Obiguard provides a robust and secure gateway to facilitate the integration of various Large Language Models (LLMs) into your applications, including Nomic.

Nomic has especially become popular due to it’s superior embeddings and is now available through Obiguard’s AI gateway as well.

With Obiguard, you can take advantage of features like fast AI gateway access, observability, prompt management, and more, all while ensuring the secure management of your LLM API keys through a virtual key system.

Provider Slug. nomic

Obiguard SDK Integration with Nomic

Obiguard provides a consistent API to interact with embedding models from various providers. To integrate Nomic with Obiguard:

1. Create a Virtual Key for Nomic in your Obiguard account

You can head over to the virtual keys tab and create one for Nomic. This will be then used to make API requests to Nomic without needing the protected API key.

2. Install the Obiguard SDK and Initialize with this Virtual Key

Add the Obiguard SDK to your application to interact with Nomic’s API through Obiguard’s gateway.

from obiguard import Obiguard

client = Obiguard(
  obiguard_api_key="vk-obg***",  # Your Obiguard virtual key
  virtual_key="VIRTUAL_KEY" # Replace with your virtual key for Nomic
)

3. Invoke the Embeddings API with Nomic

Use the Obiguard instance to send requests to your Nomic API. You can also override the virtual key directly in the API call if needed.

embeddings = client.embeddings.create(
  input='create vector representation on this sentence',
  model='nomic-embed-text-v1.5'
)

print(embeddings)

Next Steps

The complete list of features supported in the SDK are available on the link below.

Obiguard SDK Client

Learn more about the Obiguard SDK Client