Obiguard provides a robust and secure gateway to facilitate the integration of various Large Language Models (LLMs) into your applications, including Snowflake Cortex APIs.

With Obiguard, you can take advantage of features like fast AI gateway access, observability, prompt management, and more, all while ensuring the secure management of your LLM API keys through a virtual key system.

Provider Slug. cortex

Obiguard SDK Integration with Snowflake Cortex Models

Obiguard provides a consistent API to interact with models from various providers. To integrate Snowflake Cortex with Obiguard:

1. Install the Obiguard SDK

Add the Obiguard SDK to your application to interact with Snowflake Cortex AI’s API through Obiguard’s gateway.

pip install obiguard

2. Initialize Obiguard with the Virtual Key

To use Snowflake Cortex with Obiguard, get your API key/JWT Token from the Snowflake Platform, then add it to Obiguard to create the virtual key.

from obiguard import Obiguard

client = Obiguard(
  obiguard_api_key="vk-obg***",  # Your Obiguard virtual key
)

3. Invoke Chat Completions with Snowflake Cortex

Use the Obiguard instance to send requests to Snowflake Cortex. You can also override the virtual key directly in the API call if needed.

completion = client.chat.completions.create(
  messages= [{"role": 'user', "content": 'Say this is a test'}],
  model= 'claude-3-5-sonnet'
)

print(completion)

Next Steps

The complete list of features supported in the SDK are available on the link below.

Obiguard SDK Client

Learn more about the Obiguard SDK Client