With Obiguard, you can take advantage of features like fast AI gateway access, observability, prompt management, and more, all while ensuring the secure management of your LLM API keys through a virtual key system.

Provider Slug. azure-openai

Obiguard SDK Integration with Azure OpenAI

Obiguard provides a consistent API to interact with models from various providers. To integrate Azure OpenAI with Obiguard:

First, add your Azure details to Obiguard’s Virtual Keys

Here’s a step-by-step guide:

  1. Request access to Azure OpenAI here.
  2. Create a resource in the Azure portal here. (This will be your Resource Name)
  3. Deploy a model in Azure OpenAI Studio here. (This will be your Deployment Name)
  4. Select your Foundation Model from the dropdowon on the modal.
  5. Now, on Azure OpenAI studio, go to any playground (chat or completions), click on a UI element called “View code”. Note down the API version & API key from here. (This will be your Azure API Version & Azure API Key)

When you input these details, the foundation model will be auto populated. More details in this guide.

If you do not want to add your Azure details to Obiguard, you can also directly pass them while instantiating the Obiguard client. More on that here.

Now, let’s make a request using this virtual key!

1. Install the Obiguard SDK

Add the Obiguard SDK to your application to interact with Azure OpenAI’s API through Obiguard’s gateway.

pip install obiguard

2. Initialize Obiguard with the Virtual Key

Set up Obiguard with your virtual key as part of the initialization configuration. You can create a virtual key for Azure in the Obiguard UI.

from obiguard import Obiguard

client = Obiguard(
  obiguard_api_key="sk-obg***",  # Your Obiguard API key
  virtual_key="AZURE_VIRTUAL_KEY"   # Replace with your virtual key for Azure
)

3. Invoke Chat Completions with Azure OpenAI

Use the Obiguard instance to send requests to your Azure deployments. You can also override the virtual key directly in the API call if needed.

completion = client.chat.completions.create(
    messages= [{ "role": 'user', "content": 'Say this is a test' }],
    model= 'custom_model_name'
)

print(completion.choices)

Image Generation

Obioguard supports multiple modalities for Azure OpenAI and you can make image generation requests through Obioguard’s AI Gateway the same way as making completion calls.

from obiguard import Obiguard

client = Obiguard(
    obiguard_api_key="sk-obg***",  # Your Obiguard API key
    virtual_key="DALL-E_VIRTUAL_KEY"   # Referencing a Dall-E Azure deployment with Virtual Key
)

image = client.images.generate(
  prompt="Lucy in the sky with diamonds",
  size="1024x1024"
)

Obioguard’s fast AI gateway captures the information about the request on your Obioguard Dashboard. On your logs screen, you’d be able to see this request with the request and response.

Log view for an image generation request on Azure OpenAI

More information on image generation is available in the API Reference.


Making Requests Without Virtual Keys

Here’s how you can pass your Azure OpenAI details and secrets directly without using the Virutal Keys feature.

Key Mapping

In a typical Azure OpenAI request,

curl https://{YOUR_RESOURCE_NAME}.openai.azure.com/openai/deployments/{YOUR_DEPLOYMENT_NAME}/chat/completions?api-version={API_VERSION} \
  -H "Content-Type: application/json" \
  -H "api-key: {YOUR_API_KEY}" \
  -d '{
    "model": "gpt-4o",
    "messages": [
      {
        "role": "system",
        "content": "You are a helpful assistant"
      },
      {
        "role": "user",
        "content": "what is a deer?"
      }
    ]
}'
ParameterNode SDKPython SDKREST Headers
AZURE RESOURCE NAMEazureResourceNameazure_resource_namex-obiguard-azure-resource-name
AZURE DEPLOYMENT NAMEazureDeploymentIdazure_deployment_idx-obiguard-azure-deployment-id
API VERSIONazureApiVersionazure_api_versionx-obiguard-azure-api-version
AZURE API KEYAuthorization: “Bearer + Authorization = “Bearer + Authorization
AZURE MODEL NAMEazureModelNameazure_model_namex-obiguard-azure-model-name

Example

from obiguard import Obiguard

client = Obiguard(
    obiguard_api_key="sk-obg***",  # Your Obiguard API key
    provider = "azure-openai",
    azure_resource_name = "AZURE_RESOURCE_NAME",
    azure_deployment_id = "AZURE_DEPLOYMENT_NAME",
    azure_api_version = "AZURE_API_VERSION",
    azure_model_name = "AZURE_MODEL_NAME",
    Authorization = "Bearer API_KEY"
)

How to Pass JWT (JSON Web Tokens)

If you have configured fine-grained access for Azure OpenAI and need to use JSON web token (JWT) in the Authorization header instead of the regular API Key, you can use the forwardHeaders parameter to do this.

from obiguard import Obiguard

client = Obiguard(
    obiguard_api_key="sk-obg***",  # Your Obiguard API key
    provider = "azure-openai",
    azure_resource_name = "AZURE_RESOURCE_NAME",
    azure_deployment_id = "AZURE_DEPLOYMENT_NAME",
    azure_api_version = "AZURE_API_VERSION",
    azure_model_name = "AZURE_MODEL_NAME",
    Authorization = "Bearer API_KEY", # Pass your JWT here
    forward_headers= [ "Authorization" ]

    )

For further questions on custom Azure deployments or fine-grained access tokens, reach out to us on [email protected].

Next Steps

The complete list of features supported in the SDK are available on the link below.

SDK