Obiguard provides a robust and secure gateway to facilitate the integration of various Large Language Models (LLMs) into your applications, including Mistral AI APIs.

With Obiguard, you can take advantage of features like fast AI gateway access, observability, prompt management, and more, all while ensuring the secure management of your LLM API keys through a virtual key system.

Provider Slug. mistral-ai

Obiguard SDK Integration with Mistral AI Models

Obiguard provides a consistent API to interact with models from various providers. To integrate Mistral AI with Obiguard:

1. Install the Obiguard SDK

Add the Obiguard SDK to your application to interact with Mistral AI’s API through Obiguard’s gateway.

pip install obiguard

2. Initialize Obiguard with the Virtual Key

To use Mistral AI with Obiguard, get your API key from here, then add it to Obiguard to create the virtual key.

from obiguard import Obiguard

client = Obiguard(
  obiguard_api_key="sk-obg***", # Your Obiguard API key
  virtual_key="VIRTUAL_KEY" # Replace with your virtual key for Mistral AI
)

3. Invoke Chat Completions with Mistral AI

Use the Obiguard instance to send requests to Mistral AI. You can also override the virtual key directly in the API call if needed.

You can also call the new Codestral model here!

completion = client.chat.completions.create(
  messages= [{"role": 'user', "content": 'Say this is a test'}],
  model= 'codestral-latest'
)

print(completion)

Invoke Codestral Endpoint

Using Obiguard, you can also call Mistral API’s new Codestral endpoint. Just pass the Codestral URL https://codestral.mistral.ai/v1 with the customHost property.

from obiguard import Obiguard

client = Obiguard(
  obiguard_api_key="sk-obg***", # Your Obiguard API key
  virtual_key="MISTRAL_VIRTUAL_KEY",
  custom_host="https://codestral.mistral.ai/v1"
)

code_completion = client.chat.completions.create(
  model="codestral-latest",
  messages=[{"role": "user", "content": "Write a minimalist Python code to validate the proof for the special number 1729"}]
)

print(code_completion.choices[0].message.content)

Mistral Tool Calling

Tool calling feature lets models trigger external tools based on conversation context. You define available functions, the model chooses when to use them, and your application executes them and returns results.

Obiguard supports Mistral Tool Calling and makes it interoperable across multiple providers.

Get Weather Tool
tools = [{
  "type": "function",
  "function": {
    "name": "getWeather",
    "description": "Get the current weather",
    "parameters": {
      "type": "object",
      "properties": {
        "location": {"type": "string", "description": "City and state"},
        "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}
      },
      "required": ["location"]
   }
  }
}]
response = client.chat.completions.create(
  model="your_mistral_model_name",
  messages=[
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "What's the weather like in Delhi - respond in JSON"}
  ],
  tools=tools,
  tool_choice="auto"
)

print(response.choices[0].finish_reason)

Next Steps

The complete list of features supported in the SDK are available on the link below.

Obiguard SDK Client

Learn more about the Obiguard SDK Client