Obiguard provides a robust and secure gateway to facilitate the integration of various Large Language Models (LLMs) into your applications, including OpenRouter .
With Obiguard, you can take advantage of features like fast AI gateway access, observability, prompt management, and more, all while ensuring the secure management of your LLM API keys through a virtual key system.
Provider Slug. openrouter
Obiguard SDK Integration with OpenRouter Models
Obiguard provides a consistent API to interact with models from various providers. To integrate OpenRouter with Obiguard:
1. Install the Obiguard SDK
Add the Obiguard SDK to your application to interact with OpenRouter AI’s API through Obiguard’s gateway.
2. Initialize Obiguard with the Virtual Key
To use OpenRouter with Obiguard, get your API key from here , then add it to Obiguard to create the virtual key.
from obiguard import Obiguard
client = Obiguard(
obiguard_api_key = "vk-obg***" , # Your Obiguard virtual key
)
from obiguard import Obiguard
client = Obiguard(
obiguard_api_key = "vk-obg***" , # Your Obiguard virtual key
)
3. Invoke Chat Completions with OpenRouter
Use the Obiguard instance to send requests to OpenRouter. You can also override the virtual key directly in the API call if needed.
completion = client.chat.completions.create(
messages = [{ "role" : 'user' , "content" : 'Say this is a test' }],
model = 'mistral-medium'
)
print (completion)
completion = client.chat.completions.create(
messages = [{ "role" : 'user' , "content" : 'Say this is a test' }],
model = 'mistral-medium'
)
print (completion)
Tool calling feature lets models trigger external tools based on conversation context.
You define available functions, the model chooses when to use them, and your application executes them and returns results.
Obiguard supports Open Router Tool Calling and makes it interoperable across multiple providers.
tools = [{
"type" : "function" ,
"function" : {
"name" : "getWeather" ,
"description" : "Get the current weather" ,
"parameters" : {
"type" : "object" ,
"properties" : {
"location" : { "type" : "string" , "description" : "City and state" },
"unit" : { "type" : "string" , "enum" : [ "celsius" , "fahrenheit" ]}
},
"required" : [ "location" ]
}
}
}]
response = client.chat.completions.create(
model = "openai/gpt-4o" ,
messages = [
{ "role" : "system" , "content" : "You are a helpful assistant." },
{ "role" : "user" , "content" : "What's the weather like in Delhi - respond in JSON" }
],
tools = tools,
tool_choice = "auto"
)
print (response.choices[ 0 ].finish_reason)
tools = [{
"type" : "function" ,
"function" : {
"name" : "getWeather" ,
"description" : "Get the current weather" ,
"parameters" : {
"type" : "object" ,
"properties" : {
"location" : { "type" : "string" , "description" : "City and state" },
"unit" : { "type" : "string" , "enum" : [ "celsius" , "fahrenheit" ]}
},
"required" : [ "location" ]
}
}
}]
response = client.chat.completions.create(
model = "openai/gpt-4o" ,
messages = [
{ "role" : "system" , "content" : "You are a helpful assistant." },
{ "role" : "user" , "content" : "What's the weather like in Delhi - respond in JSON" }
],
tools = tools,
tool_choice = "auto"
)
print (response.choices[ 0 ].finish_reason)
curl -X POST "https://gateway.obiguard.ai/v1/chat/completions" \
-H "Content-Type: application/json" \
-H "x-obiguard-api-key: $OBIGUARD_API_KEY" \
-d '{
"model": "openai/gpt-4o",
"messages": [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What'\''s the weather like in Delhi - respond in JSON"}
],
"tools": [{
"type": "function",
"function": {
"name": "getWeather",
"description": "Get the current weather",
"parameters": {
"type": "object",
"properties": {
"location": {"type": "string", "description": "City and state"},
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}
},
"required": ["location"]
}
}
}],
"tool_choice": "auto"
}'
Next Steps
The complete list of features supported in the SDK are available on the link below.
Obiguard SDK Client Learn more about the Obiguard SDK Client