Obiguard provides a robust and secure gateway to facilitate the integration of various Large Language Models (LLMs) into your applications, including Anthropic’s Claude APIs.
With Obiguard, you can take advantage of features like fast AI gateway access, observability, prompt management, and more, all while ensuring the secure management of your LLM API keys through a virtual key system.
Use the Obiguard instance to send requests to Anthropic. You can also override the virtual key directly in the API call if needed.
Copy
chat_completion = client.chat.completions.create( messages= [{ "role": 'user', "content": 'Say this is a test' }], model= 'claude-3-opus-20240229', max_tokens=250 # Required field for Anthropic)print(chat_completion.choices[0].message.content)
Copy
chat_completion = client.chat.completions.create( messages= [{ "role": 'user', "content": 'Say this is a test' }], model= 'claude-3-opus-20240229', max_tokens=250 # Required field for Anthropic)print(chat_completion.choices[0].message.content)
Copy
chat_completion = client.chat.completions.create( messages = [{ "role": 'user', "content": 'Say this is a test' }], model = 'claude-3-opus-20240229', max_tokens = 250)print(chat_completion.choices[0].message.content)
With Obiguard, we make Anthropic models interoperable with the OpenAI schema and SDK methods.
So, instead of passing the system prompt separately, you can pass it as part of the messages body, similar to OpenAI:
Copy
completion = client.chat.completions.create( messages= [ { "role": 'system', "content": 'Your system prompt' }, { "role": 'user', "content": 'Say this is a test' } ], model= 'claude-3-opus-20240229', max_tokens=250 # Required field for Anthropic)print(completion.choices)
Copy
completion = client.chat.completions.create( messages= [ { "role": 'system', "content": 'Your system prompt' }, { "role": 'user', "content": 'Say this is a test' } ], model= 'claude-3-opus-20240229', max_tokens=250 # Required field for Anthropic)print(completion.choices)
Obiguard’s multimodal Gateway fully supports Anthropic’s vision models claude-3-sonnet, claude-3-haiku, claude-3-opus, and the latest claude-3.5-sonnet.
Obiguard follows the OpenAI schema, which means you can send your image data to Anthropic in the same format as OpenAI.
Anthropic ONLY accepts base64 -encoded images. Unlike OpenAI, it does not support image URLs.
With Obiguard, you can use the same format to send base64-encoded images to both Anthropic and OpenAI models.
Here’s an example using Anthropic claude-3.5-sonnet model
Copy
import base64import httpxfrom obiguard import Obiguard# Fetch and encode the imageimage_url = "https://upload.wikimedia.org/wikipedia/commons/a/a7/Camponotus_flavomarginatus_ant.jpg"image_data = base64.b64encode(httpx.get(image_url).content).decode("utf-8")client = Obiguard( obiguard_api_key="OBIGUARD_API_KEY", # Your Obiguard API key)# Create the requestresponse = client.chat.completions.create( model="claude-3-5-sonnet-20240620", messages=[ { "role": "system", "content": "You are a helpful assistant, who describes imagse" }, { "role": "user", "content": [ { "type": "image_url", "image_url": { "url": f"data:image/jpeg;base64,{image_data}" } } ] } ], max_tokens=1400,)print(response)
Copy
import base64import httpxfrom obiguard import Obiguard# Fetch and encode the imageimage_url = "https://upload.wikimedia.org/wikipedia/commons/a/a7/Camponotus_flavomarginatus_ant.jpg"image_data = base64.b64encode(httpx.get(image_url).content).decode("utf-8")client = Obiguard( obiguard_api_key="OBIGUARD_API_KEY", # Your Obiguard API key)# Create the requestresponse = client.chat.completions.create( model="claude-3-5-sonnet-20240620", messages=[ { "role": "system", "content": "You are a helpful assistant, who describes imagse" }, { "role": "user", "content": [ { "type": "image_url", "image_url": { "url": f"data:image/jpeg;base64,{image_data}" } } ] } ], max_tokens=1400,)print(response)
On completion, the request will get logged in Obiguard where any image inputs or outputs can be viewed. Obiguard will automatically render the base64 images to help you debug any issues quickly.
Anthropic Claude can now process PDFs to extract text, analyze charts, and understand visual content from documents.
With Obiguard, you can seamlessly integrate this capability into your applications using the familiar OpenAI-compatible API schema.
PDF support is available on the following Claude models:
Claude 3.7 Sonnet (claude-3-7-sonnet-20250219)
Claude 3.5 Sonnet (claude-3-5-sonnet-20241022, claude-3-5-sonnet-20240620)
Claude 3.5 Haiku (claude-3-5-haiku-20241022)
When using PDF support with Obiguard, be aware of these limitations:
Currently, Obiguard supports PDF processing using base64-encoded PDF documents, following the same pattern as image handling in Claude’s multimodal capabilities.
Copy
from obiguard import Obiguardimport base64import httpxclient = Obiguard( obiguard_api_key="OBIGUARD_API_KEY", # Your Obiguard API key)# Fetch and encode the PDFpdf_url = "https://assets.anthropic.com/m/1cd9d098ac3e6467/original/Claude-3-Model-Card-October-Addendum.pdf"pdf_data = "data:application/pdf;base64," + base64.standard_b64encode(httpx.get(pdf_url).content).decode("utf-8")# Alternative: Load from a local file# with open("document.pdf", "rb") as f:# pdf_data = "data:application/pdf;base64," + base64.standard_b64encode(f.read()).decode("utf-8")# Create the requestresponse = client.chat.completions.create( model="claude-3-5-sonnet-20240620", max_tokens=1024, messages=[ { "role": "system", "content": "You are a helpful document analysis assistant." }, { "role": "user", "content": [ { "type": "text", "text": "What are the key findings in this document?" }, { "type": "file", "file": { # "file_url": "https://pdfobject.com/pdf/sample.pdf", # if you want to pass a pdf file from a url "mime_type": "application/pdf", "file_data": "BASE64_PDF_DATA" } } # { # if you want to pass a plain text file # "type": "file", # "file": { # "mime_type": "text/plain", # "file_data": "This is a plain text file" # } # } ] } ])print(response.choices[0].message.content)
Copy
from obiguard import Obiguardimport base64import httpxclient = Obiguard( obiguard_api_key="OBIGUARD_API_KEY", # Your Obiguard API key)# Fetch and encode the PDFpdf_url = "https://assets.anthropic.com/m/1cd9d098ac3e6467/original/Claude-3-Model-Card-October-Addendum.pdf"pdf_data = "data:application/pdf;base64," + base64.standard_b64encode(httpx.get(pdf_url).content).decode("utf-8")# Alternative: Load from a local file# with open("document.pdf", "rb") as f:# pdf_data = "data:application/pdf;base64," + base64.standard_b64encode(f.read()).decode("utf-8")# Create the requestresponse = client.chat.completions.create( model="claude-3-5-sonnet-20240620", max_tokens=1024, messages=[ { "role": "system", "content": "You are a helpful document analysis assistant." }, { "role": "user", "content": [ { "type": "text", "text": "What are the key findings in this document?" }, { "type": "file", "file": { # "file_url": "https://pdfobject.com/pdf/sample.pdf", # if you want to pass a pdf file from a url "mime_type": "application/pdf", "file_data": "BASE64_PDF_DATA" } } # { # if you want to pass a plain text file # "type": "file", # "file": { # "mime_type": "text/plain", # "file_data": "This is a plain text file" # } # } ] } ])print(response.choices[0].message.content)
Copy
# First, encode your PDF to base64 (this example uses a command line approach)# For example using curl + base64:PDF_BASE64=$(curl -s "https://assets.anthropic.com/m/1cd9d098ac3e6467/original/Claude-3-Model-Card-October-Addendum.pdf" | base64)# Alternatively, from a local file:# PDF_BASE64=$(base64 -i document.pdf)# Then make the API call with the base64-encoded PDFcurl "https://gateway.obiguard.ai/v1/chat/completions" \ -H "Content-Type: application/json" \ -H "x-obiguard-api-key: $OBIGUARD_API_KEY" \ -H "x-obiguard-provider: anthropic" \ -H "x-api-key: $ANTHROPIC_API_KEY" -d '{ "model": "claude-3-5-sonnet-20240620", "max_tokens": 1024, "messages": [ { "role": "system", "content": "You are a helpful document analysis assistant." }, { "role": "user", "content": [ { "type": "text", "text": "What are the key findings in this document?" }, { "type": "file", "file": { "mime_type": "application/pdf", "file_data": "BASE64_PDF_DATA" } } // { // if you want to pass a plain text file // "type": "file", // "file": { // "mime_type": "text/plain", // "file_data": "This is a plain text file" // } // } ] } ] }'
Obiguard also works with Anthropic’s new prompt caching feature and helps you save time & money for all your Anthropic requests. Refer to this guide to learn how to enable it:
The assistants thinking response is returned in the response_chunk.choices[0].delta.content_blocks array, not the response.choices[0].message.content string.
Models like claude-3-7-sonnet-latest support extended thinking.
This is similar to openai thinking, but you get the model’s reasoning as it processes the request as well.
from obiguard import Obiguardclient = Obiguard(obiguard_api_key="OBIGUARD_API_KEY", # Your Obiguard API key)# Create the requestresponse = client.chat.completions.create(model="claude-3-7-sonnet-latest",max_tokens=3000,thinking={"type": "enabled","budget_tokens": 2030},stream=False,messages=[{ "role": "user", "content": [{ "type": "text", "text": "when does the flight from new york to bengaluru land tomorrow, what time, what is its flight number, and what is its baggage belt?"} ]}])print(response)# in case of streaming responses you'd have to parse the response_chunk.choices[0].delta.content_blocks array# response = client.chat.completions.create(# ...same config as above but with stream: true# )# for chunk in response:# if chunk.choices[0].delta:# content_blocks = chunk.choices[0].delta.get("content_blocks")# if content_blocks is not None:# for content_block in content_blocks:# print(content_block)
Python
Copy
from obiguard import Obiguardclient = Obiguard(obiguard_api_key="OBIGUARD_API_KEY", # Your Obiguard API key)# Create the requestresponse = client.chat.completions.create(model="claude-3-7-sonnet-latest",max_tokens=3000,thinking={"type": "enabled","budget_tokens": 2030},stream=False,messages=[{ "role": "user", "content": [{ "type": "text", "text": "when does the flight from new york to bengaluru land tomorrow, what time, what is its flight number, and what is its baggage belt?"} ]}])print(response)# in case of streaming responses you'd have to parse the response_chunk.choices[0].delta.content_blocks array# response = client.chat.completions.create(# ...same config as above but with stream: true# )# for chunk in response:# if chunk.choices[0].delta:# content_blocks = chunk.choices[0].delta.get("content_blocks")# if content_blocks is not None:# for content_block in content_blocks:# print(content_block)
OpenAI Python
Copy
from openai import OpenAIfrom obiguard import OBIGUARD_GATEWAY_URL, createHeadersclient = OpenAI( api_key='Anthropic_API_KEY', base_url=OBIGUARD_GATEWAY_URL, default_headers=createHeaders( provider="openai", obiguard_api_key="OBIGUARD_API_KEY" ))response = client.chat.completions.create( model="claude-3-7-sonnet-latest", max_tokens=3000, thinking={ "type": "enabled", "budget_tokens": 2030 }, stream=False, messages=[ { "role": "user", "content": [ { "type": "text", "text": "when does the flight from new york to bengaluru land tomorrow, what time, what is its flight number, and what is its baggage belt?" } ] } ])print(response)
cURL
Copy
curl "https://gateway.oobiguard.ai/v1/chat/completions" \ -H "Content-Type: application/json" \ -H "x-obiguard-api-key: $OBIGUARD_API_KEY" \ -H "x-obiguard-provider: anthropic" \ -H "x-obiguard--key: $ANTHROPIC_API_KEY" \ -H "x-obiguard-strict-open-ai-compliance: false" \ -d '{ "model": "claude-3-7-sonnet-latest", "max_tokens": 3000, "thinking": { "type": "enabled", "budget_tokens": 2030 }, "stream": false, "messages": [ { "role": "user", "content": [ { "type": "text", "text": "when does the flight from new york to bengaluru land tomorrow, what time, what is its flight number, and what is its baggage belt?" } ] } ] }'
from obiguard import Obiguardclient = Obiguard( obiguard_api_key="OBIGUARD_API_KEY", # Your Obiguard API key strict_open_ai_compliance=False)response = client.chat.completions.create( model="claude-3-7-sonnet-latest", max_tokens=3000, thinking={ "type": "enabled", "budget_tokens": 2030 }, stream=False, messages=[ { "role": "user", "content": [ { "type": "text", "text": "when does the flight from baroda to bangalore land tomorrow, what time, what is its flight number, and what is its baggage belt?" } ] }, { "role": "assistant", "content": [ { "type": "thinking", "thinking": "The user is asking several questions about a flight from Baroda (also known as Vadodara) to Bangalore:\n1. When does the flight land tomorrow\n2. What time does it land\n3. What is the flight number\n4. What is the baggage belt number at the arrival airport\n\nTo properly answer these questions, I would need access to airline flight schedules and airport information systems. However, I don't have:\n- Real-time or scheduled flight information\n- Access to airport baggage claim allocation systems\n- Information about specific flights between these cities\n- The ability to look up tomorrow's specific flight schedules\n\nThis question requires current, specific flight information that I don't have access to. Instead of guessing or providing potentially incorrect information, I should explain this limitation and suggest ways the user could find this information.", "signature": "EqoBCkgIARABGAIiQBVA7FBNLRtWarDSy9TAjwtOpcTSYHJ+2GYEoaorq3V+d3eapde04bvEfykD/66xZXjJ5yyqogJ8DEkNMotspRsSDKzuUJ9FKhSNt/3PdxoMaFZuH+1z1aLF8OeQIjCrA1+T2lsErrbgrve6eDWeMvP+1sqVqv/JcIn1jOmuzrPi2tNz5M0oqkOO9txJf7QqEPPw6RG3JLO2h7nV1BMN6wE=" } ] }, { "role": "user", "content": "thanks that's good to know, how about to chennai?" } ])print(response)
Python
Copy
from obiguard import Obiguardclient = Obiguard( obiguard_api_key="OBIGUARD_API_KEY", # Your Obiguard API key strict_open_ai_compliance=False)response = client.chat.completions.create( model="claude-3-7-sonnet-latest", max_tokens=3000, thinking={ "type": "enabled", "budget_tokens": 2030 }, stream=False, messages=[ { "role": "user", "content": [ { "type": "text", "text": "when does the flight from baroda to bangalore land tomorrow, what time, what is its flight number, and what is its baggage belt?" } ] }, { "role": "assistant", "content": [ { "type": "thinking", "thinking": "The user is asking several questions about a flight from Baroda (also known as Vadodara) to Bangalore:\n1. When does the flight land tomorrow\n2. What time does it land\n3. What is the flight number\n4. What is the baggage belt number at the arrival airport\n\nTo properly answer these questions, I would need access to airline flight schedules and airport information systems. However, I don't have:\n- Real-time or scheduled flight information\n- Access to airport baggage claim allocation systems\n- Information about specific flights between these cities\n- The ability to look up tomorrow's specific flight schedules\n\nThis question requires current, specific flight information that I don't have access to. Instead of guessing or providing potentially incorrect information, I should explain this limitation and suggest ways the user could find this information.", "signature": "EqoBCkgIARABGAIiQBVA7FBNLRtWarDSy9TAjwtOpcTSYHJ+2GYEoaorq3V+d3eapde04bvEfykD/66xZXjJ5yyqogJ8DEkNMotspRsSDKzuUJ9FKhSNt/3PdxoMaFZuH+1z1aLF8OeQIjCrA1+T2lsErrbgrve6eDWeMvP+1sqVqv/JcIn1jOmuzrPi2tNz5M0oqkOO9txJf7QqEPPw6RG3JLO2h7nV1BMN6wE=" } ] }, { "role": "user", "content": "thanks that's good to know, how about to chennai?" } ])print(response)
OpenAI Python
Copy
from openai import OpenAIfrom obiguard import OBIGUARD_GATEWAY_URL, createHeadersopenai = OpenAI( api_key='Anthropic_API_KEY', base_url=OBIGUARD_GATEWAY_URL, default_headers=createHeaders( provider="anthropic", obiguard_api_key="OBIGUARD_API_KEY", strict_open_ai_compliance=False ))response = openai.chat.completions.create( model="claude-3-7-sonnet-latest", max_tokens=3000, thinking={ "type": "enabled", "budget_tokens": 2030 }, stream=False, messages=[ { "role": "user", "content": [ { "type": "text", "text": "when does the flight from baroda to bangalore land tomorrow, what time, what is its flight number, and what is its baggage belt?" } ] }, { "role": "assistant", "content": [ { "type": "thinking", "thinking": "The user is asking several questions about a flight from Baroda (also known as Vadodara) to Bangalore:\n1. When does the flight land tomorrow\n2. What time does it land\n3. What is the flight number\n4. What is the baggage belt number at the arrival airport\n\nTo properly answer these questions, I would need access to airline flight schedules and airport information systems. However, I don't have:\n- Real-time or scheduled flight information\n- Access to airport baggage claim allocation systems\n- Information about specific flights between these cities\n- The ability to look up tomorrow's specific flight schedules\n\nThis question requires current, specific flight information that I don't have access to. Instead of guessing or providing potentially incorrect information, I should explain this limitation and suggest ways the user could find this information.", signature: "EqoBCkgIARABGAIiQBVA7FBNLRtWarDSy9TAjwtOpcTSYHJ+2GYEoaorq3V+d3eapde04bvEfykD/66xZXjJ5yyqogJ8DEkNMotspRsSDKzuUJ9FKhSNt/3PdxoMaFZuH+1z1aLF8OeQIjCrA1+T2lsErrbgrve6eDWeMvP+1sqVqv/JcIn1jOmuzrPi2tNz5M0oqkOO9txJf7QqEPPw6RG3JLO2h7nV1BMN6wE=" } ] }, { "role": "user", "content": "thanks that's good to know, how about to chennai?" } ])print(response)
cURL
Copy
curl "https://gateway.obiguard.ai/v1/chat/completions" \ -H "Content-Type: application/json" \ -H "x-obiguard-api-key: $OBIGUARD_API_KEY" \ -H "x-obiguard-provider: anthropic" \ -H "x-obiguard-api-key: $ANTHROPIC_API_KEY" \ -H "x-obiguard-strict-open-ai-compliance: false" \ -d '{ "model": "claude-3-7-sonnet-latest", "max_tokens": 3000, "thinking": { "type": "enabled", "budget_tokens": 2030 }, "stream": false, "messages": [ { "role": "user", "content": [ { "type": "text", "text": "when does the flight from baroda to bangalore land tomorrow, what time, what is its flight number, and what is its baggage belt?" } ] }, { "role": "assistant", "content": [ { "type": "thinking", "thinking": "The user is asking several questions about a flight from Baroda (also known as Vadodara) to Bangalore:\n1. When does the flight land tomorrow\n2. What time does it land\n3. What is the flight number\n4. What is the baggage belt number at the arrival airport\n\nTo properly answer these questions, I would need access to airline flight schedules and airport information systems. However, I don't have:\n- Real-time or scheduled flight information\n- Access to airport baggage claim allocation systems\n- Information about specific flights between these cities\n- The ability to look up tomorrow's specific flight schedules\n\nThis question requires current, specific flight information that I don't have access to. Instead of guessing or providing potentially incorrect information, I should explain this limitation and suggest ways the user could find this information.", "signature": "EqoBCkgIARABGAIiQBVA7FBNLRtWarDSy9TAjwtOpcTSYHJ+2GYEoaorq3V+d3eapde04bvEfykD/66xZXjJ5yyqogJ8DEkNMotspRsSDKzuUJ9FKhSNt/3PdxoMaFZuH+1z1aLF8OeQIjCrA1+T2lsErrbgrve6eDWeMvP+1sqVqv/JcIn1jOmuzrPi2tNz5M0oqkOO9txJf7QqEPPw6RG3JLO2h7nV1BMN6wE=" } ] }, { "role": "user", "content": "thanks that's good to know, how about to chennai?" } ] }'
Extended thinking API through Obiguard is currently in beta.
Obiguard enables support for Anthropic’s beta features by including the feature name in the header: "x-obiguard-anthropic-beta": "token-efficient-tools-2025-02-19"
For SDKs, you can specify this during client initialization or use createHeaders with the OpenAI SDK:
python
Copy
from obiguard import Obiguardclient = Obiguard( obiguard_api_key="OBIGUARD_API_KEY", # Your Obiguard API key anthropic_beta="token-efficient-tools-2025-02-19", strict_open_ai_compliance=False)
python
Copy
from obiguard import Obiguardclient = Obiguard( obiguard_api_key="OBIGUARD_API_KEY", # Your Obiguard API key anthropic_beta="token-efficient-tools-2025-02-19", strict_open_ai_compliance=False)