Integrate Obiguard with your agents with just 2 lines of code

Langchain
from langchain_openai import ChatOpenAI
from obiguard import OBIGUARD_GATEWAY_URL, createHeaders

llm = ChatOpenAI(
    api_key="OpenAI_API_Key",
    base_url=OBIGUARD_GATEWAY_URL,
    default_headers=createHeaders(
        provider="openai", #choose your provider
        obiguard_api_key="sk-obg******",  # Your Obiguard API key
    )
)

Key Production Features

By routing your agent’s requests through Obiguard, you make your agents production-grade with the following features.

1. Interoperability

Easily switch between LLM providers. Call various LLMs such as Anthropic, Gemini, Mistral, Azure OpenAI, Google Vertex AI, AWS Bedrock and much more by simply changing the provider and Obiguard API key in the LLM object.

2. Observability

Obiguard automatically logs key details about your agent runs, including cost, tokens used, response time, etc. For agent-specific observability, add Trace IDs to the request headers for each agent. This enables filtering analytics by Trace IDs, ensuring deeper monitoring and analysis.

3. Logs

Access a dedicated section to view records of action executions, including parameters, outcomes, and errors. Filter logs of your agent run based on multiple parameters such as trace ID, model, tokens used, metadata, etc.