Instead of building and maintaining separate integrations for each multimodal LLM, you can use a single, unified API to access models from OpenAI, Anthropic, Meta, Cohere, Mistral, and over 100 models across 15+ providers.Documentation Index
Fetch the complete documentation index at: https://docs.obiguard.ai/llms.txt
Use this file to discover all available pages before exploring further.
Obiguard Uses the OpenAI API Standard
The Obiguard API leverages its robust open-source AI Gateway to translate all incoming requests into the OpenAI API format, ensuring responses are fully OpenAI-compatible.Seamlessly Switch Between Providers
- Python SDK
Integrating Local or Private Models
Obiguard can connect to and monitor your locally or privately hosted LLMs, provided the model is compatible with one of the 15+ providers supported by Obiguard and its URL is accessible. Just set thecustom_host parameter along with the provider name,
and Obiguard will manage communication with your local model.
When using
custom_host, make sure to include the version segment (e.g., /v1) in the URL.
Obibuard will automatically add the correct endpoint path (/chat/completions, /completions, or /embeddings).For Ollama models, this process is different. See documentation for details.
