Universal API
Obiguard Universal API provides a unified interface for accessing multiple LLM providers, simplifying integration and management.
Instead of building and maintaining separate integrations for each multimodal LLM, you can use a single, unified API to access models from OpenAI, Anthropic, Meta, Cohere, Mistral, and over 100 models across 15+ providers.
Obiguard Uses the OpenAI API Standard
The Obiguard API leverages its robust open-source AI Gateway to translate all incoming requests into the OpenAI API format, ensuring responses are fully OpenAI-compatible.
Seamlessly Switch Between Providers
Integrating Local or Private Models
Obiguard can connect to and monitor your locally or privately hosted LLMs, provided the model is compatible with one of the 15+ providers supported by Obiguard and its URL is accessible.
Just set the custom_host
parameter along with the provider name,
and Obiguard will manage communication with your local model.
When using custom_host
, make sure to include the version segment (e.g., /v1) in the URL.
Obibuard will automatically add the correct endpoint path (/chat/completions
, /completions
, or /embeddings
).
For Ollama models, this process is different. See documentation for details.
Multimodality
Obiguard offers seamless integration with multimodal models using the same unified API, enabling support for vision, audio, image generation, and additional advanced capabilities across multiple providers.
Multimodal Capabilities