Obiguard Uses the OpenAI API Standard
The Obiguard API leverages its robust open-source AI Gateway to translate all incoming requests into the OpenAI API format, ensuring responses are fully OpenAI-compatible.Seamlessly Switch Between Providers
Integrating Local or Private Models
Obiguard can connect to and monitor your locally or privately hosted LLMs, provided the model is compatible with one of the 15+ providers supported by Obiguard and its URL is accessible. Just set thecustom_host
parameter along with the provider name,
and Obiguard will manage communication with your local model.
When using
custom_host
, make sure to include the version segment (e.g., /v1) in the URL.
Obibuard will automatically add the correct endpoint path (/chat/completions
, /completions
, or /embeddings
).For Ollama models, this process is different. See documentation for details.