# Obiguard ## Docs - [Create Speech](https://docs.obiguard.ai/api-reference/inference-api/audio/create-speech.md) - [Create Transcription](https://docs.obiguard.ai/api-reference/inference-api/audio/create-transcription.md) - [Create Translation](https://docs.obiguard.ai/api-reference/inference-api/audio/create-translation.md) - [Authentication](https://docs.obiguard.ai/api-reference/inference-api/authentication.md) - [Chat](https://docs.obiguard.ai/api-reference/inference-api/chat.md) - [Completions](https://docs.obiguard.ai/api-reference/inference-api/completions.md) - [Embeddings](https://docs.obiguard.ai/api-reference/inference-api/embeddings.md) - [Errors](https://docs.obiguard.ai/api-reference/inference-api/error-codes.md) - [Gateway to Other APIs](https://docs.obiguard.ai/api-reference/inference-api/gateway-for-other-apis.md): Access any custom provider endpoint through Obiguard API - [Headers](https://docs.obiguard.ai/api-reference/inference-api/headers.md): Header requirements and options for the Obiguard API - [Create Image](https://docs.obiguard.ai/api-reference/inference-api/images/create-image.md) - [Create Image Edit](https://docs.obiguard.ai/api-reference/inference-api/images/create-image-edit.md) - [Create Image Variation](https://docs.obiguard.ai/api-reference/inference-api/images/create-image-variation.md) - [Introduction](https://docs.obiguard.ai/api-reference/inference-api/introduction.md): This documentation provides detailed information about the various ways you can access and interact with Obiguard - **a robust AI gateway** designed to simplify and enhance your experience with Large Language Models (LLMs) like OpenAI's GPT models. - [Supported Providers](https://docs.obiguard.ai/api-reference/inference-api/supported-providers.md) - [Supported SDKs](https://docs.obiguard.ai/api-reference/sdk/list.md): Find the best way to use Obiguard in your preferred language. - [Obiguard Python SDK](https://docs.obiguard.ai/api-reference/sdk/python.md): Official Obiguard Python SDK to help take your AI apps to production - [Obiguard Features](https://docs.obiguard.ai/features-overview.md): Explore the powerful features of Obiguard. - [AI Gateway](https://docs.obiguard.ai/gateway/AI-gateway.md): A unified interface to interact with 250+ AI models. - [Function Calling](https://docs.obiguard.ai/gateway/multimodal-capabilities/function-calling.md): Obiguard’s AI Gateway enables function calling, allowing you to define functions in API requests. Supported models can respond with either plain text or a function name and its parameters. - [Image Generation](https://docs.obiguard.ai/gateway/multimodal-capabilities/image-generation.md): Obiguard AI gateway enables image generation features provided by various foundational model vendors. - [Speech-to-Text](https://docs.obiguard.ai/gateway/multimodal-capabilities/speech-to-text.md): Obiguard's AI gateway enables speech-to-text using models such as OpenAI Whisper. - [Text-to-Speech](https://docs.obiguard.ai/gateway/multimodal-capabilities/text-to-speech.md): Obiguard’s AI gateway offers text-to-speech support for both `OpenAI` and `Azure OpenAI` models. - [Thinking Mode](https://docs.obiguard.ai/gateway/multimodal-capabilities/thinking-mode.md): Enable Thinking Mode to enhance your AI's reasoning capabilities. - [Vision](https://docs.obiguard.ai/gateway/multimodal-capabilities/vision.md): Obiguard’s AI gateway enables integration with leading vision models such as OpenAI’s GPT-4V, Google’s Gemini, and others. - [Realtime API](https://docs.obiguard.ai/gateway/realtime-api.md): Leverage OpenAI's Realtime API with logging, cost tracking, and additional features! - [Remote MCP](https://docs.obiguard.ai/gateway/remote-mcp.md): Obiguard's AI gateway has MCP server support that many foundational model providers offer. - [Strict OpenAI Compliance](https://docs.obiguard.ai/gateway/strict-open-ai-compliance.md): Obiguard ensures that all responses strictly adhere to the [OpenAI specification](https://platform.openai.com/docs/api-reference/chat/create) by default. - [Universal API](https://docs.obiguard.ai/gateway/universal-api.md): Obiguard Universal API provides a unified interface for accessing multiple LLM providers, simplifying integration and management. - [Guardrail Policies](https://docs.obiguard.ai/guardrail-AI/guardrail-policies.md): Obiguard enables the screening of LLM interactions to address various threats. Validator checks, flagging logic, and strictness levels are all centrally managed through an Obiguard guardrail policy. - [Guardrail Validators](https://docs.obiguard.ai/guardrail-AI/guardrail-validators.md): Confidently deploy to production with Obiguard guardrails protecting your requests and responses - [Overview](https://docs.obiguard.ai/integrations/agents.md): Obiguard helps bring your agents to production - [Autogen](https://docs.obiguard.ai/integrations/agents/autogen.md): Use Obiguard with Autogen to take your AI Agents to production - [Bring Your own Agents](https://docs.obiguard.ai/integrations/agents/bring-your-own-agents.md): You can also use Obiguard if you are doing custom agent orchestration! - [CrewAI](https://docs.obiguard.ai/integrations/agents/crewai.md): Use Obiguard with CrewAI to take your AI Agents to production - [OpenAI Agents SDK (Python)](https://docs.obiguard.ai/integrations/agents/openai-agents.md): Use Obiguard with OpenAI Agents SDK to take your AI Agents to production - [OpenAI Swarm](https://docs.obiguard.ai/integrations/agents/openai-swarm.md): The Obiguard x Swarm integration brings advanced AI gateway capabilities, full-stack observability, and reliability features to build production-ready AI agents. - [Pydantic AI](https://docs.obiguard.ai/integrations/agents/pydantic-ai.md): Use Obiguard with PydanticAI to take your AI Agents to production - [Overview](https://docs.obiguard.ai/integrations/ai-apps.md) - [Integrations](https://docs.obiguard.ai/integrations/ecosystem.md) - [n8n](https://docs.obiguard.ai/integrations/libraries/n8n.md): Add usage tracking, cost controls, and security guardrails to your n8n workflows - [Overview](https://docs.obiguard.ai/integrations/llms.md): Obiguard integrates seamlessly with leading LLM providers and orchestration frameworks. - [AI21](https://docs.obiguard.ai/integrations/llms/ai21.md) - [Anthropic](https://docs.obiguard.ai/integrations/llms/anthropic.md) - [Computer use tool](https://docs.obiguard.ai/integrations/llms/anthropic/computer-use.md) - [Anyscale](https://docs.obiguard.ai/integrations/llms/anyscale-llama2-mistral-zephyr.md): Integrate Anyscale endpoints with Obiguard seamlessly and make your OSS models production-ready - [AWS SageMaker](https://docs.obiguard.ai/integrations/llms/aws-sagemaker.md): Route to your AWS Sagemaker models through Obiguard - [Azure AI Foundry](https://docs.obiguard.ai/integrations/llms/azure-foundry.md): Learn how to integrate Azure AI Foundry with Obiguard to access a wide range of AI models with enhanced observability and reliability features. - [Azure OpenAI](https://docs.obiguard.ai/integrations/llms/azure-openai/azure-openai.md): Azure OpenAI is a great alternative to accessing the best models including GPT-4 and more in your private environments. Obiguard provides complete support for Azure OpenAI. - [AWS Bedrock](https://docs.obiguard.ai/integrations/llms/bedrock/aws-bedrock.md) - [Embeddings](https://docs.obiguard.ai/integrations/llms/bedrock/embeddings.md): Get embeddings from Bedrock - [Bring Your Own LLM](https://docs.obiguard.ai/integrations/llms/byollm.md): Integrate your privately hosted LLMs with Obiguard for unified management, observability, and reliability. - [Cerebras](https://docs.obiguard.ai/integrations/llms/cerebras.md) - [DeepSeek](https://docs.obiguard.ai/integrations/llms/deepseek.md): Obiguard provides a robust and secure gateway to facilitate the integration of various Large Language Models (LLMs) into your applications, including DeepSeek models. - [Google Gemini](https://docs.obiguard.ai/integrations/llms/gemini.md) - [GitHub](https://docs.obiguard.ai/integrations/llms/github.md) - [Groq](https://docs.obiguard.ai/integrations/llms/groq.md) - [Hugging Face](https://docs.obiguard.ai/integrations/llms/huggingface.md) - [Lepton AI](https://docs.obiguard.ai/integrations/llms/lepton.md) - [LocalAI](https://docs.obiguard.ai/integrations/llms/local-ai.md) - [Mistral AI](https://docs.obiguard.ai/integrations/llms/mistral-ai.md) - [Nebius](https://docs.obiguard.ai/integrations/llms/nebius.md) - [Nomic](https://docs.obiguard.ai/integrations/llms/nomic.md) - [Ollama](https://docs.obiguard.ai/integrations/llms/ollama.md) - [OpenAI](https://docs.obiguard.ai/integrations/llms/openai.md): Discover how to integrate OpenAI with Obiguard for seamless completions, prompt management, and advanced features like streaming, function calling, and fine-tuning. - [Files](https://docs.obiguard.ai/integrations/llms/openai/files.md): Upload files to OpenAI - [Structured Outputs](https://docs.obiguard.ai/integrations/llms/openai/structured-outputs.md): Structured Outputs ensure that the model always follows your supplied [JSON schema](https://json-schema.org/overview/what-is-jsonschema). Obiguard supports OpenAI's Structured Outputs feature out of the box with our SDKs & APIs. - [OpenRouter](https://docs.obiguard.ai/integrations/llms/openrouter.md) - [Perplexity AI](https://docs.obiguard.ai/integrations/llms/perplexity-ai.md) - [Snowflake Cortex](https://docs.obiguard.ai/integrations/llms/snowflake-cortex.md) - [Stability AI](https://docs.obiguard.ai/integrations/llms/stability-ai.md) - [Suggest a new integration!](https://docs.obiguard.ai/integrations/llms/suggest-a-new-integration.md) - [Together AI](https://docs.obiguard.ai/integrations/llms/together-ai.md) - [Triton](https://docs.obiguard.ai/integrations/llms/triton.md): Integrate Trtiton-hosted custom models with Obiguard and take them to production - [Google Vertex AI](https://docs.obiguard.ai/integrations/llms/vertex-ai.md) - [Controlled Generations](https://docs.obiguard.ai/integrations/llms/vertex-ai/controlled-generations.md): Controlled Generations ensure that the model always follows your supplied [JSON schema](https://json-schema.org/overview/what-is-jsonschema). Obiguard supports Vertex AI's Controlled Generations feature out of the box with our SDKs & APIs. - [Embeddings](https://docs.obiguard.ai/integrations/llms/vertex-ai/embeddings.md): Get embeddings from Vertex AI - [Files](https://docs.obiguard.ai/integrations/llms/vertex-ai/files.md): Upload files to Google Cloud Storage for Vertex AI fine-tuning and batch inference - [vLLM](https://docs.obiguard.ai/integrations/llms/vllm.md): Integrate vLLM-hosted custom models with Obiguard and take them to production - [xAI](https://docs.obiguard.ai/integrations/llms/x-ai.md): Obiguard supports xAI's chat completions, completions, and embeddings APIs. - [Milvus](https://docs.obiguard.ai/integrations/vector-databases/milvus.md) - [Qdrant](https://docs.obiguard.ai/integrations/vector-databases/qdrant.md) - [Making Your First Request](https://docs.obiguard.ai/making-your-first-request.md): Integrate Obiguard and secure your first LLM call in 2 minutes! - [Analytics](https://docs.obiguard.ai/observability/analytics.md): Obiguard Analytics provides a powerful dashboard for visualizing and analyzing your LLM request data. - [Logs](https://docs.obiguard.ai/observability/logs.md): The Logs section displays a time-ordered record of all requests handled by Obiguard. - [Observability](https://docs.obiguard.ai/observability/overview.md): Monitor performance, analyze metrics, and accelerate troubleshooting with our all-in-one observability platform. - [Tracing](https://docs.obiguard.ai/observability/tracing.md): Obiguard Tracing lets you track the full lifecycle of your LLM requests in a single, chronological view. - [Projects](https://docs.obiguard.ai/projects.md): Developers frequently rely on multiple GenAI-powered applications to handle various use cases, with each application comprising several components that utilize LLMs for different tasks. Obiguard is built to seamlessly protect and operate across every part of your application that leverages GenAI for… - [Common Errors and Resolutions](https://docs.obiguard.ai/support/common-errors-and-resolutions.md): Common errors encountered when using the Obiguard Gateway and their resolutions. - [Contact Us](https://docs.obiguard.ai/support/contact-us.md): Even with thorough troubleshooting and testing, you might encounter unresolved issues. Contact the Obiguard team below, and we will respond as soon as possible. - [Virtual Keys](https://docs.obiguard.ai/virtual-keys.md): Obiguard’s virtual key system lets you securely store your LLM API keys in our vault and manage them easily using a unique virtual identifier. - [What is Obiguard?](https://docs.obiguard.ai/what-is-obiguard.md): Obiguard is a comprehensive platform designed to streamline the security of agentic AI systems for developers and organizations. It serves as a unified interface for interacting with over 250 AI models, offering advanced tools for control, visibility, and security in your agentic AI apps. - [AI Agent vs Agentic AI](https://docs.obiguard.ai/white-papers/ai-agent-vs-agentic-ai.md): Understanding AI Agents and Agentic AI, and the Role of Obiguard - [Banking](https://docs.obiguard.ai/white-papers/banking.md): Working Paper: AI-Driven Transaction Governance Using Obiguard in Financial Institutions - [Cloud](https://docs.obiguard.ai/white-papers/cloud.md): Working Paper: Obiguard AI Governance Platform in Cloud Data Center Infrastructure - [Cryptocurrencies](https://docs.obiguard.ai/white-papers/cryptocurrencies.md): Working Paper: Leveraging Obiguard for AI Governance in Cryptocurrency Platforms - [Obiguard for CEOs](https://docs.obiguard.ai/white-papers/for-ceos.md): Building Trusted and Safe Agents: An AI White Paper for CEOs - [Obiguard White Paper](https://docs.obiguard.ai/white-papers/our-white-paper.md): Obiguard: A Secure, Scalable, and High-Performance Platform for AI Agent Governance ## OpenAPI Specs - [openapi](https://docs.obiguard.ai/openapi.yaml)