Getting Started
1. Install the required packages:
2. Configure your Autogen configs:
Make your agents Production-ready with Obiguard
Obiguard makes your Autogen agents reliable, robust, and production-grade with its observability suite and AI Gateway. Seamlessly integrate 200+ LLMs with your Autogen agents using Obiguard. Gain granular insights into agent performance and costs, and continuously optimize your AI operations—all with just 2 lines of code. Let’s dive deep! Let’s go through each of the use cases!1. Interoperability
Easily switch between 200+ LLMs. Call various LLMs such as Anthropic, Gemini, Mistral, Azure OpenAI, Google Vertex AI, AWS Bedrock, and many more by simply changing theprovider
and obiguard_api_key
in the ChatOpenAI
object.
If you are using OpenAI with autogen, your code would look like this:To switch to Azure as your provider, add your Azure details to Obiguard vault (here’s
how) and use Azure OpenAI using virtual keys.