Obiguard provides a robust and secure gateway to facilitate the integration of various Large Language Models (LLMs) into your applications, including Lepton AI APIs. With Obiguard, you can take advantage of features like fast AI gateway access, observability, prompt management, and more, all while ensuring the secure management of your LLM API keys through a virtual key system.Documentation Index
Fetch the complete documentation index at: https://docs.obiguard.ai/llms.txt
Use this file to discover all available pages before exploring further.
Provider Slug.
leptonObiguard SDK Integration with Lepton AI Models
Obiguard provides a consistent API to interact with models from various providers. To integrate Lepton AI with Obiguard:1. Install the Obiguard SDK
Add the Obiguard SDK to your application to interact with Lepton AI’s API through Obiguard’s gateway.- Python SDK
2. Initialize Obiguard with the Virtual Key
To use Lepton AI with Obiguard, get your API key from Lepton AI, then add it to Obiguard to create the virtual key.- Python SDK
3. Invoke Chat Completions with Lepton AI
Use the Obiguard instance to send requests to Lepton AI. You can also override the virtual key directly in the API call if needed.- Python SDK
- cURL
Speech-to-Text (Transcription)
Lepton AI provides speech-to-text capabilities through Obiguard’s unified API:- Python SDK
Advanced Features
Streaming Responses
Lepton AI supports streaming responses to provide real-time generation:- Python SDK
Next Steps
The complete list of features supported in the SDK are available on the link below.Obiguard SDK Client
Learn more about the Obiguard SDK Client

