With Obiguard, you can take advantage of features like fast AI gateway access, observability, prompt management, and more, all while ensuring the secure management of your LLM API keys through a virtual key system.

Provider Slug: deepseek

Obiguard SDK Integration with DeepSeek Models

Obiguard provides a consistent API to interact with models from various providers. To integrate DeepSeek with Obiguard:

1. Install the Obiguard SDK

Add the Obiguard SDK to your application to interact with DeepSeek AI’s API through Obiguard’s gateway.

pip install obiguard

2. Initialize Obiguard with the Virtual Key

To use DeepSeek with Obiguard, get your API key from here, then add it to Obiguard to create the virtual key.

from obiguard import Obiguard

client = Obiguard(
    obiguard_api_key="sk-obg***",  # Your Obiguard API key
    virtual_key="VIRTUAL_KEY"   # Replace with your virtual key for DeepSeek
)

3. Invoke Chat Completions with DeepSeek

Use the Obiguard instance to send requests to DeepSeek. You can also override the virtual key directly in the API call if needed.

completion = client.chat.completions.create(
    messages= [{ "role": 'user', "content": 'Say this is a test' }],
    model= 'deepseek-chat'
)

4. Invoke Multi-round Conversation with DeepSeek

from obiguard import Obiguard

client = Obiguard(
    obiguard_api_key="sk-obg***",  # Your Obiguard API key
    virtual_key="VIRTUAL_KEY"   # Replace with your virtual key for DeepSeek
)

# Round 1
messages = [{"role": "user", "content": "What's the highest mountain in the world?"}]
response = client.chat.completions.create(
    model="deepseek-chat",
    messages=messages
)

messages.append(response.choices[0].message)
print(f"Messages Round 1: {messages}")

# Round 2
messages.append({"role": "user", "content": "What is the second?"})
response = client.chat.completions.create(
    model="deepseek-chat",
    messages=messages
)

messages.append(response.choices[0].message)
print(f"Messages Round 2: {messages}")

5. JSON Output with DeepSeek

import json
from obiguard import Obiguard

client = Obiguard(
    obiguard_api_key="sk-obg***",  # Your Obiguard API key
    virtual_key="VIRTUAL_KEY"   # Replace with your virtual key for DeepSeek
)

system_prompt = """
The user will provide some exam text. Please parse the "question" and "answer" and output them in JSON format.

EXAMPLE INPUT:
Which is the highest mountain in the world? Mount Everest.

EXAMPLE JSON OUTPUT:
{
    "question": "Which is the highest mountain in the world?",
    "answer": "Mount Everest"
}
"""

user_prompt = "Which is the longest river in the world? The Nile River."

messages = [{"role": "system", "content": system_prompt},
            {"role": "user", "content": user_prompt}]

response = client.chat.completions.create(
    model="deepseek-chat",
    messages=messages,
    response_format={
        'type': 'json_object'
    }
)

print(json.loads(response.choices[0].message.content))

Supported Endpoints

  1. CHAT_COMPLETIONS
  2. STREAM_CHAT_COMPLETIONS

The complete list of features supported in the SDK is available on the link below.

Obiguard SDK Client

Learn more about the Obiguard SDK Client