Obiguard provides a robust and secure gateway to facilitate the integration of various Large Language Models (LLMs) into your applications, including the models hosted on Cerebras Inference API.
Obiguard SDK Integration with Cerebras
Obiguard provides a consistent API to interact with models from various providers. To integrate Cerebras with Obiguard:
1. Install the Obiguard SDK
2. Initialize Obiguard with Cerebras
To use Cerebras with Obiguard, get your API key from here, then add it to Obiguard to create the virtual key.
from obiguard import Obiguard
client = Obiguard(
obiguard_api_key="sk-obg***", # Your Obiguard API key
virtual_key="CEREBRAS_VIRTUAL_KEY" # Your Cerebras Inference virtual key
)
from obiguard import Obiguard
client = Obiguard(
obiguard_api_key="sk-obg***", # Your Obiguard API key
virtual_key="CEREBRAS_VIRTUAL_KEY" # Your Cerebras Inference virtual key
)
3. Invoke Chat Completions
completion = client.chat.completions.create(
messages= [{"role": 'user', "content": 'Say this is a test'}],
model= 'llama3.1-8b'
)
completion = client.chat.completions.create(
messages= [{"role": 'user', "content": 'Say this is a test'}],
model= 'llama3.1-8b'
)
Supported Models
Cerebras currently supports Llama-3.1-8B
and Llama-3.1-70B
. You can find more info here:
Next Steps
The complete list of features supported in the SDK are available on the link below.
Responses are generated using AI and may contain mistakes.