Obiguard’s suite of features - AI gateway, observability, prompt management, and continuous fine-tuning are all enabled for the OSS models (Llama2, Mistral, Zephyr, and more) available on Anyscale endpoints.
Obiguard SDK Integration with Anyscale
1. Install the Obiguard SDK
2. Initialize Obiguard with Anyscale Virtual Key
To use Anyscale with Obiguard, get your Anyscale API key from here, then add it to Obiguard to create the virtual key.
from obiguard import Obiguard
client = Obiguard(
obiguard_api_key="sk-obg***", # Your Obiguard API key
)
from obiguard import Obiguard
client = Obiguard(
obiguard_api_key="sk-obg***", # Your Obiguard API key
)
3. Invoke Chat Completions with Anyscale
completion = client.chat.completions.create(
messages= [{ "role": 'user', "content": 'Say this is a test' }],
model= 'mistralai/Mistral-7B-Instruct-v0.1'
)
completion = client.chat.completions.create(
messages= [{ "role": 'user', "content": 'Say this is a test' }],
model= 'mistralai/Mistral-7B-Instruct-v0.1'
)
Directly Using Obiguard’s REST API
Alternatively, you can also directly call Anyscale models through Obiguard’s REST API - it works exactly the same as OpenAI API, with 2 differences:
- You send your requests to Obiguard’s complete Gateway URL
https://gateway.obiguard.ai/v1/chat/completions
- You have to add Obiguard specific headers.
x-obiguard-api-key
for sending your Obiguard API or virtual Key
curl https://gateway.obiguard.ai/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $ANYSCALE_API_KEY" \
-H "x-obiguard-api-key: $OBIGUARD_API_KEY" \
-H "x-obiguard-provider: anyscale" \
-d '{
"model": "mistralai/Mistral-7B-Instruct-v0.1",
"messages": [{"role": "user","content": "Hello!"}]
}'
List of all possible Obiguard headers.
curl https://gateway.obiguard.ai/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $ANYSCALE_API_KEY" \
-H "x-obiguard-api-key: $OBIGUARD_API_KEY" \
-H "x-obiguard-provider: anyscale" \
-d '{
"model": "mistralai/Mistral-7B-Instruct-v0.1",
"messages": [{"role": "user","content": "Hello!"}]
}'
List of all possible Obiguard headers.
Using the OpenAI Python or Node SDKs for Anyscale
You can also use the baseURL
param in the standard OpenAI SDKs and make calls to Obiguard + Anyscale directly from there. Like the Rest API example, you are only required to change the baseURL
and add defaultHeaders
to your instance. You can use the Obiguard SDK to make it simpler:
from openai import OpenAI
from obiguard import OBIGUARD_GATEWAY_URL, createHeaders
anyscale = OpenAI(
api_key="ANYSCALE_API_KEY", # defaults to os.environ.get("OPENAI_API_KEY")
base_url=OBIGUARD_GATEWAY_URL,
default_headers=createHeaders(
provider="anyscale",
obiguard_api_key="sk-obg******", # Your Obiguard API key
)
)
chat_complete = anyscale.chat.completions.create(
model="mistralai/Mistral-7B-Instruct-v0.1",
messages=[{"role": "user", "content": "Say this is a test"}],
)
from openai import OpenAI
from obiguard import OBIGUARD_GATEWAY_URL, createHeaders
anyscale = OpenAI(
api_key="ANYSCALE_API_KEY", # defaults to os.environ.get("OPENAI_API_KEY")
base_url=OBIGUARD_GATEWAY_URL,
default_headers=createHeaders(
provider="anyscale",
obiguard_api_key="sk-obg******", # Your Obiguard API key
)
)
chat_complete = anyscale.chat.completions.create(
model="mistralai/Mistral-7B-Instruct-v0.1",
messages=[{"role": "user", "content": "Say this is a test"}],
)
This request will be automatically logged by Obiguard. You can view this in your logs dashboard.
Obiguard logs the tokens utilized, execution time, and cost for each request.
Additionally, you can delve into the details to review the precise request and response data.
Advanced Use Cases
Streaming Responses
Obiguard supports streaming responses using Server Sent Events (SSE).
from openai import OpenAI
from obiguard import OBIGUARD_GATEWAY_URL, createHeaders
anyscale = OpenAI(
api_key="ANYSCALE-API-KEY", # defaults to os.environ.get("OPENAI_API_KEY")
base_url=OBIGUARD_GATEWAY_URL,
default_headers=createHeaders(
provider="anyscale",
obiguard_api_key="sk-obg******", # Your Obiguard API key
)
)
chat_complete = anyscale.chat.completions.create(
model="mistralai/Mistral-7B-Instruct-v0.1",
messages=[{"role": "user", "content": "Say this is a test"}],
stream=True
)
for chunk in chat_complete:
print(chunk.choices[0].delta.content, end="", flush=True)
from openai import OpenAI
from obiguard import OBIGUARD_GATEWAY_URL, createHeaders
anyscale = OpenAI(
api_key="ANYSCALE-API-KEY", # defaults to os.environ.get("OPENAI_API_KEY")
base_url=OBIGUARD_GATEWAY_URL,
default_headers=createHeaders(
provider="anyscale",
obiguard_api_key="sk-obg******", # Your Obiguard API key
)
)
chat_complete = anyscale.chat.completions.create(
model="mistralai/Mistral-7B-Instruct-v0.1",
messages=[{"role": "user", "content": "Say this is a test"}],
stream=True
)
for chunk in chat_complete:
print(chunk.choices[0].delta.content, end="", flush=True)
Obiguard Features
Obiguard supports the complete host of it’s functionality via the OpenAI SDK so you don’t need to migrate away from it.