Using Functions

Obiguard allows you to define functions in API requests using the OpenAI function signature. The tools parameter can include functions and is supported by models that enable function or tool calling.

from obiguard import Obiguard

client = Obiguard(
  obiguard_api_key='vk-obg***',  # Your Obiguard virtual key here
)

tools = [
{
  "type": "function",
  "function": {
    "name": "get_current_weather",
    "description": "Get the current weather in a given location",
    "parameters": {
      "type": "object",
      "properties": {
        "location": {
          "type": "string",
          "description": "The city and state, e.g. San Francisco, CA",
         },
        "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
      },
      "required": ["location"],
    },
  }
}

messages = [{"role": "user", "content": "What's the weather like in Boston today?"}]

completion = client.chat.completions.create(
  model="gpt-3.5-turbo",
  messages=messages,
  tools=tools,
  tool_choice="auto"
)

print(completion)

Managing Functions and Tools in Prompts

With Obiguard’s Prompt Library, you can create prompt templates that include function or tool definitions and specify the tool choice parameter. Obiguard also validates your tool definitions in real time to help prevent syntax errors.

Supported Providers and Models

Function calling is supported by the following providers, with more being added regularly.

ProviderSupported Models
OpenAIgpt-4 series, gpt-3.5-turbo series
Azure OpenAIgpt-4 series, gpt-3.5-turbo series
Anyscalemistralai/Mistral-7B-Instruct-v0.1, mistralai/Mixtral-8x7B-Instruct-v0.1
Together AImistralai/Mixtral-8x7B-Instruct-v0.1, mistralai/Mistral-7B-Instruct-v0.1, togethercomputer/CodeLlama-34b-Instruct
Fireworks AIfirefunction-v1, fw-function-call-34b-v0
Google Gemini / Vertex AIgemini-1.0-pro, gemini-1.0-pro-001, gemini-1.5-pro-latest