Triton
Integrate Trtiton-hosted custom models with Obiguard and take them to production
Obiguard provides a robust and secure platform to observe, govern, and manage your locally or privately hosted custom models using Triton.
Here’s the official Triton Inference Server documentation for more details.
Integrating Custom Models with Obiguard SDK
Expose your Triton Server
Expose your Triton server by using a tunneling service like ngrok or any other way you prefer. You can skip this step if you’re self-hosting the Gateway.
Install the Obiguard SDK
Initialize Obiguard with Triton custom URL
- Pass your publicly-exposed Triton server URL to Obiguard with
customHost
- Set target
provider
astriton
.
More on custom_host
here.
Invoke Chat Completions
Use the Obiguard SDK to invoke chat completions (generate) from your model, just as you would with any other provider:
Next Steps
Explore the complete list of features supported in the SDK: