Integrate Trtiton-hosted custom models with Obiguard and take them to production
Obiguard provides a robust and secure platform to observe, govern, and manage your locally or privately hosted custom models using Triton.
Here’s the official Triton Inference Server documentation for more details.
Expose your Triton Server
Expose your Triton server by using a tunneling service like ngrok or any other way you prefer. You can skip this step if you’re self-hosting the Gateway.
Install the Obiguard SDK
Initialize Obiguard with Triton custom URL
customHost
provider
as triton
.More on custom_host
here.
Invoke Chat Completions
Use the Obiguard SDK to invoke chat completions (generate) from your model, just as you would with any other provider:
Explore the complete list of features supported in the SDK: