Obiguard provides a robust and secure platform to observe, govern, and manage your locally or privately hosted custom models using Triton.Documentation Index
Fetch the complete documentation index at: https://docs.obiguard.ai/llms.txt
Use this file to discover all available pages before exploring further.
Here’s the official Triton Inference Server documentation for more details.
Integrating Custom Models with Obiguard SDK
Expose your Triton Server
Expose your Triton server by using a tunneling service like ngrok or any other way you prefer. You can skip this step if you’re self-hosting the Gateway.
Initialize Obiguard with Triton custom URL
- Pass your publicly-exposed Triton server URL to Obiguard with
customHost - Set target
providerastriton.
- Python SDK
custom_host here.
