Here’s the official Triton Inference Server documentation for more details.
Integrating Custom Models with Obiguard SDK
Expose your Triton Server
Expose your Triton server by using a tunneling service like ngrok or any other way you prefer. You can skip this step if you’re self-hosting the Gateway.
Initialize Obiguard with Triton custom URL
- Pass your publicly-exposed Triton server URL to Obiguard with
customHost - Set target
providerastriton.
- Python SDK
custom_host here.
