Invariant Proxy
Invariant Proxy is a lightweight Docker service that acts as an intermediary between AI Agents and LLM providers (such as OpenAI and Anthropic). It captures and forwards agent interactions to the Invariant Explorer, enabling seamless debugging, visualization and exploration of traces.
Why Use Invariant Proxy?
- ✅ Intercept AI interactions for better debugging and analysis.
- ✅ Seamlessly forward API requests to OpenAI, Anthropic, and other LLM providers (supports streaming responses too).
- ✅ Automatically store and organize traces in the Invariant Explorer.
Getting Started
To integrate the Proxy with your AI agent, you’ll need to modify how your client interacts with LLM providers.
🔹 OpenAI Integration
-
Get an API Key
Follow the instructions here to obtain an API key. -
Modify OpenAI Client Setup
Instead of connecting directly to OpenAI, configure yourOpenAIclient to use the proxy.from httpx import Client from openai import OpenAI client = OpenAI( http_client=Client( headers={ "Invariant-Authorization": "Bearer <invariant-api-key>" }, ), base_url="https://explorer.invariantlabs.ai/api/v1/proxy/<add-your-dataset-name-here>/openai", ) # Make API requests to OpenAI using the client as usual.
🔹 Anthropic Integration
-
Get an API Key
Follow the instructions here to obtain an API key. -
Modify Anthropic Client Setup
Instead of connecting directly to Anthropic, configure yourAnthropicclient to use the proxy.from httpx import Client from anthropic import Anthropic client = Anthropic( http_client=Client( headers={ "Invariant-Authorization": "Bearer <invariant-api-key>" }, ), base_url="https://explorer.invariantlabs.ai/api/v1/proxy/<add-your-dataset-name-here>/anthropic", ) # Make API requests to Anthropic using the client as usual.
Run
./run.sh up
Run tests
./run.sh tests
