Invariant Proxy
Invariant Proxy is a lightweight Docker service that acts as an intermediary between AI Agents and LLM providers (such as OpenAI and Anthropic). It captures and forwards agent interactions to the Invariant Explorer, enabling seamless debugging, visualization, and exploration of traces.
Why Use Invariant Proxy?
- ✅ Intercept AI interactions for better debugging and analysis.
- ✅ Seamlessly forward API requests to OpenAI, Anthropic, and other LLM providers (supports streaming responses too).
- ✅ Works with AI Agent platforms and systems like OpenHands, SWE-agent, etc.
- ✅ Automatically store and organize traces in the Invariant Explorer.
Getting Started
Run the Proxy Locally
To start the Invariant Proxy, run:
bash run.sh build && bash run.sh up
This will launch the proxy at http://localhost:8005/api/v1/proxy/.
Set Up an Invariant API Key
- Follow the instructions here to obtain an API key. This allows the proxy to push traces to Invariant Explorer.
Integration Guides
🔹 OpenAI Integration
-
Follow these steps to obtain an OpenAI API key.
-
Modify OpenAI Client Setup
Instead of connecting directly to OpenAI, configure your
OpenAIclient to use the proxy:from httpx import Client from openai import OpenAI client = OpenAI( http_client=Client( headers={ "Invariant-Authorization": "Bearer your-invariant-api-key" }, ), base_url="http://localhost:8005/api/v1/proxy/{add-your-dataset-name-here}/openai", )Note: Do not include the curly braces
{}. If the dataset does not exist in Invariant Explorer, it will be created before adding traces.
🔹 Anthropic Integration
-
Follow these steps to obtain an Anthropic API key.
-
Modify Anthropic Client Setup
from httpx import Client from anthropic import Anthropic client = Anthropic( http_client=Client( headers={ "Invariant-Authorization": "Bearer your-invariant-api-key" }, ), base_url="http://localhost:8005/api/v1/proxy/{add-your-dataset-name-here}/anthropic", )Note: Do not include the curly braces
{}. If the dataset does not exist in Invariant Explorer, it will be created before adding traces.
OpenHands Integration
OpenHands (formerly OpenDevin) is a platform for software development agents powered by AI.
How to Integrate OpenHands with Invariant Proxy
Step 1: Modify the API Base
Enable the Advanced Options toggle under settings and update the Base URL:
Step 2: Adjust the API Key Format
Set the API Key using the following format:
{your-llm-api-key}|invariant-auth: {your-invariant-api-key}
Note: Do not include the curly braces
{}.
The Invariant Proxy extracts the invariant-auth field from the API key and correctly forwards it to Invariant Explorer while sending the actual API key to OpenAI or Anthropic.
SWE-agent Integration
SWE-agent allows your preferred language model (e.g., GPT-4o or Claude Sonnet 3.5) to autonomously utilize tools for various tasks, such as fixing issues in real GitHub repositories.
Using SWE-agent with Invariant Proxy
SWE-agent does not support custom headers, so you cannot pass the Invariant API Key via Invariant-Authorization. However, there is a workaround using the Invariant Proxy.
Step 1: Modify the API Base
Run sweagent with the following flag:
--agent.model.api_base=http://localhost:8005/api/v1/proxy/{add-your-dataset-name-here}/openai
Note: Do not include the curly braces
{}.
Step 2: Adjust the API Key Format
Instead of setting your API Key normally, modify the environment variable as follows:
export OPENAI_API_KEY={your-openai-api-key}|invariant-auth: {your-invariant-api-key}
export ANTHROPIC_API_KEY={your-anthropic-api-key}|invariant-auth: {your-invariant-api-key}
Note: Do not include the curly braces
{}.
This setup ensures that SWE-agent works seamlessly with Invariant Proxy, maintaining compatibility while enabling full functionality. 🚀
Development
Pushing to Local Explorer
By default the proxy points to the Production Explorer instance. To point it to your local Explorer instance, modify the INVARIANT_API_URL value inside .env. Follow instructions in .env on how to point to the local instance.
Run Tests
To run tests, execute:
./run.sh tests
