Files
fuzzforge_ai/volumes/env
Songbird 853a8be8f3 refactor: replace .env.example with .env.template in documentation
- Remove volumes/env/.env.example file
- Update all documentation references to use .env.template instead
- Update bootstrap script error message
- Update .gitignore comment
2025-10-27 12:20:16 +01:00
..
2025-10-26 12:51:53 +01:00
2025-10-26 12:51:53 +01:00

FuzzForge LiteLLM Proxy Configuration

This directory contains configuration for the LiteLLM proxy with model-agnostic virtual keys.

Quick Start (Fresh Clone)

1. Create Your .env File

cp .env.template .env

2. Add Your Provider API Keys

Edit .env and add your real API keys:

LITELLM_OPENAI_API_KEY=sk-proj-YOUR-OPENAI-KEY-HERE
LITELLM_ANTHROPIC_API_KEY=sk-ant-api03-YOUR-ANTHROPIC-KEY-HERE

3. Start Services

cd ../..  # Back to repo root
COMPOSE_PROFILES=secrets docker compose up -d

Bootstrap will automatically:

  • Generate 3 virtual keys with individual budgets
  • Write them to your .env file
  • No model restrictions (model-agnostic)

Files

  • .env.template - Clean template (checked into git)
  • .env - Your real keys (git ignored, you create this)
  • .env.example - Legacy example

Virtual Keys (Auto-Generated)

Bootstrap creates 3 keys with budget controls:

Key Budget Duration Used By
OPENAI_API_KEY $100 30 days CLI, SDK
TASK_AGENT_API_KEY $25 30 days Task Agent
COGNEE_API_KEY $50 30 days Cognee

All keys are model-agnostic by default (no restrictions).

Using Models

Registered models in volumes/litellm/proxy_config.yaml:

  • gpt-5-miniopenai/gpt-5-mini
  • claude-sonnet-4-5anthropic/claude-sonnet-4-5-20250929
  • text-embedding-3-largeopenai/text-embedding-3-large

Use Registered Aliases:

fuzzforge workflow run llm_secret_detection . -n llm_model=gpt-5-mini
fuzzforge workflow run llm_secret_detection . -n llm_model=claude-sonnet-4-5

Use Any Model (Direct):

# Works without registering first!
fuzzforge workflow run llm_secret_detection . -n llm_model=openai/gpt-5-nano

Proxy UI

http://localhost:10999/ui

  • User: fuzzforge / Pass: fuzzforge123

Troubleshooting

# Check bootstrap logs
docker compose logs llm-proxy-bootstrap

# Verify keys generated
grep "API_KEY=" .env | grep -v "^#" | grep -v "your-"

# Restart services
docker compose restart llm-proxy task-agent