mirror of
https://github.com/FuzzingLabs/fuzzforge_ai.git
synced 2026-02-13 02:32:47 +00:00
- Remove volumes/env/.env.example file - Update all documentation references to use .env.template instead - Update bootstrap script error message - Update .gitignore comment
FuzzForge LiteLLM Proxy Configuration
This directory contains configuration for the LiteLLM proxy with model-agnostic virtual keys.
Quick Start (Fresh Clone)
1. Create Your .env File
cp .env.template .env
2. Add Your Provider API Keys
Edit .env and add your real API keys:
LITELLM_OPENAI_API_KEY=sk-proj-YOUR-OPENAI-KEY-HERE
LITELLM_ANTHROPIC_API_KEY=sk-ant-api03-YOUR-ANTHROPIC-KEY-HERE
3. Start Services
cd ../.. # Back to repo root
COMPOSE_PROFILES=secrets docker compose up -d
Bootstrap will automatically:
- Generate 3 virtual keys with individual budgets
- Write them to your
.envfile - No model restrictions (model-agnostic)
Files
.env.template- Clean template (checked into git).env- Your real keys (git ignored, you create this).env.example- Legacy example
Virtual Keys (Auto-Generated)
Bootstrap creates 3 keys with budget controls:
| Key | Budget | Duration | Used By |
|---|---|---|---|
OPENAI_API_KEY |
$100 | 30 days | CLI, SDK |
TASK_AGENT_API_KEY |
$25 | 30 days | Task Agent |
COGNEE_API_KEY |
$50 | 30 days | Cognee |
All keys are model-agnostic by default (no restrictions).
Using Models
Registered models in volumes/litellm/proxy_config.yaml:
gpt-5-mini→openai/gpt-5-miniclaude-sonnet-4-5→anthropic/claude-sonnet-4-5-20250929text-embedding-3-large→openai/text-embedding-3-large
Use Registered Aliases:
fuzzforge workflow run llm_secret_detection . -n llm_model=gpt-5-mini
fuzzforge workflow run llm_secret_detection . -n llm_model=claude-sonnet-4-5
Use Any Model (Direct):
# Works without registering first!
fuzzforge workflow run llm_secret_detection . -n llm_model=openai/gpt-5-nano
Proxy UI
- User:
fuzzforge/ Pass:fuzzforge123
Troubleshooting
# Check bootstrap logs
docker compose logs llm-proxy-bootstrap
# Verify keys generated
grep "API_KEY=" .env | grep -v "^#" | grep -v "your-"
# Restart services
docker compose restart llm-proxy task-agent