mirror of
https://github.com/FuzzingLabs/fuzzforge_ai.git
synced 2026-02-13 10:32:47 +00:00
* feat: seed governance config and responses routing * Add env-configurable timeout for proxy providers * Integrate LiteLLM OTEL collector and update docs * Make .env.litellm optional for LiteLLM proxy * Add LiteLLM proxy integration with model-agnostic virtual keys Changes: - Bootstrap generates 3 virtual keys with individual budgets (CLI: $100, Task-Agent: $25, Cognee: $50) - Task-agent loads config at runtime via entrypoint script to wait for bootstrap completion - All keys are model-agnostic by default (no LITELLM_DEFAULT_MODELS restrictions) - Bootstrap handles database/env mismatch after docker prune by deleting stale aliases - CLI and Cognee configured to use LiteLLM proxy with virtual keys - Added comprehensive documentation in volumes/env/README.md Technical details: - task-agent entrypoint waits for keys in .env file before starting uvicorn - Bootstrap creates/updates TASK_AGENT_API_KEY, COGNEE_API_KEY, and OPENAI_API_KEY - Removed hardcoded API keys from docker-compose.yml - All services route through http://localhost:10999 proxy * Fix CLI not loading virtual keys from global .env Project .env files with empty OPENAI_API_KEY values were overriding the global virtual keys. Updated _load_env_file_if_exists to only override with non-empty values. * Fix agent executor not passing API key to LiteLLM The agent was initializing LiteLlm without api_key or api_base, causing authentication errors when using the LiteLLM proxy. Now reads from OPENAI_API_KEY/LLM_API_KEY and LLM_ENDPOINT environment variables and passes them to LiteLlm constructor. * Auto-populate project .env with virtual key from global config When running 'ff init', the command now checks for a global volumes/env/.env file and automatically uses the OPENAI_API_KEY virtual key if found. This ensures projects work with LiteLLM proxy out of the box without manual key configuration. * docs: Update README with LiteLLM configuration instructions Add note about LITELLM_GEMINI_API_KEY configuration and clarify that OPENAI_API_KEY default value should not be changed as it's used for the LLM proxy. * Refactor workflow parameters to use JSON Schema defaults Consolidates parameter defaults into JSON Schema format, removing the separate default_parameters field. Adds extract_defaults_from_json_schema() helper to extract defaults from the standard schema structure. Updates LiteLLM proxy config to use LITELLM_OPENAI_API_KEY environment variable. * Remove .env.example from task_agent * Fix MDX syntax error in llm-proxy.md * fix: apply default parameters from metadata.yaml automatically Fixed TemporalManager.run_workflow() to correctly apply default parameter values from workflow metadata.yaml files when parameters are not provided by the caller. Previous behavior: - When workflow_params was empty {}, the condition `if workflow_params and 'parameters' in metadata` would fail - Parameters would not be extracted from schema, resulting in workflows receiving only target_id with no other parameters New behavior: - Removed the `workflow_params and` requirement from the condition - Now explicitly checks for defaults in parameter spec - Applies defaults from metadata.yaml automatically when param not provided - Workflows receive all parameters with proper fallback: provided value > metadata default > None This makes metadata.yaml the single source of truth for parameter defaults, removing the need for workflows to implement defensive default handling. Affected workflows: - llm_secret_detection (was failing with KeyError) - All other workflows now benefit from automatic default application Co-authored-by: tduhamel42 <tduhamel@fuzzinglabs.com>
90 lines
2.1 KiB
Markdown
90 lines
2.1 KiB
Markdown
# FuzzForge LiteLLM Proxy Configuration
|
|
|
|
This directory contains configuration for the LiteLLM proxy with model-agnostic virtual keys.
|
|
|
|
## Quick Start (Fresh Clone)
|
|
|
|
### 1. Create Your `.env` File
|
|
|
|
```bash
|
|
cp .env.template .env
|
|
```
|
|
|
|
### 2. Add Your Provider API Keys
|
|
|
|
Edit `.env` and add your **real** API keys:
|
|
|
|
```bash
|
|
LITELLM_OPENAI_API_KEY=sk-proj-YOUR-OPENAI-KEY-HERE
|
|
LITELLM_ANTHROPIC_API_KEY=sk-ant-api03-YOUR-ANTHROPIC-KEY-HERE
|
|
```
|
|
|
|
### 3. Start Services
|
|
|
|
```bash
|
|
cd ../.. # Back to repo root
|
|
COMPOSE_PROFILES=secrets docker compose up -d
|
|
```
|
|
|
|
Bootstrap will automatically:
|
|
- Generate 3 virtual keys with individual budgets
|
|
- Write them to your `.env` file
|
|
- No model restrictions (model-agnostic)
|
|
|
|
## Files
|
|
|
|
- **`.env.template`** - Clean template (checked into git)
|
|
- **`.env`** - Your real keys (git ignored, you create this)
|
|
- **`.env.example`** - Legacy example
|
|
|
|
## Virtual Keys (Auto-Generated)
|
|
|
|
Bootstrap creates 3 keys with budget controls:
|
|
|
|
| Key | Budget | Duration | Used By |
|
|
|-----|--------|----------|---------|
|
|
| `OPENAI_API_KEY` | $100 | 30 days | CLI, SDK |
|
|
| `TASK_AGENT_API_KEY` | $25 | 30 days | Task Agent |
|
|
| `COGNEE_API_KEY` | $50 | 30 days | Cognee |
|
|
|
|
All keys are **model-agnostic** by default (no restrictions).
|
|
|
|
## Using Models
|
|
|
|
Registered models in `volumes/litellm/proxy_config.yaml`:
|
|
- `gpt-5-mini` → `openai/gpt-5-mini`
|
|
- `claude-sonnet-4-5` → `anthropic/claude-sonnet-4-5-20250929`
|
|
- `text-embedding-3-large` → `openai/text-embedding-3-large`
|
|
|
|
### Use Registered Aliases:
|
|
|
|
```bash
|
|
fuzzforge workflow run llm_secret_detection . -n llm_model=gpt-5-mini
|
|
fuzzforge workflow run llm_secret_detection . -n llm_model=claude-sonnet-4-5
|
|
```
|
|
|
|
### Use Any Model (Direct):
|
|
|
|
```bash
|
|
# Works without registering first!
|
|
fuzzforge workflow run llm_secret_detection . -n llm_model=openai/gpt-5-nano
|
|
```
|
|
|
|
## Proxy UI
|
|
|
|
http://localhost:10999/ui
|
|
- User: `fuzzforge` / Pass: `fuzzforge123`
|
|
|
|
## Troubleshooting
|
|
|
|
```bash
|
|
# Check bootstrap logs
|
|
docker compose logs llm-proxy-bootstrap
|
|
|
|
# Verify keys generated
|
|
grep "API_KEY=" .env | grep -v "^#" | grep -v "your-"
|
|
|
|
# Restart services
|
|
docker compose restart llm-proxy task-agent
|
|
```
|