This commit addresses three important bugs:
1. SQL Injection Prevention (proxy.ts:70-75):
- Added whitelist validation for DATABASE_TABLE environment variable
- Table names are now validated against ALLOWED_TABLES before use
- Prevents potential SQL injection through malicious table names
2. SQL Interval Parameter Bug (dashboard/app/api/metrics/route.ts):
- Fixed incorrect INTERVAL syntax in PostgreSQL queries
- Changed from INTERVAL '$1 hours' to INTERVAL '1 hour' * $1
- Properly uses parameterized queries with interval multiplication
- Affects all 4 queries: summary, recent, model breakdown, and trends
3. Incorrect Property Reference (proxy.ts:206):
- Fixed usage.cached_tokens to usage.prompt_tokens_details?.cached_tokens
- Aligns with OpenAI API response structure for cached tokens
- Ensures accurate logging of cached token usage
- Add input validation for hours and limit query parameters to prevent NaN and DoS attacks
- Replace || with ?? for proper null coalescing in metrics summary
- Fix IPv6 normalization to prevent empty string when IP is malformed
- Fix stream parsing to skip empty JSON strings and avoid parse errors
- Remove redundant .toString() calls on authorization header
Add a lightweight Next.js dashboard to visualize OpenProxy metrics in real-time. The dashboard provides comprehensive insights into LLM API usage, costs, and performance.
Features:
- Real-time metrics overview (requests, tokens, costs, response times)
- Model breakdown with usage statistics
- Hourly trends visualization with charts
- Recent requests table with detailed information
- Auto-refresh every 30 seconds
- Configurable time ranges (1h, 6h, 24h, 7d)
Technical details:
- Built with Next.js 14 and React 18
- Uses Recharts for data visualization
- Connects directly to PostgreSQL database
- Runs on port 3008 by default
- TypeScript for type safety
- Minimal dependencies for lightweight deployment
The dashboard complements the proxy server by providing a user-friendly interface for monitoring and analyzing LLM API usage patterns.