diff --git a/.github/ISSUE_TEMPLATE/bug_report.md b/.github/ISSUE_TEMPLATE/bug_report.md deleted file mode 100644 index 0247004..0000000 --- a/.github/ISSUE_TEMPLATE/bug_report.md +++ /dev/null @@ -1,48 +0,0 @@ ---- -name: šŸ› Bug Report -about: Create a report to help us improve FuzzForge -title: "[BUG] " -labels: bug -assignees: '' ---- - -## Description -A clear and concise description of the bug you encountered. - -## Environment -Please provide details about your environment: -- **OS**: (e.g., macOS 14.0, Ubuntu 22.04, Windows 11) -- **Python version**: (e.g., 3.9.7) -- **Docker version**: (e.g., 24.0.6) -- **FuzzForge version**: (e.g., 0.6.0) - -## Steps to Reproduce -Clear steps to recreate the issue: - -1. Go to '...' -2. Run command '...' -3. Click on '...' -4. See error - -## Expected Behavior -A clear and concise description of what should happen. - -## Actual Behavior -A clear and concise description of what actually happens. - -## Logs -Please include relevant error messages and stack traces: - -``` -Paste logs here -``` - -## Screenshots -If applicable, add screenshots to help explain your problem. - -## Additional Context -Add any other context about the problem here (workflow used, specific target, configuration, etc.). - ---- - -šŸ’¬ **Need help?** Join our [Discord Community](https://discord.com/invite/acqv9FVG) for real-time support. \ No newline at end of file diff --git a/.github/ISSUE_TEMPLATE/config.yml b/.github/ISSUE_TEMPLATE/config.yml deleted file mode 100644 index e31f032..0000000 --- a/.github/ISSUE_TEMPLATE/config.yml +++ /dev/null @@ -1,8 +0,0 @@ -blank_issues_enabled: false -contact_links: - - name: šŸ’¬ Community Discord - url: https://discord.com/invite/acqv9FVG - about: Join our Discord to discuss ideas, workflows, and security research with the community. - - name: šŸ“– Documentation - url: https://github.com/FuzzingLabs/fuzzforge_ai/tree/main/docs - about: Check our documentation for guides, tutorials, and API reference. diff --git a/.github/ISSUE_TEMPLATE/feature_request.md b/.github/ISSUE_TEMPLATE/feature_request.md deleted file mode 100644 index 5b57b75..0000000 --- a/.github/ISSUE_TEMPLATE/feature_request.md +++ /dev/null @@ -1,38 +0,0 @@ ---- -name: ✨ Feature Request -about: Suggest an idea for FuzzForge -title: "[FEATURE] " -labels: enhancement -assignees: '' ---- - -## Use Case -Why is this feature needed? Describe the problem you're trying to solve or the improvement you'd like to see. - -## Proposed Solution -How should it work? Describe your ideal solution in detail. - -## Alternatives -What other approaches have you considered? List any alternative solutions or features you've thought about. - -## Implementation -**(Optional)** Do you have any technical considerations or implementation ideas? - -## Category -What area of FuzzForge would this feature enhance? - -- [ ] šŸ¤– AI Agents for Security -- [ ] šŸ›  Workflow Automation -- [ ] šŸ“ˆ Vulnerability Research -- [ ] šŸ”— Fuzzer Integration -- [ ] 🌐 Community Marketplace -- [ ] šŸ”’ Enterprise Features -- [ ] šŸ“š Documentation -- [ ] šŸŽÆ Other - -## Additional Context -Add any other context, screenshots, references, or examples about the feature request here. - ---- - -šŸ’¬ **Want to discuss this idea?** Join our [Discord Community](https://discord.com/invite/acqv9FVG) to collaborate with other contributors! \ No newline at end of file diff --git a/.github/ISSUE_TEMPLATE/workflow_submission.md b/.github/ISSUE_TEMPLATE/workflow_submission.md deleted file mode 100644 index 92d692f..0000000 --- a/.github/ISSUE_TEMPLATE/workflow_submission.md +++ /dev/null @@ -1,67 +0,0 @@ ---- -name: šŸ”„ Workflow Submission -about: Contribute a security workflow or module to the FuzzForge community -title: "[WORKFLOW] " -labels: workflow, community -assignees: '' ---- - -## Workflow Name -Provide a short, descriptive name for your workflow. - -## Description -Explain what this workflow does and what security problems it solves. - -## Category -What type of security workflow is this? - -- [ ] šŸ›”ļø **Security Assessment** - Static analysis, vulnerability scanning -- [ ] šŸ” **Secret Detection** - Credential and secret scanning -- [ ] šŸŽÆ **Fuzzing** - Dynamic testing and fuzz testing -- [ ] šŸ”„ **Reverse Engineering** - Binary analysis and decompilation -- [ ] 🌐 **Infrastructure Security** - Container, cloud, network security -- [ ] šŸ”’ **Penetration Testing** - Offensive security testing -- [ ] šŸ“‹ **Other** - Please describe - -## Files -Please attach or provide links to your workflow files: - -- [ ] `workflow.py` - Main Temporal flow implementation -- [ ] `Dockerfile` - Container definition -- [ ] `metadata.yaml` - Workflow metadata -- [ ] Test files or examples -- [ ] Documentation - -## Testing -How did you test this workflow? Please describe: - -- **Test targets used**: (e.g., vulnerable_app, custom test cases) -- **Expected outputs**: (e.g., SARIF format, specific vulnerabilities detected) -- **Validation results**: (e.g., X vulnerabilities found, Y false positives) - -## SARIF Compliance -- [ ] My workflow outputs results in SARIF format -- [ ] Results include severity levels and descriptions -- [ ] Code flow information is provided where applicable - -## Security Guidelines -- [ ] This workflow focuses on **defensive security** purposes only -- [ ] I have not included any malicious tools or capabilities -- [ ] All secrets/credentials are parameterized (no hardcoded values) -- [ ] I have followed responsible disclosure practices - -## Registry Integration -Have you updated the workflow registry? - -- [ ] Added import statement to `backend/toolbox/workflows/registry.py` -- [ ] Added registry entry with proper metadata -- [ ] Tested workflow registration and deployment - -## Additional Notes -Anything else the maintainers should know about this workflow? - ---- - -šŸš€ **Thank you for contributing to FuzzForge!** Your workflow will help the security community automate and scale their testing efforts. - -šŸ’¬ **Questions?** Join our [Discord Community](https://discord.com/invite/acqv9FVG) to discuss your contribution! \ No newline at end of file diff --git a/.github/pull_request_template.md b/.github/pull_request_template.md deleted file mode 100644 index 04ece70..0000000 --- a/.github/pull_request_template.md +++ /dev/null @@ -1,79 +0,0 @@ -## Description - - - -## Type of Change - - - -- [ ] šŸ› Bug fix (non-breaking change which fixes an issue) -- [ ] ✨ New feature (non-breaking change which adds functionality) -- [ ] šŸ’„ Breaking change (fix or feature that would cause existing functionality to not work as expected) -- [ ] šŸ“ Documentation update -- [ ] šŸ”§ Configuration change -- [ ] ā™»ļø Refactoring (no functional changes) -- [ ] šŸŽØ Style/formatting changes -- [ ] āœ… Test additions or updates - -## Related Issues - - - - -## Changes Made - - - -- -- -- - -## Testing - - - -### Tested Locally - -- [ ] All tests pass (`pytest`, `uv build`, etc.) -- [ ] Linting passes (`ruff check`) -- [ ] Code builds successfully - -### Worker Changes (if applicable) - -- [ ] Docker images build successfully (`docker compose build`) -- [ ] Worker containers start correctly -- [ ] Tested with actual workflow execution - -### Documentation - -- [ ] Documentation updated (if needed) -- [ ] README updated (if needed) -- [ ] CHANGELOG.md updated (if user-facing changes) - -## Pre-Merge Checklist - - - -- [ ] My code follows the project's coding standards -- [ ] I have performed a self-review of my code -- [ ] I have commented my code, particularly in hard-to-understand areas -- [ ] I have made corresponding changes to the documentation -- [ ] My changes generate no new warnings -- [ ] I have added tests that prove my fix is effective or that my feature works -- [ ] New and existing unit tests pass locally with my changes -- [ ] Any dependent changes have been merged and published - -### Worker-Specific Checks (if workers/ modified) - -- [ ] All worker files properly tracked by git (not gitignored) -- [ ] Worker validation script passes (`.github/scripts/validate-workers.sh`) -- [ ] Docker images build without errors -- [ ] Worker configuration updated in `docker-compose.yml` (if needed) - -## Screenshots (if applicable) - - - -## Additional Notes - - diff --git a/.github/scripts/validate-workers.sh b/.github/scripts/validate-workers.sh deleted file mode 100755 index 6b2c5f6..0000000 --- a/.github/scripts/validate-workers.sh +++ /dev/null @@ -1,127 +0,0 @@ -#!/bin/bash -# Worker Validation Script -# Ensures all workers defined in docker-compose.yml exist in the repository -# and are properly tracked by git. - -set -e - -echo "šŸ” Validating worker completeness..." - -# Colors for output -RED='\033[0;31m' -GREEN='\033[0;32m' -YELLOW='\033[1;33m' -NC='\033[0m' # No Color - -ERRORS=0 -WARNINGS=0 - -# Extract worker service names from docker-compose.yml -echo "" -echo "šŸ“‹ Checking workers defined in docker-compose.yml..." -WORKERS=$(grep -E "^\s+worker-" docker-compose.yml | grep -v "#" | cut -d: -f1 | tr -d ' ' | sort -u) - -if [ -z "$WORKERS" ]; then - echo -e "${RED}āŒ No workers found in docker-compose.yml${NC}" - exit 1 -fi - -echo "Found workers:" -for worker in $WORKERS; do - echo " - $worker" -done - -# Check each worker -echo "" -echo "šŸ”Ž Validating worker files..." -for worker in $WORKERS; do - WORKER_DIR="workers/${worker#worker-}" - - echo "" - echo "Checking $worker ($WORKER_DIR)..." - - # Check if directory exists - if [ ! -d "$WORKER_DIR" ]; then - echo -e "${RED} āŒ Directory not found: $WORKER_DIR${NC}" - ERRORS=$((ERRORS + 1)) - continue - fi - - # Check Dockerfile (single file or multi-platform pattern) - if [ -f "$WORKER_DIR/Dockerfile" ]; then - # Single Dockerfile - if ! git ls-files --error-unmatch "$WORKER_DIR/Dockerfile" &> /dev/null; then - echo -e "${RED} āŒ File not tracked by git: $WORKER_DIR/Dockerfile${NC}" - echo -e "${YELLOW} Check .gitignore patterns!${NC}" - ERRORS=$((ERRORS + 1)) - else - echo -e "${GREEN} āœ“ Dockerfile (tracked)${NC}" - fi - elif compgen -G "$WORKER_DIR/Dockerfile.*" > /dev/null; then - # Multi-platform Dockerfiles (e.g., Dockerfile.amd64, Dockerfile.arm64) - PLATFORM_DOCKERFILES=$(ls "$WORKER_DIR"/Dockerfile.* 2>/dev/null) - DOCKERFILE_FOUND=false - for dockerfile in $PLATFORM_DOCKERFILES; do - if git ls-files --error-unmatch "$dockerfile" &> /dev/null; then - echo -e "${GREEN} āœ“ $(basename "$dockerfile") (tracked)${NC}" - DOCKERFILE_FOUND=true - else - echo -e "${RED} āŒ File not tracked by git: $dockerfile${NC}" - ERRORS=$((ERRORS + 1)) - fi - done - if [ "$DOCKERFILE_FOUND" = false ]; then - echo -e "${RED} āŒ No platform-specific Dockerfiles found${NC}" - ERRORS=$((ERRORS + 1)) - fi - else - echo -e "${RED} āŒ Missing Dockerfile or Dockerfile.* files${NC}" - ERRORS=$((ERRORS + 1)) - fi - - # Check other required files - REQUIRED_FILES=("requirements.txt" "worker.py") - for file in "${REQUIRED_FILES[@]}"; do - FILE_PATH="$WORKER_DIR/$file" - - if [ ! -f "$FILE_PATH" ]; then - echo -e "${RED} āŒ Missing file: $FILE_PATH${NC}" - ERRORS=$((ERRORS + 1)) - else - # Check if file is tracked by git - if ! git ls-files --error-unmatch "$FILE_PATH" &> /dev/null; then - echo -e "${RED} āŒ File not tracked by git: $FILE_PATH${NC}" - echo -e "${YELLOW} Check .gitignore patterns!${NC}" - ERRORS=$((ERRORS + 1)) - else - echo -e "${GREEN} āœ“ $file (tracked)${NC}" - fi - fi - done -done - -# Check for any ignored worker files -echo "" -echo "🚫 Checking for gitignored worker files..." -IGNORED_FILES=$(git check-ignore workers/*/* 2>/dev/null || true) -if [ -n "$IGNORED_FILES" ]; then - echo -e "${YELLOW}āš ļø Warning: Some worker files are being ignored:${NC}" - echo "$IGNORED_FILES" | while read -r file; do - echo -e "${YELLOW} - $file${NC}" - done - WARNINGS=$((WARNINGS + 1)) -fi - -# Summary -echo "" -echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" -if [ $ERRORS -eq 0 ] && [ $WARNINGS -eq 0 ]; then - echo -e "${GREEN}āœ… All workers validated successfully!${NC}" - exit 0 -elif [ $ERRORS -eq 0 ]; then - echo -e "${YELLOW}āš ļø Validation passed with $WARNINGS warning(s)${NC}" - exit 0 -else - echo -e "${RED}āŒ Validation failed with $ERRORS error(s) and $WARNINGS warning(s)${NC}" - exit 1 -fi diff --git a/.github/workflows/benchmark.yml b/.github/workflows/benchmark.yml deleted file mode 100644 index a5b2a46..0000000 --- a/.github/workflows/benchmark.yml +++ /dev/null @@ -1,165 +0,0 @@ -name: Benchmarks - -on: - # Disabled automatic runs - benchmarks not ready for CI/CD yet - # schedule: - # - cron: '0 2 * * *' # 2 AM UTC every day - - # Allow manual trigger for testing - workflow_dispatch: - inputs: - compare_with: - description: 'Baseline commit to compare against (optional)' - required: false - default: '' - - # pull_request: - # paths: - # - 'backend/benchmarks/**' - # - 'backend/toolbox/modules/**' - # - '.github/workflows/benchmark.yml' - -jobs: - benchmark: - name: Run Benchmarks - runs-on: ubuntu-latest - - steps: - - uses: actions/checkout@v4 - with: - fetch-depth: 0 # Fetch all history for comparison - - - name: Set up Python - uses: actions/setup-python@v5 - with: - python-version: '3.11' - - - name: Install system dependencies - run: | - sudo apt-get update - sudo apt-get install -y build-essential - - - name: Install Python dependencies - working-directory: ./backend - run: | - python -m pip install --upgrade pip - pip install -e ".[dev]" - pip install pytest pytest-asyncio pytest-benchmark pytest-benchmark[histogram] - pip install -e ../sdk # Install SDK for benchmarks - - - name: Run benchmarks - working-directory: ./backend - run: | - pytest benchmarks/ \ - -v \ - --benchmark-only \ - --benchmark-json=benchmark-results.json \ - --benchmark-histogram=benchmark-histogram - - - name: Store benchmark results - uses: actions/upload-artifact@v4 - with: - name: benchmark-results-${{ github.run_number }} - path: | - backend/benchmark-results.json - backend/benchmark-histogram.svg - - - name: Download baseline benchmarks - if: github.event_name == 'pull_request' - uses: dawidd6/action-download-artifact@v3 - continue-on-error: true - with: - workflow: benchmark.yml - branch: ${{ github.base_ref }} - name: benchmark-results-* - path: ./baseline - search_artifacts: true - - - name: Compare with baseline - if: github.event_name == 'pull_request' && hashFiles('baseline/benchmark-results.json') != '' - run: | - python -c " - import json - import sys - - with open('backend/benchmark-results.json') as f: - current = json.load(f) - - with open('baseline/benchmark-results.json') as f: - baseline = json.load(f) - - print('\\n## Benchmark Comparison\\n') - print('| Benchmark | Current | Baseline | Change |') - print('|-----------|---------|----------|--------|') - - regressions = [] - - for bench in current['benchmarks']: - name = bench['name'] - current_time = bench['stats']['mean'] - - # Find matching baseline - baseline_bench = next((b for b in baseline['benchmarks'] if b['name'] == name), None) - if baseline_bench: - baseline_time = baseline_bench['stats']['mean'] - change = ((current_time - baseline_time) / baseline_time) * 100 - - print(f'| {name} | {current_time:.4f}s | {baseline_time:.4f}s | {change:+.2f}% |') - - # Flag regressions > 10% - if change > 10: - regressions.append((name, change)) - else: - print(f'| {name} | {current_time:.4f}s | N/A | NEW |') - - if regressions: - print('\\nāš ļø **Performance Regressions Detected:**') - for name, change in regressions: - print(f'- {name}: +{change:.2f}%') - sys.exit(1) - else: - print('\\nāœ… No significant performance regressions detected') - " - - - name: Comment PR with results - if: github.event_name == 'pull_request' - uses: actions/github-script@v7 - with: - script: | - const fs = require('fs'); - const results = JSON.parse(fs.readFileSync('backend/benchmark-results.json', 'utf8')); - - let body = '## Benchmark Results\\n\\n'; - body += '| Category | Benchmark | Mean Time | Std Dev |\\n'; - body += '|----------|-----------|-----------|---------|\\n'; - - for (const bench of results.benchmarks) { - const group = bench.group || 'ungrouped'; - const name = bench.name.split('::').pop(); - const mean = bench.stats.mean.toFixed(4); - const stddev = bench.stats.stddev.toFixed(4); - body += `| ${group} | ${name} | ${mean}s | ${stddev}s |\\n`; - } - - body += '\\nšŸ“Š Full benchmark results available in artifacts.'; - - github.rest.issues.createComment({ - issue_number: context.issue.number, - owner: context.repo.owner, - repo: context.repo.repo, - body: body - }); - - benchmark-summary: - name: Benchmark Summary - runs-on: ubuntu-latest - needs: benchmark - if: always() - steps: - - name: Check results - run: | - if [ "${{ needs.benchmark.result }}" != "success" ]; then - echo "Benchmarks failed or detected regressions" - exit 1 - fi - echo "Benchmarks completed successfully!" diff --git a/.github/workflows/ci-python.yml b/.github/workflows/ci-python.yml deleted file mode 100644 index 35138f1..0000000 --- a/.github/workflows/ci-python.yml +++ /dev/null @@ -1,70 +0,0 @@ -name: Python CI - -# This is a dumb Ci to ensure that the python client and backend builds correctly -# It could be optimized to run faster, building, testing and linting only changed code -# but for now it is good enough. It runs on every push and PR to any branch. -# It also runs on demand. - -on: - workflow_dispatch: - - push: - paths: - - "ai/**" - - "backend/**" - - "cli/**" - - "sdk/**" - - "src/**" - pull_request: - paths: - - "ai/**" - - "backend/**" - - "cli/**" - - "sdk/**" - - "src/**" - -jobs: - ci: - name: ci - runs-on: ubuntu-latest - - steps: - - uses: actions/checkout@v5 - - - name: Setup uv - uses: astral-sh/setup-uv@v6 - with: - enable-cache: true - - - name: Set up Python - run: uv python install - - # Validate no obvious issues - # Quick hack because CLI returns non-zero exit code when no args are provided - - name: Run base command - run: | - set +e - uv run ff - if [ $? -ne 2 ]; then - echo "Expected exit code 2 from 'uv run ff', got $?" - exit 1 - fi - - - name: Build fuzzforge_ai package - run: uv build - - - name: Build ai package - working-directory: ai - run: uv build - - - name: Build cli package - working-directory: cli - run: uv build - - - name: Build sdk package - working-directory: sdk - run: uv build - - - name: Build backend package - working-directory: backend - run: uv build diff --git a/.github/workflows/docs-deploy.yml b/.github/workflows/docs-deploy.yml deleted file mode 100644 index b5f866c..0000000 --- a/.github/workflows/docs-deploy.yml +++ /dev/null @@ -1,57 +0,0 @@ -name: Deploy Docusaurus to GitHub Pages - -on: - workflow_dispatch: - - push: - branches: - - master - paths: - - "docs/**" - -jobs: - build: - name: Build Docusaurus - runs-on: ubuntu-latest - defaults: - run: - working-directory: ./docs - steps: - - uses: actions/checkout@v4 - with: - fetch-depth: 0 - - uses: actions/setup-node@v4 - with: - node-version: 24 - cache: npm - cache-dependency-path: "**/package-lock.json" - - - name: Install dependencies - run: npm ci - - name: Build website - run: npm run build - - - name: Upload Build Artifact - uses: actions/upload-pages-artifact@v3 - with: - path: ./docs/build - - deploy: - name: Deploy to GitHub Pages - needs: build - - # Grant GITHUB_TOKEN the permissions required to make a Pages deployment - permissions: - pages: write # to deploy to Pages - id-token: write # to verify the deployment originates from an appropriate source - - # Deploy to the github-pages environment - environment: - name: github-pages - url: ${{ steps.deployment.outputs.page_url }} - - runs-on: ubuntu-latest - steps: - - name: Deploy to GitHub Pages - id: deployment - uses: actions/deploy-pages@v4 diff --git a/.github/workflows/docs-test-deploy.yml b/.github/workflows/docs-test-deploy.yml deleted file mode 100644 index c42d773..0000000 --- a/.github/workflows/docs-test-deploy.yml +++ /dev/null @@ -1,33 +0,0 @@ -name: Docusaurus test deployment - -on: - workflow_dispatch: - - push: - paths: - - "docs/**" - pull_request: - paths: - - "docs/**" - -jobs: - test-deploy: - name: Test deployment - runs-on: ubuntu-latest - defaults: - run: - working-directory: ./docs - steps: - - uses: actions/checkout@v4 - with: - fetch-depth: 0 - - uses: actions/setup-node@v4 - with: - node-version: 24 - cache: npm - cache-dependency-path: "**/package-lock.json" - - - name: Install dependencies - run: npm ci - - name: Test build website - run: npm run build diff --git a/.github/workflows/examples/security-scan.yml b/.github/workflows/examples/security-scan.yml deleted file mode 100644 index 1fd4922..0000000 --- a/.github/workflows/examples/security-scan.yml +++ /dev/null @@ -1,152 +0,0 @@ -# FuzzForge CI/CD Example - Security Scanning -# -# This workflow demonstrates how to integrate FuzzForge into your CI/CD pipeline -# for automated security testing on pull requests and pushes. -# -# Features: -# - Runs entirely in GitHub Actions (no external infrastructure needed) -# - Auto-starts FuzzForge services on-demand -# - Fails builds on error-level SARIF findings -# - Uploads SARIF results to GitHub Security tab -# - Exports findings as artifacts -# -# Prerequisites: -# - Ubuntu runner with Docker support -# - At least 4GB RAM available -# - ~90 seconds startup time - -name: Security Scan Example - -on: - pull_request: - branches: [main, develop] - push: - branches: [main] - -jobs: - security-scan: - name: Security Assessment - runs-on: ubuntu-latest - timeout-minutes: 30 - - steps: - - name: Checkout code - uses: actions/checkout@v4 - - - name: Start FuzzForge - run: | - bash scripts/ci-start.sh - - - name: Set up Python - uses: actions/setup-python@v5 - with: - python-version: '3.11' - - - name: Install FuzzForge CLI - run: | - pip install ./cli - - - name: Initialize FuzzForge - run: | - ff init --api-url http://localhost:8000 --name "GitHub Actions Security Scan" - - - name: Run Security Assessment - run: | - ff workflow run security_assessment . \ - --wait \ - --fail-on error \ - --export-sarif results.sarif - - - name: Upload SARIF to GitHub Security - if: always() - uses: github/codeql-action/upload-sarif@v3 - with: - sarif_file: results.sarif - - - name: Upload findings as artifact - if: always() - uses: actions/upload-artifact@v4 - with: - name: security-findings - path: results.sarif - retention-days: 30 - - - name: Stop FuzzForge - if: always() - run: | - bash scripts/ci-stop.sh - - secret-scan: - name: Secret Detection - runs-on: ubuntu-latest - timeout-minutes: 15 - - steps: - - uses: actions/checkout@v4 - - - name: Start FuzzForge - run: bash scripts/ci-start.sh - - - name: Install CLI - run: | - pip install ./cli - - - name: Initialize & Scan - run: | - ff init --api-url http://localhost:8000 --name "Secret Detection" - ff workflow run secret_detection . \ - --wait \ - --fail-on all \ - --export-sarif secrets.sarif - - - name: Upload results - if: always() - uses: actions/upload-artifact@v4 - with: - name: secret-scan-results - path: secrets.sarif - retention-days: 30 - - - name: Cleanup - if: always() - run: bash scripts/ci-stop.sh - - # Example: Nightly fuzzing campaign (long-running) - nightly-fuzzing: - name: Nightly Fuzzing - runs-on: ubuntu-latest - timeout-minutes: 120 - # Only run on schedule - if: github.event_name == 'schedule' - - steps: - - uses: actions/checkout@v4 - - - name: Start FuzzForge - run: bash scripts/ci-start.sh - - - name: Install CLI - run: pip install ./cli - - - name: Run Fuzzing Campaign - run: | - ff init --api-url http://localhost:8000 - ff workflow run atheris_fuzzing . \ - max_iterations=100000000 \ - timeout_seconds=7200 \ - --wait \ - --export-sarif fuzzing-results.sarif - # Don't fail on fuzzing findings, just report - continue-on-error: true - - - name: Upload fuzzing results - if: always() - uses: actions/upload-artifact@v4 - with: - name: fuzzing-results - path: fuzzing-results.sarif - retention-days: 90 - - - name: Cleanup - if: always() - run: bash scripts/ci-stop.sh diff --git a/.github/workflows/test.yml b/.github/workflows/test.yml deleted file mode 100644 index 9f79b46..0000000 --- a/.github/workflows/test.yml +++ /dev/null @@ -1,248 +0,0 @@ -name: Tests - -on: - push: - branches: [ main, master, dev, develop, feature/** ] - pull_request: - branches: [ main, master, dev, develop ] - -jobs: - validate-workers: - name: Validate Workers - runs-on: ubuntu-latest - steps: - - uses: actions/checkout@v4 - - - name: Run worker validation - run: | - chmod +x .github/scripts/validate-workers.sh - .github/scripts/validate-workers.sh - - build-workers: - name: Build Worker Docker Images - runs-on: ubuntu-latest - steps: - - uses: actions/checkout@v4 - with: - fetch-depth: 0 # Fetch all history for proper diff - - - name: Check which workers were modified - id: check-workers - run: | - if [ "${{ github.event_name }}" == "pull_request" ]; then - # For PRs, check changed files - CHANGED_FILES=$(git diff --name-only origin/${{ github.base_ref }}...HEAD) - echo "Changed files:" - echo "$CHANGED_FILES" - else - # For direct pushes, check last commit - CHANGED_FILES=$(git diff --name-only HEAD~1 HEAD) - fi - - # Check if docker-compose.yml changed (build all workers) - if echo "$CHANGED_FILES" | grep -q "^docker-compose.yml"; then - echo "workers_to_build=worker-python worker-secrets worker-rust worker-android worker-ossfuzz" >> $GITHUB_OUTPUT - echo "workers_modified=true" >> $GITHUB_OUTPUT - echo "āœ… docker-compose.yml modified - building all workers" - exit 0 - fi - - # Detect which specific workers changed - WORKERS_TO_BUILD="" - - if echo "$CHANGED_FILES" | grep -q "^workers/python/"; then - WORKERS_TO_BUILD="$WORKERS_TO_BUILD worker-python" - echo "āœ… Python worker modified" - fi - - if echo "$CHANGED_FILES" | grep -q "^workers/secrets/"; then - WORKERS_TO_BUILD="$WORKERS_TO_BUILD worker-secrets" - echo "āœ… Secrets worker modified" - fi - - if echo "$CHANGED_FILES" | grep -q "^workers/rust/"; then - WORKERS_TO_BUILD="$WORKERS_TO_BUILD worker-rust" - echo "āœ… Rust worker modified" - fi - - if echo "$CHANGED_FILES" | grep -q "^workers/android/"; then - WORKERS_TO_BUILD="$WORKERS_TO_BUILD worker-android" - echo "āœ… Android worker modified" - fi - - if echo "$CHANGED_FILES" | grep -q "^workers/ossfuzz/"; then - WORKERS_TO_BUILD="$WORKERS_TO_BUILD worker-ossfuzz" - echo "āœ… OSS-Fuzz worker modified" - fi - - if [ -z "$WORKERS_TO_BUILD" ]; then - echo "workers_modified=false" >> $GITHUB_OUTPUT - echo "ā­ļø No worker changes detected - skipping build" - else - echo "workers_to_build=$WORKERS_TO_BUILD" >> $GITHUB_OUTPUT - echo "workers_modified=true" >> $GITHUB_OUTPUT - echo "Building workers:$WORKERS_TO_BUILD" - fi - - - name: Set up Docker Buildx - if: steps.check-workers.outputs.workers_modified == 'true' - uses: docker/setup-buildx-action@v3 - - - name: Build worker images - if: steps.check-workers.outputs.workers_modified == 'true' - run: | - WORKERS="${{ steps.check-workers.outputs.workers_to_build }}" - echo "Building worker Docker images: $WORKERS" - docker compose build $WORKERS --no-cache - continue-on-error: false - - lint: - name: Lint - runs-on: ubuntu-latest - steps: - - uses: actions/checkout@v4 - - - name: Set up Python - uses: actions/setup-python@v5 - with: - python-version: '3.11' - - - name: Install dependencies - run: | - python -m pip install --upgrade pip - pip install ruff mypy - - - name: Run ruff - run: ruff check backend/src backend/toolbox backend/tests backend/benchmarks --output-format=github - - - name: Run mypy (continue on error) - run: mypy backend/src backend/toolbox || true - continue-on-error: true - - unit-tests: - name: Unit Tests - runs-on: ubuntu-latest - strategy: - matrix: - python-version: ['3.11', '3.12'] - - steps: - - uses: actions/checkout@v4 - - - name: Set up Python ${{ matrix.python-version }} - uses: actions/setup-python@v5 - with: - python-version: ${{ matrix.python-version }} - - - name: Install system dependencies - run: | - sudo apt-get update - sudo apt-get install -y build-essential - - - name: Install Python dependencies - working-directory: ./backend - run: | - python -m pip install --upgrade pip - pip install -e ".[dev]" - pip install pytest pytest-asyncio pytest-cov pytest-xdist - - - name: Run unit tests - working-directory: ./backend - run: | - pytest tests/unit/ \ - -v \ - --cov=toolbox/modules \ - --cov=src \ - --cov-report=xml \ - --cov-report=term \ - --cov-report=html \ - -n auto - - - name: Upload coverage to Codecov - if: matrix.python-version == '3.11' - uses: codecov/codecov-action@v4 - with: - file: ./backend/coverage.xml - flags: unittests - name: codecov-backend - - - name: Upload coverage HTML - if: matrix.python-version == '3.11' - uses: actions/upload-artifact@v4 - with: - name: coverage-report - path: ./backend/htmlcov/ - - # integration-tests: - # name: Integration Tests - # runs-on: ubuntu-latest - # needs: unit-tests - # - # services: - # postgres: - # image: postgres:15 - # env: - # POSTGRES_USER: postgres - # POSTGRES_PASSWORD: postgres - # POSTGRES_DB: fuzzforge_test - # options: >- - # --health-cmd pg_isready - # --health-interval 10s - # --health-timeout 5s - # --health-retries 5 - # ports: - # - 5432:5432 - # - # steps: - # - uses: actions/checkout@v4 - # - # - name: Set up Python - # uses: actions/setup-python@v5 - # with: - # python-version: '3.11' - # - # - name: Set up Docker Buildx - # uses: docker/setup-buildx-action@v3 - # - # - name: Install Python dependencies - # working-directory: ./backend - # run: | - # python -m pip install --upgrade pip - # pip install -e ".[dev]" - # pip install pytest pytest-asyncio - # - # - name: Start services (Temporal, MinIO) - # run: | - # docker-compose -f docker-compose.yml up -d temporal minio - # sleep 30 - # - # - name: Run integration tests - # working-directory: ./backend - # run: | - # pytest tests/integration/ -v --tb=short - # env: - # DATABASE_URL: postgresql://postgres:postgres@localhost:5432/fuzzforge_test - # TEMPORAL_ADDRESS: localhost:7233 - # MINIO_ENDPOINT: localhost:9000 - # - # - name: Shutdown services - # if: always() - # run: docker-compose down - - test-summary: - name: Test Summary - runs-on: ubuntu-latest - needs: [validate-workers, lint, unit-tests] - if: always() - steps: - - name: Check test results - run: | - if [ "${{ needs.validate-workers.result }}" != "success" ]; then - echo "Worker validation failed" - exit 1 - fi - if [ "${{ needs.unit-tests.result }}" != "success" ]; then - echo "Unit tests failed" - exit 1 - fi - echo "All tests passed!" diff --git a/.gitlab-ci.example.yml b/.gitlab-ci.example.yml deleted file mode 100644 index 57301ca..0000000 --- a/.gitlab-ci.example.yml +++ /dev/null @@ -1,121 +0,0 @@ -# FuzzForge CI/CD Example - GitLab CI -# -# This file demonstrates how to integrate FuzzForge into your GitLab CI/CD pipeline. -# Copy this to `.gitlab-ci.yml` in your project root to enable security scanning. -# -# Features: -# - Runs entirely in GitLab runners (no external infrastructure) -# - Auto-starts FuzzForge services on-demand -# - Fails pipelines on critical/high severity findings -# - Uploads SARIF reports to GitLab Security Dashboard -# - Exports findings as artifacts -# -# Prerequisites: -# - GitLab Runner with Docker support (docker:dind) -# - At least 4GB RAM available -# - ~90 seconds startup time - -stages: - - security - -variables: - FUZZFORGE_API_URL: "http://localhost:8000" - DOCKER_DRIVER: overlay2 - DOCKER_TLS_CERTDIR: "" - -# Base template for all FuzzForge jobs -.fuzzforge_template: - image: docker:24 - services: - - docker:24-dind - before_script: - # Install dependencies - - apk add --no-cache bash curl python3 py3-pip git - # Start FuzzForge - - bash scripts/ci-start.sh - # Install CLI - - pip3 install ./cli --break-system-packages - # Initialize project - - ff init --api-url $FUZZFORGE_API_URL --name "GitLab CI Security Scan" - after_script: - # Cleanup - - bash scripts/ci-stop.sh || true - -# Security Assessment - Comprehensive code analysis -security:scan: - extends: .fuzzforge_template - stage: security - timeout: 30 minutes - script: - - ff workflow run security_assessment . --wait --fail-on error --export-sarif results.sarif - artifacts: - when: always - reports: - sast: results.sarif - paths: - - results.sarif - expire_in: 30 days - only: - - merge_requests - - main - - develop - -# Secret Detection - Scan for exposed credentials -security:secrets: - extends: .fuzzforge_template - stage: security - timeout: 15 minutes - script: - - ff workflow run secret_detection . --wait --fail-on all --export-sarif secrets.sarif - artifacts: - when: always - paths: - - secrets.sarif - expire_in: 30 days - only: - - merge_requests - - main - -# Nightly Fuzzing - Long-running fuzzing campaign (scheduled only) -security:fuzzing: - extends: .fuzzforge_template - stage: security - timeout: 2 hours - script: - - | - ff workflow run atheris_fuzzing . \ - max_iterations=100000000 \ - timeout_seconds=7200 \ - --wait \ - --export-sarif fuzzing-results.sarif - artifacts: - when: always - paths: - - fuzzing-results.sarif - expire_in: 90 days - allow_failure: true # Don't fail pipeline on fuzzing findings - only: - - schedules - -# OSS-Fuzz Campaign (for supported projects) -security:ossfuzz: - extends: .fuzzforge_template - stage: security - timeout: 1 hour - script: - - | - ff workflow run ossfuzz_campaign . \ - project_name=your-project-name \ - campaign_duration_hours=0.5 \ - --wait \ - --export-sarif ossfuzz-results.sarif - artifacts: - when: always - paths: - - ossfuzz-results.sarif - expire_in: 90 days - allow_failure: true - only: - - schedules - # Uncomment and set your project name - # when: manual diff --git a/ARCHITECTURE.md b/ARCHITECTURE.md deleted file mode 100644 index aa265b4..0000000 --- a/ARCHITECTURE.md +++ /dev/null @@ -1,1020 +0,0 @@ -# FuzzForge AI Architecture - -**Last Updated:** 2025-10-14 -**Status:** Production - Temporal with Vertical Workers - ---- - -## Table of Contents - -1. [Executive Summary](#executive-summary) -2. [Current Architecture (Temporal + Vertical Workers)](#current-architecture-temporal--vertical-workers) -3. [Vertical Worker Model](#vertical-worker-model) -4. [Storage Strategy (MinIO)](#storage-strategy-minio) -5. [Dynamic Workflow Loading](#dynamic-workflow-loading) -6. [Architecture Principles](#architecture-principles) -7. [Component Details](#component-details) -8. [Scaling Strategy](#scaling-strategy) -9. [File Lifecycle Management](#file-lifecycle-management) -10. [Future: Nomad Migration](#future-nomad-migration) - ---- - -## Executive Summary - -### The Architecture - -**Temporal orchestration** with a **vertical worker architecture** where each worker is pre-built with domain-specific security toolchains (Android, Rust, Web, iOS, Blockchain, OSS-Fuzz, etc.). Uses **MinIO** for unified S3-compatible storage across dev and production environments. - -### Key Architecture Features - -1. **Vertical Specialization:** Pre-built toolchains (Android: Frida, apktool; Rust: AFL++, cargo-fuzz) -2. **Zero Startup Overhead:** Long-lived workers (no container spawn per workflow) -3. **Dynamic Workflows:** Add workflows without rebuilding images (mount as volume) -4. **Unified Storage:** MinIO works identically in dev and prod -5. **Better Security:** No host filesystem mounts, isolated uploaded targets -6. **Automatic Cleanup:** MinIO lifecycle policies handle file expiration -7. **Scalability:** Clear path from single-host to multi-host to Nomad cluster - ---- - -## Current Architecture (Temporal + Vertical Workers) - -### Infrastructure Overview - -``` -ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” -│ FuzzForge Platform │ -│ │ -│ ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” │ -│ │ Temporal Server │◄────────│ MinIO (S3 Storage) │ │ -│ │ - Workflows │ │ - Uploaded targets │ │ -│ │ - State mgmt │ │ - Results (optional) │ │ -│ │ - Task queues │ │ - Lifecycle policies │ │ -│ ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”¬ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ │ -│ │ │ -│ │ (Task queue routing) │ -│ │ │ -│ ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”“ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” │ -│ │ Vertical Workers (Long-lived) │ │ -│ │ │ │ -│ │ ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”ā”‚ │ -│ │ │ Android │ │ Rust/Native │ │ Web/JS ││ │ -│ │ │ - apktool │ │ - AFL++ │ │ - Node.js ││ │ -│ │ │ - Frida │ │ - cargo-fuzz │ │ - OWASP ZAP ││ │ -│ │ │ - jadx │ │ - gdb │ │ - semgrep ││ │ -│ │ │ - MobSF │ │ - valgrind │ │ - eslint ││ │ -│ │ ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ā”‚ │ -│ │ │ │ -│ │ ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” │ │ -│ │ │ iOS │ │ Blockchain │ │ │ -│ │ │ - class-dump │ │ - mythril │ │ │ -│ │ │ - Clutch │ │ - slither │ │ │ -│ │ │ - Frida │ │ - echidna │ │ │ -│ │ │ - Hopper │ │ - manticore │ │ │ -│ │ ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ │ │ -│ │ │ │ -│ │ All workers have: │ │ -│ │ - /app/toolbox mounted (workflow code) │ │ -│ │ - /cache for MinIO downloads │ │ -│ │ - Dynamic workflow discovery at startup │ │ -│ ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ │ -ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ -``` - -### Service Breakdown - -```yaml -services: - temporal: # Workflow orchestration + embedded SQLite (dev) or Postgres (prod) - minio: # S3-compatible storage for targets and results - minio-setup: # One-time: create buckets, set policies - worker-android: # Android security vertical (scales independently) - worker-rust: # Rust/native security vertical - worker-web: # Web security vertical - # Additional verticals as needed: ios, blockchain, go, etc. - -Total: 6+ services (scales with verticals) -``` - -### Resource Usage - -``` -Temporal: ~500MB (includes embedded DB in dev) -MinIO: ~256MB (with CI_CD=true flag) -MinIO-setup: ~20MB (ephemeral, exits after setup) -Worker-android: ~512MB (varies by toolchain) -Worker-rust: ~512MB -Worker-web: ~512MB -───────────────────────── -Total: ~2.3GB - -Note: +450MB overhead is worth it for: - - Unified dev/prod architecture - - No host filesystem mounts (security) - - Auto cleanup (lifecycle policies) - - Multi-host ready -``` - ---- - -## Vertical Worker Model - -### Concept - -Instead of generic workers that spawn workflow-specific containers, we have **specialized long-lived workers** pre-built with complete security toolchains for specific domains. - -### Vertical Taxonomy - -| Vertical | Tools Included | Use Cases | Workflows | -|----------|---------------|-----------|-----------| -| **android** | apktool, jadx, Frida, MobSF, androguard | APK analysis, reverse engineering, dynamic instrumentation | APK security assessment, malware analysis, repackaging detection | -| **rust** | AFL++, cargo-fuzz, gdb, valgrind, AddressSanitizer | Native fuzzing, memory safety | Cargo fuzzing campaigns, binary analysis | -| **web** | Node.js, OWASP ZAP, Burp Suite, semgrep, eslint | Web app security testing | XSS detection, SQL injection scanning, API fuzzing | -| **ios** | class-dump, Clutch, Frida, Hopper, ios-deploy | iOS app analysis | IPA analysis, jailbreak detection, runtime hooking | -| **blockchain** | mythril, slither, echidna, manticore, solc | Smart contract security | Solidity static analysis, property-based fuzzing | -| **go** | go-fuzz, staticcheck, gosec, dlv | Go security testing | Go fuzzing, static analysis | - -### Vertical Worker Architecture - -```dockerfile -# Example: workers/android/Dockerfile -FROM python:3.11-slim - -# Install Android SDK and tools -RUN apt-get update && apt-get install -y \ - openjdk-17-jdk \ - android-sdk \ - && rm -rf /var/lib/apt/lists/* - -# Install security tools -RUN pip install --no-cache-dir \ - apktool \ - androguard \ - frida-tools \ - pyaxmlparser - -# Install MobSF dependencies -RUN apt-get update && apt-get install -y \ - libxml2-dev \ - libxslt-dev \ - && rm -rf /var/lib/apt/lists/* - -# Install Temporal Python SDK -RUN pip install --no-cache-dir \ - temporalio \ - boto3 \ - pydantic - -# Copy worker entrypoint -COPY worker.py /app/ -WORKDIR /app - -# Worker will mount /app/toolbox and discover workflows at runtime -CMD ["python", "worker.py"] -``` - -### Dynamic Workflow Discovery - -```python -# workers/android/worker.py -import asyncio -from pathlib import Path -from temporalio.client import Client -from temporalio.worker import Worker - -async def discover_workflows(vertical: str): - """Discover workflows for this vertical from mounted toolbox""" - workflows = [] - toolbox = Path("/app/toolbox/workflows") - - for workflow_dir in toolbox.iterdir(): - if not workflow_dir.is_dir(): - continue - - metadata_file = workflow_dir / "metadata.yaml" - if not metadata_file.exists(): - continue - - # Parse metadata - with open(metadata_file) as f: - metadata = yaml.safe_load(f) - - # Check if workflow is for this vertical - if metadata.get("vertical") == vertical: - # Dynamically import workflow module - workflow_module = f"toolbox.workflows.{workflow_dir.name}.workflow" - module = __import__(workflow_module, fromlist=['']) - - # Find @workflow.defn decorated classes - for name, obj in inspect.getmembers(module, inspect.isclass): - if hasattr(obj, '__temporal_workflow_definition'): - workflows.append(obj) - logger.info(f"Discovered workflow: {name} for vertical {vertical}") - - return workflows - -async def main(): - vertical = os.getenv("WORKER_VERTICAL", "android") - temporal_address = os.getenv("TEMPORAL_ADDRESS", "localhost:7233") - - # Discover workflows for this vertical - workflows = await discover_workflows(vertical) - - if not workflows: - logger.warning(f"No workflows found for vertical: {vertical}") - return - - # Connect to Temporal - client = await Client.connect(temporal_address) - - # Start worker with discovered workflows - worker = Worker( - client, - task_queue=f"{vertical}-queue", - workflows=workflows, - activities=[ - get_target_activity, - cleanup_cache_activity, - # ... vertical-specific activities - ] - ) - - logger.info(f"Worker started for vertical '{vertical}' with {len(workflows)} workflows") - await worker.run() - -if __name__ == "__main__": - asyncio.run(main()) -``` - -### Workflow Declaration - -```yaml -# toolbox/workflows/android_apk_analysis/metadata.yaml -name: android_apk_analysis -version: 1.0.0 -description: "Deep analysis of Android APK files" -vertical: android # ← Routes to worker-android -dependencies: - python: - - androguard==4.1.0 # Additional Python deps (optional) - - pyaxmlparser==0.3.28 -``` - -```python -# toolbox/workflows/android_apk_analysis/workflow.py -from temporalio import workflow -from datetime import timedelta - -@workflow.defn -class AndroidApkAnalysisWorkflow: - """ - Comprehensive Android APK security analysis - Runs in worker-android with apktool, Frida, jadx pre-installed - """ - - @workflow.run - async def run(self, target_id: str) -> dict: - # Activity 1: Download target from MinIO - apk_path = await workflow.execute_activity( - "get_target", - target_id, - start_to_close_timeout=timedelta(minutes=5) - ) - - # Activity 2: Extract manifest (uses apktool - pre-installed) - manifest = await workflow.execute_activity( - "extract_manifest", - apk_path, - start_to_close_timeout=timedelta(minutes=5) - ) - - # Activity 3: Static analysis (uses jadx - pre-installed) - static_results = await workflow.execute_activity( - "static_analysis", - apk_path, - start_to_close_timeout=timedelta(minutes=30) - ) - - # Activity 4: Frida instrumentation (uses Frida - pre-installed) - dynamic_results = await workflow.execute_activity( - "dynamic_analysis", - apk_path, - start_to_close_timeout=timedelta(hours=2) - ) - - # Activity 5: Cleanup local cache - await workflow.execute_activity( - "cleanup_cache", - apk_path, - start_to_close_timeout=timedelta(minutes=1) - ) - - return { - "manifest": manifest, - "static": static_results, - "dynamic": dynamic_results - } -``` - ---- - -## Storage Strategy (MinIO) - -### Why MinIO? - -**Goal:** Unified storage that works identically in dev and production, eliminating environment-specific code. - -**Alternatives considered:** -1. āŒ **LocalVolumeStorage** (mount /Users, /home): Security risk, platform-specific, doesn't scale -2. āŒ **Different storage per environment**: Complex, error-prone, dual maintenance -3. āœ… **MinIO everywhere**: Lightweight (+256MB), S3-compatible, multi-host ready - -### MinIO Configuration - -```yaml -# docker-compose.yaml -services: - minio: - image: minio/minio:latest - command: server /data --console-address ":9001" - ports: - - "9000:9000" # S3 API - - "9001:9001" # Web Console (http://localhost:9001) - volumes: - - minio_data:/data - environment: - MINIO_ROOT_USER: fuzzforge - MINIO_ROOT_PASSWORD: fuzzforge123 - MINIO_CI_CD: "true" # Reduces memory to 256MB (from 1GB) - healthcheck: - test: ["CMD", "mc", "ready", "local"] - interval: 5s - timeout: 5s - retries: 5 - - # One-time setup: create buckets and set lifecycle policies - minio-setup: - image: minio/mc:latest - depends_on: - minio: - condition: service_healthy - entrypoint: > - /bin/sh -c " - mc alias set fuzzforge http://minio:9000 fuzzforge fuzzforge123; - mc mb fuzzforge/targets --ignore-existing; - mc mb fuzzforge/results --ignore-existing; - mc ilm add fuzzforge/targets --expiry-days 7; - mc anonymous set download fuzzforge/results; - " -``` - -### Storage Backend Implementation - -```python -# backend/src/storage/s3_cached.py -import boto3 -from pathlib import Path -from datetime import datetime, timedelta -import logging - -logger = logging.getLogger(__name__) - -class S3CachedStorage: - """ - S3-compatible storage with local caching. - Works with MinIO (dev/prod) or AWS S3 (cloud). - """ - - def __init__(self): - self.s3 = boto3.client( - 's3', - endpoint_url=os.getenv('S3_ENDPOINT', 'http://minio:9000'), - aws_access_key_id=os.getenv('S3_ACCESS_KEY', 'fuzzforge'), - aws_secret_access_key=os.getenv('S3_SECRET_KEY', 'fuzzforge123') - ) - self.bucket = os.getenv('S3_BUCKET', 'targets') - self.cache_dir = Path(os.getenv('CACHE_DIR', '/cache')) - self.cache_max_size = self._parse_size(os.getenv('CACHE_MAX_SIZE', '10GB')) - self.cache_ttl = self._parse_duration(os.getenv('CACHE_TTL', '7d')) - - async def upload_target(self, file_path: Path, user_id: str) -> str: - """Upload target to MinIO and return target ID""" - target_id = str(uuid4()) - - # Upload with metadata for lifecycle management - self.s3.upload_file( - str(file_path), - self.bucket, - f'{target_id}/target', - ExtraArgs={ - 'Metadata': { - 'user_id': user_id, - 'uploaded_at': datetime.now().isoformat(), - 'filename': file_path.name - } - } - ) - - logger.info(f"Uploaded target {target_id} ({file_path.name})") - return target_id - - async def get_target(self, target_id: str) -> Path: - """ - Get target from cache or download from MinIO. - Returns local path to cached file. - """ - cache_path = self.cache_dir / target_id - cached_file = cache_path / "target" - - # Check cache - if cached_file.exists(): - # Update access time for LRU - cached_file.touch() - logger.info(f"Cache hit: {target_id}") - return cached_file - - # Cache miss - download from MinIO - logger.info(f"Cache miss: {target_id}, downloading from MinIO") - cache_path.mkdir(parents=True, exist_ok=True) - - self.s3.download_file( - self.bucket, - f'{target_id}/target', - str(cached_file) - ) - - return cached_file - - async def cleanup_cache(self): - """LRU eviction when cache exceeds max size""" - cache_files = [] - total_size = 0 - - for cache_file in self.cache_dir.rglob('*'): - if cache_file.is_file(): - stat = cache_file.stat() - cache_files.append({ - 'path': cache_file, - 'size': stat.st_size, - 'atime': stat.st_atime - }) - total_size += stat.st_size - - if total_size > self.cache_max_size: - # Sort by access time (oldest first) - cache_files.sort(key=lambda x: x['atime']) - - for file_info in cache_files: - if total_size <= self.cache_max_size: - break - - file_info['path'].unlink() - total_size -= file_info['size'] - logger.info(f"Evicted from cache: {file_info['path']}") -``` - -### Performance Characteristics - -| Operation | Direct Filesystem | MinIO (Local) | Impact | -|-----------|------------------|---------------|---------| -| Small file (<1MB) | ~1ms | ~5-10ms | Negligible for security workflows | -| Large file (>100MB) | ~200ms | ~220ms | ~10% overhead | -| Workflow duration | 5-60 minutes | 5-60 minutes + 2-4s upload | <1% overhead | -| Subsequent scans | Same | **Cached (0ms)** | Better than filesystem | - -**Verdict:** 2-4 second upload overhead is **negligible** for workflows that run 5-60 minutes. - -### Workspace Isolation - -To support concurrent workflows safely, FuzzForge implements workspace isolation with three modes: - -**1. Isolated Mode (Default)** -```python -# Each workflow run gets its own workspace -cache_path = f"/cache/{target_id}/{run_id}/workspace/" -``` - -- **Use for:** Fuzzing workflows that modify files (corpus, crashes) -- **Advantages:** Safe for concurrent execution, no file conflicts -- **Cleanup:** Entire run directory removed after workflow completes - -**2. Shared Mode** -```python -# All runs share the same workspace -cache_path = f"/cache/{target_id}/workspace/" -``` - -- **Use for:** Read-only analysis workflows (security scanning, static analysis) -- **Advantages:** Efficient (downloads once), lower bandwidth/storage -- **Cleanup:** No cleanup (workspace persists for reuse) - -**3. Copy-on-Write Mode** -```python -# Download once to shared location, copy per run -shared_cache = f"/cache/{target_id}/shared/workspace/" -run_cache = f"/cache/{target_id}/{run_id}/workspace/" -``` - -- **Use for:** Large targets that need isolation -- **Advantages:** Download once, isolated per-run execution -- **Cleanup:** Run-specific copies removed, shared cache persists - -**Configuration:** - -Workflows specify isolation mode in `metadata.yaml`: -```yaml -name: atheris_fuzzing -workspace_isolation: "isolated" # or "shared" or "copy-on-write" -``` - -Workers automatically handle download, extraction, and cleanup based on the mode. - ---- - -## Dynamic Workflow Loading - -### The Problem - -**Requirement:** Workflows must be dynamically added without modifying the codebase or rebuilding Docker images. - -**Traditional approach (doesn't work):** -- Build Docker image per workflow with dependencies -- Push to registry -- Worker pulls and spawns container -- āŒ Requires rebuild for every workflow change -- āŒ Registry overhead -- āŒ Slow (5-10s startup per workflow) - -**Our approach (works):** -- Workflow code mounted as volume into long-lived workers -- Workers scan `/app/toolbox/workflows` at startup -- Dynamically import and register workflows matching vertical -- āœ… No rebuild needed -- āœ… No registry -- āœ… Zero startup overhead - -### Implementation - -**1. Docker Compose volume mount:** -```yaml -worker-android: - volumes: - - ./toolbox:/app/toolbox:ro # Mount workflow code as read-only -``` - -**2. Worker discovers workflows:** -```python -# Runs at worker startup -for workflow_dir in Path("/app/toolbox/workflows").iterdir(): - metadata = yaml.safe_load((workflow_dir / "metadata.yaml").read_text()) - - # Only load workflows for this vertical - if metadata.get("vertical") == os.getenv("WORKER_VERTICAL"): - # Dynamically import workflow.py - module = importlib.import_module(f"toolbox.workflows.{workflow_dir.name}.workflow") - - # Find @workflow.defn classes - workflows.append(module.MyWorkflowClass) -``` - -**3. Developer adds workflow:** -```bash -# 1. Create workflow directory -mkdir -p toolbox/workflows/my_new_workflow - -# 2. Write metadata -cat > toolbox/workflows/my_new_workflow/metadata.yaml < toolbox/workflows/my_new_workflow/workflow.py <80%, memory >90%) - -### Phase 2: Multi-Host (6-18 months) - -**Configuration:** -``` -Host 1: Temporal + MinIO -Host 2: 5Ɨ worker-android -Host 3: 5Ɨ worker-rust -Host 4: 5Ɨ worker-web -``` - -**Changes required:** -```yaml -# Point all workers to central Temporal/MinIO -environment: - TEMPORAL_ADDRESS: temporal.prod.fuzzforge.ai:7233 - S3_ENDPOINT: http://minio.prod.fuzzforge.ai:9000 -``` - -**Capacity:** 3Ɨ Phase 1 = 45-150 concurrent workflows - -### Phase 3: Nomad Cluster (18+ months, if needed) - -**Trigger Points:** -- Managing 10+ hosts manually -- Need auto-scaling based on queue depth -- Need multi-tenancy (customer namespaces) - -**Migration effort:** 1-2 weeks (workers unchanged, just change deployment method) - ---- - -## File Lifecycle Management - -### Automatic Cleanup via MinIO Lifecycle Policies - -```bash -# Set on bucket (done by minio-setup service) -mc ilm add fuzzforge/targets --expiry-days 7 - -# MinIO automatically deletes objects older than 7 days -``` - -### Local Cache Eviction (LRU) - -```python -# Worker background task (runs every 30 minutes) -async def cleanup_cache_task(): - while True: - await storage.cleanup_cache() # LRU eviction - await asyncio.sleep(1800) # 30 minutes -``` - -### Manual Deletion (API) - -```python -@app.delete("/api/targets/{target_id}") -async def delete_target(target_id: str): - """Allow users to manually delete uploaded targets""" - s3.delete_object(Bucket='targets', Key=f'{target_id}/target') - return {"status": "deleted"} -``` - -### Retention Policies - -| Object Type | Default TTL | Configurable | Notes | -|-------------|-------------|--------------|-------| -| Uploaded targets | 7 days | Yes (env var) | Auto-deleted by MinIO | -| Worker cache | LRU (10GB limit) | Yes | Evicted when cache full | -| Workflow results | 30 days (optional) | Yes | Can store in MinIO | - ---- - -## Future: Nomad Migration - -### When to Add Nomad? - -**Trigger points:** -- Managing 10+ hosts manually becomes painful -- Need auto-scaling based on queue depth -- Need multi-tenancy with resource quotas -- Want sophisticated scheduling (bin-packing, affinity rules) - -**Estimated timing:** 18-24 months - -### Migration Complexity - -**Effort:** 1-2 weeks - -**What changes:** -- Deployment method (docker-compose → Nomad jobs) -- Orchestration layer (manual → Nomad scheduler) - -**What stays the same:** -- Worker Docker images (unchanged) -- Workflows (unchanged) -- Temporal (unchanged) -- MinIO (unchanged) -- Storage backend (unchanged) - -### Nomad Job Example - -```hcl -job "fuzzforge-worker-android" { - datacenters = ["dc1"] - type = "service" - - group "workers" { - count = 5 # Auto-scales based on queue depth - - scaling { - min = 1 - max = 20 - - policy { - evaluation_interval = "30s" - - check "queue_depth" { - source = "prometheus" - query = "temporal_queue_depth{queue='android-queue'}" - - strategy "target-value" { - target = 10 # Scale up if >10 tasks queued - } - } - } - } - - task "worker" { - driver = "docker" - - config { - image = "fuzzforge/worker-android:latest" - - volumes = [ - "/opt/fuzzforge/toolbox:/app/toolbox:ro" - ] - } - - env { - TEMPORAL_ADDRESS = "temporal.service.consul:7233" - WORKER_VERTICAL = "android" - S3_ENDPOINT = "http://minio.service.consul:9000" - } - - resources { - cpu = 500 # MHz - memory = 512 # MB - } - } - } -} -``` - -### Licensing Considerations - -**Nomad BSL 1.1 Risk:** Depends on FuzzForge positioning - -**Safe positioning (LOW risk):** -- āœ… Market as "Android/Rust/Web security verticals" -- āœ… Emphasize domain expertise, not orchestration -- āœ… Nomad is internal infrastructure -- āœ… Customers buy security services, not Nomad - -**Risky positioning (MEDIUM risk):** -- āš ļø Market as "generic workflow orchestration platform" -- āš ļø Emphasize flexibility over domain expertise -- āš ļø Could be seen as competing with HashiCorp - -**Mitigation:** -- Keep marketing focused on security verticals -- Get legal review before Phase 3 -- Alternative: Use Kubernetes (Apache 2.0, zero risk) - ---- - -## Migration Timeline - -### Phase 1: Foundation (Weeks 1-2) -- āœ… Create feature branch -- Set up Temporal docker-compose -- Add MinIO service -- Implement S3CachedStorage backend -- Create cleanup/lifecycle logic - -### Phase 2: First Vertical Worker (Weeks 3-4) -- Design worker base template -- Create worker-rust with AFL++, cargo-fuzz -- Implement dynamic workflow discovery -- Test workflow loading from mounted volume - -### Phase 3: Migrate Workflows (Weeks 5-6) -- Port security_assessment workflow to Temporal -- Update workflow metadata format -- Test end-to-end flow (upload → analyze → results) -- Verify cleanup/lifecycle - -### Phase 4: Additional Verticals (Weeks 7-8) -- Create worker-android, worker-web -- Document vertical development guide -- Update CLI for MinIO uploads -- Update backend API for Temporal - -### Phase 5: Testing & Docs (Weeks 9-10) -- Comprehensive testing -- Update README -- Migration guide for existing users -- Troubleshooting documentation - -**Total: 10 weeks, rollback possible at any phase** - ---- - -## Decision Log - -### 2025-09-30: Architecture Implementation -- **Decision:** Temporal with Vertical Workers -- **Rationale:** Simpler infrastructure, better reliability, clear scaling path - -### 2025-10-01: Vertical Worker Model -- **Decision:** Use long-lived vertical workers instead of ephemeral per-workflow containers -- **Rationale:** - - Zero startup overhead (5s saved per workflow) - - Pre-built toolchains (Android, Rust, Web, etc.) - - Dynamic workflows via mounted volumes (no image rebuild) - - Better marketing (sell verticals, not orchestration) - - Safer Nomad BSL positioning - -### 2025-10-01: Unified MinIO Storage -- **Decision:** Use MinIO for both dev and production (no LocalVolumeStorage) -- **Rationale:** - - Unified codebase (no environment-specific code) - - Lightweight (256MB with CI_CD=true) - - Negligible overhead (2-4s for 250MB upload) - - Better security (no host filesystem mounts) - - Multi-host ready - - Automatic cleanup via lifecycle policies - -### 2025-10-01: Dynamic Workflow Loading -- **Decision:** Mount workflow code as volume, discover at runtime -- **Rationale:** - - Add workflows without rebuilding images - - No registry overhead - - Supports user-contributed workflows - - Faster iteration for developers - ---- - -**Document Version:** 2.0 -**Last Updated:** 2025-10-01 -**Next Review:** After Phase 1 implementation (2 weeks) diff --git a/CHANGELOG.md b/CHANGELOG.md deleted file mode 100644 index c852469..0000000 --- a/CHANGELOG.md +++ /dev/null @@ -1,200 +0,0 @@ -# Changelog - -All notable changes to FuzzForge will be documented in this file. - -The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), -and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html). - -## [Unreleased] - -### šŸ“ Documentation -- Added comprehensive worker startup documentation across all guides -- Added workflow-to-worker mapping tables in README, troubleshooting guide, getting started guide, and docker setup guide -- Fixed broken documentation links in CLI reference -- Added WEEK_SUMMARY*.md pattern to .gitignore - ---- - -## [0.7.3] - 2025-10-30 - -### šŸŽÆ Major Features - -#### Android Static Analysis Workflow -- **Added comprehensive Android security testing workflow** (`android_static_analysis`): - - Jadx decompiler for APK → Java source code decompilation - - OpenGrep/Semgrep static analysis with custom Android security rules - - MobSF integration for comprehensive mobile security scanning - - SARIF report generation with unified findings format - - Test results: Successfully decompiled 4,145 Java files, found 8 security vulnerabilities - - Full workflow completes in ~1.5 minutes - -#### Platform-Aware Worker Architecture -- **ARM64 (Apple Silicon) support**: - - Automatic platform detection (ARM64 vs x86_64) in CLI using `platform.machine()` - - Worker metadata convention (`metadata.yaml`) for platform-specific capabilities - - Multi-Dockerfile support: `Dockerfile.amd64` (full toolchain) and `Dockerfile.arm64` (optimized) - - Conditional module imports for graceful degradation (MobSF skips on ARM64) - - Backend path resolution via `FUZZFORGE_HOST_ROOT` for CLI worker management -- **Worker selection logic**: - - CLI automatically selects appropriate Dockerfile based on detected platform - - Multi-strategy path resolution (API → .fuzzforge marker → environment variable) - - Platform-specific tool availability documented in metadata - -#### Python SAST Workflow -- **Added Python Static Application Security Testing workflow** (`python_sast`): - - Bandit for Python security linting (SAST) - - MyPy for static type checking - - Safety for dependency vulnerability scanning - - Integrated SARIF reporter for unified findings format - - Auto-start Python worker on-demand - -### ✨ Enhancements - -#### CI/CD Improvements -- Added automated worker validation in CI pipeline -- Docker build checks for all workers before merge -- Worker file change detection for selective builds -- Optimized Docker layer caching for faster builds -- Dev branch testing workflow triggers - -#### CLI Improvements -- Fixed live monitoring bug in `ff monitor live` command -- Enhanced `ff findings` command with better table formatting -- Improved `ff monitor` with clearer status displays -- Auto-start workers on-demand when workflows require them -- Better error messages with actionable manual start commands - -#### Worker Management -- Standardized worker service names (`worker-python`, `worker-android`, etc.) -- Added missing `worker-secrets` to repository -- Improved worker naming consistency across codebase - -#### LiteLLM Integration -- Centralized LLM provider management with proxy -- Governance and request/response routing -- OTEL collector integration for observability -- Environment-based configurable timeouts -- Optional `.env.litellm` configuration - -### šŸ› Bug Fixes - -- Fixed MobSF API key generation from secret file (SHA256 hash) -- Corrected Temporal activity names (decompile_with_jadx, scan_with_opengrep, scan_with_mobsf) -- Resolved linter errors across codebase -- Fixed unused import issues to pass CI checks -- Removed deprecated workflow parameters -- Docker Compose version compatibility fixes - -### šŸ”§ Technical Changes - -- Conditional import pattern for optional dependencies (MobSF on ARM64) -- Multi-platform Dockerfile architecture -- Worker metadata convention for capability declaration -- Improved CI worker build optimization -- Enhanced storage activity error handling - -### šŸ“ Test Projects - -- Added `test_projects/android_test/` with BeetleBug.apk and shopnest.apk -- Android workflow validation with real APK samples -- ARM64 platform testing and validation - ---- - -## [0.7.2] - 2025-10-22 - -### šŸ› Bug Fixes -- Fixed worker naming inconsistencies across codebase -- Improved monitor command consolidation and usability -- Enhanced findings CLI with better formatting and display -- Added missing secrets worker to repository - -### šŸ“ Documentation -- Added benchmark results files to git for secret detection workflows - -**Note:** v0.7.1 was re-tagged as v0.7.2 (both point to the same commit) - ---- - -## [0.7.0] - 2025-10-16 - -### šŸŽÆ Major Features - -#### Secret Detection Workflows -- **Added three secret detection workflows**: - - `gitleaks_detection` - Pattern-based secret scanning - - `trufflehog_detection` - Entropy-based secret detection with verification - - `llm_secret_detection` - AI-powered semantic secret detection using LLMs -- **Comprehensive benchmarking infrastructure**: - - 32-secret ground truth dataset for precision/recall testing - - Difficulty levels: 12 Easy, 10 Medium, 10 Hard secrets - - SARIF-formatted output for all workflows - - Achieved 100% recall with LLM-based detection on benchmark dataset - -#### AI Module & Agent Integration -- Added A2A (Agent-to-Agent) wrapper for multi-agent orchestration -- Task agent implementation with Google ADK -- LLM analysis workflow for code security analysis -- Reactivated AI agent command (`ff ai agent`) - -#### Temporal Migration Complete -- Fully migrated from Prefect to Temporal for workflow orchestration -- MinIO storage for unified file handling (replaces volume mounts) -- Vertical workers with pre-built security toolchains -- Improved worker lifecycle management - -#### CI/CD Integration -- Ephemeral deployment model for testing -- Automated workflow validation in CI pipeline - -### ✨ Enhancements - -#### Documentation -- Updated README for Temporal + MinIO architecture -- Added `.env` configuration guide for AI agent API keys -- Fixed worker startup instructions with correct service names -- Updated docker compose commands to modern syntax - -#### Worker Management -- Added `worker_service` field to API responses for correct service naming -- Improved error messages with actionable manual start commands -- Fixed default parameters for gitleaks (now uses `no_git=True` by default) - -### šŸ› Bug Fixes - -- Fixed default parameters from metadata.yaml not being applied to workflows when no parameters provided -- Fixed gitleaks workflow failing on uploaded directories without Git history -- Fixed worker startup command suggestions (now uses `docker compose up -d` with service names) -- Fixed missing `cognify_text` method in CogneeProjectIntegration - -### šŸ”§ Technical Changes - -- Updated all package versions to 0.7.0 -- Improved SARIF output formatting for secret detection workflows -- Enhanced benchmark validation with ground truth JSON -- Better integration between CLI and backend for worker management - -### šŸ“ Test Projects - -- Added `secret_detection_benchmark` with 32 documented secrets -- Ground truth JSON for automated precision/recall calculations -- Updated `vulnerable_app` for comprehensive security testing - ---- - -## [0.6.0] - Undocumented - -### Features -- Initial Temporal migration -- Fuzzing workflows (Atheris, Cargo, OSS-Fuzz) -- Security assessment workflow -- Basic CLI commands - -**Note:** No git tag exists for v0.6.0. Release date undocumented. - ---- - -[0.7.3]: https://github.com/FuzzingLabs/fuzzforge_ai/compare/v0.7.2...v0.7.3 -[0.7.2]: https://github.com/FuzzingLabs/fuzzforge_ai/compare/v0.7.0...v0.7.2 -[0.7.0]: https://github.com/FuzzingLabs/fuzzforge_ai/releases/tag/v0.7.0 -[0.6.0]: https://github.com/FuzzingLabs/fuzzforge_ai/tree/v0.6.0 diff --git a/QUICKSTART_TEMPORAL.md b/QUICKSTART_TEMPORAL.md deleted file mode 100644 index 4264037..0000000 --- a/QUICKSTART_TEMPORAL.md +++ /dev/null @@ -1,421 +0,0 @@ -# FuzzForge Temporal Architecture - Quick Start Guide - -This guide walks you through starting and testing the new Temporal-based architecture. - -## Prerequisites - -- Docker and Docker Compose installed -- At least 2GB free RAM (core services only, workers start on-demand) -- Ports available: 7233, 8233, 9000, 9001, 8000 - -## Step 1: Start Core Services - -```bash -# From project root -cd /path/to/fuzzforge_ai - -# Start core services (Temporal, MinIO, Backend) -docker-compose up -d - -# Workers are pre-built but don't auto-start (saves ~6-7GB RAM) -# They'll start automatically when workflows need them - -# Check status -docker-compose ps -``` - -**Expected output:** -``` -NAME STATUS PORTS -fuzzforge-minio healthy 0.0.0.0:9000-9001->9000-9001/tcp -fuzzforge-temporal healthy 0.0.0.0:7233->7233/tcp -fuzzforge-temporal-postgresql healthy 5432/tcp -fuzzforge-backend healthy 0.0.0.0:8000->8000/tcp -fuzzforge-minio-setup exited (0) -# Workers NOT running (will start on-demand) -``` - -**First startup takes ~30-60 seconds** for health checks to pass. - -## Step 2: Verify Worker Discovery - -Check worker logs to ensure workflows are discovered: - -```bash -docker logs fuzzforge-worker-rust -``` - -**Expected output:** -``` -============================================================ -FuzzForge Vertical Worker: rust -============================================================ -Temporal Address: temporal:7233 -Task Queue: rust-queue -Max Concurrent Activities: 5 -============================================================ -Discovering workflows for vertical: rust -Importing workflow module: toolbox.workflows.rust_test.workflow -āœ“ Discovered workflow: RustTestWorkflow from rust_test (vertical: rust) -Discovered 1 workflows for vertical 'rust' -Connecting to Temporal at temporal:7233... -āœ“ Connected to Temporal successfully -Creating worker on task queue: rust-queue -āœ“ Worker created successfully -============================================================ -šŸš€ Worker started for vertical 'rust' -šŸ“¦ Registered 1 workflows -āš™ļø Registered 3 activities -šŸ“Ø Listening on task queue: rust-queue -============================================================ -Worker is ready to process tasks... -``` - -## Step 2.5: Worker Lifecycle Management (New in v0.7.0) - -Workers start on-demand when workflows need them: - -```bash -# Check worker status (should show Exited or not running) -docker ps -a --filter "name=fuzzforge-worker" - -# Run a workflow - worker starts automatically -ff workflow run ossfuzz_campaign . project_name=zlib - -# Worker is now running -docker ps --filter "name=fuzzforge-worker-ossfuzz" -``` - -**Configuration** (`.fuzzforge/config.yaml`): -```yaml -workers: - auto_start_workers: true # Default: auto-start - auto_stop_workers: false # Default: keep running - worker_startup_timeout: 60 # Startup timeout in seconds -``` - -**CLI Control**: -```bash -# Disable auto-start -ff workflow run ossfuzz_campaign . --no-auto-start - -# Enable auto-stop after completion -ff workflow run ossfuzz_campaign . --wait --auto-stop -``` - -## Step 3: Access Web UIs - -### Temporal Web UI -- URL: http://localhost:8233 -- View workflows, executions, and task queues - -### MinIO Console -- URL: http://localhost:9001 -- Login: `fuzzforge` / `fuzzforge123` -- View uploaded targets and results - -## Step 4: Test Workflow Execution - -### Option A: Using Temporal CLI (tctl) - -```bash -# Install tctl (if not already installed) -brew install temporal # macOS -# or download from https://github.com/temporalio/tctl/releases - -# Execute test workflow -tctl workflow run \ - --address localhost:7233 \ - --taskqueue rust-queue \ - --workflow_type RustTestWorkflow \ - --input '{"target_id": "test-123", "test_message": "Hello Temporal!"}' -``` - -### Option B: Using Python Client - -Create `test_workflow.py`: - -```python -import asyncio -from temporalio.client import Client - -async def main(): - # Connect to Temporal - client = await Client.connect("localhost:7233") - - # Start workflow - result = await client.execute_workflow( - "RustTestWorkflow", - {"target_id": "test-123", "test_message": "Hello Temporal!"}, - id="test-workflow-1", - task_queue="rust-queue" - ) - - print("Workflow result:", result) - -if __name__ == "__main__": - asyncio.run(main()) -``` - -```bash -python test_workflow.py -``` - -### Option C: Upload Target and Run (Full Flow) - -```python -# upload_and_run.py -import asyncio -import boto3 -from pathlib import Path -from temporalio.client import Client - -async def main(): - # 1. Upload target to MinIO - s3 = boto3.client( - 's3', - endpoint_url='http://localhost:9000', - aws_access_key_id='fuzzforge', - aws_secret_access_key='fuzzforge123', - region_name='us-east-1' - ) - - # Create a test file - test_file = Path('/tmp/test_target.txt') - test_file.write_text('This is a test target file') - - # Upload to MinIO - target_id = 'my-test-target-001' - s3.upload_file( - str(test_file), - 'targets', - f'{target_id}/target' - ) - print(f"āœ“ Uploaded target: {target_id}") - - # 2. Run workflow - client = await Client.connect("localhost:7233") - - result = await client.execute_workflow( - "RustTestWorkflow", - {"target_id": target_id, "test_message": "Full flow test!"}, - id=f"workflow-{target_id}", - task_queue="rust-queue" - ) - - print("āœ“ Workflow completed!") - print("Results:", result) - -if __name__ == "__main__": - asyncio.run(main()) -``` - -```bash -# Install dependencies -pip install temporalio boto3 - -# Run test -python upload_and_run.py -``` - -## Step 5: Monitor Execution - -### View in Temporal UI - -1. Open http://localhost:8233 -2. Click on "Workflows" -3. Find your workflow by ID -4. Click to see: - - Execution history - - Activity results - - Error stack traces (if any) - -### View Logs - -```bash -# Worker logs (shows activity execution) -docker logs -f fuzzforge-worker-rust - -# Temporal server logs -docker logs -f fuzzforge-temporal -``` - -### Check MinIO Storage - -1. Open http://localhost:9001 -2. Login: `fuzzforge` / `fuzzforge123` -3. Browse buckets: - - `targets/` - Uploaded target files - - `results/` - Workflow results (if uploaded) - - `cache/` - Worker cache (temporary) - -## Troubleshooting - -### Services Not Starting - -```bash -# Check logs for all services -docker-compose -f docker-compose.temporal.yaml logs - -# Check specific service -docker-compose -f docker-compose.temporal.yaml logs temporal -docker-compose -f docker-compose.temporal.yaml logs minio -docker-compose -f docker-compose.temporal.yaml logs worker-rust -``` - -### Worker Not Discovering Workflows - -**Issue**: Worker logs show "No workflows found for vertical: rust" - -**Solution**: -1. Check toolbox mount: `docker exec fuzzforge-worker-rust ls /app/toolbox/workflows` -2. Verify metadata.yaml exists and has `vertical: rust` -3. Check workflow.py has `@workflow.defn` decorator - -### Cannot Connect to Temporal - -**Issue**: `Failed to connect to Temporal` - -**Solution**: -```bash -# Wait for Temporal to be healthy -docker-compose -f docker-compose.temporal.yaml ps - -# Check Temporal health manually -curl http://localhost:8233 - -# Restart Temporal if needed -docker-compose -f docker-compose.temporal.yaml restart temporal -``` - -### MinIO Connection Failed - -**Issue**: `Failed to download target` - -**Solution**: -```bash -# Check MinIO is running -docker ps | grep minio - -# Check buckets exist -docker exec fuzzforge-minio mc ls fuzzforge/ - -# Verify target was uploaded -docker exec fuzzforge-minio mc ls fuzzforge/targets/ -``` - -### Workflow Hangs - -**Issue**: Workflow starts but never completes - -**Check**: -1. Worker logs for errors: `docker logs fuzzforge-worker-rust` -2. Activity timeouts in workflow code -3. Target file actually exists in MinIO - -## Scaling - -### Add More Workers - -```bash -# Scale rust workers horizontally -docker-compose -f docker-compose.temporal.yaml up -d --scale worker-rust=3 - -# Verify all workers are running -docker ps | grep worker-rust -``` - -### Increase Concurrent Activities - -Edit `docker-compose.temporal.yaml`: - -```yaml -worker-rust: - environment: - MAX_CONCURRENT_ACTIVITIES: 10 # Increase from 5 -``` - -```bash -# Apply changes -docker-compose -f docker-compose.temporal.yaml up -d worker-rust -``` - -## Cleanup - -```bash -# Stop all services -docker-compose -f docker-compose.temporal.yaml down - -# Remove volumes (WARNING: deletes all data) -docker-compose -f docker-compose.temporal.yaml down -v - -# Remove everything including images -docker-compose -f docker-compose.temporal.yaml down -v --rmi all -``` - -## Next Steps - -1. **Add More Workflows**: Create workflows in `backend/toolbox/workflows/` -2. **Add More Verticals**: Create new worker types (android, web, etc.) - see `workers/README.md` -3. **Integrate with Backend**: Update FastAPI backend to use Temporal client -4. **Update CLI**: Modify `ff` CLI to work with Temporal workflows - -## Useful Commands - -```bash -# View all logs -docker-compose -f docker-compose.temporal.yaml logs -f - -# View specific service logs -docker-compose -f docker-compose.temporal.yaml logs -f worker-rust - -# Restart a service -docker-compose -f docker-compose.temporal.yaml restart worker-rust - -# Check service status -docker-compose -f docker-compose.temporal.yaml ps - -# Execute command in worker -docker exec -it fuzzforge-worker-rust bash - -# View worker Python environment -docker exec fuzzforge-worker-rust pip list - -# Check workflow discovery manually -docker exec fuzzforge-worker-rust python -c " -from pathlib import Path -import yaml -for w in Path('/app/toolbox/workflows').iterdir(): - if w.is_dir(): - meta = w / 'metadata.yaml' - if meta.exists(): - print(f'{w.name}: {yaml.safe_load(meta.read_text()).get(\"vertical\")}')" -``` - -## Architecture Overview - -``` -ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” -│ Temporal │────▶│ Task Queue │────▶│ Worker-Rust │ -│ Server │ │ rust-queue │ │ (Long-lived)│ -ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”¬ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ - │ │ - │ │ - ā–¼ ā–¼ -ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” -│ Postgres │ │ MinIO │ -│ (State) │ │ (Storage) │ -ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ - │ - ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”“ā”€ā”€ā”€ā”€ā”€ā”€ā” - │ │ - ā”Œā”€ā”€ā”€ā”€ā–¼ā”€ā”€ā”€ā”€ā” ā”Œā”€ā”€ā”€ā”€ā”€ā–¼ā”€ā”€ā”€ā”€ā” - │ Targets │ │ Results │ - ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ -``` - -## Support - -- **Documentation**: See `ARCHITECTURE.md` for detailed design -- **Worker Guide**: See `workers/README.md` for adding verticals -- **Issues**: Open GitHub issue with logs and steps to reproduce diff --git a/ai/.gitignore b/ai/.gitignore deleted file mode 100644 index 5f018c8..0000000 --- a/ai/.gitignore +++ /dev/null @@ -1,6 +0,0 @@ -.env -__pycache__/ -*.pyc -fuzzforge_sessions.db -agentops.log -*.log diff --git a/ai/README.md b/ai/README.md deleted file mode 100644 index 254fdd2..0000000 --- a/ai/README.md +++ /dev/null @@ -1,110 +0,0 @@ -# FuzzForge AI Module - -FuzzForge AI is the multi-agent layer that lets you operate the FuzzForge security platform through natural language. It orchestrates local tooling, registered Agent-to-Agent (A2A) peers, and the Temporal-powered backend while keeping long-running context in memory and project knowledge graphs. - -## Quick Start - -1. **Initialise a project** - ```bash - cd /path/to/project - fuzzforge init - ``` -2. **Review environment settings** – copy `.fuzzforge/.env.template` to `.fuzzforge/.env`, then edit the values to match your provider. The template ships with commented defaults for OpenAI-style usage and placeholders for Cognee keys. - ```env - LLM_PROVIDER=openai - LITELLM_MODEL=gpt-5-mini - OPENAI_API_KEY=sk-your-key - FUZZFORGE_MCP_URL=http://localhost:8010/mcp - SESSION_PERSISTENCE=sqlite - ``` - Optional flags you may want to enable early: - ```env - MEMORY_SERVICE=inmemory - AGENTOPS_API_KEY=sk-your-agentops-key # Enable hosted tracing - LOG_LEVEL=INFO # CLI / server log level - ``` -3. **Populate the knowledge graph** - ```bash - fuzzforge ingest --path . --recursive - # alias: fuzzforge rag ingest --path . --recursive - ``` -4. **Launch the agent shell** - ```bash - fuzzforge ai agent - ``` - Keep the backend running (Temporal API at `FUZZFORGE_MCP_URL`) so workflow commands succeed. - -## Everyday Workflow - -- Run `fuzzforge ai agent` and start with `list available fuzzforge workflows` or `/memory status` to confirm everything is wired. -- Use natural prompts for automation (`run fuzzforge workflow …`, `search project knowledge for …`) and fall back to slash commands for precision (`/recall`, `/sendfile`). -- Keep `/memory datasets` handy to see which Cognee datasets are available after each ingest. -- Start the HTTP surface with `python -m fuzzforge_ai` when external agents need access to artifacts or graph queries. The CLI stays usable at the same time. -- Refresh the knowledge graph regularly: `fuzzforge ingest --path . --recursive --force` keeps responses aligned with recent code changes. - -## What the Agent Can Do - -- **Route requests** – automatically selects the right local tool or remote agent using the A2A capability registry. -- **Run security workflows** – list, submit, and monitor FuzzForge workflows via MCP wrappers. -- **Manage artifacts** – create downloadable files for reports, code edits, and shared attachments. -- **Maintain context** – stores session history, semantic recall, and Cognee project graphs. -- **Serve over HTTP** – expose the same agent as an A2A server using `python -m fuzzforge_ai`. - -## Essential Commands - -Inside `fuzzforge ai agent` you can mix slash commands and free-form prompts: - -```text -/list # Show registered A2A agents -/register http://:10201 # Add a remote agent -/artifacts # List generated files -/sendfile SecurityAgent src/report.md "Please review" -You> route_to SecurityAnalyzer: scan ./backend for secrets -You> run fuzzforge workflow static_analysis_scan on ./test_projects/demo -You> search project knowledge for "temporal status" using INSIGHTS -``` - -Artifacts created during the conversation are served from `.fuzzforge/artifacts/` and exposed through the A2A HTTP API. - -## Memory & Knowledge - -The module layers three storage systems: - -- **Session persistence** (SQLite or in-memory) for chat transcripts. -- **Semantic recall** via the ADK memory service for fuzzy search. -- **Cognee graphs** for project-wide knowledge built from ingestion runs. - -Re-run ingestion after major code changes to keep graph answers relevant. If Cognee variables are not set, graph-specific tools automatically respond with a polite "not configured" message. - -## Sample Prompts - -Use these to validate the setup once the agent shell is running: - -- `list available fuzzforge workflows` -- `run fuzzforge workflow static_analysis_scan on ./backend with target_branch=main` -- `show findings for that run once it finishes` -- `refresh the project knowledge graph for ./backend` -- `search project knowledge for "temporal readiness" using INSIGHTS` -- `/recall terraform secrets` -- `/memory status` -- `ROUTE_TO SecurityAnalyzer: audit infrastructure_vulnerable` - -## Need More Detail? - -Dive into the dedicated guides under `ai/docs/advanced/`: - -- [Architecture](https://docs.fuzzforge.ai/docs/ai/intro) – High-level architecture with diagrams and component breakdowns. -- [Ingestion](https://docs.fuzzforge.ai/docs/ai/ingestion.md) – Command options, Cognee persistence, and prompt examples. -- [Configuration](https://docs.fuzzforge.ai/docs/ai/configuration.md) – LLM provider matrix, local model setup, and tracing options. -- [Prompts](https://docs.fuzzforge.ai/docs/ai/prompts.md) – Slash commands, workflow prompts, and routing tips. -- [A2A Services](https://docs.fuzzforge.ai/docs/ai/a2a-services.md) – HTTP endpoints, agent card, and collaboration flow. -- [Memory Persistence](https://docs.fuzzforge.ai/docs/ai/architecture.md#memory--persistence) – Deep dive on memory storage, datasets, and how `/memory status` inspects them. - -## Development Notes - -- Entry point for the CLI: `ai/src/fuzzforge_ai/cli.py` -- A2A HTTP server: `ai/src/fuzzforge_ai/a2a_server.py` -- Tool routing & workflow glue: `ai/src/fuzzforge_ai/agent_executor.py` -- Ingestion helpers: `ai/src/fuzzforge_ai/ingest_utils.py` - -Install the module in editable mode (`pip install -e ai`) while iterating so CLI changes are picked up immediately. diff --git a/ai/agents/task_agent/.dockerignore b/ai/agents/task_agent/.dockerignore deleted file mode 100644 index 227dc09..0000000 --- a/ai/agents/task_agent/.dockerignore +++ /dev/null @@ -1,9 +0,0 @@ -__pycache__ -*.pyc -*.pyo -*.pytest_cache -*.coverage -coverage.xml -build/ -dist/ -.env diff --git a/ai/agents/task_agent/ARCHITECTURE.md b/ai/agents/task_agent/ARCHITECTURE.md deleted file mode 100644 index 2210dca..0000000 --- a/ai/agents/task_agent/ARCHITECTURE.md +++ /dev/null @@ -1,82 +0,0 @@ -# Architecture Overview - -This package is a minimal ADK agent that keeps runtime behaviour and A2A access in separate layers so it can double as boilerplate. - -## Directory Layout - -```text -agent_with_adk_format/ -ā”œā”€ā”€ __init__.py # Exposes root_agent for ADK runners -ā”œā”€ā”€ a2a_hot_swap.py # JSON-RPC helper for model/prompt swaps -ā”œā”€ā”€ README.md, QUICKSTART.md # Operational docs -ā”œā”€ā”€ ARCHITECTURE.md # This document -ā”œā”€ā”€ .env # Active environment (gitignored) -ā”œā”€ā”€ .env.example # Environment template -└── litellm_agent/ - ā”œā”€ā”€ agent.py # Root Agent definition (LiteLLM shell) - ā”œā”€ā”€ callbacks.py # before_agent / before_model hooks - ā”œā”€ā”€ config.py # Defaults, state keys, control prefix - ā”œā”€ā”€ control.py # HOTSWAP command parsing/serialization - ā”œā”€ā”€ state.py # Session state wrapper + LiteLLM factory - ā”œā”€ā”€ tools.py # set_model / set_prompt / get_config - ā”œā”€ā”€ prompts.py # Base instruction text - └── agent.json # A2A agent card (served under /.well-known) -``` - -```mermaid -flowchart TD - subgraph ADK Runner - A["adk api_server / adk web / adk run"] - B["agent_with_adk_format/__init__.py"] - C["litellm_agent/agent.py (root_agent)"] - D["HotSwapState (state.py)"] - E["LiteLlm(model, provider)"] - end - - subgraph Session State - S1[app:litellm_agent/model] - S2[app:litellm_agent/provider] - S3[app:litellm_agent/prompt] - end - - A --> B --> C - C --> D - D -->|instantiate| E - D --> S1 - D --> S2 - D --> S3 - E --> C -``` - -## Runtime Flow (ADK Runners) - -1. **Startup**: `adk api_server`/`adk web` imports `agent_with_adk_format`, which exposes `root_agent` from `litellm_agent/agent.py`. `.env` at package root is loaded before the runner constructs the agent. -2. **Session State**: `callbacks.py` and `tools.py` read/write through `state.py`. We store `model`, `provider`, and `prompt` keys (prefixed `app:litellm_agent/…`) which persist across turns. -3. **Instruction Generation**: `provide_instruction` composes the base persona from `prompts.py` plus any stored prompt override. The current model/provider is appended for observability. -4. **Model Hot-Swap**: When a control message is detected (`[HOTSWAP:MODEL:…]`) the callback parses it via `control.py`, updates the session state, and calls `state.apply_state_to_agent` to instantiate a new `LiteLlm(model=…, custom_llm_provider=…)`. ADK runners reuse that instance for subsequent turns. -5. **Prompt Hot-Swap**: Similar path (`set_prompt` tool/callback) updates state; the dynamic instruction immediately reflects the change. -6. **Config Reporting**: Both the callback and the tool surface the summary string produced by `HotSwapState.describe()`, ensuring CLI, A2A, and UI all show the same data. - -## A2A Integration - -- `agent.json` defines the agent card and enables ADK to register `/a2a/litellm_agent` routes when launched with `--a2a`. -- `a2a_hot_swap.py` uses `a2a.client.A2AClient` to programmatically send control messages and user text via JSON-RPC. It supports streaming when available and falls back to blocking requests otherwise. - -```mermaid -sequenceDiagram - participant Client as a2a_hot_swap.py - participant Server as ADK API Server - participant Agent as root_agent - - Client->>Server: POST /a2a/litellm_agent (message/stream or message/send) - Server->>Agent: Invoke callbacks/tools - Agent->>Server: Status / artifacts / final message - Server->>Client: Streamed Task events - Client->>Client: Extract text & print summary -``` - -## Extending the Boilerplate - -- Add tools under `litellm_agent/tools.py` and register them in `agent.py` to expose new capabilities. -- Use `state.py` to track additional configuration or session data (store under your own prefix to avoid collisions). -- When layering business logic, prefer expanding callbacks or adding higher-level agents while leaving the hot-swap mechanism untouched for reuse. diff --git a/ai/agents/task_agent/DEPLOY.md b/ai/agents/task_agent/DEPLOY.md deleted file mode 100644 index bf4d24c..0000000 --- a/ai/agents/task_agent/DEPLOY.md +++ /dev/null @@ -1,71 +0,0 @@ -# Docker & Kubernetes Deployment - -## Local Docker - -Build from the repository root: - -```bash -docker build -t litellm-hot-swap:latest agent_with_adk_format -``` - -Run the container (port 8000, inject provider keys via env file or flags): - -```bash -docker run \ - -p 8000:8000 \ - --env-file agent_with_adk_format/.env \ - litellm-hot-swap:latest -``` - -The container serves Uvicorn on `http://localhost:8000`. Update `.env` (or pass `-e KEY=value`) before launching if you plan to hot-swap providers. - -## Kubernetes (example manifest) - -Use the same image, optionally pushed to a registry (`docker tag` + `docker push`). A simple Deployment/Service pair: - -```yaml -apiVersion: apps/v1 -kind: Deployment -metadata: - name: litellm-hot-swap -spec: - replicas: 1 - selector: - matchLabels: - app: litellm-hot-swap - template: - metadata: - labels: - app: litellm-hot-swap - spec: - containers: - - name: server - image: /litellm-hot-swap:latest - ports: - - containerPort: 8000 - env: - - name: PORT - value: "8000" - - name: LITELLM_MODEL - value: gemini/gemini-2.0-flash-001 - # Add provider keys as needed - # - name: OPENAI_API_KEY - # valueFrom: - # secretKeyRef: - # name: litellm-secrets - # key: OPENAI_API_KEY ---- -apiVersion: v1 -kind: Service -metadata: - name: litellm-hot-swap -spec: - type: LoadBalancer - selector: - app: litellm-hot-swap - ports: - - port: 80 - targetPort: 8000 -``` - -Apply with `kubectl apply -f deployment.yaml`. Provide secrets via `env` or Kubernetes Secrets. diff --git a/ai/agents/task_agent/Dockerfile b/ai/agents/task_agent/Dockerfile deleted file mode 100644 index c2b6686..0000000 --- a/ai/agents/task_agent/Dockerfile +++ /dev/null @@ -1,24 +0,0 @@ -# syntax=docker/dockerfile:1 - -FROM python:3.11-slim AS base - -ENV PYTHONUNBUFFERED=1 \ - PYTHONDONTWRITEBYTECODE=1 \ - PIP_NO_CACHE_DIR=1 \ - PORT=8000 - -WORKDIR /app - -COPY requirements.txt ./requirements.txt -RUN pip install --upgrade pip && pip install -r requirements.txt - -COPY . /app/agent_with_adk_format -WORKDIR /app/agent_with_adk_format -ENV PYTHONPATH=/app - -# Copy and set up entrypoint -COPY docker-entrypoint.sh /docker-entrypoint.sh -RUN chmod +x /docker-entrypoint.sh - -ENTRYPOINT ["/docker-entrypoint.sh"] -CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000"] diff --git a/ai/agents/task_agent/QUICKSTART.md b/ai/agents/task_agent/QUICKSTART.md deleted file mode 100644 index 756a054..0000000 --- a/ai/agents/task_agent/QUICKSTART.md +++ /dev/null @@ -1,61 +0,0 @@ -# Quick Start Guide - -## Launch the Agent - -From the repository root you can expose the agent through any ADK entry point: - -```bash -# A2A / HTTP server -adk api_server --a2a --port 8000 agent_with_adk_format - -# Browser UI -adk web agent_with_adk_format - -# Interactive terminal -adk run agent_with_adk_format -``` - -The A2A server exposes the JSON-RPC endpoint at `http://localhost:8000/a2a/litellm_agent`. - -## Hot-Swap from the Command Line - -Use the bundled helper to change model and prompt via A2A without touching the UI: - -```bash -python agent_with_adk_format/a2a_hot_swap.py \ - --model openai gpt-4o \ - --prompt "You are concise." \ - --config \ - --context demo-session -``` - -The script sends the control messages for you and prints the server’s responses. The `--context` flag lets you reuse the same conversation across multiple invocations. - -### Follow-up Messages - -Once the swaps are applied you can send a user message on the same session: - -```bash -python agent_with_adk_format/a2a_hot_swap.py \ - --context demo-session \ - --message "Summarise the current configuration in five words." -``` - -### Clearing the Prompt - -```bash -python agent_with_adk_format/a2a_hot_swap.py \ - --context demo-session \ - --prompt "" \ - --config -``` - -## Control Messages (for reference) - -Behind the scenes the helper sends plain text messages understood by the callbacks: - -- `[HOTSWAP:MODEL:provider/model]` -- `[HOTSWAP:PROMPT:text]` -- `[HOTSWAP:GET_CONFIG]` - -You can craft the same messages from any A2A client if you prefer. diff --git a/ai/agents/task_agent/README.md b/ai/agents/task_agent/README.md deleted file mode 100644 index 692e4e6..0000000 --- a/ai/agents/task_agent/README.md +++ /dev/null @@ -1,365 +0,0 @@ -# LiteLLM Agent with Hot-Swap Support - -A flexible AI agent powered by LiteLLM that supports runtime hot-swapping of models and system prompts. Compatible with ADK and A2A protocols. - -## Features - -- šŸ”„ **Hot-Swap Models**: Change LLM models on-the-fly without restarting -- šŸ“ **Dynamic Prompts**: Update system prompts during conversation -- 🌐 **Multi-Provider Support**: Works with OpenAI, Anthropic, Google, OpenRouter, and more -- šŸ”Œ **A2A Compatible**: Can be served as an A2A agent -- šŸ› ļø **ADK Integration**: Run with `adk web`, `adk run`, or `adk api_server` - -## Architecture - -``` -task_agent/ -ā”œā”€ā”€ __init__.py # Exposes root_agent for ADK -ā”œā”€ā”€ a2a_hot_swap.py # JSON-RPC helper for hot-swapping -ā”œā”€ā”€ README.md # This guide -ā”œā”€ā”€ QUICKSTART.md # Quick-start walkthrough -ā”œā”€ā”€ .env # Active environment (gitignored) -ā”œā”€ā”€ .env.example # Environment template -└── litellm_agent/ - ā”œā”€ā”€ __init__.py - ā”œā”€ā”€ agent.py # Main agent implementation - ā”œā”€ā”€ agent.json # A2A agent card - ā”œā”€ā”€ callbacks.py # ADK callbacks - ā”œā”€ā”€ config.py # Defaults and state keys - ā”œā”€ā”€ control.py # HOTSWAP message helpers - ā”œā”€ā”€ prompts.py # Base instruction - ā”œā”€ā”€ state.py # Session state utilities - └── tools.py # set_model / set_prompt / get_config -``` - -## Setup - -### 1. Environment Configuration - -Copying the example file is optional—the repository already ships with a root-level `.env` seeded with defaults. Adjust the values at the package root: -```bash -cd task_agent -# Optionally refresh from the template -# cp .env.example .env -``` - -Edit `.env` (or `.env.example`) and add your proxy + API keys. The agent must be restarted after changes so the values are picked up: -```bash -# Route every request through the proxy container (use http://localhost:10999 from the host) -FF_LLM_PROXY_BASE_URL=http://llm-proxy:4000 - -# Default model + provider the agent boots with -LITELLM_MODEL=openai/gpt-4o-mini -LITELLM_PROVIDER=openai - -# Virtual key issued by the proxy to the task agent (bootstrap replaces the placeholder) -OPENAI_API_KEY=sk-proxy-default - -# Upstream keys stay inside the proxy. Store real secrets under the LiteLLM -# aliases and the bootstrapper mirrors them into .env.litellm for the proxy container. -LITELLM_OPENAI_API_KEY=your_real_openai_api_key -LITELLM_ANTHROPIC_API_KEY=your_real_anthropic_key -LITELLM_GEMINI_API_KEY=your_real_gemini_key -LITELLM_MISTRAL_API_KEY=your_real_mistral_key -LITELLM_OPENROUTER_API_KEY=your_real_openrouter_key -``` - -> When running the agent outside of Docker, swap `FF_LLM_PROXY_BASE_URL` to the host port (default `http://localhost:10999`). - -The bootstrap container provisions LiteLLM, copies provider secrets into -`volumes/env/.env.litellm`, and rewrites `volumes/env/.env` with the virtual key. -Populate the `LITELLM_*_API_KEY` values before the first launch so the proxy can -reach your upstream providers as soon as the bootstrap script runs. - -### 2. Install Dependencies - -```bash -pip install "google-adk" "a2a-sdk[all]" "python-dotenv" "litellm" -``` - -### 3. Run in Docker - -Build the container (this image can be pushed to any registry or run locally): - -```bash -docker build -t litellm-hot-swap:latest task_agent -``` - -Provide environment configuration at runtime (either pass variables individually or mount a file): - -```bash -docker run \ - -p 8000:8000 \ - --env-file task_agent/.env \ - litellm-hot-swap:latest -``` - -The container starts Uvicorn with the ADK app (`main.py`) listening on port 8000. - -## Running the Agent - -### Option 1: ADK Web UI (Recommended for Testing) - -Start the web interface: -```bash -adk web task_agent -``` - -> **Tip:** before launching `adk web`/`adk run`/`adk api_server`, ensure the root-level `.env` contains valid API keys for any provider you plan to hot-swap to (e.g. set `OPENAI_API_KEY` before switching to `openai/gpt-4o`). - -Open http://localhost:8000 in your browser and interact with the agent. - -### Option 2: ADK Terminal - -Run in terminal mode: -```bash -adk run task_agent -``` - -### Option 3: A2A API Server - -Start as an A2A-compatible API server: -```bash -adk api_server --a2a --port 8000 task_agent -``` - -The agent will be available at: `http://localhost:8000/a2a/litellm_agent` - -### Command-line helper - -Use the bundled script to drive hot-swaps and user messages over A2A: - -```bash -python task_agent/a2a_hot_swap.py \ - --url http://127.0.0.1:8000/a2a/litellm_agent \ - --model openai gpt-4o \ - --prompt "You are concise." \ - --config \ - --context demo-session -``` - -To send a follow-up prompt in the same session (with a larger timeout for long answers): - -```bash -python task_agent/a2a_hot_swap.py \ - --url http://127.0.0.1:8000/a2a/litellm_agent \ - --model openai gpt-4o \ - --prompt "You are concise." \ - --message "Give me a fuzzing harness." \ - --context demo-session \ - --timeout 120 -``` - -> Ensure the corresponding provider keys are present in `.env` (or passed via environment variables) before issuing model swaps. - -## Hot-Swap Tools - -The agent provides three special tools: - -### 1. `set_model` - Change the LLM Model - -Change the model during conversation: - -``` -User: Use the set_model tool to change to gpt-4o with openai provider -Agent: āœ… Model configured to: openai/gpt-4o - This change is active now! -``` - -**Parameters:** -- `model`: Model name (e.g., "gpt-4o", "claude-3-sonnet-20240229") -- `custom_llm_provider`: Optional provider prefix (e.g., "openai", "anthropic", "openrouter") - -**Examples:** -- OpenAI: `set_model(model="gpt-4o", custom_llm_provider="openai")` -- Anthropic: `set_model(model="claude-3-sonnet-20240229", custom_llm_provider="anthropic")` -- Google: `set_model(model="gemini-2.0-flash-001", custom_llm_provider="gemini")` - -### 2. `set_prompt` - Change System Prompt - -Update the system instructions: - -``` -User: Use set_prompt to change my behavior to "You are a helpful coding assistant" -Agent: āœ… System prompt updated: - You are a helpful coding assistant - - This change is active now! -``` - -### 3. `get_config` - View Configuration - -Check current model and prompt: - -``` -User: Use get_config to show me your configuration -Agent: šŸ“Š Current Configuration: - ━━━━━━━━━━━━━━━━━━━━━━ - Model: openai/gpt-4o - System Prompt: You are a helpful coding assistant - ━━━━━━━━━━━━━━━━━━━━━━ -``` - -## Testing - -### Basic A2A Client Test - -```bash -python agent/test_a2a_client.py -``` - -### Hot-Swap Functionality Test - -```bash -python agent/test_hotswap.py -``` - -This will: -1. Check initial configuration -2. Query with default model -3. Hot-swap to GPT-4o -4. Verify model changed -5. Change system prompt -6. Test new prompt behavior -7. Hot-swap to Claude -8. Verify final configuration - -### Command-Line Hot-Swap Helper - -You can trigger model and prompt changes directly against the A2A endpoint without the interactive CLI: - -```bash -# Start the agent first (in another terminal): -adk api_server --a2a --port 8000 task_agent - -# Apply swaps via pure A2A calls -python task_agent/a2a_hot_swap.py --model openai gpt-4o --prompt "You are concise." --config -python task_agent/a2a_hot_swap.py --model anthropic claude-3-sonnet-20240229 --context shared-session --config -python task_agent/a2a_hot_swap.py --prompt "" --context shared-session --config # Clear the prompt and show current state -``` - -`--model` accepts either `provider/model` or a provider/model pair. Add `--context` if you want to reuse the same conversation across invocations. Use `--config` to dump the agent's configuration after the changes are applied. - -## Supported Models - -### OpenAI -- `openai/gpt-4o` -- `openai/gpt-4-turbo` -- `openai/gpt-3.5-turbo` - -### Anthropic -- `anthropic/claude-3-opus-20240229` -- `anthropic/claude-3-sonnet-20240229` -- `anthropic/claude-3-haiku-20240307` - -### Google -- `gemini/gemini-2.0-flash-001` -- `gemini/gemini-2.5-pro-exp-03-25` -- `vertex_ai/gemini-2.0-flash-001` - -### OpenRouter -- `openrouter/anthropic/claude-3-opus` -- `openrouter/openai/gpt-4` -- Any model from OpenRouter catalog - -## How It Works - -### Session State -- Model and prompt settings are stored in session state -- Each session maintains its own configuration -- Settings persist across messages in the same session - -### Hot-Swap Mechanism -1. Tools update session state with new model/prompt -2. `before_agent_callback` checks for changes -3. If model changed, directly updates: `agent.model = LiteLlm(model=new_model)` -4. Dynamic instruction function reads custom prompt from session state - -### A2A Compatibility -- Agent card at `agent.json` defines A2A metadata -- Served at `/a2a/litellm_agent` endpoint -- Compatible with A2A client protocol - -## Example Usage - -### Interactive Session - -```python -from a2a.client import A2AClient -import asyncio - -async def chat(): - client = A2AClient("http://localhost:8000/a2a/litellm_agent") - context_id = "my-session-123" - - # Start with default model - async for msg in client.send_message("Hello!", context_id=context_id): - print(msg) - - # Switch to GPT-4 - async for msg in client.send_message( - "Use set_model with model gpt-4o and provider openai", - context_id=context_id - ): - print(msg) - - # Continue with new model - async for msg in client.send_message( - "Help me write a function", - context_id=context_id - ): - print(msg) - -asyncio.run(chat()) -``` - -## Troubleshooting - -### Model Not Found -- Ensure API key for the provider is set in `.env` -- Check model name is correct for the provider -- Verify LiteLLM supports the model (https://docs.litellm.ai/docs/providers) - -### Connection Refused -- Ensure the agent is running (`adk api_server --a2a task_agent`) -- Check the port matches (default: 8000) -- Verify no firewall blocking localhost - -### Hot-Swap Not Working -- Check that you're using the same `context_id` across messages -- Ensure the tool is being called (not just asked to switch) -- Look for `šŸ”„ Hot-swapped model to:` in server logs - -## Development - -### Adding New Tools - -```python -async def my_tool(tool_ctx: ToolContext, param: str) -> str: - """Your tool description.""" - # Access session state - tool_ctx.state["my_key"] = "my_value" - return "Tool result" - -# Add to agent -root_agent = LlmAgent( - # ... - tools=[set_model, set_prompt, get_config, my_tool], -) -``` - -### Modifying Callbacks - -```python -async def after_model_callback( - callback_context: CallbackContext, - llm_response: LlmResponse -) -> Optional[LlmResponse]: - """Modify response after model generates it.""" - # Your logic here - return llm_response -``` - -## License - -Apache 2.0 diff --git a/ai/agents/task_agent/__init__.py b/ai/agents/task_agent/__init__.py deleted file mode 100644 index 9d35e10..0000000 --- a/ai/agents/task_agent/__init__.py +++ /dev/null @@ -1,5 +0,0 @@ -"""Package entry point for the ADK-formatted hot swap agent.""" - -from .litellm_agent.agent import root_agent - -__all__ = ["root_agent"] diff --git a/ai/agents/task_agent/a2a_hot_swap.py b/ai/agents/task_agent/a2a_hot_swap.py deleted file mode 100644 index 8fbe140..0000000 --- a/ai/agents/task_agent/a2a_hot_swap.py +++ /dev/null @@ -1,224 +0,0 @@ -#!/usr/bin/env python3 -"""Minimal A2A client utility for hot-swapping LiteLLM model/prompt.""" - -from __future__ import annotations - -import argparse -import asyncio -from typing import Optional -from uuid import uuid4 - -import httpx -from a2a.client import A2AClient -from a2a.client.errors import A2AClientHTTPError -from a2a.types import ( - JSONRPCErrorResponse, - Message, - MessageSendConfiguration, - MessageSendParams, - Part, - Role, - SendMessageRequest, - SendStreamingMessageRequest, - Task, - TaskArtifactUpdateEvent, - TaskStatusUpdateEvent, - TextPart, -) - -from litellm_agent.control import ( - HotSwapCommand, - build_control_message, - parse_model_spec, - serialize_model_spec, -) - -DEFAULT_URL = "http://localhost:8000/a2a/litellm_agent" - - -async def _collect_text(client: A2AClient, message: str, context_id: str) -> str: - """Send a message and collect streamed agent text into a single string.""" - - params = MessageSendParams( - configuration=MessageSendConfiguration(blocking=True), - message=Message( - context_id=context_id, - message_id=str(uuid4()), - role=Role.user, - parts=[Part(root=TextPart(text=message))], - ), - ) - - stream_request = SendStreamingMessageRequest(id=str(uuid4()), params=params) - buffer: list[str] = [] - try: - async for response in client.send_message_streaming(stream_request): - root = response.root - if isinstance(root, JSONRPCErrorResponse): - raise RuntimeError(f"A2A error: {root.error}") - - payload = root.result - buffer.extend(_extract_text(payload)) - except A2AClientHTTPError as exc: - if "text/event-stream" not in str(exc): - raise - - send_request = SendMessageRequest(id=str(uuid4()), params=params) - response = await client.send_message(send_request) - root = response.root - if isinstance(root, JSONRPCErrorResponse): - raise RuntimeError(f"A2A error: {root.error}") - payload = root.result - buffer.extend(_extract_text(payload)) - - if buffer: - buffer = list(dict.fromkeys(buffer)) - return "\n".join(buffer).strip() - - -def _extract_text( - result: Message | Task | TaskStatusUpdateEvent | TaskArtifactUpdateEvent, -) -> list[str]: - texts: list[str] = [] - if isinstance(result, Message): - if result.role is Role.agent: - for part in result.parts: - root_part = part.root - text = getattr(root_part, "text", None) - if text: - texts.append(text) - elif isinstance(result, Task) and result.history: - for msg in result.history: - if msg.role is Role.agent: - for part in msg.parts: - root_part = part.root - text = getattr(root_part, "text", None) - if text: - texts.append(text) - elif isinstance(result, TaskStatusUpdateEvent): - message = result.status.message - if message: - texts.extend(_extract_text(message)) - elif isinstance(result, TaskArtifactUpdateEvent): - artifact = result.artifact - if artifact and artifact.parts: - for part in artifact.parts: - root_part = part.root - text = getattr(root_part, "text", None) - if text: - texts.append(text) - return texts - - -def _split_model_args(model_args: Optional[list[str]]) -> tuple[Optional[str], Optional[str]]: - if not model_args: - return None, None - - if len(model_args) == 1: - return model_args[0], None - - provider = model_args[0] - model = " ".join(model_args[1:]) - return model, provider - - -async def hot_swap( - url: str, - *, - model_args: Optional[list[str]], - provider: Optional[str], - prompt: Optional[str], - message: Optional[str], - show_config: bool, - context_id: Optional[str], - timeout: float, -) -> None: - """Execute the requested hot-swap operations against the A2A endpoint.""" - - timeout_config = httpx.Timeout(timeout) - async with httpx.AsyncClient(timeout=timeout_config) as http_client: - client = A2AClient(url=url, httpx_client=http_client) - session_id = context_id or str(uuid4()) - - model, derived_provider = _split_model_args(model_args) - - if model: - spec = parse_model_spec(model, provider=provider or derived_provider) - payload = serialize_model_spec(spec) - control_msg = build_control_message(HotSwapCommand.MODEL, payload) - result = await _collect_text(client, control_msg, session_id) - print(f"Model response: {result or '(no response)'}") - - if prompt is not None: - control_msg = build_control_message(HotSwapCommand.PROMPT, prompt) - result = await _collect_text(client, control_msg, session_id) - print(f"Prompt response: {result or '(no response)'}") - - if show_config: - control_msg = build_control_message(HotSwapCommand.GET_CONFIG) - result = await _collect_text(client, control_msg, session_id) - print(f"Config:\n{result or '(no response)'}") - - if message: - result = await _collect_text(client, message, session_id) - print(f"Message response: {result or '(no response)'}") - - print(f"Context ID: {session_id}") - - -def parse_args() -> argparse.Namespace: - parser = argparse.ArgumentParser(description=__doc__) - parser.add_argument( - "--url", - default=DEFAULT_URL, - help=f"A2A endpoint for the agent (default: {DEFAULT_URL})", - ) - parser.add_argument( - "--model", - nargs="+", - help="LiteLLM model spec: either 'provider/model' or ' '.", - ) - parser.add_argument( - "--provider", - help="Optional LiteLLM provider when --model lacks a prefix.") - parser.add_argument( - "--prompt", - help="Set the system prompt (omit to leave unchanged; empty string clears it).", - ) - parser.add_argument( - "--message", - help="Send an additional user message after the swaps complete.") - parser.add_argument( - "--config", - action="store_true", - help="Print the agent configuration after performing swaps.") - parser.add_argument( - "--context", - help="Optional context/session identifier to reuse across calls.") - parser.add_argument( - "--timeout", - type=float, - default=60.0, - help="Request timeout (seconds) for A2A calls (default: 60).", - ) - return parser.parse_args() - - -def main() -> None: - args = parse_args() - asyncio.run( - hot_swap( - args.url, - model_args=args.model, - provider=args.provider, - prompt=args.prompt, - message=args.message, - show_config=args.config, - context_id=args.context, - timeout=args.timeout, - ) - ) - - -if __name__ == "__main__": - main() diff --git a/ai/agents/task_agent/docker-compose.yml b/ai/agents/task_agent/docker-compose.yml deleted file mode 100644 index b22a9ac..0000000 --- a/ai/agents/task_agent/docker-compose.yml +++ /dev/null @@ -1,24 +0,0 @@ -version: '3.8' - -services: - task-agent: - build: - context: . - dockerfile: Dockerfile - container_name: fuzzforge-task-agent - ports: - - "10900:8000" - env_file: - - ../../../volumes/env/.env - environment: - - PORT=8000 - - PYTHONUNBUFFERED=1 - volumes: - # Mount volumes/env for runtime config access - - ../../../volumes/env:/app/config:ro - restart: unless-stopped - healthcheck: - test: ["CMD", "curl", "-f", "http://localhost:8000/health"] - interval: 30s - timeout: 10s - retries: 3 diff --git a/ai/agents/task_agent/docker-entrypoint.sh b/ai/agents/task_agent/docker-entrypoint.sh deleted file mode 100644 index 88e3733..0000000 --- a/ai/agents/task_agent/docker-entrypoint.sh +++ /dev/null @@ -1,31 +0,0 @@ -#!/bin/bash -set -e - -# Wait for .env file to have keys (max 30 seconds) -echo "[task-agent] Waiting for virtual keys to be provisioned..." -for i in $(seq 1 30); do - if [ -f /app/config/.env ]; then - # Check if TASK_AGENT_API_KEY has a value (not empty) - KEY=$(grep -E '^TASK_AGENT_API_KEY=' /app/config/.env | cut -d'=' -f2) - if [ -n "$KEY" ] && [ "$KEY" != "" ]; then - echo "[task-agent] Virtual keys found, loading environment..." - # Export keys from .env file - export TASK_AGENT_API_KEY="$KEY" - export OPENAI_API_KEY=$(grep -E '^OPENAI_API_KEY=' /app/config/.env | cut -d'=' -f2) - export FF_LLM_PROXY_BASE_URL=$(grep -E '^FF_LLM_PROXY_BASE_URL=' /app/config/.env | cut -d'=' -f2) - echo "[task-agent] Loaded TASK_AGENT_API_KEY: ${TASK_AGENT_API_KEY:0:15}..." - echo "[task-agent] Loaded FF_LLM_PROXY_BASE_URL: $FF_LLM_PROXY_BASE_URL" - break - fi - fi - echo "[task-agent] Keys not ready yet, waiting... ($i/30)" - sleep 1 -done - -if [ -z "$TASK_AGENT_API_KEY" ]; then - echo "[task-agent] ERROR: Virtual keys were not provisioned within 30 seconds!" - exit 1 -fi - -echo "[task-agent] Starting uvicorn..." -exec "$@" diff --git a/ai/agents/task_agent/litellm_agent/__init__.py b/ai/agents/task_agent/litellm_agent/__init__.py deleted file mode 100644 index 09c0772..0000000 --- a/ai/agents/task_agent/litellm_agent/__init__.py +++ /dev/null @@ -1,55 +0,0 @@ -"""LiteLLM hot-swap agent package exports.""" - -from .agent import root_agent -from .callbacks import ( - before_agent_callback, - before_model_callback, - provide_instruction, -) -from .config import ( - AGENT_DESCRIPTION, - AGENT_NAME, - CONTROL_PREFIX, - DEFAULT_MODEL, - DEFAULT_PROVIDER, - STATE_MODEL_KEY, - STATE_PROVIDER_KEY, - STATE_PROMPT_KEY, -) -from .control import ( - HotSwapCommand, - ModelSpec, - build_control_message, - parse_control_message, - parse_model_spec, - serialize_model_spec, -) -from .state import HotSwapState, apply_state_to_agent -from .tools import HOTSWAP_TOOLS, get_config, set_model, set_prompt - -__all__ = [ - "root_agent", - "before_agent_callback", - "before_model_callback", - "provide_instruction", - "AGENT_DESCRIPTION", - "AGENT_NAME", - "CONTROL_PREFIX", - "DEFAULT_MODEL", - "DEFAULT_PROVIDER", - "STATE_MODEL_KEY", - "STATE_PROVIDER_KEY", - "STATE_PROMPT_KEY", - "HotSwapCommand", - "ModelSpec", - "HotSwapState", - "apply_state_to_agent", - "build_control_message", - "parse_control_message", - "parse_model_spec", - "serialize_model_spec", - "HOTSWAP_TOOLS", - "get_config", - "set_model", - "set_prompt", -] diff --git a/ai/agents/task_agent/litellm_agent/agent.json b/ai/agents/task_agent/litellm_agent/agent.json deleted file mode 100644 index 05f1112..0000000 --- a/ai/agents/task_agent/litellm_agent/agent.json +++ /dev/null @@ -1,24 +0,0 @@ -{ - "name": "litellm_agent", - "description": "A flexible AI agent powered by LiteLLM with hot-swappable models from OpenRouter and other providers", - "url": "http://localhost:8000", - "version": "1.0.0", - "defaultInputModes": ["text/plain"], - "defaultOutputModes": ["text/plain"], - "capabilities": { - "streaming": true - }, - "skills": [ - { - "id": "litellm-general-purpose", - "name": "General Purpose AI Assistant", - "description": "A flexible AI assistant that can help with various tasks using any LiteLLM-supported model. Supports runtime model and prompt hot-swapping.", - "tags": ["ai", "assistant", "litellm", "flexible", "hot-swap"], - "examples": [ - "Help me write a Python function", - "Explain quantum computing", - "Switch to Claude model and help me code" - ] - } - ] -} diff --git a/ai/agents/task_agent/litellm_agent/agent.py b/ai/agents/task_agent/litellm_agent/agent.py deleted file mode 100644 index c3189ce..0000000 --- a/ai/agents/task_agent/litellm_agent/agent.py +++ /dev/null @@ -1,29 +0,0 @@ -"""Root agent definition for the LiteLLM hot-swap shell.""" - -from __future__ import annotations - -from google.adk.agents import Agent - -from .callbacks import ( - before_agent_callback, - before_model_callback, - provide_instruction, -) -from .config import AGENT_DESCRIPTION, AGENT_NAME, DEFAULT_MODEL, DEFAULT_PROVIDER -from .state import HotSwapState -from .tools import HOTSWAP_TOOLS - -_initial_state = HotSwapState(model=DEFAULT_MODEL, provider=DEFAULT_PROVIDER) - -root_agent = Agent( - name=AGENT_NAME, - model=_initial_state.instantiate_llm(), - description=AGENT_DESCRIPTION, - instruction=provide_instruction, - tools=HOTSWAP_TOOLS, - before_agent_callback=before_agent_callback, - before_model_callback=before_model_callback, -) - - -__all__ = ["root_agent"] diff --git a/ai/agents/task_agent/litellm_agent/callbacks.py b/ai/agents/task_agent/litellm_agent/callbacks.py deleted file mode 100644 index 0faaa82..0000000 --- a/ai/agents/task_agent/litellm_agent/callbacks.py +++ /dev/null @@ -1,137 +0,0 @@ -"""Callbacks and instruction providers for the LiteLLM hot-swap agent.""" - -from __future__ import annotations - -import logging -from typing import Optional - -from google.adk.agents.callback_context import CallbackContext -from google.adk.agents.readonly_context import ReadonlyContext -from google.adk.models.llm_request import LlmRequest -from google.genai import types - -from .config import CONTROL_PREFIX, DEFAULT_MODEL -from .control import HotSwapCommand, parse_control_message, parse_model_spec -from .prompts import BASE_INSTRUCTION -from .state import HotSwapState, apply_state_to_agent - -_LOGGER = logging.getLogger(__name__) - - -def provide_instruction(ctx: ReadonlyContext | None = None) -> str: - """Compose the system instruction using the stored state.""" - - state_mapping = getattr(ctx, "state", None) - state = HotSwapState.from_mapping(state_mapping) - prompt = state.prompt or BASE_INSTRUCTION - return f"{prompt}\n\nActive model: {state.display_model}" - - -def _ensure_state(callback_context: CallbackContext) -> HotSwapState: - state = HotSwapState.from_mapping(callback_context.state) - state.persist(callback_context.state) - return state - - -def _session_id(callback_context: CallbackContext) -> str: - session = getattr(callback_context, "session", None) - if session is None: - session = getattr(callback_context._invocation_context, "session", None) - return getattr(session, "id", "unknown-session") - - -async def before_model_callback( - callback_context: CallbackContext, - llm_request: LlmRequest, -) -> Optional[types.Content]: - """Ensure outgoing requests use the active model from session state.""" - - state = _ensure_state(callback_context) - try: - apply_state_to_agent(callback_context._invocation_context, state) - except Exception: # pragma: no cover - defensive logging - _LOGGER.exception( - "Failed to apply LiteLLM model '%s' (provider=%s) for session %s", - state.model, - state.provider, - callback_context.session.id, - ) - llm_request.model = state.model or DEFAULT_MODEL - return None - - -async def before_agent_callback( - callback_context: CallbackContext, -) -> Optional[types.Content]: - """Intercept hot-swap control messages and update session state.""" - - user_content = callback_context.user_content - if not user_content or not user_content.parts: - return None - - first_part = user_content.parts[0] - message_text = (first_part.text or "").strip() - if not message_text.startswith(CONTROL_PREFIX): - return None - - parsed = parse_control_message(message_text) - if not parsed: - return None - - command, payload = parsed - state = _ensure_state(callback_context) - - if command is HotSwapCommand.MODEL: - if not payload: - return _render("āŒ Missing model specification for hot-swap.") - try: - spec = parse_model_spec(payload) - except ValueError as exc: - return _render(f"āŒ Invalid model specification: {exc}") - - state.model = spec.model - state.provider = spec.provider - state.persist(callback_context.state) - try: - apply_state_to_agent(callback_context._invocation_context, state) - except Exception: # pragma: no cover - defensive logging - _LOGGER.exception( - "Failed to apply LiteLLM model '%s' (provider=%s) for session %s", - state.model, - state.provider, - _session_id(callback_context), - ) - _LOGGER.info( - "Hot-swapped model to %s (provider=%s, session=%s)", - state.model, - state.provider, - _session_id(callback_context), - ) - label = state.display_model - return _render(f"āœ… Model switched to: {label}") - - if command is HotSwapCommand.PROMPT: - prompt_value = (payload or "").strip() - state.prompt = prompt_value or None - state.persist(callback_context.state) - if state.prompt: - _LOGGER.info( - "Updated prompt for session %s", _session_id(callback_context) - ) - return _render( - "āœ… System prompt updated. This change takes effect immediately." - ) - return _render("āœ… System prompt cleared. Reverting to default instruction.") - - if command is HotSwapCommand.GET_CONFIG: - return _render(state.describe()) - - expected = ", ".join(HotSwapCommand.choices()) - return _render( - "āš ļø Unsupported hot-swap command. Available verbs: " - f"{expected}." - ) - - -def _render(message: str) -> types.ModelContent: - return types.ModelContent(parts=[types.Part(text=message)]) diff --git a/ai/agents/task_agent/litellm_agent/config.py b/ai/agents/task_agent/litellm_agent/config.py deleted file mode 100644 index 54ab609..0000000 --- a/ai/agents/task_agent/litellm_agent/config.py +++ /dev/null @@ -1,35 +0,0 @@ -"""Configuration constants for the LiteLLM hot-swap agent.""" - -from __future__ import annotations - -import os - - -def _normalize_proxy_base_url(raw_value: str | None) -> str | None: - if not raw_value: - return None - cleaned = raw_value.strip() - if not cleaned: - return None - # Avoid double slashes in downstream requests - return cleaned.rstrip("/") - -AGENT_NAME = "litellm_agent" -AGENT_DESCRIPTION = ( - "A LiteLLM-backed shell that exposes hot-swappable model and prompt controls." -) - -DEFAULT_MODEL = os.getenv("LITELLM_MODEL", "openai/gpt-4o-mini") -DEFAULT_PROVIDER = os.getenv("LITELLM_PROVIDER") or None -PROXY_BASE_URL = _normalize_proxy_base_url( - os.getenv("FF_LLM_PROXY_BASE_URL") - or os.getenv("LITELLM_API_BASE") - or os.getenv("LITELLM_BASE_URL") -) - -STATE_PREFIX = "app:litellm_agent/" -STATE_MODEL_KEY = f"{STATE_PREFIX}model" -STATE_PROVIDER_KEY = f"{STATE_PREFIX}provider" -STATE_PROMPT_KEY = f"{STATE_PREFIX}prompt" - -CONTROL_PREFIX = "[HOTSWAP" diff --git a/ai/agents/task_agent/litellm_agent/control.py b/ai/agents/task_agent/litellm_agent/control.py deleted file mode 100644 index 1c23379..0000000 --- a/ai/agents/task_agent/litellm_agent/control.py +++ /dev/null @@ -1,99 +0,0 @@ -"""Control message helpers for hot-swapping model and prompt.""" - -from __future__ import annotations - -import re -from dataclasses import dataclass -from enum import Enum -from typing import Optional, Tuple - -from .config import DEFAULT_PROVIDER - - -class HotSwapCommand(str, Enum): - """Supported control verbs embedded in user messages.""" - - MODEL = "MODEL" - PROMPT = "PROMPT" - GET_CONFIG = "GET_CONFIG" - - @classmethod - def choices(cls) -> tuple[str, ...]: - return tuple(item.value for item in cls) - - -@dataclass(frozen=True) -class ModelSpec: - """Represents a LiteLLM model and optional provider.""" - - model: str - provider: Optional[str] = None - - -_COMMAND_PATTERN = re.compile( - r"^\[HOTSWAP:(?P[A-Z_]+)(?::(?P.*))?\]$", -) - - -def parse_control_message(text: str) -> Optional[Tuple[HotSwapCommand, Optional[str]]]: - """Return hot-swap command tuple when the string matches the control format.""" - - match = _COMMAND_PATTERN.match(text.strip()) - if not match: - return None - - verb = match.group("verb") - if verb not in HotSwapCommand.choices(): - return None - - payload = match.group("payload") - return HotSwapCommand(verb), payload if payload else None - - -def build_control_message(command: HotSwapCommand, payload: Optional[str] = None) -> str: - """Serialise a control command for downstream clients.""" - - if command not in HotSwapCommand: - raise ValueError(f"Unsupported hot-swap command: {command}") - if payload is None or payload == "": - return f"[HOTSWAP:{command.value}]" - return f"[HOTSWAP:{command.value}:{payload}]" - - -def parse_model_spec(model: str, provider: Optional[str] = None) -> ModelSpec: - """Parse model/provider inputs into a structured ModelSpec.""" - - candidate = (model or "").strip() - if not candidate: - raise ValueError("Model name cannot be empty") - - if provider: - provider_clean = provider.strip() - if not provider_clean: - raise ValueError("Provider cannot be empty when supplied") - if "/" in candidate: - raise ValueError( - "Provide either provider/model or use provider argument, not both", - ) - return ModelSpec(model=candidate, provider=provider_clean) - - if "/" in candidate: - provider_part, model_part = candidate.split("/", 1) - provider_part = provider_part.strip() - model_part = model_part.strip() - if not provider_part or not model_part: - raise ValueError("Model spec must include provider and model when using '/' format") - return ModelSpec(model=model_part, provider=provider_part) - - if DEFAULT_PROVIDER: - return ModelSpec(model=candidate, provider=DEFAULT_PROVIDER.strip()) - - return ModelSpec(model=candidate, provider=None) - - -def serialize_model_spec(spec: ModelSpec) -> str: - """Render a ModelSpec to provider/model string for control messages.""" - - if spec.provider: - return f"{spec.provider}/{spec.model}" - return spec.model diff --git a/ai/agents/task_agent/litellm_agent/prompts.py b/ai/agents/task_agent/litellm_agent/prompts.py deleted file mode 100644 index ec4d603..0000000 --- a/ai/agents/task_agent/litellm_agent/prompts.py +++ /dev/null @@ -1,9 +0,0 @@ -"""System prompt templates for the LiteLLM agent.""" - -BASE_INSTRUCTION = ( - "You are a focused orchestration layer that relays between the user and a" - " LiteLLM managed model." - "\n- Keep answers concise and actionable." - "\n- Prefer plain language; reveal intermediate reasoning only when helpful." - "\n- Surface any tool results clearly with short explanations." -) diff --git a/ai/agents/task_agent/litellm_agent/state.py b/ai/agents/task_agent/litellm_agent/state.py deleted file mode 100644 index 54f1308..0000000 --- a/ai/agents/task_agent/litellm_agent/state.py +++ /dev/null @@ -1,254 +0,0 @@ -"""Session state utilities for the LiteLLM hot-swap agent.""" - -from __future__ import annotations - -from dataclasses import dataclass -import os -from typing import Any, Mapping, MutableMapping, Optional - -import httpx - -from .config import ( - DEFAULT_MODEL, - DEFAULT_PROVIDER, - PROXY_BASE_URL, - STATE_MODEL_KEY, - STATE_PROMPT_KEY, - STATE_PROVIDER_KEY, -) - - -@dataclass(slots=True) -class HotSwapState: - """Lightweight view of the hot-swap session state.""" - - model: str = DEFAULT_MODEL - provider: Optional[str] = None - prompt: Optional[str] = None - - @classmethod - def from_mapping(cls, mapping: Optional[Mapping[str, Any]]) -> "HotSwapState": - if not mapping: - return cls() - - raw_model = mapping.get(STATE_MODEL_KEY, DEFAULT_MODEL) - raw_provider = mapping.get(STATE_PROVIDER_KEY) - raw_prompt = mapping.get(STATE_PROMPT_KEY) - - model = raw_model.strip() if isinstance(raw_model, str) else DEFAULT_MODEL - provider = raw_provider.strip() if isinstance(raw_provider, str) else None - if not provider and DEFAULT_PROVIDER: - provider = DEFAULT_PROVIDER.strip() or None - prompt = raw_prompt.strip() if isinstance(raw_prompt, str) else None - return cls( - model=model or DEFAULT_MODEL, - provider=provider or None, - prompt=prompt or None, - ) - - def persist(self, store: MutableMapping[str, object]) -> None: - store[STATE_MODEL_KEY] = self.model - if self.provider: - store[STATE_PROVIDER_KEY] = self.provider - else: - store[STATE_PROVIDER_KEY] = None - store[STATE_PROMPT_KEY] = self.prompt - - def describe(self) -> str: - prompt_value = self.prompt if self.prompt else "(default prompt)" - provider_value = self.provider if self.provider else "(default provider)" - return ( - "šŸ“Š Current Configuration\n" - "━━━━━━━━━━━━━━━━━━━━━━\n" - f"Model: {self.model}\n" - f"Provider: {provider_value}\n" - f"System Prompt: {prompt_value}\n" - "━━━━━━━━━━━━━━━━━━━━━━" - ) - - def instantiate_llm(self): - """Create a LiteLlm instance for the current state.""" - - from google.adk.models.lite_llm import LiteLlm # Lazy import to avoid cycle - from google.adk.models.lite_llm import LiteLLMClient - from litellm.types.utils import Choices, Message, ModelResponse, Usage - - kwargs = {"model": self.model} - if self.provider: - kwargs["custom_llm_provider"] = self.provider - if PROXY_BASE_URL: - provider = (self.provider or DEFAULT_PROVIDER or "").lower() - if provider and provider != "openai": - kwargs["api_base"] = f"{PROXY_BASE_URL.rstrip('/')}/{provider}" - else: - kwargs["api_base"] = PROXY_BASE_URL - kwargs.setdefault("api_key", os.environ.get("TASK_AGENT_API_KEY") or os.environ.get("OPENAI_API_KEY")) - - provider = (self.provider or DEFAULT_PROVIDER or "").lower() - model_suffix = self.model.split("/", 1)[-1] - use_responses = provider == "openai" and ( - model_suffix.startswith("gpt-5") or model_suffix.startswith("o1") - ) - if use_responses: - kwargs.setdefault("use_responses_api", True) - - llm = LiteLlm(**kwargs) - - if use_responses and PROXY_BASE_URL: - - class _ResponsesAwareClient(LiteLLMClient): - def __init__(self, base_client: LiteLLMClient, api_base: str, api_key: str): - self._base_client = base_client - self._api_base = api_base.rstrip("/") - self._api_key = api_key - - async def acompletion(self, model, messages, tools, **kwargs): # type: ignore[override] - use_responses_api = kwargs.pop("use_responses_api", False) - if not use_responses_api: - return await self._base_client.acompletion( - model=model, - messages=messages, - tools=tools, - **kwargs, - ) - - resolved_model = model - if "/" not in resolved_model: - resolved_model = f"openai/{resolved_model}" - - payload = { - "model": resolved_model, - "input": _messages_to_responses_input(messages), - } - - timeout = kwargs.get("timeout", 60) - headers = { - "Authorization": f"Bearer {self._api_key}", - "Content-Type": "application/json", - } - - async with httpx.AsyncClient(timeout=timeout) as client: - response = await client.post( - f"{self._api_base}/v1/responses", - json=payload, - headers=headers, - ) - try: - response.raise_for_status() - except httpx.HTTPStatusError as exc: - text = exc.response.text - raise RuntimeError( - f"LiteLLM responses request failed: {text}" - ) from exc - data = response.json() - - text_output = _extract_output_text(data) - usage = data.get("usage", {}) - - return ModelResponse( - id=data.get("id"), - model=model, - choices=[ - Choices( - finish_reason="stop", - index=0, - message=Message(role="assistant", content=text_output), - provider_specific_fields={"bifrost_response": data}, - ) - ], - usage=Usage( - prompt_tokens=usage.get("input_tokens"), - completion_tokens=usage.get("output_tokens"), - reasoning_tokens=usage.get("output_tokens_details", {}).get( - "reasoning_tokens" - ), - total_tokens=usage.get("total_tokens"), - ), - ) - - llm.llm_client = _ResponsesAwareClient( - llm.llm_client, - PROXY_BASE_URL, - os.environ.get("TASK_AGENT_API_KEY") or os.environ.get("OPENAI_API_KEY", ""), - ) - - return llm - - @property - def display_model(self) -> str: - if self.provider: - return f"{self.provider}/{self.model}" - return self.model - - -def apply_state_to_agent(invocation_context, state: HotSwapState) -> None: - """Update the provided agent with a LiteLLM instance matching state.""" - - agent = invocation_context.agent - agent.model = state.instantiate_llm() - - -def _messages_to_responses_input(messages: list[dict[str, Any]]) -> list[dict[str, Any]]: - inputs: list[dict[str, Any]] = [] - for message in messages: - role = message.get("role", "user") - content = message.get("content", "") - text_segments: list[str] = [] - - if isinstance(content, list): - for item in content: - if isinstance(item, dict): - text = item.get("text") or item.get("content") - if text: - text_segments.append(str(text)) - elif isinstance(item, str): - text_segments.append(item) - elif isinstance(content, str): - text_segments.append(content) - - text = "\n".join(segment.strip() for segment in text_segments if segment) - if not text: - continue - - entry_type = "input_text" - if role == "assistant": - entry_type = "output_text" - - inputs.append( - { - "role": role, - "content": [ - { - "type": entry_type, - "text": text, - } - ], - } - ) - - if not inputs: - inputs.append( - { - "role": "user", - "content": [ - { - "type": "input_text", - "text": "", - } - ], - } - ) - return inputs - - -def _extract_output_text(response_json: dict[str, Any]) -> str: - outputs = response_json.get("output", []) - collected: list[str] = [] - for item in outputs: - if isinstance(item, dict) and item.get("type") == "message": - for part in item.get("content", []): - if isinstance(part, dict) and part.get("type") == "output_text": - text = part.get("text", "") - if text: - collected.append(str(text)) - return "\n\n".join(collected).strip() diff --git a/ai/agents/task_agent/litellm_agent/tools.py b/ai/agents/task_agent/litellm_agent/tools.py deleted file mode 100644 index ff60a5f..0000000 --- a/ai/agents/task_agent/litellm_agent/tools.py +++ /dev/null @@ -1,64 +0,0 @@ -"""Tool definitions exposed to the LiteLLM agent.""" - -from __future__ import annotations - -from typing import Optional - -from google.adk.tools import FunctionTool, ToolContext - -from .control import parse_model_spec -from .state import HotSwapState, apply_state_to_agent - - -async def set_model( - model: str, - *, - provider: Optional[str] = None, - tool_context: ToolContext, -) -> str: - """Hot-swap the active LiteLLM model for this session.""" - - spec = parse_model_spec(model, provider=provider) - state = HotSwapState.from_mapping(tool_context.state) - state.model = spec.model - state.provider = spec.provider - state.persist(tool_context.state) - try: - apply_state_to_agent(tool_context._invocation_context, state) - except Exception as exc: # pragma: no cover - defensive reporting - return f"āŒ Failed to apply model '{state.display_model}': {exc}" - return f"āœ… Model switched to: {state.display_model}" - - -async def set_prompt(prompt: str, *, tool_context: ToolContext) -> str: - """Update or clear the system prompt used for this session.""" - - state = HotSwapState.from_mapping(tool_context.state) - prompt_value = prompt.strip() - state.prompt = prompt_value or None - state.persist(tool_context.state) - if state.prompt: - return "āœ… System prompt updated. This change takes effect immediately." - return "āœ… System prompt cleared. Reverting to default instruction." - - -async def get_config(*, tool_context: ToolContext) -> str: - """Return a summary of the current model and prompt configuration.""" - - state = HotSwapState.from_mapping(tool_context.state) - return state.describe() - - -HOTSWAP_TOOLS = [ - FunctionTool(set_model), - FunctionTool(set_prompt), - FunctionTool(get_config), -] - - -__all__ = [ - "set_model", - "set_prompt", - "get_config", - "HOTSWAP_TOOLS", -] diff --git a/ai/agents/task_agent/main.py b/ai/agents/task_agent/main.py deleted file mode 100644 index d675cad..0000000 --- a/ai/agents/task_agent/main.py +++ /dev/null @@ -1,13 +0,0 @@ -"""ASGI entrypoint for containerized deployments.""" - -from pathlib import Path - -from google.adk.cli.fast_api import get_fast_api_app - -AGENT_DIR = Path(__file__).resolve().parent - -app = get_fast_api_app( - agents_dir=str(AGENT_DIR), - web=False, - a2a=True, -) diff --git a/ai/agents/task_agent/requirements.txt b/ai/agents/task_agent/requirements.txt deleted file mode 100644 index 132e57b..0000000 --- a/ai/agents/task_agent/requirements.txt +++ /dev/null @@ -1,4 +0,0 @@ -google-adk -a2a-sdk[all] -litellm -python-dotenv diff --git a/ai/llm.txt b/ai/llm.txt deleted file mode 100644 index 4c54800..0000000 --- a/ai/llm.txt +++ /dev/null @@ -1,93 +0,0 @@ -FuzzForge AI LLM Configuration Guide -=================================== - -This note summarises the environment variables and libraries that drive LiteLLM (via the Google ADK runtime) inside the FuzzForge AI module. For complete matrices and advanced examples, read `docs/advanced/configuration.md`. - -Core Libraries --------------- -- `google-adk` – hosts the agent runtime, memory services, and LiteLLM bridge. -- `litellm` – provider-agnostic LLM client used by ADK and the executor. -- Provider SDKs – install the SDK that matches your target backend (`openai`, `anthropic`, `google-cloud-aiplatform`, `groq`, etc.). -- Optional extras: `agentops` for tracing, `cognee[all]` for knowledge-graph ingestion, `ollama` CLI for running local models. - -Quick install foundation:: - -``` -pip install google-adk litellm openai -``` - -Add any provider-specific SDKs (for example `pip install anthropic groq`) on top of that base. - -Baseline Setup --------------- -Copy `.fuzzforge/.env.template` to `.fuzzforge/.env` and set the core fields: - -``` -LLM_PROVIDER=openai -LITELLM_MODEL=gpt-5-mini -OPENAI_API_KEY=sk-your-key -FUZZFORGE_MCP_URL=http://localhost:8010/mcp -SESSION_PERSISTENCE=sqlite -MEMORY_SERVICE=inmemory -``` - -LiteLLM Provider Examples -------------------------- - -OpenAI-compatible (Azure, etc.):: -``` -LLM_PROVIDER=azure_openai -LITELLM_MODEL=gpt-4o-mini -LLM_API_KEY=sk-your-azure-key -LLM_ENDPOINT=https://your-resource.openai.azure.com -``` - -Anthropic:: -``` -LLM_PROVIDER=anthropic -LITELLM_MODEL=claude-3-haiku-20240307 -ANTHROPIC_API_KEY=sk-your-key -``` - -Ollama (local):: -``` -LLM_PROVIDER=ollama_chat -LITELLM_MODEL=codellama:latest -OLLAMA_API_BASE=http://localhost:11434 -``` -Run `ollama pull codellama:latest` so the adapter can respond immediately. - -Vertex AI:: -``` -LLM_PROVIDER=vertex_ai -LITELLM_MODEL=gemini-1.5-pro -GOOGLE_APPLICATION_CREDENTIALS=/path/to/service-account.json -``` - -Provider Checklist ------------------- -- **OpenAI / Azure OpenAI**: `LLM_PROVIDER`, `LITELLM_MODEL`, API key, optional endpoint + API version (Azure). -- **Anthropic**: `LLM_PROVIDER=anthropic`, `LITELLM_MODEL`, `ANTHROPIC_API_KEY`. -- **Google Vertex AI**: `LLM_PROVIDER=vertex_ai`, `LITELLM_MODEL`, `GOOGLE_APPLICATION_CREDENTIALS`, `GOOGLE_CLOUD_PROJECT`. -- **Groq**: `LLM_PROVIDER=groq`, `LITELLM_MODEL`, `GROQ_API_KEY`. -- **Ollama / Local**: `LLM_PROVIDER=ollama_chat`, `LITELLM_MODEL`, `OLLAMA_API_BASE`, and the model pulled locally (`ollama pull `). - -Knowledge Graph Add-ons ------------------------ -Set these only if you plan to use Cognee project graphs: - -``` -LLM_COGNEE_PROVIDER=openai -LLM_COGNEE_MODEL=gpt-5-mini -LLM_COGNEE_API_KEY=sk-your-key -``` - -Tracing & Debugging -------------------- -- Provide `AGENTOPS_API_KEY` to enable hosted traces for every conversation. -- Set `FUZZFORGE_DEBUG=1` (and optionally `LOG_LEVEL=DEBUG`) for verbose executor output. -- Restart the agent after changing environment variables; LiteLLM loads configuration on boot. - -Further Reading ---------------- -`docs/advanced/configuration.md` – provider comparison, debugging flags, and referenced modules. diff --git a/ai/proxy/README.md b/ai/proxy/README.md deleted file mode 100644 index fc941eb..0000000 --- a/ai/proxy/README.md +++ /dev/null @@ -1,5 +0,0 @@ -# LLM Proxy Integrations - -This directory contains vendor source trees that were vendored only for reference when integrating LLM gateways. The actual FuzzForge deployment uses the official Docker images for each project. - -See `docs/docs/how-to/llm-proxy.md` for up-to-date instructions on running the proxy services and issuing keys for the agents. diff --git a/ai/pyproject.toml b/ai/pyproject.toml deleted file mode 100644 index 120b9cc..0000000 --- a/ai/pyproject.toml +++ /dev/null @@ -1,44 +0,0 @@ -[project] -name = "fuzzforge-ai" -version = "0.7.3" -description = "FuzzForge AI orchestration module" -readme = "README.md" -requires-python = ">=3.11" -dependencies = [ - "google-adk", - "a2a-sdk", - "litellm", - "python-dotenv", - "httpx", - "uvicorn", - "rich", - "agentops", - "fastmcp", - "mcp", - "typing-extensions", - "cognee>=0.3.0", -] - -[project.optional-dependencies] -dev = [ - "pytest", - "pytest-asyncio", - "black", - "ruff", -] - -[build-system] -requires = ["hatchling"] -build-backend = "hatchling.build" - -[tool.hatch.build.targets.wheel] -packages = ["src/fuzzforge_ai"] - -[tool.hatch.metadata] -allow-direct-references = true - -[tool.uv] -dev-dependencies = [ - "pytest", - "pytest-asyncio", -] diff --git a/ai/src/fuzzforge_ai/__init__.py b/ai/src/fuzzforge_ai/__init__.py deleted file mode 100644 index eefecd9..0000000 --- a/ai/src/fuzzforge_ai/__init__.py +++ /dev/null @@ -1,24 +0,0 @@ -""" -FuzzForge AI Module - Agent-to-Agent orchestration system - -This module integrates the fuzzforge_ai components into FuzzForge, -providing intelligent AI agent capabilities for security analysis. - -Usage: - from fuzzforge_ai.a2a_wrapper import send_agent_task - from fuzzforge_ai.agent import FuzzForgeAgent - from fuzzforge_ai.config_manager import ConfigManager -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -__version__ = "0.7.3" \ No newline at end of file diff --git a/ai/src/fuzzforge_ai/__main__.py b/ai/src/fuzzforge_ai/__main__.py deleted file mode 100644 index 297369f..0000000 --- a/ai/src/fuzzforge_ai/__main__.py +++ /dev/null @@ -1,110 +0,0 @@ -# ruff: noqa: E402 # Imports delayed for environment/logging setup -""" -FuzzForge A2A Server -Run this to expose FuzzForge as an A2A-compatible agent -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -import os -import warnings -import logging -from dotenv import load_dotenv - -from fuzzforge_ai.config_bridge import ProjectConfigManager - -# Suppress warnings -warnings.filterwarnings("ignore") -logging.getLogger("google.adk").setLevel(logging.ERROR) -logging.getLogger("google.adk.tools.base_authenticated_tool").setLevel(logging.ERROR) - -# Load .env from .fuzzforge directory first, then fallback -from pathlib import Path - -# Ensure Cognee logs stay inside the project workspace -project_root = Path.cwd() -default_log_dir = project_root / ".fuzzforge" / "logs" -default_log_dir.mkdir(parents=True, exist_ok=True) -log_path = default_log_dir / "cognee.log" -os.environ.setdefault("COGNEE_LOG_PATH", str(log_path)) -fuzzforge_env = Path.cwd() / ".fuzzforge" / ".env" -if fuzzforge_env.exists(): - load_dotenv(fuzzforge_env, override=True) -else: - load_dotenv(override=True) - -# Ensure Cognee uses the project-specific storage paths when available -try: - project_config = ProjectConfigManager() - project_config.setup_cognee_environment() -except Exception: - # Project may not be initialized; fall through with default settings - pass - -# Check configuration -if not os.getenv('LITELLM_MODEL'): - print("[ERROR] LITELLM_MODEL not set in .env file") - print("Please set LITELLM_MODEL to your desired model (e.g., gpt-4o-mini)") - exit(1) - -from .agent import get_fuzzforge_agent -from .a2a_server import create_a2a_app as create_custom_a2a_app - - -def create_a2a_app(): - """Create the A2A application""" - # Get configuration - port = int(os.getenv('FUZZFORGE_PORT', 10100)) - - # Get the FuzzForge agent - fuzzforge = get_fuzzforge_agent() - - # Print ASCII banner - print("\033[95m") # Purple color - print(" ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•—ā–ˆā–ˆā•— ā–ˆā–ˆā•—ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•—ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•—ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā•—") - print(" ā–ˆā–ˆā•”ā•ā•ā•ā•ā•ā–ˆā–ˆā•‘ ā–ˆā–ˆā•‘ā•šā•ā•ā–ˆā–ˆā–ˆā•”ā•ā•šā•ā•ā–ˆā–ˆā–ˆā•”ā•ā–ˆā–ˆā•”ā•ā•ā•ā•ā•ā–ˆā–ˆā•”ā•ā•ā•ā–ˆā–ˆā•—ā–ˆā–ˆā•”ā•ā•ā–ˆā–ˆā•—ā–ˆā–ˆā•”ā•ā•ā•ā•ā• ā–ˆā–ˆā•”ā•ā•ā•ā•ā• ā–ˆā–ˆā•”ā•ā•ā–ˆā–ˆā•—ā–ˆā–ˆā•‘") - print(" ā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā•‘ ā–ˆā–ˆā•‘ ā–ˆā–ˆā–ˆā•”ā• ā–ˆā–ˆā–ˆā•”ā• ā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā•‘ ā–ˆā–ˆā•‘ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•”ā•ā–ˆā–ˆā•‘ ā–ˆā–ˆā–ˆā•—ā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•‘ā–ˆā–ˆā•‘") - print(" ā–ˆā–ˆā•”ā•ā•ā• ā–ˆā–ˆā•‘ ā–ˆā–ˆā•‘ ā–ˆā–ˆā–ˆā•”ā• ā–ˆā–ˆā–ˆā•”ā• ā–ˆā–ˆā•”ā•ā•ā• ā–ˆā–ˆā•‘ ā–ˆā–ˆā•‘ā–ˆā–ˆā•”ā•ā•ā–ˆā–ˆā•—ā–ˆā–ˆā•‘ ā–ˆā–ˆā•‘ā–ˆā–ˆā•”ā•ā•ā• ā–ˆā–ˆā•”ā•ā•ā–ˆā–ˆā•‘ā–ˆā–ˆā•‘") - print(" ā–ˆā–ˆā•‘ ā•šā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•”ā•ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•—ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•—ā–ˆā–ˆā•‘ ā•šā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•”ā•ā–ˆā–ˆā•‘ ā–ˆā–ˆā•‘ā•šā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•”ā•ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā•‘ ā–ˆā–ˆā•‘ā–ˆā–ˆā•‘") - print(" ā•šā•ā• ā•šā•ā•ā•ā•ā•ā• ā•šā•ā•ā•ā•ā•ā•ā•ā•šā•ā•ā•ā•ā•ā•ā•ā•šā•ā• ā•šā•ā•ā•ā•ā•ā• ā•šā•ā• ā•šā•ā• ā•šā•ā•ā•ā•ā•ā• ā•šā•ā•ā•ā•ā•ā•ā• ā•šā•ā• ā•šā•ā•ā•šā•ā•") - print("\033[0m") # Reset color - - # Create A2A app - print("šŸš€ Starting FuzzForge A2A Server") - print(f" Model: {fuzzforge.model}") - if fuzzforge.cognee_url: - print(f" Memory: Cognee at {fuzzforge.cognee_url}") - print(f" Port: {port}") - - app = create_custom_a2a_app(fuzzforge.adk_agent, port=port, executor=fuzzforge.executor) - - print("\nāœ… FuzzForge A2A Server ready!") - print(f" Agent card: http://localhost:{port}/.well-known/agent-card.json") - print(f" A2A endpoint: http://localhost:{port}/") - print(f"\nšŸ“” Other agents can register FuzzForge at: http://localhost:{port}") - - return app - - -def main(): - """Start the A2A server using uvicorn.""" - import uvicorn - - app = create_a2a_app() - port = int(os.getenv('FUZZFORGE_PORT', 10100)) - - print("\nšŸŽÆ Starting server with uvicorn...") - uvicorn.run(app, host="127.0.0.1", port=port) - - -if __name__ == "__main__": - main() diff --git a/ai/src/fuzzforge_ai/a2a_server.py b/ai/src/fuzzforge_ai/a2a_server.py deleted file mode 100644 index 8c67e8e..0000000 --- a/ai/src/fuzzforge_ai/a2a_server.py +++ /dev/null @@ -1,229 +0,0 @@ -"""Custom A2A wiring so we can access task store and queue manager.""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -from __future__ import annotations - -import logging -from typing import Optional, Union - -from starlette.applications import Starlette -from starlette.responses import Response, FileResponse - -from google.adk.a2a.executor.a2a_agent_executor import A2aAgentExecutor -from google.adk.a2a.utils.agent_card_builder import AgentCardBuilder -from google.adk.a2a.experimental import a2a_experimental -from google.adk.agents.base_agent import BaseAgent -from google.adk.artifacts.in_memory_artifact_service import InMemoryArtifactService -from google.adk.auth.credential_service.in_memory_credential_service import InMemoryCredentialService -from google.adk.cli.utils.logs import setup_adk_logger -from google.adk.memory.in_memory_memory_service import InMemoryMemoryService -from google.adk.runners import Runner -from google.adk.sessions.in_memory_session_service import InMemorySessionService - -from a2a.server.apps import A2AStarletteApplication -from a2a.server.request_handlers.default_request_handler import DefaultRequestHandler -from a2a.server.tasks.inmemory_task_store import InMemoryTaskStore -from a2a.server.events.in_memory_queue_manager import InMemoryQueueManager -from a2a.types import AgentCard - -from .agent_executor import FuzzForgeExecutor - - -import json - - -async def serve_artifact(request): - """Serve artifact files via HTTP for A2A agents""" - artifact_id = request.path_params["artifact_id"] - - # Try to get the executor instance to access artifact cache - # We'll store a reference to it during app creation - executor = getattr(serve_artifact, '_executor', None) - if not executor: - return Response("Artifact service not available", status_code=503) - - try: - # Look in the artifact cache directory - artifact_cache_dir = executor._artifact_cache_dir - artifact_dir = artifact_cache_dir / artifact_id - - if not artifact_dir.exists(): - return Response("Artifact not found", status_code=404) - - # Find the artifact file (should be only one file in the directory) - artifact_files = list(artifact_dir.glob("*")) - if not artifact_files: - return Response("Artifact file not found", status_code=404) - - artifact_file = artifact_files[0] # Take the first (and should be only) file - - # Determine mime type from file extension or default to octet-stream - import mimetypes - mime_type, _ = mimetypes.guess_type(str(artifact_file)) - if not mime_type: - mime_type = 'application/octet-stream' - - return FileResponse( - path=str(artifact_file), - media_type=mime_type, - filename=artifact_file.name - ) - - except Exception as e: - return Response(f"Error serving artifact: {str(e)}", status_code=500) - - -async def knowledge_query(request): - """Expose knowledge graph search over HTTP for external agents.""" - executor = getattr(knowledge_query, '_executor', None) - if not executor: - return Response("Knowledge service not available", status_code=503) - - try: - payload = await request.json() - except Exception: - return Response("Invalid JSON body", status_code=400) - - query = payload.get("query") - if not query: - return Response("'query' is required", status_code=400) - - search_type = payload.get("search_type", "INSIGHTS") - dataset = payload.get("dataset") - - result = await executor.query_project_knowledge_api( - query=query, - search_type=search_type, - dataset=dataset, - ) - - status = 200 if not isinstance(result, dict) or "error" not in result else 400 - return Response( - json.dumps(result, default=str), - status_code=status, - media_type="application/json", - ) - - -async def create_file_artifact(request): - """Create an artifact from a project file via HTTP.""" - executor = getattr(create_file_artifact, '_executor', None) - if not executor: - return Response("File service not available", status_code=503) - - try: - payload = await request.json() - except Exception: - return Response("Invalid JSON body", status_code=400) - - path = payload.get("path") - if not path: - return Response("'path' is required", status_code=400) - - result = await executor.create_project_file_artifact_api(path) - status = 200 if not isinstance(result, dict) or "error" not in result else 400 - return Response( - json.dumps(result, default=str), - status_code=status, - media_type="application/json", - ) - - -def _load_agent_card(agent_card: Optional[Union[AgentCard, str]]) -> Optional[AgentCard]: - if agent_card is None: - return None - if isinstance(agent_card, AgentCard): - return agent_card - - import json - from pathlib import Path - - path = Path(agent_card) - with path.open('r', encoding='utf-8') as handle: - data = json.load(handle) - return AgentCard(**data) - - -@a2a_experimental -def create_a2a_app( - agent: BaseAgent, - *, - host: str = "localhost", - port: int = 8000, - protocol: str = "http", - agent_card: Optional[Union[AgentCard, str]] = None, - executor=None, # Accept executor reference -) -> Starlette: - """Variant of google.adk.a2a.utils.to_a2a that exposes task-store handles.""" - - setup_adk_logger(logging.INFO) - - async def create_runner() -> Runner: - return Runner( - agent=agent, - app_name=agent.name or "fuzzforge", - artifact_service=InMemoryArtifactService(), - session_service=InMemorySessionService(), - memory_service=InMemoryMemoryService(), - credential_service=InMemoryCredentialService(), - ) - - task_store = InMemoryTaskStore() - queue_manager = InMemoryQueueManager() - - agent_executor = A2aAgentExecutor(runner=create_runner) - request_handler = DefaultRequestHandler( - agent_executor=agent_executor, - task_store=task_store, - queue_manager=queue_manager, - ) - - rpc_url = f"{protocol}://{host}:{port}/" - provided_card = _load_agent_card(agent_card) - - card_builder = AgentCardBuilder(agent=agent, rpc_url=rpc_url) - - app = Starlette() - - async def setup() -> None: - if provided_card is not None: - final_card = provided_card - else: - final_card = await card_builder.build() - - a2a_app = A2AStarletteApplication( - agent_card=final_card, - http_handler=request_handler, - ) - a2a_app.add_routes_to_app(app) - - # Add artifact serving route - app.router.add_route("/artifacts/{artifact_id}", serve_artifact, methods=["GET"]) - app.router.add_route("/graph/query", knowledge_query, methods=["POST"]) - app.router.add_route("/project/files", create_file_artifact, methods=["POST"]) - - app.add_event_handler("startup", setup) - - # Expose handles so the executor can emit task updates later - FuzzForgeExecutor.task_store = task_store - FuzzForgeExecutor.queue_manager = queue_manager - - # Store reference to executor for artifact serving - serve_artifact._executor = executor - knowledge_query._executor = executor - create_file_artifact._executor = executor - - return app - - -__all__ = ["create_a2a_app"] diff --git a/ai/src/fuzzforge_ai/a2a_wrapper.py b/ai/src/fuzzforge_ai/a2a_wrapper.py deleted file mode 100644 index 0535404..0000000 --- a/ai/src/fuzzforge_ai/a2a_wrapper.py +++ /dev/null @@ -1,288 +0,0 @@ -""" -A2A Wrapper Module for FuzzForge -Programmatic interface to send tasks to A2A agents with custom model/prompt/context -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -from __future__ import annotations - -from typing import Optional, Any -from uuid import uuid4 - -import httpx -from a2a.client import A2AClient -from a2a.client.errors import A2AClientHTTPError -from a2a.types import ( - JSONRPCErrorResponse, - Message, - MessageSendConfiguration, - MessageSendParams, - Part, - Role, - SendMessageRequest, - SendStreamingMessageRequest, - Task, - TaskArtifactUpdateEvent, - TaskStatusUpdateEvent, - TextPart, -) - - -class A2ATaskResult: - """Result from an A2A agent task""" - - def __init__(self, text: str, context_id: str, raw_response: Any = None): - self.text = text - self.context_id = context_id - self.raw_response = raw_response - - def __str__(self) -> str: - return self.text - - def __repr__(self) -> str: - return f"A2ATaskResult(text={self.text[:50]}..., context_id={self.context_id})" - - -def _build_control_message(command: str, payload: Optional[str] = None) -> str: - """Build a control message for hot-swapping agent configuration""" - if payload is None or payload == "": - return f"[HOTSWAP:{command}]" - return f"[HOTSWAP:{command}:{payload}]" - - -def _extract_text( - result: Message | Task | TaskStatusUpdateEvent | TaskArtifactUpdateEvent, -) -> list[str]: - """Extract text content from A2A response objects""" - texts: list[str] = [] - if isinstance(result, Message): - if result.role is Role.agent: - for part in result.parts: - root_part = part.root - text = getattr(root_part, "text", None) - if text: - texts.append(text) - elif isinstance(result, Task) and result.history: - for msg in result.history: - if msg.role is Role.agent: - for part in msg.parts: - root_part = part.root - text = getattr(root_part, "text", None) - if text: - texts.append(text) - elif isinstance(result, TaskStatusUpdateEvent): - message = result.status.message - if message: - texts.extend(_extract_text(message)) - elif isinstance(result, TaskArtifactUpdateEvent): - artifact = result.artifact - if artifact and artifact.parts: - for part in artifact.parts: - root_part = part.root - text = getattr(root_part, "text", None) - if text: - texts.append(text) - return texts - - -async def _send_message( - client: A2AClient, - message: str, - context_id: str, -) -> str: - """Send a message to the A2A agent and collect the response""" - - params = MessageSendParams( - configuration=MessageSendConfiguration(blocking=True), - message=Message( - context_id=context_id, - message_id=str(uuid4()), - role=Role.user, - parts=[Part(root=TextPart(text=message))], - ), - ) - - stream_request = SendStreamingMessageRequest(id=str(uuid4()), params=params) - buffer: list[str] = [] - - try: - async for response in client.send_message_streaming(stream_request): - root = response.root - if isinstance(root, JSONRPCErrorResponse): - raise RuntimeError(f"A2A error: {root.error}") - - payload = root.result - buffer.extend(_extract_text(payload)) - except A2AClientHTTPError as exc: - if "text/event-stream" not in str(exc): - raise - - # Fallback to non-streaming - send_request = SendMessageRequest(id=str(uuid4()), params=params) - response = await client.send_message(send_request) - root = response.root - if isinstance(root, JSONRPCErrorResponse): - raise RuntimeError(f"A2A error: {root.error}") - payload = root.result - buffer.extend(_extract_text(payload)) - - if buffer: - buffer = list(dict.fromkeys(buffer)) # Remove duplicates - return "\n".join(buffer).strip() - - -async def send_agent_task( - url: str, - message: str, - *, - model: Optional[str] = None, - provider: Optional[str] = None, - prompt: Optional[str] = None, - context: Optional[str] = None, - timeout: float = 120.0, -) -> A2ATaskResult: - """ - Send a task to an A2A agent with optional model/prompt configuration. - - Args: - url: A2A endpoint URL (e.g., "http://127.0.0.1:8000/a2a/litellm_agent") - message: The task message to send to the agent - model: Optional model name (e.g., "gpt-4o", "gemini-2.0-flash") - provider: Optional provider name (e.g., "openai", "gemini") - prompt: Optional system prompt to set before sending the message - context: Optional context/session ID (generated if not provided) - timeout: Request timeout in seconds (default: 120) - - Returns: - A2ATaskResult with the agent's response text and context ID - - Example: - >>> result = await send_agent_task( - ... url="http://127.0.0.1:8000/a2a/litellm_agent", - ... model="gpt-4o", - ... provider="openai", - ... prompt="You are concise.", - ... message="Give me a fuzzing harness.", - ... context="fuzzing", - ... timeout=120 - ... ) - >>> print(result.text) - """ - timeout_config = httpx.Timeout(timeout) - context_id = context or str(uuid4()) - - async with httpx.AsyncClient(timeout=timeout_config) as http_client: - client = A2AClient(url=url, httpx_client=http_client) - - # Set model if provided - if model: - model_spec = f"{provider}/{model}" if provider else model - control_msg = _build_control_message("MODEL", model_spec) - await _send_message(client, control_msg, context_id) - - # Set prompt if provided - if prompt is not None: - control_msg = _build_control_message("PROMPT", prompt) - await _send_message(client, control_msg, context_id) - - # Send the actual task message - response_text = await _send_message(client, message, context_id) - - return A2ATaskResult( - text=response_text, - context_id=context_id, - ) - - -async def get_agent_config( - url: str, - context: Optional[str] = None, - timeout: float = 60.0, -) -> str: - """ - Get the current configuration of an A2A agent. - - Args: - url: A2A endpoint URL - context: Optional context/session ID - timeout: Request timeout in seconds - - Returns: - Configuration string from the agent - """ - timeout_config = httpx.Timeout(timeout) - context_id = context or str(uuid4()) - - async with httpx.AsyncClient(timeout=timeout_config) as http_client: - client = A2AClient(url=url, httpx_client=http_client) - control_msg = _build_control_message("GET_CONFIG") - config_text = await _send_message(client, control_msg, context_id) - return config_text - - -async def hot_swap_model( - url: str, - model: str, - provider: Optional[str] = None, - context: Optional[str] = None, - timeout: float = 60.0, -) -> str: - """ - Hot-swap the model of an A2A agent without sending a task. - - Args: - url: A2A endpoint URL - model: Model name to switch to - provider: Optional provider name - context: Optional context/session ID - timeout: Request timeout in seconds - - Returns: - Response from the agent - """ - timeout_config = httpx.Timeout(timeout) - context_id = context or str(uuid4()) - - async with httpx.AsyncClient(timeout=timeout_config) as http_client: - client = A2AClient(url=url, httpx_client=http_client) - model_spec = f"{provider}/{model}" if provider else model - control_msg = _build_control_message("MODEL", model_spec) - response = await _send_message(client, control_msg, context_id) - return response - - -async def hot_swap_prompt( - url: str, - prompt: str, - context: Optional[str] = None, - timeout: float = 60.0, -) -> str: - """ - Hot-swap the system prompt of an A2A agent. - - Args: - url: A2A endpoint URL - prompt: System prompt to set - context: Optional context/session ID - timeout: Request timeout in seconds - - Returns: - Response from the agent - """ - timeout_config = httpx.Timeout(timeout) - context_id = context or str(uuid4()) - - async with httpx.AsyncClient(timeout=timeout_config) as http_client: - client = A2AClient(url=url, httpx_client=http_client) - control_msg = _build_control_message("PROMPT", prompt) - response = await _send_message(client, control_msg, context_id) - return response diff --git a/ai/src/fuzzforge_ai/agent.py b/ai/src/fuzzforge_ai/agent.py deleted file mode 100644 index b33b6cd..0000000 --- a/ai/src/fuzzforge_ai/agent.py +++ /dev/null @@ -1,133 +0,0 @@ -""" -FuzzForge Agent Definition -The core agent that combines all components -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -import os -from pathlib import Path -from typing import Dict, Any, List -from google.adk import Agent -from google.adk.models.lite_llm import LiteLlm -from .agent_card import get_fuzzforge_agent_card -from .agent_executor import FuzzForgeExecutor -from .memory_service import FuzzForgeMemoryService, HybridMemoryManager - -# Load environment variables from the AI module's .env file -try: - from dotenv import load_dotenv - _ai_dir = Path(__file__).parent - _env_file = _ai_dir / ".env" - if _env_file.exists(): - load_dotenv(_env_file, override=False) # Don't override existing env vars -except ImportError: - # dotenv not available, skip loading - pass - - -class FuzzForgeAgent: - """The main FuzzForge agent that combines card, executor, and ADK agent""" - - def __init__( - self, - model: str = None, - cognee_url: str = None, - port: int = 10100, - ): - """Initialize FuzzForge agent with configuration""" - self.model = model or os.getenv('LITELLM_MODEL', 'gpt-4o-mini') - self.cognee_url = cognee_url or os.getenv('COGNEE_MCP_URL') - self.port = port - - # Initialize ADK Memory Service for conversational memory - memory_type = os.getenv('MEMORY_SERVICE', 'inmemory') - self.memory_service = FuzzForgeMemoryService(memory_type=memory_type) - - # Create the executor (the brain) with memory and session services - self.executor = FuzzForgeExecutor( - model=self.model, - cognee_url=self.cognee_url, - debug=os.getenv('FUZZFORGE_DEBUG', '0') == '1', - memory_service=self.memory_service, - session_persistence=os.getenv('SESSION_PERSISTENCE', 'inmemory'), - fuzzforge_mcp_url=None, # Disabled - ) - - # Create Hybrid Memory Manager (ADK + Cognee direct integration) - # MCP tools removed - using direct Cognee integration only - self.memory_manager = HybridMemoryManager( - memory_service=self.memory_service, - cognee_tools=None # No MCP tools, direct integration used instead - ) - - # Get the agent card (the identity) - self.agent_card = get_fuzzforge_agent_card(f"http://localhost:{self.port}") - - # Create the ADK agent (for A2A server mode) - self.adk_agent = self._create_adk_agent() - - def _create_adk_agent(self) -> Agent: - """Create the ADK agent for A2A server mode""" - # Build instruction - instruction = f"""You are {self.agent_card.name}, {self.agent_card.description} - -Your capabilities include: -""" - for skill in self.agent_card.skills: - instruction += f"\n- {skill.name}: {skill.description}" - - instruction += """ - -When responding to requests: -1. Use your registered agents when appropriate -2. Use Cognee memory tools when available -3. Provide helpful, concise responses -4. Maintain context across conversations -""" - - # Create ADK agent - return Agent( - model=LiteLlm(model=self.model), - name=self.agent_card.name, - description=self.agent_card.description, - instruction=instruction, - tools=self.executor.agent.tools if hasattr(self.executor.agent, 'tools') else [] - ) - - async def process_message(self, message: str, context_id: str = None) -> str: - """Process a message using the executor""" - result = await self.executor.execute(message, context_id or "default") - return result.get("response", "No response generated") - - async def register_agent(self, url: str) -> Dict[str, Any]: - """Register a new agent""" - return await self.executor.register_agent(url) - - def list_agents(self) -> List[Dict[str, Any]]: - """List registered agents""" - return self.executor.list_agents() - - async def cleanup(self): - """Clean up resources""" - await self.executor.cleanup() - - -# Create a singleton instance for import -_instance = None - -def get_fuzzforge_agent() -> FuzzForgeAgent: - """Get the singleton FuzzForge agent instance""" - global _instance - if _instance is None: - _instance = FuzzForgeAgent() - return _instance diff --git a/ai/src/fuzzforge_ai/agent_card.py b/ai/src/fuzzforge_ai/agent_card.py deleted file mode 100644 index 175473d..0000000 --- a/ai/src/fuzzforge_ai/agent_card.py +++ /dev/null @@ -1,182 +0,0 @@ -""" -FuzzForge Agent Card and Skills Definition -Defines what FuzzForge can do and how others can discover it -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -from dataclasses import dataclass -from typing import List, Dict, Any - -@dataclass -class AgentSkill: - """Represents a specific capability of the agent""" - id: str - name: str - description: str - tags: List[str] - examples: List[str] - input_modes: List[str] = None - output_modes: List[str] = None - - def to_dict(self) -> Dict[str, Any]: - """Convert to dictionary for JSON serialization""" - return { - "id": self.id, - "name": self.name, - "description": self.description, - "tags": self.tags, - "examples": self.examples, - "inputModes": self.input_modes or ["text/plain"], - "outputModes": self.output_modes or ["text/plain"] - } - - -@dataclass -class AgentCapabilities: - """Defines agent capabilities for A2A protocol""" - streaming: bool = False - push_notifications: bool = False - multi_turn: bool = True - context_retention: bool = True - - def to_dict(self) -> Dict[str, Any]: - return { - "streaming": self.streaming, - "pushNotifications": self.push_notifications, - "multiTurn": self.multi_turn, - "contextRetention": self.context_retention - } - - -@dataclass -class AgentCard: - """The agent's business card - tells others what this agent can do""" - name: str - description: str - version: str - url: str - skills: List[AgentSkill] - capabilities: AgentCapabilities - default_input_modes: List[str] = None - default_output_modes: List[str] = None - preferred_transport: str = "JSONRPC" - protocol_version: str = "0.3.0" - - def to_dict(self) -> Dict[str, Any]: - """Convert to A2A-compliant agent card JSON""" - return { - "name": self.name, - "description": self.description, - "version": self.version, - "url": self.url, - "protocolVersion": self.protocol_version, - "preferredTransport": self.preferred_transport, - "defaultInputModes": self.default_input_modes or ["text/plain"], - "defaultOutputModes": self.default_output_modes or ["text/plain"], - "capabilities": self.capabilities.to_dict(), - "skills": [skill.to_dict() for skill in self.skills] - } - - -# Define FuzzForge's skills -orchestration_skill = AgentSkill( - id="orchestration", - name="Agent Orchestration", - description="Route requests to appropriate registered agents based on their capabilities", - tags=["orchestration", "routing", "coordination"], - examples=[ - "Route this to the calculator", - "Send this to the appropriate agent", - "Which agent should handle this?" - ] -) - -memory_skill = AgentSkill( - id="memory", - name="Memory Management", - description="Store and retrieve information using Cognee knowledge graph", - tags=["memory", "knowledge", "storage", "cognee"], - examples=[ - "Remember that my favorite color is blue", - "What do you remember about me?", - "Search your memory for project details" - ] -) - -conversation_skill = AgentSkill( - id="conversation", - name="General Conversation", - description="Engage in general conversation and answer questions using LLM", - tags=["chat", "conversation", "qa", "llm"], - examples=[ - "What is the meaning of life?", - "Explain quantum computing", - "Help me understand this concept" - ] -) - -workflow_automation_skill = AgentSkill( - id="workflow_automation", - name="Workflow Automation", - description="Operate project workflows via MCP, monitor runs, and share results", - tags=["workflow", "automation", "mcp", "orchestration"], - examples=[ - "Submit the security assessment workflow", - "Kick off the infrastructure scan and monitor it", - "Summarise findings for run abc123" - ] -) - -agent_management_skill = AgentSkill( - id="agent_management", - name="Agent Registry Management", - description="Register, list, and manage connections to other A2A agents", - tags=["registry", "management", "discovery"], - examples=[ - "Register agent at http://localhost:10201", - "List all registered agents", - "Show agent capabilities" - ] -) - -# Define FuzzForge's capabilities -fuzzforge_capabilities = AgentCapabilities( - streaming=False, - push_notifications=True, - multi_turn=True, # We support multi-turn conversations - context_retention=True # We maintain context across turns -) - -# Create the public agent card -def get_fuzzforge_agent_card(url: str = "http://localhost:10100") -> AgentCard: - """Get FuzzForge's agent card with current configuration""" - return AgentCard( - name="ProjectOrchestrator", - description=( - "An A2A-capable project agent that can launch and monitor FuzzForge workflows, " - "consult the project knowledge graph, and coordinate with speciality agents." - ), - version="project-agent", - url=url, - skills=[ - orchestration_skill, - memory_skill, - conversation_skill, - agent_management_skill - ], - capabilities=fuzzforge_capabilities, - default_input_modes=["text/plain", "application/json"], - default_output_modes=["text/plain", "application/json"], - preferred_transport="JSONRPC", - protocol_version="0.3.0" - ) diff --git a/ai/src/fuzzforge_ai/agent_executor.py b/ai/src/fuzzforge_ai/agent_executor.py deleted file mode 100644 index 41613c0..0000000 --- a/ai/src/fuzzforge_ai/agent_executor.py +++ /dev/null @@ -1,2313 +0,0 @@ -# ruff: noqa: E402 # Imports delayed for environment/logging setup -"""FuzzForge Agent Executor - orchestrates workflows and delegation.""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -import asyncio -import time -import uuid -import json -from typing import Dict, Any, List, Union -from datetime import datetime -import os -import warnings -import logging -from pathlib import Path -import mimetypes -import hashlib -import tempfile - -# Suppress warnings -warnings.filterwarnings("ignore") -logging.getLogger("google.adk").setLevel(logging.ERROR) -logging.getLogger("google.adk.tools.base_authenticated_tool").setLevel(logging.ERROR) -logging.getLogger("agentops").setLevel(logging.ERROR) - -from google.genai import types -from google.adk.runners import Runner -from google.adk.sessions import DatabaseSessionService, InMemorySessionService -from google.adk.agents import LlmAgent -from google.adk.models.lite_llm import LiteLlm -from google.adk.artifacts.in_memory_artifact_service import InMemoryArtifactService -from google.adk.artifacts.gcs_artifact_service import GcsArtifactService -from google.adk.events.event import Event -from google.adk.events.event_actions import EventActions -from google.adk.tools import FunctionTool -from google.adk.tools.long_running_tool import LongRunningFunctionTool -from google.adk.tools.tool_context import ToolContext - -# Optional AgentOps -try: - import agentops - AGENTOPS_AVAILABLE = True -except ImportError: - AGENTOPS_AVAILABLE = False - -# MCP functionality removed - keeping direct Cognee integration only - -from google.genai.types import Part -from a2a.types import ( - Task, - TaskStatus, - TaskState, - TaskStatusUpdateEvent, - Message, - Part as A2APart, -) - -from .remote_agent import RemoteAgentConnection -from .config_bridge import ProjectConfigManager - - -class FuzzForgeExecutor: - """Executes tasks for FuzzForge - the brain of the operation""" - - task_store = None - queue_manager = None - - def __init__( - self, - model: str = None, - cognee_url: str = None, - debug: bool = False, - memory_service=None, - session_persistence: str = None, - fuzzforge_mcp_url: str = None, - ): - """Initialize the executor with configuration""" - self.model = model or os.getenv('LITELLM_MODEL', 'gpt-5-mini') - self.cognee_url = cognee_url or os.getenv('COGNEE_MCP_URL') - self.debug = debug - self.memory_service = memory_service # ADK memory service - self.session_persistence = session_persistence or os.getenv('SESSION_PERSISTENCE', 'inmemory') - self.fuzzforge_mcp_url = fuzzforge_mcp_url or os.getenv('FUZZFORGE_MCP_URL') - self._background_tasks: set[asyncio.Task] = set() - self.pending_runs: Dict[str, Dict[str, Any]] = {} - self.session_metadata: Dict[str, Dict[str, Any]] = {} - self._artifact_cache_dir = Path(os.getenv('FUZZFORGE_ARTIFACT_DIR', Path.cwd() / '.fuzzforge' / 'artifacts')) - self._knowledge_integration = None - - # Initialize Cognee service if available - self.cognee_service = None - self._cognee_initialized = False - - # Agent registry - stores registered agents - self.agents: Dict[str, Dict[str, Any]] = {} - - # Session management - self.sessions: Dict[str, Any] = {} - self.session_lookup: Dict[str, str] = {} - - # Create session service based on persistence setting - self.session_service = self._create_session_service() - - # Initialize artifact service (A2A compliant) - self.artifact_service = self._create_artifact_service() - # Local artifact cache for quick access - self.artifacts: Dict[str, List[Dict[str, Any]]] = {} - - # Initialize AgentOps if available - self.agentops_trace = None - if AGENTOPS_AVAILABLE and os.getenv('AGENTOPS_API_KEY'): - try: - agentops.init(api_key=os.getenv('AGENTOPS_API_KEY')) - self.agentops_trace = agentops.start_trace() - if self.debug: - print("[DEBUG] AgentOps tracking enabled") - except Exception as e: - if self.debug: - print(f"[DEBUG] AgentOps init failed: {e}") - - # Initialize the core agent - self._initialize_agent() - - # Auto-register agents from config - self._auto_register_agents() - - # Ensure task store/queue manager exist for CLI usage even without A2A server - if getattr(FuzzForgeExecutor, "task_store", None) is None: - try: - from a2a.server.tasks.inmemory_task_store import InMemoryTaskStore - FuzzForgeExecutor.task_store = InMemoryTaskStore() - except Exception: - FuzzForgeExecutor.task_store = None - if getattr(FuzzForgeExecutor, "queue_manager", None) is None: - try: - from a2a.server.events.in_memory_queue_manager import InMemoryQueueManager - FuzzForgeExecutor.queue_manager = InMemoryQueueManager() - except Exception: - FuzzForgeExecutor.queue_manager = None - - self.task_store = FuzzForgeExecutor.task_store - self.queue_manager = FuzzForgeExecutor.queue_manager - - def _auto_register_agents(self): - """Auto-register agents from config file""" - try: - from .config_manager import ConfigManager - config_mgr = ConfigManager() - registered = config_mgr.get_registered_agents() - - if registered and self.debug: - print(f"[DEBUG] Auto-registering {len(registered)} agents from config") - - for agent_config in registered: - url = agent_config.get('url') - name = agent_config.get('name', '') - if url: - # Register silently (don't wait for async) - import asyncio - try: - loop = asyncio.get_event_loop() - if loop.is_running(): - # Schedule for later if loop is already running - asyncio.create_task(self._register_agent_async(url, name)) - else: - # Run now if no loop is running - loop.run_until_complete(self._register_agent_async(url, name)) - except Exception: - # Ignore auto-registration failures - pass - except Exception as e: - if self.debug: - print(f"[DEBUG] Auto-registration failed: {e}") - - async def _register_agent_async(self, url: str, name: str): - """Async helper for auto-registration""" - try: - result = await self.register_agent(url) - if self.debug: - if result.get('success'): - print(f"[DEBUG] Auto-registered: {name or result.get('name')} at {url} as RemoteA2aAgent sub-agent") - else: - print(f"[DEBUG] Failed to auto-register {url}: {result.get('error')}") - except Exception as e: - if self.debug: - print(f"[DEBUG] Auto-registration error for {url}: {e}") - - def _create_artifact_service(self): - """Create artifact service based on configuration""" - artifact_storage = os.getenv('ARTIFACT_STORAGE', 'inmemory') - - if artifact_storage.lower() == 'gcs': - # Use Google Cloud Storage for artifacts - bucket_name = os.getenv('GCS_ARTIFACT_BUCKET', 'fuzzforge-artifacts') - if self.debug: - print(f"[DEBUG] Using GCS artifact storage: {bucket_name}") - try: - return GcsArtifactService(bucket_name=bucket_name) - except Exception as e: - if self.debug: - print(f"[DEBUG] GCS artifact service failed: {e}, falling back to in-memory") - return InMemoryArtifactService() - else: - # Default to in-memory artifacts - if self.debug: - print("[DEBUG] Using in-memory artifact service") - return InMemoryArtifactService() - - def _prepare_artifact_cache_dir(self) -> Path: - """Ensure a shared directory exists for delegated artifacts.""" - try: - self._artifact_cache_dir.mkdir(parents=True, exist_ok=True) - return self._artifact_cache_dir - except Exception: - fallback = Path(tempfile.gettempdir()) / "fuzzforge_artifacts" - fallback.mkdir(parents=True, exist_ok=True) - self._artifact_cache_dir = fallback - if self.debug: - print(f"[DEBUG] Falling back to artifact cache dir {fallback}") - return fallback - - def _register_artifact_bytes( - self, - *, - name: str, - data: bytes, - mime_type: str, - sha256_digest: str, - size: int, - artifact_id: str = None, # Optional: use provided ID instead of generating new one - ) -> Dict[str, Any]: - """Persist artifact bytes to cache directory and return metadata.""" - base_dir = self._prepare_artifact_cache_dir() - if artifact_id is None: - artifact_id = uuid.uuid4().hex - artifact_dir = base_dir / artifact_id - artifact_dir.mkdir(parents=True, exist_ok=True) - file_path = artifact_dir / name - file_path.write_bytes(data) - - # Create HTTP URL for A2A artifact serving instead of file:// URI - port = int(os.getenv('FUZZFORGE_PORT', 10100)) - http_uri = f"http://127.0.0.1:{port}/artifacts/{artifact_id}" - - return { - "id": artifact_id, - "file_uri": http_uri, - "path": str(file_path), - "name": name, - "mime_type": mime_type, - "sha256": sha256_digest, - "size": size, - } - - def _create_session_service(self): - """Create session service based on persistence setting""" - if self.session_persistence.lower() == 'sqlite': - # Use SQLite for persistent sessions - db_path = os.getenv('SESSION_DB_PATH', './fuzzforge_sessions.db') - # Convert to absolute path for SQLite URL - abs_db_path = os.path.abspath(db_path) - db_url = f"sqlite:///{abs_db_path}" - if self.debug: - print(f"[DEBUG] Using SQLite session persistence: {db_url}") - return DatabaseSessionService(db_url=db_url) - else: - # Default to in-memory sessions - if self.debug: - print("[DEBUG] Using in-memory session service (non-persistent)") - return InMemorySessionService() - - async def _get_cognee_service(self): - """Get or initialize shared Cognee service""" - if self.cognee_service is None or not self._cognee_initialized: - try: - from .cognee_service import CogneeService - - config = ProjectConfigManager() - if not config.is_initialized(): - raise ValueError("FuzzForge project not initialized. Run 'fuzzforge init' first.") - - self.cognee_service = CogneeService(config) - await self.cognee_service.initialize() - self._cognee_initialized = True - - if self.debug: - print("[DEBUG] Shared Cognee service initialized") - - except Exception as e: - if self.debug: - print(f"[DEBUG] Failed to initialize Cognee service: {e}") - raise - - return self.cognee_service - - async def _get_knowledge_integration(self): - """Get reusable Cognee project integration for structured queries.""" - if self._knowledge_integration is not None: - return self._knowledge_integration - - try: - from .cognee_integration import CogneeProjectIntegration - - integration = CogneeProjectIntegration() - initialised = await integration.initialize() - if not initialised: - if self.debug: - print("[DEBUG] CogneeProjectIntegration initialization failed") - return None - - self._knowledge_integration = integration - return integration - except Exception as exc: - if self.debug: - print(f"[DEBUG] Knowledge integration unavailable: {exc}") - return None - - def _initialize_agent(self): - """Initialize the LLM agent with tools""" - # Build tools list - tools = [] - - # Add custom function tools for Cognee operations (making it callable as a tool) - - # Define Cognee tool functions - async def cognee_add(text: str) -> str: - """Add information to Cognee knowledge graph memory""" - try: - if self.cognee_service: - result = await self.cognee_service.add_to_memory(text) - return f"Added to Cognee: {result}" - return "Cognee service not available" - except Exception as e: - return f"Error adding to Cognee: {e}" - - async def cognee_search(query: str) -> str: - """Search Cognee knowledge graph memory""" - try: - if self.cognee_service: - results = await self.cognee_service.search_memory(query) - return f"Cognee search results: {results}" - return "Cognee service not available" - except Exception as e: - return f"Error searching Cognee: {e}" - - # Add Cognee project integration tools - async def search_project_knowledge(query: str, dataset: str, search_type: str) -> str: - """Search the project's knowledge graph (codebase, documentation, specs, etc.) - - Args: - query: Search query - dataset: Specific dataset to search (optional, searches all if empty) - search_type: Type of search - any SearchType: INSIGHTS, CHUNKS, GRAPH_COMPLETION, CODE, SUMMARIES, RAG_COMPLETION, NATURAL_LANGUAGE, etc. - """ - try: - from cognee.modules.search.types import SearchType - - # Use shared cognee service - cognee_service = await self._get_cognee_service() - config = cognee_service.config - - # Get SearchType enum value dynamically - try: - search_type_enum = getattr(SearchType, search_type.upper()) - except AttributeError: - # Fallback to INSIGHTS if invalid search type - search_type_enum = SearchType.INSIGHTS - search_type = "INSIGHTS" - - # Handle empty/default values - if not dataset: - dataset = None - if not search_type: - search_type = "INSIGHTS" - search_type_enum = SearchType.INSIGHTS - - # Use direct cognee import like ingest command - import cognee - - # Set up user context - try: - from cognee.modules.users.methods import get_user - user_email = f"project_{config.get_project_context()['project_id']}@fuzzforge.example" - user = await get_user(user_email) - cognee.set_user(user) - except Exception: - pass # User context not critical - - # Use cognee search directly for maximum flexibility - search_kwargs = { - "query_type": search_type_enum, - "query_text": query - } - - if dataset: - search_kwargs["datasets"] = [dataset] - - results = await cognee.search(**search_kwargs) - - if not results: - return f"No results found for '{query}'" + (f" in dataset '{dataset}'" if dataset else "") - - project_context = config.get_project_context() - output = f"Search results for '{query}' in project {project_context['project_name']} (search_type: {search_type}):\n\n" - - for i, result in enumerate(results[:5], 1): # Top 5 results - if isinstance(result, str): - preview = result[:200] + "..." if len(result) > 200 else result - output += f"{i}. {preview}\n\n" - else: - output += f"{i}. {str(result)[:200]}...\n\n" - - return output - - except Exception as e: - return f"Error searching project knowledge: {e}" - - async def list_project_knowledge() -> str: - """List available knowledge and datasets in the project's knowledge graph""" - try: - import logging - logger = logging.getLogger(__name__) - - # Use shared cognee service - cognee_service = await self._get_cognee_service() - config = cognee_service.config - - project_context = config.get_project_context() - result = f"Available knowledge in project {project_context['project_name']}:\n\n" - - # Use direct cognee import like ingest command does - try: - import cognee - from cognee.modules.search.types import SearchType - - # Set up user context like ingest command - try: - from cognee.modules.users.methods import create_user, get_user - - user_email = f"project_{project_context['project_id']}@fuzzforge.example" - user_tenant = project_context['tenant_id'] - - try: - user = await get_user(user_email) - logger.info(f"Using existing user: {user_email}") - except Exception: - try: - user = await create_user(user_email, user_tenant) - logger.info(f"Created new user: {user_email}") - except Exception: - user = None - - if user: - cognee.set_user(user) - except Exception as e: - logger.warning(f"User context setup failed: {e}") - - # List available datasets - datasets = await cognee.datasets.list_datasets() - logger.info(f"Found datasets: {datasets}") - - if datasets and len(datasets) > 0: - dataset_name = f"{project_context['project_name']}_codebase" - - # Try to search for some basic info to show data exists - try: - sample_results = await cognee.search( - query_type=SearchType.INSIGHTS, - query_text="project overview files functions", - datasets=[dataset_name] - ) - - if sample_results: - data = [f"Dataset '{dataset_name}' contains {len(sample_results)} insights"] + sample_results[:3] - else: - data = [f"Dataset '{dataset_name}' exists but no insights found"] - except Exception as search_e: - logger.info(f"Search failed: {search_e}") - data = [f"Dataset '{dataset_name}' exists in: {[str(ds) for ds in datasets]}"] - else: - data = None - - except Exception as e: - data = None - logger.warning(f"Error accessing cognee: {e}") - - if not data: - result += "No data available in knowledge graph\n" - result += "Use 'fuzzforge ingest' to ingest code, documentation, or other project files\n" - else: - # Extract datasets from data - datasets = set() - if isinstance(data, list): - for item in data: - if isinstance(item, dict) and 'dataset_name' in item: - datasets.add(item['dataset_name']) - - if datasets: - result += f"Available Datasets ({len(datasets)}):\n" - for i, dataset in enumerate(sorted(datasets), 1): - result += f" {i}. {dataset}\n" - result += "\n" - - result += f"Total data items: {len(data)}\n" - - # Show sample of available data - result += "\nSample content:\n" - for i, item in enumerate(data[:3], 1): - if isinstance(item, dict): - item_str = str(item)[:100] + "..." if len(str(item)) > 100 else str(item) - result += f" {i}. {item_str}\n" - else: - item_str = str(item)[:100] + "..." if len(str(item)) > 100 else str(item) - result += f" {i}. {item_str}\n" - - return result - - except Exception as e: - return f"Error listing knowledge: {e}" - - async def ingest_to_dataset(content: str, dataset: str) -> str: - """Ingest text content (code, documentation, notes) into a specific project dataset - - Args: - content: Text content to ingest (code, docs, specs, research, etc.) - dataset: Dataset name to ingest into - """ - try: - # Use shared cognee service - cognee_service = await self._get_cognee_service() - config = cognee_service.config - - # Ingest the content - success = await cognee_service.ingest_text(content, dataset) - - if success: - project_context = config.get_project_context() - return f"Successfully ingested {len(content)} characters into dataset '{dataset}' for project {project_context['project_name']}" - else: - return f"Failed to ingest content into dataset '{dataset}'" - - except Exception as e: - return f"Error ingesting to dataset: {e}" - - async def cognify_information(text: str) -> str: - """Transform information into knowledge graph format""" - try: - from .cognee_integration import CogneeProjectIntegration - integration = CogneeProjectIntegration() - result = await integration.cognify_text(text) - - if "error" in result: - return f"Error cognifying information: {result['error']}" - - project = result.get('project', 'Unknown') - return f"Successfully transformed information into knowledge graph for project {project}" - except Exception as e: - return f"Error cognifying information: {e}" - - tools.extend([ - FunctionTool(search_project_knowledge), - FunctionTool(list_project_knowledge), - FunctionTool(ingest_to_dataset), - FunctionTool(cognify_information), - FunctionTool(self.query_project_knowledge_api) - ]) - - # Add project-local filesystem tools - async def list_project_files(path: str, pattern: str) -> str: - """List files in the current project directory with optional pattern - - Args: - path: Relative path within project (e.g. '.' for root, 'src', 'tests') - pattern: Glob pattern (e.g. '*.py', '**/*.js', '') - """ - try: - - # Get project root from config - config = ProjectConfigManager() - if not config.is_initialized(): - return "Project not initialized. Run 'fuzzforge init' first." - - project_root = config.config_path.parent # Parent of .fuzzforge - requested_path = project_root / path - - # Security check - ensure we stay within project - try: - requested_path = requested_path.resolve() - project_root = project_root.resolve() - requested_path.relative_to(project_root) - except ValueError: - return f"Access denied: Path '{path}' is outside project directory" - - if not requested_path.exists(): - return f"Path does not exist: {path}" - - if not requested_path.is_dir(): - return f"Not a directory: {path}" - - # List contents - if not pattern: - # Simple directory listing - items = [] - for item in sorted(requested_path.iterdir()): - relative = item.relative_to(project_root) - if item.is_dir(): - items.append(f"šŸ“ {relative}/") - else: - size = item.stat().st_size - size_str = f"({size} bytes)" if size < 1024 else f"({size//1024}KB)" - items.append(f"šŸ“„ {relative} {size_str}") - - return f"Project files in '{path}':\n" + "\n".join(items) if items else "Empty directory" - else: - # Pattern matching - matches = list(requested_path.glob(pattern)) - if matches: - files = [] - for f in sorted(matches): - if f.is_file(): - relative = f.relative_to(project_root) - size = f.stat().st_size - size_str = f" ({size//1024}KB)" if size >= 1024 else f" ({size}B)" - files.append(f"šŸ“„ {relative}{size_str}") - - return f"Found {len(files)} files matching '{pattern}' in project:\n" + "\n".join(files[:100]) - else: - return f"No files found matching '{pattern}' in project path '{path}'" - - except Exception as e: - return f"Error listing project files: {e}" - - async def read_project_file(file_path: str, max_lines: int) -> str: - """Read a file from the current project - - Args: - file_path: Relative path to file within project - max_lines: Maximum lines to read (0 for all, default 200 for large files) - """ - try: - - # Get project root from config - config = ProjectConfigManager() - if not config.is_initialized(): - return "Project not initialized. Run 'fuzzforge init' first." - - project_root = config.config_path.parent - requested_file = project_root / file_path - - # Security check - ensure we stay within project - try: - requested_file = requested_file.resolve() - project_root = project_root.resolve() - requested_file.relative_to(project_root) - except ValueError: - return f"Access denied: File '{file_path}' is outside project directory" - - if not requested_file.exists(): - return f"File does not exist: {file_path}" - - if not requested_file.is_file(): - return f"Not a file: {file_path}" - - # Check file size - size_mb = requested_file.stat().st_size / (1024 * 1024) - if size_mb > 5: - return f"File too large ({size_mb:.1f} MB). Use max_lines parameter to read portions." - - # Set reasonable default for max_lines - if max_lines == 0: - max_lines = 200 if size_mb > 0.1 else 0 # Default limit for larger files - - with open(requested_file, 'r', encoding='utf-8', errors='replace') as f: - if max_lines == 0: - content = f.read() - else: - lines = [] - for i, line in enumerate(f, 1): - if i > max_lines: - lines.append(f"... (truncated at {max_lines} lines)") - break - lines.append(f"{i:4d}: {line.rstrip()}") - content = "\n".join(lines) - - relative_path = requested_file.relative_to(project_root) - return f"Contents of {relative_path}:\n{content}" - - except UnicodeDecodeError: - return f"Cannot read file (binary or encoding issue): {file_path}" - except Exception as e: - return f"Error reading file: {e}" - - async def search_project_files(search_pattern: str, file_pattern: str, path: str) -> str: - """Search for text patterns in project files - - Args: - search_pattern: Text/regex pattern to find - file_pattern: File pattern to search in (e.g. '*.py', '**/*.js') - path: Relative project path to search in (e.g. '.', 'src') - """ - try: - import re - - # Get project root from config - config = ProjectConfigManager() - if not config.is_initialized(): - return "Project not initialized. Run 'fuzzforge init' first." - - project_root = config.config_path.parent - search_path = project_root / path - - # Security check - try: - search_path = search_path.resolve() - project_root = project_root.resolve() - search_path.relative_to(project_root) - except ValueError: - return f"Access denied: Path '{path}' is outside project directory" - - if not search_path.exists(): - return f"Search path does not exist: {path}" - - matches = [] - files_searched = 0 - - # Search in files - for file_path in search_path.glob(file_pattern): - if file_path.is_file(): - files_searched += 1 - try: - with open(file_path, 'r', encoding='utf-8', errors='replace') as f: - for line_num, line in enumerate(f, 1): - if re.search(search_pattern, line, re.IGNORECASE): - relative = file_path.relative_to(project_root) - matches.append(f"{relative}:{line_num}: {line.strip()}") - if len(matches) >= 50: # Limit results - break - except (PermissionError, OSError): - continue - - if len(matches) >= 50: - break - - if matches: - result = f"Found '{search_pattern}' in {len(matches)} locations (searched {files_searched} files):\n" - result += "\n".join(matches[:50]) - if len(matches) >= 50: - result += "\n... (showing first 50 matches)" - return result - else: - return f"No matches found for '{search_pattern}' in {files_searched} files matching '{file_pattern}'" - - except Exception as e: - return f"Error searching project files: {e}" - - tools.extend([ - FunctionTool(list_project_files), - FunctionTool(read_project_file), - FunctionTool(search_project_files), - FunctionTool(self.create_project_file_artifact_api) - ]) - - async def send_file_to_agent(agent_name: str, file_path: str, note: str, tool_context: ToolContext) -> str: - """Send a local file to a registered agent (agent_name, file_path, note).""" - # Handle empty note parameter - if not note: - note = "" - - session = None - context_id = None - if tool_context and getattr(tool_context, "invocation_context", None): - invocation = tool_context.invocation_context - session = invocation.session - context_id = self.session_lookup.get(getattr(session, 'id', None)) - return await self.delegate_file_to_agent(agent_name, file_path, note, session=session, context_id=context_id) - - tools.append(FunctionTool(send_file_to_agent)) - - if self.debug: - print("[DEBUG] Added Cognee project integration tools") - - # Add FuzzForge backend workflow tools if MCP endpoint configured - if self.fuzzforge_mcp_url: - if self.debug: - print(f"[DEBUG] FuzzForge MCP endpoint configured at {self.fuzzforge_mcp_url}") - - async def _call_fuzzforge_mcp(tool_name: str, payload: Dict[str, Any] | None = None) -> Any: - return await self._call_mcp_generic(tool_name, payload or {}) - - async def list_fuzzforge_workflows(tool_context: ToolContext | None = None) -> Any: - return await _call_fuzzforge_mcp("list_workflows_mcp") - - async def get_fuzzforge_workflow_metadata(workflow_name: str, tool_context: ToolContext | None = None) -> Any: - return await _call_fuzzforge_mcp("get_workflow_metadata_mcp", {"workflow_name": workflow_name}) - - async def get_fuzzforge_workflow_parameters(workflow_name: str, tool_context: ToolContext | None = None) -> Any: - return await _call_fuzzforge_mcp("get_workflow_parameters_mcp", {"workflow_name": workflow_name}) - - async def get_fuzzforge_workflow_schema(tool_context: ToolContext | None = None) -> Any: - return await _call_fuzzforge_mcp("get_workflow_metadata_schema_mcp") - - async def list_fuzzforge_runs( - limit: int = 10, - workflow_name: str = "", - states: str = "", - tool_context: ToolContext | None = None, - ) -> Any: - payload: Dict[str, Any] = {"limit": limit} - workflow_name = (workflow_name or "").strip() - if workflow_name: - payload["workflow_name"] = workflow_name - - state_tokens = [ - token.strip() - for token in (states or "").split(",") - if token.strip() - ] - if state_tokens: - payload["states"] = state_tokens - return await _call_fuzzforge_mcp("list_recent_runs_mcp", payload) - - async def submit_security_scan_mcp( - workflow_name: str, - target_path: str = "", - parameters: Dict[str, Any] | None = None, - tool_context: ToolContext | None = None, - ) -> Any: - # Resolve the target path to an absolute path for validation - resolved_path = target_path or "." - try: - resolved_path = str(Path(resolved_path).expanduser().resolve()) - except Exception: - # If resolution fails, use the raw value - resolved_path = target_path - - # Ensure configuration objects default to dictionaries instead of None - cleaned_parameters: Dict[str, Any] = {} - if parameters: - for key, value in parameters.items(): - if isinstance(key, str) and key.endswith("_config") and value is None: - cleaned_parameters[key] = {} - else: - cleaned_parameters[key] = value - - # Merge in default parameter schema for known workflows to avoid missing dicts - try: - param_info = await get_fuzzforge_workflow_parameters(workflow_name) - if isinstance(param_info, dict): - defaults = param_info.get("defaults") or {} - if isinstance(defaults, dict): - for key, value in defaults.items(): - if key.endswith("_config") and key not in cleaned_parameters: - cleaned_parameters[key] = value or {} - except Exception: - # Defaults fetch is best-effort – continue with whatever we have - pass - - # Final pass – replace any lingering None configs with empty dicts - for key, value in list(cleaned_parameters.items()): - if isinstance(key, str) and key.endswith("_config") and value is None: - cleaned_parameters[key] = {} - - payload = { - "workflow_name": workflow_name, - "target_path": resolved_path, - "parameters": cleaned_parameters, - } - result = await _call_fuzzforge_mcp("submit_security_scan_mcp", payload) - - if isinstance(result, dict): - run_id = result.get("run_id") or result.get("id") - if run_id and tool_context: - context_id = tool_context.invocation_context.session.id - session_meta = self.session_metadata.get(context_id, {}) - self.pending_runs[run_id] = { - "context_id": context_id, - "session_id": session_meta.get("session_id"), - "user_id": session_meta.get("user_id"), - "app_name": session_meta.get("app_name", "fuzzforge"), - "workflow_name": workflow_name, - "submitted_at": datetime.now().isoformat(), - } - tool_context.actions.state_delta[ - f"fuzzforge.run.{run_id}.status" - ] = "submitted" - await self._publish_task_pending(run_id, context_id, workflow_name) - self._schedule_run_followup(run_id) - - return result - - async def get_fuzzforge_run_status(run_id: str, tool_context: ToolContext | None = None) -> Any: - return await _call_fuzzforge_mcp("get_run_status_mcp", {"run_id": run_id}) - - async def get_fuzzforge_summary(run_id: str, tool_context: ToolContext | None = None) -> Any: - return await _call_fuzzforge_mcp("get_comprehensive_scan_summary", {"run_id": run_id}) - - async def get_fuzzforge_findings(run_id: str, tool_context: ToolContext | None = None) -> Any: - return await _call_fuzzforge_mcp("get_run_findings_mcp", {"run_id": run_id}) - - async def get_fuzzforge_fuzzing_stats(run_id: str, tool_context: ToolContext | None = None) -> Any: - return await _call_fuzzforge_mcp("get_fuzzing_stats_mcp", {"run_id": run_id}) - - tools.extend([ - FunctionTool(list_fuzzforge_workflows), - FunctionTool(get_fuzzforge_workflow_metadata), - FunctionTool(get_fuzzforge_workflow_parameters), - FunctionTool(get_fuzzforge_workflow_schema), - FunctionTool(list_fuzzforge_runs), - LongRunningFunctionTool(submit_security_scan_mcp), - FunctionTool(get_fuzzforge_run_status), - FunctionTool(get_fuzzforge_summary), - FunctionTool(get_fuzzforge_findings), - FunctionTool(get_fuzzforge_fuzzing_stats), - ]) - - # Add agent introspection tools - async def get_agent_capabilities(agent_name: str) -> str: - """Get detailed capabilities and tools of a registered agent""" - # Handle empty agent_name - if not agent_name or agent_name.strip() == "": - # List all agents with their capabilities - if not self.agents: - return "No agents are currently registered" - - result = "Registered agents and their capabilities:\n\n" - for name, info in self.agents.items(): - card = info.get("card", {}) - result += f"{name}\n" - result += f" Description: {card.get('description', 'No description')}\n" - - # Get skills/tools from agent card - skills = card.get('skills', []) - if skills: - result += f" Tools ({len(skills)}):\n" - for skill in skills: - skill_name = skill.get('name', 'Unknown') - skill_desc = skill.get('description', 'No description') - result += f" - {skill_name}: {skill_desc}\n" - else: - result += " Tools: Not specified in agent card\n" - result += "\n" - return result - else: - # Get specific agent details - if agent_name not in self.agents: - return f"Agent '{agent_name}' not found. Available agents: {', '.join(self.agents.keys())}" - - info = self.agents[agent_name] - card = info.get("card", {}) - - result = f"{agent_name} - Detailed Capabilities\n\n" - result += f"URL: {info.get('url')}\n" - result += f"Description: {card.get('description', 'No description')}\n\n" - - # Detailed skills/tools - skills = card.get('skills', []) - if skills: - result += f"Available Tools ({len(skills)}):\n" - for i, skill in enumerate(skills, 1): - skill_name = skill.get('name', 'Unknown') - skill_desc = skill.get('description', 'No description') - result += f"{i}. {skill_name}\n {skill_desc}\n\n" - else: - result += "Tools: Not specified in agent card\n\n" - - # Additional capabilities - capabilities = card.get('capabilities', {}) - if capabilities: - result += "Capabilities:\n" - for key, value in capabilities.items(): - result += f" - {key}: {value}\n" - result += "\n" - - # Input/Output modes - input_modes = card.get('defaultInputModes', card.get('default_input_modes', [])) - output_modes = card.get('defaultOutputModes', card.get('default_output_modes', [])) - - if input_modes: - result += f"Supported Input Modes: {', '.join(input_modes)}\n" - if output_modes: - result += f"Supported Output Modes: {', '.join(output_modes)}\n" - - return result - - # Add task tracking tools - async def create_task_list(tasks: List[str]) -> str: - """Create a task list for tracking project progress""" - if not hasattr(self, 'task_lists'): - self.task_lists = {} - - task_id = f"task_list_{len(self.task_lists)}" - self.task_lists[task_id] = { - 'tasks': [{'id': i, 'description': task, 'status': 'pending'} for i, task in enumerate(tasks)], - 'created_at': datetime.now().isoformat() - } - return f"Created task list {task_id} with {len(tasks)} tasks" - - async def update_task_status(task_list_id: str, task_id: int, status: str) -> str: - """Update the status of a task (pending, in_progress, completed)""" - if not hasattr(self, 'task_lists') or task_list_id not in self.task_lists: - return f"Task list {task_list_id} not found" - - tasks = self.task_lists[task_list_id]['tasks'] - for task in tasks: - if task['id'] == task_id: - task['status'] = status - return f"Updated task {task_id} to {status}" - return f"Task {task_id} not found" - - async def get_task_list(task_list_id: str) -> str: - """Get current task list status""" - # Handle empty task_list_id - if not task_list_id or task_list_id.strip() == "": - task_list_id = "default" - - if not hasattr(self, 'task_lists'): - return "No task lists created" - - if task_list_id: - if task_list_id in self.task_lists: - tasks = self.task_lists[task_list_id]['tasks'] - result = f"Task List {task_list_id}:\n" - for task in tasks: - result += f" [{task['status']}] {task['id']}: {task['description']}\n" - return result - return f"Task list {task_list_id} not found" - else: - # Return all task lists - result = "All task lists:\n" - for list_id, list_data in self.task_lists.items(): - completed = sum(1 for t in list_data['tasks'] if t['status'] == 'completed') - total = len(list_data['tasks']) - result += f" {list_id}: {completed}/{total} completed\n" - return result - - tools.extend([ - FunctionTool(get_agent_capabilities), - FunctionTool(create_task_list), - FunctionTool(update_task_status), - FunctionTool(get_task_list) - ]) - - - # Create the agent with LiteLLM configuration - llm_kwargs = {} - api_key = os.getenv('OPENAI_API_KEY') or os.getenv('LLM_API_KEY') - api_base = os.getenv('LLM_ENDPOINT') or os.getenv('LLM_API_BASE') or os.getenv('OPENAI_API_BASE') - - if api_key: - llm_kwargs['api_key'] = api_key - if api_base: - llm_kwargs['api_base'] = api_base - - self.agent = LlmAgent( - model=LiteLlm(model=self.model, **llm_kwargs), - name="fuzzforge_executor", - description="Intelligent A2A orchestrator with memory", - instruction=self._build_instruction(), - tools=tools # Always pass tools list (empty list is fine) - ) - - # Create runner with our session service - self.runner = Runner( - agent=self.agent, - session_service=self.session_service, # Use our configured session service - app_name="fuzzforge" - ) - - # Connect runner to our artifact service - if hasattr(self.runner, 'artifact_service'): - # Override with our configured artifact service - self.runner.artifact_service = self.artifact_service - - def _build_instruction(self) -> str: - """Build the agent's instruction prompt""" - instruction = """You are FuzzForge, an intelligent A2A orchestrator with dual memory systems. - -## Your Core Responsibilities: - -1. **Agent Orchestration (Primary)** - - Always use get_agent_capabilities() tool to check available agents - - When users ask about agent tools/capabilities, use get_agent_capabilities(agent_name) - - When a user mentions any registered agent by name, delegate to that agent - - When a request matches an agent's capabilities, route to it - - To route to an agent, format your response as: "ROUTE_TO: [agent_name] [message]" - - The system follows A2A protocol standards for agent communication - - Be agent-agnostic - work with whatever agents are registered - - Prefer using your built-in FuzzForge workflow tools directly unless the user explicitly requests delegation - -2. **FuzzForge Platform Tools (Secondary)** - - Use your FuzzForge MCP tools by default for workflow submission, monitoring, and findings retrieval - - Use the appropriate tool for the user's request - - You can submit and monitor FuzzForge workflows via MCP tools (list_workflows_mcp, submit_security_scan_mcp, list_recent_runs_mcp, get_run_status_mcp, get_comprehensive_scan_summary) - - Treat any absolute path the user provides as mountable; the backend handles volume access. Do NOT ask the user to upload, move, or zip projects—just call submit_security_scan_mcp with the supplied path and options. - - When asked to send local files or binaries to another agent, call send_file_to_agent(agent_name, file_path, note="...") - -3. **Dual Memory Systems**: - - a) **Conversational Memory** (ADK MemoryService - for past conversations) - - Automatically ingests completed sessions - - Search with "recall from past conversations about X" - - Uses semantic search (VertexAI) or keyword matching (InMemory) - - b) **Project Knowledge Graph** (Cognee - for ingested code, documentation, specs, and structured data) - - Use search_project_knowledge(query, dataset="", search_type="INSIGHTS") to search project knowledge - - Available search_type options: INSIGHTS, CHUNKS, GRAPH_COMPLETION, CODE, SUMMARIES, RAG_COMPLETION, NATURAL_LANGUAGE, CYPHER, TEMPORAL, FEELING_LUCKY - - Use list_project_knowledge() to see available datasets and knowledge - - Use ingest_to_dataset(content, dataset) to add content to specific datasets - - Use cognify_information(text) to add new information to knowledge graph - - Automatically uses current project context and directory - - Example: "what functions are in the codebase?" -> use search_project_knowledge("functions classes methods", search_type="CHUNKS") - - Example: "what documentation exists?" -> use search_project_knowledge("documentation specs readme", search_type="INSIGHTS") - - Example: "search security docs" -> use search_project_knowledge("security vulnerabilities", dataset="security_docs") - - c) **Project Filesystem Access** (Project-local file operations) - - Use list_project_files(path, pattern) to explore project structure - - Use read_project_file(file_path, max_lines) to examine file contents - - Use search_project_files(search_pattern, file_pattern, path) to find text in files - - All file operations are restricted to the current project directory for security - - Example: "show me all Python files" -> use list_project_files(".", "*.py") - - Example: "read the main agent file" -> use read_project_file("agent.py", 0) - - Example: "find TODO comments" -> use search_project_files("TODO", "**/*.py", ".") - -4. **Artifact Creation** - - When generating code, configurations, or documents, create an artifact - - Format: "ARTIFACT: [type] [title]\n```\n[content]\n```" - - Types: code, config, document, data, diagram - -5. **Multi-Step Task Execution with Graph Building** - - Chain multiple actions together - - When user says "ask agent X and then save to memory": - a) Route to agent X - b) Use `cognify` to structure the response as a knowledge graph - c) This automatically creates searchable nodes and relationships - - Build a growing knowledge graph from all interactions - - Connect new information to existing graph nodes - -6. **General Assistance** - - Only answer directly if no suitable agent is registered AND no FuzzForge tool can help - - Provide helpful responses - - Maintain conversation context - -## Tool Usage Protocol: -- ALWAYS use get_agent_capabilities() tool when asked about agents or their tools -- Use get_agent_capabilities(agent_name) for specific agent details -- Use get_agent_capabilities() without parameters to list all agents -- If an agent's skills/description match the request, use "ROUTE_TO: [name] [message]" -- After receiving agent response: - - If user wants to save/store: Use `cognify` to create knowledge graph - - Structure the data as: entities (nodes) and relationships (edges) - - Example cognify text: "Entity: 1001 (Number). Property: is_prime=false. Relationship: 1001 CHECKED_BY CalculatorAgent. Relationship: 1001 HAS_FACTORS [7, 11, 13]" -- When searching memory, use GRAPH_COMPLETION mode to traverse relationships - -## Important Rules: -- NEVER mention specific types of agents or tasks in greetings -- Do NOT say things like "I can run calculations" or mention specific capabilities -- Keep greetings generic: just say you're an orchestrator that can help -- When user asks for chained actions, acknowledge and execute all steps - -Be concise and intelligent in your responses.""" - - - return instruction - - async def execute(self, message: str, context_id: str = None) -> Dict[str, Any]: - """Execute a task/message and return the result""" - - # Use default context if none provided - if not context_id: - context_id = "default" - - # Get or create session - if context_id not in self.sessions: - session_obj = await self._create_session() - self.sessions[context_id] = session_obj - self.session_metadata[context_id] = { - "session_id": getattr(session_obj, 'id', context_id), - "user_id": getattr(session_obj, 'user_id', 'user'), - "app_name": getattr(session_obj, 'app_name', 'fuzzforge'), - } - if self.debug: - print(f"[DEBUG] Created new session for context: {context_id}") - - session = self.sessions[context_id] - session_id = getattr(session, 'id', context_id) - self.session_lookup[session_id] = context_id - if context_id not in self.session_metadata: - self.session_metadata[context_id] = { - "session_id": getattr(session, 'id', context_id), - "user_id": getattr(session, 'user_id', 'user'), - "app_name": getattr(session, 'app_name', 'fuzzforge'), - } - - # Search conversational memory if relevant - if self.memory_service and any(word in message.lower() for word in ['recall', 'remember', 'past conversation', 'previously']): - try: - memory_results = await self.memory_service.search_memory( - query=message, - app_name="fuzzforge", - user_id=getattr(session, 'user_id', 'user') - ) - if memory_results and memory_results.memories: - # Add memory context to session state - # MemoryEntry has 'text' field - session.state["memory_context"] = [ - {"text": getattr(m, 'text', str(m))} - for m in memory_results.memories - ] - if self.debug: - print(f"[DEBUG] Found {len(memory_results.memories)} memories") - except Exception as e: - if self.debug: - print(f"[DEBUG] Memory search failed: {e}") - - # Update session with registered agents following A2A AgentCard standard - registered_agents = [] - for name, info in self.agents.items(): - card = info.get("card", {}) - skills = card.get("skills", []) - - # Format according to A2A AgentSkill standard - agent_info = { - "name": name, - "url": info["url"], - "description": card.get("description", ""), - "skills": [ - { - "id": skill.get("id", ""), - "name": skill.get("name", ""), - "description": skill.get("description", ""), - "tags": skill.get("tags", []) - } - for skill in skills - ], - "skill_count": len(skills), - "default_input_modes": card.get("defaultInputModes", card.get("default_input_modes", [])), - "default_output_modes": card.get("defaultOutputModes", card.get("default_output_modes", [])) - } - registered_agents.append(agent_info) - - session.state["registered_agents"] = registered_agents - session.state["agent_names"] = list(self.agents.keys()) - - # Track if this is a multi-step request - multi_step_keywords = ["and then", "then save", "and save", "store the", "save the result", "save to memory", "remember"] - is_multi_step = any(keyword in message.lower() for keyword in multi_step_keywords) - - if is_multi_step: - session.state["multi_step_request"] = message - session.state["pending_actions"] = [] - - # Process with LLM - content = types.Content( - role='user', - parts=[types.Part.from_text(text=message)] - ) - - response = "" - try: - # Try to use existing session ID or create a new one - session_id = getattr(session, 'id', context_id) - user_id = getattr(session, 'user_id', 'user') - - if self.debug: - print(f"[DEBUG] Running with session_id: {session_id}, user_id: {user_id}") - - async for event in self.runner.run_async( - user_id=user_id, - session_id=session_id, - new_message=content - ): - # Check if event has content before accessing parts - if event and event.content: - # Normal content handling - if event.content: - if hasattr(event.content, 'parts') and event.content.parts: - # Get text from the first part that has text - for part in event.content.parts: - if hasattr(part, 'text') and part.text: - response = part.text - break - if not response and len(event.content.parts) > 0: - # Fallback to string representation - response = str(event.content.parts[0]) - elif hasattr(event.content, 'text'): - # Direct text content - response = event.content.text - else: - # Log for debugging - if self.debug: - print(f"[DEBUG] Event content type: {type(event.content)}, has parts: {hasattr(event.content, 'parts')}") - - # Check if LLM wants to route to an agent - if "ROUTE_TO:" in response: - # Extract routing command from response - route_line = None - for line in response.split('\n'): - if line.strip().startswith("ROUTE_TO:"): - route_line = line.strip() - break - - if route_line: - # Parse routing command more robustly - route_content = route_line[9:].strip() # Remove "ROUTE_TO:" - - # Try to match against registered agents - agent_name = None - agent_message = route_content - - # Check each registered agent name - for registered_name in self.agents.keys(): - if route_content.lower().startswith(registered_name.lower()): - agent_name = registered_name - # Extract message after agent name - agent_message = route_content[len(registered_name):].strip() - break - - if not agent_name: - # Fallback: try first word as agent name - parts = route_content.split(None, 1) - if parts: - agent_name = parts[0] - agent_message = parts[1] if len(parts) > 1 else message - - # Route to the agent - if agent_name in self.agents: - try: - connection = self.agents[agent_name]["connection"] - routed_response = await connection.send_message(agent_message) - agent_result = f"[{agent_name}]: {routed_response}" - - # If this was a multi-step request, process next steps - if is_multi_step: - # Store the agent response for next action - session.state["last_agent_response"] = routed_response - - # Ask LLM to continue with next steps - followup_content = types.Content( - role='user', - parts=[types.Part.from_text( - text=f"The agent responded: {routed_response}\n\nNow complete the remaining actions from the original request: {message}" - )] - ) - - # Process followup - async for followup_event in self.runner.run_async( - user_id=user_id, - session_id=session_id, - new_message=followup_content - ): - if followup_event.content.parts and followup_event.content.parts[0].text: - followup_response = followup_event.content.parts[0].text - response = f"{agent_result}\n\n{followup_response}" - break - else: - response = agent_result - - except Exception as e: - response = f"Error routing to {agent_name}: {e}" - else: - response = f"Agent {agent_name} not found. Available agents: {', '.join(self.agents.keys())}" - - # Check for artifacts in response - elif "ARTIFACT:" in response: - response = await self._extract_and_store_artifact(response, session, context_id) - except Exception as e: - if self.debug: - print(f"[DEBUG] Runner error: {e}") - print(f"[DEBUG] Error type: {type(e).__name__}") - import traceback - print(f"[DEBUG] Traceback: {traceback.format_exc()}") - # Fallback to direct agent response - response = f"I encountered an issue processing your request: {str(e) if self.debug else 'Please try again.'}" - - try: - save_session = getattr(self.runner.session_service, "save_session", None) - if callable(save_session): - await save_session(session) - except Exception as exc: - if self.debug: - print(f"[DEBUG] Failed to save session: {exc}") - - return { - "response": response or "No response generated", - "context_id": context_id, - "routed": False - } - - async def _create_session(self) -> Any: - """Create a new session""" - try: - # Create session with proper parameters - session = await self.runner.session_service.create_session( - app_name="fuzzforge", - user_id=f"user_{datetime.now().strftime('%Y%m%d_%H%M%S')}" - ) - return session - except Exception as e: - # If session service fails, create a simple mock session - if self.debug: - print(f"[DEBUG] Session creation failed: {e}, using mock session") - - # Return a simple session object - from types import SimpleNamespace - return SimpleNamespace( - id=f"session_{datetime.now().strftime('%Y%m%d_%H%M%S')}", - state={}, - app_name="fuzzforge", - user_id="user" - ) - - - async def _extract_and_store_artifact(self, response: str, session: Any, context_id: str) -> str: - """Extract and store artifacts from response using ADK artifact service (A2A compliant)""" - import re - - # Pattern to match artifact format - handle both inline and multiline formats - # Format: ARTIFACT: type filename\n```content``` (with possible extra newlines) - pattern = r'ARTIFACT:\s*(\w+)\s+(.+?)\s*\n```([^`]*?)```' - matches = re.findall(pattern, response, re.DOTALL) - - if self.debug: - print(f"[DEBUG] Looking for artifacts in response. Found {len(matches)} matches.") - if matches: - for i, (artifact_type, title, content) in enumerate(matches): - print(f"[DEBUG] Artifact {i+1}: type={artifact_type}, title={title.strip()}, content_length={len(content)}") - else: - # Show first 500 chars of response to debug regex issues - print(f"[DEBUG] No artifacts found. Response preview: {response[:500]}...") - - if matches: - artifacts_created = [] - - for artifact_type, title, content in matches: - # Determine MIME type based on artifact type - mime_type_map = { - "code": "text/plain", - "c": "text/x-c", - "cpp": "text/x-c++", - "python": "text/x-python", - "javascript": "text/javascript", - "json": "application/json", - "config": "text/plain", - "document": "text/markdown", - "data": "application/json", - "diagram": "text/plain", - "yaml": "text/yaml", - "xml": "text/xml", - "html": "text/html" - } - mime_type = mime_type_map.get(artifact_type, "text/plain") - - # Create proper A2A artifact format - title_clean = title.strip().replace(' ', '_') - # If title already has extension, use it as-is, otherwise add artifact_type as extension - if '.' in title_clean: - filename = title_clean - else: - filename = f"{title_clean}.{artifact_type}" - artifact_id = f"artifact_{uuid.uuid4().hex[:8]}" - - try: - # Store using ADK artifact service if available - if self.artifact_service: - # Create artifact metadata for A2A - artifact_metadata = { - "id": artifact_id, - "name": title.strip(), - "type": artifact_type, - "mimeType": mime_type, - "filename": filename, - "size": len(content), - "createdAt": datetime.now().isoformat() - } - - # Store content in artifact service - # Save to ADK artifact service using correct API - try: - from google.genai import types - - # Detect content type and extension from artifact metadata - filename = artifact_metadata.get("filename", f"{artifact_id}.txt") - mime_type = artifact_metadata.get("mimeType", "text/plain") - - # Handle different content types - if isinstance(content, str): - content_bytes = content.encode('utf-8') - elif isinstance(content, bytes): - content_bytes = content - else: - content_bytes = str(content).encode('utf-8') - - # Create ADK artifact using correct API - artifact_part = types.Part( - inline_data=types.Blob( - mime_type=mime_type, - data=content_bytes - ) - ) - - # Save using ADK artifact service - await self.artifact_service.save_artifact( - filename=filename, - artifact=artifact_part - ) - - if self.debug: - print(f"[DEBUG] Saved artifact to ADK service: {filename}") - - except ImportError as e: - # Fallback: just store in local cache if ADK not available - if self.debug: - print(f"[DEBUG] ADK types not available ({e}), using local storage only") - except Exception as e: - if self.debug: - print(f"[DEBUG] ADK artifact service error: {e}, using local storage only") - - if self.debug: - print(f"[DEBUG] Saved artifact to service: {artifact_id}") - - # Store to file system cache for HTTP serving - try: - content_bytes = content.encode('utf-8') if isinstance(content, str) else content - sha256_digest = hashlib.sha256(content_bytes).hexdigest() - - file_cache_result = self._register_artifact_bytes( - name=filename, - data=content_bytes, - mime_type=mime_type, - sha256_digest=sha256_digest, - size=len(content_bytes), - artifact_id=artifact_id # Use the display ID for file system - ) - - if self.debug: - print(f"[DEBUG] Stored artifact to file cache: {file_cache_result['file_uri']}") - except Exception as e: - if self.debug: - print(f"[DEBUG] Failed to store to file cache: {e}") - - # Also store in local cache for quick access - if context_id not in self.artifacts: - self.artifacts[context_id] = [] - - artifact = { - "id": artifact_id, - "type": artifact_type, - "title": title.strip(), - "filename": filename, - "mimeType": mime_type, - "content": content.strip(), - "size": len(content), - "created_at": datetime.now().isoformat() - } - - self.artifacts[context_id].append(artifact) - artifacts_created.append(f"{title.strip()} ({artifact_type})") - - if self.debug: - print(f"[DEBUG] Stored artifact: {artifact['id']} - {artifact['title']}") - - except Exception as e: - if self.debug: - print(f"[DEBUG] Failed to store artifact: {e}") - - # Create A2A compliant response with artifact references - artifact_list = ", ".join(artifacts_created) - clean_response = re.sub(pattern, "", response) - - # Add artifact notification in A2A format - artifact_response = f"{clean_response}\n\nšŸ“Ž Created artifacts: {artifact_list}" - - return artifact_response - - return response - - async def get_artifacts(self, context_id: str = None) -> List[Dict[str, Any]]: - """Get artifacts for a context or all artifacts""" - if self.debug: - print(f"[DEBUG] get_artifacts called with context_id: {context_id}") - print(f"[DEBUG] Available artifact contexts: {list(self.artifacts.keys())}") - print(f"[DEBUG] Total artifacts stored: {sum(len(artifacts) for artifacts in self.artifacts.values())}") - - if context_id: - result = self.artifacts.get(context_id, []) - if self.debug: - print(f"[DEBUG] Returning {len(result)} artifacts for context {context_id}") - return result - - # Return all artifacts - all_artifacts = [] - for ctx_id, artifacts in self.artifacts.items(): - for artifact in artifacts: - artifact_copy = artifact.copy() - artifact_copy['context_id'] = ctx_id - all_artifacts.append(artifact_copy) - - if self.debug: - print(f"[DEBUG] Returning {len(all_artifacts)} total artifacts") - return all_artifacts - - def format_artifacts_for_a2a(self, context_id: str) -> List[Dict[str, Any]]: - """Format artifacts for A2A protocol response""" - artifacts = self.artifacts.get(context_id, []) - a2a_artifacts = [] - - for artifact in artifacts: - # Create A2A compliant artifact format - a2a_artifact = { - "id": artifact["id"], - "type": "artifact", - "mimeType": artifact.get("mimeType", "text/plain"), - "name": artifact.get("title", artifact.get("filename", "untitled")), - "parts": [ - { - "type": "text", - "text": artifact.get("content", "") - } - ], - "metadata": { - "filename": artifact.get("filename"), - "size": artifact.get("size", 0), - "createdAt": artifact.get("created_at") - } - } - a2a_artifacts.append(a2a_artifact) - - return a2a_artifacts - - async def register_agent(self, url: str) -> Dict[str, Any]: - """Register a new A2A agent with persistence""" - try: - conn = RemoteAgentConnection(url) - card = await conn.get_agent_card() - - if not card: - return {"success": False, "error": "Failed to get agent card"} - - name = card.get("name", f"agent_{len(self.agents)}") - description = card.get("description", "") - - self.agents[name] = { - "url": url, - "card": card, - "connection": conn - } - - if self.debug: - print(f"[DEBUG] Registered agent {name} for ROUTE_TO delegation") - - # Update session state with registered agents for the LLM - if hasattr(self, 'sessions'): - for session in self.sessions.values(): - if hasattr(session, 'state'): - session.state["registered_agents"] = list(self.agents.keys()) - - # Persist to config - from .config_manager import ConfigManager - config_mgr = ConfigManager() - config_mgr.add_registered_agent(name, url, description) - - return { - "success": True, - "name": name, - "capabilities": len(card.get("skills", [])), - "description": description - } - - except Exception as e: - return {"success": False, "error": str(e)} - - def list_agents(self) -> List[Dict[str, Any]]: - """List all registered agents""" - return [ - { - "name": name, - "url": info["url"], - "description": info.get("card", {}).get("description", ""), - "skills": len(info.get("card", {}).get("skills", [])) - } - for name, info in self.agents.items() - ] - - async def cleanup(self): - """Clean up resources""" - # Close agent connections - for agent in self.agents.values(): - conn = agent.get("connection") - if conn: - await conn.close() - - # End AgentOps trace - if self.agentops_trace: - try: - agentops.end_trace() - except Exception: - pass - - # Cancel background monitors - for task in list(self._background_tasks): - task.cancel() - self._background_tasks.clear() - - def _schedule_run_followup(self, run_id: str) -> None: - if run_id not in self.pending_runs: - return - - try: - task = asyncio.create_task(self._monitor_run_and_notify(run_id), name=f"fuzzforge_run_{run_id}") - self._background_tasks.add(task) - - def _cleanup(t: asyncio.Task) -> None: - self._background_tasks.discard(t) - try: - t.result() - except asyncio.CancelledError: - if self.debug: - print(f"[DEBUG] Run monitor for {run_id} cancelled") - except Exception as exc: - if self.debug: - print(f"[DEBUG] Run monitor for {run_id} failed: {exc}") - - task.add_done_callback(_cleanup) - except RuntimeError as exc: - if self.debug: - print(f"[DEBUG] Unable to schedule run follow-up: {exc}") - - async def _monitor_run_and_notify(self, run_id: str) -> None: - try: - run_meta = self.pending_runs.get(run_id) - if not run_meta: - return - context_id = run_meta.get("context_id") - while True: - status = await self._call_mcp_status(run_id) - if isinstance(status, dict) and status.get("is_completed"): - break - await asyncio.sleep(5) - - summary = await self._call_mcp_summary(run_id) - findings: Any | None = None - try: - findings = await self._call_mcp_generic( - "get_run_findings_mcp", {"run_id": run_id} - ) - except Exception as exc: - if self.debug: - print(f"[DEBUG] Unable to fetch findings for {run_id}: {exc}") - - artifact_info = None - try: - artifact_info = await self._create_run_artifact( - run_id=run_id, - run_meta=run_meta, - status=status, - summary=summary, - findings=findings, - ) - if artifact_info: - run_meta["artifact"] = artifact_info - except Exception as exc: - if self.debug: - print(f"[DEBUG] Failed to create artifact for {run_id}: {exc}") - - message = self._format_run_summary(run_id, status, summary) - if artifact_info and artifact_info.get("file_uri"): - message += ( - f"\nArtifact: {artifact_info['file_uri']}" - f" ({artifact_info.get('name', 'run-summary')})" - ) - if context_id: - await self._append_session_message(context_id, message, run_id) - await self._publish_task_update( - run_id, - context_id, - status, - summary, - message, - artifact_info, - ) - self.pending_runs.pop(run_id, None) - except asyncio.CancelledError: - raise - except Exception as exc: - if self.debug: - print(f"[DEBUG] Follow-up notification failed for {run_id}: {exc}") - - async def _call_mcp_status(self, run_id: str) -> Any: - return await self._call_mcp_generic("get_run_status_mcp", {"run_id": run_id}) - - async def _call_mcp_summary(self, run_id: str) -> Any: - return await self._call_mcp_generic("get_comprehensive_scan_summary", {"run_id": run_id}) - - async def _call_mcp_generic(self, tool_name: str, payload: Dict[str, Any]) -> Any: - if not self.fuzzforge_mcp_url: - return {"error": "FUZZFORGE_MCP_URL not configured"} - - try: - from fastmcp.client import Client - except ImportError as exc: - return {"error": f"fastmcp not installed: {exc}"} - - async with Client(self.fuzzforge_mcp_url) as client: - result = await client.call_tool(tool_name, payload) - - if hasattr(result, "content") and result.content: - raw = result.content[0] if isinstance(result.content, list) else result.content - if isinstance(raw, dict) and "text" in raw: - raw = raw["text"] - if isinstance(raw, str): - stripped = raw.strip() - if stripped.startswith("{") or stripped.startswith("["): - try: - return json.loads(stripped) - except json.JSONDecodeError: - return raw - return raw - return raw - - if isinstance(result, (dict, list)): - return result - return str(result) - - def _format_run_summary(self, run_id: str, status: Any, summary: Any) -> str: - lines = [f"FuzzForge workflow {run_id} completed."] - if isinstance(status, dict): - state = status.get("status") or status.get("state") - if state: - lines.append(f"Status: {state}") - updated = status.get("updated_at") or status.get("completed_at") - if updated: - lines.append(f"Completed at: {updated}") - if isinstance(summary, dict): - total = summary.get("total_findings") - if total is not None: - lines.append(f"Total findings: {total}") - severity = summary.get("severity_summary") - if isinstance(severity, dict): - lines.append("Severity breakdown: " + ", ".join(f"{k}={v}" for k, v in severity.items())) - recommendations = summary.get("recommendations") - if recommendations: - if isinstance(recommendations, list): - lines.append("Recommendations:") - lines.extend(f"- {item}" for item in recommendations) - else: - lines.append(f"Recommendations: {recommendations}") - else: - lines.append(str(summary)) - lines.append("You can request more detail with get_run_findings_mcp(run_id) or get_run_status_mcp(run_id).") - return "\n".join(lines) - - async def query_project_knowledge_api( - self, - query: str, - search_type: str = "INSIGHTS", - dataset: str = "", - ) -> Dict[str, Any]: - integration = await self._get_knowledge_integration() - if integration is None: - return {"error": "Knowledge graph integration unavailable"} - - try: - result = await integration.search_knowledge_graph( - query=query, - search_type=search_type, - dataset=dataset or None, - ) - return json.loads(json.dumps(result, default=str)) - except Exception as exc: - return {"error": f"Knowledge graph query failed: {exc}"} - - async def create_project_file_artifact_api(self, file_path: str) -> Dict[str, Any]: - try: - config = ProjectConfigManager() - if not config.is_initialized(): - return {"error": "Project not initialized. Run 'fuzzforge init' first."} - - project_root = config.config_path.parent.resolve() - requested_file = (project_root / file_path).resolve() - - try: - requested_file.relative_to(project_root) - except ValueError: - return {"error": f"Access denied: '{file_path}' is outside the project"} - - if not requested_file.exists() or not requested_file.is_file(): - return {"error": f"File not found: {file_path}"} - - size = requested_file.stat().st_size - max_bytes = int(os.getenv("FUZZFORGE_ARTIFACT_MAX_BYTES", str(25 * 1024 * 1024))) - if size > max_bytes: - return { - "error": ( - f"File {file_path} is {size} bytes, exceeding the limit of {max_bytes} bytes" - ) - } - - data = requested_file.read_bytes() - mime_type, _ = mimetypes.guess_type(str(requested_file)) - if not mime_type: - mime_type = "application/octet-stream" - - artifact_id = f"project_file_{uuid.uuid4().hex[:8]}" - sha256_digest = hashlib.sha256(data).hexdigest() - - if self.artifact_service: - try: - artifact_part = types.Part( - inline_data=types.Blob( - mime_type=mime_type, - data=data, - ) - ) - await self.artifact_service.save_artifact( - filename=requested_file.name, - artifact=artifact_part, - ) - if self.debug: - print( - f"[DEBUG] Saved project file artifact to service: {requested_file.name}" - ) - except Exception as exc: - if self.debug: - print(f"[DEBUG] Artifact service save failed: {exc}") - - local_meta = self._register_artifact_bytes( - name=requested_file.name, - data=data, - mime_type=mime_type, - sha256_digest=sha256_digest, - size=size, - artifact_id=artifact_id, - ) - - local_meta.update( - { - "path": str(requested_file), - "size": size, - "name": requested_file.name, - "mime_type": mime_type, - } - ) - return local_meta - except Exception as exc: - return {"error": f"Failed to create artifact: {exc}"} - - async def _create_run_artifact( - self, - *, - run_id: str, - run_meta: Dict[str, Any], - status: Any, - summary: Any, - findings: Any | None = None, - ) -> Dict[str, Any] | None: - workflow_name = run_meta.get("workflow_name") or "workflow" - safe_workflow = "".join( - ch if ch.isalnum() or ch in {"-", "_"} else "_" for ch in workflow_name - ) or "workflow" - artifact_filename = f"{safe_workflow}_{run_id}_summary.json" - - payload: Dict[str, Any] = { - "run_id": run_id, - "workflow": workflow_name, - "submitted_at": run_meta.get("submitted_at"), - "status": status, - "summary": summary, - } - - if isinstance(findings, dict) and not findings.get("error"): - payload["findings"] = findings - - artifact_bytes = json.dumps(payload, indent=2, default=str).encode("utf-8") - - if self.artifact_service: - try: - artifact_part = types.Part( - inline_data=types.Blob( - mime_type="application/json", - data=artifact_bytes, - ) - ) - await self.artifact_service.save_artifact( - filename=artifact_filename, - artifact=artifact_part, - ) - if self.debug: - print( - f"[DEBUG] Saved run artifact to artifact service: {artifact_filename}" - ) - except Exception as exc: - if self.debug: - print(f"[DEBUG] Artifact service save failed: {exc}") - - sha256_digest = hashlib.sha256(artifact_bytes).hexdigest() - local_meta = self._register_artifact_bytes( - name=artifact_filename, - data=artifact_bytes, - mime_type="application/json", - sha256_digest=sha256_digest, - size=len(artifact_bytes), - artifact_id=f"fuzzforge_run_{run_id}", - ) - - return local_meta - - async def _append_session_message(self, context_id: str, message: str, run_id: str) -> None: - meta = self.session_metadata.get(context_id) - if not meta: - return - service = self.runner.session_service - session_obj = None - if hasattr(service, "sessions"): - session_obj = ( - service.sessions - .get(meta.get("app_name", "fuzzforge"), {}) - .get(meta.get("user_id"), {}) - .get(meta.get("session_id")) - ) - if not session_obj: - if self.debug: - print(f"[DEBUG] Could not locate session for context {context_id}") - return - - event = Event( - invocationId=str(uuid.uuid4()), - id=str(uuid.uuid4()), - author=getattr(self.agent, 'name', 'FuzzForge'), - content=types.Content( - role='assistant', - parts=[Part.from_text(text=message)] - ), - actions=EventActions(), - ) - event.actions.state_delta[f"fuzzforge.run.{run_id}.status"] = "completed" - event.actions.state_delta[f"fuzzforge.run.{run_id}.timestamp"] = datetime.now().isoformat() - - await service.append_event(session_obj, event) - session_obj.last_update_time = time.time() - - cached_session = self.sessions.get(context_id) - if cached_session and hasattr(cached_session, 'events'): - cached_session.events.append(event) - elif cached_session: - cached_session.events = [event] - - async def _append_external_event(self, session: Any, agent_name: str, message_text: str) -> None: - if session is None: - return - event = Event( - invocationId=str(uuid.uuid4()), - id=str(uuid.uuid4()), - author=agent_name, - content=types.Content( - role='assistant', - parts=[Part.from_text(text=message_text)] - ), - actions=EventActions(), - ) - await self.runner.session_service.append_event(session, event) - if hasattr(session, 'events'): - session.events.append(event) - else: - session.events = [event] - - async def _send_to_agent( - self, - agent_name: str, - message: Union[str, Dict[str, Any], List[Dict[str, Any]]], - session: Any, - context_id: str, - ) -> str: - agent_entry = self.agents.get(agent_name) - if not agent_entry: - return f"Agent '{agent_name}' is not registered." - - conn = agent_entry.get('connection') - if conn is None: - conn = RemoteAgentConnection(agent_entry['url']) - await conn.get_agent_card() - agent_entry['connection'] = conn - - conn.context_id = context_id - response = await conn.send_message(message) - response_text = response if isinstance(response, str) else str(response) - await self._append_external_event(session, agent_name, response_text) - return response_text - - async def delegate_file_to_agent( - self, - agent_name: str, - file_path: str, - note: str = "", - session: Any = None, - context_id: str | None = None, - ) -> str: - try: - project_root = None - try: - config = ProjectConfigManager() - if config.is_initialized(): - project_root = config.config_path.parent - except Exception: - project_root = None - - path_obj = Path(file_path).expanduser() - if not path_obj.is_absolute() and project_root: - path_obj = (project_root / path_obj).resolve() - else: - path_obj = path_obj.resolve() - - if not path_obj.is_file(): - return f"File not found: {path_obj}" - - data = path_obj.read_bytes() - except Exception as exc: - return f"Failed to read file '{file_path}': {exc}" - - message_text = note or f"Please analyse the artifact {path_obj.name}." - - if session is None: - if not self.sessions: - return "No active session available for delegation." - default_context = next(iter(self.sessions.keys())) - session = self.sessions[default_context] - context_id = default_context - - if context_id is None: - session_id = getattr(session, 'id', None) - context_id = self.session_lookup.get(session_id, session_id or 'default') - - app_name = getattr(session, 'app_name', 'fuzzforge') - user_id = getattr(session, 'user_id', 'user') - session_id = getattr(session, 'id', context_id) - - mime_type, _ = mimetypes.guess_type(str(path_obj)) - if not mime_type: - mime_type = 'application/octet-stream' - - sha256_digest = hashlib.sha256(data).hexdigest() - size = len(data) - - artifact_version = None - if self.artifact_service: - try: - artifact_part = types.Part( - inline_data=types.Blob(data=data, mime_type=mime_type) - ) - artifact_version = await self.artifact_service.save_artifact( - app_name=app_name, - user_id=user_id, - session_id=session_id, - filename=path_obj.name, - artifact=artifact_part, - ) - except Exception as exc: - artifact_version = None - if self.debug: - print(f"[DEBUG] Failed to persist artifact in service: {exc}") - - artifact_meta = self._register_artifact_bytes( - name=path_obj.name, - data=data, - mime_type=mime_type, - sha256_digest=sha256_digest, - size=size, - ) - - artifact_info = { - "file_uri": artifact_meta["file_uri"], # HTTP URL for download - "artifact_url": artifact_meta["file_uri"], # Alias for reverse agent compatibility - "cache_path": artifact_meta["path"], - "filename": path_obj.name, - "mime_type": mime_type, - "sha256": sha256_digest, - "size": size, - "session": { - "app_name": app_name, - "user_id": user_id, - "session_id": session_id, - }, - } - if artifact_version is not None: - artifact_info["artifact_version"] = artifact_version - - parts: List[Dict[str, Any]] = [ - {"type": "text", "text": message_text}, - { - "type": "file", - "file": { - "uri": artifact_meta["file_uri"], - "name": path_obj.name, - "mime_type": mime_type, - }, - }, - { - "type": "text", - "text": f"artifact_metadata: {json.dumps(artifact_info)}", - }, - ] - - return await self._send_to_agent(agent_name, {"parts": parts}, session, context_id) - - async def _publish_task_pending(self, run_id: str, context_id: str, workflow_name: str) -> None: - task_store = self.task_store - queue_manager = self.queue_manager - if not task_store or not queue_manager: - return - - context_identifier = context_id or "default" - - status_obj = TaskStatus( - state=TaskState.working, - timestamp=datetime.now().isoformat(), - ) - - task = Task( - id=run_id, - context_id=context_identifier, - status=status_obj, - metadata={"workflow": workflow_name}, - ) - await task_store.save(task) - - status_event = TaskStatusUpdateEvent( - taskId=run_id, - contextId=context_identifier, - status=status_obj, - final=False, - metadata={"workflow": workflow_name}, - ) - - queue = await queue_manager.create_or_tap(run_id) - await queue.enqueue_event(status_event) # type: ignore[arg-type] - - async def _publish_task_update( - self, - run_id: str, - context_id: str | None, - status_payload: Any, - summary_payload: Any, - message_text: str, - artifact_info: Dict[str, Any] | None = None, - ) -> None: - if not FuzzForgeExecutor.task_store or not FuzzForgeExecutor.queue_manager: - return - - task_store = self.task_store - queue_manager = self.queue_manager - - context_identifier = context_id or "default" - existing_task = await task_store.get(run_id) - - message_obj = Message( - messageId=str(uuid.uuid4()), - role="agent", - parts=[A2APart.model_validate({"type": "text", "text": message_text})], - contextId=context_identifier, - taskId=run_id, - ) - - status_obj = TaskStatus( - state=TaskState.completed, - timestamp=datetime.now().isoformat(), - message=message_obj, - ) - - metadata = { - "status": status_payload, - "summary": summary_payload, - } - if artifact_info: - metadata["artifact"] = artifact_info - - status_event = TaskStatusUpdateEvent( - taskId=run_id, - contextId=context_identifier, - status=status_obj, - final=True, - metadata=metadata, - ) - - if existing_task: - existing_task.status = status_obj - if existing_task.metadata is None: - existing_task.metadata = {} - existing_task.metadata.update(metadata) - if existing_task.history: - existing_task.history.append(message_obj) - else: - existing_task.history = [message_obj] - await task_store.save(existing_task) - else: - new_task = Task( - id=run_id, - context_id=context_identifier, - status=status_obj, - metadata=metadata, - history=[message_obj], - ) - await task_store.save(new_task) - - queue = await queue_manager.create_or_tap(run_id) - await queue.enqueue_event(status_event) # type: ignore[arg-type] diff --git a/ai/src/fuzzforge_ai/cli.py b/ai/src/fuzzforge_ai/cli.py deleted file mode 100755 index 4f5549f..0000000 --- a/ai/src/fuzzforge_ai/cli.py +++ /dev/null @@ -1,971 +0,0 @@ -# ruff: noqa: E402 # Imports delayed for environment/logging setup -#!/usr/bin/env python3 -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -""" -FuzzForge CLI - Clean modular version -Uses the separated agent components -""" - -import asyncio -import shlex -import os -import sys -import signal -import warnings -import logging -import random -from datetime import datetime -from contextlib import contextmanager -from pathlib import Path - -from dotenv import load_dotenv - -# Ensure Cognee writes logs inside the project workspace -project_root = Path.cwd() -default_log_dir = project_root / ".fuzzforge" / "logs" -default_log_dir.mkdir(parents=True, exist_ok=True) -log_path = default_log_dir / "cognee.log" -os.environ.setdefault("COGNEE_LOG_PATH", str(log_path)) - -# Suppress warnings -warnings.filterwarnings("ignore") -logging.basicConfig(level=logging.ERROR) - -# Load .env file with explicit path handling -# 1. First check current working directory for .fuzzforge/.env -fuzzforge_env = Path.cwd() / ".fuzzforge" / ".env" -if fuzzforge_env.exists(): - load_dotenv(fuzzforge_env, override=True) -else: - # 2. Then check parent directories for .fuzzforge projects - current_path = Path.cwd() - for parent in [current_path] + list(current_path.parents): - fuzzforge_dir = parent / ".fuzzforge" - if fuzzforge_dir.exists(): - project_env = fuzzforge_dir / ".env" - if project_env.exists(): - load_dotenv(project_env, override=True) - break - else: - # 3. Fallback to generic load_dotenv - load_dotenv(override=True) - -# Enhanced readline configuration for Rich Console input compatibility -try: - import readline - # Enable Rich-compatible input features - readline.parse_and_bind("tab: complete") - readline.parse_and_bind("set editing-mode emacs") - readline.parse_and_bind("set show-all-if-ambiguous on") - readline.parse_and_bind("set completion-ignore-case on") - readline.parse_and_bind("set colored-completion-prefix on") - readline.parse_and_bind("set enable-bracketed-paste on") # Better paste support - # Navigation bindings for better editing - readline.parse_and_bind("Control-a: beginning-of-line") - readline.parse_and_bind("Control-e: end-of-line") - readline.parse_and_bind("Control-u: unix-line-discard") - readline.parse_and_bind("Control-k: kill-line") - readline.parse_and_bind("Control-w: unix-word-rubout") - readline.parse_and_bind("Meta-Backspace: backward-kill-word") - # History and completion - readline.set_history_length(2000) - readline.set_startup_hook(None) - # Enable multiline editing hints - readline.parse_and_bind("set horizontal-scroll-mode off") - readline.parse_and_bind("set mark-symlinked-directories on") - READLINE_AVAILABLE = True -except ImportError: - READLINE_AVAILABLE = False - -from rich.console import Console -from rich.table import Table -from rich.panel import Panel -from rich import box - - -from .agent import FuzzForgeAgent -from .config_manager import ConfigManager -from .config_bridge import ProjectConfigManager - -console = Console() - -# Global shutdown flag -shutdown_requested = False - -# Dynamic status messages for better UX -THINKING_MESSAGES = [ - "Thinking", "Processing", "Computing", "Analyzing", "Working", - "Pondering", "Deliberating", "Calculating", "Reasoning", "Evaluating" -] - -WORKING_MESSAGES = [ - "Working", "Processing", "Handling", "Executing", "Running", - "Operating", "Performing", "Conducting", "Managing", "Coordinating" -] - -SEARCH_MESSAGES = [ - "Searching", "Scanning", "Exploring", "Investigating", "Hunting", - "Seeking", "Probing", "Examining", "Inspecting", "Browsing" -] - -# Cool prompt symbols -PROMPT_STYLES = [ - "ā–¶", "āÆ", "āž¤", "→", "Ā»", "⟩", "ā–·", "⇨", "⟶", "ā—†" -] - -def get_dynamic_status(action_type="thinking"): - """Get a random status message based on action type""" - if action_type == "thinking": - return f"{random.choice(THINKING_MESSAGES)}..." - elif action_type == "working": - return f"{random.choice(WORKING_MESSAGES)}..." - elif action_type == "searching": - return f"{random.choice(SEARCH_MESSAGES)}..." - else: - return f"{random.choice(THINKING_MESSAGES)}..." - -def get_prompt_symbol(): - """Get prompt symbol indicating where to write""" - return ">>" - -def signal_handler(signum, frame): - """Handle Ctrl+C gracefully""" - global shutdown_requested - shutdown_requested = True - console.print("\n\n[yellow]Shutting down gracefully...[/yellow]") - sys.exit(0) - -signal.signal(signal.SIGINT, signal_handler) - -@contextmanager -def safe_status(message: str): - """Safe status context manager""" - status = console.status(message, spinner="dots") - try: - status.start() - yield - finally: - status.stop() - - -class FuzzForgeCLI: - """Command-line interface for FuzzForge""" - - def __init__(self): - """Initialize the CLI""" - # Ensure .env is loaded from .fuzzforge directory - fuzzforge_env = Path.cwd() / ".fuzzforge" / ".env" - if fuzzforge_env.exists(): - load_dotenv(fuzzforge_env, override=True) - - # Load configuration for agent registry - self.config_manager = ConfigManager() - - # Check environment configuration - if not os.getenv('LITELLM_MODEL'): - console.print("[red]ERROR: LITELLM_MODEL not set in .env file[/red]") - console.print("Please set LITELLM_MODEL to your desired model") - sys.exit(1) - - # Create the agent (uses env vars directly) - self.agent = FuzzForgeAgent() - - # Create a consistent context ID for this CLI session - self.context_id = f"cli_{datetime.now().strftime('%Y%m%d_%H%M%S')}" - - # Track registered agents for config persistence - self.agents_modified = False - - # Command handlers - self.commands = { - "/help": self.cmd_help, - "/register": self.cmd_register, - "/unregister": self.cmd_unregister, - "/list": self.cmd_list, - "/memory": self.cmd_memory, - "/recall": self.cmd_recall, - "/artifacts": self.cmd_artifacts, - "/tasks": self.cmd_tasks, - "/skills": self.cmd_skills, - "/sessions": self.cmd_sessions, - "/clear": self.cmd_clear, - "/sendfile": self.cmd_sendfile, - "/quit": self.cmd_quit, - "/exit": self.cmd_quit, - } - - self.background_tasks: set[asyncio.Task] = set() - - def print_banner(self): - """Print welcome banner""" - card = self.agent.agent_card - - # Print ASCII banner - console.print("[medium_purple3] ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•—ā–ˆā–ˆā•— ā–ˆā–ˆā•—ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•—ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•—ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā•—[/medium_purple3]") - console.print("[medium_purple3] ā–ˆā–ˆā•”ā•ā•ā•ā•ā•ā–ˆā–ˆā•‘ ā–ˆā–ˆā•‘ā•šā•ā•ā–ˆā–ˆā–ˆā•”ā•ā•šā•ā•ā–ˆā–ˆā–ˆā•”ā•ā–ˆā–ˆā•”ā•ā•ā•ā•ā•ā–ˆā–ˆā•”ā•ā•ā•ā–ˆā–ˆā•—ā–ˆā–ˆā•”ā•ā•ā–ˆā–ˆā•—ā–ˆā–ˆā•”ā•ā•ā•ā•ā• ā–ˆā–ˆā•”ā•ā•ā•ā•ā• ā–ˆā–ˆā•”ā•ā•ā–ˆā–ˆā•—ā–ˆā–ˆā•‘[/medium_purple3]") - console.print("[medium_purple3] ā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā•‘ ā–ˆā–ˆā•‘ ā–ˆā–ˆā–ˆā•”ā• ā–ˆā–ˆā–ˆā•”ā• ā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā•‘ ā–ˆā–ˆā•‘ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•”ā•ā–ˆā–ˆā•‘ ā–ˆā–ˆā–ˆā•—ā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•‘ā–ˆā–ˆā•‘[/medium_purple3]") - console.print("[medium_purple3] ā–ˆā–ˆā•”ā•ā•ā• ā–ˆā–ˆā•‘ ā–ˆā–ˆā•‘ ā–ˆā–ˆā–ˆā•”ā• ā–ˆā–ˆā–ˆā•”ā• ā–ˆā–ˆā•”ā•ā•ā• ā–ˆā–ˆā•‘ ā–ˆā–ˆā•‘ā–ˆā–ˆā•”ā•ā•ā–ˆā–ˆā•—ā–ˆā–ˆā•‘ ā–ˆā–ˆā•‘ā–ˆā–ˆā•”ā•ā•ā• ā–ˆā–ˆā•”ā•ā•ā–ˆā–ˆā•‘ā–ˆā–ˆā•‘[/medium_purple3]") - console.print("[medium_purple3] ā–ˆā–ˆā•‘ ā•šā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•”ā•ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•—ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•—ā–ˆā–ˆā•‘ ā•šā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•”ā•ā–ˆā–ˆā•‘ ā–ˆā–ˆā•‘ā•šā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•”ā•ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā•‘ ā–ˆā–ˆā•‘ā–ˆā–ˆā•‘[/medium_purple3]") - console.print("[medium_purple3] ā•šā•ā• ā•šā•ā•ā•ā•ā•ā• ā•šā•ā•ā•ā•ā•ā•ā•ā•šā•ā•ā•ā•ā•ā•ā•ā•šā•ā• ā•šā•ā•ā•ā•ā•ā• ā•šā•ā• ā•šā•ā• ā•šā•ā•ā•ā•ā•ā• ā•šā•ā•ā•ā•ā•ā•ā• ā•šā•ā• ā•šā•ā•ā•šā•ā•[/medium_purple3]") - console.print(f"\n[dim]{card.description}[/dim]\n") - - provider = ( - os.getenv("LLM_PROVIDER") - or os.getenv("LLM_COGNEE_PROVIDER") - or os.getenv("COGNEE_LLM_PROVIDER") - or "unknown" - ) - - console.print( - "LLM Provider: [medium_purple1]{provider}[/medium_purple1]".format( - provider=provider - ) - ) - console.print( - "LLM Model: [medium_purple1]{model}[/medium_purple1]".format( - model=self.agent.model - ) - ) - if self.agent.executor.agentops_trace: - console.print("Tracking: [medium_purple1]AgentOps active[/medium_purple1]") - - # Show skills - console.print("\nSkills:") - for skill in card.skills: - console.print( - f" • [deep_sky_blue1]{skill.name}[/deep_sky_blue1] – {skill.description}" - ) - console.print("\nType /help for commands or just chat\n") - - async def cmd_help(self, args: str = "") -> None: - """Show help""" - help_text = """ -[bold]Commands:[/bold] - /register - Register an A2A agent (saves to config) - /unregister - Remove agent from registry and config - /list - List registered agents - -[bold]Memory Systems:[/bold] - /recall - Search past conversations (ADK Memory) - /memory - Show knowledge graph (Cognee) - /memory save - Save to knowledge graph - /memory search - Search knowledge graph - -[bold]Other:[/bold] - /artifacts - List created artifacts - /artifacts - Show artifact content - /tasks [id] - Show task list or details - /skills - Show FuzzForge skills - /sessions - List active sessions - /sendfile [message] - Attach file as artifact and route to agent - /clear - Clear screen - /help - Show this help - /quit - Exit - -[bold]Sample prompts:[/bold] - run fuzzforge workflow security_assessment on /absolute/path --volume-mode ro - list fuzzforge runs limit=5 - get fuzzforge summary - query project knowledge about "unsafe Rust" using GRAPH_COMPLETION - export project file src/lib.rs as artifact - /memory search "recent findings" - -[bold]Input Editing:[/bold] - Arrow keys - Move cursor - Ctrl+A/E - Start/end of line - Up/Down - Command history - """ - console.print(help_text) - - async def cmd_register(self, args: str) -> None: - """Register an agent""" - if not args: - console.print("Usage: /register ") - return - - with safe_status(f"{get_dynamic_status('working')} Registering {args}"): - result = await self.agent.register_agent(args.strip()) - - if result["success"]: - console.print(f"āœ… Registered: [bold]{result['name']}[/bold]") - console.print(f" Capabilities: {result['capabilities']} skills") - - # Get description from the agent's card - agents = self.agent.list_agents() - description = "" - for agent in agents: - if agent['name'] == result['name']: - description = agent.get('description', '') - break - - # Add to config for persistence - self.config_manager.add_registered_agent( - name=result['name'], - url=args.strip(), - description=description - ) - console.print(" [dim]Saved to config for auto-registration[/dim]") - else: - console.print(f"[red]Failed: {result['error']}[/red]") - - async def cmd_unregister(self, args: str) -> None: - """Unregister an agent and remove from config""" - if not args: - console.print("Usage: /unregister ") - return - - # Try to find the agent - agents = self.agent.list_agents() - agent_to_remove = None - - for agent in agents: - if agent['name'].lower() == args.lower() or agent['url'] == args: - agent_to_remove = agent - break - - if not agent_to_remove: - console.print(f"[yellow]Agent '{args}' not found[/yellow]") - return - - # Remove from config - if self.config_manager.remove_registered_agent(name=agent_to_remove['name'], url=agent_to_remove['url']): - console.print(f"āœ… Unregistered: [bold]{agent_to_remove['name']}[/bold]") - console.print(" [dim]Removed from config (won't auto-register next time)[/dim]") - else: - console.print("[yellow]Agent unregistered from session but not found in config[/yellow]") - - async def cmd_list(self, args: str = "") -> None: - """List registered agents""" - agents = self.agent.list_agents() - - if not agents: - console.print("No agents registered. Use /register ") - return - - table = Table(title="Registered Agents", box=box.ROUNDED) - table.add_column("Name", style="medium_purple3") - table.add_column("URL", style="deep_sky_blue3") - table.add_column("Skills", style="plum3") - table.add_column("Description", style="dim") - - for agent in agents: - desc = agent['description'] - if len(desc) > 40: - desc = desc[:37] + "..." - table.add_row( - agent['name'], - agent['url'], - str(agent['skills']), - desc - ) - - console.print(table) - - async def cmd_recall(self, args: str = "") -> None: - """Search conversational memory (past conversations)""" - if not args: - console.print("Usage: /recall ") - return - - await self._sync_conversational_memory() - - # First try MemoryService (for ingested memories) - with safe_status(get_dynamic_status('searching')): - results = await self.agent.memory_manager.search_conversational_memory(args) - - if results and results.memories: - console.print(f"[bold]Found {len(results.memories)} memories:[/bold]\n") - for i, memory in enumerate(results.memories, 1): - # MemoryEntry has 'text' field, not 'content' - text = getattr(memory, 'text', str(memory)) - if len(text) > 200: - text = text[:200] + "..." - console.print(f"{i}. {text}") - else: - # If MemoryService is empty, search SQLite directly - console.print("[yellow]No memories in MemoryService, searching SQLite sessions...[/yellow]") - - # Check if using DatabaseSessionService - if hasattr(self.agent.executor, 'session_service'): - service_type = type(self.agent.executor.session_service).__name__ - if service_type == 'DatabaseSessionService': - # Search SQLite database directly - import sqlite3 - import os - db_path = os.getenv('SESSION_DB_PATH', './fuzzforge_sessions.db') - - if os.path.exists(db_path): - conn = sqlite3.connect(db_path) - cursor = conn.cursor() - - # Search in events table - query = f"%{args}%" - cursor.execute( - "SELECT content FROM events WHERE content LIKE ? LIMIT 10", - (query,) - ) - - rows = cursor.fetchall() - conn.close() - - if rows: - console.print(f"[green]Found {len(rows)} matches in SQLite sessions:[/green]\n") - for i, (content,) in enumerate(rows, 1): - # Parse JSON content - import json - try: - data = json.loads(content) - if 'parts' in data and data['parts']: - text = data['parts'][0].get('text', '')[:150] - role = data.get('role', 'unknown') - console.print(f"{i}. [{role}]: {text}...") - except Exception: - console.print(f"{i}. {content[:150]}...") - else: - console.print("[yellow]No matches found in SQLite either[/yellow]") - else: - console.print("[yellow]SQLite database not found[/yellow]") - else: - console.print(f"[dim]Using {service_type} (not searchable)[/dim]") - else: - console.print("[yellow]No session history available[/yellow]") - - async def cmd_memory(self, args: str = "") -> None: - """Inspect conversational memory and knowledge graph state.""" - raw_args = (args or "").strip() - lower_args = raw_args.lower() - - if not raw_args or lower_args in {"status", "info"}: - await self._show_memory_status() - return - - if lower_args == "datasets": - await self._show_dataset_summary() - return - - if lower_args.startswith("search ") or lower_args.startswith("recall "): - query = raw_args.split(" ", 1)[1].strip() if " " in raw_args else "" - if not query: - console.print("Usage: /memory search ") - return - await self.cmd_recall(query) - return - - console.print("Usage: /memory [status|datasets|search ]") - console.print("[dim]/memory search is an alias for /recall [/dim]") - - async def _sync_conversational_memory(self) -> None: - """Ensure the ADK memory service ingests any completed sessions.""" - memory_service = getattr(self.agent.memory_manager, "memory_service", None) - executor_sessions = getattr(self.agent.executor, "sessions", {}) - metadata_map = getattr(self.agent.executor, "session_metadata", {}) - - if not memory_service or not executor_sessions: - return - - for context_id, session in list(executor_sessions.items()): - meta = metadata_map.get(context_id, {}) - if meta.get('memory_synced'): - continue - - add_session = getattr(memory_service, "add_session_to_memory", None) - if not callable(add_session): - return - - try: - await add_session(session) - meta['memory_synced'] = True - metadata_map[context_id] = meta - except Exception as exc: # pragma: no cover - defensive logging - if os.getenv('FUZZFORGE_DEBUG', '0') == '1': - console.print(f"[yellow]Memory sync failed:[/yellow] {exc}") - - async def _show_memory_status(self) -> None: - """Render conversational memory, session store, and knowledge graph status.""" - await self._sync_conversational_memory() - - status = self.agent.memory_manager.get_status() - - conversational = status.get("conversational_memory", {}) - conv_type = conversational.get("type", "unknown") - conv_active = "yes" if conversational.get("active") else "no" - conv_details = conversational.get("details", "") - - session_service = getattr(self.agent.executor, "session_service", None) - session_service_name = type(session_service).__name__ if session_service else "Unavailable" - - session_lines = [ - f"[bold]Service:[/bold] {session_service_name}" - ] - - session_count = None - event_count = None - db_path_display = None - - if session_service_name == "DatabaseSessionService": - import sqlite3 - - db_path = os.getenv('SESSION_DB_PATH', './fuzzforge_sessions.db') - session_path = Path(db_path).expanduser().resolve() - db_path_display = str(session_path) - - if session_path.exists(): - try: - with sqlite3.connect(session_path) as conn: - cursor = conn.cursor() - cursor.execute("SELECT COUNT(*) FROM sessions") - session_count = cursor.fetchone()[0] - cursor.execute("SELECT COUNT(*) FROM events") - event_count = cursor.fetchone()[0] - except Exception as exc: - session_lines.append(f"[yellow]Warning:[/yellow] Unable to read session database ({exc})") - else: - session_lines.append("[yellow]SQLite session database not found yet[/yellow]") - - elif session_service_name == "InMemorySessionService": - session_lines.append("[dim]Session data persists for the current process only[/dim]") - - if db_path_display: - session_lines.append(f"[bold]Database:[/bold] {db_path_display}") - if session_count is not None: - session_lines.append(f"[bold]Sessions Recorded:[/bold] {session_count}") - if event_count is not None: - session_lines.append(f"[bold]Events Logged:[/bold] {event_count}") - - conv_lines = [ - f"[bold]Type:[/bold] {conv_type}", - f"[bold]Active:[/bold] {conv_active}" - ] - if conv_details: - conv_lines.append(f"[bold]Details:[/bold] {conv_details}") - - console.print(Panel("\n".join(conv_lines), title="Conversation Memory", border_style="medium_purple3")) - console.print(Panel("\n".join(session_lines), title="Session Store", border_style="deep_sky_blue3")) - - # Knowledge graph section - knowledge = status.get("knowledge_graph", {}) - kg_active = knowledge.get("active", False) - kg_lines = [ - f"[bold]Active:[/bold] {'yes' if kg_active else 'no'}", - f"[bold]Purpose:[/bold] {knowledge.get('purpose', 'N/A')}" - ] - - cognee_data = None - cognee_error = None - try: - project_config = ProjectConfigManager() - cognee_data = project_config.get_cognee_config() - except Exception as exc: # pragma: no cover - defensive - cognee_error = str(exc) - - if cognee_data: - data_dir = cognee_data.get('data_directory') - system_dir = cognee_data.get('system_directory') - if data_dir: - kg_lines.append(f"[bold]Data dir:[/bold] {data_dir}") - if system_dir: - kg_lines.append(f"[bold]System dir:[/bold] {system_dir}") - elif cognee_error: - kg_lines.append(f"[yellow]Config unavailable:[/yellow] {cognee_error}") - - dataset_summary = None - if kg_active: - try: - integration = await self.agent.executor._get_knowledge_integration() - if integration: - dataset_summary = await integration.list_datasets() - except Exception as exc: # pragma: no cover - defensive - kg_lines.append(f"[yellow]Dataset listing failed:[/yellow] {exc}") - - if dataset_summary: - if dataset_summary.get("error"): - kg_lines.append(f"[yellow]Dataset listing failed:[/yellow] {dataset_summary['error']}") - else: - datasets = dataset_summary.get("datasets", []) - total = dataset_summary.get("total_datasets") - if total is not None: - kg_lines.append(f"[bold]Datasets:[/bold] {total}") - if datasets: - preview = ", ".join(sorted(datasets)[:5]) - if len(datasets) > 5: - preview += ", …" - kg_lines.append(f"[bold]Samples:[/bold] {preview}") - else: - kg_lines.append("[dim]Run `fuzzforge ingest` to populate the knowledge graph[/dim]") - - console.print(Panel("\n".join(kg_lines), title="Knowledge Graph", border_style="spring_green4")) - console.print("\n[dim]Subcommands: /memory datasets | /memory search [/dim]") - - async def _show_dataset_summary(self) -> None: - """List datasets available in the Cognee knowledge graph.""" - try: - integration = await self.agent.executor._get_knowledge_integration() - except Exception as exc: - console.print(f"[yellow]Knowledge graph unavailable:[/yellow] {exc}") - return - - if not integration: - console.print("[yellow]Knowledge graph is not initialised yet.[/yellow]") - console.print("[dim]Run `fuzzforge ingest --path . --recursive` to create the project dataset.[/dim]") - return - - with safe_status(get_dynamic_status('searching')): - dataset_info = await integration.list_datasets() - - if dataset_info.get("error"): - console.print(f"[red]{dataset_info['error']}[/red]") - return - - datasets = dataset_info.get("datasets", []) - if not datasets: - console.print("[yellow]No datasets found.[/yellow]") - console.print("[dim]Run `fuzzforge ingest` to populate the knowledge graph.[/dim]") - return - - table = Table(title="Cognee Datasets", box=box.ROUNDED) - table.add_column("Dataset", style="medium_purple3") - table.add_column("Notes", style="dim") - - for name in sorted(datasets): - note = "" - if name.endswith("_codebase"): - note = "primary project dataset" - table.add_row(name, note) - - console.print(table) - console.print( - "[dim]Use knowledge graph prompts (e.g. `search project knowledge for \"topic\" using INSIGHTS`) to query these datasets.[/dim]" - ) - - async def cmd_artifacts(self, args: str = "") -> None: - """List or show artifacts""" - if args: - # Show specific artifact - artifacts = await self.agent.executor.get_artifacts(self.context_id) - for artifact in artifacts: - if artifact['id'] == args or args in artifact['id']: - console.print(Panel( - f"[bold]{artifact['title']}[/bold]\n" - f"Type: {artifact['type']} | Created: {artifact['created_at'][:19]}\n\n" - f"[code]{artifact['content']}[/code]", - title=f"Artifact: {artifact['id']}", - border_style="medium_purple3" - )) - return - console.print(f"[yellow]Artifact {args} not found[/yellow]") - return - - # List all artifacts - artifacts = await self.agent.executor.get_artifacts(self.context_id) - - if not artifacts: - console.print("No artifacts created yet") - console.print("[dim]Artifacts are created when generating code, configs, or documents[/dim]") - return - - table = Table(title="Artifacts", box=box.ROUNDED) - table.add_column("ID", style="medium_purple3") - table.add_column("Type", style="deep_sky_blue3") - table.add_column("Title", style="plum3") - table.add_column("Size", style="dim") - table.add_column("Created", style="dim") - - for artifact in artifacts: - size = f"{len(artifact['content'])} chars" - created = artifact['created_at'][:19] # Just date and time - - table.add_row( - artifact['id'], - artifact['type'], - artifact['title'][:40] + "..." if len(artifact['title']) > 40 else artifact['title'], - size, - created - ) - - console.print(table) - console.print("\n[dim]Use /artifacts to view artifact content[/dim]") - - async def cmd_tasks(self, args: str = "") -> None: - """List tasks or show details for a specific task.""" - store = getattr(self.agent.executor, "task_store", None) - if not store or not hasattr(store, "tasks"): - console.print("Task store not available") - return - - task_id = args.strip() - - async with store.lock: - tasks = dict(store.tasks) - - if not tasks: - console.print("No tasks recorded yet") - return - - if task_id: - task = tasks.get(task_id) - if not task: - console.print(f"Task '{task_id}' not found") - return - - state_str = task.status.state.value if hasattr(task.status.state, "value") else str(task.status.state) - console.print(f"\n[bold]Task {task.id}[/bold]") - console.print(f"Context: {task.context_id}") - console.print(f"State: {state_str}") - console.print(f"Timestamp: {task.status.timestamp}") - if task.metadata: - console.print("Metadata:") - for key, value in task.metadata.items(): - console.print(f" • {key}: {value}") - if task.history: - console.print("History:") - for entry in task.history[-5:]: - text = getattr(entry, "text", None) - if not text and hasattr(entry, "parts"): - text = " ".join( - getattr(part, "text", "") for part in getattr(entry, "parts", []) - ) - console.print(f" - {text}") - return - - table = Table(title="FuzzForge Tasks", box=box.ROUNDED) - table.add_column("ID", style="medium_purple3") - table.add_column("State", style="white") - table.add_column("Workflow", style="deep_sky_blue3") - table.add_column("Updated", style="green") - - for task in tasks.values(): - state_value = task.status.state.value if hasattr(task.status.state, "value") else str(task.status.state) - workflow = "" - if task.metadata: - workflow = task.metadata.get("workflow") or task.metadata.get("workflow_name") or "" - timestamp = task.status.timestamp if task.status else "" - table.add_row(task.id, state_value, workflow, timestamp) - - console.print(table) - console.print("\n[dim]Use /tasks to view task details[/dim]") - - async def cmd_sessions(self, args: str = "") -> None: - """List active sessions""" - sessions = self.agent.executor.sessions - - if not sessions: - console.print("No active sessions") - return - - table = Table(title="Active Sessions", box=box.ROUNDED) - table.add_column("Context ID", style="medium_purple3") - table.add_column("Session ID", style="deep_sky_blue3") - table.add_column("User ID", style="plum3") - table.add_column("State", style="dim") - - for context_id, session in sessions.items(): - # Get session info - session_id = getattr(session, 'id', 'N/A') - user_id = getattr(session, 'user_id', 'N/A') - state = getattr(session, 'state', {}) - - # Format state info - agents_count = len(state.get('registered_agents', [])) - state_info = f"{agents_count} agents registered" - - table.add_row( - context_id[:20] + "..." if len(context_id) > 20 else context_id, - session_id[:20] + "..." if len(str(session_id)) > 20 else str(session_id), - user_id, - state_info - ) - - console.print(table) - console.print(f"\n[dim]Current session: {self.context_id}[/dim]") - - async def cmd_skills(self, args: str = "") -> None: - """Show FuzzForge skills""" - card = self.agent.agent_card - - table = Table(title=f"{card.name} Skills", box=box.ROUNDED) - table.add_column("Skill", style="medium_purple3") - table.add_column("Description", style="white") - table.add_column("Tags", style="deep_sky_blue3") - - for skill in card.skills: - table.add_row( - skill.name, - skill.description, - ", ".join(skill.tags[:3]) - ) - - console.print(table) - - async def cmd_clear(self, args: str = "") -> None: - """Clear screen""" - console.clear() - self.print_banner() - - async def cmd_sendfile(self, args: str) -> None: - """Encode a local file as an artifact and route it to a registered agent.""" - tokens = shlex.split(args) - if len(tokens) < 2: - console.print("Usage: /sendfile [message]") - return - - agent_name = tokens[0] - file_arg = tokens[1] - note = " ".join(tokens[2:]).strip() - - file_path = Path(file_arg).expanduser() - if not file_path.exists(): - console.print(f"[red]File not found:[/red] {file_path}") - return - - session = self.agent.executor.sessions.get(self.context_id) - if not session: - console.print("[red]No active session available. Try sending a prompt first.[/red]") - return - - console.print(f"[dim]Delegating {file_path.name} to {agent_name}...[/dim]") - - async def _delegate() -> None: - try: - response = await self.agent.executor.delegate_file_to_agent( - agent_name, - str(file_path), - note, - session=session, - context_id=self.context_id, - ) - console.print(f"[{agent_name}]: {response}") - except Exception as exc: - console.print(f"[red]Failed to delegate file:[/red] {exc}") - finally: - self.background_tasks.discard(asyncio.current_task()) - - task = asyncio.create_task(_delegate()) - self.background_tasks.add(task) - console.print("[dim]Delegation in progress… you can continue working.[/dim]") - - async def cmd_quit(self, args: str = "") -> None: - """Exit the CLI""" - console.print("\n[green]Shutting down...[/green]") - await self.agent.cleanup() - if self.background_tasks: - for task in list(self.background_tasks): - task.cancel() - await asyncio.gather(*self.background_tasks, return_exceptions=True) - console.print("Goodbye!\n") - sys.exit(0) - - async def process_command(self, text: str) -> bool: - """Process slash commands""" - if not text.startswith('/'): - return False - - parts = text.split(maxsplit=1) - cmd = parts[0].lower() - args = parts[1] if len(parts) > 1 else "" - - if cmd in self.commands: - await self.commands[cmd](args) - return True - - console.print(f"Unknown command: {cmd}") - return True - - async def auto_register_agents(self): - """Auto-register agents from config on startup""" - agents_to_register = self.config_manager.get_registered_agents() - - if agents_to_register: - console.print(f"\n[dim]Auto-registering {len(agents_to_register)} agents from config...[/dim]") - - for agent_config in agents_to_register: - url = agent_config.get('url') - name = agent_config.get('name', 'Unknown') - - if url: - try: - with safe_status(f"Registering {name}..."): - result = await self.agent.register_agent(url) - - if result["success"]: - console.print(f" āœ… {name}: [green]Connected[/green]") - else: - console.print(f" āš ļø {name}: [yellow]Failed - {result.get('error', 'Unknown error')}[/yellow]") - except Exception as e: - console.print(f" āš ļø {name}: [yellow]Failed - {e}[/yellow]") - - console.print("") # Empty line for spacing - - async def run(self): - """Main CLI loop""" - self.print_banner() - - # Auto-register agents from config - await self.auto_register_agents() - - while not shutdown_requested: - try: - # Use standard input with non-deletable colored prompt - prompt_symbol = get_prompt_symbol() - try: - # Print colored prompt then use input() for non-deletable behavior - console.print(f"[medium_purple3]{prompt_symbol}[/medium_purple3] ", end="") - user_input = input().strip() - except (EOFError, KeyboardInterrupt): - raise - - if not user_input: - continue - - # Check for commands - if await self.process_command(user_input): - continue - - # Process message - with safe_status(get_dynamic_status('thinking')): - response = await self.agent.process_message(user_input, self.context_id) - - # Display response - console.print(f"\n{response}\n") - - except KeyboardInterrupt: - await self.cmd_quit() - - except EOFError: - await self.cmd_quit() - - except Exception as e: - console.print(f"[red]Error: {e}[/red]") - if os.getenv('FUZZFORGE_DEBUG') == '1': - console.print_exception() - console.print("") - - await self.agent.cleanup() - - -def main(): - """Main entry point""" - try: - cli = FuzzForgeCLI() - asyncio.run(cli.run()) - except KeyboardInterrupt: - console.print("\n[yellow]Interrupted[/yellow]") - sys.exit(0) - except Exception as e: - console.print(f"[red]Fatal error: {e}[/red]") - if os.getenv('FUZZFORGE_DEBUG') == '1': - console.print_exception() - sys.exit(1) - - -if __name__ == "__main__": - main() diff --git a/ai/src/fuzzforge_ai/cognee_integration.py b/ai/src/fuzzforge_ai/cognee_integration.py deleted file mode 100644 index 90d005d..0000000 --- a/ai/src/fuzzforge_ai/cognee_integration.py +++ /dev/null @@ -1,469 +0,0 @@ -""" -Cognee Integration Module for FuzzForge -Provides standardized access to project-specific knowledge graphs -Can be reused by external agents and other components -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -import os -from typing import Dict, Any, Optional -from pathlib import Path - - -class CogneeProjectIntegration: - """ - Standardized Cognee integration that can be reused across agents - Automatically detects project context and provides knowledge graph access - """ - - def __init__(self, project_dir: Optional[str] = None): - """ - Initialize with project directory (defaults to current working directory) - - Args: - project_dir: Path to project directory (optional, defaults to cwd) - """ - self.project_dir = Path(project_dir) if project_dir else Path.cwd() - self.config_file = self.project_dir / ".fuzzforge" / "config.yaml" - self.project_context = None - self._cognee = None - self._initialized = False - - async def initialize(self) -> bool: - """ - Initialize Cognee with project context - - Returns: - bool: True if initialization successful - """ - try: - # Import Cognee - import cognee - self._cognee = cognee - - # Load project context - if not self._load_project_context(): - return False - - # Configure Cognee for this project - await self._setup_cognee_config() - - self._initialized = True - return True - - except ImportError: - print("Cognee not installed. Install with: pip install cognee") - return False - except Exception as e: - print(f"Failed to initialize Cognee: {e}") - return False - - def _load_project_context(self) -> bool: - """Load project context from FuzzForge config""" - try: - if not self.config_file.exists(): - print(f"No FuzzForge config found at {self.config_file}") - return False - - import yaml - with open(self.config_file, 'r') as f: - config = yaml.safe_load(f) - - self.project_context = { - "project_name": config.get("project", {}).get("name", "default"), - "project_id": config.get("project", {}).get("id", "default"), - "tenant_id": config.get("cognee", {}).get("tenant", "default") - } - return True - - except Exception as e: - print(f"Error loading project context: {e}") - return False - - async def _setup_cognee_config(self): - """Configure Cognee for project-specific access""" - # Set API key and model - api_key = os.getenv('OPENAI_API_KEY') - model = os.getenv('LITELLM_MODEL', 'gpt-4o-mini') - - if not api_key: - raise ValueError("OPENAI_API_KEY required for Cognee operations") - - # Configure Cognee - self._cognee.config.set_llm_api_key(api_key) - self._cognee.config.set_llm_model(model) - self._cognee.config.set_llm_provider("openai") - - # Set project-specific directories - project_cognee_dir = self.project_dir / ".fuzzforge" / "cognee" / f"project_{self.project_context['project_id']}" - - self._cognee.config.data_root_directory(str(project_cognee_dir / "data")) - self._cognee.config.system_root_directory(str(project_cognee_dir / "system")) - - # Ensure directories exist - project_cognee_dir.mkdir(parents=True, exist_ok=True) - (project_cognee_dir / "data").mkdir(exist_ok=True) - (project_cognee_dir / "system").mkdir(exist_ok=True) - - async def search_knowledge_graph(self, query: str, search_type: str = "GRAPH_COMPLETION", dataset: str = None) -> Dict[str, Any]: - """ - Search the project's knowledge graph - - Args: - query: Search query - search_type: Type of search ("GRAPH_COMPLETION", "INSIGHTS", "CHUNKS", etc.) - dataset: Specific dataset to search (optional) - - Returns: - Dict containing search results - """ - if not self._initialized: - await self.initialize() - - if not self._initialized: - return {"error": "Cognee not initialized"} - - try: - from cognee.modules.search.types import SearchType - - # Resolve search type dynamically; fallback to GRAPH_COMPLETION - try: - search_type_enum = getattr(SearchType, search_type.upper()) - except AttributeError: - search_type_enum = SearchType.GRAPH_COMPLETION - search_type = "GRAPH_COMPLETION" - - # Prepare search kwargs - search_kwargs = { - "query_type": search_type_enum, - "query_text": query - } - - # Add dataset filter if specified - if dataset: - search_kwargs["datasets"] = [dataset] - - results = await self._cognee.search(**search_kwargs) - - return { - "query": query, - "search_type": search_type, - "dataset": dataset, - "results": results, - "project": self.project_context["project_name"] - } - except Exception as e: - return {"error": f"Search failed: {e}"} - - async def list_knowledge_data(self) -> Dict[str, Any]: - """ - List available data in the knowledge graph - - Returns: - Dict containing available data - """ - if not self._initialized: - await self.initialize() - - if not self._initialized: - return {"error": "Cognee not initialized"} - - try: - data = await self._cognee.list_data() - return { - "project": self.project_context["project_name"], - "available_data": data - } - except Exception as e: - return {"error": f"Failed to list data: {e}"} - - async def cognify_text(self, text: str, dataset: str = None) -> Dict[str, Any]: - """ - Cognify text content into knowledge graph - - Args: - text: Text to cognify - dataset: Dataset name (defaults to project_name_codebase) - - Returns: - Dict containing cognify results - """ - if not self._initialized: - await self.initialize() - - if not self._initialized: - return {"error": "Cognee not initialized"} - - if not dataset: - dataset = f"{self.project_context['project_name']}_codebase" - - try: - # Add text to dataset - await self._cognee.add([text], dataset_name=dataset) - - # Process (cognify) the dataset - await self._cognee.cognify([dataset]) - - return { - "text_length": len(text), - "dataset": dataset, - "project": self.project_context["project_name"], - "status": "success" - } - except Exception as e: - return {"error": f"Cognify failed: {e}"} - - async def ingest_text_to_dataset(self, text: str, dataset: str = None) -> Dict[str, Any]: - """ - Ingest text content into a specific dataset - - Args: - text: Text to ingest - dataset: Dataset name (defaults to project_name_codebase) - - Returns: - Dict containing ingest results - """ - if not self._initialized: - await self.initialize() - - if not self._initialized: - return {"error": "Cognee not initialized"} - - if not dataset: - dataset = f"{self.project_context['project_name']}_codebase" - - try: - # Add text to dataset - await self._cognee.add([text], dataset_name=dataset) - - # Process (cognify) the dataset - await self._cognee.cognify([dataset]) - - return { - "text_length": len(text), - "dataset": dataset, - "project": self.project_context["project_name"], - "status": "success" - } - except Exception as e: - return {"error": f"Ingest failed: {e}"} - - async def ingest_files_to_dataset(self, file_paths: list, dataset: str = None) -> Dict[str, Any]: - """ - Ingest multiple files into a specific dataset - - Args: - file_paths: List of file paths to ingest - dataset: Dataset name (defaults to project_name_codebase) - - Returns: - Dict containing ingest results - """ - if not self._initialized: - await self.initialize() - - if not self._initialized: - return {"error": "Cognee not initialized"} - - if not dataset: - dataset = f"{self.project_context['project_name']}_codebase" - - try: - # Validate and filter readable files - valid_files = [] - for file_path in file_paths: - try: - path = Path(file_path) - if path.exists() and path.is_file(): - # Test if file is readable - with open(path, 'r', encoding='utf-8') as f: - f.read(1) - valid_files.append(str(path)) - except (UnicodeDecodeError, PermissionError, OSError): - continue - - if not valid_files: - return {"error": "No valid files found to ingest"} - - # Add files to dataset - await self._cognee.add(valid_files, dataset_name=dataset) - - # Process (cognify) the dataset - await self._cognee.cognify([dataset]) - - return { - "files_processed": len(valid_files), - "total_files_requested": len(file_paths), - "dataset": dataset, - "project": self.project_context["project_name"], - "status": "success" - } - except Exception as e: - return {"error": f"Ingest failed: {e}"} - - async def list_datasets(self) -> Dict[str, Any]: - """ - List all datasets available in the project - - Returns: - Dict containing available datasets - """ - if not self._initialized: - await self.initialize() - - if not self._initialized: - return {"error": "Cognee not initialized"} - - try: - # Get available datasets by searching for data - data = await self._cognee.list_data() - - # Extract unique dataset names from the data - datasets = set() - if isinstance(data, list): - for item in data: - if isinstance(item, dict) and 'dataset_name' in item: - datasets.add(item['dataset_name']) - - return { - "project": self.project_context["project_name"], - "datasets": list(datasets), - "total_datasets": len(datasets) - } - except Exception as e: - return {"error": f"Failed to list datasets: {e}"} - - async def create_dataset(self, dataset: str) -> Dict[str, Any]: - """ - Create a new dataset (dataset is created automatically when data is added) - - Args: - dataset: Dataset name to create - - Returns: - Dict containing creation result - """ - if not self._initialized: - await self.initialize() - - if not self._initialized: - return {"error": "Cognee not initialized"} - - try: - # In Cognee, datasets are created implicitly when data is added - # We'll add empty content to create the dataset - await self._cognee.add([f"Dataset {dataset} initialized for project {self.project_context['project_name']}"], - dataset_name=dataset) - - return { - "dataset": dataset, - "project": self.project_context["project_name"], - "status": "created" - } - except Exception as e: - return {"error": f"Failed to create dataset: {e}"} - - def get_project_context(self) -> Optional[Dict[str, str]]: - """Get current project context""" - return self.project_context - - def is_initialized(self) -> bool: - """Check if Cognee is initialized""" - return self._initialized - - -# Convenience functions for easy integration -async def search_project_codebase(query: str, project_dir: Optional[str] = None, dataset: str = None, search_type: str = "GRAPH_COMPLETION") -> str: - """ - Convenience function to search project codebase - - Args: - query: Search query - project_dir: Project directory (optional, defaults to cwd) - dataset: Specific dataset to search (optional) - search_type: Type of search ("GRAPH_COMPLETION", "INSIGHTS", "CHUNKS") - - Returns: - Formatted search results as string - """ - cognee_integration = CogneeProjectIntegration(project_dir) - result = await cognee_integration.search_knowledge_graph(query, search_type, dataset) - - if "error" in result: - return f"Error searching codebase: {result['error']}" - - project_name = result.get("project", "Unknown") - results = result.get("results", []) - - if not results: - return f"No results found for '{query}' in project {project_name}" - - output = f"Search results for '{query}' in project {project_name}:\n\n" - - # Format results - if isinstance(results, list): - for i, item in enumerate(results, 1): - if isinstance(item, dict): - # Handle structured results - output += f"{i}. " - if "search_result" in item: - output += f"Dataset: {item.get('dataset_name', 'Unknown')}\n" - for result_item in item["search_result"]: - if isinstance(result_item, dict): - if "name" in result_item: - output += f" - {result_item['name']}: {result_item.get('description', '')}\n" - elif "text" in result_item: - text = result_item["text"][:200] + "..." if len(result_item["text"]) > 200 else result_item["text"] - output += f" - {text}\n" - else: - output += f" - {str(result_item)[:200]}...\n" - else: - output += f"{str(item)[:200]}...\n" - output += "\n" - else: - output += f"{i}. {str(item)[:200]}...\n\n" - else: - output += f"{str(results)[:500]}..." - - return output - - -async def list_project_knowledge(project_dir: Optional[str] = None) -> str: - """ - Convenience function to list project knowledge - - Args: - project_dir: Project directory (optional, defaults to cwd) - - Returns: - Formatted list of available data - """ - cognee_integration = CogneeProjectIntegration(project_dir) - result = await cognee_integration.list_knowledge_data() - - if "error" in result: - return f"Error listing knowledge: {result['error']}" - - project_name = result.get("project", "Unknown") - data = result.get("available_data", []) - - output = f"Available knowledge in project {project_name}:\n\n" - - if not data: - output += "No data available in knowledge graph" - else: - for i, item in enumerate(data, 1): - output += f"{i}. {item}\n" - - return output diff --git a/ai/src/fuzzforge_ai/cognee_service.py b/ai/src/fuzzforge_ai/cognee_service.py deleted file mode 100644 index ba14a30..0000000 --- a/ai/src/fuzzforge_ai/cognee_service.py +++ /dev/null @@ -1,428 +0,0 @@ -""" -Cognee Service for FuzzForge -Provides integrated Cognee functionality for codebase analysis and knowledge graphs -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -import os -import logging -from pathlib import Path -from typing import Dict, List, Any - -logger = logging.getLogger(__name__) - - -class CogneeService: - """ - Service for managing Cognee integration with FuzzForge - Handles multi-tenant isolation and project-specific knowledge graphs - """ - - def __init__(self, config): - """Initialize with FuzzForge config""" - self.config = config - self.cognee_config = config.get_cognee_config() - self.project_context = config.get_project_context() - self._cognee = None - self._user = None - self._initialized = False - - async def initialize(self): - """Initialize Cognee with project-specific configuration""" - try: - # Ensure environment variables for Cognee are set before import - self.config.setup_cognee_environment() - logger.debug( - "Cognee environment configured", - extra={ - "data": self.cognee_config.get("data_directory"), - "system": self.cognee_config.get("system_directory"), - }, - ) - - import cognee - self._cognee = cognee - - # Configure LLM with API key BEFORE any other cognee operations - provider = os.getenv("LLM_PROVIDER", "openai") - model = os.getenv("LLM_MODEL") or os.getenv("LITELLM_MODEL", "gpt-4o-mini") - api_key = os.getenv("COGNEE_API_KEY") or os.getenv("LLM_API_KEY") or os.getenv("OPENAI_API_KEY") - endpoint = os.getenv("LLM_ENDPOINT") - api_version = os.getenv("LLM_API_VERSION") - max_tokens = os.getenv("LLM_MAX_TOKENS") - - if provider.lower() in {"openai", "azure_openai", "custom"} and not api_key: - raise ValueError( - "OpenAI-compatible API key is required for Cognee LLM operations. " - "Set OPENAI_API_KEY, LLM_API_KEY, or COGNEE_LLM_API_KEY in your .env" - ) - - # Expose environment variables for downstream libraries - os.environ["LLM_PROVIDER"] = provider - os.environ["LITELLM_MODEL"] = model - os.environ["LLM_MODEL"] = model - if api_key: - os.environ["LLM_API_KEY"] = api_key - # Maintain compatibility with components still expecting OPENAI_API_KEY - if provider.lower() in {"openai", "azure_openai", "custom"}: - os.environ.setdefault("OPENAI_API_KEY", api_key) - if endpoint: - os.environ["LLM_ENDPOINT"] = endpoint - os.environ.setdefault("LLM_API_BASE", endpoint) - os.environ.setdefault("OPENAI_API_BASE", endpoint) - os.environ.setdefault("LITELLM_PROXY_API_BASE", endpoint) - if api_key: - os.environ.setdefault("LITELLM_PROXY_API_KEY", api_key) - if api_version: - os.environ["LLM_API_VERSION"] = api_version - if max_tokens: - os.environ["LLM_MAX_TOKENS"] = str(max_tokens) - - # Configure Cognee's runtime using its configuration helpers when available - embedding_model = os.getenv("LLM_EMBEDDING_MODEL") - embedding_endpoint = os.getenv("LLM_EMBEDDING_ENDPOINT") - if embedding_endpoint: - os.environ.setdefault("LLM_EMBEDDING_API_BASE", embedding_endpoint) - - if hasattr(cognee.config, "set_llm_provider"): - cognee.config.set_llm_provider(provider) - if hasattr(cognee.config, "set_llm_model"): - cognee.config.set_llm_model(model) - if api_key and hasattr(cognee.config, "set_llm_api_key"): - cognee.config.set_llm_api_key(api_key) - if endpoint and hasattr(cognee.config, "set_llm_endpoint"): - cognee.config.set_llm_endpoint(endpoint) - if embedding_model and hasattr(cognee.config, "set_llm_embedding_model"): - cognee.config.set_llm_embedding_model(embedding_model) - if embedding_endpoint and hasattr(cognee.config, "set_llm_embedding_endpoint"): - cognee.config.set_llm_embedding_endpoint(embedding_endpoint) - if api_version and hasattr(cognee.config, "set_llm_api_version"): - cognee.config.set_llm_api_version(api_version) - if max_tokens and hasattr(cognee.config, "set_llm_max_tokens"): - cognee.config.set_llm_max_tokens(int(max_tokens)) - - # Configure graph database - cognee.config.set_graph_db_config({ - "graph_database_provider": self.cognee_config.get("graph_database_provider", "kuzu"), - }) - - # Set data directories - data_dir = self.cognee_config.get("data_directory") - system_dir = self.cognee_config.get("system_directory") - - if data_dir: - logger.debug("Setting cognee data root", extra={"path": data_dir}) - cognee.config.data_root_directory(data_dir) - if system_dir: - logger.debug("Setting cognee system root", extra={"path": system_dir}) - cognee.config.system_root_directory(system_dir) - - # Setup multi-tenant user context - await self._setup_user_context() - - self._initialized = True - logger.info(f"Cognee initialized for project {self.project_context['project_name']} " - f"with Kuzu at {system_dir}") - - except ImportError: - logger.error("Cognee not installed. Install with: pip install cognee") - raise - except Exception as e: - logger.error(f"Failed to initialize Cognee: {e}") - raise - - async def create_dataset(self): - """Create dataset for this project if it doesn't exist""" - if not self._initialized: - await self.initialize() - - try: - # Dataset creation is handled automatically by Cognee when adding files - # We just ensure we have the right context set up - dataset_name = f"{self.project_context['project_name']}_codebase" - logger.info(f"Dataset {dataset_name} ready for project {self.project_context['project_name']}") - return dataset_name - except Exception as e: - logger.error(f"Failed to create dataset: {e}") - raise - - async def _setup_user_context(self): - """Setup user context for multi-tenant isolation""" - try: - from cognee.modules.users.methods import create_user, get_user - - # Always try fallback email first to avoid validation issues - fallback_email = f"project_{self.project_context['project_id']}@fuzzforge.example" - user_tenant = self.project_context['tenant_id'] - - # Try to get existing fallback user first - try: - self._user = await get_user(fallback_email) - logger.info(f"Using existing user: {fallback_email}") - return - except Exception: - # User doesn't exist, try to create fallback - pass - - # Create fallback user - try: - self._user = await create_user(fallback_email, user_tenant) - logger.info(f"Created fallback user: {fallback_email} for tenant: {user_tenant}") - return - except Exception as fallback_error: - logger.warning(f"Fallback user creation failed: {fallback_error}") - self._user = None - return - - except Exception as e: - logger.warning(f"Could not setup multi-tenant user context: {e}") - logger.info("Proceeding with default context") - self._user = None - - def get_project_dataset_name(self, dataset_suffix: str = "codebase") -> str: - """Get project-specific dataset name""" - return f"{self.project_context['project_name']}_{dataset_suffix}" - - async def ingest_text(self, content: str, dataset: str = "fuzzforge") -> bool: - """Ingest text content into knowledge graph""" - if not self._initialized: - await self.initialize() - - try: - await self._cognee.add([content], dataset) - await self._cognee.cognify([dataset]) - return True - except Exception as e: - logger.error(f"Failed to ingest text: {e}") - return False - - async def ingest_files(self, file_paths: List[Path], dataset: str = "fuzzforge") -> Dict[str, Any]: - """Ingest multiple files into knowledge graph""" - if not self._initialized: - await self.initialize() - - results = { - "success": 0, - "failed": 0, - "errors": [] - } - - try: - ingest_paths: List[str] = [] - for file_path in file_paths: - try: - with open(file_path, 'r', encoding='utf-8'): - ingest_paths.append(str(file_path)) - results["success"] += 1 - except (UnicodeDecodeError, PermissionError) as exc: - results["failed"] += 1 - results["errors"].append(f"{file_path}: {exc}") - logger.warning("Skipping %s: %s", file_path, exc) - - if ingest_paths: - await self._cognee.add(ingest_paths, dataset_name=dataset) - await self._cognee.cognify([dataset]) - - except Exception as e: - logger.error(f"Failed to ingest files: {e}") - results["errors"].append(f"Cognify error: {str(e)}") - - return results - - async def search_insights(self, query: str, dataset: str = None) -> List[str]: - """Search for insights in the knowledge graph""" - if not self._initialized: - await self.initialize() - - try: - from cognee.modules.search.types import SearchType - - kwargs = { - "query_type": SearchType.INSIGHTS, - "query_text": query - } - - if dataset: - kwargs["datasets"] = [dataset] - - results = await self._cognee.search(**kwargs) - return results if isinstance(results, list) else [] - - except Exception as e: - logger.error(f"Failed to search insights: {e}") - return [] - - async def search_chunks(self, query: str, dataset: str = None) -> List[str]: - """Search for relevant text chunks""" - if not self._initialized: - await self.initialize() - - try: - from cognee.modules.search.types import SearchType - - kwargs = { - "query_type": SearchType.CHUNKS, - "query_text": query - } - - if dataset: - kwargs["datasets"] = [dataset] - - results = await self._cognee.search(**kwargs) - return results if isinstance(results, list) else [] - - except Exception as e: - logger.error(f"Failed to search chunks: {e}") - return [] - - async def search_graph_completion(self, query: str) -> List[str]: - """Search for graph completion (relationships)""" - if not self._initialized: - await self.initialize() - - try: - from cognee.modules.search.types import SearchType - - results = await self._cognee.search( - query_type=SearchType.GRAPH_COMPLETION, - query_text=query - ) - return results if isinstance(results, list) else [] - - except Exception as e: - logger.error(f"Failed to search graph completion: {e}") - return [] - - async def get_status(self) -> Dict[str, Any]: - """Get service status and statistics""" - status = { - "initialized": self._initialized, - "enabled": self.cognee_config.get("enabled", True), - "provider": self.cognee_config.get("graph_database_provider", "kuzu"), - "data_directory": self.cognee_config.get("data_directory"), - "system_directory": self.cognee_config.get("system_directory"), - } - - if self._initialized: - try: - # Check if directories exist and get sizes - data_dir = Path(status["data_directory"]) - system_dir = Path(status["system_directory"]) - - status.update({ - "data_dir_exists": data_dir.exists(), - "system_dir_exists": system_dir.exists(), - "kuzu_db_exists": (system_dir / "kuzu_db").exists(), - "lancedb_exists": (system_dir / "lancedb").exists(), - }) - - except Exception as e: - status["status_error"] = str(e) - - return status - - async def clear_data(self, confirm: bool = False): - """Clear all ingested data (dangerous!)""" - if not confirm: - raise ValueError("Must confirm data clearing with confirm=True") - - if not self._initialized: - await self.initialize() - - try: - await self._cognee.prune.prune_data() - await self._cognee.prune.prune_system(metadata=True) - logger.info("Cognee data cleared") - except Exception as e: - logger.error(f"Failed to clear data: {e}") - raise - - -class FuzzForgeCogneeIntegration: - """ - Main integration class for FuzzForge + Cognee - Provides high-level operations for security analysis - """ - - def __init__(self, config): - self.service = CogneeService(config) - - async def analyze_codebase(self, path: Path, recursive: bool = True) -> Dict[str, Any]: - """ - Analyze a codebase and extract security-relevant insights - """ - # Collect code files - from fuzzforge_ai.ingest_utils import collect_ingest_files - - files = collect_ingest_files(path, recursive, None, []) - - if not files: - return {"error": "No files found to analyze"} - - # Ingest files - results = await self.service.ingest_files(files, "security_analysis") - - if results["success"] == 0: - return {"error": "Failed to ingest any files", "details": results} - - # Extract security insights - security_queries = [ - "vulnerabilities security risks", - "authentication authorization", - "input validation sanitization", - "encryption cryptography", - "error handling exceptions", - "logging sensitive data" - ] - - insights = {} - for query in security_queries: - insight_results = await self.service.search_insights(query, "security_analysis") - if insight_results: - insights[query.replace(" ", "_")] = insight_results - - return { - "files_processed": results["success"], - "files_failed": results["failed"], - "errors": results["errors"], - "security_insights": insights - } - - async def query_codebase(self, query: str, search_type: str = "insights") -> List[str]: - """Query the ingested codebase""" - if search_type == "insights": - return await self.service.search_insights(query) - elif search_type == "chunks": - return await self.service.search_chunks(query) - elif search_type == "graph": - return await self.service.search_graph_completion(query) - else: - raise ValueError(f"Unknown search type: {search_type}") - - async def get_project_summary(self) -> Dict[str, Any]: - """Get a summary of the analyzed project""" - # Search for general project insights - summary_queries = [ - "project structure components", - "main functionality features", - "programming languages frameworks", - "dependencies libraries" - ] - - summary = {} - for query in summary_queries: - results = await self.service.search_insights(query) - if results: - summary[query.replace(" ", "_")] = results[:3] # Top 3 results - - return summary diff --git a/ai/src/fuzzforge_ai/config.yaml b/ai/src/fuzzforge_ai/config.yaml deleted file mode 100644 index 133c61d..0000000 --- a/ai/src/fuzzforge_ai/config.yaml +++ /dev/null @@ -1,9 +0,0 @@ -# FuzzForge Registered Agents -# These agents will be automatically registered on startup - -registered_agents: - -# Example entries: -# - name: Calculator -# url: http://localhost:10201 -# description: Mathematical calculations agent diff --git a/ai/src/fuzzforge_ai/config_bridge.py b/ai/src/fuzzforge_ai/config_bridge.py deleted file mode 100644 index 32a7905..0000000 --- a/ai/src/fuzzforge_ai/config_bridge.py +++ /dev/null @@ -1,31 +0,0 @@ -"""Bridge module providing access to the host CLI configuration manager.""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -try: - from fuzzforge_cli.config import ProjectConfigManager as _ProjectConfigManager -except ImportError: # pragma: no cover - used when CLI not available - class _ProjectConfigManager: # type: ignore[no-redef] - """Fallback implementation that raises a helpful error.""" - - def __init__(self, *args, **kwargs): - raise ImportError( - "ProjectConfigManager is unavailable. Install the FuzzForge CLI " - "package or supply a compatible configuration object." - ) - - def __getattr__(name): # pragma: no cover - defensive - raise ImportError("ProjectConfigManager unavailable") - -ProjectConfigManager = _ProjectConfigManager - -__all__ = ["ProjectConfigManager"] diff --git a/ai/src/fuzzforge_ai/config_manager.py b/ai/src/fuzzforge_ai/config_manager.py deleted file mode 100644 index 9aa76ca..0000000 --- a/ai/src/fuzzforge_ai/config_manager.py +++ /dev/null @@ -1,134 +0,0 @@ -""" -Configuration manager for FuzzForge -Handles loading and saving registered agents -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -import os -import yaml -from typing import Dict, Any, List - -class ConfigManager: - """Manages FuzzForge agent registry configuration""" - - def __init__(self, config_path: str = None): - """Initialize config manager""" - if config_path: - self.config_path = config_path - else: - # Check for local .fuzzforge/agents.yaml first, then fall back to global - local_config = os.path.join(os.getcwd(), '.fuzzforge', 'agents.yaml') - global_config = os.path.join(os.path.dirname(__file__), 'config.yaml') - - if os.path.exists(local_config): - self.config_path = local_config - if os.getenv("FUZZFORGE_DEBUG", "0") == "1": - print(f"[CONFIG] Using local config: {local_config}") - else: - self.config_path = global_config - if os.getenv("FUZZFORGE_DEBUG", "0") == "1": - print(f"[CONFIG] Using global config: {global_config}") - - self.config = self.load_config() - - def load_config(self) -> Dict[str, Any]: - """Load configuration from YAML file""" - if not os.path.exists(self.config_path): - # Create default config if it doesn't exist - return {'registered_agents': []} - - try: - with open(self.config_path, 'r') as f: - config = yaml.safe_load(f) or {} - # Ensure registered_agents is a list - if 'registered_agents' not in config or config['registered_agents'] is None: - config['registered_agents'] = [] - return config - except Exception as e: - print(f"[WARNING] Failed to load config: {e}") - return {'registered_agents': []} - - def save_config(self): - """Save current configuration to file""" - try: - # Create a clean config with comments - config_content = """# FuzzForge Registered Agents -# These agents will be automatically registered on startup - -""" - # Add the agents list - if self.config.get('registered_agents'): - config_content += yaml.dump({'registered_agents': self.config['registered_agents']}, - default_flow_style=False, sort_keys=False) - else: - config_content += "registered_agents: []\n" - - config_content += """ -# Example entries: -# - name: Calculator -# url: http://localhost:10201 -# description: Mathematical calculations agent -""" - - with open(self.config_path, 'w') as f: - f.write(config_content) - - return True - except Exception as e: - print(f"[ERROR] Failed to save config: {e}") - return False - - def get_registered_agents(self) -> List[Dict[str, Any]]: - """Get list of registered agents from config""" - return self.config.get('registered_agents', []) - - def add_registered_agent(self, name: str, url: str, description: str = "") -> bool: - """Add a new registered agent to config""" - if 'registered_agents' not in self.config: - self.config['registered_agents'] = [] - - # Check if agent already exists - for agent in self.config['registered_agents']: - if agent.get('url') == url: - # Update existing agent - agent['name'] = name - agent['description'] = description - return self.save_config() - - # Add new agent - self.config['registered_agents'].append({ - 'name': name, - 'url': url, - 'description': description - }) - - return self.save_config() - - def remove_registered_agent(self, name: str = None, url: str = None) -> bool: - """Remove a registered agent from config""" - if 'registered_agents' not in self.config: - return False - - original_count = len(self.config['registered_agents']) - - # Filter out the agent - self.config['registered_agents'] = [ - agent for agent in self.config['registered_agents'] - if not ((name and agent.get('name') == name) or - (url and agent.get('url') == url)) - ] - - if len(self.config['registered_agents']) < original_count: - return self.save_config() - - return False diff --git a/ai/src/fuzzforge_ai/ingest_utils.py b/ai/src/fuzzforge_ai/ingest_utils.py deleted file mode 100644 index ef272d5..0000000 --- a/ai/src/fuzzforge_ai/ingest_utils.py +++ /dev/null @@ -1,104 +0,0 @@ -"""Utilities for collecting files to ingest into Cognee.""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -from __future__ import annotations - -import fnmatch -from pathlib import Path -from typing import Iterable, List, Optional - -_DEFAULT_FILE_TYPES = [ - ".py", - ".js", - ".ts", - ".java", - ".cpp", - ".c", - ".h", - ".rs", - ".go", - ".rb", - ".php", - ".cs", - ".swift", - ".kt", - ".scala", - ".clj", - ".hs", - ".md", - ".txt", - ".yaml", - ".yml", - ".json", - ".toml", - ".cfg", - ".ini", -] - -_DEFAULT_EXCLUDE = [ - "*.pyc", - "__pycache__", - ".git", - ".svn", - ".hg", - "node_modules", - ".venv", - "venv", - ".env", - "dist", - "build", - ".pytest_cache", - ".mypy_cache", - ".tox", - "coverage", - "*.log", - "*.tmp", -] - - -def collect_ingest_files( - path: Path, - recursive: bool = True, - file_types: Optional[Iterable[str]] = None, - exclude: Optional[Iterable[str]] = None, -) -> List[Path]: - """Return a list of files eligible for ingestion.""" - path = path.resolve() - files: List[Path] = [] - - extensions = list(file_types) if file_types else list(_DEFAULT_FILE_TYPES) - exclusions = list(exclude) if exclude else [] - exclusions.extend(_DEFAULT_EXCLUDE) - - def should_exclude(file_path: Path) -> bool: - file_str = str(file_path) - for pattern in exclusions: - if fnmatch.fnmatch(file_str, f"*{pattern}*") or fnmatch.fnmatch(file_path.name, pattern): - return True - return False - - if path.is_file(): - if not should_exclude(path) and any(str(path).endswith(ext) for ext in extensions): - files.append(path) - return files - - pattern = "**/*" if recursive else "*" - for file_path in path.glob(pattern): - if file_path.is_file() and not should_exclude(file_path): - if any(str(file_path).endswith(ext) for ext in extensions): - files.append(file_path) - - return files - - -__all__ = ["collect_ingest_files"] diff --git a/ai/src/fuzzforge_ai/memory_service.py b/ai/src/fuzzforge_ai/memory_service.py deleted file mode 100644 index f00b7c3..0000000 --- a/ai/src/fuzzforge_ai/memory_service.py +++ /dev/null @@ -1,244 +0,0 @@ -""" -FuzzForge Memory Service -Implements ADK MemoryService pattern for conversational memory -Separate from Cognee which will be used for RAG/codebase analysis -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -import os -from typing import Dict, Any -import logging - -# ADK Memory imports -from google.adk.memory import InMemoryMemoryService, BaseMemoryService -from google.adk.memory.base_memory_service import SearchMemoryResponse - -# Optional VertexAI Memory Bank -try: - from google.adk.memory import VertexAiMemoryBankService - VERTEX_AVAILABLE = True -except ImportError: - VERTEX_AVAILABLE = False - -logger = logging.getLogger(__name__) - - -class FuzzForgeMemoryService: - """ - Manages conversational memory using ADK patterns - This is separate from Cognee which will handle RAG/codebase - """ - - def __init__(self, memory_type: str = "inmemory", **kwargs): - """ - Initialize memory service - - Args: - memory_type: "inmemory" or "vertexai" - **kwargs: Additional args for specific memory service - For vertexai: project, location, agent_engine_id - """ - self.memory_type = memory_type - self.service = self._create_service(memory_type, **kwargs) - - def _create_service(self, memory_type: str, **kwargs) -> BaseMemoryService: - """Create the appropriate memory service""" - - if memory_type == "inmemory": - # Use ADK's InMemoryMemoryService for local development - logger.info("Using InMemory MemoryService for conversational memory") - return InMemoryMemoryService() - - elif memory_type == "vertexai" and VERTEX_AVAILABLE: - # Use VertexAI Memory Bank for production - project = kwargs.get('project') or os.getenv('GOOGLE_CLOUD_PROJECT') - location = kwargs.get('location') or os.getenv('GOOGLE_CLOUD_LOCATION', 'us-central1') - agent_engine_id = kwargs.get('agent_engine_id') or os.getenv('AGENT_ENGINE_ID') - - if not all([project, location, agent_engine_id]): - logger.warning("VertexAI config missing, falling back to InMemory") - return InMemoryMemoryService() - - logger.info(f"Using VertexAI MemoryBank: {agent_engine_id}") - return VertexAiMemoryBankService( - project=project, - location=location, - agent_engine_id=agent_engine_id - ) - else: - # Default to in-memory - logger.info("Defaulting to InMemory MemoryService") - return InMemoryMemoryService() - - async def add_session_to_memory(self, session: Any) -> None: - """ - Add a completed session to long-term memory - This extracts meaningful information from the conversation - - Args: - session: The session object to process - """ - try: - # Let the underlying service handle the ingestion - # It will extract relevant information based on the implementation - await self.service.add_session_to_memory(session) - - logger.debug(f"Added session {session.id} to {self.memory_type} memory") - - except Exception as e: - logger.error(f"Failed to add session to memory: {e}") - - async def search_memory(self, - query: str, - app_name: str = "fuzzforge", - user_id: str = None, - max_results: int = 10) -> SearchMemoryResponse: - """ - Search long-term memory for relevant information - - Args: - query: The search query - app_name: Application name for filtering - user_id: User ID for filtering (optional) - max_results: Maximum number of results - - Returns: - SearchMemoryResponse with relevant memories - """ - try: - # Search the memory service - results = await self.service.search_memory( - app_name=app_name, - user_id=user_id, - query=query - ) - - logger.debug(f"Memory search for '{query}' returned {len(results.memories)} results") - return results - - except Exception as e: - logger.error(f"Memory search failed: {e}") - # Return empty results on error - return SearchMemoryResponse(memories=[]) - - async def ingest_completed_sessions(self, session_service) -> int: - """ - Batch ingest all completed sessions into memory - Useful for initial memory population - - Args: - session_service: The session service containing sessions - - Returns: - Number of sessions ingested - """ - ingested = 0 - - try: - # Get all sessions from the session service - sessions = await session_service.list_sessions(app_name="fuzzforge") - - for session_info in sessions: - # Load full session - session = await session_service.load_session( - app_name="fuzzforge", - user_id=session_info.get('user_id'), - session_id=session_info.get('id') - ) - - if session and len(session.get_events()) > 0: - await self.add_session_to_memory(session) - ingested += 1 - - logger.info(f"Ingested {ingested} sessions into {self.memory_type} memory") - - except Exception as e: - logger.error(f"Failed to batch ingest sessions: {e}") - - return ingested - - def get_status(self) -> Dict[str, Any]: - """Get memory service status""" - return { - "type": self.memory_type, - "active": self.service is not None, - "vertex_available": VERTEX_AVAILABLE, - "details": { - "inmemory": "Non-persistent, keyword search", - "vertexai": "Persistent, semantic search with LLM extraction" - }.get(self.memory_type, "Unknown") - } - - -class HybridMemoryManager: - """ - Manages both ADK MemoryService (conversational) and Cognee (RAG/codebase) - Provides unified interface for both memory systems - """ - - def __init__(self, - memory_service: FuzzForgeMemoryService = None, - cognee_tools = None): - """ - Initialize with both memory systems - - Args: - memory_service: ADK-pattern memory for conversations - cognee_tools: Cognee MCP tools for RAG/codebase - """ - # ADK memory for conversations - self.memory_service = memory_service or FuzzForgeMemoryService() - - # Cognee for knowledge graphs and RAG (future) - self.cognee_tools = cognee_tools - - async def search_conversational_memory(self, query: str) -> SearchMemoryResponse: - """Search past conversations using ADK memory""" - return await self.memory_service.search_memory(query) - - async def search_knowledge_graph(self, query: str, search_type: str = "GRAPH_COMPLETION"): - """Search Cognee knowledge graph (for RAG/codebase in future)""" - if not self.cognee_tools: - return None - - try: - # Use Cognee's graph search - return await self.cognee_tools.search( - query=query, - search_type=search_type - ) - except Exception as e: - logger.debug(f"Cognee search failed: {e}") - return None - - async def store_in_graph(self, content: str): - """Store in Cognee knowledge graph (for codebase analysis later)""" - if not self.cognee_tools: - return None - - try: - # Use cognify to create graph structures - return await self.cognee_tools.cognify(content) - except Exception as e: - logger.debug(f"Cognee store failed: {e}") - return None - - def get_status(self) -> Dict[str, Any]: - """Get status of both memory systems""" - return { - "conversational_memory": self.memory_service.get_status(), - "knowledge_graph": { - "active": self.cognee_tools is not None, - "purpose": "RAG/codebase analysis (future)" - } - } \ No newline at end of file diff --git a/ai/src/fuzzforge_ai/remote_agent.py b/ai/src/fuzzforge_ai/remote_agent.py deleted file mode 100644 index 979fffa..0000000 --- a/ai/src/fuzzforge_ai/remote_agent.py +++ /dev/null @@ -1,171 +0,0 @@ -""" -Remote Agent Connection Handler -Handles A2A protocol communication with remote agents -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -import httpx -import uuid -from typing import Dict, Any, Optional, List - - -class RemoteAgentConnection: - """Handles A2A protocol communication with remote agents""" - - def __init__(self, url: str): - """Initialize connection to a remote agent""" - self.url = url.rstrip('/') - self.agent_card = None - self.client = httpx.AsyncClient(timeout=120.0, follow_redirects=True) - self.context_id = None - - async def get_agent_card(self) -> Optional[Dict[str, Any]]: - """Get the agent card from the remote agent""" - # If URL already points to a .json file, fetch it directly - if self.url.endswith('.json'): - try: - response = await self.client.get(self.url) - response.raise_for_status() - self.agent_card = response.json() - - # Use canonical URL from agent card if provided - if isinstance(self.agent_card, dict) and "url" in self.agent_card: - self.url = self.agent_card["url"].rstrip('/') - - return self.agent_card - except Exception as e: - print(f"Failed to get agent card from {self.url}: {e}") - return None - - # Try both agent-card.json (A2A 0.3.0+) and agent.json (legacy) - well_known_paths = [ - "/.well-known/agent-card.json", - "/.well-known/agent.json", - ] - - for path in well_known_paths: - try: - response = await self.client.get(f"{self.url}{path}") - response.raise_for_status() - self.agent_card = response.json() - - # Use canonical URL from agent card if provided - if isinstance(self.agent_card, dict) and "url" in self.agent_card: - self.url = self.agent_card["url"].rstrip('/') - - return self.agent_card - except Exception: - continue - - print(f"Failed to get agent card from {self.url}") - print("Tip: If agent is at /a2a/something, use full URL: /register http://host:port/a2a/something") - return None - - async def send_message(self, message: str | Dict[str, Any] | List[Dict[str, Any]]) -> str: - """Send a message to the remote agent using A2A protocol""" - try: - parts: List[Dict[str, Any]] - metadata: Dict[str, Any] | None = None - if isinstance(message, dict): - metadata = message.get("metadata") if isinstance(message.get("metadata"), dict) else None - raw_parts = message.get("parts", []) - if not raw_parts: - text_value = message.get("text") or message.get("message") - if isinstance(text_value, str): - raw_parts = [{"type": "text", "text": text_value}] - parts = [raw_part for raw_part in raw_parts if isinstance(raw_part, dict)] - elif isinstance(message, list): - parts = [part for part in message if isinstance(part, dict)] - metadata = None - else: - parts = [{"type": "text", "text": message}] - metadata = None - - if not parts: - parts = [{"type": "text", "text": ""}] - - # Build JSON-RPC request per A2A spec - payload = { - "jsonrpc": "2.0", - "method": "message/send", - "params": { - "message": { - "messageId": str(uuid.uuid4()), - "role": "user", - "parts": parts, - } - }, - "id": 1 - } - - if metadata: - payload["params"]["message"]["metadata"] = metadata - - # Include context if we have one - if self.context_id: - payload["params"]["contextId"] = self.context_id - - # Send to root endpoint per A2A protocol - response = await self.client.post(self.url, json=payload) - response.raise_for_status() - result = response.json() - - # Extract response based on A2A JSON-RPC format - if isinstance(result, dict): - # Update context for continuity - if "result" in result and isinstance(result["result"], dict): - if "contextId" in result["result"]: - self.context_id = result["result"]["contextId"] - - # Extract text from artifacts - if "artifacts" in result["result"]: - texts = [] - for artifact in result["result"]["artifacts"]: - if isinstance(artifact, dict) and "parts" in artifact: - for part in artifact["parts"]: - if isinstance(part, dict) and "text" in part: - texts.append(part["text"]) - if texts: - return " ".join(texts) - - # Extract from message format - if "message" in result["result"]: - msg = result["result"]["message"] - if isinstance(msg, dict) and "parts" in msg: - texts = [] - for part in msg["parts"]: - if isinstance(part, dict) and "text" in part: - texts.append(part["text"]) - return " ".join(texts) if texts else str(msg) - return str(msg) - - return str(result["result"]) - - # Handle error response - elif "error" in result: - error = result["error"] - if isinstance(error, dict): - return f"Error: {error.get('message', str(error))}" - return f"Error: {error}" - - # Fallback - return result.get("response", result.get("message", str(result))) - - return str(result) - - except Exception as e: - return f"Error communicating with agent: {e}" - - async def close(self): - """Close the connection properly""" - await self.client.aclose() diff --git a/backend/Dockerfile b/backend/Dockerfile deleted file mode 100644 index 7a49c84..0000000 --- a/backend/Dockerfile +++ /dev/null @@ -1,37 +0,0 @@ -FROM python:3.11-slim - -WORKDIR /app - -# Install system dependencies including Docker client and rsync -RUN apt-get update && apt-get install -y \ - curl \ - ca-certificates \ - gnupg \ - lsb-release \ - rsync \ - && curl -fsSL https://download.docker.com/linux/debian/gpg | gpg --dearmor -o /usr/share/keyrings/docker-archive-keyring.gpg \ - && echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] https://download.docker.com/linux/debian $(lsb_release -cs) stable" | tee /etc/apt/sources.list.d/docker.list > /dev/null \ - && apt-get update \ - && apt-get install -y docker-ce-cli \ - && rm -rf /var/lib/apt/lists/* - -# Docker client configuration removed - localhost:5001 doesn't require insecure registry config - -# Copy project files -COPY pyproject.toml ./ - -# Install dependencies with pip -RUN pip install --no-cache-dir -e . - -# Copy source code -COPY . . - -# Expose ports (API on 8000, MCP on 8010) -EXPOSE 8000 8010 - -# Health check -HEALTHCHECK --interval=30s --timeout=10s --start-period=5s --retries=3 \ - CMD curl -f http://localhost:8000/health || exit 1 - -# Start the application -CMD ["uvicorn", "src.main:app", "--host", "0.0.0.0", "--port", "8000"] \ No newline at end of file diff --git a/backend/README.md b/backend/README.md deleted file mode 100644 index 74589b0..0000000 --- a/backend/README.md +++ /dev/null @@ -1,316 +0,0 @@ -# FuzzForge Backend - -A stateless API server for security testing workflow orchestration using Temporal. This system dynamically discovers workflows, executes them in isolated worker environments, and returns findings in SARIF format. - -## Architecture Overview - -### Core Components - -1. **Workflow Discovery System**: Automatically discovers workflows at startup -2. **Module System**: Reusable components (scanner, analyzer, reporter) with a common interface -3. **Temporal Integration**: Handles workflow orchestration, execution, and monitoring with vertical workers -4. **File Upload & Storage**: HTTP multipart upload to MinIO for target files -5. **SARIF Output**: Standardized security findings format - -### Key Features - -- **Stateless**: No persistent data, fully scalable -- **Generic**: No hardcoded workflows, automatic discovery -- **Isolated**: Each workflow runs in specialized vertical workers -- **Extensible**: Easy to add new workflows and modules -- **Secure**: File upload with MinIO storage, automatic cleanup via lifecycle policies -- **Observable**: Comprehensive logging and status tracking - -## Quick Start - -### Prerequisites - -- Docker and Docker Compose - -### Installation - -From the project root, start all services: - -```bash -docker-compose -f docker-compose.temporal.yaml up -d -``` - -This will start: -- Temporal server (Web UI at http://localhost:8233, gRPC at :7233) -- MinIO (S3 storage at http://localhost:9000, Console at http://localhost:9001) -- PostgreSQL database (for Temporal state) -- Vertical workers (worker-rust, worker-android, worker-web, etc.) -- FuzzForge backend API (port 8000) - -**Note**: MinIO console login: `fuzzforge` / `fuzzforge123` - -## API Endpoints - -### Workflows - -- `GET /workflows` - List all discovered workflows -- `GET /workflows/{name}/metadata` - Get workflow metadata and parameters -- `GET /workflows/{name}/parameters` - Get workflow parameter schema -- `GET /workflows/metadata/schema` - Get metadata.yaml schema -- `POST /workflows/{name}/submit` - Submit a workflow for execution (path-based, legacy) -- `POST /workflows/{name}/upload-and-submit` - **Upload local files and submit workflow** (recommended) - -### Runs - -- `GET /runs/{run_id}/status` - Get run status -- `GET /runs/{run_id}/findings` - Get SARIF findings from completed run -- `GET /runs/{workflow_name}/findings/{run_id}` - Alternative findings endpoint with workflow name - -## Workflow Structure - -Each workflow must have: - -``` -toolbox/workflows/{workflow_name}/ - workflow.py # Temporal workflow definition - metadata.yaml # Mandatory metadata (parameters, version, vertical, etc.) - requirements.txt # Optional Python dependencies (installed in vertical worker) -``` - -**Note**: With Temporal architecture, workflows run in pre-built vertical workers (e.g., `worker-rust`, `worker-android`), not individual Docker containers. The workflow code is mounted as a volume and discovered at runtime. - -### Example metadata.yaml - -```yaml -name: security_assessment -version: "1.0.0" -description: "Comprehensive security analysis workflow" -author: "FuzzForge Team" -category: "comprehensive" -vertical: "rust" # Routes to worker-rust -tags: - - "security" - - "analysis" - - "comprehensive" - -requirements: - tools: - - "file_scanner" - - "security_analyzer" - - "sarif_reporter" - resources: - memory: "512Mi" - cpu: "500m" - timeout: 1800 - -has_docker: true - -parameters: - type: object - properties: - target_path: - type: string - default: "/workspace" - description: "Path to analyze" - scanner_config: - type: object - description: "Scanner configuration" - properties: - max_file_size: - type: integer - description: "Maximum file size to scan (bytes)" - -output_schema: - type: object - properties: - sarif: - type: object - description: "SARIF-formatted security findings" - summary: - type: object - description: "Scan execution summary" -``` - -### Metadata Field Descriptions - -- **name**: Workflow identifier (must match directory name) -- **version**: Semantic version (x.y.z format) -- **description**: Human-readable description of the workflow -- **author**: Workflow author/maintainer -- **category**: Workflow category (comprehensive, specialized, fuzzing, focused) -- **tags**: Array of descriptive tags for categorization -- **requirements.tools**: Required security tools that the workflow uses -- **requirements.resources**: Resource requirements enforced at runtime: - - `memory`: Memory limit (e.g., "512Mi", "1Gi") - - `cpu`: CPU limit (e.g., "500m" for 0.5 cores, "1" for 1 core) - - `timeout`: Maximum execution time in seconds -- **parameters**: JSON Schema object defining workflow parameters -- **output_schema**: Expected output format (typically SARIF) - -### Resource Requirements - -Resource requirements defined in workflow metadata are automatically enforced. Users can override defaults when submitting workflows: - -```bash -curl -X POST "http://localhost:8000/workflows/security_assessment/submit" \ - -H "Content-Type: application/json" \ - -d '{ - "target_path": "/tmp/project", - "resource_limits": { - "memory_limit": "1Gi", - "cpu_limit": "1" - } - }' -``` - -Resource precedence: User limits > Workflow requirements > System defaults - -## File Upload and Target Access - -### Upload Endpoint - -The backend provides an upload endpoint for submitting workflows with local files: - -``` -POST /workflows/{workflow_name}/upload-and-submit -Content-Type: multipart/form-data - -Parameters: - file: File upload (supports .tar.gz for directories) - parameters: JSON string of workflow parameters (optional) - timeout: Execution timeout in seconds (optional) -``` - -Example using curl: - -```bash -# Upload a directory (create tarball first) -tar -czf project.tar.gz /path/to/project -curl -X POST "http://localhost:8000/workflows/security_assessment/upload-and-submit" \ - -F "file=@project.tar.gz" \ - -F "parameters={\"check_secrets\":true}" - -# Upload a single file -curl -X POST "http://localhost:8000/workflows/security_assessment/upload-and-submit" \ - -F "file=@binary.elf" -``` - -### Storage Flow - -1. **CLI/API uploads file** via HTTP multipart -2. **Backend receives file** and streams to temporary location (max 10GB) -3. **Backend uploads to MinIO** with generated `target_id` -4. **Workflow is submitted** to Temporal with `target_id` -5. **Worker downloads target** from MinIO to local cache -6. **Workflow processes target** from cache -7. **MinIO lifecycle policy** deletes files after 7 days - -### Advantages - -- **No host filesystem access required** - workers can run anywhere -- **Automatic cleanup** - lifecycle policies prevent disk exhaustion -- **Caching** - repeated workflows reuse cached targets -- **Multi-host ready** - targets accessible from any worker -- **Secure** - isolated storage, no arbitrary host path access - -## Module Development - -Modules implement the `BaseModule` interface: - -```python -from src.toolbox.modules.base import BaseModule, ModuleMetadata, ModuleResult - -class MyModule(BaseModule): - def get_metadata(self) -> ModuleMetadata: - return ModuleMetadata( - name="my_module", - version="1.0.0", - description="Module description", - category="scanner", - ... - ) - - async def execute(self, config: Dict, workspace: Path) -> ModuleResult: - # Module logic here - findings = [...] - return self.create_result(findings=findings) - - def validate_config(self, config: Dict) -> bool: - # Validate configuration - return True -``` - -## Submitting a Workflow - -### With File Upload (Recommended) - -```bash -# Automatic tarball and upload -tar -czf project.tar.gz /home/user/project -curl -X POST "http://localhost:8000/workflows/security_assessment/upload-and-submit" \ - -F "file=@project.tar.gz" \ - -F "parameters={\"scanner_config\":{\"patterns\":[\"*.py\"]},\"analyzer_config\":{\"check_secrets\":true}}" -``` - -### Legacy Path-Based Submission - -```bash -# Only works if backend and target are on same machine -curl -X POST "http://localhost:8000/workflows/security_assessment/submit" \ - -H "Content-Type: application/json" \ - -d '{ - "target_path": "/home/user/project", - "parameters": { - "scanner_config": {"patterns": ["*.py"]}, - "analyzer_config": {"check_secrets": true} - } - }' -``` - -## Getting Findings - -```bash -curl "http://localhost:8000/runs/{run_id}/findings" -``` - -Returns SARIF-formatted findings: - -```json -{ - "workflow": "security_assessment", - "run_id": "abc-123", - "sarif": { - "version": "2.1.0", - "runs": [{ - "tool": {...}, - "results": [...] - }] - } -} -``` - -## Security Considerations - -1. **File Upload Security**: Files uploaded to MinIO with isolated storage -2. **Read-Only Default**: Target files accessed as read-only unless explicitly set -3. **Worker Isolation**: Each workflow runs in isolated vertical workers -4. **Resource Limits**: Can set CPU/memory limits per worker -5. **Automatic Cleanup**: MinIO lifecycle policies delete old files after 7 days - -## Development - -### Adding a New Workflow - -1. Create directory: `toolbox/workflows/my_workflow/` -2. Add `workflow.py` with a Temporal workflow (using `@workflow.defn`) -3. Add mandatory `metadata.yaml` with `vertical` field -4. Restart the appropriate worker: `docker-compose -f docker-compose.temporal.yaml restart worker-rust` -5. Worker will automatically discover and register the new workflow - -### Adding a New Module - -1. Create module in `toolbox/modules/{category}/` -2. Implement `BaseModule` interface -3. Use in workflows via import - -### Adding a New Vertical Worker - -1. Create worker directory: `workers/{vertical}/` -2. Create `Dockerfile` with required tools -3. Add worker to `docker-compose.temporal.yaml` -4. Worker will automatically discover workflows with matching `vertical` in metadata \ No newline at end of file diff --git a/backend/benchmarks/README.md b/backend/benchmarks/README.md deleted file mode 100644 index fc29286..0000000 --- a/backend/benchmarks/README.md +++ /dev/null @@ -1,184 +0,0 @@ -# FuzzForge Benchmark Suite - -Performance benchmarking infrastructure organized by module category. - -## Directory Structure - -``` -benchmarks/ -ā”œā”€ā”€ conftest.py # Benchmark fixtures -ā”œā”€ā”€ category_configs.py # Category-specific thresholds -ā”œā”€ā”€ by_category/ # Benchmarks organized by category -│ ā”œā”€ā”€ fuzzer/ -│ │ ā”œā”€ā”€ bench_cargo_fuzz.py -│ │ └── bench_atheris.py -│ ā”œā”€ā”€ scanner/ -│ │ └── bench_file_scanner.py -│ ā”œā”€ā”€ secret_detection/ -│ │ ā”œā”€ā”€ bench_gitleaks.py -│ │ └── bench_trufflehog.py -│ └── analyzer/ -│ └── bench_security_analyzer.py -ā”œā”€ā”€ fixtures/ # Benchmark test data -│ ā”œā”€ā”€ small/ # ~1K LOC -│ ā”œā”€ā”€ medium/ # ~10K LOC -│ └── large/ # ~100K LOC -└── results/ # Benchmark results (JSON) -``` - -## Module Categories - -### Fuzzer -**Expected Metrics**: execs/sec, coverage_rate, time_to_crash, memory_usage - -**Performance Thresholds**: -- Min 1000 execs/sec -- Max 10s for small projects -- Max 2GB memory - -### Scanner -**Expected Metrics**: files/sec, LOC/sec, findings_count - -**Performance Thresholds**: -- Min 100 files/sec -- Min 10K LOC/sec -- Max 512MB memory - -### Secret Detection -**Expected Metrics**: patterns/sec, precision, recall, F1 - -**Performance Thresholds**: -- Min 90% precision -- Min 95% recall -- Max 5 false positives per 100 secrets - -### Analyzer -**Expected Metrics**: analysis_depth, files/sec, accuracy - -**Performance Thresholds**: -- Min 10 files/sec (deep analysis) -- Min 85% accuracy -- Max 2GB memory - -## Running Benchmarks - -### All Benchmarks -```bash -cd backend -pytest benchmarks/ --benchmark-only -v -``` - -### Specific Category -```bash -pytest benchmarks/by_category/fuzzer/ --benchmark-only -v -``` - -### With Comparison -```bash -# Run and save baseline -pytest benchmarks/ --benchmark-only --benchmark-save=baseline - -# Compare against baseline -pytest benchmarks/ --benchmark-only --benchmark-compare=baseline -``` - -### Generate Histogram -```bash -pytest benchmarks/ --benchmark-only --benchmark-histogram=histogram -``` - -## Benchmark Results - -Results are saved as JSON and include: -- Mean execution time -- Standard deviation -- Min/Max values -- Iterations per second -- Memory usage - -Example output: -``` ------------------------- benchmark: fuzzer -------------------------- -Name Mean StdDev Ops/Sec -bench_cargo_fuzz[discovery] 0.0012s 0.0001s 833.33 -bench_cargo_fuzz[execution] 0.1250s 0.0050s 8.00 -bench_cargo_fuzz[memory] 0.0100s 0.0005s 100.00 ---------------------------------------------------------------------- -``` - -## CI/CD Integration - -Benchmarks run: -- **Nightly**: Full benchmark suite, track trends -- **On PR**: When benchmarks/ or modules/ changed -- **Manual**: Via workflow_dispatch - -### Regression Detection - -Benchmarks automatically fail if: -- Performance degrades >10% -- Memory usage exceeds thresholds -- Throughput drops below minimum - -See `.github/workflows/benchmark.yml` for configuration. - -## Adding New Benchmarks - -### 1. Create benchmark file in category directory -```python -# benchmarks/by_category/fuzzer/bench_new_fuzzer.py - -import pytest -from benchmarks.category_configs import ModuleCategory, get_threshold - -@pytest.mark.benchmark(group="fuzzer") -def test_execution_performance(benchmark, new_fuzzer, test_workspace): - """Benchmark execution speed""" - result = benchmark(new_fuzzer.execute, config, test_workspace) - - # Validate against threshold - threshold = get_threshold(ModuleCategory.FUZZER, "max_execution_time_small") - assert result.execution_time < threshold -``` - -### 2. Update category_configs.py if needed -Add new thresholds or metrics for your module. - -### 3. Run locally -```bash -pytest benchmarks/by_category/fuzzer/bench_new_fuzzer.py --benchmark-only -v -``` - -## Best Practices - -1. **Use mocking** for external dependencies (network, disk I/O) -2. **Fixed iterations** for consistent benchmarking -3. **Warm-up runs** for JIT-compiled code -4. **Category-specific metrics** aligned with module purpose -5. **Realistic fixtures** that represent actual use cases -6. **Memory profiling** using tracemalloc -7. **Compare apples to apples** within the same category - -## Interpreting Results - -### Good Performance -- āœ… Execution time below threshold -- āœ… Memory usage within limits -- āœ… Throughput meets minimum -- āœ… <5% variance across runs - -### Performance Issues -- āš ļø Execution time 10-20% over threshold -- āŒ Execution time >20% over threshold -- āŒ Memory leaks (increasing over iterations) -- āŒ High variance (>10%) indicates instability - -## Tracking Performance Over Time - -Benchmark results are stored as artifacts with: -- Commit SHA -- Timestamp -- Environment details (Python version, OS) -- Full metrics - -Use these to track long-term performance trends and detect gradual degradation. diff --git a/backend/benchmarks/by_category/fuzzer/bench_cargo_fuzz.py b/backend/benchmarks/by_category/fuzzer/bench_cargo_fuzz.py deleted file mode 100644 index 0cd97ca..0000000 --- a/backend/benchmarks/by_category/fuzzer/bench_cargo_fuzz.py +++ /dev/null @@ -1,221 +0,0 @@ -""" -Benchmarks for CargoFuzzer module - -Tests performance characteristics of Rust fuzzing: -- Execution throughput (execs/sec) -- Coverage rate -- Memory efficiency -- Time to first crash -""" - -import pytest -import asyncio -from pathlib import Path -from unittest.mock import AsyncMock, patch -import sys - -sys.path.insert(0, str(Path(__file__).resolve().parents[3] / "toolbox")) - -from modules.fuzzer.cargo_fuzzer import CargoFuzzer -from benchmarks.category_configs import ModuleCategory, get_threshold - - -@pytest.fixture -def cargo_fuzzer(): - """Create CargoFuzzer instance for benchmarking""" - return CargoFuzzer() - - -@pytest.fixture -def benchmark_config(): - """Benchmark-optimized configuration""" - return { - "target_name": None, - "max_iterations": 10000, # Fixed iterations for consistent benchmarking - "timeout_seconds": 30, - "sanitizer": "address" - } - - -@pytest.fixture -def mock_rust_workspace(tmp_path): - """Create a minimal Rust workspace for benchmarking""" - workspace = tmp_path / "rust_project" - workspace.mkdir() - - # Cargo.toml - (workspace / "Cargo.toml").write_text("""[package] -name = "bench_project" -version = "0.1.0" -edition = "2021" -""") - - # src/lib.rs - src = workspace / "src" - src.mkdir() - (src / "lib.rs").write_text(""" -pub fn benchmark_function(data: &[u8]) -> Vec { - data.to_vec() -} -""") - - # fuzz structure - fuzz = workspace / "fuzz" - fuzz.mkdir() - (fuzz / "Cargo.toml").write_text("""[package] -name = "bench_project-fuzz" -version = "0.0.0" -edition = "2021" - -[dependencies] -libfuzzer-sys = "0.4" - -[dependencies.bench_project] -path = ".." - -[[bin]] -name = "fuzz_target_1" -path = "fuzz_targets/fuzz_target_1.rs" -""") - - targets = fuzz / "fuzz_targets" - targets.mkdir() - (targets / "fuzz_target_1.rs").write_text("""#![no_main] -use libfuzzer_sys::fuzz_target; -use bench_project::benchmark_function; - -fuzz_target!(|data: &[u8]| { - let _ = benchmark_function(data); -}); -""") - - return workspace - - -class TestCargoFuzzerPerformance: - """Benchmark CargoFuzzer performance metrics""" - - @pytest.mark.benchmark(group="fuzzer") - def test_target_discovery_performance(self, benchmark, cargo_fuzzer, mock_rust_workspace): - """Benchmark fuzz target discovery speed""" - def discover(): - return asyncio.run(cargo_fuzzer._discover_fuzz_targets(mock_rust_workspace)) - - result = benchmark(discover) - assert len(result) > 0 - - @pytest.mark.benchmark(group="fuzzer") - def test_config_validation_performance(self, benchmark, cargo_fuzzer, benchmark_config): - """Benchmark configuration validation speed""" - result = benchmark(cargo_fuzzer.validate_config, benchmark_config) - assert result is True - - @pytest.mark.benchmark(group="fuzzer") - def test_module_initialization_performance(self, benchmark): - """Benchmark module instantiation time""" - def init_module(): - return CargoFuzzer() - - module = benchmark(init_module) - assert module is not None - - -class TestCargoFuzzerThroughput: - """Benchmark execution throughput""" - - @pytest.mark.benchmark(group="fuzzer") - def test_execution_throughput(self, benchmark, cargo_fuzzer, mock_rust_workspace, benchmark_config): - """Benchmark fuzzing execution throughput""" - - # Mock actual fuzzing to focus on orchestration overhead - async def mock_run(workspace, target, config, callback): - # Simulate 10K execs at 1000 execs/sec - if callback: - await callback({ - "total_execs": 10000, - "execs_per_sec": 1000.0, - "crashes": 0, - "coverage": 50, - "corpus_size": 10, - "elapsed_time": 10 - }) - return [], {"total_executions": 10000, "execution_time": 10.0} - - with patch.object(cargo_fuzzer, '_build_fuzz_target', new_callable=AsyncMock, return_value=True): - with patch.object(cargo_fuzzer, '_run_fuzzing', side_effect=mock_run): - with patch.object(cargo_fuzzer, '_parse_crash_artifacts', new_callable=AsyncMock, return_value=[]): - def run_fuzzer(): - # Run in new event loop - loop = asyncio.new_event_loop() - try: - return loop.run_until_complete( - cargo_fuzzer.execute(benchmark_config, mock_rust_workspace) - ) - finally: - loop.close() - - result = benchmark(run_fuzzer) - assert result.status == "success" - - # Verify performance threshold - threshold = get_threshold(ModuleCategory.FUZZER, "max_execution_time_small") - assert result.execution_time < threshold, \ - f"Execution time {result.execution_time}s exceeds threshold {threshold}s" - - -class TestCargoFuzzerMemory: - """Benchmark memory efficiency""" - - @pytest.mark.benchmark(group="fuzzer") - def test_memory_overhead(self, benchmark, cargo_fuzzer, mock_rust_workspace, benchmark_config): - """Benchmark memory usage during execution""" - import tracemalloc - - def measure_memory(): - tracemalloc.start() - - # Simulate operations - cargo_fuzzer.validate_config(benchmark_config) - asyncio.run(cargo_fuzzer._discover_fuzz_targets(mock_rust_workspace)) - - current, peak = tracemalloc.get_traced_memory() - tracemalloc.stop() - - return peak / 1024 / 1024 # Convert to MB - - peak_mb = benchmark(measure_memory) - - # Check against threshold - max_memory = get_threshold(ModuleCategory.FUZZER, "max_memory_mb") - assert peak_mb < max_memory, \ - f"Peak memory {peak_mb:.2f}MB exceeds threshold {max_memory}MB" - - -class TestCargoFuzzerScalability: - """Benchmark scalability characteristics""" - - @pytest.mark.benchmark(group="fuzzer") - def test_multiple_target_discovery(self, benchmark, cargo_fuzzer, tmp_path): - """Benchmark discovery with multiple targets""" - workspace = tmp_path / "multi_target" - workspace.mkdir() - - # Create workspace with 10 fuzz targets - (workspace / "Cargo.toml").write_text("[package]\nname = \"test\"\nversion = \"0.1.0\"\nedition = \"2021\"") - src = workspace / "src" - src.mkdir() - (src / "lib.rs").write_text("pub fn test() {}") - - fuzz = workspace / "fuzz" - fuzz.mkdir() - targets = fuzz / "fuzz_targets" - targets.mkdir() - - for i in range(10): - (targets / f"fuzz_target_{i}.rs").write_text("// Target") - - def discover(): - return asyncio.run(cargo_fuzzer._discover_fuzz_targets(workspace)) - - result = benchmark(discover) - assert len(result) == 10 diff --git a/backend/benchmarks/by_category/secret_detection/README.md b/backend/benchmarks/by_category/secret_detection/README.md deleted file mode 100644 index 9fd437d..0000000 --- a/backend/benchmarks/by_category/secret_detection/README.md +++ /dev/null @@ -1,240 +0,0 @@ -# Secret Detection Benchmarks - -Comprehensive benchmarking suite comparing secret detection tools via complete workflow execution: -- **Gitleaks** - Fast pattern-based detection -- **TruffleHog** - Entropy analysis with verification -- **LLM Detector** - AI-powered semantic analysis (gpt-4o-mini, gpt-5-mini) - -## Quick Start - -### Run All Comparisons - -```bash -cd backend -python benchmarks/by_category/secret_detection/compare_tools.py -``` - -This will run all workflows on `test_projects/secret_detection_benchmark/` and generate comparison reports. - -### Run Benchmark Tests - -```bash -# All benchmarks (Gitleaks, TruffleHog, LLM with 3 models) -pytest benchmarks/by_category/secret_detection/bench_comparison.py --benchmark-only -v - -# Specific tool only -pytest benchmarks/by_category/secret_detection/bench_comparison.py::TestSecretDetectionComparison::test_gitleaks_workflow --benchmark-only -v - -# Performance tests only -pytest benchmarks/by_category/secret_detection/bench_comparison.py::TestSecretDetectionPerformance --benchmark-only -v -``` - -## Ground Truth Dataset - -**Controlled Benchmark** (`test_projects/secret_detection_benchmark/`) - -**Exactly 32 documented secrets** for accurate precision/recall testing: -- **12 Easy**: Standard patterns (AWS keys, GitHub PATs, Stripe keys, SSH keys) -- **10 Medium**: Obfuscated (Base64, hex, concatenated, in comments, Unicode) -- **10 Hard**: Well hidden (ROT13, binary, XOR, reversed, template strings, regex patterns) - -All secrets documented in `secret_detection_benchmark_GROUND_TRUTH.json` with exact file paths and line numbers. - -See `test_projects/secret_detection_benchmark/README.md` for details. - -## Metrics Measured - -### Accuracy Metrics -- **Precision**: TP / (TP + FP) - How many detected secrets are real? -- **Recall**: TP / (TP + FN) - How many real secrets were found? -- **F1 Score**: Harmonic mean of precision and recall -- **False Positive Rate**: FP / Total Detected - -### Performance Metrics -- **Execution Time**: Total time to scan all files -- **Throughput**: Files/secrets scanned per second -- **Memory Usage**: Peak memory during execution - -### Thresholds (from `category_configs.py`) -- Minimum Precision: 90% -- Minimum Recall: 95% -- Max Execution Time (small): 2.0s -- Max False Positives: 5 per 100 secrets - -## Tool Comparison - -### Gitleaks -**Strengths:** -- Fastest execution -- Git-aware (commit history scanning) -- Low false positive rate -- No API required -- Works offline - -**Weaknesses:** -- Pattern-based only -- May miss obfuscated secrets -- Limited to known patterns - -### TruffleHog -**Strengths:** -- Secret verification (validates if active) -- High detection rate with entropy analysis -- Multiple detectors (600+ secret types) -- Catches high-entropy strings - -**Weaknesses:** -- Slower than Gitleaks -- Higher false positive rate -- Verification requires network calls - -### LLM Detector -**Strengths:** -- Semantic understanding of context -- Catches novel/custom secret patterns -- Can reason about what "looks like" a secret -- Multiple model options (GPT-4, Claude, etc.) -- Understands code context - -**Weaknesses:** -- Slowest (API latency + LLM processing) -- Most expensive (LLM API costs) -- Requires A2A agent infrastructure -- Accuracy varies by model -- May miss well-disguised secrets - -## Results Directory - -After running comparisons, results are saved to: -``` -benchmarks/by_category/secret_detection/results/ -ā”œā”€ā”€ comparison_report.md # Human-readable comparison with: -│ # - Summary table with secrets/files/avg per file/time -│ # - Agreement analysis (secrets found by N tools) -│ # - Tool agreement matrix (overlap between pairs) -│ # - Per-file detailed comparison table -│ # - File type breakdown -│ # - Files analyzed by each tool -│ # - Overlap analysis and performance summary -└── comparison_results.json # Machine-readable data with findings_by_file -``` - -## Latest Benchmark Results - -Run the benchmark to generate results: -```bash -cd backend -python benchmarks/by_category/secret_detection/compare_tools.py -``` - -Results are saved to `results/comparison_report.md` with: -- Summary table (secrets found, files scanned, time) -- Agreement analysis (how many tools found each secret) -- Tool agreement matrix (overlap between tools) -- Per-file detailed comparison -- File type breakdown - -## CI/CD Integration - -Add to your CI pipeline: - -```yaml -# .github/workflows/benchmark-secrets.yml -name: Secret Detection Benchmark - -on: - schedule: - - cron: '0 0 * * 0' # Weekly - workflow_dispatch: - -jobs: - benchmark: - runs-on: ubuntu-latest - steps: - - uses: actions/checkout@v3 - - - name: Set up Python - uses: actions/setup-python@v4 - with: - python-version: '3.11' - - - name: Install dependencies - run: | - pip install -r backend/requirements.txt - pip install pytest-benchmark - - - name: Run benchmarks - env: - GITGUARDIAN_API_KEY: ${{ secrets.GITGUARDIAN_API_KEY }} - run: | - cd backend - pytest benchmarks/by_category/secret_detection/bench_comparison.py \ - --benchmark-only \ - --benchmark-json=results.json \ - --gitguardian-api-key - - - name: Upload results - uses: actions/upload-artifact@v3 - with: - name: benchmark-results - path: backend/results.json -``` - -## Adding New Tools - -To benchmark a new secret detection tool: - -1. Create module in `toolbox/modules/secret_detection/` -2. Register in `__init__.py` -3. Add to `compare_tools.py` in `run_all_tools()` -4. Add test in `bench_comparison.py` - -## Interpreting Results - -### High Precision, Low Recall -Tool is conservative - few false positives but misses secrets. -**Use case**: Production environments where false positives are costly. - -### Low Precision, High Recall -Tool is aggressive - finds most secrets but many false positives. -**Use case**: Initial scans where manual review is acceptable. - -### Balanced (High F1) -Tool has good balance of precision and recall. -**Use case**: General purpose scanning. - -### Fast Execution -Suitable for CI/CD pipelines and pre-commit hooks. - -### Slow but Accurate -Better for comprehensive security audits. - -## Best Practices - -1. **Use multiple tools**: Each has strengths/weaknesses -2. **Combine results**: Union of all findings for maximum coverage -3. **Filter intelligently**: Remove known false positives -4. **Verify findings**: Check if secrets are actually valid -5. **Track over time**: Monitor precision/recall trends -6. **Update regularly**: Patterns evolve, tools improve - -## Troubleshooting - -### GitGuardian Tests Skipped -- Set `GITGUARDIAN_API_KEY` environment variable -- Use `--gitguardian-api-key` flag - -### LLM Tests Skipped -- Ensure A2A agent is running -- Check agent URL in config -- Use `--llm-enabled` flag - -### Low Recall -- Check if ground truth is up to date -- Verify tool is configured correctly -- Review missed secrets manually - -### High False Positives -- Adjust tool sensitivity -- Add exclusion patterns -- Review false positive list diff --git a/backend/benchmarks/by_category/secret_detection/bench_comparison.py b/backend/benchmarks/by_category/secret_detection/bench_comparison.py deleted file mode 100644 index 1dc040e..0000000 --- a/backend/benchmarks/by_category/secret_detection/bench_comparison.py +++ /dev/null @@ -1,285 +0,0 @@ -""" -Secret Detection Tool Comparison Benchmark - -Compares Gitleaks, TruffleHog, and LLM-based detection -on the vulnerable_app ground truth dataset via workflow execution. -""" - -import pytest -import json -from pathlib import Path -from typing import Dict, List, Any -import sys - -sys.path.insert(0, str(Path(__file__).resolve().parents[3] / "sdk" / "src")) - -from fuzzforge_sdk import FuzzForgeClient -from benchmarks.category_configs import ModuleCategory, get_threshold - - -@pytest.fixture -def target_path(): - """Path to vulnerable_app""" - path = Path(__file__).parent.parent.parent.parent.parent / "test_projects" / "vulnerable_app" - assert path.exists(), f"Target not found: {path}" - return path - - -@pytest.fixture -def ground_truth(target_path): - """Load ground truth data""" - metadata_file = target_path / "SECRETS_GROUND_TRUTH.json" - assert metadata_file.exists(), f"Ground truth not found: {metadata_file}" - - with open(metadata_file) as f: - return json.load(f) - - -@pytest.fixture -def sdk_client(): - """FuzzForge SDK client""" - client = FuzzForgeClient(base_url="http://localhost:8000") - yield client - client.close() - - -def calculate_metrics(sarif_results: List[Dict], ground_truth: Dict[str, Any]) -> Dict[str, float]: - """Calculate precision, recall, and F1 score""" - - # Extract expected secrets from ground truth - expected_secrets = set() - for file_info in ground_truth["files"]: - if "secrets" in file_info: - for secret in file_info["secrets"]: - expected_secrets.add((file_info["filename"], secret["line"])) - - # Extract detected secrets from SARIF - detected_secrets = set() - for result in sarif_results: - locations = result.get("locations", []) - for location in locations: - physical_location = location.get("physicalLocation", {}) - artifact_location = physical_location.get("artifactLocation", {}) - region = physical_location.get("region", {}) - - uri = artifact_location.get("uri", "") - line = region.get("startLine", 0) - - if uri and line: - file_path = Path(uri) - filename = file_path.name - detected_secrets.add((filename, line)) - # Also try with relative path - if len(file_path.parts) > 1: - rel_path = str(Path(*file_path.parts[-2:])) - detected_secrets.add((rel_path, line)) - - # Calculate metrics - true_positives = len(expected_secrets & detected_secrets) - false_positives = len(detected_secrets - expected_secrets) - false_negatives = len(expected_secrets - detected_secrets) - - precision = true_positives / (true_positives + false_positives) if (true_positives + false_positives) > 0 else 0 - recall = true_positives / (true_positives + false_negatives) if (true_positives + false_negatives) > 0 else 0 - f1 = 2 * (precision * recall) / (precision + recall) if (precision + recall) > 0 else 0 - - return { - "precision": precision, - "recall": recall, - "f1": f1, - "true_positives": true_positives, - "false_positives": false_positives, - "false_negatives": false_negatives - } - - -class TestSecretDetectionComparison: - """Compare all secret detection tools""" - - @pytest.mark.benchmark(group="secret_detection") - def test_gitleaks_workflow(self, benchmark, sdk_client, target_path, ground_truth): - """Benchmark Gitleaks workflow accuracy and performance""" - - def run_gitleaks(): - run = sdk_client.submit_workflow_with_upload( - workflow_name="gitleaks_detection", - target_path=str(target_path), - parameters={ - "scan_mode": "detect", - "no_git": True, - "redact": False - } - ) - - result = sdk_client.wait_for_completion(run.run_id, timeout=300) - assert result.status == "completed", f"Workflow failed: {result.status}" - - findings = sdk_client.get_run_findings(run.run_id) - assert findings and findings.sarif, "No findings returned" - - return findings - - findings = benchmark(run_gitleaks) - - # Extract SARIF results - sarif_results = [] - for run_data in findings.sarif.get("runs", []): - sarif_results.extend(run_data.get("results", [])) - - # Calculate metrics - metrics = calculate_metrics(sarif_results, ground_truth) - - # Log results - print(f"\n=== Gitleaks Workflow Results ===") - print(f"Precision: {metrics['precision']:.2%}") - print(f"Recall: {metrics['recall']:.2%}") - print(f"F1 Score: {metrics['f1']:.2%}") - print(f"True Positives: {metrics['true_positives']}") - print(f"False Positives: {metrics['false_positives']}") - print(f"False Negatives: {metrics['false_negatives']}") - print(f"Findings Count: {len(sarif_results)}") - - # Assert meets thresholds - min_precision = get_threshold(ModuleCategory.SECRET_DETECTION, "min_precision") - min_recall = get_threshold(ModuleCategory.SECRET_DETECTION, "min_recall") - - assert metrics['precision'] >= min_precision, \ - f"Precision {metrics['precision']:.2%} below threshold {min_precision:.2%}" - assert metrics['recall'] >= min_recall, \ - f"Recall {metrics['recall']:.2%} below threshold {min_recall:.2%}" - - @pytest.mark.benchmark(group="secret_detection") - def test_trufflehog_workflow(self, benchmark, sdk_client, target_path, ground_truth): - """Benchmark TruffleHog workflow accuracy and performance""" - - def run_trufflehog(): - run = sdk_client.submit_workflow_with_upload( - workflow_name="trufflehog_detection", - target_path=str(target_path), - parameters={ - "verify": False, - "max_depth": 10 - } - ) - - result = sdk_client.wait_for_completion(run.run_id, timeout=300) - assert result.status == "completed", f"Workflow failed: {result.status}" - - findings = sdk_client.get_run_findings(run.run_id) - assert findings and findings.sarif, "No findings returned" - - return findings - - findings = benchmark(run_trufflehog) - - sarif_results = [] - for run_data in findings.sarif.get("runs", []): - sarif_results.extend(run_data.get("results", [])) - - metrics = calculate_metrics(sarif_results, ground_truth) - - print(f"\n=== TruffleHog Workflow Results ===") - print(f"Precision: {metrics['precision']:.2%}") - print(f"Recall: {metrics['recall']:.2%}") - print(f"F1 Score: {metrics['f1']:.2%}") - print(f"True Positives: {metrics['true_positives']}") - print(f"False Positives: {metrics['false_positives']}") - print(f"False Negatives: {metrics['false_negatives']}") - print(f"Findings Count: {len(sarif_results)}") - - min_precision = get_threshold(ModuleCategory.SECRET_DETECTION, "min_precision") - min_recall = get_threshold(ModuleCategory.SECRET_DETECTION, "min_recall") - - assert metrics['precision'] >= min_precision - assert metrics['recall'] >= min_recall - - @pytest.mark.benchmark(group="secret_detection") - @pytest.mark.parametrize("model", [ - "gpt-4o-mini", - "gpt-4o", - "claude-3-5-sonnet-20241022" - ]) - def test_llm_workflow(self, benchmark, sdk_client, target_path, ground_truth, model): - """Benchmark LLM workflow with different models""" - - def run_llm(): - provider = "openai" if "gpt" in model else "anthropic" - - run = sdk_client.submit_workflow_with_upload( - workflow_name="llm_secret_detection", - target_path=str(target_path), - parameters={ - "agent_url": "http://fuzzforge-task-agent:8000/a2a/litellm_agent", - "llm_model": model, - "llm_provider": provider, - "max_files": 20, - "timeout": 60 - } - ) - - result = sdk_client.wait_for_completion(run.run_id, timeout=300) - assert result.status == "completed", f"Workflow failed: {result.status}" - - findings = sdk_client.get_run_findings(run.run_id) - assert findings and findings.sarif, "No findings returned" - - return findings - - findings = benchmark(run_llm) - - sarif_results = [] - for run_data in findings.sarif.get("runs", []): - sarif_results.extend(run_data.get("results", [])) - - metrics = calculate_metrics(sarif_results, ground_truth) - - print(f"\n=== LLM ({model}) Workflow Results ===") - print(f"Precision: {metrics['precision']:.2%}") - print(f"Recall: {metrics['recall']:.2%}") - print(f"F1 Score: {metrics['f1']:.2%}") - print(f"True Positives: {metrics['true_positives']}") - print(f"False Positives: {metrics['false_positives']}") - print(f"False Negatives: {metrics['false_negatives']}") - print(f"Findings Count: {len(sarif_results)}") - - -class TestSecretDetectionPerformance: - """Performance benchmarks for each tool""" - - @pytest.mark.benchmark(group="secret_detection") - def test_gitleaks_performance(self, benchmark, sdk_client, target_path): - """Benchmark Gitleaks workflow execution speed""" - - def run(): - run = sdk_client.submit_workflow_with_upload( - workflow_name="gitleaks_detection", - target_path=str(target_path), - parameters={"scan_mode": "detect", "no_git": True} - ) - result = sdk_client.wait_for_completion(run.run_id, timeout=300) - return result - - result = benchmark(run) - - max_time = get_threshold(ModuleCategory.SECRET_DETECTION, "max_execution_time_small") - # Note: Workflow execution time includes orchestration overhead - # so we allow 2x the module threshold - assert result.execution_time < max_time * 2 - - @pytest.mark.benchmark(group="secret_detection") - def test_trufflehog_performance(self, benchmark, sdk_client, target_path): - """Benchmark TruffleHog workflow execution speed""" - - def run(): - run = sdk_client.submit_workflow_with_upload( - workflow_name="trufflehog_detection", - target_path=str(target_path), - parameters={"verify": False} - ) - result = sdk_client.wait_for_completion(run.run_id, timeout=300) - return result - - result = benchmark(run) - - max_time = get_threshold(ModuleCategory.SECRET_DETECTION, "max_execution_time_small") - assert result.execution_time < max_time * 2 diff --git a/backend/benchmarks/by_category/secret_detection/compare_tools.py b/backend/benchmarks/by_category/secret_detection/compare_tools.py deleted file mode 100644 index ae03c99..0000000 --- a/backend/benchmarks/by_category/secret_detection/compare_tools.py +++ /dev/null @@ -1,547 +0,0 @@ -""" -Secret Detection Tools Comparison Report Generator - -Generates comparison reports showing strengths/weaknesses of each tool. -Uses workflow execution via SDK to test complete pipeline. -""" - -import asyncio -import json -import time -from pathlib import Path -from typing import Dict, List, Any, Optional -from dataclasses import dataclass, asdict -import sys - -sys.path.insert(0, str(Path(__file__).resolve().parents[3] / "sdk" / "src")) - -from fuzzforge_sdk import FuzzForgeClient - - -@dataclass -class ToolResult: - """Results from running a tool""" - tool_name: str - execution_time: float - findings_count: int - findings_by_file: Dict[str, List[int]] # file_path -> [line_numbers] - unique_files: int - unique_locations: int # unique (file, line) pairs - secret_density: float # average secrets per file - file_types: Dict[str, int] # file extension -> count of files with secrets - - -class SecretDetectionComparison: - """Compare secret detection tools""" - - def __init__(self, target_path: Path, api_url: str = "http://localhost:8000"): - self.target_path = target_path - self.client = FuzzForgeClient(base_url=api_url) - - async def run_workflow(self, workflow_name: str, tool_name: str, config: Dict[str, Any] = None) -> Optional[ToolResult]: - """Run a workflow and extract findings""" - print(f"\nRunning {tool_name} workflow...") - - start_time = time.time() - - try: - # Start workflow - run = self.client.submit_workflow_with_upload( - workflow_name=workflow_name, - target_path=str(self.target_path), - parameters=config or {} - ) - - print(f" Started run: {run.run_id}") - - # Wait for completion (up to 30 minutes for slow LLMs) - print(f" Waiting for completion...") - result = self.client.wait_for_completion(run.run_id, timeout=1800) - - execution_time = time.time() - start_time - - if result.status != "COMPLETED": - print(f"āŒ {tool_name} workflow failed: {result.status}") - return None - - # Get findings from SARIF - findings = self.client.get_run_findings(run.run_id) - - if not findings or not findings.sarif: - print(f"āš ļø {tool_name} produced no findings") - return None - - # Extract results from SARIF and group by file - findings_by_file = {} - unique_locations = set() - - for run_data in findings.sarif.get("runs", []): - for result in run_data.get("results", []): - locations = result.get("locations", []) - for location in locations: - physical_location = location.get("physicalLocation", {}) - artifact_location = physical_location.get("artifactLocation", {}) - region = physical_location.get("region", {}) - - uri = artifact_location.get("uri", "") - line = region.get("startLine", 0) - - if uri and line: - if uri not in findings_by_file: - findings_by_file[uri] = [] - findings_by_file[uri].append(line) - unique_locations.add((uri, line)) - - # Sort line numbers for each file - for file_path in findings_by_file: - findings_by_file[file_path] = sorted(set(findings_by_file[file_path])) - - # Calculate file type distribution - file_types = {} - for file_path in findings_by_file: - ext = Path(file_path).suffix or Path(file_path).name # Use full name for files like .env - if ext.startswith('.'): - file_types[ext] = file_types.get(ext, 0) + 1 - else: - file_types['[no extension]'] = file_types.get('[no extension]', 0) + 1 - - # Calculate secret density - secret_density = len(unique_locations) / len(findings_by_file) if findings_by_file else 0 - - print(f" āœ“ Found {len(unique_locations)} secrets in {len(findings_by_file)} files (avg {secret_density:.1f} per file)") - - return ToolResult( - tool_name=tool_name, - execution_time=execution_time, - findings_count=len(unique_locations), - findings_by_file=findings_by_file, - unique_files=len(findings_by_file), - unique_locations=len(unique_locations), - secret_density=secret_density, - file_types=file_types - ) - - except Exception as e: - print(f"āŒ {tool_name} error: {e}") - return None - - - async def run_all_tools(self, llm_models: List[str] = None) -> List[ToolResult]: - """Run all available tools""" - results = [] - - if llm_models is None: - llm_models = ["gpt-4o-mini"] - - # Gitleaks - result = await self.run_workflow("gitleaks_detection", "Gitleaks", { - "scan_mode": "detect", - "no_git": True, - "redact": False - }) - if result: - results.append(result) - - # TruffleHog - result = await self.run_workflow("trufflehog_detection", "TruffleHog", { - "verify": False, - "max_depth": 10 - }) - if result: - results.append(result) - - # LLM Detector with multiple models - for model in llm_models: - tool_name = f"LLM ({model})" - result = await self.run_workflow("llm_secret_detection", tool_name, { - "agent_url": "http://fuzzforge-task-agent:8000/a2a/litellm_agent", - "llm_model": model, - "llm_provider": "openai" if "gpt" in model else "anthropic", - "max_files": 20, - "timeout": 60, - "file_patterns": [ - "*.py", "*.js", "*.ts", "*.java", "*.go", "*.env", "*.yaml", "*.yml", - "*.json", "*.xml", "*.ini", "*.sql", "*.properties", "*.sh", "*.bat", - "*.config", "*.conf", "*.toml", "*id_rsa*", "*.txt" - ] - }) - if result: - results.append(result) - - return results - - def _calculate_agreement_matrix(self, results: List[ToolResult]) -> Dict[str, Dict[str, int]]: - """Calculate overlap matrix showing common secrets between tool pairs""" - matrix = {} - - for i, result1 in enumerate(results): - matrix[result1.tool_name] = {} - # Convert to set of (file, line) tuples - secrets1 = set() - for file_path, lines in result1.findings_by_file.items(): - for line in lines: - secrets1.add((file_path, line)) - - for result2 in results: - secrets2 = set() - for file_path, lines in result2.findings_by_file.items(): - for line in lines: - secrets2.add((file_path, line)) - - # Count common secrets - common = len(secrets1 & secrets2) - matrix[result1.tool_name][result2.tool_name] = common - - return matrix - - def _get_per_file_comparison(self, results: List[ToolResult]) -> Dict[str, Dict[str, int]]: - """Get per-file breakdown of findings across all tools""" - all_files = set() - for result in results: - all_files.update(result.findings_by_file.keys()) - - comparison = {} - for file_path in sorted(all_files): - comparison[file_path] = {} - for result in results: - comparison[file_path][result.tool_name] = len(result.findings_by_file.get(file_path, [])) - - return comparison - - def _get_agreement_stats(self, results: List[ToolResult]) -> Dict[int, int]: - """Calculate how many secrets are found by 1, 2, 3, or all tools""" - # Collect all unique (file, line) pairs across all tools - all_secrets = {} # (file, line) -> list of tools that found it - - for result in results: - for file_path, lines in result.findings_by_file.items(): - for line in lines: - key = (file_path, line) - if key not in all_secrets: - all_secrets[key] = [] - all_secrets[key].append(result.tool_name) - - # Count by number of tools - agreement_counts = {} - for secret, tools in all_secrets.items(): - count = len(set(tools)) # Unique tools - agreement_counts[count] = agreement_counts.get(count, 0) + 1 - - return agreement_counts - - def generate_markdown_report(self, results: List[ToolResult]) -> str: - """Generate markdown comparison report""" - report = [] - report.append("# Secret Detection Tools Comparison\n") - report.append(f"**Target**: {self.target_path.name}") - report.append(f"**Tools**: {', '.join([r.tool_name for r in results])}\n") - - # Summary table with extended metrics - report.append("\n## Summary\n") - report.append("| Tool | Secrets | Files | Avg/File | Time (s) |") - report.append("|------|---------|-------|----------|----------|") - - for result in results: - report.append( - f"| {result.tool_name} | " - f"{result.findings_count} | " - f"{result.unique_files} | " - f"{result.secret_density:.1f} | " - f"{result.execution_time:.2f} |" - ) - - # Agreement Analysis - agreement_stats = self._get_agreement_stats(results) - report.append("\n## Agreement Analysis\n") - report.append("Secrets found by different numbers of tools:\n") - for num_tools in sorted(agreement_stats.keys(), reverse=True): - count = agreement_stats[num_tools] - if num_tools == len(results): - report.append(f"- **All {num_tools} tools agree**: {count} secrets") - elif num_tools == 1: - report.append(f"- **Only 1 tool found**: {count} secrets") - else: - report.append(f"- **{num_tools} tools agree**: {count} secrets") - - # Agreement Matrix - agreement_matrix = self._calculate_agreement_matrix(results) - report.append("\n## Tool Agreement Matrix\n") - report.append("Number of common secrets found by tool pairs:\n") - - # Header row - header = "| Tool |" - separator = "|------|" - for result in results: - short_name = result.tool_name.replace("LLM (", "").replace(")", "") - header += f" {short_name} |" - separator += "------|" - report.append(header) - report.append(separator) - - # Data rows - for result in results: - short_name = result.tool_name.replace("LLM (", "").replace(")", "") - row = f"| {short_name} |" - for result2 in results: - count = agreement_matrix[result.tool_name][result2.tool_name] - row += f" {count} |" - report.append(row) - - # Per-File Comparison - per_file = self._get_per_file_comparison(results) - report.append("\n## Per-File Detailed Comparison\n") - report.append("Secrets found per file by each tool:\n") - - # Header - header = "| File |" - separator = "|------|" - for result in results: - short_name = result.tool_name.replace("LLM (", "").replace(")", "") - header += f" {short_name} |" - separator += "------|" - header += " Total |" - separator += "------|" - report.append(header) - report.append(separator) - - # Show top 15 files by total findings - file_totals = [(f, sum(counts.values())) for f, counts in per_file.items()] - file_totals.sort(key=lambda x: x[1], reverse=True) - - for file_path, total in file_totals[:15]: - row = f"| `{file_path}` |" - for result in results: - count = per_file[file_path].get(result.tool_name, 0) - row += f" {count} |" - row += f" **{total}** |" - report.append(row) - - if len(file_totals) > 15: - report.append(f"| ... and {len(file_totals) - 15} more files | ... | ... | ... | ... | ... |") - - # File Type Breakdown - report.append("\n## File Type Breakdown\n") - all_extensions = set() - for result in results: - all_extensions.update(result.file_types.keys()) - - if all_extensions: - header = "| Type |" - separator = "|------|" - for result in results: - short_name = result.tool_name.replace("LLM (", "").replace(")", "") - header += f" {short_name} |" - separator += "------|" - report.append(header) - report.append(separator) - - for ext in sorted(all_extensions): - row = f"| `{ext}` |" - for result in results: - count = result.file_types.get(ext, 0) - row += f" {count} files |" - report.append(row) - - # File analysis - report.append("\n## Files Analyzed\n") - - # Collect all unique files across all tools - all_files = set() - for result in results: - all_files.update(result.findings_by_file.keys()) - - report.append(f"**Total unique files with secrets**: {len(all_files)}\n") - - for result in results: - report.append(f"\n### {result.tool_name}\n") - report.append(f"Found secrets in **{result.unique_files} files**:\n") - - # Sort files by number of findings (descending) - sorted_files = sorted( - result.findings_by_file.items(), - key=lambda x: len(x[1]), - reverse=True - ) - - # Show top 10 files - for file_path, lines in sorted_files[:10]: - report.append(f"- `{file_path}`: {len(lines)} secrets (lines: {', '.join(map(str, lines[:5]))}{'...' if len(lines) > 5 else ''})") - - if len(sorted_files) > 10: - report.append(f"- ... and {len(sorted_files) - 10} more files") - - # Overlap analysis - if len(results) >= 2: - report.append("\n## Overlap Analysis\n") - - # Find common files - file_sets = [set(r.findings_by_file.keys()) for r in results] - common_files = set.intersection(*file_sets) if file_sets else set() - - if common_files: - report.append(f"\n**Files found by all tools** ({len(common_files)}):\n") - for file_path in sorted(common_files)[:10]: - report.append(f"- `{file_path}`") - else: - report.append("\n**No files were found by all tools**\n") - - # Find tool-specific files - for i, result in enumerate(results): - unique_to_tool = set(result.findings_by_file.keys()) - for j, other_result in enumerate(results): - if i != j: - unique_to_tool -= set(other_result.findings_by_file.keys()) - - if unique_to_tool: - report.append(f"\n**Unique to {result.tool_name}** ({len(unique_to_tool)} files):\n") - for file_path in sorted(unique_to_tool)[:5]: - report.append(f"- `{file_path}`") - if len(unique_to_tool) > 5: - report.append(f"- ... and {len(unique_to_tool) - 5} more") - - # Ground Truth Analysis (if available) - ground_truth_path = Path(__file__).parent / "secret_detection_benchmark_GROUND_TRUTH.json" - if ground_truth_path.exists(): - report.append("\n## Ground Truth Analysis\n") - try: - with open(ground_truth_path) as f: - gt_data = json.load(f) - - gt_total = gt_data.get("total_secrets", 30) - report.append(f"**Expected secrets**: {gt_total} (documented in ground truth)\n") - - # Build ground truth set of (file, line) tuples - gt_secrets = set() - for secret in gt_data.get("secrets", []): - gt_secrets.add((secret["file"], secret["line"])) - - report.append("### Tool Performance vs Ground Truth\n") - report.append("| Tool | Found | Expected | Recall | Extra Findings |") - report.append("|------|-------|----------|--------|----------------|") - - for result in results: - # Build tool findings set - tool_secrets = set() - for file_path, lines in result.findings_by_file.items(): - for line in lines: - tool_secrets.add((file_path, line)) - - # Calculate metrics - true_positives = len(gt_secrets & tool_secrets) - recall = (true_positives / gt_total * 100) if gt_total > 0 else 0 - extra = len(tool_secrets - gt_secrets) - - report.append( - f"| {result.tool_name} | " - f"{result.findings_count} | " - f"{gt_total} | " - f"{recall:.1f}% | " - f"{extra} |" - ) - - # Analyze LLM extra findings - llm_results = [r for r in results if "LLM" in r.tool_name] - if llm_results: - report.append("\n### LLM Extra Findings Explanation\n") - report.append("LLMs may find more than 30 secrets because they detect:\n") - report.append("- **Split secret components**: Each part of `DB_PASS_PART1 + PART2 + PART3` counted separately") - report.append("- **Join operations**: Lines like `''.join(AWS_SECRET_CHARS)` flagged as additional exposure") - report.append("- **Decoding functions**: Code that reveals secrets (e.g., `base64.b64decode()`, `codecs.decode()`)") - report.append("- **Comment identifiers**: Lines marking secret locations without plaintext values") - report.append("\nThese are *technically correct* detections of secret exposure points, not false positives.") - report.append("The ground truth documents 30 'primary' secrets, but the codebase has additional derivative exposures.\n") - - except Exception as e: - report.append(f"*Could not load ground truth: {e}*\n") - - # Performance summary - if results: - report.append("\n## Performance Summary\n") - most_findings = max(results, key=lambda r: r.findings_count) - most_files = max(results, key=lambda r: r.unique_files) - fastest = min(results, key=lambda r: r.execution_time) - - report.append(f"- **Most secrets found**: {most_findings.tool_name} ({most_findings.findings_count} secrets)") - report.append(f"- **Most files covered**: {most_files.tool_name} ({most_files.unique_files} files)") - report.append(f"- **Fastest**: {fastest.tool_name} ({fastest.execution_time:.2f}s)") - - return "\n".join(report) - - def save_json_report(self, results: List[ToolResult], output_path: Path): - """Save results as JSON""" - data = { - "target_path": str(self.target_path), - "results": [asdict(r) for r in results] - } - - with open(output_path, 'w') as f: - json.dump(data, f, indent=2) - - print(f"\nāœ… JSON report saved to: {output_path}") - - def cleanup(self): - """Cleanup SDK client""" - self.client.close() - - -async def main(): - """Run comparison and generate reports""" - # Get target path (secret_detection_benchmark) - target_path = Path(__file__).parent.parent.parent.parent.parent / "test_projects" / "secret_detection_benchmark" - - if not target_path.exists(): - print(f"āŒ Target not found at: {target_path}") - return 1 - - print("=" * 80) - print("Secret Detection Tools Comparison") - print("=" * 80) - print(f"Target: {target_path}") - - # LLM models to test - llm_models = [ - "gpt-4o-mini", - "gpt-5-mini" - ] - print(f"LLM models: {', '.join(llm_models)}\n") - - # Run comparison - comparison = SecretDetectionComparison(target_path) - - try: - results = await comparison.run_all_tools(llm_models=llm_models) - - if not results: - print("āŒ No tools ran successfully") - return 1 - - # Generate reports - print("\n" + "=" * 80) - markdown_report = comparison.generate_markdown_report(results) - print(markdown_report) - - # Save reports - output_dir = Path(__file__).parent / "results" - output_dir.mkdir(exist_ok=True) - - markdown_path = output_dir / "comparison_report.md" - with open(markdown_path, 'w') as f: - f.write(markdown_report) - print(f"\nāœ… Markdown report saved to: {markdown_path}") - - json_path = output_dir / "comparison_results.json" - comparison.save_json_report(results, json_path) - - print("\n" + "=" * 80) - print("āœ… Comparison complete!") - print("=" * 80) - - return 0 - - finally: - comparison.cleanup() - - -if __name__ == "__main__": - exit_code = asyncio.run(main()) - sys.exit(exit_code) diff --git a/backend/benchmarks/by_category/secret_detection/results/comparison_report.md b/backend/benchmarks/by_category/secret_detection/results/comparison_report.md deleted file mode 100644 index 220cb33..0000000 --- a/backend/benchmarks/by_category/secret_detection/results/comparison_report.md +++ /dev/null @@ -1,169 +0,0 @@ -# Secret Detection Tools Comparison - -**Target**: secret_detection_benchmark -**Tools**: Gitleaks, TruffleHog, LLM (gpt-4o-mini), LLM (gpt-5-mini) - - -## Summary - -| Tool | Secrets | Files | Avg/File | Time (s) | -|------|---------|-------|----------|----------| -| Gitleaks | 12 | 10 | 1.2 | 5.18 | -| TruffleHog | 1 | 1 | 1.0 | 5.06 | -| LLM (gpt-4o-mini) | 30 | 15 | 2.0 | 296.85 | -| LLM (gpt-5-mini) | 41 | 16 | 2.6 | 618.55 | - -## Agreement Analysis - -Secrets found by different numbers of tools: - -- **3 tools agree**: 6 secrets -- **2 tools agree**: 22 secrets -- **Only 1 tool found**: 22 secrets - -## Tool Agreement Matrix - -Number of common secrets found by tool pairs: - -| Tool | Gitleaks | TruffleHog | gpt-4o-mini | gpt-5-mini | -|------|------|------|------|------| -| Gitleaks | 12 | 0 | 7 | 11 | -| TruffleHog | 0 | 1 | 0 | 0 | -| gpt-4o-mini | 7 | 0 | 30 | 22 | -| gpt-5-mini | 11 | 0 | 22 | 41 | - -## Per-File Detailed Comparison - -Secrets found per file by each tool: - -| File | Gitleaks | TruffleHog | gpt-4o-mini | gpt-5-mini | Total | -|------|------|------|------|------|------| -| `src/obfuscated.py` | 2 | 0 | 6 | 7 | **15** | -| `src/advanced.js` | 0 | 0 | 5 | 7 | **12** | -| `src/config.py` | 1 | 0 | 0 | 6 | **7** | -| `.env` | 1 | 0 | 2 | 2 | **5** | -| `config/keys.yaml` | 1 | 0 | 2 | 2 | **5** | -| `config/oauth.json` | 1 | 0 | 2 | 2 | **5** | -| `config/settings.py` | 2 | 0 | 0 | 3 | **5** | -| `scripts/deploy.sh` | 1 | 0 | 2 | 2 | **5** | -| `config/legacy.ini` | 0 | 0 | 2 | 2 | **4** | -| `src/Crypto.go` | 0 | 0 | 2 | 2 | **4** | -| `config/app.properties` | 1 | 0 | 1 | 1 | **3** | -| `config/database.yaml` | 0 | 1 | 1 | 1 | **3** | -| `src/Main.java` | 1 | 0 | 1 | 1 | **3** | -| `id_rsa` | 1 | 0 | 1 | 0 | **2** | -| `scripts/webhook.js` | 0 | 0 | 1 | 1 | **2** | -| ... and 2 more files | ... | ... | ... | ... | ... | - -## File Type Breakdown - -| Type | Gitleaks | TruffleHog | gpt-4o-mini | gpt-5-mini | -|------|------|------|------|------| -| `.env` | 1 files | 0 files | 1 files | 1 files | -| `.go` | 0 files | 0 files | 1 files | 1 files | -| `.ini` | 0 files | 0 files | 1 files | 1 files | -| `.java` | 1 files | 0 files | 1 files | 1 files | -| `.js` | 0 files | 0 files | 2 files | 2 files | -| `.json` | 1 files | 0 files | 1 files | 1 files | -| `.properties` | 1 files | 0 files | 1 files | 1 files | -| `.py` | 3 files | 0 files | 2 files | 4 files | -| `.sh` | 1 files | 0 files | 1 files | 1 files | -| `.sql` | 0 files | 0 files | 1 files | 1 files | -| `.yaml` | 1 files | 1 files | 2 files | 2 files | -| `[no extension]` | 1 files | 0 files | 1 files | 0 files | - -## Files Analyzed - -**Total unique files with secrets**: 17 - - -### Gitleaks - -Found secrets in **10 files**: - -- `config/settings.py`: 2 secrets (lines: 6, 9) -- `src/obfuscated.py`: 2 secrets (lines: 7, 17) -- `.env`: 1 secrets (lines: 3) -- `config/app.properties`: 1 secrets (lines: 6) -- `config/keys.yaml`: 1 secrets (lines: 6) -- `id_rsa`: 1 secrets (lines: 1) -- `config/oauth.json`: 1 secrets (lines: 4) -- `scripts/deploy.sh`: 1 secrets (lines: 5) -- `src/Main.java`: 1 secrets (lines: 5) -- `src/config.py`: 1 secrets (lines: 7) - -### TruffleHog - -Found secrets in **1 files**: - -- `config/database.yaml`: 1 secrets (lines: 6) - -### LLM (gpt-4o-mini) - -Found secrets in **15 files**: - -- `src/obfuscated.py`: 6 secrets (lines: 7, 10, 13, 18, 20...) -- `src/advanced.js`: 5 secrets (lines: 4, 7, 10, 12, 17) -- `src/Crypto.go`: 2 secrets (lines: 6, 10) -- `.env`: 2 secrets (lines: 3, 4) -- `config/keys.yaml`: 2 secrets (lines: 6, 12) -- `config/oauth.json`: 2 secrets (lines: 3, 4) -- `config/legacy.ini`: 2 secrets (lines: 4, 7) -- `scripts/deploy.sh`: 2 secrets (lines: 6, 9) -- `src/app.py`: 1 secrets (lines: 7) -- `scripts/webhook.js`: 1 secrets (lines: 4) -- ... and 5 more files - -### LLM (gpt-5-mini) - -Found secrets in **16 files**: - -- `src/obfuscated.py`: 7 secrets (lines: 7, 10, 13, 14, 17...) -- `src/advanced.js`: 7 secrets (lines: 4, 7, 9, 10, 13...) -- `src/config.py`: 6 secrets (lines: 7, 10, 13, 14, 15...) -- `config/settings.py`: 3 secrets (lines: 6, 9, 20) -- `src/Crypto.go`: 2 secrets (lines: 10, 15) -- `.env`: 2 secrets (lines: 3, 4) -- `config/keys.yaml`: 2 secrets (lines: 6, 12) -- `config/oauth.json`: 2 secrets (lines: 3, 4) -- `config/legacy.ini`: 2 secrets (lines: 3, 7) -- `scripts/deploy.sh`: 2 secrets (lines: 5, 10) -- ... and 6 more files - -## Overlap Analysis - - -**No files were found by all tools** - - -## Ground Truth Analysis - -**Expected secrets**: 32 (documented in ground truth) - -### Tool Performance vs Ground Truth - -| Tool | Found | Expected | Recall | Extra Findings | -|------|-------|----------|--------|----------------| -| Gitleaks | 12 | 32 | 37.5% | 0 | -| TruffleHog | 1 | 32 | 0.0% | 1 | -| LLM (gpt-4o-mini) | 30 | 32 | 56.2% | 12 | -| LLM (gpt-5-mini) | 41 | 32 | 84.4% | 14 | - -### LLM Extra Findings Explanation - -LLMs may find more than 30 secrets because they detect: - -- **Split secret components**: Each part of `DB_PASS_PART1 + PART2 + PART3` counted separately -- **Join operations**: Lines like `''.join(AWS_SECRET_CHARS)` flagged as additional exposure -- **Decoding functions**: Code that reveals secrets (e.g., `base64.b64decode()`, `codecs.decode()`) -- **Comment identifiers**: Lines marking secret locations without plaintext values - -These are *technically correct* detections of secret exposure points, not false positives. -The ground truth documents 30 'primary' secrets, but the codebase has additional derivative exposures. - - -## Performance Summary - -- **Most secrets found**: LLM (gpt-5-mini) (41 secrets) -- **Most files covered**: LLM (gpt-5-mini) (16 files) -- **Fastest**: TruffleHog (5.06s) \ No newline at end of file diff --git a/backend/benchmarks/by_category/secret_detection/results/comparison_results.json b/backend/benchmarks/by_category/secret_detection/results/comparison_results.json deleted file mode 100644 index 4e9c89f..0000000 --- a/backend/benchmarks/by_category/secret_detection/results/comparison_results.json +++ /dev/null @@ -1,253 +0,0 @@ -{ - "target_path": "/Users/tduhamel/Documents/FuzzingLabs/fuzzforge_ai/test_projects/secret_detection_benchmark", - "results": [ - { - "tool_name": "Gitleaks", - "execution_time": 5.177123069763184, - "findings_count": 12, - "findings_by_file": { - ".env": [ - 3 - ], - "config/app.properties": [ - 6 - ], - "config/keys.yaml": [ - 6 - ], - "id_rsa": [ - 1 - ], - "config/oauth.json": [ - 4 - ], - "scripts/deploy.sh": [ - 5 - ], - "config/settings.py": [ - 6, - 9 - ], - "src/Main.java": [ - 5 - ], - "src/obfuscated.py": [ - 7, - 17 - ], - "src/config.py": [ - 7 - ] - }, - "unique_files": 10, - "unique_locations": 12, - "secret_density": 1.2, - "file_types": { - ".env": 1, - ".properties": 1, - ".yaml": 1, - "[no extension]": 1, - ".json": 1, - ".sh": 1, - ".py": 3, - ".java": 1 - } - }, - { - "tool_name": "TruffleHog", - "execution_time": 5.061383008956909, - "findings_count": 1, - "findings_by_file": { - "config/database.yaml": [ - 6 - ] - }, - "unique_files": 1, - "unique_locations": 1, - "secret_density": 1.0, - "file_types": { - ".yaml": 1 - } - }, - { - "tool_name": "LLM (gpt-4o-mini)", - "execution_time": 296.8492441177368, - "findings_count": 30, - "findings_by_file": { - "src/obfuscated.py": [ - 7, - 10, - 13, - 18, - 20, - 23 - ], - "src/app.py": [ - 7 - ], - "scripts/webhook.js": [ - 4 - ], - "src/advanced.js": [ - 4, - 7, - 10, - 12, - 17 - ], - "src/Main.java": [ - 5 - ], - "src/Crypto.go": [ - 6, - 10 - ], - ".env": [ - 3, - 4 - ], - "config/keys.yaml": [ - 6, - 12 - ], - "config/database.yaml": [ - 7 - ], - "config/oauth.json": [ - 3, - 4 - ], - "config/legacy.ini": [ - 4, - 7 - ], - "src/database.sql": [ - 4 - ], - "config/app.properties": [ - 6 - ], - "scripts/deploy.sh": [ - 6, - 9 - ], - "id_rsa": [ - 1 - ] - }, - "unique_files": 15, - "unique_locations": 30, - "secret_density": 2.0, - "file_types": { - ".py": 2, - ".js": 2, - ".java": 1, - ".go": 1, - ".env": 1, - ".yaml": 2, - ".json": 1, - ".ini": 1, - ".sql": 1, - ".properties": 1, - ".sh": 1, - "[no extension]": 1 - } - }, - { - "tool_name": "LLM (gpt-5-mini)", - "execution_time": 618.5462851524353, - "findings_count": 41, - "findings_by_file": { - "config/settings.py": [ - 6, - 9, - 20 - ], - "src/obfuscated.py": [ - 7, - 10, - 13, - 14, - 17, - 20, - 23 - ], - "src/app.py": [ - 7 - ], - "src/config.py": [ - 7, - 10, - 13, - 14, - 15, - 16 - ], - "scripts/webhook.js": [ - 4 - ], - "src/advanced.js": [ - 4, - 7, - 9, - 10, - 13, - 17, - 19 - ], - "src/Main.java": [ - 5 - ], - "src/Crypto.go": [ - 10, - 15 - ], - ".env": [ - 3, - 4 - ], - "config/keys.yaml": [ - 6, - 12 - ], - "config/database.yaml": [ - 7 - ], - "config/oauth.json": [ - 3, - 4 - ], - "config/legacy.ini": [ - 3, - 7 - ], - "src/database.sql": [ - 6 - ], - "config/app.properties": [ - 6 - ], - "scripts/deploy.sh": [ - 5, - 10 - ] - }, - "unique_files": 16, - "unique_locations": 41, - "secret_density": 2.5625, - "file_types": { - ".py": 4, - ".js": 2, - ".java": 1, - ".go": 1, - ".env": 1, - ".yaml": 2, - ".json": 1, - ".ini": 1, - ".sql": 1, - ".properties": 1, - ".sh": 1 - } - } - ] -} \ No newline at end of file diff --git a/backend/benchmarks/by_category/secret_detection/secret_detection_benchmark_GROUND_TRUTH.json b/backend/benchmarks/by_category/secret_detection/secret_detection_benchmark_GROUND_TRUTH.json deleted file mode 100644 index cd6223c..0000000 --- a/backend/benchmarks/by_category/secret_detection/secret_detection_benchmark_GROUND_TRUTH.json +++ /dev/null @@ -1,344 +0,0 @@ -{ - "description": "Ground truth dataset for secret detection benchmarking - Exactly 32 secrets", - "version": "1.1.0", - "total_secrets": 32, - "secrets_by_difficulty": { - "easy": 12, - "medium": 10, - "hard": 10 - }, - "secrets": [ - { - "id": 1, - "file": ".env", - "line": 3, - "difficulty": "easy", - "type": "aws_access_key", - "value": "AKIAIOSFODNN7EXAMPLE", - "severity": "critical" - }, - { - "id": 2, - "file": ".env", - "line": 4, - "difficulty": "easy", - "type": "aws_secret_access_key", - "value": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY", - "severity": "critical" - }, - { - "id": 3, - "file": "config/settings.py", - "line": 6, - "difficulty": "easy", - "type": "github_pat", - "value": "ghp_vR8jK2mN4pQ6tX9bC3wY7zA1eF5hI8kL", - "severity": "critical" - }, - { - "id": 4, - "file": "config/settings.py", - "line": 9, - "difficulty": "easy", - "type": "stripe_api_key", - "value": "sk_live_51MabcdefghijklmnopqrstuvwxyzABCDEF123456789", - "severity": "critical" - }, - { - "id": 5, - "file": "config/settings.py", - "line": 17, - "difficulty": "easy", - "type": "database_password", - "value": "ProdDB_P@ssw0rd_2024_Secure!", - "severity": "critical" - }, - { - "id": 6, - "file": "src/app.py", - "line": 6, - "difficulty": "easy", - "type": "jwt_secret", - "value": "my-super-secret-jwt-key-do-not-share-2024", - "severity": "critical" - }, - { - "id": 7, - "file": "config/database.yaml", - "line": 7, - "difficulty": "easy", - "type": "azure_storage_key", - "value": "DefaultEndpointsProtocol=https;AccountName=prodstore;AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==;EndpointSuffix=core.windows.net", - "severity": "critical" - }, - { - "id": 8, - "file": "scripts/webhook.js", - "line": 4, - "difficulty": "easy", - "type": "slack_webhook", - "value": "https://hooks.slack.com/services/T00000000/B00000000/XXXXXXXXXXXXXXXXXXXX", - "severity": "high" - }, - { - "id": 9, - "file": "config/app.properties", - "line": 6, - "difficulty": "easy", - "type": "api_key", - "value": "sk_test_4eC39HqLyjWDarjtT1zdp7dc", - "severity": "high" - }, - { - "id": 10, - "file": "id_rsa", - "line": 1, - "difficulty": "easy", - "type": "ssh_private_key", - "value": "-----BEGIN OPENSSH PRIVATE KEY-----", - "severity": "critical" - }, - { - "id": 11, - "file": "config/oauth.json", - "line": 4, - "difficulty": "easy", - "type": "oauth_client_secret", - "value": "GOCSPX-Ab12Cd34Ef56Gh78Ij90Kl12", - "severity": "critical" - }, - { - "id": 12, - "file": "src/Main.java", - "line": 5, - "difficulty": "easy", - "type": "google_oauth_secret", - "value": "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", - "severity": "critical" - }, - { - "id": 13, - "file": "src/config.py", - "line": 7, - "difficulty": "medium", - "type": "aws_access_key_base64", - "value": "QUtJQUlPU0ZPRE5ON0VYQU1QTEU=", - "decoded": "AKIAIOSFODNN7EXAMPLE", - "severity": "critical" - }, - { - "id": 14, - "file": "src/config.py", - "line": 10, - "difficulty": "medium", - "type": "api_token_hex", - "value": "6170695f746f6b656e5f616263313233787977373839", - "decoded": "api_token_abc123xyz789", - "severity": "high" - }, - { - "id": 15, - "file": "src/config.py", - "line": 16, - "difficulty": "medium", - "type": "database_password_concatenated", - "value": "MySecurePassword2024!", - "note": "Built from DB_PASS_PART1 + DB_PASS_PART2 + DB_PASS_PART3", - "severity": "critical" - }, - { - "id": 16, - "file": "scripts/deploy.sh", - "line": 5, - "difficulty": "medium", - "type": "api_key_export", - "value": "sk_prod_1234567890abcdefghijklmnopqrstuvwxyz", - "severity": "critical" - }, - { - "id": 17, - "file": "scripts/deploy.sh", - "line": 11, - "difficulty": "medium", - "type": "database_password_url_encoded", - "value": "mysql://admin:MyP%40ssw0rd%21@db.example.com:3306/prod", - "decoded": "mysql://admin:MyP@ssw0rd!@db.example.com:3306/prod", - "note": "In comment", - "severity": "critical" - }, - { - "id": 18, - "file": "config/keys.yaml", - "line": 6, - "difficulty": "medium", - "type": "rsa_private_key_multiline", - "value": "-----BEGIN RSA PRIVATE KEY-----", - "note": "Multi-line YAML literal block", - "severity": "critical" - }, - { - "id": 19, - "file": "config/keys.yaml", - "line": 11, - "difficulty": "medium", - "type": "api_token_unicode", - "value": "tĆøkęn_śęçrėt_ẃïth_ŭñïçődė_123456", - "severity": "high" - }, - { - "id": 20, - "file": "src/database.sql", - "line": 6, - "difficulty": "medium", - "type": "database_connection_string", - "value": "postgresql://admin:Pr0dDB_S3cr3t_P@ss@db.prod.example.com:5432/prod_db", - "note": "In SQL comment", - "severity": "critical" - }, - { - "id": 21, - "file": "config/legacy.ini", - "line": 3, - "difficulty": "medium", - "type": "database_password", - "value": "L3g@cy_DB_P@ssw0rd_2023", - "severity": "critical" - }, - { - "id": 22, - "file": "config/legacy.ini", - "line": 7, - "difficulty": "medium", - "type": "api_key_commented", - "value": "backup_key_xyz789abc123def456ghi", - "note": "Commented backup key", - "severity": "high" - }, - { - "id": 23, - "file": "src/obfuscated.py", - "line": 7, - "difficulty": "hard", - "type": "stripe_key_rot13", - "value": "fx_yvir_frperg_xrl_12345", - "decoded": "sk_live_secret_key_12345", - "severity": "critical" - }, - { - "id": 24, - "file": "src/obfuscated.py", - "line": 10, - "difficulty": "hard", - "type": "github_token_binary", - "value": "b'\\x67\\x68\\x70\\x5f\\x4d\\x79\\x47\\x69\\x74\\x48\\x75\\x62\\x54\\x6f\\x6b\\x65\\x6e\\x31\\x32\\x33\\x34\\x35\\x36'", - "decoded": "ghp_MyGitHubToken123456", - "severity": "critical" - }, - { - "id": 25, - "file": "src/obfuscated.py", - "line": 13, - "difficulty": "hard", - "type": "aws_secret_char_array", - "value": "['A','W','S','_','S','E','C','R','E','T','_','K','E','Y','_','X','Y','Z','7','8','9']", - "decoded": "AWS_SECRET_KEY_XYZ789", - "severity": "critical" - }, - { - "id": 26, - "file": "src/obfuscated.py", - "line": 17, - "difficulty": "hard", - "type": "api_token_reversed", - "value": "321cba_desrever_nekot_ipa", - "decoded": "api_token_reversed_abc123", - "severity": "high" - }, - { - "id": 27, - "file": "src/advanced.js", - "line": 4, - "difficulty": "hard", - "type": "secret_template_string", - "value": "sk_prod_template_key_xyz", - "note": "Built from template literals", - "severity": "critical" - }, - { - "id": 28, - "file": "src/advanced.js", - "line": 7, - "difficulty": "hard", - "type": "password_in_regex", - "value": "password_regex_secret_789", - "note": "Inside regex pattern", - "severity": "medium" - }, - { - "id": 29, - "file": "src/advanced.js", - "line": 10, - "difficulty": "hard", - "type": "api_key_xor", - "value": "[65,82,90,75,94,91,92,75,93,67,65,90,67,92,75,91,67,95]", - "decoded": "api_xor_secret_key", - "note": "XOR encrypted with key 42", - "severity": "critical" - }, - { - "id": 30, - "file": "src/advanced.js", - "line": 17, - "difficulty": "hard", - "type": "api_key_escaped_json", - "value": "sk_escaped_json_key_456", - "note": "Escaped JSON within string", - "severity": "high" - }, - { - "id": 31, - "file": "src/Crypto.go", - "line": 10, - "difficulty": "hard", - "type": "secret_in_heredoc", - "value": "golang_heredoc_secret_999", - "note": "In heredoc/multi-line string", - "severity": "high" - }, - { - "id": 32, - "file": "src/Crypto.go", - "line": 15, - "difficulty": "hard", - "type": "stripe_key_typo", - "value": "strippe_sk_live_corrected_key", - "decoded": "stripe_sk_live_corrected_key", - "note": "Intentional typo corrected programmatically", - "severity": "critical" - } - ], - "file_summary": { - ".env": 2, - "config/settings.py": 3, - "src/app.py": 1, - "config/database.yaml": 1, - "scripts/webhook.js": 1, - "config/app.properties": 1, - "id_rsa": 1, - "config/oauth.json": 1, - "src/Main.java": 1, - "src/config.py": 3, - "scripts/deploy.sh": 2, - "config/keys.yaml": 2, - "src/database.sql": 1, - "config/legacy.ini": 2, - "src/obfuscated.py": 4, - "src/advanced.js": 4, - "src/Crypto.go": 2 - }, - "notes": { - "easy_secrets": "Standard patterns that any decent secret scanner should detect", - "medium_secrets": "Slightly obfuscated - base64, hex, concatenated, or in comments", - "hard_secrets": "Well hidden - ROT13, binary, XOR, reversed, split across constructs" - } -} diff --git a/backend/benchmarks/category_configs.py b/backend/benchmarks/category_configs.py deleted file mode 100644 index 429a68f..0000000 --- a/backend/benchmarks/category_configs.py +++ /dev/null @@ -1,151 +0,0 @@ -""" -Category-specific benchmark configurations - -Defines expected metrics and performance thresholds for each module category. -""" - -from dataclasses import dataclass -from typing import List, Dict -from enum import Enum - - -class ModuleCategory(str, Enum): - """Module categories for benchmarking""" - FUZZER = "fuzzer" - SCANNER = "scanner" - ANALYZER = "analyzer" - SECRET_DETECTION = "secret_detection" - REPORTER = "reporter" - - -@dataclass -class CategoryBenchmarkConfig: - """Benchmark configuration for a module category""" - category: ModuleCategory - expected_metrics: List[str] - performance_thresholds: Dict[str, float] - description: str - - -# Fuzzer category configuration -FUZZER_CONFIG = CategoryBenchmarkConfig( - category=ModuleCategory.FUZZER, - expected_metrics=[ - "execs_per_sec", - "coverage_rate", - "time_to_first_crash", - "corpus_efficiency", - "execution_time", - "peak_memory_mb" - ], - performance_thresholds={ - "min_execs_per_sec": 1000, # Minimum executions per second - "max_execution_time_small": 10.0, # Max time for small project (seconds) - "max_execution_time_medium": 60.0, # Max time for medium project - "max_memory_mb": 2048, # Maximum memory usage - "min_coverage_rate": 1.0, # Minimum new coverage per second - }, - description="Fuzzing modules: coverage-guided fuzz testing" -) - -# Scanner category configuration -SCANNER_CONFIG = CategoryBenchmarkConfig( - category=ModuleCategory.SCANNER, - expected_metrics=[ - "files_per_sec", - "loc_per_sec", - "execution_time", - "peak_memory_mb", - "findings_count" - ], - performance_thresholds={ - "min_files_per_sec": 100, # Minimum files scanned per second - "min_loc_per_sec": 10000, # Minimum lines of code per second - "max_execution_time_small": 1.0, - "max_execution_time_medium": 10.0, - "max_memory_mb": 512, - }, - description="File scanning modules: fast pattern-based scanning" -) - -# Secret detection category configuration -SECRET_DETECTION_CONFIG = CategoryBenchmarkConfig( - category=ModuleCategory.SECRET_DETECTION, - expected_metrics=[ - "patterns_per_sec", - "precision", - "recall", - "f1_score", - "false_positive_rate", - "execution_time", - "peak_memory_mb" - ], - performance_thresholds={ - "min_patterns_per_sec": 1000, - "min_precision": 0.90, # 90% precision target - "min_recall": 0.95, # 95% recall target - "max_false_positives": 5, # Max false positives per 100 secrets - "max_execution_time_small": 2.0, - "max_execution_time_medium": 20.0, - "max_memory_mb": 1024, - }, - description="Secret detection modules: high precision pattern matching" -) - -# Analyzer category configuration -ANALYZER_CONFIG = CategoryBenchmarkConfig( - category=ModuleCategory.ANALYZER, - expected_metrics=[ - "analysis_depth", - "files_analyzed_per_sec", - "execution_time", - "peak_memory_mb", - "findings_count", - "accuracy" - ], - performance_thresholds={ - "min_files_per_sec": 10, # Slower than scanners due to deep analysis - "max_execution_time_small": 5.0, - "max_execution_time_medium": 60.0, - "max_memory_mb": 2048, - "min_accuracy": 0.85, # 85% accuracy target - }, - description="Code analysis modules: deep semantic analysis" -) - -# Reporter category configuration -REPORTER_CONFIG = CategoryBenchmarkConfig( - category=ModuleCategory.REPORTER, - expected_metrics=[ - "report_generation_time", - "findings_per_sec", - "peak_memory_mb" - ], - performance_thresholds={ - "max_report_time_100_findings": 1.0, # Max 1 second for 100 findings - "max_report_time_1000_findings": 10.0, # Max 10 seconds for 1000 findings - "max_memory_mb": 256, - }, - description="Reporting modules: fast report generation" -) - - -# Category configurations map -CATEGORY_CONFIGS = { - ModuleCategory.FUZZER: FUZZER_CONFIG, - ModuleCategory.SCANNER: SCANNER_CONFIG, - ModuleCategory.SECRET_DETECTION: SECRET_DETECTION_CONFIG, - ModuleCategory.ANALYZER: ANALYZER_CONFIG, - ModuleCategory.REPORTER: REPORTER_CONFIG, -} - - -def get_category_config(category: ModuleCategory) -> CategoryBenchmarkConfig: - """Get benchmark configuration for a category""" - return CATEGORY_CONFIGS[category] - - -def get_threshold(category: ModuleCategory, metric: str) -> float: - """Get performance threshold for a specific metric""" - config = get_category_config(category) - return config.performance_thresholds.get(metric, 0.0) diff --git a/backend/benchmarks/conftest.py b/backend/benchmarks/conftest.py deleted file mode 100644 index 2710fb4..0000000 --- a/backend/benchmarks/conftest.py +++ /dev/null @@ -1,60 +0,0 @@ -""" -Benchmark fixtures and configuration -""" - -import sys -from pathlib import Path -import pytest - -# Add parent directories to path -BACKEND_ROOT = Path(__file__).resolve().parents[1] -TOOLBOX = BACKEND_ROOT / "toolbox" - -if str(BACKEND_ROOT) not in sys.path: - sys.path.insert(0, str(BACKEND_ROOT)) -if str(TOOLBOX) not in sys.path: - sys.path.insert(0, str(TOOLBOX)) - - -# ============================================================================ -# Benchmark Fixtures -# ============================================================================ - -@pytest.fixture(scope="session") -def benchmark_fixtures_dir(): - """Path to benchmark fixtures directory""" - return Path(__file__).parent / "fixtures" - - -@pytest.fixture(scope="session") -def small_project_fixture(benchmark_fixtures_dir): - """Small project fixture (~1K LOC)""" - return benchmark_fixtures_dir / "small" - - -@pytest.fixture(scope="session") -def medium_project_fixture(benchmark_fixtures_dir): - """Medium project fixture (~10K LOC)""" - return benchmark_fixtures_dir / "medium" - - -@pytest.fixture(scope="session") -def large_project_fixture(benchmark_fixtures_dir): - """Large project fixture (~100K LOC)""" - return benchmark_fixtures_dir / "large" - - -# ============================================================================ -# pytest-benchmark Configuration -# ============================================================================ - -def pytest_configure(config): - """Configure pytest-benchmark""" - config.addinivalue_line( - "markers", "benchmark: mark test as a benchmark" - ) - - -def pytest_benchmark_group_stats(config, benchmarks, group_by): - """Group benchmark results by category""" - return group_by diff --git a/backend/mcp-config.json b/backend/mcp-config.json deleted file mode 100644 index 4f06ce4..0000000 --- a/backend/mcp-config.json +++ /dev/null @@ -1,121 +0,0 @@ -{ - "name": "FuzzForge Security Testing Platform", - "description": "MCP server for FuzzForge security testing workflows via Docker Compose", - "version": "0.6.0", - "connection": { - "type": "http", - "host": "localhost", - "port": 8010, - "base_url": "http://localhost:8010", - "mcp_endpoint": "/mcp" - }, - "docker_compose": { - "service": "fuzzforge-backend", - "command": "docker compose up -d", - "health_check": "http://localhost:8000/health" - }, - "capabilities": { - "tools": [ - { - "name": "submit_security_scan_mcp", - "description": "Submit a security scanning workflow for execution", - "parameters": { - "workflow_name": "string", - "target_path": "string", - "parameters": "object" - } - }, - { - "name": "get_comprehensive_scan_summary", - "description": "Get a comprehensive summary of scan results with analysis", - "parameters": { - "run_id": "string" - } - } - ], - "fastapi_routes": [ - { - "method": "GET", - "path": "/", - "description": "Get API status and loaded workflows count" - }, - { - "method": "GET", - "path": "/workflows/", - "description": "List all available security testing workflows" - }, - { - "method": "POST", - "path": "/workflows/{workflow_name}/submit", - "description": "Submit a security scanning workflow for execution" - }, - { - "method": "GET", - "path": "/runs/{run_id}/status", - "description": "Get the current status of a security scan run" - }, - { - "method": "GET", - "path": "/runs/{run_id}/findings", - "description": "Get security findings from a completed scan" - }, - { - "method": "GET", - "path": "/fuzzing/{run_id}/stats", - "description": "Get fuzzing statistics for a run" - } - ] - }, - "examples": { - "start_infrastructure_scan": { - "description": "Run infrastructure security scan on a project", - "steps": [ - "1. Start Docker Compose: docker compose up -d", - "2. Submit scan via MCP tool: submit_security_scan_mcp", - "3. Monitor status and get results" - ], - "workflow_name": "infrastructure_scan", - "target_path": "/Users/tduhamel/Documents/FuzzingLabs/fuzzforge_alpha/test_projects/infrastructure_vulnerable", - "parameters": { - "checkov_config": { - "severity": ["HIGH", "MEDIUM", "LOW"] - }, - "hadolint_config": { - "severity": ["error", "warning", "info", "style"] - } - } - }, - "static_analysis_scan": { - "description": "Run static analysis security scan", - "workflow_name": "static_analysis_scan", - "target_path": "/Users/tduhamel/Documents/FuzzingLabs/fuzzforge_alpha/test_projects/static_analysis_vulnerable", - "parameters": { - "bandit_config": { - "severity": ["HIGH", "MEDIUM", "LOW"] - }, - "opengrep_config": { - "severity": ["HIGH", "MEDIUM", "LOW"] - } - } - }, - "secret_detection_scan": { - "description": "Run secret detection scan", - "workflow_name": "secret_detection_scan", - "target_path": "/Users/tduhamel/Documents/FuzzingLabs/fuzzforge_alpha/test_projects/secret_detection_vulnerable", - "parameters": { - "trufflehog_config": { - "verified_only": false - }, - "gitleaks_config": { - "no_git": true - } - } - } - }, - "usage": { - "via_mcp": "Connect MCP client to http://localhost:8010/mcp after starting Docker Compose", - "via_api": "Use FastAPI endpoints directly at http://localhost:8000", - "start_system": "docker compose up -d", - "stop_system": "docker compose down" - } -} diff --git a/backend/pyproject.toml b/backend/pyproject.toml deleted file mode 100644 index 595d473..0000000 --- a/backend/pyproject.toml +++ /dev/null @@ -1,41 +0,0 @@ -[project] -name = "backend" -version = "0.7.3" -description = "FuzzForge OSS backend" -authors = [] -readme = "README.md" -requires-python = ">=3.11" -dependencies = [ - "fastapi>=0.116.1", - "temporalio>=1.6.0", - "boto3>=1.34.0", - "pydantic>=2.0.0", - "pyyaml>=6.0", - "docker>=7.0.0", - "aiofiles>=23.0.0", - "uvicorn>=0.30.0", - "aiohttp>=3.12.15", - "fastmcp", -] - -[project.optional-dependencies] -dev = [ - "pytest>=8.0.0", - "pytest-asyncio>=0.23.0", - "pytest-benchmark>=4.0.0", - "pytest-cov>=5.0.0", - "pytest-xdist>=3.5.0", - "pytest-mock>=3.12.0", - "httpx>=0.27.0", - "ruff>=0.1.0", -] - -[tool.pytest.ini_options] -asyncio_mode = "auto" -testpaths = ["tests", "benchmarks"] -python_files = ["test_*.py", "bench_*.py"] -python_classes = ["Test*"] -python_functions = ["test_*"] -markers = [ - "benchmark: mark test as a benchmark", -] diff --git a/backend/src/__init__.py b/backend/src/__init__.py deleted file mode 100644 index 43bcfe7..0000000 --- a/backend/src/__init__.py +++ /dev/null @@ -1,11 +0,0 @@ -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - diff --git a/backend/src/api/__init__.py b/backend/src/api/__init__.py deleted file mode 100644 index 43bcfe7..0000000 --- a/backend/src/api/__init__.py +++ /dev/null @@ -1,11 +0,0 @@ -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - diff --git a/backend/src/api/fuzzing.py b/backend/src/api/fuzzing.py deleted file mode 100644 index 166319a..0000000 --- a/backend/src/api/fuzzing.py +++ /dev/null @@ -1,325 +0,0 @@ -""" -API endpoints for fuzzing workflow management and real-time monitoring -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -import logging -from typing import List, Dict -from fastapi import APIRouter, HTTPException, WebSocket, WebSocketDisconnect -from fastapi.responses import StreamingResponse -import asyncio -import json -from datetime import datetime - -from src.models.findings import ( - FuzzingStats, - CrashReport -) - -logger = logging.getLogger(__name__) - -router = APIRouter(prefix="/fuzzing", tags=["fuzzing"]) - -# In-memory storage for real-time stats (in production, use Redis or similar) -fuzzing_stats: Dict[str, FuzzingStats] = {} -crash_reports: Dict[str, List[CrashReport]] = {} -active_connections: Dict[str, List[WebSocket]] = {} - - -def initialize_fuzzing_tracking(run_id: str, workflow_name: str): - """ - Initialize fuzzing tracking for a new run. - - This function should be called when a workflow is submitted to enable - real-time monitoring and stats collection. - - Args: - run_id: The run identifier - workflow_name: Name of the workflow - """ - fuzzing_stats[run_id] = FuzzingStats( - run_id=run_id, - workflow=workflow_name - ) - crash_reports[run_id] = [] - active_connections[run_id] = [] - - -@router.get("/{run_id}/stats", response_model=FuzzingStats) -async def get_fuzzing_stats(run_id: str) -> FuzzingStats: - """ - Get current fuzzing statistics for a run. - - Args: - run_id: The fuzzing run ID - - Returns: - Current fuzzing statistics - - Raises: - HTTPException: 404 if run not found - """ - if run_id not in fuzzing_stats: - raise HTTPException( - status_code=404, - detail=f"Fuzzing run not found: {run_id}" - ) - - return fuzzing_stats[run_id] - - -@router.get("/{run_id}/crashes", response_model=List[CrashReport]) -async def get_crash_reports(run_id: str) -> List[CrashReport]: - """ - Get crash reports for a fuzzing run. - - Args: - run_id: The fuzzing run ID - - Returns: - List of crash reports - - Raises: - HTTPException: 404 if run not found - """ - if run_id not in crash_reports: - raise HTTPException( - status_code=404, - detail=f"Fuzzing run not found: {run_id}" - ) - - return crash_reports[run_id] - - -@router.post("/{run_id}/stats") -async def update_fuzzing_stats(run_id: str, stats: FuzzingStats): - """ - Update fuzzing statistics (called by fuzzing workflows). - - Args: - run_id: The fuzzing run ID - stats: Updated statistics - - Raises: - HTTPException: 404 if run not found - """ - if run_id not in fuzzing_stats: - raise HTTPException( - status_code=404, - detail=f"Fuzzing run not found: {run_id}" - ) - - # Update stats - fuzzing_stats[run_id] = stats - - # Debug: log reception for live instrumentation - try: - logger.info( - "Received fuzzing stats update: run_id=%s exec=%s eps=%.2f crashes=%s corpus=%s coverage=%s elapsed=%ss", - run_id, - stats.executions, - stats.executions_per_sec, - stats.crashes, - stats.corpus_size, - stats.coverage, - stats.elapsed_time, - ) - except Exception: - pass - - # Notify connected WebSocket clients - if run_id in active_connections: - message = { - "type": "stats_update", - "data": stats.model_dump() - } - for websocket in active_connections[run_id][:]: # Copy to avoid modification during iteration - try: - await websocket.send_text(json.dumps(message)) - except Exception: - # Remove disconnected clients - active_connections[run_id].remove(websocket) - - -@router.post("/{run_id}/crash") -async def report_crash(run_id: str, crash: CrashReport): - """ - Report a new crash (called by fuzzing workflows). - - Args: - run_id: The fuzzing run ID - crash: Crash report details - """ - if run_id not in crash_reports: - crash_reports[run_id] = [] - - # Add crash report - crash_reports[run_id].append(crash) - - # Update stats - if run_id in fuzzing_stats: - fuzzing_stats[run_id].crashes += 1 - fuzzing_stats[run_id].last_crash_time = crash.timestamp - - # Notify connected WebSocket clients - if run_id in active_connections: - message = { - "type": "crash_report", - "data": crash.model_dump() - } - for websocket in active_connections[run_id][:]: - try: - await websocket.send_text(json.dumps(message)) - except Exception: - active_connections[run_id].remove(websocket) - - -@router.websocket("/{run_id}/live") -async def websocket_endpoint(websocket: WebSocket, run_id: str): - """ - WebSocket endpoint for real-time fuzzing updates. - - Args: - websocket: WebSocket connection - run_id: The fuzzing run ID to monitor - """ - await websocket.accept() - - # Initialize connection tracking - if run_id not in active_connections: - active_connections[run_id] = [] - active_connections[run_id].append(websocket) - - try: - # Send current stats on connection - if run_id in fuzzing_stats: - current = fuzzing_stats[run_id] - if isinstance(current, dict): - payload = current - elif hasattr(current, "model_dump"): - payload = current.model_dump() - elif hasattr(current, "dict"): - payload = current.dict() - else: - payload = getattr(current, "__dict__", {"run_id": run_id}) - message = {"type": "stats_update", "data": payload} - await websocket.send_text(json.dumps(message)) - - # Keep connection alive - while True: - try: - # Wait for ping or handle disconnect - data = await asyncio.wait_for(websocket.receive_text(), timeout=30.0) - # Echo back for ping-pong - if data == "ping": - await websocket.send_text("pong") - except asyncio.TimeoutError: - # Send periodic heartbeat - await websocket.send_text(json.dumps({"type": "heartbeat"})) - - except WebSocketDisconnect: - # Clean up connection - if run_id in active_connections and websocket in active_connections[run_id]: - active_connections[run_id].remove(websocket) - except Exception as e: - logger.error(f"WebSocket error for run {run_id}: {e}") - if run_id in active_connections and websocket in active_connections[run_id]: - active_connections[run_id].remove(websocket) - - -@router.get("/{run_id}/stream") -async def stream_fuzzing_updates(run_id: str): - """ - Server-Sent Events endpoint for real-time fuzzing updates. - - Args: - run_id: The fuzzing run ID to monitor - - Returns: - Streaming response with real-time updates - """ - if run_id not in fuzzing_stats: - raise HTTPException( - status_code=404, - detail=f"Fuzzing run not found: {run_id}" - ) - - async def event_stream(): - """Generate server-sent events for fuzzing updates""" - last_stats_time = datetime.utcnow() - - while True: - try: - # Send current stats - if run_id in fuzzing_stats: - current_stats = fuzzing_stats[run_id] - if isinstance(current_stats, dict): - stats_payload = current_stats - elif hasattr(current_stats, "model_dump"): - stats_payload = current_stats.model_dump() - elif hasattr(current_stats, "dict"): - stats_payload = current_stats.dict() - else: - stats_payload = getattr(current_stats, "__dict__", {"run_id": run_id}) - event_data = f"data: {json.dumps({'type': 'stats', 'data': stats_payload})}\n\n" - yield event_data - - # Send recent crashes - if run_id in crash_reports: - recent_crashes = [ - crash for crash in crash_reports[run_id] - if crash.timestamp > last_stats_time - ] - for crash in recent_crashes: - event_data = f"data: {json.dumps({'type': 'crash', 'data': crash.model_dump()})}\n\n" - yield event_data - - last_stats_time = datetime.utcnow() - await asyncio.sleep(5) # Update every 5 seconds - - except Exception as e: - logger.error(f"Error in event stream for run {run_id}: {e}") - break - - return StreamingResponse( - event_stream(), - media_type="text/event-stream", - headers={ - "Cache-Control": "no-cache", - "Connection": "keep-alive", - } - ) - - -@router.delete("/{run_id}") -async def cleanup_fuzzing_run(run_id: str): - """ - Clean up fuzzing run data. - - Args: - run_id: The fuzzing run ID to clean up - """ - # Clean up tracking data - fuzzing_stats.pop(run_id, None) - crash_reports.pop(run_id, None) - - # Close any active WebSocket connections - if run_id in active_connections: - for websocket in active_connections[run_id]: - try: - await websocket.close() - except Exception: - pass - del active_connections[run_id] - - return {"message": f"Cleaned up fuzzing run {run_id}"} diff --git a/backend/src/api/runs.py b/backend/src/api/runs.py deleted file mode 100644 index b975f4b..0000000 --- a/backend/src/api/runs.py +++ /dev/null @@ -1,183 +0,0 @@ -""" -API endpoints for workflow run management and findings retrieval -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -import logging -from fastapi import APIRouter, HTTPException, Depends - -from src.models.findings import WorkflowFindings, WorkflowStatus - -logger = logging.getLogger(__name__) - -router = APIRouter(prefix="/runs", tags=["runs"]) - - -def get_temporal_manager(): - """Dependency to get the Temporal manager instance""" - from src.main import temporal_mgr - return temporal_mgr - - -@router.get("/{run_id}/status", response_model=WorkflowStatus) -async def get_run_status( - run_id: str, - temporal_mgr=Depends(get_temporal_manager) -) -> WorkflowStatus: - """ - Get the current status of a workflow run. - - Args: - run_id: The workflow run ID - - Returns: - Status information including state, timestamps, and completion flags - - Raises: - HTTPException: 404 if run not found - """ - try: - status = await temporal_mgr.get_workflow_status(run_id) - - # Map Temporal status to response format - workflow_status = status.get("status", "UNKNOWN") - is_completed = workflow_status in ["COMPLETED", "FAILED", "CANCELLED"] - is_failed = workflow_status == "FAILED" - is_running = workflow_status == "RUNNING" - - # Extract workflow name from run_id (format: workflow_name-unique_id) - workflow_name = run_id.rsplit('-', 1)[0] if '-' in run_id else "unknown" - - return WorkflowStatus( - run_id=run_id, - workflow=workflow_name, - status=workflow_status, - is_completed=is_completed, - is_failed=is_failed, - is_running=is_running, - created_at=status.get("start_time"), - updated_at=status.get("close_time") or status.get("execution_time") - ) - - except Exception as e: - logger.error(f"Failed to get status for run {run_id}: {e}") - raise HTTPException( - status_code=404, - detail=f"Run not found: {run_id}" - ) - - -@router.get("/{run_id}/findings", response_model=WorkflowFindings) -async def get_run_findings( - run_id: str, - temporal_mgr=Depends(get_temporal_manager) -) -> WorkflowFindings: - """ - Get the findings from a completed workflow run. - - Args: - run_id: The workflow run ID - - Returns: - SARIF-formatted findings from the workflow execution - - Raises: - HTTPException: 404 if run not found, 400 if run not completed - """ - try: - # Get run status first - status = await temporal_mgr.get_workflow_status(run_id) - workflow_status = status.get("status", "UNKNOWN") - - if workflow_status not in ["COMPLETED", "FAILED", "CANCELLED"]: - if workflow_status == "RUNNING": - raise HTTPException( - status_code=400, - detail=f"Run {run_id} is still running. Current status: {workflow_status}" - ) - else: - raise HTTPException( - status_code=400, - detail=f"Run {run_id} not completed. Status: {workflow_status}" - ) - - if workflow_status == "FAILED": - raise HTTPException( - status_code=400, - detail=f"Run {run_id} failed. Status: {workflow_status}" - ) - - # Get the workflow result - result = await temporal_mgr.get_workflow_result(run_id) - - # Extract SARIF from result (handle None for backwards compatibility) - if isinstance(result, dict): - sarif = result.get("sarif") or {} - else: - sarif = {} - - # Extract workflow name from run_id (format: workflow_name-unique_id) - workflow_name = run_id.rsplit('-', 1)[0] if '-' in run_id else "unknown" - - # Metadata - metadata = { - "completion_time": status.get("close_time"), - "workflow_version": "unknown" - } - - return WorkflowFindings( - workflow=workflow_name, - run_id=run_id, - sarif=sarif, - metadata=metadata - ) - - except HTTPException: - raise - except Exception as e: - logger.error(f"Failed to get findings for run {run_id}: {e}") - raise HTTPException( - status_code=500, - detail=f"Failed to retrieve findings: {str(e)}" - ) - - -@router.get("/{workflow_name}/findings/{run_id}", response_model=WorkflowFindings) -async def get_workflow_findings( - workflow_name: str, - run_id: str, - temporal_mgr=Depends(get_temporal_manager) -) -> WorkflowFindings: - """ - Get findings for a specific workflow run. - - Alternative endpoint that includes workflow name in the path for clarity. - - Args: - workflow_name: Name of the workflow - run_id: The workflow run ID - - Returns: - SARIF-formatted findings from the workflow execution - - Raises: - HTTPException: 404 if workflow or run not found, 400 if run not completed - """ - if workflow_name not in temporal_mgr.workflows: - raise HTTPException( - status_code=404, - detail=f"Workflow not found: {workflow_name}" - ) - - # Delegate to the main findings endpoint - return await get_run_findings(run_id, temporal_mgr) diff --git a/backend/src/api/system.py b/backend/src/api/system.py deleted file mode 100644 index a4ee1a6..0000000 --- a/backend/src/api/system.py +++ /dev/null @@ -1,47 +0,0 @@ -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -""" -System information endpoints for FuzzForge API. - -Provides system configuration and filesystem paths to CLI for worker management. -""" - -import os -from typing import Dict - -from fastapi import APIRouter - -router = APIRouter(prefix="/system", tags=["system"]) - - -@router.get("/info") -async def get_system_info() -> Dict[str, str]: - """ - Get system information including host filesystem paths. - - This endpoint exposes paths needed by the CLI to manage workers via docker-compose. - The FUZZFORGE_HOST_ROOT environment variable is set by docker-compose and points - to the FuzzForge installation directory on the host machine. - - Returns: - Dictionary containing: - - host_root: Absolute path to FuzzForge root on host - - docker_compose_path: Path to docker-compose.yml on host - - workers_dir: Path to workers directory on host - """ - host_root = os.getenv("FUZZFORGE_HOST_ROOT", "") - - return { - "host_root": host_root, - "docker_compose_path": f"{host_root}/docker-compose.yml" if host_root else "", - "workers_dir": f"{host_root}/workers" if host_root else "", - } diff --git a/backend/src/api/workflows.py b/backend/src/api/workflows.py deleted file mode 100644 index a4d1b7c..0000000 --- a/backend/src/api/workflows.py +++ /dev/null @@ -1,667 +0,0 @@ -""" -API endpoints for workflow management with enhanced error handling -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -import logging -import traceback -import tempfile -from typing import List, Dict, Any, Optional -from fastapi import APIRouter, HTTPException, Depends, UploadFile, File, Form -from pathlib import Path - -from src.models.findings import ( - WorkflowSubmission, - WorkflowMetadata, - WorkflowListItem, - RunSubmissionResponse -) -from src.temporal.discovery import WorkflowDiscovery - -logger = logging.getLogger(__name__) - -# Configuration for file uploads -MAX_UPLOAD_SIZE = 10 * 1024 * 1024 * 1024 # 10 GB -ALLOWED_CONTENT_TYPES = [ - "application/gzip", - "application/x-gzip", - "application/x-tar", - "application/x-compressed-tar", - "application/octet-stream", # Generic binary -] - -router = APIRouter(prefix="/workflows", tags=["workflows"]) - - -def extract_defaults_from_json_schema(metadata: Dict[str, Any]) -> Dict[str, Any]: - """ - Extract default parameter values from JSON Schema format. - - Converts from: - parameters: - properties: - param_name: - default: value - - To: - {param_name: value} - - Args: - metadata: Workflow metadata dictionary - - Returns: - Dictionary of parameter defaults - """ - defaults = {} - - # Check if there's a legacy default_parameters field - if "default_parameters" in metadata: - defaults.update(metadata["default_parameters"]) - - # Extract defaults from JSON Schema parameters - parameters = metadata.get("parameters", {}) - properties = parameters.get("properties", {}) - - for param_name, param_spec in properties.items(): - if "default" in param_spec: - defaults[param_name] = param_spec["default"] - - return defaults - - -def create_structured_error_response( - error_type: str, - message: str, - workflow_name: Optional[str] = None, - run_id: Optional[str] = None, - container_info: Optional[Dict[str, Any]] = None, - deployment_info: Optional[Dict[str, Any]] = None, - suggestions: Optional[List[str]] = None -) -> Dict[str, Any]: - """Create a structured error response with rich context.""" - error_response = { - "error": { - "type": error_type, - "message": message, - "timestamp": __import__("datetime").datetime.utcnow().isoformat() + "Z" - } - } - - if workflow_name: - error_response["error"]["workflow_name"] = workflow_name - - if run_id: - error_response["error"]["run_id"] = run_id - - if container_info: - error_response["error"]["container"] = container_info - - if deployment_info: - error_response["error"]["deployment"] = deployment_info - - if suggestions: - error_response["error"]["suggestions"] = suggestions - - return error_response - - -def get_temporal_manager(): - """Dependency to get the Temporal manager instance""" - from src.main import temporal_mgr - return temporal_mgr - - -@router.get("/", response_model=List[WorkflowListItem]) -async def list_workflows( - temporal_mgr=Depends(get_temporal_manager) -) -> List[WorkflowListItem]: - """ - List all discovered workflows with their metadata. - - Returns a summary of each workflow including name, version, description, - author, and tags. - """ - workflows = [] - for name, info in temporal_mgr.workflows.items(): - workflows.append(WorkflowListItem( - name=name, - version=info.metadata.get("version", "0.6.0"), - description=info.metadata.get("description", ""), - author=info.metadata.get("author"), - tags=info.metadata.get("tags", []) - )) - - return workflows - - -@router.get("/metadata/schema") -async def get_metadata_schema() -> Dict[str, Any]: - """ - Get the JSON schema for workflow metadata files. - - This schema defines the structure and requirements for metadata.yaml files - that must accompany each workflow. - """ - return WorkflowDiscovery.get_metadata_schema() - - -@router.get("/{workflow_name}/metadata", response_model=WorkflowMetadata) -async def get_workflow_metadata( - workflow_name: str, - temporal_mgr=Depends(get_temporal_manager) -) -> WorkflowMetadata: - """ - Get complete metadata for a specific workflow. - - Args: - workflow_name: Name of the workflow - - Returns: - Complete metadata including parameters schema, supported volume modes, - required modules, and more. - - Raises: - HTTPException: 404 if workflow not found - """ - if workflow_name not in temporal_mgr.workflows: - available_workflows = list(temporal_mgr.workflows.keys()) - error_response = create_structured_error_response( - error_type="WorkflowNotFound", - message=f"Workflow '{workflow_name}' not found", - workflow_name=workflow_name, - suggestions=[ - f"Available workflows: {', '.join(available_workflows)}", - "Use GET /workflows/ to see all available workflows", - "Check workflow name spelling and case sensitivity" - ] - ) - raise HTTPException( - status_code=404, - detail=error_response - ) - - info = temporal_mgr.workflows[workflow_name] - metadata = info.metadata - - return WorkflowMetadata( - name=workflow_name, - version=metadata.get("version", "0.6.0"), - description=metadata.get("description", ""), - author=metadata.get("author"), - tags=metadata.get("tags", []), - parameters=metadata.get("parameters", {}), - default_parameters=extract_defaults_from_json_schema(metadata), - required_modules=metadata.get("required_modules", []) - ) - - -@router.post("/{workflow_name}/submit", response_model=RunSubmissionResponse) -async def submit_workflow( - workflow_name: str, - submission: WorkflowSubmission, - temporal_mgr=Depends(get_temporal_manager) -) -> RunSubmissionResponse: - """ - Submit a workflow for execution. - - Args: - workflow_name: Name of the workflow to execute - submission: Submission parameters including target path and parameters - - Returns: - Run submission response with run_id and initial status - - Raises: - HTTPException: 404 if workflow not found, 400 for invalid parameters - """ - if workflow_name not in temporal_mgr.workflows: - available_workflows = list(temporal_mgr.workflows.keys()) - error_response = create_structured_error_response( - error_type="WorkflowNotFound", - message=f"Workflow '{workflow_name}' not found", - workflow_name=workflow_name, - suggestions=[ - f"Available workflows: {', '.join(available_workflows)}", - "Use GET /workflows/ to see all available workflows", - "Check workflow name spelling and case sensitivity" - ] - ) - raise HTTPException( - status_code=404, - detail=error_response - ) - - try: - # Upload target file to MinIO and get target_id - target_path = Path(submission.target_path) - if not target_path.exists(): - raise ValueError(f"Target path does not exist: {submission.target_path}") - - # Upload target (using anonymous user for now) - target_id = await temporal_mgr.upload_target( - file_path=target_path, - user_id="api-user", - metadata={"workflow": workflow_name} - ) - - # Merge default parameters with user parameters - workflow_info = temporal_mgr.workflows[workflow_name] - metadata = workflow_info.metadata or {} - defaults = extract_defaults_from_json_schema(metadata) - user_params = submission.parameters or {} - workflow_params = {**defaults, **user_params} - - # Start workflow execution - handle = await temporal_mgr.run_workflow( - workflow_name=workflow_name, - target_id=target_id, - workflow_params=workflow_params - ) - - run_id = handle.id - - # Initialize fuzzing tracking if this looks like a fuzzing workflow - workflow_info = temporal_mgr.workflows.get(workflow_name, {}) - workflow_tags = workflow_info.metadata.get("tags", []) if hasattr(workflow_info, 'metadata') else [] - if "fuzzing" in workflow_tags or "fuzz" in workflow_name.lower(): - from src.api.fuzzing import initialize_fuzzing_tracking - initialize_fuzzing_tracking(run_id, workflow_name) - - return RunSubmissionResponse( - run_id=run_id, - status="RUNNING", - workflow=workflow_name, - message=f"Workflow '{workflow_name}' submitted successfully" - ) - - except ValueError as e: - # Parameter validation errors - error_response = create_structured_error_response( - error_type="ValidationError", - message=str(e), - workflow_name=workflow_name, - suggestions=[ - "Check parameter types and values", - "Use GET /workflows/{workflow_name}/parameters for schema", - "Ensure all required parameters are provided" - ] - ) - raise HTTPException(status_code=400, detail=error_response) - - except Exception as e: - logger.error(f"Failed to submit workflow '{workflow_name}': {e}") - logger.error(f"Traceback: {traceback.format_exc()}") - - # Try to get more context about the error - container_info = None - deployment_info = None - suggestions = [] - - error_message = str(e) - error_type = "WorkflowSubmissionError" - - # Detect specific error patterns - if "workflow" in error_message.lower() and "not found" in error_message.lower(): - error_type = "WorkflowError" - suggestions.extend([ - "Check if Temporal server is running and accessible", - "Verify workflow workers are running", - "Check if workflow is registered with correct vertical", - "Ensure Docker is running and has sufficient resources" - ]) - - elif "volume" in error_message.lower() or "mount" in error_message.lower(): - error_type = "VolumeError" - suggestions.extend([ - "Check if the target path exists and is accessible", - "Verify file permissions (Docker needs read access)", - "Ensure the path is not in use by another process", - "Try using an absolute path instead of relative path" - ]) - - elif "memory" in error_message.lower() or "resource" in error_message.lower(): - error_type = "ResourceError" - suggestions.extend([ - "Check system memory and CPU availability", - "Consider reducing resource limits or dataset size", - "Monitor Docker resource usage", - "Increase Docker memory limits if needed" - ]) - - elif "image" in error_message.lower(): - error_type = "ImageError" - suggestions.extend([ - "Check if the workflow image exists", - "Verify Docker registry access", - "Try rebuilding the workflow image", - "Check network connectivity to registries" - ]) - - else: - suggestions.extend([ - "Check FuzzForge backend logs for details", - "Verify all services are running (docker-compose up -d)", - "Try restarting the workflow deployment", - "Contact support if the issue persists" - ]) - - error_response = create_structured_error_response( - error_type=error_type, - message=f"Failed to submit workflow: {error_message}", - workflow_name=workflow_name, - container_info=container_info, - deployment_info=deployment_info, - suggestions=suggestions - ) - - raise HTTPException( - status_code=500, - detail=error_response - ) - - -@router.post("/{workflow_name}/upload-and-submit", response_model=RunSubmissionResponse) -async def upload_and_submit_workflow( - workflow_name: str, - file: UploadFile = File(..., description="Target file or tarball to analyze"), - parameters: Optional[str] = Form(None, description="JSON-encoded workflow parameters"), - timeout: Optional[int] = Form(None, description="Timeout in seconds"), - temporal_mgr=Depends(get_temporal_manager) -) -> RunSubmissionResponse: - """ - Upload a target file/tarball and submit workflow for execution. - - This endpoint accepts multipart/form-data uploads and is the recommended - way to submit workflows from remote CLI clients. - - Args: - workflow_name: Name of the workflow to execute - file: Target file or tarball (compressed directory) - parameters: JSON string of workflow parameters (optional) - timeout: Execution timeout in seconds (optional) - - Returns: - Run submission response with run_id and initial status - - Raises: - HTTPException: 404 if workflow not found, 400 for invalid parameters, - 413 if file too large - """ - if workflow_name not in temporal_mgr.workflows: - available_workflows = list(temporal_mgr.workflows.keys()) - error_response = create_structured_error_response( - error_type="WorkflowNotFound", - message=f"Workflow '{workflow_name}' not found", - workflow_name=workflow_name, - suggestions=[ - f"Available workflows: {', '.join(available_workflows)}", - "Use GET /workflows/ to see all available workflows" - ] - ) - raise HTTPException(status_code=404, detail=error_response) - - temp_file_path = None - - try: - # Validate file size - file_size = 0 - chunk_size = 1024 * 1024 # 1MB chunks - - # Create temporary file - temp_fd, temp_file_path = tempfile.mkstemp(suffix=".tar.gz") - - logger.info(f"Receiving file upload for workflow '{workflow_name}': {file.filename}") - - # Stream file to disk - with open(temp_fd, 'wb') as temp_file: - while True: - chunk = await file.read(chunk_size) - if not chunk: - break - - file_size += len(chunk) - - # Check size limit - if file_size > MAX_UPLOAD_SIZE: - raise HTTPException( - status_code=413, - detail=create_structured_error_response( - error_type="FileTooLarge", - message=f"File size exceeds maximum allowed size of {MAX_UPLOAD_SIZE / (1024**3):.1f} GB", - workflow_name=workflow_name, - suggestions=[ - "Reduce the size of your target directory", - "Exclude unnecessary files (build artifacts, dependencies, etc.)", - "Consider splitting into smaller analysis targets" - ] - ) - ) - - temp_file.write(chunk) - - logger.info(f"Received file: {file_size / (1024**2):.2f} MB") - - # Parse parameters - workflow_params = {} - if parameters: - try: - import json - workflow_params = json.loads(parameters) - if not isinstance(workflow_params, dict): - raise ValueError("Parameters must be a JSON object") - except (json.JSONDecodeError, ValueError) as e: - raise HTTPException( - status_code=400, - detail=create_structured_error_response( - error_type="InvalidParameters", - message=f"Invalid parameters JSON: {e}", - workflow_name=workflow_name, - suggestions=["Ensure parameters is valid JSON object"] - ) - ) - - # Upload to MinIO - target_id = await temporal_mgr.upload_target( - file_path=Path(temp_file_path), - user_id="api-user", - metadata={ - "workflow": workflow_name, - "original_filename": file.filename, - "upload_method": "multipart" - } - ) - - logger.info(f"Uploaded to MinIO with target_id: {target_id}") - - # Merge default parameters with user parameters - workflow_info = temporal_mgr.workflows.get(workflow_name) - metadata = workflow_info.metadata or {} - defaults = extract_defaults_from_json_schema(metadata) - workflow_params = {**defaults, **workflow_params} - - # Start workflow execution - handle = await temporal_mgr.run_workflow( - workflow_name=workflow_name, - target_id=target_id, - workflow_params=workflow_params - ) - - run_id = handle.id - - # Initialize fuzzing tracking if needed - workflow_info = temporal_mgr.workflows.get(workflow_name, {}) - workflow_tags = workflow_info.metadata.get("tags", []) if hasattr(workflow_info, 'metadata') else [] - if "fuzzing" in workflow_tags or "fuzz" in workflow_name.lower(): - from src.api.fuzzing import initialize_fuzzing_tracking - initialize_fuzzing_tracking(run_id, workflow_name) - - return RunSubmissionResponse( - run_id=run_id, - status="RUNNING", - workflow=workflow_name, - message=f"Workflow '{workflow_name}' submitted successfully with uploaded target" - ) - - except HTTPException: - raise - except Exception as e: - logger.error(f"Failed to upload and submit workflow '{workflow_name}': {e}") - logger.error(f"Traceback: {traceback.format_exc()}") - - error_response = create_structured_error_response( - error_type="WorkflowSubmissionError", - message=f"Failed to process upload and submit workflow: {str(e)}", - workflow_name=workflow_name, - suggestions=[ - "Check if the uploaded file is a valid tarball", - "Verify MinIO storage is accessible", - "Check backend logs for detailed error information", - "Ensure Temporal workers are running" - ] - ) - - raise HTTPException(status_code=500, detail=error_response) - - finally: - # Cleanup temporary file - if temp_file_path and Path(temp_file_path).exists(): - try: - Path(temp_file_path).unlink() - logger.debug(f"Cleaned up temp file: {temp_file_path}") - except Exception as e: - logger.warning(f"Failed to cleanup temp file {temp_file_path}: {e}") - - -@router.get("/{workflow_name}/worker-info") -async def get_workflow_worker_info( - workflow_name: str, - temporal_mgr=Depends(get_temporal_manager) -) -> Dict[str, Any]: - """ - Get worker information for a workflow. - - Returns details about which worker is required to execute this workflow, - including container name, task queue, and vertical. - - Args: - workflow_name: Name of the workflow - - Returns: - Worker information including vertical, container name, and task queue - - Raises: - HTTPException: 404 if workflow not found - """ - if workflow_name not in temporal_mgr.workflows: - available_workflows = list(temporal_mgr.workflows.keys()) - error_response = create_structured_error_response( - error_type="WorkflowNotFound", - message=f"Workflow '{workflow_name}' not found", - workflow_name=workflow_name, - suggestions=[ - f"Available workflows: {', '.join(available_workflows)}", - "Use GET /workflows/ to see all available workflows" - ] - ) - raise HTTPException( - status_code=404, - detail=error_response - ) - - info = temporal_mgr.workflows[workflow_name] - metadata = info.metadata - - # Extract vertical from metadata - vertical = metadata.get("vertical") - - if not vertical: - error_response = create_structured_error_response( - error_type="MissingVertical", - message=f"Workflow '{workflow_name}' does not specify a vertical in metadata", - workflow_name=workflow_name, - suggestions=[ - "Check workflow metadata.yaml for 'vertical' field", - "Contact workflow author for support" - ] - ) - raise HTTPException( - status_code=500, - detail=error_response - ) - - return { - "workflow": workflow_name, - "vertical": vertical, - "worker_service": f"worker-{vertical}", - "task_queue": f"{vertical}-queue", - "required": True - } - - -@router.get("/{workflow_name}/parameters") -async def get_workflow_parameters( - workflow_name: str, - temporal_mgr=Depends(get_temporal_manager) -) -> Dict[str, Any]: - """ - Get the parameters schema for a workflow. - - Args: - workflow_name: Name of the workflow - - Returns: - Parameters schema with types, descriptions, and defaults - - Raises: - HTTPException: 404 if workflow not found - """ - if workflow_name not in temporal_mgr.workflows: - available_workflows = list(temporal_mgr.workflows.keys()) - error_response = create_structured_error_response( - error_type="WorkflowNotFound", - message=f"Workflow '{workflow_name}' not found", - workflow_name=workflow_name, - suggestions=[ - f"Available workflows: {', '.join(available_workflows)}", - "Use GET /workflows/ to see all available workflows" - ] - ) - raise HTTPException( - status_code=404, - detail=error_response - ) - - info = temporal_mgr.workflows[workflow_name] - metadata = info.metadata - - # Return parameters with enhanced schema information - parameters_schema = metadata.get("parameters", {}) - - # Extract the actual parameter definitions from JSON schema structure - if "properties" in parameters_schema: - param_definitions = parameters_schema["properties"] - else: - param_definitions = parameters_schema - - # Extract default values from JSON Schema - default_params = extract_defaults_from_json_schema(metadata) - - return { - "workflow": workflow_name, - "parameters": param_definitions, - "default_parameters": default_params, - "required_parameters": [ - name for name, schema in param_definitions.items() - if isinstance(schema, dict) and schema.get("required", False) - ] - } \ No newline at end of file diff --git a/backend/src/core/__init__.py b/backend/src/core/__init__.py deleted file mode 100644 index 43bcfe7..0000000 --- a/backend/src/core/__init__.py +++ /dev/null @@ -1,11 +0,0 @@ -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - diff --git a/backend/src/core/setup.py b/backend/src/core/setup.py deleted file mode 100644 index 97b3a46..0000000 --- a/backend/src/core/setup.py +++ /dev/null @@ -1,45 +0,0 @@ -""" -Setup utilities for FuzzForge infrastructure -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -import logging - -logger = logging.getLogger(__name__) - - -async def setup_result_storage(): - """ - Setup result storage (MinIO). - - MinIO is used for both target upload and result storage. - This is a placeholder for any MinIO-specific setup if needed. - """ - logger.info("Result storage (MinIO) configured") - # MinIO is configured via environment variables in docker-compose - # No additional setup needed here - return True - - -async def validate_infrastructure(): - """ - Validate all required infrastructure components. - - This should be called during startup to ensure everything is ready. - """ - logger.info("Validating infrastructure...") - - # Setup storage (MinIO) - await setup_result_storage() - - logger.info("Infrastructure validation completed") diff --git a/backend/src/main.py b/backend/src/main.py deleted file mode 100644 index c219742..0000000 --- a/backend/src/main.py +++ /dev/null @@ -1,712 +0,0 @@ -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -import asyncio -import logging -import os -from contextlib import AsyncExitStack, asynccontextmanager, suppress -from typing import Any, Dict, Optional, List - -import uvicorn -from fastapi import FastAPI -from starlette.applications import Starlette -from starlette.routing import Mount - -from fastmcp.server.http import create_sse_app - -from src.temporal.manager import TemporalManager -from src.core.setup import setup_result_storage, validate_infrastructure -from src.api import workflows, runs, fuzzing, system - -from fastmcp import FastMCP - -logging.basicConfig(level=logging.INFO) -logger = logging.getLogger(__name__) - -temporal_mgr = TemporalManager() - - -class TemporalBootstrapState: - """Tracks Temporal initialization progress for API and MCP consumers.""" - - def __init__(self) -> None: - self.ready: bool = False - self.status: str = "not_started" - self.last_error: Optional[str] = None - self.task_running: bool = False - - def as_dict(self) -> Dict[str, Any]: - return { - "ready": self.ready, - "status": self.status, - "last_error": self.last_error, - "task_running": self.task_running, - } - - -temporal_bootstrap_state = TemporalBootstrapState() - -# Configure retry strategy for bootstrapping Temporal + infrastructure -STARTUP_RETRY_SECONDS = max(1, int(os.getenv("FUZZFORGE_STARTUP_RETRY_SECONDS", "5"))) -STARTUP_RETRY_MAX_SECONDS = max( - STARTUP_RETRY_SECONDS, - int(os.getenv("FUZZFORGE_STARTUP_RETRY_MAX_SECONDS", "60")), -) - -temporal_bootstrap_task: Optional[asyncio.Task] = None - -# --------------------------------------------------------------------------- -# FastAPI application (REST API) -# --------------------------------------------------------------------------- - -app = FastAPI( - title="FuzzForge API", - description="Security testing workflow orchestration API with fuzzing support", - version="0.6.0", -) - -app.include_router(workflows.router) -app.include_router(runs.router) -app.include_router(fuzzing.router) -app.include_router(system.router) - - -def get_temporal_status() -> Dict[str, Any]: - """Return a snapshot of Temporal bootstrap state for diagnostics.""" - status = temporal_bootstrap_state.as_dict() - status["workflows_loaded"] = len(temporal_mgr.workflows) - status["bootstrap_task_running"] = ( - temporal_bootstrap_task is not None and not temporal_bootstrap_task.done() - ) - return status - - -def _temporal_not_ready_status() -> Optional[Dict[str, Any]]: - """Return status details if Temporal is not ready yet.""" - status = get_temporal_status() - if status.get("ready"): - return None - return status - - -@app.get("/") -async def root() -> Dict[str, Any]: - status = get_temporal_status() - return { - "name": "FuzzForge API", - "version": "0.6.0", - "status": "ready" if status.get("ready") else "initializing", - "workflows_loaded": status.get("workflows_loaded", 0), - "temporal": status, - } - - -@app.get("/health") -async def health() -> Dict[str, str]: - status = get_temporal_status() - health_status = "healthy" if status.get("ready") else "initializing" - return {"status": health_status} - - -# Map FastAPI OpenAPI operationIds to readable MCP tool names -FASTAPI_MCP_NAME_OVERRIDES: Dict[str, str] = { - "list_workflows_workflows__get": "api_list_workflows", - "get_metadata_schema_workflows_metadata_schema_get": "api_get_metadata_schema", - "get_workflow_metadata_workflows__workflow_name__metadata_get": "api_get_workflow_metadata", - "submit_workflow_workflows__workflow_name__submit_post": "api_submit_workflow", - "get_workflow_parameters_workflows__workflow_name__parameters_get": "api_get_workflow_parameters", - "get_run_status_runs__run_id__status_get": "api_get_run_status", - "get_run_findings_runs__run_id__findings_get": "api_get_run_findings", - "get_workflow_findings_runs__workflow_name__findings__run_id__get": "api_get_workflow_findings", - "get_fuzzing_stats_fuzzing__run_id__stats_get": "api_get_fuzzing_stats", - "update_fuzzing_stats_fuzzing__run_id__stats_post": "api_update_fuzzing_stats", - "get_crash_reports_fuzzing__run_id__crashes_get": "api_get_crash_reports", - "report_crash_fuzzing__run_id__crash_post": "api_report_crash", - "stream_fuzzing_updates_fuzzing__run_id__stream_get": "api_stream_fuzzing_updates", - "cleanup_fuzzing_run_fuzzing__run_id__delete": "api_cleanup_fuzzing_run", - "root__get": "api_root", - "health_health_get": "api_health", -} - - -# Create an MCP adapter exposing all FastAPI endpoints via OpenAPI parsing -FASTAPI_MCP_ADAPTER = FastMCP.from_fastapi( - app, - name="FuzzForge FastAPI", - mcp_names=FASTAPI_MCP_NAME_OVERRIDES, -) -_fastapi_mcp_imported = False - - -# --------------------------------------------------------------------------- -# FastMCP server (runs on dedicated port outside FastAPI) -# --------------------------------------------------------------------------- - -mcp = FastMCP(name="FuzzForge MCP") - - -async def _bootstrap_temporal_with_retries() -> None: - """Initialize Temporal infrastructure with exponential backoff retries.""" - - attempt = 0 - - while True: - attempt += 1 - temporal_bootstrap_state.task_running = True - temporal_bootstrap_state.status = "starting" - temporal_bootstrap_state.ready = False - temporal_bootstrap_state.last_error = None - - try: - logger.info("Bootstrapping Temporal infrastructure...") - await validate_infrastructure() - await setup_result_storage() - await temporal_mgr.initialize() - - temporal_bootstrap_state.ready = True - temporal_bootstrap_state.status = "ready" - temporal_bootstrap_state.task_running = False - logger.info("Temporal infrastructure ready") - return - - except asyncio.CancelledError: - temporal_bootstrap_state.status = "cancelled" - temporal_bootstrap_state.task_running = False - logger.info("Temporal bootstrap task cancelled") - raise - - except Exception as exc: # pragma: no cover - defensive logging on infra startup - logger.exception("Temporal bootstrap failed") - temporal_bootstrap_state.ready = False - temporal_bootstrap_state.status = "error" - temporal_bootstrap_state.last_error = str(exc) - - # Ensure partial initialization does not leave stale state behind - temporal_mgr.workflows.clear() - - wait_time = min( - STARTUP_RETRY_SECONDS * (2 ** (attempt - 1)), - STARTUP_RETRY_MAX_SECONDS, - ) - logger.info("Retrying Temporal bootstrap in %s second(s)", wait_time) - - try: - await asyncio.sleep(wait_time) - except asyncio.CancelledError: - temporal_bootstrap_state.status = "cancelled" - temporal_bootstrap_state.task_running = False - raise - - -def _lookup_workflow(workflow_name: str): - info = temporal_mgr.workflows.get(workflow_name) - if not info: - return None - metadata = info.metadata - defaults = metadata.get("default_parameters", {}) - default_target_path = metadata.get("default_target_path") or defaults.get("target_path") - return { - "name": workflow_name, - "version": metadata.get("version", "0.6.0"), - "description": metadata.get("description", ""), - "author": metadata.get("author"), - "tags": metadata.get("tags", []), - "parameters": metadata.get("parameters", {}), - "default_parameters": metadata.get("default_parameters", {}), - "required_modules": metadata.get("required_modules", []), - "default_target_path": default_target_path - } - - -@mcp.tool -async def list_workflows_mcp() -> Dict[str, Any]: - """List all discovered workflows and their metadata summary.""" - not_ready = _temporal_not_ready_status() - if not_ready: - return { - "workflows": [], - "temporal": not_ready, - "message": "Temporal infrastructure is still initializing", - } - - workflows_summary = [] - for name, info in temporal_mgr.workflows.items(): - metadata = info.metadata - defaults = metadata.get("default_parameters", {}) - workflows_summary.append({ - "name": name, - "version": metadata.get("version", "0.6.0"), - "description": metadata.get("description", ""), - "author": metadata.get("author"), - "tags": metadata.get("tags", []), - "default_target_path": metadata.get("default_target_path") - or defaults.get("target_path") - }) - return {"workflows": workflows_summary, "temporal": get_temporal_status()} - - -@mcp.tool -async def get_workflow_metadata_mcp(workflow_name: str) -> Dict[str, Any]: - """Fetch detailed metadata for a workflow.""" - not_ready = _temporal_not_ready_status() - if not_ready: - return { - "error": "Temporal infrastructure not ready", - "temporal": not_ready, - } - - data = _lookup_workflow(workflow_name) - if not data: - return {"error": f"Workflow not found: {workflow_name}"} - return data - - -@mcp.tool -async def get_workflow_parameters_mcp(workflow_name: str) -> Dict[str, Any]: - """Return the parameter schema and defaults for a workflow.""" - not_ready = _temporal_not_ready_status() - if not_ready: - return { - "error": "Temporal infrastructure not ready", - "temporal": not_ready, - } - - data = _lookup_workflow(workflow_name) - if not data: - return {"error": f"Workflow not found: {workflow_name}"} - return { - "parameters": data.get("parameters", {}), - "defaults": data.get("default_parameters", {}), - } - - -@mcp.tool -async def get_workflow_metadata_schema_mcp() -> Dict[str, Any]: - """Return the JSON schema describing workflow metadata files.""" - from src.temporal.discovery import WorkflowDiscovery - return WorkflowDiscovery.get_metadata_schema() - - -@mcp.tool -async def submit_security_scan_mcp( - workflow_name: str, - target_id: str, - parameters: Dict[str, Any] | None = None, -) -> Dict[str, Any] | Dict[str, str]: - """Submit a Temporal workflow via MCP.""" - try: - not_ready = _temporal_not_ready_status() - if not_ready: - return { - "error": "Temporal infrastructure not ready", - "temporal": not_ready, - } - - workflow_info = temporal_mgr.workflows.get(workflow_name) - if not workflow_info: - return {"error": f"Workflow '{workflow_name}' not found"} - - metadata = workflow_info.metadata or {} - defaults = metadata.get("default_parameters", {}) - - parameters = parameters or {} - cleaned_parameters: Dict[str, Any] = {**defaults, **parameters} - - # Ensure *_config structures default to dicts - for key, value in list(cleaned_parameters.items()): - if isinstance(key, str) and key.endswith("_config") and value is None: - cleaned_parameters[key] = {} - - # Some workflows expect configuration dictionaries even when omitted - parameter_definitions = ( - metadata.get("parameters", {}).get("properties", {}) - if isinstance(metadata.get("parameters"), dict) - else {} - ) - for key, definition in parameter_definitions.items(): - if not isinstance(key, str) or not key.endswith("_config"): - continue - if key not in cleaned_parameters: - default_value = definition.get("default") if isinstance(definition, dict) else None - cleaned_parameters[key] = default_value if default_value is not None else {} - elif cleaned_parameters[key] is None: - cleaned_parameters[key] = {} - - # Start workflow - handle = await temporal_mgr.run_workflow( - workflow_name=workflow_name, - target_id=target_id, - workflow_params=cleaned_parameters, - ) - - return { - "run_id": handle.id, - "status": "RUNNING", - "workflow": workflow_name, - "message": f"Workflow '{workflow_name}' submitted successfully", - "target_id": target_id, - "parameters": cleaned_parameters, - "mcp_enabled": True, - } - except Exception as exc: # pragma: no cover - defensive logging - logger.exception("MCP submit failed") - return {"error": f"Failed to submit workflow: {exc}"} - - -@mcp.tool -async def get_comprehensive_scan_summary(run_id: str) -> Dict[str, Any] | Dict[str, str]: - """Return a summary for the given workflow run via MCP.""" - try: - not_ready = _temporal_not_ready_status() - if not_ready: - return { - "error": "Temporal infrastructure not ready", - "temporal": not_ready, - } - - status = await temporal_mgr.get_workflow_status(run_id) - - # Try to get result if completed - total_findings = 0 - severity_summary = {"critical": 0, "high": 0, "medium": 0, "low": 0, "info": 0} - - if status.get("status") == "COMPLETED": - try: - result = await temporal_mgr.get_workflow_result(run_id) - if isinstance(result, dict): - summary = result.get("summary", {}) - total_findings = summary.get("total_findings", 0) - except Exception as e: - logger.debug(f"Could not retrieve result for {run_id}: {e}") - - return { - "run_id": run_id, - "workflow": "unknown", # Temporal doesn't track workflow name in status - "status": status.get("status", "unknown"), - "is_completed": status.get("status") == "COMPLETED", - "total_findings": total_findings, - "severity_summary": severity_summary, - "scan_duration": status.get("close_time", "In progress"), - "recommendations": ( - [ - "Review high and critical severity findings first", - "Implement security fixes based on finding recommendations", - "Re-run scan after applying fixes to verify remediation", - ] - if total_findings > 0 - else ["No security issues found"] - ), - "mcp_analysis": True, - } - except Exception as exc: # pragma: no cover - logger.exception("MCP summary failed") - return {"error": f"Failed to summarize run: {exc}"} - - -@mcp.tool -async def get_run_status_mcp(run_id: str) -> Dict[str, Any]: - """Return current status information for a Temporal run.""" - try: - not_ready = _temporal_not_ready_status() - if not_ready: - return { - "error": "Temporal infrastructure not ready", - "temporal": not_ready, - } - - status = await temporal_mgr.get_workflow_status(run_id) - - return { - "run_id": run_id, - "workflow": "unknown", - "status": status["status"], - "is_completed": status["status"] in ["COMPLETED", "FAILED", "CANCELLED"], - "is_failed": status["status"] == "FAILED", - "is_running": status["status"] == "RUNNING", - "created_at": status.get("start_time"), - "updated_at": status.get("close_time") or status.get("execution_time"), - } - except Exception as exc: - logger.exception("MCP run status failed") - return {"error": f"Failed to get run status: {exc}"} - - -@mcp.tool -async def get_run_findings_mcp(run_id: str) -> Dict[str, Any]: - """Return SARIF findings for a completed run.""" - try: - not_ready = _temporal_not_ready_status() - if not_ready: - return { - "error": "Temporal infrastructure not ready", - "temporal": not_ready, - } - - status = await temporal_mgr.get_workflow_status(run_id) - if status.get("status") != "COMPLETED": - return {"error": f"Run {run_id} not completed. Status: {status.get('status')}"} - - result = await temporal_mgr.get_workflow_result(run_id) - - metadata = { - "completion_time": status.get("close_time"), - "workflow_version": "unknown", - } - - sarif = result.get("sarif", {}) if isinstance(result, dict) else {} - - return { - "workflow": "unknown", - "run_id": run_id, - "sarif": sarif, - "metadata": metadata, - } - except Exception as exc: - logger.exception("MCP findings failed") - return {"error": f"Failed to retrieve findings: {exc}"} - - -@mcp.tool -async def list_recent_runs_mcp( - limit: int = 10, - workflow_name: str | None = None, -) -> Dict[str, Any]: - """List recent Temporal runs with optional workflow filter.""" - - not_ready = _temporal_not_ready_status() - if not_ready: - return { - "runs": [], - "temporal": not_ready, - "message": "Temporal infrastructure is still initializing", - } - - try: - limit_value = int(limit) - except (TypeError, ValueError): - limit_value = 10 - limit_value = max(1, min(limit_value, 100)) - - try: - # Build filter query - filter_query = None - if workflow_name: - workflow_info = temporal_mgr.workflows.get(workflow_name) - if workflow_info: - filter_query = f'WorkflowType="{workflow_info.workflow_type}"' - - workflows = await temporal_mgr.list_workflows(filter_query, limit_value) - - results: List[Dict[str, Any]] = [] - for wf in workflows: - results.append({ - "run_id": wf["workflow_id"], - "workflow": workflow_name or "unknown", - "state": wf["status"], - "state_type": wf["status"], - "is_completed": wf["status"] in ["COMPLETED", "FAILED", "CANCELLED"], - "is_running": wf["status"] == "RUNNING", - "is_failed": wf["status"] == "FAILED", - "created_at": wf.get("start_time"), - "updated_at": wf.get("close_time"), - }) - - return {"runs": results, "temporal": get_temporal_status()} - - except Exception as exc: - logger.exception("Failed to list runs") - return { - "runs": [], - "temporal": get_temporal_status(), - "error": str(exc) - } - - -@mcp.tool -async def get_fuzzing_stats_mcp(run_id: str) -> Dict[str, Any]: - """Return fuzzing statistics for a run if available.""" - not_ready = _temporal_not_ready_status() - if not_ready: - return { - "error": "Temporal infrastructure not ready", - "temporal": not_ready, - } - - stats = fuzzing.fuzzing_stats.get(run_id) - if not stats: - return {"error": f"Fuzzing run not found: {run_id}"} - # Be resilient if a plain dict slipped into the cache - if isinstance(stats, dict): - return stats - if hasattr(stats, "model_dump"): - return stats.model_dump() - if hasattr(stats, "dict"): - return stats.dict() - # Last resort - return getattr(stats, "__dict__", {"run_id": run_id}) - - -@mcp.tool -async def get_fuzzing_crash_reports_mcp(run_id: str) -> Dict[str, Any]: - """Return crash reports collected for a fuzzing run.""" - not_ready = _temporal_not_ready_status() - if not_ready: - return { - "error": "Temporal infrastructure not ready", - "temporal": not_ready, - } - - reports = fuzzing.crash_reports.get(run_id) - if reports is None: - return {"error": f"Fuzzing run not found: {run_id}"} - return {"run_id": run_id, "crashes": [report.model_dump() for report in reports]} - - -@mcp.tool -async def get_backend_status_mcp() -> Dict[str, Any]: - """Expose backend readiness, workflows, and registered MCP tools.""" - - status = get_temporal_status() - response: Dict[str, Any] = {"temporal": status} - - if status.get("ready"): - response["workflows"] = list(temporal_mgr.workflows.keys()) - - try: - tools = await mcp._tool_manager.list_tools() - response["mcp_tools"] = sorted(tool.name for tool in tools) - except Exception as exc: # pragma: no cover - defensive logging - logger.debug("Failed to enumerate MCP tools: %s", exc) - - return response - - -def create_mcp_transport_app() -> Starlette: - """Build a Starlette app serving HTTP + SSE transports on one port.""" - - http_app = mcp.http_app(path="/", transport="streamable-http") - sse_app = create_sse_app( - server=mcp, - message_path="/messages", - sse_path="/", - auth=mcp.auth, - ) - - routes = [ - Mount("/mcp", app=http_app), - Mount("/mcp/sse", app=sse_app), - ] - - @asynccontextmanager - async def lifespan(app: Starlette): # pragma: no cover - integration wiring - async with AsyncExitStack() as stack: - await stack.enter_async_context( - http_app.router.lifespan_context(http_app) - ) - await stack.enter_async_context( - sse_app.router.lifespan_context(sse_app) - ) - yield - - combined_app = Starlette(routes=routes, lifespan=lifespan) - combined_app.state.fastmcp_server = mcp - combined_app.state.http_app = http_app - combined_app.state.sse_app = sse_app - return combined_app - - -# --------------------------------------------------------------------------- -# Combined lifespan: Temporal init + dedicated MCP transports -# --------------------------------------------------------------------------- - -@asynccontextmanager -async def combined_lifespan(app: FastAPI): - global temporal_bootstrap_task, _fastapi_mcp_imported - - logger.info("Starting FuzzForge backend...") - - # Ensure FastAPI endpoints are exposed via MCP once - if not _fastapi_mcp_imported: - try: - await mcp.import_server(FASTAPI_MCP_ADAPTER) - _fastapi_mcp_imported = True - logger.info("Mounted FastAPI endpoints as MCP tools") - except Exception as exc: - logger.exception("Failed to import FastAPI endpoints into MCP", exc_info=exc) - - # Kick off Temporal bootstrap in the background if needed - if temporal_bootstrap_task is None or temporal_bootstrap_task.done(): - temporal_bootstrap_task = asyncio.create_task(_bootstrap_temporal_with_retries()) - logger.info("Temporal bootstrap task started") - else: - logger.info("Temporal bootstrap task already running") - - # Start MCP transports on shared port (HTTP + SSE) - mcp_app = create_mcp_transport_app() - mcp_config = uvicorn.Config( - app=mcp_app, - host="0.0.0.0", - port=8010, - log_level="info", - lifespan="on", - ) - mcp_server = uvicorn.Server(mcp_config) - mcp_server.install_signal_handlers = lambda: None # type: ignore[assignment] - mcp_task = asyncio.create_task(mcp_server.serve()) - - async def _wait_for_uvicorn_startup() -> None: - started_attr = getattr(mcp_server, "started", None) - if hasattr(started_attr, "wait"): - await asyncio.wait_for(started_attr.wait(), timeout=10) - return - - # Fallback for uvicorn versions where "started" is a bool - poll_interval = 0.1 - checks = int(10 / poll_interval) - for _ in range(checks): - if getattr(mcp_server, "started", False): - return - await asyncio.sleep(poll_interval) - raise asyncio.TimeoutError - - try: - await _wait_for_uvicorn_startup() - except asyncio.TimeoutError: # pragma: no cover - defensive logging - if mcp_task.done(): - raise RuntimeError("MCP server failed to start") from mcp_task.exception() - logger.warning("Timed out waiting for MCP server startup; continuing anyway") - - logger.info("MCP HTTP available at http://0.0.0.0:8010/mcp") - logger.info("MCP SSE available at http://0.0.0.0:8010/mcp/sse") - - try: - yield - finally: - logger.info("Shutting down MCP transports...") - mcp_server.should_exit = True - mcp_server.force_exit = True - await asyncio.gather(mcp_task, return_exceptions=True) - - if temporal_bootstrap_task and not temporal_bootstrap_task.done(): - temporal_bootstrap_task.cancel() - with suppress(asyncio.CancelledError): - await temporal_bootstrap_task - temporal_bootstrap_state.task_running = False - if not temporal_bootstrap_state.ready: - temporal_bootstrap_state.status = "stopped" - temporal_bootstrap_task = None - - # Close Temporal client - await temporal_mgr.close() - logger.info("Shutting down FuzzForge backend...") - - -app.router.lifespan_context = combined_lifespan diff --git a/backend/src/models/__init__.py b/backend/src/models/__init__.py deleted file mode 100644 index 43bcfe7..0000000 --- a/backend/src/models/__init__.py +++ /dev/null @@ -1,11 +0,0 @@ -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - diff --git a/backend/src/models/findings.py b/backend/src/models/findings.py deleted file mode 100644 index b71a9b6..0000000 --- a/backend/src/models/findings.py +++ /dev/null @@ -1,120 +0,0 @@ -""" -Models for workflow findings and submissions -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -from pydantic import BaseModel, Field -from typing import Dict, Any, Optional, List -from datetime import datetime - - -class WorkflowFindings(BaseModel): - """Findings from a workflow execution in SARIF format""" - workflow: str = Field(..., description="Workflow name") - run_id: str = Field(..., description="Unique run identifier") - sarif: Dict[str, Any] = Field(..., description="SARIF formatted findings") - metadata: Dict[str, Any] = Field(default_factory=dict, description="Additional metadata") - - -class WorkflowSubmission(BaseModel): - """ - Submit a workflow with configurable settings. - - Note: This model is deprecated in favor of the /upload-and-submit endpoint - which handles file uploads directly. - """ - parameters: Dict[str, Any] = Field( - default_factory=dict, - description="Workflow-specific parameters" - ) - timeout: Optional[int] = Field( - default=None, # Allow workflow-specific defaults - description="Timeout in seconds (None for workflow default)", - ge=1, - le=604800 # Max 7 days to support fuzzing campaigns - ) - - -class WorkflowStatus(BaseModel): - """Status of a workflow run""" - run_id: str = Field(..., description="Unique run identifier") - workflow: str = Field(..., description="Workflow name") - status: str = Field(..., description="Current status") - is_completed: bool = Field(..., description="Whether the run is completed") - is_failed: bool = Field(..., description="Whether the run failed") - is_running: bool = Field(..., description="Whether the run is currently running") - created_at: datetime = Field(..., description="Run creation time") - updated_at: datetime = Field(..., description="Last update time") - - -class WorkflowMetadata(BaseModel): - """Complete metadata for a workflow""" - name: str = Field(..., description="Workflow name") - version: str = Field(..., description="Semantic version") - description: str = Field(..., description="Workflow description") - author: Optional[str] = Field(None, description="Workflow author") - tags: List[str] = Field(default_factory=list, description="Workflow tags") - parameters: Dict[str, Any] = Field(..., description="Parameters schema") - default_parameters: Dict[str, Any] = Field( - default_factory=dict, - description="Default parameter values" - ) - required_modules: List[str] = Field( - default_factory=list, - description="Required module names" - ) - - -class WorkflowListItem(BaseModel): - """Summary information for a workflow in list views""" - name: str = Field(..., description="Workflow name") - version: str = Field(..., description="Semantic version") - description: str = Field(..., description="Workflow description") - author: Optional[str] = Field(None, description="Workflow author") - tags: List[str] = Field(default_factory=list, description="Workflow tags") - - -class RunSubmissionResponse(BaseModel): - """Response after submitting a workflow""" - run_id: str = Field(..., description="Unique run identifier") - status: str = Field(..., description="Initial status") - workflow: str = Field(..., description="Workflow name") - message: str = Field(default="Workflow submitted successfully") - - -class FuzzingStats(BaseModel): - """Real-time fuzzing statistics""" - run_id: str = Field(..., description="Unique run identifier") - workflow: str = Field(..., description="Workflow name") - executions: int = Field(default=0, description="Total executions") - executions_per_sec: float = Field(default=0.0, description="Current execution rate") - crashes: int = Field(default=0, description="Total crashes found") - unique_crashes: int = Field(default=0, description="Unique crashes") - coverage: Optional[float] = Field(None, description="Code coverage percentage") - corpus_size: int = Field(default=0, description="Current corpus size") - elapsed_time: int = Field(default=0, description="Elapsed time in seconds") - last_crash_time: Optional[datetime] = Field(None, description="Time of last crash") - - -class CrashReport(BaseModel): - """Individual crash report from fuzzing""" - run_id: str = Field(..., description="Run identifier") - crash_id: str = Field(..., description="Unique crash identifier") - timestamp: datetime = Field(default_factory=datetime.utcnow) - signal: Optional[str] = Field(None, description="Crash signal (SIGSEGV, etc.)") - crash_type: Optional[str] = Field(None, description="Type of crash") - stack_trace: Optional[str] = Field(None, description="Stack trace") - input_file: Optional[str] = Field(None, description="Path to crashing input") - reproducer: Optional[str] = Field(None, description="Minimized reproducer") - severity: str = Field(default="medium", description="Crash severity") - exploitability: Optional[str] = Field(None, description="Exploitability assessment") \ No newline at end of file diff --git a/backend/src/storage/__init__.py b/backend/src/storage/__init__.py deleted file mode 100644 index 4f78cff..0000000 --- a/backend/src/storage/__init__.py +++ /dev/null @@ -1,10 +0,0 @@ -""" -Storage abstraction layer for FuzzForge. - -Provides unified interface for storing and retrieving targets and results. -""" - -from .base import StorageBackend -from .s3_cached import S3CachedStorage - -__all__ = ["StorageBackend", "S3CachedStorage"] diff --git a/backend/src/storage/base.py b/backend/src/storage/base.py deleted file mode 100644 index 7323fd3..0000000 --- a/backend/src/storage/base.py +++ /dev/null @@ -1,153 +0,0 @@ -""" -Base storage backend interface. - -All storage implementations must implement this interface. -""" - -from abc import ABC, abstractmethod -from pathlib import Path -from typing import Optional, Dict, Any - - -class StorageBackend(ABC): - """ - Abstract base class for storage backends. - - Implementations handle storage and retrieval of: - - Uploaded targets (code, binaries, etc.) - - Workflow results - - Temporary files - """ - - @abstractmethod - async def upload_target( - self, - file_path: Path, - user_id: str, - metadata: Optional[Dict[str, Any]] = None - ) -> str: - """ - Upload a target file to storage. - - Args: - file_path: Local path to file to upload - user_id: ID of user uploading the file - metadata: Optional metadata to store with file - - Returns: - Target ID (unique identifier for retrieval) - - Raises: - FileNotFoundError: If file_path doesn't exist - StorageError: If upload fails - """ - pass - - @abstractmethod - async def get_target(self, target_id: str) -> Path: - """ - Get target file from storage. - - Args: - target_id: Unique identifier from upload_target() - - Returns: - Local path to cached file - - Raises: - FileNotFoundError: If target doesn't exist - StorageError: If download fails - """ - pass - - @abstractmethod - async def delete_target(self, target_id: str) -> None: - """ - Delete target from storage. - - Args: - target_id: Unique identifier to delete - - Raises: - StorageError: If deletion fails (doesn't raise if not found) - """ - pass - - @abstractmethod - async def upload_results( - self, - workflow_id: str, - results: Dict[str, Any], - results_format: str = "json" - ) -> str: - """ - Upload workflow results to storage. - - Args: - workflow_id: Workflow execution ID - results: Results dictionary - results_format: Format (json, sarif, etc.) - - Returns: - URL to uploaded results - - Raises: - StorageError: If upload fails - """ - pass - - @abstractmethod - async def get_results(self, workflow_id: str) -> Dict[str, Any]: - """ - Get workflow results from storage. - - Args: - workflow_id: Workflow execution ID - - Returns: - Results dictionary - - Raises: - FileNotFoundError: If results don't exist - StorageError: If download fails - """ - pass - - @abstractmethod - async def list_targets( - self, - user_id: Optional[str] = None, - limit: int = 100 - ) -> list[Dict[str, Any]]: - """ - List uploaded targets. - - Args: - user_id: Filter by user ID (None = all users) - limit: Maximum number of results - - Returns: - List of target metadata dictionaries - - Raises: - StorageError: If listing fails - """ - pass - - @abstractmethod - async def cleanup_cache(self) -> int: - """ - Clean up local cache (LRU eviction). - - Returns: - Number of files removed - - Raises: - StorageError: If cleanup fails - """ - pass - - -class StorageError(Exception): - """Base exception for storage operations.""" - pass diff --git a/backend/src/storage/s3_cached.py b/backend/src/storage/s3_cached.py deleted file mode 100644 index 99c8e3a..0000000 --- a/backend/src/storage/s3_cached.py +++ /dev/null @@ -1,423 +0,0 @@ -""" -S3-compatible storage backend with local caching. - -Works with MinIO (dev/prod) or AWS S3 (cloud). -""" - -import json -import logging -import os -import shutil -from datetime import datetime -from pathlib import Path -from typing import Optional, Dict, Any -from uuid import uuid4 - -import boto3 -from botocore.exceptions import ClientError - -from .base import StorageBackend, StorageError - -logger = logging.getLogger(__name__) - - -class S3CachedStorage(StorageBackend): - """ - S3-compatible storage with local caching. - - Features: - - Upload targets to S3/MinIO - - Download with local caching (LRU eviction) - - Lifecycle management (auto-cleanup old files) - - Metadata tracking - """ - - def __init__( - self, - endpoint_url: Optional[str] = None, - access_key: Optional[str] = None, - secret_key: Optional[str] = None, - bucket: str = "targets", - region: str = "us-east-1", - use_ssl: bool = False, - cache_dir: Optional[Path] = None, - cache_max_size_gb: int = 10 - ): - """ - Initialize S3 storage backend. - - Args: - endpoint_url: S3 endpoint (None = AWS S3, or MinIO URL) - access_key: S3 access key (None = from env) - secret_key: S3 secret key (None = from env) - bucket: S3 bucket name - region: AWS region - use_ssl: Use HTTPS - cache_dir: Local cache directory - cache_max_size_gb: Maximum cache size in GB - """ - # Use environment variables as defaults - self.endpoint_url = endpoint_url or os.getenv('S3_ENDPOINT', 'http://minio:9000') - self.access_key = access_key or os.getenv('S3_ACCESS_KEY', 'fuzzforge') - self.secret_key = secret_key or os.getenv('S3_SECRET_KEY', 'fuzzforge123') - self.bucket = bucket or os.getenv('S3_BUCKET', 'targets') - self.region = region or os.getenv('S3_REGION', 'us-east-1') - self.use_ssl = use_ssl or os.getenv('S3_USE_SSL', 'false').lower() == 'true' - - # Cache configuration - self.cache_dir = cache_dir or Path(os.getenv('CACHE_DIR', '/tmp/fuzzforge-cache')) - self.cache_max_size = cache_max_size_gb * (1024 ** 3) # Convert to bytes - - # Ensure cache directory exists - self.cache_dir.mkdir(parents=True, exist_ok=True) - - # Initialize S3 client - try: - self.s3_client = boto3.client( - 's3', - endpoint_url=self.endpoint_url, - aws_access_key_id=self.access_key, - aws_secret_access_key=self.secret_key, - region_name=self.region, - use_ssl=self.use_ssl - ) - logger.info(f"Initialized S3 storage: {self.endpoint_url}/{self.bucket}") - except Exception as e: - logger.error(f"Failed to initialize S3 client: {e}") - raise StorageError(f"S3 initialization failed: {e}") - - async def upload_target( - self, - file_path: Path, - user_id: str, - metadata: Optional[Dict[str, Any]] = None - ) -> str: - """Upload target file to S3/MinIO.""" - if not file_path.exists(): - raise FileNotFoundError(f"File not found: {file_path}") - - # Generate unique target ID - target_id = str(uuid4()) - - # Prepare metadata - upload_metadata = { - 'user_id': user_id, - 'uploaded_at': datetime.now().isoformat(), - 'filename': file_path.name, - 'size': str(file_path.stat().st_size) - } - if metadata: - upload_metadata.update(metadata) - - # Upload to S3 - s3_key = f'{target_id}/target' - try: - logger.info(f"Uploading target to s3://{self.bucket}/{s3_key}") - - self.s3_client.upload_file( - str(file_path), - self.bucket, - s3_key, - ExtraArgs={ - 'Metadata': upload_metadata - } - ) - - file_size_mb = file_path.stat().st_size / (1024 * 1024) - logger.info( - f"āœ“ Uploaded target {target_id} " - f"({file_path.name}, {file_size_mb:.2f} MB)" - ) - - return target_id - - except ClientError as e: - logger.error(f"S3 upload failed: {e}", exc_info=True) - raise StorageError(f"Failed to upload target: {e}") - except Exception as e: - logger.error(f"Upload failed: {e}", exc_info=True) - raise StorageError(f"Upload error: {e}") - - async def get_target(self, target_id: str) -> Path: - """Get target from cache or download from S3/MinIO.""" - # Check cache first - cache_path = self.cache_dir / target_id - cached_file = cache_path / "target" - - if cached_file.exists(): - # Update access time for LRU - cached_file.touch() - logger.info(f"Cache HIT: {target_id}") - return cached_file - - # Cache miss - download from S3 - logger.info(f"Cache MISS: {target_id}, downloading from S3...") - - try: - # Create cache directory - cache_path.mkdir(parents=True, exist_ok=True) - - # Download from S3 - s3_key = f'{target_id}/target' - logger.info(f"Downloading s3://{self.bucket}/{s3_key}") - - self.s3_client.download_file( - self.bucket, - s3_key, - str(cached_file) - ) - - # Verify download - if not cached_file.exists(): - raise StorageError(f"Downloaded file not found: {cached_file}") - - file_size_mb = cached_file.stat().st_size / (1024 * 1024) - logger.info(f"āœ“ Downloaded target {target_id} ({file_size_mb:.2f} MB)") - - return cached_file - - except ClientError as e: - error_code = e.response.get('Error', {}).get('Code') - if error_code in ['404', 'NoSuchKey']: - logger.error(f"Target not found: {target_id}") - raise FileNotFoundError(f"Target {target_id} not found in storage") - else: - logger.error(f"S3 download failed: {e}", exc_info=True) - raise StorageError(f"Download failed: {e}") - except Exception as e: - logger.error(f"Download error: {e}", exc_info=True) - # Cleanup partial download - if cache_path.exists(): - shutil.rmtree(cache_path, ignore_errors=True) - raise StorageError(f"Download error: {e}") - - async def delete_target(self, target_id: str) -> None: - """Delete target from S3/MinIO.""" - try: - s3_key = f'{target_id}/target' - logger.info(f"Deleting s3://{self.bucket}/{s3_key}") - - self.s3_client.delete_object( - Bucket=self.bucket, - Key=s3_key - ) - - # Also delete from cache if present - cache_path = self.cache_dir / target_id - if cache_path.exists(): - shutil.rmtree(cache_path, ignore_errors=True) - logger.info(f"āœ“ Deleted target {target_id} from S3 and cache") - else: - logger.info(f"āœ“ Deleted target {target_id} from S3") - - except ClientError as e: - logger.error(f"S3 delete failed: {e}", exc_info=True) - # Don't raise error if object doesn't exist - if e.response.get('Error', {}).get('Code') not in ['404', 'NoSuchKey']: - raise StorageError(f"Delete failed: {e}") - except Exception as e: - logger.error(f"Delete error: {e}", exc_info=True) - raise StorageError(f"Delete error: {e}") - - async def upload_results( - self, - workflow_id: str, - results: Dict[str, Any], - results_format: str = "json" - ) -> str: - """Upload workflow results to S3/MinIO.""" - try: - # Prepare results content - if results_format == "json": - content = json.dumps(results, indent=2).encode('utf-8') - content_type = 'application/json' - file_ext = 'json' - elif results_format == "sarif": - content = json.dumps(results, indent=2).encode('utf-8') - content_type = 'application/sarif+json' - file_ext = 'sarif' - else: - content = json.dumps(results, indent=2).encode('utf-8') - content_type = 'application/json' - file_ext = 'json' - - # Upload to results bucket - results_bucket = 'results' - s3_key = f'{workflow_id}/results.{file_ext}' - - logger.info(f"Uploading results to s3://{results_bucket}/{s3_key}") - - self.s3_client.put_object( - Bucket=results_bucket, - Key=s3_key, - Body=content, - ContentType=content_type, - Metadata={ - 'workflow_id': workflow_id, - 'format': results_format, - 'uploaded_at': datetime.now().isoformat() - } - ) - - # Construct URL - results_url = f"{self.endpoint_url}/{results_bucket}/{s3_key}" - logger.info(f"āœ“ Uploaded results: {results_url}") - - return results_url - - except Exception as e: - logger.error(f"Results upload failed: {e}", exc_info=True) - raise StorageError(f"Results upload failed: {e}") - - async def get_results(self, workflow_id: str) -> Dict[str, Any]: - """Get workflow results from S3/MinIO.""" - try: - results_bucket = 'results' - s3_key = f'{workflow_id}/results.json' - - logger.info(f"Downloading results from s3://{results_bucket}/{s3_key}") - - response = self.s3_client.get_object( - Bucket=results_bucket, - Key=s3_key - ) - - content = response['Body'].read().decode('utf-8') - results = json.loads(content) - - logger.info(f"āœ“ Downloaded results for workflow {workflow_id}") - return results - - except ClientError as e: - error_code = e.response.get('Error', {}).get('Code') - if error_code in ['404', 'NoSuchKey']: - logger.error(f"Results not found: {workflow_id}") - raise FileNotFoundError(f"Results for workflow {workflow_id} not found") - else: - logger.error(f"Results download failed: {e}", exc_info=True) - raise StorageError(f"Results download failed: {e}") - except Exception as e: - logger.error(f"Results download error: {e}", exc_info=True) - raise StorageError(f"Results download error: {e}") - - async def list_targets( - self, - user_id: Optional[str] = None, - limit: int = 100 - ) -> list[Dict[str, Any]]: - """List uploaded targets.""" - try: - targets = [] - paginator = self.s3_client.get_paginator('list_objects_v2') - - for page in paginator.paginate(Bucket=self.bucket, PaginationConfig={'MaxItems': limit}): - for obj in page.get('Contents', []): - # Get object metadata - try: - metadata_response = self.s3_client.head_object( - Bucket=self.bucket, - Key=obj['Key'] - ) - metadata = metadata_response.get('Metadata', {}) - - # Filter by user_id if specified - if user_id and metadata.get('user_id') != user_id: - continue - - targets.append({ - 'target_id': obj['Key'].split('/')[0], - 'key': obj['Key'], - 'size': obj['Size'], - 'last_modified': obj['LastModified'].isoformat(), - 'metadata': metadata - }) - - except Exception as e: - logger.warning(f"Failed to get metadata for {obj['Key']}: {e}") - continue - - logger.info(f"Listed {len(targets)} targets (user_id={user_id})") - return targets - - except Exception as e: - logger.error(f"List targets failed: {e}", exc_info=True) - raise StorageError(f"List targets failed: {e}") - - async def cleanup_cache(self) -> int: - """Clean up local cache using LRU eviction.""" - try: - cache_files = [] - total_size = 0 - - # Gather all cached files with metadata - for cache_file in self.cache_dir.rglob('*'): - if cache_file.is_file(): - try: - stat = cache_file.stat() - cache_files.append({ - 'path': cache_file, - 'size': stat.st_size, - 'atime': stat.st_atime # Last access time - }) - total_size += stat.st_size - except Exception as e: - logger.warning(f"Failed to stat {cache_file}: {e}") - continue - - # Check if cleanup is needed - if total_size <= self.cache_max_size: - logger.info( - f"Cache size OK: {total_size / (1024**3):.2f} GB / " - f"{self.cache_max_size / (1024**3):.2f} GB" - ) - return 0 - - # Sort by access time (oldest first) - cache_files.sort(key=lambda x: x['atime']) - - # Remove files until under limit - removed_count = 0 - for file_info in cache_files: - if total_size <= self.cache_max_size: - break - - try: - file_info['path'].unlink() - total_size -= file_info['size'] - removed_count += 1 - logger.debug(f"Evicted from cache: {file_info['path']}") - except Exception as e: - logger.warning(f"Failed to delete {file_info['path']}: {e}") - continue - - logger.info( - f"āœ“ Cache cleanup: removed {removed_count} files, " - f"new size: {total_size / (1024**3):.2f} GB" - ) - return removed_count - - except Exception as e: - logger.error(f"Cache cleanup failed: {e}", exc_info=True) - raise StorageError(f"Cache cleanup failed: {e}") - - def get_cache_stats(self) -> Dict[str, Any]: - """Get cache statistics.""" - try: - total_size = 0 - file_count = 0 - - for cache_file in self.cache_dir.rglob('*'): - if cache_file.is_file(): - total_size += cache_file.stat().st_size - file_count += 1 - - return { - 'total_size_bytes': total_size, - 'total_size_gb': total_size / (1024 ** 3), - 'file_count': file_count, - 'max_size_gb': self.cache_max_size / (1024 ** 3), - 'usage_percent': (total_size / self.cache_max_size) * 100 - } - except Exception as e: - logger.error(f"Failed to get cache stats: {e}") - return {'error': str(e)} diff --git a/backend/src/temporal/__init__.py b/backend/src/temporal/__init__.py deleted file mode 100644 index acaa368..0000000 --- a/backend/src/temporal/__init__.py +++ /dev/null @@ -1,10 +0,0 @@ -""" -Temporal integration for FuzzForge. - -Handles workflow execution, monitoring, and management. -""" - -from .manager import TemporalManager -from .discovery import WorkflowDiscovery - -__all__ = ["TemporalManager", "WorkflowDiscovery"] diff --git a/backend/src/temporal/discovery.py b/backend/src/temporal/discovery.py deleted file mode 100644 index 07da6f8..0000000 --- a/backend/src/temporal/discovery.py +++ /dev/null @@ -1,257 +0,0 @@ -""" -Workflow Discovery for Temporal - -Discovers workflows from the toolbox/workflows directory -and provides metadata about available workflows. -""" - -import logging -import yaml -from pathlib import Path -from typing import Dict, Any -from pydantic import BaseModel, Field, ConfigDict - -logger = logging.getLogger(__name__) - - -class WorkflowInfo(BaseModel): - """Information about a discovered workflow""" - name: str = Field(..., description="Workflow name") - path: Path = Field(..., description="Path to workflow directory") - workflow_file: Path = Field(..., description="Path to workflow.py file") - metadata: Dict[str, Any] = Field(..., description="Workflow metadata from YAML") - workflow_type: str = Field(..., description="Workflow class name") - vertical: str = Field(..., description="Vertical (worker type) for this workflow") - - model_config = ConfigDict(arbitrary_types_allowed=True) - - -class WorkflowDiscovery: - """ - Discovers workflows from the filesystem. - - Scans toolbox/workflows/ for directories containing: - - metadata.yaml (required) - - workflow.py (required) - - Each workflow declares its vertical (rust, android, web, etc.) - which determines which worker pool will execute it. - """ - - def __init__(self, workflows_dir: Path): - """ - Initialize workflow discovery. - - Args: - workflows_dir: Path to the workflows directory - """ - self.workflows_dir = workflows_dir - if not self.workflows_dir.exists(): - self.workflows_dir.mkdir(parents=True, exist_ok=True) - logger.info(f"Created workflows directory: {self.workflows_dir}") - - async def discover_workflows(self) -> Dict[str, WorkflowInfo]: - """ - Discover workflows by scanning the workflows directory. - - Returns: - Dictionary mapping workflow names to their information - """ - workflows = {} - - logger.info(f"Scanning for workflows in: {self.workflows_dir}") - - for workflow_dir in self.workflows_dir.iterdir(): - if not workflow_dir.is_dir(): - continue - - # Skip special directories - if workflow_dir.name.startswith('.') or workflow_dir.name == '__pycache__': - continue - - metadata_file = workflow_dir / "metadata.yaml" - if not metadata_file.exists(): - logger.debug(f"No metadata.yaml in {workflow_dir.name}, skipping") - continue - - workflow_file = workflow_dir / "workflow.py" - if not workflow_file.exists(): - logger.warning( - f"Workflow {workflow_dir.name} has metadata but no workflow.py, skipping" - ) - continue - - try: - # Parse metadata - with open(metadata_file) as f: - metadata = yaml.safe_load(f) - - # Validate required fields - if 'name' not in metadata: - logger.warning(f"Workflow {workflow_dir.name} metadata missing 'name' field") - metadata['name'] = workflow_dir.name - - if 'vertical' not in metadata: - logger.warning( - f"Workflow {workflow_dir.name} metadata missing 'vertical' field" - ) - continue - - # Infer workflow class name from metadata or use convention - workflow_type = metadata.get('workflow_class') - if not workflow_type: - # Convention: convert snake_case to PascalCase + Workflow - # e.g., rust_test -> RustTestWorkflow - parts = workflow_dir.name.split('_') - workflow_type = ''.join(part.capitalize() for part in parts) + 'Workflow' - - # Create workflow info - info = WorkflowInfo( - name=metadata['name'], - path=workflow_dir, - workflow_file=workflow_file, - metadata=metadata, - workflow_type=workflow_type, - vertical=metadata['vertical'] - ) - - workflows[info.name] = info - logger.info( - f"āœ“ Discovered workflow: {info.name} " - f"(vertical: {info.vertical}, class: {info.workflow_type})" - ) - - except Exception as e: - logger.error( - f"Error discovering workflow {workflow_dir.name}: {e}", - exc_info=True - ) - continue - - logger.info(f"Discovered {len(workflows)} workflows") - return workflows - - def get_workflows_by_vertical( - self, - workflows: Dict[str, WorkflowInfo], - vertical: str - ) -> Dict[str, WorkflowInfo]: - """ - Filter workflows by vertical. - - Args: - workflows: All discovered workflows - vertical: Vertical name to filter by - - Returns: - Filtered workflows dictionary - """ - return { - name: info - for name, info in workflows.items() - if info.vertical == vertical - } - - def get_available_verticals(self, workflows: Dict[str, WorkflowInfo]) -> list[str]: - """ - Get list of all verticals from discovered workflows. - - Args: - workflows: All discovered workflows - - Returns: - List of unique vertical names - """ - return list(set(info.vertical for info in workflows.values())) - - @staticmethod - def get_metadata_schema() -> Dict[str, Any]: - """ - Get the JSON schema for workflow metadata. - - Returns: - JSON schema dictionary - """ - return { - "type": "object", - "required": ["name", "version", "description", "author", "vertical", "parameters"], - "properties": { - "name": { - "type": "string", - "description": "Workflow name" - }, - "version": { - "type": "string", - "pattern": "^\\d+\\.\\d+\\.\\d+$", - "description": "Semantic version (x.y.z)" - }, - "vertical": { - "type": "string", - "description": "Vertical worker type (rust, android, web, etc.)" - }, - "description": { - "type": "string", - "description": "Workflow description" - }, - "author": { - "type": "string", - "description": "Workflow author" - }, - "category": { - "type": "string", - "enum": ["comprehensive", "specialized", "fuzzing", "focused"], - "description": "Workflow category" - }, - "tags": { - "type": "array", - "items": {"type": "string"}, - "description": "Workflow tags for categorization" - }, - "requirements": { - "type": "object", - "required": ["tools", "resources"], - "properties": { - "tools": { - "type": "array", - "items": {"type": "string"}, - "description": "Required security tools" - }, - "resources": { - "type": "object", - "required": ["memory", "cpu", "timeout"], - "properties": { - "memory": { - "type": "string", - "pattern": "^\\d+[GMK]i$", - "description": "Memory limit (e.g., 1Gi, 512Mi)" - }, - "cpu": { - "type": "string", - "pattern": "^\\d+m?$", - "description": "CPU limit (e.g., 1000m, 2)" - }, - "timeout": { - "type": "integer", - "minimum": 60, - "maximum": 7200, - "description": "Workflow timeout in seconds" - } - } - } - } - }, - "parameters": { - "type": "object", - "description": "Workflow parameters schema" - }, - "default_parameters": { - "type": "object", - "description": "Default parameter values" - }, - "required_modules": { - "type": "array", - "items": {"type": "string"}, - "description": "Required module names" - } - } - } diff --git a/backend/src/temporal/manager.py b/backend/src/temporal/manager.py deleted file mode 100644 index 96d9a84..0000000 --- a/backend/src/temporal/manager.py +++ /dev/null @@ -1,392 +0,0 @@ -""" -Temporal Manager - Workflow execution and management - -Handles: -- Workflow discovery from toolbox -- Workflow execution (submit to Temporal) -- Status monitoring -- Results retrieval -""" - -import logging -import os -from pathlib import Path -from typing import Dict, Optional, Any -from uuid import uuid4 - -from temporalio.client import Client, WorkflowHandle -from temporalio.common import RetryPolicy -from datetime import timedelta - -from .discovery import WorkflowDiscovery, WorkflowInfo -from src.storage import S3CachedStorage - -logger = logging.getLogger(__name__) - - -class TemporalManager: - """ - Manages Temporal workflow execution for FuzzForge. - - This class: - - Discovers available workflows from toolbox - - Submits workflow executions to Temporal - - Monitors workflow status - - Retrieves workflow results - """ - - def __init__( - self, - workflows_dir: Optional[Path] = None, - temporal_address: Optional[str] = None, - temporal_namespace: str = "default", - storage: Optional[S3CachedStorage] = None - ): - """ - Initialize Temporal manager. - - Args: - workflows_dir: Path to workflows directory (default: toolbox/workflows) - temporal_address: Temporal server address (default: from env or localhost:7233) - temporal_namespace: Temporal namespace - storage: Storage backend for file uploads (default: S3CachedStorage) - """ - if workflows_dir is None: - workflows_dir = Path("toolbox/workflows") - - self.temporal_address = temporal_address or os.getenv( - 'TEMPORAL_ADDRESS', - 'localhost:7233' - ) - self.temporal_namespace = temporal_namespace - self.discovery = WorkflowDiscovery(workflows_dir) - self.workflows: Dict[str, WorkflowInfo] = {} - self.client: Optional[Client] = None - - # Initialize storage backend - self.storage = storage or S3CachedStorage() - - logger.info( - f"TemporalManager initialized: {self.temporal_address} " - f"(namespace: {self.temporal_namespace})" - ) - - async def initialize(self): - """Initialize the manager by discovering workflows and connecting to Temporal.""" - try: - # Discover workflows - self.workflows = await self.discovery.discover_workflows() - - if not self.workflows: - logger.warning("No workflows discovered") - else: - logger.info( - f"Discovered {len(self.workflows)} workflows: " - f"{list(self.workflows.keys())}" - ) - - # Connect to Temporal - self.client = await Client.connect( - self.temporal_address, - namespace=self.temporal_namespace - ) - logger.info(f"āœ“ Connected to Temporal: {self.temporal_address}") - - except Exception as e: - logger.error(f"Failed to initialize Temporal manager: {e}", exc_info=True) - raise - - async def close(self): - """Close Temporal client connection.""" - if self.client: - # Temporal client doesn't need explicit close in Python SDK - pass - - async def get_workflows(self) -> Dict[str, WorkflowInfo]: - """ - Get all discovered workflows. - - Returns: - Dictionary mapping workflow names to their info - """ - return self.workflows - - async def get_workflow(self, name: str) -> Optional[WorkflowInfo]: - """ - Get workflow info by name. - - Args: - name: Workflow name - - Returns: - WorkflowInfo or None if not found - """ - return self.workflows.get(name) - - async def upload_target( - self, - file_path: Path, - user_id: str, - metadata: Optional[Dict[str, Any]] = None - ) -> str: - """ - Upload target file to storage. - - Args: - file_path: Local path to file - user_id: User ID - metadata: Optional metadata - - Returns: - Target ID for use in workflow execution - """ - target_id = await self.storage.upload_target(file_path, user_id, metadata) - logger.info(f"Uploaded target: {target_id}") - return target_id - - async def run_workflow( - self, - workflow_name: str, - target_id: str, - workflow_params: Optional[Dict[str, Any]] = None, - workflow_id: Optional[str] = None - ) -> WorkflowHandle: - """ - Execute a workflow. - - Args: - workflow_name: Name of workflow to execute - target_id: Target ID (from upload_target) - workflow_params: Additional workflow parameters - workflow_id: Optional workflow ID (generated if not provided) - - Returns: - WorkflowHandle for monitoring/results - - Raises: - ValueError: If workflow not found or client not initialized - """ - if not self.client: - raise ValueError("Temporal client not initialized. Call initialize() first.") - - # Get workflow info - workflow_info = self.workflows.get(workflow_name) - if not workflow_info: - raise ValueError(f"Workflow not found: {workflow_name}") - - # Generate workflow ID if not provided - if not workflow_id: - workflow_id = f"{workflow_name}-{str(uuid4())[:8]}" - - # Prepare workflow input arguments - workflow_params = workflow_params or {} - - # Build args list: [target_id, ...workflow_params in schema order] - # The workflow parameters are passed as individual positional args - workflow_args = [target_id] - - # Add parameters in order based on metadata schema - # This ensures parameters match the workflow signature order - # Apply defaults from metadata.yaml if parameter not provided - if 'parameters' in workflow_info.metadata: - param_schema = workflow_info.metadata['parameters'].get('properties', {}) - logger.debug(f"Found {len(param_schema)} parameters in schema") - # Iterate parameters in schema order and add values - for param_name in param_schema.keys(): - param_spec = param_schema[param_name] - - # Use provided param, or fall back to default from metadata - if workflow_params and param_name in workflow_params: - param_value = workflow_params[param_name] - logger.debug(f"Using provided value for {param_name}: {param_value}") - elif 'default' in param_spec: - param_value = param_spec['default'] - logger.debug(f"Using default for {param_name}: {param_value}") - else: - param_value = None - logger.debug(f"No value or default for {param_name}, using None") - - workflow_args.append(param_value) - else: - logger.debug("No 'parameters' section found in workflow metadata") - - # Determine task queue from workflow vertical - vertical = workflow_info.metadata.get("vertical", "default") - task_queue = f"{vertical}-queue" - - logger.info( - f"Starting workflow: {workflow_name} " - f"(id={workflow_id}, queue={task_queue}, target={target_id})" - ) - logger.info(f"DEBUG: workflow_args = {workflow_args}") - logger.info(f"DEBUG: workflow_params received = {workflow_params}") - - try: - # Start workflow execution with positional arguments - handle = await self.client.start_workflow( - workflow=workflow_info.workflow_type, # Workflow class name - args=workflow_args, # Positional arguments - id=workflow_id, - task_queue=task_queue, - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=1), - maximum_interval=timedelta(minutes=1), - maximum_attempts=3 - ) - ) - - logger.info(f"āœ“ Workflow started: {workflow_id}") - return handle - - except Exception as e: - logger.error(f"Failed to start workflow {workflow_name}: {e}", exc_info=True) - raise - - async def get_workflow_status(self, workflow_id: str) -> Dict[str, Any]: - """ - Get workflow execution status. - - Args: - workflow_id: Workflow execution ID - - Returns: - Status dictionary with workflow state - - Raises: - ValueError: If client not initialized or workflow not found - """ - if not self.client: - raise ValueError("Temporal client not initialized") - - try: - # Get workflow handle - handle = self.client.get_workflow_handle(workflow_id) - - # Try to get result (non-blocking describe) - description = await handle.describe() - - status = { - "workflow_id": workflow_id, - "status": description.status.name, - "start_time": description.start_time.isoformat() if description.start_time else None, - "execution_time": description.execution_time.isoformat() if description.execution_time else None, - "close_time": description.close_time.isoformat() if description.close_time else None, - "task_queue": description.task_queue, - } - - logger.info(f"Workflow {workflow_id} status: {status['status']}") - return status - - except Exception as e: - logger.error(f"Failed to get workflow status: {e}", exc_info=True) - raise - - async def get_workflow_result( - self, - workflow_id: str, - timeout: Optional[timedelta] = None - ) -> Any: - """ - Get workflow execution result (blocking). - - Args: - workflow_id: Workflow execution ID - timeout: Maximum time to wait for result - - Returns: - Workflow result - - Raises: - ValueError: If client not initialized - TimeoutError: If timeout exceeded - """ - if not self.client: - raise ValueError("Temporal client not initialized") - - try: - handle = self.client.get_workflow_handle(workflow_id) - - logger.info(f"Waiting for workflow result: {workflow_id}") - - # Wait for workflow to complete and get result - if timeout: - # Use asyncio timeout if provided - import asyncio - result = await asyncio.wait_for(handle.result(), timeout=timeout.total_seconds()) - else: - result = await handle.result() - - logger.info(f"āœ“ Workflow {workflow_id} completed") - return result - - except Exception as e: - logger.error(f"Failed to get workflow result: {e}", exc_info=True) - raise - - async def cancel_workflow(self, workflow_id: str) -> None: - """ - Cancel a running workflow. - - Args: - workflow_id: Workflow execution ID - - Raises: - ValueError: If client not initialized - """ - if not self.client: - raise ValueError("Temporal client not initialized") - - try: - handle = self.client.get_workflow_handle(workflow_id) - await handle.cancel() - - logger.info(f"āœ“ Workflow cancelled: {workflow_id}") - - except Exception as e: - logger.error(f"Failed to cancel workflow: {e}", exc_info=True) - raise - - async def list_workflows( - self, - filter_query: Optional[str] = None, - limit: int = 100 - ) -> list[Dict[str, Any]]: - """ - List workflow executions. - - Args: - filter_query: Optional Temporal list filter query - limit: Maximum number of results - - Returns: - List of workflow execution info - - Raises: - ValueError: If client not initialized - """ - if not self.client: - raise ValueError("Temporal client not initialized") - - try: - workflows = [] - - # Use Temporal's list API - async for workflow in self.client.list_workflows(filter_query): - workflows.append({ - "workflow_id": workflow.id, - "workflow_type": workflow.workflow_type, - "status": workflow.status.name, - "start_time": workflow.start_time.isoformat() if workflow.start_time else None, - "close_time": workflow.close_time.isoformat() if workflow.close_time else None, - "task_queue": workflow.task_queue, - }) - - if len(workflows) >= limit: - break - - logger.info(f"Listed {len(workflows)} workflows") - return workflows - - except Exception as e: - logger.error(f"Failed to list workflows: {e}", exc_info=True) - raise diff --git a/backend/tests/README.md b/backend/tests/README.md deleted file mode 100644 index a1cada4..0000000 --- a/backend/tests/README.md +++ /dev/null @@ -1,119 +0,0 @@ -# FuzzForge Test Suite - -Comprehensive test infrastructure for FuzzForge modules and workflows. - -## Directory Structure - -``` -tests/ -ā”œā”€ā”€ conftest.py # Shared pytest fixtures -ā”œā”€ā”€ unit/ # Fast, isolated unit tests -│ ā”œā”€ā”€ test_modules/ # Module-specific tests -│ │ ā”œā”€ā”€ test_cargo_fuzzer.py -│ │ └── test_atheris_fuzzer.py -│ ā”œā”€ā”€ test_workflows/ # Workflow tests -│ └── test_api/ # API endpoint tests -ā”œā”€ā”€ integration/ # Integration tests (requires Docker) -└── fixtures/ # Test data and projects - ā”œā”€ā”€ test_projects/ # Vulnerable projects for testing - └── expected_results/ # Expected output for validation -``` - -## Running Tests - -### All Tests -```bash -cd backend -pytest tests/ -v -``` - -### Unit Tests Only (Fast) -```bash -pytest tests/unit/ -v -``` - -### Integration Tests (Requires Docker) -```bash -# Start services -docker-compose up -d - -# Run integration tests -pytest tests/integration/ -v - -# Cleanup -docker-compose down -``` - -### With Coverage -```bash -pytest tests/ --cov=toolbox/modules --cov=src --cov-report=html -``` - -### Parallel Execution -```bash -pytest tests/unit/ -n auto -``` - -## Available Fixtures - -### Workspace Fixtures -- `temp_workspace`: Empty temporary workspace -- `python_test_workspace`: Python project with vulnerabilities -- `rust_test_workspace`: Rust project with fuzz targets - -### Module Fixtures -- `atheris_fuzzer`: AtherisFuzzer instance -- `cargo_fuzzer`: CargoFuzzer instance -- `file_scanner`: FileScanner instance - -### Configuration Fixtures -- `atheris_config`: Default Atheris configuration -- `cargo_fuzz_config`: Default cargo-fuzz configuration -- `gitleaks_config`: Default Gitleaks configuration - -### Mock Fixtures -- `mock_stats_callback`: Mock stats callback for fuzzing -- `mock_temporal_context`: Mock Temporal activity context - -## Writing Tests - -### Unit Test Example -```python -import pytest - -@pytest.mark.asyncio -async def test_module_execution(cargo_fuzzer, rust_test_workspace, cargo_fuzz_config): - """Test module execution""" - result = await cargo_fuzzer.execute(cargo_fuzz_config, rust_test_workspace) - - assert result.status == "success" - assert result.execution_time > 0 -``` - -### Integration Test Example -```python -@pytest.mark.integration -async def test_end_to_end_workflow(): - """Test complete workflow execution""" - # Test full workflow with real services - pass -``` - -## CI/CD Integration - -Tests run automatically on: -- **Push to main/develop**: Full test suite -- **Pull requests**: Full test suite + coverage -- **Nightly**: Extended integration tests - -See `.github/workflows/test.yml` for configuration. - -## Code Coverage - -Target coverage: **80%+** for core modules - -View coverage report: -```bash -pytest tests/ --cov --cov-report=html -open htmlcov/index.html -``` diff --git a/backend/tests/conftest.py b/backend/tests/conftest.py deleted file mode 100644 index 0bc6eee..0000000 --- a/backend/tests/conftest.py +++ /dev/null @@ -1,230 +0,0 @@ -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -import sys -from pathlib import Path -from typing import Dict, Any -import pytest - -# Ensure project root is on sys.path so `src` is importable -ROOT = Path(__file__).resolve().parents[1] -if str(ROOT) not in sys.path: - sys.path.insert(0, str(ROOT)) - -# Add toolbox to path for module imports -TOOLBOX = ROOT / "toolbox" -if str(TOOLBOX) not in sys.path: - sys.path.insert(0, str(TOOLBOX)) - - -# ============================================================================ -# Workspace Fixtures -# ============================================================================ - -@pytest.fixture -def temp_workspace(tmp_path): - """Create a temporary workspace directory for testing""" - workspace = tmp_path / "workspace" - workspace.mkdir() - return workspace - - -@pytest.fixture -def python_test_workspace(temp_workspace): - """Create a Python test workspace with sample files""" - # Create a simple Python project structure - (temp_workspace / "main.py").write_text(""" -def process_data(data): - # Intentional bug: no bounds checking - return data[0:100] - -def divide(a, b): - # Division by zero vulnerability - return a / b -""") - - (temp_workspace / "config.py").write_text(""" -# Hardcoded secrets for testing -API_KEY = "sk_test_1234567890abcdef" -DATABASE_URL = "postgresql://admin:password123@localhost/db" -AWS_SECRET = "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY" -""") - - return temp_workspace - - -@pytest.fixture -def rust_test_workspace(temp_workspace): - """Create a Rust test workspace with fuzz targets""" - # Create Cargo.toml - (temp_workspace / "Cargo.toml").write_text("""[package] -name = "test_project" -version = "0.1.0" -edition = "2021" - -[dependencies] -""") - - # Create src/lib.rs - src_dir = temp_workspace / "src" - src_dir.mkdir() - (src_dir / "lib.rs").write_text(""" -pub fn process_buffer(data: &[u8]) -> Vec { - if data.len() < 4 { - return Vec::new(); - } - - // Vulnerability: bounds checking issue - let size = data[0] as usize; - let mut result = Vec::new(); - for i in 0..size { - result.push(data[i]); - } - result -} -""") - - # Create fuzz directory structure - fuzz_dir = temp_workspace / "fuzz" - fuzz_dir.mkdir() - - (fuzz_dir / "Cargo.toml").write_text("""[package] -name = "test_project-fuzz" -version = "0.0.0" -edition = "2021" - -[dependencies] -libfuzzer-sys = "0.4" - -[dependencies.test_project] -path = ".." - -[[bin]] -name = "fuzz_target_1" -path = "fuzz_targets/fuzz_target_1.rs" -""") - - fuzz_targets_dir = fuzz_dir / "fuzz_targets" - fuzz_targets_dir.mkdir() - - (fuzz_targets_dir / "fuzz_target_1.rs").write_text("""#![no_main] -use libfuzzer_sys::fuzz_target; -use test_project::process_buffer; - -fuzz_target!(|data: &[u8]| { - let _ = process_buffer(data); -}); -""") - - return temp_workspace - - -# ============================================================================ -# Module Configuration Fixtures -# ============================================================================ - -@pytest.fixture -def atheris_config(): - """Default Atheris fuzzer configuration""" - return { - "target_file": "auto-discover", - "max_iterations": 1000, - "timeout_seconds": 10, - "corpus_dir": None - } - - -@pytest.fixture -def cargo_fuzz_config(): - """Default cargo-fuzz configuration""" - return { - "target_name": None, - "max_iterations": 1000, - "timeout_seconds": 10, - "sanitizer": "address" - } - - -@pytest.fixture -def gitleaks_config(): - """Default Gitleaks configuration""" - return { - "config_path": None, - "scan_uncommitted": True - } - - -@pytest.fixture -def file_scanner_config(): - """Default file scanner configuration""" - return { - "scan_patterns": ["*.py", "*.rs", "*.js"], - "exclude_patterns": ["*.test.*", "*.spec.*"], - "max_file_size": 1048576 # 1MB - } - - -# ============================================================================ -# Module Instance Fixtures -# ============================================================================ - -@pytest.fixture -def atheris_fuzzer(): - """Create an AtherisFuzzer instance""" - from modules.fuzzer.atheris_fuzzer import AtherisFuzzer - return AtherisFuzzer() - - -@pytest.fixture -def cargo_fuzzer(): - """Create a CargoFuzzer instance""" - from modules.fuzzer.cargo_fuzzer import CargoFuzzer - return CargoFuzzer() - - -@pytest.fixture -def file_scanner(): - """Create a FileScanner instance""" - from modules.scanner.file_scanner import FileScanner - return FileScanner() - - -# ============================================================================ -# Mock Fixtures -# ============================================================================ - -@pytest.fixture -def mock_stats_callback(): - """Mock stats callback for fuzzing""" - stats_received = [] - - async def callback(stats: Dict[str, Any]): - stats_received.append(stats) - - callback.stats_received = stats_received - return callback - - -@pytest.fixture -def mock_temporal_context(): - """Mock Temporal activity context""" - class MockActivityInfo: - def __init__(self): - self.workflow_id = "test-workflow-123" - self.activity_id = "test-activity-1" - self.attempt = 1 - - class MockContext: - def __init__(self): - self.info = MockActivityInfo() - - return MockContext() - diff --git a/backend/tests/fixtures/__init__.py b/backend/tests/fixtures/__init__.py deleted file mode 100644 index e69de29..0000000 diff --git a/backend/tests/integration/__init__.py b/backend/tests/integration/__init__.py deleted file mode 100644 index e69de29..0000000 diff --git a/backend/tests/unit/__init__.py b/backend/tests/unit/__init__.py deleted file mode 100644 index e69de29..0000000 diff --git a/backend/tests/unit/test_api/__init__.py b/backend/tests/unit/test_api/__init__.py deleted file mode 100644 index e69de29..0000000 diff --git a/backend/tests/unit/test_modules/__init__.py b/backend/tests/unit/test_modules/__init__.py deleted file mode 100644 index e69de29..0000000 diff --git a/backend/tests/unit/test_modules/test_atheris_fuzzer.py b/backend/tests/unit/test_modules/test_atheris_fuzzer.py deleted file mode 100644 index 9cd01ce..0000000 --- a/backend/tests/unit/test_modules/test_atheris_fuzzer.py +++ /dev/null @@ -1,177 +0,0 @@ -""" -Unit tests for AtherisFuzzer module -""" - -import pytest -from unittest.mock import AsyncMock, patch - - -@pytest.mark.asyncio -class TestAtherisFuzzerMetadata: - """Test AtherisFuzzer metadata""" - - async def test_metadata_structure(self, atheris_fuzzer): - """Test that module metadata is properly defined""" - metadata = atheris_fuzzer.get_metadata() - - assert metadata.name == "atheris_fuzzer" - assert metadata.category == "fuzzer" - assert "fuzzing" in metadata.tags - assert "python" in metadata.tags - - -@pytest.mark.asyncio -class TestAtherisFuzzerConfigValidation: - """Test configuration validation""" - - async def test_valid_config(self, atheris_fuzzer, atheris_config): - """Test validation of valid configuration""" - assert atheris_fuzzer.validate_config(atheris_config) is True - - async def test_invalid_max_iterations(self, atheris_fuzzer): - """Test validation fails with invalid max_iterations""" - config = { - "target_file": "fuzz_target.py", - "max_iterations": -1, - "timeout_seconds": 10 - } - with pytest.raises(ValueError, match="max_iterations"): - atheris_fuzzer.validate_config(config) - - async def test_invalid_timeout(self, atheris_fuzzer): - """Test validation fails with invalid timeout""" - config = { - "target_file": "fuzz_target.py", - "max_iterations": 1000, - "timeout_seconds": 0 - } - with pytest.raises(ValueError, match="timeout_seconds"): - atheris_fuzzer.validate_config(config) - - -@pytest.mark.asyncio -class TestAtherisFuzzerDiscovery: - """Test fuzz target discovery""" - - async def test_auto_discover(self, atheris_fuzzer, python_test_workspace): - """Test auto-discovery of Python fuzz targets""" - # Create a fuzz target file - (python_test_workspace / "fuzz_target.py").write_text(""" -import atheris -import sys - -def TestOneInput(data): - pass - -if __name__ == "__main__": - atheris.Setup(sys.argv, TestOneInput) - atheris.Fuzz() -""") - - # Pass None for auto-discovery - target = atheris_fuzzer._discover_target(python_test_workspace, None) - - assert target is not None - assert "fuzz_target.py" in str(target) - - -@pytest.mark.asyncio -class TestAtherisFuzzerExecution: - """Test fuzzer execution logic""" - - async def test_execution_creates_result(self, atheris_fuzzer, python_test_workspace, atheris_config): - """Test that execution returns a ModuleResult""" - # Create a simple fuzz target - (python_test_workspace / "fuzz_target.py").write_text(""" -import atheris -import sys - -def TestOneInput(data): - if len(data) > 0: - pass - -if __name__ == "__main__": - atheris.Setup(sys.argv, TestOneInput) - atheris.Fuzz() -""") - - # Use a very short timeout for testing - test_config = { - "target_file": "fuzz_target.py", - "max_iterations": 10, - "timeout_seconds": 1 - } - - # Mock the fuzzing subprocess to avoid actual execution - with patch.object(atheris_fuzzer, '_run_fuzzing', new_callable=AsyncMock, return_value=([], {"total_executions": 10})): - result = await atheris_fuzzer.execute(test_config, python_test_workspace) - - assert result.module == "atheris_fuzzer" - assert result.status in ["success", "partial", "failed"] - assert isinstance(result.execution_time, float) - - -@pytest.mark.asyncio -class TestAtherisFuzzerStatsCallback: - """Test stats callback functionality""" - - async def test_stats_callback_invoked(self, atheris_fuzzer, python_test_workspace, atheris_config, mock_stats_callback): - """Test that stats callback is invoked during fuzzing""" - (python_test_workspace / "fuzz_target.py").write_text(""" -import atheris -import sys - -def TestOneInput(data): - pass - -if __name__ == "__main__": - atheris.Setup(sys.argv, TestOneInput) - atheris.Fuzz() -""") - - # Mock fuzzing to simulate stats - async def mock_run_fuzzing(test_one_input, target_path, workspace, max_iterations, timeout_seconds, stats_callback): - if stats_callback: - await stats_callback({ - "total_execs": 100, - "execs_per_sec": 10.0, - "crashes": 0, - "coverage": 5, - "corpus_size": 2, - "elapsed_time": 10 - }) - return - - with patch.object(atheris_fuzzer, '_run_fuzzing', side_effect=mock_run_fuzzing): - with patch.object(atheris_fuzzer, '_load_target_module', return_value=lambda x: None): - # Put stats_callback in config dict, not as kwarg - atheris_config["target_file"] = "fuzz_target.py" - atheris_config["stats_callback"] = mock_stats_callback - await atheris_fuzzer.execute(atheris_config, python_test_workspace) - - # Verify callback was invoked - assert len(mock_stats_callback.stats_received) > 0 - - -@pytest.mark.asyncio -class TestAtherisFuzzerFindingGeneration: - """Test finding generation from crashes""" - - async def test_create_crash_finding(self, atheris_fuzzer): - """Test crash finding creation""" - finding = atheris_fuzzer.create_finding( - title="Crash: Exception in TestOneInput", - description="IndexError: list index out of range", - severity="high", - category="crash", - file_path="fuzz_target.py", - metadata={ - "crash_type": "IndexError", - "stack_trace": "Traceback..." - } - ) - - assert finding.title == "Crash: Exception in TestOneInput" - assert finding.severity == "high" - assert finding.category == "crash" - assert "IndexError" in finding.metadata["crash_type"] diff --git a/backend/tests/unit/test_modules/test_cargo_fuzzer.py b/backend/tests/unit/test_modules/test_cargo_fuzzer.py deleted file mode 100644 index f550b9a..0000000 --- a/backend/tests/unit/test_modules/test_cargo_fuzzer.py +++ /dev/null @@ -1,177 +0,0 @@ -""" -Unit tests for CargoFuzzer module -""" - -import pytest -from unittest.mock import AsyncMock, patch - - -@pytest.mark.asyncio -class TestCargoFuzzerMetadata: - """Test CargoFuzzer metadata""" - - async def test_metadata_structure(self, cargo_fuzzer): - """Test that module metadata is properly defined""" - metadata = cargo_fuzzer.get_metadata() - - assert metadata.name == "cargo_fuzz" - assert metadata.version == "0.11.2" - assert metadata.category == "fuzzer" - assert "fuzzing" in metadata.tags - assert "rust" in metadata.tags - - -@pytest.mark.asyncio -class TestCargoFuzzerConfigValidation: - """Test configuration validation""" - - async def test_valid_config(self, cargo_fuzzer, cargo_fuzz_config): - """Test validation of valid configuration""" - assert cargo_fuzzer.validate_config(cargo_fuzz_config) is True - - async def test_invalid_max_iterations(self, cargo_fuzzer): - """Test validation fails with invalid max_iterations""" - config = { - "max_iterations": -1, - "timeout_seconds": 10, - "sanitizer": "address" - } - with pytest.raises(ValueError, match="max_iterations"): - cargo_fuzzer.validate_config(config) - - async def test_invalid_timeout(self, cargo_fuzzer): - """Test validation fails with invalid timeout""" - config = { - "max_iterations": 1000, - "timeout_seconds": 0, - "sanitizer": "address" - } - with pytest.raises(ValueError, match="timeout_seconds"): - cargo_fuzzer.validate_config(config) - - async def test_invalid_sanitizer(self, cargo_fuzzer): - """Test validation fails with invalid sanitizer""" - config = { - "max_iterations": 1000, - "timeout_seconds": 10, - "sanitizer": "invalid_sanitizer" - } - with pytest.raises(ValueError, match="sanitizer"): - cargo_fuzzer.validate_config(config) - - -@pytest.mark.asyncio -class TestCargoFuzzerWorkspaceValidation: - """Test workspace validation""" - - async def test_valid_workspace(self, cargo_fuzzer, rust_test_workspace): - """Test validation of valid workspace""" - assert cargo_fuzzer.validate_workspace(rust_test_workspace) is True - - async def test_nonexistent_workspace(self, cargo_fuzzer, tmp_path): - """Test validation fails with nonexistent workspace""" - nonexistent = tmp_path / "does_not_exist" - with pytest.raises(ValueError, match="does not exist"): - cargo_fuzzer.validate_workspace(nonexistent) - - async def test_workspace_is_file(self, cargo_fuzzer, tmp_path): - """Test validation fails when workspace is a file""" - file_path = tmp_path / "file.txt" - file_path.write_text("test") - with pytest.raises(ValueError, match="not a directory"): - cargo_fuzzer.validate_workspace(file_path) - - -@pytest.mark.asyncio -class TestCargoFuzzerDiscovery: - """Test fuzz target discovery""" - - async def test_discover_targets(self, cargo_fuzzer, rust_test_workspace): - """Test discovery of fuzz targets""" - targets = await cargo_fuzzer._discover_fuzz_targets(rust_test_workspace) - - assert len(targets) == 1 - assert "fuzz_target_1" in targets - - async def test_no_fuzz_directory(self, cargo_fuzzer, temp_workspace): - """Test discovery with no fuzz directory""" - targets = await cargo_fuzzer._discover_fuzz_targets(temp_workspace) - - assert targets == [] - - -@pytest.mark.asyncio -class TestCargoFuzzerExecution: - """Test fuzzer execution logic""" - - async def test_execution_creates_result(self, cargo_fuzzer, rust_test_workspace, cargo_fuzz_config): - """Test that execution returns a ModuleResult""" - # Mock the build and run methods to avoid actual fuzzing - with patch.object(cargo_fuzzer, '_build_fuzz_target', new_callable=AsyncMock, return_value=True): - with patch.object(cargo_fuzzer, '_run_fuzzing', new_callable=AsyncMock, return_value=([], {"total_executions": 0, "crashes_found": 0})): - with patch.object(cargo_fuzzer, '_parse_crash_artifacts', new_callable=AsyncMock, return_value=[]): - result = await cargo_fuzzer.execute(cargo_fuzz_config, rust_test_workspace) - - assert result.module == "cargo_fuzz" - assert result.status == "success" - assert isinstance(result.execution_time, float) - assert result.execution_time >= 0 - - async def test_execution_with_no_targets(self, cargo_fuzzer, temp_workspace, cargo_fuzz_config): - """Test execution fails gracefully with no fuzz targets""" - result = await cargo_fuzzer.execute(cargo_fuzz_config, temp_workspace) - - assert result.status == "failed" - assert "No fuzz targets found" in result.error - - -@pytest.mark.asyncio -class TestCargoFuzzerStatsCallback: - """Test stats callback functionality""" - - async def test_stats_callback_invoked(self, cargo_fuzzer, rust_test_workspace, cargo_fuzz_config, mock_stats_callback): - """Test that stats callback is invoked during fuzzing""" - # Mock build/run to simulate stats generation - async def mock_run_fuzzing(workspace, target, config, callback): - # Simulate stats callback - if callback: - await callback({ - "total_execs": 1000, - "execs_per_sec": 100.0, - "crashes": 0, - "coverage": 10, - "corpus_size": 5, - "elapsed_time": 10 - }) - return [], {"total_executions": 1000} - - with patch.object(cargo_fuzzer, '_build_fuzz_target', new_callable=AsyncMock, return_value=True): - with patch.object(cargo_fuzzer, '_run_fuzzing', side_effect=mock_run_fuzzing): - with patch.object(cargo_fuzzer, '_parse_crash_artifacts', new_callable=AsyncMock, return_value=[]): - await cargo_fuzzer.execute(cargo_fuzz_config, rust_test_workspace, stats_callback=mock_stats_callback) - - # Verify callback was invoked - assert len(mock_stats_callback.stats_received) > 0 - assert mock_stats_callback.stats_received[0]["total_execs"] == 1000 - - -@pytest.mark.asyncio -class TestCargoFuzzerFindingGeneration: - """Test finding generation from crashes""" - - async def test_create_finding_from_crash(self, cargo_fuzzer): - """Test finding creation""" - finding = cargo_fuzzer.create_finding( - title="Crash: Segmentation Fault", - description="Test crash", - severity="critical", - category="crash", - file_path="fuzz/fuzz_targets/fuzz_target_1.rs", - metadata={"crash_type": "SIGSEGV"} - ) - - assert finding.title == "Crash: Segmentation Fault" - assert finding.severity == "critical" - assert finding.category == "crash" - assert finding.file_path == "fuzz/fuzz_targets/fuzz_target_1.rs" - assert finding.metadata["crash_type"] == "SIGSEGV" diff --git a/backend/tests/unit/test_modules/test_file_scanner.py b/backend/tests/unit/test_modules/test_file_scanner.py deleted file mode 100644 index 12332f0..0000000 --- a/backend/tests/unit/test_modules/test_file_scanner.py +++ /dev/null @@ -1,349 +0,0 @@ -""" -Unit tests for FileScanner module -""" - -import sys -from pathlib import Path - -import pytest - -sys.path.insert(0, str(Path(__file__).resolve().parents[3] / "toolbox")) - - - -@pytest.mark.asyncio -class TestFileScannerMetadata: - """Test FileScanner metadata""" - - async def test_metadata_structure(self, file_scanner): - """Test that metadata has correct structure""" - metadata = file_scanner.get_metadata() - - assert metadata.name == "file_scanner" - assert metadata.version == "1.0.0" - assert metadata.category == "scanner" - assert "files" in metadata.tags - assert "enumeration" in metadata.tags - assert metadata.requires_workspace is True - - -@pytest.mark.asyncio -class TestFileScannerConfigValidation: - """Test configuration validation""" - - async def test_valid_config(self, file_scanner): - """Test that valid config passes validation""" - config = { - "patterns": ["*.py", "*.js"], - "max_file_size": 1048576, - "check_sensitive": True, - "calculate_hashes": False - } - assert file_scanner.validate_config(config) is True - - async def test_default_config(self, file_scanner): - """Test that empty config uses defaults""" - config = {} - assert file_scanner.validate_config(config) is True - - async def test_invalid_patterns_type(self, file_scanner): - """Test that non-list patterns raises error""" - config = {"patterns": "*.py"} - with pytest.raises(ValueError, match="patterns must be a list"): - file_scanner.validate_config(config) - - async def test_invalid_max_file_size(self, file_scanner): - """Test that invalid max_file_size raises error""" - config = {"max_file_size": -1} - with pytest.raises(ValueError, match="max_file_size must be a positive integer"): - file_scanner.validate_config(config) - - async def test_invalid_max_file_size_type(self, file_scanner): - """Test that non-integer max_file_size raises error""" - config = {"max_file_size": "large"} - with pytest.raises(ValueError, match="max_file_size must be a positive integer"): - file_scanner.validate_config(config) - - -@pytest.mark.asyncio -class TestFileScannerExecution: - """Test scanner execution""" - - async def test_scan_python_files(self, file_scanner, python_test_workspace): - """Test scanning Python files""" - config = { - "patterns": ["*.py"], - "check_sensitive": False, - "calculate_hashes": False - } - - result = await file_scanner.execute(config, python_test_workspace) - - assert result.module == "file_scanner" - assert result.status == "success" - assert len(result.findings) > 0 - - # Check that Python files were found - python_files = [f for f in result.findings if f.file_path.endswith('.py')] - assert len(python_files) > 0 - - async def test_scan_all_files(self, file_scanner, python_test_workspace): - """Test scanning all files with wildcard""" - config = { - "patterns": ["*"], - "check_sensitive": False, - "calculate_hashes": False - } - - result = await file_scanner.execute(config, python_test_workspace) - - assert result.status == "success" - assert len(result.findings) > 0 - assert result.summary["total_files"] > 0 - - async def test_scan_with_multiple_patterns(self, file_scanner, python_test_workspace): - """Test scanning with multiple patterns""" - config = { - "patterns": ["*.py", "*.txt"], - "check_sensitive": False, - "calculate_hashes": False - } - - result = await file_scanner.execute(config, python_test_workspace) - - assert result.status == "success" - assert len(result.findings) > 0 - - async def test_empty_workspace(self, file_scanner, temp_workspace): - """Test scanning empty workspace""" - config = { - "patterns": ["*.py"], - "check_sensitive": False - } - - result = await file_scanner.execute(config, temp_workspace) - - assert result.status == "success" - assert len(result.findings) == 0 - assert result.summary["total_files"] == 0 - - -@pytest.mark.asyncio -class TestFileScannerSensitiveDetection: - """Test sensitive file detection""" - - async def test_detect_env_file(self, file_scanner, temp_workspace): - """Test detection of .env file""" - # Create .env file - (temp_workspace / ".env").write_text("API_KEY=secret123") - - config = { - "patterns": ["*"], - "check_sensitive": True, - "calculate_hashes": False - } - - result = await file_scanner.execute(config, temp_workspace) - - assert result.status == "success" - - # Check for sensitive file finding - sensitive_findings = [f for f in result.findings if f.category == "sensitive_file"] - assert len(sensitive_findings) > 0 - assert any(".env" in f.title for f in sensitive_findings) - - async def test_detect_private_key(self, file_scanner, temp_workspace): - """Test detection of private key file""" - # Create private key file - (temp_workspace / "id_rsa").write_text("-----BEGIN RSA PRIVATE KEY-----") - - config = { - "patterns": ["*"], - "check_sensitive": True - } - - result = await file_scanner.execute(config, temp_workspace) - - assert result.status == "success" - sensitive_findings = [f for f in result.findings if f.category == "sensitive_file"] - assert len(sensitive_findings) > 0 - - async def test_no_sensitive_detection_when_disabled(self, file_scanner, temp_workspace): - """Test that sensitive detection can be disabled""" - (temp_workspace / ".env").write_text("API_KEY=secret123") - - config = { - "patterns": ["*"], - "check_sensitive": False - } - - result = await file_scanner.execute(config, temp_workspace) - - assert result.status == "success" - sensitive_findings = [f for f in result.findings if f.category == "sensitive_file"] - assert len(sensitive_findings) == 0 - - -@pytest.mark.asyncio -class TestFileScannerHashing: - """Test file hashing functionality""" - - async def test_hash_calculation(self, file_scanner, temp_workspace): - """Test SHA256 hash calculation""" - # Create test file - test_file = temp_workspace / "test.txt" - test_file.write_text("Hello World") - - config = { - "patterns": ["*.txt"], - "calculate_hashes": True - } - - result = await file_scanner.execute(config, temp_workspace) - - assert result.status == "success" - - # Find the test.txt finding - txt_findings = [f for f in result.findings if "test.txt" in f.file_path] - assert len(txt_findings) > 0 - - # Check that hash was calculated - finding = txt_findings[0] - assert finding.metadata.get("file_hash") is not None - assert len(finding.metadata["file_hash"]) == 64 # SHA256 hex length - - async def test_no_hash_when_disabled(self, file_scanner, temp_workspace): - """Test that hashing can be disabled""" - test_file = temp_workspace / "test.txt" - test_file.write_text("Hello World") - - config = { - "patterns": ["*.txt"], - "calculate_hashes": False - } - - result = await file_scanner.execute(config, temp_workspace) - - assert result.status == "success" - txt_findings = [f for f in result.findings if "test.txt" in f.file_path] - - if len(txt_findings) > 0: - finding = txt_findings[0] - assert finding.metadata.get("file_hash") is None - - -@pytest.mark.asyncio -class TestFileScannerFileTypes: - """Test file type detection""" - - async def test_detect_python_type(self, file_scanner, temp_workspace): - """Test detection of Python file type""" - (temp_workspace / "script.py").write_text("print('hello')") - - config = {"patterns": ["*.py"]} - result = await file_scanner.execute(config, temp_workspace) - - assert result.status == "success" - py_findings = [f for f in result.findings if "script.py" in f.file_path] - assert len(py_findings) > 0 - assert "python" in py_findings[0].metadata["file_type"] - - async def test_detect_javascript_type(self, file_scanner, temp_workspace): - """Test detection of JavaScript file type""" - (temp_workspace / "app.js").write_text("console.log('hello')") - - config = {"patterns": ["*.js"]} - result = await file_scanner.execute(config, temp_workspace) - - assert result.status == "success" - js_findings = [f for f in result.findings if "app.js" in f.file_path] - assert len(js_findings) > 0 - assert "javascript" in js_findings[0].metadata["file_type"] - - async def test_file_type_summary(self, file_scanner, temp_workspace): - """Test that file type summary is generated""" - (temp_workspace / "script.py").write_text("print('hello')") - (temp_workspace / "app.js").write_text("console.log('hello')") - (temp_workspace / "readme.txt").write_text("Documentation") - - config = {"patterns": ["*"]} - result = await file_scanner.execute(config, temp_workspace) - - assert result.status == "success" - assert "file_types" in result.summary - assert len(result.summary["file_types"]) > 0 - - -@pytest.mark.asyncio -class TestFileScannerSizeLimits: - """Test file size handling""" - - async def test_skip_large_files(self, file_scanner, temp_workspace): - """Test that large files are skipped""" - # Create a "large" file - large_file = temp_workspace / "large.txt" - large_file.write_text("x" * 1000) - - config = { - "patterns": ["*.txt"], - "max_file_size": 500 # Set limit smaller than file - } - - result = await file_scanner.execute(config, temp_workspace) - - # Should succeed but skip the large file - assert result.status == "success" - - # The file should still be counted but not have a detailed finding - assert result.summary["total_files"] > 0 - - async def test_process_small_files(self, file_scanner, temp_workspace): - """Test that small files are processed""" - small_file = temp_workspace / "small.txt" - small_file.write_text("small content") - - config = { - "patterns": ["*.txt"], - "max_file_size": 1048576 # 1MB - } - - result = await file_scanner.execute(config, temp_workspace) - - assert result.status == "success" - txt_findings = [f for f in result.findings if "small.txt" in f.file_path] - assert len(txt_findings) > 0 - - -@pytest.mark.asyncio -class TestFileScannerSummary: - """Test result summary generation""" - - async def test_summary_structure(self, file_scanner, python_test_workspace): - """Test that summary has correct structure""" - config = {"patterns": ["*"]} - result = await file_scanner.execute(config, python_test_workspace) - - assert result.status == "success" - assert "total_files" in result.summary - assert "total_size_bytes" in result.summary - assert "file_types" in result.summary - assert "patterns_scanned" in result.summary - - assert isinstance(result.summary["total_files"], int) - assert isinstance(result.summary["total_size_bytes"], int) - assert isinstance(result.summary["file_types"], dict) - assert isinstance(result.summary["patterns_scanned"], list) - - async def test_summary_counts(self, file_scanner, temp_workspace): - """Test that summary counts are accurate""" - # Create known files - (temp_workspace / "file1.py").write_text("content1") - (temp_workspace / "file2.py").write_text("content2") - (temp_workspace / "file3.txt").write_text("content3") - - config = {"patterns": ["*"]} - result = await file_scanner.execute(config, temp_workspace) - - assert result.status == "success" - assert result.summary["total_files"] == 3 - assert result.summary["total_size_bytes"] > 0 diff --git a/backend/tests/unit/test_modules/test_security_analyzer.py b/backend/tests/unit/test_modules/test_security_analyzer.py deleted file mode 100644 index 7365a78..0000000 --- a/backend/tests/unit/test_modules/test_security_analyzer.py +++ /dev/null @@ -1,493 +0,0 @@ -""" -Unit tests for SecurityAnalyzer module -""" - -import pytest -import sys -from pathlib import Path - -sys.path.insert(0, str(Path(__file__).resolve().parents[3] / "toolbox")) - -from modules.analyzer.security_analyzer import SecurityAnalyzer - - -@pytest.fixture -def security_analyzer(): - """Create SecurityAnalyzer instance""" - return SecurityAnalyzer() - - -@pytest.mark.asyncio -class TestSecurityAnalyzerMetadata: - """Test SecurityAnalyzer metadata""" - - async def test_metadata_structure(self, security_analyzer): - """Test that metadata has correct structure""" - metadata = security_analyzer.get_metadata() - - assert metadata.name == "security_analyzer" - assert metadata.version == "1.0.0" - assert metadata.category == "analyzer" - assert "security" in metadata.tags - assert "vulnerabilities" in metadata.tags - assert metadata.requires_workspace is True - - -@pytest.mark.asyncio -class TestSecurityAnalyzerConfigValidation: - """Test configuration validation""" - - async def test_valid_config(self, security_analyzer): - """Test that valid config passes validation""" - config = { - "file_extensions": [".py", ".js"], - "check_secrets": True, - "check_sql": True, - "check_dangerous_functions": True - } - assert security_analyzer.validate_config(config) is True - - async def test_default_config(self, security_analyzer): - """Test that empty config uses defaults""" - config = {} - assert security_analyzer.validate_config(config) is True - - async def test_invalid_extensions_type(self, security_analyzer): - """Test that non-list extensions raises error""" - config = {"file_extensions": ".py"} - with pytest.raises(ValueError, match="file_extensions must be a list"): - security_analyzer.validate_config(config) - - -@pytest.mark.asyncio -class TestSecurityAnalyzerSecretDetection: - """Test hardcoded secret detection""" - - async def test_detect_api_key(self, security_analyzer, temp_workspace): - """Test detection of hardcoded API key""" - code_file = temp_workspace / "config.py" - code_file.write_text(""" -# Configuration file -api_key = "apikey_live_abcdefghijklmnopqrstuvwxyzabcdefghijk" -database_url = "postgresql://localhost/db" -""") - - config = { - "file_extensions": [".py"], - "check_secrets": True, - "check_sql": False, - "check_dangerous_functions": False - } - - result = await security_analyzer.execute(config, temp_workspace) - - assert result.status == "success" - secret_findings = [f for f in result.findings if f.category == "hardcoded_secret"] - assert len(secret_findings) > 0 - assert any("API Key" in f.title for f in secret_findings) - - async def test_detect_password(self, security_analyzer, temp_workspace): - """Test detection of hardcoded password""" - code_file = temp_workspace / "auth.py" - code_file.write_text(""" -def connect(): - password = "mySecretP@ssw0rd" - return connect_db(password) -""") - - config = { - "file_extensions": [".py"], - "check_secrets": True, - "check_sql": False, - "check_dangerous_functions": False - } - - result = await security_analyzer.execute(config, temp_workspace) - - assert result.status == "success" - secret_findings = [f for f in result.findings if f.category == "hardcoded_secret"] - assert len(secret_findings) > 0 - - async def test_detect_aws_credentials(self, security_analyzer, temp_workspace): - """Test detection of AWS credentials""" - code_file = temp_workspace / "aws_config.py" - code_file.write_text(""" -aws_access_key = "AKIAIOSFODNN7REALKEY" -aws_secret_key = "wJalrXUtnFEMI/K7MDENG/bPxRfiCYREALKEY" -""") - - config = { - "file_extensions": [".py"], - "check_secrets": True - } - - result = await security_analyzer.execute(config, temp_workspace) - - assert result.status == "success" - aws_findings = [f for f in result.findings if "AWS" in f.title] - assert len(aws_findings) >= 2 # Both access key and secret key - - async def test_no_secret_detection_when_disabled(self, security_analyzer, temp_workspace): - """Test that secret detection can be disabled""" - code_file = temp_workspace / "config.py" - code_file.write_text('api_key = "sk_live_1234567890abcdef"') - - config = { - "file_extensions": [".py"], - "check_secrets": False - } - - result = await security_analyzer.execute(config, temp_workspace) - - assert result.status == "success" - secret_findings = [f for f in result.findings if f.category == "hardcoded_secret"] - assert len(secret_findings) == 0 - - -@pytest.mark.asyncio -class TestSecurityAnalyzerSQLInjection: - """Test SQL injection detection""" - - async def test_detect_string_concatenation(self, security_analyzer, temp_workspace): - """Test detection of SQL string concatenation""" - code_file = temp_workspace / "db.py" - code_file.write_text(""" -def get_user(user_id): - query = "SELECT * FROM users WHERE id = " + user_id - return execute(query) -""") - - config = { - "file_extensions": [".py"], - "check_secrets": False, - "check_sql": True, - "check_dangerous_functions": False - } - - result = await security_analyzer.execute(config, temp_workspace) - - assert result.status == "success" - sql_findings = [f for f in result.findings if f.category == "sql_injection"] - assert len(sql_findings) > 0 - - async def test_detect_f_string_sql(self, security_analyzer, temp_workspace): - """Test detection of f-string in SQL""" - code_file = temp_workspace / "db.py" - code_file.write_text(""" -def get_user(name): - query = f"SELECT * FROM users WHERE name = '{name}'" - return execute(query) -""") - - config = { - "file_extensions": [".py"], - "check_sql": True - } - - result = await security_analyzer.execute(config, temp_workspace) - - assert result.status == "success" - sql_findings = [f for f in result.findings if f.category == "sql_injection"] - assert len(sql_findings) > 0 - - async def test_detect_dynamic_query_building(self, security_analyzer, temp_workspace): - """Test detection of dynamic query building""" - code_file = temp_workspace / "queries.py" - code_file.write_text(""" -def search(keyword): - query = "SELECT * FROM products WHERE name LIKE " + keyword - execute(query + " ORDER BY price") -""") - - config = { - "file_extensions": [".py"], - "check_sql": True - } - - result = await security_analyzer.execute(config, temp_workspace) - - assert result.status == "success" - sql_findings = [f for f in result.findings if f.category == "sql_injection"] - assert len(sql_findings) > 0 - - async def test_no_sql_detection_when_disabled(self, security_analyzer, temp_workspace): - """Test that SQL detection can be disabled""" - code_file = temp_workspace / "db.py" - code_file.write_text('query = "SELECT * FROM users WHERE id = " + user_id') - - config = { - "file_extensions": [".py"], - "check_sql": False - } - - result = await security_analyzer.execute(config, temp_workspace) - - assert result.status == "success" - sql_findings = [f for f in result.findings if f.category == "sql_injection"] - assert len(sql_findings) == 0 - - -@pytest.mark.asyncio -class TestSecurityAnalyzerDangerousFunctions: - """Test dangerous function detection""" - - async def test_detect_eval(self, security_analyzer, temp_workspace): - """Test detection of eval() usage""" - code_file = temp_workspace / "dangerous.py" - code_file.write_text(""" -def process_input(user_input): - result = eval(user_input) - return result -""") - - config = { - "file_extensions": [".py"], - "check_secrets": False, - "check_sql": False, - "check_dangerous_functions": True - } - - result = await security_analyzer.execute(config, temp_workspace) - - assert result.status == "success" - dangerous_findings = [f for f in result.findings if f.category == "dangerous_function"] - assert len(dangerous_findings) > 0 - assert any("eval" in f.title.lower() for f in dangerous_findings) - - async def test_detect_exec(self, security_analyzer, temp_workspace): - """Test detection of exec() usage""" - code_file = temp_workspace / "runner.py" - code_file.write_text(""" -def run_code(code): - exec(code) -""") - - config = { - "file_extensions": [".py"], - "check_dangerous_functions": True - } - - result = await security_analyzer.execute(config, temp_workspace) - - assert result.status == "success" - dangerous_findings = [f for f in result.findings if f.category == "dangerous_function"] - assert len(dangerous_findings) > 0 - - async def test_detect_os_system(self, security_analyzer, temp_workspace): - """Test detection of os.system() usage""" - code_file = temp_workspace / "commands.py" - code_file.write_text(""" -import os - -def run_command(cmd): - os.system(cmd) -""") - - config = { - "file_extensions": [".py"], - "check_dangerous_functions": True - } - - result = await security_analyzer.execute(config, temp_workspace) - - assert result.status == "success" - dangerous_findings = [f for f in result.findings if f.category == "dangerous_function"] - assert len(dangerous_findings) > 0 - assert any("os.system" in f.title for f in dangerous_findings) - - async def test_detect_pickle_loads(self, security_analyzer, temp_workspace): - """Test detection of pickle.loads() usage""" - code_file = temp_workspace / "serializer.py" - code_file.write_text(""" -import pickle - -def deserialize(data): - return pickle.loads(data) -""") - - config = { - "file_extensions": [".py"], - "check_dangerous_functions": True - } - - result = await security_analyzer.execute(config, temp_workspace) - - assert result.status == "success" - dangerous_findings = [f for f in result.findings if f.category == "dangerous_function"] - assert len(dangerous_findings) > 0 - - async def test_detect_javascript_eval(self, security_analyzer, temp_workspace): - """Test detection of eval() in JavaScript""" - code_file = temp_workspace / "app.js" - code_file.write_text(""" -function processInput(userInput) { - return eval(userInput); -} -""") - - config = { - "file_extensions": [".js"], - "check_dangerous_functions": True - } - - result = await security_analyzer.execute(config, temp_workspace) - - assert result.status == "success" - dangerous_findings = [f for f in result.findings if f.category == "dangerous_function"] - assert len(dangerous_findings) > 0 - - async def test_detect_innerHTML(self, security_analyzer, temp_workspace): - """Test detection of innerHTML (XSS risk)""" - code_file = temp_workspace / "dom.js" - code_file.write_text(""" -function updateContent(html) { - document.getElementById("content").innerHTML = html; -} -""") - - config = { - "file_extensions": [".js"], - "check_dangerous_functions": True - } - - result = await security_analyzer.execute(config, temp_workspace) - - assert result.status == "success" - dangerous_findings = [f for f in result.findings if f.category == "dangerous_function"] - assert len(dangerous_findings) > 0 - - async def test_no_dangerous_detection_when_disabled(self, security_analyzer, temp_workspace): - """Test that dangerous function detection can be disabled""" - code_file = temp_workspace / "code.py" - code_file.write_text('result = eval(user_input)') - - config = { - "file_extensions": [".py"], - "check_dangerous_functions": False - } - - result = await security_analyzer.execute(config, temp_workspace) - - assert result.status == "success" - dangerous_findings = [f for f in result.findings if f.category == "dangerous_function"] - assert len(dangerous_findings) == 0 - - -@pytest.mark.asyncio -class TestSecurityAnalyzerMultipleIssues: - """Test detection of multiple issues in same file""" - - async def test_detect_multiple_vulnerabilities(self, security_analyzer, temp_workspace): - """Test detection of multiple vulnerability types""" - code_file = temp_workspace / "vulnerable.py" - code_file.write_text(""" -import os - -# Hardcoded credentials -api_key = "apikey_live_abcdefghijklmnopqrstuvwxyzabcdef" -password = "MySecureP@ssw0rd" - -def process_query(user_input): - # SQL injection - query = "SELECT * FROM users WHERE name = " + user_input - - # Dangerous function - result = eval(user_input) - - # Command injection - os.system(user_input) - - return result -""") - - config = { - "file_extensions": [".py"], - "check_secrets": True, - "check_sql": True, - "check_dangerous_functions": True - } - - result = await security_analyzer.execute(config, temp_workspace) - - assert result.status == "success" - - # Should find multiple types of issues - secret_findings = [f for f in result.findings if f.category == "hardcoded_secret"] - sql_findings = [f for f in result.findings if f.category == "sql_injection"] - dangerous_findings = [f for f in result.findings if f.category == "dangerous_function"] - - assert len(secret_findings) > 0 - assert len(sql_findings) > 0 - assert len(dangerous_findings) > 0 - - -@pytest.mark.asyncio -class TestSecurityAnalyzerSummary: - """Test result summary generation""" - - async def test_summary_structure(self, security_analyzer, temp_workspace): - """Test that summary has correct structure""" - (temp_workspace / "test.py").write_text("print('hello')") - - config = {"file_extensions": [".py"]} - result = await security_analyzer.execute(config, temp_workspace) - - assert result.status == "success" - assert "files_analyzed" in result.summary - assert "total_findings" in result.summary - assert "extensions_scanned" in result.summary - - assert isinstance(result.summary["files_analyzed"], int) - assert isinstance(result.summary["total_findings"], int) - assert isinstance(result.summary["extensions_scanned"], list) - - async def test_empty_workspace(self, security_analyzer, temp_workspace): - """Test analyzing empty workspace""" - config = {"file_extensions": [".py"]} - result = await security_analyzer.execute(config, temp_workspace) - - assert result.status == "partial" # No files found - assert result.summary["files_analyzed"] == 0 - - async def test_analyze_multiple_file_types(self, security_analyzer, temp_workspace): - """Test analyzing multiple file types""" - (temp_workspace / "app.py").write_text("print('hello')") - (temp_workspace / "script.js").write_text("console.log('hello')") - (temp_workspace / "index.php").write_text("") - - config = {"file_extensions": [".py", ".js", ".php"]} - result = await security_analyzer.execute(config, temp_workspace) - - assert result.status == "success" - assert result.summary["files_analyzed"] == 3 - - -@pytest.mark.asyncio -class TestSecurityAnalyzerFalsePositives: - """Test false positive filtering""" - - async def test_skip_test_secrets(self, security_analyzer, temp_workspace): - """Test that test/example secrets are filtered""" - code_file = temp_workspace / "test_config.py" - code_file.write_text(""" -# Test configuration - should be filtered -api_key = "test_key_example" -password = "dummy_password_123" -token = "sample_token_placeholder" -""") - - config = { - "file_extensions": [".py"], - "check_secrets": True - } - - result = await security_analyzer.execute(config, temp_workspace) - - assert result.status == "success" - # These should be filtered as false positives - secret_findings = [f for f in result.findings if f.category == "hardcoded_secret"] - # Should have fewer or no findings due to false positive filtering - assert len(secret_findings) == 0 or all( - not any(fp in f.description.lower() for fp in ['test', 'example', 'dummy', 'sample']) - for f in secret_findings - ) diff --git a/backend/tests/unit/test_workflows/__init__.py b/backend/tests/unit/test_workflows/__init__.py deleted file mode 100644 index e69de29..0000000 diff --git a/backend/toolbox/__init__.py b/backend/toolbox/__init__.py deleted file mode 100644 index 43bcfe7..0000000 --- a/backend/toolbox/__init__.py +++ /dev/null @@ -1,11 +0,0 @@ -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - diff --git a/backend/toolbox/common/storage_activities.py b/backend/toolbox/common/storage_activities.py deleted file mode 100644 index a09a83c..0000000 --- a/backend/toolbox/common/storage_activities.py +++ /dev/null @@ -1,369 +0,0 @@ -""" -FuzzForge Common Storage Activities - -Activities for interacting with MinIO storage: -- get_target_activity: Download target from MinIO to local cache -- cleanup_cache_activity: Remove target from local cache -- upload_results_activity: Upload workflow results to MinIO -""" - -import logging -import os -import shutil -from pathlib import Path - -import boto3 -from botocore.exceptions import ClientError -from temporalio import activity - -# Configure logging -logger = logging.getLogger(__name__) - -# Initialize S3 client (MinIO) -s3_client = boto3.client( - 's3', - endpoint_url=os.getenv('S3_ENDPOINT', 'http://minio:9000'), - aws_access_key_id=os.getenv('S3_ACCESS_KEY', 'fuzzforge'), - aws_secret_access_key=os.getenv('S3_SECRET_KEY', 'fuzzforge123'), - region_name=os.getenv('S3_REGION', 'us-east-1'), - use_ssl=os.getenv('S3_USE_SSL', 'false').lower() == 'true' -) - -# Configuration -S3_BUCKET = os.getenv('S3_BUCKET', 'targets') -CACHE_DIR = Path(os.getenv('CACHE_DIR', '/cache')) -CACHE_MAX_SIZE_GB = int(os.getenv('CACHE_MAX_SIZE', '10').rstrip('GB')) - - -@activity.defn(name="get_target") -async def get_target_activity( - target_id: str, - run_id: str = None, - workspace_isolation: str = "isolated" -) -> str: - """ - Download target from MinIO to local cache. - - Args: - target_id: UUID of the uploaded target - run_id: Workflow run ID for isolation (required for isolated mode) - workspace_isolation: Isolation mode - "isolated" (default), "shared", or "copy-on-write" - - Returns: - Local path to the cached target workspace - - Raises: - FileNotFoundError: If target doesn't exist in MinIO - ValueError: If run_id not provided for isolated mode - Exception: For other download errors - """ - logger.info( - f"Activity: get_target (target_id={target_id}, run_id={run_id}, " - f"isolation={workspace_isolation})" - ) - - # Validate isolation mode - valid_modes = ["isolated", "shared", "copy-on-write"] - if workspace_isolation not in valid_modes: - raise ValueError( - f"Invalid workspace_isolation mode: {workspace_isolation}. " - f"Must be one of: {valid_modes}" - ) - - # Require run_id for isolated and copy-on-write modes - if workspace_isolation in ["isolated", "copy-on-write"] and not run_id: - raise ValueError( - f"run_id is required for workspace_isolation='{workspace_isolation}'" - ) - - # Define cache paths based on isolation mode - if workspace_isolation == "isolated": - # Each run gets its own isolated workspace - cache_path = CACHE_DIR / target_id / run_id - cached_file = cache_path / "target" - elif workspace_isolation == "shared": - # All runs share the same workspace (legacy behavior) - cache_path = CACHE_DIR / target_id - cached_file = cache_path / "target" - else: # copy-on-write - # Shared download, run-specific copy - shared_cache_path = CACHE_DIR / target_id / "shared" - cache_path = CACHE_DIR / target_id / run_id - cached_file = shared_cache_path / "target" - - # Handle copy-on-write mode - if workspace_isolation == "copy-on-write": - # Check if shared cache exists - if cached_file.exists(): - logger.info(f"Copy-on-write: Shared cache HIT for {target_id}") - - # Copy shared workspace to run-specific path - shared_workspace = shared_cache_path / "workspace" - run_workspace = cache_path / "workspace" - - if shared_workspace.exists(): - logger.info(f"Copying workspace to isolated run path: {run_workspace}") - cache_path.mkdir(parents=True, exist_ok=True) - shutil.copytree(shared_workspace, run_workspace) - return str(run_workspace) - else: - # Shared file exists but not extracted (non-tarball) - run_file = cache_path / "target" - cache_path.mkdir(parents=True, exist_ok=True) - shutil.copy2(cached_file, run_file) - return str(run_file) - # If shared cache doesn't exist, fall through to download - - # Check if target is already cached (isolated or shared mode) - elif cached_file.exists(): - # Update access time for LRU - cached_file.touch() - logger.info(f"Cache HIT: {target_id} (mode: {workspace_isolation})") - - # Check if workspace directory exists (extracted tarball) - workspace_dir = cache_path / "workspace" - if workspace_dir.exists() and workspace_dir.is_dir(): - logger.info(f"Returning cached workspace: {workspace_dir}") - return str(workspace_dir) - else: - # Return cached file (not a tarball) - return str(cached_file) - - # Cache miss - download from MinIO - logger.info( - f"Cache MISS: {target_id} (mode: {workspace_isolation}), " - f"downloading from MinIO..." - ) - - try: - # Create cache directory - cache_path.mkdir(parents=True, exist_ok=True) - - # Download from S3/MinIO - s3_key = f'{target_id}/target' - logger.info(f"Downloading s3://{S3_BUCKET}/{s3_key} -> {cached_file}") - - s3_client.download_file( - Bucket=S3_BUCKET, - Key=s3_key, - Filename=str(cached_file) - ) - - # Verify file was downloaded - if not cached_file.exists(): - raise FileNotFoundError(f"Downloaded file not found: {cached_file}") - - file_size = cached_file.stat().st_size - logger.info( - f"āœ“ Downloaded target {target_id} " - f"({file_size / 1024 / 1024:.2f} MB)" - ) - - # Extract tarball if it's an archive - import tarfile - workspace_dir = cache_path / "workspace" - - if tarfile.is_tarfile(str(cached_file)): - logger.info(f"Extracting tarball to {workspace_dir}...") - workspace_dir.mkdir(parents=True, exist_ok=True) - - with tarfile.open(str(cached_file), 'r:*') as tar: - tar.extractall(path=workspace_dir) - - logger.info(f"āœ“ Extracted tarball to {workspace_dir}") - - # For copy-on-write mode, copy to run-specific path - if workspace_isolation == "copy-on-write": - run_cache_path = CACHE_DIR / target_id / run_id - run_workspace = run_cache_path / "workspace" - logger.info(f"Copy-on-write: Copying to {run_workspace}") - run_cache_path.mkdir(parents=True, exist_ok=True) - shutil.copytree(workspace_dir, run_workspace) - return str(run_workspace) - - return str(workspace_dir) - else: - # Not a tarball - if workspace_isolation == "copy-on-write": - # Copy file to run-specific path - run_cache_path = CACHE_DIR / target_id / run_id - run_file = run_cache_path / "target" - logger.info(f"Copy-on-write: Copying file to {run_file}") - run_cache_path.mkdir(parents=True, exist_ok=True) - shutil.copy2(cached_file, run_file) - return str(run_file) - - return str(cached_file) - - except ClientError as e: - error_code = e.response['Error']['Code'] - if error_code == '404' or error_code == 'NoSuchKey': - logger.error(f"Target not found in MinIO: {target_id}") - raise FileNotFoundError(f"Target {target_id} not found in storage") - else: - logger.error(f"S3/MinIO error downloading target: {e}", exc_info=True) - raise - - except Exception as e: - logger.error(f"Failed to download target {target_id}: {e}", exc_info=True) - # Cleanup partial download - if cache_path.exists(): - shutil.rmtree(cache_path, ignore_errors=True) - raise - - -@activity.defn(name="cleanup_cache") -async def cleanup_cache_activity( - target_path: str, - workspace_isolation: str = "isolated" -) -> None: - """ - Remove target from local cache after workflow completes. - - Args: - target_path: Path to the cached target workspace (from get_target_activity) - workspace_isolation: Isolation mode used - determines cleanup scope - - Notes: - - "isolated" mode: Removes the entire run-specific directory - - "copy-on-write" mode: Removes run-specific directory, keeps shared cache - - "shared" mode: Does NOT remove cache (shared across runs) - """ - logger.info( - f"Activity: cleanup_cache (path={target_path}, " - f"isolation={workspace_isolation})" - ) - - try: - target = Path(target_path) - - # For shared mode, don't clean up (cache is shared across runs) - if workspace_isolation == "shared": - logger.info( - f"Skipping cleanup for shared workspace (mode={workspace_isolation})" - ) - return - - # For isolated and copy-on-write modes, clean up run-specific directory - # Navigate up to the run-specific directory: /cache/{target_id}/{run_id}/ - if target.name == "workspace": - # Path is .../workspace, go up one level to run directory - run_dir = target.parent - else: - # Path is a file, go up one level to run directory - run_dir = target.parent - - # Validate it's in cache and looks like a run-specific path - if run_dir.exists() and run_dir.is_relative_to(CACHE_DIR): - # Check if parent is target_id directory (validate structure) - target_id_dir = run_dir.parent - if target_id_dir.is_relative_to(CACHE_DIR): - shutil.rmtree(run_dir) - logger.info( - f"āœ“ Cleaned up run-specific directory: {run_dir} " - f"(mode={workspace_isolation})" - ) - else: - logger.warning( - f"Unexpected cache structure, skipping cleanup: {run_dir}" - ) - else: - logger.warning( - f"Cache path not in CACHE_DIR or doesn't exist: {run_dir}" - ) - - except Exception as e: - # Don't fail workflow if cleanup fails - logger.error( - f"Failed to cleanup cache {target_path}: {e}", - exc_info=True - ) - - -@activity.defn(name="upload_results") -async def upload_results_activity( - workflow_id: str, - results: dict, - results_format: str = "json" -) -> str: - """ - Upload workflow results to MinIO. - - Args: - workflow_id: Workflow execution ID - results: Results dictionary to upload - results_format: Format for results (json, sarif, etc.) - - Returns: - S3 URL to the uploaded results - """ - logger.info( - f"Activity: upload_results " - f"(workflow_id={workflow_id}, format={results_format})" - ) - - try: - import json - - # Prepare results content - if results_format == "json": - content = json.dumps(results, indent=2).encode('utf-8') - content_type = 'application/json' - file_ext = 'json' - elif results_format == "sarif": - content = json.dumps(results, indent=2).encode('utf-8') - content_type = 'application/sarif+json' - file_ext = 'sarif' - else: - # Default to JSON - content = json.dumps(results, indent=2).encode('utf-8') - content_type = 'application/json' - file_ext = 'json' - - # Upload to MinIO - s3_key = f'{workflow_id}/results.{file_ext}' - logger.info(f"Uploading results to s3://results/{s3_key}") - - s3_client.put_object( - Bucket='results', - Key=s3_key, - Body=content, - ContentType=content_type, - Metadata={ - 'workflow_id': workflow_id, - 'format': results_format - } - ) - - # Construct S3 URL - s3_endpoint = os.getenv('S3_ENDPOINT', 'http://minio:9000') - s3_url = f"{s3_endpoint}/results/{s3_key}" - - logger.info(f"āœ“ Uploaded results: {s3_url}") - return s3_url - - except Exception as e: - logger.error( - f"Failed to upload results for workflow {workflow_id}: {e}", - exc_info=True - ) - raise - - -def _check_cache_size(): - """Check total cache size and log warning if exceeding limit""" - try: - total_size = 0 - for item in CACHE_DIR.rglob('*'): - if item.is_file(): - total_size += item.stat().st_size - - total_size_gb = total_size / (1024 ** 3) - if total_size_gb > CACHE_MAX_SIZE_GB: - logger.warning( - f"Cache size ({total_size_gb:.2f} GB) exceeds " - f"limit ({CACHE_MAX_SIZE_GB} GB). Consider cleanup." - ) - - except Exception as e: - logger.error(f"Failed to check cache size: {e}") diff --git a/backend/toolbox/modules/__init__.py b/backend/toolbox/modules/__init__.py deleted file mode 100644 index 43bcfe7..0000000 --- a/backend/toolbox/modules/__init__.py +++ /dev/null @@ -1,11 +0,0 @@ -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - diff --git a/backend/toolbox/modules/analyzer/__init__.py b/backend/toolbox/modules/analyzer/__init__.py deleted file mode 100644 index 8bffdab..0000000 --- a/backend/toolbox/modules/analyzer/__init__.py +++ /dev/null @@ -1,16 +0,0 @@ -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -from .security_analyzer import SecurityAnalyzer -from .bandit_analyzer import BanditAnalyzer -from .mypy_analyzer import MypyAnalyzer - -__all__ = ["SecurityAnalyzer", "BanditAnalyzer", "MypyAnalyzer"] \ No newline at end of file diff --git a/backend/toolbox/modules/analyzer/bandit_analyzer.py b/backend/toolbox/modules/analyzer/bandit_analyzer.py deleted file mode 100644 index ecf81a8..0000000 --- a/backend/toolbox/modules/analyzer/bandit_analyzer.py +++ /dev/null @@ -1,328 +0,0 @@ -""" -Bandit Analyzer Module - Analyzes Python code for security issues using Bandit -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -import asyncio -import json -import logging -import time -from pathlib import Path -from typing import Dict, Any, List - -try: - from toolbox.modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding -except ImportError: - try: - from modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding - except ImportError: - from src.toolbox.modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding - -logger = logging.getLogger(__name__) - - -class BanditAnalyzer(BaseModule): - """ - Analyzes Python code for security issues using Bandit. - - This module: - - Runs Bandit security linter on Python files - - Detects common security issues (SQL injection, hardcoded secrets, etc.) - - Reports findings with severity levels - """ - - # Severity mapping from Bandit levels to our standard - SEVERITY_MAP = { - "LOW": "low", - "MEDIUM": "medium", - "HIGH": "high" - } - - def get_metadata(self) -> ModuleMetadata: - """Get module metadata""" - return ModuleMetadata( - name="bandit_analyzer", - version="1.0.0", - description="Analyzes Python code for security issues using Bandit", - author="FuzzForge Team", - category="analyzer", - tags=["python", "security", "bandit", "sast"], - input_schema={ - "severity_level": { - "type": "string", - "enum": ["low", "medium", "high"], - "description": "Minimum severity level to report", - "default": "low" - }, - "confidence_level": { - "type": "string", - "enum": ["low", "medium", "high"], - "description": "Minimum confidence level to report", - "default": "medium" - }, - "exclude_tests": { - "type": "boolean", - "description": "Exclude test files from analysis", - "default": True - }, - "skip_ids": { - "type": "array", - "items": {"type": "string"}, - "description": "List of Bandit test IDs to skip", - "default": [] - } - }, - output_schema={ - "findings": { - "type": "array", - "description": "List of security issues found by Bandit" - } - }, - requires_workspace=True - ) - - def validate_config(self, config: Dict[str, Any]) -> bool: - """Validate module configuration""" - severity = config.get("severity_level", "low") - if severity not in ["low", "medium", "high"]: - raise ValueError("severity_level must be one of: low, medium, high") - - confidence = config.get("confidence_level", "medium") - if confidence not in ["low", "medium", "high"]: - raise ValueError("confidence_level must be one of: low, medium, high") - - skip_ids = config.get("skip_ids", []) - if not isinstance(skip_ids, list): - raise ValueError("skip_ids must be a list") - - return True - - async def _run_bandit( - self, - workspace: Path, - severity_level: str, - confidence_level: str, - exclude_tests: bool, - skip_ids: List[str] - ) -> Dict[str, Any]: - """ - Run Bandit on the workspace. - - Args: - workspace: Path to workspace - severity_level: Minimum severity to report - confidence_level: Minimum confidence to report - exclude_tests: Whether to exclude test files - skip_ids: List of test IDs to skip - - Returns: - Bandit JSON output as dict - """ - try: - # Build bandit command - cmd = [ - "bandit", - "-r", str(workspace), - "-f", "json", - "-ll", # Report all findings (we'll filter later) - ] - - # Add exclude patterns for test files - if exclude_tests: - cmd.extend(["-x", "*/test_*.py,*/tests/*,*_test.py"]) - - # Add skip IDs if specified - if skip_ids: - cmd.extend(["-s", ",".join(skip_ids)]) - - logger.info(f"Running Bandit on: {workspace}") - process = await asyncio.create_subprocess_exec( - *cmd, - stdout=asyncio.subprocess.PIPE, - stderr=asyncio.subprocess.PIPE - ) - - stdout, stderr = await process.communicate() - - # Bandit returns non-zero if issues found, which is expected - if process.returncode not in [0, 1]: - logger.error(f"Bandit failed: {stderr.decode()}") - return {"results": []} - - # Parse JSON output - result = json.loads(stdout.decode()) - return result - - except Exception as e: - logger.error(f"Error running Bandit: {e}") - return {"results": []} - - def _should_include_finding( - self, - issue: Dict[str, Any], - min_severity: str, - min_confidence: str - ) -> bool: - """ - Determine if a Bandit issue should be included based on severity/confidence. - - Args: - issue: Bandit issue dict - min_severity: Minimum severity threshold - min_confidence: Minimum confidence threshold - - Returns: - True if issue should be included - """ - severity_order = ["low", "medium", "high"] - issue_severity = issue.get("issue_severity", "LOW").lower() - issue_confidence = issue.get("issue_confidence", "LOW").lower() - - severity_meets_threshold = severity_order.index(issue_severity) >= severity_order.index(min_severity) - confidence_meets_threshold = severity_order.index(issue_confidence) >= severity_order.index(min_confidence) - - return severity_meets_threshold and confidence_meets_threshold - - def _convert_to_findings( - self, - bandit_result: Dict[str, Any], - workspace: Path, - min_severity: str, - min_confidence: str - ) -> List[ModuleFinding]: - """ - Convert Bandit results to ModuleFindings. - - Args: - bandit_result: Bandit JSON output - workspace: Workspace path for relative paths - min_severity: Minimum severity to include - min_confidence: Minimum confidence to include - - Returns: - List of ModuleFindings - """ - findings = [] - - for issue in bandit_result.get("results", []): - # Filter by severity and confidence - if not self._should_include_finding(issue, min_severity, min_confidence): - continue - - # Extract issue details - test_id = issue.get("test_id", "B000") - test_name = issue.get("test_name", "unknown") - issue_text = issue.get("issue_text", "No description") - severity = self.SEVERITY_MAP.get(issue.get("issue_severity", "LOW"), "low") - - # File location - filename = issue.get("filename", "") - line_number = issue.get("line_number", 0) - code = issue.get("code", "") - - # Try to get relative path - try: - file_path = Path(filename) - rel_path = file_path.relative_to(workspace) - except (ValueError, TypeError): - rel_path = Path(filename).name - - # Create finding - finding = self.create_finding( - title=f"{test_name} ({test_id})", - description=issue_text, - severity=severity, - category="security-issue", - file_path=str(rel_path), - line_start=line_number, - line_end=line_number, - code_snippet=code.strip() if code else None, - recommendation=f"Review and fix the security issue identified by Bandit test {test_id}", - metadata={ - "test_id": test_id, - "test_name": test_name, - "confidence": issue.get("issue_confidence", "LOW").lower(), - "cwe": issue.get("issue_cwe", {}).get("id") if issue.get("issue_cwe") else None, - "more_info": issue.get("more_info", "") - } - ) - findings.append(finding) - - return findings - - async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: - """ - Execute the Bandit analyzer module. - - Args: - config: Module configuration - workspace: Path to workspace - - Returns: - ModuleResult with security findings - """ - start_time = time.time() - metadata = self.get_metadata() - - # Validate inputs - self.validate_config(config) - self.validate_workspace(workspace) - - # Get configuration - severity_level = config.get("severity_level", "low") - confidence_level = config.get("confidence_level", "medium") - exclude_tests = config.get("exclude_tests", True) - skip_ids = config.get("skip_ids", []) - - # Run Bandit - logger.info("Starting Bandit analysis...") - bandit_result = await self._run_bandit( - workspace, - severity_level, - confidence_level, - exclude_tests, - skip_ids - ) - - # Convert to findings - findings = self._convert_to_findings( - bandit_result, - workspace, - severity_level, - confidence_level - ) - - # Calculate summary - severity_counts = {} - for finding in findings: - sev = finding.severity - severity_counts[sev] = severity_counts.get(sev, 0) + 1 - - execution_time = time.time() - start_time - - return ModuleResult( - module=metadata.name, - version=metadata.version, - status="success", - execution_time=execution_time, - findings=findings, - summary={ - "total_issues": len(findings), - "by_severity": severity_counts, - "files_analyzed": len(set(f.file_path for f in findings if f.file_path)) - }, - metadata={ - "bandit_version": bandit_result.get("generated_at", "unknown"), - "metrics": bandit_result.get("metrics", {}) - } - ) diff --git a/backend/toolbox/modules/analyzer/llm_analyzer.py b/backend/toolbox/modules/analyzer/llm_analyzer.py deleted file mode 100644 index b3b1374..0000000 --- a/backend/toolbox/modules/analyzer/llm_analyzer.py +++ /dev/null @@ -1,349 +0,0 @@ -""" -LLM Analyzer Module - Uses AI to analyze code for security issues -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -import logging -from pathlib import Path -from typing import Dict, Any, List - -try: - from toolbox.modules.base import BaseModule, ModuleMetadata, ModuleResult -except ImportError: - try: - from modules.base import BaseModule, ModuleMetadata, ModuleResult - except ImportError: - from src.toolbox.modules.base import BaseModule, ModuleMetadata, ModuleResult - -logger = logging.getLogger(__name__) - - -class LLMAnalyzer(BaseModule): - """ - Uses an LLM to analyze code for potential security issues. - - This module: - - Sends code to an LLM agent via A2A protocol - - Asks the LLM to identify security vulnerabilities - - Collects findings and returns them in structured format - """ - - def get_metadata(self) -> ModuleMetadata: - """Get module metadata""" - return ModuleMetadata( - name="llm_analyzer", - version="1.0.0", - description="Uses AI to analyze code for security issues", - author="FuzzForge Team", - category="analyzer", - tags=["llm", "ai", "security", "analysis"], - input_schema={ - "agent_url": { - "type": "string", - "description": "A2A agent endpoint URL", - "default": "http://fuzzforge-task-agent:8000/a2a/litellm_agent" - }, - "llm_model": { - "type": "string", - "description": "LLM model to use", - "default": "gpt-4o-mini" - }, - "llm_provider": { - "type": "string", - "description": "LLM provider (openai, anthropic, etc.)", - "default": "openai" - }, - "file_patterns": { - "type": "array", - "items": {"type": "string"}, - "description": "File patterns to analyze", - "default": ["*.py", "*.js", "*.ts", "*.java", "*.go"] - }, - "max_files": { - "type": "integer", - "description": "Maximum number of files to analyze", - "default": 5 - }, - "max_file_size": { - "type": "integer", - "description": "Maximum file size in bytes", - "default": 50000 # 50KB - }, - "timeout": { - "type": "integer", - "description": "Timeout per file in seconds", - "default": 60 - } - }, - output_schema={ - "findings": { - "type": "array", - "description": "Security issues identified by LLM" - } - }, - requires_workspace=True - ) - - def validate_config(self, config: Dict[str, Any]) -> bool: - """Validate module configuration""" - # Lazy import to avoid Temporal sandbox restrictions - try: - from fuzzforge_ai.a2a_wrapper import send_agent_task # noqa: F401 - except ImportError: - raise RuntimeError( - "A2A wrapper not available. Ensure fuzzforge_ai module is accessible." - ) - - agent_url = config.get("agent_url") - if not agent_url or not isinstance(agent_url, str): - raise ValueError("agent_url must be a valid URL string") - - max_files = config.get("max_files", 5) - if not isinstance(max_files, int) or max_files <= 0: - raise ValueError("max_files must be a positive integer") - - return True - - async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: - """ - Execute the LLM analysis module. - - Args: - config: Module configuration - workspace: Path to the workspace containing code to analyze - - Returns: - ModuleResult with findings from LLM analysis - """ - # Start execution timer - self.start_timer() - - logger.info(f"Starting LLM analysis in workspace: {workspace}") - - # Extract configuration - agent_url = config.get("agent_url", "http://fuzzforge-task-agent:8000/a2a/litellm_agent") - llm_model = config.get("llm_model", "gpt-4o-mini") - llm_provider = config.get("llm_provider", "openai") - file_patterns = config.get("file_patterns", ["*.py", "*.js", "*.ts", "*.java", "*.go"]) - max_files = config.get("max_files", 5) - max_file_size = config.get("max_file_size", 50000) - timeout = config.get("timeout", 60) - - # Find files to analyze - files_to_analyze = [] - for pattern in file_patterns: - for file_path in workspace.rglob(pattern): - if file_path.is_file(): - try: - # Check file size - if file_path.stat().st_size > max_file_size: - logger.debug(f"Skipping {file_path} (too large)") - continue - - files_to_analyze.append(file_path) - - if len(files_to_analyze) >= max_files: - break - except Exception as e: - logger.warning(f"Error checking file {file_path}: {e}") - continue - - if len(files_to_analyze) >= max_files: - break - - logger.info(f"Found {len(files_to_analyze)} files to analyze") - - # Analyze each file - all_findings = [] - for file_path in files_to_analyze: - logger.info(f"Analyzing: {file_path.relative_to(workspace)}") - - try: - findings = await self._analyze_file( - file_path=file_path, - workspace=workspace, - agent_url=agent_url, - llm_model=llm_model, - llm_provider=llm_provider, - timeout=timeout - ) - all_findings.extend(findings) - - except Exception as e: - logger.error(f"Error analyzing {file_path}: {e}") - # Continue with next file - continue - - logger.info(f"LLM analysis complete. Found {len(all_findings)} issues.") - - # Create result using base module helper - return self.create_result( - findings=all_findings, - status="success", - summary={ - "files_analyzed": len(files_to_analyze), - "total_findings": len(all_findings), - "agent_url": agent_url, - "model": f"{llm_provider}/{llm_model}" - } - ) - - async def _analyze_file( - self, - file_path: Path, - workspace: Path, - agent_url: str, - llm_model: str, - llm_provider: str, - timeout: int - ) -> List[Dict[str, Any]]: - """Analyze a single file with LLM""" - - # Read file content - try: - with open(file_path, 'r', encoding='utf-8') as f: - code_content = f.read() - except Exception as e: - logger.error(f"Failed to read {file_path}: {e}") - return [] - - # Determine language from extension - extension = file_path.suffix.lower() - language_map = { - ".py": "python", - ".js": "javascript", - ".ts": "typescript", - ".java": "java", - ".go": "go", - ".rs": "rust", - ".c": "c", - ".cpp": "cpp", - } - language = language_map.get(extension, "code") - - # Build prompt for LLM - system_prompt = ( - "You are a security code analyzer. Analyze the provided code and identify " - "potential security vulnerabilities, bugs, and code quality issues. " - "For each issue found, respond in this exact format:\n" - "ISSUE: [short title]\n" - "SEVERITY: [error/warning/note]\n" - "LINE: [line number or 'unknown']\n" - "DESCRIPTION: [detailed explanation]\n\n" - "If no issues are found, respond with 'NO_ISSUES_FOUND'." - ) - - user_message = ( - f"Analyze this {language} code for security vulnerabilities:\n\n" - f"File: {file_path.relative_to(workspace)}\n\n" - f"```{language}\n{code_content}\n```" - ) - - # Call LLM via A2A wrapper (lazy import to avoid Temporal sandbox restrictions) - try: - from fuzzforge_ai.a2a_wrapper import send_agent_task - - result = await send_agent_task( - url=agent_url, - model=llm_model, - provider=llm_provider, - prompt=system_prompt, - message=user_message, - context=f"llm_analysis_{file_path.stem}", - timeout=float(timeout) - ) - - llm_response = result.text - - except Exception as e: - logger.error(f"A2A call failed for {file_path}: {e}") - return [] - - # Parse LLM response into findings - findings = self._parse_llm_response( - llm_response=llm_response, - file_path=file_path, - workspace=workspace - ) - - return findings - - def _parse_llm_response( - self, - llm_response: str, - file_path: Path, - workspace: Path - ) -> List: - """Parse LLM response into structured findings""" - - if "NO_ISSUES_FOUND" in llm_response: - return [] - - findings = [] - relative_path = str(file_path.relative_to(workspace)) - - # Simple parser for the expected format - lines = llm_response.split('\n') - current_issue = {} - - for line in lines: - line = line.strip() - - if line.startswith("ISSUE:"): - # Save previous issue if exists - if current_issue: - findings.append(self._create_module_finding(current_issue, relative_path)) - current_issue = {"title": line.replace("ISSUE:", "").strip()} - - elif line.startswith("SEVERITY:"): - current_issue["severity"] = line.replace("SEVERITY:", "").strip().lower() - - elif line.startswith("LINE:"): - line_num = line.replace("LINE:", "").strip() - try: - current_issue["line"] = int(line_num) - except ValueError: - current_issue["line"] = None - - elif line.startswith("DESCRIPTION:"): - current_issue["description"] = line.replace("DESCRIPTION:", "").strip() - - # Save last issue - if current_issue: - findings.append(self._create_module_finding(current_issue, relative_path)) - - return findings - - def _create_module_finding(self, issue: Dict[str, Any], file_path: str): - """Create a ModuleFinding from parsed issue""" - - severity_map = { - "error": "critical", - "warning": "medium", - "note": "low", - "info": "low" - } - - # Use base class helper to create proper ModuleFinding - return self.create_finding( - title=issue.get("title", "Security issue detected"), - description=issue.get("description", ""), - severity=severity_map.get(issue.get("severity", "warning"), "medium"), - category="security", - file_path=file_path, - line_start=issue.get("line"), - metadata={ - "tool": "llm-analyzer", - "type": "llm-security-analysis" - } - ) diff --git a/backend/toolbox/modules/analyzer/mypy_analyzer.py b/backend/toolbox/modules/analyzer/mypy_analyzer.py deleted file mode 100644 index 9d3e39f..0000000 --- a/backend/toolbox/modules/analyzer/mypy_analyzer.py +++ /dev/null @@ -1,269 +0,0 @@ -""" -Mypy Analyzer Module - Analyzes Python code for type safety issues using Mypy -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -import asyncio -import logging -import re -import time -from pathlib import Path -from typing import Dict, Any, List - -try: - from toolbox.modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding -except ImportError: - try: - from modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding - except ImportError: - from src.toolbox.modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding - -logger = logging.getLogger(__name__) - - -class MypyAnalyzer(BaseModule): - """ - Analyzes Python code for type safety issues using Mypy. - - This module: - - Runs Mypy type checker on Python files - - Detects type errors and inconsistencies - - Reports findings with configurable strictness - """ - - # Map Mypy error codes to severity - ERROR_SEVERITY_MAP = { - "error": "medium", - "note": "info" - } - - def get_metadata(self) -> ModuleMetadata: - """Get module metadata""" - return ModuleMetadata( - name="mypy_analyzer", - version="1.0.0", - description="Analyzes Python code for type safety issues using Mypy", - author="FuzzForge Team", - category="analyzer", - tags=["python", "type-checking", "mypy", "sast"], - input_schema={ - "strict_mode": { - "type": "boolean", - "description": "Enable strict type checking", - "default": False - }, - "ignore_missing_imports": { - "type": "boolean", - "description": "Ignore errors about missing imports", - "default": True - }, - "follow_imports": { - "type": "string", - "enum": ["normal", "silent", "skip", "error"], - "description": "How to handle imports", - "default": "silent" - } - }, - output_schema={ - "findings": { - "type": "array", - "description": "List of type errors found by Mypy" - } - }, - requires_workspace=True - ) - - def validate_config(self, config: Dict[str, Any]) -> bool: - """Validate module configuration""" - follow_imports = config.get("follow_imports", "silent") - if follow_imports not in ["normal", "silent", "skip", "error"]: - raise ValueError("follow_imports must be one of: normal, silent, skip, error") - - return True - - async def _run_mypy( - self, - workspace: Path, - strict_mode: bool, - ignore_missing_imports: bool, - follow_imports: str - ) -> str: - """ - Run Mypy on the workspace. - - Args: - workspace: Path to workspace - strict_mode: Enable strict checking - ignore_missing_imports: Ignore missing import errors - follow_imports: How to handle imports - - Returns: - Mypy output as string - """ - try: - # Build mypy command - cmd = [ - "mypy", - str(workspace), - "--show-column-numbers", - "--no-error-summary", - f"--follow-imports={follow_imports}" - ] - - if strict_mode: - cmd.append("--strict") - - if ignore_missing_imports: - cmd.append("--ignore-missing-imports") - - logger.info(f"Running Mypy on: {workspace}") - process = await asyncio.create_subprocess_exec( - *cmd, - stdout=asyncio.subprocess.PIPE, - stderr=asyncio.subprocess.PIPE - ) - - stdout, stderr = await process.communicate() - - # Mypy returns non-zero if errors found, which is expected - output = stdout.decode() - return output - - except Exception as e: - logger.error(f"Error running Mypy: {e}") - return "" - - def _parse_mypy_output(self, output: str, workspace: Path) -> List[ModuleFinding]: - """ - Parse Mypy output and convert to findings. - - Mypy output format: - file.py:10:5: error: Incompatible return value type [return-value] - file.py:15: note: See https://... - - Args: - output: Mypy stdout - workspace: Workspace path for relative paths - - Returns: - List of ModuleFindings - """ - findings = [] - - # Regex to parse mypy output lines - # Format: filename:line:column: level: message [error-code] - pattern = r'^(.+?):(\d+)(?::(\d+))?: (error|note): (.+?)(?:\s+\[([^\]]+)\])?$' - - for line in output.splitlines(): - match = re.match(pattern, line.strip()) - if not match: - continue - - filename, line_num, column, level, message, error_code = match.groups() - - # Convert to relative path - try: - file_path = Path(filename) - rel_path = file_path.relative_to(workspace) - except (ValueError, TypeError): - rel_path = Path(filename).name - - # Skip if it's just a note (unless it's a standalone note) - if level == "note" and not error_code: - continue - - # Map severity - severity = self.ERROR_SEVERITY_MAP.get(level, "medium") - - # Create finding - title = f"Type error: {error_code or 'type-issue'}" - description = message - - finding = self.create_finding( - title=title, - description=description, - severity=severity, - category="type-error", - file_path=str(rel_path), - line_start=int(line_num), - line_end=int(line_num), - recommendation="Review and fix the type inconsistency or add appropriate type annotations", - metadata={ - "error_code": error_code or "unknown", - "column": int(column) if column else None, - "level": level - } - ) - findings.append(finding) - - return findings - - async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: - """ - Execute the Mypy analyzer module. - - Args: - config: Module configuration - workspace: Path to workspace - - Returns: - ModuleResult with type checking findings - """ - start_time = time.time() - metadata = self.get_metadata() - - # Validate inputs - self.validate_config(config) - self.validate_workspace(workspace) - - # Get configuration - strict_mode = config.get("strict_mode", False) - ignore_missing_imports = config.get("ignore_missing_imports", True) - follow_imports = config.get("follow_imports", "silent") - - # Run Mypy - logger.info("Starting Mypy analysis...") - mypy_output = await self._run_mypy( - workspace, - strict_mode, - ignore_missing_imports, - follow_imports - ) - - # Parse output to findings - findings = self._parse_mypy_output(mypy_output, workspace) - - # Calculate summary - error_code_counts = {} - for finding in findings: - code = finding.metadata.get("error_code", "unknown") - error_code_counts[code] = error_code_counts.get(code, 0) + 1 - - execution_time = time.time() - start_time - - return ModuleResult( - module=metadata.name, - version=metadata.version, - status="success", - execution_time=execution_time, - findings=findings, - summary={ - "total_errors": len(findings), - "by_error_code": error_code_counts, - "files_with_errors": len(set(f.file_path for f in findings if f.file_path)) - }, - metadata={ - "strict_mode": strict_mode, - "ignore_missing_imports": ignore_missing_imports - } - ) diff --git a/backend/toolbox/modules/analyzer/security_analyzer.py b/backend/toolbox/modules/analyzer/security_analyzer.py deleted file mode 100644 index 3b4a2ea..0000000 --- a/backend/toolbox/modules/analyzer/security_analyzer.py +++ /dev/null @@ -1,368 +0,0 @@ -""" -Security Analyzer Module - Analyzes code for security vulnerabilities -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -import logging -import re -from pathlib import Path -from typing import Dict, Any, List - -try: - from toolbox.modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding -except ImportError: - try: - from modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding - except ImportError: - from src.toolbox.modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding - -logger = logging.getLogger(__name__) - - -class SecurityAnalyzer(BaseModule): - """ - Analyzes source code for common security vulnerabilities. - - This module: - - Detects hardcoded secrets and credentials - - Identifies dangerous function calls - - Finds SQL injection vulnerabilities - - Detects insecure configurations - """ - - def get_metadata(self) -> ModuleMetadata: - """Get module metadata""" - return ModuleMetadata( - name="security_analyzer", - version="1.0.0", - description="Analyzes code for security vulnerabilities", - author="FuzzForge Team", - category="analyzer", - tags=["security", "vulnerabilities", "static-analysis"], - input_schema={ - "file_extensions": { - "type": "array", - "items": {"type": "string"}, - "description": "File extensions to analyze", - "default": [".py", ".js", ".java", ".php", ".rb", ".go"] - }, - "check_secrets": { - "type": "boolean", - "description": "Check for hardcoded secrets", - "default": True - }, - "check_sql": { - "type": "boolean", - "description": "Check for SQL injection risks", - "default": True - }, - "check_dangerous_functions": { - "type": "boolean", - "description": "Check for dangerous function calls", - "default": True - } - }, - output_schema={ - "findings": { - "type": "array", - "description": "List of security findings" - } - }, - requires_workspace=True - ) - - def validate_config(self, config: Dict[str, Any]) -> bool: - """Validate module configuration""" - extensions = config.get("file_extensions", []) - if not isinstance(extensions, list): - raise ValueError("file_extensions must be a list") - - return True - - async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: - """ - Execute the security analysis module. - - Args: - config: Module configuration - workspace: Path to the workspace directory - - Returns: - ModuleResult with security findings - """ - self.start_timer() - self.validate_workspace(workspace) - self.validate_config(config) - - findings = [] - files_analyzed = 0 - - # Get configuration - file_extensions = config.get("file_extensions", [".py", ".js", ".java", ".php", ".rb", ".go"]) - check_secrets = config.get("check_secrets", True) - check_sql = config.get("check_sql", True) - check_dangerous = config.get("check_dangerous_functions", True) - - logger.info(f"Analyzing files with extensions: {file_extensions}") - - try: - # Analyze each file - for ext in file_extensions: - for file_path in workspace.rglob(f"*{ext}"): - if not file_path.is_file(): - continue - - files_analyzed += 1 - relative_path = file_path.relative_to(workspace) - - try: - content = file_path.read_text(encoding='utf-8', errors='ignore') - lines = content.splitlines() - - # Check for secrets - if check_secrets: - secret_findings = self._check_hardcoded_secrets( - content, lines, relative_path - ) - findings.extend(secret_findings) - - # Check for SQL injection - if check_sql and ext in [".py", ".php", ".java", ".js"]: - sql_findings = self._check_sql_injection( - content, lines, relative_path - ) - findings.extend(sql_findings) - - # Check for dangerous functions - if check_dangerous: - dangerous_findings = self._check_dangerous_functions( - content, lines, relative_path, ext - ) - findings.extend(dangerous_findings) - - except Exception as e: - logger.error(f"Error analyzing file {relative_path}: {e}") - - # Create summary - summary = { - "files_analyzed": files_analyzed, - "total_findings": len(findings), - "extensions_scanned": file_extensions - } - - return self.create_result( - findings=findings, - status="success" if files_analyzed > 0 else "partial", - summary=summary, - metadata={ - "workspace": str(workspace), - "config": config - } - ) - - except Exception as e: - logger.error(f"Security analyzer failed: {e}") - return self.create_result( - findings=findings, - status="failed", - error=str(e) - ) - - def _check_hardcoded_secrets( - self, content: str, lines: List[str], file_path: Path - ) -> List[ModuleFinding]: - """ - Check for hardcoded secrets in code. - - Args: - content: File content - lines: File lines - file_path: Relative file path - - Returns: - List of findings - """ - findings = [] - - # Patterns for secrets - secret_patterns = [ - (r'api[_-]?key\s*=\s*["\']([^"\']{20,})["\']', 'API Key'), - (r'api[_-]?secret\s*=\s*["\']([^"\']{20,})["\']', 'API Secret'), - (r'password\s*=\s*["\']([^"\']+)["\']', 'Hardcoded Password'), - (r'token\s*=\s*["\']([^"\']{20,})["\']', 'Authentication Token'), - (r'aws[_-]?access[_-]?key\s*=\s*["\']([^"\']+)["\']', 'AWS Access Key'), - (r'aws[_-]?secret[_-]?key\s*=\s*["\']([^"\']+)["\']', 'AWS Secret Key'), - (r'private[_-]?key\s*=\s*["\']([^"\']+)["\']', 'Private Key'), - (r'["\']([A-Za-z0-9]{32,})["\']', 'Potential Secret Hash'), - (r'Bearer\s+([A-Za-z0-9\-_]+\.[A-Za-z0-9\-_]+\.[A-Za-z0-9\-_]+)', 'JWT Token'), - ] - - for pattern, secret_type in secret_patterns: - for match in re.finditer(pattern, content, re.IGNORECASE): - # Find line number - line_num = content[:match.start()].count('\n') + 1 - line_content = lines[line_num - 1] if line_num <= len(lines) else "" - - # Skip common false positives - if self._is_false_positive_secret(match.group(0)): - continue - - findings.append(self.create_finding( - title=f"Hardcoded {secret_type} detected", - description=f"Found potential hardcoded {secret_type} in {file_path}", - severity="high" if "key" in secret_type.lower() else "medium", - category="hardcoded_secret", - file_path=str(file_path), - line_start=line_num, - code_snippet=line_content.strip()[:100], - recommendation=f"Remove hardcoded {secret_type} and use environment variables or secure vault", - metadata={"secret_type": secret_type} - )) - - return findings - - def _check_sql_injection( - self, content: str, lines: List[str], file_path: Path - ) -> List[ModuleFinding]: - """ - Check for potential SQL injection vulnerabilities. - - Args: - content: File content - lines: File lines - file_path: Relative file path - - Returns: - List of findings - """ - findings = [] - - # SQL injection patterns - sql_patterns = [ - (r'(SELECT|INSERT|UPDATE|DELETE).*\+\s*[\'"]?\s*\+?\s*\w+', 'String concatenation in SQL'), - (r'(SELECT|INSERT|UPDATE|DELETE).*%\s*[\'"]?\s*%?\s*\w+', 'String formatting in SQL'), - (r'f[\'"].*?(SELECT|INSERT|UPDATE|DELETE).*?\{.*?\}', 'F-string in SQL query'), - (r'query\s*=.*?\+', 'Dynamic query building'), - (r'execute\s*\(.*?\+.*?\)', 'Dynamic execute statement'), - ] - - for pattern, vuln_type in sql_patterns: - for match in re.finditer(pattern, content, re.IGNORECASE): - line_num = content[:match.start()].count('\n') + 1 - line_content = lines[line_num - 1] if line_num <= len(lines) else "" - - findings.append(self.create_finding( - title=f"Potential SQL Injection: {vuln_type}", - description=f"Detected potential SQL injection vulnerability via {vuln_type}", - severity="high", - category="sql_injection", - file_path=str(file_path), - line_start=line_num, - code_snippet=line_content.strip()[:100], - recommendation="Use parameterized queries or prepared statements instead", - metadata={"vulnerability_type": vuln_type} - )) - - return findings - - def _check_dangerous_functions( - self, content: str, lines: List[str], file_path: Path, ext: str - ) -> List[ModuleFinding]: - """ - Check for dangerous function calls. - - Args: - content: File content - lines: File lines - file_path: Relative file path - ext: File extension - - Returns: - List of findings - """ - findings = [] - - # Language-specific dangerous functions - dangerous_functions = { - ".py": [ - (r'eval\s*\(', 'eval()', 'Arbitrary code execution'), - (r'exec\s*\(', 'exec()', 'Arbitrary code execution'), - (r'os\.system\s*\(', 'os.system()', 'Command injection risk'), - (r'subprocess\.call\s*\(.*shell=True', 'subprocess with shell=True', 'Command injection risk'), - (r'pickle\.loads?\s*\(', 'pickle.load()', 'Deserialization vulnerability'), - ], - ".js": [ - (r'eval\s*\(', 'eval()', 'Arbitrary code execution'), - (r'new\s+Function\s*\(', 'new Function()', 'Arbitrary code execution'), - (r'innerHTML\s*=', 'innerHTML', 'XSS vulnerability'), - (r'document\.write\s*\(', 'document.write()', 'XSS vulnerability'), - ], - ".php": [ - (r'eval\s*\(', 'eval()', 'Arbitrary code execution'), - (r'exec\s*\(', 'exec()', 'Command execution'), - (r'system\s*\(', 'system()', 'Command execution'), - (r'shell_exec\s*\(', 'shell_exec()', 'Command execution'), - (r'\$_GET\[', 'Direct $_GET usage', 'Input validation missing'), - (r'\$_POST\[', 'Direct $_POST usage', 'Input validation missing'), - ] - } - - if ext in dangerous_functions: - for pattern, func_name, risk_type in dangerous_functions[ext]: - for match in re.finditer(pattern, content): - line_num = content[:match.start()].count('\n') + 1 - line_content = lines[line_num - 1] if line_num <= len(lines) else "" - - findings.append(self.create_finding( - title=f"Dangerous function: {func_name}", - description=f"Use of potentially dangerous function {func_name}: {risk_type}", - severity="medium", - category="dangerous_function", - file_path=str(file_path), - line_start=line_num, - code_snippet=line_content.strip()[:100], - recommendation=f"Consider safer alternatives to {func_name}", - metadata={ - "function": func_name, - "risk": risk_type - } - )) - - return findings - - def _is_false_positive_secret(self, value: str) -> bool: - """ - Check if a potential secret is likely a false positive. - - Args: - value: Potential secret value - - Returns: - True if likely false positive - """ - false_positive_patterns = [ - 'example', - 'test', - 'demo', - 'sample', - 'dummy', - 'placeholder', - 'xxx', - '123', - 'change', - 'your', - 'here' - ] - - value_lower = value.lower() - return any(pattern in value_lower for pattern in false_positive_patterns) \ No newline at end of file diff --git a/backend/toolbox/modules/android/__init__.py b/backend/toolbox/modules/android/__init__.py deleted file mode 100644 index ef2c74c..0000000 --- a/backend/toolbox/modules/android/__init__.py +++ /dev/null @@ -1,31 +0,0 @@ -""" -Android Security Analysis Modules - -Modules for Android application security testing: -- JadxDecompiler: APK decompilation using Jadx -- MobSFScanner: Mobile security analysis using MobSF -- OpenGrepAndroid: Static analysis using OpenGrep/Semgrep with Android-specific rules -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -from .jadx_decompiler import JadxDecompiler -from .opengrep_android import OpenGrepAndroid - -# MobSF is optional (not available on ARM64 platform) -try: - from .mobsf_scanner import MobSFScanner - __all__ = ["JadxDecompiler", "MobSFScanner", "OpenGrepAndroid"] -except ImportError: - # MobSF dependencies not available (e.g., ARM64 platform) - MobSFScanner = None - __all__ = ["JadxDecompiler", "OpenGrepAndroid"] diff --git a/backend/toolbox/modules/android/custom_rules/clipboard-sensitive-data.yaml b/backend/toolbox/modules/android/custom_rules/clipboard-sensitive-data.yaml deleted file mode 100644 index df7944e..0000000 --- a/backend/toolbox/modules/android/custom_rules/clipboard-sensitive-data.yaml +++ /dev/null @@ -1,15 +0,0 @@ -rules: - - id: clipboard-sensitive-data - severity: WARNING - languages: [java] - message: "Sensitive data may be copied to the clipboard." - metadata: - authors: - - Guerric ELOI (FuzzingLabs) - category: security - area: clipboard - verification-level: [L1] - paths: - include: - - "**/*.java" - pattern: "$CLIPBOARD.setPrimaryClip($CLIP)" diff --git a/backend/toolbox/modules/android/custom_rules/hardcoded-secrets.yaml b/backend/toolbox/modules/android/custom_rules/hardcoded-secrets.yaml deleted file mode 100644 index c353c96..0000000 --- a/backend/toolbox/modules/android/custom_rules/hardcoded-secrets.yaml +++ /dev/null @@ -1,23 +0,0 @@ -rules: - - id: hardcoded-secrets - severity: WARNING - languages: [java] - message: "Possible hardcoded secret found in variable '$NAME'." - metadata: - authors: - - Guerric ELOI (FuzzingLabs) - owasp-mobile: M2 - category: secrets - verification-level: [L1] - paths: - include: - - "**/*.java" - patterns: - - pattern-either: - - pattern: 'String $NAME = "$VAL";' - - pattern: 'final String $NAME = "$VAL";' - - pattern: 'private String $NAME = "$VAL";' - - pattern: 'public static String $NAME = "$VAL";' - - pattern: 'static final String $NAME = "$VAL";' - - pattern-regex: "$NAME =~ /(?i).*(api|key|token|secret|pass|auth|session|bearer|access|private).*/" - diff --git a/backend/toolbox/modules/android/custom_rules/insecure-data-storage.yaml b/backend/toolbox/modules/android/custom_rules/insecure-data-storage.yaml deleted file mode 100644 index c22546d..0000000 --- a/backend/toolbox/modules/android/custom_rules/insecure-data-storage.yaml +++ /dev/null @@ -1,18 +0,0 @@ -rules: - - id: insecure-data-storage - severity: WARNING - languages: [java] - message: "Potential insecure data storage (external storage)." - metadata: - authors: - - Guerric ELOI (FuzzingLabs) - owasp-mobile: M2 - category: security - area: storage - verification-level: [L1] - paths: - include: - - "**/*.java" - pattern-either: - - pattern: "$CTX.openFileOutput($NAME, $MODE)" - - pattern: "Environment.getExternalStorageDirectory()" diff --git a/backend/toolbox/modules/android/custom_rules/insecure-deeplink.yaml b/backend/toolbox/modules/android/custom_rules/insecure-deeplink.yaml deleted file mode 100644 index 4be31ad..0000000 --- a/backend/toolbox/modules/android/custom_rules/insecure-deeplink.yaml +++ /dev/null @@ -1,16 +0,0 @@ -rules: - - id: insecure-deeplink - severity: WARNING - languages: [xml] - message: "Potential insecure deeplink found in intent-filter." - metadata: - authors: - - Guerric ELOI (FuzzingLabs) - category: component - area: manifest - verification-level: [L1] - paths: - include: - - "**/AndroidManifest.xml" - pattern: | - diff --git a/backend/toolbox/modules/android/custom_rules/insecure-logging.yaml b/backend/toolbox/modules/android/custom_rules/insecure-logging.yaml deleted file mode 100644 index f36f2a7..0000000 --- a/backend/toolbox/modules/android/custom_rules/insecure-logging.yaml +++ /dev/null @@ -1,21 +0,0 @@ -rules: - - id: insecure-logging - severity: WARNING - languages: [java] - message: "Sensitive data logged via Android Log API." - metadata: - authors: - - Guerric ELOI (FuzzingLabs) - owasp-mobile: M2 - category: logging - verification-level: [L1] - paths: - include: - - "**/*.java" - patterns: - - pattern-either: - - pattern: "Log.d($TAG, $MSG)" - - pattern: "Log.e($TAG, $MSG)" - - pattern: "System.out.println($MSG)" - - pattern-regex: "$MSG =~ /(?i).*(password|token|secret|api|auth|session).*/" - diff --git a/backend/toolbox/modules/android/custom_rules/intent-redirection.yaml b/backend/toolbox/modules/android/custom_rules/intent-redirection.yaml deleted file mode 100644 index ade522a..0000000 --- a/backend/toolbox/modules/android/custom_rules/intent-redirection.yaml +++ /dev/null @@ -1,15 +0,0 @@ -rules: - - id: intent-redirection - severity: WARNING - languages: [java] - message: "Potential intent redirection: using getIntent().getExtras() without validation." - metadata: - authors: - - Guerric ELOI (FuzzingLabs) - category: intent - area: intercomponent - verification-level: [L1] - paths: - include: - - "**/*.java" - pattern: "$ACT.getIntent().getExtras()" diff --git a/backend/toolbox/modules/android/custom_rules/sensitive_data_sharedPreferences.yaml b/backend/toolbox/modules/android/custom_rules/sensitive_data_sharedPreferences.yaml deleted file mode 100644 index 4f8f28f..0000000 --- a/backend/toolbox/modules/android/custom_rules/sensitive_data_sharedPreferences.yaml +++ /dev/null @@ -1,18 +0,0 @@ -rules: - - id: sensitive-data-in-shared-preferences - severity: WARNING - languages: [java] - message: "Sensitive data may be stored in SharedPreferences. Please review the key '$KEY'." - metadata: - authors: - - Guerric ELOI (FuzzingLabs) - owasp-mobile: M2 - category: security - area: storage - verification-level: [L1] - paths: - include: - - "**/*.java" - patterns: - - pattern: "$EDITOR.putString($KEY, $VAL);" - - pattern-regex: "$KEY =~ /(?i).*(username|password|pass|token|auth_token|api_key|secret|sessionid|email).*/" diff --git a/backend/toolbox/modules/android/custom_rules/sqlite-injection.yaml b/backend/toolbox/modules/android/custom_rules/sqlite-injection.yaml deleted file mode 100644 index 5d07e22..0000000 --- a/backend/toolbox/modules/android/custom_rules/sqlite-injection.yaml +++ /dev/null @@ -1,21 +0,0 @@ -rules: - - id: sqlite-injection - severity: ERROR - languages: [java] - message: "Possible SQL injection: concatenated input in rawQuery or execSQL." - metadata: - authors: - - Guerric ELOI (FuzzingLabs) - owasp-mobile: M7 - category: injection - area: database - verification-level: [L1] - paths: - include: - - "**/*.java" - patterns: - - pattern-either: - - pattern: "$DB.rawQuery($QUERY, ...)" - - pattern: "$DB.execSQL($QUERY)" - - pattern-regex: "$QUERY =~ /.*\".*\".*\\+.*/" - diff --git a/backend/toolbox/modules/android/custom_rules/vulnerable-activity.yaml b/backend/toolbox/modules/android/custom_rules/vulnerable-activity.yaml deleted file mode 100644 index 0cef4fc..0000000 --- a/backend/toolbox/modules/android/custom_rules/vulnerable-activity.yaml +++ /dev/null @@ -1,16 +0,0 @@ -rules: - - id: vulnerable-activity - severity: WARNING - languages: [xml] - message: "Activity exported without permission." - metadata: - authors: - - Guerric ELOI (FuzzingLabs) - category: component - area: manifest - verification-level: [L1] - paths: - include: - - "**/AndroidManifest.xml" - pattern: | - ModuleMetadata: - return ModuleMetadata( - name="jadx_decompiler", - version="1.5.0", - description="Android APK decompilation using Jadx - converts DEX bytecode to Java source", - author="FuzzForge Team", - category="android", - tags=["android", "jadx", "decompilation", "reverse", "apk"], - input_schema={ - "type": "object", - "properties": { - "apk_path": { - "type": "string", - "description": "Path to the APK to decompile (absolute or relative to workspace)", - }, - "output_dir": { - "type": "string", - "description": "Directory (relative to workspace) where Jadx output should be written", - "default": "jadx_output", - }, - "overwrite": { - "type": "boolean", - "description": "Overwrite existing output directory if present", - "default": True, - }, - "threads": { - "type": "integer", - "description": "Number of Jadx decompilation threads", - "default": 4, - "minimum": 1, - "maximum": 32, - }, - "decompiler_args": { - "type": "array", - "items": {"type": "string"}, - "description": "Additional arguments passed directly to Jadx", - "default": [], - }, - }, - "required": ["apk_path"], - }, - output_schema={ - "type": "object", - "properties": { - "output_dir": { - "type": "string", - "description": "Path to decompiled output directory", - }, - "source_dir": { - "type": "string", - "description": "Path to decompiled Java sources", - }, - "resource_dir": { - "type": "string", - "description": "Path to extracted resources", - }, - "java_files": { - "type": "integer", - "description": "Number of Java files decompiled", - }, - }, - }, - requires_workspace=True, - ) - - def validate_config(self, config: Dict[str, Any]) -> bool: - """Validate module configuration""" - apk_path = config.get("apk_path") - if not apk_path: - raise ValueError("'apk_path' must be provided for Jadx decompilation") - - threads = config.get("threads", 4) - if not isinstance(threads, int) or threads < 1 or threads > 32: - raise ValueError("threads must be between 1 and 32") - - return True - - async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: - """ - Execute Jadx decompilation on an APK file. - - Args: - config: Configuration dict with apk_path, output_dir, etc. - workspace: Workspace directory path - - Returns: - ModuleResult with decompilation summary and metadata - """ - self.start_timer() - - try: - self.validate_config(config) - self.validate_workspace(workspace) - - workspace = workspace.resolve() - - # Resolve APK path - apk_path = Path(config["apk_path"]) - if not apk_path.is_absolute(): - apk_path = (workspace / apk_path).resolve() - - if not apk_path.exists(): - raise ValueError(f"APK not found: {apk_path}") - - if apk_path.is_dir(): - raise ValueError(f"APK path must be a file, not a directory: {apk_path}") - - logger.info(f"Decompiling APK: {apk_path}") - - # Resolve output directory - output_dir = Path(config.get("output_dir", "jadx_output")) - if not output_dir.is_absolute(): - output_dir = (workspace / output_dir).resolve() - - # Handle existing output directory - if output_dir.exists(): - if config.get("overwrite", True): - logger.info(f"Removing existing output directory: {output_dir}") - shutil.rmtree(output_dir) - else: - raise ValueError( - f"Output directory already exists: {output_dir}. Set overwrite=true to replace it." - ) - - output_dir.mkdir(parents=True, exist_ok=True) - - # Build Jadx command - threads = str(config.get("threads", 4)) - extra_args = config.get("decompiler_args", []) or [] - - cmd = [ - "jadx", - "--threads-count", - threads, - "--deobf", # Deobfuscate code - "--output-dir", - str(output_dir), - ] - cmd.extend(extra_args) - cmd.append(str(apk_path)) - - logger.info(f"Running Jadx: {' '.join(cmd)}") - - # Execute Jadx - process = await asyncio.create_subprocess_exec( - *cmd, - stdout=asyncio.subprocess.PIPE, - stderr=asyncio.subprocess.PIPE, - cwd=str(workspace), - ) - - stdout, stderr = await process.communicate() - stdout_str = stdout.decode(errors="ignore") if stdout else "" - stderr_str = stderr.decode(errors="ignore") if stderr else "" - - if stdout_str: - logger.debug(f"Jadx stdout: {stdout_str[:200]}...") - if stderr_str: - logger.debug(f"Jadx stderr: {stderr_str[:200]}...") - - if process.returncode != 0: - error_output = stderr_str or stdout_str or "No error output" - raise RuntimeError( - f"Jadx failed with exit code {process.returncode}: {error_output[:500]}" - ) - - # Verify output structure - source_dir = output_dir / "sources" - resource_dir = output_dir / "resources" - - if not source_dir.exists(): - logger.warning( - f"Jadx sources directory not found at expected path: {source_dir}" - ) - # Use output_dir as fallback - source_dir = output_dir - - # Count decompiled Java files - java_files = 0 - if source_dir.exists(): - java_files = sum(1 for _ in source_dir.rglob("*.java")) - logger.info(f"Decompiled {java_files} Java files") - - # Log sample files for debugging - sample_files = [] - for idx, file_path in enumerate(source_dir.rglob("*.java")): - sample_files.append(str(file_path.relative_to(workspace))) - if idx >= 4: - break - if sample_files: - logger.debug(f"Sample Java files: {sample_files}") - - # Create summary - summary = { - "output_dir": str(output_dir), - "source_dir": str(source_dir if source_dir.exists() else output_dir), - "resource_dir": str( - resource_dir if resource_dir.exists() else output_dir - ), - "java_files": java_files, - "apk_name": apk_path.name, - "apk_size_bytes": apk_path.stat().st_size, - } - - metadata = { - "apk_path": str(apk_path), - "output_dir": str(output_dir), - "source_dir": summary["source_dir"], - "resource_dir": summary["resource_dir"], - "threads": threads, - "decompiler": "jadx", - "decompiler_version": "1.5.0", - } - - logger.info( - f"āœ“ Jadx decompilation completed: {java_files} Java files generated" - ) - - return self.create_result( - findings=[], # Jadx doesn't generate findings, only decompiles - status="success", - summary=summary, - metadata=metadata, - ) - - except Exception as exc: - logger.error(f"Jadx decompilation failed: {exc}", exc_info=True) - return self.create_result( - findings=[], - status="failed", - error=str(exc), - metadata={"decompiler": "jadx", "apk_path": config.get("apk_path")}, - ) diff --git a/backend/toolbox/modules/android/mobsf_scanner.py b/backend/toolbox/modules/android/mobsf_scanner.py deleted file mode 100644 index 3b16e1b..0000000 --- a/backend/toolbox/modules/android/mobsf_scanner.py +++ /dev/null @@ -1,437 +0,0 @@ -""" -MobSF Scanner Module - -Mobile Security Framework (MobSF) integration for comprehensive Android app security analysis. -Performs static analysis on APK files including permissions, manifest analysis, code analysis, and behavior checks. -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -import logging -import os -from collections import Counter -from pathlib import Path -from typing import Dict, Any, List -import aiohttp - -try: - from toolbox.modules.base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult -except ImportError: - try: - from modules.base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult - except ImportError: - from src.toolbox.modules.base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult - -logger = logging.getLogger(__name__) - - -class MobSFScanner(BaseModule): - """Mobile Security Framework (MobSF) scanner module for Android applications""" - - SEVERITY_MAP = { - "dangerous": "critical", - "high": "high", - "warning": "medium", - "medium": "medium", - "low": "low", - "info": "low", - "secure": "low", - } - - def get_metadata(self) -> ModuleMetadata: - return ModuleMetadata( - name="mobsf_scanner", - version="3.9.7", - description="Comprehensive Android security analysis using Mobile Security Framework (MobSF)", - author="FuzzForge Team", - category="android", - tags=["mobile", "android", "mobsf", "sast", "scanner", "security"], - input_schema={ - "type": "object", - "properties": { - "mobsf_url": { - "type": "string", - "description": "MobSF server URL", - "default": "http://localhost:8877", - }, - "file_path": { - "type": "string", - "description": "Path to the APK file to scan (absolute or relative to workspace)", - }, - "api_key": { - "type": "string", - "description": "MobSF API key (if not provided, will try MOBSF_API_KEY env var)", - "default": None, - }, - "rescan": { - "type": "boolean", - "description": "Force rescan even if file was previously analyzed", - "default": False, - }, - }, - "required": ["file_path"], - }, - output_schema={ - "type": "object", - "properties": { - "findings": { - "type": "array", - "description": "Security findings from MobSF analysis" - }, - "scan_hash": {"type": "string"}, - "total_findings": {"type": "integer"}, - "severity_counts": {"type": "object"}, - } - }, - requires_workspace=True, - ) - - def validate_config(self, config: Dict[str, Any]) -> bool: - """Validate module configuration""" - if "mobsf_url" in config and not isinstance(config["mobsf_url"], str): - raise ValueError("mobsf_url must be a string") - - file_path = config.get("file_path") - if not file_path: - raise ValueError("file_path is required for MobSF scanning") - - return True - - async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: - """ - Execute MobSF security analysis on an APK file. - - Args: - config: Configuration dict with file_path, mobsf_url, api_key - workspace: Workspace directory path - - Returns: - ModuleResult with security findings from MobSF - """ - self.start_timer() - - try: - self.validate_config(config) - self.validate_workspace(workspace) - - # Get configuration - mobsf_url = config.get("mobsf_url", "http://localhost:8877") - file_path_str = config["file_path"] - rescan = config.get("rescan", False) - - # Get API key from config or environment - api_key = config.get("api_key") or os.environ.get("MOBSF_API_KEY", "") - if not api_key: - logger.warning("No MobSF API key provided. Some functionality may be limited.") - - # Resolve APK file path - file_path = Path(file_path_str) - if not file_path.is_absolute(): - file_path = (workspace / file_path).resolve() - - if not file_path.exists(): - raise FileNotFoundError(f"APK file not found: {file_path}") - - if not file_path.is_file(): - raise ValueError(f"APK path must be a file: {file_path}") - - logger.info(f"Starting MobSF scan of APK: {file_path}") - - # Upload and scan APK - scan_hash = await self._upload_file(mobsf_url, file_path, api_key) - logger.info(f"APK uploaded to MobSF with hash: {scan_hash}") - - # Start scan - await self._start_scan(mobsf_url, scan_hash, api_key, rescan=rescan) - logger.info(f"MobSF scan completed for hash: {scan_hash}") - - # Get JSON results - scan_results = await self._get_json_results(mobsf_url, scan_hash, api_key) - - # Parse results into findings - findings = self._parse_scan_results(scan_results, file_path) - - # Create summary - summary = self._create_summary(findings, scan_hash) - - logger.info(f"āœ“ MobSF scan completed: {len(findings)} findings") - - return self.create_result( - findings=findings, - status="success", - summary=summary, - metadata={ - "tool": "mobsf", - "tool_version": "3.9.7", - "scan_hash": scan_hash, - "apk_file": str(file_path), - "mobsf_url": mobsf_url, - } - ) - - except Exception as exc: - logger.error(f"MobSF scanner failed: {exc}", exc_info=True) - return self.create_result( - findings=[], - status="failed", - error=str(exc), - metadata={"tool": "mobsf", "file_path": config.get("file_path")} - ) - - async def _upload_file(self, mobsf_url: str, file_path: Path, api_key: str) -> str: - """ - Upload APK file to MobSF server. - - Returns: - Scan hash for the uploaded file - """ - headers = {'X-Mobsf-Api-Key': api_key} if api_key else {} - - # Create multipart form data - filename = file_path.name - - async with aiohttp.ClientSession() as session: - with open(file_path, 'rb') as f: - data = aiohttp.FormData() - data.add_field('file', - f, - filename=filename, - content_type='application/vnd.android.package-archive') - - async with session.post( - f"{mobsf_url}/api/v1/upload", - headers=headers, - data=data, - timeout=aiohttp.ClientTimeout(total=300) - ) as response: - if response.status != 200: - error_text = await response.text() - raise Exception(f"Failed to upload file to MobSF: {error_text}") - - result = await response.json() - scan_hash = result.get('hash') - if not scan_hash: - raise Exception(f"MobSF upload failed: {result}") - - return scan_hash - - async def _start_scan(self, mobsf_url: str, scan_hash: str, api_key: str, rescan: bool = False) -> Dict[str, Any]: - """ - Start MobSF scan for uploaded file. - - Returns: - Scan result dictionary - """ - headers = {'X-Mobsf-Api-Key': api_key} if api_key else {} - data = { - 'hash': scan_hash, - 're_scan': '1' if rescan else '0' - } - - async with aiohttp.ClientSession() as session: - async with session.post( - f"{mobsf_url}/api/v1/scan", - headers=headers, - data=data, - timeout=aiohttp.ClientTimeout(total=600) # 10 minutes for scan - ) as response: - if response.status != 200: - error_text = await response.text() - raise Exception(f"MobSF scan failed: {error_text}") - - result = await response.json() - return result - - async def _get_json_results(self, mobsf_url: str, scan_hash: str, api_key: str) -> Dict[str, Any]: - """ - Retrieve JSON scan results from MobSF. - - Returns: - Scan results dictionary - """ - headers = {'X-Mobsf-Api-Key': api_key} if api_key else {} - data = {'hash': scan_hash} - - async with aiohttp.ClientSession() as session: - async with session.post( - f"{mobsf_url}/api/v1/report_json", - headers=headers, - data=data, - timeout=aiohttp.ClientTimeout(total=60) - ) as response: - if response.status != 200: - error_text = await response.text() - raise Exception(f"Failed to retrieve MobSF results: {error_text}") - - return await response.json() - - def _parse_scan_results(self, scan_data: Dict[str, Any], apk_path: Path) -> List[ModuleFinding]: - """Parse MobSF JSON results into standardized findings""" - findings = [] - - # Parse permissions - if 'permissions' in scan_data: - for perm_name, perm_attrs in scan_data['permissions'].items(): - if isinstance(perm_attrs, dict): - severity = self.SEVERITY_MAP.get( - perm_attrs.get('status', '').lower(), 'low' - ) - - finding = self.create_finding( - title=f"Android Permission: {perm_name}", - description=perm_attrs.get('description', 'No description'), - severity=severity, - category="android-permission", - metadata={ - 'permission': perm_name, - 'status': perm_attrs.get('status'), - 'info': perm_attrs.get('info'), - 'tool': 'mobsf', - } - ) - findings.append(finding) - - # Parse manifest analysis - if 'manifest_analysis' in scan_data: - manifest_findings = scan_data['manifest_analysis'].get('manifest_findings', []) - for item in manifest_findings: - if isinstance(item, dict): - severity = self.SEVERITY_MAP.get(item.get('severity', '').lower(), 'medium') - - finding = self.create_finding( - title=item.get('title') or item.get('name') or "Manifest Issue", - description=item.get('description', 'No description'), - severity=severity, - category="android-manifest", - metadata={ - 'rule': item.get('rule'), - 'tool': 'mobsf', - } - ) - findings.append(finding) - - # Parse code analysis - if 'code_analysis' in scan_data: - code_findings = scan_data['code_analysis'].get('findings', {}) - for finding_name, finding_data in code_findings.items(): - if isinstance(finding_data, dict): - metadata_dict = finding_data.get('metadata', {}) - severity = self.SEVERITY_MAP.get( - metadata_dict.get('severity', '').lower(), 'medium' - ) - - # MobSF returns 'files' as a dict: {filename: line_numbers} - files_dict = finding_data.get('files', {}) - - # Create a finding for each affected file - if isinstance(files_dict, dict) and files_dict: - for file_path, line_numbers in files_dict.items(): - finding = self.create_finding( - title=finding_name, - description=metadata_dict.get('description', 'No description'), - severity=severity, - category="android-code-analysis", - file_path=file_path, - line_number=line_numbers, # Can be string like "28" or "65,81" - metadata={ - 'cwe': metadata_dict.get('cwe'), - 'owasp': metadata_dict.get('owasp'), - 'masvs': metadata_dict.get('masvs'), - 'cvss': metadata_dict.get('cvss'), - 'ref': metadata_dict.get('ref'), - 'line_numbers': line_numbers, - 'tool': 'mobsf', - } - ) - findings.append(finding) - else: - # Fallback: create one finding without file info - finding = self.create_finding( - title=finding_name, - description=metadata_dict.get('description', 'No description'), - severity=severity, - category="android-code-analysis", - metadata={ - 'cwe': metadata_dict.get('cwe'), - 'owasp': metadata_dict.get('owasp'), - 'masvs': metadata_dict.get('masvs'), - 'cvss': metadata_dict.get('cvss'), - 'ref': metadata_dict.get('ref'), - 'tool': 'mobsf', - } - ) - findings.append(finding) - - # Parse behavior analysis - if 'behaviour' in scan_data: - for key, value in scan_data['behaviour'].items(): - if isinstance(value, dict): - metadata_dict = value.get('metadata', {}) - labels = metadata_dict.get('label', []) - label = labels[0] if labels else 'Unknown Behavior' - - severity = self.SEVERITY_MAP.get( - metadata_dict.get('severity', '').lower(), 'medium' - ) - - # MobSF returns 'files' as a dict: {filename: line_numbers} - files_dict = value.get('files', {}) - - # Create a finding for each affected file - if isinstance(files_dict, dict) and files_dict: - for file_path, line_numbers in files_dict.items(): - finding = self.create_finding( - title=f"Behavior: {label}", - description=metadata_dict.get('description', 'No description'), - severity=severity, - category="android-behavior", - file_path=file_path, - line_number=line_numbers, - metadata={ - 'line_numbers': line_numbers, - 'behavior_key': key, - 'tool': 'mobsf', - } - ) - findings.append(finding) - else: - # Fallback: create one finding without file info - finding = self.create_finding( - title=f"Behavior: {label}", - description=metadata_dict.get('description', 'No description'), - severity=severity, - category="android-behavior", - metadata={ - 'behavior_key': key, - 'tool': 'mobsf', - } - ) - findings.append(finding) - - logger.debug(f"Parsed {len(findings)} findings from MobSF results") - return findings - - def _create_summary(self, findings: List[ModuleFinding], scan_hash: str) -> Dict[str, Any]: - """Create analysis summary""" - severity_counter = Counter() - category_counter = Counter() - - for finding in findings: - severity_counter[finding.severity] += 1 - category_counter[finding.category] += 1 - - return { - "scan_hash": scan_hash, - "total_findings": len(findings), - "severity_counts": dict(severity_counter), - "category_counts": dict(category_counter), - } diff --git a/backend/toolbox/modules/android/opengrep_android.py b/backend/toolbox/modules/android/opengrep_android.py deleted file mode 100644 index 01e32c4..0000000 --- a/backend/toolbox/modules/android/opengrep_android.py +++ /dev/null @@ -1,440 +0,0 @@ -""" -OpenGrep Android Static Analysis Module - -Pattern-based static analysis for Android applications using OpenGrep/Semgrep -with Android-specific security rules. -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -import asyncio -import json -import logging -from pathlib import Path -from typing import Dict, Any, List - -try: - from toolbox.modules.base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult -except ImportError: - try: - from modules.base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult - except ImportError: - from src.toolbox.modules.base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult - -logger = logging.getLogger(__name__) - - -class OpenGrepAndroid(BaseModule): - """OpenGrep static analysis module specialized for Android security""" - - def get_metadata(self) -> ModuleMetadata: - """Get module metadata""" - return ModuleMetadata( - name="opengrep_android", - version="1.45.0", - description="Android-focused static analysis using OpenGrep/Semgrep with custom security rules for Java/Kotlin", - author="FuzzForge Team", - category="android", - tags=["sast", "android", "opengrep", "semgrep", "java", "kotlin", "security"], - input_schema={ - "type": "object", - "properties": { - "config": { - "type": "string", - "enum": ["auto", "p/security-audit", "p/owasp-top-ten", "p/cwe-top-25"], - "default": "auto", - "description": "Rule configuration to use" - }, - "custom_rules_path": { - "type": "string", - "description": "Path to a directory containing custom OpenGrep rules (Android-specific rules recommended)", - "default": None, - }, - "languages": { - "type": "array", - "items": {"type": "string"}, - "description": "Specific languages to analyze (defaults to java, kotlin for Android)", - "default": ["java", "kotlin"], - }, - "include_patterns": { - "type": "array", - "items": {"type": "string"}, - "description": "File patterns to include", - "default": [], - }, - "exclude_patterns": { - "type": "array", - "items": {"type": "string"}, - "description": "File patterns to exclude", - "default": [], - }, - "max_target_bytes": { - "type": "integer", - "default": 1000000, - "description": "Maximum file size to analyze (bytes)" - }, - "timeout": { - "type": "integer", - "default": 300, - "description": "Analysis timeout in seconds" - }, - "severity": { - "type": "array", - "items": {"type": "string", "enum": ["ERROR", "WARNING", "INFO"]}, - "default": ["ERROR", "WARNING", "INFO"], - "description": "Minimum severity levels to report" - }, - "confidence": { - "type": "array", - "items": {"type": "string", "enum": ["HIGH", "MEDIUM", "LOW"]}, - "default": ["HIGH", "MEDIUM", "LOW"], - "description": "Minimum confidence levels to report" - } - } - }, - output_schema={ - "type": "object", - "properties": { - "findings": { - "type": "array", - "description": "Security findings from OpenGrep analysis" - }, - "total_findings": {"type": "integer"}, - "severity_counts": {"type": "object"}, - "files_analyzed": {"type": "integer"}, - } - }, - requires_workspace=True, - ) - - def validate_config(self, config: Dict[str, Any]) -> bool: - """Validate configuration""" - timeout = config.get("timeout", 300) - if not isinstance(timeout, int) or timeout < 30 or timeout > 3600: - raise ValueError("Timeout must be between 30 and 3600 seconds") - - max_bytes = config.get("max_target_bytes", 1000000) - if not isinstance(max_bytes, int) or max_bytes < 1000 or max_bytes > 10000000: - raise ValueError("max_target_bytes must be between 1000 and 10000000") - - custom_rules_path = config.get("custom_rules_path") - if custom_rules_path: - rules_path = Path(custom_rules_path) - if not rules_path.exists(): - logger.warning(f"Custom rules path does not exist: {custom_rules_path}") - - return True - - async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: - """Execute OpenGrep static analysis on Android code""" - self.start_timer() - - try: - # Validate inputs - self.validate_config(config) - self.validate_workspace(workspace) - - logger.info(f"Running OpenGrep Android analysis on {workspace}") - - # Build opengrep command - cmd = ["opengrep", "scan", "--json"] - - # Add configuration - custom_rules_path = config.get("custom_rules_path") - use_custom_rules = False - if custom_rules_path and Path(custom_rules_path).exists(): - cmd.extend(["--config", custom_rules_path]) - use_custom_rules = True - logger.info(f"Using custom Android rules from: {custom_rules_path}") - else: - config_type = config.get("config", "auto") - if config_type == "auto": - cmd.extend(["--config", "auto"]) - else: - cmd.extend(["--config", config_type]) - - # Add timeout - cmd.extend(["--timeout", str(config.get("timeout", 300))]) - - # Add max target bytes - cmd.extend(["--max-target-bytes", str(config.get("max_target_bytes", 1000000))]) - - # Add languages if specified (but NOT when using custom rules) - languages = config.get("languages", ["java", "kotlin"]) - if languages and not use_custom_rules: - langs = ",".join(languages) - cmd.extend(["--lang", langs]) - logger.debug(f"Analyzing languages: {langs}") - - # Add include patterns - include_patterns = config.get("include_patterns", []) - for pattern in include_patterns: - cmd.extend(["--include", pattern]) - - # Add exclude patterns - exclude_patterns = config.get("exclude_patterns", []) - for pattern in exclude_patterns: - cmd.extend(["--exclude", pattern]) - - # Add severity filter if single level requested - severity_levels = config.get("severity", ["ERROR", "WARNING", "INFO"]) - if severity_levels and len(severity_levels) == 1: - cmd.extend(["--severity", severity_levels[0]]) - - # Disable metrics collection - cmd.append("--disable-version-check") - cmd.append("--no-git-ignore") - - # Add target directory - cmd.append(str(workspace)) - - logger.debug(f"Running command: {' '.join(cmd)}") - - # Run OpenGrep - process = await asyncio.create_subprocess_exec( - *cmd, - stdout=asyncio.subprocess.PIPE, - stderr=asyncio.subprocess.PIPE, - cwd=workspace - ) - - stdout, stderr = await process.communicate() - - # Parse results - findings = [] - if process.returncode in [0, 1]: # 0 = no findings, 1 = findings found - findings = self._parse_opengrep_output(stdout.decode(), workspace, config) - logger.info(f"OpenGrep found {len(findings)} potential security issues") - else: - error_msg = stderr.decode() - logger.error(f"OpenGrep failed: {error_msg}") - return self.create_result( - findings=[], - status="failed", - error=f"OpenGrep execution failed (exit code {process.returncode}): {error_msg[:500]}" - ) - - # Create summary - summary = self._create_summary(findings) - - return self.create_result( - findings=findings, - status="success", - summary=summary, - metadata={ - "tool": "opengrep", - "tool_version": "1.45.0", - "languages": languages, - "custom_rules": bool(custom_rules_path), - } - ) - - except Exception as e: - logger.error(f"OpenGrep Android module failed: {e}", exc_info=True) - return self.create_result( - findings=[], - status="failed", - error=str(e) - ) - - def _parse_opengrep_output(self, output: str, workspace: Path, config: Dict[str, Any]) -> List[ModuleFinding]: - """Parse OpenGrep JSON output into findings""" - findings = [] - - if not output.strip(): - return findings - - try: - data = json.loads(output) - results = data.get("results", []) - logger.debug(f"OpenGrep returned {len(results)} raw results") - - # Get filtering criteria - allowed_severities = set(config.get("severity", ["ERROR", "WARNING", "INFO"])) - allowed_confidences = set(config.get("confidence", ["HIGH", "MEDIUM", "LOW"])) - - for result in results: - # Extract basic info - rule_id = result.get("check_id", "unknown") - message = result.get("message", "") - extra = result.get("extra", {}) - severity = extra.get("severity", "INFO").upper() - - # File location info - path_info = result.get("path", "") - start_line = result.get("start", {}).get("line", 0) - end_line = result.get("end", {}).get("line", 0) - - # Code snippet - lines = extra.get("lines", "") - - # Metadata - rule_metadata = extra.get("metadata", {}) - cwe = rule_metadata.get("cwe", []) - owasp = rule_metadata.get("owasp", []) - confidence = extra.get("confidence", rule_metadata.get("confidence", "MEDIUM")).upper() - - # Apply severity filter - if severity not in allowed_severities: - continue - - # Apply confidence filter - if confidence not in allowed_confidences: - continue - - # Make file path relative to workspace - if path_info: - try: - rel_path = Path(path_info).relative_to(workspace) - path_info = str(rel_path) - except ValueError: - pass - - # Map severity to our standard levels - finding_severity = self._map_severity(severity) - - # Create finding - finding = self.create_finding( - title=f"Android Security: {rule_id}", - description=message or f"OpenGrep rule {rule_id} triggered", - severity=finding_severity, - category=self._get_category(rule_id, extra), - file_path=path_info if path_info else None, - line_start=start_line if start_line > 0 else None, - line_end=end_line if end_line > 0 and end_line != start_line else None, - code_snippet=lines.strip() if lines else None, - recommendation=self._get_recommendation(rule_id, extra), - metadata={ - "rule_id": rule_id, - "opengrep_severity": severity, - "confidence": confidence, - "cwe": cwe, - "owasp": owasp, - "fix": extra.get("fix", ""), - "impact": extra.get("impact", ""), - "likelihood": extra.get("likelihood", ""), - "references": extra.get("references", []), - "tool": "opengrep", - } - ) - - findings.append(finding) - - except json.JSONDecodeError as e: - logger.warning(f"Failed to parse OpenGrep output: {e}. Output snippet: {output[:200]}...") - except Exception as e: - logger.warning(f"Error processing OpenGrep results: {e}", exc_info=True) - - return findings - - def _map_severity(self, opengrep_severity: str) -> str: - """Map OpenGrep severity to our standard severity levels""" - severity_map = { - "ERROR": "high", - "WARNING": "medium", - "INFO": "low" - } - return severity_map.get(opengrep_severity.upper(), "medium") - - def _get_category(self, rule_id: str, extra: Dict[str, Any]) -> str: - """Determine finding category based on rule and metadata""" - rule_metadata = extra.get("metadata", {}) - cwe_list = rule_metadata.get("cwe", []) - owasp_list = rule_metadata.get("owasp", []) - - rule_lower = rule_id.lower() - - # Android-specific categories - if "injection" in rule_lower or "sql" in rule_lower: - return "injection" - elif "intent" in rule_lower: - return "android-intent" - elif "webview" in rule_lower: - return "android-webview" - elif "deeplink" in rule_lower: - return "android-deeplink" - elif "storage" in rule_lower or "sharedpreferences" in rule_lower: - return "android-storage" - elif "logging" in rule_lower or "log" in rule_lower: - return "android-logging" - elif "clipboard" in rule_lower: - return "android-clipboard" - elif "activity" in rule_lower or "service" in rule_lower or "provider" in rule_lower: - return "android-component" - elif "crypto" in rule_lower or "encrypt" in rule_lower: - return "cryptography" - elif "hardcode" in rule_lower or "secret" in rule_lower: - return "secrets" - elif "auth" in rule_lower: - return "authentication" - elif cwe_list: - return f"cwe-{cwe_list[0]}" - elif owasp_list: - return f"owasp-{owasp_list[0].replace(' ', '-').lower()}" - else: - return "android-security" - - def _get_recommendation(self, rule_id: str, extra: Dict[str, Any]) -> str: - """Generate recommendation based on rule and metadata""" - fix_suggestion = extra.get("fix", "") - if fix_suggestion: - return fix_suggestion - - rule_lower = rule_id.lower() - - # Android-specific recommendations - if "injection" in rule_lower or "sql" in rule_lower: - return "Use parameterized queries or Room database with type-safe queries to prevent SQL injection." - elif "intent" in rule_lower: - return "Validate all incoming Intent data and use explicit Intents when possible to prevent Intent manipulation attacks." - elif "webview" in rule_lower and "javascript" in rule_lower: - return "Disable JavaScript in WebView if not needed, or implement proper JavaScript interfaces with @JavascriptInterface annotation." - elif "deeplink" in rule_lower: - return "Validate all deeplink URLs and sanitize user input to prevent deeplink hijacking attacks." - elif "storage" in rule_lower or "sharedpreferences" in rule_lower: - return "Encrypt sensitive data before storing in SharedPreferences or use EncryptedSharedPreferences for Android API 23+." - elif "logging" in rule_lower: - return "Remove sensitive data from logs in production builds. Use ProGuard/R8 to strip logging statements." - elif "clipboard" in rule_lower: - return "Avoid placing sensitive data on the clipboard. If necessary, clear clipboard data when no longer needed." - elif "crypto" in rule_lower: - return "Use modern cryptographic algorithms (AES-GCM, RSA-OAEP) and Android Keystore for key management." - elif "hardcode" in rule_lower or "secret" in rule_lower: - return "Remove hardcoded secrets. Use Android Keystore, environment variables, or secure configuration management." - else: - return "Review this Android security issue and apply appropriate fixes based on Android security best practices." - - def _create_summary(self, findings: List[ModuleFinding]) -> Dict[str, Any]: - """Create analysis summary""" - severity_counts = {"critical": 0, "high": 0, "medium": 0, "low": 0} - category_counts = {} - rule_counts = {} - - for finding in findings: - # Count by severity - severity_counts[finding.severity] += 1 - - # Count by category - category = finding.category - category_counts[category] = category_counts.get(category, 0) + 1 - - # Count by rule - rule_id = finding.metadata.get("rule_id", "unknown") - rule_counts[rule_id] = rule_counts.get(rule_id, 0) + 1 - - return { - "total_findings": len(findings), - "severity_counts": severity_counts, - "category_counts": category_counts, - "top_rules": dict(sorted(rule_counts.items(), key=lambda x: x[1], reverse=True)[:10]), - "files_analyzed": len(set(f.file_path for f in findings if f.file_path)) - } diff --git a/backend/toolbox/modules/base.py b/backend/toolbox/modules/base.py deleted file mode 100644 index dcef98d..0000000 --- a/backend/toolbox/modules/base.py +++ /dev/null @@ -1,271 +0,0 @@ -""" -Base module interface for all FuzzForge modules -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -from abc import ABC, abstractmethod -from pathlib import Path -from typing import Dict, Any, List, Optional -from pydantic import BaseModel, Field -import logging - -logger = logging.getLogger(__name__) - - -class ModuleMetadata(BaseModel): - """Metadata describing a module's capabilities and requirements""" - name: str = Field(..., description="Module name") - version: str = Field(..., description="Module version") - description: str = Field(..., description="Module description") - author: Optional[str] = Field(None, description="Module author") - category: str = Field(..., description="Module category (scanner, analyzer, reporter, etc.)") - tags: List[str] = Field(default_factory=list, description="Module tags") - input_schema: Dict[str, Any] = Field(default_factory=dict, description="Expected input schema") - output_schema: Dict[str, Any] = Field(default_factory=dict, description="Output schema") - requires_workspace: bool = Field(True, description="Whether module requires workspace access") - - -class ModuleFinding(BaseModel): - """Individual finding from a module""" - id: str = Field(..., description="Unique finding ID") - title: str = Field(..., description="Finding title") - description: str = Field(..., description="Detailed description") - severity: str = Field(..., description="Severity level (info, low, medium, high, critical)") - category: str = Field(..., description="Finding category") - file_path: Optional[str] = Field(None, description="Affected file path relative to workspace") - line_start: Optional[int] = Field(None, description="Starting line number") - line_end: Optional[int] = Field(None, description="Ending line number") - code_snippet: Optional[str] = Field(None, description="Relevant code snippet") - recommendation: Optional[str] = Field(None, description="Remediation recommendation") - metadata: Dict[str, Any] = Field(default_factory=dict, description="Additional metadata") - - -class ModuleResult(BaseModel): - """Standard result format from module execution""" - module: str = Field(..., description="Module name") - version: str = Field(..., description="Module version") - status: str = Field(default="success", description="Execution status (success, partial, failed)") - execution_time: float = Field(..., description="Execution time in seconds") - findings: List[ModuleFinding] = Field(default_factory=list, description="List of findings") - summary: Dict[str, Any] = Field(default_factory=dict, description="Summary statistics") - metadata: Dict[str, Any] = Field(default_factory=dict, description="Additional metadata") - error: Optional[str] = Field(None, description="Error message if failed") - sarif: Optional[Dict[str, Any]] = Field(None, description="SARIF report if generated by reporter module") - - -class BaseModule(ABC): - """ - Base interface for all security testing modules. - - All modules must inherit from this class and implement the required methods. - Modules are designed to be stateless and reusable across different workflows. - """ - - def __init__(self): - """Initialize the module""" - self._metadata = self.get_metadata() - self._start_time = None - logger.info(f"Initialized module: {self._metadata.name} v{self._metadata.version}") - - @abstractmethod - def get_metadata(self) -> ModuleMetadata: - """ - Get module metadata. - - Returns: - ModuleMetadata object describing the module - """ - pass - - @abstractmethod - async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: - """ - Execute the module with given configuration and workspace. - - Args: - config: Module-specific configuration parameters - workspace: Path to the mounted workspace directory - - Returns: - ModuleResult containing findings and metadata - """ - pass - - @abstractmethod - def validate_config(self, config: Dict[str, Any]) -> bool: - """ - Validate the provided configuration against module requirements. - - Args: - config: Configuration to validate - - Returns: - True if configuration is valid, False otherwise - - Raises: - ValueError: If configuration is invalid with details - """ - pass - - def validate_workspace(self, workspace: Path) -> bool: - """ - Validate that the workspace exists and is accessible. - - Args: - workspace: Path to the workspace - - Returns: - True if workspace is valid - - Raises: - ValueError: If workspace is invalid - """ - if not workspace.exists(): - raise ValueError(f"Workspace does not exist: {workspace}") - - if not workspace.is_dir(): - raise ValueError(f"Workspace is not a directory: {workspace}") - - return True - - def create_finding( - self, - title: str, - description: str, - severity: str, - category: str, - **kwargs - ) -> ModuleFinding: - """ - Helper method to create a standardized finding. - - Args: - title: Finding title - description: Detailed description - severity: Severity level - category: Finding category - **kwargs: Additional finding fields - - Returns: - ModuleFinding object - """ - import uuid - finding_id = str(uuid.uuid4()) - - return ModuleFinding( - id=finding_id, - title=title, - description=description, - severity=severity, - category=category, - **kwargs - ) - - def start_timer(self): - """Start the execution timer""" - from time import time - self._start_time = time() - - def get_execution_time(self) -> float: - """Get the execution time in seconds""" - from time import time - if self._start_time is None: - return 0.0 - return time() - self._start_time - - def create_result( - self, - findings: List[ModuleFinding], - status: str = "success", - summary: Dict[str, Any] = None, - metadata: Dict[str, Any] = None, - error: str = None - ) -> ModuleResult: - """ - Helper method to create a module result. - - Args: - findings: List of findings - status: Execution status - summary: Summary statistics - metadata: Additional metadata - error: Error message if failed - - Returns: - ModuleResult object - """ - return ModuleResult( - module=self._metadata.name, - version=self._metadata.version, - status=status, - execution_time=self.get_execution_time(), - findings=findings, - summary=summary or self._generate_summary(findings), - metadata=metadata or {}, - error=error - ) - - def _generate_summary(self, findings: List[ModuleFinding]) -> Dict[str, Any]: - """ - Generate summary statistics from findings. - - Args: - findings: List of findings - - Returns: - Summary dictionary - """ - severity_counts = { - "info": 0, - "low": 0, - "medium": 0, - "high": 0, - "critical": 0 - } - - category_counts = {} - - for finding in findings: - # Count by severity - if finding.severity in severity_counts: - severity_counts[finding.severity] += 1 - - # Count by category - if finding.category not in category_counts: - category_counts[finding.category] = 0 - category_counts[finding.category] += 1 - - return { - "total_findings": len(findings), - "severity_counts": severity_counts, - "category_counts": category_counts, - "highest_severity": self._get_highest_severity(findings) - } - - def _get_highest_severity(self, findings: List[ModuleFinding]) -> str: - """ - Get the highest severity from findings. - - Args: - findings: List of findings - - Returns: - Highest severity level - """ - severity_order = ["critical", "high", "medium", "low", "info"] - - for severity in severity_order: - if any(f.severity == severity for f in findings): - return severity - - return "none" \ No newline at end of file diff --git a/backend/toolbox/modules/fuzzer/__init__.py b/backend/toolbox/modules/fuzzer/__init__.py deleted file mode 100644 index ad0d1ba..0000000 --- a/backend/toolbox/modules/fuzzer/__init__.py +++ /dev/null @@ -1,10 +0,0 @@ -""" -Fuzzing modules for FuzzForge - -This package contains fuzzing modules for different fuzzing engines. -""" - -from .atheris_fuzzer import AtherisFuzzer -from .cargo_fuzzer import CargoFuzzer - -__all__ = ["AtherisFuzzer", "CargoFuzzer"] diff --git a/backend/toolbox/modules/fuzzer/atheris_fuzzer.py b/backend/toolbox/modules/fuzzer/atheris_fuzzer.py deleted file mode 100644 index 3f0c42d..0000000 --- a/backend/toolbox/modules/fuzzer/atheris_fuzzer.py +++ /dev/null @@ -1,608 +0,0 @@ -""" -Atheris Fuzzer Module - -Reusable module for fuzzing Python code using Atheris. -Discovers and fuzzes user-provided Python targets with TestOneInput() function. -""" - -import asyncio -import base64 -import importlib.util -import logging -import multiprocessing -import os -import sys -import time -from datetime import datetime -from pathlib import Path -from typing import Dict, Any, List, Optional, Callable -import uuid - -import httpx -from modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding - -logger = logging.getLogger(__name__) - - -def _run_atheris_in_subprocess( - target_path_str: str, - corpus_dir_str: str, - max_iterations: int, - timeout_seconds: int, - shared_crashes: Any, - exec_counter: multiprocessing.Value, - crash_counter: multiprocessing.Value, - coverage_counter: multiprocessing.Value -): - """ - Run atheris.Fuzz() in a separate process to isolate os._exit() calls. - - This function runs in a subprocess and loads the target module, - sets up atheris, and runs fuzzing. Stats are communicated via shared memory. - - Args: - target_path_str: String path to target file - corpus_dir_str: String path to corpus directory - max_iterations: Maximum fuzzing iterations - timeout_seconds: Timeout in seconds - shared_crashes: Manager().list() for storing crash details - exec_counter: Shared counter for executions - crash_counter: Shared counter for crashes - coverage_counter: Shared counter for coverage edges - """ - import atheris - import importlib.util - import traceback - from pathlib import Path - - target_path = Path(target_path_str) - total_executions = 0 - - # NOTE: Crash details are written directly to shared_crashes (Manager().list()) - # so they can be accessed by parent process after subprocess exits. - # We don't use a local crashes list because os._exit() prevents cleanup code. - - try: - # Load target module in subprocess - module_name = f"fuzz_target_{uuid.uuid4().hex[:8]}" - spec = importlib.util.spec_from_file_location(module_name, target_path) - if spec is None or spec.loader is None: - raise ImportError(f"Could not load module from {target_path}") - - module = importlib.util.module_from_spec(spec) - sys.modules[module_name] = module - spec.loader.exec_module(module) - - if not hasattr(module, "TestOneInput"): - raise AttributeError("Module does not have TestOneInput() function") - - test_one_input = module.TestOneInput - - # Wrapper to track executions and crashes - def fuzz_wrapper(data): - nonlocal total_executions - total_executions += 1 - - # Update shared counter for live stats - with exec_counter.get_lock(): - exec_counter.value += 1 - - try: - test_one_input(data) - except Exception as e: - # Capture crash details to shared memory - crash_info = { - "input": bytes(data), # Convert to bytes for serialization - "exception_type": type(e).__name__, - "exception_message": str(e), - "stack_trace": traceback.format_exc(), - "execution": total_executions - } - # Write to shared memory so parent process can access crash details - shared_crashes.append(crash_info) - - # Update shared crash counter - with crash_counter.get_lock(): - crash_counter.value += 1 - - # Re-raise so Atheris detects it - raise - - # Check for dictionary file in target directory - dict_args = [] - target_dir = target_path.parent - for dict_name in ["fuzz.dict", "fuzzing.dict", "dict.txt"]: - dict_path = target_dir / dict_name - if dict_path.exists(): - dict_args.append(f"-dict={dict_path}") - break - - # Configure Atheris - atheris_args = [ - "atheris_fuzzer", - f"-runs={max_iterations}", - f"-max_total_time={timeout_seconds}", - "-print_final_stats=1" - ] + dict_args + [corpus_dir_str] # Corpus directory as positional arg - - atheris.Setup(atheris_args, fuzz_wrapper) - - # Run fuzzing (this will call os._exit() when done) - atheris.Fuzz() - - except SystemExit: - # Atheris exits when done - this is normal - # Crash details already written to shared_crashes - pass - except Exception: - # Fatal error - traceback already written to shared memory - # via crash handler in fuzz_wrapper - pass - - -class AtherisFuzzer(BaseModule): - """ - Atheris fuzzing module - discovers and fuzzes Python code. - - This module can be used by any workflow to fuzz Python targets. - """ - - def __init__(self): - super().__init__() - self.crashes = [] - self.total_executions = 0 - self.start_time = None - self.last_stats_time = 0 - self.run_id = None - - def get_metadata(self) -> ModuleMetadata: - """Return module metadata""" - return ModuleMetadata( - name="atheris_fuzzer", - version="1.0.0", - description="Python fuzzing using Atheris - discovers and fuzzes TestOneInput() functions", - author="FuzzForge Team", - category="fuzzer", - tags=["fuzzing", "atheris", "python", "coverage"], - input_schema={ - "type": "object", - "properties": { - "target_file": { - "type": "string", - "description": "Python file with TestOneInput() function (auto-discovered if not specified)" - }, - "max_iterations": { - "type": "integer", - "description": "Maximum fuzzing iterations", - "default": 100000 - }, - "timeout_seconds": { - "type": "integer", - "description": "Fuzzing timeout in seconds", - "default": 300 - }, - "stats_callback": { - "description": "Optional callback for real-time statistics" - } - } - }, - requires_workspace=True - ) - - def validate_config(self, config: Dict[str, Any]) -> bool: - """Validate fuzzing configuration""" - max_iterations = config.get("max_iterations", 100000) - if not isinstance(max_iterations, int) or max_iterations <= 0: - raise ValueError(f"max_iterations must be positive integer, got: {max_iterations}") - - timeout = config.get("timeout_seconds", 300) - if not isinstance(timeout, int) or timeout <= 0: - raise ValueError(f"timeout_seconds must be positive integer, got: {timeout}") - - return True - - async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: - """ - Execute Atheris fuzzing on user code. - - Args: - config: Fuzzing configuration - workspace: Path to user's uploaded code - - Returns: - ModuleResult with crash findings - """ - self.start_timer() - self.start_time = time.time() - - # Validate configuration - self.validate_config(config) - self.validate_workspace(workspace) - - # Extract config - target_file = config.get("target_file") - max_iterations = config.get("max_iterations", 100000) - timeout_seconds = config.get("timeout_seconds", 300) - stats_callback = config.get("stats_callback") - self.run_id = config.get("run_id") - - logger.info( - f"Starting Atheris fuzzing (max_iterations={max_iterations}, " - f"timeout={timeout_seconds}s, target={target_file or 'auto-discover'})" - ) - - try: - # Step 1: Discover or load target - target_path = self._discover_target(workspace, target_file) - logger.info(f"Using fuzz target: {target_path}") - - # Step 2: Load target module - test_one_input = self._load_target_module(target_path) - logger.info(f"Loaded TestOneInput function from {target_path}") - - # Step 3: Run fuzzing - await self._run_fuzzing( - test_one_input=test_one_input, - target_path=target_path, - workspace=workspace, - max_iterations=max_iterations, - timeout_seconds=timeout_seconds, - stats_callback=stats_callback - ) - - # Step 4: Generate findings from crashes - findings = await self._generate_findings(target_path) - - logger.info( - f"Fuzzing completed: {self.total_executions} executions, " - f"{len(self.crashes)} crashes found" - ) - - # Generate SARIF report (always, even with no findings) - from modules.reporter import SARIFReporter - reporter = SARIFReporter() - reporter_config = { - "findings": findings, - "tool_name": "Atheris Fuzzer", - "tool_version": self._metadata.version - } - reporter_result = await reporter.execute(reporter_config, workspace) - sarif_report = reporter_result.sarif - - return ModuleResult( - module=self._metadata.name, - version=self._metadata.version, - status="success", - execution_time=self.get_execution_time(), - findings=findings, - summary={ - "total_executions": self.total_executions, - "crashes_found": len(self.crashes), - "execution_time": self.get_execution_time(), - "target_file": str(target_path.relative_to(workspace)) - }, - metadata={ - "max_iterations": max_iterations, - "timeout_seconds": timeout_seconds - }, - sarif=sarif_report - ) - - except Exception as e: - logger.error(f"Fuzzing failed: {e}", exc_info=True) - return self.create_result( - findings=[], - status="failed", - error=str(e) - ) - - def _discover_target(self, workspace: Path, target_file: Optional[str]) -> Path: - """ - Discover fuzz target in workspace. - - Args: - workspace: Path to workspace - target_file: Explicit target file or None for auto-discovery - - Returns: - Path to target file - """ - if target_file: - # Use specified target - target_path = workspace / target_file - if not target_path.exists(): - raise FileNotFoundError(f"Target file not found: {target_file}") - return target_path - - # Auto-discover: look for fuzz_*.py or *_fuzz.py - logger.info("Auto-discovering fuzz targets...") - - candidates = [] - # Use rglob for recursive search (searches all subdirectories) - for pattern in ["fuzz_*.py", "*_fuzz.py", "fuzz_target.py"]: - matches = list(workspace.rglob(pattern)) - candidates.extend(matches) - - if not candidates: - raise FileNotFoundError( - "No fuzz targets found. Expected files matching: fuzz_*.py, *_fuzz.py, or fuzz_target.py" - ) - - # Use first candidate - target = candidates[0] - if len(candidates) > 1: - logger.warning( - f"Multiple fuzz targets found: {[str(c) for c in candidates]}. " - f"Using: {target.name}" - ) - - return target - - def _load_target_module(self, target_path: Path) -> Callable: - """ - Load target module and get TestOneInput function. - - Args: - target_path: Path to Python file with TestOneInput - - Returns: - TestOneInput function - """ - # Add target directory to sys.path - target_dir = target_path.parent - if str(target_dir) not in sys.path: - sys.path.insert(0, str(target_dir)) - - # Load module dynamically - module_name = target_path.stem - spec = importlib.util.spec_from_file_location(module_name, target_path) - if spec is None or spec.loader is None: - raise ImportError(f"Cannot load module from {target_path}") - - module = importlib.util.module_from_spec(spec) - spec.loader.exec_module(module) - - # Get TestOneInput function - if not hasattr(module, "TestOneInput"): - raise AttributeError( - f"Module {module_name} does not have TestOneInput() function. " - "Atheris requires a TestOneInput(data: bytes) function." - ) - - return module.TestOneInput - - async def _run_fuzzing( - self, - test_one_input: Callable, - target_path: Path, - workspace: Path, - max_iterations: int, - timeout_seconds: int, - stats_callback: Optional[Callable] = None - ): - """ - Run Atheris fuzzing with real-time monitoring. - - Args: - test_one_input: TestOneInput function to fuzz (not used, loaded in subprocess) - target_path: Path to target file - workspace: Path to workspace directory - max_iterations: Max iterations - timeout_seconds: Timeout in seconds - stats_callback: Optional callback for stats - """ - self.crashes = [] - self.total_executions = 0 - - # Create corpus directory in workspace - corpus_dir = workspace / ".fuzzforge_corpus" - corpus_dir.mkdir(exist_ok=True) - logger.info(f"Using corpus directory: {corpus_dir}") - - logger.info(f"Starting Atheris fuzzer in subprocess (max_runs={max_iterations}, timeout={timeout_seconds}s)...") - - # Create shared memory for subprocess communication - ctx = multiprocessing.get_context('spawn') - manager = ctx.Manager() - shared_crashes = manager.list() # Shared list for crash details - exec_counter = ctx.Value('i', 0) # Shared execution counter - crash_counter = ctx.Value('i', 0) # Shared crash counter - coverage_counter = ctx.Value('i', 0) # Shared coverage counter - - # Start fuzzing in subprocess - process = ctx.Process( - target=_run_atheris_in_subprocess, - args=(str(target_path), str(corpus_dir), max_iterations, timeout_seconds, shared_crashes, exec_counter, crash_counter, coverage_counter) - ) - - # Run fuzzing in a separate task with monitoring - async def monitor_stats(): - """Monitor and report stats every 0.5 seconds""" - while True: - await asyncio.sleep(0.5) - - if stats_callback: - elapsed = time.time() - self.start_time - # Read from shared counters - current_execs = exec_counter.value - current_crashes = crash_counter.value - current_coverage = coverage_counter.value - execs_per_sec = current_execs / elapsed if elapsed > 0 else 0 - - # Count corpus files - try: - corpus_size = len(list(corpus_dir.iterdir())) if corpus_dir.exists() else 0 - except Exception: - corpus_size = 0 - - # TODO: Get real coverage from Atheris - # For now use corpus_size as proxy - coverage_value = current_coverage if current_coverage > 0 else corpus_size - - await stats_callback({ - "total_execs": current_execs, - "execs_per_sec": execs_per_sec, - "crashes": current_crashes, - "corpus_size": corpus_size, - "coverage": coverage_value, # Using corpus as coverage proxy - "elapsed_time": int(elapsed) - }) - - # Start monitoring task - monitor_task = None - if stats_callback: - monitor_task = asyncio.create_task(monitor_stats()) - - try: - # Start subprocess - process.start() - logger.info(f"Fuzzing subprocess started (PID: {process.pid})") - - # Wait for subprocess to complete - while process.is_alive(): - await asyncio.sleep(0.1) - - # NOTE: We cannot use result_queue because Atheris calls os._exit() - # which terminates immediately without putting results in the queue. - # Instead, we rely on shared memory (Manager().list() and Value counters). - - # Read final values from shared memory - self.total_executions = exec_counter.value - total_crashes = crash_counter.value - - # Read crash details from shared memory and convert to our format - self.crashes = [] - for crash_data in shared_crashes: - # Reconstruct crash info with exception object - crash_info = { - "input": crash_data["input"], - "exception": Exception(crash_data["exception_message"]), - "exception_type": crash_data["exception_type"], - "stack_trace": crash_data["stack_trace"], - "execution": crash_data["execution"] - } - self.crashes.append(crash_info) - - logger.warning( - f"Crash found (execution {crash_data['execution']}): " - f"{crash_data['exception_type']}: {crash_data['exception_message']}" - ) - - logger.info(f"Fuzzing completed: {self.total_executions} executions, {total_crashes} crashes found") - - # Send final stats update - if stats_callback: - elapsed = time.time() - self.start_time - execs_per_sec = self.total_executions / elapsed if elapsed > 0 else 0 - - # Count final corpus size - try: - final_corpus_size = len(list(corpus_dir.iterdir())) if corpus_dir.exists() else 0 - except Exception: - final_corpus_size = 0 - - # TODO: Parse coverage from Atheris output - # For now, use corpus size as proxy (corpus grows with coverage) - # libFuzzer writes coverage to stdout but sys.stdout redirection - # doesn't work because it writes to FD 1 directly from C++ - final_coverage = coverage_counter.value if coverage_counter.value > 0 else final_corpus_size - - await stats_callback({ - "total_execs": self.total_executions, - "execs_per_sec": execs_per_sec, - "crashes": total_crashes, - "corpus_size": final_corpus_size, - "coverage": final_coverage, - "elapsed_time": int(elapsed) - }) - - # Wait for process to fully terminate - process.join(timeout=5) - - if process.exitcode is not None and process.exitcode != 0: - logger.warning(f"Subprocess exited with code: {process.exitcode}") - - except Exception as e: - logger.error(f"Fuzzing execution error: {e}") - if process.is_alive(): - logger.warning("Terminating fuzzing subprocess...") - process.terminate() - process.join(timeout=5) - if process.is_alive(): - process.kill() - raise - finally: - # Stop monitoring - if monitor_task: - monitor_task.cancel() - try: - await monitor_task - except asyncio.CancelledError: - pass - - async def _generate_findings(self, target_path: Path) -> List[ModuleFinding]: - """ - Generate ModuleFinding objects from crashes. - - Args: - target_path: Path to target file - - Returns: - List of findings - """ - findings = [] - - for idx, crash in enumerate(self.crashes): - # Encode crash input for storage - crash_input_b64 = base64.b64encode(crash["input"]).decode() - - finding = self.create_finding( - title=f"Crash: {crash['exception_type']}", - description=( - f"Atheris found crash during fuzzing:\n" - f"Exception: {crash['exception_type']}\n" - f"Message: {str(crash['exception'])}\n" - f"Execution: {crash['execution']}" - ), - severity="critical", - category="crash", - file_path=str(target_path), - metadata={ - "crash_input_base64": crash_input_b64, - "crash_input_hex": crash["input"].hex(), - "exception_type": crash["exception_type"], - "stack_trace": crash["stack_trace"], - "execution_number": crash["execution"] - }, - recommendation=( - "Review the crash stack trace and input to identify the vulnerability. " - "The crash input is provided in base64 and hex formats for reproduction." - ) - ) - findings.append(finding) - - # Report crash to backend for real-time monitoring - if self.run_id: - try: - crash_report = { - "run_id": self.run_id, - "crash_id": f"crash_{idx + 1}", - "timestamp": datetime.utcnow().isoformat(), - "crash_type": crash["exception_type"], - "stack_trace": crash["stack_trace"], - "input_file": crash_input_b64, - "severity": "critical", - "exploitability": "unknown" - } - - backend_url = os.getenv("BACKEND_URL", "http://backend:8000") - async with httpx.AsyncClient(timeout=5.0) as client: - await client.post( - f"{backend_url}/fuzzing/{self.run_id}/crash", - json=crash_report - ) - logger.debug(f"Crash report sent to backend: {crash_report['crash_id']}") - except Exception as e: - logger.debug(f"Failed to post crash report to backend: {e}") - - return findings diff --git a/backend/toolbox/modules/fuzzer/cargo_fuzzer.py b/backend/toolbox/modules/fuzzer/cargo_fuzzer.py deleted file mode 100644 index c4fc746..0000000 --- a/backend/toolbox/modules/fuzzer/cargo_fuzzer.py +++ /dev/null @@ -1,455 +0,0 @@ -""" -Cargo Fuzzer Module - -Reusable module for fuzzing Rust code using cargo-fuzz (libFuzzer). -Discovers and fuzzes user-provided Rust targets with fuzz_target!() macros. -""" - -import asyncio -import logging -import os -import re -import time -from pathlib import Path -from typing import Dict, Any, List, Optional, Callable - -from modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding - -logger = logging.getLogger(__name__) - - -class CargoFuzzer(BaseModule): - """ - Cargo-fuzz (libFuzzer) fuzzer module for Rust code. - - Discovers fuzz targets in user's Rust project and runs cargo-fuzz - to find crashes, undefined behavior, and memory safety issues. - """ - - def get_metadata(self) -> ModuleMetadata: - """Get module metadata""" - return ModuleMetadata( - name="cargo_fuzz", - version="0.11.2", - description="Fuzz Rust code using cargo-fuzz with libFuzzer backend", - author="FuzzForge Team", - category="fuzzer", - tags=["fuzzing", "rust", "cargo-fuzz", "libfuzzer", "memory-safety"], - input_schema={ - "type": "object", - "properties": { - "target_name": { - "type": "string", - "description": "Fuzz target name (auto-discovered if not specified)" - }, - "max_iterations": { - "type": "integer", - "default": 1000000, - "description": "Maximum fuzzing iterations" - }, - "timeout_seconds": { - "type": "integer", - "default": 1800, - "description": "Fuzzing timeout in seconds" - }, - "sanitizer": { - "type": "string", - "enum": ["address", "memory", "undefined"], - "default": "address", - "description": "Sanitizer to use (address, memory, undefined)" - } - } - }, - output_schema={ - "type": "object", - "properties": { - "findings": { - "type": "array", - "description": "Crashes and memory safety issues found" - }, - "summary": { - "type": "object", - "description": "Fuzzing execution summary" - } - } - } - ) - - def validate_config(self, config: Dict[str, Any]) -> bool: - """Validate configuration""" - max_iterations = config.get("max_iterations", 1000000) - if not isinstance(max_iterations, int) or max_iterations < 1: - raise ValueError("max_iterations must be a positive integer") - - timeout = config.get("timeout_seconds", 1800) - if not isinstance(timeout, int) or timeout < 1: - raise ValueError("timeout_seconds must be a positive integer") - - sanitizer = config.get("sanitizer", "address") - if sanitizer not in ["address", "memory", "undefined"]: - raise ValueError("sanitizer must be one of: address, memory, undefined") - - return True - - async def execute( - self, - config: Dict[str, Any], - workspace: Path, - stats_callback: Optional[Callable] = None - ) -> ModuleResult: - """ - Execute cargo-fuzz on user's Rust code. - - Args: - config: Fuzzer configuration - workspace: Path to workspace directory containing Rust project - stats_callback: Optional callback for real-time stats updates - - Returns: - ModuleResult containing findings and summary - """ - self.start_timer() - - try: - # Validate inputs - self.validate_config(config) - self.validate_workspace(workspace) - - logger.info(f"Running cargo-fuzz on {workspace}") - - # Step 1: Discover fuzz targets - targets = await self._discover_fuzz_targets(workspace) - if not targets: - return self.create_result( - findings=[], - status="failed", - error="No fuzz targets found. Expected fuzz targets in fuzz/fuzz_targets/" - ) - - # Get target name from config or use first discovered target - target_name = config.get("target_name") - if not target_name: - target_name = targets[0] - logger.info(f"No target specified, using first discovered target: {target_name}") - elif target_name not in targets: - return self.create_result( - findings=[], - status="failed", - error=f"Target '{target_name}' not found. Available targets: {', '.join(targets)}" - ) - - # Step 2: Build fuzz target - logger.info(f"Building fuzz target: {target_name}") - build_success = await self._build_fuzz_target(workspace, target_name, config) - if not build_success: - return self.create_result( - findings=[], - status="failed", - error=f"Failed to build fuzz target: {target_name}" - ) - - # Step 3: Run fuzzing - logger.info(f"Starting fuzzing: {target_name}") - findings, stats = await self._run_fuzzing( - workspace, - target_name, - config, - stats_callback - ) - - # Step 4: Parse crash artifacts - crash_findings = await self._parse_crash_artifacts(workspace, target_name) - findings.extend(crash_findings) - - logger.info(f"Fuzzing completed: {len(findings)} crashes found") - - return self.create_result( - findings=findings, - status="success", - summary=stats - ) - - except Exception as e: - logger.error(f"Cargo fuzzer failed: {e}") - return self.create_result( - findings=[], - status="failed", - error=str(e) - ) - - async def _discover_fuzz_targets(self, workspace: Path) -> List[str]: - """ - Discover fuzz targets in the project. - - Looks for fuzz targets in fuzz/fuzz_targets/ directory. - """ - fuzz_targets_dir = workspace / "fuzz" / "fuzz_targets" - if not fuzz_targets_dir.exists(): - logger.warning(f"No fuzz targets directory found: {fuzz_targets_dir}") - return [] - - targets = [] - for file in fuzz_targets_dir.glob("*.rs"): - target_name = file.stem - targets.append(target_name) - logger.info(f"Discovered fuzz target: {target_name}") - - return targets - - async def _build_fuzz_target( - self, - workspace: Path, - target_name: str, - config: Dict[str, Any] - ) -> bool: - """Build the fuzz target with instrumentation""" - try: - sanitizer = config.get("sanitizer", "address") - - # Build command - cmd = [ - "cargo", "fuzz", "build", - target_name, - f"--sanitizer={sanitizer}" - ] - - logger.debug(f"Build command: {' '.join(cmd)}") - - proc = await asyncio.create_subprocess_exec( - *cmd, - cwd=workspace, - stdout=asyncio.subprocess.PIPE, - stderr=asyncio.subprocess.PIPE - ) - - stdout, stderr = await proc.communicate() - - if proc.returncode != 0: - logger.error(f"Build failed: {stderr.decode()}") - return False - - logger.info("Build successful") - return True - - except Exception as e: - logger.error(f"Build error: {e}") - return False - - async def _run_fuzzing( - self, - workspace: Path, - target_name: str, - config: Dict[str, Any], - stats_callback: Optional[Callable] - ) -> tuple[List[ModuleFinding], Dict[str, Any]]: - """ - Run cargo-fuzz and collect statistics. - - Returns: - Tuple of (findings, stats_dict) - """ - max_iterations = config.get("max_iterations", 1000000) - timeout_seconds = config.get("timeout_seconds", 1800) - sanitizer = config.get("sanitizer", "address") - - findings = [] - stats = { - "total_executions": 0, - "crashes_found": 0, - "corpus_size": 0, - "coverage": 0.0, - "execution_time": 0.0 - } - - try: - # Cargo fuzz run command - cmd = [ - "cargo", "fuzz", "run", - target_name, - f"--sanitizer={sanitizer}", - "--", - f"-runs={max_iterations}", - f"-max_total_time={timeout_seconds}" - ] - - logger.debug(f"Fuzz command: {' '.join(cmd)}") - - start_time = time.time() - proc = await asyncio.create_subprocess_exec( - *cmd, - cwd=workspace, - stdout=asyncio.subprocess.PIPE, - stderr=asyncio.subprocess.STDOUT - ) - - # Monitor output and extract stats - last_stats_time = time.time() - async for line in proc.stdout: - line_str = line.decode('utf-8', errors='ignore').strip() - - # Parse libFuzzer stats - # Example: "#12345 NEW cov: 123 ft: 456 corp: 10/234b" - stats_match = re.match(r'#(\d+)\s+.*cov:\s*(\d+).*corp:\s*(\d+)', line_str) - if stats_match: - execs = int(stats_match.group(1)) - cov = int(stats_match.group(2)) - corp = int(stats_match.group(3)) - - stats["total_executions"] = execs - stats["coverage"] = float(cov) - stats["corpus_size"] = corp - stats["execution_time"] = time.time() - start_time - - # Invoke stats callback for real-time monitoring - if stats_callback and time.time() - last_stats_time >= 0.5: - await stats_callback({ - "total_execs": execs, - "execs_per_sec": execs / stats["execution_time"] if stats["execution_time"] > 0 else 0, - "crashes": stats["crashes_found"], - "coverage": cov, - "corpus_size": corp, - "elapsed_time": int(stats["execution_time"]) - }) - last_stats_time = time.time() - - # Detect crash line - if "SUMMARY:" in line_str or "ERROR:" in line_str: - logger.info(f"Detected crash: {line_str}") - stats["crashes_found"] += 1 - - await proc.wait() - stats["execution_time"] = time.time() - start_time - - # Send final stats update - if stats_callback: - await stats_callback({ - "total_execs": stats["total_executions"], - "execs_per_sec": stats["total_executions"] / stats["execution_time"] if stats["execution_time"] > 0 else 0, - "crashes": stats["crashes_found"], - "coverage": stats["coverage"], - "corpus_size": stats["corpus_size"], - "elapsed_time": int(stats["execution_time"]) - }) - - logger.info( - f"Fuzzing completed: {stats['total_executions']} execs, " - f"{stats['crashes_found']} crashes" - ) - - except Exception as e: - logger.error(f"Fuzzing error: {e}") - - return findings, stats - - async def _parse_crash_artifacts( - self, - workspace: Path, - target_name: str - ) -> List[ModuleFinding]: - """ - Parse crash artifacts from fuzz/artifacts directory. - - Cargo-fuzz stores crashes in: fuzz/artifacts// - """ - findings = [] - artifacts_dir = workspace / "fuzz" / "artifacts" / target_name - - if not artifacts_dir.exists(): - logger.info("No crash artifacts found") - return findings - - # Find all crash files - for crash_file in artifacts_dir.glob("crash-*"): - try: - finding = await self._analyze_crash(workspace, target_name, crash_file) - if finding: - findings.append(finding) - except Exception as e: - logger.warning(f"Failed to analyze crash {crash_file}: {e}") - - logger.info(f"Parsed {len(findings)} crash artifacts") - return findings - - async def _analyze_crash( - self, - workspace: Path, - target_name: str, - crash_file: Path - ) -> Optional[ModuleFinding]: - """ - Analyze a single crash file. - - Runs cargo-fuzz with the crash input to reproduce and get stack trace. - """ - try: - # Read crash input - crash_input = crash_file.read_bytes() - - # Reproduce crash to get stack trace - cmd = [ - "cargo", "fuzz", "run", - target_name, - str(crash_file) - ] - - proc = await asyncio.create_subprocess_exec( - *cmd, - cwd=workspace, - stdout=asyncio.subprocess.PIPE, - stderr=asyncio.subprocess.STDOUT, - env={**os.environ, "RUST_BACKTRACE": "1"} - ) - - stdout, _ = await proc.communicate() - output = stdout.decode('utf-8', errors='ignore') - - # Parse stack trace and error type - error_type = "Unknown Crash" - stack_trace = output - - # Extract error type - if "SEGV" in output: - error_type = "Segmentation Fault" - severity = "critical" - elif "heap-use-after-free" in output: - error_type = "Use After Free" - severity = "critical" - elif "heap-buffer-overflow" in output: - error_type = "Heap Buffer Overflow" - severity = "critical" - elif "stack-buffer-overflow" in output: - error_type = "Stack Buffer Overflow" - severity = "high" - elif "panic" in output.lower(): - error_type = "Panic" - severity = "medium" - else: - severity = "high" - - # Create finding - finding = self.create_finding( - title=f"Crash: {error_type} in {target_name}", - description=f"Cargo-fuzz discovered a crash in target '{target_name}'. " - f"Error type: {error_type}. " - f"Input size: {len(crash_input)} bytes.", - severity=severity, - category="crash", - file_path=f"fuzz/fuzz_targets/{target_name}.rs", - code_snippet=stack_trace[:500], - recommendation="Review the crash details and fix the underlying bug. " - "Use AddressSanitizer to identify memory safety issues. " - "Consider adding bounds checks or using safer APIs.", - metadata={ - "error_type": error_type, - "crash_file": crash_file.name, - "input_size": len(crash_input), - "reproducer": crash_file.name, - "stack_trace": stack_trace - } - ) - - return finding - - except Exception as e: - logger.warning(f"Failed to analyze crash {crash_file}: {e}") - return None diff --git a/backend/toolbox/modules/reporter/__init__.py b/backend/toolbox/modules/reporter/__init__.py deleted file mode 100644 index 7812ff1..0000000 --- a/backend/toolbox/modules/reporter/__init__.py +++ /dev/null @@ -1,14 +0,0 @@ -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -from .sarif_reporter import SARIFReporter - -__all__ = ["SARIFReporter"] \ No newline at end of file diff --git a/backend/toolbox/modules/reporter/sarif_reporter.py b/backend/toolbox/modules/reporter/sarif_reporter.py deleted file mode 100644 index 283c8dc..0000000 --- a/backend/toolbox/modules/reporter/sarif_reporter.py +++ /dev/null @@ -1,400 +0,0 @@ -""" -SARIF Reporter Module - Generates SARIF-formatted security reports -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -import logging -from pathlib import Path -from typing import Dict, Any, List -from datetime import datetime - -try: - from toolbox.modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding -except ImportError: - try: - from modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding - except ImportError: - from src.toolbox.modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding - -logger = logging.getLogger(__name__) - - -class SARIFReporter(BaseModule): - """ - Generates SARIF (Static Analysis Results Interchange Format) reports. - - This module: - - Converts findings to SARIF format - - Aggregates results from multiple modules - - Adds metadata and context - - Provides actionable recommendations - """ - - def get_metadata(self) -> ModuleMetadata: - """Get module metadata""" - return ModuleMetadata( - name="sarif_reporter", - version="1.0.0", - description="Generates SARIF-formatted security reports", - author="FuzzForge Team", - category="reporter", - tags=["reporting", "sarif", "output"], - input_schema={ - "findings": { - "type": "array", - "description": "List of findings to report", - "required": True - }, - "tool_name": { - "type": "string", - "description": "Name of the tool", - "default": "FuzzForge Security Assessment" - }, - "tool_version": { - "type": "string", - "description": "Tool version", - "default": "1.0.0" - }, - "include_code_flows": { - "type": "boolean", - "description": "Include code flow information", - "default": False - } - }, - output_schema={ - "sarif": { - "type": "object", - "description": "SARIF 2.1.0 formatted report" - } - }, - requires_workspace=False # Reporter doesn't need direct workspace access - ) - - def validate_config(self, config: Dict[str, Any]) -> bool: - """Validate module configuration""" - if "findings" not in config and "modules_results" not in config: - raise ValueError("Either 'findings' or 'modules_results' must be provided") - return True - - async def execute(self, config: Dict[str, Any], workspace: Path = None) -> ModuleResult: - """ - Execute the SARIF reporter module. - - Args: - config: Module configuration with findings - workspace: Optional workspace path for context - - Returns: - ModuleResult with SARIF report - """ - self.start_timer() - self.validate_config(config) - - # Get configuration - tool_name = config.get("tool_name", "FuzzForge Security Assessment") - tool_version = config.get("tool_version", "1.0.0") - include_code_flows = config.get("include_code_flows", False) - - # Collect findings from either direct findings or module results - all_findings = [] - - if "findings" in config: - # Direct findings provided - all_findings = config["findings"] - if isinstance(all_findings, list) and all(isinstance(f, dict) for f in all_findings): - # Convert dict findings to ModuleFinding objects - all_findings = [ModuleFinding(**f) if isinstance(f, dict) else f for f in all_findings] - elif "modules_results" in config: - # Aggregate from module results - for module_result in config["modules_results"]: - if isinstance(module_result, dict): - findings = module_result.get("findings", []) - all_findings.extend(findings) - elif hasattr(module_result, "findings"): - all_findings.extend(module_result.findings) - - logger.info(f"Generating SARIF report for {len(all_findings)} findings") - - try: - # Generate SARIF report - sarif_report = self._generate_sarif( - findings=all_findings, - tool_name=tool_name, - tool_version=tool_version, - include_code_flows=include_code_flows, - workspace_path=str(workspace) if workspace else None - ) - - # Create summary - summary = self._generate_report_summary(all_findings) - - return ModuleResult( - module=self.get_metadata().name, - version=self.get_metadata().version, - status="success", - execution_time=self.get_execution_time(), - findings=[], # Reporter doesn't generate new findings - summary=summary, - metadata={ - "tool_name": tool_name, - "tool_version": tool_version, - "report_format": "SARIF 2.1.0", - "total_findings": len(all_findings) - }, - error=None, - sarif=sarif_report # Add SARIF as custom field - ) - - except Exception as e: - logger.error(f"SARIF reporter failed: {e}") - return self.create_result( - findings=[], - status="failed", - error=str(e) - ) - - def _generate_sarif( - self, - findings: List[ModuleFinding], - tool_name: str, - tool_version: str, - include_code_flows: bool, - workspace_path: str = None - ) -> Dict[str, Any]: - """ - Generate SARIF 2.1.0 formatted report. - - Args: - findings: List of findings to report - tool_name: Name of the tool - tool_version: Tool version - include_code_flows: Whether to include code flow information - workspace_path: Optional workspace path - - Returns: - SARIF formatted dictionary - """ - # Create rules from unique finding types - rules = self._create_rules(findings) - - # Create results from findings - results = self._create_results(findings, include_code_flows) - - # Build SARIF structure - sarif = { - "$schema": "https://raw.githubusercontent.com/oasis-tcs/sarif-spec/master/Schemata/sarif-schema-2.1.0.json", - "version": "2.1.0", - "runs": [ - { - "tool": { - "driver": { - "name": tool_name, - "version": tool_version, - "informationUri": "https://fuzzforge.ai", - "rules": rules - } - }, - "results": results, - "invocations": [ - { - "executionSuccessful": True, - "endTimeUtc": datetime.utcnow().isoformat() + "Z" - } - ] - } - ] - } - - # Add workspace information if available - if workspace_path: - sarif["runs"][0]["originalUriBaseIds"] = { - "WORKSPACE": { - "uri": f"file://{workspace_path}/", - "description": "The workspace root directory" - } - } - - return sarif - - def _create_rules(self, findings: List[ModuleFinding]) -> List[Dict[str, Any]]: - """ - Create SARIF rules from findings. - - Args: - findings: List of findings - - Returns: - List of SARIF rule objects - """ - rules_dict = {} - - for finding in findings: - rule_id = f"{finding.category}_{finding.severity}" - - if rule_id not in rules_dict: - rules_dict[rule_id] = { - "id": rule_id, - "name": finding.category.replace("_", " ").title(), - "shortDescription": { - "text": f"{finding.category} vulnerability" - }, - "fullDescription": { - "text": f"Detection rule for {finding.category} vulnerabilities with {finding.severity} severity" - }, - "defaultConfiguration": { - "level": self._severity_to_sarif_level(finding.severity) - }, - "properties": { - "category": finding.category, - "severity": finding.severity, - "tags": ["security", finding.category, finding.severity] - } - } - - return list(rules_dict.values()) - - def _create_results( - self, findings: List[ModuleFinding], include_code_flows: bool - ) -> List[Dict[str, Any]]: - """ - Create SARIF results from findings. - - Args: - findings: List of findings - include_code_flows: Whether to include code flows - - Returns: - List of SARIF result objects - """ - results = [] - - for finding in findings: - result = { - "ruleId": f"{finding.category}_{finding.severity}", - "level": self._severity_to_sarif_level(finding.severity), - "message": { - "text": finding.description - }, - "locations": [] - } - - # Add location information if available - if finding.file_path: - location = { - "physicalLocation": { - "artifactLocation": { - "uri": finding.file_path, - "uriBaseId": "WORKSPACE" - } - } - } - - # Add line information if available - if finding.line_start: - location["physicalLocation"]["region"] = { - "startLine": finding.line_start - } - if finding.line_end: - location["physicalLocation"]["region"]["endLine"] = finding.line_end - - # Add code snippet if available - if finding.code_snippet: - location["physicalLocation"]["region"]["snippet"] = { - "text": finding.code_snippet - } - - result["locations"].append(location) - - # Add fix suggestions if available - if finding.recommendation: - result["fixes"] = [ - { - "description": { - "text": finding.recommendation - } - } - ] - - # Add properties - result["properties"] = { - "findingId": finding.id, - "title": finding.title, - "metadata": finding.metadata - } - - results.append(result) - - return results - - def _severity_to_sarif_level(self, severity: str) -> str: - """ - Convert severity to SARIF level. - - Args: - severity: Finding severity - - Returns: - SARIF level string - """ - mapping = { - "critical": "error", - "high": "error", - "medium": "warning", - "low": "note", - "info": "none" - } - return mapping.get(severity.lower(), "warning") - - def _generate_report_summary(self, findings: List[ModuleFinding]) -> Dict[str, Any]: - """ - Generate summary statistics for the report. - - Args: - findings: List of findings - - Returns: - Summary dictionary - """ - severity_counts = { - "critical": 0, - "high": 0, - "medium": 0, - "low": 0, - "info": 0 - } - - category_counts = {} - affected_files = set() - - for finding in findings: - # Count by severity - if finding.severity in severity_counts: - severity_counts[finding.severity] += 1 - - # Count by category - if finding.category not in category_counts: - category_counts[finding.category] = 0 - category_counts[finding.category] += 1 - - # Track affected files - if finding.file_path: - affected_files.add(finding.file_path) - - return { - "total_findings": len(findings), - "severity_distribution": severity_counts, - "category_distribution": category_counts, - "affected_files": len(affected_files), - "report_format": "SARIF 2.1.0", - "generated_at": datetime.utcnow().isoformat() - } diff --git a/backend/toolbox/modules/scanner/__init__.py b/backend/toolbox/modules/scanner/__init__.py deleted file mode 100644 index 3efefe6..0000000 --- a/backend/toolbox/modules/scanner/__init__.py +++ /dev/null @@ -1,15 +0,0 @@ -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -from .file_scanner import FileScanner -from .dependency_scanner import DependencyScanner - -__all__ = ["FileScanner", "DependencyScanner"] \ No newline at end of file diff --git a/backend/toolbox/modules/scanner/dependency_scanner.py b/backend/toolbox/modules/scanner/dependency_scanner.py deleted file mode 100644 index 4c7791c..0000000 --- a/backend/toolbox/modules/scanner/dependency_scanner.py +++ /dev/null @@ -1,302 +0,0 @@ -""" -Dependency Scanner Module - Scans Python dependencies for known vulnerabilities using pip-audit -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -import asyncio -import json -import logging -import time -from pathlib import Path -from typing import Dict, Any, List - -try: - from toolbox.modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding -except ImportError: - try: - from modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding - except ImportError: - from src.toolbox.modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding - -logger = logging.getLogger(__name__) - - -class DependencyScanner(BaseModule): - """ - Scans Python dependencies for known vulnerabilities using pip-audit. - - This module: - - Discovers dependency files (requirements.txt, pyproject.toml, setup.py, Pipfile) - - Runs pip-audit to check for vulnerable dependencies - - Reports CVEs with severity and affected versions - """ - - def get_metadata(self) -> ModuleMetadata: - """Get module metadata""" - return ModuleMetadata( - name="dependency_scanner", - version="1.0.0", - description="Scans Python dependencies for known vulnerabilities", - author="FuzzForge Team", - category="scanner", - tags=["dependencies", "cve", "vulnerabilities", "pip-audit"], - input_schema={ - "dependency_files": { - "type": "array", - "items": {"type": "string"}, - "description": "List of dependency files to scan (auto-discovered if empty)", - "default": [] - }, - "ignore_vulns": { - "type": "array", - "items": {"type": "string"}, - "description": "List of vulnerability IDs to ignore", - "default": [] - } - }, - output_schema={ - "findings": { - "type": "array", - "description": "List of vulnerable dependencies with CVE information" - } - }, - requires_workspace=True - ) - - def validate_config(self, config: Dict[str, Any]) -> bool: - """Validate module configuration""" - dep_files = config.get("dependency_files", []) - if not isinstance(dep_files, list): - raise ValueError("dependency_files must be a list") - - ignore_vulns = config.get("ignore_vulns", []) - if not isinstance(ignore_vulns, list): - raise ValueError("ignore_vulns must be a list") - - return True - - def _discover_dependency_files(self, workspace: Path) -> List[Path]: - """ - Discover Python dependency files in workspace. - - Returns: - List of discovered dependency file paths - """ - dependency_patterns = [ - "requirements.txt", - "*requirements*.txt", - "pyproject.toml", - "setup.py", - "Pipfile", - "poetry.lock" - ] - - found_files = [] - for pattern in dependency_patterns: - found_files.extend(workspace.rglob(pattern)) - - # Deduplicate and return - unique_files = list(set(found_files)) - logger.info(f"Discovered {len(unique_files)} dependency files") - return unique_files - - async def _run_pip_audit(self, file_path: Path) -> Dict[str, Any]: - """ - Run pip-audit on a specific dependency file. - - Args: - file_path: Path to dependency file - - Returns: - pip-audit JSON output as dict - """ - try: - # Run pip-audit with JSON output - cmd = [ - "pip-audit", - "--requirement", str(file_path), - "--format", "json", - "--progress-spinner", "off" - ] - - logger.info(f"Running pip-audit on: {file_path.name}") - process = await asyncio.create_subprocess_exec( - *cmd, - stdout=asyncio.subprocess.PIPE, - stderr=asyncio.subprocess.PIPE - ) - - stdout, stderr = await process.communicate() - - # pip-audit returns 0 if no vulns, 1 if vulns found - if process.returncode not in [0, 1]: - logger.error(f"pip-audit failed: {stderr.decode()}") - return {"dependencies": []} - - # Parse JSON output - result = json.loads(stdout.decode()) - return result - - except Exception as e: - logger.error(f"Error running pip-audit on {file_path}: {e}") - return {"dependencies": []} - - def _convert_to_findings( - self, - audit_result: Dict[str, Any], - file_path: Path, - workspace: Path, - ignore_vulns: List[str] - ) -> List[ModuleFinding]: - """ - Convert pip-audit results to ModuleFindings. - - Args: - audit_result: pip-audit JSON output - file_path: Path to scanned file - workspace: Workspace path for relative path calculation - ignore_vulns: List of vulnerability IDs to ignore - - Returns: - List of ModuleFindings - """ - findings = [] - - # pip-audit format: {"dependencies": [{package, version, vulns: []}]} - for dep in audit_result.get("dependencies", []): - package_name = dep.get("name", "unknown") - package_version = dep.get("version", "unknown") - vulnerabilities = dep.get("vulns", []) - - for vuln in vulnerabilities: - vuln_id = vuln.get("id", "UNKNOWN") - - # Skip if in ignore list - if vuln_id in ignore_vulns: - logger.debug(f"Ignoring vulnerability: {vuln_id}") - continue - - description = vuln.get("description", "No description available") - fix_versions = vuln.get("fix_versions", []) - - # Map CVSS scores to severity - # pip-audit doesn't always provide CVSS, so we default to medium - severity = "medium" - - # Try to get relative path - try: - rel_path = file_path.relative_to(workspace) - except ValueError: - rel_path = file_path - - recommendation = f"Upgrade {package_name} to a fixed version: {', '.join(fix_versions)}" if fix_versions else f"Check for updates to {package_name}" - - finding = self.create_finding( - title=f"Vulnerable dependency: {package_name} ({vuln_id})", - description=f"{description}\n\nAffected package: {package_name} {package_version}", - severity=severity, - category="vulnerable-dependency", - file_path=str(rel_path), - recommendation=recommendation, - metadata={ - "cve_id": vuln_id, - "package": package_name, - "installed_version": package_version, - "fix_versions": fix_versions, - "aliases": vuln.get("aliases", []), - "link": vuln.get("link", "") - } - ) - findings.append(finding) - - return findings - - async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: - """ - Execute the dependency scanning module. - - Args: - config: Module configuration - workspace: Path to workspace - - Returns: - ModuleResult with vulnerability findings - """ - start_time = time.time() - metadata = self.get_metadata() - - # Validate inputs - self.validate_config(config) - self.validate_workspace(workspace) - - # Get configuration - specified_files = config.get("dependency_files", []) - ignore_vulns = config.get("ignore_vulns", []) - - # Discover or use specified dependency files - if specified_files: - dep_files = [workspace / f for f in specified_files] - else: - dep_files = self._discover_dependency_files(workspace) - - if not dep_files: - logger.warning("No dependency files found in workspace") - return ModuleResult( - module=metadata.name, - version=metadata.version, - status="success", - execution_time=time.time() - start_time, - findings=[], - summary={ - "total_files": 0, - "total_vulnerabilities": 0, - "vulnerable_packages": 0 - } - ) - - # Scan each dependency file - all_findings = [] - files_scanned = 0 - - for dep_file in dep_files: - if not dep_file.exists(): - logger.warning(f"Dependency file not found: {dep_file}") - continue - - logger.info(f"Scanning dependencies in: {dep_file.name}") - audit_result = await self._run_pip_audit(dep_file) - findings = self._convert_to_findings(audit_result, dep_file, workspace, ignore_vulns) - - all_findings.extend(findings) - files_scanned += 1 - - # Calculate summary - unique_packages = len(set(f.metadata.get("package") for f in all_findings)) - - execution_time = time.time() - start_time - - return ModuleResult( - module=metadata.name, - version=metadata.version, - status="success", - execution_time=execution_time, - findings=all_findings, - summary={ - "total_files": files_scanned, - "total_vulnerabilities": len(all_findings), - "vulnerable_packages": unique_packages - }, - metadata={ - "scanned_files": [str(f.name) for f in dep_files if f.exists()] - } - ) diff --git a/backend/toolbox/modules/scanner/file_scanner.py b/backend/toolbox/modules/scanner/file_scanner.py deleted file mode 100644 index 22de200..0000000 --- a/backend/toolbox/modules/scanner/file_scanner.py +++ /dev/null @@ -1,315 +0,0 @@ -""" -File Scanner Module - Scans and enumerates files in the workspace -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -import logging -import mimetypes -from pathlib import Path -from typing import Dict, Any -import hashlib - -try: - from toolbox.modules.base import BaseModule, ModuleMetadata, ModuleResult -except ImportError: - try: - from modules.base import BaseModule, ModuleMetadata, ModuleResult - except ImportError: - from src.toolbox.modules.base import BaseModule, ModuleMetadata, ModuleResult - -logger = logging.getLogger(__name__) - - -class FileScanner(BaseModule): - """ - Scans files in the mounted workspace and collects information. - - This module: - - Enumerates files based on patterns - - Detects file types - - Calculates file hashes - - Identifies potentially sensitive files - """ - - def get_metadata(self) -> ModuleMetadata: - """Get module metadata""" - return ModuleMetadata( - name="file_scanner", - version="1.0.0", - description="Scans and enumerates files in the workspace", - author="FuzzForge Team", - category="scanner", - tags=["files", "enumeration", "discovery"], - input_schema={ - "patterns": { - "type": "array", - "items": {"type": "string"}, - "description": "File patterns to scan (e.g., ['*.py', '*.js'])", - "default": ["*"] - }, - "max_file_size": { - "type": "integer", - "description": "Maximum file size to scan in bytes", - "default": 10485760 # 10MB - }, - "check_sensitive": { - "type": "boolean", - "description": "Check for sensitive file patterns", - "default": True - }, - "calculate_hashes": { - "type": "boolean", - "description": "Calculate SHA256 hashes for files", - "default": False - } - }, - output_schema={ - "findings": { - "type": "array", - "description": "List of discovered files with metadata" - } - }, - requires_workspace=True - ) - - def validate_config(self, config: Dict[str, Any]) -> bool: - """Validate module configuration""" - patterns = config.get("patterns", ["*"]) - if not isinstance(patterns, list): - raise ValueError("patterns must be a list") - - max_size = config.get("max_file_size", 10485760) - if not isinstance(max_size, int) or max_size <= 0: - raise ValueError("max_file_size must be a positive integer") - - return True - - async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: - """ - Execute the file scanning module. - - Args: - config: Module configuration - workspace: Path to the workspace directory - - Returns: - ModuleResult with file findings - """ - self.start_timer() - self.validate_workspace(workspace) - self.validate_config(config) - - findings = [] - file_count = 0 - total_size = 0 - file_types = {} - - # Get configuration - patterns = config.get("patterns", ["*"]) - max_file_size = config.get("max_file_size", 10485760) - check_sensitive = config.get("check_sensitive", True) - calculate_hashes = config.get("calculate_hashes", False) - - logger.info(f"Scanning workspace with patterns: {patterns}") - - try: - # Scan for each pattern - for pattern in patterns: - for file_path in workspace.rglob(pattern): - if not file_path.is_file(): - continue - - file_count += 1 - relative_path = file_path.relative_to(workspace) - - # Get file stats - try: - stats = file_path.stat() - file_size = stats.st_size - total_size += file_size - - # Skip large files - if file_size > max_file_size: - logger.warning(f"Skipping large file: {relative_path} ({file_size} bytes)") - continue - - # Detect file type - file_type = self._detect_file_type(file_path) - if file_type not in file_types: - file_types[file_type] = 0 - file_types[file_type] += 1 - - # Check for sensitive files - if check_sensitive and self._is_sensitive_file(file_path): - findings.append(self.create_finding( - title=f"Potentially sensitive file: {relative_path.name}", - description=f"Found potentially sensitive file at {relative_path}", - severity="medium", - category="sensitive_file", - file_path=str(relative_path), - metadata={ - "file_size": file_size, - "file_type": file_type - } - )) - - # Calculate hash if requested - file_hash = None - if calculate_hashes and file_size < 1048576: # Only hash files < 1MB - file_hash = self._calculate_hash(file_path) - - # Create informational finding for each file - findings.append(self.create_finding( - title=f"File discovered: {relative_path.name}", - description=f"File: {relative_path}", - severity="info", - category="file_enumeration", - file_path=str(relative_path), - metadata={ - "file_size": file_size, - "file_type": file_type, - "file_hash": file_hash - } - )) - - except Exception as e: - logger.error(f"Error processing file {relative_path}: {e}") - - # Create summary - summary = { - "total_files": file_count, - "total_size_bytes": total_size, - "file_types": file_types, - "patterns_scanned": patterns - } - - return self.create_result( - findings=findings, - status="success", - summary=summary, - metadata={ - "workspace": str(workspace), - "config": config - } - ) - - except Exception as e: - logger.error(f"File scanner failed: {e}") - return self.create_result( - findings=findings, - status="failed", - error=str(e) - ) - - def _detect_file_type(self, file_path: Path) -> str: - """ - Detect the type of a file. - - Args: - file_path: Path to the file - - Returns: - File type string - """ - # Try to determine from extension - mime_type, _ = mimetypes.guess_type(str(file_path)) - if mime_type: - return mime_type - - # Check by extension - ext = file_path.suffix.lower() - type_map = { - '.py': 'text/x-python', - '.js': 'application/javascript', - '.java': 'text/x-java', - '.cpp': 'text/x-c++', - '.c': 'text/x-c', - '.go': 'text/x-go', - '.rs': 'text/x-rust', - '.rb': 'text/x-ruby', - '.php': 'text/x-php', - '.yaml': 'text/yaml', - '.yml': 'text/yaml', - '.json': 'application/json', - '.xml': 'text/xml', - '.md': 'text/markdown', - '.txt': 'text/plain', - '.sh': 'text/x-shellscript', - '.bat': 'text/x-batch', - '.ps1': 'text/x-powershell' - } - - return type_map.get(ext, 'application/octet-stream') - - def _is_sensitive_file(self, file_path: Path) -> bool: - """ - Check if a file might contain sensitive information. - - Args: - file_path: Path to the file - - Returns: - True if potentially sensitive - """ - sensitive_patterns = [ - '.env', - '.env.local', - '.env.production', - 'credentials', - 'password', - 'secret', - 'private_key', - 'id_rsa', - 'id_dsa', - '.pem', - '.key', - '.pfx', - '.p12', - 'wallet', - '.ssh', - 'token', - 'api_key', - 'config.json', - 'settings.json', - '.git-credentials', - '.npmrc', - '.pypirc', - '.docker/config.json' - ] - - file_name_lower = file_path.name.lower() - for pattern in sensitive_patterns: - if pattern in file_name_lower: - return True - - return False - - def _calculate_hash(self, file_path: Path) -> str: - """ - Calculate SHA256 hash of a file. - - Args: - file_path: Path to the file - - Returns: - Hex string of SHA256 hash - """ - try: - sha256_hash = hashlib.sha256() - with open(file_path, "rb") as f: - for byte_block in iter(lambda: f.read(4096), b""): - sha256_hash.update(byte_block) - return sha256_hash.hexdigest() - except Exception as e: - logger.error(f"Failed to calculate hash for {file_path}: {e}") - return None \ No newline at end of file diff --git a/backend/toolbox/modules/secret_detection/__init__.py b/backend/toolbox/modules/secret_detection/__init__.py deleted file mode 100644 index e3fc98e..0000000 --- a/backend/toolbox/modules/secret_detection/__init__.py +++ /dev/null @@ -1,38 +0,0 @@ -""" -Secret Detection Modules - -This package contains modules for detecting secrets, credentials, and sensitive information -in codebases and repositories. - -Available modules: -- TruffleHog: Comprehensive secret detection with verification -- Gitleaks: Git-specific secret scanning and leak detection -- GitGuardian: Enterprise secret detection using GitGuardian API -- LLM Secret Detector: AI-powered semantic secret detection -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -from typing import List, Type -from ..base import BaseModule - -# Module registry for automatic discovery -SECRET_DETECTION_MODULES: List[Type[BaseModule]] = [] - -def register_module(module_class: Type[BaseModule]): - """Register a secret detection module""" - SECRET_DETECTION_MODULES.append(module_class) - return module_class - -def get_available_modules() -> List[Type[BaseModule]]: - """Get all available secret detection modules""" - return SECRET_DETECTION_MODULES.copy() \ No newline at end of file diff --git a/backend/toolbox/modules/secret_detection/gitleaks.py b/backend/toolbox/modules/secret_detection/gitleaks.py deleted file mode 100644 index 7005236..0000000 --- a/backend/toolbox/modules/secret_detection/gitleaks.py +++ /dev/null @@ -1,353 +0,0 @@ -""" -Gitleaks Secret Detection Module - -This module uses Gitleaks to detect secrets and sensitive information in Git repositories -and file systems. -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -import asyncio -import json -from pathlib import Path -from typing import Dict, Any, List -import subprocess -import logging - -from ..base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult -from . import register_module - -logger = logging.getLogger(__name__) - - -@register_module -class GitleaksModule(BaseModule): - """Gitleaks secret detection module""" - - def get_metadata(self) -> ModuleMetadata: - """Get module metadata""" - return ModuleMetadata( - name="gitleaks", - version="8.18.0", - description="Git-specific secret scanning and leak detection using Gitleaks", - author="FuzzForge Team", - category="secret_detection", - tags=["secrets", "git", "leak-detection", "credentials"], - input_schema={ - "type": "object", - "properties": { - "scan_mode": { - "type": "string", - "enum": ["detect", "protect"], - "default": "detect", - "description": "Scan mode: detect (entire repo history) or protect (staged changes)" - }, - "config_file": { - "type": "string", - "description": "Path to custom Gitleaks configuration file" - }, - "baseline_file": { - "type": "string", - "description": "Path to baseline file to ignore known findings" - }, - "max_target_megabytes": { - "type": "integer", - "default": 100, - "description": "Maximum size of files to scan (in MB)" - }, - "redact": { - "type": "boolean", - "default": True, - "description": "Redact secrets in output" - }, - "no_git": { - "type": "boolean", - "default": False, - "description": "Scan files without Git context" - } - } - }, - output_schema={ - "type": "object", - "properties": { - "findings": { - "type": "array", - "items": { - "type": "object", - "properties": { - "rule_id": {"type": "string"}, - "category": {"type": "string"}, - "file_path": {"type": "string"}, - "line_number": {"type": "integer"}, - "secret": {"type": "string"} - } - } - } - } - } - ) - - def validate_config(self, config: Dict[str, Any]) -> bool: - """Validate configuration""" - scan_mode = config.get("scan_mode", "detect") - if scan_mode not in ["detect", "protect"]: - raise ValueError("scan_mode must be 'detect' or 'protect'") - - max_size = config.get("max_target_megabytes", 100) - if not isinstance(max_size, int) or max_size < 1 or max_size > 1000: - raise ValueError("max_target_megabytes must be between 1 and 1000") - - return True - - async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: - """Execute Gitleaks secret detection""" - self.start_timer() - - try: - # Validate inputs - self.validate_config(config) - self.validate_workspace(workspace) - - logger.info(f"Running Gitleaks on {workspace}") - - # Build Gitleaks command - scan_mode = config.get("scan_mode", "detect") - cmd = ["gitleaks", scan_mode] - - # Add source path - cmd.extend(["--source", str(workspace)]) - - # Create temp file for JSON output - import tempfile - output_file = tempfile.NamedTemporaryFile(mode='w+', suffix='.json', delete=False) - output_path = output_file.name - output_file.close() - - # Add report format and output file - cmd.extend(["--report-format", "json"]) - cmd.extend(["--report-path", output_path]) - - # Add redact option - if config.get("redact", True): - cmd.append("--redact") - - # Add max target size - max_size = config.get("max_target_megabytes", 100) - cmd.extend(["--max-target-megabytes", str(max_size)]) - - # Add config file if specified - if config.get("config_file"): - config_path = Path(config["config_file"]) - if config_path.exists(): - cmd.extend(["--config", str(config_path)]) - - # Add baseline file if specified - if config.get("baseline_file"): - baseline_path = Path(config["baseline_file"]) - if baseline_path.exists(): - cmd.extend(["--baseline-path", str(baseline_path)]) - - # Add no-git flag if specified - if config.get("no_git", False): - cmd.append("--no-git") - - # Add verbose output - cmd.append("--verbose") - - logger.debug(f"Running command: {' '.join(cmd)}") - - # Run Gitleaks - process = await asyncio.create_subprocess_exec( - *cmd, - stdout=asyncio.subprocess.PIPE, - stderr=asyncio.subprocess.PIPE, - cwd=workspace - ) - - stdout, stderr = await process.communicate() - - # Parse results - findings = [] - try: - # Read the JSON output from file - with open(output_path, 'r') as f: - output_content = f.read() - - if process.returncode == 0: - # No secrets found - logger.info("No secrets detected by Gitleaks") - elif process.returncode == 1: - # Secrets found - parse from file content - findings = self._parse_gitleaks_output(output_content, workspace) - else: - # Error occurred - error_msg = stderr.decode() - logger.error(f"Gitleaks failed: {error_msg}") - return self.create_result( - findings=[], - status="failed", - error=f"Gitleaks execution failed: {error_msg}" - ) - finally: - # Clean up temp file - import os - try: - os.unlink(output_path) - except: - pass - - # Create summary - summary = { - "total_leaks": len(findings), - "unique_rules": len(set(f.metadata.get("rule_id", "") for f in findings)), - "files_with_leaks": len(set(f.file_path for f in findings if f.file_path)), - "scan_mode": scan_mode - } - - logger.info(f"Gitleaks found {len(findings)} potential leaks") - - return self.create_result( - findings=findings, - status="success", - summary=summary - ) - - except Exception as e: - logger.error(f"Gitleaks module failed: {e}") - return self.create_result( - findings=[], - status="failed", - error=str(e) - ) - - def _parse_gitleaks_output(self, output: str, workspace: Path) -> List[ModuleFinding]: - """Parse Gitleaks JSON output into findings""" - findings = [] - - if not output.strip(): - return findings - - try: - # Gitleaks outputs JSON array - results = json.loads(output) - if not isinstance(results, list): - logger.warning("Unexpected Gitleaks output format") - return findings - - for result in results: - # Extract information - rule_id = result.get("RuleID", "unknown") - description = result.get("Description", "") - file_path = result.get("File", "") - line_number = result.get("StartLine", 0) # Gitleaks outputs "StartLine", not "LineNumber" - line_end = result.get("EndLine", 0) - secret = result.get("Secret", "") - match_text = result.get("Match", "") - - # Commit info (if available) - commit = result.get("Commit", "") - author = result.get("Author", "") - email = result.get("Email", "") - date = result.get("Date", "") - - # Make file path relative to workspace - if file_path: - try: - rel_path = Path(file_path).relative_to(workspace) - file_path = str(rel_path) - except ValueError: - # If file is outside workspace, keep absolute path - pass - - # Determine severity based on rule type - severity = self._get_leak_severity(rule_id, description) - - # Create finding - finding = self.create_finding( - title=f"Secret leak detected: {rule_id}", - description=self._get_leak_description(rule_id, description, commit), - severity=severity, - category="secret_leak", - file_path=file_path if file_path else None, - line_start=line_number if line_number > 0 else None, - line_end=line_end if line_end > 0 else None, - code_snippet=match_text if match_text else secret, - recommendation=self._get_leak_recommendation(rule_id), - metadata={ - "rule_id": rule_id, - "secret_type": description, - "commit": commit, - "author": author, - "email": email, - "date": date, - "entropy": result.get("Entropy", 0), - "fingerprint": result.get("Fingerprint", "") - } - ) - - findings.append(finding) - - except json.JSONDecodeError as e: - logger.warning(f"Failed to parse Gitleaks output: {e}") - except Exception as e: - logger.warning(f"Error processing Gitleaks results: {e}") - - return findings - - def _get_leak_severity(self, rule_id: str, description: str) -> str: - """Determine severity based on secret type""" - critical_patterns = [ - "aws", "amazon", "gcp", "google", "azure", "microsoft", - "private_key", "rsa", "ssh", "certificate", "database", - "password", "auth", "token", "secret", "key" - ] - - rule_lower = rule_id.lower() - desc_lower = description.lower() - - # Check for critical patterns - for pattern in critical_patterns: - if pattern in rule_lower or pattern in desc_lower: - if any(x in rule_lower for x in ["aws", "gcp", "azure"]): - return "critical" - elif any(x in rule_lower for x in ["private", "key", "password"]): - return "high" - else: - return "medium" - - return "low" - - def _get_leak_description(self, rule_id: str, description: str, commit: str) -> str: - """Get description for the leak finding""" - base_desc = f"Gitleaks detected a potential secret leak matching rule '{rule_id}'" - if description: - base_desc += f" ({description})" - - if commit: - base_desc += f" in commit {commit[:8]}" - - base_desc += ". This may indicate sensitive information has been committed to version control." - - return base_desc - - def _get_leak_recommendation(self, rule_id: str) -> str: - """Get remediation recommendation""" - base_rec = "Remove the secret from the codebase and Git history. " - - if any(pattern in rule_id.lower() for pattern in ["aws", "gcp", "azure"]): - base_rec += "Revoke the cloud credentials immediately and rotate them. " - - base_rec += "Consider using Git history rewriting tools (git-filter-branch, BFG) " \ - "to remove sensitive data from commit history. Implement pre-commit hooks " \ - "to prevent future secret commits." - - return base_rec \ No newline at end of file diff --git a/backend/toolbox/modules/secret_detection/llm_secret_detector.py b/backend/toolbox/modules/secret_detection/llm_secret_detector.py deleted file mode 100644 index 1adf341..0000000 --- a/backend/toolbox/modules/secret_detection/llm_secret_detector.py +++ /dev/null @@ -1,398 +0,0 @@ -""" -LLM Secret Detection Module - -This module uses an LLM to detect secrets and sensitive information via semantic understanding. -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -import logging -from pathlib import Path -from typing import Dict, Any, List - -from ..base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult -from . import register_module - -logger = logging.getLogger(__name__) - - -@register_module -class LLMSecretDetectorModule(BaseModule): - """ - LLM-based secret detection module using AI semantic analysis. - - Uses an LLM agent to identify secrets through natural language understanding, - potentially catching secrets that pattern-based tools miss. - """ - - def get_metadata(self) -> ModuleMetadata: - """Get module metadata""" - return ModuleMetadata( - name="llm_secret_detector", - version="1.0.0", - description="AI-powered secret detection using LLM semantic analysis", - author="FuzzForge Team", - category="secret_detection", - tags=["secrets", "llm", "ai", "semantic"], - input_schema={ - "type": "object", - "properties": { - "agent_url": { - "type": "string", - "default": "http://fuzzforge-task-agent:8000/a2a/litellm_agent", - "description": "A2A agent endpoint URL" - }, - "llm_model": { - "type": "string", - "default": "gpt-4o-mini", - "description": "LLM model to use" - }, - "llm_provider": { - "type": "string", - "default": "openai", - "description": "LLM provider (openai, anthropic, etc.)" - }, - "file_patterns": { - "type": "array", - "items": {"type": "string"}, - "default": ["*.py", "*.js", "*.ts", "*.java", "*.go", "*.env", "*.yaml", "*.yml", "*.json", "*.xml", "*.ini", "*.sql", "*.properties", "*.sh", "*.bat", "*.config", "*.conf", "*.toml", "*id_rsa*"], - "description": "File patterns to analyze" - }, - "max_files": { - "type": "integer", - "default": 20, - "description": "Maximum number of files to analyze" - }, - "max_file_size": { - "type": "integer", - "default": 30000, - "description": "Maximum file size in bytes (30KB default)" - }, - "timeout": { - "type": "integer", - "default": 45, - "description": "Timeout per file in seconds" - } - }, - "required": [] - }, - output_schema={ - "type": "object", - "properties": { - "findings": { - "type": "array", - "description": "Secrets identified by LLM" - } - } - } - ) - - def validate_config(self, config: Dict[str, Any]) -> bool: - """Validate module configuration""" - # Lazy import to avoid Temporal sandbox restrictions - try: - from fuzzforge_ai.a2a_wrapper import send_agent_task # noqa: F401 - except ImportError: - raise RuntimeError( - "A2A wrapper not available. Ensure fuzzforge_ai module is accessible." - ) - - agent_url = config.get("agent_url") - # agent_url is optional - will have default from metadata.yaml - if agent_url is not None and not isinstance(agent_url, str): - raise ValueError("agent_url must be a valid URL string") - - max_files = config.get("max_files", 20) - if not isinstance(max_files, int) or max_files <= 0: - raise ValueError("max_files must be a positive integer") - - return True - - async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: - """ - Execute LLM-based secret detection. - - Args: - config: Module configuration - workspace: Path to the workspace containing code to analyze - - Returns: - ModuleResult with secrets detected by LLM - """ - self.start_timer() - - logger.info(f"Starting LLM secret detection in workspace: {workspace}") - - # Extract configuration (defaults come from metadata.yaml via API) - agent_url = config["agent_url"] - llm_model = config["llm_model"] - llm_provider = config["llm_provider"] - file_patterns = config["file_patterns"] - max_files = config["max_files"] - max_file_size = config["max_file_size"] - timeout = config["timeout"] - - # Find files to analyze - # Skip files that are unlikely to contain secrets - skip_patterns = ['*.sarif', '*.md', '*.html', '*.css', '*.db', '*.sqlite'] - - files_to_analyze = [] - for pattern in file_patterns: - for file_path in workspace.rglob(pattern): - if file_path.is_file(): - try: - # Skip unlikely files - if any(file_path.match(skip) for skip in skip_patterns): - logger.debug(f"Skipping {file_path.name} (unlikely to have secrets)") - continue - - # Check file size - if file_path.stat().st_size > max_file_size: - logger.debug(f"Skipping {file_path} (too large)") - continue - - files_to_analyze.append(file_path) - - if len(files_to_analyze) >= max_files: - break - except Exception as e: - logger.warning(f"Error checking file {file_path}: {e}") - continue - - if len(files_to_analyze) >= max_files: - break - - logger.info(f"Found {len(files_to_analyze)} files to analyze for secrets") - - # Analyze each file with LLM - all_findings = [] - for file_path in files_to_analyze: - logger.info(f"Analyzing: {file_path.relative_to(workspace)}") - - try: - findings = await self._analyze_file_for_secrets( - file_path=file_path, - workspace=workspace, - agent_url=agent_url, - llm_model=llm_model, - llm_provider=llm_provider, - timeout=timeout - ) - all_findings.extend(findings) - - except Exception as e: - logger.error(f"Error analyzing {file_path}: {e}") - # Continue with next file - continue - - logger.info(f"LLM secret detection complete. Found {len(all_findings)} potential secrets.") - - # Create result - return self.create_result( - findings=all_findings, - status="success", - summary={ - "files_analyzed": len(files_to_analyze), - "total_secrets": len(all_findings), - "agent_url": agent_url, - "model": f"{llm_provider}/{llm_model}" - } - ) - - async def _analyze_file_for_secrets( - self, - file_path: Path, - workspace: Path, - agent_url: str, - llm_model: str, - llm_provider: str, - timeout: int - ) -> List[ModuleFinding]: - """Analyze a single file for secrets using LLM""" - - # Read file content - try: - with open(file_path, 'r', encoding='utf-8') as f: - code_content = f.read() - except Exception as e: - logger.error(f"Failed to read {file_path}: {e}") - return [] - - # Build specialized prompt for secret detection - system_prompt = ( - "You are a security expert specialized in detecting secrets and credentials in code. " - "Your job is to find REAL secrets that could be exploited. Be thorough and aggressive.\n\n" - "For each secret found, respond in this exact format:\n" - "SECRET_FOUND: [type like 'AWS Key', 'GitHub Token', 'Database Password']\n" - "SEVERITY: [critical/high/medium/low]\n" - "LINE: [exact line number]\n" - "CONFIDENCE: [high/medium/low]\n" - "DESCRIPTION: [brief explanation]\n\n" - "EXAMPLES of secrets to find:\n" - "1. API Keys: 'AKIA...', 'ghp_...', 'sk_live_...', 'SG.'\n" - "2. Tokens: Bearer tokens, OAuth tokens, JWT secrets\n" - "3. Passwords: Database passwords, admin passwords in configs\n" - "4. Connection Strings: mongodb://, postgres://, redis:// with credentials\n" - "5. Private Keys: -----BEGIN PRIVATE KEY-----, -----BEGIN RSA PRIVATE KEY-----\n" - "6. Cloud Credentials: AWS keys, GCP keys, Azure keys\n" - "7. Encryption Keys: AES keys, secret keys in config\n" - "8. Webhook URLs: URLs with tokens like hooks.slack.com/services/...\n\n" - "FIND EVERYTHING that looks like a real credential, password, key, or token.\n" - "DO NOT be overly cautious. Report anything suspicious.\n\n" - "If absolutely no secrets exist, respond with 'NO_SECRETS_FOUND'." - ) - - user_message = ( - f"Analyze this code for secrets and credentials:\n\n" - f"File: {file_path.relative_to(workspace)}\n\n" - f"```\n{code_content}\n```" - ) - - # Call LLM via A2A wrapper - try: - from fuzzforge_ai.a2a_wrapper import send_agent_task - - result = await send_agent_task( - url=agent_url, - model=llm_model, - provider=llm_provider, - prompt=system_prompt, - message=user_message, - context=f"secret_detection_{file_path.stem}", - timeout=float(timeout) - ) - - llm_response = result.text - - # Debug: Log LLM response - logger.debug(f"LLM response for {file_path.name}: {llm_response[:200]}...") - - except Exception as e: - logger.error(f"A2A call failed for {file_path}: {e}") - return [] - - # Parse LLM response into findings - findings = self._parse_llm_response( - llm_response=llm_response, - file_path=file_path, - workspace=workspace - ) - - if findings: - logger.info(f"Found {len(findings)} secrets in {file_path.name}") - else: - logger.debug(f"No secrets found in {file_path.name}. Response: {llm_response[:500]}") - - return findings - - def _parse_llm_response( - self, - llm_response: str, - file_path: Path, - workspace: Path - ) -> List[ModuleFinding]: - """Parse LLM response into structured findings""" - - if "NO_SECRETS_FOUND" in llm_response: - return [] - - findings = [] - relative_path = str(file_path.relative_to(workspace)) - - # Simple parser for the expected format - lines = llm_response.split('\n') - current_secret = {} - - for line in lines: - line = line.strip() - - if line.startswith("SECRET_FOUND:"): - # Save previous secret if exists - if current_secret: - findings.append(self._create_secret_finding(current_secret, relative_path)) - current_secret = {"type": line.replace("SECRET_FOUND:", "").strip()} - - elif line.startswith("SEVERITY:"): - severity = line.replace("SEVERITY:", "").strip().lower() - current_secret["severity"] = severity - - elif line.startswith("LINE:"): - line_num = line.replace("LINE:", "").strip() - try: - current_secret["line"] = int(line_num) - except ValueError: - current_secret["line"] = None - - elif line.startswith("CONFIDENCE:"): - confidence = line.replace("CONFIDENCE:", "").strip().lower() - current_secret["confidence"] = confidence - - elif line.startswith("DESCRIPTION:"): - current_secret["description"] = line.replace("DESCRIPTION:", "").strip() - - # Save last secret - if current_secret: - findings.append(self._create_secret_finding(current_secret, relative_path)) - - return findings - - def _create_secret_finding(self, secret: Dict[str, Any], file_path: str) -> ModuleFinding: - """Create a ModuleFinding from parsed secret""" - - severity_map = { - "critical": "critical", - "high": "high", - "medium": "medium", - "low": "low" - } - - severity = severity_map.get(secret.get("severity", "medium"), "medium") - confidence = secret.get("confidence", "medium") - - # Adjust severity based on confidence - if confidence == "low" and severity == "critical": - severity = "high" - elif confidence == "low" and severity == "high": - severity = "medium" - - # Create finding - title = f"LLM detected secret: {secret.get('type', 'Unknown secret')}" - description = secret.get("description", "An LLM identified this as a potential secret.") - description += f"\n\nConfidence: {confidence}" - - return self.create_finding( - title=title, - description=description, - severity=severity, - category="secret_detection", - file_path=file_path, - line_start=secret.get("line"), - recommendation=self._get_secret_recommendation(secret.get("type", "")), - metadata={ - "tool": "llm-secret-detector", - "secret_type": secret.get("type", "unknown"), - "confidence": confidence, - "detection_method": "semantic-analysis" - } - ) - - def _get_secret_recommendation(self, secret_type: str) -> str: - """Get remediation recommendation for detected secret""" - return ( - f"A potential {secret_type} was detected by AI analysis. " - f"Verify whether this is a real secret or a false positive. " - f"If real: (1) Revoke the credential immediately, " - f"(2) Remove from codebase and Git history, " - f"(3) Rotate to a new secret, " - f"(4) Use secret management tools for storage. " - f"Implement pre-commit hooks to prevent future leaks." - ) diff --git a/backend/toolbox/modules/secret_detection/trufflehog.py b/backend/toolbox/modules/secret_detection/trufflehog.py deleted file mode 100644 index 6c68e99..0000000 --- a/backend/toolbox/modules/secret_detection/trufflehog.py +++ /dev/null @@ -1,284 +0,0 @@ -""" -TruffleHog Secret Detection Module - -This module uses TruffleHog to detect secrets, credentials, and sensitive information -with verification capabilities. -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -import asyncio -import json -import tempfile -from pathlib import Path -from typing import Dict, Any, List -import subprocess -import logging - -from ..base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult -from . import register_module - -logger = logging.getLogger(__name__) - - -@register_module -class TruffleHogModule(BaseModule): - """TruffleHog secret detection module""" - - def get_metadata(self) -> ModuleMetadata: - """Get module metadata""" - return ModuleMetadata( - name="trufflehog", - version="3.63.2", - description="Comprehensive secret detection with verification using TruffleHog", - author="FuzzForge Team", - category="secret_detection", - tags=["secrets", "credentials", "sensitive-data", "verification"], - input_schema={ - "type": "object", - "properties": { - "verify": { - "type": "boolean", - "default": False, - "description": "Verify discovered secrets" - }, - "include_detectors": { - "type": "array", - "items": {"type": "string"}, - "description": "Specific detectors to include" - }, - "exclude_detectors": { - "type": "array", - "items": {"type": "string"}, - "description": "Specific detectors to exclude" - }, - "concurrency": { - "type": "integer", - "default": 10, - "description": "Number of concurrent workers" - } - } - }, - output_schema={ - "type": "object", - "properties": { - "findings": { - "type": "array", - "items": { - "type": "object", - "properties": { - "detector": {"type": "string"}, - "verified": {"type": "boolean"}, - "file_path": {"type": "string"}, - "line": {"type": "integer"}, - "secret": {"type": "string"} - } - } - } - } - } - ) - - def validate_config(self, config: Dict[str, Any]) -> bool: - """Validate configuration""" - # Check concurrency bounds - concurrency = config.get("concurrency", 10) - if not isinstance(concurrency, int) or concurrency < 1 or concurrency > 50: - raise ValueError("Concurrency must be between 1 and 50") - - return True - - async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: - """Execute TruffleHog secret detection""" - self.start_timer() - - try: - # Validate inputs - self.validate_config(config) - self.validate_workspace(workspace) - - logger.info(f"Running TruffleHog on {workspace}") - - # Build TruffleHog command - cmd = ["trufflehog", "filesystem", str(workspace)] - - # Add verification flag - if config.get("verify", False): - cmd.append("--verify") - else: - # Explicitly disable verification to get all unverified secrets - cmd.append("--no-verification") - - # Add JSON output - cmd.extend(["--json", "--no-update"]) - - # Add concurrency - cmd.extend(["--concurrency", str(config.get("concurrency", 10))]) - - # Add include/exclude detectors - if config.get("include_detectors"): - cmd.extend(["--include-detectors", ",".join(config["include_detectors"])]) - - if config.get("exclude_detectors"): - cmd.extend(["--exclude-detectors", ",".join(config["exclude_detectors"])]) - - logger.debug(f"Running command: {' '.join(cmd)}") - - # Run TruffleHog - process = await asyncio.create_subprocess_exec( - *cmd, - stdout=asyncio.subprocess.PIPE, - stderr=asyncio.subprocess.PIPE, - cwd=workspace - ) - - stdout, stderr = await process.communicate() - - # Parse results - findings = [] - if process.returncode == 0 or process.returncode == 1: # 1 indicates secrets found - findings = self._parse_trufflehog_output(stdout.decode(), workspace) - else: - error_msg = stderr.decode() - logger.error(f"TruffleHog failed: {error_msg}") - return self.create_result( - findings=[], - status="failed", - error=f"TruffleHog execution failed: {error_msg}" - ) - - # Create summary - summary = { - "total_secrets": len(findings), - "verified_secrets": len([f for f in findings if f.metadata.get("verified", False)]), - "detectors_triggered": len(set(f.metadata.get("detector", "") for f in findings)), - "files_with_secrets": len(set(f.file_path for f in findings if f.file_path)) - } - - logger.info(f"TruffleHog found {len(findings)} secrets") - - return self.create_result( - findings=findings, - status="success", - summary=summary - ) - - except Exception as e: - logger.error(f"TruffleHog module failed: {e}") - return self.create_result( - findings=[], - status="failed", - error=str(e) - ) - - def _parse_trufflehog_output(self, output: str, workspace: Path) -> List[ModuleFinding]: - """Parse TruffleHog JSON output into findings""" - findings = [] - - for line in output.strip().split('\n'): - if not line.strip(): - continue - - try: - result = json.loads(line) - - # Extract information - detector = result.get("DetectorName", "unknown") - verified = result.get("Verified", False) - raw_secret = result.get("Raw", "") - - # Source info - source_metadata = result.get("SourceMetadata", {}) - source_data = source_metadata.get("Data", {}) - file_path = source_data.get("Filesystem", {}).get("file", "") - line_num = source_data.get("Filesystem", {}).get("line", 0) - - # Make file path relative to workspace - if file_path: - try: - rel_path = Path(file_path).relative_to(workspace) - file_path = str(rel_path) - except ValueError: - # If file is outside workspace, keep absolute path - pass - - # Determine severity based on verification and detector type - severity = self._get_secret_severity(detector, verified, raw_secret) - - # Create finding - finding = self.create_finding( - title=f"{detector} secret detected", - description=self._get_secret_description(detector, verified), - severity=severity, - category="secret_detection", - file_path=file_path if file_path else None, - line_start=line_num if line_num > 0 else None, - code_snippet=self._truncate_secret(raw_secret), - recommendation=self._get_secret_recommendation(detector, verified), - metadata={ - "detector": detector, - "verified": verified, - "detector_type": result.get("DetectorType", ""), - "decoder_type": result.get("DecoderType", ""), - "structured_data": result.get("StructuredData", {}) - } - ) - - findings.append(finding) - - except json.JSONDecodeError as e: - logger.warning(f"Failed to parse TruffleHog output line: {e}") - continue - except Exception as e: - logger.warning(f"Error processing TruffleHog result: {e}") - continue - - return findings - - def _get_secret_severity(self, detector: str, verified: bool, secret: str) -> str: - """Determine severity based on secret type and verification status""" - if verified: - # Verified secrets are always high risk - critical_detectors = ["aws", "gcp", "azure", "github", "gitlab", "database"] - if any(crit in detector.lower() for crit in critical_detectors): - return "critical" - return "high" - - # Unverified secrets - high_risk_detectors = ["private_key", "certificate", "password", "token"] - if any(high in detector.lower() for high in high_risk_detectors): - return "medium" - - return "low" - - def _get_secret_description(self, detector: str, verified: bool) -> str: - """Get description for the secret finding""" - verification_status = "verified and active" if verified else "unverified" - return f"A {detector} secret was detected and is {verification_status}. " \ - f"This may represent a security risk if the credential is valid." - - def _get_secret_recommendation(self, detector: str, verified: bool) -> str: - """Get remediation recommendation""" - if verified: - return f"IMMEDIATE ACTION REQUIRED: This {detector} secret is verified and active. " \ - f"Revoke the credential immediately, remove it from the codebase, and " \ - f"implement proper secret management practices." - else: - return f"Review this {detector} secret to determine if it's valid. " \ - f"If real, revoke the credential and remove it from the codebase. " \ - f"Consider implementing secret scanning in CI/CD pipelines." - - def _truncate_secret(self, secret: str, max_length: int = 50) -> str: - """Truncate secret for display purposes""" - if len(secret) <= max_length: - return secret - return secret[:max_length] + "..." \ No newline at end of file diff --git a/backend/toolbox/workflows/__init__.py b/backend/toolbox/workflows/__init__.py deleted file mode 100644 index 43bcfe7..0000000 --- a/backend/toolbox/workflows/__init__.py +++ /dev/null @@ -1,11 +0,0 @@ -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - diff --git a/backend/toolbox/workflows/android_static_analysis/__init__.py b/backend/toolbox/workflows/android_static_analysis/__init__.py deleted file mode 100644 index aec13c5..0000000 --- a/backend/toolbox/workflows/android_static_analysis/__init__.py +++ /dev/null @@ -1,35 +0,0 @@ -""" -Android Static Analysis Workflow - -Comprehensive Android application security testing combining: -- Jadx APK decompilation -- OpenGrep/Semgrep static analysis with Android-specific rules -- MobSF mobile security framework analysis -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -from .workflow import AndroidStaticAnalysisWorkflow -from .activities import ( - decompile_with_jadx_activity, - scan_with_opengrep_activity, - scan_with_mobsf_activity, - generate_android_sarif_activity, -) - -__all__ = [ - "AndroidStaticAnalysisWorkflow", - "decompile_with_jadx_activity", - "scan_with_opengrep_activity", - "scan_with_mobsf_activity", - "generate_android_sarif_activity", -] diff --git a/backend/toolbox/workflows/android_static_analysis/activities.py b/backend/toolbox/workflows/android_static_analysis/activities.py deleted file mode 100644 index 5d37729..0000000 --- a/backend/toolbox/workflows/android_static_analysis/activities.py +++ /dev/null @@ -1,213 +0,0 @@ -""" -Android Static Analysis Workflow Activities - -Activities for the Android security testing workflow: -- decompile_with_jadx_activity: Decompile APK using Jadx -- scan_with_opengrep_activity: Analyze code with OpenGrep/Semgrep -- scan_with_mobsf_activity: Scan APK with MobSF -- generate_android_sarif_activity: Generate combined SARIF report -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -import logging -import sys -from pathlib import Path - -from temporalio import activity - -# Configure logging -logger = logging.getLogger(__name__) - -# Add toolbox to path for module imports -sys.path.insert(0, '/app/toolbox') - - -@activity.defn(name="decompile_with_jadx") -async def decompile_with_jadx_activity(workspace_path: str, config: dict) -> dict: - """ - Decompile Android APK to Java source code using Jadx. - - Args: - workspace_path: Path to the workspace directory - config: JadxDecompiler configuration - - Returns: - Decompilation results dictionary - """ - logger.info(f"Activity: decompile_with_jadx (workspace={workspace_path})") - - try: - from modules.android import JadxDecompiler - - workspace = Path(workspace_path) - if not workspace.exists(): - raise FileNotFoundError(f"Workspace not found: {workspace_path}") - - decompiler = JadxDecompiler() - result = await decompiler.execute(config, workspace) - - logger.info( - f"āœ“ Jadx decompilation completed: " - f"{result.summary.get('java_files', 0)} Java files generated" - ) - return result.dict() - - except Exception as e: - logger.error(f"Jadx decompilation failed: {e}", exc_info=True) - raise - - -@activity.defn(name="scan_with_opengrep") -async def scan_with_opengrep_activity(workspace_path: str, config: dict) -> dict: - """ - Analyze Android code for security issues using OpenGrep/Semgrep. - - Args: - workspace_path: Path to the workspace directory - config: OpenGrepAndroid configuration - - Returns: - Analysis results dictionary - """ - logger.info(f"Activity: scan_with_opengrep (workspace={workspace_path})") - - try: - from modules.android import OpenGrepAndroid - - workspace = Path(workspace_path) - if not workspace.exists(): - raise FileNotFoundError(f"Workspace not found: {workspace_path}") - - analyzer = OpenGrepAndroid() - result = await analyzer.execute(config, workspace) - - logger.info( - f"āœ“ OpenGrep analysis completed: " - f"{result.summary.get('total_findings', 0)} security issues found" - ) - return result.dict() - - except Exception as e: - logger.error(f"OpenGrep analysis failed: {e}", exc_info=True) - raise - - -@activity.defn(name="scan_with_mobsf") -async def scan_with_mobsf_activity(workspace_path: str, config: dict) -> dict: - """ - Analyze Android APK for security issues using MobSF. - - Args: - workspace_path: Path to the workspace directory - config: MobSFScanner configuration - - Returns: - Scan results dictionary (or skipped status if MobSF unavailable) - """ - logger.info(f"Activity: scan_with_mobsf (workspace={workspace_path})") - - # Check if MobSF is installed (graceful degradation for ARM64 platform) - mobsf_path = Path("/app/mobsf") - if not mobsf_path.exists(): - logger.warning("MobSF not installed on this platform (ARM64/Rosetta limitation)") - return { - "status": "skipped", - "findings": [], - "summary": { - "total_findings": 0, - "skip_reason": "MobSF unavailable on ARM64 platform (Rosetta 2 incompatibility)" - } - } - - try: - from modules.android import MobSFScanner - - workspace = Path(workspace_path) - if not workspace.exists(): - raise FileNotFoundError(f"Workspace not found: {workspace_path}") - - scanner = MobSFScanner() - result = await scanner.execute(config, workspace) - - logger.info( - f"āœ“ MobSF scan completed: " - f"{result.summary.get('total_findings', 0)} findings" - ) - return result.dict() - - except Exception as e: - logger.error(f"MobSF scan failed: {e}", exc_info=True) - raise - - -@activity.defn(name="generate_android_sarif") -async def generate_android_sarif_activity( - jadx_result: dict, - opengrep_result: dict, - mobsf_result: dict, - config: dict, - workspace_path: str -) -> dict: - """ - Generate combined SARIF report from all Android security findings. - - Args: - jadx_result: Jadx decompilation results - opengrep_result: OpenGrep analysis results - mobsf_result: MobSF scan results (may be None if disabled) - config: Reporter configuration - workspace_path: Workspace path - - Returns: - SARIF report dictionary - """ - logger.info("Activity: generate_android_sarif") - - try: - from modules.reporter import SARIFReporter - - workspace = Path(workspace_path) - - # Collect all findings - all_findings = [] - all_findings.extend(opengrep_result.get("findings", [])) - - if mobsf_result: - all_findings.extend(mobsf_result.get("findings", [])) - - # Prepare reporter config - reporter_config = { - **(config or {}), - "findings": all_findings, - "tool_name": "FuzzForge Android Static Analysis", - "tool_version": "1.0.0", - "metadata": { - "jadx_version": "1.5.0", - "opengrep_version": "1.45.0", - "mobsf_version": "3.9.7", - "java_files_decompiled": jadx_result.get("summary", {}).get("java_files", 0), - } - } - - reporter = SARIFReporter() - result = await reporter.execute(reporter_config, workspace) - - sarif_report = result.dict().get("sarif", {}) - - logger.info(f"āœ“ SARIF report generated with {len(all_findings)} findings") - - return sarif_report - - except Exception as e: - logger.error(f"SARIF report generation failed: {e}", exc_info=True) - raise diff --git a/backend/toolbox/workflows/android_static_analysis/metadata.yaml b/backend/toolbox/workflows/android_static_analysis/metadata.yaml deleted file mode 100644 index cd77e48..0000000 --- a/backend/toolbox/workflows/android_static_analysis/metadata.yaml +++ /dev/null @@ -1,172 +0,0 @@ -name: android_static_analysis -version: "1.0.0" -vertical: android -description: "Comprehensive Android application security testing using Jadx decompilation, OpenGrep static analysis, and MobSF mobile security framework" -author: "FuzzForge Team" -tags: - - "android" - - "mobile" - - "static-analysis" - - "security" - - "opengrep" - - "semgrep" - - "mobsf" - - "jadx" - - "apk" - - "sarif" - -# Workspace isolation mode -# Using "shared" mode for read-only APK analysis (no file modifications except decompilation output) -workspace_isolation: "shared" - -parameters: - type: object - properties: - apk_path: - type: string - description: "Path to the APK file to analyze (relative to uploaded target or absolute within workspace)" - default: "" - - decompile_apk: - type: boolean - description: "Whether to decompile APK with Jadx before OpenGrep analysis" - default: true - - jadx_config: - type: object - description: "Jadx decompiler configuration" - properties: - output_dir: - type: string - description: "Output directory for decompiled sources" - default: "jadx_output" - overwrite: - type: boolean - description: "Overwrite existing decompilation output" - default: true - threads: - type: integer - description: "Number of decompilation threads" - default: 4 - minimum: 1 - maximum: 32 - decompiler_args: - type: array - items: - type: string - description: "Additional Jadx arguments" - default: [] - - opengrep_config: - type: object - description: "OpenGrep/Semgrep static analysis configuration" - properties: - config: - type: string - enum: ["auto", "p/security-audit", "p/owasp-top-ten", "p/cwe-top-25"] - description: "Preset OpenGrep ruleset (ignored if custom_rules_path is set)" - default: "auto" - custom_rules_path: - type: string - description: "Path to custom OpenGrep rules directory (use Android-specific rules for best results)" - default: "/app/toolbox/modules/android/custom_rules" - languages: - type: array - items: - type: string - description: "Programming languages to analyze (defaults to java, kotlin for Android)" - default: ["java", "kotlin"] - include_patterns: - type: array - items: - type: string - description: "File patterns to include in scan" - default: [] - exclude_patterns: - type: array - items: - type: string - description: "File patterns to exclude from scan" - default: [] - max_target_bytes: - type: integer - description: "Maximum file size to analyze (bytes)" - default: 1000000 - timeout: - type: integer - description: "Analysis timeout in seconds" - default: 300 - severity: - type: array - items: - type: string - enum: ["ERROR", "WARNING", "INFO"] - description: "Severity levels to include in results" - default: ["ERROR", "WARNING", "INFO"] - confidence: - type: array - items: - type: string - enum: ["HIGH", "MEDIUM", "LOW"] - description: "Confidence levels to include in results" - default: ["HIGH", "MEDIUM", "LOW"] - - mobsf_config: - type: object - description: "MobSF scanner configuration" - properties: - enabled: - type: boolean - description: "Enable MobSF analysis (requires APK file)" - default: true - mobsf_url: - type: string - description: "MobSF server URL" - default: "http://localhost:8877" - api_key: - type: string - description: "MobSF API key (if not provided, uses MOBSF_API_KEY env var)" - default: null - rescan: - type: boolean - description: "Force rescan even if APK was previously analyzed" - default: false - - reporter_config: - type: object - description: "SARIF reporter configuration" - properties: - include_code_flows: - type: boolean - description: "Include code flow information in SARIF output" - default: false - logical_id: - type: string - description: "Custom identifier for the SARIF report" - default: null - -output_schema: - type: object - properties: - sarif: - type: object - description: "SARIF-formatted findings from all Android security tools" - summary: - type: object - description: "Android security analysis summary" - properties: - total_findings: - type: integer - decompiled_java_files: - type: integer - description: "Number of Java files decompiled by Jadx" - opengrep_findings: - type: integer - description: "Findings from OpenGrep/Semgrep analysis" - mobsf_findings: - type: integer - description: "Findings from MobSF analysis" - severity_distribution: - type: object - category_distribution: - type: object diff --git a/backend/toolbox/workflows/android_static_analysis/workflow.py b/backend/toolbox/workflows/android_static_analysis/workflow.py deleted file mode 100644 index 8376cd2..0000000 --- a/backend/toolbox/workflows/android_static_analysis/workflow.py +++ /dev/null @@ -1,289 +0,0 @@ -""" -Android Static Analysis Workflow - Temporal Version - -Comprehensive security testing for Android applications using Jadx, OpenGrep, and MobSF. -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -from datetime import timedelta -from typing import Dict, Any, Optional -from pathlib import Path - -from temporalio import workflow -from temporalio.common import RetryPolicy - -# Import activity interfaces (will be executed by worker) -with workflow.unsafe.imports_passed_through(): - import logging - -logger = logging.getLogger(__name__) - - -@workflow.defn -class AndroidStaticAnalysisWorkflow: - """ - Android Static Application Security Testing workflow. - - This workflow: - 1. Downloads target (APK) from MinIO - 2. (Optional) Decompiles APK using Jadx - 3. Runs OpenGrep/Semgrep static analysis on decompiled code - 4. (Optional) Runs MobSF comprehensive security scan - 5. Generates a SARIF report with all findings - 6. Uploads results to MinIO - 7. Cleans up cache - """ - - @workflow.run - async def run( - self, - target_id: str, - apk_path: Optional[str] = None, - decompile_apk: bool = True, - jadx_config: Optional[Dict[str, Any]] = None, - opengrep_config: Optional[Dict[str, Any]] = None, - mobsf_config: Optional[Dict[str, Any]] = None, - reporter_config: Optional[Dict[str, Any]] = None - ) -> Dict[str, Any]: - """ - Main workflow execution. - - Args: - target_id: UUID of the uploaded target (APK) in MinIO - apk_path: Path to APK file within target (if target is not a single APK) - decompile_apk: Whether to decompile APK with Jadx before OpenGrep - jadx_config: Configuration for Jadx decompiler - opengrep_config: Configuration for OpenGrep analyzer - mobsf_config: Configuration for MobSF scanner - reporter_config: Configuration for SARIF reporter - - Returns: - Dictionary containing SARIF report and summary - """ - workflow_id = workflow.info().workflow_id - - workflow.logger.info( - f"Starting AndroidStaticAnalysisWorkflow " - f"(workflow_id={workflow_id}, target_id={target_id})" - ) - - # Default configurations - if not jadx_config: - jadx_config = { - "output_dir": "jadx_output", - "overwrite": True, - "threads": 4, - "decompiler_args": [] - } - - if not opengrep_config: - opengrep_config = { - "config": "auto", - "custom_rules_path": "/app/toolbox/modules/android/custom_rules", - "languages": ["java", "kotlin"], - "severity": ["ERROR", "WARNING", "INFO"], - "confidence": ["HIGH", "MEDIUM", "LOW"], - "timeout": 300, - } - - if not mobsf_config: - mobsf_config = { - "enabled": True, - "mobsf_url": "http://localhost:8877", - "api_key": None, - "rescan": False, - } - - if not reporter_config: - reporter_config = { - "include_code_flows": False - } - - # Activity retry policy - retry_policy = RetryPolicy( - initial_interval=timedelta(seconds=1), - maximum_interval=timedelta(seconds=60), - maximum_attempts=3, - backoff_coefficient=2.0, - ) - - # Phase 0: Download target from MinIO - workflow.logger.info(f"Phase 0: Downloading target from MinIO (target_id={target_id})") - workspace_path = await workflow.execute_activity( - "get_target", - args=[target_id, workflow.info().workflow_id, "shared"], - start_to_close_timeout=timedelta(minutes=10), - retry_policy=retry_policy, - ) - workflow.logger.info(f"āœ“ Target downloaded to: {workspace_path}") - - # Handle case where workspace_path is a file (single APK upload) - # vs. a directory containing files - workspace_path_obj = Path(workspace_path) - - # Determine actual workspace directory and APK path - if apk_path: - # User explicitly provided apk_path - actual_apk_path = apk_path - # workspace_path could be either a file or directory - # If it's a file and apk_path matches the filename, use parent as workspace - if workspace_path_obj.name == apk_path: - workspace_path = str(workspace_path_obj.parent) - workflow.logger.info(f"Adjusted workspace to parent directory: {workspace_path}") - else: - # No apk_path provided - check if workspace_path is an APK file - if workspace_path_obj.suffix.lower() == '.apk' or workspace_path_obj.name.endswith('.apk'): - # workspace_path is the APK file itself - actual_apk_path = workspace_path_obj.name - workspace_path = str(workspace_path_obj.parent) - workflow.logger.info(f"Detected single APK file: {actual_apk_path}, workspace: {workspace_path}") - else: - # workspace_path is a directory, need to find APK within it - actual_apk_path = None - workflow.logger.info("Workspace is a directory, APK detection will be handled by modules") - - # Phase 1: Jadx decompilation (if enabled and APK provided) - jadx_result = None - analysis_workspace = workspace_path - - if decompile_apk and actual_apk_path: - workflow.logger.info(f"Phase 1: Decompiling APK with Jadx (apk={actual_apk_path})") - - jadx_activity_config = { - **jadx_config, - "apk_path": actual_apk_path - } - - jadx_result = await workflow.execute_activity( - "decompile_with_jadx", - args=[workspace_path, jadx_activity_config], - start_to_close_timeout=timedelta(minutes=15), - retry_policy=retry_policy, - ) - - if jadx_result.get("status") == "success": - # Use decompiled sources as workspace for OpenGrep - source_dir = jadx_result.get("summary", {}).get("source_dir") - if source_dir: - analysis_workspace = source_dir - workflow.logger.info( - f"āœ“ Jadx decompiled {jadx_result.get('summary', {}).get('java_files', 0)} Java files" - ) - else: - workflow.logger.warning(f"Jadx decompilation failed: {jadx_result.get('error')}") - else: - workflow.logger.info("Phase 1: Jadx decompilation skipped") - - # Phase 2: OpenGrep static analysis - workflow.logger.info(f"Phase 2: OpenGrep analysis on {analysis_workspace}") - - opengrep_result = await workflow.execute_activity( - "scan_with_opengrep", - args=[analysis_workspace, opengrep_config], - start_to_close_timeout=timedelta(minutes=20), - retry_policy=retry_policy, - ) - - workflow.logger.info( - f"āœ“ OpenGrep completed: {opengrep_result.get('summary', {}).get('total_findings', 0)} findings" - ) - - # Phase 3: MobSF analysis (if enabled and APK provided) - mobsf_result = None - - if mobsf_config.get("enabled", True) and actual_apk_path: - workflow.logger.info(f"Phase 3: MobSF scan on APK: {actual_apk_path}") - - mobsf_activity_config = { - **mobsf_config, - "file_path": actual_apk_path - } - - try: - mobsf_result = await workflow.execute_activity( - "scan_with_mobsf", - args=[workspace_path, mobsf_activity_config], - start_to_close_timeout=timedelta(minutes=30), - retry_policy=RetryPolicy( - maximum_attempts=2 # MobSF can be flaky, limit retries - ), - ) - - # Handle skipped or completed status - if mobsf_result.get("status") == "skipped": - workflow.logger.warning( - f"āš ļø MobSF skipped: {mobsf_result.get('summary', {}).get('skip_reason', 'Unknown reason')}" - ) - else: - workflow.logger.info( - f"āœ“ MobSF completed: {mobsf_result.get('summary', {}).get('total_findings', 0)} findings" - ) - except Exception as e: - workflow.logger.warning(f"MobSF scan failed (continuing without it): {e}") - mobsf_result = None - else: - workflow.logger.info("Phase 3: MobSF scan skipped (disabled or no APK)") - - # Phase 4: Generate SARIF report - workflow.logger.info("Phase 4: Generating SARIF report") - - sarif_report = await workflow.execute_activity( - "generate_android_sarif", - args=[jadx_result or {}, opengrep_result, mobsf_result, reporter_config, workspace_path], - start_to_close_timeout=timedelta(minutes=5), - retry_policy=retry_policy, - ) - - # Phase 5: Upload results to MinIO - workflow.logger.info("Phase 5: Uploading results to MinIO") - - result_url = await workflow.execute_activity( - "upload_results", - args=[workflow.info().workflow_id, sarif_report, "sarif"], - start_to_close_timeout=timedelta(minutes=10), - retry_policy=retry_policy, - ) - - workflow.logger.info(f"āœ“ Results uploaded: {result_url}") - - # Phase 6: Cleanup cache - workflow.logger.info("Phase 6: Cleaning up cache") - - await workflow.execute_activity( - "cleanup_cache", - args=[workspace_path, "shared"], - start_to_close_timeout=timedelta(minutes=5), - retry_policy=RetryPolicy(maximum_attempts=1), # Don't retry cleanup - ) - - # Calculate summary - total_findings = len(sarif_report.get("runs", [{}])[0].get("results", [])) - - summary = { - "workflow": "android_static_analysis", - "target_id": target_id, - "total_findings": total_findings, - "decompiled_java_files": (jadx_result or {}).get("summary", {}).get("java_files", 0) if jadx_result else 0, - "opengrep_findings": opengrep_result.get("summary", {}).get("total_findings", 0), - "mobsf_findings": mobsf_result.get("summary", {}).get("total_findings", 0) if mobsf_result else 0, - "result_url": result_url, - } - - workflow.logger.info( - f"āœ… AndroidStaticAnalysisWorkflow completed successfully: {total_findings} findings" - ) - - return { - "sarif": sarif_report, - "summary": summary, - } diff --git a/backend/toolbox/workflows/atheris_fuzzing/__init__.py b/backend/toolbox/workflows/atheris_fuzzing/__init__.py deleted file mode 100644 index 38b1648..0000000 --- a/backend/toolbox/workflows/atheris_fuzzing/__init__.py +++ /dev/null @@ -1,9 +0,0 @@ -""" -Atheris Fuzzing Workflow - -Fuzzes user-provided Python code using Atheris. -""" - -from .workflow import AtherisFuzzingWorkflow - -__all__ = ["AtherisFuzzingWorkflow"] diff --git a/backend/toolbox/workflows/atheris_fuzzing/activities.py b/backend/toolbox/workflows/atheris_fuzzing/activities.py deleted file mode 100644 index 2ed31b7..0000000 --- a/backend/toolbox/workflows/atheris_fuzzing/activities.py +++ /dev/null @@ -1,122 +0,0 @@ -""" -Atheris Fuzzing Workflow Activities - -Activities specific to the Atheris fuzzing workflow. -""" - -import logging -import sys -from datetime import datetime -from pathlib import Path -from typing import Dict, Any -import os - -import httpx -from temporalio import activity - -# Configure logging -logger = logging.getLogger(__name__) - -# Add toolbox to path for module imports -sys.path.insert(0, '/app/toolbox') - - -@activity.defn(name="fuzz_with_atheris") -async def fuzz_activity(workspace_path: str, config: dict) -> dict: - """ - Fuzzing activity using the AtherisFuzzer module on user code. - - This activity: - 1. Imports the reusable AtherisFuzzer module - 2. Sets up real-time stats callback - 3. Executes fuzzing on user's TestOneInput() function - 4. Returns findings as ModuleResult - - Args: - workspace_path: Path to the workspace directory (user's uploaded code) - config: Fuzzer configuration (target_file, max_iterations, timeout_seconds) - - Returns: - Fuzzer results dictionary (findings, summary, metadata) - """ - logger.info(f"Activity: fuzz_with_atheris (workspace={workspace_path})") - - try: - # Import reusable AtherisFuzzer module - from modules.fuzzer import AtherisFuzzer - - workspace = Path(workspace_path) - if not workspace.exists(): - raise FileNotFoundError(f"Workspace not found: {workspace_path}") - - # Get activity info for real-time stats - info = activity.info() - run_id = info.workflow_id - - # Define stats callback for real-time monitoring - async def stats_callback(stats_data: Dict[str, Any]): - """Callback for live fuzzing statistics""" - try: - # Prepare stats payload for backend - coverage_value = stats_data.get("coverage", 0) - logger.info(f"COVERAGE_DEBUG: coverage from stats_data = {coverage_value}") - - stats_payload = { - "run_id": run_id, - "workflow": "atheris_fuzzing", - "executions": stats_data.get("total_execs", 0), - "executions_per_sec": stats_data.get("execs_per_sec", 0.0), - "crashes": stats_data.get("crashes", 0), - "unique_crashes": stats_data.get("crashes", 0), - "coverage": coverage_value, - "corpus_size": stats_data.get("corpus_size", 0), - "elapsed_time": stats_data.get("elapsed_time", 0), - "last_crash_time": None - } - - # POST stats to backend API for real-time monitoring - backend_url = os.getenv("BACKEND_URL", "http://backend:8000") - async with httpx.AsyncClient(timeout=5.0) as client: - try: - await client.post( - f"{backend_url}/fuzzing/{run_id}/stats", - json=stats_payload - ) - except Exception as http_err: - logger.debug(f"Failed to post stats to backend: {http_err}") - - # Also log for debugging - logger.info("LIVE_STATS", extra={ - "stats_type": "fuzzing_live_update", - "workflow_type": "atheris_fuzzing", - "run_id": run_id, - "executions": stats_data.get("total_execs", 0), - "executions_per_sec": stats_data.get("execs_per_sec", 0.0), - "crashes": stats_data.get("crashes", 0), - "corpus_size": stats_data.get("corpus_size", 0), - "coverage": stats_data.get("coverage", 0.0), - "elapsed_time": stats_data.get("elapsed_time", 0), - "timestamp": datetime.utcnow().isoformat() - }) - except Exception as e: - logger.warning(f"Error in stats callback: {e}") - - # Add stats callback and run_id to config - config["stats_callback"] = stats_callback - config["run_id"] = run_id - - # Execute the fuzzer module - fuzzer = AtherisFuzzer() - result = await fuzzer.execute(config, workspace) - - logger.info( - f"āœ“ Fuzzing completed: " - f"{result.summary.get('total_executions', 0)} executions, " - f"{result.summary.get('crashes_found', 0)} crashes" - ) - - return result.dict() - - except Exception as e: - logger.error(f"Fuzzing failed: {e}", exc_info=True) - raise diff --git a/backend/toolbox/workflows/atheris_fuzzing/metadata.yaml b/backend/toolbox/workflows/atheris_fuzzing/metadata.yaml deleted file mode 100644 index c119aad..0000000 --- a/backend/toolbox/workflows/atheris_fuzzing/metadata.yaml +++ /dev/null @@ -1,60 +0,0 @@ -name: atheris_fuzzing -version: "1.0.0" -vertical: python -description: "Fuzz Python code using Atheris with real-time monitoring. Automatically discovers and fuzzes TestOneInput() functions in user code." -author: "FuzzForge Team" -tags: - - "fuzzing" - - "atheris" - - "python" - - "coverage" - - "security" - -# Workspace isolation mode (system-level configuration) -# - "isolated" (default): Each workflow run gets its own isolated workspace (safe for concurrent fuzzing) -# - "shared": All runs share the same workspace (for read-only analysis workflows) -# - "copy-on-write": Download once, copy for each run (balances performance and isolation) -workspace_isolation: "isolated" - -parameters: - type: object - properties: - target_file: - type: string - description: "Python file with TestOneInput() function (auto-discovered if not specified)" - max_iterations: - type: integer - default: 1000000 - description: "Maximum fuzzing iterations" - timeout_seconds: - type: integer - default: 1800 - description: "Fuzzing timeout in seconds (30 minutes)" - -output_schema: - type: object - properties: - findings: - type: array - description: "Crashes and vulnerabilities found during fuzzing" - items: - type: object - properties: - title: - type: string - severity: - type: string - category: - type: string - metadata: - type: object - summary: - type: object - description: "Fuzzing execution summary" - properties: - total_executions: - type: integer - crashes_found: - type: integer - execution_time: - type: number diff --git a/backend/toolbox/workflows/atheris_fuzzing/workflow.py b/backend/toolbox/workflows/atheris_fuzzing/workflow.py deleted file mode 100644 index a9b0cad..0000000 --- a/backend/toolbox/workflows/atheris_fuzzing/workflow.py +++ /dev/null @@ -1,175 +0,0 @@ -""" -Atheris Fuzzing Workflow - Temporal Version - -Fuzzes user-provided Python code using Atheris with real-time monitoring. -""" - -from datetime import timedelta -from typing import Dict, Any, Optional - -from temporalio import workflow -from temporalio.common import RetryPolicy - -# Import for type hints (will be executed by worker) -with workflow.unsafe.imports_passed_through(): - import logging - -logger = logging.getLogger(__name__) - - -@workflow.defn -class AtherisFuzzingWorkflow: - """ - Fuzz Python code using Atheris. - - User workflow: - 1. User runs: ff workflow run atheris_fuzzing . - 2. CLI uploads project to MinIO - 3. Worker downloads project - 4. Worker fuzzes TestOneInput() function - 5. Crashes reported as findings - """ - - @workflow.run - async def run( - self, - target_id: str, # MinIO UUID of uploaded user code - target_file: Optional[str] = None, # Optional: specific file to fuzz - max_iterations: int = 1000000, - timeout_seconds: int = 1800 # 30 minutes default for fuzzing - ) -> Dict[str, Any]: - """ - Main workflow execution. - - Args: - target_id: UUID of the uploaded target in MinIO - target_file: Optional specific Python file with TestOneInput() (auto-discovered if None) - max_iterations: Maximum fuzzing iterations - timeout_seconds: Fuzzing timeout in seconds - - Returns: - Dictionary containing findings and summary - """ - workflow_id = workflow.info().workflow_id - - workflow.logger.info( - f"Starting AtherisFuzzingWorkflow " - f"(workflow_id={workflow_id}, target_id={target_id}, " - f"target_file={target_file or 'auto-discover'}, max_iterations={max_iterations}, " - f"timeout_seconds={timeout_seconds})" - ) - - results = { - "workflow_id": workflow_id, - "target_id": target_id, - "status": "running", - "steps": [] - } - - try: - # Get run ID for workspace isolation - run_id = workflow.info().run_id - - # Step 1: Download user's project from MinIO - workflow.logger.info("Step 1: Downloading user code from MinIO") - target_path = await workflow.execute_activity( - "get_target", - args=[target_id, run_id, "isolated"], # target_id, run_id, workspace_isolation - start_to_close_timeout=timedelta(minutes=5), - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=1), - maximum_interval=timedelta(seconds=30), - maximum_attempts=3 - ) - ) - results["steps"].append({ - "step": "download_target", - "status": "success", - "target_path": target_path - }) - workflow.logger.info(f"āœ“ User code downloaded to: {target_path}") - - # Step 2: Run Atheris fuzzing - workflow.logger.info("Step 2: Running Atheris fuzzing") - - # Use defaults if parameters are None - actual_max_iterations = max_iterations if max_iterations is not None else 1000000 - actual_timeout_seconds = timeout_seconds if timeout_seconds is not None else 1800 - - fuzz_config = { - "target_file": target_file, - "max_iterations": actual_max_iterations, - "timeout_seconds": actual_timeout_seconds - } - - fuzz_results = await workflow.execute_activity( - "fuzz_with_atheris", - args=[target_path, fuzz_config], - start_to_close_timeout=timedelta(seconds=actual_timeout_seconds + 60), - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=2), - maximum_interval=timedelta(seconds=60), - maximum_attempts=1 # Fuzzing shouldn't retry - ) - ) - - results["steps"].append({ - "step": "fuzzing", - "status": "success", - "executions": fuzz_results.get("summary", {}).get("total_executions", 0), - "crashes": fuzz_results.get("summary", {}).get("crashes_found", 0) - }) - workflow.logger.info( - f"āœ“ Fuzzing completed: " - f"{fuzz_results.get('summary', {}).get('total_executions', 0)} executions, " - f"{fuzz_results.get('summary', {}).get('crashes_found', 0)} crashes" - ) - - # Step 3: Upload results to MinIO - workflow.logger.info("Step 3: Uploading results") - try: - results_url = await workflow.execute_activity( - "upload_results", - args=[workflow_id, fuzz_results, "json"], - start_to_close_timeout=timedelta(minutes=2) - ) - results["results_url"] = results_url - workflow.logger.info(f"āœ“ Results uploaded to: {results_url}") - except Exception as e: - workflow.logger.warning(f"Failed to upload results: {e}") - results["results_url"] = None - - # Step 4: Cleanup cache - workflow.logger.info("Step 4: Cleaning up cache") - try: - await workflow.execute_activity( - "cleanup_cache", - args=[target_path, "isolated"], # target_path, workspace_isolation - start_to_close_timeout=timedelta(minutes=1) - ) - workflow.logger.info("āœ“ Cache cleaned up") - except Exception as e: - workflow.logger.warning(f"Cache cleanup failed: {e}") - - # Mark workflow as successful - results["status"] = "success" - results["findings"] = fuzz_results.get("findings", []) - results["summary"] = fuzz_results.get("summary", {}) - results["sarif"] = fuzz_results.get("sarif") or {} - workflow.logger.info( - f"āœ“ Workflow completed successfully: {workflow_id} " - f"({results['summary'].get('crashes_found', 0)} crashes found)" - ) - - return results - - except Exception as e: - workflow.logger.error(f"Workflow failed: {e}") - results["status"] = "error" - results["error"] = str(e) - results["steps"].append({ - "step": "error", - "status": "failed", - "error": str(e) - }) - raise diff --git a/backend/toolbox/workflows/cargo_fuzzing/__init__.py b/backend/toolbox/workflows/cargo_fuzzing/__init__.py deleted file mode 100644 index d496e88..0000000 --- a/backend/toolbox/workflows/cargo_fuzzing/__init__.py +++ /dev/null @@ -1,5 +0,0 @@ -"""Cargo Fuzzing Workflow""" - -from .workflow import CargoFuzzingWorkflow - -__all__ = ["CargoFuzzingWorkflow"] diff --git a/backend/toolbox/workflows/cargo_fuzzing/activities.py b/backend/toolbox/workflows/cargo_fuzzing/activities.py deleted file mode 100644 index e23e929..0000000 --- a/backend/toolbox/workflows/cargo_fuzzing/activities.py +++ /dev/null @@ -1,203 +0,0 @@ -""" -Cargo Fuzzing Workflow Activities - -Activities specific to the cargo-fuzz fuzzing workflow. -""" - -import logging -import sys -from datetime import datetime -from pathlib import Path -from typing import Dict, Any -import os - -import httpx -from temporalio import activity - -# Configure logging -logger = logging.getLogger(__name__) - -# Add toolbox to path for module imports -sys.path.insert(0, '/app/toolbox') - - -@activity.defn(name="fuzz_with_cargo") -async def fuzz_activity(workspace_path: str, config: dict) -> dict: - """ - Fuzzing activity using the CargoFuzzer module on user code. - - This activity: - 1. Imports the reusable CargoFuzzer module - 2. Sets up real-time stats callback - 3. Executes fuzzing on user's fuzz_target!() functions - 4. Returns findings as ModuleResult - - Args: - workspace_path: Path to the workspace directory (user's uploaded Rust project) - config: Fuzzer configuration (target_name, max_iterations, timeout_seconds, sanitizer) - - Returns: - Fuzzer results dictionary (findings, summary, metadata) - """ - logger.info(f"Activity: fuzz_with_cargo (workspace={workspace_path})") - - try: - # Import reusable CargoFuzzer module - from modules.fuzzer import CargoFuzzer - - workspace = Path(workspace_path) - if not workspace.exists(): - raise FileNotFoundError(f"Workspace not found: {workspace_path}") - - # Get activity info for real-time stats - info = activity.info() - run_id = info.workflow_id - - # Define stats callback for real-time monitoring - async def stats_callback(stats_data: Dict[str, Any]): - """Callback for live fuzzing statistics""" - try: - # Prepare stats payload for backend - coverage_value = stats_data.get("coverage", 0) - - stats_payload = { - "run_id": run_id, - "workflow": "cargo_fuzzing", - "executions": stats_data.get("total_execs", 0), - "executions_per_sec": stats_data.get("execs_per_sec", 0.0), - "crashes": stats_data.get("crashes", 0), - "unique_crashes": stats_data.get("crashes", 0), - "coverage": coverage_value, - "corpus_size": stats_data.get("corpus_size", 0), - "elapsed_time": stats_data.get("elapsed_time", 0), - "last_crash_time": None - } - - # POST stats to backend API for real-time monitoring - backend_url = os.getenv("BACKEND_URL", "http://backend:8000") - async with httpx.AsyncClient(timeout=5.0) as client: - try: - await client.post( - f"{backend_url}/fuzzing/{run_id}/stats", - json=stats_payload - ) - except Exception as http_err: - logger.debug(f"Failed to post stats to backend: {http_err}") - - # Also log for debugging - logger.info("LIVE_STATS", extra={ - "stats_type": "fuzzing_live_update", - "workflow_type": "cargo_fuzzing", - "run_id": run_id, - "executions": stats_data.get("total_execs", 0), - "executions_per_sec": stats_data.get("execs_per_sec", 0.0), - "crashes": stats_data.get("crashes", 0), - "corpus_size": stats_data.get("corpus_size", 0), - "coverage": stats_data.get("coverage", 0.0), - "elapsed_time": stats_data.get("elapsed_time", 0), - "timestamp": datetime.utcnow().isoformat() - }) - - except Exception as e: - logger.error(f"Stats callback error: {e}") - - # Initialize CargoFuzzer module - fuzzer = CargoFuzzer() - - # Execute fuzzing with stats callback - module_result = await fuzzer.execute( - config=config, - workspace=workspace, - stats_callback=stats_callback - ) - - # Convert ModuleResult to dictionary - result_dict = { - "findings": [], - "summary": module_result.summary, - "metadata": module_result.metadata, - "status": module_result.status, - "error": module_result.error - } - - # Convert findings to dict format - for finding in module_result.findings: - finding_dict = { - "id": finding.id, - "title": finding.title, - "description": finding.description, - "severity": finding.severity, - "category": finding.category, - "file_path": finding.file_path, - "line_start": finding.line_start, - "line_end": finding.line_end, - "code_snippet": finding.code_snippet, - "recommendation": finding.recommendation, - "metadata": finding.metadata - } - result_dict["findings"].append(finding_dict) - - # Generate SARIF report from findings - if module_result.findings: - # Convert findings to SARIF format - severity_map = { - "critical": "error", - "high": "error", - "medium": "warning", - "low": "note", - "info": "note" - } - - results = [] - for finding in module_result.findings: - result = { - "ruleId": finding.metadata.get("rule_id", finding.category), - "level": severity_map.get(finding.severity, "warning"), - "message": {"text": finding.description}, - "locations": [] - } - - if finding.file_path: - location = { - "physicalLocation": { - "artifactLocation": {"uri": finding.file_path}, - "region": { - "startLine": finding.line_start or 1, - "endLine": finding.line_end or finding.line_start or 1 - } - } - } - result["locations"].append(location) - - results.append(result) - - result_dict["sarif"] = { - "version": "2.1.0", - "$schema": "https://raw.githubusercontent.com/oasis-tcs/sarif-spec/master/Schemata/sarif-schema-2.1.0.json", - "runs": [{ - "tool": { - "driver": { - "name": "cargo-fuzz", - "version": "0.11.2" - } - }, - "results": results - }] - } - else: - result_dict["sarif"] = { - "version": "2.1.0", - "$schema": "https://raw.githubusercontent.com/oasis-tcs/sarif-spec/master/Schemata/sarif-schema-2.1.0.json", - "runs": [] - } - - logger.info( - f"Fuzzing activity completed: {len(module_result.findings)} crashes found, " - f"{module_result.summary.get('total_executions', 0)} executions" - ) - - return result_dict - - except Exception as e: - logger.error(f"Fuzzing activity failed: {e}", exc_info=True) - raise diff --git a/backend/toolbox/workflows/cargo_fuzzing/metadata.yaml b/backend/toolbox/workflows/cargo_fuzzing/metadata.yaml deleted file mode 100644 index 829a1f3..0000000 --- a/backend/toolbox/workflows/cargo_fuzzing/metadata.yaml +++ /dev/null @@ -1,65 +0,0 @@ -name: cargo_fuzzing -version: "1.0.0" -vertical: rust -description: "Fuzz Rust code using cargo-fuzz with real-time monitoring. Automatically discovers and fuzzes fuzz_target!() functions in user code." -author: "FuzzForge Team" -tags: - - "fuzzing" - - "cargo-fuzz" - - "rust" - - "libfuzzer" - - "memory-safety" - -# Workspace isolation mode (system-level configuration) -# - "isolated" (default): Each workflow run gets its own isolated workspace (safe for concurrent fuzzing) -# - "shared": All runs share the same workspace (for read-only analysis workflows) -# - "copy-on-write": Download once, copy for each run (balances performance and isolation) -workspace_isolation: "isolated" - -parameters: - type: object - properties: - target_name: - type: string - description: "Fuzz target name from fuzz/fuzz_targets/ (auto-discovered if not specified)" - max_iterations: - type: integer - default: 1000000 - description: "Maximum fuzzing iterations" - timeout_seconds: - type: integer - default: 1800 - description: "Fuzzing timeout in seconds (30 minutes)" - sanitizer: - type: string - enum: ["address", "memory", "undefined"] - default: "address" - description: "Sanitizer to use (address, memory, undefined)" - -output_schema: - type: object - properties: - findings: - type: array - description: "Crashes and memory safety issues found during fuzzing" - items: - type: object - properties: - title: - type: string - severity: - type: string - category: - type: string - metadata: - type: object - summary: - type: object - description: "Fuzzing execution summary" - properties: - total_executions: - type: integer - crashes_found: - type: integer - execution_time: - type: number diff --git a/backend/toolbox/workflows/cargo_fuzzing/workflow.py b/backend/toolbox/workflows/cargo_fuzzing/workflow.py deleted file mode 100644 index 5581ee0..0000000 --- a/backend/toolbox/workflows/cargo_fuzzing/workflow.py +++ /dev/null @@ -1,180 +0,0 @@ -""" -Cargo Fuzzing Workflow - Temporal Version - -Fuzzes user-provided Rust code using cargo-fuzz with real-time monitoring. -""" - -from datetime import timedelta -from typing import Dict, Any, Optional - -from temporalio import workflow -from temporalio.common import RetryPolicy - -# Import for type hints (will be executed by worker) -with workflow.unsafe.imports_passed_through(): - import logging - -logger = logging.getLogger(__name__) - - -@workflow.defn -class CargoFuzzingWorkflow: - """ - Fuzz Rust code using cargo-fuzz (libFuzzer). - - User workflow: - 1. User runs: ff workflow run cargo_fuzzing . - 2. CLI uploads Rust project to MinIO - 3. Worker downloads project - 4. Worker discovers fuzz targets in fuzz/fuzz_targets/ - 5. Worker fuzzes the target with cargo-fuzz - 6. Crashes reported as findings - """ - - @workflow.run - async def run( - self, - target_id: str, # MinIO UUID of uploaded user code - target_name: Optional[str] = None, # Optional: specific fuzz target name - max_iterations: int = 1000000, - timeout_seconds: int = 1800, # 30 minutes default for fuzzing - sanitizer: str = "address" - ) -> Dict[str, Any]: - """ - Main workflow execution. - - Args: - target_id: UUID of the uploaded target in MinIO - target_name: Optional specific fuzz target name (auto-discovered if None) - max_iterations: Maximum fuzzing iterations - timeout_seconds: Fuzzing timeout in seconds - sanitizer: Sanitizer to use (address, memory, undefined) - - Returns: - Dictionary containing findings and summary - """ - workflow_id = workflow.info().workflow_id - - workflow.logger.info( - f"Starting CargoFuzzingWorkflow " - f"(workflow_id={workflow_id}, target_id={target_id}, " - f"target_name={target_name or 'auto-discover'}, max_iterations={max_iterations}, " - f"timeout_seconds={timeout_seconds}, sanitizer={sanitizer})" - ) - - results = { - "workflow_id": workflow_id, - "target_id": target_id, - "status": "running", - "steps": [] - } - - try: - # Get run ID for workspace isolation - run_id = workflow.info().run_id - - # Step 1: Download user's Rust project from MinIO - workflow.logger.info("Step 1: Downloading user code from MinIO") - target_path = await workflow.execute_activity( - "get_target", - args=[target_id, run_id, "isolated"], # target_id, run_id, workspace_isolation - start_to_close_timeout=timedelta(minutes=5), - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=1), - maximum_interval=timedelta(seconds=30), - maximum_attempts=3 - ) - ) - results["steps"].append({ - "step": "download_target", - "status": "success", - "target_path": target_path - }) - workflow.logger.info(f"āœ“ User code downloaded to: {target_path}") - - # Step 2: Run cargo-fuzz - workflow.logger.info("Step 2: Running cargo-fuzz") - - # Use defaults if parameters are None - actual_max_iterations = max_iterations if max_iterations is not None else 1000000 - actual_timeout_seconds = timeout_seconds if timeout_seconds is not None else 1800 - actual_sanitizer = sanitizer if sanitizer is not None else "address" - - fuzz_config = { - "target_name": target_name, - "max_iterations": actual_max_iterations, - "timeout_seconds": actual_timeout_seconds, - "sanitizer": actual_sanitizer - } - - fuzz_results = await workflow.execute_activity( - "fuzz_with_cargo", - args=[target_path, fuzz_config], - start_to_close_timeout=timedelta(seconds=actual_timeout_seconds + 120), - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=2), - maximum_interval=timedelta(seconds=60), - maximum_attempts=1 # Fuzzing shouldn't retry - ) - ) - - results["steps"].append({ - "step": "fuzzing", - "status": "success", - "executions": fuzz_results.get("summary", {}).get("total_executions", 0), - "crashes": fuzz_results.get("summary", {}).get("crashes_found", 0) - }) - workflow.logger.info( - f"āœ“ Fuzzing completed: " - f"{fuzz_results.get('summary', {}).get('total_executions', 0)} executions, " - f"{fuzz_results.get('summary', {}).get('crashes_found', 0)} crashes" - ) - - # Step 3: Upload results to MinIO - workflow.logger.info("Step 3: Uploading results") - try: - results_url = await workflow.execute_activity( - "upload_results", - args=[workflow_id, fuzz_results, "json"], - start_to_close_timeout=timedelta(minutes=2) - ) - results["results_url"] = results_url - workflow.logger.info(f"āœ“ Results uploaded to: {results_url}") - except Exception as e: - workflow.logger.warning(f"Failed to upload results: {e}") - results["results_url"] = None - - # Step 4: Cleanup cache - workflow.logger.info("Step 4: Cleaning up cache") - try: - await workflow.execute_activity( - "cleanup_cache", - args=[target_path, "isolated"], # target_path, workspace_isolation - start_to_close_timeout=timedelta(minutes=1) - ) - workflow.logger.info("āœ“ Cache cleaned up") - except Exception as e: - workflow.logger.warning(f"Cache cleanup failed: {e}") - - # Mark workflow as successful - results["status"] = "success" - results["findings"] = fuzz_results.get("findings", []) - results["summary"] = fuzz_results.get("summary", {}) - results["sarif"] = fuzz_results.get("sarif") or {} - workflow.logger.info( - f"āœ“ Workflow completed successfully: {workflow_id} " - f"({results['summary'].get('crashes_found', 0)} crashes found)" - ) - - return results - - except Exception as e: - workflow.logger.error(f"Workflow failed: {e}") - results["status"] = "error" - results["error"] = str(e) - results["steps"].append({ - "step": "error", - "status": "failed", - "error": str(e) - }) - raise diff --git a/backend/toolbox/workflows/gitleaks_detection/__init__.py b/backend/toolbox/workflows/gitleaks_detection/__init__.py deleted file mode 100644 index e192e0e..0000000 --- a/backend/toolbox/workflows/gitleaks_detection/__init__.py +++ /dev/null @@ -1,19 +0,0 @@ -""" -Gitleaks Detection Workflow -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -from .workflow import GitleaksDetectionWorkflow -from .activities import scan_with_gitleaks - -__all__ = ["GitleaksDetectionWorkflow", "scan_with_gitleaks"] diff --git a/backend/toolbox/workflows/gitleaks_detection/activities.py b/backend/toolbox/workflows/gitleaks_detection/activities.py deleted file mode 100644 index c7273a3..0000000 --- a/backend/toolbox/workflows/gitleaks_detection/activities.py +++ /dev/null @@ -1,166 +0,0 @@ -""" -Gitleaks Detection Workflow Activities -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -import logging -from pathlib import Path -from typing import Dict, Any - -from temporalio import activity - -try: - from toolbox.modules.secret_detection.gitleaks import GitleaksModule -except ImportError: - try: - from modules.secret_detection.gitleaks import GitleaksModule - except ImportError: - from src.toolbox.modules.secret_detection.gitleaks import GitleaksModule - -logger = logging.getLogger(__name__) - - -@activity.defn(name="scan_with_gitleaks") -async def scan_with_gitleaks(target_path: str, config: Dict[str, Any]) -> Dict[str, Any]: - """ - Scan code using Gitleaks. - - Args: - target_path: Path to the workspace containing code - config: Gitleaks configuration - - Returns: - Dictionary containing findings and summary - """ - activity.logger.info(f"Starting Gitleaks scan: {target_path}") - activity.logger.info(f"Config: {config}") - - workspace = Path(target_path) - - if not workspace.exists(): - raise FileNotFoundError(f"Workspace not found: {target_path}") - - # Create and execute Gitleaks module - gitleaks = GitleaksModule() - - # Validate configuration - gitleaks.validate_config(config) - - # Execute scan - result = await gitleaks.execute(config, workspace) - - if result.status == "failed": - raise RuntimeError(f"Gitleaks scan failed: {result.error or 'Unknown error'}") - - activity.logger.info( - f"Gitleaks scan completed: {len(result.findings)} findings from " - f"{result.summary.get('files_scanned', 0)} files" - ) - - # Convert ModuleFinding objects to dicts for serialization - findings_dicts = [finding.model_dump() for finding in result.findings] - - return { - "findings": findings_dicts, - "summary": result.summary - } - - -@activity.defn(name="gitleaks_generate_sarif") -async def gitleaks_generate_sarif(findings: list, metadata: Dict[str, Any]) -> Dict[str, Any]: - """ - Generate SARIF report from Gitleaks findings. - - Args: - findings: List of finding dictionaries - metadata: Metadata including tool_name, tool_version, run_id - - Returns: - SARIF report dictionary - """ - activity.logger.info(f"Generating SARIF report from {len(findings)} findings") - - # Debug: Check if first finding has line_start - if findings: - first_finding = findings[0] - activity.logger.info(f"First finding keys: {list(first_finding.keys())}") - activity.logger.info(f"line_start value: {first_finding.get('line_start')}") - - # Basic SARIF 2.1.0 structure - sarif_report = { - "version": "2.1.0", - "$schema": "https://raw.githubusercontent.com/oasis-tcs/sarif-spec/master/Schemata/sarif-schema-2.1.0.json", - "runs": [ - { - "tool": { - "driver": { - "name": metadata.get("tool_name", "gitleaks"), - "version": metadata.get("tool_version", "8.18.0"), - "informationUri": "https://github.com/gitleaks/gitleaks" - } - }, - "results": [] - } - ] - } - - # Convert findings to SARIF results - for finding in findings: - sarif_result = { - "ruleId": finding.get("metadata", {}).get("rule_id", "unknown"), - "level": _severity_to_sarif_level(finding.get("severity", "warning")), - "message": { - "text": finding.get("title", "Secret leak detected") - }, - "locations": [] - } - - # Add description if present - if finding.get("description"): - sarif_result["message"]["markdown"] = finding["description"] - - # Add location if file path is present - if finding.get("file_path"): - location = { - "physicalLocation": { - "artifactLocation": { - "uri": finding["file_path"] - } - } - } - - # Add region if line number is present - if finding.get("line_start"): - location["physicalLocation"]["region"] = { - "startLine": finding["line_start"] - } - - sarif_result["locations"].append(location) - - sarif_report["runs"][0]["results"].append(sarif_result) - - activity.logger.info(f"Generated SARIF report with {len(sarif_report['runs'][0]['results'])} results") - - return sarif_report - - -def _severity_to_sarif_level(severity: str) -> str: - """Convert severity to SARIF level""" - severity_map = { - "critical": "error", - "high": "error", - "medium": "warning", - "low": "note", - "info": "note" - } - return severity_map.get(severity.lower(), "warning") diff --git a/backend/toolbox/workflows/gitleaks_detection/metadata.yaml b/backend/toolbox/workflows/gitleaks_detection/metadata.yaml deleted file mode 100644 index ad4ae45..0000000 --- a/backend/toolbox/workflows/gitleaks_detection/metadata.yaml +++ /dev/null @@ -1,34 +0,0 @@ -name: gitleaks_detection -version: "1.0.0" -vertical: secrets -description: "Detect secrets and credentials using Gitleaks" -author: "FuzzForge Team" -tags: - - "secrets" - - "gitleaks" - - "git" - - "leak-detection" - -workspace_isolation: "shared" - -parameters: - type: object - properties: - scan_mode: - type: string - enum: ["detect", "protect"] - default: "detect" - description: "Scan mode: detect (entire repo history) or protect (staged changes)" - - redact: - type: boolean - default: true - description: "Redact secrets in output" - - no_git: - type: boolean - default: false - description: "Scan files without Git context" - -required_modules: - - "gitleaks" diff --git a/backend/toolbox/workflows/gitleaks_detection/workflow.py b/backend/toolbox/workflows/gitleaks_detection/workflow.py deleted file mode 100644 index 4960e16..0000000 --- a/backend/toolbox/workflows/gitleaks_detection/workflow.py +++ /dev/null @@ -1,187 +0,0 @@ -""" -Gitleaks Detection Workflow - Temporal Version - -Scans code for secrets and credentials using Gitleaks. -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -from datetime import timedelta -from typing import Dict, Any - -from temporalio import workflow -from temporalio.common import RetryPolicy - -# Import for type hints (will be executed by worker) -with workflow.unsafe.imports_passed_through(): - import logging - -logger = logging.getLogger(__name__) - - -@workflow.defn -class GitleaksDetectionWorkflow: - """ - Scan code for secrets using Gitleaks. - - User workflow: - 1. User runs: ff workflow run gitleaks_detection . - 2. CLI uploads project to MinIO - 3. Worker downloads project - 4. Worker runs Gitleaks - 5. Secrets reported as findings in SARIF format - """ - - @workflow.run - async def run( - self, - target_id: str, # MinIO UUID of uploaded user code - scan_mode: str = "detect", - redact: bool = True, - no_git: bool = True - ) -> Dict[str, Any]: - """ - Main workflow execution. - - Args: - target_id: UUID of the uploaded target in MinIO - scan_mode: Scan mode ('detect' or 'protect') - redact: Redact secrets in output - no_git: Scan files without Git context - - Returns: - Dictionary containing findings and summary - """ - workflow_id = workflow.info().workflow_id - - workflow.logger.info( - f"Starting GitleaksDetectionWorkflow " - f"(workflow_id={workflow_id}, target_id={target_id}, scan_mode={scan_mode})" - ) - - results = { - "workflow_id": workflow_id, - "target_id": target_id, - "status": "running", - "steps": [], - "findings": [] - } - - try: - # Get run ID for workspace isolation - run_id = workflow.info().run_id - - # Step 1: Download user's project from MinIO - workflow.logger.info("Step 1: Downloading user code from MinIO") - target_path = await workflow.execute_activity( - "get_target", - args=[target_id, run_id, "shared"], - start_to_close_timeout=timedelta(minutes=5), - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=1), - maximum_interval=timedelta(seconds=30), - maximum_attempts=3 - ) - ) - results["steps"].append({ - "step": "download", - "status": "success", - "target_path": target_path - }) - workflow.logger.info(f"āœ“ Target downloaded to: {target_path}") - - # Step 2: Run Gitleaks - workflow.logger.info("Step 2: Scanning with Gitleaks") - - scan_config = { - "scan_mode": scan_mode, - "redact": redact, - "no_git": no_git - } - - scan_results = await workflow.execute_activity( - "scan_with_gitleaks", - args=[target_path, scan_config], - start_to_close_timeout=timedelta(minutes=10), - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=2), - maximum_interval=timedelta(seconds=60), - maximum_attempts=2 - ) - ) - - results["steps"].append({ - "step": "gitleaks_scan", - "status": "success", - "leaks_found": scan_results.get("summary", {}).get("total_leaks", 0) - }) - workflow.logger.info( - f"āœ“ Gitleaks scan completed: " - f"{scan_results.get('summary', {}).get('total_leaks', 0)} leaks found" - ) - - # Step 3: Generate SARIF report - workflow.logger.info("Step 3: Generating SARIF report") - sarif_report = await workflow.execute_activity( - "gitleaks_generate_sarif", - args=[scan_results.get("findings", []), {"tool_name": "gitleaks", "tool_version": "8.18.0"}], - start_to_close_timeout=timedelta(minutes=2) - ) - - # Step 4: Upload results to MinIO - workflow.logger.info("Step 4: Uploading results") - try: - results_url = await workflow.execute_activity( - "upload_results", - args=[workflow_id, scan_results, "json"], - start_to_close_timeout=timedelta(minutes=2) - ) - results["results_url"] = results_url - workflow.logger.info(f"āœ“ Results uploaded to: {results_url}") - except Exception as e: - workflow.logger.warning(f"Failed to upload results: {e}") - results["results_url"] = None - - # Step 5: Cleanup cache - workflow.logger.info("Step 5: Cleaning up cache") - try: - await workflow.execute_activity( - "cleanup_cache", - args=[target_path, "shared"], - start_to_close_timeout=timedelta(minutes=1) - ) - workflow.logger.info("āœ“ Cache cleaned up") - except Exception as e: - workflow.logger.warning(f"Cache cleanup failed: {e}") - - # Mark workflow as successful - results["status"] = "success" - results["findings"] = scan_results.get("findings", []) - results["summary"] = scan_results.get("summary", {}) - results["sarif"] = sarif_report or {} - workflow.logger.info( - f"āœ“ Workflow completed successfully: {workflow_id} " - f"({results['summary'].get('total_leaks', 0)} leaks found)" - ) - - return results - - except Exception as e: - workflow.logger.error(f"Workflow failed: {e}") - results["status"] = "error" - results["error"] = str(e) - results["steps"].append({ - "step": "error", - "status": "failed", - "error": str(e) - }) - raise diff --git a/backend/toolbox/workflows/llm_analysis/__init__.py b/backend/toolbox/workflows/llm_analysis/__init__.py deleted file mode 100644 index 028946c..0000000 --- a/backend/toolbox/workflows/llm_analysis/__init__.py +++ /dev/null @@ -1,19 +0,0 @@ -""" -LLM Analysis Workflow -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -from .workflow import LlmAnalysisWorkflow -from .activities import analyze_with_llm - -__all__ = ["LlmAnalysisWorkflow", "analyze_with_llm"] diff --git a/backend/toolbox/workflows/llm_analysis/activities.py b/backend/toolbox/workflows/llm_analysis/activities.py deleted file mode 100644 index cb47599..0000000 --- a/backend/toolbox/workflows/llm_analysis/activities.py +++ /dev/null @@ -1,162 +0,0 @@ -""" -LLM Analysis Workflow Activities -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -import logging -from pathlib import Path -from typing import Dict, Any - -from temporalio import activity - -try: - from toolbox.modules.analyzer.llm_analyzer import LLMAnalyzer -except ImportError: - try: - from modules.analyzer.llm_analyzer import LLMAnalyzer - except ImportError: - from src.toolbox.modules.analyzer.llm_analyzer import LLMAnalyzer - -logger = logging.getLogger(__name__) - - -@activity.defn(name="llm_generate_sarif") -async def llm_generate_sarif(findings: list, metadata: Dict[str, Any]) -> Dict[str, Any]: - """ - Generate SARIF report from LLM findings. - - Args: - findings: List of finding dictionaries - metadata: Metadata including tool_name, tool_version, run_id - - Returns: - SARIF report dictionary - """ - activity.logger.info(f"Generating SARIF report from {len(findings)} findings") - - # Basic SARIF 2.1.0 structure - sarif_report = { - "version": "2.1.0", - "$schema": "https://raw.githubusercontent.com/oasis-tcs/sarif-spec/master/Schemata/sarif-schema-2.1.0.json", - "runs": [ - { - "tool": { - "driver": { - "name": metadata.get("tool_name", "llm-analyzer"), - "version": metadata.get("tool_version", "1.0.0"), - "informationUri": "https://github.com/FuzzingLabs/fuzzforge_ai" - } - }, - "results": [] - } - ] - } - - # Convert findings to SARIF results - for finding in findings: - sarif_result = { - "ruleId": finding.get("id", "unknown"), - "level": _severity_to_sarif_level(finding.get("severity", "warning")), - "message": { - "text": finding.get("title", "Security issue detected") - }, - "locations": [] - } - - # Add description if present - if finding.get("description"): - sarif_result["message"]["markdown"] = finding["description"] - - # Add location if file path is present - if finding.get("file_path"): - location = { - "physicalLocation": { - "artifactLocation": { - "uri": finding["file_path"] - } - } - } - - # Add region if line number is present - if finding.get("line_start"): - location["physicalLocation"]["region"] = { - "startLine": finding["line_start"] - } - if finding.get("line_end"): - location["physicalLocation"]["region"]["endLine"] = finding["line_end"] - - sarif_result["locations"].append(location) - - sarif_report["runs"][0]["results"].append(sarif_result) - - activity.logger.info(f"Generated SARIF report with {len(sarif_report['runs'][0]['results'])} results") - - return sarif_report - - -def _severity_to_sarif_level(severity: str) -> str: - """Convert severity to SARIF level""" - severity_map = { - "critical": "error", - "high": "error", - "medium": "warning", - "low": "note", - "info": "note" - } - return severity_map.get(severity.lower(), "warning") - - -@activity.defn(name="analyze_with_llm") -async def analyze_with_llm(target_path: str, config: Dict[str, Any]) -> Dict[str, Any]: - """ - Analyze code using LLM. - - Args: - target_path: Path to the workspace containing code - config: LLM analyzer configuration - - Returns: - Dictionary containing findings and summary - """ - activity.logger.info(f"Starting LLM analysis: {target_path}") - activity.logger.info(f"Config: {config}") - - workspace = Path(target_path) - - if not workspace.exists(): - raise FileNotFoundError(f"Workspace not found: {target_path}") - - # Create and execute LLM analyzer - analyzer = LLMAnalyzer() - - # Validate configuration - analyzer.validate_config(config) - - # Execute analysis - result = await analyzer.execute(config, workspace) - - if result.status == "failed": - raise RuntimeError(f"LLM analysis failed: {result.error or 'Unknown error'}") - - activity.logger.info( - f"LLM analysis completed: {len(result.findings)} findings from " - f"{result.summary.get('files_analyzed', 0)} files" - ) - - # Convert ModuleFinding objects to dicts for serialization - findings_dicts = [finding.model_dump() for finding in result.findings] - - return { - "findings": findings_dicts, - "summary": result.summary - } diff --git a/backend/toolbox/workflows/llm_analysis/metadata.yaml b/backend/toolbox/workflows/llm_analysis/metadata.yaml deleted file mode 100644 index 2631b59..0000000 --- a/backend/toolbox/workflows/llm_analysis/metadata.yaml +++ /dev/null @@ -1,110 +0,0 @@ -name: llm_analysis -version: "1.0.0" -vertical: python -description: "Uses AI/LLM to analyze code for security vulnerabilities and code quality issues" -author: "FuzzForge Team" -tags: - - "llm" - - "ai" - - "security" - - "static-analysis" - - "code-quality" - -# Workspace isolation mode -workspace_isolation: "shared" - -parameters: - type: object - properties: - agent_url: - type: string - description: "A2A agent endpoint URL" - default: "http://fuzzforge-task-agent:8000/a2a/litellm_agent" - llm_model: - type: string - description: "LLM model to use (e.g., gpt-4o-mini, claude-3-5-sonnet)" - default: "gpt-5-mini" - llm_provider: - type: string - description: "LLM provider (openai, anthropic, etc.)" - default: "openai" - file_patterns: - type: array - items: - type: string - default: - - "*.py" - - "*.js" - - "*.ts" - - "*.jsx" - - "*.tsx" - - "*.java" - - "*.go" - - "*.rs" - - "*.c" - - "*.cpp" - - "*.h" - - "*.hpp" - - "*.cs" - - "*.php" - - "*.rb" - - "*.swift" - - "*.kt" - - "*.scala" - - "*.env" - - "*.yaml" - - "*.yml" - - "*.json" - - "*.xml" - - "*.ini" - - "*.sql" - - "*.properties" - - "*.sh" - - "*.bat" - - "*.ps1" - - "*.config" - - "*.conf" - - "*.toml" - - "*id_rsa*" - - "*id_dsa*" - - "*id_ecdsa*" - - "*id_ed25519*" - - "*.pem" - - "*.key" - - "*.pub" - - "*.txt" - - "*.md" - - "Dockerfile" - - "docker-compose.yml" - - ".gitignore" - - ".dockerignore" - description: "File patterns to analyze for security issues and secrets" - max_files: - type: integer - description: "Maximum number of files to analyze" - default: 10 - max_file_size: - type: integer - description: "Maximum file size in bytes" - default: 100000 - timeout: - type: integer - description: "Timeout per file in seconds" - default: 90 - -output_schema: - type: object - properties: - sarif: - type: object - description: "SARIF-formatted security findings from LLM" - summary: - type: object - description: "Analysis summary" - properties: - files_analyzed: - type: integer - total_findings: - type: integer - model_used: - type: string diff --git a/backend/toolbox/workflows/llm_analysis/workflow.py b/backend/toolbox/workflows/llm_analysis/workflow.py deleted file mode 100644 index 136e844..0000000 --- a/backend/toolbox/workflows/llm_analysis/workflow.py +++ /dev/null @@ -1,236 +0,0 @@ -""" -LLM Analysis Workflow - Temporal Version - -Uses AI/LLM to analyze code for security issues. -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -from datetime import timedelta -from typing import Dict, Any, Optional - -from temporalio import workflow -from temporalio.common import RetryPolicy - -# Import for type hints (will be executed by worker) -with workflow.unsafe.imports_passed_through(): - import logging - -logger = logging.getLogger(__name__) - - -@workflow.defn -class LlmAnalysisWorkflow: - """ - Analyze code using AI/LLM for security vulnerabilities. - - User workflow: - 1. User runs: ff workflow run llm_analysis . - 2. CLI uploads project to MinIO - 3. Worker downloads project - 4. Worker calls LLM analyzer module - 5. LLM analyzes code files and reports findings - 6. Results returned in SARIF format - """ - - @workflow.run - async def run( - self, - target_id: str, # MinIO UUID of uploaded user code - agent_url: Optional[str] = None, - llm_model: Optional[str] = None, - llm_provider: Optional[str] = None, - file_patterns: Optional[list] = None, - max_files: Optional[int] = None, - max_file_size: Optional[int] = None, - timeout: Optional[int] = None - ) -> Dict[str, Any]: - """ - Main workflow execution. - - Args: - target_id: UUID of the uploaded target in MinIO - agent_url: A2A agent endpoint URL - llm_model: LLM model to use - llm_provider: LLM provider - file_patterns: File patterns to analyze - max_files: Maximum number of files to analyze - max_file_size: Maximum file size in bytes - timeout: Timeout per file in seconds - - Returns: - Dictionary containing findings and summary - """ - workflow_id = workflow.info().workflow_id - - workflow.logger.info( - f"Starting LLMAnalysisWorkflow " - f"(workflow_id={workflow_id}, target_id={target_id}, model={llm_model})" - ) - - results = { - "workflow_id": workflow_id, - "target_id": target_id, - "status": "running", - "steps": [], - "findings": [] - } - - try: - # Get run ID for workspace isolation - run_id = workflow.info().run_id - - # Step 1: Download user's project from MinIO - workflow.logger.info("Step 1: Downloading user code from MinIO") - target_path = await workflow.execute_activity( - "get_target", - args=[target_id, run_id, "shared"], - start_to_close_timeout=timedelta(minutes=5), - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=1), - maximum_interval=timedelta(seconds=30), - maximum_attempts=3 - ) - ) - results["steps"].append({ - "step": "download", - "status": "success", - "target_path": target_path - }) - workflow.logger.info(f"āœ“ Target downloaded to: {target_path}") - - # Step 2: Run LLM analysis - workflow.logger.info("Step 2: Analyzing code with LLM") - - # Build analyzer config - analyzer_config = {} - if agent_url: - analyzer_config["agent_url"] = agent_url - if llm_model: - analyzer_config["llm_model"] = llm_model - if llm_provider: - analyzer_config["llm_provider"] = llm_provider - if file_patterns: - analyzer_config["file_patterns"] = file_patterns - if max_files is not None: - analyzer_config["max_files"] = max_files - if max_file_size is not None: - analyzer_config["max_file_size"] = max_file_size - if timeout is not None: - analyzer_config["timeout"] = timeout - - analysis_results = await workflow.execute_activity( - "analyze_with_llm", - args=[target_path, analyzer_config], - start_to_close_timeout=timedelta(minutes=30), # LLM calls can be slow - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=5), - maximum_interval=timedelta(minutes=1), - maximum_attempts=2 - ) - ) - - findings = analysis_results.get("findings", []) - summary = analysis_results.get("summary", {}) - - results["steps"].append({ - "step": "llm_analysis", - "status": "success", - "files_analyzed": summary.get("files_analyzed", 0), - "findings_count": len(findings) - }) - - workflow.logger.info( - f"āœ“ LLM analysis completed: " - f"{summary.get('files_analyzed', 0)} files, " - f"{len(findings)} findings" - ) - - # Step 3: Generate SARIF report - workflow.logger.info("Step 3: Generating SARIF report") - - sarif_report = await workflow.execute_activity( - "llm_generate_sarif", - args=[findings, { - "tool_name": "llm-analyzer", - "tool_version": "1.0.0", - "run_id": run_id - }], - start_to_close_timeout=timedelta(minutes=5), - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=1), - maximum_interval=timedelta(seconds=30), - maximum_attempts=3 - ) - ) - - results["steps"].append({ - "step": "sarif_generation", - "status": "success", - "results_count": len(sarif_report.get("runs", [{}])[0].get("results", [])) - }) - - workflow.logger.info( - f"āœ“ SARIF report generated: " - f"{len(sarif_report.get('runs', [{}])[0].get('results', []))} results" - ) - - # Step 4: Upload results to MinIO - workflow.logger.info("Step 4: Uploading results to MinIO") - - # Upload SARIF report - if sarif_report: - results_url = await workflow.execute_activity( - "upload_results", - args=[run_id, sarif_report], - start_to_close_timeout=timedelta(minutes=5), - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=1), - maximum_interval=timedelta(seconds=30), - maximum_attempts=3 - ) - ) - results["results_url"] = results_url - workflow.logger.info(f"āœ“ Results uploaded to: {results_url}") - - # Step 5: Cleanup cache - workflow.logger.info("Step 5: Cleaning up cache") - await workflow.execute_activity( - "cleanup_cache", - args=[target_id], - start_to_close_timeout=timedelta(minutes=2), - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=1), - maximum_interval=timedelta(seconds=10), - maximum_attempts=2 - ) - ) - workflow.logger.info("āœ“ Cache cleaned up") - - # Mark workflow as successful - results["status"] = "success" - results["sarif"] = sarif_report - results["summary"] = summary - results["findings"] = findings - - workflow.logger.info( - f"āœ… LLMAnalysisWorkflow completed successfully: " - f"{len(findings)} findings" - ) - - except Exception as e: - workflow.logger.error(f"āŒ Workflow failed: {e}") - results["status"] = "failed" - results["error"] = str(e) - raise - - return results diff --git a/backend/toolbox/workflows/llm_secret_detection/__init__.py b/backend/toolbox/workflows/llm_secret_detection/__init__.py deleted file mode 100644 index 81148a7..0000000 --- a/backend/toolbox/workflows/llm_secret_detection/__init__.py +++ /dev/null @@ -1,6 +0,0 @@ -"""LLM Secret Detection Workflow""" - -from .workflow import LlmSecretDetectionWorkflow -from .activities import scan_with_llm - -__all__ = ["LlmSecretDetectionWorkflow", "scan_with_llm"] diff --git a/backend/toolbox/workflows/llm_secret_detection/activities.py b/backend/toolbox/workflows/llm_secret_detection/activities.py deleted file mode 100644 index c16691f..0000000 --- a/backend/toolbox/workflows/llm_secret_detection/activities.py +++ /dev/null @@ -1,112 +0,0 @@ -"""LLM Secret Detection Workflow Activities""" - -from pathlib import Path -from typing import Dict, Any -from temporalio import activity - -try: - from toolbox.modules.secret_detection.llm_secret_detector import LLMSecretDetectorModule -except ImportError: - from modules.secret_detection.llm_secret_detector import LLMSecretDetectorModule - -@activity.defn(name="scan_with_llm") -async def scan_with_llm(target_path: str, config: Dict[str, Any]) -> Dict[str, Any]: - """Scan code using LLM.""" - activity.logger.info(f"Starting LLM secret detection: {target_path}") - workspace = Path(target_path) - - llm_detector = LLMSecretDetectorModule() - llm_detector.validate_config(config) - result = await llm_detector.execute(config, workspace) - - if result.status == "failed": - raise RuntimeError(f"LLM detection failed: {result.error}") - - findings_dicts = [finding.model_dump() for finding in result.findings] - return {"findings": findings_dicts, "summary": result.summary} - - -@activity.defn(name="llm_secret_generate_sarif") -async def llm_secret_generate_sarif(findings: list, metadata: Dict[str, Any]) -> Dict[str, Any]: - """ - Generate SARIF report from LLM secret detection findings. - - Args: - findings: List of finding dictionaries from LLM secret detector - metadata: Metadata including tool_name, tool_version - - Returns: - SARIF 2.1.0 report dictionary - """ - activity.logger.info(f"Generating SARIF report from {len(findings)} findings") - - # Basic SARIF 2.1.0 structure - sarif_report = { - "version": "2.1.0", - "$schema": "https://raw.githubusercontent.com/oasis-tcs/sarif-spec/master/Schemata/sarif-schema-2.1.0.json", - "runs": [ - { - "tool": { - "driver": { - "name": metadata.get("tool_name", "llm-secret-detector"), - "version": metadata.get("tool_version", "1.0.0"), - "informationUri": "https://github.com/FuzzingLabs/fuzzforge_ai" - } - }, - "results": [] - } - ] - } - - # Convert findings to SARIF results - for finding in findings: - sarif_result = { - "ruleId": finding.get("id", finding.get("metadata", {}).get("secret_type", "unknown-secret")), - "level": _severity_to_sarif_level(finding.get("severity", "warning")), - "message": { - "text": finding.get("title", "Secret detected by LLM") - }, - "locations": [] - } - - # Add description if present - if finding.get("description"): - sarif_result["message"]["markdown"] = finding["description"] - - # Add location if file path is present - if finding.get("file_path"): - location = { - "physicalLocation": { - "artifactLocation": { - "uri": finding["file_path"] - } - } - } - - # Add region if line number is present - if finding.get("line_start"): - location["physicalLocation"]["region"] = { - "startLine": finding["line_start"] - } - if finding.get("line_end"): - location["physicalLocation"]["region"]["endLine"] = finding["line_end"] - - sarif_result["locations"].append(location) - - sarif_report["runs"][0]["results"].append(sarif_result) - - activity.logger.info(f"Generated SARIF report with {len(sarif_report['runs'][0]['results'])} results") - - return sarif_report - - -def _severity_to_sarif_level(severity: str) -> str: - """Convert severity to SARIF level""" - severity_map = { - "critical": "error", - "high": "error", - "medium": "warning", - "low": "note", - "info": "note" - } - return severity_map.get(severity.lower(), "warning") diff --git a/backend/toolbox/workflows/llm_secret_detection/metadata.yaml b/backend/toolbox/workflows/llm_secret_detection/metadata.yaml deleted file mode 100644 index a97b859..0000000 --- a/backend/toolbox/workflows/llm_secret_detection/metadata.yaml +++ /dev/null @@ -1,71 +0,0 @@ -name: llm_secret_detection -version: "1.0.0" -vertical: secrets -description: "AI-powered secret detection using LLM semantic analysis" -author: "FuzzForge Team" -tags: - - "secrets" - - "llm" - - "ai" - - "semantic" - -workspace_isolation: "shared" - -parameters: - type: object - properties: - agent_url: - type: string - default: "http://fuzzforge-task-agent:8000/a2a/litellm_agent" - - llm_model: - type: string - default: "gpt-5-mini" - - llm_provider: - type: string - default: "openai" - - max_files: - type: integer - default: 20 - - max_file_size: - type: integer - default: 30000 - description: "Maximum file size in bytes" - - timeout: - type: integer - default: 30 - description: "Timeout per file in seconds" - - file_patterns: - type: array - items: - type: string - default: - - "*.py" - - "*.js" - - "*.ts" - - "*.java" - - "*.go" - - "*.env" - - "*.yaml" - - "*.yml" - - "*.json" - - "*.xml" - - "*.ini" - - "*.sql" - - "*.properties" - - "*.sh" - - "*.bat" - - "*.config" - - "*.conf" - - "*.toml" - - "*id_rsa*" - - "*.txt" - description: "File patterns to scan for secrets" - -required_modules: - - "llm_secret_detector" diff --git a/backend/toolbox/workflows/llm_secret_detection/workflow.py b/backend/toolbox/workflows/llm_secret_detection/workflow.py deleted file mode 100644 index a0c66d2..0000000 --- a/backend/toolbox/workflows/llm_secret_detection/workflow.py +++ /dev/null @@ -1,159 +0,0 @@ -"""LLM Secret Detection Workflow""" - -from datetime import timedelta -from typing import Dict, Any, Optional -from temporalio import workflow -from temporalio.common import RetryPolicy - -@workflow.defn -class LlmSecretDetectionWorkflow: - """Scan code for secrets using LLM AI.""" - - @workflow.run - async def run( - self, - target_id: str, - agent_url: Optional[str] = None, - llm_model: Optional[str] = None, - llm_provider: Optional[str] = None, - max_files: Optional[int] = None, - max_file_size: Optional[int] = None, - timeout: Optional[int] = None, - file_patterns: Optional[list] = None - ) -> Dict[str, Any]: - workflow_id = workflow.info().workflow_id - run_id = workflow.info().run_id - - workflow.logger.info( - f"Starting LLM Secret Detection Workflow " - f"(workflow_id={workflow_id}, target_id={target_id}, model={llm_model})" - ) - - results = { - "workflow_id": workflow_id, - "target_id": target_id, - "status": "running", - "steps": [], - "findings": [] - } - - try: - # Step 1: Download target from MinIO - workflow.logger.info("Step 1: Downloading target from MinIO") - target_path = await workflow.execute_activity( - "get_target", - args=[target_id, run_id, "shared"], - start_to_close_timeout=timedelta(minutes=5), - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=1), - maximum_interval=timedelta(seconds=30), - maximum_attempts=3 - ) - ) - results["steps"].append({ - "step": "download", - "status": "success", - "target_path": target_path - }) - workflow.logger.info(f"āœ“ Target downloaded to: {target_path}") - - # Step 2: Scan with LLM - workflow.logger.info("Step 2: Scanning with LLM") - config = {} - if agent_url: - config["agent_url"] = agent_url - if llm_model: - config["llm_model"] = llm_model - if llm_provider: - config["llm_provider"] = llm_provider - if max_files: - config["max_files"] = max_files - if max_file_size: - config["max_file_size"] = max_file_size - if timeout: - config["timeout"] = timeout - if file_patterns: - config["file_patterns"] = file_patterns - - scan_results = await workflow.execute_activity( - "scan_with_llm", - args=[target_path, config], - start_to_close_timeout=timedelta(minutes=30), - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=2), - maximum_interval=timedelta(seconds=60), - maximum_attempts=2 - ) - ) - - findings_count = len(scan_results.get("findings", [])) - results["steps"].append({ - "step": "llm_scan", - "status": "success", - "secrets_found": findings_count - }) - workflow.logger.info(f"āœ“ LLM scan completed: {findings_count} secrets found") - - # Step 3: Generate SARIF report - workflow.logger.info("Step 3: Generating SARIF report") - sarif_report = await workflow.execute_activity( - "llm_generate_sarif", # Use shared LLM SARIF activity - args=[ - scan_results.get("findings", []), - { - "tool_name": f"llm-secret-detector ({llm_model or 'gpt-4o-mini'})", - "tool_version": "1.0.0" - } - ], - start_to_close_timeout=timedelta(minutes=2) - ) - workflow.logger.info("āœ“ SARIF report generated") - - # Step 4: Upload results to MinIO - workflow.logger.info("Step 4: Uploading results") - try: - results_url = await workflow.execute_activity( - "upload_results", - args=[workflow_id, scan_results, "json"], - start_to_close_timeout=timedelta(minutes=2) - ) - results["results_url"] = results_url - workflow.logger.info(f"āœ“ Results uploaded to: {results_url}") - except Exception as e: - workflow.logger.warning(f"Failed to upload results: {e}") - results["results_url"] = None - - # Step 5: Cleanup cache - workflow.logger.info("Step 5: Cleaning up cache") - try: - await workflow.execute_activity( - "cleanup_cache", - args=[target_path, "shared"], - start_to_close_timeout=timedelta(minutes=1) - ) - workflow.logger.info("āœ“ Cache cleaned up") - except Exception as e: - workflow.logger.warning(f"Cache cleanup failed: {e}") - - # Mark workflow as successful - results["status"] = "success" - results["findings"] = scan_results.get("findings", []) - results["summary"] = scan_results.get("summary", {}) - results["sarif"] = sarif_report or {} - workflow.logger.info( - f"āœ“ Workflow completed successfully: {workflow_id} " - f"({findings_count} secrets found)" - ) - - return results - - except Exception as e: - workflow.logger.error(f"Workflow failed: {e}") - results["status"] = "error" - results["error"] = str(e) - results["steps"].append({ - "step": "error", - "status": "failed", - "error": str(e) - }) - raise diff --git a/backend/toolbox/workflows/ossfuzz_campaign/metadata.yaml b/backend/toolbox/workflows/ossfuzz_campaign/metadata.yaml deleted file mode 100644 index d6766f9..0000000 --- a/backend/toolbox/workflows/ossfuzz_campaign/metadata.yaml +++ /dev/null @@ -1,106 +0,0 @@ -name: ossfuzz_campaign -version: "1.0.0" -vertical: ossfuzz -description: "Generic OSS-Fuzz fuzzing campaign. Automatically reads project configuration from OSS-Fuzz repo and runs fuzzing using Google's infrastructure." -author: "FuzzForge Team" -tags: - - "fuzzing" - - "oss-fuzz" - - "libfuzzer" - - "afl" - - "honggfuzz" - - "memory-safety" - - "security" - -# Workspace isolation mode -# OSS-Fuzz campaigns use isolated mode for safe concurrent campaigns -workspace_isolation: "isolated" - -parameters: - type: object - required: - - project_name - properties: - project_name: - type: string - description: "OSS-Fuzz project name (e.g., 'curl', 'sqlite3', 'libxml2')" - examples: - - "curl" - - "sqlite3" - - "libxml2" - - "openssl" - - "zlib" - - campaign_duration_hours: - type: integer - default: 1 - minimum: 1 - maximum: 168 # 1 week max - description: "How many hours to run the fuzzing campaign" - - override_engine: - type: string - enum: ["libfuzzer", "afl", "honggfuzz"] - description: "Override fuzzing engine from project.yaml (optional)" - - override_sanitizer: - type: string - enum: ["address", "memory", "undefined", "dataflow"] - description: "Override sanitizer from project.yaml (optional)" - - max_iterations: - type: integer - minimum: 1000 - description: "Optional limit on fuzzing iterations (optional)" - -output_schema: - type: object - properties: - project_name: - type: string - description: "OSS-Fuzz project that was fuzzed" - - summary: - type: object - description: "Campaign execution summary" - properties: - total_executions: - type: integer - crashes_found: - type: integer - unique_crashes: - type: integer - duration_hours: - type: number - engine_used: - type: string - sanitizer_used: - type: string - - crashes: - type: array - description: "List of crash file paths" - items: - type: string - - sarif: - type: object - description: "SARIF-formatted crash reports (future)" - -examples: - - name: "Fuzz curl for 1 hour" - parameters: - project_name: "curl" - campaign_duration_hours: 1 - - - name: "Fuzz sqlite3 with AFL" - parameters: - project_name: "sqlite3" - campaign_duration_hours: 2 - override_engine: "afl" - - - name: "Fuzz libxml2 with memory sanitizer" - parameters: - project_name: "libxml2" - campaign_duration_hours: 6 - override_sanitizer: "memory" diff --git a/backend/toolbox/workflows/ossfuzz_campaign/workflow.py b/backend/toolbox/workflows/ossfuzz_campaign/workflow.py deleted file mode 100644 index 7b735dd..0000000 --- a/backend/toolbox/workflows/ossfuzz_campaign/workflow.py +++ /dev/null @@ -1,219 +0,0 @@ -""" -OSS-Fuzz Campaign Workflow - Temporal Version - -Generic workflow for running OSS-Fuzz campaigns using Google's infrastructure. -Automatically reads project configuration from OSS-Fuzz project.yaml files. -""" - -import asyncio -from datetime import timedelta -from typing import Dict, Any, Optional - -from temporalio import workflow -from temporalio.common import RetryPolicy - -# Import for type hints (will be executed by worker) -with workflow.unsafe.imports_passed_through(): - import logging - -logger = logging.getLogger(__name__) - - -@workflow.defn -class OssfuzzCampaignWorkflow: - """ - Generic OSS-Fuzz fuzzing campaign workflow. - - User workflow: - 1. User runs: ff workflow run ossfuzz_campaign . project_name=curl - 2. Worker loads project config from OSS-Fuzz repo - 3. Worker builds project using OSS-Fuzz's build system - 4. Worker runs fuzzing with engines from project.yaml - 5. Crashes and corpus reported as findings - """ - - @workflow.run - async def run( - self, - target_id: str, # Required by FuzzForge (not used, OSS-Fuzz downloads from Google) - project_name: str, # Required: OSS-Fuzz project name (e.g., "curl", "sqlite3") - campaign_duration_hours: int = 1, - override_engine: Optional[str] = None, # Override engine from project.yaml - override_sanitizer: Optional[str] = None, # Override sanitizer from project.yaml - max_iterations: Optional[int] = None # Optional: limit fuzzing iterations - ) -> Dict[str, Any]: - """ - Main workflow execution. - - Args: - target_id: UUID of uploaded target (not used, required by FuzzForge) - project_name: Name of OSS-Fuzz project (e.g., "curl", "sqlite3", "libxml2") - campaign_duration_hours: How many hours to fuzz (default: 1) - override_engine: Override fuzzing engine from project.yaml - override_sanitizer: Override sanitizer from project.yaml - max_iterations: Optional limit on fuzzing iterations - - Returns: - Dictionary containing crashes, stats, and SARIF report - """ - workflow_id = workflow.info().workflow_id - - workflow.logger.info( - f"Starting OSS-Fuzz Campaign for project '{project_name}' " - f"(workflow_id={workflow_id}, duration={campaign_duration_hours}h)" - ) - - results = { - "workflow_id": workflow_id, - "project_name": project_name, - "status": "running", - "steps": [] - } - - try: - # Step 1: Load OSS-Fuzz project configuration - workflow.logger.info(f"Step 1: Loading project config for '{project_name}'") - project_config = await workflow.execute_activity( - "load_ossfuzz_project", - args=[project_name], - start_to_close_timeout=timedelta(minutes=5), - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=1), - maximum_interval=timedelta(seconds=30), - maximum_attempts=3 - ) - ) - - results["steps"].append({ - "step": "load_config", - "status": "success", - "language": project_config.get("language"), - "engines": project_config.get("fuzzing_engines", []), - "sanitizers": project_config.get("sanitizers", []) - }) - - workflow.logger.info( - f"āœ“ Loaded config: language={project_config.get('language')}, " - f"engines={project_config.get('fuzzing_engines')}" - ) - - # Step 2: Build project using OSS-Fuzz infrastructure - workflow.logger.info(f"Step 2: Building project '{project_name}'") - - build_result = await workflow.execute_activity( - "build_ossfuzz_project", - args=[ - project_name, - project_config, - override_sanitizer, - override_engine - ], - start_to_close_timeout=timedelta(minutes=30), - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=2), - maximum_interval=timedelta(seconds=60), - maximum_attempts=2 - ) - ) - - results["steps"].append({ - "step": "build_project", - "status": "success", - "fuzz_targets": len(build_result.get("fuzz_targets", [])), - "sanitizer": build_result.get("sanitizer_used"), - "engine": build_result.get("engine_used") - }) - - workflow.logger.info( - f"āœ“ Build completed: {len(build_result.get('fuzz_targets', []))} fuzz targets found" - ) - - if not build_result.get("fuzz_targets"): - raise Exception(f"No fuzz targets found for project {project_name}") - - # Step 3: Run fuzzing on discovered targets - workflow.logger.info(f"Step 3: Fuzzing {len(build_result['fuzz_targets'])} targets") - - # Determine which engine to use - engine_to_use = override_engine if override_engine else build_result["engine_used"] - duration_seconds = campaign_duration_hours * 3600 - - # Fuzz each target (in parallel if multiple targets) - fuzz_futures = [] - for target_path in build_result["fuzz_targets"]: - future = workflow.execute_activity( - "fuzz_target", - args=[target_path, engine_to_use, duration_seconds, None, None], - start_to_close_timeout=timedelta(seconds=duration_seconds + 300), - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=2), - maximum_interval=timedelta(seconds=60), - maximum_attempts=1 # Fuzzing shouldn't retry - ) - ) - fuzz_futures.append(future) - - # Wait for all fuzzing to complete - fuzz_results = await asyncio.gather(*fuzz_futures, return_exceptions=True) - - # Aggregate results - total_execs = 0 - total_crashes = 0 - all_crashes = [] - - for i, result in enumerate(fuzz_results): - if isinstance(result, Exception): - workflow.logger.error(f"Fuzzing failed for target {i}: {result}") - continue - - total_execs += result.get("total_executions", 0) - total_crashes += result.get("crashes", 0) - all_crashes.extend(result.get("crash_files", [])) - - results["steps"].append({ - "step": "fuzzing", - "status": "success", - "total_executions": total_execs, - "crashes_found": total_crashes, - "targets_fuzzed": len(build_result["fuzz_targets"]) - }) - - workflow.logger.info( - f"āœ“ Fuzzing completed: {total_execs} executions, {total_crashes} crashes" - ) - - # Step 4: Generate SARIF report - workflow.logger.info("Step 4: Generating SARIF report") - - # TODO: Implement crash minimization and SARIF generation - # For now, return raw results - - results["status"] = "success" - results["summary"] = { - "project": project_name, - "total_executions": total_execs, - "crashes_found": total_crashes, - "unique_crashes": len(set(all_crashes)), - "duration_hours": campaign_duration_hours, - "engine_used": engine_to_use, - "sanitizer_used": build_result.get("sanitizer_used") - } - results["crashes"] = all_crashes[:100] # Limit to first 100 crashes - - workflow.logger.info( - f"āœ“ Campaign completed: {project_name} - " - f"{total_execs} execs, {total_crashes} crashes" - ) - - return results - - except Exception as e: - workflow.logger.error(f"Workflow failed: {e}") - results["status"] = "error" - results["error"] = str(e) - results["steps"].append({ - "step": "error", - "status": "failed", - "error": str(e) - }) - raise diff --git a/backend/toolbox/workflows/python_sast/__init__.py b/backend/toolbox/workflows/python_sast/__init__.py deleted file mode 100644 index e436884..0000000 --- a/backend/toolbox/workflows/python_sast/__init__.py +++ /dev/null @@ -1,10 +0,0 @@ -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. diff --git a/backend/toolbox/workflows/python_sast/activities.py b/backend/toolbox/workflows/python_sast/activities.py deleted file mode 100644 index fea884f..0000000 --- a/backend/toolbox/workflows/python_sast/activities.py +++ /dev/null @@ -1,191 +0,0 @@ -""" -Python SAST Workflow Activities - -Activities specific to the Python SAST workflow: -- scan_dependencies_activity: Scan Python dependencies for CVEs using pip-audit -- analyze_with_bandit_activity: Analyze Python code for security issues using Bandit -- analyze_with_mypy_activity: Analyze Python code for type safety using Mypy -- generate_python_sast_sarif_activity: Generate SARIF report from all findings -""" - -import logging -import sys -from pathlib import Path - -from temporalio import activity - -# Configure logging -logger = logging.getLogger(__name__) - -# Add toolbox to path for module imports -sys.path.insert(0, '/app/toolbox') - - -@activity.defn(name="scan_dependencies") -async def scan_dependencies_activity(workspace_path: str, config: dict) -> dict: - """ - Scan Python dependencies for known vulnerabilities using pip-audit. - - Args: - workspace_path: Path to the workspace directory - config: DependencyScanner configuration - - Returns: - Scanner results dictionary - """ - logger.info(f"Activity: scan_dependencies (workspace={workspace_path})") - - try: - from modules.scanner import DependencyScanner - - workspace = Path(workspace_path) - if not workspace.exists(): - raise FileNotFoundError(f"Workspace not found: {workspace_path}") - - scanner = DependencyScanner() - result = await scanner.execute(config, workspace) - - logger.info( - f"āœ“ Dependency scanning completed: " - f"{result.summary.get('total_vulnerabilities', 0)} vulnerabilities found" - ) - return result.dict() - - except Exception as e: - logger.error(f"Dependency scanning failed: {e}", exc_info=True) - raise - - -@activity.defn(name="analyze_with_bandit") -async def analyze_with_bandit_activity(workspace_path: str, config: dict) -> dict: - """ - Analyze Python code for security issues using Bandit. - - Args: - workspace_path: Path to the workspace directory - config: BanditAnalyzer configuration - - Returns: - Analysis results dictionary - """ - logger.info(f"Activity: analyze_with_bandit (workspace={workspace_path})") - - try: - from modules.analyzer import BanditAnalyzer - - workspace = Path(workspace_path) - if not workspace.exists(): - raise FileNotFoundError(f"Workspace not found: {workspace_path}") - - analyzer = BanditAnalyzer() - result = await analyzer.execute(config, workspace) - - logger.info( - f"āœ“ Bandit analysis completed: " - f"{result.summary.get('total_issues', 0)} security issues found" - ) - return result.dict() - - except Exception as e: - logger.error(f"Bandit analysis failed: {e}", exc_info=True) - raise - - -@activity.defn(name="analyze_with_mypy") -async def analyze_with_mypy_activity(workspace_path: str, config: dict) -> dict: - """ - Analyze Python code for type safety issues using Mypy. - - Args: - workspace_path: Path to the workspace directory - config: MypyAnalyzer configuration - - Returns: - Analysis results dictionary - """ - logger.info(f"Activity: analyze_with_mypy (workspace={workspace_path})") - - try: - from modules.analyzer import MypyAnalyzer - - workspace = Path(workspace_path) - if not workspace.exists(): - raise FileNotFoundError(f"Workspace not found: {workspace_path}") - - analyzer = MypyAnalyzer() - result = await analyzer.execute(config, workspace) - - logger.info( - f"āœ“ Mypy analysis completed: " - f"{result.summary.get('total_errors', 0)} type errors found" - ) - return result.dict() - - except Exception as e: - logger.error(f"Mypy analysis failed: {e}", exc_info=True) - raise - - -@activity.defn(name="generate_python_sast_sarif") -async def generate_python_sast_sarif_activity( - dependency_results: dict, - bandit_results: dict, - mypy_results: dict, - config: dict, - workspace_path: str -) -> dict: - """ - Generate SARIF report from all SAST analysis results. - - Args: - dependency_results: Results from dependency scanner - bandit_results: Results from Bandit analyzer - mypy_results: Results from Mypy analyzer - config: Reporter configuration - workspace_path: Path to the workspace - - Returns: - SARIF report dictionary - """ - logger.info("Activity: generate_python_sast_sarif") - - try: - from modules.reporter import SARIFReporter - - workspace = Path(workspace_path) - - # Combine findings from all modules - all_findings = [] - - # Add dependency scanner findings - dependency_findings = dependency_results.get("findings", []) - all_findings.extend(dependency_findings) - - # Add Bandit findings - bandit_findings = bandit_results.get("findings", []) - all_findings.extend(bandit_findings) - - # Add Mypy findings - mypy_findings = mypy_results.get("findings", []) - all_findings.extend(mypy_findings) - - # Prepare reporter config - reporter_config = { - **config, - "findings": all_findings, - "tool_name": "FuzzForge Python SAST", - "tool_version": "1.0.0" - } - - reporter = SARIFReporter() - result = await reporter.execute(reporter_config, workspace) - - # Extract SARIF from result - sarif = result.dict().get("sarif", {}) - - logger.info(f"āœ“ SARIF report generated with {len(all_findings)} findings") - return sarif - - except Exception as e: - logger.error(f"SARIF report generation failed: {e}", exc_info=True) - raise diff --git a/backend/toolbox/workflows/python_sast/metadata.yaml b/backend/toolbox/workflows/python_sast/metadata.yaml deleted file mode 100644 index c7e209c..0000000 --- a/backend/toolbox/workflows/python_sast/metadata.yaml +++ /dev/null @@ -1,110 +0,0 @@ -name: python_sast -version: "1.0.0" -vertical: python -description: "Python Static Application Security Testing (SAST) workflow combining dependency scanning (pip-audit), security linting (Bandit), and type checking (Mypy)" -author: "FuzzForge Team" -tags: - - "python" - - "sast" - - "security" - - "type-checking" - - "dependencies" - - "bandit" - - "mypy" - - "pip-audit" - - "sarif" - -# Workspace isolation mode (system-level configuration) -# Using "shared" mode for read-only SAST analysis (no file modifications) -workspace_isolation: "shared" - -parameters: - type: object - properties: - dependency_config: - type: object - description: "Dependency scanner (pip-audit) configuration" - properties: - dependency_files: - type: array - items: - type: string - description: "List of dependency files to scan (auto-discovered if empty)" - default: [] - ignore_vulns: - type: array - items: - type: string - description: "List of vulnerability IDs to ignore" - default: [] - bandit_config: - type: object - description: "Bandit security analyzer configuration" - properties: - severity_level: - type: string - enum: ["low", "medium", "high"] - description: "Minimum severity level to report" - default: "low" - confidence_level: - type: string - enum: ["low", "medium", "high"] - description: "Minimum confidence level to report" - default: "medium" - exclude_tests: - type: boolean - description: "Exclude test files from analysis" - default: true - skip_ids: - type: array - items: - type: string - description: "List of Bandit test IDs to skip" - default: [] - mypy_config: - type: object - description: "Mypy type checker configuration" - properties: - strict_mode: - type: boolean - description: "Enable strict type checking" - default: false - ignore_missing_imports: - type: boolean - description: "Ignore errors about missing imports" - default: true - follow_imports: - type: string - enum: ["normal", "silent", "skip", "error"] - description: "How to handle imports" - default: "silent" - reporter_config: - type: object - description: "SARIF reporter configuration" - properties: - include_code_flows: - type: boolean - description: "Include code flow information" - default: false - -output_schema: - type: object - properties: - sarif: - type: object - description: "SARIF-formatted SAST findings from all tools" - summary: - type: object - description: "SAST execution summary" - properties: - total_findings: - type: integer - vulnerabilities: - type: integer - description: "CVEs found in dependencies" - security_issues: - type: integer - description: "Security issues found by Bandit" - type_errors: - type: integer - description: "Type errors found by Mypy" diff --git a/backend/toolbox/workflows/python_sast/workflow.py b/backend/toolbox/workflows/python_sast/workflow.py deleted file mode 100644 index 6d56a47..0000000 --- a/backend/toolbox/workflows/python_sast/workflow.py +++ /dev/null @@ -1,265 +0,0 @@ -""" -Python SAST Workflow - Temporal Version - -Static Application Security Testing for Python projects using multiple tools. -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -from datetime import timedelta -from typing import Dict, Any, Optional - -from temporalio import workflow -from temporalio.common import RetryPolicy - -# Import activity interfaces (will be executed by worker) -with workflow.unsafe.imports_passed_through(): - import logging - -logger = logging.getLogger(__name__) - - -@workflow.defn -class PythonSastWorkflow: - """ - Python Static Application Security Testing workflow. - - This workflow: - 1. Downloads target from MinIO - 2. Runs dependency scanning (pip-audit for CVEs) - 3. Runs security linting (Bandit for security issues) - 4. Runs type checking (Mypy for type safety) - 5. Generates a SARIF report with all findings - 6. Uploads results to MinIO - 7. Cleans up cache - """ - - @workflow.run - async def run( - self, - target_id: str, - dependency_config: Optional[Dict[str, Any]] = None, - bandit_config: Optional[Dict[str, Any]] = None, - mypy_config: Optional[Dict[str, Any]] = None, - reporter_config: Optional[Dict[str, Any]] = None - ) -> Dict[str, Any]: - """ - Main workflow execution. - - Args: - target_id: UUID of the uploaded target in MinIO - dependency_config: Configuration for dependency scanner - bandit_config: Configuration for Bandit analyzer - mypy_config: Configuration for Mypy analyzer - reporter_config: Configuration for SARIF reporter - - Returns: - Dictionary containing SARIF report and summary - """ - workflow_id = workflow.info().workflow_id - - workflow.logger.info( - f"Starting PythonSASTWorkflow " - f"(workflow_id={workflow_id}, target_id={target_id})" - ) - - # Default configurations - if not dependency_config: - dependency_config = { - "dependency_files": [], # Auto-discover - "ignore_vulns": [] - } - - if not bandit_config: - bandit_config = { - "severity_level": "low", - "confidence_level": "medium", - "exclude_tests": True, - "skip_ids": [] - } - - if not mypy_config: - mypy_config = { - "strict_mode": False, - "ignore_missing_imports": True, - "follow_imports": "silent" - } - - if not reporter_config: - reporter_config = { - "include_code_flows": False - } - - results = { - "workflow_id": workflow_id, - "target_id": target_id, - "status": "running", - "steps": [] - } - - try: - # Get run ID for workspace isolation (using shared mode for read-only analysis) - run_id = workflow.info().run_id - - # Step 1: Download target from MinIO - workflow.logger.info("Step 1: Downloading target from MinIO") - target_path = await workflow.execute_activity( - "get_target", - args=[target_id, run_id, "shared"], # target_id, run_id, workspace_isolation - start_to_close_timeout=timedelta(minutes=5), - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=1), - maximum_interval=timedelta(seconds=30), - maximum_attempts=3 - ) - ) - results["steps"].append({ - "step": "download_target", - "status": "success", - "target_path": target_path - }) - workflow.logger.info(f"āœ“ Target downloaded to: {target_path}") - - # Step 2: Dependency scanning (pip-audit) - workflow.logger.info("Step 2: Scanning dependencies for vulnerabilities") - dependency_results = await workflow.execute_activity( - "scan_dependencies", - args=[target_path, dependency_config], - start_to_close_timeout=timedelta(minutes=10), - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=2), - maximum_interval=timedelta(seconds=60), - maximum_attempts=2 - ) - ) - results["steps"].append({ - "step": "dependency_scanning", - "status": "success", - "vulnerabilities": dependency_results.get("summary", {}).get("total_vulnerabilities", 0) - }) - workflow.logger.info( - f"āœ“ Dependency scanning completed: " - f"{dependency_results.get('summary', {}).get('total_vulnerabilities', 0)} vulnerabilities" - ) - - # Step 3: Security linting (Bandit) - workflow.logger.info("Step 3: Analyzing security issues with Bandit") - bandit_results = await workflow.execute_activity( - "analyze_with_bandit", - args=[target_path, bandit_config], - start_to_close_timeout=timedelta(minutes=10), - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=2), - maximum_interval=timedelta(seconds=60), - maximum_attempts=2 - ) - ) - results["steps"].append({ - "step": "bandit_analysis", - "status": "success", - "issues": bandit_results.get("summary", {}).get("total_issues", 0) - }) - workflow.logger.info( - f"āœ“ Bandit analysis completed: " - f"{bandit_results.get('summary', {}).get('total_issues', 0)} security issues" - ) - - # Step 4: Type checking (Mypy) - workflow.logger.info("Step 4: Type checking with Mypy") - mypy_results = await workflow.execute_activity( - "analyze_with_mypy", - args=[target_path, mypy_config], - start_to_close_timeout=timedelta(minutes=10), - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=2), - maximum_interval=timedelta(seconds=60), - maximum_attempts=2 - ) - ) - results["steps"].append({ - "step": "mypy_analysis", - "status": "success", - "type_errors": mypy_results.get("summary", {}).get("total_errors", 0) - }) - workflow.logger.info( - f"āœ“ Mypy analysis completed: " - f"{mypy_results.get('summary', {}).get('total_errors', 0)} type errors" - ) - - # Step 5: Generate SARIF report - workflow.logger.info("Step 5: Generating SARIF report") - sarif_report = await workflow.execute_activity( - "generate_python_sast_sarif", - args=[dependency_results, bandit_results, mypy_results, reporter_config, target_path], - start_to_close_timeout=timedelta(minutes=5) - ) - results["steps"].append({ - "step": "report_generation", - "status": "success" - }) - - # Count total findings in SARIF - total_findings = 0 - if sarif_report and "runs" in sarif_report: - total_findings = len(sarif_report["runs"][0].get("results", [])) - - workflow.logger.info(f"āœ“ SARIF report generated with {total_findings} findings") - - # Step 6: Upload results to MinIO - workflow.logger.info("Step 6: Uploading results") - try: - results_url = await workflow.execute_activity( - "upload_results", - args=[workflow_id, sarif_report, "sarif"], - start_to_close_timeout=timedelta(minutes=2) - ) - results["results_url"] = results_url - workflow.logger.info(f"āœ“ Results uploaded to: {results_url}") - except Exception as e: - workflow.logger.warning(f"Failed to upload results: {e}") - results["results_url"] = None - - # Step 7: Cleanup cache - workflow.logger.info("Step 7: Cleaning up cache") - try: - await workflow.execute_activity( - "cleanup_cache", - args=[target_path, "shared"], # target_path, workspace_isolation - start_to_close_timeout=timedelta(minutes=1) - ) - workflow.logger.info("āœ“ Cache cleaned up (skipped for shared mode)") - except Exception as e: - workflow.logger.warning(f"Cache cleanup failed: {e}") - - # Mark workflow as successful - results["status"] = "success" - results["sarif"] = sarif_report - results["summary"] = { - "total_findings": total_findings, - "vulnerabilities": dependency_results.get("summary", {}).get("total_vulnerabilities", 0), - "security_issues": bandit_results.get("summary", {}).get("total_issues", 0), - "type_errors": mypy_results.get("summary", {}).get("total_errors", 0) - } - workflow.logger.info(f"āœ“ Workflow completed successfully: {workflow_id}") - - return results - - except Exception as e: - workflow.logger.error(f"Workflow failed: {e}") - results["status"] = "error" - results["error"] = str(e) - results["steps"].append({ - "step": "error", - "status": "failed", - "error": str(e) - }) - raise diff --git a/backend/toolbox/workflows/security_assessment/__init__.py b/backend/toolbox/workflows/security_assessment/__init__.py deleted file mode 100644 index 43bcfe7..0000000 --- a/backend/toolbox/workflows/security_assessment/__init__.py +++ /dev/null @@ -1,11 +0,0 @@ -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - diff --git a/backend/toolbox/workflows/security_assessment/activities.py b/backend/toolbox/workflows/security_assessment/activities.py deleted file mode 100644 index ca9182f..0000000 --- a/backend/toolbox/workflows/security_assessment/activities.py +++ /dev/null @@ -1,150 +0,0 @@ -""" -Security Assessment Workflow Activities - -Activities specific to the security assessment workflow: -- scan_files_activity: Scan files in the workspace -- analyze_security_activity: Analyze security vulnerabilities -- generate_sarif_report_activity: Generate SARIF report from findings -""" - -import logging -import sys -from pathlib import Path - -from temporalio import activity - -# Configure logging -logger = logging.getLogger(__name__) - -# Add toolbox to path for module imports -sys.path.insert(0, '/app/toolbox') - - -@activity.defn(name="scan_files") -async def scan_files_activity(workspace_path: str, config: dict) -> dict: - """ - Scan files in the workspace. - - Args: - workspace_path: Path to the workspace directory - config: Scanner configuration - - Returns: - Scanner results dictionary - """ - logger.info(f"Activity: scan_files (workspace={workspace_path})") - - try: - from modules.scanner import FileScanner - - workspace = Path(workspace_path) - if not workspace.exists(): - raise FileNotFoundError(f"Workspace not found: {workspace_path}") - - scanner = FileScanner() - result = await scanner.execute(config, workspace) - - logger.info( - f"āœ“ File scanning completed: " - f"{result.summary.get('total_files', 0)} files scanned" - ) - return result.dict() - - except Exception as e: - logger.error(f"File scanning failed: {e}", exc_info=True) - raise - - -@activity.defn(name="analyze_security") -async def analyze_security_activity(workspace_path: str, config: dict) -> dict: - """ - Analyze security vulnerabilities in the workspace. - - Args: - workspace_path: Path to the workspace directory - config: Analyzer configuration - - Returns: - Analysis results dictionary - """ - logger.info(f"Activity: analyze_security (workspace={workspace_path})") - - try: - from modules.analyzer import SecurityAnalyzer - - workspace = Path(workspace_path) - if not workspace.exists(): - raise FileNotFoundError(f"Workspace not found: {workspace_path}") - - analyzer = SecurityAnalyzer() - result = await analyzer.execute(config, workspace) - - logger.info( - f"āœ“ Security analysis completed: " - f"{result.summary.get('total_findings', 0)} findings" - ) - return result.dict() - - except Exception as e: - logger.error(f"Security analysis failed: {e}", exc_info=True) - raise - - -@activity.defn(name="generate_sarif_report") -async def generate_sarif_report_activity( - scan_results: dict, - analysis_results: dict, - config: dict, - workspace_path: str -) -> dict: - """ - Generate SARIF report from scan and analysis results. - - Args: - scan_results: Results from file scanner - analysis_results: Results from security analyzer - config: Reporter configuration - workspace_path: Path to the workspace - - Returns: - SARIF report dictionary - """ - logger.info("Activity: generate_sarif_report") - - try: - from modules.reporter import SARIFReporter - - workspace = Path(workspace_path) - - # Combine findings from all modules - all_findings = [] - - # Add scanner findings (only sensitive files, not all files) - scanner_findings = scan_results.get("findings", []) - sensitive_findings = [f for f in scanner_findings if f.get("severity") != "info"] - all_findings.extend(sensitive_findings) - - # Add analyzer findings - analyzer_findings = analysis_results.get("findings", []) - all_findings.extend(analyzer_findings) - - # Prepare reporter config - reporter_config = { - **config, - "findings": all_findings, - "tool_name": "FuzzForge Security Assessment", - "tool_version": "1.0.0" - } - - reporter = SARIFReporter() - result = await reporter.execute(reporter_config, workspace) - - # Extract SARIF from result - sarif = result.dict().get("sarif", {}) - - logger.info(f"āœ“ SARIF report generated with {len(all_findings)} findings") - return sarif - - except Exception as e: - logger.error(f"SARIF report generation failed: {e}", exc_info=True) - raise diff --git a/backend/toolbox/workflows/security_assessment/metadata.yaml b/backend/toolbox/workflows/security_assessment/metadata.yaml deleted file mode 100644 index 09addbd..0000000 --- a/backend/toolbox/workflows/security_assessment/metadata.yaml +++ /dev/null @@ -1,83 +0,0 @@ -name: security_assessment -version: "2.0.0" -vertical: python -description: "Comprehensive security assessment workflow that scans files, analyzes code for vulnerabilities, and generates SARIF reports" -author: "FuzzForge Team" -tags: - - "security" - - "scanner" - - "analyzer" - - "static-analysis" - - "sarif" - - "comprehensive" - -# Workspace isolation mode (system-level configuration) -# - "isolated" (default): Each workflow run gets its own isolated workspace (safe for concurrent fuzzing) -# - "shared": All runs share the same workspace (for read-only analysis workflows) -# - "copy-on-write": Download once, copy for each run (balances performance and isolation) -# Using "shared" mode for read-only security analysis (no file modifications) -workspace_isolation: "shared" - -parameters: - type: object - properties: - scanner_config: - type: object - description: "File scanner configuration" - properties: - patterns: - type: array - items: - type: string - description: "File patterns to scan" - check_sensitive: - type: boolean - description: "Check for sensitive files" - calculate_hashes: - type: boolean - description: "Calculate file hashes" - max_file_size: - type: integer - description: "Maximum file size to scan (bytes)" - analyzer_config: - type: object - description: "Security analyzer configuration" - properties: - file_extensions: - type: array - items: - type: string - description: "File extensions to analyze" - check_secrets: - type: boolean - description: "Check for hardcoded secrets" - check_sql: - type: boolean - description: "Check for SQL injection risks" - check_dangerous_functions: - type: boolean - description: "Check for dangerous function calls" - reporter_config: - type: object - description: "SARIF reporter configuration" - properties: - include_code_flows: - type: boolean - description: "Include code flow information" - -output_schema: - type: object - properties: - sarif: - type: object - description: "SARIF-formatted security findings" - summary: - type: object - description: "Scan execution summary" - properties: - total_findings: - type: integer - severity_counts: - type: object - tool_counts: - type: object diff --git a/backend/toolbox/workflows/security_assessment/workflow.py b/backend/toolbox/workflows/security_assessment/workflow.py deleted file mode 100644 index d7ff21c..0000000 --- a/backend/toolbox/workflows/security_assessment/workflow.py +++ /dev/null @@ -1,233 +0,0 @@ -""" -Security Assessment Workflow - Temporal Version - -Comprehensive security analysis using multiple modules. -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -from datetime import timedelta -from typing import Dict, Any, Optional - -from temporalio import workflow -from temporalio.common import RetryPolicy - -# Import activity interfaces (will be executed by worker) -with workflow.unsafe.imports_passed_through(): - import logging - -logger = logging.getLogger(__name__) - - -@workflow.defn -class SecurityAssessmentWorkflow: - """ - Comprehensive security assessment workflow. - - This workflow: - 1. Downloads target from MinIO - 2. Scans files in the workspace - 3. Analyzes code for security vulnerabilities - 4. Generates a SARIF report with all findings - 5. Uploads results to MinIO - 6. Cleans up cache - """ - - @workflow.run - async def run( - self, - target_id: str, - scanner_config: Optional[Dict[str, Any]] = None, - analyzer_config: Optional[Dict[str, Any]] = None, - reporter_config: Optional[Dict[str, Any]] = None - ) -> Dict[str, Any]: - """ - Main workflow execution. - - Args: - target_id: UUID of the uploaded target in MinIO - scanner_config: Configuration for file scanner - analyzer_config: Configuration for security analyzer - reporter_config: Configuration for SARIF reporter - - Returns: - Dictionary containing SARIF report and summary - """ - workflow_id = workflow.info().workflow_id - - workflow.logger.info( - f"Starting SecurityAssessmentWorkflow " - f"(workflow_id={workflow_id}, target_id={target_id})" - ) - - # Default configurations - if not scanner_config: - scanner_config = { - "patterns": ["*"], - "check_sensitive": True, - "calculate_hashes": False, - "max_file_size": 10485760 # 10MB - } - - if not analyzer_config: - analyzer_config = { - "file_extensions": [".py", ".js", ".java", ".php", ".rb", ".go"], - "check_secrets": True, - "check_sql": True, - "check_dangerous_functions": True - } - - if not reporter_config: - reporter_config = { - "include_code_flows": False - } - - results = { - "workflow_id": workflow_id, - "target_id": target_id, - "status": "running", - "steps": [] - } - - try: - # Get run ID for workspace isolation (using shared mode for read-only analysis) - run_id = workflow.info().run_id - - # Step 1: Download target from MinIO - workflow.logger.info("Step 1: Downloading target from MinIO") - target_path = await workflow.execute_activity( - "get_target", - args=[target_id, run_id, "shared"], # target_id, run_id, workspace_isolation - start_to_close_timeout=timedelta(minutes=5), - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=1), - maximum_interval=timedelta(seconds=30), - maximum_attempts=3 - ) - ) - results["steps"].append({ - "step": "download_target", - "status": "success", - "target_path": target_path - }) - workflow.logger.info(f"āœ“ Target downloaded to: {target_path}") - - # Step 2: File scanning - workflow.logger.info("Step 2: Scanning files") - scan_results = await workflow.execute_activity( - "scan_files", - args=[target_path, scanner_config], - start_to_close_timeout=timedelta(minutes=10), - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=2), - maximum_interval=timedelta(seconds=60), - maximum_attempts=2 - ) - ) - results["steps"].append({ - "step": "file_scanning", - "status": "success", - "files_scanned": scan_results.get("summary", {}).get("total_files", 0) - }) - workflow.logger.info( - f"āœ“ File scanning completed: " - f"{scan_results.get('summary', {}).get('total_files', 0)} files" - ) - - # Step 3: Security analysis - workflow.logger.info("Step 3: Analyzing security vulnerabilities") - analysis_results = await workflow.execute_activity( - "analyze_security", - args=[target_path, analyzer_config], - start_to_close_timeout=timedelta(minutes=15), - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=2), - maximum_interval=timedelta(seconds=60), - maximum_attempts=2 - ) - ) - results["steps"].append({ - "step": "security_analysis", - "status": "success", - "findings": analysis_results.get("summary", {}).get("total_findings", 0) - }) - workflow.logger.info( - f"āœ“ Security analysis completed: " - f"{analysis_results.get('summary', {}).get('total_findings', 0)} findings" - ) - - # Step 4: Generate SARIF report - workflow.logger.info("Step 4: Generating SARIF report") - sarif_report = await workflow.execute_activity( - "generate_sarif_report", - args=[scan_results, analysis_results, reporter_config, target_path], - start_to_close_timeout=timedelta(minutes=5) - ) - results["steps"].append({ - "step": "report_generation", - "status": "success" - }) - - # Count total findings in SARIF - total_findings = 0 - if sarif_report and "runs" in sarif_report: - total_findings = len(sarif_report["runs"][0].get("results", [])) - - workflow.logger.info(f"āœ“ SARIF report generated with {total_findings} findings") - - # Step 5: Upload results to MinIO - workflow.logger.info("Step 5: Uploading results") - try: - results_url = await workflow.execute_activity( - "upload_results", - args=[workflow_id, sarif_report, "sarif"], - start_to_close_timeout=timedelta(minutes=2) - ) - results["results_url"] = results_url - workflow.logger.info(f"āœ“ Results uploaded to: {results_url}") - except Exception as e: - workflow.logger.warning(f"Failed to upload results: {e}") - results["results_url"] = None - - # Step 6: Cleanup cache - workflow.logger.info("Step 6: Cleaning up cache") - try: - await workflow.execute_activity( - "cleanup_cache", - args=[target_path, "shared"], # target_path, workspace_isolation - start_to_close_timeout=timedelta(minutes=1) - ) - workflow.logger.info("āœ“ Cache cleaned up (skipped for shared mode)") - except Exception as e: - workflow.logger.warning(f"Cache cleanup failed: {e}") - - # Mark workflow as successful - results["status"] = "success" - results["sarif"] = sarif_report - results["summary"] = { - "total_findings": total_findings, - "files_scanned": scan_results.get("summary", {}).get("total_files", 0) - } - workflow.logger.info(f"āœ“ Workflow completed successfully: {workflow_id}") - - return results - - except Exception as e: - workflow.logger.error(f"Workflow failed: {e}") - results["status"] = "error" - results["error"] = str(e) - results["steps"].append({ - "step": "error", - "status": "failed", - "error": str(e) - }) - raise diff --git a/backend/toolbox/workflows/trufflehog_detection/__init__.py b/backend/toolbox/workflows/trufflehog_detection/__init__.py deleted file mode 100644 index d580fb8..0000000 --- a/backend/toolbox/workflows/trufflehog_detection/__init__.py +++ /dev/null @@ -1,13 +0,0 @@ -""" -TruffleHog Detection Workflow -""" - -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. - -from .workflow import TrufflehogDetectionWorkflow -from .activities import scan_with_trufflehog, trufflehog_generate_sarif - -__all__ = ["TrufflehogDetectionWorkflow", "scan_with_trufflehog", "trufflehog_generate_sarif"] diff --git a/backend/toolbox/workflows/trufflehog_detection/activities.py b/backend/toolbox/workflows/trufflehog_detection/activities.py deleted file mode 100644 index 31bb92c..0000000 --- a/backend/toolbox/workflows/trufflehog_detection/activities.py +++ /dev/null @@ -1,110 +0,0 @@ -"""TruffleHog Detection Workflow Activities""" - -from pathlib import Path -from typing import Dict, Any -from temporalio import activity - -try: - from toolbox.modules.secret_detection.trufflehog import TruffleHogModule -except ImportError: - from modules.secret_detection.trufflehog import TruffleHogModule - -@activity.defn(name="scan_with_trufflehog") -async def scan_with_trufflehog(target_path: str, config: Dict[str, Any]) -> Dict[str, Any]: - """Scan code using TruffleHog.""" - activity.logger.info(f"Starting TruffleHog scan: {target_path}") - workspace = Path(target_path) - - trufflehog = TruffleHogModule() - trufflehog.validate_config(config) - result = await trufflehog.execute(config, workspace) - - if result.status == "failed": - raise RuntimeError(f"TruffleHog scan failed: {result.error}") - - findings_dicts = [finding.model_dump() for finding in result.findings] - return {"findings": findings_dicts, "summary": result.summary} - - -@activity.defn(name="trufflehog_generate_sarif") -async def trufflehog_generate_sarif(findings: list, metadata: Dict[str, Any]) -> Dict[str, Any]: - """ - Generate SARIF report from TruffleHog findings. - - Args: - findings: List of finding dictionaries - metadata: Metadata including tool_name, tool_version - - Returns: - SARIF report dictionary - """ - activity.logger.info(f"Generating SARIF report from {len(findings)} findings") - - # Basic SARIF 2.1.0 structure - sarif_report = { - "version": "2.1.0", - "$schema": "https://raw.githubusercontent.com/oasis-tcs/sarif-spec/master/Schemata/sarif-schema-2.1.0.json", - "runs": [ - { - "tool": { - "driver": { - "name": metadata.get("tool_name", "trufflehog"), - "version": metadata.get("tool_version", "3.63.2"), - "informationUri": "https://github.com/trufflesecurity/trufflehog" - } - }, - "results": [] - } - ] - } - - # Convert findings to SARIF results - for finding in findings: - sarif_result = { - "ruleId": finding.get("metadata", {}).get("detector", "unknown"), - "level": _severity_to_sarif_level(finding.get("severity", "warning")), - "message": { - "text": finding.get("title", "Secret detected") - }, - "locations": [] - } - - # Add description if present - if finding.get("description"): - sarif_result["message"]["markdown"] = finding["description"] - - # Add location if file path is present - if finding.get("file_path"): - location = { - "physicalLocation": { - "artifactLocation": { - "uri": finding["file_path"] - } - } - } - - # Add region if line number is present - if finding.get("line_start"): - location["physicalLocation"]["region"] = { - "startLine": finding["line_start"] - } - - sarif_result["locations"].append(location) - - sarif_report["runs"][0]["results"].append(sarif_result) - - activity.logger.info(f"Generated SARIF report with {len(sarif_report['runs'][0]['results'])} results") - - return sarif_report - - -def _severity_to_sarif_level(severity: str) -> str: - """Convert severity to SARIF level""" - severity_map = { - "critical": "error", - "high": "error", - "medium": "warning", - "low": "note", - "info": "note" - } - return severity_map.get(severity.lower(), "warning") diff --git a/backend/toolbox/workflows/trufflehog_detection/metadata.yaml b/backend/toolbox/workflows/trufflehog_detection/metadata.yaml deleted file mode 100644 index d725061..0000000 --- a/backend/toolbox/workflows/trufflehog_detection/metadata.yaml +++ /dev/null @@ -1,27 +0,0 @@ -name: trufflehog_detection -version: "1.0.0" -vertical: secrets -description: "Detect secrets with verification using TruffleHog" -author: "FuzzForge Team" -tags: - - "secrets" - - "trufflehog" - - "verification" - -workspace_isolation: "shared" - -parameters: - type: object - properties: - verify: - type: boolean - default: true - description: "Verify discovered secrets" - - max_depth: - type: integer - default: 10 - description: "Maximum directory depth to scan" - -required_modules: - - "trufflehog" diff --git a/backend/toolbox/workflows/trufflehog_detection/workflow.py b/backend/toolbox/workflows/trufflehog_detection/workflow.py deleted file mode 100644 index 62336f3..0000000 --- a/backend/toolbox/workflows/trufflehog_detection/workflow.py +++ /dev/null @@ -1,104 +0,0 @@ -"""TruffleHog Detection Workflow""" - -from datetime import timedelta -from typing import Dict, Any -from temporalio import workflow -from temporalio.common import RetryPolicy - -@workflow.defn -class TrufflehogDetectionWorkflow: - """Scan code for secrets using TruffleHog.""" - - @workflow.run - async def run(self, target_id: str, verify: bool = False, concurrency: int = 10) -> Dict[str, Any]: - workflow_id = workflow.info().workflow_id - run_id = workflow.info().run_id - - workflow.logger.info( - f"Starting TrufflehogDetectionWorkflow " - f"(workflow_id={workflow_id}, target_id={target_id}, verify={verify})" - ) - - results = {"workflow_id": workflow_id, "status": "running", "findings": []} - - try: - # Step 1: Download target - workflow.logger.info("Step 1: Downloading target from MinIO") - target_path = await workflow.execute_activity( - "get_target", args=[target_id, run_id, "shared"], - start_to_close_timeout=timedelta(minutes=5), - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=1), - maximum_interval=timedelta(seconds=30), - maximum_attempts=3 - ) - ) - workflow.logger.info(f"āœ“ Target downloaded to: {target_path}") - - # Step 2: Scan with TruffleHog - workflow.logger.info("Step 2: Scanning with TruffleHog") - scan_results = await workflow.execute_activity( - "scan_with_trufflehog", - args=[target_path, {"verify": verify, "concurrency": concurrency}], - start_to_close_timeout=timedelta(minutes=15), - retry_policy=RetryPolicy( - initial_interval=timedelta(seconds=2), - maximum_interval=timedelta(seconds=60), - maximum_attempts=2 - ) - ) - workflow.logger.info( - f"āœ“ TruffleHog scan completed: " - f"{scan_results.get('summary', {}).get('total_secrets', 0)} secrets found" - ) - - # Step 3: Generate SARIF report - workflow.logger.info("Step 3: Generating SARIF report") - sarif_report = await workflow.execute_activity( - "trufflehog_generate_sarif", - args=[scan_results.get("findings", []), {"tool_name": "trufflehog", "tool_version": "3.63.2"}], - start_to_close_timeout=timedelta(minutes=2) - ) - - # Step 4: Upload results to MinIO - workflow.logger.info("Step 4: Uploading results") - try: - results_url = await workflow.execute_activity( - "upload_results", - args=[workflow_id, scan_results, "json"], - start_to_close_timeout=timedelta(minutes=2) - ) - results["results_url"] = results_url - workflow.logger.info(f"āœ“ Results uploaded to: {results_url}") - except Exception as e: - workflow.logger.warning(f"Failed to upload results: {e}") - results["results_url"] = None - - # Step 5: Cleanup - workflow.logger.info("Step 5: Cleaning up cache") - try: - await workflow.execute_activity( - "cleanup_cache", args=[target_path, "shared"], - start_to_close_timeout=timedelta(minutes=1) - ) - workflow.logger.info("āœ“ Cache cleaned up") - except Exception as e: - workflow.logger.warning(f"Cache cleanup failed: {e}") - - # Mark workflow as successful - results["status"] = "success" - results["findings"] = scan_results.get("findings", []) - results["summary"] = scan_results.get("summary", {}) - results["sarif"] = sarif_report or {} - workflow.logger.info( - f"āœ“ Workflow completed successfully: {workflow_id} " - f"({results['summary'].get('total_secrets', 0)} secrets found)" - ) - - return results - - except Exception as e: - workflow.logger.error(f"Workflow failed: {e}") - results["status"] = "error" - results["error"] = str(e) - raise diff --git a/backend/uv.lock b/backend/uv.lock deleted file mode 100644 index 82803c8..0000000 --- a/backend/uv.lock +++ /dev/null @@ -1,2019 +0,0 @@ -version = 1 -revision = 3 -requires-python = ">=3.11" -resolution-markers = [ - "python_full_version >= '3.13'", - "python_full_version < '3.13'", -] - -[[package]] -name = "aiofiles" -version = "24.1.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/0b/03/a88171e277e8caa88a4c77808c20ebb04ba74cc4681bf1e9416c862de237/aiofiles-24.1.0.tar.gz", hash = "sha256:22a075c9e5a3810f0c2e48f3008c94d68c65d763b9b03857924c99e57355166c", size = 30247, upload-time = "2024-06-24T11:02:03.584Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/a5/45/30bb92d442636f570cb5651bc661f52b610e2eec3f891a5dc3a4c3667db0/aiofiles-24.1.0-py3-none-any.whl", hash = "sha256:b4ec55f4195e3eb5d7abd1bf7e061763e864dd4954231fb8539a0ef8bb8260e5", size = 15896, upload-time = "2024-06-24T11:02:01.529Z" }, -] - -[[package]] -name = "aiohappyeyeballs" -version = "2.6.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/26/30/f84a107a9c4331c14b2b586036f40965c128aa4fee4dda5d3d51cb14ad54/aiohappyeyeballs-2.6.1.tar.gz", hash = "sha256:c3f9d0113123803ccadfdf3f0faa505bc78e6a72d1cc4806cbd719826e943558", size = 22760, upload-time = "2025-03-12T01:42:48.764Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/0f/15/5bf3b99495fb160b63f95972b81750f18f7f4e02ad051373b669d17d44f2/aiohappyeyeballs-2.6.1-py3-none-any.whl", hash = "sha256:f349ba8f4b75cb25c99c5c2d84e997e485204d2902a9597802b0371f09331fb8", size = 15265, upload-time = "2025-03-12T01:42:47.083Z" }, -] - -[[package]] -name = "aiohttp" -version = "3.12.15" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "aiohappyeyeballs" }, - { name = "aiosignal" }, - { name = "attrs" }, - { name = "frozenlist" }, - { name = "multidict" }, - { name = "propcache" }, - { name = "yarl" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/9b/e7/d92a237d8802ca88483906c388f7c201bbe96cd80a165ffd0ac2f6a8d59f/aiohttp-3.12.15.tar.gz", hash = "sha256:4fc61385e9c98d72fcdf47e6dd81833f47b2f77c114c29cd64a361be57a763a2", size = 7823716, upload-time = "2025-07-29T05:52:32.215Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/20/19/9e86722ec8e835959bd97ce8c1efa78cf361fa4531fca372551abcc9cdd6/aiohttp-3.12.15-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:d3ce17ce0220383a0f9ea07175eeaa6aa13ae5a41f30bc61d84df17f0e9b1117", size = 711246, upload-time = "2025-07-29T05:50:15.937Z" }, - { url = "https://files.pythonhosted.org/packages/71/f9/0a31fcb1a7d4629ac9d8f01f1cb9242e2f9943f47f5d03215af91c3c1a26/aiohttp-3.12.15-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:010cc9bbd06db80fe234d9003f67e97a10fe003bfbedb40da7d71c1008eda0fe", size = 483515, upload-time = "2025-07-29T05:50:17.442Z" }, - { url = "https://files.pythonhosted.org/packages/62/6c/94846f576f1d11df0c2e41d3001000527c0fdf63fce7e69b3927a731325d/aiohttp-3.12.15-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:3f9d7c55b41ed687b9d7165b17672340187f87a773c98236c987f08c858145a9", size = 471776, upload-time = "2025-07-29T05:50:19.568Z" }, - { url = "https://files.pythonhosted.org/packages/f8/6c/f766d0aaafcee0447fad0328da780d344489c042e25cd58fde566bf40aed/aiohttp-3.12.15-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bc4fbc61bb3548d3b482f9ac7ddd0f18c67e4225aaa4e8552b9f1ac7e6bda9e5", size = 1741977, upload-time = "2025-07-29T05:50:21.665Z" }, - { url = "https://files.pythonhosted.org/packages/17/e5/fb779a05ba6ff44d7bc1e9d24c644e876bfff5abe5454f7b854cace1b9cc/aiohttp-3.12.15-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:7fbc8a7c410bb3ad5d595bb7118147dfbb6449d862cc1125cf8867cb337e8728", size = 1690645, upload-time = "2025-07-29T05:50:23.333Z" }, - { url = "https://files.pythonhosted.org/packages/37/4e/a22e799c2035f5d6a4ad2cf8e7c1d1bd0923192871dd6e367dafb158b14c/aiohttp-3.12.15-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:74dad41b3458dbb0511e760fb355bb0b6689e0630de8a22b1b62a98777136e16", size = 1789437, upload-time = "2025-07-29T05:50:25.007Z" }, - { url = "https://files.pythonhosted.org/packages/28/e5/55a33b991f6433569babb56018b2fb8fb9146424f8b3a0c8ecca80556762/aiohttp-3.12.15-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3b6f0af863cf17e6222b1735a756d664159e58855da99cfe965134a3ff63b0b0", size = 1828482, upload-time = "2025-07-29T05:50:26.693Z" }, - { url = "https://files.pythonhosted.org/packages/c6/82/1ddf0ea4f2f3afe79dffed5e8a246737cff6cbe781887a6a170299e33204/aiohttp-3.12.15-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b5b7fe4972d48a4da367043b8e023fb70a04d1490aa7d68800e465d1b97e493b", size = 1730944, upload-time = "2025-07-29T05:50:28.382Z" }, - { url = "https://files.pythonhosted.org/packages/1b/96/784c785674117b4cb3877522a177ba1b5e4db9ce0fd519430b5de76eec90/aiohttp-3.12.15-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6443cca89553b7a5485331bc9bedb2342b08d073fa10b8c7d1c60579c4a7b9bd", size = 1668020, upload-time = "2025-07-29T05:50:30.032Z" }, - { url = "https://files.pythonhosted.org/packages/12/8a/8b75f203ea7e5c21c0920d84dd24a5c0e971fe1e9b9ebbf29ae7e8e39790/aiohttp-3.12.15-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:6c5f40ec615e5264f44b4282ee27628cea221fcad52f27405b80abb346d9f3f8", size = 1716292, upload-time = "2025-07-29T05:50:31.983Z" }, - { url = "https://files.pythonhosted.org/packages/47/0b/a1451543475bb6b86a5cfc27861e52b14085ae232896a2654ff1231c0992/aiohttp-3.12.15-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:2abbb216a1d3a2fe86dbd2edce20cdc5e9ad0be6378455b05ec7f77361b3ab50", size = 1711451, upload-time = "2025-07-29T05:50:33.989Z" }, - { url = "https://files.pythonhosted.org/packages/55/fd/793a23a197cc2f0d29188805cfc93aa613407f07e5f9da5cd1366afd9d7c/aiohttp-3.12.15-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:db71ce547012a5420a39c1b744d485cfb823564d01d5d20805977f5ea1345676", size = 1691634, upload-time = "2025-07-29T05:50:35.846Z" }, - { url = "https://files.pythonhosted.org/packages/ca/bf/23a335a6670b5f5dfc6d268328e55a22651b440fca341a64fccf1eada0c6/aiohttp-3.12.15-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:ced339d7c9b5030abad5854aa5413a77565e5b6e6248ff927d3e174baf3badf7", size = 1785238, upload-time = "2025-07-29T05:50:37.597Z" }, - { url = "https://files.pythonhosted.org/packages/57/4f/ed60a591839a9d85d40694aba5cef86dde9ee51ce6cca0bb30d6eb1581e7/aiohttp-3.12.15-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:7c7dd29c7b5bda137464dc9bfc738d7ceea46ff70309859ffde8c022e9b08ba7", size = 1805701, upload-time = "2025-07-29T05:50:39.591Z" }, - { url = "https://files.pythonhosted.org/packages/85/e0/444747a9455c5de188c0f4a0173ee701e2e325d4b2550e9af84abb20cdba/aiohttp-3.12.15-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:421da6fd326460517873274875c6c5a18ff225b40da2616083c5a34a7570b685", size = 1718758, upload-time = "2025-07-29T05:50:41.292Z" }, - { url = "https://files.pythonhosted.org/packages/36/ab/1006278d1ffd13a698e5dd4bfa01e5878f6bddefc296c8b62649753ff249/aiohttp-3.12.15-cp311-cp311-win32.whl", hash = "sha256:4420cf9d179ec8dfe4be10e7d0fe47d6d606485512ea2265b0d8c5113372771b", size = 428868, upload-time = "2025-07-29T05:50:43.063Z" }, - { url = "https://files.pythonhosted.org/packages/10/97/ad2b18700708452400278039272032170246a1bf8ec5d832772372c71f1a/aiohttp-3.12.15-cp311-cp311-win_amd64.whl", hash = "sha256:edd533a07da85baa4b423ee8839e3e91681c7bfa19b04260a469ee94b778bf6d", size = 453273, upload-time = "2025-07-29T05:50:44.613Z" }, - { url = "https://files.pythonhosted.org/packages/63/97/77cb2450d9b35f517d6cf506256bf4f5bda3f93a66b4ad64ba7fc917899c/aiohttp-3.12.15-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:802d3868f5776e28f7bf69d349c26fc0efadb81676d0afa88ed00d98a26340b7", size = 702333, upload-time = "2025-07-29T05:50:46.507Z" }, - { url = "https://files.pythonhosted.org/packages/83/6d/0544e6b08b748682c30b9f65640d006e51f90763b41d7c546693bc22900d/aiohttp-3.12.15-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:f2800614cd560287be05e33a679638e586a2d7401f4ddf99e304d98878c29444", size = 476948, upload-time = "2025-07-29T05:50:48.067Z" }, - { url = "https://files.pythonhosted.org/packages/3a/1d/c8c40e611e5094330284b1aea8a4b02ca0858f8458614fa35754cab42b9c/aiohttp-3.12.15-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8466151554b593909d30a0a125d638b4e5f3836e5aecde85b66b80ded1cb5b0d", size = 469787, upload-time = "2025-07-29T05:50:49.669Z" }, - { url = "https://files.pythonhosted.org/packages/38/7d/b76438e70319796bfff717f325d97ce2e9310f752a267bfdf5192ac6082b/aiohttp-3.12.15-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2e5a495cb1be69dae4b08f35a6c4579c539e9b5706f606632102c0f855bcba7c", size = 1716590, upload-time = "2025-07-29T05:50:51.368Z" }, - { url = "https://files.pythonhosted.org/packages/79/b1/60370d70cdf8b269ee1444b390cbd72ce514f0d1cd1a715821c784d272c9/aiohttp-3.12.15-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:6404dfc8cdde35c69aaa489bb3542fb86ef215fc70277c892be8af540e5e21c0", size = 1699241, upload-time = "2025-07-29T05:50:53.628Z" }, - { url = "https://files.pythonhosted.org/packages/a3/2b/4968a7b8792437ebc12186db31523f541943e99bda8f30335c482bea6879/aiohttp-3.12.15-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3ead1c00f8521a5c9070fcb88f02967b1d8a0544e6d85c253f6968b785e1a2ab", size = 1754335, upload-time = "2025-07-29T05:50:55.394Z" }, - { url = "https://files.pythonhosted.org/packages/fb/c1/49524ed553f9a0bec1a11fac09e790f49ff669bcd14164f9fab608831c4d/aiohttp-3.12.15-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6990ef617f14450bc6b34941dba4f12d5613cbf4e33805932f853fbd1cf18bfb", size = 1800491, upload-time = "2025-07-29T05:50:57.202Z" }, - { url = "https://files.pythonhosted.org/packages/de/5e/3bf5acea47a96a28c121b167f5ef659cf71208b19e52a88cdfa5c37f1fcc/aiohttp-3.12.15-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fd736ed420f4db2b8148b52b46b88ed038d0354255f9a73196b7bbce3ea97545", size = 1719929, upload-time = "2025-07-29T05:50:59.192Z" }, - { url = "https://files.pythonhosted.org/packages/39/94/8ae30b806835bcd1cba799ba35347dee6961a11bd507db634516210e91d8/aiohttp-3.12.15-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3c5092ce14361a73086b90c6efb3948ffa5be2f5b6fbcf52e8d8c8b8848bb97c", size = 1635733, upload-time = "2025-07-29T05:51:01.394Z" }, - { url = "https://files.pythonhosted.org/packages/7a/46/06cdef71dd03acd9da7f51ab3a9107318aee12ad38d273f654e4f981583a/aiohttp-3.12.15-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:aaa2234bb60c4dbf82893e934d8ee8dea30446f0647e024074237a56a08c01bd", size = 1696790, upload-time = "2025-07-29T05:51:03.657Z" }, - { url = "https://files.pythonhosted.org/packages/02/90/6b4cfaaf92ed98d0ec4d173e78b99b4b1a7551250be8937d9d67ecb356b4/aiohttp-3.12.15-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:6d86a2fbdd14192e2f234a92d3b494dd4457e683ba07e5905a0b3ee25389ac9f", size = 1718245, upload-time = "2025-07-29T05:51:05.911Z" }, - { url = "https://files.pythonhosted.org/packages/2e/e6/2593751670fa06f080a846f37f112cbe6f873ba510d070136a6ed46117c6/aiohttp-3.12.15-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:a041e7e2612041a6ddf1c6a33b883be6a421247c7afd47e885969ee4cc58bd8d", size = 1658899, upload-time = "2025-07-29T05:51:07.753Z" }, - { url = "https://files.pythonhosted.org/packages/8f/28/c15bacbdb8b8eb5bf39b10680d129ea7410b859e379b03190f02fa104ffd/aiohttp-3.12.15-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:5015082477abeafad7203757ae44299a610e89ee82a1503e3d4184e6bafdd519", size = 1738459, upload-time = "2025-07-29T05:51:09.56Z" }, - { url = "https://files.pythonhosted.org/packages/00/de/c269cbc4faa01fb10f143b1670633a8ddd5b2e1ffd0548f7aa49cb5c70e2/aiohttp-3.12.15-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:56822ff5ddfd1b745534e658faba944012346184fbfe732e0d6134b744516eea", size = 1766434, upload-time = "2025-07-29T05:51:11.423Z" }, - { url = "https://files.pythonhosted.org/packages/52/b0/4ff3abd81aa7d929b27d2e1403722a65fc87b763e3a97b3a2a494bfc63bc/aiohttp-3.12.15-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b2acbbfff69019d9014508c4ba0401822e8bae5a5fdc3b6814285b71231b60f3", size = 1726045, upload-time = "2025-07-29T05:51:13.689Z" }, - { url = "https://files.pythonhosted.org/packages/71/16/949225a6a2dd6efcbd855fbd90cf476052e648fb011aa538e3b15b89a57a/aiohttp-3.12.15-cp312-cp312-win32.whl", hash = "sha256:d849b0901b50f2185874b9a232f38e26b9b3d4810095a7572eacea939132d4e1", size = 423591, upload-time = "2025-07-29T05:51:15.452Z" }, - { url = "https://files.pythonhosted.org/packages/2b/d8/fa65d2a349fe938b76d309db1a56a75c4fb8cc7b17a398b698488a939903/aiohttp-3.12.15-cp312-cp312-win_amd64.whl", hash = "sha256:b390ef5f62bb508a9d67cb3bba9b8356e23b3996da7062f1a57ce1a79d2b3d34", size = 450266, upload-time = "2025-07-29T05:51:17.239Z" }, - { url = "https://files.pythonhosted.org/packages/f2/33/918091abcf102e39d15aba2476ad9e7bd35ddb190dcdd43a854000d3da0d/aiohttp-3.12.15-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:9f922ffd05034d439dde1c77a20461cf4a1b0831e6caa26151fe7aa8aaebc315", size = 696741, upload-time = "2025-07-29T05:51:19.021Z" }, - { url = "https://files.pythonhosted.org/packages/b5/2a/7495a81e39a998e400f3ecdd44a62107254803d1681d9189be5c2e4530cd/aiohttp-3.12.15-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:2ee8a8ac39ce45f3e55663891d4b1d15598c157b4d494a4613e704c8b43112cd", size = 474407, upload-time = "2025-07-29T05:51:21.165Z" }, - { url = "https://files.pythonhosted.org/packages/49/fc/a9576ab4be2dcbd0f73ee8675d16c707cfc12d5ee80ccf4015ba543480c9/aiohttp-3.12.15-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:3eae49032c29d356b94eee45a3f39fdf4b0814b397638c2f718e96cfadf4c4e4", size = 466703, upload-time = "2025-07-29T05:51:22.948Z" }, - { url = "https://files.pythonhosted.org/packages/09/2f/d4bcc8448cf536b2b54eed48f19682031ad182faa3a3fee54ebe5b156387/aiohttp-3.12.15-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b97752ff12cc12f46a9b20327104448042fce5c33a624f88c18f66f9368091c7", size = 1705532, upload-time = "2025-07-29T05:51:25.211Z" }, - { url = "https://files.pythonhosted.org/packages/f1/f3/59406396083f8b489261e3c011aa8aee9df360a96ac8fa5c2e7e1b8f0466/aiohttp-3.12.15-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:894261472691d6fe76ebb7fcf2e5870a2ac284c7406ddc95823c8598a1390f0d", size = 1686794, upload-time = "2025-07-29T05:51:27.145Z" }, - { url = "https://files.pythonhosted.org/packages/dc/71/164d194993a8d114ee5656c3b7ae9c12ceee7040d076bf7b32fb98a8c5c6/aiohttp-3.12.15-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5fa5d9eb82ce98959fc1031c28198b431b4d9396894f385cb63f1e2f3f20ca6b", size = 1738865, upload-time = "2025-07-29T05:51:29.366Z" }, - { url = "https://files.pythonhosted.org/packages/1c/00/d198461b699188a93ead39cb458554d9f0f69879b95078dce416d3209b54/aiohttp-3.12.15-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f0fa751efb11a541f57db59c1dd821bec09031e01452b2b6217319b3a1f34f3d", size = 1788238, upload-time = "2025-07-29T05:51:31.285Z" }, - { url = "https://files.pythonhosted.org/packages/85/b8/9e7175e1fa0ac8e56baa83bf3c214823ce250d0028955dfb23f43d5e61fd/aiohttp-3.12.15-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5346b93e62ab51ee2a9d68e8f73c7cf96ffb73568a23e683f931e52450e4148d", size = 1710566, upload-time = "2025-07-29T05:51:33.219Z" }, - { url = "https://files.pythonhosted.org/packages/59/e4/16a8eac9df39b48ae102ec030fa9f726d3570732e46ba0c592aeeb507b93/aiohttp-3.12.15-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:049ec0360f939cd164ecbfd2873eaa432613d5e77d6b04535e3d1fbae5a9e645", size = 1624270, upload-time = "2025-07-29T05:51:35.195Z" }, - { url = "https://files.pythonhosted.org/packages/1f/f8/cd84dee7b6ace0740908fd0af170f9fab50c2a41ccbc3806aabcb1050141/aiohttp-3.12.15-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:b52dcf013b57464b6d1e51b627adfd69a8053e84b7103a7cd49c030f9ca44461", size = 1677294, upload-time = "2025-07-29T05:51:37.215Z" }, - { url = "https://files.pythonhosted.org/packages/ce/42/d0f1f85e50d401eccd12bf85c46ba84f947a84839c8a1c2c5f6e8ab1eb50/aiohttp-3.12.15-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:9b2af240143dd2765e0fb661fd0361a1b469cab235039ea57663cda087250ea9", size = 1708958, upload-time = "2025-07-29T05:51:39.328Z" }, - { url = "https://files.pythonhosted.org/packages/d5/6b/f6fa6c5790fb602538483aa5a1b86fcbad66244997e5230d88f9412ef24c/aiohttp-3.12.15-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:ac77f709a2cde2cc71257ab2d8c74dd157c67a0558a0d2799d5d571b4c63d44d", size = 1651553, upload-time = "2025-07-29T05:51:41.356Z" }, - { url = "https://files.pythonhosted.org/packages/04/36/a6d36ad545fa12e61d11d1932eef273928b0495e6a576eb2af04297fdd3c/aiohttp-3.12.15-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:47f6b962246f0a774fbd3b6b7be25d59b06fdb2f164cf2513097998fc6a29693", size = 1727688, upload-time = "2025-07-29T05:51:43.452Z" }, - { url = "https://files.pythonhosted.org/packages/aa/c8/f195e5e06608a97a4e52c5d41c7927301bf757a8e8bb5bbf8cef6c314961/aiohttp-3.12.15-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:760fb7db442f284996e39cf9915a94492e1896baac44f06ae551974907922b64", size = 1761157, upload-time = "2025-07-29T05:51:45.643Z" }, - { url = "https://files.pythonhosted.org/packages/05/6a/ea199e61b67f25ba688d3ce93f63b49b0a4e3b3d380f03971b4646412fc6/aiohttp-3.12.15-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:ad702e57dc385cae679c39d318def49aef754455f237499d5b99bea4ef582e51", size = 1710050, upload-time = "2025-07-29T05:51:48.203Z" }, - { url = "https://files.pythonhosted.org/packages/b4/2e/ffeb7f6256b33635c29dbed29a22a723ff2dd7401fff42ea60cf2060abfb/aiohttp-3.12.15-cp313-cp313-win32.whl", hash = "sha256:f813c3e9032331024de2eb2e32a88d86afb69291fbc37a3a3ae81cc9917fb3d0", size = 422647, upload-time = "2025-07-29T05:51:50.718Z" }, - { url = "https://files.pythonhosted.org/packages/1b/8e/78ee35774201f38d5e1ba079c9958f7629b1fd079459aea9467441dbfbf5/aiohttp-3.12.15-cp313-cp313-win_amd64.whl", hash = "sha256:1a649001580bdb37c6fdb1bebbd7e3bc688e8ec2b5c6f52edbb664662b17dc84", size = 449067, upload-time = "2025-07-29T05:51:52.549Z" }, -] - -[[package]] -name = "aiosignal" -version = "1.4.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "frozenlist" }, - { name = "typing-extensions", marker = "python_full_version < '3.13'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/61/62/06741b579156360248d1ec624842ad0edf697050bbaf7c3e46394e106ad1/aiosignal-1.4.0.tar.gz", hash = "sha256:f47eecd9468083c2029cc99945502cb7708b082c232f9aca65da147157b251c7", size = 25007, upload-time = "2025-07-03T22:54:43.528Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/fb/76/641ae371508676492379f16e2fa48f4e2c11741bd63c48be4b12a6b09cba/aiosignal-1.4.0-py3-none-any.whl", hash = "sha256:053243f8b92b990551949e63930a839ff0cf0b0ebbe0597b0f3fb19e1a0fe82e", size = 7490, upload-time = "2025-07-03T22:54:42.156Z" }, -] - -[[package]] -name = "annotated-types" -version = "0.7.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/ee/67/531ea369ba64dcff5ec9c3402f9f51bf748cec26dde048a2f973a4eea7f5/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89", size = 16081, upload-time = "2024-05-20T21:33:25.928Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643, upload-time = "2024-05-20T21:33:24.1Z" }, -] - -[[package]] -name = "anyio" -version = "4.10.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "idna" }, - { name = "sniffio" }, - { name = "typing-extensions", marker = "python_full_version < '3.13'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/f1/b4/636b3b65173d3ce9a38ef5f0522789614e590dab6a8d505340a4efe4c567/anyio-4.10.0.tar.gz", hash = "sha256:3f3fae35c96039744587aa5b8371e7e8e603c0702999535961dd336026973ba6", size = 213252, upload-time = "2025-08-04T08:54:26.451Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/6f/12/e5e0282d673bb9746bacfb6e2dba8719989d3660cdb2ea79aee9a9651afb/anyio-4.10.0-py3-none-any.whl", hash = "sha256:60e474ac86736bbfd6f210f7a61218939c318f43f9972497381f1c5e930ed3d1", size = 107213, upload-time = "2025-08-04T08:54:24.882Z" }, -] - -[[package]] -name = "attrs" -version = "25.3.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/5a/b0/1367933a8532ee6ff8d63537de4f1177af4bff9f3e829baf7331f595bb24/attrs-25.3.0.tar.gz", hash = "sha256:75d7cefc7fb576747b2c81b4442d4d4a1ce0900973527c011d1030fd3bf4af1b", size = 812032, upload-time = "2025-03-13T11:10:22.779Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/77/06/bb80f5f86020c4551da315d78b3ab75e8228f89f0162f2c3a819e407941a/attrs-25.3.0-py3-none-any.whl", hash = "sha256:427318ce031701fea540783410126f03899a97ffc6f61596ad581ac2e40e3bc3", size = 63815, upload-time = "2025-03-13T11:10:21.14Z" }, -] - -[[package]] -name = "authlib" -version = "1.6.4" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "cryptography" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/ce/bb/73a1f1c64ee527877f64122422dafe5b87a846ccf4ac933fe21bcbb8fee8/authlib-1.6.4.tar.gz", hash = "sha256:104b0442a43061dc8bc23b133d1d06a2b0a9c2e3e33f34c4338929e816287649", size = 164046, upload-time = "2025-09-17T09:59:23.897Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/0e/aa/91355b5f539caf1b94f0e66ff1e4ee39373b757fce08204981f7829ede51/authlib-1.6.4-py2.py3-none-any.whl", hash = "sha256:39313d2a2caac3ecf6d8f95fbebdfd30ae6ea6ae6a6db794d976405fdd9aa796", size = 243076, upload-time = "2025-09-17T09:59:22.259Z" }, -] - -[[package]] -name = "backend" -version = "0.6.0" -source = { virtual = "." } -dependencies = [ - { name = "aiofiles" }, - { name = "aiohttp" }, - { name = "boto3" }, - { name = "docker" }, - { name = "fastapi" }, - { name = "fastmcp" }, - { name = "pydantic" }, - { name = "pyyaml" }, - { name = "temporalio" }, - { name = "uvicorn" }, -] - -[package.optional-dependencies] -dev = [ - { name = "httpx" }, - { name = "pytest" }, - { name = "pytest-asyncio" }, - { name = "pytest-benchmark" }, - { name = "pytest-cov" }, - { name = "pytest-mock" }, - { name = "pytest-xdist" }, - { name = "ruff" }, -] - -[package.metadata] -requires-dist = [ - { name = "aiofiles", specifier = ">=23.0.0" }, - { name = "aiohttp", specifier = ">=3.12.15" }, - { name = "boto3", specifier = ">=1.34.0" }, - { name = "docker", specifier = ">=7.0.0" }, - { name = "fastapi", specifier = ">=0.116.1" }, - { name = "fastmcp" }, - { name = "httpx", marker = "extra == 'dev'", specifier = ">=0.27.0" }, - { name = "pydantic", specifier = ">=2.0.0" }, - { name = "pytest", marker = "extra == 'dev'", specifier = ">=8.0.0" }, - { name = "pytest-asyncio", marker = "extra == 'dev'", specifier = ">=0.23.0" }, - { name = "pytest-benchmark", marker = "extra == 'dev'", specifier = ">=4.0.0" }, - { name = "pytest-cov", marker = "extra == 'dev'", specifier = ">=5.0.0" }, - { name = "pytest-mock", marker = "extra == 'dev'", specifier = ">=3.12.0" }, - { name = "pytest-xdist", marker = "extra == 'dev'", specifier = ">=3.5.0" }, - { name = "pyyaml", specifier = ">=6.0" }, - { name = "ruff", marker = "extra == 'dev'", specifier = ">=0.1.0" }, - { name = "temporalio", specifier = ">=1.6.0" }, - { name = "uvicorn", specifier = ">=0.30.0" }, -] -provides-extras = ["dev"] - -[[package]] -name = "boto3" -version = "1.40.44" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "botocore" }, - { name = "jmespath" }, - { name = "s3transfer" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/98/e2/c291748090a9715cc8b74a58e3ba1d17b571b9c1ff6681cfb3191e9c117a/boto3-1.40.44.tar.gz", hash = "sha256:84ade2a253e5445902d2cb2064f48aedf9ba83d6f863244266c2e36c2f190cec", size = 111603, upload-time = "2025-10-02T20:14:25.087Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/ae/3e/3505fd6d192dfc6bbeac09576ba4dbd7d242e9850c275ab0c066433955b7/boto3-1.40.44-py3-none-any.whl", hash = "sha256:281ddf688951773a98161ccb34c54c6376b2ecc7028ab99d77483df5990b448c", size = 139344, upload-time = "2025-10-02T20:14:23.109Z" }, -] - -[[package]] -name = "botocore" -version = "1.40.44" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "jmespath" }, - { name = "python-dateutil" }, - { name = "urllib3" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/97/04/8e4dbfc2ff0ffb0df68de687402a770bbf5c8578e37757d5edacdec5d190/botocore-1.40.44.tar.gz", hash = "sha256:8f6f96ef053dcdfe79c14dfee303c0d381608c111696862fafc6e38402ccf8fe", size = 14391194, upload-time = "2025-10-02T20:14:11.799Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/9c/92/a175f6c442005ed6335592539aa675f2ba0e8478d941186242af742bd912/botocore-1.40.44-py3-none-any.whl", hash = "sha256:6fa7274cdb69be7c7b3ce6ff46a7c3e35e270f259dd77ee3f8ad8c584352262b", size = 14060101, upload-time = "2025-10-02T20:14:08.471Z" }, -] - -[[package]] -name = "certifi" -version = "2025.8.3" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/dc/67/960ebe6bf230a96cda2e0abcf73af550ec4f090005363542f0765df162e0/certifi-2025.8.3.tar.gz", hash = "sha256:e564105f78ded564e3ae7c923924435e1daa7463faeab5bb932bc53ffae63407", size = 162386, upload-time = "2025-08-03T03:07:47.08Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/e5/48/1549795ba7742c948d2ad169c1c8cdbae65bc450d6cd753d124b17c8cd32/certifi-2025.8.3-py3-none-any.whl", hash = "sha256:f6c12493cfb1b06ba2ff328595af9350c65d6644968e5d3a2ffd78699af217a5", size = 161216, upload-time = "2025-08-03T03:07:45.777Z" }, -] - -[[package]] -name = "cffi" -version = "2.0.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "pycparser", marker = "implementation_name != 'PyPy'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/eb/56/b1ba7935a17738ae8453301356628e8147c79dbb825bcbc73dc7401f9846/cffi-2.0.0.tar.gz", hash = "sha256:44d1b5909021139fe36001ae048dbdde8214afa20200eda0f64c068cac5d5529", size = 523588, upload-time = "2025-09-08T23:24:04.541Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/12/4a/3dfd5f7850cbf0d06dc84ba9aa00db766b52ca38d8b86e3a38314d52498c/cffi-2.0.0-cp311-cp311-macosx_10_13_x86_64.whl", hash = "sha256:b4c854ef3adc177950a8dfc81a86f5115d2abd545751a304c5bcf2c2c7283cfe", size = 184344, upload-time = "2025-09-08T23:22:26.456Z" }, - { url = "https://files.pythonhosted.org/packages/4f/8b/f0e4c441227ba756aafbe78f117485b25bb26b1c059d01f137fa6d14896b/cffi-2.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2de9a304e27f7596cd03d16f1b7c72219bd944e99cc52b84d0145aefb07cbd3c", size = 180560, upload-time = "2025-09-08T23:22:28.197Z" }, - { url = "https://files.pythonhosted.org/packages/b1/b7/1200d354378ef52ec227395d95c2576330fd22a869f7a70e88e1447eb234/cffi-2.0.0-cp311-cp311-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:baf5215e0ab74c16e2dd324e8ec067ef59e41125d3eade2b863d294fd5035c92", size = 209613, upload-time = "2025-09-08T23:22:29.475Z" }, - { url = "https://files.pythonhosted.org/packages/b8/56/6033f5e86e8cc9bb629f0077ba71679508bdf54a9a5e112a3c0b91870332/cffi-2.0.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:730cacb21e1bdff3ce90babf007d0a0917cc3e6492f336c2f0134101e0944f93", size = 216476, upload-time = "2025-09-08T23:22:31.063Z" }, - { url = "https://files.pythonhosted.org/packages/dc/7f/55fecd70f7ece178db2f26128ec41430d8720f2d12ca97bf8f0a628207d5/cffi-2.0.0-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:6824f87845e3396029f3820c206e459ccc91760e8fa24422f8b0c3d1731cbec5", size = 203374, upload-time = "2025-09-08T23:22:32.507Z" }, - { url = "https://files.pythonhosted.org/packages/84/ef/a7b77c8bdc0f77adc3b46888f1ad54be8f3b7821697a7b89126e829e676a/cffi-2.0.0-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:9de40a7b0323d889cf8d23d1ef214f565ab154443c42737dfe52ff82cf857664", size = 202597, upload-time = "2025-09-08T23:22:34.132Z" }, - { url = "https://files.pythonhosted.org/packages/d7/91/500d892b2bf36529a75b77958edfcd5ad8e2ce4064ce2ecfeab2125d72d1/cffi-2.0.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:8941aaadaf67246224cee8c3803777eed332a19d909b47e29c9842ef1e79ac26", size = 215574, upload-time = "2025-09-08T23:22:35.443Z" }, - { url = "https://files.pythonhosted.org/packages/44/64/58f6255b62b101093d5df22dcb752596066c7e89dd725e0afaed242a61be/cffi-2.0.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:a05d0c237b3349096d3981b727493e22147f934b20f6f125a3eba8f994bec4a9", size = 218971, upload-time = "2025-09-08T23:22:36.805Z" }, - { url = "https://files.pythonhosted.org/packages/ab/49/fa72cebe2fd8a55fbe14956f9970fe8eb1ac59e5df042f603ef7c8ba0adc/cffi-2.0.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:94698a9c5f91f9d138526b48fe26a199609544591f859c870d477351dc7b2414", size = 211972, upload-time = "2025-09-08T23:22:38.436Z" }, - { url = "https://files.pythonhosted.org/packages/0b/28/dd0967a76aab36731b6ebfe64dec4e981aff7e0608f60c2d46b46982607d/cffi-2.0.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:5fed36fccc0612a53f1d4d9a816b50a36702c28a2aa880cb8a122b3466638743", size = 217078, upload-time = "2025-09-08T23:22:39.776Z" }, - { url = "https://files.pythonhosted.org/packages/2b/c0/015b25184413d7ab0a410775fdb4a50fca20f5589b5dab1dbbfa3baad8ce/cffi-2.0.0-cp311-cp311-win32.whl", hash = "sha256:c649e3a33450ec82378822b3dad03cc228b8f5963c0c12fc3b1e0ab940f768a5", size = 172076, upload-time = "2025-09-08T23:22:40.95Z" }, - { url = "https://files.pythonhosted.org/packages/ae/8f/dc5531155e7070361eb1b7e4c1a9d896d0cb21c49f807a6c03fd63fc877e/cffi-2.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:66f011380d0e49ed280c789fbd08ff0d40968ee7b665575489afa95c98196ab5", size = 182820, upload-time = "2025-09-08T23:22:42.463Z" }, - { url = "https://files.pythonhosted.org/packages/95/5c/1b493356429f9aecfd56bc171285a4c4ac8697f76e9bbbbb105e537853a1/cffi-2.0.0-cp311-cp311-win_arm64.whl", hash = "sha256:c6638687455baf640e37344fe26d37c404db8b80d037c3d29f58fe8d1c3b194d", size = 177635, upload-time = "2025-09-08T23:22:43.623Z" }, - { url = "https://files.pythonhosted.org/packages/ea/47/4f61023ea636104d4f16ab488e268b93008c3d0bb76893b1b31db1f96802/cffi-2.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:6d02d6655b0e54f54c4ef0b94eb6be0607b70853c45ce98bd278dc7de718be5d", size = 185271, upload-time = "2025-09-08T23:22:44.795Z" }, - { url = "https://files.pythonhosted.org/packages/df/a2/781b623f57358e360d62cdd7a8c681f074a71d445418a776eef0aadb4ab4/cffi-2.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8eca2a813c1cb7ad4fb74d368c2ffbbb4789d377ee5bb8df98373c2cc0dee76c", size = 181048, upload-time = "2025-09-08T23:22:45.938Z" }, - { url = "https://files.pythonhosted.org/packages/ff/df/a4f0fbd47331ceeba3d37c2e51e9dfc9722498becbeec2bd8bc856c9538a/cffi-2.0.0-cp312-cp312-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:21d1152871b019407d8ac3985f6775c079416c282e431a4da6afe7aefd2bccbe", size = 212529, upload-time = "2025-09-08T23:22:47.349Z" }, - { url = "https://files.pythonhosted.org/packages/d5/72/12b5f8d3865bf0f87cf1404d8c374e7487dcf097a1c91c436e72e6badd83/cffi-2.0.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:b21e08af67b8a103c71a250401c78d5e0893beff75e28c53c98f4de42f774062", size = 220097, upload-time = "2025-09-08T23:22:48.677Z" }, - { url = "https://files.pythonhosted.org/packages/c2/95/7a135d52a50dfa7c882ab0ac17e8dc11cec9d55d2c18dda414c051c5e69e/cffi-2.0.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:1e3a615586f05fc4065a8b22b8152f0c1b00cdbc60596d187c2a74f9e3036e4e", size = 207983, upload-time = "2025-09-08T23:22:50.06Z" }, - { url = "https://files.pythonhosted.org/packages/3a/c8/15cb9ada8895957ea171c62dc78ff3e99159ee7adb13c0123c001a2546c1/cffi-2.0.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:81afed14892743bbe14dacb9e36d9e0e504cd204e0b165062c488942b9718037", size = 206519, upload-time = "2025-09-08T23:22:51.364Z" }, - { url = "https://files.pythonhosted.org/packages/78/2d/7fa73dfa841b5ac06c7b8855cfc18622132e365f5b81d02230333ff26e9e/cffi-2.0.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:3e17ed538242334bf70832644a32a7aae3d83b57567f9fd60a26257e992b79ba", size = 219572, upload-time = "2025-09-08T23:22:52.902Z" }, - { url = "https://files.pythonhosted.org/packages/07/e0/267e57e387b4ca276b90f0434ff88b2c2241ad72b16d31836adddfd6031b/cffi-2.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3925dd22fa2b7699ed2617149842d2e6adde22b262fcbfada50e3d195e4b3a94", size = 222963, upload-time = "2025-09-08T23:22:54.518Z" }, - { url = "https://files.pythonhosted.org/packages/b6/75/1f2747525e06f53efbd878f4d03bac5b859cbc11c633d0fb81432d98a795/cffi-2.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:2c8f814d84194c9ea681642fd164267891702542f028a15fc97d4674b6206187", size = 221361, upload-time = "2025-09-08T23:22:55.867Z" }, - { url = "https://files.pythonhosted.org/packages/7b/2b/2b6435f76bfeb6bbf055596976da087377ede68df465419d192acf00c437/cffi-2.0.0-cp312-cp312-win32.whl", hash = "sha256:da902562c3e9c550df360bfa53c035b2f241fed6d9aef119048073680ace4a18", size = 172932, upload-time = "2025-09-08T23:22:57.188Z" }, - { url = "https://files.pythonhosted.org/packages/f8/ed/13bd4418627013bec4ed6e54283b1959cf6db888048c7cf4b4c3b5b36002/cffi-2.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:da68248800ad6320861f129cd9c1bf96ca849a2771a59e0344e88681905916f5", size = 183557, upload-time = "2025-09-08T23:22:58.351Z" }, - { url = "https://files.pythonhosted.org/packages/95/31/9f7f93ad2f8eff1dbc1c3656d7ca5bfd8fb52c9d786b4dcf19b2d02217fa/cffi-2.0.0-cp312-cp312-win_arm64.whl", hash = "sha256:4671d9dd5ec934cb9a73e7ee9676f9362aba54f7f34910956b84d727b0d73fb6", size = 177762, upload-time = "2025-09-08T23:22:59.668Z" }, - { url = "https://files.pythonhosted.org/packages/4b/8d/a0a47a0c9e413a658623d014e91e74a50cdd2c423f7ccfd44086ef767f90/cffi-2.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:00bdf7acc5f795150faa6957054fbbca2439db2f775ce831222b66f192f03beb", size = 185230, upload-time = "2025-09-08T23:23:00.879Z" }, - { url = "https://files.pythonhosted.org/packages/4a/d2/a6c0296814556c68ee32009d9c2ad4f85f2707cdecfd7727951ec228005d/cffi-2.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:45d5e886156860dc35862657e1494b9bae8dfa63bf56796f2fb56e1679fc0bca", size = 181043, upload-time = "2025-09-08T23:23:02.231Z" }, - { url = "https://files.pythonhosted.org/packages/b0/1e/d22cc63332bd59b06481ceaac49d6c507598642e2230f201649058a7e704/cffi-2.0.0-cp313-cp313-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:07b271772c100085dd28b74fa0cd81c8fb1a3ba18b21e03d7c27f3436a10606b", size = 212446, upload-time = "2025-09-08T23:23:03.472Z" }, - { url = "https://files.pythonhosted.org/packages/a9/f5/a2c23eb03b61a0b8747f211eb716446c826ad66818ddc7810cc2cc19b3f2/cffi-2.0.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d48a880098c96020b02d5a1f7d9251308510ce8858940e6fa99ece33f610838b", size = 220101, upload-time = "2025-09-08T23:23:04.792Z" }, - { url = "https://files.pythonhosted.org/packages/f2/7f/e6647792fc5850d634695bc0e6ab4111ae88e89981d35ac269956605feba/cffi-2.0.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:f93fd8e5c8c0a4aa1f424d6173f14a892044054871c771f8566e4008eaa359d2", size = 207948, upload-time = "2025-09-08T23:23:06.127Z" }, - { url = "https://files.pythonhosted.org/packages/cb/1e/a5a1bd6f1fb30f22573f76533de12a00bf274abcdc55c8edab639078abb6/cffi-2.0.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:dd4f05f54a52fb558f1ba9f528228066954fee3ebe629fc1660d874d040ae5a3", size = 206422, upload-time = "2025-09-08T23:23:07.753Z" }, - { url = "https://files.pythonhosted.org/packages/98/df/0a1755e750013a2081e863e7cd37e0cdd02664372c754e5560099eb7aa44/cffi-2.0.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c8d3b5532fc71b7a77c09192b4a5a200ea992702734a2e9279a37f2478236f26", size = 219499, upload-time = "2025-09-08T23:23:09.648Z" }, - { url = "https://files.pythonhosted.org/packages/50/e1/a969e687fcf9ea58e6e2a928ad5e2dd88cc12f6f0ab477e9971f2309b57c/cffi-2.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:d9b29c1f0ae438d5ee9acb31cadee00a58c46cc9c0b2f9038c6b0b3470877a8c", size = 222928, upload-time = "2025-09-08T23:23:10.928Z" }, - { url = "https://files.pythonhosted.org/packages/36/54/0362578dd2c9e557a28ac77698ed67323ed5b9775ca9d3fe73fe191bb5d8/cffi-2.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6d50360be4546678fc1b79ffe7a66265e28667840010348dd69a314145807a1b", size = 221302, upload-time = "2025-09-08T23:23:12.42Z" }, - { url = "https://files.pythonhosted.org/packages/eb/6d/bf9bda840d5f1dfdbf0feca87fbdb64a918a69bca42cfa0ba7b137c48cb8/cffi-2.0.0-cp313-cp313-win32.whl", hash = "sha256:74a03b9698e198d47562765773b4a8309919089150a0bb17d829ad7b44b60d27", size = 172909, upload-time = "2025-09-08T23:23:14.32Z" }, - { url = "https://files.pythonhosted.org/packages/37/18/6519e1ee6f5a1e579e04b9ddb6f1676c17368a7aba48299c3759bbc3c8b3/cffi-2.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:19f705ada2530c1167abacb171925dd886168931e0a7b78f5bffcae5c6b5be75", size = 183402, upload-time = "2025-09-08T23:23:15.535Z" }, - { url = "https://files.pythonhosted.org/packages/cb/0e/02ceeec9a7d6ee63bb596121c2c8e9b3a9e150936f4fbef6ca1943e6137c/cffi-2.0.0-cp313-cp313-win_arm64.whl", hash = "sha256:256f80b80ca3853f90c21b23ee78cd008713787b1b1e93eae9f3d6a7134abd91", size = 177780, upload-time = "2025-09-08T23:23:16.761Z" }, - { url = "https://files.pythonhosted.org/packages/92/c4/3ce07396253a83250ee98564f8d7e9789fab8e58858f35d07a9a2c78de9f/cffi-2.0.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fc33c5141b55ed366cfaad382df24fe7dcbc686de5be719b207bb248e3053dc5", size = 185320, upload-time = "2025-09-08T23:23:18.087Z" }, - { url = "https://files.pythonhosted.org/packages/59/dd/27e9fa567a23931c838c6b02d0764611c62290062a6d4e8ff7863daf9730/cffi-2.0.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c654de545946e0db659b3400168c9ad31b5d29593291482c43e3564effbcee13", size = 181487, upload-time = "2025-09-08T23:23:19.622Z" }, - { url = "https://files.pythonhosted.org/packages/d6/43/0e822876f87ea8a4ef95442c3d766a06a51fc5298823f884ef87aaad168c/cffi-2.0.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:24b6f81f1983e6df8db3adc38562c83f7d4a0c36162885ec7f7b77c7dcbec97b", size = 220049, upload-time = "2025-09-08T23:23:20.853Z" }, - { url = "https://files.pythonhosted.org/packages/b4/89/76799151d9c2d2d1ead63c2429da9ea9d7aac304603de0c6e8764e6e8e70/cffi-2.0.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:12873ca6cb9b0f0d3a0da705d6086fe911591737a59f28b7936bdfed27c0d47c", size = 207793, upload-time = "2025-09-08T23:23:22.08Z" }, - { url = "https://files.pythonhosted.org/packages/bb/dd/3465b14bb9e24ee24cb88c9e3730f6de63111fffe513492bf8c808a3547e/cffi-2.0.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:d9b97165e8aed9272a6bb17c01e3cc5871a594a446ebedc996e2397a1c1ea8ef", size = 206300, upload-time = "2025-09-08T23:23:23.314Z" }, - { url = "https://files.pythonhosted.org/packages/47/d9/d83e293854571c877a92da46fdec39158f8d7e68da75bf73581225d28e90/cffi-2.0.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:afb8db5439b81cf9c9d0c80404b60c3cc9c3add93e114dcae767f1477cb53775", size = 219244, upload-time = "2025-09-08T23:23:24.541Z" }, - { url = "https://files.pythonhosted.org/packages/2b/0f/1f177e3683aead2bb00f7679a16451d302c436b5cbf2505f0ea8146ef59e/cffi-2.0.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:737fe7d37e1a1bffe70bd5754ea763a62a066dc5913ca57e957824b72a85e205", size = 222828, upload-time = "2025-09-08T23:23:26.143Z" }, - { url = "https://files.pythonhosted.org/packages/c6/0f/cafacebd4b040e3119dcb32fed8bdef8dfe94da653155f9d0b9dc660166e/cffi-2.0.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:38100abb9d1b1435bc4cc340bb4489635dc2f0da7456590877030c9b3d40b0c1", size = 220926, upload-time = "2025-09-08T23:23:27.873Z" }, - { url = "https://files.pythonhosted.org/packages/3e/aa/df335faa45b395396fcbc03de2dfcab242cd61a9900e914fe682a59170b1/cffi-2.0.0-cp314-cp314-win32.whl", hash = "sha256:087067fa8953339c723661eda6b54bc98c5625757ea62e95eb4898ad5e776e9f", size = 175328, upload-time = "2025-09-08T23:23:44.61Z" }, - { url = "https://files.pythonhosted.org/packages/bb/92/882c2d30831744296ce713f0feb4c1cd30f346ef747b530b5318715cc367/cffi-2.0.0-cp314-cp314-win_amd64.whl", hash = "sha256:203a48d1fb583fc7d78a4c6655692963b860a417c0528492a6bc21f1aaefab25", size = 185650, upload-time = "2025-09-08T23:23:45.848Z" }, - { url = "https://files.pythonhosted.org/packages/9f/2c/98ece204b9d35a7366b5b2c6539c350313ca13932143e79dc133ba757104/cffi-2.0.0-cp314-cp314-win_arm64.whl", hash = "sha256:dbd5c7a25a7cb98f5ca55d258b103a2054f859a46ae11aaf23134f9cc0d356ad", size = 180687, upload-time = "2025-09-08T23:23:47.105Z" }, - { url = "https://files.pythonhosted.org/packages/3e/61/c768e4d548bfa607abcda77423448df8c471f25dbe64fb2ef6d555eae006/cffi-2.0.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:9a67fc9e8eb39039280526379fb3a70023d77caec1852002b4da7e8b270c4dd9", size = 188773, upload-time = "2025-09-08T23:23:29.347Z" }, - { url = "https://files.pythonhosted.org/packages/2c/ea/5f76bce7cf6fcd0ab1a1058b5af899bfbef198bea4d5686da88471ea0336/cffi-2.0.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:7a66c7204d8869299919db4d5069a82f1561581af12b11b3c9f48c584eb8743d", size = 185013, upload-time = "2025-09-08T23:23:30.63Z" }, - { url = "https://files.pythonhosted.org/packages/be/b4/c56878d0d1755cf9caa54ba71e5d049479c52f9e4afc230f06822162ab2f/cffi-2.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7cc09976e8b56f8cebd752f7113ad07752461f48a58cbba644139015ac24954c", size = 221593, upload-time = "2025-09-08T23:23:31.91Z" }, - { url = "https://files.pythonhosted.org/packages/e0/0d/eb704606dfe8033e7128df5e90fee946bbcb64a04fcdaa97321309004000/cffi-2.0.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:92b68146a71df78564e4ef48af17551a5ddd142e5190cdf2c5624d0c3ff5b2e8", size = 209354, upload-time = "2025-09-08T23:23:33.214Z" }, - { url = "https://files.pythonhosted.org/packages/d8/19/3c435d727b368ca475fb8742ab97c9cb13a0de600ce86f62eab7fa3eea60/cffi-2.0.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:b1e74d11748e7e98e2f426ab176d4ed720a64412b6a15054378afdb71e0f37dc", size = 208480, upload-time = "2025-09-08T23:23:34.495Z" }, - { url = "https://files.pythonhosted.org/packages/d0/44/681604464ed9541673e486521497406fadcc15b5217c3e326b061696899a/cffi-2.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:28a3a209b96630bca57cce802da70c266eb08c6e97e5afd61a75611ee6c64592", size = 221584, upload-time = "2025-09-08T23:23:36.096Z" }, - { url = "https://files.pythonhosted.org/packages/25/8e/342a504ff018a2825d395d44d63a767dd8ebc927ebda557fecdaca3ac33a/cffi-2.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:7553fb2090d71822f02c629afe6042c299edf91ba1bf94951165613553984512", size = 224443, upload-time = "2025-09-08T23:23:37.328Z" }, - { url = "https://files.pythonhosted.org/packages/e1/5e/b666bacbbc60fbf415ba9988324a132c9a7a0448a9a8f125074671c0f2c3/cffi-2.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:6c6c373cfc5c83a975506110d17457138c8c63016b563cc9ed6e056a82f13ce4", size = 223437, upload-time = "2025-09-08T23:23:38.945Z" }, - { url = "https://files.pythonhosted.org/packages/a0/1d/ec1a60bd1a10daa292d3cd6bb0b359a81607154fb8165f3ec95fe003b85c/cffi-2.0.0-cp314-cp314t-win32.whl", hash = "sha256:1fc9ea04857caf665289b7a75923f2c6ed559b8298a1b8c49e59f7dd95c8481e", size = 180487, upload-time = "2025-09-08T23:23:40.423Z" }, - { url = "https://files.pythonhosted.org/packages/bf/41/4c1168c74fac325c0c8156f04b6749c8b6a8f405bbf91413ba088359f60d/cffi-2.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:d68b6cef7827e8641e8ef16f4494edda8b36104d79773a334beaa1e3521430f6", size = 191726, upload-time = "2025-09-08T23:23:41.742Z" }, - { url = "https://files.pythonhosted.org/packages/ae/3a/dbeec9d1ee0844c679f6bb5d6ad4e9f198b1224f4e7a32825f47f6192b0c/cffi-2.0.0-cp314-cp314t-win_arm64.whl", hash = "sha256:0a1527a803f0a659de1af2e1fd700213caba79377e27e4693648c2923da066f9", size = 184195, upload-time = "2025-09-08T23:23:43.004Z" }, -] - -[[package]] -name = "charset-normalizer" -version = "3.4.3" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/83/2d/5fd176ceb9b2fc619e63405525573493ca23441330fcdaee6bef9460e924/charset_normalizer-3.4.3.tar.gz", hash = "sha256:6fce4b8500244f6fcb71465d4a4930d132ba9ab8e71a7859e6a5d59851068d14", size = 122371, upload-time = "2025-08-09T07:57:28.46Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/7f/b5/991245018615474a60965a7c9cd2b4efbaabd16d582a5547c47ee1c7730b/charset_normalizer-3.4.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:b256ee2e749283ef3ddcff51a675ff43798d92d746d1a6e4631bf8c707d22d0b", size = 204483, upload-time = "2025-08-09T07:55:53.12Z" }, - { url = "https://files.pythonhosted.org/packages/c7/2a/ae245c41c06299ec18262825c1569c5d3298fc920e4ddf56ab011b417efd/charset_normalizer-3.4.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:13faeacfe61784e2559e690fc53fa4c5ae97c6fcedb8eb6fb8d0a15b475d2c64", size = 145520, upload-time = "2025-08-09T07:55:54.712Z" }, - { url = "https://files.pythonhosted.org/packages/3a/a4/b3b6c76e7a635748c4421d2b92c7b8f90a432f98bda5082049af37ffc8e3/charset_normalizer-3.4.3-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:00237675befef519d9af72169d8604a067d92755e84fe76492fef5441db05b91", size = 158876, upload-time = "2025-08-09T07:55:56.024Z" }, - { url = "https://files.pythonhosted.org/packages/e2/e6/63bb0e10f90a8243c5def74b5b105b3bbbfb3e7bb753915fe333fb0c11ea/charset_normalizer-3.4.3-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:585f3b2a80fbd26b048a0be90c5aae8f06605d3c92615911c3a2b03a8a3b796f", size = 156083, upload-time = "2025-08-09T07:55:57.582Z" }, - { url = "https://files.pythonhosted.org/packages/87/df/b7737ff046c974b183ea9aa111b74185ac8c3a326c6262d413bd5a1b8c69/charset_normalizer-3.4.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0e78314bdc32fa80696f72fa16dc61168fda4d6a0c014e0380f9d02f0e5d8a07", size = 150295, upload-time = "2025-08-09T07:55:59.147Z" }, - { url = "https://files.pythonhosted.org/packages/61/f1/190d9977e0084d3f1dc169acd060d479bbbc71b90bf3e7bf7b9927dec3eb/charset_normalizer-3.4.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:96b2b3d1a83ad55310de8c7b4a2d04d9277d5591f40761274856635acc5fcb30", size = 148379, upload-time = "2025-08-09T07:56:00.364Z" }, - { url = "https://files.pythonhosted.org/packages/4c/92/27dbe365d34c68cfe0ca76f1edd70e8705d82b378cb54ebbaeabc2e3029d/charset_normalizer-3.4.3-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:939578d9d8fd4299220161fdd76e86c6a251987476f5243e8864a7844476ba14", size = 160018, upload-time = "2025-08-09T07:56:01.678Z" }, - { url = "https://files.pythonhosted.org/packages/99/04/baae2a1ea1893a01635d475b9261c889a18fd48393634b6270827869fa34/charset_normalizer-3.4.3-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:fd10de089bcdcd1be95a2f73dbe6254798ec1bda9f450d5828c96f93e2536b9c", size = 157430, upload-time = "2025-08-09T07:56:02.87Z" }, - { url = "https://files.pythonhosted.org/packages/2f/36/77da9c6a328c54d17b960c89eccacfab8271fdaaa228305330915b88afa9/charset_normalizer-3.4.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:1e8ac75d72fa3775e0b7cb7e4629cec13b7514d928d15ef8ea06bca03ef01cae", size = 151600, upload-time = "2025-08-09T07:56:04.089Z" }, - { url = "https://files.pythonhosted.org/packages/64/d4/9eb4ff2c167edbbf08cdd28e19078bf195762e9bd63371689cab5ecd3d0d/charset_normalizer-3.4.3-cp311-cp311-win32.whl", hash = "sha256:6cf8fd4c04756b6b60146d98cd8a77d0cdae0e1ca20329da2ac85eed779b6849", size = 99616, upload-time = "2025-08-09T07:56:05.658Z" }, - { url = "https://files.pythonhosted.org/packages/f4/9c/996a4a028222e7761a96634d1820de8a744ff4327a00ada9c8942033089b/charset_normalizer-3.4.3-cp311-cp311-win_amd64.whl", hash = "sha256:31a9a6f775f9bcd865d88ee350f0ffb0e25936a7f930ca98995c05abf1faf21c", size = 107108, upload-time = "2025-08-09T07:56:07.176Z" }, - { url = "https://files.pythonhosted.org/packages/e9/5e/14c94999e418d9b87682734589404a25854d5f5d0408df68bc15b6ff54bb/charset_normalizer-3.4.3-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:e28e334d3ff134e88989d90ba04b47d84382a828c061d0d1027b1b12a62b39b1", size = 205655, upload-time = "2025-08-09T07:56:08.475Z" }, - { url = "https://files.pythonhosted.org/packages/7d/a8/c6ec5d389672521f644505a257f50544c074cf5fc292d5390331cd6fc9c3/charset_normalizer-3.4.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0cacf8f7297b0c4fcb74227692ca46b4a5852f8f4f24b3c766dd94a1075c4884", size = 146223, upload-time = "2025-08-09T07:56:09.708Z" }, - { url = "https://files.pythonhosted.org/packages/fc/eb/a2ffb08547f4e1e5415fb69eb7db25932c52a52bed371429648db4d84fb1/charset_normalizer-3.4.3-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c6fd51128a41297f5409deab284fecbe5305ebd7e5a1f959bee1c054622b7018", size = 159366, upload-time = "2025-08-09T07:56:11.326Z" }, - { url = "https://files.pythonhosted.org/packages/82/10/0fd19f20c624b278dddaf83b8464dcddc2456cb4b02bb902a6da126b87a1/charset_normalizer-3.4.3-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:3cfb2aad70f2c6debfbcb717f23b7eb55febc0bb23dcffc0f076009da10c6392", size = 157104, upload-time = "2025-08-09T07:56:13.014Z" }, - { url = "https://files.pythonhosted.org/packages/16/ab/0233c3231af734f5dfcf0844aa9582d5a1466c985bbed6cedab85af9bfe3/charset_normalizer-3.4.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1606f4a55c0fd363d754049cdf400175ee96c992b1f8018b993941f221221c5f", size = 151830, upload-time = "2025-08-09T07:56:14.428Z" }, - { url = "https://files.pythonhosted.org/packages/ae/02/e29e22b4e02839a0e4a06557b1999d0a47db3567e82989b5bb21f3fbbd9f/charset_normalizer-3.4.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:027b776c26d38b7f15b26a5da1044f376455fb3766df8fc38563b4efbc515154", size = 148854, upload-time = "2025-08-09T07:56:16.051Z" }, - { url = "https://files.pythonhosted.org/packages/05/6b/e2539a0a4be302b481e8cafb5af8792da8093b486885a1ae4d15d452bcec/charset_normalizer-3.4.3-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:42e5088973e56e31e4fa58eb6bd709e42fc03799c11c42929592889a2e54c491", size = 160670, upload-time = "2025-08-09T07:56:17.314Z" }, - { url = "https://files.pythonhosted.org/packages/31/e7/883ee5676a2ef217a40ce0bffcc3d0dfbf9e64cbcfbdf822c52981c3304b/charset_normalizer-3.4.3-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:cc34f233c9e71701040d772aa7490318673aa7164a0efe3172b2981218c26d93", size = 158501, upload-time = "2025-08-09T07:56:18.641Z" }, - { url = "https://files.pythonhosted.org/packages/c1/35/6525b21aa0db614cf8b5792d232021dca3df7f90a1944db934efa5d20bb1/charset_normalizer-3.4.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:320e8e66157cc4e247d9ddca8e21f427efc7a04bbd0ac8a9faf56583fa543f9f", size = 153173, upload-time = "2025-08-09T07:56:20.289Z" }, - { url = "https://files.pythonhosted.org/packages/50/ee/f4704bad8201de513fdc8aac1cabc87e38c5818c93857140e06e772b5892/charset_normalizer-3.4.3-cp312-cp312-win32.whl", hash = "sha256:fb6fecfd65564f208cbf0fba07f107fb661bcd1a7c389edbced3f7a493f70e37", size = 99822, upload-time = "2025-08-09T07:56:21.551Z" }, - { url = "https://files.pythonhosted.org/packages/39/f5/3b3836ca6064d0992c58c7561c6b6eee1b3892e9665d650c803bd5614522/charset_normalizer-3.4.3-cp312-cp312-win_amd64.whl", hash = "sha256:86df271bf921c2ee3818f0522e9a5b8092ca2ad8b065ece5d7d9d0e9f4849bcc", size = 107543, upload-time = "2025-08-09T07:56:23.115Z" }, - { url = "https://files.pythonhosted.org/packages/65/ca/2135ac97709b400c7654b4b764daf5c5567c2da45a30cdd20f9eefe2d658/charset_normalizer-3.4.3-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:14c2a87c65b351109f6abfc424cab3927b3bdece6f706e4d12faaf3d52ee5efe", size = 205326, upload-time = "2025-08-09T07:56:24.721Z" }, - { url = "https://files.pythonhosted.org/packages/71/11/98a04c3c97dd34e49c7d247083af03645ca3730809a5509443f3c37f7c99/charset_normalizer-3.4.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:41d1fc408ff5fdfb910200ec0e74abc40387bccb3252f3f27c0676731df2b2c8", size = 146008, upload-time = "2025-08-09T07:56:26.004Z" }, - { url = "https://files.pythonhosted.org/packages/60/f5/4659a4cb3c4ec146bec80c32d8bb16033752574c20b1252ee842a95d1a1e/charset_normalizer-3.4.3-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:1bb60174149316da1c35fa5233681f7c0f9f514509b8e399ab70fea5f17e45c9", size = 159196, upload-time = "2025-08-09T07:56:27.25Z" }, - { url = "https://files.pythonhosted.org/packages/86/9e/f552f7a00611f168b9a5865a1414179b2c6de8235a4fa40189f6f79a1753/charset_normalizer-3.4.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:30d006f98569de3459c2fc1f2acde170b7b2bd265dc1943e87e1a4efe1b67c31", size = 156819, upload-time = "2025-08-09T07:56:28.515Z" }, - { url = "https://files.pythonhosted.org/packages/7e/95/42aa2156235cbc8fa61208aded06ef46111c4d3f0de233107b3f38631803/charset_normalizer-3.4.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:416175faf02e4b0810f1f38bcb54682878a4af94059a1cd63b8747244420801f", size = 151350, upload-time = "2025-08-09T07:56:29.716Z" }, - { url = "https://files.pythonhosted.org/packages/c2/a9/3865b02c56f300a6f94fc631ef54f0a8a29da74fb45a773dfd3dcd380af7/charset_normalizer-3.4.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:6aab0f181c486f973bc7262a97f5aca3ee7e1437011ef0c2ec04b5a11d16c927", size = 148644, upload-time = "2025-08-09T07:56:30.984Z" }, - { url = "https://files.pythonhosted.org/packages/77/d9/cbcf1a2a5c7d7856f11e7ac2d782aec12bdfea60d104e60e0aa1c97849dc/charset_normalizer-3.4.3-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:fdabf8315679312cfa71302f9bd509ded4f2f263fb5b765cf1433b39106c3cc9", size = 160468, upload-time = "2025-08-09T07:56:32.252Z" }, - { url = "https://files.pythonhosted.org/packages/f6/42/6f45efee8697b89fda4d50580f292b8f7f9306cb2971d4b53f8914e4d890/charset_normalizer-3.4.3-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:bd28b817ea8c70215401f657edef3a8aa83c29d447fb0b622c35403780ba11d5", size = 158187, upload-time = "2025-08-09T07:56:33.481Z" }, - { url = "https://files.pythonhosted.org/packages/70/99/f1c3bdcfaa9c45b3ce96f70b14f070411366fa19549c1d4832c935d8e2c3/charset_normalizer-3.4.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:18343b2d246dc6761a249ba1fb13f9ee9a2bcd95decc767319506056ea4ad4dc", size = 152699, upload-time = "2025-08-09T07:56:34.739Z" }, - { url = "https://files.pythonhosted.org/packages/a3/ad/b0081f2f99a4b194bcbb1934ef3b12aa4d9702ced80a37026b7607c72e58/charset_normalizer-3.4.3-cp313-cp313-win32.whl", hash = "sha256:6fb70de56f1859a3f71261cbe41005f56a7842cc348d3aeb26237560bfa5e0ce", size = 99580, upload-time = "2025-08-09T07:56:35.981Z" }, - { url = "https://files.pythonhosted.org/packages/9a/8f/ae790790c7b64f925e5c953b924aaa42a243fb778fed9e41f147b2a5715a/charset_normalizer-3.4.3-cp313-cp313-win_amd64.whl", hash = "sha256:cf1ebb7d78e1ad8ec2a8c4732c7be2e736f6e5123a4146c5b89c9d1f585f8cef", size = 107366, upload-time = "2025-08-09T07:56:37.339Z" }, - { url = "https://files.pythonhosted.org/packages/8e/91/b5a06ad970ddc7a0e513112d40113e834638f4ca1120eb727a249fb2715e/charset_normalizer-3.4.3-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:3cd35b7e8aedeb9e34c41385fda4f73ba609e561faedfae0a9e75e44ac558a15", size = 204342, upload-time = "2025-08-09T07:56:38.687Z" }, - { url = "https://files.pythonhosted.org/packages/ce/ec/1edc30a377f0a02689342f214455c3f6c2fbedd896a1d2f856c002fc3062/charset_normalizer-3.4.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b89bc04de1d83006373429975f8ef9e7932534b8cc9ca582e4db7d20d91816db", size = 145995, upload-time = "2025-08-09T07:56:40.048Z" }, - { url = "https://files.pythonhosted.org/packages/17/e5/5e67ab85e6d22b04641acb5399c8684f4d37caf7558a53859f0283a650e9/charset_normalizer-3.4.3-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:2001a39612b241dae17b4687898843f254f8748b796a2e16f1051a17078d991d", size = 158640, upload-time = "2025-08-09T07:56:41.311Z" }, - { url = "https://files.pythonhosted.org/packages/f1/e5/38421987f6c697ee3722981289d554957c4be652f963d71c5e46a262e135/charset_normalizer-3.4.3-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:8dcfc373f888e4fb39a7bc57e93e3b845e7f462dacc008d9749568b1c4ece096", size = 156636, upload-time = "2025-08-09T07:56:43.195Z" }, - { url = "https://files.pythonhosted.org/packages/a0/e4/5a075de8daa3ec0745a9a3b54467e0c2967daaaf2cec04c845f73493e9a1/charset_normalizer-3.4.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:18b97b8404387b96cdbd30ad660f6407799126d26a39ca65729162fd810a99aa", size = 150939, upload-time = "2025-08-09T07:56:44.819Z" }, - { url = "https://files.pythonhosted.org/packages/02/f7/3611b32318b30974131db62b4043f335861d4d9b49adc6d57c1149cc49d4/charset_normalizer-3.4.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:ccf600859c183d70eb47e05a44cd80a4ce77394d1ac0f79dbd2dd90a69a3a049", size = 148580, upload-time = "2025-08-09T07:56:46.684Z" }, - { url = "https://files.pythonhosted.org/packages/7e/61/19b36f4bd67f2793ab6a99b979b4e4f3d8fc754cbdffb805335df4337126/charset_normalizer-3.4.3-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:53cd68b185d98dde4ad8990e56a58dea83a4162161b1ea9272e5c9182ce415e0", size = 159870, upload-time = "2025-08-09T07:56:47.941Z" }, - { url = "https://files.pythonhosted.org/packages/06/57/84722eefdd338c04cf3030ada66889298eaedf3e7a30a624201e0cbe424a/charset_normalizer-3.4.3-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:30a96e1e1f865f78b030d65241c1ee850cdf422d869e9028e2fc1d5e4db73b92", size = 157797, upload-time = "2025-08-09T07:56:49.756Z" }, - { url = "https://files.pythonhosted.org/packages/72/2a/aff5dd112b2f14bcc3462c312dce5445806bfc8ab3a7328555da95330e4b/charset_normalizer-3.4.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:d716a916938e03231e86e43782ca7878fb602a125a91e7acb8b5112e2e96ac16", size = 152224, upload-time = "2025-08-09T07:56:51.369Z" }, - { url = "https://files.pythonhosted.org/packages/b7/8c/9839225320046ed279c6e839d51f028342eb77c91c89b8ef2549f951f3ec/charset_normalizer-3.4.3-cp314-cp314-win32.whl", hash = "sha256:c6dbd0ccdda3a2ba7c2ecd9d77b37f3b5831687d8dc1b6ca5f56a4880cc7b7ce", size = 100086, upload-time = "2025-08-09T07:56:52.722Z" }, - { url = "https://files.pythonhosted.org/packages/ee/7a/36fbcf646e41f710ce0a563c1c9a343c6edf9be80786edeb15b6f62e17db/charset_normalizer-3.4.3-cp314-cp314-win_amd64.whl", hash = "sha256:73dc19b562516fc9bcf6e5d6e596df0b4eb98d87e4f79f3ae71840e6ed21361c", size = 107400, upload-time = "2025-08-09T07:56:55.172Z" }, - { url = "https://files.pythonhosted.org/packages/8a/1f/f041989e93b001bc4e44bb1669ccdcf54d3f00e628229a85b08d330615c5/charset_normalizer-3.4.3-py3-none-any.whl", hash = "sha256:ce571ab16d890d23b5c278547ba694193a45011ff86a9162a71307ed9f86759a", size = 53175, upload-time = "2025-08-09T07:57:26.864Z" }, -] - -[[package]] -name = "click" -version = "8.2.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "colorama", marker = "sys_platform == 'win32'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/60/6c/8ca2efa64cf75a977a0d7fac081354553ebe483345c734fb6b6515d96bbc/click-8.2.1.tar.gz", hash = "sha256:27c491cc05d968d271d5a1db13e3b5a184636d9d930f148c50b038f0d0646202", size = 286342, upload-time = "2025-05-20T23:19:49.832Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/85/32/10bb5764d90a8eee674e9dc6f4db6a0ab47c8c4d0d83c27f7c39ac415a4d/click-8.2.1-py3-none-any.whl", hash = "sha256:61a3265b914e850b85317d0b3109c7f8cd35a670f963866005d6ef1d5175a12b", size = 102215, upload-time = "2025-05-20T23:19:47.796Z" }, -] - -[[package]] -name = "colorama" -version = "0.4.6" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697, upload-time = "2022-10-25T02:36:22.414Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335, upload-time = "2022-10-25T02:36:20.889Z" }, -] - -[[package]] -name = "coverage" -version = "7.10.7" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/51/26/d22c300112504f5f9a9fd2297ce33c35f3d353e4aeb987c8419453b2a7c2/coverage-7.10.7.tar.gz", hash = "sha256:f4ab143ab113be368a3e9b795f9cd7906c5ef407d6173fe9675a902e1fffc239", size = 827704, upload-time = "2025-09-21T20:03:56.815Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/d2/5d/c1a17867b0456f2e9ce2d8d4708a4c3a089947d0bec9c66cdf60c9e7739f/coverage-7.10.7-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a609f9c93113be646f44c2a0256d6ea375ad047005d7f57a5c15f614dc1b2f59", size = 218102, upload-time = "2025-09-21T20:01:16.089Z" }, - { url = "https://files.pythonhosted.org/packages/54/f0/514dcf4b4e3698b9a9077f084429681bf3aad2b4a72578f89d7f643eb506/coverage-7.10.7-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:65646bb0359386e07639c367a22cf9b5bf6304e8630b565d0626e2bdf329227a", size = 218505, upload-time = "2025-09-21T20:01:17.788Z" }, - { url = "https://files.pythonhosted.org/packages/20/f6/9626b81d17e2a4b25c63ac1b425ff307ecdeef03d67c9a147673ae40dc36/coverage-7.10.7-cp311-cp311-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:5f33166f0dfcce728191f520bd2692914ec70fac2713f6bf3ce59c3deacb4699", size = 248898, upload-time = "2025-09-21T20:01:19.488Z" }, - { url = "https://files.pythonhosted.org/packages/b0/ef/bd8e719c2f7417ba03239052e099b76ea1130ac0cbb183ee1fcaa58aaff3/coverage-7.10.7-cp311-cp311-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:35f5e3f9e455bb17831876048355dca0f758b6df22f49258cb5a91da23ef437d", size = 250831, upload-time = "2025-09-21T20:01:20.817Z" }, - { url = "https://files.pythonhosted.org/packages/a5/b6/bf054de41ec948b151ae2b79a55c107f5760979538f5fb80c195f2517718/coverage-7.10.7-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4da86b6d62a496e908ac2898243920c7992499c1712ff7c2b6d837cc69d9467e", size = 252937, upload-time = "2025-09-21T20:01:22.171Z" }, - { url = "https://files.pythonhosted.org/packages/0f/e5/3860756aa6f9318227443c6ce4ed7bf9e70bb7f1447a0353f45ac5c7974b/coverage-7.10.7-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:6b8b09c1fad947c84bbbc95eca841350fad9cbfa5a2d7ca88ac9f8d836c92e23", size = 249021, upload-time = "2025-09-21T20:01:23.907Z" }, - { url = "https://files.pythonhosted.org/packages/26/0f/bd08bd042854f7fd07b45808927ebcce99a7ed0f2f412d11629883517ac2/coverage-7.10.7-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:4376538f36b533b46f8971d3a3e63464f2c7905c9800db97361c43a2b14792ab", size = 250626, upload-time = "2025-09-21T20:01:25.721Z" }, - { url = "https://files.pythonhosted.org/packages/8e/a7/4777b14de4abcc2e80c6b1d430f5d51eb18ed1d75fca56cbce5f2db9b36e/coverage-7.10.7-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:121da30abb574f6ce6ae09840dae322bef734480ceafe410117627aa54f76d82", size = 248682, upload-time = "2025-09-21T20:01:27.105Z" }, - { url = "https://files.pythonhosted.org/packages/34/72/17d082b00b53cd45679bad682fac058b87f011fd8b9fe31d77f5f8d3a4e4/coverage-7.10.7-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:88127d40df529336a9836870436fc2751c339fbaed3a836d42c93f3e4bd1d0a2", size = 248402, upload-time = "2025-09-21T20:01:28.629Z" }, - { url = "https://files.pythonhosted.org/packages/81/7a/92367572eb5bdd6a84bfa278cc7e97db192f9f45b28c94a9ca1a921c3577/coverage-7.10.7-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:ba58bbcd1b72f136080c0bccc2400d66cc6115f3f906c499013d065ac33a4b61", size = 249320, upload-time = "2025-09-21T20:01:30.004Z" }, - { url = "https://files.pythonhosted.org/packages/2f/88/a23cc185f6a805dfc4fdf14a94016835eeb85e22ac3a0e66d5e89acd6462/coverage-7.10.7-cp311-cp311-win32.whl", hash = "sha256:972b9e3a4094b053a4e46832b4bc829fc8a8d347160eb39d03f1690316a99c14", size = 220536, upload-time = "2025-09-21T20:01:32.184Z" }, - { url = "https://files.pythonhosted.org/packages/fe/ef/0b510a399dfca17cec7bc2f05ad8bd78cf55f15c8bc9a73ab20c5c913c2e/coverage-7.10.7-cp311-cp311-win_amd64.whl", hash = "sha256:a7b55a944a7f43892e28ad4bc0561dfd5f0d73e605d1aa5c3c976b52aea121d2", size = 221425, upload-time = "2025-09-21T20:01:33.557Z" }, - { url = "https://files.pythonhosted.org/packages/51/7f/023657f301a276e4ba1850f82749bc136f5a7e8768060c2e5d9744a22951/coverage-7.10.7-cp311-cp311-win_arm64.whl", hash = "sha256:736f227fb490f03c6488f9b6d45855f8e0fd749c007f9303ad30efab0e73c05a", size = 220103, upload-time = "2025-09-21T20:01:34.929Z" }, - { url = "https://files.pythonhosted.org/packages/13/e4/eb12450f71b542a53972d19117ea5a5cea1cab3ac9e31b0b5d498df1bd5a/coverage-7.10.7-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:7bb3b9ddb87ef7725056572368040c32775036472d5a033679d1fa6c8dc08417", size = 218290, upload-time = "2025-09-21T20:01:36.455Z" }, - { url = "https://files.pythonhosted.org/packages/37/66/593f9be12fc19fb36711f19a5371af79a718537204d16ea1d36f16bd78d2/coverage-7.10.7-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:18afb24843cbc175687225cab1138c95d262337f5473512010e46831aa0c2973", size = 218515, upload-time = "2025-09-21T20:01:37.982Z" }, - { url = "https://files.pythonhosted.org/packages/66/80/4c49f7ae09cafdacc73fbc30949ffe77359635c168f4e9ff33c9ebb07838/coverage-7.10.7-cp312-cp312-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:399a0b6347bcd3822be369392932884b8216d0944049ae22925631a9b3d4ba4c", size = 250020, upload-time = "2025-09-21T20:01:39.617Z" }, - { url = "https://files.pythonhosted.org/packages/a6/90/a64aaacab3b37a17aaedd83e8000142561a29eb262cede42d94a67f7556b/coverage-7.10.7-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:314f2c326ded3f4b09be11bc282eb2fc861184bc95748ae67b360ac962770be7", size = 252769, upload-time = "2025-09-21T20:01:41.341Z" }, - { url = "https://files.pythonhosted.org/packages/98/2e/2dda59afd6103b342e096f246ebc5f87a3363b5412609946c120f4e7750d/coverage-7.10.7-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c41e71c9cfb854789dee6fc51e46743a6d138b1803fab6cb860af43265b42ea6", size = 253901, upload-time = "2025-09-21T20:01:43.042Z" }, - { url = "https://files.pythonhosted.org/packages/53/dc/8d8119c9051d50f3119bb4a75f29f1e4a6ab9415cd1fa8bf22fcc3fb3b5f/coverage-7.10.7-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:bc01f57ca26269c2c706e838f6422e2a8788e41b3e3c65e2f41148212e57cd59", size = 250413, upload-time = "2025-09-21T20:01:44.469Z" }, - { url = "https://files.pythonhosted.org/packages/98/b3/edaff9c5d79ee4d4b6d3fe046f2b1d799850425695b789d491a64225d493/coverage-7.10.7-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:a6442c59a8ac8b85812ce33bc4d05bde3fb22321fa8294e2a5b487c3505f611b", size = 251820, upload-time = "2025-09-21T20:01:45.915Z" }, - { url = "https://files.pythonhosted.org/packages/11/25/9a0728564bb05863f7e513e5a594fe5ffef091b325437f5430e8cfb0d530/coverage-7.10.7-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:78a384e49f46b80fb4c901d52d92abe098e78768ed829c673fbb53c498bef73a", size = 249941, upload-time = "2025-09-21T20:01:47.296Z" }, - { url = "https://files.pythonhosted.org/packages/e0/fd/ca2650443bfbef5b0e74373aac4df67b08180d2f184b482c41499668e258/coverage-7.10.7-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:5e1e9802121405ede4b0133aa4340ad8186a1d2526de5b7c3eca519db7bb89fb", size = 249519, upload-time = "2025-09-21T20:01:48.73Z" }, - { url = "https://files.pythonhosted.org/packages/24/79/f692f125fb4299b6f963b0745124998ebb8e73ecdfce4ceceb06a8c6bec5/coverage-7.10.7-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:d41213ea25a86f69efd1575073d34ea11aabe075604ddf3d148ecfec9e1e96a1", size = 251375, upload-time = "2025-09-21T20:01:50.529Z" }, - { url = "https://files.pythonhosted.org/packages/5e/75/61b9bbd6c7d24d896bfeec57acba78e0f8deac68e6baf2d4804f7aae1f88/coverage-7.10.7-cp312-cp312-win32.whl", hash = "sha256:77eb4c747061a6af8d0f7bdb31f1e108d172762ef579166ec84542f711d90256", size = 220699, upload-time = "2025-09-21T20:01:51.941Z" }, - { url = "https://files.pythonhosted.org/packages/ca/f3/3bf7905288b45b075918d372498f1cf845b5b579b723c8fd17168018d5f5/coverage-7.10.7-cp312-cp312-win_amd64.whl", hash = "sha256:f51328ffe987aecf6d09f3cd9d979face89a617eacdaea43e7b3080777f647ba", size = 221512, upload-time = "2025-09-21T20:01:53.481Z" }, - { url = "https://files.pythonhosted.org/packages/5c/44/3e32dbe933979d05cf2dac5e697c8599cfe038aaf51223ab901e208d5a62/coverage-7.10.7-cp312-cp312-win_arm64.whl", hash = "sha256:bda5e34f8a75721c96085903c6f2197dc398c20ffd98df33f866a9c8fd95f4bf", size = 220147, upload-time = "2025-09-21T20:01:55.2Z" }, - { url = "https://files.pythonhosted.org/packages/9a/94/b765c1abcb613d103b64fcf10395f54d69b0ef8be6a0dd9c524384892cc7/coverage-7.10.7-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:981a651f543f2854abd3b5fcb3263aac581b18209be49863ba575de6edf4c14d", size = 218320, upload-time = "2025-09-21T20:01:56.629Z" }, - { url = "https://files.pythonhosted.org/packages/72/4f/732fff31c119bb73b35236dd333030f32c4bfe909f445b423e6c7594f9a2/coverage-7.10.7-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:73ab1601f84dc804f7812dc297e93cd99381162da39c47040a827d4e8dafe63b", size = 218575, upload-time = "2025-09-21T20:01:58.203Z" }, - { url = "https://files.pythonhosted.org/packages/87/02/ae7e0af4b674be47566707777db1aa375474f02a1d64b9323e5813a6cdd5/coverage-7.10.7-cp313-cp313-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:a8b6f03672aa6734e700bbcd65ff050fd19cddfec4b031cc8cf1c6967de5a68e", size = 249568, upload-time = "2025-09-21T20:01:59.748Z" }, - { url = "https://files.pythonhosted.org/packages/a2/77/8c6d22bf61921a59bce5471c2f1f7ac30cd4ac50aadde72b8c48d5727902/coverage-7.10.7-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:10b6ba00ab1132a0ce4428ff68cf50a25efd6840a42cdf4239c9b99aad83be8b", size = 252174, upload-time = "2025-09-21T20:02:01.192Z" }, - { url = "https://files.pythonhosted.org/packages/b1/20/b6ea4f69bbb52dac0aebd62157ba6a9dddbfe664f5af8122dac296c3ee15/coverage-7.10.7-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c79124f70465a150e89340de5963f936ee97097d2ef76c869708c4248c63ca49", size = 253447, upload-time = "2025-09-21T20:02:02.701Z" }, - { url = "https://files.pythonhosted.org/packages/f9/28/4831523ba483a7f90f7b259d2018fef02cb4d5b90bc7c1505d6e5a84883c/coverage-7.10.7-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:69212fbccdbd5b0e39eac4067e20a4a5256609e209547d86f740d68ad4f04911", size = 249779, upload-time = "2025-09-21T20:02:04.185Z" }, - { url = "https://files.pythonhosted.org/packages/a7/9f/4331142bc98c10ca6436d2d620c3e165f31e6c58d43479985afce6f3191c/coverage-7.10.7-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:7ea7c6c9d0d286d04ed3541747e6597cbe4971f22648b68248f7ddcd329207f0", size = 251604, upload-time = "2025-09-21T20:02:06.034Z" }, - { url = "https://files.pythonhosted.org/packages/ce/60/bda83b96602036b77ecf34e6393a3836365481b69f7ed7079ab85048202b/coverage-7.10.7-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:b9be91986841a75042b3e3243d0b3cb0b2434252b977baaf0cd56e960fe1e46f", size = 249497, upload-time = "2025-09-21T20:02:07.619Z" }, - { url = "https://files.pythonhosted.org/packages/5f/af/152633ff35b2af63977edd835d8e6430f0caef27d171edf2fc76c270ef31/coverage-7.10.7-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:b281d5eca50189325cfe1f365fafade89b14b4a78d9b40b05ddd1fc7d2a10a9c", size = 249350, upload-time = "2025-09-21T20:02:10.34Z" }, - { url = "https://files.pythonhosted.org/packages/9d/71/d92105d122bd21cebba877228990e1646d862e34a98bb3374d3fece5a794/coverage-7.10.7-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:99e4aa63097ab1118e75a848a28e40d68b08a5e19ce587891ab7fd04475e780f", size = 251111, upload-time = "2025-09-21T20:02:12.122Z" }, - { url = "https://files.pythonhosted.org/packages/a2/9e/9fdb08f4bf476c912f0c3ca292e019aab6712c93c9344a1653986c3fd305/coverage-7.10.7-cp313-cp313-win32.whl", hash = "sha256:dc7c389dce432500273eaf48f410b37886be9208b2dd5710aaf7c57fd442c698", size = 220746, upload-time = "2025-09-21T20:02:13.919Z" }, - { url = "https://files.pythonhosted.org/packages/b1/b1/a75fd25df44eab52d1931e89980d1ada46824c7a3210be0d3c88a44aaa99/coverage-7.10.7-cp313-cp313-win_amd64.whl", hash = "sha256:cac0fdca17b036af3881a9d2729a850b76553f3f716ccb0360ad4dbc06b3b843", size = 221541, upload-time = "2025-09-21T20:02:15.57Z" }, - { url = "https://files.pythonhosted.org/packages/14/3a/d720d7c989562a6e9a14b2c9f5f2876bdb38e9367126d118495b89c99c37/coverage-7.10.7-cp313-cp313-win_arm64.whl", hash = "sha256:4b6f236edf6e2f9ae8fcd1332da4e791c1b6ba0dc16a2dc94590ceccb482e546", size = 220170, upload-time = "2025-09-21T20:02:17.395Z" }, - { url = "https://files.pythonhosted.org/packages/bb/22/e04514bf2a735d8b0add31d2b4ab636fc02370730787c576bb995390d2d5/coverage-7.10.7-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:a0ec07fd264d0745ee396b666d47cef20875f4ff2375d7c4f58235886cc1ef0c", size = 219029, upload-time = "2025-09-21T20:02:18.936Z" }, - { url = "https://files.pythonhosted.org/packages/11/0b/91128e099035ece15da3445d9015e4b4153a6059403452d324cbb0a575fa/coverage-7.10.7-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:dd5e856ebb7bfb7672b0086846db5afb4567a7b9714b8a0ebafd211ec7ce6a15", size = 219259, upload-time = "2025-09-21T20:02:20.44Z" }, - { url = "https://files.pythonhosted.org/packages/8b/51/66420081e72801536a091a0c8f8c1f88a5c4bf7b9b1bdc6222c7afe6dc9b/coverage-7.10.7-cp313-cp313t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:f57b2a3c8353d3e04acf75b3fed57ba41f5c0646bbf1d10c7c282291c97936b4", size = 260592, upload-time = "2025-09-21T20:02:22.313Z" }, - { url = "https://files.pythonhosted.org/packages/5d/22/9b8d458c2881b22df3db5bb3e7369e63d527d986decb6c11a591ba2364f7/coverage-7.10.7-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:1ef2319dd15a0b009667301a3f84452a4dc6fddfd06b0c5c53ea472d3989fbf0", size = 262768, upload-time = "2025-09-21T20:02:24.287Z" }, - { url = "https://files.pythonhosted.org/packages/f7/08/16bee2c433e60913c610ea200b276e8eeef084b0d200bdcff69920bd5828/coverage-7.10.7-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:83082a57783239717ceb0ad584de3c69cf581b2a95ed6bf81ea66034f00401c0", size = 264995, upload-time = "2025-09-21T20:02:26.133Z" }, - { url = "https://files.pythonhosted.org/packages/20/9d/e53eb9771d154859b084b90201e5221bca7674ba449a17c101a5031d4054/coverage-7.10.7-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:50aa94fb1fb9a397eaa19c0d5ec15a5edd03a47bf1a3a6111a16b36e190cff65", size = 259546, upload-time = "2025-09-21T20:02:27.716Z" }, - { url = "https://files.pythonhosted.org/packages/ad/b0/69bc7050f8d4e56a89fb550a1577d5d0d1db2278106f6f626464067b3817/coverage-7.10.7-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:2120043f147bebb41c85b97ac45dd173595ff14f2a584f2963891cbcc3091541", size = 262544, upload-time = "2025-09-21T20:02:29.216Z" }, - { url = "https://files.pythonhosted.org/packages/ef/4b/2514b060dbd1bc0aaf23b852c14bb5818f244c664cb16517feff6bb3a5ab/coverage-7.10.7-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:2fafd773231dd0378fdba66d339f84904a8e57a262f583530f4f156ab83863e6", size = 260308, upload-time = "2025-09-21T20:02:31.226Z" }, - { url = "https://files.pythonhosted.org/packages/54/78/7ba2175007c246d75e496f64c06e94122bdb914790a1285d627a918bd271/coverage-7.10.7-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:0b944ee8459f515f28b851728ad224fa2d068f1513ef6b7ff1efafeb2185f999", size = 258920, upload-time = "2025-09-21T20:02:32.823Z" }, - { url = "https://files.pythonhosted.org/packages/c0/b3/fac9f7abbc841409b9a410309d73bfa6cfb2e51c3fada738cb607ce174f8/coverage-7.10.7-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:4b583b97ab2e3efe1b3e75248a9b333bd3f8b0b1b8e5b45578e05e5850dfb2c2", size = 261434, upload-time = "2025-09-21T20:02:34.86Z" }, - { url = "https://files.pythonhosted.org/packages/ee/51/a03bec00d37faaa891b3ff7387192cef20f01604e5283a5fabc95346befa/coverage-7.10.7-cp313-cp313t-win32.whl", hash = "sha256:2a78cd46550081a7909b3329e2266204d584866e8d97b898cd7fb5ac8d888b1a", size = 221403, upload-time = "2025-09-21T20:02:37.034Z" }, - { url = "https://files.pythonhosted.org/packages/53/22/3cf25d614e64bf6d8e59c7c669b20d6d940bb337bdee5900b9ca41c820bb/coverage-7.10.7-cp313-cp313t-win_amd64.whl", hash = "sha256:33a5e6396ab684cb43dc7befa386258acb2d7fae7f67330ebb85ba4ea27938eb", size = 222469, upload-time = "2025-09-21T20:02:39.011Z" }, - { url = "https://files.pythonhosted.org/packages/49/a1/00164f6d30d8a01c3c9c48418a7a5be394de5349b421b9ee019f380df2a0/coverage-7.10.7-cp313-cp313t-win_arm64.whl", hash = "sha256:86b0e7308289ddde73d863b7683f596d8d21c7d8664ce1dee061d0bcf3fbb4bb", size = 220731, upload-time = "2025-09-21T20:02:40.939Z" }, - { url = "https://files.pythonhosted.org/packages/23/9c/5844ab4ca6a4dd97a1850e030a15ec7d292b5c5cb93082979225126e35dd/coverage-7.10.7-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:b06f260b16ead11643a5a9f955bd4b5fd76c1a4c6796aeade8520095b75de520", size = 218302, upload-time = "2025-09-21T20:02:42.527Z" }, - { url = "https://files.pythonhosted.org/packages/f0/89/673f6514b0961d1f0e20ddc242e9342f6da21eaba3489901b565c0689f34/coverage-7.10.7-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:212f8f2e0612778f09c55dd4872cb1f64a1f2b074393d139278ce902064d5b32", size = 218578, upload-time = "2025-09-21T20:02:44.468Z" }, - { url = "https://files.pythonhosted.org/packages/05/e8/261cae479e85232828fb17ad536765c88dd818c8470aca690b0ac6feeaa3/coverage-7.10.7-cp314-cp314-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:3445258bcded7d4aa630ab8296dea4d3f15a255588dd535f980c193ab6b95f3f", size = 249629, upload-time = "2025-09-21T20:02:46.503Z" }, - { url = "https://files.pythonhosted.org/packages/82/62/14ed6546d0207e6eda876434e3e8475a3e9adbe32110ce896c9e0c06bb9a/coverage-7.10.7-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:bb45474711ba385c46a0bfe696c695a929ae69ac636cda8f532be9e8c93d720a", size = 252162, upload-time = "2025-09-21T20:02:48.689Z" }, - { url = "https://files.pythonhosted.org/packages/ff/49/07f00db9ac6478e4358165a08fb41b469a1b053212e8a00cb02f0d27a05f/coverage-7.10.7-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:813922f35bd800dca9994c5971883cbc0d291128a5de6b167c7aa697fcf59360", size = 253517, upload-time = "2025-09-21T20:02:50.31Z" }, - { url = "https://files.pythonhosted.org/packages/a2/59/c5201c62dbf165dfbc91460f6dbbaa85a8b82cfa6131ac45d6c1bfb52deb/coverage-7.10.7-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:93c1b03552081b2a4423091d6fb3787265b8f86af404cff98d1b5342713bdd69", size = 249632, upload-time = "2025-09-21T20:02:51.971Z" }, - { url = "https://files.pythonhosted.org/packages/07/ae/5920097195291a51fb00b3a70b9bbd2edbfe3c84876a1762bd1ef1565ebc/coverage-7.10.7-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:cc87dd1b6eaf0b848eebb1c86469b9f72a1891cb42ac7adcfbce75eadb13dd14", size = 251520, upload-time = "2025-09-21T20:02:53.858Z" }, - { url = "https://files.pythonhosted.org/packages/b9/3c/a815dde77a2981f5743a60b63df31cb322c944843e57dbd579326625a413/coverage-7.10.7-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:39508ffda4f343c35f3236fe8d1a6634a51f4581226a1262769d7f970e73bffe", size = 249455, upload-time = "2025-09-21T20:02:55.807Z" }, - { url = "https://files.pythonhosted.org/packages/aa/99/f5cdd8421ea656abefb6c0ce92556709db2265c41e8f9fc6c8ae0f7824c9/coverage-7.10.7-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:925a1edf3d810537c5a3abe78ec5530160c5f9a26b1f4270b40e62cc79304a1e", size = 249287, upload-time = "2025-09-21T20:02:57.784Z" }, - { url = "https://files.pythonhosted.org/packages/c3/7a/e9a2da6a1fc5d007dd51fca083a663ab930a8c4d149c087732a5dbaa0029/coverage-7.10.7-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:2c8b9a0636f94c43cd3576811e05b89aa9bc2d0a85137affc544ae5cb0e4bfbd", size = 250946, upload-time = "2025-09-21T20:02:59.431Z" }, - { url = "https://files.pythonhosted.org/packages/ef/5b/0b5799aa30380a949005a353715095d6d1da81927d6dbed5def2200a4e25/coverage-7.10.7-cp314-cp314-win32.whl", hash = "sha256:b7b8288eb7cdd268b0304632da8cb0bb93fadcfec2fe5712f7b9cc8f4d487be2", size = 221009, upload-time = "2025-09-21T20:03:01.324Z" }, - { url = "https://files.pythonhosted.org/packages/da/b0/e802fbb6eb746de006490abc9bb554b708918b6774b722bb3a0e6aa1b7de/coverage-7.10.7-cp314-cp314-win_amd64.whl", hash = "sha256:1ca6db7c8807fb9e755d0379ccc39017ce0a84dcd26d14b5a03b78563776f681", size = 221804, upload-time = "2025-09-21T20:03:03.4Z" }, - { url = "https://files.pythonhosted.org/packages/9e/e8/71d0c8e374e31f39e3389bb0bd19e527d46f00ea8571ec7ec8fd261d8b44/coverage-7.10.7-cp314-cp314-win_arm64.whl", hash = "sha256:097c1591f5af4496226d5783d036bf6fd6cd0cbc132e071b33861de756efb880", size = 220384, upload-time = "2025-09-21T20:03:05.111Z" }, - { url = "https://files.pythonhosted.org/packages/62/09/9a5608d319fa3eba7a2019addeacb8c746fb50872b57a724c9f79f146969/coverage-7.10.7-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:a62c6ef0d50e6de320c270ff91d9dd0a05e7250cac2a800b7784bae474506e63", size = 219047, upload-time = "2025-09-21T20:03:06.795Z" }, - { url = "https://files.pythonhosted.org/packages/f5/6f/f58d46f33db9f2e3647b2d0764704548c184e6f5e014bef528b7f979ef84/coverage-7.10.7-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:9fa6e4dd51fe15d8738708a973470f67a855ca50002294852e9571cdbd9433f2", size = 219266, upload-time = "2025-09-21T20:03:08.495Z" }, - { url = "https://files.pythonhosted.org/packages/74/5c/183ffc817ba68e0b443b8c934c8795553eb0c14573813415bd59941ee165/coverage-7.10.7-cp314-cp314t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:8fb190658865565c549b6b4706856d6a7b09302c797eb2cf8e7fe9dabb043f0d", size = 260767, upload-time = "2025-09-21T20:03:10.172Z" }, - { url = "https://files.pythonhosted.org/packages/0f/48/71a8abe9c1ad7e97548835e3cc1adbf361e743e9d60310c5f75c9e7bf847/coverage-7.10.7-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:affef7c76a9ef259187ef31599a9260330e0335a3011732c4b9effa01e1cd6e0", size = 262931, upload-time = "2025-09-21T20:03:11.861Z" }, - { url = "https://files.pythonhosted.org/packages/84/fd/193a8fb132acfc0a901f72020e54be5e48021e1575bb327d8ee1097a28fd/coverage-7.10.7-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6e16e07d85ca0cf8bafe5f5d23a0b850064e8e945d5677492b06bbe6f09cc699", size = 265186, upload-time = "2025-09-21T20:03:13.539Z" }, - { url = "https://files.pythonhosted.org/packages/b1/8f/74ecc30607dd95ad50e3034221113ccb1c6d4e8085cc761134782995daae/coverage-7.10.7-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:03ffc58aacdf65d2a82bbeb1ffe4d01ead4017a21bfd0454983b88ca73af94b9", size = 259470, upload-time = "2025-09-21T20:03:15.584Z" }, - { url = "https://files.pythonhosted.org/packages/0f/55/79ff53a769f20d71b07023ea115c9167c0bb56f281320520cf64c5298a96/coverage-7.10.7-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:1b4fd784344d4e52647fd7857b2af5b3fbe6c239b0b5fa63e94eb67320770e0f", size = 262626, upload-time = "2025-09-21T20:03:17.673Z" }, - { url = "https://files.pythonhosted.org/packages/88/e2/dac66c140009b61ac3fc13af673a574b00c16efdf04f9b5c740703e953c0/coverage-7.10.7-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:0ebbaddb2c19b71912c6f2518e791aa8b9f054985a0769bdb3a53ebbc765c6a1", size = 260386, upload-time = "2025-09-21T20:03:19.36Z" }, - { url = "https://files.pythonhosted.org/packages/a2/f1/f48f645e3f33bb9ca8a496bc4a9671b52f2f353146233ebd7c1df6160440/coverage-7.10.7-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:a2d9a3b260cc1d1dbdb1c582e63ddcf5363426a1a68faa0f5da28d8ee3c722a0", size = 258852, upload-time = "2025-09-21T20:03:21.007Z" }, - { url = "https://files.pythonhosted.org/packages/bb/3b/8442618972c51a7affeead957995cfa8323c0c9bcf8fa5a027421f720ff4/coverage-7.10.7-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:a3cc8638b2480865eaa3926d192e64ce6c51e3d29c849e09d5b4ad95efae5399", size = 261534, upload-time = "2025-09-21T20:03:23.12Z" }, - { url = "https://files.pythonhosted.org/packages/b2/dc/101f3fa3a45146db0cb03f5b4376e24c0aac818309da23e2de0c75295a91/coverage-7.10.7-cp314-cp314t-win32.whl", hash = "sha256:67f8c5cbcd3deb7a60b3345dffc89a961a484ed0af1f6f73de91705cc6e31235", size = 221784, upload-time = "2025-09-21T20:03:24.769Z" }, - { url = "https://files.pythonhosted.org/packages/4c/a1/74c51803fc70a8a40d7346660379e144be772bab4ac7bb6e6b905152345c/coverage-7.10.7-cp314-cp314t-win_amd64.whl", hash = "sha256:e1ed71194ef6dea7ed2d5cb5f7243d4bcd334bfb63e59878519be558078f848d", size = 222905, upload-time = "2025-09-21T20:03:26.93Z" }, - { url = "https://files.pythonhosted.org/packages/12/65/f116a6d2127df30bcafbceef0302d8a64ba87488bf6f73a6d8eebf060873/coverage-7.10.7-cp314-cp314t-win_arm64.whl", hash = "sha256:7fe650342addd8524ca63d77b2362b02345e5f1a093266787d210c70a50b471a", size = 220922, upload-time = "2025-09-21T20:03:28.672Z" }, - { url = "https://files.pythonhosted.org/packages/ec/16/114df1c291c22cac3b0c127a73e0af5c12ed7bbb6558d310429a0ae24023/coverage-7.10.7-py3-none-any.whl", hash = "sha256:f7941f6f2fe6dd6807a1208737b8a0cbcf1cc6d7b07d24998ad2d63590868260", size = 209952, upload-time = "2025-09-21T20:03:53.918Z" }, -] - -[package.optional-dependencies] -toml = [ - { name = "tomli", marker = "python_full_version <= '3.11'" }, -] - -[[package]] -name = "cryptography" -version = "45.0.7" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "cffi", marker = "platform_python_implementation != 'PyPy'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/a7/35/c495bffc2056f2dadb32434f1feedd79abde2a7f8363e1974afa9c33c7e2/cryptography-45.0.7.tar.gz", hash = "sha256:4b1654dfc64ea479c242508eb8c724044f1e964a47d1d1cacc5132292d851971", size = 744980, upload-time = "2025-09-01T11:15:03.146Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/0c/91/925c0ac74362172ae4516000fe877912e33b5983df735ff290c653de4913/cryptography-45.0.7-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:3be4f21c6245930688bd9e162829480de027f8bf962ede33d4f8ba7d67a00cee", size = 7041105, upload-time = "2025-09-01T11:13:59.684Z" }, - { url = "https://files.pythonhosted.org/packages/fc/63/43641c5acce3a6105cf8bd5baeceeb1846bb63067d26dae3e5db59f1513a/cryptography-45.0.7-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:67285f8a611b0ebc0857ced2081e30302909f571a46bfa7a3cc0ad303fe015c6", size = 4205799, upload-time = "2025-09-01T11:14:02.517Z" }, - { url = "https://files.pythonhosted.org/packages/bc/29/c238dd9107f10bfde09a4d1c52fd38828b1aa353ced11f358b5dd2507d24/cryptography-45.0.7-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:577470e39e60a6cd7780793202e63536026d9b8641de011ed9d8174da9ca5339", size = 4430504, upload-time = "2025-09-01T11:14:04.522Z" }, - { url = "https://files.pythonhosted.org/packages/62/62/24203e7cbcc9bd7c94739428cd30680b18ae6b18377ae66075c8e4771b1b/cryptography-45.0.7-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:4bd3e5c4b9682bc112d634f2c6ccc6736ed3635fc3319ac2bb11d768cc5a00d8", size = 4209542, upload-time = "2025-09-01T11:14:06.309Z" }, - { url = "https://files.pythonhosted.org/packages/cd/e3/e7de4771a08620eef2389b86cd87a2c50326827dea5528feb70595439ce4/cryptography-45.0.7-cp311-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:465ccac9d70115cd4de7186e60cfe989de73f7bb23e8a7aa45af18f7412e75bf", size = 3889244, upload-time = "2025-09-01T11:14:08.152Z" }, - { url = "https://files.pythonhosted.org/packages/96/b8/bca71059e79a0bb2f8e4ec61d9c205fbe97876318566cde3b5092529faa9/cryptography-45.0.7-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:16ede8a4f7929b4b7ff3642eba2bf79aa1d71f24ab6ee443935c0d269b6bc513", size = 4461975, upload-time = "2025-09-01T11:14:09.755Z" }, - { url = "https://files.pythonhosted.org/packages/58/67/3f5b26937fe1218c40e95ef4ff8d23c8dc05aa950d54200cc7ea5fb58d28/cryptography-45.0.7-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:8978132287a9d3ad6b54fcd1e08548033cc09dc6aacacb6c004c73c3eb5d3ac3", size = 4209082, upload-time = "2025-09-01T11:14:11.229Z" }, - { url = "https://files.pythonhosted.org/packages/0e/e4/b3e68a4ac363406a56cf7b741eeb80d05284d8c60ee1a55cdc7587e2a553/cryptography-45.0.7-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:b6a0e535baec27b528cb07a119f321ac024592388c5681a5ced167ae98e9fff3", size = 4460397, upload-time = "2025-09-01T11:14:12.924Z" }, - { url = "https://files.pythonhosted.org/packages/22/49/2c93f3cd4e3efc8cb22b02678c1fad691cff9dd71bb889e030d100acbfe0/cryptography-45.0.7-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:a24ee598d10befaec178efdff6054bc4d7e883f615bfbcd08126a0f4931c83a6", size = 4337244, upload-time = "2025-09-01T11:14:14.431Z" }, - { url = "https://files.pythonhosted.org/packages/04/19/030f400de0bccccc09aa262706d90f2ec23d56bc4eb4f4e8268d0ddf3fb8/cryptography-45.0.7-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:fa26fa54c0a9384c27fcdc905a2fb7d60ac6e47d14bc2692145f2b3b1e2cfdbd", size = 4568862, upload-time = "2025-09-01T11:14:16.185Z" }, - { url = "https://files.pythonhosted.org/packages/29/56/3034a3a353efa65116fa20eb3c990a8c9f0d3db4085429040a7eef9ada5f/cryptography-45.0.7-cp311-abi3-win32.whl", hash = "sha256:bef32a5e327bd8e5af915d3416ffefdbe65ed975b646b3805be81b23580b57b8", size = 2936578, upload-time = "2025-09-01T11:14:17.638Z" }, - { url = "https://files.pythonhosted.org/packages/b3/61/0ab90f421c6194705a99d0fa9f6ee2045d916e4455fdbb095a9c2c9a520f/cryptography-45.0.7-cp311-abi3-win_amd64.whl", hash = "sha256:3808e6b2e5f0b46d981c24d79648e5c25c35e59902ea4391a0dcb3e667bf7443", size = 3405400, upload-time = "2025-09-01T11:14:18.958Z" }, - { url = "https://files.pythonhosted.org/packages/63/e8/c436233ddf19c5f15b25ace33979a9dd2e7aa1a59209a0ee8554179f1cc0/cryptography-45.0.7-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:bfb4c801f65dd61cedfc61a83732327fafbac55a47282e6f26f073ca7a41c3b2", size = 7021824, upload-time = "2025-09-01T11:14:20.954Z" }, - { url = "https://files.pythonhosted.org/packages/bc/4c/8f57f2500d0ccd2675c5d0cc462095adf3faa8c52294ba085c036befb901/cryptography-45.0.7-cp37-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:81823935e2f8d476707e85a78a405953a03ef7b7b4f55f93f7c2d9680e5e0691", size = 4202233, upload-time = "2025-09-01T11:14:22.454Z" }, - { url = "https://files.pythonhosted.org/packages/eb/ac/59b7790b4ccaed739fc44775ce4645c9b8ce54cbec53edf16c74fd80cb2b/cryptography-45.0.7-cp37-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:3994c809c17fc570c2af12c9b840d7cea85a9fd3e5c0e0491f4fa3c029216d59", size = 4423075, upload-time = "2025-09-01T11:14:24.287Z" }, - { url = "https://files.pythonhosted.org/packages/b8/56/d4f07ea21434bf891faa088a6ac15d6d98093a66e75e30ad08e88aa2b9ba/cryptography-45.0.7-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:dad43797959a74103cb59c5dac71409f9c27d34c8a05921341fb64ea8ccb1dd4", size = 4204517, upload-time = "2025-09-01T11:14:25.679Z" }, - { url = "https://files.pythonhosted.org/packages/e8/ac/924a723299848b4c741c1059752c7cfe09473b6fd77d2920398fc26bfb53/cryptography-45.0.7-cp37-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:ce7a453385e4c4693985b4a4a3533e041558851eae061a58a5405363b098fcd3", size = 3882893, upload-time = "2025-09-01T11:14:27.1Z" }, - { url = "https://files.pythonhosted.org/packages/83/dc/4dab2ff0a871cc2d81d3ae6d780991c0192b259c35e4d83fe1de18b20c70/cryptography-45.0.7-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:b04f85ac3a90c227b6e5890acb0edbaf3140938dbecf07bff618bf3638578cf1", size = 4450132, upload-time = "2025-09-01T11:14:28.58Z" }, - { url = "https://files.pythonhosted.org/packages/12/dd/b2882b65db8fc944585d7fb00d67cf84a9cef4e77d9ba8f69082e911d0de/cryptography-45.0.7-cp37-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:48c41a44ef8b8c2e80ca4527ee81daa4c527df3ecbc9423c41a420a9559d0e27", size = 4204086, upload-time = "2025-09-01T11:14:30.572Z" }, - { url = "https://files.pythonhosted.org/packages/5d/fa/1d5745d878048699b8eb87c984d4ccc5da4f5008dfd3ad7a94040caca23a/cryptography-45.0.7-cp37-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:f3df7b3d0f91b88b2106031fd995802a2e9ae13e02c36c1fc075b43f420f3a17", size = 4449383, upload-time = "2025-09-01T11:14:32.046Z" }, - { url = "https://files.pythonhosted.org/packages/36/8b/fc61f87931bc030598e1876c45b936867bb72777eac693e905ab89832670/cryptography-45.0.7-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:dd342f085542f6eb894ca00ef70236ea46070c8a13824c6bde0dfdcd36065b9b", size = 4332186, upload-time = "2025-09-01T11:14:33.95Z" }, - { url = "https://files.pythonhosted.org/packages/0b/11/09700ddad7443ccb11d674efdbe9a832b4455dc1f16566d9bd3834922ce5/cryptography-45.0.7-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:1993a1bb7e4eccfb922b6cd414f072e08ff5816702a0bdb8941c247a6b1b287c", size = 4561639, upload-time = "2025-09-01T11:14:35.343Z" }, - { url = "https://files.pythonhosted.org/packages/71/ed/8f4c1337e9d3b94d8e50ae0b08ad0304a5709d483bfcadfcc77a23dbcb52/cryptography-45.0.7-cp37-abi3-win32.whl", hash = "sha256:18fcf70f243fe07252dcb1b268a687f2358025ce32f9f88028ca5c364b123ef5", size = 2926552, upload-time = "2025-09-01T11:14:36.929Z" }, - { url = "https://files.pythonhosted.org/packages/bc/ff/026513ecad58dacd45d1d24ebe52b852165a26e287177de1d545325c0c25/cryptography-45.0.7-cp37-abi3-win_amd64.whl", hash = "sha256:7285a89df4900ed3bfaad5679b1e668cb4b38a8de1ccbfc84b05f34512da0a90", size = 3392742, upload-time = "2025-09-01T11:14:38.368Z" }, - { url = "https://files.pythonhosted.org/packages/99/4e/49199a4c82946938a3e05d2e8ad9482484ba48bbc1e809e3d506c686d051/cryptography-45.0.7-pp311-pypy311_pp73-macosx_10_9_x86_64.whl", hash = "sha256:4a862753b36620af6fc54209264f92c716367f2f0ff4624952276a6bbd18cbde", size = 3584634, upload-time = "2025-09-01T11:14:50.593Z" }, - { url = "https://files.pythonhosted.org/packages/16/ce/5f6ff59ea9c7779dba51b84871c19962529bdcc12e1a6ea172664916c550/cryptography-45.0.7-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:06ce84dc14df0bf6ea84666f958e6080cdb6fe1231be2a51f3fc1267d9f3fb34", size = 4149533, upload-time = "2025-09-01T11:14:52.091Z" }, - { url = "https://files.pythonhosted.org/packages/ce/13/b3cfbd257ac96da4b88b46372e662009b7a16833bfc5da33bb97dd5631ae/cryptography-45.0.7-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:d0c5c6bac22b177bf8da7435d9d27a6834ee130309749d162b26c3105c0795a9", size = 4385557, upload-time = "2025-09-01T11:14:53.551Z" }, - { url = "https://files.pythonhosted.org/packages/1c/c5/8c59d6b7c7b439ba4fc8d0cab868027fd095f215031bc123c3a070962912/cryptography-45.0.7-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:2f641b64acc00811da98df63df7d59fd4706c0df449da71cb7ac39a0732b40ae", size = 4149023, upload-time = "2025-09-01T11:14:55.022Z" }, - { url = "https://files.pythonhosted.org/packages/55/32/05385c86d6ca9ab0b4d5bb442d2e3d85e727939a11f3e163fc776ce5eb40/cryptography-45.0.7-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:f5414a788ecc6ee6bc58560e85ca624258a55ca434884445440a810796ea0e0b", size = 4385722, upload-time = "2025-09-01T11:14:57.319Z" }, - { url = "https://files.pythonhosted.org/packages/23/87/7ce86f3fa14bc11a5a48c30d8103c26e09b6465f8d8e9d74cf7a0714f043/cryptography-45.0.7-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:1f3d56f73595376f4244646dd5c5870c14c196949807be39e79e7bd9bac3da63", size = 3332908, upload-time = "2025-09-01T11:14:58.78Z" }, -] - -[[package]] -name = "cyclopts" -version = "3.24.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "attrs" }, - { name = "docstring-parser", marker = "python_full_version < '4'" }, - { name = "rich" }, - { name = "rich-rst" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/30/ca/7782da3b03242d5f0a16c20371dff99d4bd1fedafe26bc48ff82e42be8c9/cyclopts-3.24.0.tar.gz", hash = "sha256:de6964a041dfb3c57bf043b41e68c43548227a17de1bad246e3a0bfc5c4b7417", size = 76131, upload-time = "2025-09-08T15:40:57.75Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/f0/8b/2c95f0645c6f40211896375e6fa51f504b8ccb29c21f6ae661fe87ab044e/cyclopts-3.24.0-py3-none-any.whl", hash = "sha256:809d04cde9108617106091140c3964ee6fceb33cecdd537f7ffa360bde13ed71", size = 86154, upload-time = "2025-09-08T15:40:56.41Z" }, -] - -[[package]] -name = "dnspython" -version = "2.8.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/8c/8b/57666417c0f90f08bcafa776861060426765fdb422eb10212086fb811d26/dnspython-2.8.0.tar.gz", hash = "sha256:181d3c6996452cb1189c4046c61599b84a5a86e099562ffde77d26984ff26d0f", size = 368251, upload-time = "2025-09-07T18:58:00.022Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/ba/5a/18ad964b0086c6e62e2e7500f7edc89e3faa45033c71c1893d34eed2b2de/dnspython-2.8.0-py3-none-any.whl", hash = "sha256:01d9bbc4a2d76bf0db7c1f729812ded6d912bd318d3b1cf81d30c0f845dbf3af", size = 331094, upload-time = "2025-09-07T18:57:58.071Z" }, -] - -[[package]] -name = "docker" -version = "7.1.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "pywin32", marker = "sys_platform == 'win32'" }, - { name = "requests" }, - { name = "urllib3" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/91/9b/4a2ea29aeba62471211598dac5d96825bb49348fa07e906ea930394a83ce/docker-7.1.0.tar.gz", hash = "sha256:ad8c70e6e3f8926cb8a92619b832b4ea5299e2831c14284663184e200546fa6c", size = 117834, upload-time = "2024-05-23T11:13:57.216Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/e3/26/57c6fb270950d476074c087527a558ccb6f4436657314bfb6cdf484114c4/docker-7.1.0-py3-none-any.whl", hash = "sha256:c96b93b7f0a746f9e77d325bcfb87422a3d8bd4f03136ae8a85b37f1898d5fc0", size = 147774, upload-time = "2024-05-23T11:13:55.01Z" }, -] - -[[package]] -name = "docstring-parser" -version = "0.17.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/b2/9d/c3b43da9515bd270df0f80548d9944e389870713cc1fe2b8fb35fe2bcefd/docstring_parser-0.17.0.tar.gz", hash = "sha256:583de4a309722b3315439bb31d64ba3eebada841f2e2cee23b99df001434c912", size = 27442, upload-time = "2025-07-21T07:35:01.868Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/55/e2/2537ebcff11c1ee1ff17d8d0b6f4db75873e3b0fb32c2d4a2ee31ecb310a/docstring_parser-0.17.0-py3-none-any.whl", hash = "sha256:cf2569abd23dce8099b300f9b4fa8191e9582dda731fd533daf54c4551658708", size = 36896, upload-time = "2025-07-21T07:35:00.684Z" }, -] - -[[package]] -name = "docutils" -version = "0.22.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/e4/47/d869000fb74438584858acc628a364b277fc012695f0dfd513cb10f99768/docutils-0.22.1.tar.gz", hash = "sha256:d2fb50923a313532b6d41a77776d24cb459a594be9b7e4afa1fbcb5bda1893e6", size = 2291655, upload-time = "2025-09-17T17:58:45.409Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/8c/dc/1948b90c5d9dbfa4d1fd3991013a042ba3ac62ebd3afdcb3fac08366e755/docutils-0.22.1-py3-none-any.whl", hash = "sha256:806e896f256a17466426544038f30cb860a99f5d4af640e36c284bfcb1824512", size = 638455, upload-time = "2025-09-17T17:58:42.498Z" }, -] - -[[package]] -name = "email-validator" -version = "2.3.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "dnspython" }, - { name = "idna" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/f5/22/900cb125c76b7aaa450ce02fd727f452243f2e91a61af068b40adba60ea9/email_validator-2.3.0.tar.gz", hash = "sha256:9fc05c37f2f6cf439ff414f8fc46d917929974a82244c20eb10231ba60c54426", size = 51238, upload-time = "2025-08-26T13:09:06.831Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/de/15/545e2b6cf2e3be84bc1ed85613edd75b8aea69807a71c26f4ca6a9258e82/email_validator-2.3.0-py3-none-any.whl", hash = "sha256:80f13f623413e6b197ae73bb10bf4eb0908faf509ad8362c5edeb0be7fd450b4", size = 35604, upload-time = "2025-08-26T13:09:05.858Z" }, -] - -[[package]] -name = "exceptiongroup" -version = "1.3.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "typing-extensions", marker = "python_full_version < '3.13'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/0b/9f/a65090624ecf468cdca03533906e7c69ed7588582240cfe7cc9e770b50eb/exceptiongroup-1.3.0.tar.gz", hash = "sha256:b241f5885f560bc56a59ee63ca4c6a8bfa46ae4ad651af316d4e81817bb9fd88", size = 29749, upload-time = "2025-05-10T17:42:51.123Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/36/f4/c6e662dade71f56cd2f3735141b265c3c79293c109549c1e6933b0651ffc/exceptiongroup-1.3.0-py3-none-any.whl", hash = "sha256:4d111e6e0c13d0644cad6ddaa7ed0261a0b36971f6d23e7ec9b4b9097da78a10", size = 16674, upload-time = "2025-05-10T17:42:49.33Z" }, -] - -[[package]] -name = "execnet" -version = "2.1.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/bb/ff/b4c0dc78fbe20c3e59c0c7334de0c27eb4001a2b2017999af398bf730817/execnet-2.1.1.tar.gz", hash = "sha256:5189b52c6121c24feae288166ab41b32549c7e2348652736540b9e6e7d4e72e3", size = 166524, upload-time = "2024-04-08T09:04:19.245Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/43/09/2aea36ff60d16dd8879bdb2f5b3ee0ba8d08cbbdcdfe870e695ce3784385/execnet-2.1.1-py3-none-any.whl", hash = "sha256:26dee51f1b80cebd6d0ca8e74dd8745419761d3bef34163928cbebbdc4749fdc", size = 40612, upload-time = "2024-04-08T09:04:17.414Z" }, -] - -[[package]] -name = "fastapi" -version = "0.116.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "pydantic" }, - { name = "starlette" }, - { name = "typing-extensions" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/78/d7/6c8b3bfe33eeffa208183ec037fee0cce9f7f024089ab1c5d12ef04bd27c/fastapi-0.116.1.tar.gz", hash = "sha256:ed52cbf946abfd70c5a0dccb24673f0670deeb517a88b3544d03c2a6bf283143", size = 296485, upload-time = "2025-07-11T16:22:32.057Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/e5/47/d63c60f59a59467fda0f93f46335c9d18526d7071f025cb5b89d5353ea42/fastapi-0.116.1-py3-none-any.whl", hash = "sha256:c46ac7c312df840f0c9e220f7964bada936781bc4e2e6eb71f1c4d7553786565", size = 95631, upload-time = "2025-07-11T16:22:30.485Z" }, -] - -[[package]] -name = "fastmcp" -version = "2.12.3" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "authlib" }, - { name = "cyclopts" }, - { name = "exceptiongroup" }, - { name = "httpx" }, - { name = "mcp" }, - { name = "openapi-core" }, - { name = "openapi-pydantic" }, - { name = "pydantic", extra = ["email"] }, - { name = "pyperclip" }, - { name = "python-dotenv" }, - { name = "rich" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/99/5e/035fdfa23646de8811776cd62d93440e334e8a4557b35c63c1bff125c08c/fastmcp-2.12.3.tar.gz", hash = "sha256:541dd569d5b6c083140b04d997ba3dc47f7c10695cee700d0a733ce63b20bb65", size = 5246812, upload-time = "2025-09-12T12:28:07.136Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/96/79/0fd386e61819e205563d4eb15da76564b80dc2edd3c64b46f2706235daec/fastmcp-2.12.3-py3-none-any.whl", hash = "sha256:aee50872923a9cba731861fc0120e7dbe4642a2685ba251b2b202b82fb6c25a9", size = 314031, upload-time = "2025-09-12T12:28:05.024Z" }, -] - -[[package]] -name = "frozenlist" -version = "1.7.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/79/b1/b64018016eeb087db503b038296fd782586432b9c077fc5c7839e9cb6ef6/frozenlist-1.7.0.tar.gz", hash = "sha256:2e310d81923c2437ea8670467121cc3e9b0f76d3043cc1d2331d56c7fb7a3a8f", size = 45078, upload-time = "2025-06-09T23:02:35.538Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/34/7e/803dde33760128acd393a27eb002f2020ddb8d99d30a44bfbaab31c5f08a/frozenlist-1.7.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:aa51e147a66b2d74de1e6e2cf5921890de6b0f4820b257465101d7f37b49fb5a", size = 82251, upload-time = "2025-06-09T23:00:16.279Z" }, - { url = "https://files.pythonhosted.org/packages/75/a9/9c2c5760b6ba45eae11334db454c189d43d34a4c0b489feb2175e5e64277/frozenlist-1.7.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:9b35db7ce1cd71d36ba24f80f0c9e7cff73a28d7a74e91fe83e23d27c7828750", size = 48183, upload-time = "2025-06-09T23:00:17.698Z" }, - { url = "https://files.pythonhosted.org/packages/47/be/4038e2d869f8a2da165f35a6befb9158c259819be22eeaf9c9a8f6a87771/frozenlist-1.7.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:34a69a85e34ff37791e94542065c8416c1afbf820b68f720452f636d5fb990cd", size = 47107, upload-time = "2025-06-09T23:00:18.952Z" }, - { url = "https://files.pythonhosted.org/packages/79/26/85314b8a83187c76a37183ceed886381a5f992975786f883472fcb6dc5f2/frozenlist-1.7.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4a646531fa8d82c87fe4bb2e596f23173caec9185bfbca5d583b4ccfb95183e2", size = 237333, upload-time = "2025-06-09T23:00:20.275Z" }, - { url = "https://files.pythonhosted.org/packages/1f/fd/e5b64f7d2c92a41639ffb2ad44a6a82f347787abc0c7df5f49057cf11770/frozenlist-1.7.0-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:79b2ffbba483f4ed36a0f236ccb85fbb16e670c9238313709638167670ba235f", size = 231724, upload-time = "2025-06-09T23:00:21.705Z" }, - { url = "https://files.pythonhosted.org/packages/20/fb/03395c0a43a5976af4bf7534759d214405fbbb4c114683f434dfdd3128ef/frozenlist-1.7.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a26f205c9ca5829cbf82bb2a84b5c36f7184c4316617d7ef1b271a56720d6b30", size = 245842, upload-time = "2025-06-09T23:00:23.148Z" }, - { url = "https://files.pythonhosted.org/packages/d0/15/c01c8e1dffdac5d9803507d824f27aed2ba76b6ed0026fab4d9866e82f1f/frozenlist-1.7.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:bcacfad3185a623fa11ea0e0634aac7b691aa925d50a440f39b458e41c561d98", size = 239767, upload-time = "2025-06-09T23:00:25.103Z" }, - { url = "https://files.pythonhosted.org/packages/14/99/3f4c6fe882c1f5514b6848aa0a69b20cb5e5d8e8f51a339d48c0e9305ed0/frozenlist-1.7.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:72c1b0fe8fe451b34f12dce46445ddf14bd2a5bcad7e324987194dc8e3a74c86", size = 224130, upload-time = "2025-06-09T23:00:27.061Z" }, - { url = "https://files.pythonhosted.org/packages/4d/83/220a374bd7b2aeba9d0725130665afe11de347d95c3620b9b82cc2fcab97/frozenlist-1.7.0-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:61d1a5baeaac6c0798ff6edfaeaa00e0e412d49946c53fae8d4b8e8b3566c4ae", size = 235301, upload-time = "2025-06-09T23:00:29.02Z" }, - { url = "https://files.pythonhosted.org/packages/03/3c/3e3390d75334a063181625343e8daab61b77e1b8214802cc4e8a1bb678fc/frozenlist-1.7.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:7edf5c043c062462f09b6820de9854bf28cc6cc5b6714b383149745e287181a8", size = 234606, upload-time = "2025-06-09T23:00:30.514Z" }, - { url = "https://files.pythonhosted.org/packages/23/1e/58232c19608b7a549d72d9903005e2d82488f12554a32de2d5fb59b9b1ba/frozenlist-1.7.0-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:d50ac7627b3a1bd2dcef6f9da89a772694ec04d9a61b66cf87f7d9446b4a0c31", size = 248372, upload-time = "2025-06-09T23:00:31.966Z" }, - { url = "https://files.pythonhosted.org/packages/c0/a4/e4a567e01702a88a74ce8a324691e62a629bf47d4f8607f24bf1c7216e7f/frozenlist-1.7.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:ce48b2fece5aeb45265bb7a58259f45027db0abff478e3077e12b05b17fb9da7", size = 229860, upload-time = "2025-06-09T23:00:33.375Z" }, - { url = "https://files.pythonhosted.org/packages/73/a6/63b3374f7d22268b41a9db73d68a8233afa30ed164c46107b33c4d18ecdd/frozenlist-1.7.0-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:fe2365ae915a1fafd982c146754e1de6ab3478def8a59c86e1f7242d794f97d5", size = 245893, upload-time = "2025-06-09T23:00:35.002Z" }, - { url = "https://files.pythonhosted.org/packages/6d/eb/d18b3f6e64799a79673c4ba0b45e4cfbe49c240edfd03a68be20002eaeaa/frozenlist-1.7.0-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:45a6f2fdbd10e074e8814eb98b05292f27bad7d1883afbe009d96abdcf3bc898", size = 246323, upload-time = "2025-06-09T23:00:36.468Z" }, - { url = "https://files.pythonhosted.org/packages/5a/f5/720f3812e3d06cd89a1d5db9ff6450088b8f5c449dae8ffb2971a44da506/frozenlist-1.7.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:21884e23cffabb157a9dd7e353779077bf5b8f9a58e9b262c6caad2ef5f80a56", size = 233149, upload-time = "2025-06-09T23:00:37.963Z" }, - { url = "https://files.pythonhosted.org/packages/69/68/03efbf545e217d5db8446acfd4c447c15b7c8cf4dbd4a58403111df9322d/frozenlist-1.7.0-cp311-cp311-win32.whl", hash = "sha256:284d233a8953d7b24f9159b8a3496fc1ddc00f4db99c324bd5fb5f22d8698ea7", size = 39565, upload-time = "2025-06-09T23:00:39.753Z" }, - { url = "https://files.pythonhosted.org/packages/58/17/fe61124c5c333ae87f09bb67186d65038834a47d974fc10a5fadb4cc5ae1/frozenlist-1.7.0-cp311-cp311-win_amd64.whl", hash = "sha256:387cbfdcde2f2353f19c2f66bbb52406d06ed77519ac7ee21be0232147c2592d", size = 44019, upload-time = "2025-06-09T23:00:40.988Z" }, - { url = "https://files.pythonhosted.org/packages/ef/a2/c8131383f1e66adad5f6ecfcce383d584ca94055a34d683bbb24ac5f2f1c/frozenlist-1.7.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:3dbf9952c4bb0e90e98aec1bd992b3318685005702656bc6f67c1a32b76787f2", size = 81424, upload-time = "2025-06-09T23:00:42.24Z" }, - { url = "https://files.pythonhosted.org/packages/4c/9d/02754159955088cb52567337d1113f945b9e444c4960771ea90eb73de8db/frozenlist-1.7.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:1f5906d3359300b8a9bb194239491122e6cf1444c2efb88865426f170c262cdb", size = 47952, upload-time = "2025-06-09T23:00:43.481Z" }, - { url = "https://files.pythonhosted.org/packages/01/7a/0046ef1bd6699b40acd2067ed6d6670b4db2f425c56980fa21c982c2a9db/frozenlist-1.7.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3dabd5a8f84573c8d10d8859a50ea2dec01eea372031929871368c09fa103478", size = 46688, upload-time = "2025-06-09T23:00:44.793Z" }, - { url = "https://files.pythonhosted.org/packages/d6/a2/a910bafe29c86997363fb4c02069df4ff0b5bc39d33c5198b4e9dd42d8f8/frozenlist-1.7.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aa57daa5917f1738064f302bf2626281a1cb01920c32f711fbc7bc36111058a8", size = 243084, upload-time = "2025-06-09T23:00:46.125Z" }, - { url = "https://files.pythonhosted.org/packages/64/3e/5036af9d5031374c64c387469bfcc3af537fc0f5b1187d83a1cf6fab1639/frozenlist-1.7.0-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:c193dda2b6d49f4c4398962810fa7d7c78f032bf45572b3e04dd5249dff27e08", size = 233524, upload-time = "2025-06-09T23:00:47.73Z" }, - { url = "https://files.pythonhosted.org/packages/06/39/6a17b7c107a2887e781a48ecf20ad20f1c39d94b2a548c83615b5b879f28/frozenlist-1.7.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bfe2b675cf0aaa6d61bf8fbffd3c274b3c9b7b1623beb3809df8a81399a4a9c4", size = 248493, upload-time = "2025-06-09T23:00:49.742Z" }, - { url = "https://files.pythonhosted.org/packages/be/00/711d1337c7327d88c44d91dd0f556a1c47fb99afc060ae0ef66b4d24793d/frozenlist-1.7.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8fc5d5cda37f62b262405cf9652cf0856839c4be8ee41be0afe8858f17f4c94b", size = 244116, upload-time = "2025-06-09T23:00:51.352Z" }, - { url = "https://files.pythonhosted.org/packages/24/fe/74e6ec0639c115df13d5850e75722750adabdc7de24e37e05a40527ca539/frozenlist-1.7.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b0d5ce521d1dd7d620198829b87ea002956e4319002ef0bc8d3e6d045cb4646e", size = 224557, upload-time = "2025-06-09T23:00:52.855Z" }, - { url = "https://files.pythonhosted.org/packages/8d/db/48421f62a6f77c553575201e89048e97198046b793f4a089c79a6e3268bd/frozenlist-1.7.0-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:488d0a7d6a0008ca0db273c542098a0fa9e7dfaa7e57f70acef43f32b3f69dca", size = 241820, upload-time = "2025-06-09T23:00:54.43Z" }, - { url = "https://files.pythonhosted.org/packages/1d/fa/cb4a76bea23047c8462976ea7b7a2bf53997a0ca171302deae9d6dd12096/frozenlist-1.7.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:15a7eaba63983d22c54d255b854e8108e7e5f3e89f647fc854bd77a237e767df", size = 236542, upload-time = "2025-06-09T23:00:56.409Z" }, - { url = "https://files.pythonhosted.org/packages/5d/32/476a4b5cfaa0ec94d3f808f193301debff2ea42288a099afe60757ef6282/frozenlist-1.7.0-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:1eaa7e9c6d15df825bf255649e05bd8a74b04a4d2baa1ae46d9c2d00b2ca2cb5", size = 249350, upload-time = "2025-06-09T23:00:58.468Z" }, - { url = "https://files.pythonhosted.org/packages/8d/ba/9a28042f84a6bf8ea5dbc81cfff8eaef18d78b2a1ad9d51c7bc5b029ad16/frozenlist-1.7.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:e4389e06714cfa9d47ab87f784a7c5be91d3934cd6e9a7b85beef808297cc025", size = 225093, upload-time = "2025-06-09T23:01:00.015Z" }, - { url = "https://files.pythonhosted.org/packages/bc/29/3a32959e68f9cf000b04e79ba574527c17e8842e38c91d68214a37455786/frozenlist-1.7.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:73bd45e1488c40b63fe5a7df892baf9e2a4d4bb6409a2b3b78ac1c6236178e01", size = 245482, upload-time = "2025-06-09T23:01:01.474Z" }, - { url = "https://files.pythonhosted.org/packages/80/e8/edf2f9e00da553f07f5fa165325cfc302dead715cab6ac8336a5f3d0adc2/frozenlist-1.7.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:99886d98e1643269760e5fe0df31e5ae7050788dd288947f7f007209b8c33f08", size = 249590, upload-time = "2025-06-09T23:01:02.961Z" }, - { url = "https://files.pythonhosted.org/packages/1c/80/9a0eb48b944050f94cc51ee1c413eb14a39543cc4f760ed12657a5a3c45a/frozenlist-1.7.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:290a172aae5a4c278c6da8a96222e6337744cd9c77313efe33d5670b9f65fc43", size = 237785, upload-time = "2025-06-09T23:01:05.095Z" }, - { url = "https://files.pythonhosted.org/packages/f3/74/87601e0fb0369b7a2baf404ea921769c53b7ae00dee7dcfe5162c8c6dbf0/frozenlist-1.7.0-cp312-cp312-win32.whl", hash = "sha256:426c7bc70e07cfebc178bc4c2bf2d861d720c4fff172181eeb4a4c41d4ca2ad3", size = 39487, upload-time = "2025-06-09T23:01:06.54Z" }, - { url = "https://files.pythonhosted.org/packages/0b/15/c026e9a9fc17585a9d461f65d8593d281fedf55fbf7eb53f16c6df2392f9/frozenlist-1.7.0-cp312-cp312-win_amd64.whl", hash = "sha256:563b72efe5da92e02eb68c59cb37205457c977aa7a449ed1b37e6939e5c47c6a", size = 43874, upload-time = "2025-06-09T23:01:07.752Z" }, - { url = "https://files.pythonhosted.org/packages/24/90/6b2cebdabdbd50367273c20ff6b57a3dfa89bd0762de02c3a1eb42cb6462/frozenlist-1.7.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ee80eeda5e2a4e660651370ebffd1286542b67e268aa1ac8d6dbe973120ef7ee", size = 79791, upload-time = "2025-06-09T23:01:09.368Z" }, - { url = "https://files.pythonhosted.org/packages/83/2e/5b70b6a3325363293fe5fc3ae74cdcbc3e996c2a11dde2fd9f1fb0776d19/frozenlist-1.7.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:d1a81c85417b914139e3a9b995d4a1c84559afc839a93cf2cb7f15e6e5f6ed2d", size = 47165, upload-time = "2025-06-09T23:01:10.653Z" }, - { url = "https://files.pythonhosted.org/packages/f4/25/a0895c99270ca6966110f4ad98e87e5662eab416a17e7fd53c364bf8b954/frozenlist-1.7.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:cbb65198a9132ebc334f237d7b0df163e4de83fb4f2bdfe46c1e654bdb0c5d43", size = 45881, upload-time = "2025-06-09T23:01:12.296Z" }, - { url = "https://files.pythonhosted.org/packages/19/7c/71bb0bbe0832793c601fff68cd0cf6143753d0c667f9aec93d3c323f4b55/frozenlist-1.7.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dab46c723eeb2c255a64f9dc05b8dd601fde66d6b19cdb82b2e09cc6ff8d8b5d", size = 232409, upload-time = "2025-06-09T23:01:13.641Z" }, - { url = "https://files.pythonhosted.org/packages/c0/45/ed2798718910fe6eb3ba574082aaceff4528e6323f9a8570be0f7028d8e9/frozenlist-1.7.0-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:6aeac207a759d0dedd2e40745575ae32ab30926ff4fa49b1635def65806fddee", size = 225132, upload-time = "2025-06-09T23:01:15.264Z" }, - { url = "https://files.pythonhosted.org/packages/ba/e2/8417ae0f8eacb1d071d4950f32f229aa6bf68ab69aab797b72a07ea68d4f/frozenlist-1.7.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bd8c4e58ad14b4fa7802b8be49d47993182fdd4023393899632c88fd8cd994eb", size = 237638, upload-time = "2025-06-09T23:01:16.752Z" }, - { url = "https://files.pythonhosted.org/packages/f8/b7/2ace5450ce85f2af05a871b8c8719b341294775a0a6c5585d5e6170f2ce7/frozenlist-1.7.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:04fb24d104f425da3540ed83cbfc31388a586a7696142004c577fa61c6298c3f", size = 233539, upload-time = "2025-06-09T23:01:18.202Z" }, - { url = "https://files.pythonhosted.org/packages/46/b9/6989292c5539553dba63f3c83dc4598186ab2888f67c0dc1d917e6887db6/frozenlist-1.7.0-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6a5c505156368e4ea6b53b5ac23c92d7edc864537ff911d2fb24c140bb175e60", size = 215646, upload-time = "2025-06-09T23:01:19.649Z" }, - { url = "https://files.pythonhosted.org/packages/72/31/bc8c5c99c7818293458fe745dab4fd5730ff49697ccc82b554eb69f16a24/frozenlist-1.7.0-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8bd7eb96a675f18aa5c553eb7ddc24a43c8c18f22e1f9925528128c052cdbe00", size = 232233, upload-time = "2025-06-09T23:01:21.175Z" }, - { url = "https://files.pythonhosted.org/packages/59/52/460db4d7ba0811b9ccb85af996019f5d70831f2f5f255f7cc61f86199795/frozenlist-1.7.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:05579bf020096fe05a764f1f84cd104a12f78eaab68842d036772dc6d4870b4b", size = 227996, upload-time = "2025-06-09T23:01:23.098Z" }, - { url = "https://files.pythonhosted.org/packages/ba/c9/f4b39e904c03927b7ecf891804fd3b4df3db29b9e487c6418e37988d6e9d/frozenlist-1.7.0-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:376b6222d114e97eeec13d46c486facd41d4f43bab626b7c3f6a8b4e81a5192c", size = 242280, upload-time = "2025-06-09T23:01:24.808Z" }, - { url = "https://files.pythonhosted.org/packages/b8/33/3f8d6ced42f162d743e3517781566b8481322be321b486d9d262adf70bfb/frozenlist-1.7.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:0aa7e176ebe115379b5b1c95b4096fb1c17cce0847402e227e712c27bdb5a949", size = 217717, upload-time = "2025-06-09T23:01:26.28Z" }, - { url = "https://files.pythonhosted.org/packages/3e/e8/ad683e75da6ccef50d0ab0c2b2324b32f84fc88ceee778ed79b8e2d2fe2e/frozenlist-1.7.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:3fbba20e662b9c2130dc771e332a99eff5da078b2b2648153a40669a6d0e36ca", size = 236644, upload-time = "2025-06-09T23:01:27.887Z" }, - { url = "https://files.pythonhosted.org/packages/b2/14/8d19ccdd3799310722195a72ac94ddc677541fb4bef4091d8e7775752360/frozenlist-1.7.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:f3f4410a0a601d349dd406b5713fec59b4cee7e71678d5b17edda7f4655a940b", size = 238879, upload-time = "2025-06-09T23:01:29.524Z" }, - { url = "https://files.pythonhosted.org/packages/ce/13/c12bf657494c2fd1079a48b2db49fa4196325909249a52d8f09bc9123fd7/frozenlist-1.7.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:e2cdfaaec6a2f9327bf43c933c0319a7c429058e8537c508964a133dffee412e", size = 232502, upload-time = "2025-06-09T23:01:31.287Z" }, - { url = "https://files.pythonhosted.org/packages/d7/8b/e7f9dfde869825489382bc0d512c15e96d3964180c9499efcec72e85db7e/frozenlist-1.7.0-cp313-cp313-win32.whl", hash = "sha256:5fc4df05a6591c7768459caba1b342d9ec23fa16195e744939ba5914596ae3e1", size = 39169, upload-time = "2025-06-09T23:01:35.503Z" }, - { url = "https://files.pythonhosted.org/packages/35/89/a487a98d94205d85745080a37860ff5744b9820a2c9acbcdd9440bfddf98/frozenlist-1.7.0-cp313-cp313-win_amd64.whl", hash = "sha256:52109052b9791a3e6b5d1b65f4b909703984b770694d3eb64fad124c835d7cba", size = 43219, upload-time = "2025-06-09T23:01:36.784Z" }, - { url = "https://files.pythonhosted.org/packages/56/d5/5c4cf2319a49eddd9dd7145e66c4866bdc6f3dbc67ca3d59685149c11e0d/frozenlist-1.7.0-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:a6f86e4193bb0e235ef6ce3dde5cbabed887e0b11f516ce8a0f4d3b33078ec2d", size = 84345, upload-time = "2025-06-09T23:01:38.295Z" }, - { url = "https://files.pythonhosted.org/packages/a4/7d/ec2c1e1dc16b85bc9d526009961953df9cec8481b6886debb36ec9107799/frozenlist-1.7.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:82d664628865abeb32d90ae497fb93df398a69bb3434463d172b80fc25b0dd7d", size = 48880, upload-time = "2025-06-09T23:01:39.887Z" }, - { url = "https://files.pythonhosted.org/packages/69/86/f9596807b03de126e11e7d42ac91e3d0b19a6599c714a1989a4e85eeefc4/frozenlist-1.7.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:912a7e8375a1c9a68325a902f3953191b7b292aa3c3fb0d71a216221deca460b", size = 48498, upload-time = "2025-06-09T23:01:41.318Z" }, - { url = "https://files.pythonhosted.org/packages/5e/cb/df6de220f5036001005f2d726b789b2c0b65f2363b104bbc16f5be8084f8/frozenlist-1.7.0-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9537c2777167488d539bc5de2ad262efc44388230e5118868e172dd4a552b146", size = 292296, upload-time = "2025-06-09T23:01:42.685Z" }, - { url = "https://files.pythonhosted.org/packages/83/1f/de84c642f17c8f851a2905cee2dae401e5e0daca9b5ef121e120e19aa825/frozenlist-1.7.0-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:f34560fb1b4c3e30ba35fa9a13894ba39e5acfc5f60f57d8accde65f46cc5e74", size = 273103, upload-time = "2025-06-09T23:01:44.166Z" }, - { url = "https://files.pythonhosted.org/packages/88/3c/c840bfa474ba3fa13c772b93070893c6e9d5c0350885760376cbe3b6c1b3/frozenlist-1.7.0-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:acd03d224b0175f5a850edc104ac19040d35419eddad04e7cf2d5986d98427f1", size = 292869, upload-time = "2025-06-09T23:01:45.681Z" }, - { url = "https://files.pythonhosted.org/packages/a6/1c/3efa6e7d5a39a1d5ef0abeb51c48fb657765794a46cf124e5aca2c7a592c/frozenlist-1.7.0-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f2038310bc582f3d6a09b3816ab01737d60bf7b1ec70f5356b09e84fb7408ab1", size = 291467, upload-time = "2025-06-09T23:01:47.234Z" }, - { url = "https://files.pythonhosted.org/packages/4f/00/d5c5e09d4922c395e2f2f6b79b9a20dab4b67daaf78ab92e7729341f61f6/frozenlist-1.7.0-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b8c05e4c8e5f36e5e088caa1bf78a687528f83c043706640a92cb76cd6999384", size = 266028, upload-time = "2025-06-09T23:01:48.819Z" }, - { url = "https://files.pythonhosted.org/packages/4e/27/72765be905619dfde25a7f33813ac0341eb6b076abede17a2e3fbfade0cb/frozenlist-1.7.0-cp313-cp313t-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:765bb588c86e47d0b68f23c1bee323d4b703218037765dcf3f25c838c6fecceb", size = 284294, upload-time = "2025-06-09T23:01:50.394Z" }, - { url = "https://files.pythonhosted.org/packages/88/67/c94103a23001b17808eb7dd1200c156bb69fb68e63fcf0693dde4cd6228c/frozenlist-1.7.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:32dc2e08c67d86d0969714dd484fd60ff08ff81d1a1e40a77dd34a387e6ebc0c", size = 281898, upload-time = "2025-06-09T23:01:52.234Z" }, - { url = "https://files.pythonhosted.org/packages/42/34/a3e2c00c00f9e2a9db5653bca3fec306349e71aff14ae45ecc6d0951dd24/frozenlist-1.7.0-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:c0303e597eb5a5321b4de9c68e9845ac8f290d2ab3f3e2c864437d3c5a30cd65", size = 290465, upload-time = "2025-06-09T23:01:53.788Z" }, - { url = "https://files.pythonhosted.org/packages/bb/73/f89b7fbce8b0b0c095d82b008afd0590f71ccb3dee6eee41791cf8cd25fd/frozenlist-1.7.0-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:a47f2abb4e29b3a8d0b530f7c3598badc6b134562b1a5caee867f7c62fee51e3", size = 266385, upload-time = "2025-06-09T23:01:55.769Z" }, - { url = "https://files.pythonhosted.org/packages/cd/45/e365fdb554159462ca12df54bc59bfa7a9a273ecc21e99e72e597564d1ae/frozenlist-1.7.0-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:3d688126c242a6fabbd92e02633414d40f50bb6002fa4cf995a1d18051525657", size = 288771, upload-time = "2025-06-09T23:01:57.4Z" }, - { url = "https://files.pythonhosted.org/packages/00/11/47b6117002a0e904f004d70ec5194fe9144f117c33c851e3d51c765962d0/frozenlist-1.7.0-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:4e7e9652b3d367c7bd449a727dc79d5043f48b88d0cbfd4f9f1060cf2b414104", size = 288206, upload-time = "2025-06-09T23:01:58.936Z" }, - { url = "https://files.pythonhosted.org/packages/40/37/5f9f3c3fd7f7746082ec67bcdc204db72dad081f4f83a503d33220a92973/frozenlist-1.7.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:1a85e345b4c43db8b842cab1feb41be5cc0b10a1830e6295b69d7310f99becaf", size = 282620, upload-time = "2025-06-09T23:02:00.493Z" }, - { url = "https://files.pythonhosted.org/packages/0b/31/8fbc5af2d183bff20f21aa743b4088eac4445d2bb1cdece449ae80e4e2d1/frozenlist-1.7.0-cp313-cp313t-win32.whl", hash = "sha256:3a14027124ddb70dfcee5148979998066897e79f89f64b13328595c4bdf77c81", size = 43059, upload-time = "2025-06-09T23:02:02.072Z" }, - { url = "https://files.pythonhosted.org/packages/bb/ed/41956f52105b8dbc26e457c5705340c67c8cc2b79f394b79bffc09d0e938/frozenlist-1.7.0-cp313-cp313t-win_amd64.whl", hash = "sha256:3bf8010d71d4507775f658e9823210b7427be36625b387221642725b515dcf3e", size = 47516, upload-time = "2025-06-09T23:02:03.779Z" }, - { url = "https://files.pythonhosted.org/packages/ee/45/b82e3c16be2182bff01179db177fe144d58b5dc787a7d4492c6ed8b9317f/frozenlist-1.7.0-py3-none-any.whl", hash = "sha256:9a5af342e34f7e97caf8c995864c7a396418ae2859cc6fdf1b1073020d516a7e", size = 13106, upload-time = "2025-06-09T23:02:34.204Z" }, -] - -[[package]] -name = "h11" -version = "0.16.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/01/ee/02a2c011bdab74c6fb3c75474d40b3052059d95df7e73351460c8588d963/h11-0.16.0.tar.gz", hash = "sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1", size = 101250, upload-time = "2025-04-24T03:35:25.427Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/04/4b/29cac41a4d98d144bf5f6d33995617b185d14b22401f75ca86f384e87ff1/h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86", size = 37515, upload-time = "2025-04-24T03:35:24.344Z" }, -] - -[[package]] -name = "httpcore" -version = "1.0.9" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "certifi" }, - { name = "h11" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/06/94/82699a10bca87a5556c9c59b5963f2d039dbd239f25bc2a63907a05a14cb/httpcore-1.0.9.tar.gz", hash = "sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8", size = 85484, upload-time = "2025-04-24T22:06:22.219Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/7e/f5/f66802a942d491edb555dd61e3a9961140fd64c90bce1eafd741609d334d/httpcore-1.0.9-py3-none-any.whl", hash = "sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55", size = 78784, upload-time = "2025-04-24T22:06:20.566Z" }, -] - -[[package]] -name = "httpx" -version = "0.28.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "anyio" }, - { name = "certifi" }, - { name = "httpcore" }, - { name = "idna" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/b1/df/48c586a5fe32a0f01324ee087459e112ebb7224f646c0b5023f5e79e9956/httpx-0.28.1.tar.gz", hash = "sha256:75e98c5f16b0f35b567856f597f06ff2270a374470a5c2392242528e3e3e42fc", size = 141406, upload-time = "2024-12-06T15:37:23.222Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/2a/39/e50c7c3a983047577ee07d2a9e53faf5a69493943ec3f6a384bdc792deb2/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad", size = 73517, upload-time = "2024-12-06T15:37:21.509Z" }, -] - -[[package]] -name = "httpx-sse" -version = "0.4.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/6e/fa/66bd985dd0b7c109a3bcb89272ee0bfb7e2b4d06309ad7b38ff866734b2a/httpx_sse-0.4.1.tar.gz", hash = "sha256:8f44d34414bc7b21bf3602713005c5df4917884f76072479b21f68befa4ea26e", size = 12998, upload-time = "2025-06-24T13:21:05.71Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/25/0a/6269e3473b09aed2dab8aa1a600c70f31f00ae1349bee30658f7e358a159/httpx_sse-0.4.1-py3-none-any.whl", hash = "sha256:cba42174344c3a5b06f255ce65b350880f962d99ead85e776f23c6618a377a37", size = 8054, upload-time = "2025-06-24T13:21:04.772Z" }, -] - -[[package]] -name = "idna" -version = "3.10" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/f1/70/7703c29685631f5a7590aa73f1f1d3fa9a380e654b86af429e0934a32f7d/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9", size = 190490, upload-time = "2024-09-15T18:07:39.745Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3", size = 70442, upload-time = "2024-09-15T18:07:37.964Z" }, -] - -[[package]] -name = "iniconfig" -version = "2.1.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/f2/97/ebf4da567aa6827c909642694d71c9fcf53e5b504f2d96afea02718862f3/iniconfig-2.1.0.tar.gz", hash = "sha256:3abbd2e30b36733fee78f9c7f7308f2d0050e88f0087fd25c2645f63c773e1c7", size = 4793, upload-time = "2025-03-19T20:09:59.721Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/2c/e1/e6716421ea10d38022b952c159d5161ca1193197fb744506875fbb87ea7b/iniconfig-2.1.0-py3-none-any.whl", hash = "sha256:9deba5723312380e77435581c6bf4935c94cbfab9b1ed33ef8d238ea168eb760", size = 6050, upload-time = "2025-03-19T20:10:01.071Z" }, -] - -[[package]] -name = "isodate" -version = "0.7.2" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/54/4d/e940025e2ce31a8ce1202635910747e5a87cc3a6a6bb2d00973375014749/isodate-0.7.2.tar.gz", hash = "sha256:4cd1aa0f43ca76f4a6c6c0292a85f40b35ec2e43e315b59f06e6d32171a953e6", size = 29705, upload-time = "2024-10-08T23:04:11.5Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/15/aa/0aca39a37d3c7eb941ba736ede56d689e7be91cab5d9ca846bde3999eba6/isodate-0.7.2-py3-none-any.whl", hash = "sha256:28009937d8031054830160fce6d409ed342816b543597cece116d966c6d99e15", size = 22320, upload-time = "2024-10-08T23:04:09.501Z" }, -] - -[[package]] -name = "jmespath" -version = "1.0.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/00/2a/e867e8531cf3e36b41201936b7fa7ba7b5702dbef42922193f05c8976cd6/jmespath-1.0.1.tar.gz", hash = "sha256:90261b206d6defd58fdd5e85f478bf633a2901798906be2ad389150c5c60edbe", size = 25843, upload-time = "2022-06-17T18:00:12.224Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/31/b4/b9b800c45527aadd64d5b442f9b932b00648617eb5d63d2c7a6587b7cafc/jmespath-1.0.1-py3-none-any.whl", hash = "sha256:02e2e4cc71b5bcab88332eebf907519190dd9e6e82107fa7f83b1003a6252980", size = 20256, upload-time = "2022-06-17T18:00:10.251Z" }, -] - -[[package]] -name = "jsonschema" -version = "4.25.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "attrs" }, - { name = "jsonschema-specifications" }, - { name = "referencing" }, - { name = "rpds-py" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/74/69/f7185de793a29082a9f3c7728268ffb31cb5095131a9c139a74078e27336/jsonschema-4.25.1.tar.gz", hash = "sha256:e4a9655ce0da0c0b67a085847e00a3a51449e1157f4f75e9fb5aa545e122eb85", size = 357342, upload-time = "2025-08-18T17:03:50.038Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/bf/9c/8c95d856233c1f82500c2450b8c68576b4cf1c871db3afac5c34ff84e6fd/jsonschema-4.25.1-py3-none-any.whl", hash = "sha256:3fba0169e345c7175110351d456342c364814cfcf3b964ba4587f22915230a63", size = 90040, upload-time = "2025-08-18T17:03:48.373Z" }, -] - -[[package]] -name = "jsonschema-path" -version = "0.3.4" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "pathable" }, - { name = "pyyaml" }, - { name = "referencing" }, - { name = "requests" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/6e/45/41ebc679c2a4fced6a722f624c18d658dee42612b83ea24c1caf7c0eb3a8/jsonschema_path-0.3.4.tar.gz", hash = "sha256:8365356039f16cc65fddffafda5f58766e34bebab7d6d105616ab52bc4297001", size = 11159, upload-time = "2025-01-24T14:33:16.547Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/cb/58/3485da8cb93d2f393bce453adeef16896751f14ba3e2024bc21dc9597646/jsonschema_path-0.3.4-py3-none-any.whl", hash = "sha256:f502191fdc2b22050f9a81c9237be9d27145b9001c55842bece5e94e382e52f8", size = 14810, upload-time = "2025-01-24T14:33:14.652Z" }, -] - -[[package]] -name = "jsonschema-specifications" -version = "2025.9.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "referencing" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/19/74/a633ee74eb36c44aa6d1095e7cc5569bebf04342ee146178e2d36600708b/jsonschema_specifications-2025.9.1.tar.gz", hash = "sha256:b540987f239e745613c7a9176f3edb72b832a4ac465cf02712288397832b5e8d", size = 32855, upload-time = "2025-09-08T01:34:59.186Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/41/45/1a4ed80516f02155c51f51e8cedb3c1902296743db0bbc66608a0db2814f/jsonschema_specifications-2025.9.1-py3-none-any.whl", hash = "sha256:98802fee3a11ee76ecaca44429fda8a41bff98b00a0f2838151b113f210cc6fe", size = 18437, upload-time = "2025-09-08T01:34:57.871Z" }, -] - -[[package]] -name = "lazy-object-proxy" -version = "1.12.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/08/a2/69df9c6ba6d316cfd81fe2381e464db3e6de5db45f8c43c6a23504abf8cb/lazy_object_proxy-1.12.0.tar.gz", hash = "sha256:1f5a462d92fd0cfb82f1fab28b51bfb209fabbe6aabf7f0d51472c0c124c0c61", size = 43681, upload-time = "2025-08-22T13:50:06.783Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/01/b3/4684b1e128a87821e485f5a901b179790e6b5bc02f89b7ee19c23be36ef3/lazy_object_proxy-1.12.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1cf69cd1a6c7fe2dbcc3edaa017cf010f4192e53796538cc7d5e1fedbfa4bcff", size = 26656, upload-time = "2025-08-22T13:42:30.605Z" }, - { url = "https://files.pythonhosted.org/packages/3a/03/1bdc21d9a6df9ff72d70b2ff17d8609321bea4b0d3cffd2cea92fb2ef738/lazy_object_proxy-1.12.0-cp311-cp311-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:efff4375a8c52f55a145dc8487a2108c2140f0bec4151ab4e1843e52eb9987ad", size = 68832, upload-time = "2025-08-22T13:42:31.675Z" }, - { url = "https://files.pythonhosted.org/packages/3d/4b/5788e5e8bd01d19af71e50077ab020bc5cce67e935066cd65e1215a09ff9/lazy_object_proxy-1.12.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1192e8c2f1031a6ff453ee40213afa01ba765b3dc861302cd91dbdb2e2660b00", size = 69148, upload-time = "2025-08-22T13:42:32.876Z" }, - { url = "https://files.pythonhosted.org/packages/79/0e/090bf070f7a0de44c61659cb7f74c2fe02309a77ca8c4b43adfe0b695f66/lazy_object_proxy-1.12.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:3605b632e82a1cbc32a1e5034278a64db555b3496e0795723ee697006b980508", size = 67800, upload-time = "2025-08-22T13:42:34.054Z" }, - { url = "https://files.pythonhosted.org/packages/cf/d2/b320325adbb2d119156f7c506a5fbfa37fcab15c26d13cf789a90a6de04e/lazy_object_proxy-1.12.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:a61095f5d9d1a743e1e20ec6d6db6c2ca511961777257ebd9b288951b23b44fa", size = 68085, upload-time = "2025-08-22T13:42:35.197Z" }, - { url = "https://files.pythonhosted.org/packages/6a/48/4b718c937004bf71cd82af3713874656bcb8d0cc78600bf33bb9619adc6c/lazy_object_proxy-1.12.0-cp311-cp311-win_amd64.whl", hash = "sha256:997b1d6e10ecc6fb6fe0f2c959791ae59599f41da61d652f6c903d1ee58b7370", size = 26535, upload-time = "2025-08-22T13:42:36.521Z" }, - { url = "https://files.pythonhosted.org/packages/0d/1b/b5f5bd6bda26f1e15cd3232b223892e4498e34ec70a7f4f11c401ac969f1/lazy_object_proxy-1.12.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8ee0d6027b760a11cc18281e702c0309dd92da458a74b4c15025d7fc490deede", size = 26746, upload-time = "2025-08-22T13:42:37.572Z" }, - { url = "https://files.pythonhosted.org/packages/55/64/314889b618075c2bfc19293ffa9153ce880ac6153aacfd0a52fcabf21a66/lazy_object_proxy-1.12.0-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:4ab2c584e3cc8be0dfca422e05ad30a9abe3555ce63e9ab7a559f62f8dbc6ff9", size = 71457, upload-time = "2025-08-22T13:42:38.743Z" }, - { url = "https://files.pythonhosted.org/packages/11/53/857fc2827fc1e13fbdfc0ba2629a7d2579645a06192d5461809540b78913/lazy_object_proxy-1.12.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:14e348185adbd03ec17d051e169ec45686dcd840a3779c9d4c10aabe2ca6e1c0", size = 71036, upload-time = "2025-08-22T13:42:40.184Z" }, - { url = "https://files.pythonhosted.org/packages/2b/24/e581ffed864cd33c1b445b5763d617448ebb880f48675fc9de0471a95cbc/lazy_object_proxy-1.12.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:c4fcbe74fb85df8ba7825fa05eddca764138da752904b378f0ae5ab33a36c308", size = 69329, upload-time = "2025-08-22T13:42:41.311Z" }, - { url = "https://files.pythonhosted.org/packages/78/be/15f8f5a0b0b2e668e756a152257d26370132c97f2f1943329b08f057eff0/lazy_object_proxy-1.12.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:563d2ec8e4d4b68ee7848c5ab4d6057a6d703cb7963b342968bb8758dda33a23", size = 70690, upload-time = "2025-08-22T13:42:42.51Z" }, - { url = "https://files.pythonhosted.org/packages/5d/aa/f02be9bbfb270e13ee608c2b28b8771f20a5f64356c6d9317b20043c6129/lazy_object_proxy-1.12.0-cp312-cp312-win_amd64.whl", hash = "sha256:53c7fd99eb156bbb82cbc5d5188891d8fdd805ba6c1e3b92b90092da2a837073", size = 26563, upload-time = "2025-08-22T13:42:43.685Z" }, - { url = "https://files.pythonhosted.org/packages/f4/26/b74c791008841f8ad896c7f293415136c66cc27e7c7577de4ee68040c110/lazy_object_proxy-1.12.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:86fd61cb2ba249b9f436d789d1356deae69ad3231dc3c0f17293ac535162672e", size = 26745, upload-time = "2025-08-22T13:42:44.982Z" }, - { url = "https://files.pythonhosted.org/packages/9b/52/641870d309e5d1fb1ea7d462a818ca727e43bfa431d8c34b173eb090348c/lazy_object_proxy-1.12.0-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:81d1852fb30fab81696f93db1b1e55a5d1ff7940838191062f5f56987d5fcc3e", size = 71537, upload-time = "2025-08-22T13:42:46.141Z" }, - { url = "https://files.pythonhosted.org/packages/47/b6/919118e99d51c5e76e8bf5a27df406884921c0acf2c7b8a3b38d847ab3e9/lazy_object_proxy-1.12.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:be9045646d83f6c2664c1330904b245ae2371b5c57a3195e4028aedc9f999655", size = 71141, upload-time = "2025-08-22T13:42:47.375Z" }, - { url = "https://files.pythonhosted.org/packages/e5/47/1d20e626567b41de085cf4d4fb3661a56c159feaa73c825917b3b4d4f806/lazy_object_proxy-1.12.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:67f07ab742f1adfb3966c40f630baaa7902be4222a17941f3d85fd1dae5565ff", size = 69449, upload-time = "2025-08-22T13:42:48.49Z" }, - { url = "https://files.pythonhosted.org/packages/58/8d/25c20ff1a1a8426d9af2d0b6f29f6388005fc8cd10d6ee71f48bff86fdd0/lazy_object_proxy-1.12.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:75ba769017b944fcacbf6a80c18b2761a1795b03f8899acdad1f1c39db4409be", size = 70744, upload-time = "2025-08-22T13:42:49.608Z" }, - { url = "https://files.pythonhosted.org/packages/c0/67/8ec9abe15c4f8a4bcc6e65160a2c667240d025cbb6591b879bea55625263/lazy_object_proxy-1.12.0-cp313-cp313-win_amd64.whl", hash = "sha256:7b22c2bbfb155706b928ac4d74c1a63ac8552a55ba7fff4445155523ea4067e1", size = 26568, upload-time = "2025-08-22T13:42:57.719Z" }, - { url = "https://files.pythonhosted.org/packages/23/12/cd2235463f3469fd6c62d41d92b7f120e8134f76e52421413a0ad16d493e/lazy_object_proxy-1.12.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:4a79b909aa16bde8ae606f06e6bbc9d3219d2e57fb3e0076e17879072b742c65", size = 27391, upload-time = "2025-08-22T13:42:50.62Z" }, - { url = "https://files.pythonhosted.org/packages/60/9e/f1c53e39bbebad2e8609c67d0830cc275f694d0ea23d78e8f6db526c12d3/lazy_object_proxy-1.12.0-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:338ab2f132276203e404951205fe80c3fd59429b3a724e7b662b2eb539bb1be9", size = 80552, upload-time = "2025-08-22T13:42:51.731Z" }, - { url = "https://files.pythonhosted.org/packages/4c/b6/6c513693448dcb317d9d8c91d91f47addc09553613379e504435b4cc8b3e/lazy_object_proxy-1.12.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8c40b3c9faee2e32bfce0df4ae63f4e73529766893258eca78548bac801c8f66", size = 82857, upload-time = "2025-08-22T13:42:53.225Z" }, - { url = "https://files.pythonhosted.org/packages/12/1c/d9c4aaa4c75da11eb7c22c43d7c90a53b4fca0e27784a5ab207768debea7/lazy_object_proxy-1.12.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:717484c309df78cedf48396e420fa57fc8a2b1f06ea889df7248fdd156e58847", size = 80833, upload-time = "2025-08-22T13:42:54.391Z" }, - { url = "https://files.pythonhosted.org/packages/0b/ae/29117275aac7d7d78ae4f5a4787f36ff33262499d486ac0bf3e0b97889f6/lazy_object_proxy-1.12.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:a6b7ea5ea1ffe15059eb44bcbcb258f97bcb40e139b88152c40d07b1a1dfc9ac", size = 79516, upload-time = "2025-08-22T13:42:55.812Z" }, - { url = "https://files.pythonhosted.org/packages/19/40/b4e48b2c38c69392ae702ae7afa7b6551e0ca5d38263198b7c79de8b3bdf/lazy_object_proxy-1.12.0-cp313-cp313t-win_amd64.whl", hash = "sha256:08c465fb5cd23527512f9bd7b4c7ba6cec33e28aad36fbbe46bf7b858f9f3f7f", size = 27656, upload-time = "2025-08-22T13:42:56.793Z" }, - { url = "https://files.pythonhosted.org/packages/ef/3a/277857b51ae419a1574557c0b12e0d06bf327b758ba94cafc664cb1e2f66/lazy_object_proxy-1.12.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c9defba70ab943f1df98a656247966d7729da2fe9c2d5d85346464bf320820a3", size = 26582, upload-time = "2025-08-22T13:49:49.366Z" }, - { url = "https://files.pythonhosted.org/packages/1a/b6/c5e0fa43535bb9c87880e0ba037cdb1c50e01850b0831e80eb4f4762f270/lazy_object_proxy-1.12.0-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:6763941dbf97eea6b90f5b06eb4da9418cc088fce0e3883f5816090f9afcde4a", size = 71059, upload-time = "2025-08-22T13:49:50.488Z" }, - { url = "https://files.pythonhosted.org/packages/06/8a/7dcad19c685963c652624702f1a968ff10220b16bfcc442257038216bf55/lazy_object_proxy-1.12.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:fdc70d81235fc586b9e3d1aeef7d1553259b62ecaae9db2167a5d2550dcc391a", size = 71034, upload-time = "2025-08-22T13:49:54.224Z" }, - { url = "https://files.pythonhosted.org/packages/12/ac/34cbfb433a10e28c7fd830f91c5a348462ba748413cbb950c7f259e67aa7/lazy_object_proxy-1.12.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:0a83c6f7a6b2bfc11ef3ed67f8cbe99f8ff500b05655d8e7df9aab993a6abc95", size = 69529, upload-time = "2025-08-22T13:49:55.29Z" }, - { url = "https://files.pythonhosted.org/packages/6f/6a/11ad7e349307c3ca4c0175db7a77d60ce42a41c60bcb11800aabd6a8acb8/lazy_object_proxy-1.12.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:256262384ebd2a77b023ad02fbcc9326282bcfd16484d5531154b02bc304f4c5", size = 70391, upload-time = "2025-08-22T13:49:56.35Z" }, - { url = "https://files.pythonhosted.org/packages/59/97/9b410ed8fbc6e79c1ee8b13f8777a80137d4bc189caf2c6202358e66192c/lazy_object_proxy-1.12.0-cp314-cp314-win_amd64.whl", hash = "sha256:7601ec171c7e8584f8ff3f4e440aa2eebf93e854f04639263875b8c2971f819f", size = 26988, upload-time = "2025-08-22T13:49:57.302Z" }, - { url = "https://files.pythonhosted.org/packages/41/a0/b91504515c1f9a299fc157967ffbd2f0321bce0516a3d5b89f6f4cad0355/lazy_object_proxy-1.12.0-pp39.pp310.pp311.graalpy311-none-any.whl", hash = "sha256:c3b2e0af1f7f77c4263759c4824316ce458fabe0fceadcd24ef8ca08b2d1e402", size = 15072, upload-time = "2025-08-22T13:50:05.498Z" }, -] - -[[package]] -name = "markdown-it-py" -version = "4.0.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "mdurl" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/5b/f5/4ec618ed16cc4f8fb3b701563655a69816155e79e24a17b651541804721d/markdown_it_py-4.0.0.tar.gz", hash = "sha256:cb0a2b4aa34f932c007117b194e945bd74e0ec24133ceb5bac59009cda1cb9f3", size = 73070, upload-time = "2025-08-11T12:57:52.854Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/94/54/e7d793b573f298e1c9013b8c4dade17d481164aa517d1d7148619c2cedbf/markdown_it_py-4.0.0-py3-none-any.whl", hash = "sha256:87327c59b172c5011896038353a81343b6754500a08cd7a4973bb48c6d578147", size = 87321, upload-time = "2025-08-11T12:57:51.923Z" }, -] - -[[package]] -name = "markupsafe" -version = "3.0.2" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/b2/97/5d42485e71dfc078108a86d6de8fa46db44a1a9295e89c5d6d4a06e23a62/markupsafe-3.0.2.tar.gz", hash = "sha256:ee55d3edf80167e48ea11a923c7386f4669df67d7994554387f84e7d8b0a2bf0", size = 20537, upload-time = "2024-10-18T15:21:54.129Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/6b/28/bbf83e3f76936960b850435576dd5e67034e200469571be53f69174a2dfd/MarkupSafe-3.0.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:9025b4018f3a1314059769c7bf15441064b2207cb3f065e6ea1e7359cb46db9d", size = 14353, upload-time = "2024-10-18T15:21:02.187Z" }, - { url = "https://files.pythonhosted.org/packages/6c/30/316d194b093cde57d448a4c3209f22e3046c5bb2fb0820b118292b334be7/MarkupSafe-3.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:93335ca3812df2f366e80509ae119189886b0f3c2b81325d39efdb84a1e2ae93", size = 12392, upload-time = "2024-10-18T15:21:02.941Z" }, - { url = "https://files.pythonhosted.org/packages/f2/96/9cdafba8445d3a53cae530aaf83c38ec64c4d5427d975c974084af5bc5d2/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2cb8438c3cbb25e220c2ab33bb226559e7afb3baec11c4f218ffa7308603c832", size = 23984, upload-time = "2024-10-18T15:21:03.953Z" }, - { url = "https://files.pythonhosted.org/packages/f1/a4/aefb044a2cd8d7334c8a47d3fb2c9f328ac48cb349468cc31c20b539305f/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a123e330ef0853c6e822384873bef7507557d8e4a082961e1defa947aa59ba84", size = 23120, upload-time = "2024-10-18T15:21:06.495Z" }, - { url = "https://files.pythonhosted.org/packages/8d/21/5e4851379f88f3fad1de30361db501300d4f07bcad047d3cb0449fc51f8c/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1e084f686b92e5b83186b07e8a17fc09e38fff551f3602b249881fec658d3eca", size = 23032, upload-time = "2024-10-18T15:21:07.295Z" }, - { url = "https://files.pythonhosted.org/packages/00/7b/e92c64e079b2d0d7ddf69899c98842f3f9a60a1ae72657c89ce2655c999d/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d8213e09c917a951de9d09ecee036d5c7d36cb6cb7dbaece4c71a60d79fb9798", size = 24057, upload-time = "2024-10-18T15:21:08.073Z" }, - { url = "https://files.pythonhosted.org/packages/f9/ac/46f960ca323037caa0a10662ef97d0a4728e890334fc156b9f9e52bcc4ca/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:5b02fb34468b6aaa40dfc198d813a641e3a63b98c2b05a16b9f80b7ec314185e", size = 23359, upload-time = "2024-10-18T15:21:09.318Z" }, - { url = "https://files.pythonhosted.org/packages/69/84/83439e16197337b8b14b6a5b9c2105fff81d42c2a7c5b58ac7b62ee2c3b1/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:0bff5e0ae4ef2e1ae4fdf2dfd5b76c75e5c2fa4132d05fc1b0dabcd20c7e28c4", size = 23306, upload-time = "2024-10-18T15:21:10.185Z" }, - { url = "https://files.pythonhosted.org/packages/9a/34/a15aa69f01e2181ed8d2b685c0d2f6655d5cca2c4db0ddea775e631918cd/MarkupSafe-3.0.2-cp311-cp311-win32.whl", hash = "sha256:6c89876f41da747c8d3677a2b540fb32ef5715f97b66eeb0c6b66f5e3ef6f59d", size = 15094, upload-time = "2024-10-18T15:21:11.005Z" }, - { url = "https://files.pythonhosted.org/packages/da/b8/3a3bd761922d416f3dc5d00bfbed11f66b1ab89a0c2b6e887240a30b0f6b/MarkupSafe-3.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:70a87b411535ccad5ef2f1df5136506a10775d267e197e4cf531ced10537bd6b", size = 15521, upload-time = "2024-10-18T15:21:12.911Z" }, - { url = "https://files.pythonhosted.org/packages/22/09/d1f21434c97fc42f09d290cbb6350d44eb12f09cc62c9476effdb33a18aa/MarkupSafe-3.0.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:9778bd8ab0a994ebf6f84c2b949e65736d5575320a17ae8984a77fab08db94cf", size = 14274, upload-time = "2024-10-18T15:21:13.777Z" }, - { url = "https://files.pythonhosted.org/packages/6b/b0/18f76bba336fa5aecf79d45dcd6c806c280ec44538b3c13671d49099fdd0/MarkupSafe-3.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:846ade7b71e3536c4e56b386c2a47adf5741d2d8b94ec9dc3e92e5e1ee1e2225", size = 12348, upload-time = "2024-10-18T15:21:14.822Z" }, - { url = "https://files.pythonhosted.org/packages/e0/25/dd5c0f6ac1311e9b40f4af06c78efde0f3b5cbf02502f8ef9501294c425b/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1c99d261bd2d5f6b59325c92c73df481e05e57f19837bdca8413b9eac4bd8028", size = 24149, upload-time = "2024-10-18T15:21:15.642Z" }, - { url = "https://files.pythonhosted.org/packages/f3/f0/89e7aadfb3749d0f52234a0c8c7867877876e0a20b60e2188e9850794c17/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e17c96c14e19278594aa4841ec148115f9c7615a47382ecb6b82bd8fea3ab0c8", size = 23118, upload-time = "2024-10-18T15:21:17.133Z" }, - { url = "https://files.pythonhosted.org/packages/d5/da/f2eeb64c723f5e3777bc081da884b414671982008c47dcc1873d81f625b6/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:88416bd1e65dcea10bc7569faacb2c20ce071dd1f87539ca2ab364bf6231393c", size = 22993, upload-time = "2024-10-18T15:21:18.064Z" }, - { url = "https://files.pythonhosted.org/packages/da/0e/1f32af846df486dce7c227fe0f2398dc7e2e51d4a370508281f3c1c5cddc/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:2181e67807fc2fa785d0592dc2d6206c019b9502410671cc905d132a92866557", size = 24178, upload-time = "2024-10-18T15:21:18.859Z" }, - { url = "https://files.pythonhosted.org/packages/c4/f6/bb3ca0532de8086cbff5f06d137064c8410d10779c4c127e0e47d17c0b71/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:52305740fe773d09cffb16f8ed0427942901f00adedac82ec8b67752f58a1b22", size = 23319, upload-time = "2024-10-18T15:21:19.671Z" }, - { url = "https://files.pythonhosted.org/packages/a2/82/8be4c96ffee03c5b4a034e60a31294daf481e12c7c43ab8e34a1453ee48b/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ad10d3ded218f1039f11a75f8091880239651b52e9bb592ca27de44eed242a48", size = 23352, upload-time = "2024-10-18T15:21:20.971Z" }, - { url = "https://files.pythonhosted.org/packages/51/ae/97827349d3fcffee7e184bdf7f41cd6b88d9919c80f0263ba7acd1bbcb18/MarkupSafe-3.0.2-cp312-cp312-win32.whl", hash = "sha256:0f4ca02bea9a23221c0182836703cbf8930c5e9454bacce27e767509fa286a30", size = 15097, upload-time = "2024-10-18T15:21:22.646Z" }, - { url = "https://files.pythonhosted.org/packages/c1/80/a61f99dc3a936413c3ee4e1eecac96c0da5ed07ad56fd975f1a9da5bc630/MarkupSafe-3.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:8e06879fc22a25ca47312fbe7c8264eb0b662f6db27cb2d3bbbc74b1df4b9b87", size = 15601, upload-time = "2024-10-18T15:21:23.499Z" }, - { url = "https://files.pythonhosted.org/packages/83/0e/67eb10a7ecc77a0c2bbe2b0235765b98d164d81600746914bebada795e97/MarkupSafe-3.0.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ba9527cdd4c926ed0760bc301f6728ef34d841f405abf9d4f959c478421e4efd", size = 14274, upload-time = "2024-10-18T15:21:24.577Z" }, - { url = "https://files.pythonhosted.org/packages/2b/6d/9409f3684d3335375d04e5f05744dfe7e9f120062c9857df4ab490a1031a/MarkupSafe-3.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f8b3d067f2e40fe93e1ccdd6b2e1d16c43140e76f02fb1319a05cf2b79d99430", size = 12352, upload-time = "2024-10-18T15:21:25.382Z" }, - { url = "https://files.pythonhosted.org/packages/d2/f5/6eadfcd3885ea85fe2a7c128315cc1bb7241e1987443d78c8fe712d03091/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:569511d3b58c8791ab4c2e1285575265991e6d8f8700c7be0e88f86cb0672094", size = 24122, upload-time = "2024-10-18T15:21:26.199Z" }, - { url = "https://files.pythonhosted.org/packages/0c/91/96cf928db8236f1bfab6ce15ad070dfdd02ed88261c2afafd4b43575e9e9/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:15ab75ef81add55874e7ab7055e9c397312385bd9ced94920f2802310c930396", size = 23085, upload-time = "2024-10-18T15:21:27.029Z" }, - { url = "https://files.pythonhosted.org/packages/c2/cf/c9d56af24d56ea04daae7ac0940232d31d5a8354f2b457c6d856b2057d69/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f3818cb119498c0678015754eba762e0d61e5b52d34c8b13d770f0719f7b1d79", size = 22978, upload-time = "2024-10-18T15:21:27.846Z" }, - { url = "https://files.pythonhosted.org/packages/2a/9f/8619835cd6a711d6272d62abb78c033bda638fdc54c4e7f4272cf1c0962b/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:cdb82a876c47801bb54a690c5ae105a46b392ac6099881cdfb9f6e95e4014c6a", size = 24208, upload-time = "2024-10-18T15:21:28.744Z" }, - { url = "https://files.pythonhosted.org/packages/f9/bf/176950a1792b2cd2102b8ffeb5133e1ed984547b75db47c25a67d3359f77/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:cabc348d87e913db6ab4aa100f01b08f481097838bdddf7c7a84b7575b7309ca", size = 23357, upload-time = "2024-10-18T15:21:29.545Z" }, - { url = "https://files.pythonhosted.org/packages/ce/4f/9a02c1d335caabe5c4efb90e1b6e8ee944aa245c1aaaab8e8a618987d816/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:444dcda765c8a838eaae23112db52f1efaf750daddb2d9ca300bcae1039adc5c", size = 23344, upload-time = "2024-10-18T15:21:30.366Z" }, - { url = "https://files.pythonhosted.org/packages/ee/55/c271b57db36f748f0e04a759ace9f8f759ccf22b4960c270c78a394f58be/MarkupSafe-3.0.2-cp313-cp313-win32.whl", hash = "sha256:bcf3e58998965654fdaff38e58584d8937aa3096ab5354d493c77d1fdd66d7a1", size = 15101, upload-time = "2024-10-18T15:21:31.207Z" }, - { url = "https://files.pythonhosted.org/packages/29/88/07df22d2dd4df40aba9f3e402e6dc1b8ee86297dddbad4872bd5e7b0094f/MarkupSafe-3.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:e6a2a455bd412959b57a172ce6328d2dd1f01cb2135efda2e4576e8a23fa3b0f", size = 15603, upload-time = "2024-10-18T15:21:32.032Z" }, - { url = "https://files.pythonhosted.org/packages/62/6a/8b89d24db2d32d433dffcd6a8779159da109842434f1dd2f6e71f32f738c/MarkupSafe-3.0.2-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:b5a6b3ada725cea8a5e634536b1b01c30bcdcd7f9c6fff4151548d5bf6b3a36c", size = 14510, upload-time = "2024-10-18T15:21:33.625Z" }, - { url = "https://files.pythonhosted.org/packages/7a/06/a10f955f70a2e5a9bf78d11a161029d278eeacbd35ef806c3fd17b13060d/MarkupSafe-3.0.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:a904af0a6162c73e3edcb969eeeb53a63ceeb5d8cf642fade7d39e7963a22ddb", size = 12486, upload-time = "2024-10-18T15:21:34.611Z" }, - { url = "https://files.pythonhosted.org/packages/34/cf/65d4a571869a1a9078198ca28f39fba5fbb910f952f9dbc5220afff9f5e6/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4aa4e5faecf353ed117801a068ebab7b7e09ffb6e1d5e412dc852e0da018126c", size = 25480, upload-time = "2024-10-18T15:21:35.398Z" }, - { url = "https://files.pythonhosted.org/packages/0c/e3/90e9651924c430b885468b56b3d597cabf6d72be4b24a0acd1fa0e12af67/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c0ef13eaeee5b615fb07c9a7dadb38eac06a0608b41570d8ade51c56539e509d", size = 23914, upload-time = "2024-10-18T15:21:36.231Z" }, - { url = "https://files.pythonhosted.org/packages/66/8c/6c7cf61f95d63bb866db39085150df1f2a5bd3335298f14a66b48e92659c/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d16a81a06776313e817c951135cf7340a3e91e8c1ff2fac444cfd75fffa04afe", size = 23796, upload-time = "2024-10-18T15:21:37.073Z" }, - { url = "https://files.pythonhosted.org/packages/bb/35/cbe9238ec3f47ac9a7c8b3df7a808e7cb50fe149dc7039f5f454b3fba218/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:6381026f158fdb7c72a168278597a5e3a5222e83ea18f543112b2662a9b699c5", size = 25473, upload-time = "2024-10-18T15:21:37.932Z" }, - { url = "https://files.pythonhosted.org/packages/e6/32/7621a4382488aa283cc05e8984a9c219abad3bca087be9ec77e89939ded9/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:3d79d162e7be8f996986c064d1c7c817f6df3a77fe3d6859f6f9e7be4b8c213a", size = 24114, upload-time = "2024-10-18T15:21:39.799Z" }, - { url = "https://files.pythonhosted.org/packages/0d/80/0985960e4b89922cb5a0bac0ed39c5b96cbc1a536a99f30e8c220a996ed9/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:131a3c7689c85f5ad20f9f6fb1b866f402c445b220c19fe4308c0b147ccd2ad9", size = 24098, upload-time = "2024-10-18T15:21:40.813Z" }, - { url = "https://files.pythonhosted.org/packages/82/78/fedb03c7d5380df2427038ec8d973587e90561b2d90cd472ce9254cf348b/MarkupSafe-3.0.2-cp313-cp313t-win32.whl", hash = "sha256:ba8062ed2cf21c07a9e295d5b8a2a5ce678b913b45fdf68c32d95d6c1291e0b6", size = 15208, upload-time = "2024-10-18T15:21:41.814Z" }, - { url = "https://files.pythonhosted.org/packages/4f/65/6079a46068dfceaeabb5dcad6d674f5f5c61a6fa5673746f42a9f4c233b3/MarkupSafe-3.0.2-cp313-cp313t-win_amd64.whl", hash = "sha256:e444a31f8db13eb18ada366ab3cf45fd4b31e4db1236a4448f68778c1d1a5a2f", size = 15739, upload-time = "2024-10-18T15:21:42.784Z" }, -] - -[[package]] -name = "mcp" -version = "1.14.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "anyio" }, - { name = "httpx" }, - { name = "httpx-sse" }, - { name = "jsonschema" }, - { name = "pydantic" }, - { name = "pydantic-settings" }, - { name = "python-multipart" }, - { name = "pywin32", marker = "sys_platform == 'win32'" }, - { name = "sse-starlette" }, - { name = "starlette" }, - { name = "uvicorn", marker = "sys_platform != 'emscripten'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/95/fd/d6e941a52446198b73e5e4a953441f667f1469aeb06fb382d9f6729d6168/mcp-1.14.0.tar.gz", hash = "sha256:2e7d98b195e08b2abc1dc6191f6f3dc0059604ac13ee6a40f88676274787fac4", size = 454855, upload-time = "2025-09-11T17:40:48.667Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/04/7b/84b0dd4c2c5a499d2c5d63fb7a1224c25fc4c8b6c24623fa7a566471480d/mcp-1.14.0-py3-none-any.whl", hash = "sha256:b2d27feba27b4c53d41b58aa7f4d090ae0cb740cbc4e339af10f8cbe54c4e19d", size = 163805, upload-time = "2025-09-11T17:40:46.891Z" }, -] - -[[package]] -name = "mdurl" -version = "0.1.2" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/d6/54/cfe61301667036ec958cb99bd3efefba235e65cdeb9c84d24a8293ba1d90/mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba", size = 8729, upload-time = "2022-08-14T12:40:10.846Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/b3/38/89ba8ad64ae25be8de66a6d463314cf1eb366222074cfda9ee839c56a4b4/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8", size = 9979, upload-time = "2022-08-14T12:40:09.779Z" }, -] - -[[package]] -name = "more-itertools" -version = "10.8.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/ea/5d/38b681d3fce7a266dd9ab73c66959406d565b3e85f21d5e66e1181d93721/more_itertools-10.8.0.tar.gz", hash = "sha256:f638ddf8a1a0d134181275fb5d58b086ead7c6a72429ad725c67503f13ba30bd", size = 137431, upload-time = "2025-09-02T15:23:11.018Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/a4/8e/469e5a4a2f5855992e425f3cb33804cc07bf18d48f2db061aec61ce50270/more_itertools-10.8.0-py3-none-any.whl", hash = "sha256:52d4362373dcf7c52546bc4af9a86ee7c4579df9a8dc268be0a2f949d376cc9b", size = 69667, upload-time = "2025-09-02T15:23:09.635Z" }, -] - -[[package]] -name = "multidict" -version = "6.6.4" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/69/7f/0652e6ed47ab288e3756ea9c0df8b14950781184d4bd7883f4d87dd41245/multidict-6.6.4.tar.gz", hash = "sha256:d2d4e4787672911b48350df02ed3fa3fffdc2f2e8ca06dd6afdf34189b76a9dd", size = 101843, upload-time = "2025-08-11T12:08:48.217Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/6b/7f/90a7f01e2d005d6653c689039977f6856718c75c5579445effb7e60923d1/multidict-6.6.4-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:c7a0e9b561e6460484318a7612e725df1145d46b0ef57c6b9866441bf6e27e0c", size = 76472, upload-time = "2025-08-11T12:06:29.006Z" }, - { url = "https://files.pythonhosted.org/packages/54/a3/bed07bc9e2bb302ce752f1dabc69e884cd6a676da44fb0e501b246031fdd/multidict-6.6.4-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6bf2f10f70acc7a2446965ffbc726e5fc0b272c97a90b485857e5c70022213eb", size = 44634, upload-time = "2025-08-11T12:06:30.374Z" }, - { url = "https://files.pythonhosted.org/packages/a7/4b/ceeb4f8f33cf81277da464307afeaf164fb0297947642585884f5cad4f28/multidict-6.6.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:66247d72ed62d5dd29752ffc1d3b88f135c6a8de8b5f63b7c14e973ef5bda19e", size = 44282, upload-time = "2025-08-11T12:06:31.958Z" }, - { url = "https://files.pythonhosted.org/packages/03/35/436a5da8702b06866189b69f655ffdb8f70796252a8772a77815f1812679/multidict-6.6.4-cp311-cp311-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:105245cc6b76f51e408451a844a54e6823bbd5a490ebfe5bdfc79798511ceded", size = 229696, upload-time = "2025-08-11T12:06:33.087Z" }, - { url = "https://files.pythonhosted.org/packages/b6/0e/915160be8fecf1fca35f790c08fb74ca684d752fcba62c11daaf3d92c216/multidict-6.6.4-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:cbbc54e58b34c3bae389ef00046be0961f30fef7cb0dd9c7756aee376a4f7683", size = 246665, upload-time = "2025-08-11T12:06:34.448Z" }, - { url = "https://files.pythonhosted.org/packages/08/ee/2f464330acd83f77dcc346f0b1a0eaae10230291450887f96b204b8ac4d3/multidict-6.6.4-cp311-cp311-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:56c6b3652f945c9bc3ac6c8178cd93132b8d82dd581fcbc3a00676c51302bc1a", size = 225485, upload-time = "2025-08-11T12:06:35.672Z" }, - { url = "https://files.pythonhosted.org/packages/71/cc/9a117f828b4d7fbaec6adeed2204f211e9caf0a012692a1ee32169f846ae/multidict-6.6.4-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:b95494daf857602eccf4c18ca33337dd2be705bccdb6dddbfc9d513e6addb9d9", size = 257318, upload-time = "2025-08-11T12:06:36.98Z" }, - { url = "https://files.pythonhosted.org/packages/25/77/62752d3dbd70e27fdd68e86626c1ae6bccfebe2bb1f84ae226363e112f5a/multidict-6.6.4-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:e5b1413361cef15340ab9dc61523e653d25723e82d488ef7d60a12878227ed50", size = 254689, upload-time = "2025-08-11T12:06:38.233Z" }, - { url = "https://files.pythonhosted.org/packages/00/6e/fac58b1072a6fc59af5e7acb245e8754d3e1f97f4f808a6559951f72a0d4/multidict-6.6.4-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e167bf899c3d724f9662ef00b4f7fef87a19c22b2fead198a6f68b263618df52", size = 246709, upload-time = "2025-08-11T12:06:39.517Z" }, - { url = "https://files.pythonhosted.org/packages/01/ef/4698d6842ef5e797c6db7744b0081e36fb5de3d00002cc4c58071097fac3/multidict-6.6.4-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:aaea28ba20a9026dfa77f4b80369e51cb767c61e33a2d4043399c67bd95fb7c6", size = 243185, upload-time = "2025-08-11T12:06:40.796Z" }, - { url = "https://files.pythonhosted.org/packages/aa/c9/d82e95ae1d6e4ef396934e9b0e942dfc428775f9554acf04393cce66b157/multidict-6.6.4-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:8c91cdb30809a96d9ecf442ec9bc45e8cfaa0f7f8bdf534e082c2443a196727e", size = 237838, upload-time = "2025-08-11T12:06:42.595Z" }, - { url = "https://files.pythonhosted.org/packages/57/cf/f94af5c36baaa75d44fab9f02e2a6bcfa0cd90acb44d4976a80960759dbc/multidict-6.6.4-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:1a0ccbfe93ca114c5d65a2471d52d8829e56d467c97b0e341cf5ee45410033b3", size = 246368, upload-time = "2025-08-11T12:06:44.304Z" }, - { url = "https://files.pythonhosted.org/packages/4a/fe/29f23460c3d995f6a4b678cb2e9730e7277231b981f0b234702f0177818a/multidict-6.6.4-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:55624b3f321d84c403cb7d8e6e982f41ae233d85f85db54ba6286f7295dc8a9c", size = 253339, upload-time = "2025-08-11T12:06:45.597Z" }, - { url = "https://files.pythonhosted.org/packages/29/b6/fd59449204426187b82bf8a75f629310f68c6adc9559dc922d5abe34797b/multidict-6.6.4-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:4a1fb393a2c9d202cb766c76208bd7945bc194eba8ac920ce98c6e458f0b524b", size = 246933, upload-time = "2025-08-11T12:06:46.841Z" }, - { url = "https://files.pythonhosted.org/packages/19/52/d5d6b344f176a5ac3606f7a61fb44dc746e04550e1a13834dff722b8d7d6/multidict-6.6.4-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:43868297a5759a845fa3a483fb4392973a95fb1de891605a3728130c52b8f40f", size = 242225, upload-time = "2025-08-11T12:06:48.588Z" }, - { url = "https://files.pythonhosted.org/packages/ec/d3/5b2281ed89ff4d5318d82478a2a2450fcdfc3300da48ff15c1778280ad26/multidict-6.6.4-cp311-cp311-win32.whl", hash = "sha256:ed3b94c5e362a8a84d69642dbeac615452e8af9b8eb825b7bc9f31a53a1051e2", size = 41306, upload-time = "2025-08-11T12:06:49.95Z" }, - { url = "https://files.pythonhosted.org/packages/74/7d/36b045c23a1ab98507aefd44fd8b264ee1dd5e5010543c6fccf82141ccef/multidict-6.6.4-cp311-cp311-win_amd64.whl", hash = "sha256:d8c112f7a90d8ca5d20213aa41eac690bb50a76da153e3afb3886418e61cb22e", size = 46029, upload-time = "2025-08-11T12:06:51.082Z" }, - { url = "https://files.pythonhosted.org/packages/0f/5e/553d67d24432c5cd52b49047f2d248821843743ee6d29a704594f656d182/multidict-6.6.4-cp311-cp311-win_arm64.whl", hash = "sha256:3bb0eae408fa1996d87247ca0d6a57b7fc1dcf83e8a5c47ab82c558c250d4adf", size = 43017, upload-time = "2025-08-11T12:06:52.243Z" }, - { url = "https://files.pythonhosted.org/packages/05/f6/512ffd8fd8b37fb2680e5ac35d788f1d71bbaf37789d21a820bdc441e565/multidict-6.6.4-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:0ffb87be160942d56d7b87b0fdf098e81ed565add09eaa1294268c7f3caac4c8", size = 76516, upload-time = "2025-08-11T12:06:53.393Z" }, - { url = "https://files.pythonhosted.org/packages/99/58/45c3e75deb8855c36bd66cc1658007589662ba584dbf423d01df478dd1c5/multidict-6.6.4-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:d191de6cbab2aff5de6c5723101705fd044b3e4c7cfd587a1929b5028b9714b3", size = 45394, upload-time = "2025-08-11T12:06:54.555Z" }, - { url = "https://files.pythonhosted.org/packages/fd/ca/e8c4472a93a26e4507c0b8e1f0762c0d8a32de1328ef72fd704ef9cc5447/multidict-6.6.4-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:38a0956dd92d918ad5feff3db8fcb4a5eb7dba114da917e1a88475619781b57b", size = 43591, upload-time = "2025-08-11T12:06:55.672Z" }, - { url = "https://files.pythonhosted.org/packages/05/51/edf414f4df058574a7265034d04c935aa84a89e79ce90fcf4df211f47b16/multidict-6.6.4-cp312-cp312-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:6865f6d3b7900ae020b495d599fcf3765653bc927951c1abb959017f81ae8287", size = 237215, upload-time = "2025-08-11T12:06:57.213Z" }, - { url = "https://files.pythonhosted.org/packages/c8/45/8b3d6dbad8cf3252553cc41abea09ad527b33ce47a5e199072620b296902/multidict-6.6.4-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0a2088c126b6f72db6c9212ad827d0ba088c01d951cee25e758c450da732c138", size = 258299, upload-time = "2025-08-11T12:06:58.946Z" }, - { url = "https://files.pythonhosted.org/packages/3c/e8/8ca2e9a9f5a435fc6db40438a55730a4bf4956b554e487fa1b9ae920f825/multidict-6.6.4-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:0f37bed7319b848097085d7d48116f545985db988e2256b2e6f00563a3416ee6", size = 242357, upload-time = "2025-08-11T12:07:00.301Z" }, - { url = "https://files.pythonhosted.org/packages/0f/84/80c77c99df05a75c28490b2af8f7cba2a12621186e0a8b0865d8e745c104/multidict-6.6.4-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:01368e3c94032ba6ca0b78e7ccb099643466cf24f8dc8eefcfdc0571d56e58f9", size = 268369, upload-time = "2025-08-11T12:07:01.638Z" }, - { url = "https://files.pythonhosted.org/packages/0d/e9/920bfa46c27b05fb3e1ad85121fd49f441492dca2449c5bcfe42e4565d8a/multidict-6.6.4-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:8fe323540c255db0bffee79ad7f048c909f2ab0edb87a597e1c17da6a54e493c", size = 269341, upload-time = "2025-08-11T12:07:02.943Z" }, - { url = "https://files.pythonhosted.org/packages/af/65/753a2d8b05daf496f4a9c367fe844e90a1b2cac78e2be2c844200d10cc4c/multidict-6.6.4-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b8eb3025f17b0a4c3cd08cda49acf312a19ad6e8a4edd9dbd591e6506d999402", size = 256100, upload-time = "2025-08-11T12:07:04.564Z" }, - { url = "https://files.pythonhosted.org/packages/09/54/655be13ae324212bf0bc15d665a4e34844f34c206f78801be42f7a0a8aaa/multidict-6.6.4-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:bbc14f0365534d35a06970d6a83478b249752e922d662dc24d489af1aa0d1be7", size = 253584, upload-time = "2025-08-11T12:07:05.914Z" }, - { url = "https://files.pythonhosted.org/packages/5c/74/ab2039ecc05264b5cec73eb018ce417af3ebb384ae9c0e9ed42cb33f8151/multidict-6.6.4-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:75aa52fba2d96bf972e85451b99d8e19cc37ce26fd016f6d4aa60da9ab2b005f", size = 251018, upload-time = "2025-08-11T12:07:08.301Z" }, - { url = "https://files.pythonhosted.org/packages/af/0a/ccbb244ac848e56c6427f2392741c06302bbfba49c0042f1eb3c5b606497/multidict-6.6.4-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:4fefd4a815e362d4f011919d97d7b4a1e566f1dde83dc4ad8cfb5b41de1df68d", size = 251477, upload-time = "2025-08-11T12:07:10.248Z" }, - { url = "https://files.pythonhosted.org/packages/0e/b0/0ed49bba775b135937f52fe13922bc64a7eaf0a3ead84a36e8e4e446e096/multidict-6.6.4-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:db9801fe021f59a5b375ab778973127ca0ac52429a26e2fd86aa9508f4d26eb7", size = 263575, upload-time = "2025-08-11T12:07:11.928Z" }, - { url = "https://files.pythonhosted.org/packages/3e/d9/7fb85a85e14de2e44dfb6a24f03c41e2af8697a6df83daddb0e9b7569f73/multidict-6.6.4-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:a650629970fa21ac1fb06ba25dabfc5b8a2054fcbf6ae97c758aa956b8dba802", size = 259649, upload-time = "2025-08-11T12:07:13.244Z" }, - { url = "https://files.pythonhosted.org/packages/03/9e/b3a459bcf9b6e74fa461a5222a10ff9b544cb1cd52fd482fb1b75ecda2a2/multidict-6.6.4-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:452ff5da78d4720d7516a3a2abd804957532dd69296cb77319c193e3ffb87e24", size = 251505, upload-time = "2025-08-11T12:07:14.57Z" }, - { url = "https://files.pythonhosted.org/packages/86/a2/8022f78f041dfe6d71e364001a5cf987c30edfc83c8a5fb7a3f0974cff39/multidict-6.6.4-cp312-cp312-win32.whl", hash = "sha256:8c2fcb12136530ed19572bbba61b407f655e3953ba669b96a35036a11a485793", size = 41888, upload-time = "2025-08-11T12:07:15.904Z" }, - { url = "https://files.pythonhosted.org/packages/c7/eb/d88b1780d43a56db2cba24289fa744a9d216c1a8546a0dc3956563fd53ea/multidict-6.6.4-cp312-cp312-win_amd64.whl", hash = "sha256:047d9425860a8c9544fed1b9584f0c8bcd31bcde9568b047c5e567a1025ecd6e", size = 46072, upload-time = "2025-08-11T12:07:17.045Z" }, - { url = "https://files.pythonhosted.org/packages/9f/16/b929320bf5750e2d9d4931835a4c638a19d2494a5b519caaaa7492ebe105/multidict-6.6.4-cp312-cp312-win_arm64.whl", hash = "sha256:14754eb72feaa1e8ae528468f24250dd997b8e2188c3d2f593f9eba259e4b364", size = 43222, upload-time = "2025-08-11T12:07:18.328Z" }, - { url = "https://files.pythonhosted.org/packages/3a/5d/e1db626f64f60008320aab00fbe4f23fc3300d75892a3381275b3d284580/multidict-6.6.4-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:f46a6e8597f9bd71b31cc708195d42b634c8527fecbcf93febf1052cacc1f16e", size = 75848, upload-time = "2025-08-11T12:07:19.912Z" }, - { url = "https://files.pythonhosted.org/packages/4c/aa/8b6f548d839b6c13887253af4e29c939af22a18591bfb5d0ee6f1931dae8/multidict-6.6.4-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:22e38b2bc176c5eb9c0a0e379f9d188ae4cd8b28c0f53b52bce7ab0a9e534657", size = 45060, upload-time = "2025-08-11T12:07:21.163Z" }, - { url = "https://files.pythonhosted.org/packages/eb/c6/f5e97e5d99a729bc2aa58eb3ebfa9f1e56a9b517cc38c60537c81834a73f/multidict-6.6.4-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:5df8afd26f162da59e218ac0eefaa01b01b2e6cd606cffa46608f699539246da", size = 43269, upload-time = "2025-08-11T12:07:22.392Z" }, - { url = "https://files.pythonhosted.org/packages/dc/31/d54eb0c62516776f36fe67f84a732f97e0b0e12f98d5685bebcc6d396910/multidict-6.6.4-cp313-cp313-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:49517449b58d043023720aa58e62b2f74ce9b28f740a0b5d33971149553d72aa", size = 237158, upload-time = "2025-08-11T12:07:23.636Z" }, - { url = "https://files.pythonhosted.org/packages/c4/1c/8a10c1c25b23156e63b12165a929d8eb49a6ed769fdbefb06e6f07c1e50d/multidict-6.6.4-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ae9408439537c5afdca05edd128a63f56a62680f4b3c234301055d7a2000220f", size = 257076, upload-time = "2025-08-11T12:07:25.049Z" }, - { url = "https://files.pythonhosted.org/packages/ad/86/90e20b5771d6805a119e483fd3d1e8393e745a11511aebca41f0da38c3e2/multidict-6.6.4-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:87a32d20759dc52a9e850fe1061b6e41ab28e2998d44168a8a341b99ded1dba0", size = 240694, upload-time = "2025-08-11T12:07:26.458Z" }, - { url = "https://files.pythonhosted.org/packages/e7/49/484d3e6b535bc0555b52a0a26ba86e4d8d03fd5587d4936dc59ba7583221/multidict-6.6.4-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:52e3c8d43cdfff587ceedce9deb25e6ae77daba560b626e97a56ddcad3756879", size = 266350, upload-time = "2025-08-11T12:07:27.94Z" }, - { url = "https://files.pythonhosted.org/packages/bf/b4/aa4c5c379b11895083d50021e229e90c408d7d875471cb3abf721e4670d6/multidict-6.6.4-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:ad8850921d3a8d8ff6fbef790e773cecfc260bbfa0566998980d3fa8f520bc4a", size = 267250, upload-time = "2025-08-11T12:07:29.303Z" }, - { url = "https://files.pythonhosted.org/packages/80/e5/5e22c5bf96a64bdd43518b1834c6d95a4922cc2066b7d8e467dae9b6cee6/multidict-6.6.4-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:497a2954adc25c08daff36f795077f63ad33e13f19bfff7736e72c785391534f", size = 254900, upload-time = "2025-08-11T12:07:30.764Z" }, - { url = "https://files.pythonhosted.org/packages/17/38/58b27fed927c07035abc02befacab42491e7388ca105e087e6e0215ead64/multidict-6.6.4-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:024ce601f92d780ca1617ad4be5ac15b501cc2414970ffa2bb2bbc2bd5a68fa5", size = 252355, upload-time = "2025-08-11T12:07:32.205Z" }, - { url = "https://files.pythonhosted.org/packages/d0/a1/dad75d23a90c29c02b5d6f3d7c10ab36c3197613be5d07ec49c7791e186c/multidict-6.6.4-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:a693fc5ed9bdd1c9e898013e0da4dcc640de7963a371c0bd458e50e046bf6438", size = 250061, upload-time = "2025-08-11T12:07:33.623Z" }, - { url = "https://files.pythonhosted.org/packages/b8/1a/ac2216b61c7f116edab6dc3378cca6c70dc019c9a457ff0d754067c58b20/multidict-6.6.4-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:190766dac95aab54cae5b152a56520fd99298f32a1266d66d27fdd1b5ac00f4e", size = 249675, upload-time = "2025-08-11T12:07:34.958Z" }, - { url = "https://files.pythonhosted.org/packages/d4/79/1916af833b800d13883e452e8e0977c065c4ee3ab7a26941fbfdebc11895/multidict-6.6.4-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:34d8f2a5ffdceab9dcd97c7a016deb2308531d5f0fced2bb0c9e1df45b3363d7", size = 261247, upload-time = "2025-08-11T12:07:36.588Z" }, - { url = "https://files.pythonhosted.org/packages/c5/65/d1f84fe08ac44a5fc7391cbc20a7cedc433ea616b266284413fd86062f8c/multidict-6.6.4-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:59e8d40ab1f5a8597abcef00d04845155a5693b5da00d2c93dbe88f2050f2812", size = 257960, upload-time = "2025-08-11T12:07:39.735Z" }, - { url = "https://files.pythonhosted.org/packages/13/b5/29ec78057d377b195ac2c5248c773703a6b602e132a763e20ec0457e7440/multidict-6.6.4-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:467fe64138cfac771f0e949b938c2e1ada2b5af22f39692aa9258715e9ea613a", size = 250078, upload-time = "2025-08-11T12:07:41.525Z" }, - { url = "https://files.pythonhosted.org/packages/c4/0e/7e79d38f70a872cae32e29b0d77024bef7834b0afb406ddae6558d9e2414/multidict-6.6.4-cp313-cp313-win32.whl", hash = "sha256:14616a30fe6d0a48d0a48d1a633ab3b8bec4cf293aac65f32ed116f620adfd69", size = 41708, upload-time = "2025-08-11T12:07:43.405Z" }, - { url = "https://files.pythonhosted.org/packages/9d/34/746696dffff742e97cd6a23da953e55d0ea51fa601fa2ff387b3edcfaa2c/multidict-6.6.4-cp313-cp313-win_amd64.whl", hash = "sha256:40cd05eaeb39e2bc8939451f033e57feaa2ac99e07dbca8afe2be450a4a3b6cf", size = 45912, upload-time = "2025-08-11T12:07:45.082Z" }, - { url = "https://files.pythonhosted.org/packages/c7/87/3bac136181e271e29170d8d71929cdeddeb77f3e8b6a0c08da3a8e9da114/multidict-6.6.4-cp313-cp313-win_arm64.whl", hash = "sha256:f6eb37d511bfae9e13e82cb4d1af36b91150466f24d9b2b8a9785816deb16605", size = 43076, upload-time = "2025-08-11T12:07:46.746Z" }, - { url = "https://files.pythonhosted.org/packages/64/94/0a8e63e36c049b571c9ae41ee301ada29c3fee9643d9c2548d7d558a1d99/multidict-6.6.4-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:6c84378acd4f37d1b507dfa0d459b449e2321b3ba5f2338f9b085cf7a7ba95eb", size = 82812, upload-time = "2025-08-11T12:07:48.402Z" }, - { url = "https://files.pythonhosted.org/packages/25/1a/be8e369dfcd260d2070a67e65dd3990dd635cbd735b98da31e00ea84cd4e/multidict-6.6.4-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:0e0558693063c75f3d952abf645c78f3c5dfdd825a41d8c4d8156fc0b0da6e7e", size = 48313, upload-time = "2025-08-11T12:07:49.679Z" }, - { url = "https://files.pythonhosted.org/packages/26/5a/dd4ade298674b2f9a7b06a32c94ffbc0497354df8285f27317c66433ce3b/multidict-6.6.4-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:3f8e2384cb83ebd23fd07e9eada8ba64afc4c759cd94817433ab8c81ee4b403f", size = 46777, upload-time = "2025-08-11T12:07:51.318Z" }, - { url = "https://files.pythonhosted.org/packages/89/db/98aa28bc7e071bfba611ac2ae803c24e96dd3a452b4118c587d3d872c64c/multidict-6.6.4-cp313-cp313t-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:f996b87b420995a9174b2a7c1a8daf7db4750be6848b03eb5e639674f7963773", size = 229321, upload-time = "2025-08-11T12:07:52.965Z" }, - { url = "https://files.pythonhosted.org/packages/c7/bc/01ddda2a73dd9d167bd85d0e8ef4293836a8f82b786c63fb1a429bc3e678/multidict-6.6.4-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:cc356250cffd6e78416cf5b40dc6a74f1edf3be8e834cf8862d9ed5265cf9b0e", size = 249954, upload-time = "2025-08-11T12:07:54.423Z" }, - { url = "https://files.pythonhosted.org/packages/06/78/6b7c0f020f9aa0acf66d0ab4eb9f08375bac9a50ff5e3edb1c4ccd59eafc/multidict-6.6.4-cp313-cp313t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:dadf95aa862714ea468a49ad1e09fe00fcc9ec67d122f6596a8d40caf6cec7d0", size = 228612, upload-time = "2025-08-11T12:07:55.914Z" }, - { url = "https://files.pythonhosted.org/packages/00/44/3faa416f89b2d5d76e9d447296a81521e1c832ad6e40b92f990697b43192/multidict-6.6.4-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:7dd57515bebffd8ebd714d101d4c434063322e4fe24042e90ced41f18b6d3395", size = 257528, upload-time = "2025-08-11T12:07:57.371Z" }, - { url = "https://files.pythonhosted.org/packages/05/5f/77c03b89af0fcb16f018f668207768191fb9dcfb5e3361a5e706a11db2c9/multidict-6.6.4-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:967af5f238ebc2eb1da4e77af5492219fbd9b4b812347da39a7b5f5c72c0fa45", size = 256329, upload-time = "2025-08-11T12:07:58.844Z" }, - { url = "https://files.pythonhosted.org/packages/cf/e9/ed750a2a9afb4f8dc6f13dc5b67b514832101b95714f1211cd42e0aafc26/multidict-6.6.4-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2a4c6875c37aae9794308ec43e3530e4aa0d36579ce38d89979bbf89582002bb", size = 247928, upload-time = "2025-08-11T12:08:01.037Z" }, - { url = "https://files.pythonhosted.org/packages/1f/b5/e0571bc13cda277db7e6e8a532791d4403dacc9850006cb66d2556e649c0/multidict-6.6.4-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:7f683a551e92bdb7fac545b9c6f9fa2aebdeefa61d607510b3533286fcab67f5", size = 245228, upload-time = "2025-08-11T12:08:02.96Z" }, - { url = "https://files.pythonhosted.org/packages/f3/a3/69a84b0eccb9824491f06368f5b86e72e4af54c3067c37c39099b6687109/multidict-6.6.4-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:3ba5aaf600edaf2a868a391779f7a85d93bed147854925f34edd24cc70a3e141", size = 235869, upload-time = "2025-08-11T12:08:04.746Z" }, - { url = "https://files.pythonhosted.org/packages/a9/9d/28802e8f9121a6a0804fa009debf4e753d0a59969ea9f70be5f5fdfcb18f/multidict-6.6.4-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:580b643b7fd2c295d83cad90d78419081f53fd532d1f1eb67ceb7060f61cff0d", size = 243446, upload-time = "2025-08-11T12:08:06.332Z" }, - { url = "https://files.pythonhosted.org/packages/38/ea/6c98add069b4878c1d66428a5f5149ddb6d32b1f9836a826ac764b9940be/multidict-6.6.4-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:37b7187197da6af3ee0b044dbc9625afd0c885f2800815b228a0e70f9a7f473d", size = 252299, upload-time = "2025-08-11T12:08:07.931Z" }, - { url = "https://files.pythonhosted.org/packages/3a/09/8fe02d204473e14c0af3affd50af9078839dfca1742f025cca765435d6b4/multidict-6.6.4-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:e1b93790ed0bc26feb72e2f08299691ceb6da5e9e14a0d13cc74f1869af327a0", size = 246926, upload-time = "2025-08-11T12:08:09.467Z" }, - { url = "https://files.pythonhosted.org/packages/37/3d/7b1e10d774a6df5175ecd3c92bff069e77bed9ec2a927fdd4ff5fe182f67/multidict-6.6.4-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:a506a77ddee1efcca81ecbeae27ade3e09cdf21a8ae854d766c2bb4f14053f92", size = 243383, upload-time = "2025-08-11T12:08:10.981Z" }, - { url = "https://files.pythonhosted.org/packages/50/b0/a6fae46071b645ae98786ab738447de1ef53742eaad949f27e960864bb49/multidict-6.6.4-cp313-cp313t-win32.whl", hash = "sha256:f93b2b2279883d1d0a9e1bd01f312d6fc315c5e4c1f09e112e4736e2f650bc4e", size = 47775, upload-time = "2025-08-11T12:08:12.439Z" }, - { url = "https://files.pythonhosted.org/packages/b2/0a/2436550b1520091af0600dff547913cb2d66fbac27a8c33bc1b1bccd8d98/multidict-6.6.4-cp313-cp313t-win_amd64.whl", hash = "sha256:6d46a180acdf6e87cc41dc15d8f5c2986e1e8739dc25dbb7dac826731ef381a4", size = 53100, upload-time = "2025-08-11T12:08:13.823Z" }, - { url = "https://files.pythonhosted.org/packages/97/ea/43ac51faff934086db9c072a94d327d71b7d8b40cd5dcb47311330929ef0/multidict-6.6.4-cp313-cp313t-win_arm64.whl", hash = "sha256:756989334015e3335d087a27331659820d53ba432befdef6a718398b0a8493ad", size = 45501, upload-time = "2025-08-11T12:08:15.173Z" }, - { url = "https://files.pythonhosted.org/packages/fd/69/b547032297c7e63ba2af494edba695d781af8a0c6e89e4d06cf848b21d80/multidict-6.6.4-py3-none-any.whl", hash = "sha256:27d8f8e125c07cb954e54d75d04905a9bba8a439c1d84aca94949d4d03d8601c", size = 12313, upload-time = "2025-08-11T12:08:46.891Z" }, -] - -[[package]] -name = "nexus-rpc" -version = "1.1.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "typing-extensions" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/ef/66/540687556bd28cf1ec370cc6881456203dfddb9dab047b8979c6865b5984/nexus_rpc-1.1.0.tar.gz", hash = "sha256:d65ad6a2f54f14e53ebe39ee30555eaeb894102437125733fb13034a04a44553", size = 77383, upload-time = "2025-07-07T19:03:58.368Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/bf/2f/9e9d0dcaa4c6ffa22b7aa31069a8a264c753ff8027b36af602cce038c92f/nexus_rpc-1.1.0-py3-none-any.whl", hash = "sha256:d1b007af2aba186a27e736f8eaae39c03aed05b488084ff6c3d1785c9ba2ad38", size = 27743, upload-time = "2025-07-07T19:03:57.556Z" }, -] - -[[package]] -name = "openapi-core" -version = "0.19.5" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "isodate" }, - { name = "jsonschema" }, - { name = "jsonschema-path" }, - { name = "more-itertools" }, - { name = "openapi-schema-validator" }, - { name = "openapi-spec-validator" }, - { name = "parse" }, - { name = "typing-extensions" }, - { name = "werkzeug" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/b1/35/1acaa5f2fcc6e54eded34a2ec74b479439c4e469fc4e8d0e803fda0234db/openapi_core-0.19.5.tar.gz", hash = "sha256:421e753da56c391704454e66afe4803a290108590ac8fa6f4a4487f4ec11f2d3", size = 103264, upload-time = "2025-03-20T20:17:28.193Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/27/6f/83ead0e2e30a90445ee4fc0135f43741aebc30cca5b43f20968b603e30b6/openapi_core-0.19.5-py3-none-any.whl", hash = "sha256:ef7210e83a59394f46ce282639d8d26ad6fc8094aa904c9c16eb1bac8908911f", size = 106595, upload-time = "2025-03-20T20:17:26.77Z" }, -] - -[[package]] -name = "openapi-pydantic" -version = "0.5.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "pydantic" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/02/2e/58d83848dd1a79cb92ed8e63f6ba901ca282c5f09d04af9423ec26c56fd7/openapi_pydantic-0.5.1.tar.gz", hash = "sha256:ff6835af6bde7a459fb93eb93bb92b8749b754fc6e51b2f1590a19dc3005ee0d", size = 60892, upload-time = "2025-01-08T19:29:27.083Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/12/cf/03675d8bd8ecbf4445504d8071adab19f5f993676795708e36402ab38263/openapi_pydantic-0.5.1-py3-none-any.whl", hash = "sha256:a3a09ef4586f5bd760a8df7f43028b60cafb6d9f61de2acba9574766255ab146", size = 96381, upload-time = "2025-01-08T19:29:25.275Z" }, -] - -[[package]] -name = "openapi-schema-validator" -version = "0.6.3" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "jsonschema" }, - { name = "jsonschema-specifications" }, - { name = "rfc3339-validator" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/8b/f3/5507ad3325169347cd8ced61c232ff3df70e2b250c49f0fe140edb4973c6/openapi_schema_validator-0.6.3.tar.gz", hash = "sha256:f37bace4fc2a5d96692f4f8b31dc0f8d7400fd04f3a937798eaf880d425de6ee", size = 11550, upload-time = "2025-01-10T18:08:22.268Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/21/c6/ad0fba32775ae749016829dace42ed80f4407b171da41313d1a3a5f102e4/openapi_schema_validator-0.6.3-py3-none-any.whl", hash = "sha256:f3b9870f4e556b5a62a1c39da72a6b4b16f3ad9c73dc80084b1b11e74ba148a3", size = 8755, upload-time = "2025-01-10T18:08:19.758Z" }, -] - -[[package]] -name = "openapi-spec-validator" -version = "0.7.2" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "jsonschema" }, - { name = "jsonschema-path" }, - { name = "lazy-object-proxy" }, - { name = "openapi-schema-validator" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/82/af/fe2d7618d6eae6fb3a82766a44ed87cd8d6d82b4564ed1c7cfb0f6378e91/openapi_spec_validator-0.7.2.tar.gz", hash = "sha256:cc029309b5c5dbc7859df0372d55e9d1ff43e96d678b9ba087f7c56fc586f734", size = 36855, upload-time = "2025-06-07T14:48:56.299Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/27/dd/b3fd642260cb17532f66cc1e8250f3507d1e580483e209dc1e9d13bd980d/openapi_spec_validator-0.7.2-py3-none-any.whl", hash = "sha256:4bbdc0894ec85f1d1bea1d6d9c8b2c3c8d7ccaa13577ef40da9c006c9fd0eb60", size = 39713, upload-time = "2025-06-07T14:48:54.077Z" }, -] - -[[package]] -name = "packaging" -version = "25.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/a1/d4/1fc4078c65507b51b96ca8f8c3ba19e6a61c8253c72794544580a7b6c24d/packaging-25.0.tar.gz", hash = "sha256:d443872c98d677bf60f6a1f2f8c1cb748e8fe762d2bf9d3148b5599295b0fc4f", size = 165727, upload-time = "2025-04-19T11:48:59.673Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/20/12/38679034af332785aac8774540895e234f4d07f7545804097de4b666afd8/packaging-25.0-py3-none-any.whl", hash = "sha256:29572ef2b1f17581046b3a2227d5c611fb25ec70ca1ba8554b24b0e69331a484", size = 66469, upload-time = "2025-04-19T11:48:57.875Z" }, -] - -[[package]] -name = "parse" -version = "1.20.2" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/4f/78/d9b09ba24bb36ef8b83b71be547e118d46214735b6dfb39e4bfde0e9b9dd/parse-1.20.2.tar.gz", hash = "sha256:b41d604d16503c79d81af5165155c0b20f6c8d6c559efa66b4b695c3e5a0a0ce", size = 29391, upload-time = "2024-06-11T04:41:57.34Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/d0/31/ba45bf0b2aa7898d81cbbfac0e88c267befb59ad91a19e36e1bc5578ddb1/parse-1.20.2-py2.py3-none-any.whl", hash = "sha256:967095588cb802add9177d0c0b6133b5ba33b1ea9007ca800e526f42a85af558", size = 20126, upload-time = "2024-06-11T04:41:55.057Z" }, -] - -[[package]] -name = "pathable" -version = "0.4.4" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/67/93/8f2c2075b180c12c1e9f6a09d1a985bc2036906b13dff1d8917e395f2048/pathable-0.4.4.tar.gz", hash = "sha256:6905a3cd17804edfac7875b5f6c9142a218c7caef78693c2dbbbfbac186d88b2", size = 8124, upload-time = "2025-01-10T18:43:13.247Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/7d/eb/b6260b31b1a96386c0a880edebe26f89669098acea8e0318bff6adb378fd/pathable-0.4.4-py3-none-any.whl", hash = "sha256:5ae9e94793b6ef5a4cbe0a7ce9dbbefc1eec38df253763fd0aeeacf2762dbbc2", size = 9592, upload-time = "2025-01-10T18:43:11.88Z" }, -] - -[[package]] -name = "pluggy" -version = "1.6.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/f9/e2/3e91f31a7d2b083fe6ef3fa267035b518369d9511ffab804f839851d2779/pluggy-1.6.0.tar.gz", hash = "sha256:7dcc130b76258d33b90f61b658791dede3486c3e6bfb003ee5c9bfb396dd22f3", size = 69412, upload-time = "2025-05-15T12:30:07.975Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/54/20/4d324d65cc6d9205fabedc306948156824eb9f0ee1633355a8f7ec5c66bf/pluggy-1.6.0-py3-none-any.whl", hash = "sha256:e920276dd6813095e9377c0bc5566d94c932c33b27a3e3945d8389c374dd4746", size = 20538, upload-time = "2025-05-15T12:30:06.134Z" }, -] - -[[package]] -name = "propcache" -version = "0.3.2" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/a6/16/43264e4a779dd8588c21a70f0709665ee8f611211bdd2c87d952cfa7c776/propcache-0.3.2.tar.gz", hash = "sha256:20d7d62e4e7ef05f221e0db2856b979540686342e7dd9973b815599c7057e168", size = 44139, upload-time = "2025-06-09T22:56:06.081Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/80/8d/e8b436717ab9c2cfc23b116d2c297305aa4cd8339172a456d61ebf5669b8/propcache-0.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:0b8d2f607bd8f80ddc04088bc2a037fdd17884a6fcadc47a96e334d72f3717be", size = 74207, upload-time = "2025-06-09T22:54:05.399Z" }, - { url = "https://files.pythonhosted.org/packages/d6/29/1e34000e9766d112171764b9fa3226fa0153ab565d0c242c70e9945318a7/propcache-0.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:06766d8f34733416e2e34f46fea488ad5d60726bb9481d3cddf89a6fa2d9603f", size = 43648, upload-time = "2025-06-09T22:54:08.023Z" }, - { url = "https://files.pythonhosted.org/packages/46/92/1ad5af0df781e76988897da39b5f086c2bf0f028b7f9bd1f409bb05b6874/propcache-0.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a2dc1f4a1df4fecf4e6f68013575ff4af84ef6f478fe5344317a65d38a8e6dc9", size = 43496, upload-time = "2025-06-09T22:54:09.228Z" }, - { url = "https://files.pythonhosted.org/packages/b3/ce/e96392460f9fb68461fabab3e095cb00c8ddf901205be4eae5ce246e5b7e/propcache-0.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:be29c4f4810c5789cf10ddf6af80b041c724e629fa51e308a7a0fb19ed1ef7bf", size = 217288, upload-time = "2025-06-09T22:54:10.466Z" }, - { url = "https://files.pythonhosted.org/packages/c5/2a/866726ea345299f7ceefc861a5e782b045545ae6940851930a6adaf1fca6/propcache-0.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:59d61f6970ecbd8ff2e9360304d5c8876a6abd4530cb752c06586849ac8a9dc9", size = 227456, upload-time = "2025-06-09T22:54:11.828Z" }, - { url = "https://files.pythonhosted.org/packages/de/03/07d992ccb6d930398689187e1b3c718339a1c06b8b145a8d9650e4726166/propcache-0.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:62180e0b8dbb6b004baec00a7983e4cc52f5ada9cd11f48c3528d8cfa7b96a66", size = 225429, upload-time = "2025-06-09T22:54:13.823Z" }, - { url = "https://files.pythonhosted.org/packages/5d/e6/116ba39448753b1330f48ab8ba927dcd6cf0baea8a0ccbc512dfb49ba670/propcache-0.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c144ca294a204c470f18cf4c9d78887810d04a3e2fbb30eea903575a779159df", size = 213472, upload-time = "2025-06-09T22:54:15.232Z" }, - { url = "https://files.pythonhosted.org/packages/a6/85/f01f5d97e54e428885a5497ccf7f54404cbb4f906688a1690cd51bf597dc/propcache-0.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c5c2a784234c28854878d68978265617aa6dc0780e53d44b4d67f3651a17a9a2", size = 204480, upload-time = "2025-06-09T22:54:17.104Z" }, - { url = "https://files.pythonhosted.org/packages/e3/79/7bf5ab9033b8b8194cc3f7cf1aaa0e9c3256320726f64a3e1f113a812dce/propcache-0.3.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:5745bc7acdafa978ca1642891b82c19238eadc78ba2aaa293c6863b304e552d7", size = 214530, upload-time = "2025-06-09T22:54:18.512Z" }, - { url = "https://files.pythonhosted.org/packages/31/0b/bd3e0c00509b609317df4a18e6b05a450ef2d9a963e1d8bc9c9415d86f30/propcache-0.3.2-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:c0075bf773d66fa8c9d41f66cc132ecc75e5bb9dd7cce3cfd14adc5ca184cb95", size = 205230, upload-time = "2025-06-09T22:54:19.947Z" }, - { url = "https://files.pythonhosted.org/packages/7a/23/fae0ff9b54b0de4e819bbe559508da132d5683c32d84d0dc2ccce3563ed4/propcache-0.3.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:5f57aa0847730daceff0497f417c9de353c575d8da3579162cc74ac294c5369e", size = 206754, upload-time = "2025-06-09T22:54:21.716Z" }, - { url = "https://files.pythonhosted.org/packages/b7/7f/ad6a3c22630aaa5f618b4dc3c3598974a72abb4c18e45a50b3cdd091eb2f/propcache-0.3.2-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:eef914c014bf72d18efb55619447e0aecd5fb7c2e3fa7441e2e5d6099bddff7e", size = 218430, upload-time = "2025-06-09T22:54:23.17Z" }, - { url = "https://files.pythonhosted.org/packages/5b/2c/ba4f1c0e8a4b4c75910742f0d333759d441f65a1c7f34683b4a74c0ee015/propcache-0.3.2-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:2a4092e8549031e82facf3decdbc0883755d5bbcc62d3aea9d9e185549936dcf", size = 223884, upload-time = "2025-06-09T22:54:25.539Z" }, - { url = "https://files.pythonhosted.org/packages/88/e4/ebe30fc399e98572019eee82ad0caf512401661985cbd3da5e3140ffa1b0/propcache-0.3.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:85871b050f174bc0bfb437efbdb68aaf860611953ed12418e4361bc9c392749e", size = 211480, upload-time = "2025-06-09T22:54:26.892Z" }, - { url = "https://files.pythonhosted.org/packages/96/0a/7d5260b914e01d1d0906f7f38af101f8d8ed0dc47426219eeaf05e8ea7c2/propcache-0.3.2-cp311-cp311-win32.whl", hash = "sha256:36c8d9b673ec57900c3554264e630d45980fd302458e4ac801802a7fd2ef7897", size = 37757, upload-time = "2025-06-09T22:54:28.241Z" }, - { url = "https://files.pythonhosted.org/packages/e1/2d/89fe4489a884bc0da0c3278c552bd4ffe06a1ace559db5ef02ef24ab446b/propcache-0.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:e53af8cb6a781b02d2ea079b5b853ba9430fcbe18a8e3ce647d5982a3ff69f39", size = 41500, upload-time = "2025-06-09T22:54:29.4Z" }, - { url = "https://files.pythonhosted.org/packages/a8/42/9ca01b0a6f48e81615dca4765a8f1dd2c057e0540f6116a27dc5ee01dfb6/propcache-0.3.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:8de106b6c84506b31c27168582cd3cb3000a6412c16df14a8628e5871ff83c10", size = 73674, upload-time = "2025-06-09T22:54:30.551Z" }, - { url = "https://files.pythonhosted.org/packages/af/6e/21293133beb550f9c901bbece755d582bfaf2176bee4774000bd4dd41884/propcache-0.3.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:28710b0d3975117239c76600ea351934ac7b5ff56e60953474342608dbbb6154", size = 43570, upload-time = "2025-06-09T22:54:32.296Z" }, - { url = "https://files.pythonhosted.org/packages/0c/c8/0393a0a3a2b8760eb3bde3c147f62b20044f0ddac81e9d6ed7318ec0d852/propcache-0.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ce26862344bdf836650ed2487c3d724b00fbfec4233a1013f597b78c1cb73615", size = 43094, upload-time = "2025-06-09T22:54:33.929Z" }, - { url = "https://files.pythonhosted.org/packages/37/2c/489afe311a690399d04a3e03b069225670c1d489eb7b044a566511c1c498/propcache-0.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bca54bd347a253af2cf4544bbec232ab982f4868de0dd684246b67a51bc6b1db", size = 226958, upload-time = "2025-06-09T22:54:35.186Z" }, - { url = "https://files.pythonhosted.org/packages/9d/ca/63b520d2f3d418c968bf596839ae26cf7f87bead026b6192d4da6a08c467/propcache-0.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:55780d5e9a2ddc59711d727226bb1ba83a22dd32f64ee15594b9392b1f544eb1", size = 234894, upload-time = "2025-06-09T22:54:36.708Z" }, - { url = "https://files.pythonhosted.org/packages/11/60/1d0ed6fff455a028d678df30cc28dcee7af77fa2b0e6962ce1df95c9a2a9/propcache-0.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:035e631be25d6975ed87ab23153db6a73426a48db688070d925aa27e996fe93c", size = 233672, upload-time = "2025-06-09T22:54:38.062Z" }, - { url = "https://files.pythonhosted.org/packages/37/7c/54fd5301ef38505ab235d98827207176a5c9b2aa61939b10a460ca53e123/propcache-0.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ee6f22b6eaa39297c751d0e80c0d3a454f112f5c6481214fcf4c092074cecd67", size = 224395, upload-time = "2025-06-09T22:54:39.634Z" }, - { url = "https://files.pythonhosted.org/packages/ee/1a/89a40e0846f5de05fdc6779883bf46ba980e6df4d2ff8fb02643de126592/propcache-0.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7ca3aee1aa955438c4dba34fc20a9f390e4c79967257d830f137bd5a8a32ed3b", size = 212510, upload-time = "2025-06-09T22:54:41.565Z" }, - { url = "https://files.pythonhosted.org/packages/5e/33/ca98368586c9566a6b8d5ef66e30484f8da84c0aac3f2d9aec6d31a11bd5/propcache-0.3.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:7a4f30862869fa2b68380d677cc1c5fcf1e0f2b9ea0cf665812895c75d0ca3b8", size = 222949, upload-time = "2025-06-09T22:54:43.038Z" }, - { url = "https://files.pythonhosted.org/packages/ba/11/ace870d0aafe443b33b2f0b7efdb872b7c3abd505bfb4890716ad7865e9d/propcache-0.3.2-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:b77ec3c257d7816d9f3700013639db7491a434644c906a2578a11daf13176251", size = 217258, upload-time = "2025-06-09T22:54:44.376Z" }, - { url = "https://files.pythonhosted.org/packages/5b/d2/86fd6f7adffcfc74b42c10a6b7db721d1d9ca1055c45d39a1a8f2a740a21/propcache-0.3.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:cab90ac9d3f14b2d5050928483d3d3b8fb6b4018893fc75710e6aa361ecb2474", size = 213036, upload-time = "2025-06-09T22:54:46.243Z" }, - { url = "https://files.pythonhosted.org/packages/07/94/2d7d1e328f45ff34a0a284cf5a2847013701e24c2a53117e7c280a4316b3/propcache-0.3.2-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:0b504d29f3c47cf6b9e936c1852246c83d450e8e063d50562115a6be6d3a2535", size = 227684, upload-time = "2025-06-09T22:54:47.63Z" }, - { url = "https://files.pythonhosted.org/packages/b7/05/37ae63a0087677e90b1d14710e532ff104d44bc1efa3b3970fff99b891dc/propcache-0.3.2-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:ce2ac2675a6aa41ddb2a0c9cbff53780a617ac3d43e620f8fd77ba1c84dcfc06", size = 234562, upload-time = "2025-06-09T22:54:48.982Z" }, - { url = "https://files.pythonhosted.org/packages/a4/7c/3f539fcae630408d0bd8bf3208b9a647ccad10976eda62402a80adf8fc34/propcache-0.3.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:62b4239611205294cc433845b914131b2a1f03500ff3c1ed093ed216b82621e1", size = 222142, upload-time = "2025-06-09T22:54:50.424Z" }, - { url = "https://files.pythonhosted.org/packages/7c/d2/34b9eac8c35f79f8a962546b3e97e9d4b990c420ee66ac8255d5d9611648/propcache-0.3.2-cp312-cp312-win32.whl", hash = "sha256:df4a81b9b53449ebc90cc4deefb052c1dd934ba85012aa912c7ea7b7e38b60c1", size = 37711, upload-time = "2025-06-09T22:54:52.072Z" }, - { url = "https://files.pythonhosted.org/packages/19/61/d582be5d226cf79071681d1b46b848d6cb03d7b70af7063e33a2787eaa03/propcache-0.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:7046e79b989d7fe457bb755844019e10f693752d169076138abf17f31380800c", size = 41479, upload-time = "2025-06-09T22:54:53.234Z" }, - { url = "https://files.pythonhosted.org/packages/dc/d1/8c747fafa558c603c4ca19d8e20b288aa0c7cda74e9402f50f31eb65267e/propcache-0.3.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ca592ed634a73ca002967458187109265e980422116c0a107cf93d81f95af945", size = 71286, upload-time = "2025-06-09T22:54:54.369Z" }, - { url = "https://files.pythonhosted.org/packages/61/99/d606cb7986b60d89c36de8a85d58764323b3a5ff07770a99d8e993b3fa73/propcache-0.3.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:9ecb0aad4020e275652ba3975740f241bd12a61f1a784df044cf7477a02bc252", size = 42425, upload-time = "2025-06-09T22:54:55.642Z" }, - { url = "https://files.pythonhosted.org/packages/8c/96/ef98f91bbb42b79e9bb82bdd348b255eb9d65f14dbbe3b1594644c4073f7/propcache-0.3.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:7f08f1cc28bd2eade7a8a3d2954ccc673bb02062e3e7da09bc75d843386b342f", size = 41846, upload-time = "2025-06-09T22:54:57.246Z" }, - { url = "https://files.pythonhosted.org/packages/5b/ad/3f0f9a705fb630d175146cd7b1d2bf5555c9beaed54e94132b21aac098a6/propcache-0.3.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d1a342c834734edb4be5ecb1e9fb48cb64b1e2320fccbd8c54bf8da8f2a84c33", size = 208871, upload-time = "2025-06-09T22:54:58.975Z" }, - { url = "https://files.pythonhosted.org/packages/3a/38/2085cda93d2c8b6ec3e92af2c89489a36a5886b712a34ab25de9fbca7992/propcache-0.3.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8a544caaae1ac73f1fecfae70ded3e93728831affebd017d53449e3ac052ac1e", size = 215720, upload-time = "2025-06-09T22:55:00.471Z" }, - { url = "https://files.pythonhosted.org/packages/61/c1/d72ea2dc83ac7f2c8e182786ab0fc2c7bd123a1ff9b7975bee671866fe5f/propcache-0.3.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:310d11aa44635298397db47a3ebce7db99a4cc4b9bbdfcf6c98a60c8d5261cf1", size = 215203, upload-time = "2025-06-09T22:55:01.834Z" }, - { url = "https://files.pythonhosted.org/packages/af/81/b324c44ae60c56ef12007105f1460d5c304b0626ab0cc6b07c8f2a9aa0b8/propcache-0.3.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4c1396592321ac83157ac03a2023aa6cc4a3cc3cfdecb71090054c09e5a7cce3", size = 206365, upload-time = "2025-06-09T22:55:03.199Z" }, - { url = "https://files.pythonhosted.org/packages/09/73/88549128bb89e66d2aff242488f62869014ae092db63ccea53c1cc75a81d/propcache-0.3.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8cabf5b5902272565e78197edb682017d21cf3b550ba0460ee473753f28d23c1", size = 196016, upload-time = "2025-06-09T22:55:04.518Z" }, - { url = "https://files.pythonhosted.org/packages/b9/3f/3bdd14e737d145114a5eb83cb172903afba7242f67c5877f9909a20d948d/propcache-0.3.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:0a2f2235ac46a7aa25bdeb03a9e7060f6ecbd213b1f9101c43b3090ffb971ef6", size = 205596, upload-time = "2025-06-09T22:55:05.942Z" }, - { url = "https://files.pythonhosted.org/packages/0f/ca/2f4aa819c357d3107c3763d7ef42c03980f9ed5c48c82e01e25945d437c1/propcache-0.3.2-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:92b69e12e34869a6970fd2f3da91669899994b47c98f5d430b781c26f1d9f387", size = 200977, upload-time = "2025-06-09T22:55:07.792Z" }, - { url = "https://files.pythonhosted.org/packages/cd/4a/e65276c7477533c59085251ae88505caf6831c0e85ff8b2e31ebcbb949b1/propcache-0.3.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:54e02207c79968ebbdffc169591009f4474dde3b4679e16634d34c9363ff56b4", size = 197220, upload-time = "2025-06-09T22:55:09.173Z" }, - { url = "https://files.pythonhosted.org/packages/7c/54/fc7152e517cf5578278b242396ce4d4b36795423988ef39bb8cd5bf274c8/propcache-0.3.2-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:4adfb44cb588001f68c5466579d3f1157ca07f7504fc91ec87862e2b8e556b88", size = 210642, upload-time = "2025-06-09T22:55:10.62Z" }, - { url = "https://files.pythonhosted.org/packages/b9/80/abeb4a896d2767bf5f1ea7b92eb7be6a5330645bd7fb844049c0e4045d9d/propcache-0.3.2-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:fd3e6019dc1261cd0291ee8919dd91fbab7b169bb76aeef6c716833a3f65d206", size = 212789, upload-time = "2025-06-09T22:55:12.029Z" }, - { url = "https://files.pythonhosted.org/packages/b3/db/ea12a49aa7b2b6d68a5da8293dcf50068d48d088100ac016ad92a6a780e6/propcache-0.3.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4c181cad81158d71c41a2bce88edce078458e2dd5ffee7eddd6b05da85079f43", size = 205880, upload-time = "2025-06-09T22:55:13.45Z" }, - { url = "https://files.pythonhosted.org/packages/d1/e5/9076a0bbbfb65d1198007059c65639dfd56266cf8e477a9707e4b1999ff4/propcache-0.3.2-cp313-cp313-win32.whl", hash = "sha256:8a08154613f2249519e549de2330cf8e2071c2887309a7b07fb56098f5170a02", size = 37220, upload-time = "2025-06-09T22:55:15.284Z" }, - { url = "https://files.pythonhosted.org/packages/d3/f5/b369e026b09a26cd77aa88d8fffd69141d2ae00a2abaaf5380d2603f4b7f/propcache-0.3.2-cp313-cp313-win_amd64.whl", hash = "sha256:e41671f1594fc4ab0a6dec1351864713cb3a279910ae8b58f884a88a0a632c05", size = 40678, upload-time = "2025-06-09T22:55:16.445Z" }, - { url = "https://files.pythonhosted.org/packages/a4/3a/6ece377b55544941a08d03581c7bc400a3c8cd3c2865900a68d5de79e21f/propcache-0.3.2-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:9a3cf035bbaf035f109987d9d55dc90e4b0e36e04bbbb95af3055ef17194057b", size = 76560, upload-time = "2025-06-09T22:55:17.598Z" }, - { url = "https://files.pythonhosted.org/packages/0c/da/64a2bb16418740fa634b0e9c3d29edff1db07f56d3546ca2d86ddf0305e1/propcache-0.3.2-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:156c03d07dc1323d8dacaa221fbe028c5c70d16709cdd63502778e6c3ccca1b0", size = 44676, upload-time = "2025-06-09T22:55:18.922Z" }, - { url = "https://files.pythonhosted.org/packages/36/7b/f025e06ea51cb72c52fb87e9b395cced02786610b60a3ed51da8af017170/propcache-0.3.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:74413c0ba02ba86f55cf60d18daab219f7e531620c15f1e23d95563f505efe7e", size = 44701, upload-time = "2025-06-09T22:55:20.106Z" }, - { url = "https://files.pythonhosted.org/packages/a4/00/faa1b1b7c3b74fc277f8642f32a4c72ba1d7b2de36d7cdfb676db7f4303e/propcache-0.3.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f066b437bb3fa39c58ff97ab2ca351db465157d68ed0440abecb21715eb24b28", size = 276934, upload-time = "2025-06-09T22:55:21.5Z" }, - { url = "https://files.pythonhosted.org/packages/74/ab/935beb6f1756e0476a4d5938ff44bf0d13a055fed880caf93859b4f1baf4/propcache-0.3.2-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f1304b085c83067914721e7e9d9917d41ad87696bf70f0bc7dee450e9c71ad0a", size = 278316, upload-time = "2025-06-09T22:55:22.918Z" }, - { url = "https://files.pythonhosted.org/packages/f8/9d/994a5c1ce4389610838d1caec74bdf0e98b306c70314d46dbe4fcf21a3e2/propcache-0.3.2-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ab50cef01b372763a13333b4e54021bdcb291fc9a8e2ccb9c2df98be51bcde6c", size = 282619, upload-time = "2025-06-09T22:55:24.651Z" }, - { url = "https://files.pythonhosted.org/packages/2b/00/a10afce3d1ed0287cef2e09506d3be9822513f2c1e96457ee369adb9a6cd/propcache-0.3.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fad3b2a085ec259ad2c2842666b2a0a49dea8463579c606426128925af1ed725", size = 265896, upload-time = "2025-06-09T22:55:26.049Z" }, - { url = "https://files.pythonhosted.org/packages/2e/a8/2aa6716ffa566ca57c749edb909ad27884680887d68517e4be41b02299f3/propcache-0.3.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:261fa020c1c14deafd54c76b014956e2f86991af198c51139faf41c4d5e83892", size = 252111, upload-time = "2025-06-09T22:55:27.381Z" }, - { url = "https://files.pythonhosted.org/packages/36/4f/345ca9183b85ac29c8694b0941f7484bf419c7f0fea2d1e386b4f7893eed/propcache-0.3.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:46d7f8aa79c927e5f987ee3a80205c987717d3659f035c85cf0c3680526bdb44", size = 268334, upload-time = "2025-06-09T22:55:28.747Z" }, - { url = "https://files.pythonhosted.org/packages/3e/ca/fcd54f78b59e3f97b3b9715501e3147f5340167733d27db423aa321e7148/propcache-0.3.2-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:6d8f3f0eebf73e3c0ff0e7853f68be638b4043c65a70517bb575eff54edd8dbe", size = 255026, upload-time = "2025-06-09T22:55:30.184Z" }, - { url = "https://files.pythonhosted.org/packages/8b/95/8e6a6bbbd78ac89c30c225210a5c687790e532ba4088afb8c0445b77ef37/propcache-0.3.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:03c89c1b14a5452cf15403e291c0ccd7751d5b9736ecb2c5bab977ad6c5bcd81", size = 250724, upload-time = "2025-06-09T22:55:31.646Z" }, - { url = "https://files.pythonhosted.org/packages/ee/b0/0dd03616142baba28e8b2d14ce5df6631b4673850a3d4f9c0f9dd714a404/propcache-0.3.2-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:0cc17efde71e12bbaad086d679ce575268d70bc123a5a71ea7ad76f70ba30bba", size = 268868, upload-time = "2025-06-09T22:55:33.209Z" }, - { url = "https://files.pythonhosted.org/packages/c5/98/2c12407a7e4fbacd94ddd32f3b1e3d5231e77c30ef7162b12a60e2dd5ce3/propcache-0.3.2-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:acdf05d00696bc0447e278bb53cb04ca72354e562cf88ea6f9107df8e7fd9770", size = 271322, upload-time = "2025-06-09T22:55:35.065Z" }, - { url = "https://files.pythonhosted.org/packages/35/91/9cb56efbb428b006bb85db28591e40b7736847b8331d43fe335acf95f6c8/propcache-0.3.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:4445542398bd0b5d32df908031cb1b30d43ac848e20470a878b770ec2dcc6330", size = 265778, upload-time = "2025-06-09T22:55:36.45Z" }, - { url = "https://files.pythonhosted.org/packages/9a/4c/b0fe775a2bdd01e176b14b574be679d84fc83958335790f7c9a686c1f468/propcache-0.3.2-cp313-cp313t-win32.whl", hash = "sha256:f86e5d7cd03afb3a1db8e9f9f6eff15794e79e791350ac48a8c924e6f439f394", size = 41175, upload-time = "2025-06-09T22:55:38.436Z" }, - { url = "https://files.pythonhosted.org/packages/a4/ff/47f08595e3d9b5e149c150f88d9714574f1a7cbd89fe2817158a952674bf/propcache-0.3.2-cp313-cp313t-win_amd64.whl", hash = "sha256:9704bedf6e7cbe3c65eca4379a9b53ee6a83749f047808cbb5044d40d7d72198", size = 44857, upload-time = "2025-06-09T22:55:39.687Z" }, - { url = "https://files.pythonhosted.org/packages/cc/35/cc0aaecf278bb4575b8555f2b137de5ab821595ddae9da9d3cd1da4072c7/propcache-0.3.2-py3-none-any.whl", hash = "sha256:98f1ec44fb675f5052cccc8e609c46ed23a35a1cfd18545ad4e29002d858a43f", size = 12663, upload-time = "2025-06-09T22:56:04.484Z" }, -] - -[[package]] -name = "protobuf" -version = "6.32.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/fa/a4/cc17347aa2897568beece2e674674359f911d6fe21b0b8d6268cd42727ac/protobuf-6.32.1.tar.gz", hash = "sha256:ee2469e4a021474ab9baafea6cd070e5bf27c7d29433504ddea1a4ee5850f68d", size = 440635, upload-time = "2025-09-11T21:38:42.935Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/c0/98/645183ea03ab3995d29086b8bf4f7562ebd3d10c9a4b14ee3f20d47cfe50/protobuf-6.32.1-cp310-abi3-win32.whl", hash = "sha256:a8a32a84bc9f2aad712041b8b366190f71dde248926da517bde9e832e4412085", size = 424411, upload-time = "2025-09-11T21:38:27.427Z" }, - { url = "https://files.pythonhosted.org/packages/8c/f3/6f58f841f6ebafe076cebeae33fc336e900619d34b1c93e4b5c97a81fdfa/protobuf-6.32.1-cp310-abi3-win_amd64.whl", hash = "sha256:b00a7d8c25fa471f16bc8153d0e53d6c9e827f0953f3c09aaa4331c718cae5e1", size = 435738, upload-time = "2025-09-11T21:38:30.959Z" }, - { url = "https://files.pythonhosted.org/packages/10/56/a8a3f4e7190837139e68c7002ec749190a163af3e330f65d90309145a210/protobuf-6.32.1-cp39-abi3-macosx_10_9_universal2.whl", hash = "sha256:d8c7e6eb619ffdf105ee4ab76af5a68b60a9d0f66da3ea12d1640e6d8dab7281", size = 426454, upload-time = "2025-09-11T21:38:34.076Z" }, - { url = "https://files.pythonhosted.org/packages/3f/be/8dd0a927c559b37d7a6c8ab79034fd167dcc1f851595f2e641ad62be8643/protobuf-6.32.1-cp39-abi3-manylinux2014_aarch64.whl", hash = "sha256:2f5b80a49e1eb7b86d85fcd23fe92df154b9730a725c3b38c4e43b9d77018bf4", size = 322874, upload-time = "2025-09-11T21:38:35.509Z" }, - { url = "https://files.pythonhosted.org/packages/5c/f6/88d77011b605ef979aace37b7703e4eefad066f7e84d935e5a696515c2dd/protobuf-6.32.1-cp39-abi3-manylinux2014_x86_64.whl", hash = "sha256:b1864818300c297265c83a4982fd3169f97122c299f56a56e2445c3698d34710", size = 322013, upload-time = "2025-09-11T21:38:37.017Z" }, - { url = "https://files.pythonhosted.org/packages/97/b7/15cc7d93443d6c6a84626ae3258a91f4c6ac8c0edd5df35ea7658f71b79c/protobuf-6.32.1-py3-none-any.whl", hash = "sha256:2601b779fc7d32a866c6b4404f9d42a3f67c5b9f3f15b4db3cccabe06b95c346", size = 169289, upload-time = "2025-09-11T21:38:41.234Z" }, -] - -[[package]] -name = "py-cpuinfo" -version = "9.0.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/37/a8/d832f7293ebb21690860d2e01d8115e5ff6f2ae8bbdc953f0eb0fa4bd2c7/py-cpuinfo-9.0.0.tar.gz", hash = "sha256:3cdbbf3fac90dc6f118bfd64384f309edeadd902d7c8fb17f02ffa1fc3f49690", size = 104716, upload-time = "2022-10-25T20:38:06.303Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/e0/a9/023730ba63db1e494a271cb018dcd361bd2c917ba7004c3e49d5daf795a2/py_cpuinfo-9.0.0-py3-none-any.whl", hash = "sha256:859625bc251f64e21f077d099d4162689c762b5d6a4c3c97553d56241c9674d5", size = 22335, upload-time = "2022-10-25T20:38:27.636Z" }, -] - -[[package]] -name = "pycparser" -version = "2.23" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/fe/cf/d2d3b9f5699fb1e4615c8e32ff220203e43b248e1dfcc6736ad9057731ca/pycparser-2.23.tar.gz", hash = "sha256:78816d4f24add8f10a06d6f05b4d424ad9e96cfebf68a4ddc99c65c0720d00c2", size = 173734, upload-time = "2025-09-09T13:23:47.91Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/a0/e3/59cd50310fc9b59512193629e1984c1f95e5c8ae6e5d8c69532ccc65a7fe/pycparser-2.23-py3-none-any.whl", hash = "sha256:e5c6e8d3fbad53479cab09ac03729e0a9faf2bee3db8208a550daf5af81a5934", size = 118140, upload-time = "2025-09-09T13:23:46.651Z" }, -] - -[[package]] -name = "pydantic" -version = "2.11.9" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "annotated-types" }, - { name = "pydantic-core" }, - { name = "typing-extensions" }, - { name = "typing-inspection" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/ff/5d/09a551ba512d7ca404d785072700d3f6727a02f6f3c24ecfd081c7cf0aa8/pydantic-2.11.9.tar.gz", hash = "sha256:6b8ffda597a14812a7975c90b82a8a2e777d9257aba3453f973acd3c032a18e2", size = 788495, upload-time = "2025-09-13T11:26:39.325Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/3e/d3/108f2006987c58e76691d5ae5d200dd3e0f532cb4e5fa3560751c3a1feba/pydantic-2.11.9-py3-none-any.whl", hash = "sha256:c42dd626f5cfc1c6950ce6205ea58c93efa406da65f479dcb4029d5934857da2", size = 444855, upload-time = "2025-09-13T11:26:36.909Z" }, -] - -[package.optional-dependencies] -email = [ - { name = "email-validator" }, -] - -[[package]] -name = "pydantic-core" -version = "2.33.2" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "typing-extensions" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/ad/88/5f2260bdfae97aabf98f1778d43f69574390ad787afb646292a638c923d4/pydantic_core-2.33.2.tar.gz", hash = "sha256:7cb8bc3605c29176e1b105350d2e6474142d7c1bd1d9327c4a9bdb46bf827acc", size = 435195, upload-time = "2025-04-23T18:33:52.104Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/3f/8d/71db63483d518cbbf290261a1fc2839d17ff89fce7089e08cad07ccfce67/pydantic_core-2.33.2-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:4c5b0a576fb381edd6d27f0a85915c6daf2f8138dc5c267a57c08a62900758c7", size = 2028584, upload-time = "2025-04-23T18:31:03.106Z" }, - { url = "https://files.pythonhosted.org/packages/24/2f/3cfa7244ae292dd850989f328722d2aef313f74ffc471184dc509e1e4e5a/pydantic_core-2.33.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e799c050df38a639db758c617ec771fd8fb7a5f8eaaa4b27b101f266b216a246", size = 1855071, upload-time = "2025-04-23T18:31:04.621Z" }, - { url = "https://files.pythonhosted.org/packages/b3/d3/4ae42d33f5e3f50dd467761304be2fa0a9417fbf09735bc2cce003480f2a/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dc46a01bf8d62f227d5ecee74178ffc448ff4e5197c756331f71efcc66dc980f", size = 1897823, upload-time = "2025-04-23T18:31:06.377Z" }, - { url = "https://files.pythonhosted.org/packages/f4/f3/aa5976e8352b7695ff808599794b1fba2a9ae2ee954a3426855935799488/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a144d4f717285c6d9234a66778059f33a89096dfb9b39117663fd8413d582dcc", size = 1983792, upload-time = "2025-04-23T18:31:07.93Z" }, - { url = "https://files.pythonhosted.org/packages/d5/7a/cda9b5a23c552037717f2b2a5257e9b2bfe45e687386df9591eff7b46d28/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:73cf6373c21bc80b2e0dc88444f41ae60b2f070ed02095754eb5a01df12256de", size = 2136338, upload-time = "2025-04-23T18:31:09.283Z" }, - { url = "https://files.pythonhosted.org/packages/2b/9f/b8f9ec8dd1417eb9da784e91e1667d58a2a4a7b7b34cf4af765ef663a7e5/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3dc625f4aa79713512d1976fe9f0bc99f706a9dee21dfd1810b4bbbf228d0e8a", size = 2730998, upload-time = "2025-04-23T18:31:11.7Z" }, - { url = "https://files.pythonhosted.org/packages/47/bc/cd720e078576bdb8255d5032c5d63ee5c0bf4b7173dd955185a1d658c456/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:881b21b5549499972441da4758d662aeea93f1923f953e9cbaff14b8b9565aef", size = 2003200, upload-time = "2025-04-23T18:31:13.536Z" }, - { url = "https://files.pythonhosted.org/packages/ca/22/3602b895ee2cd29d11a2b349372446ae9727c32e78a94b3d588a40fdf187/pydantic_core-2.33.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bdc25f3681f7b78572699569514036afe3c243bc3059d3942624e936ec93450e", size = 2113890, upload-time = "2025-04-23T18:31:15.011Z" }, - { url = "https://files.pythonhosted.org/packages/ff/e6/e3c5908c03cf00d629eb38393a98fccc38ee0ce8ecce32f69fc7d7b558a7/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:fe5b32187cbc0c862ee201ad66c30cf218e5ed468ec8dc1cf49dec66e160cc4d", size = 2073359, upload-time = "2025-04-23T18:31:16.393Z" }, - { url = "https://files.pythonhosted.org/packages/12/e7/6a36a07c59ebefc8777d1ffdaf5ae71b06b21952582e4b07eba88a421c79/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:bc7aee6f634a6f4a95676fcb5d6559a2c2a390330098dba5e5a5f28a2e4ada30", size = 2245883, upload-time = "2025-04-23T18:31:17.892Z" }, - { url = "https://files.pythonhosted.org/packages/16/3f/59b3187aaa6cc0c1e6616e8045b284de2b6a87b027cce2ffcea073adf1d2/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:235f45e5dbcccf6bd99f9f472858849f73d11120d76ea8707115415f8e5ebebf", size = 2241074, upload-time = "2025-04-23T18:31:19.205Z" }, - { url = "https://files.pythonhosted.org/packages/e0/ed/55532bb88f674d5d8f67ab121a2a13c385df382de2a1677f30ad385f7438/pydantic_core-2.33.2-cp311-cp311-win32.whl", hash = "sha256:6368900c2d3ef09b69cb0b913f9f8263b03786e5b2a387706c5afb66800efd51", size = 1910538, upload-time = "2025-04-23T18:31:20.541Z" }, - { url = "https://files.pythonhosted.org/packages/fe/1b/25b7cccd4519c0b23c2dd636ad39d381abf113085ce4f7bec2b0dc755eb1/pydantic_core-2.33.2-cp311-cp311-win_amd64.whl", hash = "sha256:1e063337ef9e9820c77acc768546325ebe04ee38b08703244c1309cccc4f1bab", size = 1952909, upload-time = "2025-04-23T18:31:22.371Z" }, - { url = "https://files.pythonhosted.org/packages/49/a9/d809358e49126438055884c4366a1f6227f0f84f635a9014e2deb9b9de54/pydantic_core-2.33.2-cp311-cp311-win_arm64.whl", hash = "sha256:6b99022f1d19bc32a4c2a0d544fc9a76e3be90f0b3f4af413f87d38749300e65", size = 1897786, upload-time = "2025-04-23T18:31:24.161Z" }, - { url = "https://files.pythonhosted.org/packages/18/8a/2b41c97f554ec8c71f2a8a5f85cb56a8b0956addfe8b0efb5b3d77e8bdc3/pydantic_core-2.33.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:a7ec89dc587667f22b6a0b6579c249fca9026ce7c333fc142ba42411fa243cdc", size = 2009000, upload-time = "2025-04-23T18:31:25.863Z" }, - { url = "https://files.pythonhosted.org/packages/a1/02/6224312aacb3c8ecbaa959897af57181fb6cf3a3d7917fd44d0f2917e6f2/pydantic_core-2.33.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3c6db6e52c6d70aa0d00d45cdb9b40f0433b96380071ea80b09277dba021ddf7", size = 1847996, upload-time = "2025-04-23T18:31:27.341Z" }, - { url = "https://files.pythonhosted.org/packages/d6/46/6dcdf084a523dbe0a0be59d054734b86a981726f221f4562aed313dbcb49/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e61206137cbc65e6d5256e1166f88331d3b6238e082d9f74613b9b765fb9025", size = 1880957, upload-time = "2025-04-23T18:31:28.956Z" }, - { url = "https://files.pythonhosted.org/packages/ec/6b/1ec2c03837ac00886ba8160ce041ce4e325b41d06a034adbef11339ae422/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:eb8c529b2819c37140eb51b914153063d27ed88e3bdc31b71198a198e921e011", size = 1964199, upload-time = "2025-04-23T18:31:31.025Z" }, - { url = "https://files.pythonhosted.org/packages/2d/1d/6bf34d6adb9debd9136bd197ca72642203ce9aaaa85cfcbfcf20f9696e83/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c52b02ad8b4e2cf14ca7b3d918f3eb0ee91e63b3167c32591e57c4317e134f8f", size = 2120296, upload-time = "2025-04-23T18:31:32.514Z" }, - { url = "https://files.pythonhosted.org/packages/e0/94/2bd0aaf5a591e974b32a9f7123f16637776c304471a0ab33cf263cf5591a/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:96081f1605125ba0855dfda83f6f3df5ec90c61195421ba72223de35ccfb2f88", size = 2676109, upload-time = "2025-04-23T18:31:33.958Z" }, - { url = "https://files.pythonhosted.org/packages/f9/41/4b043778cf9c4285d59742281a769eac371b9e47e35f98ad321349cc5d61/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f57a69461af2a5fa6e6bbd7a5f60d3b7e6cebb687f55106933188e79ad155c1", size = 2002028, upload-time = "2025-04-23T18:31:39.095Z" }, - { url = "https://files.pythonhosted.org/packages/cb/d5/7bb781bf2748ce3d03af04d5c969fa1308880e1dca35a9bd94e1a96a922e/pydantic_core-2.33.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:572c7e6c8bb4774d2ac88929e3d1f12bc45714ae5ee6d9a788a9fb35e60bb04b", size = 2100044, upload-time = "2025-04-23T18:31:41.034Z" }, - { url = "https://files.pythonhosted.org/packages/fe/36/def5e53e1eb0ad896785702a5bbfd25eed546cdcf4087ad285021a90ed53/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:db4b41f9bd95fbe5acd76d89920336ba96f03e149097365afe1cb092fceb89a1", size = 2058881, upload-time = "2025-04-23T18:31:42.757Z" }, - { url = "https://files.pythonhosted.org/packages/01/6c/57f8d70b2ee57fc3dc8b9610315949837fa8c11d86927b9bb044f8705419/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:fa854f5cf7e33842a892e5c73f45327760bc7bc516339fda888c75ae60edaeb6", size = 2227034, upload-time = "2025-04-23T18:31:44.304Z" }, - { url = "https://files.pythonhosted.org/packages/27/b9/9c17f0396a82b3d5cbea4c24d742083422639e7bb1d5bf600e12cb176a13/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:5f483cfb75ff703095c59e365360cb73e00185e01aaea067cd19acffd2ab20ea", size = 2234187, upload-time = "2025-04-23T18:31:45.891Z" }, - { url = "https://files.pythonhosted.org/packages/b0/6a/adf5734ffd52bf86d865093ad70b2ce543415e0e356f6cacabbc0d9ad910/pydantic_core-2.33.2-cp312-cp312-win32.whl", hash = "sha256:9cb1da0f5a471435a7bc7e439b8a728e8b61e59784b2af70d7c169f8dd8ae290", size = 1892628, upload-time = "2025-04-23T18:31:47.819Z" }, - { url = "https://files.pythonhosted.org/packages/43/e4/5479fecb3606c1368d496a825d8411e126133c41224c1e7238be58b87d7e/pydantic_core-2.33.2-cp312-cp312-win_amd64.whl", hash = "sha256:f941635f2a3d96b2973e867144fde513665c87f13fe0e193c158ac51bfaaa7b2", size = 1955866, upload-time = "2025-04-23T18:31:49.635Z" }, - { url = "https://files.pythonhosted.org/packages/0d/24/8b11e8b3e2be9dd82df4b11408a67c61bb4dc4f8e11b5b0fc888b38118b5/pydantic_core-2.33.2-cp312-cp312-win_arm64.whl", hash = "sha256:cca3868ddfaccfbc4bfb1d608e2ccaaebe0ae628e1416aeb9c4d88c001bb45ab", size = 1888894, upload-time = "2025-04-23T18:31:51.609Z" }, - { url = "https://files.pythonhosted.org/packages/46/8c/99040727b41f56616573a28771b1bfa08a3d3fe74d3d513f01251f79f172/pydantic_core-2.33.2-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:1082dd3e2d7109ad8b7da48e1d4710c8d06c253cbc4a27c1cff4fbcaa97a9e3f", size = 2015688, upload-time = "2025-04-23T18:31:53.175Z" }, - { url = "https://files.pythonhosted.org/packages/3a/cc/5999d1eb705a6cefc31f0b4a90e9f7fc400539b1a1030529700cc1b51838/pydantic_core-2.33.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f517ca031dfc037a9c07e748cefd8d96235088b83b4f4ba8939105d20fa1dcd6", size = 1844808, upload-time = "2025-04-23T18:31:54.79Z" }, - { url = "https://files.pythonhosted.org/packages/6f/5e/a0a7b8885c98889a18b6e376f344da1ef323d270b44edf8174d6bce4d622/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0a9f2c9dd19656823cb8250b0724ee9c60a82f3cdf68a080979d13092a3b0fef", size = 1885580, upload-time = "2025-04-23T18:31:57.393Z" }, - { url = "https://files.pythonhosted.org/packages/3b/2a/953581f343c7d11a304581156618c3f592435523dd9d79865903272c256a/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2b0a451c263b01acebe51895bfb0e1cc842a5c666efe06cdf13846c7418caa9a", size = 1973859, upload-time = "2025-04-23T18:31:59.065Z" }, - { url = "https://files.pythonhosted.org/packages/e6/55/f1a813904771c03a3f97f676c62cca0c0a4138654107c1b61f19c644868b/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ea40a64d23faa25e62a70ad163571c0b342b8bf66d5fa612ac0dec4f069d916", size = 2120810, upload-time = "2025-04-23T18:32:00.78Z" }, - { url = "https://files.pythonhosted.org/packages/aa/c3/053389835a996e18853ba107a63caae0b9deb4a276c6b472931ea9ae6e48/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0fb2d542b4d66f9470e8065c5469ec676978d625a8b7a363f07d9a501a9cb36a", size = 2676498, upload-time = "2025-04-23T18:32:02.418Z" }, - { url = "https://files.pythonhosted.org/packages/eb/3c/f4abd740877a35abade05e437245b192f9d0ffb48bbbbd708df33d3cda37/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fdac5d6ffa1b5a83bca06ffe7583f5576555e6c8b3a91fbd25ea7780f825f7d", size = 2000611, upload-time = "2025-04-23T18:32:04.152Z" }, - { url = "https://files.pythonhosted.org/packages/59/a7/63ef2fed1837d1121a894d0ce88439fe3e3b3e48c7543b2a4479eb99c2bd/pydantic_core-2.33.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:04a1a413977ab517154eebb2d326da71638271477d6ad87a769102f7c2488c56", size = 2107924, upload-time = "2025-04-23T18:32:06.129Z" }, - { url = "https://files.pythonhosted.org/packages/04/8f/2551964ef045669801675f1cfc3b0d74147f4901c3ffa42be2ddb1f0efc4/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:c8e7af2f4e0194c22b5b37205bfb293d166a7344a5b0d0eaccebc376546d77d5", size = 2063196, upload-time = "2025-04-23T18:32:08.178Z" }, - { url = "https://files.pythonhosted.org/packages/26/bd/d9602777e77fc6dbb0c7db9ad356e9a985825547dce5ad1d30ee04903918/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:5c92edd15cd58b3c2d34873597a1e20f13094f59cf88068adb18947df5455b4e", size = 2236389, upload-time = "2025-04-23T18:32:10.242Z" }, - { url = "https://files.pythonhosted.org/packages/42/db/0e950daa7e2230423ab342ae918a794964b053bec24ba8af013fc7c94846/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:65132b7b4a1c0beded5e057324b7e16e10910c106d43675d9bd87d4f38dde162", size = 2239223, upload-time = "2025-04-23T18:32:12.382Z" }, - { url = "https://files.pythonhosted.org/packages/58/4d/4f937099c545a8a17eb52cb67fe0447fd9a373b348ccfa9a87f141eeb00f/pydantic_core-2.33.2-cp313-cp313-win32.whl", hash = "sha256:52fb90784e0a242bb96ec53f42196a17278855b0f31ac7c3cc6f5c1ec4811849", size = 1900473, upload-time = "2025-04-23T18:32:14.034Z" }, - { url = "https://files.pythonhosted.org/packages/a0/75/4a0a9bac998d78d889def5e4ef2b065acba8cae8c93696906c3a91f310ca/pydantic_core-2.33.2-cp313-cp313-win_amd64.whl", hash = "sha256:c083a3bdd5a93dfe480f1125926afcdbf2917ae714bdb80b36d34318b2bec5d9", size = 1955269, upload-time = "2025-04-23T18:32:15.783Z" }, - { url = "https://files.pythonhosted.org/packages/f9/86/1beda0576969592f1497b4ce8e7bc8cbdf614c352426271b1b10d5f0aa64/pydantic_core-2.33.2-cp313-cp313-win_arm64.whl", hash = "sha256:e80b087132752f6b3d714f041ccf74403799d3b23a72722ea2e6ba2e892555b9", size = 1893921, upload-time = "2025-04-23T18:32:18.473Z" }, - { url = "https://files.pythonhosted.org/packages/a4/7d/e09391c2eebeab681df2b74bfe6c43422fffede8dc74187b2b0bf6fd7571/pydantic_core-2.33.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:61c18fba8e5e9db3ab908620af374db0ac1baa69f0f32df4f61ae23f15e586ac", size = 1806162, upload-time = "2025-04-23T18:32:20.188Z" }, - { url = "https://files.pythonhosted.org/packages/f1/3d/847b6b1fed9f8ed3bb95a9ad04fbd0b212e832d4f0f50ff4d9ee5a9f15cf/pydantic_core-2.33.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:95237e53bb015f67b63c91af7518a62a8660376a6a0db19b89acc77a4d6199f5", size = 1981560, upload-time = "2025-04-23T18:32:22.354Z" }, - { url = "https://files.pythonhosted.org/packages/6f/9a/e73262f6c6656262b5fdd723ad90f518f579b7bc8622e43a942eec53c938/pydantic_core-2.33.2-cp313-cp313t-win_amd64.whl", hash = "sha256:c2fc0a768ef76c15ab9238afa6da7f69895bb5d1ee83aeea2e3509af4472d0b9", size = 1935777, upload-time = "2025-04-23T18:32:25.088Z" }, - { url = "https://files.pythonhosted.org/packages/7b/27/d4ae6487d73948d6f20dddcd94be4ea43e74349b56eba82e9bdee2d7494c/pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:dd14041875d09cc0f9308e37a6f8b65f5585cf2598a53aa0123df8b129d481f8", size = 2025200, upload-time = "2025-04-23T18:33:14.199Z" }, - { url = "https://files.pythonhosted.org/packages/f1/b8/b3cb95375f05d33801024079b9392a5ab45267a63400bf1866e7ce0f0de4/pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:d87c561733f66531dced0da6e864f44ebf89a8fba55f31407b00c2f7f9449593", size = 1859123, upload-time = "2025-04-23T18:33:16.555Z" }, - { url = "https://files.pythonhosted.org/packages/05/bc/0d0b5adeda59a261cd30a1235a445bf55c7e46ae44aea28f7bd6ed46e091/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2f82865531efd18d6e07a04a17331af02cb7a651583c418df8266f17a63c6612", size = 1892852, upload-time = "2025-04-23T18:33:18.513Z" }, - { url = "https://files.pythonhosted.org/packages/3e/11/d37bdebbda2e449cb3f519f6ce950927b56d62f0b84fd9cb9e372a26a3d5/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bfb5112df54209d820d7bf9317c7a6c9025ea52e49f46b6a2060104bba37de7", size = 2067484, upload-time = "2025-04-23T18:33:20.475Z" }, - { url = "https://files.pythonhosted.org/packages/8c/55/1f95f0a05ce72ecb02a8a8a1c3be0579bbc29b1d5ab68f1378b7bebc5057/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:64632ff9d614e5eecfb495796ad51b0ed98c453e447a76bcbeeb69615079fc7e", size = 2108896, upload-time = "2025-04-23T18:33:22.501Z" }, - { url = "https://files.pythonhosted.org/packages/53/89/2b2de6c81fa131f423246a9109d7b2a375e83968ad0800d6e57d0574629b/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:f889f7a40498cc077332c7ab6b4608d296d852182211787d4f3ee377aaae66e8", size = 2069475, upload-time = "2025-04-23T18:33:24.528Z" }, - { url = "https://files.pythonhosted.org/packages/b8/e9/1f7efbe20d0b2b10f6718944b5d8ece9152390904f29a78e68d4e7961159/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:de4b83bb311557e439b9e186f733f6c645b9417c84e2eb8203f3f820a4b988bf", size = 2239013, upload-time = "2025-04-23T18:33:26.621Z" }, - { url = "https://files.pythonhosted.org/packages/3c/b2/5309c905a93811524a49b4e031e9851a6b00ff0fb668794472ea7746b448/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:82f68293f055f51b51ea42fafc74b6aad03e70e191799430b90c13d643059ebb", size = 2238715, upload-time = "2025-04-23T18:33:28.656Z" }, - { url = "https://files.pythonhosted.org/packages/32/56/8a7ca5d2cd2cda1d245d34b1c9a942920a718082ae8e54e5f3e5a58b7add/pydantic_core-2.33.2-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:329467cecfb529c925cf2bbd4d60d2c509bc2fb52a20c1045bf09bb70971a9c1", size = 2066757, upload-time = "2025-04-23T18:33:30.645Z" }, -] - -[[package]] -name = "pydantic-settings" -version = "2.10.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "pydantic" }, - { name = "python-dotenv" }, - { name = "typing-inspection" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/68/85/1ea668bbab3c50071ca613c6ab30047fb36ab0da1b92fa8f17bbc38fd36c/pydantic_settings-2.10.1.tar.gz", hash = "sha256:06f0062169818d0f5524420a360d632d5857b83cffd4d42fe29597807a1614ee", size = 172583, upload-time = "2025-06-24T13:26:46.841Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/58/f0/427018098906416f580e3cf1366d3b1abfb408a0652e9f31600c24a1903c/pydantic_settings-2.10.1-py3-none-any.whl", hash = "sha256:a60952460b99cf661dc25c29c0ef171721f98bfcb52ef8d9ea4c943d7c8cc796", size = 45235, upload-time = "2025-06-24T13:26:45.485Z" }, -] - -[[package]] -name = "pygments" -version = "2.19.2" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/b0/77/a5b8c569bf593b0140bde72ea885a803b82086995367bf2037de0159d924/pygments-2.19.2.tar.gz", hash = "sha256:636cb2477cec7f8952536970bc533bc43743542f70392ae026374600add5b887", size = 4968631, upload-time = "2025-06-21T13:39:12.283Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/c7/21/705964c7812476f378728bdf590ca4b771ec72385c533964653c68e86bdc/pygments-2.19.2-py3-none-any.whl", hash = "sha256:86540386c03d588bb81d44bc3928634ff26449851e99741617ecb9037ee5ec0b", size = 1225217, upload-time = "2025-06-21T13:39:07.939Z" }, -] - -[[package]] -name = "pyperclip" -version = "1.10.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/15/99/25f4898cf420efb6f45f519de018f4faea5391114a8618b16736ef3029f1/pyperclip-1.10.0.tar.gz", hash = "sha256:180c8346b1186921c75dfd14d9048a6b5d46bfc499778811952c6dd6eb1ca6be", size = 12193, upload-time = "2025-09-18T00:54:00.384Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/1e/bc/22540e73c5f5ae18f02924cd3954a6c9a4aa6b713c841a94c98335d333a1/pyperclip-1.10.0-py3-none-any.whl", hash = "sha256:596fbe55dc59263bff26e61d2afbe10223e2fccb5210c9c96a28d6887cfcc7ec", size = 11062, upload-time = "2025-09-18T00:53:59.252Z" }, -] - -[[package]] -name = "pytest" -version = "8.4.2" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "colorama", marker = "sys_platform == 'win32'" }, - { name = "iniconfig" }, - { name = "packaging" }, - { name = "pluggy" }, - { name = "pygments" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/a3/5c/00a0e072241553e1a7496d638deababa67c5058571567b92a7eaa258397c/pytest-8.4.2.tar.gz", hash = "sha256:86c0d0b93306b961d58d62a4db4879f27fe25513d4b969df351abdddb3c30e01", size = 1519618, upload-time = "2025-09-04T14:34:22.711Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/a8/a4/20da314d277121d6534b3a980b29035dcd51e6744bd79075a6ce8fa4eb8d/pytest-8.4.2-py3-none-any.whl", hash = "sha256:872f880de3fc3a5bdc88a11b39c9710c3497a547cfa9320bc3c5e62fbf272e79", size = 365750, upload-time = "2025-09-04T14:34:20.226Z" }, -] - -[[package]] -name = "pytest-asyncio" -version = "1.2.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "pytest" }, - { name = "typing-extensions", marker = "python_full_version < '3.13'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/42/86/9e3c5f48f7b7b638b216e4b9e645f54d199d7abbbab7a64a13b4e12ba10f/pytest_asyncio-1.2.0.tar.gz", hash = "sha256:c609a64a2a8768462d0c99811ddb8bd2583c33fd33cf7f21af1c142e824ffb57", size = 50119, upload-time = "2025-09-12T07:33:53.816Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/04/93/2fa34714b7a4ae72f2f8dad66ba17dd9a2c793220719e736dda28b7aec27/pytest_asyncio-1.2.0-py3-none-any.whl", hash = "sha256:8e17ae5e46d8e7efe51ab6494dd2010f4ca8dae51652aa3c8d55acf50bfb2e99", size = 15095, upload-time = "2025-09-12T07:33:52.639Z" }, -] - -[[package]] -name = "pytest-benchmark" -version = "5.1.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "py-cpuinfo" }, - { name = "pytest" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/39/d0/a8bd08d641b393db3be3819b03e2d9bb8760ca8479080a26a5f6e540e99c/pytest-benchmark-5.1.0.tar.gz", hash = "sha256:9ea661cdc292e8231f7cd4c10b0319e56a2118e2c09d9f50e1b3d150d2aca105", size = 337810, upload-time = "2024-10-30T11:51:48.521Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/9e/d6/b41653199ea09d5969d4e385df9bbfd9a100f28ca7e824ce7c0a016e3053/pytest_benchmark-5.1.0-py3-none-any.whl", hash = "sha256:922de2dfa3033c227c96da942d1878191afa135a29485fb942e85dff1c592c89", size = 44259, upload-time = "2024-10-30T11:51:45.94Z" }, -] - -[[package]] -name = "pytest-cov" -version = "7.0.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "coverage", extra = ["toml"] }, - { name = "pluggy" }, - { name = "pytest" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/5e/f7/c933acc76f5208b3b00089573cf6a2bc26dc80a8aece8f52bb7d6b1855ca/pytest_cov-7.0.0.tar.gz", hash = "sha256:33c97eda2e049a0c5298e91f519302a1334c26ac65c1a483d6206fd458361af1", size = 54328, upload-time = "2025-09-09T10:57:02.113Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/ee/49/1377b49de7d0c1ce41292161ea0f721913fa8722c19fb9c1e3aa0367eecb/pytest_cov-7.0.0-py3-none-any.whl", hash = "sha256:3b8e9558b16cc1479da72058bdecf8073661c7f57f7d3c5f22a1c23507f2d861", size = 22424, upload-time = "2025-09-09T10:57:00.695Z" }, -] - -[[package]] -name = "pytest-mock" -version = "3.15.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "pytest" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/68/14/eb014d26be205d38ad5ad20d9a80f7d201472e08167f0bb4361e251084a9/pytest_mock-3.15.1.tar.gz", hash = "sha256:1849a238f6f396da19762269de72cb1814ab44416fa73a8686deac10b0d87a0f", size = 34036, upload-time = "2025-09-16T16:37:27.081Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/5a/cc/06253936f4a7fa2e0f48dfe6d851d9c56df896a9ab09ac019d70b760619c/pytest_mock-3.15.1-py3-none-any.whl", hash = "sha256:0a25e2eb88fe5168d535041d09a4529a188176ae608a6d249ee65abc0949630d", size = 10095, upload-time = "2025-09-16T16:37:25.734Z" }, -] - -[[package]] -name = "pytest-xdist" -version = "3.8.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "execnet" }, - { name = "pytest" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/78/b4/439b179d1ff526791eb921115fca8e44e596a13efeda518b9d845a619450/pytest_xdist-3.8.0.tar.gz", hash = "sha256:7e578125ec9bc6050861aa93f2d59f1d8d085595d6551c2c90b6f4fad8d3a9f1", size = 88069, upload-time = "2025-07-01T13:30:59.346Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/ca/31/d4e37e9e550c2b92a9cbc2e4d0b7420a27224968580b5a447f420847c975/pytest_xdist-3.8.0-py3-none-any.whl", hash = "sha256:202ca578cfeb7370784a8c33d6d05bc6e13b4f25b5053c30a152269fd10f0b88", size = 46396, upload-time = "2025-07-01T13:30:56.632Z" }, -] - -[[package]] -name = "python-dateutil" -version = "2.9.0.post0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "six" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/66/c0/0c8b6ad9f17a802ee498c46e004a0eb49bc148f2fd230864601a86dcf6db/python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3", size = 342432, upload-time = "2024-03-01T18:36:20.211Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/ec/57/56b9bcc3c9c6a792fcbaf139543cee77261f3651ca9da0c93f5c1221264b/python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427", size = 229892, upload-time = "2024-03-01T18:36:18.57Z" }, -] - -[[package]] -name = "python-dotenv" -version = "1.1.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/f6/b0/4bc07ccd3572a2f9df7e6782f52b0c6c90dcbb803ac4a167702d7d0dfe1e/python_dotenv-1.1.1.tar.gz", hash = "sha256:a8a6399716257f45be6a007360200409fce5cda2661e3dec71d23dc15f6189ab", size = 41978, upload-time = "2025-06-24T04:21:07.341Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/5f/ed/539768cf28c661b5b068d66d96a2f155c4971a5d55684a514c1a0e0dec2f/python_dotenv-1.1.1-py3-none-any.whl", hash = "sha256:31f23644fe2602f88ff55e1f5c79ba497e01224ee7737937930c448e4d0e24dc", size = 20556, upload-time = "2025-06-24T04:21:06.073Z" }, -] - -[[package]] -name = "python-multipart" -version = "0.0.20" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/f3/87/f44d7c9f274c7ee665a29b885ec97089ec5dc034c7f3fafa03da9e39a09e/python_multipart-0.0.20.tar.gz", hash = "sha256:8dd0cab45b8e23064ae09147625994d090fa46f5b0d1e13af944c331a7fa9d13", size = 37158, upload-time = "2024-12-16T19:45:46.972Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/45/58/38b5afbc1a800eeea951b9285d3912613f2603bdf897a4ab0f4bd7f405fc/python_multipart-0.0.20-py3-none-any.whl", hash = "sha256:8a62d3a8335e06589fe01f2a3e178cdcc632f3fbe0d492ad9ee0ec35aab1f104", size = 24546, upload-time = "2024-12-16T19:45:44.423Z" }, -] - -[[package]] -name = "pywin32" -version = "311" -source = { registry = "https://pypi.org/simple" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/7c/af/449a6a91e5d6db51420875c54f6aff7c97a86a3b13a0b4f1a5c13b988de3/pywin32-311-cp311-cp311-win32.whl", hash = "sha256:184eb5e436dea364dcd3d2316d577d625c0351bf237c4e9a5fabbcfa5a58b151", size = 8697031, upload-time = "2025-07-14T20:13:13.266Z" }, - { url = "https://files.pythonhosted.org/packages/51/8f/9bb81dd5bb77d22243d33c8397f09377056d5c687aa6d4042bea7fbf8364/pywin32-311-cp311-cp311-win_amd64.whl", hash = "sha256:3ce80b34b22b17ccbd937a6e78e7225d80c52f5ab9940fe0506a1a16f3dab503", size = 9508308, upload-time = "2025-07-14T20:13:15.147Z" }, - { url = "https://files.pythonhosted.org/packages/44/7b/9c2ab54f74a138c491aba1b1cd0795ba61f144c711daea84a88b63dc0f6c/pywin32-311-cp311-cp311-win_arm64.whl", hash = "sha256:a733f1388e1a842abb67ffa8e7aad0e70ac519e09b0f6a784e65a136ec7cefd2", size = 8703930, upload-time = "2025-07-14T20:13:16.945Z" }, - { url = "https://files.pythonhosted.org/packages/e7/ab/01ea1943d4eba0f850c3c61e78e8dd59757ff815ff3ccd0a84de5f541f42/pywin32-311-cp312-cp312-win32.whl", hash = "sha256:750ec6e621af2b948540032557b10a2d43b0cee2ae9758c54154d711cc852d31", size = 8706543, upload-time = "2025-07-14T20:13:20.765Z" }, - { url = "https://files.pythonhosted.org/packages/d1/a8/a0e8d07d4d051ec7502cd58b291ec98dcc0c3fff027caad0470b72cfcc2f/pywin32-311-cp312-cp312-win_amd64.whl", hash = "sha256:b8c095edad5c211ff31c05223658e71bf7116daa0ecf3ad85f3201ea3190d067", size = 9495040, upload-time = "2025-07-14T20:13:22.543Z" }, - { url = "https://files.pythonhosted.org/packages/ba/3a/2ae996277b4b50f17d61f0603efd8253cb2d79cc7ae159468007b586396d/pywin32-311-cp312-cp312-win_arm64.whl", hash = "sha256:e286f46a9a39c4a18b319c28f59b61de793654af2f395c102b4f819e584b5852", size = 8710102, upload-time = "2025-07-14T20:13:24.682Z" }, - { url = "https://files.pythonhosted.org/packages/a5/be/3fd5de0979fcb3994bfee0d65ed8ca9506a8a1260651b86174f6a86f52b3/pywin32-311-cp313-cp313-win32.whl", hash = "sha256:f95ba5a847cba10dd8c4d8fefa9f2a6cf283b8b88ed6178fa8a6c1ab16054d0d", size = 8705700, upload-time = "2025-07-14T20:13:26.471Z" }, - { url = "https://files.pythonhosted.org/packages/e3/28/e0a1909523c6890208295a29e05c2adb2126364e289826c0a8bc7297bd5c/pywin32-311-cp313-cp313-win_amd64.whl", hash = "sha256:718a38f7e5b058e76aee1c56ddd06908116d35147e133427e59a3983f703a20d", size = 9494700, upload-time = "2025-07-14T20:13:28.243Z" }, - { url = "https://files.pythonhosted.org/packages/04/bf/90339ac0f55726dce7d794e6d79a18a91265bdf3aa70b6b9ca52f35e022a/pywin32-311-cp313-cp313-win_arm64.whl", hash = "sha256:7b4075d959648406202d92a2310cb990fea19b535c7f4a78d3f5e10b926eeb8a", size = 8709318, upload-time = "2025-07-14T20:13:30.348Z" }, - { url = "https://files.pythonhosted.org/packages/c9/31/097f2e132c4f16d99a22bfb777e0fd88bd8e1c634304e102f313af69ace5/pywin32-311-cp314-cp314-win32.whl", hash = "sha256:b7a2c10b93f8986666d0c803ee19b5990885872a7de910fc460f9b0c2fbf92ee", size = 8840714, upload-time = "2025-07-14T20:13:32.449Z" }, - { url = "https://files.pythonhosted.org/packages/90/4b/07c77d8ba0e01349358082713400435347df8426208171ce297da32c313d/pywin32-311-cp314-cp314-win_amd64.whl", hash = "sha256:3aca44c046bd2ed8c90de9cb8427f581c479e594e99b5c0bb19b29c10fd6cb87", size = 9656800, upload-time = "2025-07-14T20:13:34.312Z" }, - { url = "https://files.pythonhosted.org/packages/c0/d2/21af5c535501a7233e734b8af901574572da66fcc254cb35d0609c9080dd/pywin32-311-cp314-cp314-win_arm64.whl", hash = "sha256:a508e2d9025764a8270f93111a970e1d0fbfc33f4153b388bb649b7eec4f9b42", size = 8932540, upload-time = "2025-07-14T20:13:36.379Z" }, -] - -[[package]] -name = "pyyaml" -version = "6.0.2" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/54/ed/79a089b6be93607fa5cdaedf301d7dfb23af5f25c398d5ead2525b063e17/pyyaml-6.0.2.tar.gz", hash = "sha256:d584d9ec91ad65861cc08d42e834324ef890a082e591037abe114850ff7bbc3e", size = 130631, upload-time = "2024-08-06T20:33:50.674Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/f8/aa/7af4e81f7acba21a4c6be026da38fd2b872ca46226673c89a758ebdc4fd2/PyYAML-6.0.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:cc1c1159b3d456576af7a3e4d1ba7e6924cb39de8f67111c735f6fc832082774", size = 184612, upload-time = "2024-08-06T20:32:03.408Z" }, - { url = "https://files.pythonhosted.org/packages/8b/62/b9faa998fd185f65c1371643678e4d58254add437edb764a08c5a98fb986/PyYAML-6.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1e2120ef853f59c7419231f3bf4e7021f1b936f6ebd222406c3b60212205d2ee", size = 172040, upload-time = "2024-08-06T20:32:04.926Z" }, - { url = "https://files.pythonhosted.org/packages/ad/0c/c804f5f922a9a6563bab712d8dcc70251e8af811fce4524d57c2c0fd49a4/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5d225db5a45f21e78dd9358e58a98702a0302f2659a3c6cd320564b75b86f47c", size = 736829, upload-time = "2024-08-06T20:32:06.459Z" }, - { url = "https://files.pythonhosted.org/packages/51/16/6af8d6a6b210c8e54f1406a6b9481febf9c64a3109c541567e35a49aa2e7/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5ac9328ec4831237bec75defaf839f7d4564be1e6b25ac710bd1a96321cc8317", size = 764167, upload-time = "2024-08-06T20:32:08.338Z" }, - { url = "https://files.pythonhosted.org/packages/75/e4/2c27590dfc9992f73aabbeb9241ae20220bd9452df27483b6e56d3975cc5/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3ad2a3decf9aaba3d29c8f537ac4b243e36bef957511b4766cb0057d32b0be85", size = 762952, upload-time = "2024-08-06T20:32:14.124Z" }, - { url = "https://files.pythonhosted.org/packages/9b/97/ecc1abf4a823f5ac61941a9c00fe501b02ac3ab0e373c3857f7d4b83e2b6/PyYAML-6.0.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:ff3824dc5261f50c9b0dfb3be22b4567a6f938ccce4587b38952d85fd9e9afe4", size = 735301, upload-time = "2024-08-06T20:32:16.17Z" }, - { url = "https://files.pythonhosted.org/packages/45/73/0f49dacd6e82c9430e46f4a027baa4ca205e8b0a9dce1397f44edc23559d/PyYAML-6.0.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:797b4f722ffa07cc8d62053e4cff1486fa6dc094105d13fea7b1de7d8bf71c9e", size = 756638, upload-time = "2024-08-06T20:32:18.555Z" }, - { url = "https://files.pythonhosted.org/packages/22/5f/956f0f9fc65223a58fbc14459bf34b4cc48dec52e00535c79b8db361aabd/PyYAML-6.0.2-cp311-cp311-win32.whl", hash = "sha256:11d8f3dd2b9c1207dcaf2ee0bbbfd5991f571186ec9cc78427ba5bd32afae4b5", size = 143850, upload-time = "2024-08-06T20:32:19.889Z" }, - { url = "https://files.pythonhosted.org/packages/ed/23/8da0bbe2ab9dcdd11f4f4557ccaf95c10b9811b13ecced089d43ce59c3c8/PyYAML-6.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:e10ce637b18caea04431ce14fabcf5c64a1c61ec9c56b071a4b7ca131ca52d44", size = 161980, upload-time = "2024-08-06T20:32:21.273Z" }, - { url = "https://files.pythonhosted.org/packages/86/0c/c581167fc46d6d6d7ddcfb8c843a4de25bdd27e4466938109ca68492292c/PyYAML-6.0.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:c70c95198c015b85feafc136515252a261a84561b7b1d51e3384e0655ddf25ab", size = 183873, upload-time = "2024-08-06T20:32:25.131Z" }, - { url = "https://files.pythonhosted.org/packages/a8/0c/38374f5bb272c051e2a69281d71cba6fdb983413e6758b84482905e29a5d/PyYAML-6.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ce826d6ef20b1bc864f0a68340c8b3287705cae2f8b4b1d932177dcc76721725", size = 173302, upload-time = "2024-08-06T20:32:26.511Z" }, - { url = "https://files.pythonhosted.org/packages/c3/93/9916574aa8c00aa06bbac729972eb1071d002b8e158bd0e83a3b9a20a1f7/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1f71ea527786de97d1a0cc0eacd1defc0985dcf6b3f17bb77dcfc8c34bec4dc5", size = 739154, upload-time = "2024-08-06T20:32:28.363Z" }, - { url = "https://files.pythonhosted.org/packages/95/0f/b8938f1cbd09739c6da569d172531567dbcc9789e0029aa070856f123984/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9b22676e8097e9e22e36d6b7bda33190d0d400f345f23d4065d48f4ca7ae0425", size = 766223, upload-time = "2024-08-06T20:32:30.058Z" }, - { url = "https://files.pythonhosted.org/packages/b9/2b/614b4752f2e127db5cc206abc23a8c19678e92b23c3db30fc86ab731d3bd/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:80bab7bfc629882493af4aa31a4cfa43a4c57c83813253626916b8c7ada83476", size = 767542, upload-time = "2024-08-06T20:32:31.881Z" }, - { url = "https://files.pythonhosted.org/packages/d4/00/dd137d5bcc7efea1836d6264f049359861cf548469d18da90cd8216cf05f/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:0833f8694549e586547b576dcfaba4a6b55b9e96098b36cdc7ebefe667dfed48", size = 731164, upload-time = "2024-08-06T20:32:37.083Z" }, - { url = "https://files.pythonhosted.org/packages/c9/1f/4f998c900485e5c0ef43838363ba4a9723ac0ad73a9dc42068b12aaba4e4/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8b9c7197f7cb2738065c481a0461e50ad02f18c78cd75775628afb4d7137fb3b", size = 756611, upload-time = "2024-08-06T20:32:38.898Z" }, - { url = "https://files.pythonhosted.org/packages/df/d1/f5a275fdb252768b7a11ec63585bc38d0e87c9e05668a139fea92b80634c/PyYAML-6.0.2-cp312-cp312-win32.whl", hash = "sha256:ef6107725bd54b262d6dedcc2af448a266975032bc85ef0172c5f059da6325b4", size = 140591, upload-time = "2024-08-06T20:32:40.241Z" }, - { url = "https://files.pythonhosted.org/packages/0c/e8/4f648c598b17c3d06e8753d7d13d57542b30d56e6c2dedf9c331ae56312e/PyYAML-6.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:7e7401d0de89a9a855c839bc697c079a4af81cf878373abd7dc625847d25cbd8", size = 156338, upload-time = "2024-08-06T20:32:41.93Z" }, - { url = "https://files.pythonhosted.org/packages/ef/e3/3af305b830494fa85d95f6d95ef7fa73f2ee1cc8ef5b495c7c3269fb835f/PyYAML-6.0.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:efdca5630322a10774e8e98e1af481aad470dd62c3170801852d752aa7a783ba", size = 181309, upload-time = "2024-08-06T20:32:43.4Z" }, - { url = "https://files.pythonhosted.org/packages/45/9f/3b1c20a0b7a3200524eb0076cc027a970d320bd3a6592873c85c92a08731/PyYAML-6.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:50187695423ffe49e2deacb8cd10510bc361faac997de9efef88badc3bb9e2d1", size = 171679, upload-time = "2024-08-06T20:32:44.801Z" }, - { url = "https://files.pythonhosted.org/packages/7c/9a/337322f27005c33bcb656c655fa78325b730324c78620e8328ae28b64d0c/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0ffe8360bab4910ef1b9e87fb812d8bc0a308b0d0eef8c8f44e0254ab3b07133", size = 733428, upload-time = "2024-08-06T20:32:46.432Z" }, - { url = "https://files.pythonhosted.org/packages/a3/69/864fbe19e6c18ea3cc196cbe5d392175b4cf3d5d0ac1403ec3f2d237ebb5/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:17e311b6c678207928d649faa7cb0d7b4c26a0ba73d41e99c4fff6b6c3276484", size = 763361, upload-time = "2024-08-06T20:32:51.188Z" }, - { url = "https://files.pythonhosted.org/packages/04/24/b7721e4845c2f162d26f50521b825fb061bc0a5afcf9a386840f23ea19fa/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:70b189594dbe54f75ab3a1acec5f1e3faa7e8cf2f1e08d9b561cb41b845f69d5", size = 759523, upload-time = "2024-08-06T20:32:53.019Z" }, - { url = "https://files.pythonhosted.org/packages/2b/b2/e3234f59ba06559c6ff63c4e10baea10e5e7df868092bf9ab40e5b9c56b6/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:41e4e3953a79407c794916fa277a82531dd93aad34e29c2a514c2c0c5fe971cc", size = 726660, upload-time = "2024-08-06T20:32:54.708Z" }, - { url = "https://files.pythonhosted.org/packages/fe/0f/25911a9f080464c59fab9027482f822b86bf0608957a5fcc6eaac85aa515/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:68ccc6023a3400877818152ad9a1033e3db8625d899c72eacb5a668902e4d652", size = 751597, upload-time = "2024-08-06T20:32:56.985Z" }, - { url = "https://files.pythonhosted.org/packages/14/0d/e2c3b43bbce3cf6bd97c840b46088a3031085179e596d4929729d8d68270/PyYAML-6.0.2-cp313-cp313-win32.whl", hash = "sha256:bc2fa7c6b47d6bc618dd7fb02ef6fdedb1090ec036abab80d4681424b84c1183", size = 140527, upload-time = "2024-08-06T20:33:03.001Z" }, - { url = "https://files.pythonhosted.org/packages/fa/de/02b54f42487e3d3c6efb3f89428677074ca7bf43aae402517bc7cca949f3/PyYAML-6.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:8388ee1976c416731879ac16da0aff3f63b286ffdd57cdeb95f3f2e085687563", size = 156446, upload-time = "2024-08-06T20:33:04.33Z" }, -] - -[[package]] -name = "referencing" -version = "0.36.2" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "attrs" }, - { name = "rpds-py" }, - { name = "typing-extensions", marker = "python_full_version < '3.13'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/2f/db/98b5c277be99dd18bfd91dd04e1b759cad18d1a338188c936e92f921c7e2/referencing-0.36.2.tar.gz", hash = "sha256:df2e89862cd09deabbdba16944cc3f10feb6b3e6f18e902f7cc25609a34775aa", size = 74744, upload-time = "2025-01-25T08:48:16.138Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/c1/b1/3baf80dc6d2b7bc27a95a67752d0208e410351e3feb4eb78de5f77454d8d/referencing-0.36.2-py3-none-any.whl", hash = "sha256:e8699adbbf8b5c7de96d8ffa0eb5c158b3beafce084968e2ea8bb08c6794dcd0", size = 26775, upload-time = "2025-01-25T08:48:14.241Z" }, -] - -[[package]] -name = "requests" -version = "2.32.5" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "certifi" }, - { name = "charset-normalizer" }, - { name = "idna" }, - { name = "urllib3" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/c9/74/b3ff8e6c8446842c3f5c837e9c3dfcfe2018ea6ecef224c710c85ef728f4/requests-2.32.5.tar.gz", hash = "sha256:dbba0bac56e100853db0ea71b82b4dfd5fe2bf6d3754a8893c3af500cec7d7cf", size = 134517, upload-time = "2025-08-18T20:46:02.573Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/1e/db/4254e3eabe8020b458f1a747140d32277ec7a271daf1d235b70dc0b4e6e3/requests-2.32.5-py3-none-any.whl", hash = "sha256:2462f94637a34fd532264295e186976db0f5d453d1cdd31473c85a6a161affb6", size = 64738, upload-time = "2025-08-18T20:46:00.542Z" }, -] - -[[package]] -name = "rfc3339-validator" -version = "0.1.4" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "six" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/28/ea/a9387748e2d111c3c2b275ba970b735e04e15cdb1eb30693b6b5708c4dbd/rfc3339_validator-0.1.4.tar.gz", hash = "sha256:138a2abdf93304ad60530167e51d2dfb9549521a836871b88d7f4695d0022f6b", size = 5513, upload-time = "2021-05-12T16:37:54.178Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/7b/44/4e421b96b67b2daff264473f7465db72fbdf36a07e05494f50300cc7b0c6/rfc3339_validator-0.1.4-py2.py3-none-any.whl", hash = "sha256:24f6ec1eda14ef823da9e36ec7113124b39c04d50a4d3d3a3c2859577e7791fa", size = 3490, upload-time = "2021-05-12T16:37:52.536Z" }, -] - -[[package]] -name = "rich" -version = "14.1.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "markdown-it-py" }, - { name = "pygments" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/fe/75/af448d8e52bf1d8fa6a9d089ca6c07ff4453d86c65c145d0a300bb073b9b/rich-14.1.0.tar.gz", hash = "sha256:e497a48b844b0320d45007cdebfeaeed8db2a4f4bcf49f15e455cfc4af11eaa8", size = 224441, upload-time = "2025-07-25T07:32:58.125Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/e3/30/3c4d035596d3cf444529e0b2953ad0466f6049528a879d27534700580395/rich-14.1.0-py3-none-any.whl", hash = "sha256:536f5f1785986d6dbdea3c75205c473f970777b4a0d6c6dd1b696aa05a3fa04f", size = 243368, upload-time = "2025-07-25T07:32:56.73Z" }, -] - -[[package]] -name = "rich-rst" -version = "1.3.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "docutils" }, - { name = "rich" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/b0/69/5514c3a87b5f10f09a34bb011bc0927bc12c596c8dae5915604e71abc386/rich_rst-1.3.1.tar.gz", hash = "sha256:fad46e3ba42785ea8c1785e2ceaa56e0ffa32dbe5410dec432f37e4107c4f383", size = 13839, upload-time = "2024-04-30T04:40:38.125Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/fd/bc/cc4e3dbc5e7992398dcb7a8eda0cbcf4fb792a0cdb93f857b478bf3cf884/rich_rst-1.3.1-py3-none-any.whl", hash = "sha256:498a74e3896507ab04492d326e794c3ef76e7cda078703aa592d1853d91098c1", size = 11621, upload-time = "2024-04-30T04:40:32.619Z" }, -] - -[[package]] -name = "rpds-py" -version = "0.27.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/e9/dd/2c0cbe774744272b0ae725f44032c77bdcab6e8bcf544bffa3b6e70c8dba/rpds_py-0.27.1.tar.gz", hash = "sha256:26a1c73171d10b7acccbded82bf6a586ab8203601e565badc74bbbf8bc5a10f8", size = 27479, upload-time = "2025-08-27T12:16:36.024Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/b5/c1/7907329fbef97cbd49db6f7303893bd1dd5a4a3eae415839ffdfb0762cae/rpds_py-0.27.1-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:be898f271f851f68b318872ce6ebebbc62f303b654e43bf72683dbdc25b7c881", size = 371063, upload-time = "2025-08-27T12:12:47.856Z" }, - { url = "https://files.pythonhosted.org/packages/11/94/2aab4bc86228bcf7c48760990273653a4900de89c7537ffe1b0d6097ed39/rpds_py-0.27.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:62ac3d4e3e07b58ee0ddecd71d6ce3b1637de2d373501412df395a0ec5f9beb5", size = 353210, upload-time = "2025-08-27T12:12:49.187Z" }, - { url = "https://files.pythonhosted.org/packages/3a/57/f5eb3ecf434342f4f1a46009530e93fd201a0b5b83379034ebdb1d7c1a58/rpds_py-0.27.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4708c5c0ceb2d034f9991623631d3d23cb16e65c83736ea020cdbe28d57c0a0e", size = 381636, upload-time = "2025-08-27T12:12:50.492Z" }, - { url = "https://files.pythonhosted.org/packages/ae/f4/ef95c5945e2ceb5119571b184dd5a1cc4b8541bbdf67461998cfeac9cb1e/rpds_py-0.27.1-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:abfa1171a9952d2e0002aba2ad3780820b00cc3d9c98c6630f2e93271501f66c", size = 394341, upload-time = "2025-08-27T12:12:52.024Z" }, - { url = "https://files.pythonhosted.org/packages/5a/7e/4bd610754bf492d398b61725eb9598ddd5eb86b07d7d9483dbcd810e20bc/rpds_py-0.27.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4b507d19f817ebaca79574b16eb2ae412e5c0835542c93fe9983f1e432aca195", size = 523428, upload-time = "2025-08-27T12:12:53.779Z" }, - { url = "https://files.pythonhosted.org/packages/9f/e5/059b9f65a8c9149361a8b75094864ab83b94718344db511fd6117936ed2a/rpds_py-0.27.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:168b025f8fd8d8d10957405f3fdcef3dc20f5982d398f90851f4abc58c566c52", size = 402923, upload-time = "2025-08-27T12:12:55.15Z" }, - { url = "https://files.pythonhosted.org/packages/f5/48/64cabb7daced2968dd08e8a1b7988bf358d7bd5bcd5dc89a652f4668543c/rpds_py-0.27.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cb56c6210ef77caa58e16e8c17d35c63fe3f5b60fd9ba9d424470c3400bcf9ed", size = 384094, upload-time = "2025-08-27T12:12:57.194Z" }, - { url = "https://files.pythonhosted.org/packages/ae/e1/dc9094d6ff566bff87add8a510c89b9e158ad2ecd97ee26e677da29a9e1b/rpds_py-0.27.1-cp311-cp311-manylinux_2_31_riscv64.whl", hash = "sha256:d252f2d8ca0195faa707f8eb9368955760880b2b42a8ee16d382bf5dd807f89a", size = 401093, upload-time = "2025-08-27T12:12:58.985Z" }, - { url = "https://files.pythonhosted.org/packages/37/8e/ac8577e3ecdd5593e283d46907d7011618994e1d7ab992711ae0f78b9937/rpds_py-0.27.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6e5e54da1e74b91dbc7996b56640f79b195d5925c2b78efaa8c5d53e1d88edde", size = 417969, upload-time = "2025-08-27T12:13:00.367Z" }, - { url = "https://files.pythonhosted.org/packages/66/6d/87507430a8f74a93556fe55c6485ba9c259949a853ce407b1e23fea5ba31/rpds_py-0.27.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:ffce0481cc6e95e5b3f0a47ee17ffbd234399e6d532f394c8dce320c3b089c21", size = 558302, upload-time = "2025-08-27T12:13:01.737Z" }, - { url = "https://files.pythonhosted.org/packages/3a/bb/1db4781ce1dda3eecc735e3152659a27b90a02ca62bfeea17aee45cc0fbc/rpds_py-0.27.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:a205fdfe55c90c2cd8e540ca9ceba65cbe6629b443bc05db1f590a3db8189ff9", size = 589259, upload-time = "2025-08-27T12:13:03.127Z" }, - { url = "https://files.pythonhosted.org/packages/7b/0e/ae1c8943d11a814d01b482e1f8da903f88047a962dff9bbdadf3bd6e6fd1/rpds_py-0.27.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:689fb5200a749db0415b092972e8eba85847c23885c8543a8b0f5c009b1a5948", size = 554983, upload-time = "2025-08-27T12:13:04.516Z" }, - { url = "https://files.pythonhosted.org/packages/b2/d5/0b2a55415931db4f112bdab072443ff76131b5ac4f4dc98d10d2d357eb03/rpds_py-0.27.1-cp311-cp311-win32.whl", hash = "sha256:3182af66048c00a075010bc7f4860f33913528a4b6fc09094a6e7598e462fe39", size = 217154, upload-time = "2025-08-27T12:13:06.278Z" }, - { url = "https://files.pythonhosted.org/packages/24/75/3b7ffe0d50dc86a6a964af0d1cc3a4a2cdf437cb7b099a4747bbb96d1819/rpds_py-0.27.1-cp311-cp311-win_amd64.whl", hash = "sha256:b4938466c6b257b2f5c4ff98acd8128ec36b5059e5c8f8372d79316b1c36bb15", size = 228627, upload-time = "2025-08-27T12:13:07.625Z" }, - { url = "https://files.pythonhosted.org/packages/8d/3f/4fd04c32abc02c710f09a72a30c9a55ea3cc154ef8099078fd50a0596f8e/rpds_py-0.27.1-cp311-cp311-win_arm64.whl", hash = "sha256:2f57af9b4d0793e53266ee4325535a31ba48e2f875da81a9177c9926dfa60746", size = 220998, upload-time = "2025-08-27T12:13:08.972Z" }, - { url = "https://files.pythonhosted.org/packages/bd/fe/38de28dee5df58b8198c743fe2bea0c785c6d40941b9950bac4cdb71a014/rpds_py-0.27.1-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:ae2775c1973e3c30316892737b91f9283f9908e3cc7625b9331271eaaed7dc90", size = 361887, upload-time = "2025-08-27T12:13:10.233Z" }, - { url = "https://files.pythonhosted.org/packages/7c/9a/4b6c7eedc7dd90986bf0fab6ea2a091ec11c01b15f8ba0a14d3f80450468/rpds_py-0.27.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:2643400120f55c8a96f7c9d858f7be0c88d383cd4653ae2cf0d0c88f668073e5", size = 345795, upload-time = "2025-08-27T12:13:11.65Z" }, - { url = "https://files.pythonhosted.org/packages/6f/0e/e650e1b81922847a09cca820237b0edee69416a01268b7754d506ade11ad/rpds_py-0.27.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:16323f674c089b0360674a4abd28d5042947d54ba620f72514d69be4ff64845e", size = 385121, upload-time = "2025-08-27T12:13:13.008Z" }, - { url = "https://files.pythonhosted.org/packages/1b/ea/b306067a712988e2bff00dcc7c8f31d26c29b6d5931b461aa4b60a013e33/rpds_py-0.27.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:9a1f4814b65eacac94a00fc9a526e3fdafd78e439469644032032d0d63de4881", size = 398976, upload-time = "2025-08-27T12:13:14.368Z" }, - { url = "https://files.pythonhosted.org/packages/2c/0a/26dc43c8840cb8fe239fe12dbc8d8de40f2365e838f3d395835dde72f0e5/rpds_py-0.27.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7ba32c16b064267b22f1850a34051121d423b6f7338a12b9459550eb2096e7ec", size = 525953, upload-time = "2025-08-27T12:13:15.774Z" }, - { url = "https://files.pythonhosted.org/packages/22/14/c85e8127b573aaf3a0cbd7fbb8c9c99e735a4a02180c84da2a463b766e9e/rpds_py-0.27.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e5c20f33fd10485b80f65e800bbe5f6785af510b9f4056c5a3c612ebc83ba6cb", size = 407915, upload-time = "2025-08-27T12:13:17.379Z" }, - { url = "https://files.pythonhosted.org/packages/ed/7b/8f4fee9ba1fb5ec856eb22d725a4efa3deb47f769597c809e03578b0f9d9/rpds_py-0.27.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:466bfe65bd932da36ff279ddd92de56b042f2266d752719beb97b08526268ec5", size = 386883, upload-time = "2025-08-27T12:13:18.704Z" }, - { url = "https://files.pythonhosted.org/packages/86/47/28fa6d60f8b74fcdceba81b272f8d9836ac0340570f68f5df6b41838547b/rpds_py-0.27.1-cp312-cp312-manylinux_2_31_riscv64.whl", hash = "sha256:41e532bbdcb57c92ba3be62c42e9f096431b4cf478da9bc3bc6ce5c38ab7ba7a", size = 405699, upload-time = "2025-08-27T12:13:20.089Z" }, - { url = "https://files.pythonhosted.org/packages/d0/fd/c5987b5e054548df56953a21fe2ebed51fc1ec7c8f24fd41c067b68c4a0a/rpds_py-0.27.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f149826d742b406579466283769a8ea448eed82a789af0ed17b0cd5770433444", size = 423713, upload-time = "2025-08-27T12:13:21.436Z" }, - { url = "https://files.pythonhosted.org/packages/ac/ba/3c4978b54a73ed19a7d74531be37a8bcc542d917c770e14d372b8daea186/rpds_py-0.27.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:80c60cfb5310677bd67cb1e85a1e8eb52e12529545441b43e6f14d90b878775a", size = 562324, upload-time = "2025-08-27T12:13:22.789Z" }, - { url = "https://files.pythonhosted.org/packages/b5/6c/6943a91768fec16db09a42b08644b960cff540c66aab89b74be6d4a144ba/rpds_py-0.27.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:7ee6521b9baf06085f62ba9c7a3e5becffbc32480d2f1b351559c001c38ce4c1", size = 593646, upload-time = "2025-08-27T12:13:24.122Z" }, - { url = "https://files.pythonhosted.org/packages/11/73/9d7a8f4be5f4396f011a6bb7a19fe26303a0dac9064462f5651ced2f572f/rpds_py-0.27.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:a512c8263249a9d68cac08b05dd59d2b3f2061d99b322813cbcc14c3c7421998", size = 558137, upload-time = "2025-08-27T12:13:25.557Z" }, - { url = "https://files.pythonhosted.org/packages/6e/96/6772cbfa0e2485bcceef8071de7821f81aeac8bb45fbfd5542a3e8108165/rpds_py-0.27.1-cp312-cp312-win32.whl", hash = "sha256:819064fa048ba01b6dadc5116f3ac48610435ac9a0058bbde98e569f9e785c39", size = 221343, upload-time = "2025-08-27T12:13:26.967Z" }, - { url = "https://files.pythonhosted.org/packages/67/b6/c82f0faa9af1c6a64669f73a17ee0eeef25aff30bb9a1c318509efe45d84/rpds_py-0.27.1-cp312-cp312-win_amd64.whl", hash = "sha256:d9199717881f13c32c4046a15f024971a3b78ad4ea029e8da6b86e5aa9cf4594", size = 232497, upload-time = "2025-08-27T12:13:28.326Z" }, - { url = "https://files.pythonhosted.org/packages/e1/96/2817b44bd2ed11aebacc9251da03689d56109b9aba5e311297b6902136e2/rpds_py-0.27.1-cp312-cp312-win_arm64.whl", hash = "sha256:33aa65b97826a0e885ef6e278fbd934e98cdcfed80b63946025f01e2f5b29502", size = 222790, upload-time = "2025-08-27T12:13:29.71Z" }, - { url = "https://files.pythonhosted.org/packages/cc/77/610aeee8d41e39080c7e14afa5387138e3c9fa9756ab893d09d99e7d8e98/rpds_py-0.27.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:e4b9fcfbc021633863a37e92571d6f91851fa656f0180246e84cbd8b3f6b329b", size = 361741, upload-time = "2025-08-27T12:13:31.039Z" }, - { url = "https://files.pythonhosted.org/packages/3a/fc/c43765f201c6a1c60be2043cbdb664013def52460a4c7adace89d6682bf4/rpds_py-0.27.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:1441811a96eadca93c517d08df75de45e5ffe68aa3089924f963c782c4b898cf", size = 345574, upload-time = "2025-08-27T12:13:32.902Z" }, - { url = "https://files.pythonhosted.org/packages/20/42/ee2b2ca114294cd9847d0ef9c26d2b0851b2e7e00bf14cc4c0b581df0fc3/rpds_py-0.27.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:55266dafa22e672f5a4f65019015f90336ed31c6383bd53f5e7826d21a0e0b83", size = 385051, upload-time = "2025-08-27T12:13:34.228Z" }, - { url = "https://files.pythonhosted.org/packages/fd/e8/1e430fe311e4799e02e2d1af7c765f024e95e17d651612425b226705f910/rpds_py-0.27.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d78827d7ac08627ea2c8e02c9e5b41180ea5ea1f747e9db0915e3adf36b62dcf", size = 398395, upload-time = "2025-08-27T12:13:36.132Z" }, - { url = "https://files.pythonhosted.org/packages/82/95/9dc227d441ff2670651c27a739acb2535ccaf8b351a88d78c088965e5996/rpds_py-0.27.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae92443798a40a92dc5f0b01d8a7c93adde0c4dc965310a29ae7c64d72b9fad2", size = 524334, upload-time = "2025-08-27T12:13:37.562Z" }, - { url = "https://files.pythonhosted.org/packages/87/01/a670c232f401d9ad461d9a332aa4080cd3cb1d1df18213dbd0d2a6a7ab51/rpds_py-0.27.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c46c9dd2403b66a2a3b9720ec4b74d4ab49d4fabf9f03dfdce2d42af913fe8d0", size = 407691, upload-time = "2025-08-27T12:13:38.94Z" }, - { url = "https://files.pythonhosted.org/packages/03/36/0a14aebbaa26fe7fab4780c76f2239e76cc95a0090bdb25e31d95c492fcd/rpds_py-0.27.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2efe4eb1d01b7f5f1939f4ef30ecea6c6b3521eec451fb93191bf84b2a522418", size = 386868, upload-time = "2025-08-27T12:13:40.192Z" }, - { url = "https://files.pythonhosted.org/packages/3b/03/8c897fb8b5347ff6c1cc31239b9611c5bf79d78c984430887a353e1409a1/rpds_py-0.27.1-cp313-cp313-manylinux_2_31_riscv64.whl", hash = "sha256:15d3b4d83582d10c601f481eca29c3f138d44c92187d197aff663a269197c02d", size = 405469, upload-time = "2025-08-27T12:13:41.496Z" }, - { url = "https://files.pythonhosted.org/packages/da/07/88c60edc2df74850d496d78a1fdcdc7b54360a7f610a4d50008309d41b94/rpds_py-0.27.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:4ed2e16abbc982a169d30d1a420274a709949e2cbdef119fe2ec9d870b42f274", size = 422125, upload-time = "2025-08-27T12:13:42.802Z" }, - { url = "https://files.pythonhosted.org/packages/6b/86/5f4c707603e41b05f191a749984f390dabcbc467cf833769b47bf14ba04f/rpds_py-0.27.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a75f305c9b013289121ec0f1181931975df78738cdf650093e6b86d74aa7d8dd", size = 562341, upload-time = "2025-08-27T12:13:44.472Z" }, - { url = "https://files.pythonhosted.org/packages/b2/92/3c0cb2492094e3cd9baf9e49bbb7befeceb584ea0c1a8b5939dca4da12e5/rpds_py-0.27.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:67ce7620704745881a3d4b0ada80ab4d99df390838839921f99e63c474f82cf2", size = 592511, upload-time = "2025-08-27T12:13:45.898Z" }, - { url = "https://files.pythonhosted.org/packages/10/bb/82e64fbb0047c46a168faa28d0d45a7851cd0582f850b966811d30f67ad8/rpds_py-0.27.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9d992ac10eb86d9b6f369647b6a3f412fc0075cfd5d799530e84d335e440a002", size = 557736, upload-time = "2025-08-27T12:13:47.408Z" }, - { url = "https://files.pythonhosted.org/packages/00/95/3c863973d409210da7fb41958172c6b7dbe7fc34e04d3cc1f10bb85e979f/rpds_py-0.27.1-cp313-cp313-win32.whl", hash = "sha256:4f75e4bd8ab8db624e02c8e2fc4063021b58becdbe6df793a8111d9343aec1e3", size = 221462, upload-time = "2025-08-27T12:13:48.742Z" }, - { url = "https://files.pythonhosted.org/packages/ce/2c/5867b14a81dc217b56d95a9f2a40fdbc56a1ab0181b80132beeecbd4b2d6/rpds_py-0.27.1-cp313-cp313-win_amd64.whl", hash = "sha256:f9025faafc62ed0b75a53e541895ca272815bec18abe2249ff6501c8f2e12b83", size = 232034, upload-time = "2025-08-27T12:13:50.11Z" }, - { url = "https://files.pythonhosted.org/packages/c7/78/3958f3f018c01923823f1e47f1cc338e398814b92d83cd278364446fac66/rpds_py-0.27.1-cp313-cp313-win_arm64.whl", hash = "sha256:ed10dc32829e7d222b7d3b93136d25a406ba9788f6a7ebf6809092da1f4d279d", size = 222392, upload-time = "2025-08-27T12:13:52.587Z" }, - { url = "https://files.pythonhosted.org/packages/01/76/1cdf1f91aed5c3a7bf2eba1f1c4e4d6f57832d73003919a20118870ea659/rpds_py-0.27.1-cp313-cp313t-macosx_10_12_x86_64.whl", hash = "sha256:92022bbbad0d4426e616815b16bc4127f83c9a74940e1ccf3cfe0b387aba0228", size = 358355, upload-time = "2025-08-27T12:13:54.012Z" }, - { url = "https://files.pythonhosted.org/packages/c3/6f/bf142541229374287604caf3bb2a4ae17f0a580798fd72d3b009b532db4e/rpds_py-0.27.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:47162fdab9407ec3f160805ac3e154df042e577dd53341745fc7fb3f625e6d92", size = 342138, upload-time = "2025-08-27T12:13:55.791Z" }, - { url = "https://files.pythonhosted.org/packages/1a/77/355b1c041d6be40886c44ff5e798b4e2769e497b790f0f7fd1e78d17e9a8/rpds_py-0.27.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb89bec23fddc489e5d78b550a7b773557c9ab58b7946154a10a6f7a214a48b2", size = 380247, upload-time = "2025-08-27T12:13:57.683Z" }, - { url = "https://files.pythonhosted.org/packages/d6/a4/d9cef5c3946ea271ce2243c51481971cd6e34f21925af2783dd17b26e815/rpds_py-0.27.1-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e48af21883ded2b3e9eb48cb7880ad8598b31ab752ff3be6457001d78f416723", size = 390699, upload-time = "2025-08-27T12:13:59.137Z" }, - { url = "https://files.pythonhosted.org/packages/3a/06/005106a7b8c6c1a7e91b73169e49870f4af5256119d34a361ae5240a0c1d/rpds_py-0.27.1-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6f5b7bd8e219ed50299e58551a410b64daafb5017d54bbe822e003856f06a802", size = 521852, upload-time = "2025-08-27T12:14:00.583Z" }, - { url = "https://files.pythonhosted.org/packages/e5/3e/50fb1dac0948e17a02eb05c24510a8fe12d5ce8561c6b7b7d1339ab7ab9c/rpds_py-0.27.1-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:08f1e20bccf73b08d12d804d6e1c22ca5530e71659e6673bce31a6bb71c1e73f", size = 402582, upload-time = "2025-08-27T12:14:02.034Z" }, - { url = "https://files.pythonhosted.org/packages/cb/b0/f4e224090dc5b0ec15f31a02d746ab24101dd430847c4d99123798661bfc/rpds_py-0.27.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0dc5dceeaefcc96dc192e3a80bbe1d6c410c469e97bdd47494a7d930987f18b2", size = 384126, upload-time = "2025-08-27T12:14:03.437Z" }, - { url = "https://files.pythonhosted.org/packages/54/77/ac339d5f82b6afff1df8f0fe0d2145cc827992cb5f8eeb90fc9f31ef7a63/rpds_py-0.27.1-cp313-cp313t-manylinux_2_31_riscv64.whl", hash = "sha256:d76f9cc8665acdc0c9177043746775aa7babbf479b5520b78ae4002d889f5c21", size = 399486, upload-time = "2025-08-27T12:14:05.443Z" }, - { url = "https://files.pythonhosted.org/packages/d6/29/3e1c255eee6ac358c056a57d6d6869baa00a62fa32eea5ee0632039c50a3/rpds_py-0.27.1-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:134fae0e36022edad8290a6661edf40c023562964efea0cc0ec7f5d392d2aaef", size = 414832, upload-time = "2025-08-27T12:14:06.902Z" }, - { url = "https://files.pythonhosted.org/packages/3f/db/6d498b844342deb3fa1d030598db93937a9964fcf5cb4da4feb5f17be34b/rpds_py-0.27.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:eb11a4f1b2b63337cfd3b4d110af778a59aae51c81d195768e353d8b52f88081", size = 557249, upload-time = "2025-08-27T12:14:08.37Z" }, - { url = "https://files.pythonhosted.org/packages/60/f3/690dd38e2310b6f68858a331399b4d6dbb9132c3e8ef8b4333b96caf403d/rpds_py-0.27.1-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:13e608ac9f50a0ed4faec0e90ece76ae33b34c0e8656e3dceb9a7db994c692cd", size = 587356, upload-time = "2025-08-27T12:14:10.034Z" }, - { url = "https://files.pythonhosted.org/packages/86/e3/84507781cccd0145f35b1dc32c72675200c5ce8d5b30f813e49424ef68fc/rpds_py-0.27.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:dd2135527aa40f061350c3f8f89da2644de26cd73e4de458e79606384f4f68e7", size = 555300, upload-time = "2025-08-27T12:14:11.783Z" }, - { url = "https://files.pythonhosted.org/packages/e5/ee/375469849e6b429b3516206b4580a79e9ef3eb12920ddbd4492b56eaacbe/rpds_py-0.27.1-cp313-cp313t-win32.whl", hash = "sha256:3020724ade63fe320a972e2ffd93b5623227e684315adce194941167fee02688", size = 216714, upload-time = "2025-08-27T12:14:13.629Z" }, - { url = "https://files.pythonhosted.org/packages/21/87/3fc94e47c9bd0742660e84706c311a860dcae4374cf4a03c477e23ce605a/rpds_py-0.27.1-cp313-cp313t-win_amd64.whl", hash = "sha256:8ee50c3e41739886606388ba3ab3ee2aae9f35fb23f833091833255a31740797", size = 228943, upload-time = "2025-08-27T12:14:14.937Z" }, - { url = "https://files.pythonhosted.org/packages/70/36/b6e6066520a07cf029d385de869729a895917b411e777ab1cde878100a1d/rpds_py-0.27.1-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:acb9aafccaae278f449d9c713b64a9e68662e7799dbd5859e2c6b3c67b56d334", size = 362472, upload-time = "2025-08-27T12:14:16.333Z" }, - { url = "https://files.pythonhosted.org/packages/af/07/b4646032e0dcec0df9c73a3bd52f63bc6c5f9cda992f06bd0e73fe3fbebd/rpds_py-0.27.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:b7fb801aa7f845ddf601c49630deeeccde7ce10065561d92729bfe81bd21fb33", size = 345676, upload-time = "2025-08-27T12:14:17.764Z" }, - { url = "https://files.pythonhosted.org/packages/b0/16/2f1003ee5d0af4bcb13c0cf894957984c32a6751ed7206db2aee7379a55e/rpds_py-0.27.1-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fe0dd05afb46597b9a2e11c351e5e4283c741237e7f617ffb3252780cca9336a", size = 385313, upload-time = "2025-08-27T12:14:19.829Z" }, - { url = "https://files.pythonhosted.org/packages/05/cd/7eb6dd7b232e7f2654d03fa07f1414d7dfc980e82ba71e40a7c46fd95484/rpds_py-0.27.1-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:b6dfb0e058adb12d8b1d1b25f686e94ffa65d9995a5157afe99743bf7369d62b", size = 399080, upload-time = "2025-08-27T12:14:21.531Z" }, - { url = "https://files.pythonhosted.org/packages/20/51/5829afd5000ec1cb60f304711f02572d619040aa3ec033d8226817d1e571/rpds_py-0.27.1-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ed090ccd235f6fa8bb5861684567f0a83e04f52dfc2e5c05f2e4b1309fcf85e7", size = 523868, upload-time = "2025-08-27T12:14:23.485Z" }, - { url = "https://files.pythonhosted.org/packages/05/2c/30eebca20d5db95720ab4d2faec1b5e4c1025c473f703738c371241476a2/rpds_py-0.27.1-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:bf876e79763eecf3e7356f157540d6a093cef395b65514f17a356f62af6cc136", size = 408750, upload-time = "2025-08-27T12:14:24.924Z" }, - { url = "https://files.pythonhosted.org/packages/90/1a/cdb5083f043597c4d4276eae4e4c70c55ab5accec078da8611f24575a367/rpds_py-0.27.1-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:12ed005216a51b1d6e2b02a7bd31885fe317e45897de81d86dcce7d74618ffff", size = 387688, upload-time = "2025-08-27T12:14:27.537Z" }, - { url = "https://files.pythonhosted.org/packages/7c/92/cf786a15320e173f945d205ab31585cc43969743bb1a48b6888f7a2b0a2d/rpds_py-0.27.1-cp314-cp314-manylinux_2_31_riscv64.whl", hash = "sha256:ee4308f409a40e50593c7e3bb8cbe0b4d4c66d1674a316324f0c2f5383b486f9", size = 407225, upload-time = "2025-08-27T12:14:28.981Z" }, - { url = "https://files.pythonhosted.org/packages/33/5c/85ee16df5b65063ef26017bef33096557a4c83fbe56218ac7cd8c235f16d/rpds_py-0.27.1-cp314-cp314-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:0b08d152555acf1f455154d498ca855618c1378ec810646fcd7c76416ac6dc60", size = 423361, upload-time = "2025-08-27T12:14:30.469Z" }, - { url = "https://files.pythonhosted.org/packages/4b/8e/1c2741307fcabd1a334ecf008e92c4f47bb6f848712cf15c923becfe82bb/rpds_py-0.27.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:dce51c828941973a5684d458214d3a36fcd28da3e1875d659388f4f9f12cc33e", size = 562493, upload-time = "2025-08-27T12:14:31.987Z" }, - { url = "https://files.pythonhosted.org/packages/04/03/5159321baae9b2222442a70c1f988cbbd66b9be0675dd3936461269be360/rpds_py-0.27.1-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:c1476d6f29eb81aa4151c9a31219b03f1f798dc43d8af1250a870735516a1212", size = 592623, upload-time = "2025-08-27T12:14:33.543Z" }, - { url = "https://files.pythonhosted.org/packages/ff/39/c09fd1ad28b85bc1d4554a8710233c9f4cefd03d7717a1b8fbfd171d1167/rpds_py-0.27.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:3ce0cac322b0d69b63c9cdb895ee1b65805ec9ffad37639f291dd79467bee675", size = 558800, upload-time = "2025-08-27T12:14:35.436Z" }, - { url = "https://files.pythonhosted.org/packages/c5/d6/99228e6bbcf4baa764b18258f519a9035131d91b538d4e0e294313462a98/rpds_py-0.27.1-cp314-cp314-win32.whl", hash = "sha256:dfbfac137d2a3d0725758cd141f878bf4329ba25e34979797c89474a89a8a3a3", size = 221943, upload-time = "2025-08-27T12:14:36.898Z" }, - { url = "https://files.pythonhosted.org/packages/be/07/c802bc6b8e95be83b79bdf23d1aa61d68324cb1006e245d6c58e959e314d/rpds_py-0.27.1-cp314-cp314-win_amd64.whl", hash = "sha256:a6e57b0abfe7cc513450fcf529eb486b6e4d3f8aee83e92eb5f1ef848218d456", size = 233739, upload-time = "2025-08-27T12:14:38.386Z" }, - { url = "https://files.pythonhosted.org/packages/c8/89/3e1b1c16d4c2d547c5717377a8df99aee8099ff050f87c45cb4d5fa70891/rpds_py-0.27.1-cp314-cp314-win_arm64.whl", hash = "sha256:faf8d146f3d476abfee026c4ae3bdd9ca14236ae4e4c310cbd1cf75ba33d24a3", size = 223120, upload-time = "2025-08-27T12:14:39.82Z" }, - { url = "https://files.pythonhosted.org/packages/62/7e/dc7931dc2fa4a6e46b2a4fa744a9fe5c548efd70e0ba74f40b39fa4a8c10/rpds_py-0.27.1-cp314-cp314t-macosx_10_12_x86_64.whl", hash = "sha256:ba81d2b56b6d4911ce735aad0a1d4495e808b8ee4dc58715998741a26874e7c2", size = 358944, upload-time = "2025-08-27T12:14:41.199Z" }, - { url = "https://files.pythonhosted.org/packages/e6/22/4af76ac4e9f336bfb1a5f240d18a33c6b2fcaadb7472ac7680576512b49a/rpds_py-0.27.1-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:84f7d509870098de0e864cad0102711c1e24e9b1a50ee713b65928adb22269e4", size = 342283, upload-time = "2025-08-27T12:14:42.699Z" }, - { url = "https://files.pythonhosted.org/packages/1c/15/2a7c619b3c2272ea9feb9ade67a45c40b3eeb500d503ad4c28c395dc51b4/rpds_py-0.27.1-cp314-cp314t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a9e960fc78fecd1100539f14132425e1d5fe44ecb9239f8f27f079962021523e", size = 380320, upload-time = "2025-08-27T12:14:44.157Z" }, - { url = "https://files.pythonhosted.org/packages/a2/7d/4c6d243ba4a3057e994bb5bedd01b5c963c12fe38dde707a52acdb3849e7/rpds_py-0.27.1-cp314-cp314t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:62f85b665cedab1a503747617393573995dac4600ff51869d69ad2f39eb5e817", size = 391760, upload-time = "2025-08-27T12:14:45.845Z" }, - { url = "https://files.pythonhosted.org/packages/b4/71/b19401a909b83bcd67f90221330bc1ef11bc486fe4e04c24388d28a618ae/rpds_py-0.27.1-cp314-cp314t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:fed467af29776f6556250c9ed85ea5a4dd121ab56a5f8b206e3e7a4c551e48ec", size = 522476, upload-time = "2025-08-27T12:14:47.364Z" }, - { url = "https://files.pythonhosted.org/packages/e4/44/1a3b9715c0455d2e2f0f6df5ee6d6f5afdc423d0773a8a682ed2b43c566c/rpds_py-0.27.1-cp314-cp314t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f2729615f9d430af0ae6b36cf042cb55c0936408d543fb691e1a9e36648fd35a", size = 403418, upload-time = "2025-08-27T12:14:49.991Z" }, - { url = "https://files.pythonhosted.org/packages/1c/4b/fb6c4f14984eb56673bc868a66536f53417ddb13ed44b391998100a06a96/rpds_py-0.27.1-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1b207d881a9aef7ba753d69c123a35d96ca7cb808056998f6b9e8747321f03b8", size = 384771, upload-time = "2025-08-27T12:14:52.159Z" }, - { url = "https://files.pythonhosted.org/packages/c0/56/d5265d2d28b7420d7b4d4d85cad8ef891760f5135102e60d5c970b976e41/rpds_py-0.27.1-cp314-cp314t-manylinux_2_31_riscv64.whl", hash = "sha256:639fd5efec029f99b79ae47e5d7e00ad8a773da899b6309f6786ecaf22948c48", size = 400022, upload-time = "2025-08-27T12:14:53.859Z" }, - { url = "https://files.pythonhosted.org/packages/8f/e9/9f5fc70164a569bdd6ed9046486c3568d6926e3a49bdefeeccfb18655875/rpds_py-0.27.1-cp314-cp314t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fecc80cb2a90e28af8a9b366edacf33d7a91cbfe4c2c4544ea1246e949cfebeb", size = 416787, upload-time = "2025-08-27T12:14:55.673Z" }, - { url = "https://files.pythonhosted.org/packages/d4/64/56dd03430ba491db943a81dcdef115a985aac5f44f565cd39a00c766d45c/rpds_py-0.27.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:42a89282d711711d0a62d6f57d81aa43a1368686c45bc1c46b7f079d55692734", size = 557538, upload-time = "2025-08-27T12:14:57.245Z" }, - { url = "https://files.pythonhosted.org/packages/3f/36/92cc885a3129993b1d963a2a42ecf64e6a8e129d2c7cc980dbeba84e55fb/rpds_py-0.27.1-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:cf9931f14223de59551ab9d38ed18d92f14f055a5f78c1d8ad6493f735021bbb", size = 588512, upload-time = "2025-08-27T12:14:58.728Z" }, - { url = "https://files.pythonhosted.org/packages/dd/10/6b283707780a81919f71625351182b4f98932ac89a09023cb61865136244/rpds_py-0.27.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:f39f58a27cc6e59f432b568ed8429c7e1641324fbe38131de852cd77b2d534b0", size = 555813, upload-time = "2025-08-27T12:15:00.334Z" }, - { url = "https://files.pythonhosted.org/packages/04/2e/30b5ea18c01379da6272a92825dd7e53dc9d15c88a19e97932d35d430ef7/rpds_py-0.27.1-cp314-cp314t-win32.whl", hash = "sha256:d5fa0ee122dc09e23607a28e6d7b150da16c662e66409bbe85230e4c85bb528a", size = 217385, upload-time = "2025-08-27T12:15:01.937Z" }, - { url = "https://files.pythonhosted.org/packages/32/7d/97119da51cb1dd3f2f3c0805f155a3aa4a95fa44fe7d78ae15e69edf4f34/rpds_py-0.27.1-cp314-cp314t-win_amd64.whl", hash = "sha256:6567d2bb951e21232c2f660c24cf3470bb96de56cdcb3f071a83feeaff8a2772", size = 230097, upload-time = "2025-08-27T12:15:03.961Z" }, - { url = "https://files.pythonhosted.org/packages/0c/ed/e1fba02de17f4f76318b834425257c8ea297e415e12c68b4361f63e8ae92/rpds_py-0.27.1-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:cdfe4bb2f9fe7458b7453ad3c33e726d6d1c7c0a72960bcc23800d77384e42df", size = 371402, upload-time = "2025-08-27T12:15:51.561Z" }, - { url = "https://files.pythonhosted.org/packages/af/7c/e16b959b316048b55585a697e94add55a4ae0d984434d279ea83442e460d/rpds_py-0.27.1-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:8fabb8fd848a5f75a2324e4a84501ee3a5e3c78d8603f83475441866e60b94a3", size = 354084, upload-time = "2025-08-27T12:15:53.219Z" }, - { url = "https://files.pythonhosted.org/packages/de/c1/ade645f55de76799fdd08682d51ae6724cb46f318573f18be49b1e040428/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:eda8719d598f2f7f3e0f885cba8646644b55a187762bec091fa14a2b819746a9", size = 383090, upload-time = "2025-08-27T12:15:55.158Z" }, - { url = "https://files.pythonhosted.org/packages/1f/27/89070ca9b856e52960da1472efcb6c20ba27cfe902f4f23ed095b9cfc61d/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:3c64d07e95606ec402a0a1c511fe003873fa6af630bda59bac77fac8b4318ebc", size = 394519, upload-time = "2025-08-27T12:15:57.238Z" }, - { url = "https://files.pythonhosted.org/packages/b3/28/be120586874ef906aa5aeeae95ae8df4184bc757e5b6bd1c729ccff45ed5/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:93a2ed40de81bcff59aabebb626562d48332f3d028ca2036f1d23cbb52750be4", size = 523817, upload-time = "2025-08-27T12:15:59.237Z" }, - { url = "https://files.pythonhosted.org/packages/a8/ef/70cc197bc11cfcde02a86f36ac1eed15c56667c2ebddbdb76a47e90306da/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:387ce8c44ae94e0ec50532d9cb0edce17311024c9794eb196b90e1058aadeb66", size = 403240, upload-time = "2025-08-27T12:16:00.923Z" }, - { url = "https://files.pythonhosted.org/packages/cf/35/46936cca449f7f518f2f4996e0e8344db4b57e2081e752441154089d2a5f/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aaf94f812c95b5e60ebaf8bfb1898a7d7cb9c1af5744d4a67fa47796e0465d4e", size = 385194, upload-time = "2025-08-27T12:16:02.802Z" }, - { url = "https://files.pythonhosted.org/packages/e1/62/29c0d3e5125c3270b51415af7cbff1ec587379c84f55a5761cc9efa8cd06/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_31_riscv64.whl", hash = "sha256:4848ca84d6ded9b58e474dfdbad4b8bfb450344c0551ddc8d958bf4b36aa837c", size = 402086, upload-time = "2025-08-27T12:16:04.806Z" }, - { url = "https://files.pythonhosted.org/packages/8f/66/03e1087679227785474466fdd04157fb793b3b76e3fcf01cbf4c693c1949/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2bde09cbcf2248b73c7c323be49b280180ff39fadcfe04e7b6f54a678d02a7cf", size = 419272, upload-time = "2025-08-27T12:16:06.471Z" }, - { url = "https://files.pythonhosted.org/packages/6a/24/e3e72d265121e00b063aef3e3501e5b2473cf1b23511d56e529531acf01e/rpds_py-0.27.1-pp311-pypy311_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:94c44ee01fd21c9058f124d2d4f0c9dc7634bec93cd4b38eefc385dabe71acbf", size = 560003, upload-time = "2025-08-27T12:16:08.06Z" }, - { url = "https://files.pythonhosted.org/packages/26/ca/f5a344c534214cc2d41118c0699fffbdc2c1bc7046f2a2b9609765ab9c92/rpds_py-0.27.1-pp311-pypy311_pp73-musllinux_1_2_i686.whl", hash = "sha256:df8b74962e35c9249425d90144e721eed198e6555a0e22a563d29fe4486b51f6", size = 590482, upload-time = "2025-08-27T12:16:10.137Z" }, - { url = "https://files.pythonhosted.org/packages/ce/08/4349bdd5c64d9d193c360aa9db89adeee6f6682ab8825dca0a3f535f434f/rpds_py-0.27.1-pp311-pypy311_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:dc23e6820e3b40847e2f4a7726462ba0cf53089512abe9ee16318c366494c17a", size = 556523, upload-time = "2025-08-27T12:16:12.188Z" }, -] - -[[package]] -name = "ruff" -version = "0.13.3" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/c7/8e/f9f9ca747fea8e3ac954e3690d4698c9737c23b51731d02df999c150b1c9/ruff-0.13.3.tar.gz", hash = "sha256:5b0ba0db740eefdfbcce4299f49e9eaefc643d4d007749d77d047c2bab19908e", size = 5438533, upload-time = "2025-10-02T19:29:31.582Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/d2/33/8f7163553481466a92656d35dea9331095122bb84cf98210bef597dd2ecd/ruff-0.13.3-py3-none-linux_armv6l.whl", hash = "sha256:311860a4c5e19189c89d035638f500c1e191d283d0cc2f1600c8c80d6dcd430c", size = 12484040, upload-time = "2025-10-02T19:28:49.199Z" }, - { url = "https://files.pythonhosted.org/packages/b0/b5/4a21a4922e5dd6845e91896b0d9ef493574cbe061ef7d00a73c61db531af/ruff-0.13.3-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:2bdad6512fb666b40fcadb65e33add2b040fc18a24997d2e47fee7d66f7fcae2", size = 13122975, upload-time = "2025-10-02T19:28:52.446Z" }, - { url = "https://files.pythonhosted.org/packages/40/90/15649af836d88c9f154e5be87e64ae7d2b1baa5a3ef317cb0c8fafcd882d/ruff-0.13.3-py3-none-macosx_11_0_arm64.whl", hash = "sha256:fc6fa4637284708d6ed4e5e970d52fc3b76a557d7b4e85a53013d9d201d93286", size = 12346621, upload-time = "2025-10-02T19:28:54.712Z" }, - { url = "https://files.pythonhosted.org/packages/a5/42/bcbccb8141305f9a6d3f72549dd82d1134299177cc7eaf832599700f95a7/ruff-0.13.3-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1c9e6469864f94a98f412f20ea143d547e4c652f45e44f369d7b74ee78185838", size = 12574408, upload-time = "2025-10-02T19:28:56.679Z" }, - { url = "https://files.pythonhosted.org/packages/ce/19/0f3681c941cdcfa2d110ce4515624c07a964dc315d3100d889fcad3bfc9e/ruff-0.13.3-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5bf62b705f319476c78891e0e97e965b21db468b3c999086de8ffb0d40fd2822", size = 12285330, upload-time = "2025-10-02T19:28:58.79Z" }, - { url = "https://files.pythonhosted.org/packages/10/f8/387976bf00d126b907bbd7725219257feea58650e6b055b29b224d8cb731/ruff-0.13.3-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:78cc1abed87ce40cb07ee0667ce99dbc766c9f519eabfd948ed87295d8737c60", size = 13980815, upload-time = "2025-10-02T19:29:01.577Z" }, - { url = "https://files.pythonhosted.org/packages/0c/a6/7c8ec09d62d5a406e2b17d159e4817b63c945a8b9188a771193b7e1cc0b5/ruff-0.13.3-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:4fb75e7c402d504f7a9a259e0442b96403fa4a7310ffe3588d11d7e170d2b1e3", size = 14987733, upload-time = "2025-10-02T19:29:04.036Z" }, - { url = "https://files.pythonhosted.org/packages/97/e5/f403a60a12258e0fd0c2195341cfa170726f254c788673495d86ab5a9a9d/ruff-0.13.3-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:17b951f9d9afb39330b2bdd2dd144ce1c1335881c277837ac1b50bfd99985ed3", size = 14439848, upload-time = "2025-10-02T19:29:06.684Z" }, - { url = "https://files.pythonhosted.org/packages/39/49/3de381343e89364c2334c9f3268b0349dc734fc18b2d99a302d0935c8345/ruff-0.13.3-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6052f8088728898e0a449f0dde8fafc7ed47e4d878168b211977e3e7e854f662", size = 13421890, upload-time = "2025-10-02T19:29:08.767Z" }, - { url = "https://files.pythonhosted.org/packages/ab/b5/c0feca27d45ae74185a6bacc399f5d8920ab82df2d732a17213fb86a2c4c/ruff-0.13.3-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dc742c50f4ba72ce2a3be362bd359aef7d0d302bf7637a6f942eaa763bd292af", size = 13444870, upload-time = "2025-10-02T19:29:11.234Z" }, - { url = "https://files.pythonhosted.org/packages/50/a1/b655298a1f3fda4fdc7340c3f671a4b260b009068fbeb3e4e151e9e3e1bf/ruff-0.13.3-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:8e5640349493b378431637019366bbd73c927e515c9c1babfea3e932f5e68e1d", size = 13691599, upload-time = "2025-10-02T19:29:13.353Z" }, - { url = "https://files.pythonhosted.org/packages/32/b0/a8705065b2dafae007bcae21354e6e2e832e03eb077bb6c8e523c2becb92/ruff-0.13.3-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:6b139f638a80eae7073c691a5dd8d581e0ba319540be97c343d60fb12949c8d0", size = 12421893, upload-time = "2025-10-02T19:29:15.668Z" }, - { url = "https://files.pythonhosted.org/packages/0d/1e/cbe7082588d025cddbb2f23e6dfef08b1a2ef6d6f8328584ad3015b5cebd/ruff-0.13.3-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:6b547def0a40054825de7cfa341039ebdfa51f3d4bfa6a0772940ed351d2746c", size = 12267220, upload-time = "2025-10-02T19:29:17.583Z" }, - { url = "https://files.pythonhosted.org/packages/a5/99/4086f9c43f85e0755996d09bdcb334b6fee9b1eabdf34e7d8b877fadf964/ruff-0.13.3-py3-none-musllinux_1_2_i686.whl", hash = "sha256:9cc48a3564423915c93573f1981d57d101e617839bef38504f85f3677b3a0a3e", size = 13177818, upload-time = "2025-10-02T19:29:19.943Z" }, - { url = "https://files.pythonhosted.org/packages/9b/de/7b5db7e39947d9dc1c5f9f17b838ad6e680527d45288eeb568e860467010/ruff-0.13.3-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:1a993b17ec03719c502881cb2d5f91771e8742f2ca6de740034433a97c561989", size = 13618715, upload-time = "2025-10-02T19:29:22.527Z" }, - { url = "https://files.pythonhosted.org/packages/28/d3/bb25ee567ce2f61ac52430cf99f446b0e6d49bdfa4188699ad005fdd16aa/ruff-0.13.3-py3-none-win32.whl", hash = "sha256:f14e0d1fe6460f07814d03c6e32e815bff411505178a1f539a38f6097d3e8ee3", size = 12334488, upload-time = "2025-10-02T19:29:24.782Z" }, - { url = "https://files.pythonhosted.org/packages/cf/49/12f5955818a1139eed288753479ba9d996f6ea0b101784bb1fe6977ec128/ruff-0.13.3-py3-none-win_amd64.whl", hash = "sha256:621e2e5812b691d4f244638d693e640f188bacbb9bc793ddd46837cea0503dd2", size = 13455262, upload-time = "2025-10-02T19:29:26.882Z" }, - { url = "https://files.pythonhosted.org/packages/fe/72/7b83242b26627a00e3af70d0394d68f8f02750d642567af12983031777fc/ruff-0.13.3-py3-none-win_arm64.whl", hash = "sha256:9e9e9d699841eaf4c2c798fa783df2fabc680b72059a02ca0ed81c460bc58330", size = 12538484, upload-time = "2025-10-02T19:29:28.951Z" }, -] - -[[package]] -name = "s3transfer" -version = "0.14.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "botocore" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/62/74/8d69dcb7a9efe8baa2046891735e5dfe433ad558ae23d9e3c14c633d1d58/s3transfer-0.14.0.tar.gz", hash = "sha256:eff12264e7c8b4985074ccce27a3b38a485bb7f7422cc8046fee9be4983e4125", size = 151547, upload-time = "2025-09-09T19:23:31.089Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/48/f0/ae7ca09223a81a1d890b2557186ea015f6e0502e9b8cb8e1813f1d8cfa4e/s3transfer-0.14.0-py3-none-any.whl", hash = "sha256:ea3b790c7077558ed1f02a3072fb3cb992bbbd253392f4b6e9e8976941c7d456", size = 85712, upload-time = "2025-09-09T19:23:30.041Z" }, -] - -[[package]] -name = "six" -version = "1.17.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/94/e7/b2c673351809dca68a0e064b6af791aa332cf192da575fd474ed7d6f16a2/six-1.17.0.tar.gz", hash = "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81", size = 34031, upload-time = "2024-12-04T17:35:28.174Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/b7/ce/149a00dd41f10bc29e5921b496af8b574d8413afcd5e30dfa0ed46c2cc5e/six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274", size = 11050, upload-time = "2024-12-04T17:35:26.475Z" }, -] - -[[package]] -name = "sniffio" -version = "1.3.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/a2/87/a6771e1546d97e7e041b6ae58d80074f81b7d5121207425c964ddf5cfdbd/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc", size = 20372, upload-time = "2024-02-25T23:20:04.057Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235, upload-time = "2024-02-25T23:20:01.196Z" }, -] - -[[package]] -name = "sse-starlette" -version = "3.0.2" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "anyio" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/42/6f/22ed6e33f8a9e76ca0a412405f31abb844b779d52c5f96660766edcd737c/sse_starlette-3.0.2.tar.gz", hash = "sha256:ccd60b5765ebb3584d0de2d7a6e4f745672581de4f5005ab31c3a25d10b52b3a", size = 20985, upload-time = "2025-07-27T09:07:44.565Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/ef/10/c78f463b4ef22eef8491f218f692be838282cd65480f6e423d7730dfd1fb/sse_starlette-3.0.2-py3-none-any.whl", hash = "sha256:16b7cbfddbcd4eaca11f7b586f3b8a080f1afe952c15813455b162edea619e5a", size = 11297, upload-time = "2025-07-27T09:07:43.268Z" }, -] - -[[package]] -name = "starlette" -version = "0.47.3" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "anyio" }, - { name = "typing-extensions", marker = "python_full_version < '3.13'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/15/b9/cc3017f9a9c9b6e27c5106cc10cc7904653c3eec0729793aec10479dd669/starlette-0.47.3.tar.gz", hash = "sha256:6bc94f839cc176c4858894f1f8908f0ab79dfec1a6b8402f6da9be26ebea52e9", size = 2584144, upload-time = "2025-08-24T13:36:42.122Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/ce/fd/901cfa59aaa5b30a99e16876f11abe38b59a1a2c51ffb3d7142bb6089069/starlette-0.47.3-py3-none-any.whl", hash = "sha256:89c0778ca62a76b826101e7c709e70680a1699ca7da6b44d38eb0a7e61fe4b51", size = 72991, upload-time = "2025-08-24T13:36:40.887Z" }, -] - -[[package]] -name = "temporalio" -version = "1.18.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "nexus-rpc" }, - { name = "protobuf" }, - { name = "types-protobuf" }, - { name = "typing-extensions" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/09/7a/9f7885950cc040d71340a9379134b168d557b0a0e589c75d31e797f5a8bf/temporalio-1.18.1.tar.gz", hash = "sha256:46394498f8822e61b3ce70d6735de7618f5af0501fb90f3f90f4b4f9e7816d77", size = 1787082, upload-time = "2025-09-30T15:00:19.871Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/82/c0/9bad907dcf968c55acee1b5cc4ec0590a0fca3bc448dc32898785a577f7b/temporalio-1.18.1-cp39-abi3-macosx_10_12_x86_64.whl", hash = "sha256:748c0ec9f48aa1ab612a58fe516d9be28c1dd98194f560fd28a2ab09c6e2ca5e", size = 12809719, upload-time = "2025-09-30T14:59:58.177Z" }, - { url = "https://files.pythonhosted.org/packages/51/c5/490a2726aa67d4b856e8288d36848e7859801889b21d251cae8e8a6c9311/temporalio-1.18.1-cp39-abi3-macosx_11_0_arm64.whl", hash = "sha256:5a789e7c483582d6d7dd49e7d2d2730d82dc94d9342fe71be76fa67afa4e6865", size = 12393639, upload-time = "2025-09-30T15:00:02.737Z" }, - { url = "https://files.pythonhosted.org/packages/92/89/e500e066df3c0fc1e6ee1a7cadbdfbc9812c62296ac0554fc09779555560/temporalio-1.18.1-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a9f5cf75c4b887476a2b39d022a9c44c495f5eb1668087a022bd9258d3adddf9", size = 12732719, upload-time = "2025-09-30T15:00:07.458Z" }, - { url = "https://files.pythonhosted.org/packages/a4/18/7e5c4082b1550c38c802af02ae60ffe39d87646856aa51909cdd2789b7a6/temporalio-1.18.1-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f28a69394bf18b4a1c22a6a784d348e93482858c505d054570b278f0f5e13e9c", size = 12926861, upload-time = "2025-09-30T15:00:12.777Z" }, - { url = "https://files.pythonhosted.org/packages/10/49/e021b3205f06a1ec8a533dc8b02dcf5784d003cf99e4fd574eedb7439357/temporalio-1.18.1-cp39-abi3-win_amd64.whl", hash = "sha256:552b360f9ccdac8d5fc5d19c6578c2f6f634399ccc37439c4794aa58487f7fd5", size = 13059005, upload-time = "2025-09-30T15:00:17.586Z" }, -] - -[[package]] -name = "tomli" -version = "2.2.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/18/87/302344fed471e44a87289cf4967697d07e532f2421fdaf868a303cbae4ff/tomli-2.2.1.tar.gz", hash = "sha256:cd45e1dc79c835ce60f7404ec8119f2eb06d38b1deba146f07ced3bbc44505ff", size = 17175, upload-time = "2024-11-27T22:38:36.873Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/43/ca/75707e6efa2b37c77dadb324ae7d9571cb424e61ea73fad7c56c2d14527f/tomli-2.2.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:678e4fa69e4575eb77d103de3df8a895e1591b48e740211bd1067378c69e8249", size = 131077, upload-time = "2024-11-27T22:37:54.956Z" }, - { url = "https://files.pythonhosted.org/packages/c7/16/51ae563a8615d472fdbffc43a3f3d46588c264ac4f024f63f01283becfbb/tomli-2.2.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:023aa114dd824ade0100497eb2318602af309e5a55595f76b626d6d9f3b7b0a6", size = 123429, upload-time = "2024-11-27T22:37:56.698Z" }, - { url = "https://files.pythonhosted.org/packages/f1/dd/4f6cd1e7b160041db83c694abc78e100473c15d54620083dbd5aae7b990e/tomli-2.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ece47d672db52ac607a3d9599a9d48dcb2f2f735c6c2d1f34130085bb12b112a", size = 226067, upload-time = "2024-11-27T22:37:57.63Z" }, - { url = "https://files.pythonhosted.org/packages/a9/6b/c54ede5dc70d648cc6361eaf429304b02f2871a345bbdd51e993d6cdf550/tomli-2.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6972ca9c9cc9f0acaa56a8ca1ff51e7af152a9f87fb64623e31d5c83700080ee", size = 236030, upload-time = "2024-11-27T22:37:59.344Z" }, - { url = "https://files.pythonhosted.org/packages/1f/47/999514fa49cfaf7a92c805a86c3c43f4215621855d151b61c602abb38091/tomli-2.2.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c954d2250168d28797dd4e3ac5cf812a406cd5a92674ee4c8f123c889786aa8e", size = 240898, upload-time = "2024-11-27T22:38:00.429Z" }, - { url = "https://files.pythonhosted.org/packages/73/41/0a01279a7ae09ee1573b423318e7934674ce06eb33f50936655071d81a24/tomli-2.2.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8dd28b3e155b80f4d54beb40a441d366adcfe740969820caf156c019fb5c7ec4", size = 229894, upload-time = "2024-11-27T22:38:02.094Z" }, - { url = "https://files.pythonhosted.org/packages/55/18/5d8bc5b0a0362311ce4d18830a5d28943667599a60d20118074ea1b01bb7/tomli-2.2.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:e59e304978767a54663af13c07b3d1af22ddee3bb2fb0618ca1593e4f593a106", size = 245319, upload-time = "2024-11-27T22:38:03.206Z" }, - { url = "https://files.pythonhosted.org/packages/92/a3/7ade0576d17f3cdf5ff44d61390d4b3febb8a9fc2b480c75c47ea048c646/tomli-2.2.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:33580bccab0338d00994d7f16f4c4ec25b776af3ffaac1ed74e0b3fc95e885a8", size = 238273, upload-time = "2024-11-27T22:38:04.217Z" }, - { url = "https://files.pythonhosted.org/packages/72/6f/fa64ef058ac1446a1e51110c375339b3ec6be245af9d14c87c4a6412dd32/tomli-2.2.1-cp311-cp311-win32.whl", hash = "sha256:465af0e0875402f1d226519c9904f37254b3045fc5084697cefb9bdde1ff99ff", size = 98310, upload-time = "2024-11-27T22:38:05.908Z" }, - { url = "https://files.pythonhosted.org/packages/6a/1c/4a2dcde4a51b81be3530565e92eda625d94dafb46dbeb15069df4caffc34/tomli-2.2.1-cp311-cp311-win_amd64.whl", hash = "sha256:2d0f2fdd22b02c6d81637a3c95f8cd77f995846af7414c5c4b8d0545afa1bc4b", size = 108309, upload-time = "2024-11-27T22:38:06.812Z" }, - { url = "https://files.pythonhosted.org/packages/52/e1/f8af4c2fcde17500422858155aeb0d7e93477a0d59a98e56cbfe75070fd0/tomli-2.2.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4a8f6e44de52d5e6c657c9fe83b562f5f4256d8ebbfe4ff922c495620a7f6cea", size = 132762, upload-time = "2024-11-27T22:38:07.731Z" }, - { url = "https://files.pythonhosted.org/packages/03/b8/152c68bb84fc00396b83e7bbddd5ec0bd3dd409db4195e2a9b3e398ad2e3/tomli-2.2.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8d57ca8095a641b8237d5b079147646153d22552f1c637fd3ba7f4b0b29167a8", size = 123453, upload-time = "2024-11-27T22:38:09.384Z" }, - { url = "https://files.pythonhosted.org/packages/c8/d6/fc9267af9166f79ac528ff7e8c55c8181ded34eb4b0e93daa767b8841573/tomli-2.2.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e340144ad7ae1533cb897d406382b4b6fede8890a03738ff1683af800d54192", size = 233486, upload-time = "2024-11-27T22:38:10.329Z" }, - { url = "https://files.pythonhosted.org/packages/5c/51/51c3f2884d7bab89af25f678447ea7d297b53b5a3b5730a7cb2ef6069f07/tomli-2.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:db2b95f9de79181805df90bedc5a5ab4c165e6ec3fe99f970d0e302f384ad222", size = 242349, upload-time = "2024-11-27T22:38:11.443Z" }, - { url = "https://files.pythonhosted.org/packages/ab/df/bfa89627d13a5cc22402e441e8a931ef2108403db390ff3345c05253935e/tomli-2.2.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:40741994320b232529c802f8bc86da4e1aa9f413db394617b9a256ae0f9a7f77", size = 252159, upload-time = "2024-11-27T22:38:13.099Z" }, - { url = "https://files.pythonhosted.org/packages/9e/6e/fa2b916dced65763a5168c6ccb91066f7639bdc88b48adda990db10c8c0b/tomli-2.2.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:400e720fe168c0f8521520190686ef8ef033fb19fc493da09779e592861b78c6", size = 237243, upload-time = "2024-11-27T22:38:14.766Z" }, - { url = "https://files.pythonhosted.org/packages/b4/04/885d3b1f650e1153cbb93a6a9782c58a972b94ea4483ae4ac5cedd5e4a09/tomli-2.2.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:02abe224de6ae62c19f090f68da4e27b10af2b93213d36cf44e6e1c5abd19fdd", size = 259645, upload-time = "2024-11-27T22:38:15.843Z" }, - { url = "https://files.pythonhosted.org/packages/9c/de/6b432d66e986e501586da298e28ebeefd3edc2c780f3ad73d22566034239/tomli-2.2.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b82ebccc8c8a36f2094e969560a1b836758481f3dc360ce9a3277c65f374285e", size = 244584, upload-time = "2024-11-27T22:38:17.645Z" }, - { url = "https://files.pythonhosted.org/packages/1c/9a/47c0449b98e6e7d1be6cbac02f93dd79003234ddc4aaab6ba07a9a7482e2/tomli-2.2.1-cp312-cp312-win32.whl", hash = "sha256:889f80ef92701b9dbb224e49ec87c645ce5df3fa2cc548664eb8a25e03127a98", size = 98875, upload-time = "2024-11-27T22:38:19.159Z" }, - { url = "https://files.pythonhosted.org/packages/ef/60/9b9638f081c6f1261e2688bd487625cd1e660d0a85bd469e91d8db969734/tomli-2.2.1-cp312-cp312-win_amd64.whl", hash = "sha256:7fc04e92e1d624a4a63c76474610238576942d6b8950a2d7f908a340494e67e4", size = 109418, upload-time = "2024-11-27T22:38:20.064Z" }, - { url = "https://files.pythonhosted.org/packages/04/90/2ee5f2e0362cb8a0b6499dc44f4d7d48f8fff06d28ba46e6f1eaa61a1388/tomli-2.2.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f4039b9cbc3048b2416cc57ab3bda989a6fcf9b36cf8937f01a6e731b64f80d7", size = 132708, upload-time = "2024-11-27T22:38:21.659Z" }, - { url = "https://files.pythonhosted.org/packages/c0/ec/46b4108816de6b385141f082ba99e315501ccd0a2ea23db4a100dd3990ea/tomli-2.2.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:286f0ca2ffeeb5b9bd4fcc8d6c330534323ec51b2f52da063b11c502da16f30c", size = 123582, upload-time = "2024-11-27T22:38:22.693Z" }, - { url = "https://files.pythonhosted.org/packages/a0/bd/b470466d0137b37b68d24556c38a0cc819e8febe392d5b199dcd7f578365/tomli-2.2.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a92ef1a44547e894e2a17d24e7557a5e85a9e1d0048b0b5e7541f76c5032cb13", size = 232543, upload-time = "2024-11-27T22:38:24.367Z" }, - { url = "https://files.pythonhosted.org/packages/d9/e5/82e80ff3b751373f7cead2815bcbe2d51c895b3c990686741a8e56ec42ab/tomli-2.2.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9316dc65bed1684c9a98ee68759ceaed29d229e985297003e494aa825ebb0281", size = 241691, upload-time = "2024-11-27T22:38:26.081Z" }, - { url = "https://files.pythonhosted.org/packages/05/7e/2a110bc2713557d6a1bfb06af23dd01e7dde52b6ee7dadc589868f9abfac/tomli-2.2.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e85e99945e688e32d5a35c1ff38ed0b3f41f43fad8df0bdf79f72b2ba7bc5272", size = 251170, upload-time = "2024-11-27T22:38:27.921Z" }, - { url = "https://files.pythonhosted.org/packages/64/7b/22d713946efe00e0adbcdfd6d1aa119ae03fd0b60ebed51ebb3fa9f5a2e5/tomli-2.2.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ac065718db92ca818f8d6141b5f66369833d4a80a9d74435a268c52bdfa73140", size = 236530, upload-time = "2024-11-27T22:38:29.591Z" }, - { url = "https://files.pythonhosted.org/packages/38/31/3a76f67da4b0cf37b742ca76beaf819dca0ebef26d78fc794a576e08accf/tomli-2.2.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:d920f33822747519673ee656a4b6ac33e382eca9d331c87770faa3eef562aeb2", size = 258666, upload-time = "2024-11-27T22:38:30.639Z" }, - { url = "https://files.pythonhosted.org/packages/07/10/5af1293da642aded87e8a988753945d0cf7e00a9452d3911dd3bb354c9e2/tomli-2.2.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a198f10c4d1b1375d7687bc25294306e551bf1abfa4eace6650070a5c1ae2744", size = 243954, upload-time = "2024-11-27T22:38:31.702Z" }, - { url = "https://files.pythonhosted.org/packages/5b/b9/1ed31d167be802da0fc95020d04cd27b7d7065cc6fbefdd2f9186f60d7bd/tomli-2.2.1-cp313-cp313-win32.whl", hash = "sha256:d3f5614314d758649ab2ab3a62d4f2004c825922f9e370b29416484086b264ec", size = 98724, upload-time = "2024-11-27T22:38:32.837Z" }, - { url = "https://files.pythonhosted.org/packages/c7/32/b0963458706accd9afcfeb867c0f9175a741bf7b19cd424230714d722198/tomli-2.2.1-cp313-cp313-win_amd64.whl", hash = "sha256:a38aa0308e754b0e3c67e344754dff64999ff9b513e691d0e786265c93583c69", size = 109383, upload-time = "2024-11-27T22:38:34.455Z" }, - { url = "https://files.pythonhosted.org/packages/6e/c2/61d3e0f47e2b74ef40a68b9e6ad5984f6241a942f7cd3bbfbdbd03861ea9/tomli-2.2.1-py3-none-any.whl", hash = "sha256:cb55c73c5f4408779d0cf3eef9f762b9c9f147a77de7b258bef0a5628adc85cc", size = 14257, upload-time = "2024-11-27T22:38:35.385Z" }, -] - -[[package]] -name = "types-protobuf" -version = "6.32.1.20250918" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/69/5a/bd06c2dbb77ebd4ea764473c9c4c014c7ba94432192cb965a274f8544b9d/types_protobuf-6.32.1.20250918.tar.gz", hash = "sha256:44ce0ae98475909ca72379946ab61a4435eec2a41090821e713c17e8faf5b88f", size = 63780, upload-time = "2025-09-18T02:50:39.391Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/37/5a/8d93d4f4af5dc3dd62aa4f020deae746b34b1d94fb5bee1f776c6b7e9d6c/types_protobuf-6.32.1.20250918-py3-none-any.whl", hash = "sha256:22ba6133d142d11cc34d3788ad6dead2732368ebb0406eaa7790ea6ae46c8d0b", size = 77885, upload-time = "2025-09-18T02:50:38.028Z" }, -] - -[[package]] -name = "typing-extensions" -version = "4.15.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/72/94/1a15dd82efb362ac84269196e94cf00f187f7ed21c242792a923cdb1c61f/typing_extensions-4.15.0.tar.gz", hash = "sha256:0cea48d173cc12fa28ecabc3b837ea3cf6f38c6d1136f85cbaaf598984861466", size = 109391, upload-time = "2025-08-25T13:49:26.313Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/18/67/36e9267722cc04a6b9f15c7f3441c2363321a3ea07da7ae0c0707beb2a9c/typing_extensions-4.15.0-py3-none-any.whl", hash = "sha256:f0fa19c6845758ab08074a0cfa8b7aecb71c999ca73d62883bc25cc018c4e548", size = 44614, upload-time = "2025-08-25T13:49:24.86Z" }, -] - -[[package]] -name = "typing-inspection" -version = "0.4.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "typing-extensions" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/f8/b1/0c11f5058406b3af7609f121aaa6b609744687f1d158b3c3a5bf4cc94238/typing_inspection-0.4.1.tar.gz", hash = "sha256:6ae134cc0203c33377d43188d4064e9b357dba58cff3185f22924610e70a9d28", size = 75726, upload-time = "2025-05-21T18:55:23.885Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/17/69/cd203477f944c353c31bade965f880aa1061fd6bf05ded0726ca845b6ff7/typing_inspection-0.4.1-py3-none-any.whl", hash = "sha256:389055682238f53b04f7badcb49b989835495a96700ced5dab2d8feae4b26f51", size = 14552, upload-time = "2025-05-21T18:55:22.152Z" }, -] - -[[package]] -name = "urllib3" -version = "2.5.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/15/22/9ee70a2574a4f4599c47dd506532914ce044817c7752a79b6a51286319bc/urllib3-2.5.0.tar.gz", hash = "sha256:3fc47733c7e419d4bc3f6b3dc2b4f890bb743906a30d56ba4a5bfa4bbff92760", size = 393185, upload-time = "2025-06-18T14:07:41.644Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/a7/c2/fe1e52489ae3122415c51f387e221dd0773709bad6c6cdaa599e8a2c5185/urllib3-2.5.0-py3-none-any.whl", hash = "sha256:e6b01673c0fa6a13e374b50871808eb3bf7046c4b125b216f6bf1cc604cff0dc", size = 129795, upload-time = "2025-06-18T14:07:40.39Z" }, -] - -[[package]] -name = "uvicorn" -version = "0.35.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "click" }, - { name = "h11" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/5e/42/e0e305207bb88c6b8d3061399c6a961ffe5fbb7e2aa63c9234df7259e9cd/uvicorn-0.35.0.tar.gz", hash = "sha256:bc662f087f7cf2ce11a1d7fd70b90c9f98ef2e2831556dd078d131b96cc94a01", size = 78473, upload-time = "2025-06-28T16:15:46.058Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/d2/e2/dc81b1bd1dcfe91735810265e9d26bc8ec5da45b4c0f6237e286819194c3/uvicorn-0.35.0-py3-none-any.whl", hash = "sha256:197535216b25ff9b785e29a0b79199f55222193d47f820816e7da751e9bc8d4a", size = 66406, upload-time = "2025-06-28T16:15:44.816Z" }, -] - -[[package]] -name = "werkzeug" -version = "3.1.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "markupsafe" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/32/af/d4502dc713b4ccea7175d764718d5183caf8d0867a4f0190d5d4a45cea49/werkzeug-3.1.1.tar.gz", hash = "sha256:8cd39dfbdfc1e051965f156163e2974e52c210f130810e9ad36858f0fd3edad4", size = 806453, upload-time = "2024-11-01T16:40:45.462Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/ee/ea/c67e1dee1ba208ed22c06d1d547ae5e293374bfc43e0eb0ef5e262b68561/werkzeug-3.1.1-py3-none-any.whl", hash = "sha256:a71124d1ef06008baafa3d266c02f56e1836a5984afd6dd6c9230669d60d9fb5", size = 224371, upload-time = "2024-11-01T16:40:43.994Z" }, -] - -[[package]] -name = "yarl" -version = "1.20.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "idna" }, - { name = "multidict" }, - { name = "propcache" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/3c/fb/efaa23fa4e45537b827620f04cf8f3cd658b76642205162e072703a5b963/yarl-1.20.1.tar.gz", hash = "sha256:d017a4997ee50c91fd5466cef416231bb82177b93b029906cefc542ce14c35ac", size = 186428, upload-time = "2025-06-10T00:46:09.923Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/b1/18/893b50efc2350e47a874c5c2d67e55a0ea5df91186b2a6f5ac52eff887cd/yarl-1.20.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:47ee6188fea634bdfaeb2cc420f5b3b17332e6225ce88149a17c413c77ff269e", size = 133833, upload-time = "2025-06-10T00:43:07.393Z" }, - { url = "https://files.pythonhosted.org/packages/89/ed/b8773448030e6fc47fa797f099ab9eab151a43a25717f9ac043844ad5ea3/yarl-1.20.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:d0f6500f69e8402d513e5eedb77a4e1818691e8f45e6b687147963514d84b44b", size = 91070, upload-time = "2025-06-10T00:43:09.538Z" }, - { url = "https://files.pythonhosted.org/packages/e3/e3/409bd17b1e42619bf69f60e4f031ce1ccb29bd7380117a55529e76933464/yarl-1.20.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:7a8900a42fcdaad568de58887c7b2f602962356908eedb7628eaf6021a6e435b", size = 89818, upload-time = "2025-06-10T00:43:11.575Z" }, - { url = "https://files.pythonhosted.org/packages/f8/77/64d8431a4d77c856eb2d82aa3de2ad6741365245a29b3a9543cd598ed8c5/yarl-1.20.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bad6d131fda8ef508b36be3ece16d0902e80b88ea7200f030a0f6c11d9e508d4", size = 347003, upload-time = "2025-06-10T00:43:14.088Z" }, - { url = "https://files.pythonhosted.org/packages/8d/d2/0c7e4def093dcef0bd9fa22d4d24b023788b0a33b8d0088b51aa51e21e99/yarl-1.20.1-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:df018d92fe22aaebb679a7f89fe0c0f368ec497e3dda6cb81a567610f04501f1", size = 336537, upload-time = "2025-06-10T00:43:16.431Z" }, - { url = "https://files.pythonhosted.org/packages/f0/f3/fc514f4b2cf02cb59d10cbfe228691d25929ce8f72a38db07d3febc3f706/yarl-1.20.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8f969afbb0a9b63c18d0feecf0db09d164b7a44a053e78a7d05f5df163e43833", size = 362358, upload-time = "2025-06-10T00:43:18.704Z" }, - { url = "https://files.pythonhosted.org/packages/ea/6d/a313ac8d8391381ff9006ac05f1d4331cee3b1efaa833a53d12253733255/yarl-1.20.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:812303eb4aa98e302886ccda58d6b099e3576b1b9276161469c25803a8db277d", size = 357362, upload-time = "2025-06-10T00:43:20.888Z" }, - { url = "https://files.pythonhosted.org/packages/00/70/8f78a95d6935a70263d46caa3dd18e1f223cf2f2ff2037baa01a22bc5b22/yarl-1.20.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:98c4a7d166635147924aa0bf9bfe8d8abad6fffa6102de9c99ea04a1376f91e8", size = 348979, upload-time = "2025-06-10T00:43:23.169Z" }, - { url = "https://files.pythonhosted.org/packages/cb/05/42773027968968f4f15143553970ee36ead27038d627f457cc44bbbeecf3/yarl-1.20.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:12e768f966538e81e6e7550f9086a6236b16e26cd964cf4df35349970f3551cf", size = 337274, upload-time = "2025-06-10T00:43:27.111Z" }, - { url = "https://files.pythonhosted.org/packages/05/be/665634aa196954156741ea591d2f946f1b78ceee8bb8f28488bf28c0dd62/yarl-1.20.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:fe41919b9d899661c5c28a8b4b0acf704510b88f27f0934ac7a7bebdd8938d5e", size = 363294, upload-time = "2025-06-10T00:43:28.96Z" }, - { url = "https://files.pythonhosted.org/packages/eb/90/73448401d36fa4e210ece5579895731f190d5119c4b66b43b52182e88cd5/yarl-1.20.1-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:8601bc010d1d7780592f3fc1bdc6c72e2b6466ea34569778422943e1a1f3c389", size = 358169, upload-time = "2025-06-10T00:43:30.701Z" }, - { url = "https://files.pythonhosted.org/packages/c3/b0/fce922d46dc1eb43c811f1889f7daa6001b27a4005587e94878570300881/yarl-1.20.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:daadbdc1f2a9033a2399c42646fbd46da7992e868a5fe9513860122d7fe7a73f", size = 362776, upload-time = "2025-06-10T00:43:32.51Z" }, - { url = "https://files.pythonhosted.org/packages/f1/0d/b172628fce039dae8977fd22caeff3eeebffd52e86060413f5673767c427/yarl-1.20.1-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:03aa1e041727cb438ca762628109ef1333498b122e4c76dd858d186a37cec845", size = 381341, upload-time = "2025-06-10T00:43:34.543Z" }, - { url = "https://files.pythonhosted.org/packages/6b/9b/5b886d7671f4580209e855974fe1cecec409aa4a89ea58b8f0560dc529b1/yarl-1.20.1-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:642980ef5e0fa1de5fa96d905c7e00cb2c47cb468bfcac5a18c58e27dbf8d8d1", size = 379988, upload-time = "2025-06-10T00:43:36.489Z" }, - { url = "https://files.pythonhosted.org/packages/73/be/75ef5fd0fcd8f083a5d13f78fd3f009528132a1f2a1d7c925c39fa20aa79/yarl-1.20.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:86971e2795584fe8c002356d3b97ef6c61862720eeff03db2a7c86b678d85b3e", size = 371113, upload-time = "2025-06-10T00:43:38.592Z" }, - { url = "https://files.pythonhosted.org/packages/50/4f/62faab3b479dfdcb741fe9e3f0323e2a7d5cd1ab2edc73221d57ad4834b2/yarl-1.20.1-cp311-cp311-win32.whl", hash = "sha256:597f40615b8d25812f14562699e287f0dcc035d25eb74da72cae043bb884d773", size = 81485, upload-time = "2025-06-10T00:43:41.038Z" }, - { url = "https://files.pythonhosted.org/packages/f0/09/d9c7942f8f05c32ec72cd5c8e041c8b29b5807328b68b4801ff2511d4d5e/yarl-1.20.1-cp311-cp311-win_amd64.whl", hash = "sha256:26ef53a9e726e61e9cd1cda6b478f17e350fb5800b4bd1cd9fe81c4d91cfeb2e", size = 86686, upload-time = "2025-06-10T00:43:42.692Z" }, - { url = "https://files.pythonhosted.org/packages/5f/9a/cb7fad7d73c69f296eda6815e4a2c7ed53fc70c2f136479a91c8e5fbdb6d/yarl-1.20.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:bdcc4cd244e58593a4379fe60fdee5ac0331f8eb70320a24d591a3be197b94a9", size = 133667, upload-time = "2025-06-10T00:43:44.369Z" }, - { url = "https://files.pythonhosted.org/packages/67/38/688577a1cb1e656e3971fb66a3492501c5a5df56d99722e57c98249e5b8a/yarl-1.20.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:b29a2c385a5f5b9c7d9347e5812b6f7ab267193c62d282a540b4fc528c8a9d2a", size = 91025, upload-time = "2025-06-10T00:43:46.295Z" }, - { url = "https://files.pythonhosted.org/packages/50/ec/72991ae51febeb11a42813fc259f0d4c8e0507f2b74b5514618d8b640365/yarl-1.20.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1112ae8154186dfe2de4732197f59c05a83dc814849a5ced892b708033f40dc2", size = 89709, upload-time = "2025-06-10T00:43:48.22Z" }, - { url = "https://files.pythonhosted.org/packages/99/da/4d798025490e89426e9f976702e5f9482005c548c579bdae792a4c37769e/yarl-1.20.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:90bbd29c4fe234233f7fa2b9b121fb63c321830e5d05b45153a2ca68f7d310ee", size = 352287, upload-time = "2025-06-10T00:43:49.924Z" }, - { url = "https://files.pythonhosted.org/packages/1a/26/54a15c6a567aac1c61b18aa0f4b8aa2e285a52d547d1be8bf48abe2b3991/yarl-1.20.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:680e19c7ce3710ac4cd964e90dad99bf9b5029372ba0c7cbfcd55e54d90ea819", size = 345429, upload-time = "2025-06-10T00:43:51.7Z" }, - { url = "https://files.pythonhosted.org/packages/d6/95/9dcf2386cb875b234353b93ec43e40219e14900e046bf6ac118f94b1e353/yarl-1.20.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4a979218c1fdb4246a05efc2cc23859d47c89af463a90b99b7c56094daf25a16", size = 365429, upload-time = "2025-06-10T00:43:53.494Z" }, - { url = "https://files.pythonhosted.org/packages/91/b2/33a8750f6a4bc224242a635f5f2cff6d6ad5ba651f6edcccf721992c21a0/yarl-1.20.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:255b468adf57b4a7b65d8aad5b5138dce6a0752c139965711bdcb81bc370e1b6", size = 363862, upload-time = "2025-06-10T00:43:55.766Z" }, - { url = "https://files.pythonhosted.org/packages/98/28/3ab7acc5b51f4434b181b0cee8f1f4b77a65919700a355fb3617f9488874/yarl-1.20.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a97d67108e79cfe22e2b430d80d7571ae57d19f17cda8bb967057ca8a7bf5bfd", size = 355616, upload-time = "2025-06-10T00:43:58.056Z" }, - { url = "https://files.pythonhosted.org/packages/36/a3/f666894aa947a371724ec7cd2e5daa78ee8a777b21509b4252dd7bd15e29/yarl-1.20.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8570d998db4ddbfb9a590b185a0a33dbf8aafb831d07a5257b4ec9948df9cb0a", size = 339954, upload-time = "2025-06-10T00:43:59.773Z" }, - { url = "https://files.pythonhosted.org/packages/f1/81/5f466427e09773c04219d3450d7a1256138a010b6c9f0af2d48565e9ad13/yarl-1.20.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:97c75596019baae7c71ccf1d8cc4738bc08134060d0adfcbe5642f778d1dca38", size = 365575, upload-time = "2025-06-10T00:44:02.051Z" }, - { url = "https://files.pythonhosted.org/packages/2e/e3/e4b0ad8403e97e6c9972dd587388940a032f030ebec196ab81a3b8e94d31/yarl-1.20.1-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:1c48912653e63aef91ff988c5432832692ac5a1d8f0fb8a33091520b5bbe19ef", size = 365061, upload-time = "2025-06-10T00:44:04.196Z" }, - { url = "https://files.pythonhosted.org/packages/ac/99/b8a142e79eb86c926f9f06452eb13ecb1bb5713bd01dc0038faf5452e544/yarl-1.20.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:4c3ae28f3ae1563c50f3d37f064ddb1511ecc1d5584e88c6b7c63cf7702a6d5f", size = 364142, upload-time = "2025-06-10T00:44:06.527Z" }, - { url = "https://files.pythonhosted.org/packages/34/f2/08ed34a4a506d82a1a3e5bab99ccd930a040f9b6449e9fd050320e45845c/yarl-1.20.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:c5e9642f27036283550f5f57dc6156c51084b458570b9d0d96100c8bebb186a8", size = 381894, upload-time = "2025-06-10T00:44:08.379Z" }, - { url = "https://files.pythonhosted.org/packages/92/f8/9a3fbf0968eac704f681726eff595dce9b49c8a25cd92bf83df209668285/yarl-1.20.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:2c26b0c49220d5799f7b22c6838409ee9bc58ee5c95361a4d7831f03cc225b5a", size = 383378, upload-time = "2025-06-10T00:44:10.51Z" }, - { url = "https://files.pythonhosted.org/packages/af/85/9363f77bdfa1e4d690957cd39d192c4cacd1c58965df0470a4905253b54f/yarl-1.20.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:564ab3d517e3d01c408c67f2e5247aad4019dcf1969982aba3974b4093279004", size = 374069, upload-time = "2025-06-10T00:44:12.834Z" }, - { url = "https://files.pythonhosted.org/packages/35/99/9918c8739ba271dcd935400cff8b32e3cd319eaf02fcd023d5dcd487a7c8/yarl-1.20.1-cp312-cp312-win32.whl", hash = "sha256:daea0d313868da1cf2fac6b2d3a25c6e3a9e879483244be38c8e6a41f1d876a5", size = 81249, upload-time = "2025-06-10T00:44:14.731Z" }, - { url = "https://files.pythonhosted.org/packages/eb/83/5d9092950565481b413b31a23e75dd3418ff0a277d6e0abf3729d4d1ce25/yarl-1.20.1-cp312-cp312-win_amd64.whl", hash = "sha256:48ea7d7f9be0487339828a4de0360d7ce0efc06524a48e1810f945c45b813698", size = 86710, upload-time = "2025-06-10T00:44:16.716Z" }, - { url = "https://files.pythonhosted.org/packages/8a/e1/2411b6d7f769a07687acee88a062af5833cf1966b7266f3d8dfb3d3dc7d3/yarl-1.20.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:0b5ff0fbb7c9f1b1b5ab53330acbfc5247893069e7716840c8e7d5bb7355038a", size = 131811, upload-time = "2025-06-10T00:44:18.933Z" }, - { url = "https://files.pythonhosted.org/packages/b2/27/584394e1cb76fb771371770eccad35de400e7b434ce3142c2dd27392c968/yarl-1.20.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:14f326acd845c2b2e2eb38fb1346c94f7f3b01a4f5c788f8144f9b630bfff9a3", size = 90078, upload-time = "2025-06-10T00:44:20.635Z" }, - { url = "https://files.pythonhosted.org/packages/bf/9a/3246ae92d4049099f52d9b0fe3486e3b500e29b7ea872d0f152966fc209d/yarl-1.20.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f60e4ad5db23f0b96e49c018596707c3ae89f5d0bd97f0ad3684bcbad899f1e7", size = 88748, upload-time = "2025-06-10T00:44:22.34Z" }, - { url = "https://files.pythonhosted.org/packages/a3/25/35afe384e31115a1a801fbcf84012d7a066d89035befae7c5d4284df1e03/yarl-1.20.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:49bdd1b8e00ce57e68ba51916e4bb04461746e794e7c4d4bbc42ba2f18297691", size = 349595, upload-time = "2025-06-10T00:44:24.314Z" }, - { url = "https://files.pythonhosted.org/packages/28/2d/8aca6cb2cabc8f12efcb82749b9cefecbccfc7b0384e56cd71058ccee433/yarl-1.20.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:66252d780b45189975abfed839616e8fd2dbacbdc262105ad7742c6ae58f3e31", size = 342616, upload-time = "2025-06-10T00:44:26.167Z" }, - { url = "https://files.pythonhosted.org/packages/0b/e9/1312633d16b31acf0098d30440ca855e3492d66623dafb8e25b03d00c3da/yarl-1.20.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:59174e7332f5d153d8f7452a102b103e2e74035ad085f404df2e40e663a22b28", size = 361324, upload-time = "2025-06-10T00:44:27.915Z" }, - { url = "https://files.pythonhosted.org/packages/bc/a0/688cc99463f12f7669eec7c8acc71ef56a1521b99eab7cd3abb75af887b0/yarl-1.20.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e3968ec7d92a0c0f9ac34d5ecfd03869ec0cab0697c91a45db3fbbd95fe1b653", size = 359676, upload-time = "2025-06-10T00:44:30.041Z" }, - { url = "https://files.pythonhosted.org/packages/af/44/46407d7f7a56e9a85a4c207724c9f2c545c060380718eea9088f222ba697/yarl-1.20.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d1a4fbb50e14396ba3d375f68bfe02215d8e7bc3ec49da8341fe3157f59d2ff5", size = 352614, upload-time = "2025-06-10T00:44:32.171Z" }, - { url = "https://files.pythonhosted.org/packages/b1/91/31163295e82b8d5485d31d9cf7754d973d41915cadce070491778d9c9825/yarl-1.20.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:11a62c839c3a8eac2410e951301309426f368388ff2f33799052787035793b02", size = 336766, upload-time = "2025-06-10T00:44:34.494Z" }, - { url = "https://files.pythonhosted.org/packages/b4/8e/c41a5bc482121f51c083c4c2bcd16b9e01e1cf8729e380273a952513a21f/yarl-1.20.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:041eaa14f73ff5a8986b4388ac6bb43a77f2ea09bf1913df7a35d4646db69e53", size = 364615, upload-time = "2025-06-10T00:44:36.856Z" }, - { url = "https://files.pythonhosted.org/packages/e3/5b/61a3b054238d33d70ea06ebba7e58597891b71c699e247df35cc984ab393/yarl-1.20.1-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:377fae2fef158e8fd9d60b4c8751387b8d1fb121d3d0b8e9b0be07d1b41e83dc", size = 360982, upload-time = "2025-06-10T00:44:39.141Z" }, - { url = "https://files.pythonhosted.org/packages/df/a3/6a72fb83f8d478cb201d14927bc8040af901811a88e0ff2da7842dd0ed19/yarl-1.20.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:1c92f4390e407513f619d49319023664643d3339bd5e5a56a3bebe01bc67ec04", size = 369792, upload-time = "2025-06-10T00:44:40.934Z" }, - { url = "https://files.pythonhosted.org/packages/7c/af/4cc3c36dfc7c077f8dedb561eb21f69e1e9f2456b91b593882b0b18c19dc/yarl-1.20.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:d25ddcf954df1754ab0f86bb696af765c5bfaba39b74095f27eececa049ef9a4", size = 382049, upload-time = "2025-06-10T00:44:42.854Z" }, - { url = "https://files.pythonhosted.org/packages/19/3a/e54e2c4752160115183a66dc9ee75a153f81f3ab2ba4bf79c3c53b33de34/yarl-1.20.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:909313577e9619dcff8c31a0ea2aa0a2a828341d92673015456b3ae492e7317b", size = 384774, upload-time = "2025-06-10T00:44:45.275Z" }, - { url = "https://files.pythonhosted.org/packages/9c/20/200ae86dabfca89060ec6447649f219b4cbd94531e425e50d57e5f5ac330/yarl-1.20.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:793fd0580cb9664548c6b83c63b43c477212c0260891ddf86809e1c06c8b08f1", size = 374252, upload-time = "2025-06-10T00:44:47.31Z" }, - { url = "https://files.pythonhosted.org/packages/83/75/11ee332f2f516b3d094e89448da73d557687f7d137d5a0f48c40ff211487/yarl-1.20.1-cp313-cp313-win32.whl", hash = "sha256:468f6e40285de5a5b3c44981ca3a319a4b208ccc07d526b20b12aeedcfa654b7", size = 81198, upload-time = "2025-06-10T00:44:49.164Z" }, - { url = "https://files.pythonhosted.org/packages/ba/ba/39b1ecbf51620b40ab402b0fc817f0ff750f6d92712b44689c2c215be89d/yarl-1.20.1-cp313-cp313-win_amd64.whl", hash = "sha256:495b4ef2fea40596bfc0affe3837411d6aa3371abcf31aac0ccc4bdd64d4ef5c", size = 86346, upload-time = "2025-06-10T00:44:51.182Z" }, - { url = "https://files.pythonhosted.org/packages/43/c7/669c52519dca4c95153c8ad96dd123c79f354a376346b198f438e56ffeb4/yarl-1.20.1-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:f60233b98423aab21d249a30eb27c389c14929f47be8430efa7dbd91493a729d", size = 138826, upload-time = "2025-06-10T00:44:52.883Z" }, - { url = "https://files.pythonhosted.org/packages/6a/42/fc0053719b44f6ad04a75d7f05e0e9674d45ef62f2d9ad2c1163e5c05827/yarl-1.20.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:6f3eff4cc3f03d650d8755c6eefc844edde99d641d0dcf4da3ab27141a5f8ddf", size = 93217, upload-time = "2025-06-10T00:44:54.658Z" }, - { url = "https://files.pythonhosted.org/packages/4f/7f/fa59c4c27e2a076bba0d959386e26eba77eb52ea4a0aac48e3515c186b4c/yarl-1.20.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:69ff8439d8ba832d6bed88af2c2b3445977eba9a4588b787b32945871c2444e3", size = 92700, upload-time = "2025-06-10T00:44:56.784Z" }, - { url = "https://files.pythonhosted.org/packages/2f/d4/062b2f48e7c93481e88eff97a6312dca15ea200e959f23e96d8ab898c5b8/yarl-1.20.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3cf34efa60eb81dd2645a2e13e00bb98b76c35ab5061a3989c7a70f78c85006d", size = 347644, upload-time = "2025-06-10T00:44:59.071Z" }, - { url = "https://files.pythonhosted.org/packages/89/47/78b7f40d13c8f62b499cc702fdf69e090455518ae544c00a3bf4afc9fc77/yarl-1.20.1-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:8e0fe9364ad0fddab2688ce72cb7a8e61ea42eff3c7caeeb83874a5d479c896c", size = 323452, upload-time = "2025-06-10T00:45:01.605Z" }, - { url = "https://files.pythonhosted.org/packages/eb/2b/490d3b2dc66f52987d4ee0d3090a147ea67732ce6b4d61e362c1846d0d32/yarl-1.20.1-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8f64fbf81878ba914562c672024089e3401974a39767747691c65080a67b18c1", size = 346378, upload-time = "2025-06-10T00:45:03.946Z" }, - { url = "https://files.pythonhosted.org/packages/66/ad/775da9c8a94ce925d1537f939a4f17d782efef1f973039d821cbe4bcc211/yarl-1.20.1-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f6342d643bf9a1de97e512e45e4b9560a043347e779a173250824f8b254bd5ce", size = 353261, upload-time = "2025-06-10T00:45:05.992Z" }, - { url = "https://files.pythonhosted.org/packages/4b/23/0ed0922b47a4f5c6eb9065d5ff1e459747226ddce5c6a4c111e728c9f701/yarl-1.20.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:56dac5f452ed25eef0f6e3c6a066c6ab68971d96a9fb441791cad0efba6140d3", size = 335987, upload-time = "2025-06-10T00:45:08.227Z" }, - { url = "https://files.pythonhosted.org/packages/3e/49/bc728a7fe7d0e9336e2b78f0958a2d6b288ba89f25a1762407a222bf53c3/yarl-1.20.1-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c7d7f497126d65e2cad8dc5f97d34c27b19199b6414a40cb36b52f41b79014be", size = 329361, upload-time = "2025-06-10T00:45:10.11Z" }, - { url = "https://files.pythonhosted.org/packages/93/8f/b811b9d1f617c83c907e7082a76e2b92b655400e61730cd61a1f67178393/yarl-1.20.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:67e708dfb8e78d8a19169818eeb5c7a80717562de9051bf2413aca8e3696bf16", size = 346460, upload-time = "2025-06-10T00:45:12.055Z" }, - { url = "https://files.pythonhosted.org/packages/70/fd/af94f04f275f95da2c3b8b5e1d49e3e79f1ed8b6ceb0f1664cbd902773ff/yarl-1.20.1-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:595c07bc79af2494365cc96ddeb772f76272364ef7c80fb892ef9d0649586513", size = 334486, upload-time = "2025-06-10T00:45:13.995Z" }, - { url = "https://files.pythonhosted.org/packages/84/65/04c62e82704e7dd0a9b3f61dbaa8447f8507655fd16c51da0637b39b2910/yarl-1.20.1-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:7bdd2f80f4a7df852ab9ab49484a4dee8030023aa536df41f2d922fd57bf023f", size = 342219, upload-time = "2025-06-10T00:45:16.479Z" }, - { url = "https://files.pythonhosted.org/packages/91/95/459ca62eb958381b342d94ab9a4b6aec1ddec1f7057c487e926f03c06d30/yarl-1.20.1-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:c03bfebc4ae8d862f853a9757199677ab74ec25424d0ebd68a0027e9c639a390", size = 350693, upload-time = "2025-06-10T00:45:18.399Z" }, - { url = "https://files.pythonhosted.org/packages/a6/00/d393e82dd955ad20617abc546a8f1aee40534d599ff555ea053d0ec9bf03/yarl-1.20.1-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:344d1103e9c1523f32a5ed704d576172d2cabed3122ea90b1d4e11fe17c66458", size = 355803, upload-time = "2025-06-10T00:45:20.677Z" }, - { url = "https://files.pythonhosted.org/packages/9e/ed/c5fb04869b99b717985e244fd93029c7a8e8febdfcffa06093e32d7d44e7/yarl-1.20.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:88cab98aa4e13e1ade8c141daeedd300a4603b7132819c484841bb7af3edce9e", size = 341709, upload-time = "2025-06-10T00:45:23.221Z" }, - { url = "https://files.pythonhosted.org/packages/24/fd/725b8e73ac2a50e78a4534ac43c6addf5c1c2d65380dd48a9169cc6739a9/yarl-1.20.1-cp313-cp313t-win32.whl", hash = "sha256:b121ff6a7cbd4abc28985b6028235491941b9fe8fe226e6fdc539c977ea1739d", size = 86591, upload-time = "2025-06-10T00:45:25.793Z" }, - { url = "https://files.pythonhosted.org/packages/94/c3/b2e9f38bc3e11191981d57ea08cab2166e74ea770024a646617c9cddd9f6/yarl-1.20.1-cp313-cp313t-win_amd64.whl", hash = "sha256:541d050a355bbbc27e55d906bc91cb6fe42f96c01413dd0f4ed5a5240513874f", size = 93003, upload-time = "2025-06-10T00:45:27.752Z" }, - { url = "https://files.pythonhosted.org/packages/b4/2d/2345fce04cfd4bee161bf1e7d9cdc702e3e16109021035dbb24db654a622/yarl-1.20.1-py3-none-any.whl", hash = "sha256:83b8eb083fe4683c6115795d9fc1cfaf2cbbefb19b3a1cb68f6527460f483a77", size = 46542, upload-time = "2025-06-10T00:46:07.521Z" }, -] diff --git a/cli/.gitignore b/cli/.gitignore deleted file mode 100644 index f24b22c..0000000 --- a/cli/.gitignore +++ /dev/null @@ -1,64 +0,0 @@ -# FuzzForge CLI specific .gitignore - -# Python -__pycache__/ -*.py[cod] -*$py.class -*.so -.Python -build/ -develop-eggs/ -dist/ -downloads/ -eggs/ -.eggs/ -lib/ -lib64/ -parts/ -sdist/ -var/ -wheels/ -*.egg-info/ -.installed.cfg -*.egg -MANIFEST - -# Virtual environments -.venv/ -venv/ -ENV/ -env/ - -# UV package manager - keep uv.lock for CLI -# uv.lock # Commented out - we want to keep this for reproducible CLI builds - -# IDE -.vscode/ -.idea/ -*.swp -*.swo - -# OS -.DS_Store -Thumbs.db - -# Testing -.coverage -.pytest_cache/ -.tox/ -htmlcov/ - -# MyPy -.mypy_cache/ - -# Local development -local_config.yaml -.env.local - -# Generated files -*.log -*.tmp - -# CLI specific -# Don't ignore uv.lock in CLI as it's needed for reproducible builds -!uv.lock \ No newline at end of file diff --git a/cli/README.md b/cli/README.md deleted file mode 100644 index 76e4f28..0000000 --- a/cli/README.md +++ /dev/null @@ -1,648 +0,0 @@ -# FuzzForge CLI - -šŸ›”ļø **FuzzForge CLI** - Command-line interface for FuzzForge security testing platform - -A comprehensive CLI for managing security testing workflows, monitoring runs in real-time, and analyzing findings with beautiful terminal interfaces and persistent project management. - -## ✨ Features - -- šŸ“ **Project Management** - Initialize and manage FuzzForge projects with local databases -- šŸ”§ **Workflow Management** - Browse, configure, and run security testing workflows -- šŸš€ **Workflow Execution** - Execute and manage security testing workflows -- šŸ” **Findings Analysis** - View, export, and analyze security findings in multiple formats -- šŸ“Š **Real-time Monitoring** - Live dashboards for fuzzing statistics and crash reports -- āš™ļø **Configuration** - Flexible project and global configuration management -- šŸŽØ **Rich UI** - Beautiful tables, progress bars, and interactive prompts -- šŸ’¾ **Persistent Storage** - SQLite database for runs, findings, and crash data -- šŸ›”ļø **Error Handling** - Comprehensive error handling with user-friendly messages -- šŸ”„ **Network Resilience** - Automatic retries and graceful degradation - -## šŸš€ Quick Start - -### Installation - -#### Prerequisites -- Python 3.11 or higher -- [uv](https://docs.astral.sh/uv/) package manager - -#### Install FuzzForge CLI -```bash -# Clone the repository -git clone https://github.com/FuzzingLabs/fuzzforge_alpha.git -cd fuzzforge_alpha/cli - -# Install globally with uv (recommended) -uv tool install . - -# Alternative: Install in development mode -uv sync -uv add --editable ../sdk -uv tool install --editable . - -# Verify installation -fuzzforge --help -``` - -#### Shell Completion (Optional) -```bash -# Install completion for your shell -fuzzforge --install-completion -``` - -### Initialize Your First Project - -```bash -# Create a new project directory -mkdir my-security-project -cd my-security-project - -# Initialize FuzzForge project -ff init - -# Check status -fuzzforge status -``` - -This creates a `.fuzzforge/` directory with: -- SQLite database for persistent storage -- Configuration file (`config.yaml`) -- Project metadata - -### Run Your First Analysis - -```bash -# List available workflows -fuzzforge workflows list - -# Get workflow details -fuzzforge workflows info security_assessment - -# Submit a workflow for analysis -fuzzforge workflow run security_assessment /path/to/your/code - - -# View findings when complete -fuzzforge finding -``` - -## šŸ“š Command Reference - -### Project Management - -#### `ff init` -Initialize a new FuzzForge project in the current directory. - -```bash -ff init --name "My Security Project" --api-url "http://localhost:8000" -``` - -**Options:** -- `--name, -n` - Project name (defaults to directory name) -- `--api-url, -u` - FuzzForge API URL (defaults to http://localhost:8000) -- `--force, -f` - Force initialization even if project exists - -#### `fuzzforge status` -Show comprehensive project and API status information. - -```bash -fuzzforge status -``` - -Displays: -- Project information and configuration -- Database statistics (runs, findings, crashes) -- API connectivity and available workflows - -### Workflow Management - -#### `fuzzforge workflows list` -List all available security testing workflows. - -```bash -fuzzforge workflows list -``` - -#### `fuzzforge workflows info ` -Show detailed information about a specific workflow. - -```bash -fuzzforge workflows info security_assessment -``` - -Displays: -- Workflow metadata (version, author, description) -- Parameter schema and requirements -- Supported volume modes and features - -#### `fuzzforge workflows parameters ` -Interactive parameter builder for workflows. - -```bash -# Interactive mode -fuzzforge workflows parameters security_assessment - -# Save parameters to file -fuzzforge workflows parameters security_assessment --output params.json - -# Non-interactive mode (show schema only) -fuzzforge workflows parameters security_assessment --no-interactive -``` - -### Workflow Execution - -#### `fuzzforge workflow run ` -Execute a security testing workflow with **automatic file upload**. - -```bash -# Basic execution - CLI automatically detects local files and uploads them -fuzzforge workflow run security_assessment /path/to/code - -# With parameters -fuzzforge workflow run security_assessment /path/to/binary \ - --param timeout=3600 \ - --param iterations=10000 - -# With parameter file -fuzzforge workflow run security_assessment /path/to/code \ - --param-file my-params.json - -# Wait for completion -fuzzforge workflow run security_assessment /path/to/code --wait -``` - -**Automatic File Upload Behavior:** - -The CLI intelligently handles target files based on whether they exist locally: - -1. **Local file/directory exists** → **Automatic upload to MinIO**: - - CLI creates a compressed tarball (`.tar.gz`) for directories - - Uploads via HTTP to backend API - - Backend stores in MinIO with unique `target_id` - - Worker downloads from MinIO when ready to analyze - - āœ… **Works from any machine** (no shared filesystem needed) - -2. **Path doesn't exist locally** → **Path-based submission** (legacy): - - Path is sent to backend as-is - - Backend expects target to be accessible on its filesystem - - āš ļø Only works when CLI and backend share filesystem - -**Example workflow:** -```bash -$ ff workflow security_assessment ./my-project - -šŸ”§ Getting workflow information for: security_assessment -šŸ“¦ Detected local directory: ./my-project (21 files) -šŸ—œļø Creating compressed tarball... -šŸ“¤ Uploading to backend (0.01 MB)... -āœ… Upload complete! Target ID: 548193a1-f73f-4ec1-8068-19ec2660b8e4 - -šŸŽÆ Executing workflow: - Workflow: security_assessment - Target: my-project.tar.gz (uploaded) - Volume Mode: ro - Status: šŸ”„ RUNNING - -āœ… Workflow started successfully! - Execution ID: security_assessment-52781925 -``` - -**Upload Details:** -- **Max file size**: 10 GB (configurable on backend) -- **Compression**: Automatic for directories (reduces upload time) -- **Storage**: Files stored in MinIO (S3-compatible) -- **Lifecycle**: Automatic cleanup after 7 days -- **Caching**: Workers cache downloaded targets for faster repeated workflows - -**Options:** -- `--param, -p` - Parameter in key=value format (can be used multiple times) -- `--param-file, -f` - JSON file containing parameters -- `--volume-mode, -v` - Volume mount mode: `ro` (read-only) or `rw` (read-write) -- `--timeout, -t` - Execution timeout in seconds -- `--interactive/--no-interactive, -i/-n` - Interactive parameter input -- `--wait, -w` - Wait for execution to complete - -**Worker Lifecycle Options (v0.7.0):** -- `--auto-start/--no-auto-start` - Auto-start required worker (default: from config) -- `--auto-stop/--no-auto-stop` - Auto-stop worker after completion (default: from config) - -**Examples:** -```bash -# Worker starts automatically (default behavior) -fuzzforge workflow ossfuzz_campaign . project_name=zlib - -# Disable auto-start (worker must be running already) -fuzzforge workflow ossfuzz_campaign . --no-auto-start - -# Auto-stop worker after completion -fuzzforge workflow ossfuzz_campaign . --wait --auto-stop -``` - -#### `fuzzforge workflow status [execution-id]` -Check the status of a workflow execution. - -```bash -# Check specific execution -fuzzforge workflow status abc123def456 - -# Check most recent execution -fuzzforge workflow status -``` - -#### `fuzzforge workflow history` -Show workflow execution history from local database. - -```bash -# List all executions -fuzzforge workflow history - -# Filter by workflow -fuzzforge workflow history --workflow security_assessment - -# Filter by status -fuzzforge workflow history --status completed - -# Limit results -fuzzforge workflow history --limit 10 -``` - -#### `fuzzforge workflow retry ` -Retry a workflow with the same or modified parameters. - -```bash -# Retry with same parameters -fuzzforge workflow retry abc123def456 - -# Modify parameters interactively -fuzzforge workflow retry abc123def456 --modify-params -``` - -### Findings Management - -#### `fuzzforge finding [execution-id]` -View security findings for a specific execution. - -```bash -# Display latest findings -fuzzforge finding - -# Display specific execution findings -fuzzforge finding abc123def456 -``` - -#### `fuzzforge findings` -Browse all security findings from local database. - -```bash -# List all findings -fuzzforge findings - -# Show findings history -fuzzforge findings history --limit 20 -``` - -#### `fuzzforge finding export [execution-id]` -Export security findings in various formats. - -```bash -# Export latest findings -fuzzforge finding export --format json - -# Export specific execution findings -fuzzforge finding export abc123def456 --format sarif - -# Export as CSV with output file -fuzzforge finding export abc123def456 --format csv --output report.csv - -# Export as HTML report -fuzzforge finding export --format html --output report.html -``` - -### Configuration Management - -#### `fuzzforge config show` -Display current configuration settings. - -```bash -# Show project configuration -fuzzforge config show - -# Show global configuration -fuzzforge config show --global -``` - -#### `fuzzforge config set ` -Set a configuration value. - -```bash -# Project settings -fuzzforge config set project.api_url "http://api.fuzzforge.com" -fuzzforge config set project.default_timeout 7200 -fuzzforge config set project.default_workflow "security_assessment" - -# Retention settings -fuzzforge config set retention.max_runs 200 -fuzzforge config set retention.keep_findings_days 120 - -# Preferences -fuzzforge config set preferences.auto_save_findings true -fuzzforge config set preferences.show_progress_bars false - -# Global configuration -fuzzforge config set project.api_url "http://global.api.com" --global -``` - -#### `fuzzforge config get ` -Get a specific configuration value. - -```bash -fuzzforge config get project.api_url -fuzzforge config get retention.max_runs --global -``` - -#### `fuzzforge config reset` -Reset configuration to defaults. - -```bash -# Reset project configuration -fuzzforge config reset - -# Reset global configuration -fuzzforge config reset --global - -# Skip confirmation -fuzzforge config reset --force -``` - -#### `fuzzforge config edit` -Open configuration file in default editor. - -```bash -# Edit project configuration -fuzzforge config edit - -# Edit global configuration -fuzzforge config edit --global -``` - -## šŸ—ļø Project Structure - -When you initialize a FuzzForge project, the following structure is created: - -``` -my-project/ -ā”œā”€ā”€ .fuzzforge/ -│ ā”œā”€ā”€ config.yaml # Project configuration -│ └── findings.db # SQLite database -ā”œā”€ā”€ .gitignore # Updated with FuzzForge entries -└── README.md # Project README (if created) -``` - -### Database Schema - -The SQLite database stores: - -- **runs** - Workflow run history and metadata -- **findings** - Security findings and SARIF data -- **crashes** - Crash reports and fuzzing data - -### Configuration Format - -Project configuration (`.fuzzforge/config.yaml`): - -```yaml -project: - name: "My Security Project" - api_url: "http://localhost:8000" - default_timeout: 3600 - default_workflow: null - -retention: - max_runs: 100 - keep_findings_days: 90 - -preferences: - auto_save_findings: true - show_progress_bars: true - table_style: "rich" - color_output: true - -workers: - auto_start_workers: true # Auto-start workers when needed - auto_stop_workers: false # Auto-stop workers after completion - worker_startup_timeout: 60 # Worker startup timeout (seconds) - docker_compose_file: null # Custom docker-compose.yml path -``` - -## šŸ”§ Advanced Usage - -### Parameter Handling - -FuzzForge CLI supports flexible parameter input: - -1. **Command line parameters**: - ```bash - ff workflow workflow-name /path key1=value1 key2=value2 - ``` - -2. **Parameter files**: - ```bash - echo '{"timeout": 3600, "threads": 4}' > params.json - ff workflow workflow-name /path --param-file params.json - ``` - -3. **Interactive prompts**: - ```bash - ff workflow workflow-name /path --interactive - ``` - -4. **Parameter builder**: - ```bash - ff workflows parameters workflow-name --output my-params.json - ff workflow workflow-name /path --param-file my-params.json - ``` - -### Environment Variables - -Override configuration with environment variables: - -```bash -export FUZZFORGE_API_URL="http://production.api.com" -export FUZZFORGE_TIMEOUT="7200" -``` - -### Data Retention - -Configure automatic cleanup of old data: - -```bash -# Keep only 50 runs -fuzzforge config set retention.max_runs 50 - -# Keep findings for 30 days -fuzzforge config set retention.keep_findings_days 30 -``` - -### Export Formats - -Support for multiple export formats: - -- **JSON** - Simplified findings structure -- **CSV** - Tabular data for spreadsheets -- **HTML** - Interactive web report -- **SARIF** - Standard security analysis format - -## šŸ› ļø Development - -### Setup Development Environment - -```bash -# Clone repository -git clone https://github.com/FuzzingLabs/fuzzforge_alpha.git -cd fuzzforge_alpha/cli - -# Install in development mode -uv sync -uv add --editable ../sdk - -# Install CLI in editable mode -uv tool install --editable . -``` - -### Project Structure - -``` -cli/ -ā”œā”€ā”€ src/fuzzforge_cli/ -│ ā”œā”€ā”€ __init__.py -│ ā”œā”€ā”€ main.py # Main CLI app -│ ā”œā”€ā”€ config.py # Configuration management -│ ā”œā”€ā”€ database.py # Database operations -│ ā”œā”€ā”€ exceptions.py # Error handling -│ ā”œā”€ā”€ api_validation.py # API response validation -│ └── commands/ # Command implementations -│ ā”œā”€ā”€ init.py # Project initialization -│ ā”œā”€ā”€ workflows.py # Workflow management -│ ā”œā”€ā”€ runs.py # Run management -│ ā”œā”€ā”€ findings.py # Findings management -│ ā”œā”€ā”€ config.py # Configuration commands -│ └── status.py # Status information -ā”œā”€ā”€ pyproject.toml # Project configuration -└── README.md # This file -``` - -### Running Tests - -```bash -# Run tests (when available) -uv run pytest - -# Code formatting -uv run black src/ -uv run isort src/ - -# Type checking -uv run mypy src/ -``` - -## āš ļø Troubleshooting - -### Common Issues - -#### "No FuzzForge project found" -```bash -# Initialize a project first -ff init -``` - -#### API Connection Failed -```bash -# Check API URL configuration -fuzzforge config get project.api_url - -# Test API connectivity -fuzzforge status - -# Update API URL if needed -fuzzforge config set project.api_url "http://correct-url:8000" -``` - -#### Permission Errors -```bash -# Ensure proper permissions for project directory -chmod -R 755 .fuzzforge/ - -# Check file ownership -ls -la .fuzzforge/ -``` - -#### Database Issues -```bash -# Check database file exists -ls -la .fuzzforge/findings.db - -# Reinitialize if corrupted (will lose data) -rm .fuzzforge/findings.db -ff init --force -``` - -### Environment Variables - -Set these environment variables for debugging: - -```bash -export FUZZFORGE_DEBUG=1 # Enable debug logging -export FUZZFORGE_API_URL="..." # Override API URL -export FUZZFORGE_TIMEOUT="30" # Override timeout -``` - -### Getting Help - -```bash -# General help -fuzzforge --help - -# Command-specific help -ff workflows --help -ff workflow run --help - -# Show version -fuzzforge --version -``` - -## šŸ† Example Workflow - -Here's a complete example of analyzing a project: - -```bash -# 1. Initialize project -mkdir my-security-audit -cd my-security-audit -ff init --name "Security Audit 2024" - -# 2. Check available workflows -fuzzforge workflows list - -# 3. Submit comprehensive security assessment -ff workflow security_assessment /path/to/source/code --wait - -# 4. View findings in table format -fuzzforge findings get - -# 5. Export detailed report -fuzzforge findings export --format html --output security_report.html - -# 6. Check project statistics -fuzzforge status -``` - -## šŸ“œ License - -This project is licensed under the terms specified in the main FuzzForge repository. - -## šŸ¤ Contributing - -Contributions are welcome! Please see the main FuzzForge repository for contribution guidelines. - ---- - -**FuzzForge CLI** - Making security testing workflows accessible and efficient from the command line. diff --git a/cli/completion_install.py b/cli/completion_install.py deleted file mode 100644 index bc1784d..0000000 --- a/cli/completion_install.py +++ /dev/null @@ -1,323 +0,0 @@ -#!/usr/bin/env python3 -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -""" -Install shell completion for FuzzForge CLI. - -This script installs completion using Typer's built-in --install-completion command. -""" - -import os -import sys -import subprocess -from pathlib import Path -import typer - - -def run_fuzzforge_completion_install(shell: str) -> bool: - """Install completion using the fuzzforge CLI itself.""" - try: - # Use the CLI's built-in completion installation - result = subprocess.run([ - sys.executable, "-m", "fuzzforge_cli.main", - "--install-completion", shell - ], capture_output=True, text=True, cwd=Path(__file__).parent.parent) - - if result.returncode == 0: - print(f"āœ… {shell.capitalize()} completion installed successfully") - return True - else: - print(f"āŒ Failed to install {shell} completion: {result.stderr}") - return False - - except Exception as e: - print(f"āŒ Error installing {shell} completion: {e}") - return False - - -def create_manual_completion_scripts(): - """Create manual completion scripts as fallback.""" - scripts = { - "bash": ''' -# FuzzForge CLI completion for bash -_fuzzforge_completion() { - local IFS=$'\\t' - local response - - response=$(env COMP_WORDS="${COMP_WORDS[*]}" COMP_CWORD=$COMP_CWORD _FUZZFORGE_COMPLETE=bash_complete $1) - - for completion in $response; do - IFS=',' read type value <<< "$completion" - - if [[ $type == 'dir' ]]; then - COMPREPLY=() - compopt -o dirnames - elif [[ $type == 'file' ]]; then - COMPREPLY=() - compopt -o default - elif [[ $type == 'plain' ]]; then - COMPREPLY+=($value) - fi - done - - return 0 -} - -complete -o nosort -F _fuzzforge_completion fuzzforge - ''', - - "zsh": ''' -#compdef fuzzforge - -_fuzzforge_completion() { - local -a completions - local -a completions_with_descriptions - local -a response - response=(${(f)"$(env COMP_WORDS="${words[*]}" COMP_CWORD=$((CURRENT-1)) _FUZZFORGE_COMPLETE=zsh_complete fuzzforge)"}) - - for type_and_line in $response; do - if [[ "$type_and_line" =~ ^([^,]*),(.*)$ ]]; then - local type="$match[1]" - local line="$match[2]" - - if [[ "$type" == "dir" ]]; then - _path_files -/ - elif [[ "$type" == "file" ]]; then - _path_files -f - elif [[ "$type" == "plain" ]]; then - if [[ "$line" =~ ^([^:]*):(.*)$ ]]; then - completions_with_descriptions+=("$match[1]":"$match[2]") - else - completions+=("$line") - fi - fi - fi - done - - if [ -n "$completions_with_descriptions" ]; then - _describe "" completions_with_descriptions -V unsorted - fi - - if [ -n "$completions" ]; then - compadd -U -V unsorted -a completions - fi -} - -compdef _fuzzforge_completion fuzzforge; - ''', - - "fish": ''' -# FuzzForge CLI completion for fish -function __fuzzforge_completion - set -l response - - for value in (env _FUZZFORGE_COMPLETE=fish_complete COMP_WORDS=(commandline -cp) COMP_CWORD=(commandline -t) fuzzforge) - set response $response $value - end - - for completion in $response - set -l metadata (string split "," $completion) - - if test $metadata[1] = "dir" - __fish_complete_directories $metadata[2] - else if test $metadata[1] = "file" - __fish_complete_path $metadata[2] - else if test $metadata[1] = "plain" - echo $metadata[2] - end - end -end - -complete --no-files --command fuzzforge --arguments "(__fuzzforge_completion)" - ''' - } - - return scripts - - -def install_bash_completion(): - """Install bash completion.""" - print("šŸ“ Installing bash completion...") - - # Get the manual completion script - scripts = create_manual_completion_scripts() - completion_script = scripts["bash"] - - # Try different locations for bash completion - completion_dirs = [ - Path.home() / ".bash_completion.d", - Path("/usr/local/etc/bash_completion.d"), - Path("/etc/bash_completion.d") - ] - - for completion_dir in completion_dirs: - try: - completion_dir.mkdir(exist_ok=True) - completion_file = completion_dir / "fuzzforge" - completion_file.write_text(completion_script) - print(f"āœ… Bash completion installed to: {completion_file}") - - # Add source line to .bashrc if not present - bashrc = Path.home() / ".bashrc" - source_line = f"source {completion_file}" - - if bashrc.exists(): - bashrc_content = bashrc.read_text() - if source_line not in bashrc_content: - with bashrc.open("a") as f: - f.write(f"\n# FuzzForge CLI completion\n{source_line}\n") - print("āœ… Added completion source to ~/.bashrc") - - return True - except PermissionError: - continue - except Exception as e: - print(f"āŒ Failed to install bash completion: {e}") - continue - - print("āŒ Could not install bash completion (permission denied)") - return False - - -def install_zsh_completion(): - """Install zsh completion.""" - print("šŸ“ Installing zsh completion...") - - # Get the manual completion script - scripts = create_manual_completion_scripts() - completion_script = scripts["zsh"] - - # Create completion directory - comp_dir = Path.home() / ".zsh" / "completions" - comp_dir.mkdir(parents=True, exist_ok=True) - - try: - completion_file = comp_dir / "_fuzzforge" - completion_file.write_text(completion_script) - print(f"āœ… Zsh completion installed to: {completion_file}") - - # Add fpath to .zshrc if not present - zshrc = Path.home() / ".zshrc" - fpath_line = 'fpath=(~/.zsh/completions $fpath)' - autoload_line = 'autoload -U compinit && compinit' - - if zshrc.exists(): - zshrc_content = zshrc.read_text() - lines_to_add = [] - - if fpath_line not in zshrc_content: - lines_to_add.append(fpath_line) - - if autoload_line not in zshrc_content: - lines_to_add.append(autoload_line) - - if lines_to_add: - with zshrc.open("a") as f: - f.write("\n# FuzzForge CLI completion\n") - for line in lines_to_add: - f.write(f"{line}\n") - print("āœ… Added completion setup to ~/.zshrc") - - return True - except Exception as e: - print(f"āŒ Failed to install zsh completion: {e}") - return False - - -def install_fish_completion(): - """Install fish completion.""" - print("šŸ“ Installing fish completion...") - - # Get the manual completion script - scripts = create_manual_completion_scripts() - completion_script = scripts["fish"] - - # Fish completion directory - comp_dir = Path.home() / ".config" / "fish" / "completions" - comp_dir.mkdir(parents=True, exist_ok=True) - - try: - completion_file = comp_dir / "fuzzforge.fish" - completion_file.write_text(completion_script) - print(f"āœ… Fish completion installed to: {completion_file}") - return True - except Exception as e: - print(f"āŒ Failed to install fish completion: {e}") - return False - - -def detect_shell(): - """Detect the current shell.""" - shell_path = os.environ.get('SHELL', '') - if 'bash' in shell_path: - return 'bash' - elif 'zsh' in shell_path: - return 'zsh' - elif 'fish' in shell_path: - return 'fish' - else: - return None - - -def main(): - """Install completion for the current shell or all shells.""" - print("šŸš€ FuzzForge CLI Completion Installer") - print("=" * 50) - - current_shell = detect_shell() - if current_shell: - print(f"🐚 Detected shell: {current_shell}") - - # Check for command line arguments - if len(sys.argv) > 1 and sys.argv[1] == "--all": - install_all = True - print("Installing completion for all shells...") - else: - # Ask user which shells to install (with default to current shell only) - if current_shell: - install_all = typer.confirm("Install completion for all supported shells (bash, zsh, fish)?", default=False) - if not install_all: - print(f"Installing completion for {current_shell} only...") - else: - install_all = typer.confirm("Install completion for all supported shells (bash, zsh, fish)?", default=True) - - success_count = 0 - - if install_all or current_shell == 'bash': - if install_bash_completion(): - success_count += 1 - - if install_all or current_shell == 'zsh': - if install_zsh_completion(): - success_count += 1 - - if install_all or current_shell == 'fish': - if install_fish_completion(): - success_count += 1 - - print("\n" + "=" * 50) - if success_count > 0: - print(f"āœ… Successfully installed completion for {success_count} shell(s)!") - print("\nšŸ“‹ To activate completion:") - print(" • Bash: Restart your terminal or run 'source ~/.bashrc'") - print(" • Zsh: Restart your terminal or run 'source ~/.zshrc'") - print(" • Fish: Completion is active immediately") - print("\nšŸ’” Try typing 'fuzzforge ' to test completion!") - else: - print("āŒ No completions were installed successfully.") - return 1 - - return 0 - - -if __name__ == "__main__": - sys.exit(main()) \ No newline at end of file diff --git a/cli/main.py b/cli/main.py deleted file mode 100644 index 627f3f9..0000000 --- a/cli/main.py +++ /dev/null @@ -1,21 +0,0 @@ -""" -FuzzForge CLI - Command-line interface for FuzzForge security testing platform. - -This module provides the main entry point for the FuzzForge CLI application. -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -from src.fuzzforge_cli.main import app - -if __name__ == "__main__": - app() diff --git a/cli/pyproject.toml b/cli/pyproject.toml deleted file mode 100644 index 4a71d1e..0000000 --- a/cli/pyproject.toml +++ /dev/null @@ -1,41 +0,0 @@ -[project] -name = "fuzzforge-cli" -version = "0.7.3" -description = "FuzzForge CLI - Command-line interface for FuzzForge security testing platform" -readme = "README.md" -authors = [ - { name = "Tanguy Duhamel", email = "tduhamel@fuzzinglabs.com" } -] -requires-python = ">=3.11" -dependencies = [ - "typer>=0.12.0", - "rich>=13.0.0", - "pyyaml>=6.0.0", - "pydantic>=2.0.0", - "httpx>=0.27.0", - "websockets>=13.0", - "sseclient-py>=1.8.0", - "fuzzforge-sdk", - "fuzzforge-ai", -] - -[project.optional-dependencies] -dev = [ - "pytest>=8.0.0", - "pytest-asyncio>=0.23.0", - "black>=24.0.0", - "isort>=5.13.0", - "mypy>=1.11.0", -] - -[project.scripts] -fuzzforge = "fuzzforge_cli.main:main" -ff = "fuzzforge_cli.main:main" - -[build-system] -requires = ["uv_build>=0.8.17,<0.9.0"] -build-backend = "uv_build" - -[tool.uv.sources] -fuzzforge-sdk = { path = "../sdk", editable = true } -fuzzforge-ai = { path = "../ai", editable = true } diff --git a/cli/src/fuzzforge_cli/__init__.py b/cli/src/fuzzforge_cli/__init__.py deleted file mode 100644 index cc4a071..0000000 --- a/cli/src/fuzzforge_cli/__init__.py +++ /dev/null @@ -1,19 +0,0 @@ -""" -FuzzForge CLI - Command-line interface for FuzzForge security testing platform. - -A comprehensive CLI for managing workflows, runs, findings, and real-time monitoring -with local project management and persistent storage. -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -__version__ = "0.7.3" \ No newline at end of file diff --git a/cli/src/fuzzforge_cli/api_validation.py b/cli/src/fuzzforge_cli/api_validation.py deleted file mode 100644 index 1f9aa52..0000000 --- a/cli/src/fuzzforge_cli/api_validation.py +++ /dev/null @@ -1,310 +0,0 @@ -""" -API response validation and graceful degradation utilities. -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -import logging -from typing import Any, Dict, List, Optional -from pydantic import BaseModel, ValidationError as PydanticValidationError - -from .exceptions import ValidationError - -logger = logging.getLogger(__name__) - - -class WorkflowMetadata(BaseModel): - """Expected workflow metadata structure""" - name: str - version: str - author: Optional[str] = None - description: Optional[str] = None - parameters: Dict[str, Any] = {} - - -class RunStatus(BaseModel): - """Expected run status structure""" - run_id: str - workflow: str - status: str - created_at: str - updated_at: str - - @property - def is_completed(self) -> bool: - """Check if run is in a completed state""" - return self.status.lower() in ["completed", "success", "finished"] - - @property - def is_running(self) -> bool: - """Check if run is currently running""" - return self.status.lower() in ["running", "in_progress", "active"] - - @property - def is_failed(self) -> bool: - """Check if run has failed""" - return self.status.lower() in ["failed", "error", "cancelled"] - - -class FindingsResponse(BaseModel): - """Expected findings response structure""" - run_id: str - sarif: Dict[str, Any] - total_issues: Optional[int] = None - - def model_post_init(self, __context: Any) -> None: - """Validate SARIF structure after initialization""" - if not self.sarif.get("runs"): - logger.warning(f"SARIF data for run {self.run_id} missing 'runs' section") - elif not isinstance(self.sarif["runs"], list): - logger.warning(f"SARIF 'runs' section is not a list for run {self.run_id}") - - -def validate_api_response(response_data: Any, expected_model: type[BaseModel], - operation: str = "API operation") -> BaseModel: - """ - Validate API response against expected Pydantic model. - - Args: - response_data: Raw response data from API - expected_model: Pydantic model class to validate against - operation: Description of the operation for error messages - - Returns: - Validated model instance - - Raises: - ValidationError: If validation fails - """ - try: - return expected_model.model_validate(response_data) - except PydanticValidationError as e: - logger.error(f"API response validation failed for {operation}: {e}") - raise ValidationError( - f"API response for {operation}", - str(response_data)[:200] + "..." if len(str(response_data)) > 200 else str(response_data), - f"valid {expected_model.__name__} format" - ) from e - except Exception as e: - logger.error(f"Unexpected error validating API response for {operation}: {e}") - raise ValidationError( - f"API response for {operation}", - "invalid data", - f"valid {expected_model.__name__} format" - ) from e - - -def validate_sarif_structure(sarif_data: Dict[str, Any]) -> Dict[str, str]: - """ - Validate basic SARIF structure and return validation issues. - - Args: - sarif_data: SARIF data dictionary - - Returns: - Dictionary of validation issues found - """ - issues = {} - - # Check basic SARIF structure - if not isinstance(sarif_data, dict): - issues["structure"] = "SARIF data is not a dictionary" - return issues - - if "runs" not in sarif_data: - issues["runs"] = "Missing 'runs' section in SARIF data" - elif not isinstance(sarif_data["runs"], list): - issues["runs_type"] = "'runs' section is not a list" - elif len(sarif_data["runs"]) == 0: - issues["runs_empty"] = "'runs' section is empty" - else: - # Check first run structure - run = sarif_data["runs"][0] - if not isinstance(run, dict): - issues["run_structure"] = "First run is not a dictionary" - else: - if "results" not in run: - issues["results"] = "Missing 'results' section in run" - elif not isinstance(run["results"], list): - issues["results_type"] = "'results' section is not a list" - - if "tool" not in run: - issues["tool"] = "Missing 'tool' section in run" - elif not isinstance(run["tool"], dict): - issues["tool_type"] = "'tool' section is not a dictionary" - - return issues - - -def safe_extract_sarif_summary(sarif_data: Dict[str, Any]) -> Dict[str, Any]: - """ - Safely extract summary information from SARIF data with fallbacks. - - Args: - sarif_data: SARIF data dictionary - - Returns: - Summary dictionary with safe defaults - """ - summary = { - "total_issues": 0, - "by_severity": {}, - "by_rule": {}, - "tools": [], - "validation_issues": [] - } - - # Validate structure first - validation_issues = validate_sarif_structure(sarif_data) - if validation_issues: - summary["validation_issues"] = list(validation_issues.values()) - logger.warning(f"SARIF validation issues: {validation_issues}") - - try: - runs = sarif_data.get("runs", []) - if not runs: - return summary - - run = runs[0] - results = run.get("results", []) - - summary["total_issues"] = len(results) - - # Count by severity/level - for result in results: - try: - level = result.get("level", "note") - rule_id = result.get("ruleId", "unknown") - - summary["by_severity"][level] = summary["by_severity"].get(level, 0) + 1 - summary["by_rule"][rule_id] = summary["by_rule"].get(rule_id, 0) + 1 - except Exception as e: - logger.warning(f"Failed to process result: {e}") - continue - - # Extract tool information safely - try: - tool = run.get("tool", {}) - driver = tool.get("driver", {}) - if driver.get("name"): - summary["tools"].append({ - "name": driver.get("name", "unknown"), - "version": driver.get("version", "unknown"), - "rules": len(driver.get("rules", [])) - }) - except Exception as e: - logger.warning(f"Failed to extract tool information: {e}") - - except Exception as e: - logger.error(f"Failed to extract SARIF summary: {e}") - summary["validation_issues"].append(f"Summary extraction failed: {e}") - - return summary - - -def validate_workflow_parameters(parameters: Dict[str, Any], - workflow_schema: Dict[str, Any]) -> List[str]: - """ - Validate workflow parameters against schema with detailed error messages. - - Args: - parameters: Parameters to validate - workflow_schema: JSON schema for the workflow - - Returns: - List of validation error messages - """ - errors = [] - - try: - properties = workflow_schema.get("properties", {}) - required = set(workflow_schema.get("required", [])) - - # Check required parameters - missing_required = required - set(parameters.keys()) - if missing_required: - errors.append(f"Missing required parameters: {', '.join(missing_required)}") - - # Validate individual parameters - for param_name, param_value in parameters.items(): - if param_name not in properties: - errors.append(f"Unknown parameter: {param_name}") - continue - - param_schema = properties[param_name] - param_type = param_schema.get("type", "string") - - # Type validation - if param_type == "integer" and not isinstance(param_value, int): - errors.append(f"Parameter '{param_name}' must be an integer") - elif param_type == "number" and not isinstance(param_value, (int, float)): - errors.append(f"Parameter '{param_name}' must be a number") - elif param_type == "boolean" and not isinstance(param_value, bool): - errors.append(f"Parameter '{param_name}' must be a boolean") - elif param_type == "array" and not isinstance(param_value, list): - errors.append(f"Parameter '{param_name}' must be an array") - - # Range validation for numbers - if param_type in ["integer", "number"] and isinstance(param_value, (int, float)): - minimum = param_schema.get("minimum") - maximum = param_schema.get("maximum") - - if minimum is not None and param_value < minimum: - errors.append(f"Parameter '{param_name}' must be >= {minimum}") - if maximum is not None and param_value > maximum: - errors.append(f"Parameter '{param_name}' must be <= {maximum}") - - except Exception as e: - logger.error(f"Parameter validation failed: {e}") - errors.append(f"Parameter validation error: {e}") - - return errors - - -def create_fallback_response(response_type: str, **kwargs) -> Dict[str, Any]: - """ - Create fallback responses when API calls fail. - - Args: - response_type: Type of response to create - **kwargs: Additional data for the fallback - - Returns: - Fallback response dictionary - """ - fallbacks = { - "workflow_list": { - "workflows": [], - "message": "Unable to fetch workflows from API" - }, - "run_status": { - "run_id": kwargs.get("run_id", "unknown"), - "workflow": kwargs.get("workflow", "unknown"), - "status": "unknown", - "created_at": kwargs.get("created_at", "unknown"), - "updated_at": kwargs.get("updated_at", "unknown"), - "message": "Unable to fetch run status from API" - }, - "findings": { - "run_id": kwargs.get("run_id", "unknown"), - "sarif": { - "version": "2.1.0", - "runs": [] - }, - "message": "Unable to fetch findings from API" - } - } - - fallback = fallbacks.get(response_type, {"message": f"No fallback available for {response_type}"}) - logger.info(f"Using fallback response for {response_type}: {fallback.get('message', 'Unknown fallback')}") - - return fallback \ No newline at end of file diff --git a/cli/src/fuzzforge_cli/commands/__init__.py b/cli/src/fuzzforge_cli/commands/__init__.py deleted file mode 100644 index afcf0d9..0000000 --- a/cli/src/fuzzforge_cli/commands/__init__.py +++ /dev/null @@ -1,17 +0,0 @@ -""" -Command modules for FuzzForge CLI. -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -from . import worker - -__all__ = ["worker"] diff --git a/cli/src/fuzzforge_cli/commands/ai.py b/cli/src/fuzzforge_cli/commands/ai.py deleted file mode 100644 index a5834dd..0000000 --- a/cli/src/fuzzforge_cli/commands/ai.py +++ /dev/null @@ -1,95 +0,0 @@ -"""AI integration commands for the FuzzForge CLI.""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -from __future__ import annotations - -import asyncio -import os - -import typer -from rich.console import Console - - -console = Console() -app = typer.Typer(name="ai", help="Interact with the FuzzForge AI system") - - -@app.command("agent") -def ai_agent() -> None: - """Launch the full AI agent CLI with A2A orchestration.""" - console.print("[cyan]šŸ¤– Opening Project FuzzForge AI Agent session[/cyan]\n") - try: - from fuzzforge_ai.cli import FuzzForgeCLI - cli = FuzzForgeCLI() - asyncio.run(cli.run()) - except ImportError as exc: - console.print(f"[red]Failed to import AI CLI:[/red] {exc}") - raise typer.Exit(1) from exc - - -# Memory + health commands -@app.command("status") -def ai_status() -> None: - """Show AI system health and configuration.""" - # TODO: Implement AI status checking - # This command is a placeholder for future health monitoring functionality - console.print("🚧 [yellow]AI status command is not yet implemented.[/yellow]") - console.print("\nPlanned features:") - console.print(" • LLM provider connectivity") - console.print(" • API key validation") - console.print(" • Registered agents status") - console.print(" • Memory/session persistence health") - console.print("\nFor now, use [cyan]ff ai agent[/cyan] to launch the AI agent.") - - -@app.command("server") -def ai_server( - port: int = typer.Option(10100, "--port", "-p", help="Server port (default: 10100)"), -) -> None: - """Start AI system as an A2A server.""" - console.print(f"[cyan]šŸš€ Starting FuzzForge AI Server on port {port}[/cyan]") - console.print("[dim]Other agents can register this instance at the A2A endpoint[/dim]\n") - - try: - os.environ["FUZZFORGE_PORT"] = str(port) - from fuzzforge_ai.__main__ import main as start_server - - start_server() - except Exception as exc: # pragma: no cover - console.print(f"[red]Failed to start AI server:[/red] {exc}") - raise typer.Exit(1) from exc - - -# --------------------------------------------------------------------------- -# Helper functions (largely adapted from the OSS implementation) -# --------------------------------------------------------------------------- - - -@app.callback(invoke_without_command=True) -def ai_callback(ctx: typer.Context): - """ - šŸ¤– AI integration features - """ - # Check if a subcommand is being invoked - if ctx.invoked_subcommand is not None: - # Let the subcommand handle it - return - - # Show not implemented message for default command - console.print("🚧 [yellow]AI command is not fully implemented yet.[/yellow]") - console.print("Please use specific subcommands:") - console.print(" • [cyan]ff ai agent[/cyan] - Launch the full AI agent CLI") - console.print(" • [cyan]ff ai status[/cyan] - Show AI system health and configuration") - console.print(" • [cyan]ff ai server[/cyan] - Start AI system as an A2A server") - - diff --git a/cli/src/fuzzforge_cli/commands/config.py b/cli/src/fuzzforge_cli/commands/config.py deleted file mode 100644 index 1373fd7..0000000 --- a/cli/src/fuzzforge_cli/commands/config.py +++ /dev/null @@ -1,381 +0,0 @@ -""" -Configuration management commands. -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -import typer -from pathlib import Path -from rich.console import Console -from rich.table import Table -from rich.panel import Panel -from rich.prompt import Confirm -from rich import box - -from ..config import ( - get_project_config, - get_global_config, - save_global_config, - FuzzForgeConfig -) -from ..exceptions import require_project, ValidationError, handle_error - -console = Console() -app = typer.Typer() - - -@app.command("show") -def show_config( - global_config: bool = typer.Option( - False, "--global", "-g", - help="Show global configuration instead of project config" - ) -): - """ - šŸ“‹ Display current configuration settings - """ - if global_config: - config = get_global_config() - config_type = "Global" - config_path = Path.home() / ".config" / "fuzzforge" / "config.yaml" - else: - try: - require_project() - config = get_project_config() - if not config: - raise ValidationError("project configuration", "missing", "initialized project") - except Exception as e: - handle_error(e, "loading project configuration") - return # Unreachable, but makes static analysis happy - config_type = "Project" - config_path = Path.cwd() / ".fuzzforge" / "config.yaml" - - console.print(f"\nāš™ļø [bold]{config_type} Configuration[/bold]\n") - - # Project settings - project_table = Table(show_header=False, box=box.SIMPLE) - project_table.add_column("Setting", style="bold cyan") - project_table.add_column("Value") - - project_table.add_row("Project Name", config.project.name) - project_table.add_row("API URL", config.project.api_url) - project_table.add_row("Default Timeout", f"{config.project.default_timeout}s") - if config.project.default_workflow: - project_table.add_row("Default Workflow", config.project.default_workflow) - - console.print( - Panel.fit( - project_table, - title="šŸ“ Project Settings", - box=box.ROUNDED - ) - ) - - # Retention settings - retention_table = Table(show_header=False, box=box.SIMPLE) - retention_table.add_column("Setting", style="bold cyan") - retention_table.add_column("Value") - - retention_table.add_row("Max Runs", str(config.retention.max_runs)) - retention_table.add_row("Keep Findings (days)", str(config.retention.keep_findings_days)) - - console.print( - Panel.fit( - retention_table, - title="šŸ—„ļø Data Retention", - box=box.ROUNDED - ) - ) - - # Preferences - prefs_table = Table(show_header=False, box=box.SIMPLE) - prefs_table.add_column("Setting", style="bold cyan") - prefs_table.add_column("Value") - - prefs_table.add_row("Auto Save Findings", "āœ… Yes" if config.preferences.auto_save_findings else "āŒ No") - prefs_table.add_row("Show Progress Bars", "āœ… Yes" if config.preferences.show_progress_bars else "āŒ No") - prefs_table.add_row("Table Style", config.preferences.table_style) - prefs_table.add_row("Color Output", "āœ… Yes" if config.preferences.color_output else "āŒ No") - - console.print( - Panel.fit( - prefs_table, - title="šŸŽØ Preferences", - box=box.ROUNDED - ) - ) - - console.print(f"\nšŸ“ Config file: [dim]{config_path}[/dim]") - - -@app.command("set") -def set_config( - key: str = typer.Argument(..., help="Configuration key to set (e.g., 'project.name', 'project.api_url')"), - value: str = typer.Argument(..., help="Value to set"), - global_config: bool = typer.Option( - False, "--global", "-g", - help="Set in global configuration instead of project config" - ) -): - """ - āš™ļø Set a configuration value - """ - if global_config: - config = get_global_config() - config_type = "global" - else: - config = get_project_config() - if not config: - console.print("āŒ No project configuration found. Run 'ff init' first.", style="red") - raise typer.Exit(1) - config_type = "project" - - # Parse the key path - key_parts = key.split('.') - if len(key_parts) != 2: - console.print("āŒ Key must be in format 'section.setting' (e.g., 'project.name')", style="red") - raise typer.Exit(1) - - section, setting = key_parts - - try: - # Update configuration - if section == "project": - if setting == "name": - config.project.name = value - elif setting == "api_url": - config.project.api_url = value - elif setting == "default_timeout": - config.project.default_timeout = int(value) - elif setting == "default_workflow": - config.project.default_workflow = value if value.lower() != "none" else None - else: - console.print(f"āŒ Unknown project setting: {setting}", style="red") - raise typer.Exit(1) - - elif section == "retention": - if setting == "max_runs": - config.retention.max_runs = int(value) - elif setting == "keep_findings_days": - config.retention.keep_findings_days = int(value) - else: - console.print(f"āŒ Unknown retention setting: {setting}", style="red") - raise typer.Exit(1) - - elif section == "preferences": - if setting == "auto_save_findings": - config.preferences.auto_save_findings = value.lower() in ("true", "yes", "1", "on") - elif setting == "show_progress_bars": - config.preferences.show_progress_bars = value.lower() in ("true", "yes", "1", "on") - elif setting == "table_style": - config.preferences.table_style = value - elif setting == "color_output": - config.preferences.color_output = value.lower() in ("true", "yes", "1", "on") - else: - console.print(f"āŒ Unknown preferences setting: {setting}", style="red") - raise typer.Exit(1) - - else: - console.print(f"āŒ Unknown configuration section: {section}", style="red") - console.print("Valid sections: project, retention, preferences", style="dim") - raise typer.Exit(1) - - # Save configuration - if global_config: - save_global_config(config) - else: - config_path = Path.cwd() / ".fuzzforge" / "config.yaml" - config.save_to_file(config_path) - - console.print(f"āœ… Set {config_type} configuration: [bold cyan]{key}[/bold cyan] = [bold]{value}[/bold]", style="green") - - except ValueError as e: - console.print(f"āŒ Invalid value for {key}: {e}", style="red") - raise typer.Exit(1) - except Exception as e: - console.print(f"āŒ Failed to set configuration: {e}", style="red") - raise typer.Exit(1) - - -@app.command("get") -def get_config( - key: str = typer.Argument(..., help="Configuration key to get (e.g., 'project.name')"), - global_config: bool = typer.Option( - False, "--global", "-g", - help="Get from global configuration instead of project config" - ) -): - """ - šŸ“– Get a specific configuration value - """ - if global_config: - config = get_global_config() - else: - config = get_project_config() - if not config: - console.print("āŒ No project configuration found. Run 'ff init' first.", style="red") - raise typer.Exit(1) - - # Parse the key path - key_parts = key.split('.') - if len(key_parts) != 2: - console.print("āŒ Key must be in format 'section.setting' (e.g., 'project.name')", style="red") - raise typer.Exit(1) - - section, setting = key_parts - - try: - # Get configuration value - if section == "project": - if setting == "name": - value = config.project.name - elif setting == "api_url": - value = config.project.api_url - elif setting == "default_timeout": - value = config.project.default_timeout - elif setting == "default_workflow": - value = config.project.default_workflow or "none" - else: - console.print(f"āŒ Unknown project setting: {setting}", style="red") - raise typer.Exit(1) - - elif section == "retention": - if setting == "max_runs": - value = config.retention.max_runs - elif setting == "keep_findings_days": - value = config.retention.keep_findings_days - else: - console.print(f"āŒ Unknown retention setting: {setting}", style="red") - raise typer.Exit(1) - - elif section == "preferences": - if setting == "auto_save_findings": - value = config.preferences.auto_save_findings - elif setting == "show_progress_bars": - value = config.preferences.show_progress_bars - elif setting == "table_style": - value = config.preferences.table_style - elif setting == "color_output": - value = config.preferences.color_output - else: - console.print(f"āŒ Unknown preferences setting: {setting}", style="red") - raise typer.Exit(1) - - else: - console.print(f"āŒ Unknown configuration section: {section}", style="red") - raise typer.Exit(1) - - console.print(f"{key}: [bold cyan]{value}[/bold cyan]") - - except Exception as e: - console.print(f"āŒ Failed to get configuration: {e}", style="red") - raise typer.Exit(1) - - -@app.command("reset") -def reset_config( - global_config: bool = typer.Option( - False, "--global", "-g", - help="Reset global configuration instead of project config" - ), - force: bool = typer.Option( - False, "--force", "-f", - help="Skip confirmation prompt" - ) -): - """ - šŸ”„ Reset configuration to defaults - """ - config_type = "global" if global_config else "project" - - if not force: - if not Confirm.ask(f"Reset {config_type} configuration to defaults?", default=False, console=console): - console.print("āŒ Reset cancelled", style="yellow") - raise typer.Exit(0) - - try: - # Create new default configuration - new_config = FuzzForgeConfig() - - if global_config: - save_global_config(new_config) - else: - if not Path.cwd().joinpath(".fuzzforge").exists(): - console.print("āŒ No project configuration found. Run 'ff init' first.", style="red") - raise typer.Exit(1) - - config_path = Path.cwd() / ".fuzzforge" / "config.yaml" - new_config.save_to_file(config_path) - - console.print(f"āœ… {config_type.title()} configuration reset to defaults", style="green") - - except Exception as e: - console.print(f"āŒ Failed to reset configuration: {e}", style="red") - raise typer.Exit(1) - - -@app.command("edit") -def edit_config( - global_config: bool = typer.Option( - False, "--global", "-g", - help="Edit global configuration instead of project config" - ) -): - """ - šŸ“ Open configuration file in default editor - """ - import subprocess - - if global_config: - config_path = Path.home() / ".config" / "fuzzforge" / "config.yaml" - config_type = "global" - else: - config_path = Path.cwd() / ".fuzzforge" / "config.yaml" - config_type = "project" - - if not config_path.exists(): - console.print("āŒ No project configuration found. Run 'ff init' first.", style="red") - raise typer.Exit(1) - - # Try to find a suitable editor - editors = ["code", "vim", "nano", "notepad"] - editor = None - - for e in editors: - try: - subprocess.run([e, "--version"], capture_output=True, check=True) - editor = e - break - except (subprocess.CalledProcessError, FileNotFoundError): - continue - - if not editor: - console.print(f"šŸ“ Configuration file: [bold cyan]{config_path}[/bold cyan]") - console.print("āŒ No suitable editor found. Please edit the file manually.", style="red") - raise typer.Exit(1) - - try: - console.print(f"šŸ“ Opening {config_type} configuration in {editor}...") - subprocess.run([editor, str(config_path)], check=True) - console.print("āœ… Configuration file edited", style="green") - - except subprocess.CalledProcessError as e: - console.print(f"āŒ Failed to open editor: {e}", style="red") - raise typer.Exit(1) - - -@app.callback() -def config_callback(): - """ - āš™ļø Manage configuration settings - """ - pass diff --git a/cli/src/fuzzforge_cli/commands/findings.py b/cli/src/fuzzforge_cli/commands/findings.py deleted file mode 100644 index 7058527..0000000 --- a/cli/src/fuzzforge_cli/commands/findings.py +++ /dev/null @@ -1,1076 +0,0 @@ -""" -Findings and security results management commands. -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -import json -import csv -from datetime import datetime -from pathlib import Path -from typing import Optional, Dict, Any, List - -import typer -from rich.console import Console -from rich.table import Table -from rich.panel import Panel -from rich.syntax import Syntax -from rich.text import Text -from rich import box - -from ..config import get_project_config, FuzzForgeConfig -from ..database import get_project_db, ensure_project_db, FindingRecord -from ..exceptions import ( - retry_on_network_error, validate_run_id, - require_project, ValidationError -) -from fuzzforge_sdk import FuzzForgeClient - -console = Console() -app = typer.Typer() - - -@retry_on_network_error(max_retries=3, delay=1.0) -def get_client() -> FuzzForgeClient: - """Get configured FuzzForge client with retry on network errors""" - config = get_project_config() or FuzzForgeConfig() - return FuzzForgeClient(base_url=config.get_api_url(), timeout=config.get_timeout()) - - -def severity_style(severity: str) -> str: - """Get rich style for severity level""" - return { - "error": "bold red", - "warning": "bold yellow", - "note": "bold blue", - "info": "bold cyan" - }.get(severity.lower(), "white") - - -@app.command("get") -def get_findings( - run_id: str = typer.Argument(..., help="Run ID to get findings for"), - save: bool = typer.Option( - True, "--save/--no-save", - help="Save findings to local database" - ), - format: str = typer.Option( - "table", "--format", "-f", - help="Output format: table, json, sarif" - ) -): - """ - šŸ” Retrieve and display security findings for a run - """ - try: - require_project() - validate_run_id(run_id) - - if format not in ["table", "json", "sarif"]: - raise ValidationError("format", format, "one of: table, json, sarif") - with get_client() as client: - console.print(f"šŸ” Fetching findings for run: {run_id}") - findings = client.get_run_findings(run_id) - - # Save to database if requested - if save: - try: - db = ensure_project_db() - - # Extract summary from SARIF - sarif_data = findings.sarif - runs_data = sarif_data.get("runs", []) - summary = {} - - if runs_data: - results = runs_data[0].get("results", []) - summary = { - "total_issues": len(results), - "by_severity": {}, - "by_rule": {}, - "tools": [] - } - - for result in results: - level = result.get("level", "note") - rule_id = result.get("ruleId", "unknown") - - summary["by_severity"][level] = summary["by_severity"].get(level, 0) + 1 - summary["by_rule"][rule_id] = summary["by_rule"].get(rule_id, 0) + 1 - - # Extract tool info - tool = runs_data[0].get("tool", {}) - driver = tool.get("driver", {}) - if driver.get("name"): - summary["tools"].append({ - "name": driver.get("name"), - "version": driver.get("version"), - "rules": len(driver.get("rules", [])) - }) - - finding_record = FindingRecord( - run_id=run_id, - sarif_data=sarif_data, - summary=summary, - created_at=datetime.now() - ) - db.save_findings(finding_record) - console.print("āœ… Findings saved to local database", style="green") - except Exception as e: - console.print(f"āš ļø Failed to save findings to database: {e}", style="yellow") - - # Display findings - if format == "json": - findings_json = json.dumps(findings.sarif, indent=2) - console.print(Syntax(findings_json, "json", theme="monokai")) - - elif format == "sarif": - sarif_json = json.dumps(findings.sarif, indent=2) - console.print(sarif_json) - - else: # table format - display_findings_table(findings.sarif) - - # Suggest export command and show command - console.print(f"\nšŸ’” View full details of a finding: [bold cyan]ff finding show {run_id} --rule [/bold cyan]") - console.print(f"šŸ’” Export these findings: [bold cyan]ff findings export {run_id} --format sarif[/bold cyan]") - console.print(" Supported formats: [cyan]sarif[/cyan] (standard), [cyan]json[/cyan], [cyan]csv[/cyan], [cyan]html[/cyan]") - - except Exception as e: - console.print(f"āŒ Failed to get findings: {e}", style="red") - raise typer.Exit(1) - - -def show_finding( - run_id: str = typer.Argument(..., help="Run ID to get finding from"), - rule_id: str = typer.Option(..., "--rule", "-r", help="Rule ID of the specific finding to show") -): - """ - šŸ” Show detailed information about a specific finding - - This function is registered as a command in main.py under the finding (singular) command group. - """ - try: - require_project() - validate_run_id(run_id) - - # Try to get from database first, fallback to API - db = get_project_db() - findings_data = None - if db: - findings_data = db.get_findings(run_id) - - if not findings_data: - with get_client() as client: - console.print(f"šŸ” Fetching findings for run: {run_id}") - findings = client.get_run_findings(run_id) - sarif_data = findings.sarif - else: - sarif_data = findings_data.sarif_data - - # Find the specific finding by rule_id - runs = sarif_data.get("runs", []) - if not runs: - console.print("āŒ No findings data available", style="red") - raise typer.Exit(1) - - run_data = runs[0] - results = run_data.get("results", []) - tool = run_data.get("tool", {}).get("driver", {}) - - # Search for matching finding - matching_finding = None - for result in results: - if result.get("ruleId") == rule_id: - matching_finding = result - break - - if not matching_finding: - console.print(f"āŒ No finding found with rule ID: {rule_id}", style="red") - console.print(f"šŸ’” Use [bold cyan]ff findings get {run_id}[/bold cyan] to see all findings", style="dim") - raise typer.Exit(1) - - # Display detailed finding - display_finding_detail(matching_finding, tool, run_id) - - except Exception as e: - console.print(f"āŒ Failed to get finding: {e}", style="red") - raise typer.Exit(1) - - -def display_finding_detail(finding: Dict[str, Any], tool: Dict[str, Any], run_id: str): - """Display detailed information about a single finding""" - rule_id = finding.get("ruleId", "unknown") - level = finding.get("level", "note") - message = finding.get("message", {}) - message_text = message.get("text", "No summary available") - message_markdown = message.get("markdown", message_text) - - # Get location - locations = finding.get("locations", []) - location_str = "Unknown location" - code_snippet = None - - if locations: - physical_location = locations[0].get("physicalLocation", {}) - artifact_location = physical_location.get("artifactLocation", {}) - region = physical_location.get("region", {}) - - file_path = artifact_location.get("uri", "") - if file_path: - location_str = file_path - if region.get("startLine"): - location_str += f":{region['startLine']}" - if region.get("startColumn"): - location_str += f":{region['startColumn']}" - - # Get code snippet if available - if region.get("snippet", {}).get("text"): - code_snippet = region["snippet"]["text"].strip() - - # Get severity style - severity_color = { - "error": "red", - "warning": "yellow", - "note": "blue", - "info": "cyan" - }.get(level.lower(), "white") - - # Build detailed content - content_lines = [] - content_lines.append(f"[bold]Rule ID:[/bold] {rule_id}") - content_lines.append(f"[bold]Severity:[/bold] [{severity_color}]{level.upper()}[/{severity_color}]") - content_lines.append(f"[bold]Location:[/bold] {location_str}") - content_lines.append(f"[bold]Tool:[/bold] {tool.get('name', 'Unknown')} v{tool.get('version', 'unknown')}") - content_lines.append(f"[bold]Run ID:[/bold] {run_id}") - content_lines.append("") - content_lines.append("[bold]Summary:[/bold]") - content_lines.append(message_text) - content_lines.append("") - content_lines.append("[bold]Description:[/bold]") - content_lines.append(message_markdown) - - if code_snippet: - content_lines.append("") - content_lines.append("[bold]Code Snippet:[/bold]") - content_lines.append(f"[dim]{code_snippet}[/dim]") - - content = "\n".join(content_lines) - - # Display in panel - console.print() - console.print(Panel( - content, - title="šŸ” Finding Detail", - border_style=severity_color, - box=box.ROUNDED, - padding=(1, 2) - )) - console.print() - console.print(f"šŸ’” Export this run: [bold cyan]ff findings export {run_id} --format sarif[/bold cyan]") - - -def display_findings_table(sarif_data: Dict[str, Any]): - """Display SARIF findings in a rich table format""" - runs = sarif_data.get("runs", []) - if not runs: - console.print("ā„¹ļø No findings data available", style="dim") - return - - run_data = runs[0] - results = run_data.get("results", []) - tool = run_data.get("tool", {}) - driver = tool.get("driver", {}) - - # Tool information - console.print("\nšŸ” [bold]Security Analysis Results[/bold]") - if driver.get("name"): - console.print(f"Tool: {driver.get('name')} v{driver.get('version', 'unknown')}") - - if not results: - console.print("āœ… No security issues found!", style="green") - return - - # Summary statistics - summary_by_level = {} - for result in results: - level = result.get("level", "note") - summary_by_level[level] = summary_by_level.get(level, 0) + 1 - - summary_table = Table(show_header=False, box=box.SIMPLE) - summary_table.add_column("Severity", width=15, justify="left", style="bold") - summary_table.add_column("Count", width=8, justify="right", style="bold") - - for level, count in sorted(summary_by_level.items()): - # Create Rich Text object with color styling - level_text = level.upper() - severity_text = Text(level_text, style=severity_style(level)) - count_text = Text(str(count)) - - summary_table.add_row(severity_text, count_text) - - console.print( - Panel.fit( - summary_table, - title=f"šŸ“Š Summary ({len(results)} total issues)", - box=box.ROUNDED - ) - ) - - # Detailed results - Rich Text-based table with proper emoji alignment - results_table = Table(box=box.ROUNDED) - results_table.add_column("Severity", width=12, justify="left", no_wrap=True) - results_table.add_column("Rule", justify="left", style="bold cyan", no_wrap=True) - results_table.add_column("Message", width=45, justify="left", no_wrap=True) - results_table.add_column("Location", width=20, justify="left", style="dim", no_wrap=True) - - for result in results[:50]: # Limit to first 50 results - level = result.get("level", "note") - rule_id = result.get("ruleId", "unknown") - message = result.get("message", {}).get("text", "No message") - - # Extract location information - locations = result.get("locations", []) - location_str = "" - if locations: - physical_location = locations[0].get("physicalLocation", {}) - artifact_location = physical_location.get("artifactLocation", {}) - region = physical_location.get("region", {}) - - file_path = artifact_location.get("uri", "") - if file_path: - location_str = Path(file_path).name - if region.get("startLine"): - location_str += f":{region['startLine']}" - if region.get("startColumn"): - location_str += f":{region['startColumn']}" - - # Create Rich Text objects with color styling - severity_text = Text(level.upper(), style=severity_style(level)) - severity_text.truncate(12, overflow="ellipsis") - - # Show full rule ID without truncation - message_text = Text(message) - message_text.truncate(45, overflow="ellipsis") - - location_text = Text(location_str) - location_text.truncate(20, overflow="ellipsis") - - results_table.add_row( - severity_text, - rule_id, # Pass string directly to show full UUID - message_text, - location_text - ) - - console.print("\nšŸ“‹ [bold]Detailed Results[/bold]") - if len(results) > 50: - console.print(f"Showing first 50 of {len(results)} results") - console.print() - console.print(results_table) - - -@app.command("history") -def findings_history( - limit: int = typer.Option(20, "--limit", "-l", help="Maximum number of findings to show") -): - """ - šŸ“š Show findings history from local database - """ - db = get_project_db() - if not db: - console.print("āŒ No FuzzForge project found. Run 'ff init' first.", style="red") - raise typer.Exit(1) - - try: - findings = db.list_findings(limit=limit) - - if not findings: - console.print("āŒ No findings found in database", style="red") - return - - table = Table(box=box.ROUNDED) - table.add_column("Run ID", style="bold cyan", width=36) # Full UUID width - table.add_column("Date", justify="center") - table.add_column("Total Issues", justify="center", style="bold") - table.add_column("Errors", justify="center", style="red") - table.add_column("Warnings", justify="center", style="yellow") - table.add_column("Notes", justify="center", style="blue") - table.add_column("Tools", style="dim") - - for finding in findings: - summary = finding.summary - total_issues = summary.get("total_issues", 0) - by_severity = summary.get("by_severity", {}) - tools = summary.get("tools", []) - - tool_names = ", ".join([tool.get("name", "Unknown") for tool in tools]) - - table.add_row( - finding.run_id, # Show full Run ID - finding.created_at.strftime("%m-%d %H:%M"), - str(total_issues), - str(by_severity.get("error", 0)), - str(by_severity.get("warning", 0)), - str(by_severity.get("note", 0)), - tool_names[:30] + "..." if len(tool_names) > 30 else tool_names - ) - - console.print(f"\nšŸ“š [bold]Findings History ({len(findings)})[/bold]\n") - console.print(table) - - console.print("\nšŸ’” Use [bold cyan]fuzzforge finding [/bold cyan] to view detailed findings") - - except Exception as e: - console.print(f"āŒ Failed to get findings history: {e}", style="red") - raise typer.Exit(1) - - -@app.command("export") -def export_findings( - run_id: str = typer.Argument(..., help="Run ID to export findings for"), - format: str = typer.Option( - "sarif", "--format", "-f", - help="Export format: sarif (standard), json, csv, html" - ), - output: Optional[str] = typer.Option( - None, "--output", "-o", - help="Output file path (defaults to findings--.)" - ) -): - """ - šŸ“¤ Export security findings in various formats - - SARIF is the standard format for security findings and is recommended - for interoperability with other security tools. Filenames are automatically - made unique with timestamps to prevent overwriting previous exports. - """ - db = get_project_db() - if not db: - console.print("āŒ No FuzzForge project found. Run 'ff init' first.", style="red") - raise typer.Exit(1) - - try: - # Get findings from database first, fallback to API - findings_data = db.get_findings(run_id) - if not findings_data: - console.print(f"šŸ“” Fetching findings from API for run: {run_id}") - with get_client() as client: - findings = client.get_run_findings(run_id) - sarif_data = findings.sarif - else: - sarif_data = findings_data.sarif_data - - # Generate output filename with timestamp for uniqueness - if not output: - timestamp = datetime.now().strftime("%Y%m%d-%H%M%S") - output = f"findings-{run_id[:8]}-{timestamp}.{format}" - - output_path = Path(output) - - # Export based on format - if format == "sarif": - with open(output_path, 'w') as f: - json.dump(sarif_data, f, indent=2) - - elif format == "json": - # Simplified JSON format - simplified_data = extract_simplified_findings(sarif_data) - with open(output_path, 'w') as f: - json.dump(simplified_data, f, indent=2) - - elif format == "csv": - export_to_csv(sarif_data, output_path) - - elif format == "html": - export_to_html(sarif_data, output_path, run_id) - - else: - console.print(f"āŒ Unsupported format: {format}", style="red") - raise typer.Exit(1) - - console.print(f"āœ… Findings exported to: [bold cyan]{output_path}[/bold cyan]") - - except Exception as e: - console.print(f"āŒ Failed to export findings: {e}", style="red") - raise typer.Exit(1) - - -def extract_simplified_findings(sarif_data: Dict[str, Any]) -> Dict[str, Any]: - """Extract simplified findings structure from SARIF""" - runs = sarif_data.get("runs", []) - if not runs: - return {"findings": [], "summary": {}} - - run_data = runs[0] - results = run_data.get("results", []) - tool = run_data.get("tool", {}).get("driver", {}) - - simplified = { - "tool": { - "name": tool.get("name", "Unknown"), - "version": tool.get("version", "Unknown") - }, - "summary": { - "total_issues": len(results), - "by_severity": {} - }, - "findings": [] - } - - for result in results: - level = result.get("level", "note") - simplified["summary"]["by_severity"][level] = simplified["summary"]["by_severity"].get(level, 0) + 1 - - # Extract location - location_info = {} - locations = result.get("locations", []) - if locations: - physical_location = locations[0].get("physicalLocation", {}) - artifact_location = physical_location.get("artifactLocation", {}) - region = physical_location.get("region", {}) - - location_info = { - "file": artifact_location.get("uri", ""), - "line": region.get("startLine"), - "column": region.get("startColumn") - } - - simplified["findings"].append({ - "rule_id": result.get("ruleId", "unknown"), - "severity": level, - "message": result.get("message", {}).get("text", ""), - "location": location_info - }) - - return simplified - - -def export_to_csv(sarif_data: Dict[str, Any], output_path: Path): - """Export findings to CSV format""" - runs = sarif_data.get("runs", []) - if not runs: - return - - results = runs[0].get("results", []) - - with open(output_path, 'w', newline='', encoding='utf-8') as csvfile: - fieldnames = ['rule_id', 'severity', 'message', 'file', 'line', 'column'] - writer = csv.DictWriter(csvfile, fieldnames=fieldnames) - writer.writeheader() - - for result in results: - location_info = {"file": "", "line": "", "column": ""} - locations = result.get("locations", []) - if locations: - physical_location = locations[0].get("physicalLocation", {}) - artifact_location = physical_location.get("artifactLocation", {}) - region = physical_location.get("region", {}) - - location_info = { - "file": artifact_location.get("uri", ""), - "line": region.get("startLine", ""), - "column": region.get("startColumn", "") - } - - writer.writerow({ - "rule_id": result.get("ruleId", ""), - "severity": result.get("level", "note"), - "message": result.get("message", {}).get("text", ""), - **location_info - }) - - -def export_to_html(sarif_data: Dict[str, Any], output_path: Path, run_id: str): - """Export findings to HTML format""" - runs = sarif_data.get("runs", []) - if not runs: - return - - run_data = runs[0] - results = run_data.get("results", []) - tool = run_data.get("tool", {}).get("driver", {}) - - # Simple HTML template - html_content = f""" - - - Security Findings - {run_id} - - - -
-

Security Findings Report

-

Run ID: {run_id}

-

Tool: {tool.get('name', 'Unknown')} v{tool.get('version', 'Unknown')}

-

Generated: {datetime.now().strftime('%Y-%m-%d %H:%M:%S')}

-
- -
-

Summary

-

Total Issues: {len(results)}

-
- -
-

Detailed Findings

- - - - - - - - - - -""" - - for result in results: - level = result.get("level", "note") - rule_id = result.get("ruleId", "unknown") - message = result.get("message", {}).get("text", "") - - # Extract location - location_str = "" - locations = result.get("locations", []) - if locations: - physical_location = locations[0].get("physicalLocation", {}) - artifact_location = physical_location.get("artifactLocation", {}) - region = physical_location.get("region", {}) - - file_path = artifact_location.get("uri", "") - if file_path: - location_str = file_path - if region.get("startLine"): - location_str += f":{region['startLine']}" - - html_content += f""" - - - - - - - """ - - html_content += """ - -
Rule IDSeverityMessageLocation
{rule_id}{level}{message}{location_str}
-
- - - """ - - with open(output_path, 'w', encoding='utf-8') as f: - f.write(html_content) - - -@app.command("all") -def all_findings( - workflow: Optional[str] = typer.Option( - None, "--workflow", "-w", - help="Filter by workflow name" - ), - severity: Optional[str] = typer.Option( - None, "--severity", "-s", - help="Filter by severity levels (comma-separated: error,warning,note,info)" - ), - since: Optional[str] = typer.Option( - None, "--since", - help="Show findings since date (YYYY-MM-DD)" - ), - limit: Optional[int] = typer.Option( - None, "--limit", "-l", - help="Maximum number of findings to show" - ), - export_format: Optional[str] = typer.Option( - None, "--export", "-e", - help="Export format: json, csv, html" - ), - output: Optional[str] = typer.Option( - None, "--output", "-o", - help="Output file for export" - ), - stats_only: bool = typer.Option( - False, "--stats", - help="Show statistics only" - ), - show_findings: bool = typer.Option( - False, "--show-findings", "-f", - help="Show actual findings content, not just summary" - ), - max_findings: int = typer.Option( - 50, "--max-findings", - help="Maximum number of individual findings to display" - ) -): - """ - šŸ“Š Show all findings for the entire project - """ - db = get_project_db() - if not db: - console.print("āŒ No FuzzForge project found. Run 'ff init' first.", style="red") - raise typer.Exit(1) - - try: - # Parse filters - severity_list = None - if severity: - severity_list = [s.strip().lower() for s in severity.split(",")] - - since_date = None - if since: - try: - since_date = datetime.strptime(since, "%Y-%m-%d") - except ValueError: - console.print(f"āŒ Invalid date format: {since}. Use YYYY-MM-DD", style="red") - raise typer.Exit(1) - - # Get aggregated stats - stats = db.get_aggregated_stats() - - # Show statistics - if stats_only or not export_format: - # Create summary panel - summary_text = f"""[bold]šŸ“Š Project Security Summary[/bold] - -[cyan]Total Findings Records:[/cyan] {stats['total_findings_records']} -[cyan]Total Runs Analyzed:[/cyan] {stats['total_runs']} -[cyan]Total Security Issues:[/cyan] {stats['total_issues']} -[cyan]Recent Findings (7 days):[/cyan] {stats['recent_findings']} - -[bold]Severity Distribution:[/bold] - šŸ”“ Errors: {stats['severity_distribution'].get('error', 0)} - 🟔 Warnings: {stats['severity_distribution'].get('warning', 0)} - šŸ”µ Notes: {stats['severity_distribution'].get('note', 0)} - ā„¹ļø Info: {stats['severity_distribution'].get('info', 0)} - -[bold]By Workflow:[/bold]""" - - for wf_name, count in stats['workflows'].items(): - summary_text += f"\n • {wf_name}: {count} findings" - - console.print(Panel(summary_text, box=box.ROUNDED, title="FuzzForge Project Analysis", border_style="cyan")) - - if stats_only: - return - - # Get all findings with filters - findings = db.get_all_findings( - workflow=workflow, - severity=severity_list, - since_date=since_date, - limit=limit - ) - - if not findings: - console.print("ā„¹ļø No findings match the specified filters", style="dim") - return - - # Export if requested - if export_format: - if not output: - timestamp = datetime.now().strftime("%Y%m%d_%H%M%S") - output = f"all_findings_{timestamp}.{export_format}" - - export_all_findings(findings, export_format, output) - console.print(f"āœ… Exported {len(findings)} findings to: {output}", style="green") - return - - # Display findings table - table = Table(box=box.ROUNDED, title=f"All Project Findings ({len(findings)} records)") - table.add_column("Run ID", style="bold cyan", width=36) # Full UUID width - table.add_column("Workflow", style="dim", width=20) - table.add_column("Date", justify="center") - table.add_column("Issues", justify="center", style="bold") - table.add_column("Errors", justify="center", style="red") - table.add_column("Warnings", justify="center", style="yellow") - table.add_column("Notes", justify="center", style="blue") - - # Get run info for each finding - runs_info = {} - for finding in findings: - run_id = finding.run_id - if run_id not in runs_info: - run_info = db.get_run(run_id) - runs_info[run_id] = run_info - - for finding in findings: - run_id = finding.run_id - run_info = runs_info.get(run_id) - workflow_name = run_info.workflow if run_info else "unknown" - - summary = finding.summary - total_issues = summary.get("total_issues", 0) - by_severity = summary.get("by_severity", {}) - - # Count issues from SARIF data if summary is incomplete - if total_issues == 0 and "runs" in finding.sarif_data: - for run in finding.sarif_data["runs"]: - total_issues += len(run.get("results", [])) - - table.add_row( - run_id, # Show full Run ID - workflow_name[:17] + "..." if len(workflow_name) > 20 else workflow_name, - finding.created_at.strftime("%Y-%m-%d %H:%M"), - str(total_issues), - str(by_severity.get("error", 0)), - str(by_severity.get("warning", 0)), - str(by_severity.get("note", 0)) - ) - - console.print(table) - - # Show actual findings if requested - if show_findings: - display_detailed_findings(findings, max_findings) - - console.print("\nšŸ’” Use filters to refine results: --workflow, --severity, --since") - console.print("šŸ’” Show findings content: --show-findings") - console.print("šŸ’” Export findings: --export json --output report.json") - console.print("šŸ’” View specific findings: [bold cyan]fuzzforge finding [/bold cyan]") - - except Exception as e: - console.print(f"āŒ Failed to get all findings: {e}", style="red") - raise typer.Exit(1) - - -def display_detailed_findings(findings: List[FindingRecord], max_findings: int): - """Display detailed findings content""" - console.print(f"\nšŸ“‹ [bold]Detailed Findings Content[/bold] (showing up to {max_findings} findings)\n") - - findings_count = 0 - - for finding_record in findings: - if findings_count >= max_findings: - remaining = sum(len(run.get("results", [])) - for f in findings[findings.index(finding_record):] - for run in f.sarif_data.get("runs", [])) - if remaining > 0: - console.print(f"\n... and {remaining} more findings (use --max-findings to show more)") - break - - # Get run info for this finding - sarif_data = finding_record.sarif_data - if not sarif_data or "runs" not in sarif_data: - continue - - for run in sarif_data["runs"]: - tool = run.get("tool", {}) - driver = tool.get("driver", {}) - tool_name = driver.get("name", "Unknown Tool") - - results = run.get("results", []) - if not results: - continue - - # Group results by severity - for result in results: - if findings_count >= max_findings: - break - - findings_count += 1 - - # Extract key information - rule_id = result.get("ruleId", "unknown") - level = result.get("level", "note").upper() - message_text = result.get("message", {}).get("text", "No description") - - # Get location information - locations = result.get("locations", []) - location_str = "Unknown location" - if locations: - physical = locations[0].get("physicalLocation", {}) - artifact = physical.get("artifactLocation", {}) - region = physical.get("region", {}) - - file_path = artifact.get("uri", "") - line_number = region.get("startLine", "") - - if file_path: - location_str = f"{file_path}" - if line_number: - location_str += f":{line_number}" - - # Get severity style - severity_style = { - "ERROR": "bold red", - "WARNING": "bold yellow", - "NOTE": "bold blue", - "INFO": "bold cyan" - }.get(level, "white") - - # Create finding panel - finding_content = f"""[bold]Rule:[/bold] {rule_id} -[bold]Location:[/bold] {location_str} -[bold]Tool:[/bold] {tool_name} -[bold]Run:[/bold] {finding_record.run_id[:12]}... - -[bold]Description:[/bold] -{message_text}""" - - # Add code context if available - region = locations[0].get("physicalLocation", {}).get("region", {}) if locations else {} - if region.get("snippet", {}).get("text"): - code_snippet = region["snippet"]["text"].strip() - finding_content += f"\n\n[bold]Code:[/bold]\n[dim]{code_snippet}[/dim]" - - console.print(Panel( - finding_content, - title=f"[{severity_style}]{level}[/{severity_style}] Finding #{findings_count}", - border_style=severity_style.split()[-1] if " " in severity_style else severity_style, - box=box.ROUNDED - )) - - console.print() # Add spacing between findings - - -def export_all_findings(findings: List[FindingRecord], format: str, output_path: str): - """Export all findings to specified format""" - output_file = Path(output_path) - - if format == "json": - # Combine all SARIF data - all_results = [] - for finding in findings: - if "runs" in finding.sarif_data: - for run in finding.sarif_data["runs"]: - for result in run.get("results", []): - result_entry = { - "run_id": finding.run_id, - "created_at": finding.created_at.isoformat(), - **result - } - all_results.append(result_entry) - - with open(output_file, 'w') as f: - json.dump({ - "total_findings": len(findings), - "export_date": datetime.now().isoformat(), - "results": all_results - }, f, indent=2) - - elif format == "csv": - # Export to CSV - with open(output_file, 'w', newline='') as f: - writer = csv.writer(f) - writer.writerow(["Run ID", "Date", "Severity", "Rule ID", "Message", "File", "Line"]) - - for finding in findings: - if "runs" in finding.sarif_data: - for run in finding.sarif_data["runs"]: - for result in run.get("results", []): - locations = result.get("locations", []) - location_info = locations[0] if locations else {} - physical = location_info.get("physicalLocation", {}) - artifact = physical.get("artifactLocation", {}) - region = physical.get("region", {}) - - writer.writerow([ - finding.run_id[:12], - finding.created_at.strftime("%Y-%m-%d %H:%M"), - result.get("level", "note"), - result.get("ruleId", ""), - result.get("message", {}).get("text", ""), - artifact.get("uri", ""), - region.get("startLine", "") - ]) - - elif format == "html": - # Generate HTML report - html_content = f""" - - - FuzzForge Security Findings Report - - - -

FuzzForge Security Findings Report

-
-

Generated: {datetime.now().strftime("%Y-%m-%d %H:%M:%S")}

-

Total Findings: {len(findings)}

-
- - - - - - - - - """ - - for finding in findings: - if "runs" in finding.sarif_data: - for run in finding.sarif_data["runs"]: - for result in run.get("results", []): - level = result.get("level", "note") - locations = result.get("locations", []) - location_info = locations[0] if locations else {} - physical = location_info.get("physicalLocation", {}) - artifact = physical.get("artifactLocation", {}) - region = physical.get("region", {}) - - html_content += f""" - - - - - - - - """ - - html_content += """ -
Run IDDateSeverityRuleMessageLocation
{finding.run_id[:12]}{finding.created_at.strftime("%Y-%m-%d %H:%M")}{level.upper()}{result.get("ruleId", "")}{result.get("message", {}).get("text", "")}{artifact.get("uri", "")} : {region.get("startLine", "")}
- -""" - - with open(output_file, 'w') as f: - f.write(html_content) - - -@app.callback(invoke_without_command=True) -def findings_callback(ctx: typer.Context): - """ - šŸ” View and export security findings - """ - # Check if a subcommand is being invoked - if ctx.invoked_subcommand is not None: - # Let the subcommand handle it - return - - # Default to history when no subcommand provided - findings_history(limit=20) \ No newline at end of file diff --git a/cli/src/fuzzforge_cli/commands/ingest.py b/cli/src/fuzzforge_cli/commands/ingest.py deleted file mode 100644 index 20e657c..0000000 --- a/cli/src/fuzzforge_cli/commands/ingest.py +++ /dev/null @@ -1,251 +0,0 @@ -"""Cognee ingestion commands for FuzzForge CLI.""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -from __future__ import annotations - -import asyncio -import os -from pathlib import Path -from typing import List, Optional - -import typer -from rich.console import Console -from rich.prompt import Confirm - -from ..config import ProjectConfigManager -from ..ingest_utils import collect_ingest_files - -console = Console() -app = typer.Typer( - name="ingest", - help="Ingest files or directories into the Cognee knowledge graph for the current project", - invoke_without_command=True, -) - - -@app.callback() -def ingest_callback( - ctx: typer.Context, - path: Optional[Path] = typer.Argument( - None, - exists=True, - file_okay=True, - dir_okay=True, - readable=True, - resolve_path=True, - help="File or directory to ingest (defaults to current directory)", - ), - recursive: bool = typer.Option( - False, - "--recursive", - "-r", - help="Recursively ingest directories", - ), - file_types: Optional[List[str]] = typer.Option( - None, - "--file-types", - "-t", - help="File extensions to include (e.g. --file-types .py --file-types .js)", - ), - exclude: Optional[List[str]] = typer.Option( - None, - "--exclude", - "-e", - help="Glob patterns to exclude", - ), - dataset: Optional[str] = typer.Option( - None, - "--dataset", - "-d", - help="Dataset name to ingest into", - ), - force: bool = typer.Option( - False, - "--force", - "-f", - help="Force re-ingestion and skip confirmation", - ), -): - """Entry point for `fuzzforge ingest` when no subcommand is provided.""" - if ctx.invoked_subcommand: - return - - try: - config = ProjectConfigManager() - except FileNotFoundError as exc: - console.print(f"[red]Error:[/red] {exc}") - raise typer.Exit(1) from exc - - if not config.is_initialized(): - console.print("[red]Error: FuzzForge project not initialized. Run 'ff init' first.[/red]") - raise typer.Exit(1) - - config.setup_cognee_environment() - if os.getenv("FUZZFORGE_DEBUG", "0") == "1": - console.print( - "[dim]Cognee directories:\n" - f" DATA: {os.getenv('COGNEE_DATA_ROOT', 'unset')}\n" - f" SYSTEM: {os.getenv('COGNEE_SYSTEM_ROOT', 'unset')}\n" - f" USER: {os.getenv('COGNEE_USER_ID', 'unset')}\n", - ) - project_context = config.get_project_context() - - target_path = path or Path.cwd() - dataset_name = dataset or f"{project_context['project_name']}_codebase" - - try: - import cognee # noqa: F401 # Just to validate installation - except ImportError as exc: - console.print("[red]Cognee is not installed.[/red]") - console.print("Install with: pip install 'cognee[all]' litellm") - raise typer.Exit(1) from exc - - console.print(f"[bold]šŸ” Ingesting {target_path} into Cognee knowledge graph[/bold]") - console.print( - f"Project: [cyan]{project_context['project_name']}[/cyan] " - f"(ID: [dim]{project_context['project_id']}[/dim])" - ) - console.print(f"Dataset: [cyan]{dataset_name}[/cyan]") - console.print(f"Tenant: [dim]{project_context['tenant_id']}[/dim]") - - if not force: - confirm_message = f"Ingest {target_path} into knowledge graph for this project?" - if not Confirm.ask(confirm_message, console=console): - console.print("[yellow]Ingestion cancelled[/yellow]") - raise typer.Exit(0) - - try: - asyncio.run( - _run_ingestion( - config=config, - path=target_path.resolve(), - recursive=recursive, - file_types=file_types, - exclude=exclude, - dataset=dataset_name, - force=force, - ) - ) - except KeyboardInterrupt: - console.print("\n[yellow]Ingestion cancelled by user[/yellow]") - raise typer.Exit(1) - except Exception as exc: # pragma: no cover - rich reporting - console.print(f"[red]Failed to ingest:[/red] {exc}") - raise typer.Exit(1) from exc - - -async def _run_ingestion( - *, - config: ProjectConfigManager, - path: Path, - recursive: bool, - file_types: Optional[List[str]], - exclude: Optional[List[str]], - dataset: str, - force: bool, -) -> None: - """Perform the actual ingestion work.""" - from fuzzforge_ai.cognee_service import CogneeService - - cognee_service = CogneeService(config) - await cognee_service.initialize() - - # Always skip internal bookkeeping directories - exclude_patterns = list(exclude or []) - default_excludes = { - ".fuzzforge/**", - ".git/**", - } - added_defaults = [] - for pattern in default_excludes: - if pattern not in exclude_patterns: - exclude_patterns.append(pattern) - added_defaults.append(pattern) - - if added_defaults and os.getenv("FUZZFORGE_DEBUG", "0") == "1": - console.print( - "[dim]Auto-excluding paths: {patterns}[/dim]".format( - patterns=", ".join(added_defaults) - ) - ) - - try: - files_to_ingest = collect_ingest_files(path, recursive, file_types, exclude_patterns) - except Exception as exc: - console.print(f"[red]Failed to collect files:[/red] {exc}") - return - - if not files_to_ingest: - console.print("[yellow]No files found to ingest[/yellow]") - return - - console.print(f"Found [green]{len(files_to_ingest)}[/green] files to ingest") - - if force: - console.print("Cleaning existing data for this project...") - try: - await cognee_service.clear_data(confirm=True) - except Exception as exc: - console.print(f"[yellow]Warning:[/yellow] Could not clean existing data: {exc}") - - console.print("Adding files to Cognee...") - valid_file_paths = [] - for file_path in files_to_ingest: - try: - with open(file_path, "r", encoding="utf-8") as fh: - fh.read(1) - valid_file_paths.append(file_path) - console.print(f" āœ“ {file_path}") - except (UnicodeDecodeError, PermissionError) as exc: - console.print(f"[yellow]Skipping {file_path}: {exc}[/yellow]") - - if not valid_file_paths: - console.print("[yellow]No readable files found to ingest[/yellow]") - return - - results = await cognee_service.ingest_files(valid_file_paths, dataset) - - console.print( - f"[green]āœ… Successfully ingested {results['success']} files into knowledge graph[/green]" - ) - if results["failed"]: - console.print( - f"[yellow]āš ļø Skipped {results['failed']} files due to errors[/yellow]" - ) - - try: - insights = await cognee_service.search_insights( - query=f"What insights can you provide about the {dataset} dataset?", - dataset=dataset, - ) - if insights: - console.print(f"\n[bold]šŸ“Š Generated {len(insights)} insights:[/bold]") - for index, insight in enumerate(insights[:3], 1): - console.print(f" {index}. {insight}") - if len(insights) > 3: - console.print(f" ... and {len(insights) - 3} more") - - chunks = await cognee_service.search_chunks( - query=f"functions classes methods in {dataset}", - dataset=dataset, - ) - if chunks: - console.print( - f"\n[bold]šŸ” Sample searchable content ({len(chunks)} chunks found):[/bold]" - ) - for index, chunk in enumerate(chunks[:2], 1): - preview = chunk[:100] + "..." if len(chunk) > 100 else chunk - console.print(f" {index}. {preview}") - except Exception: - # Best-effort stats — ignore failures here - pass diff --git a/cli/src/fuzzforge_cli/commands/init.py b/cli/src/fuzzforge_cli/commands/init.py deleted file mode 100644 index ceb3586..0000000 --- a/cli/src/fuzzforge_cli/commands/init.py +++ /dev/null @@ -1,303 +0,0 @@ -"""Project initialization commands.""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -from __future__ import annotations - -import os -from pathlib import Path -from textwrap import dedent -from typing import Optional - -import typer -from rich.console import Console -from rich.prompt import Confirm, Prompt - -from ..config import ensure_project_config -from ..database import ensure_project_db - -console = Console() -app = typer.Typer() - - -@app.command() -def project( - name: Optional[str] = typer.Option( - None, "--name", "-n", help="Project name (defaults to current directory name)" - ), - api_url: Optional[str] = typer.Option( - None, - "--api-url", - "-u", - help="FuzzForge API URL (defaults to http://localhost:8000)", - ), - force: bool = typer.Option( - False, - "--force", - "-f", - help="Force initialization even if project already exists", - ), -): - """ - šŸ“ Initialize a new FuzzForge project in the current directory. - - This creates a .fuzzforge directory with: - • SQLite database for storing runs, findings, and crashes - • Configuration file with project settings - • Default ignore patterns and preferences - """ - current_dir = Path.cwd() - fuzzforge_dir = current_dir / ".fuzzforge" - - # Check if project already exists - if fuzzforge_dir.exists() and not force: - if fuzzforge_dir.is_dir() and any(fuzzforge_dir.iterdir()): - console.print( - "āŒ FuzzForge project already exists in this directory", style="red" - ) - console.print("Use --force to reinitialize", style="dim") - raise typer.Exit(1) - - # Get project name - if not name: - name = Prompt.ask("Project name", default=current_dir.name, console=console) - - # Get API URL - if not api_url: - api_url = Prompt.ask( - "FuzzForge API URL", default="http://localhost:8000", console=console - ) - - # Confirm initialization - console.print(f"\nšŸ“ Initializing FuzzForge project: [bold cyan]{name}[/bold cyan]") - console.print(f"šŸ“ Location: [dim]{current_dir}[/dim]") - console.print(f"šŸ”— API URL: [dim]{api_url}[/dim]") - - if not Confirm.ask("\nProceed with initialization?", default=True, console=console): - console.print("āŒ Initialization cancelled", style="yellow") - raise typer.Exit(0) - - try: - # Create .fuzzforge directory - console.print("\nšŸ”Ø Creating project structure...") - fuzzforge_dir.mkdir(exist_ok=True) - - # Initialize configuration - console.print("āš™ļø Setting up configuration...") - ensure_project_config( - project_dir=current_dir, - project_name=name, - api_url=api_url, - ) - - # Initialize database - console.print("šŸ—„ļø Initializing database...") - ensure_project_db(current_dir) - - _ensure_env_file(fuzzforge_dir, force) - _ensure_agents_registry(fuzzforge_dir, force) - - # Create .gitignore if needed - gitignore_path = current_dir / ".gitignore" - gitignore_entries = [ - "# FuzzForge CLI", - ".fuzzforge/findings.db-*", # SQLite temp files - ".fuzzforge/cache/", - ".fuzzforge/temp/", - ] - - if gitignore_path.exists(): - with open(gitignore_path, "r") as f: - existing_content = f.read() - - if "# FuzzForge CLI" not in existing_content: - with open(gitignore_path, "a") as f: - f.write(f"\n{chr(10).join(gitignore_entries)}\n") - console.print("šŸ“ Updated .gitignore with FuzzForge entries") - else: - with open(gitignore_path, "w") as f: - f.write(f"{chr(10).join(gitignore_entries)}\n") - console.print("šŸ“ Created .gitignore") - - # Create README if it doesn't exist - readme_path = current_dir / "README.md" - if not readme_path.exists(): - readme_content = f"""# {name} - -FuzzForge security testing project. - -## Quick Start - -```bash -# List available workflows -fuzzforge workflows - -# Submit a workflow for analysis -fuzzforge workflow run /path/to/target - -# View findings -fuzzforge finding -``` - -## Project Structure - -- `.fuzzforge/` - Project data and configuration -- `.fuzzforge/config.yaml` - Project configuration -- `.fuzzforge/findings.db` - Local database for runs and findings -""" - - with open(readme_path, "w") as f: - f.write(readme_content) - console.print("šŸ“š Created README.md") - - console.print("\nāœ… FuzzForge project initialized successfully!", style="green") - console.print("\nšŸŽÆ Next steps:") - console.print(" • ff workflows - See available workflows") - console.print(" • ff status - Check API connectivity") - console.print(" • ff workflow - Start your first analysis") - console.print(" • edit .fuzzforge/.env with API keys & provider settings") - - except Exception as e: - console.print(f"\nāŒ Initialization failed: {e}", style="red") - raise typer.Exit(1) - - -@app.callback() -def init_callback(): - """ - šŸ“ Initialize FuzzForge projects and components - """ - - -def _ensure_env_file(fuzzforge_dir: Path, force: bool) -> None: - """Create or update the .fuzzforge/.env file with AI defaults.""" - - env_path = fuzzforge_dir / ".env" - if env_path.exists() and not force: - console.print("🧪 Using existing .fuzzforge/.env (use --force to regenerate)") - return - - console.print("🧠 Configuring AI environment...") - console.print(" • Default LLM provider: openai") - console.print(" • Default LLM model: litellm_proxy/gpt-5-mini") - console.print(" • To customise provider/model later, edit .fuzzforge/.env") - - llm_provider = "openai" - llm_model = "litellm_proxy/gpt-5-mini" - - # Check for global virtual keys from volumes/env/.env - global_env_key = None - for parent in fuzzforge_dir.parents: - global_env = parent / "volumes" / "env" / ".env" - if global_env.exists(): - try: - for line in global_env.read_text(encoding="utf-8").splitlines(): - if line.strip().startswith("OPENAI_API_KEY=") and "=" in line: - key_value = line.split("=", 1)[1].strip() - if key_value and not key_value.startswith("your-") and key_value.startswith("sk-"): - global_env_key = key_value - console.print(f" • Found virtual key in {global_env.relative_to(parent)}") - break - except Exception: - pass - break - - api_key = Prompt.ask( - "OpenAI API key (leave blank to use global virtual key)" if global_env_key else "OpenAI API key (leave blank to fill manually)", - default="", - show_default=False, - console=console, - ) - - # Use global key if user didn't provide one - if not api_key and global_env_key: - api_key = global_env_key - - session_db_path = fuzzforge_dir / "fuzzforge_sessions.db" - session_db_rel = session_db_path.relative_to(fuzzforge_dir.parent) - - env_lines = [ - "# FuzzForge AI configuration", - "# Populate the API key(s) that match your LLM provider", - "", - f"LLM_PROVIDER={llm_provider}", - f"LLM_MODEL={llm_model}", - f"LITELLM_MODEL={llm_model}", - "LLM_ENDPOINT=http://localhost:10999", - "LLM_API_KEY=", - "LLM_EMBEDDING_MODEL=litellm_proxy/text-embedding-3-large", - "LLM_EMBEDDING_ENDPOINT=http://localhost:10999", - f"OPENAI_API_KEY={api_key}", - "FUZZFORGE_MCP_URL=http://localhost:8010/mcp", - "", - "# Cognee configuration mirrors the primary LLM by default", - f"LLM_COGNEE_PROVIDER={llm_provider}", - f"LLM_COGNEE_MODEL={llm_model}", - "LLM_COGNEE_ENDPOINT=http://localhost:10999", - "LLM_COGNEE_API_KEY=", - "LLM_COGNEE_EMBEDDING_MODEL=litellm_proxy/text-embedding-3-large", - "LLM_COGNEE_EMBEDDING_ENDPOINT=http://localhost:10999", - "COGNEE_MCP_URL=", - "", - "# Session persistence options: inmemory | sqlite", - "SESSION_PERSISTENCE=sqlite", - f"SESSION_DB_PATH={session_db_rel}", - "", - "# Optional integrations", - "AGENTOPS_API_KEY=", - "FUZZFORGE_DEBUG=0", - "", - ] - - env_path.write_text("\n".join(env_lines), encoding="utf-8") - console.print(f"šŸ“ Created {env_path.relative_to(fuzzforge_dir.parent)}") - - template_path = fuzzforge_dir / ".env.template" - if not template_path.exists() or force: - template_lines = [] - for line in env_lines: - if line.startswith("OPENAI_API_KEY="): - template_lines.append("OPENAI_API_KEY=") - elif line.startswith("LLM_API_KEY="): - template_lines.append("LLM_API_KEY=") - elif line.startswith("LLM_COGNEE_API_KEY="): - template_lines.append("LLM_COGNEE_API_KEY=") - else: - template_lines.append(line) - template_path.write_text("\n".join(template_lines), encoding="utf-8") - console.print(f"šŸ“ Created {template_path.relative_to(fuzzforge_dir.parent)}") - - # SQLite session DB will be created automatically when first used by the AI agent - - -def _ensure_agents_registry(fuzzforge_dir: Path, force: bool) -> None: - """Create a starter agents.yaml registry if needed.""" - - agents_path = fuzzforge_dir / "agents.yaml" - if agents_path.exists() and not force: - return - - template = dedent( - """\ - # FuzzForge Registered Agents - # Populate this list to auto-register remote agents when the AI CLI starts - registered_agents: [] - - # Example: - # registered_agents: - # - name: Calculator - # url: http://localhost:10201 - # description: Sample math agent - """.strip() - ) - - agents_path.write_text(template + "\n", encoding="utf-8") - console.print(f"šŸ“ Created {agents_path.relative_to(fuzzforge_dir.parent)}") diff --git a/cli/src/fuzzforge_cli/commands/monitor.py b/cli/src/fuzzforge_cli/commands/monitor.py deleted file mode 100644 index eb6d3ba..0000000 --- a/cli/src/fuzzforge_cli/commands/monitor.py +++ /dev/null @@ -1,402 +0,0 @@ -""" -Real-time monitoring and statistics commands. -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -import time -from datetime import datetime - -import typer -from rich.console import Console -from rich.table import Table -from rich.panel import Panel -from rich.live import Live -from rich import box - -from ..config import get_project_config, FuzzForgeConfig -from ..database import ensure_project_db, CrashRecord -from fuzzforge_sdk import FuzzForgeClient - -console = Console() -app = typer.Typer() - - -def get_client() -> FuzzForgeClient: - """Get configured FuzzForge client""" - config = get_project_config() or FuzzForgeConfig() - return FuzzForgeClient(base_url=config.get_api_url(), timeout=config.get_timeout()) - - -def format_duration(seconds: int) -> str: - """Format duration in human readable format""" - if seconds < 60: - return f"{seconds}s" - elif seconds < 3600: - return f"{seconds // 60}m {seconds % 60}s" - else: - hours = seconds // 3600 - minutes = (seconds % 3600) // 60 - return f"{hours}h {minutes}m" - - -def format_number(num: int) -> str: - """Format large numbers with K, M suffixes""" - if num >= 1000000: - return f"{num / 1000000:.1f}M" - elif num >= 1000: - return f"{num / 1000:.1f}K" - else: - return str(num) - - -def create_stats_table(stats) -> Panel: - """Create a rich table for fuzzing statistics""" - # Create main stats table - stats_table = Table(show_header=False, box=box.SIMPLE) - stats_table.add_column("Metric", style="bold cyan") - stats_table.add_column("Value", justify="right", style="bold white") - - stats_table.add_row("Total Executions", format_number(stats.executions)) - stats_table.add_row("Executions/sec", f"{stats.executions_per_sec:.1f}") - stats_table.add_row("Total Crashes", format_number(stats.crashes)) - stats_table.add_row("Unique Crashes", format_number(stats.unique_crashes)) - - if stats.coverage is not None and stats.coverage > 0: - stats_table.add_row("Code Coverage", f"{stats.coverage} edges") - - stats_table.add_row("Corpus Size", format_number(stats.corpus_size)) - stats_table.add_row("Elapsed Time", format_duration(stats.elapsed_time)) - - if stats.last_crash_time: - time_since_crash = datetime.now() - stats.last_crash_time - stats_table.add_row("Last Crash", f"{format_duration(int(time_since_crash.total_seconds()))} ago") - - return Panel.fit( - stats_table, - title=f"šŸ“Š Fuzzing Statistics - {stats.workflow}", - subtitle=f"Run: {stats.run_id[:12]}...", - box=box.ROUNDED - ) - - -@app.command("crashes") -def crash_reports( - run_id: str = typer.Argument(..., help="Run ID to get crash reports for"), - save: bool = typer.Option( - True, "--save/--no-save", - help="Save crashes to local database" - ), - limit: int = typer.Option( - 50, "--limit", "-l", - help="Maximum number of crashes to show" - ) -): - """ - šŸ› Display crash reports for a fuzzing run - """ - try: - with get_client() as client: - console.print(f"šŸ› Fetching crash reports for run: {run_id}") - crashes = client.get_crash_reports(run_id) - - if not crashes: - console.print("āœ… No crashes found!", style="green") - return - - # Save to database if requested - if save: - db = ensure_project_db() - for crash in crashes: - crash_record = CrashRecord( - run_id=run_id, - crash_id=crash.crash_id, - signal=crash.signal, - stack_trace=crash.stack_trace, - input_file=crash.input_file, - severity=crash.severity, - timestamp=crash.timestamp - ) - db.save_crash(crash_record) - console.print("āœ… Crashes saved to local database") - - # Display crashes - crashes_to_show = crashes[:limit] - - # Summary - severity_counts = {} - signal_counts = {} - for crash in crashes: - severity_counts[crash.severity] = severity_counts.get(crash.severity, 0) + 1 - if crash.signal: - signal_counts[crash.signal] = signal_counts.get(crash.signal, 0) + 1 - - summary_table = Table(show_header=False, box=box.SIMPLE) - summary_table.add_column("Metric", style="bold cyan") - summary_table.add_column("Value", justify="right") - - summary_table.add_row("Total Crashes", str(len(crashes))) - summary_table.add_row("Unique Signals", str(len(signal_counts))) - - for severity, count in sorted(severity_counts.items()): - summary_table.add_row(f"{severity.title()} Severity", str(count)) - - console.print( - Panel.fit( - summary_table, - title="šŸ› Crash Summary", - box=box.ROUNDED - ) - ) - - # Detailed crash table - if crashes_to_show: - crashes_table = Table(box=box.ROUNDED) - crashes_table.add_column("Crash ID", style="bold cyan") - crashes_table.add_column("Signal", justify="center") - crashes_table.add_column("Severity", justify="center") - crashes_table.add_column("Timestamp", justify="center") - crashes_table.add_column("Input File", style="dim") - - for crash in crashes_to_show: - signal_emoji = { - "SIGSEGV": "šŸ’„", - "SIGABRT": "šŸ›‘", - "SIGFPE": "🧮", - "SIGILL": "āš ļø" - }.get(crash.signal or "", "šŸ›") - - severity_style = { - "high": "red", - "medium": "yellow", - "low": "green" - }.get(crash.severity.lower(), "white") - - input_display = "" - if crash.input_file: - input_display = crash.input_file.split("/")[-1] # Show just filename - - crashes_table.add_row( - crash.crash_id[:12] + "..." if len(crash.crash_id) > 15 else crash.crash_id, - f"{signal_emoji} {crash.signal or 'Unknown'}", - f"[{severity_style}]{crash.severity}[/{severity_style}]", - crash.timestamp.strftime("%H:%M:%S"), - input_display - ) - - console.print("\nšŸ› [bold]Crash Details[/bold]") - if len(crashes) > limit: - console.print(f"Showing first {limit} of {len(crashes)} crashes") - console.print() - console.print(crashes_table) - - console.print(f"\nšŸ’” Use [bold cyan]fuzzforge finding {run_id}[/bold cyan] for detailed analysis") - - except Exception as e: - console.print(f"āŒ Failed to get crash reports: {e}", style="red") - raise typer.Exit(1) - - -def _live_monitor(run_id: str, refresh: int, once: bool = False, style: str = "inline"): - """Helper for live monitoring with inline real-time display or table display""" - with get_client() as client: - start_time = time.time() - - def render_inline_stats(run_status, stats): - """Render inline stats display (non-dashboard)""" - lines = [] - - # Header line - workflow_name = getattr(stats, 'workflow', 'unknown') - status_emoji = "šŸ”„" if not getattr(run_status, 'is_completed', False) else "āœ…" - status_color = "yellow" if not getattr(run_status, 'is_completed', False) else "green" - - lines.append(f"\n[bold cyan]šŸ“Š Live Fuzzing Monitor[/bold cyan] - {workflow_name} (Run: {run_id[:12]}...)\n") - - # Stats lines with emojis - lines.append(f" [bold]⚔ Executions[/bold] {format_number(stats.executions):>8} [dim]({stats.executions_per_sec:,.1f}/sec)[/dim]") - lines.append(f" [bold]šŸ’„ Crashes[/bold] {stats.crashes:>8} [dim](unique: {stats.unique_crashes})[/dim]") - lines.append(f" [bold]šŸ“¦ Corpus[/bold] {stats.corpus_size:>8} inputs") - - if stats.coverage is not None and stats.coverage > 0: - lines.append(f" [bold]šŸ“ˆ Coverage[/bold] {stats.coverage:>8} edges") - - lines.append(f" [bold]ā±ļø Elapsed[/bold] {format_duration(stats.elapsed_time):>8}") - - # Last crash info - if stats.last_crash_time: - time_since = datetime.now() - stats.last_crash_time - crash_ago = format_duration(int(time_since.total_seconds())) - lines.append(f" [bold red]šŸ› Last Crash[/bold red] {crash_ago:>8} ago") - - # Status line - status_text = getattr(run_status, 'status', 'Unknown') - current_time = datetime.now().strftime('%H:%M:%S') - lines.append(f"\n[{status_color}]{status_emoji} Status: {status_text}[/{status_color}] | Last update: [dim]{current_time}[/dim] | Refresh: {refresh}s | [dim]Press Ctrl+C to stop[/dim]") - - return "\n".join(lines) - - # Fallback stats class - class FallbackStats: - def __init__(self, run_id): - self.run_id = run_id - self.workflow = "unknown" - self.executions = 0 - self.executions_per_sec = 0.0 - self.crashes = 0 - self.unique_crashes = 0 - self.coverage = None - self.corpus_size = 0 - self.elapsed_time = 0 - self.last_crash_time = None - - # Initial fetch - try: - run_status = client.get_run_status(run_id) - stats = client.get_fuzzing_stats(run_id) - except Exception: - stats = FallbackStats(run_id) - run_status = type("RS", (), {"status":"Unknown","is_completed":False,"is_failed":False})() - - # Handle --once mode: show stats once and exit - if once: - if style == "table": - console.print(create_stats_table(stats)) - else: - console.print(render_inline_stats(run_status, stats)) - return - - # Live monitoring mode - with Live(auto_refresh=False, console=console) as live: - # Render based on style - if style == "table": - live.update(create_stats_table(stats), refresh=True) - else: - live.update(render_inline_stats(run_status, stats), refresh=True) - - # Polling loop - consecutive_errors = 0 - max_errors = 5 - - while True: - try: - # Poll for updates - try: - run_status = client.get_run_status(run_id) - consecutive_errors = 0 - except Exception as e: - consecutive_errors += 1 - if consecutive_errors >= max_errors: - console.print(f"\nāŒ Too many errors getting run status: {e}", style="red") - break - time.sleep(refresh) - continue - - # Try to get fuzzing stats - try: - stats = client.get_fuzzing_stats(run_id) - except Exception: - stats = FallbackStats(run_id) - - # Update display based on style - if style == "table": - live.update(create_stats_table(stats), refresh=True) - else: - live.update(render_inline_stats(run_status, stats), refresh=True) - - # Check if completed - if getattr(run_status, 'is_completed', False) or getattr(run_status, 'is_failed', False): - break - - # Wait before next poll - time.sleep(refresh) - - except KeyboardInterrupt: - raise - except Exception as e: - console.print(f"\nāš ļø Monitoring error: {e}", style="yellow") - time.sleep(refresh) - - # Final status - final_status = getattr(run_status, 'status', 'Unknown') - total_time = format_duration(int(time.time() - start_time)) - - if getattr(run_status, 'is_completed', False): - console.print(f"\nāœ… [bold green]Run completed successfully[/bold green] | Total runtime: {total_time}") - else: - console.print(f"\nāš ļø [bold yellow]Run ended[/bold yellow] | Status: {final_status} | Total runtime: {total_time}") - - -@app.command("live") -def live_monitor( - run_id: str = typer.Argument(..., help="Run ID to monitor live"), - refresh: int = typer.Option( - 2, "--refresh", "-r", - help="Refresh interval in seconds" - ), - once: bool = typer.Option( - False, "--once", - help="Show stats once and exit" - ), - style: str = typer.Option( - "inline", "--style", - help="Display style: 'inline' (default) or 'table'" - ) -): - """ - šŸ“ŗ Real-time monitoring with live statistics updates - - Display styles: - - inline: Visual inline display with emojis (default) - - table: Clean table-based display - - Use --once to show stats once without continuous monitoring (useful for scripts) - """ - try: - # Validate style - if style not in ["inline", "table"]: - console.print(f"āŒ Invalid style: {style}. Must be 'inline' or 'table'", style="red") - raise typer.Exit(1) - - _live_monitor(run_id, refresh, once, style) - except KeyboardInterrupt: - console.print("\n\nšŸ“Š Monitoring stopped by user.", style="yellow") - except Exception as e: - console.print(f"\nāŒ Failed to start monitoring: {e}", style="red") - raise typer.Exit(1) - - -def create_progress_bar(percentage: float, color: str = "green") -> str: - """Create a simple text progress bar""" - width = 20 - filled = int((percentage / 100) * width) - bar = "ā–ˆ" * filled + "ā–‘" * (width - filled) - return f"[{color}]{bar}[/{color}] {percentage:.1f}%" - - -@app.callback(invoke_without_command=True) -def monitor_callback(ctx: typer.Context): - """ - šŸ“Š Real-time monitoring and statistics - """ - # Check if a subcommand is being invoked - if ctx.invoked_subcommand is not None: - # Let the subcommand handle it - return - - # Show help message for default command - from rich.console import Console - console = Console() - console.print("šŸ“Š [bold cyan]Monitor Command[/bold cyan]") - console.print("\nAvailable subcommands:") - console.print(" • [cyan]ff monitor live [/cyan] - Monitor with live updates (supports --once, --style)") - console.print(" • [cyan]ff monitor crashes [/cyan] - Show crash reports") diff --git a/cli/src/fuzzforge_cli/commands/status.py b/cli/src/fuzzforge_cli/commands/status.py deleted file mode 100644 index 5d78042..0000000 --- a/cli/src/fuzzforge_cli/commands/status.py +++ /dev/null @@ -1,165 +0,0 @@ -""" -Status command for showing project and API information. -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -from pathlib import Path -from rich.console import Console -from rich.table import Table -from rich.panel import Panel -from rich import box - -from ..config import get_project_config, FuzzForgeConfig -from ..database import get_project_db -from fuzzforge_sdk import FuzzForgeClient - -console = Console() - - -def show_status(): - """Show comprehensive project and API status""" - current_dir = Path.cwd() - fuzzforge_dir = current_dir / ".fuzzforge" - - # Project status - console.print("\nšŸ“Š [bold]FuzzForge Project Status[/bold]\n") - - if not fuzzforge_dir.exists(): - console.print( - Panel.fit( - "āŒ No FuzzForge project found in current directory\n\n" - "Run [bold cyan]ff init[/bold cyan] to initialize a project", - title="Project Status", - box=box.ROUNDED - ) - ) - return - - # Load project configuration - config = get_project_config() - if not config: - config = FuzzForgeConfig() - - # Project info table - project_table = Table(show_header=False, box=box.SIMPLE) - project_table.add_column("Property", style="bold cyan") - project_table.add_column("Value") - - project_table.add_row("Project Name", config.project.name) - project_table.add_row("Location", str(current_dir)) - project_table.add_row("API URL", config.project.api_url) - project_table.add_row("Default Timeout", f"{config.project.default_timeout}s") - - console.print( - Panel.fit( - project_table, - title="āœ… Project Information", - box=box.ROUNDED - ) - ) - - # Database status - db = get_project_db() - if db: - try: - stats = db.get_stats() - db_table = Table(show_header=False, box=box.SIMPLE) - db_table.add_column("Metric", style="bold cyan") - db_table.add_column("Count", justify="right") - - db_table.add_row("Total Runs", str(stats["total_runs"])) - db_table.add_row("Total Findings", str(stats["total_findings"])) - db_table.add_row("Total Crashes", str(stats["total_crashes"])) - db_table.add_row("Runs (Last 7 days)", str(stats["runs_last_7_days"])) - - if stats["runs_by_status"]: - db_table.add_row("", "") # Spacer - for status, count in stats["runs_by_status"].items(): - status_emoji = { - "completed": "āœ…", - "running": "šŸ”„", - "failed": "āŒ", - "queued": "ā³", - "cancelled": "ā¹ļø" - }.get(status, "šŸ“‹") - db_table.add_row(f"{status_emoji} {status.title()}", str(count)) - - console.print( - Panel.fit( - db_table, - title="šŸ—„ļø Database Statistics", - box=box.ROUNDED - ) - ) - except Exception as e: - console.print(f"āš ļø Database error: {e}", style="yellow") - - # API status - console.print("\nšŸ”— [bold]API Connectivity[/bold]") - try: - with FuzzForgeClient(base_url=config.get_api_url(), timeout=10.0) as client: - api_status = client.get_api_status() - workflows = client.list_workflows() - - api_table = Table(show_header=False, box=box.SIMPLE) - api_table.add_column("Property", style="bold cyan") - api_table.add_column("Value") - - api_table.add_row("Status", "āœ… Connected") - api_table.add_row("Service", f"{api_status.name} v{api_status.version}") - api_table.add_row("Workflows", str(len(workflows))) - - console.print( - Panel.fit( - api_table, - title="āœ… API Status", - box=box.ROUNDED - ) - ) - - # Show available workflows - if workflows: - workflow_table = Table(box=box.SIMPLE_HEAD) - workflow_table.add_column("Name", style="bold") - workflow_table.add_column("Version", justify="center") - workflow_table.add_column("Description") - - for workflow in workflows[:10]: # Limit to first 10 - workflow_table.add_row( - workflow.name, - workflow.version, - workflow.description[:60] + "..." if len(workflow.description) > 60 else workflow.description - ) - - if len(workflows) > 10: - workflow_table.add_row("...", "...", f"and {len(workflows) - 10} more workflows") - - console.print( - Panel.fit( - workflow_table, - title=f"šŸ”§ Available Workflows ({len(workflows)})", - box=box.ROUNDED - ) - ) - - except Exception as e: - console.print( - Panel.fit( - f"āŒ Failed to connect to API\n\n" - f"Error: {str(e)}\n\n" - f"API URL: {config.get_api_url()}\n\n" - "Check that the FuzzForge API is running and accessible.", - title="āŒ API Connection Failed", - box=box.ROUNDED - ) - ) \ No newline at end of file diff --git a/cli/src/fuzzforge_cli/commands/worker.py b/cli/src/fuzzforge_cli/commands/worker.py deleted file mode 100644 index 06b8b03..0000000 --- a/cli/src/fuzzforge_cli/commands/worker.py +++ /dev/null @@ -1,225 +0,0 @@ -""" -Worker management commands for FuzzForge CLI. - -Provides commands to start, stop, and list Temporal workers. -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -import subprocess -import sys -import typer -from pathlib import Path -from rich.console import Console -from rich.table import Table -from typing import Optional - -from ..worker_manager import WorkerManager - -console = Console() -app = typer.Typer( - name="worker", - help="šŸ”§ Manage Temporal workers", - no_args_is_help=True, -) - - -@app.command("stop") -def stop_workers( - all: bool = typer.Option( - False, "--all", - help="Stop all workers (default behavior, flag for clarity)" - ) -): - """ - šŸ›‘ Stop all running FuzzForge workers. - - This command stops all worker containers using the proper Docker Compose - profile flag to ensure workers are actually stopped (since they're in profiles). - - Examples: - $ ff worker stop - $ ff worker stop --all - """ - try: - worker_mgr = WorkerManager() - success = worker_mgr.stop_all_workers() - - if success: - sys.exit(0) - else: - console.print("āš ļø Some workers may not have stopped properly", style="yellow") - sys.exit(1) - - except Exception as e: - console.print(f"āŒ Error: {e}", style="red") - sys.exit(1) - - -@app.command("list") -def list_workers( - all: bool = typer.Option( - False, "--all", "-a", - help="Show all workers (including stopped)" - ) -): - """ - šŸ“‹ List FuzzForge workers and their status. - - By default, shows only running workers. Use --all to see all workers. - - Examples: - $ ff worker list - $ ff worker list --all - """ - try: - # Get list of running workers - result = subprocess.run( - ["docker", "ps", "--filter", "name=fuzzforge-worker-", - "--format", "{{.Names}}\t{{.Status}}\t{{.RunningFor}}"], - capture_output=True, - text=True, - check=False - ) - - running_workers = [] - if result.stdout.strip(): - for line in result.stdout.strip().splitlines(): - parts = line.split('\t') - if len(parts) >= 3: - running_workers.append({ - "name": parts[0].replace("fuzzforge-worker-", ""), - "status": "Running", - "uptime": parts[2] - }) - - # If --all, also get stopped workers - stopped_workers = [] - if all: - result_all = subprocess.run( - ["docker", "ps", "-a", "--filter", "name=fuzzforge-worker-", - "--format", "{{.Names}}\t{{.Status}}"], - capture_output=True, - text=True, - check=False - ) - - all_worker_names = set() - for line in result_all.stdout.strip().splitlines(): - parts = line.split('\t') - if len(parts) >= 2: - worker_name = parts[0].replace("fuzzforge-worker-", "") - all_worker_names.add(worker_name) - # If not running, it's stopped - if not any(w["name"] == worker_name for w in running_workers): - stopped_workers.append({ - "name": worker_name, - "status": "Stopped", - "uptime": "-" - }) - - # Display results - if not running_workers and not stopped_workers: - console.print("ā„¹ļø No workers found", style="cyan") - console.print("\nšŸ’” Start a worker with: [cyan]docker compose up -d worker-[/cyan]") - console.print(" Or run a workflow, which auto-starts workers: [cyan]ff workflow run [/cyan]") - return - - # Create table - table = Table(title="FuzzForge Workers", show_header=True, header_style="bold cyan") - table.add_column("Worker", style="cyan", no_wrap=True) - table.add_column("Status", style="green") - table.add_column("Uptime", style="dim") - - # Add running workers - for worker in running_workers: - table.add_row( - worker["name"], - f"[green]ā—[/green] {worker['status']}", - worker["uptime"] - ) - - # Add stopped workers if --all - for worker in stopped_workers: - table.add_row( - worker["name"], - f"[red]ā—[/red] {worker['status']}", - worker["uptime"] - ) - - console.print(table) - - # Summary - if running_workers: - console.print(f"\nāœ… {len(running_workers)} worker(s) running") - if stopped_workers: - console.print(f"ā¹ļø {len(stopped_workers)} worker(s) stopped") - - except Exception as e: - console.print(f"āŒ Error listing workers: {e}", style="red") - sys.exit(1) - - -@app.command("start") -def start_worker( - name: str = typer.Argument( - ..., - help="Worker name (e.g., 'python', 'android', 'secrets')" - ), - build: bool = typer.Option( - False, "--build", - help="Rebuild worker image before starting" - ) -): - """ - šŸš€ Start a specific worker. - - The worker name should be the vertical name (e.g., 'python', 'android', 'rust'). - - Examples: - $ ff worker start python - $ ff worker start android --build - """ - try: - service_name = f"worker-{name}" - - console.print(f"šŸš€ Starting worker: [cyan]{service_name}[/cyan]") - - # Build docker compose command - cmd = ["docker", "compose", "up", "-d"] - if build: - cmd.append("--build") - cmd.append(service_name) - - result = subprocess.run( - cmd, - capture_output=True, - text=True, - check=False - ) - - if result.returncode == 0: - console.print(f"āœ… Worker [cyan]{service_name}[/cyan] started successfully") - else: - console.print(f"āŒ Failed to start worker: {result.stderr}", style="red") - console.print( - f"\nšŸ’” Try manually: [yellow]docker compose up -d {service_name}[/yellow]", - style="dim" - ) - sys.exit(1) - - except Exception as e: - console.print(f"āŒ Error: {e}", style="red") - sys.exit(1) - - -if __name__ == "__main__": - app() diff --git a/cli/src/fuzzforge_cli/commands/workflow_exec.py b/cli/src/fuzzforge_cli/commands/workflow_exec.py deleted file mode 100644 index bfa7f12..0000000 --- a/cli/src/fuzzforge_cli/commands/workflow_exec.py +++ /dev/null @@ -1,772 +0,0 @@ -""" -Workflow execution and management commands. -Replaces the old 'runs' terminology with cleaner workflow-centric commands. -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -import json -import time -from datetime import datetime -from pathlib import Path -from typing import Optional, Dict, Any, List - -import typer -from rich.console import Console -from rich.table import Table -from rich.panel import Panel -from rich.prompt import Prompt, Confirm -from rich import box - -from ..config import get_project_config, FuzzForgeConfig -from ..database import get_project_db, ensure_project_db, RunRecord -from ..exceptions import ( - handle_error, retry_on_network_error, safe_json_load, require_project, - ValidationError, DatabaseError -) -from ..validation import ( - validate_run_id, validate_workflow_name, validate_target_path, - validate_parameters, validate_timeout -) -from ..progress import step_progress -from ..constants import ( - STATUS_EMOJIS, MAX_RUN_ID_DISPLAY_LENGTH, - PROGRESS_STEP_DELAYS, MAX_RETRIES, RETRY_DELAY, POLL_INTERVAL -) -from ..worker_manager import WorkerManager -from fuzzforge_sdk import FuzzForgeClient, WorkflowSubmission - -console = Console() -app = typer.Typer() - - -@retry_on_network_error(max_retries=MAX_RETRIES, delay=RETRY_DELAY) -def get_client() -> FuzzForgeClient: - """Get configured FuzzForge client with retry on network errors""" - config = get_project_config() or FuzzForgeConfig() - return FuzzForgeClient(base_url=config.get_api_url(), timeout=config.get_timeout()) - - -def status_emoji(status: str) -> str: - """Get emoji for execution status""" - return STATUS_EMOJIS.get(status.lower(), STATUS_EMOJIS["unknown"]) - - -def should_fail_build(sarif_data: Dict[str, Any], fail_on: str) -> bool: - """ - Check if findings warrant build failure based on SARIF severity levels. - - Args: - sarif_data: SARIF format findings data - fail_on: Comma-separated SARIF levels (error,warning,note,info,all,none) - - Returns: - True if build should fail, False otherwise - """ - if fail_on == "none": - return False - - # Parse fail_on parameter - accept SARIF levels - if fail_on == "all": - check_levels = {"error", "warning", "note", "info"} - else: - check_levels = {s.strip().lower() for s in fail_on.split(",")} - - # Validate levels - valid_levels = {"error", "warning", "note", "info", "none"} - invalid = check_levels - valid_levels - if invalid: - console.print(f"āš ļø Invalid SARIF levels: {', '.join(invalid)}", style="yellow") - console.print("Valid levels: error, warning, note, info, all, none") - - # Check SARIF results - runs = sarif_data.get("runs", []) - if not runs: - return False - - results = runs[0].get("results", []) - for result in results: - level = result.get("level", "note") # SARIF default is "note" - if level in check_levels: - return True - - return False - - -def parse_inline_parameters(params: List[str]) -> Dict[str, Any]: - """Parse inline key=value parameters using improved validation""" - return validate_parameters(params) - - -def execute_workflow_submission( - client: FuzzForgeClient, - workflow: str, - target_path: str, - parameters: Dict[str, Any], - timeout: Optional[int], - interactive: bool -) -> Any: - """Handle the workflow submission process with file upload""" - # Get workflow metadata for parameter validation - console.print(f"šŸ”§ Getting workflow information for: {workflow}") - workflow_meta = client.get_workflow_metadata(workflow) - - # Interactive parameter input - if interactive and workflow_meta.parameters.get("properties"): - properties = workflow_meta.parameters.get("properties", {}) - required_params = set(workflow_meta.parameters.get("required", [])) - - missing_required = required_params - set(parameters.keys()) - - if missing_required: - console.print(f"\nšŸ“ [bold]Missing required parameters:[/bold] {', '.join(missing_required)}") - console.print("Please provide values:\n") - - for param_name in missing_required: - param_schema = properties.get(param_name, {}) - description = param_schema.get("description", "") - param_type = param_schema.get("type", "string") - - prompt_text = f"{param_name}" - if description: - prompt_text += f" ({description})" - prompt_text += f" [{param_type}]" - - while True: - user_input = Prompt.ask(prompt_text, console=console) - - try: - if param_type == "integer": - parameters[param_name] = int(user_input) - elif param_type == "number": - parameters[param_name] = float(user_input) - elif param_type == "boolean": - parameters[param_name] = user_input.lower() in ("true", "yes", "1", "on") - elif param_type == "array": - parameters[param_name] = [item.strip() for item in user_input.split(",") if item.strip()] - else: - parameters[param_name] = user_input - break - except ValueError as e: - console.print(f"āŒ Invalid {param_type}: {e}", style="red") - - # Show submission summary - console.print("\nšŸŽÆ [bold]Executing workflow:[/bold]") - console.print(f" Workflow: {workflow}") - console.print(f" Target: {target_path}") - if parameters: - console.print(f" Parameters: {len(parameters)} provided") - if timeout: - console.print(f" Timeout: {timeout}s") - - # Check if target path exists locally - target_path_obj = Path(target_path) - use_upload = target_path_obj.exists() - - if use_upload: - # Show file/directory info - if target_path_obj.is_dir(): - num_files = sum(1 for _ in target_path_obj.rglob("*") if _.is_file()) - console.print(f" Upload: Directory with {num_files} files") - else: - size_mb = target_path_obj.stat().st_size / (1024 * 1024) - console.print(f" Upload: File ({size_mb:.2f} MB)") - else: - console.print(" [yellow]āš ļø Warning: Target path does not exist locally[/yellow]") - console.print(" [yellow] Attempting to use path-based submission (backend must have access)[/yellow]") - - # Only ask for confirmation in interactive mode - if interactive: - if not Confirm.ask("\nExecute workflow?", default=True, console=console): - console.print("āŒ Execution cancelled", style="yellow") - raise typer.Exit(0) - else: - console.print("\nšŸš€ Executing workflow...") - - # Submit the workflow with enhanced progress - console.print(f"\nšŸš€ Executing workflow: [bold yellow]{workflow}[/bold yellow]") - - if use_upload: - # Use new upload-based submission - steps = [ - "Validating workflow configuration", - "Creating tarball (if directory)", - "Uploading target to backend", - "Starting workflow execution", - "Initializing execution environment" - ] - - with step_progress(steps, f"Executing {workflow}") as progress: - progress.next_step() # Validating - time.sleep(PROGRESS_STEP_DELAYS["validating"]) - - progress.next_step() # Creating tarball - time.sleep(PROGRESS_STEP_DELAYS["connecting"]) - - progress.next_step() # Uploading - # Use the new upload method - response = client.submit_workflow_with_upload( - workflow_name=workflow, - target_path=target_path, - parameters=parameters, - timeout=timeout - ) - time.sleep(PROGRESS_STEP_DELAYS["uploading"]) - - progress.next_step() # Starting - time.sleep(PROGRESS_STEP_DELAYS["creating"]) - - progress.next_step() # Initializing - time.sleep(PROGRESS_STEP_DELAYS["initializing"]) - - progress.complete("Workflow started successfully!") - else: - # Fall back to path-based submission (for backward compatibility) - steps = [ - "Validating workflow configuration", - "Connecting to FuzzForge API", - "Submitting workflow parameters", - "Creating workflow deployment", - "Initializing execution environment" - ] - - with step_progress(steps, f"Executing {workflow}") as progress: - progress.next_step() # Validating - time.sleep(PROGRESS_STEP_DELAYS["validating"]) - - progress.next_step() # Connecting - time.sleep(PROGRESS_STEP_DELAYS["connecting"]) - - progress.next_step() # Submitting - submission = WorkflowSubmission( - parameters=parameters, - timeout=timeout - ) - response = client.submit_workflow(workflow, submission) - time.sleep(PROGRESS_STEP_DELAYS["uploading"]) - - progress.next_step() # Creating deployment - time.sleep(PROGRESS_STEP_DELAYS["creating"]) - - progress.next_step() # Initializing - time.sleep(PROGRESS_STEP_DELAYS["initializing"]) - - progress.complete("Workflow started successfully!") - - return response - - -# Main workflow execution command (replaces 'runs submit') -@app.command(name="exec", hidden=True) # Hidden because it will be called from main workflow command -def execute_workflow( - workflow: str = typer.Argument(..., help="Workflow name to execute"), - target_path: str = typer.Argument(..., help="Path to analyze"), - params: List[str] = typer.Argument(default=None, help="Parameters as key=value pairs"), - param_file: Optional[str] = typer.Option( - None, "--param-file", "-f", - help="JSON file containing workflow parameters" - ), - timeout: Optional[int] = typer.Option( - None, "--timeout", "-t", - help="Execution timeout in seconds" - ), - interactive: bool = typer.Option( - True, "--interactive/--no-interactive", "-i/-n", - help="Interactive parameter input for missing required parameters" - ), - wait: bool = typer.Option( - False, "--wait", "-w", - help="Wait for execution to complete" - ), - live: bool = typer.Option( - False, "--live", "-l", - help="Start live monitoring after execution (useful for fuzzing workflows)" - ), - auto_start: Optional[bool] = typer.Option( - None, "--auto-start/--no-auto-start", - help="Automatically start required worker if not running (default: from config)" - ), - auto_stop: Optional[bool] = typer.Option( - None, "--auto-stop/--no-auto-stop", - help="Automatically stop worker after execution completes (default: from config)" - ), - fail_on: Optional[str] = typer.Option( - None, "--fail-on", - help="Fail build if findings match severity (critical,high,medium,low,all,none). Use with --wait" - ), - export_sarif: Optional[str] = typer.Option( - None, "--export-sarif", - help="Export SARIF results to file after completion. Use with --wait" - ) -): - """ - šŸš€ Execute a workflow on a target - - Use --live for fuzzing workflows to see real-time progress. - Use --wait to wait for completion without live dashboard. - Use --fail-on with --wait to fail CI builds based on finding severity. - Use --export-sarif with --wait to export SARIF findings to a file. - """ - try: - # Validate inputs - validate_workflow_name(workflow) - target_path_obj = validate_target_path(target_path, must_exist=True) - target_path = str(target_path_obj.absolute()) - validate_timeout(timeout) - - # Ensure we're in a project directory - require_project() - except Exception as e: - handle_error(e, "validating inputs") - - # Parse parameters - parameters = {} - - # Load from param file - if param_file: - try: - file_params = safe_json_load(param_file) - if isinstance(file_params, dict): - parameters.update(file_params) - else: - raise ValidationError("parameter file", param_file, "a JSON object") - except Exception as e: - handle_error(e, "loading parameter file") - - # Parse inline parameters - if params: - try: - inline_params = parse_inline_parameters(params) - parameters.update(inline_params) - except Exception as e: - handle_error(e, "parsing parameters") - - # Get config for worker management settings - config = get_project_config() or FuzzForgeConfig() - should_auto_start = auto_start if auto_start is not None else config.workers.auto_start_workers - should_auto_stop = auto_stop if auto_stop is not None else config.workers.auto_stop_workers - - worker_service = None # Track for cleanup - worker_mgr = None - wait_completed = False # Track if wait completed successfully - - try: - with get_client() as client: - # Get worker information for this workflow - try: - console.print(f"šŸ” Checking worker requirements for: {workflow}") - worker_info = client.get_workflow_worker_info(workflow) - - # Initialize worker manager - compose_file = config.workers.docker_compose_file - worker_mgr = WorkerManager( - compose_file=Path(compose_file) if compose_file else None, - startup_timeout=config.workers.worker_startup_timeout - ) - - # Ensure worker is running - worker_service = worker_info.get("worker_service", f"worker-{worker_info['vertical']}") - if not worker_mgr.ensure_worker_running(worker_info, auto_start=should_auto_start): - console.print( - f"āŒ Worker not available: {worker_info['vertical']}", - style="red" - ) - console.print( - f"šŸ’” Start the worker manually: docker compose up -d {worker_service}" - ) - raise typer.Exit(1) - - except typer.Exit: - raise # Re-raise Exit to preserve exit code - except Exception as e: - # If we can't get worker info, warn but continue (might be old backend) - console.print( - f"āš ļø Could not check worker requirements: {e}", - style="yellow" - ) - console.print( - " Continuing without worker management...", - style="yellow" - ) - - response = execute_workflow_submission( - client, workflow, target_path, parameters, - timeout, interactive - ) - - console.print("āœ… Workflow execution started!", style="green") - console.print(f" Execution ID: [bold cyan]{response.run_id}[/bold cyan]") - console.print(f" Status: {status_emoji(response.status)} {response.status}") - - # Save to database - try: - db = ensure_project_db() - run_record = RunRecord( - run_id=response.run_id, - workflow=workflow, - status=response.status, - target_path=target_path, - parameters=parameters, - created_at=datetime.now() - ) - db.save_run(run_record) - except Exception as e: - # Don't fail the whole operation if database save fails - console.print(f"āš ļø Failed to save execution to database: {e}", style="yellow") - - console.print(f"\nšŸ’” Monitor progress: [bold cyan]fuzzforge monitor live {response.run_id}[/bold cyan]") - console.print(f"šŸ’” Check status: [bold cyan]fuzzforge workflow status {response.run_id}[/bold cyan]") - - # Suggest --live for fuzzing workflows - if not live and not wait and "fuzzing" in workflow.lower(): - console.print(f"šŸ’” Next time try: [bold cyan]fuzzforge workflow run {workflow} {target_path} --live[/bold cyan] for real-time monitoring", style="dim") - - # Start live monitoring if requested - if live: - # Check if this is a fuzzing workflow to show appropriate messaging - is_fuzzing = "fuzzing" in workflow.lower() - if is_fuzzing: - console.print("\nšŸ“ŗ Starting live fuzzing monitor...") - console.print("šŸ’” You'll see real-time crash discovery, execution stats, and coverage data.") - else: - console.print("\nšŸ“ŗ Starting live monitoring...") - - console.print("Press Ctrl+C to stop monitoring (execution continues in background).\n") - - try: - from ..commands.monitor import _live_monitor - # Call helper function directly with proper parameters - _live_monitor(response.run_id, refresh=3, once=False, style="inline") - except KeyboardInterrupt: - console.print("\nā¹ļø Live monitoring stopped (execution continues in background)", style="yellow") - except Exception as e: - console.print(f"āš ļø Failed to start live monitoring: {e}", style="yellow") - console.print(f"šŸ’” You can still monitor manually: [bold cyan]fuzzforge monitor live {response.run_id}[/bold cyan]") - - # Wait for completion if requested - elif wait: - console.print("\nā³ Waiting for execution to complete...") - try: - final_status = client.wait_for_completion(response.run_id, poll_interval=POLL_INTERVAL) - - # Update database - try: - db.update_run_status( - response.run_id, - final_status.status, - completed_at=datetime.now() if final_status.is_completed else None - ) - except Exception as e: - console.print(f"āš ļø Failed to update database: {e}", style="yellow") - - console.print(f"šŸ Execution completed with status: {status_emoji(final_status.status)} {final_status.status}") - wait_completed = True # Mark wait as completed - - if final_status.is_completed: - # Export SARIF if requested - if export_sarif: - try: - console.print("\nšŸ“¤ Exporting SARIF results...") - findings = client.get_run_findings(response.run_id) - output_path = Path(export_sarif) - with open(output_path, 'w') as f: - json.dump(findings.sarif, f, indent=2) - console.print(f"āœ… SARIF exported to: [bold cyan]{output_path}[/bold cyan]") - except Exception as e: - console.print(f"āš ļø Failed to export SARIF: {e}", style="yellow") - - # Check if build should fail based on findings - if fail_on: - try: - console.print(f"\nšŸ” Checking findings against severity threshold: {fail_on}") - findings = client.get_run_findings(response.run_id) - if should_fail_build(findings.sarif, fail_on): - console.print("āŒ [bold red]Build failed: Found blocking security issues[/bold red]") - console.print(f"šŸ’” View details: [bold cyan]fuzzforge finding {response.run_id}[/bold cyan]") - raise typer.Exit(1) - else: - console.print("āœ… [bold green]No blocking security issues found[/bold green]") - except typer.Exit: - raise # Re-raise Exit to preserve exit code - except Exception as e: - console.print(f"āš ļø Failed to check findings: {e}", style="yellow") - - if not fail_on and not export_sarif: - console.print(f"šŸ’” View findings: [bold cyan]fuzzforge findings {response.run_id}[/bold cyan]") - - except KeyboardInterrupt: - console.print("\nā¹ļø Monitoring cancelled (execution continues in background)", style="yellow") - except typer.Exit: - raise # Re-raise Exit to preserve exit code - except Exception as e: - handle_error(e, "waiting for completion") - - except typer.Exit: - raise # Re-raise Exit to preserve exit code - except Exception as e: - handle_error(e, "executing workflow") - finally: - # Stop worker if auto-stop is enabled and wait completed - if should_auto_stop and worker_service and worker_mgr and wait_completed: - try: - console.print("\nšŸ›‘ Stopping worker (auto-stop enabled)...") - if worker_mgr.stop_worker(worker_service): - console.print(f"āœ… Worker stopped: {worker_service}") - except Exception as e: - console.print( - f"āš ļø Failed to stop worker: {e}", - style="yellow" - ) - - -@app.command("status") -def workflow_status( - execution_id: Optional[str] = typer.Argument(None, help="Execution ID to check (defaults to most recent)") -): - """ - šŸ“Š Check the status of a workflow execution - """ - try: - require_project() - - if execution_id: - validate_run_id(execution_id) - - db = get_project_db() - if not db: - raise DatabaseError("get project database", Exception("No database found")) - - # Get execution ID - if not execution_id: - recent_runs = db.list_runs(limit=1) - if not recent_runs: - console.print("āš ļø No executions found in project database", style="yellow") - raise typer.Exit(0) - execution_id = recent_runs[0].run_id - console.print(f"šŸ” Using most recent execution: {execution_id}") - else: - validate_run_id(execution_id) - - # Get status from API - with get_client() as client: - status = client.get_run_status(execution_id) - - # Update local database - try: - db.update_run_status( - execution_id, - status.status, - completed_at=status.updated_at if status.is_completed else None - ) - except Exception as e: - console.print(f"āš ļø Failed to update database: {e}", style="yellow") - - # Display status - console.print(f"\nšŸ“Š [bold]Execution Status: {execution_id}[/bold]\n") - - status_table = Table(show_header=False, box=box.SIMPLE) - status_table.add_column("Property", style="bold cyan") - status_table.add_column("Value") - - status_table.add_row("Execution ID", execution_id) - status_table.add_row("Workflow", status.workflow) - status_table.add_row("Status", f"{status_emoji(status.status)} {status.status}") - status_table.add_row("Created", status.created_at.strftime("%Y-%m-%d %H:%M:%S")) - status_table.add_row("Updated", status.updated_at.strftime("%Y-%m-%d %H:%M:%S")) - - if status.is_completed: - duration = status.updated_at - status.created_at - status_table.add_row("Duration", str(duration).split('.')[0]) # Remove microseconds - - console.print( - Panel.fit( - status_table, - title="šŸ“Š Status Information", - box=box.ROUNDED - ) - ) - - # Show next steps - if status.is_running: - console.print(f"\nšŸ’” Monitor live: [bold cyan]fuzzforge monitor live {execution_id}[/bold cyan]") - elif status.is_completed: - console.print(f"šŸ’” View findings: [bold cyan]fuzzforge finding {execution_id}[/bold cyan]") - elif status.is_failed: - console.print(f"šŸ’” Check logs: [bold cyan]fuzzforge workflow logs {execution_id}[/bold cyan]") - - except Exception as e: - handle_error(e, "getting execution status") - - -@app.command("history") -def workflow_history( - workflow: Optional[str] = typer.Option(None, "--workflow", "-w", help="Filter by workflow name"), - status: Optional[str] = typer.Option(None, "--status", "-s", help="Filter by status"), - limit: int = typer.Option(20, "--limit", "-l", help="Maximum number of executions to show") -): - """ - šŸ“‹ Show workflow execution history - """ - try: - require_project() - - if limit <= 0: - raise ValidationError("limit", limit, "a positive integer") - - db = get_project_db() - if not db: - raise DatabaseError("get project database", Exception("No database found")) - runs = db.list_runs(workflow=workflow, status=status, limit=limit) - - if not runs: - console.print("āš ļø No executions found matching criteria", style="yellow") - return - - table = Table(box=box.ROUNDED) - table.add_column("Execution ID", style="bold cyan") - table.add_column("Workflow", style="bold") - table.add_column("Status", justify="center") - table.add_column("Target", style="dim") - table.add_column("Created", justify="center") - table.add_column("Parameters", justify="center", style="dim") - - for run in runs: - param_count = len(run.parameters) if run.parameters else 0 - param_str = f"{param_count} params" if param_count > 0 else "-" - - table.add_row( - run.run_id[:12] + "..." if len(run.run_id) > MAX_RUN_ID_DISPLAY_LENGTH else run.run_id, - run.workflow, - f"{status_emoji(run.status)} {run.status}", - Path(run.target_path).name, - run.created_at.strftime("%m-%d %H:%M"), - param_str - ) - - console.print(f"\nšŸ“‹ [bold]Workflow Execution History ({len(runs)})[/bold]") - if workflow: - console.print(f" Filtered by workflow: {workflow}") - if status: - console.print(f" Filtered by status: {status}") - console.print() - console.print(table) - - console.print("\nšŸ’” Use [bold cyan]fuzzforge workflow status [/bold cyan] for detailed status") - - except Exception as e: - handle_error(e, "listing execution history") - - -@app.command("retry") -def retry_workflow( - execution_id: Optional[str] = typer.Argument(None, help="Execution ID to retry (defaults to most recent)"), - modify_params: bool = typer.Option( - False, "--modify-params", "-m", - help="Interactively modify parameters before retrying" - ) -): - """ - šŸ”„ Retry a workflow execution with the same or modified parameters - """ - try: - require_project() - - db = get_project_db() - if not db: - raise DatabaseError("get project database", Exception("No database found")) - - # Get execution ID if not provided - if not execution_id: - recent_runs = db.list_runs(limit=1) - if not recent_runs: - console.print("āš ļø No executions found to retry", style="yellow") - raise typer.Exit(0) - execution_id = recent_runs[0].run_id - console.print(f"šŸ”„ Retrying most recent execution: {execution_id}") - else: - validate_run_id(execution_id) - - # Get original execution - original_run = db.get_run(execution_id) - if not original_run: - raise ValidationError("execution_id", execution_id, "an existing execution ID in the database") - - console.print(f"šŸ”„ [bold]Retrying workflow:[/bold] {original_run.workflow}") - console.print(f" Original Execution ID: {execution_id}") - console.print(f" Target: {original_run.target_path}") - - parameters = original_run.parameters.copy() - - # Modify parameters if requested - if modify_params and parameters: - console.print("\nšŸ“ [bold]Current parameters:[/bold]") - for key, value in parameters.items(): - new_value = Prompt.ask( - f"{key}", - default=str(value), - console=console - ) - if new_value != str(value): - # Try to maintain type - try: - if isinstance(value, bool): - parameters[key] = new_value.lower() in ("true", "yes", "1", "on") - elif isinstance(value, int): - parameters[key] = int(new_value) - elif isinstance(value, float): - parameters[key] = float(new_value) - elif isinstance(value, list): - parameters[key] = [item.strip() for item in new_value.split(",") if item.strip()] - else: - parameters[key] = new_value - except ValueError: - parameters[key] = new_value - - # Submit new execution - with get_client() as client: - submission = WorkflowSubmission( - target_path=original_run.target_path, - parameters=parameters - ) - - response = client.submit_workflow(original_run.workflow, submission) - - console.print("\nāœ… Retry submitted successfully!", style="green") - console.print(f" New Execution ID: [bold cyan]{response.run_id}[/bold cyan]") - console.print(f" Status: {status_emoji(response.status)} {response.status}") - - # Save to database - try: - run_record = RunRecord( - run_id=response.run_id, - workflow=original_run.workflow, - status=response.status, - target_path=original_run.target_path, - parameters=parameters, - created_at=datetime.now(), - metadata={"retry_of": execution_id} - ) - db.save_run(run_record) - except Exception as e: - console.print(f"āš ļø Failed to save execution to database: {e}", style="yellow") - - console.print(f"\nšŸ’” Monitor progress: [bold cyan]fuzzforge monitor live {response.run_id}[/bold cyan]") - - except Exception as e: - handle_error(e, "retrying workflow") - - -@app.callback() -def workflow_exec_callback(): - """ - šŸš€ Workflow execution management - """ \ No newline at end of file diff --git a/cli/src/fuzzforge_cli/commands/workflows.py b/cli/src/fuzzforge_cli/commands/workflows.py deleted file mode 100644 index e38d247..0000000 --- a/cli/src/fuzzforge_cli/commands/workflows.py +++ /dev/null @@ -1,304 +0,0 @@ -""" -Workflow management commands. -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -import json -import typer -from rich.console import Console -from rich.table import Table -from rich.panel import Panel -from rich.prompt import Prompt -from rich.syntax import Syntax -from rich import box -from typing import Optional - -from ..config import get_project_config, FuzzForgeConfig -from ..fuzzy import enhanced_workflow_not_found_handler -from fuzzforge_sdk import FuzzForgeClient - -console = Console() -app = typer.Typer() - - -def get_client() -> FuzzForgeClient: - """Get configured FuzzForge client""" - config = get_project_config() or FuzzForgeConfig() - return FuzzForgeClient(base_url=config.get_api_url(), timeout=config.get_timeout()) - - -@app.command("list") -def list_workflows(): - """ - šŸ“‹ List all available security testing workflows - """ - try: - with get_client() as client: - workflows = client.list_workflows() - - if not workflows: - console.print("āŒ No workflows available", style="red") - return - - table = Table(box=box.ROUNDED) - table.add_column("Name", style="bold cyan") - table.add_column("Version", justify="center") - table.add_column("Description") - table.add_column("Tags", style="dim") - - for workflow in workflows: - tags_str = ", ".join(workflow.tags) if workflow.tags else "" - table.add_row( - workflow.name, - workflow.version, - workflow.description, - tags_str - ) - - console.print(f"\nšŸ”§ [bold]Available Workflows ({len(workflows)})[/bold]\n") - console.print(table) - - console.print("\nšŸ’” Use [bold cyan]fuzzforge workflows info [/bold cyan] for detailed information") - - except Exception as e: - console.print(f"āŒ Failed to fetch workflows: {e}", style="red") - raise typer.Exit(1) - - -@app.command("info") -def workflow_info( - name: str = typer.Argument(..., help="Workflow name to get information about") -): - """ - šŸ“‹ Show detailed information about a specific workflow - """ - try: - with get_client() as client: - workflow = client.get_workflow_metadata(name) - - console.print(f"\nšŸ”§ [bold]Workflow: {workflow.name}[/bold]\n") - - # Basic information - info_table = Table(show_header=False, box=box.SIMPLE) - info_table.add_column("Property", style="bold cyan") - info_table.add_column("Value") - - info_table.add_row("Name", workflow.name) - info_table.add_row("Version", workflow.version) - info_table.add_row("Description", workflow.description) - if workflow.author: - info_table.add_row("Author", workflow.author) - if workflow.tags: - info_table.add_row("Tags", ", ".join(workflow.tags)) - info_table.add_row("Custom Docker", "āœ… Yes" if workflow.has_custom_docker else "āŒ No") - - console.print( - Panel.fit( - info_table, - title="ā„¹ļø Basic Information", - box=box.ROUNDED - ) - ) - - # Parameters - if workflow.parameters: - console.print("\nšŸ“ [bold]Parameters Schema[/bold]") - - param_table = Table(box=box.ROUNDED) - param_table.add_column("Parameter", style="bold") - param_table.add_column("Type", style="cyan") - param_table.add_column("Required", justify="center") - param_table.add_column("Default") - param_table.add_column("Description", style="dim") - - # Extract parameter information from JSON schema - properties = workflow.parameters.get("properties", {}) - required_params = set(workflow.parameters.get("required", [])) - defaults = workflow.default_parameters - - for param_name, param_schema in properties.items(): - param_type = param_schema.get("type", "unknown") - is_required = "āœ…" if param_name in required_params else "āŒ" - default_val = str(defaults.get(param_name, "")) if param_name in defaults else "" - description = param_schema.get("description", "") - - # Handle array types - if param_type == "array": - items_type = param_schema.get("items", {}).get("type", "unknown") - param_type = f"array[{items_type}]" - - param_table.add_row( - param_name, - param_type, - is_required, - default_val[:30] + "..." if len(default_val) > 30 else default_val, - description[:50] + "..." if len(description) > 50 else description - ) - - console.print(param_table) - - # Required modules - if workflow.required_modules: - console.print(f"\nšŸ”§ [bold]Required Modules:[/bold] {', '.join(workflow.required_modules)}") - - console.print(f"\nšŸ’” Use [bold cyan]fuzzforge workflows parameters {name}[/bold cyan] for interactive parameter builder") - - except Exception as e: - error_message = str(e) - if "not found" in error_message.lower() or "404" in error_message: - # Try fuzzy matching for workflow name - enhanced_workflow_not_found_handler(name) - else: - console.print(f"āŒ Failed to get workflow info: {e}", style="red") - raise typer.Exit(1) - - -@app.command("parameters") -def workflow_parameters( - name: str = typer.Argument(..., help="Workflow name"), - output_file: Optional[str] = typer.Option( - None, "--output", "-o", - help="Save parameters to JSON file" - ), - interactive: bool = typer.Option( - True, "--interactive/--no-interactive", "-i/-n", - help="Interactive parameter builder" - ) -): - """ - šŸ“ Interactive parameter builder for workflows - """ - try: - with get_client() as client: - workflow = client.get_workflow_metadata(name) - param_response = client.get_workflow_parameters(name) - - console.print(f"\nšŸ“ [bold]Parameter Builder: {name}[/bold]\n") - - if not workflow.parameters.get("properties"): - console.print("ā„¹ļø This workflow has no configurable parameters") - return - - parameters = {} - properties = workflow.parameters.get("properties", {}) - required_params = set(workflow.parameters.get("required", [])) - defaults = param_response.default_parameters - - if interactive: - console.print("šŸ”§ Enter parameter values (press Enter for default):\n") - - for param_name, param_schema in properties.items(): - param_type = param_schema.get("type", "string") - description = param_schema.get("description", "") - is_required = param_name in required_params - default_value = defaults.get(param_name) - - # Build prompt - prompt_text = f"{param_name}" - if description: - prompt_text += f" ({description})" - if param_type: - prompt_text += f" [{param_type}]" - if is_required: - prompt_text += " [bold red]*required*[/bold red]" - - # Get user input - while True: - if default_value is not None: - user_input = Prompt.ask( - prompt_text, - default=str(default_value), - console=console - ) - else: - user_input = Prompt.ask( - prompt_text, - console=console - ) - - # Validate and convert input - if user_input.strip() == "" and not is_required: - break - - if user_input.strip() == "" and is_required: - console.print("āŒ This parameter is required", style="red") - continue - - try: - # Type conversion - if param_type == "integer": - parameters[param_name] = int(user_input) - elif param_type == "number": - parameters[param_name] = float(user_input) - elif param_type == "boolean": - parameters[param_name] = user_input.lower() in ("true", "yes", "1", "on") - elif param_type == "array": - # Simple comma-separated array - parameters[param_name] = [item.strip() for item in user_input.split(",") if item.strip()] - else: - parameters[param_name] = user_input - - break - - except ValueError as e: - console.print(f"āŒ Invalid {param_type}: {e}", style="red") - - # Show summary - console.print("\nšŸ“‹ [bold]Parameter Summary:[/bold]") - summary_table = Table(show_header=False, box=box.SIMPLE) - summary_table.add_column("Parameter", style="cyan") - summary_table.add_column("Value", style="white") - - for key, value in parameters.items(): - summary_table.add_row(key, str(value)) - - console.print(summary_table) - - else: - # Non-interactive mode - show schema - console.print("šŸ“‹ Parameter Schema:") - schema_json = json.dumps(workflow.parameters, indent=2) - console.print(Syntax(schema_json, "json", theme="monokai")) - - if defaults: - console.print("\nšŸ“‹ Default Values:") - defaults_json = json.dumps(defaults, indent=2) - console.print(Syntax(defaults_json, "json", theme="monokai")) - - # Save to file if requested - if output_file: - if parameters or not interactive: - data_to_save = parameters if interactive else {"schema": workflow.parameters, "defaults": defaults} - with open(output_file, 'w') as f: - json.dump(data_to_save, f, indent=2) - console.print(f"\nšŸ’¾ Parameters saved to: {output_file}") - else: - console.print("\nāŒ No parameters to save", style="red") - - except Exception as e: - console.print(f"āŒ Failed to build parameters: {e}", style="red") - raise typer.Exit(1) - - -@app.callback(invoke_without_command=True) -def workflows_callback(ctx: typer.Context): - """ - šŸ”§ Manage security testing workflows - """ - # Check if a subcommand is being invoked - if ctx.invoked_subcommand is not None: - # Let the subcommand handle it - return - - # Default to list when no subcommand provided - list_workflows() \ No newline at end of file diff --git a/cli/src/fuzzforge_cli/completion.py b/cli/src/fuzzforge_cli/completion.py deleted file mode 100644 index 7bd7c5b..0000000 --- a/cli/src/fuzzforge_cli/completion.py +++ /dev/null @@ -1,178 +0,0 @@ -""" -Shell auto-completion support for FuzzForge CLI. - -Provides intelligent tab completion for commands, workflows, run IDs, and parameters. -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -import typer -from typing import List -from pathlib import Path - -from .config import get_project_config, FuzzForgeConfig -from .database import get_project_db -from fuzzforge_sdk import FuzzForgeClient - - -def complete_workflow_names(incomplete: str) -> List[str]: - """Auto-complete workflow names from the API.""" - try: - config = get_project_config() or FuzzForgeConfig() - with FuzzForgeClient(base_url=config.get_api_url(), timeout=5.0) as client: - workflows = client.list_workflows() - workflow_names = [w.name for w in workflows] - return [name for name in workflow_names if name.startswith(incomplete)] - except Exception: - # Fallback to common workflow names if API is unavailable - common_workflows = [ - "security_assessment", - "language_fuzzing", - "infrastructure_scan", - "static_analysis_scan", - "penetration_testing_scan", - "secret_detection_scan" - ] - return [name for name in common_workflows if name.startswith(incomplete)] - - -def complete_run_ids(incomplete: str) -> List[str]: - """Auto-complete run IDs from local database.""" - try: - db = get_project_db() - if db: - runs = db.get_recent_runs(limit=50) # Get recent runs for completion - run_ids = [run.run_id for run in runs] - return [run_id for run_id in run_ids if run_id.startswith(incomplete)] - except Exception: - pass - return [] - - -def complete_target_paths(incomplete: str) -> List[str]: - """Auto-complete file/directory paths.""" - try: - # Convert incomplete path to Path object - path = Path(incomplete) if incomplete else Path.cwd() - - if path.is_dir(): - # Complete directory contents - try: - entries = [] - for entry in path.iterdir(): - entry_str = str(entry) - if entry.is_dir(): - entry_str += "/" - entries.append(entry_str) - return entries - except PermissionError: - return [] - else: - # Complete parent directory contents that match the incomplete name - parent = path.parent - name = path.name - try: - entries = [] - for entry in parent.iterdir(): - if entry.name.startswith(name): - entry_str = str(entry) - if entry.is_dir(): - entry_str += "/" - entries.append(entry_str) - return entries - except (PermissionError, FileNotFoundError): - return [] - except Exception: - return [] - - -def complete_export_formats(incomplete: str) -> List[str]: - """Auto-complete export formats.""" - formats = ["json", "csv", "html", "sarif"] - return [fmt for fmt in formats if fmt.startswith(incomplete)] - - -def complete_severity_levels(incomplete: str) -> List[str]: - """Auto-complete severity levels.""" - severities = ["critical", "high", "medium", "low", "info"] - return [sev for sev in severities if sev.startswith(incomplete)] - - -def complete_workflow_tags(incomplete: str) -> List[str]: - """Auto-complete workflow tags.""" - try: - config = get_project_config() or FuzzForgeConfig() - with FuzzForgeClient(base_url=config.get_api_url(), timeout=5.0) as client: - workflows = client.list_workflows() - all_tags = set() - for w in workflows: - if w.tags: - all_tags.update(w.tags) - return [tag for tag in sorted(all_tags) if tag.startswith(incomplete)] - except Exception: - # Fallback tags - common_tags = [ - "security", "fuzzing", "static-analysis", "infrastructure", - "secrets", "containers", "vulnerabilities", "pentest" - ] - return [tag for tag in common_tags if tag.startswith(incomplete)] - - -def complete_config_keys(incomplete: str) -> List[str]: - """Auto-complete configuration keys.""" - config_keys = [ - "api_url", - "api_timeout", - "default_workflow", - "project_name", - "data_retention_days", - "auto_save_findings", - "notification_webhook" - ] - return [key for key in config_keys if key.startswith(incomplete)] - - -# Completion callbacks for Typer -WorkflowNameComplete = typer.Option( - autocompletion=complete_workflow_names, - help="Workflow name (tab completion available)" -) - -RunIdComplete = typer.Option( - autocompletion=complete_run_ids, - help="Run ID (tab completion available)" -) - -TargetPathComplete = typer.Argument( - autocompletion=complete_target_paths, - help="Target path (tab completion available)" -) - -ExportFormatComplete = typer.Option( - autocompletion=complete_export_formats, - help="Export format (tab completion available)" -) - -SeverityComplete = typer.Option( - autocompletion=complete_severity_levels, - help="Severity level (tab completion available)" -) - -WorkflowTagComplete = typer.Option( - autocompletion=complete_workflow_tags, - help="Workflow tag (tab completion available)" -) - -ConfigKeyComplete = typer.Option( - autocompletion=complete_config_keys, - help="Configuration key (tab completion available)" -) \ No newline at end of file diff --git a/cli/src/fuzzforge_cli/config.py b/cli/src/fuzzforge_cli/config.py deleted file mode 100644 index 1a0ae28..0000000 --- a/cli/src/fuzzforge_cli/config.py +++ /dev/null @@ -1,491 +0,0 @@ -""" -Configuration management for FuzzForge CLI. - -Extends project configuration with Cognee integration metadata -and provides helpers for AI modules. -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -from __future__ import annotations - -import hashlib -import os -from pathlib import Path -from typing import Any, Dict, Optional - -try: # Optional dependency; fall back if not installed - from dotenv import load_dotenv -except ImportError: # pragma: no cover - optional dependency - load_dotenv = None - - -def _load_env_file_if_exists(path: Path, override: bool = False) -> bool: - if not path.exists(): - return False - # Always use manual parsing to handle empty values correctly - try: - for line in path.read_text(encoding="utf-8").splitlines(): - stripped = line.strip() - if not stripped or stripped.startswith("#") or "=" not in stripped: - continue - key, value = stripped.split("=", 1) - key = key.strip() - value = value.strip() - if override: - # Only override if value is non-empty - if value: - os.environ[key] = value - else: - # Set if not already in environment and value is non-empty - if key not in os.environ and value: - os.environ[key] = value - return True - except Exception: # pragma: no cover - best effort fallback - return False - - -def _find_shared_env_file(project_dir: Path) -> Path | None: - for directory in [project_dir] + list(project_dir.parents): - candidate = directory / "volumes" / "env" / ".env" - if candidate.exists(): - return candidate - return None - - -def load_project_env(project_dir: Optional[Path] = None) -> Path | None: - """Load project-local env, falling back to shared volumes/env/.env.""" - - project_dir = Path(project_dir or Path.cwd()) - shared_env = _find_shared_env_file(project_dir) - loaded_shared = False - if shared_env: - loaded_shared = _load_env_file_if_exists(shared_env, override=False) - - project_env = project_dir / ".fuzzforge" / ".env" - if _load_env_file_if_exists(project_env, override=True): - return project_env - - if loaded_shared: - return shared_env - - return None - -import yaml -from pydantic import BaseModel, Field - - -def _generate_project_id(project_dir: Path, project_name: str) -> str: - """Generate a deterministic project identifier based on path and name.""" - resolved_path = str(project_dir.resolve()) - hash_input = f"{resolved_path}:{project_name}".encode() - return hashlib.sha256(hash_input).hexdigest()[:16] - - -class ProjectConfig(BaseModel): - """Project configuration model.""" - - name: str = "fuzzforge-project" - api_url: str = "http://localhost:8000" - default_timeout: int = 3600 - default_workflow: Optional[str] = None - id: Optional[str] = None - tenant_id: Optional[str] = None - - -class RetentionConfig(BaseModel): - """Data retention configuration.""" - - max_runs: int = 100 - keep_findings_days: int = 90 - - -class PreferencesConfig(BaseModel): - """User preferences.""" - - auto_save_findings: bool = True - show_progress_bars: bool = True - table_style: str = "rich" - color_output: bool = True - - -class WorkerConfig(BaseModel): - """Worker lifecycle management configuration.""" - - auto_start_workers: bool = True - auto_stop_workers: bool = False - worker_startup_timeout: int = 60 - docker_compose_file: Optional[str] = None - - -class CogneeConfig(BaseModel): - """Cognee integration metadata.""" - - enabled: bool = True - graph_database_provider: str = "kuzu" - data_directory: Optional[str] = None - system_directory: Optional[str] = None - backend_access_control: bool = True - project_id: Optional[str] = None - tenant_id: Optional[str] = None - - -class FuzzForgeConfig(BaseModel): - """Complete FuzzForge CLI configuration.""" - - project: ProjectConfig = Field(default_factory=ProjectConfig) - retention: RetentionConfig = Field(default_factory=RetentionConfig) - preferences: PreferencesConfig = Field(default_factory=PreferencesConfig) - workers: WorkerConfig = Field(default_factory=WorkerConfig) - cognee: CogneeConfig = Field(default_factory=CogneeConfig) - - @classmethod - def from_file(cls, config_path: Path) -> "FuzzForgeConfig": - """Load configuration from YAML file.""" - if not config_path.exists(): - return cls() - - try: - with open(config_path, "r", encoding="utf-8") as fh: - data = yaml.safe_load(fh) or {} - return cls(**data) - except Exception as exc: # pragma: no cover - defensive fallback - print(f"Warning: Failed to load config from {config_path}: {exc}") - return cls() - - def save_to_file(self, config_path: Path) -> None: - """Save configuration to YAML file.""" - config_path.parent.mkdir(parents=True, exist_ok=True) - with open(config_path, "w", encoding="utf-8") as fh: - yaml.dump( - self.model_dump(), - fh, - default_flow_style=False, - sort_keys=False, - ) - - # ------------------------------------------------------------------ - # Convenience helpers used by CLI and AI modules - # ------------------------------------------------------------------ - def ensure_project_metadata(self, project_dir: Path) -> bool: - """Ensure project id/tenant metadata is populated.""" - changed = False - project = self.project - if not project.id: - project.id = _generate_project_id(project_dir, project.name) - changed = True - if not project.tenant_id: - project.tenant_id = f"fuzzforge_project_{project.id}" - changed = True - return changed - - def ensure_cognee_defaults(self, project_dir: Path) -> bool: - """Ensure Cognee configuration and directories exist.""" - self.ensure_project_metadata(project_dir) - changed = False - - cognee = self.cognee - if not cognee.project_id: - cognee.project_id = self.project.id - changed = True - if not cognee.tenant_id: - cognee.tenant_id = self.project.tenant_id - changed = True - - base_dir = project_dir / ".fuzzforge" / "cognee" / f"project_{self.project.id}" - data_dir = base_dir / "data" - system_dir = base_dir / "system" - - for path in ( - base_dir, - data_dir, - system_dir, - system_dir / "kuzu_db", - system_dir / "lancedb", - ): - if not path.exists(): - path.mkdir(parents=True, exist_ok=True) - - if cognee.data_directory != str(data_dir): - cognee.data_directory = str(data_dir) - changed = True - if cognee.system_directory != str(system_dir): - cognee.system_directory = str(system_dir) - changed = True - - return changed - - def get_api_url(self) -> str: - """Get API URL with environment variable override.""" - return os.getenv("FUZZFORGE_API_URL", self.project.api_url) - - def get_timeout(self) -> int: - """Get timeout with environment variable override.""" - env_timeout = os.getenv("FUZZFORGE_TIMEOUT") - if env_timeout and env_timeout.isdigit(): - return int(env_timeout) - return self.project.default_timeout - - def get_project_context(self, project_dir: Path) -> Dict[str, str]: - """Return project metadata for AI integrations.""" - self.ensure_cognee_defaults(project_dir) - return { - "project_id": self.project.id or "unknown_project", - "project_name": self.project.name, - "tenant_id": self.project.tenant_id or "fuzzforge_tenant", - "data_directory": self.cognee.data_directory, - "system_directory": self.cognee.system_directory, - } - - def get_cognee_config(self, project_dir: Path) -> Dict[str, Any]: - """Expose Cognee configuration as a plain dictionary.""" - self.ensure_cognee_defaults(project_dir) - return self.cognee.model_dump() - - -# ---------------------------------------------------------------------- -# Project-level helpers used across the CLI -# ---------------------------------------------------------------------- - -def _get_project_paths(project_dir: Path) -> Dict[str, Path]: - config_dir = project_dir / ".fuzzforge" - return { - "config_dir": config_dir, - "config_path": config_dir / "config.yaml", - } - - -def get_project_config(project_dir: Optional[Path] = None) -> Optional[FuzzForgeConfig]: - """Get configuration for the current project.""" - project_dir = Path(project_dir or Path.cwd()) - paths = _get_project_paths(project_dir) - config_path = paths["config_path"] - - if not config_path.exists(): - return None - - config = FuzzForgeConfig.from_file(config_path) - if config.ensure_cognee_defaults(project_dir): - config.save_to_file(config_path) - return config - - -def ensure_project_config( - project_dir: Optional[Path] = None, - project_name: Optional[str] = None, - api_url: Optional[str] = None, -) -> FuzzForgeConfig: - """Ensure project configuration exists, creating defaults if needed.""" - project_dir = Path(project_dir or Path.cwd()) - paths = _get_project_paths(project_dir) - config_dir = paths["config_dir"] - config_path = paths["config_path"] - - config_dir.mkdir(parents=True, exist_ok=True) - - if config_path.exists(): - config = FuzzForgeConfig.from_file(config_path) - else: - config = FuzzForgeConfig() - - if project_name: - config.project.name = project_name - if api_url: - config.project.api_url = api_url - - if config.ensure_cognee_defaults(project_dir): - config.save_to_file(config_path) - else: - # Still ensure latest values persisted (e.g., updated name/url) - config.save_to_file(config_path) - - return config - - -def get_global_config() -> FuzzForgeConfig: - """Get global user configuration.""" - home = Path.home() - global_config_dir = home / ".config" / "fuzzforge" - global_config_path = global_config_dir / "config.yaml" - - if global_config_path.exists(): - return FuzzForgeConfig.from_file(global_config_path) - - return FuzzForgeConfig() - - -def save_global_config(config: FuzzForgeConfig) -> None: - """Save global user configuration.""" - home = Path.home() - global_config_dir = home / ".config" / "fuzzforge" - global_config_path = global_config_dir / "config.yaml" - config.save_to_file(global_config_path) - - -# ---------------------------------------------------------------------- -# Compatibility layer for AI modules -# ---------------------------------------------------------------------- - -class ProjectConfigManager: - """Lightweight wrapper mimicking the legacy Config class used by the AI module.""" - - def __init__(self, project_dir: Optional[Path] = None): - self.project_dir = Path(project_dir or Path.cwd()) - paths = _get_project_paths(self.project_dir) - self.config_path = paths["config_dir"] - self.file_path = paths["config_path"] - self._config = get_project_config(self.project_dir) - if self._config is None: - raise FileNotFoundError( - f"FuzzForge project not initialized in {self.project_dir}. Run 'ff init'." - ) - - # Legacy API ------------------------------------------------------ - def is_initialized(self) -> bool: - return self.file_path.exists() - - def get_project_context(self) -> Dict[str, str]: - return self._config.get_project_context(self.project_dir) - - def get_cognee_config(self) -> Dict[str, Any]: - return self._config.get_cognee_config(self.project_dir) - - def setup_cognee_environment(self) -> None: - cognee = self.get_cognee_config() - if not cognee.get("enabled", True): - return - - load_project_env(self.project_dir) - - backend_access = "true" if cognee.get("backend_access_control", True) else "false" - os.environ["ENABLE_BACKEND_ACCESS_CONTROL"] = backend_access - os.environ["GRAPH_DATABASE_PROVIDER"] = cognee.get("graph_database_provider", "kuzu") - - data_dir = cognee.get("data_directory") - system_dir = cognee.get("system_directory") - tenant_id = cognee.get("tenant_id", "fuzzforge_tenant") - - if data_dir: - os.environ["COGNEE_DATA_ROOT"] = data_dir - if system_dir: - os.environ["COGNEE_SYSTEM_ROOT"] = system_dir - os.environ["COGNEE_USER_ID"] = tenant_id - os.environ["COGNEE_TENANT_ID"] = tenant_id - - # Configure LLM provider defaults for Cognee. Values prefixed with COGNEE_ - # take precedence so users can segregate credentials. - def _env(*names: str, default: str | None = None) -> str | None: - for name in names: - value = os.getenv(name) - if value: - return value - return default - - provider = _env( - "LLM_COGNEE_PROVIDER", - "COGNEE_LLM_PROVIDER", - "LLM_PROVIDER", - default="openai", - ) - model = _env( - "LLM_COGNEE_MODEL", - "COGNEE_LLM_MODEL", - "LLM_MODEL", - "LITELLM_MODEL", - default="gpt-4o-mini", - ) - api_key = _env( - "LLM_COGNEE_API_KEY", - "COGNEE_LLM_API_KEY", - "LLM_API_KEY", - "OPENAI_API_KEY", - ) - endpoint = _env("LLM_COGNEE_ENDPOINT", "COGNEE_LLM_ENDPOINT", "LLM_ENDPOINT") - embedding_model = _env( - "LLM_COGNEE_EMBEDDING_MODEL", - "COGNEE_LLM_EMBEDDING_MODEL", - "LLM_EMBEDDING_MODEL", - ) - embedding_endpoint = _env( - "LLM_COGNEE_EMBEDDING_ENDPOINT", - "COGNEE_LLM_EMBEDDING_ENDPOINT", - "LLM_EMBEDDING_ENDPOINT", - "LLM_ENDPOINT", - ) - api_version = _env( - "LLM_COGNEE_API_VERSION", - "COGNEE_LLM_API_VERSION", - "LLM_API_VERSION", - ) - max_tokens = _env( - "LLM_COGNEE_MAX_TOKENS", - "COGNEE_LLM_MAX_TOKENS", - "LLM_MAX_TOKENS", - ) - - if provider: - os.environ["LLM_PROVIDER"] = provider - if model: - os.environ["LLM_MODEL"] = model - # Maintain backwards compatibility with components expecting LITELLM_MODEL - os.environ.setdefault("LITELLM_MODEL", model) - if api_key: - os.environ["LLM_API_KEY"] = api_key - # Provide OPENAI_API_KEY fallback when using OpenAI-compatible providers - if provider and provider.lower() in {"openai", "azure_openai", "custom"}: - os.environ.setdefault("OPENAI_API_KEY", api_key) - if endpoint: - os.environ["LLM_ENDPOINT"] = endpoint - os.environ.setdefault("LLM_API_BASE", endpoint) - os.environ.setdefault("LLM_EMBEDDING_ENDPOINT", endpoint) - os.environ.setdefault("LLM_EMBEDDING_API_BASE", endpoint) - os.environ.setdefault("OPENAI_API_BASE", endpoint) - # Set LiteLLM proxy environment variables for SDK usage - os.environ.setdefault("LITELLM_PROXY_API_BASE", endpoint) - if api_key: - # Set LiteLLM proxy API key from the virtual key - os.environ.setdefault("LITELLM_PROXY_API_KEY", api_key) - if embedding_model: - os.environ["LLM_EMBEDDING_MODEL"] = embedding_model - if embedding_endpoint: - os.environ["LLM_EMBEDDING_ENDPOINT"] = embedding_endpoint - os.environ.setdefault("LLM_EMBEDDING_API_BASE", embedding_endpoint) - if api_version: - os.environ["LLM_API_VERSION"] = api_version - if max_tokens: - os.environ["LLM_MAX_TOKENS"] = str(max_tokens) - - # FuzzForge MCP backend connection - fallback if not in .env - if not os.getenv("FUZZFORGE_MCP_URL"): - os.environ["FUZZFORGE_MCP_URL"] = os.getenv( - "FUZZFORGE_DEFAULT_MCP_URL", - "http://localhost:8010/mcp", - ) - - def refresh(self) -> None: - """Reload configuration from disk.""" - self._config = get_project_config(self.project_dir) - if self._config is None: - raise FileNotFoundError( - f"FuzzForge project not initialized in {self.project_dir}. Run 'ff init'." - ) - - # Convenience accessors ------------------------------------------ - @property - def fuzzforge_dir(self) -> Path: - return self.config_path - - def get_api_url(self) -> str: - return self._config.get_api_url() - - def get_timeout(self) -> int: - return self._config.get_timeout() diff --git a/cli/src/fuzzforge_cli/constants.py b/cli/src/fuzzforge_cli/constants.py deleted file mode 100644 index 493dfb0..0000000 --- a/cli/src/fuzzforge_cli/constants.py +++ /dev/null @@ -1,69 +0,0 @@ -""" -Constants for FuzzForge CLI. -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -# Database constants -DEFAULT_DB_TIMEOUT = 30.0 -DEFAULT_CLEANUP_DAYS = 90 -STATS_SAMPLE_SIZE = 100 - -# Network constants -DEFAULT_API_TIMEOUT = 30.0 -MAX_RETRIES = 3 -RETRY_DELAY = 1.0 -POLL_INTERVAL = 5.0 - -# Display constants -MAX_RUN_ID_DISPLAY_LENGTH = 15 -MAX_DESCRIPTION_LENGTH = 50 -MAX_DEFAULT_VALUE_LENGTH = 30 - -# Progress constants -PROGRESS_STEP_DELAYS = { - "validating": 0.3, - "connecting": 0.2, - "uploading": 0.2, - "creating": 0.3, - "initializing": 0.2 -} - -# Status emojis -STATUS_EMOJIS = { - "completed": "āœ…", - "running": "šŸ”„", - "failed": "āŒ", - "queued": "ā³", - "cancelled": "ā¹ļø", - "pending": "šŸ“‹", - "unknown": "ā“" -} - -# Severity styles for Rich -SEVERITY_STYLES = { - "error": "bold red", - "warning": "bold yellow", - "note": "bold blue", - "info": "bold cyan" -} - -# Default export formats -DEFAULT_EXPORT_FORMAT = "sarif" -SUPPORTED_EXPORT_FORMATS = ["sarif", "json", "csv"] - -# Default configuration -DEFAULT_CONFIG = { - "api_url": "http://localhost:8000", - "timeout": DEFAULT_API_TIMEOUT, - "max_retries": MAX_RETRIES, -} \ No newline at end of file diff --git a/cli/src/fuzzforge_cli/database.py b/cli/src/fuzzforge_cli/database.py deleted file mode 100644 index 3c8e86c..0000000 --- a/cli/src/fuzzforge_cli/database.py +++ /dev/null @@ -1,661 +0,0 @@ -""" -Database module for FuzzForge CLI. - -Handles SQLite database operations for local project management, -including runs, findings, and crash storage. -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -import sqlite3 -import json -import logging -from datetime import datetime -from pathlib import Path -from typing import Dict, Any, List, Optional, Union -from contextlib import contextmanager - -from pydantic import BaseModel -from .constants import DEFAULT_DB_TIMEOUT, DEFAULT_CLEANUP_DAYS, STATS_SAMPLE_SIZE - -logger = logging.getLogger(__name__) - - -class RunRecord(BaseModel): - """Database record for workflow runs""" - run_id: str - workflow: str - status: str - target_path: str - parameters: Dict[str, Any] = {} - created_at: datetime - completed_at: Optional[datetime] = None - metadata: Dict[str, Any] = {} - - -class FindingRecord(BaseModel): - """Database record for findings""" - id: Optional[int] = None - run_id: str - sarif_data: Dict[str, Any] - summary: Dict[str, Any] = {} - created_at: datetime - - -class CrashRecord(BaseModel): - """Database record for crash reports""" - id: Optional[int] = None - run_id: str - crash_id: str - signal: Optional[str] = None - stack_trace: Optional[str] = None - input_file: Optional[str] = None - severity: str = "medium" - timestamp: datetime - - -class FuzzForgeDatabase: - """SQLite database manager for FuzzForge CLI projects""" - - SCHEMA = """ - CREATE TABLE IF NOT EXISTS runs ( - run_id TEXT PRIMARY KEY, - workflow TEXT NOT NULL, - status TEXT NOT NULL, - target_path TEXT NOT NULL, - parameters TEXT DEFAULT '{}', - created_at TIMESTAMP NOT NULL, - completed_at TIMESTAMP, - metadata TEXT DEFAULT '{}' - ); - - CREATE TABLE IF NOT EXISTS findings ( - id INTEGER PRIMARY KEY AUTOINCREMENT, - run_id TEXT NOT NULL, - sarif_data TEXT NOT NULL, - summary TEXT DEFAULT '{}', - created_at TIMESTAMP NOT NULL, - FOREIGN KEY (run_id) REFERENCES runs (run_id) - ); - - CREATE TABLE IF NOT EXISTS crashes ( - id INTEGER PRIMARY KEY AUTOINCREMENT, - run_id TEXT NOT NULL, - crash_id TEXT NOT NULL, - signal TEXT, - stack_trace TEXT, - input_file TEXT, - severity TEXT DEFAULT 'medium', - timestamp TIMESTAMP NOT NULL, - FOREIGN KEY (run_id) REFERENCES runs (run_id) - ); - - CREATE INDEX IF NOT EXISTS idx_runs_status ON runs (status); - CREATE INDEX IF NOT EXISTS idx_runs_workflow ON runs (workflow); - CREATE INDEX IF NOT EXISTS idx_runs_created_at ON runs (created_at); - CREATE INDEX IF NOT EXISTS idx_findings_run_id ON findings (run_id); - CREATE INDEX IF NOT EXISTS idx_crashes_run_id ON crashes (run_id); - """ - - def __init__(self, db_path: Union[str, Path]): - self.db_path = Path(db_path) - self.db_path.parent.mkdir(parents=True, exist_ok=True) - self._initialize_db() - - def _initialize_db(self): - """Initialize database with schema, handling corruption""" - try: - with self.connection() as conn: - # Test database integrity first - conn.execute("PRAGMA integrity_check").fetchone() - conn.executescript(self.SCHEMA) - except sqlite3.DatabaseError as e: - logger.warning(f"Database corruption detected: {e}") - # Backup corrupted database - backup_path = self.db_path.with_suffix('.db.corrupted') - if self.db_path.exists(): - self.db_path.rename(backup_path) - logger.info(f"Corrupted database backed up to: {backup_path}") - - # Create fresh database - with self.connection() as conn: - conn.executescript(self.SCHEMA) - logger.info("Created fresh database after corruption") - - @contextmanager - def connection(self): - """Context manager for database connections with proper resource management""" - conn = None - try: - conn = sqlite3.connect( - self.db_path, - detect_types=sqlite3.PARSE_DECLTYPES | sqlite3.PARSE_COLNAMES, - timeout=DEFAULT_DB_TIMEOUT - ) - conn.row_factory = sqlite3.Row - # Enable WAL mode for better concurrency - conn.execute("PRAGMA journal_mode=WAL") - # Enable query optimization - conn.execute("PRAGMA optimize") - yield conn - conn.commit() - except sqlite3.OperationalError as e: - if conn: - try: - conn.rollback() - except Exception: - pass # Connection might be broken - if "database is locked" in str(e).lower(): - raise sqlite3.OperationalError( - "Database is locked. Another FuzzForge process may be running." - ) from e - elif "database disk image is malformed" in str(e).lower(): - raise sqlite3.DatabaseError( - "Database is corrupted. Use 'ff init --force' to reset." - ) from e - raise - except Exception: - if conn: - try: - conn.rollback() - except Exception: - pass # Connection might be broken - raise - finally: - if conn: - try: - conn.close() - except Exception: - pass # Ensure cleanup even if close fails - - # Run management methods - - def save_run(self, run: RunRecord) -> None: - """Save or update a run record with validation""" - try: - # Validate JSON serialization before database write - parameters_json = json.dumps(run.parameters) - metadata_json = json.dumps(run.metadata) - - with self.connection() as conn: - conn.execute(""" - INSERT OR REPLACE INTO runs - (run_id, workflow, status, target_path, parameters, created_at, completed_at, metadata) - VALUES (?, ?, ?, ?, ?, ?, ?, ?) - """, ( - run.run_id, - run.workflow, - run.status, - run.target_path, - parameters_json, - run.created_at, - run.completed_at, - metadata_json - )) - except (TypeError, ValueError) as e: - raise ValueError(f"Failed to serialize run data: {e}") from e - - def get_run(self, run_id: str) -> Optional[RunRecord]: - """Get a run record by ID with error handling""" - with self.connection() as conn: - row = conn.execute( - "SELECT * FROM runs WHERE run_id = ?", - (run_id,) - ).fetchone() - - if row: - try: - return RunRecord( - run_id=row["run_id"], - workflow=row["workflow"], - status=row["status"], - target_path=row["target_path"], - parameters=json.loads(row["parameters"] or "{}"), - created_at=row["created_at"], - completed_at=row["completed_at"], - metadata=json.loads(row["metadata"] or "{}") - ) - except (json.JSONDecodeError, TypeError) as e: - logger.warning(f"Failed to deserialize run {run_id}: {e}") - # Return with empty dicts for corrupted JSON - return RunRecord( - run_id=row["run_id"], - workflow=row["workflow"], - status=row["status"], - target_path=row["target_path"], - parameters={}, - created_at=row["created_at"], - completed_at=row["completed_at"], - metadata={} - ) - return None - - def list_runs( - self, - workflow: Optional[str] = None, - status: Optional[str] = None, - limit: int = 50 - ) -> List[RunRecord]: - """List runs with optional filters""" - query = "SELECT * FROM runs WHERE 1=1" - params = [] - - if workflow: - query += " AND workflow = ?" - params.append(workflow) - - if status: - query += " AND status = ?" - params.append(status) - - query += " ORDER BY created_at DESC LIMIT ?" - params.append(limit) - - with self.connection() as conn: - rows = conn.execute(query, params).fetchall() - runs = [] - for row in rows: - try: - runs.append(RunRecord( - run_id=row["run_id"], - workflow=row["workflow"], - status=row["status"], - target_path=row["target_path"], - parameters=json.loads(row["parameters"] or "{}"), - created_at=row["created_at"], - completed_at=row["completed_at"], - metadata=json.loads(row["metadata"] or "{}") - )) - except (json.JSONDecodeError, TypeError) as e: - logger.warning(f"Skipping corrupted run {row['run_id']}: {e}") - # Skip corrupted records instead of failing - continue - return runs - - def update_run_status(self, run_id: str, status: str, completed_at: Optional[datetime] = None): - """Update run status""" - with self.connection() as conn: - conn.execute( - "UPDATE runs SET status = ?, completed_at = ? WHERE run_id = ?", - (status, completed_at, run_id) - ) - - # Findings management methods - - def save_findings(self, finding: FindingRecord) -> int: - """Save findings and return the ID""" - with self.connection() as conn: - cursor = conn.execute(""" - INSERT INTO findings (run_id, sarif_data, summary, created_at) - VALUES (?, ?, ?, ?) - """, ( - finding.run_id, - json.dumps(finding.sarif_data), - json.dumps(finding.summary), - finding.created_at - )) - return cursor.lastrowid - - def get_findings(self, run_id: str) -> Optional[FindingRecord]: - """Get findings for a run""" - with self.connection() as conn: - row = conn.execute( - "SELECT * FROM findings WHERE run_id = ? ORDER BY created_at DESC LIMIT 1", - (run_id,) - ).fetchone() - - if row: - return FindingRecord( - id=row["id"], - run_id=row["run_id"], - sarif_data=json.loads(row["sarif_data"]), - summary=json.loads(row["summary"]), - created_at=row["created_at"] - ) - return None - - def list_findings(self, limit: int = 50) -> List[FindingRecord]: - """List recent findings""" - with self.connection() as conn: - rows = conn.execute(""" - SELECT * FROM findings - ORDER BY created_at DESC - LIMIT ? - """, (limit,)).fetchall() - - return [ - FindingRecord( - id=row["id"], - run_id=row["run_id"], - sarif_data=json.loads(row["sarif_data"]), - summary=json.loads(row["summary"]), - created_at=row["created_at"] - ) - for row in rows - ] - - def get_all_findings(self, - workflow: Optional[str] = None, - severity: Optional[List[str]] = None, - since_date: Optional[datetime] = None, - limit: Optional[int] = None) -> List[FindingRecord]: - """Get all findings with optional filters""" - with self.connection() as conn: - query = """ - SELECT f.*, r.workflow - FROM findings f - JOIN runs r ON f.run_id = r.run_id - WHERE 1=1 - """ - params = [] - - if workflow: - query += " AND r.workflow = ?" - params.append(workflow) - - if since_date: - query += " AND f.created_at >= ?" - params.append(since_date) - - query += " ORDER BY f.created_at DESC" - - if limit: - query += " LIMIT ?" - params.append(limit) - - rows = conn.execute(query, params).fetchall() - - findings = [] - for row in rows: - try: - finding = FindingRecord( - id=row["id"], - run_id=row["run_id"], - sarif_data=json.loads(row["sarif_data"]), - summary=json.loads(row["summary"]), - created_at=row["created_at"] - ) - - # Filter by severity if specified - if severity: - finding_severities = set() - if "runs" in finding.sarif_data: - for run in finding.sarif_data["runs"]: - for result in run.get("results", []): - finding_severities.add(result.get("level", "note").lower()) - - if not any(sev.lower() in finding_severities for sev in severity): - continue - - findings.append(finding) - except (json.JSONDecodeError, KeyError) as e: - logger.warning(f"Skipping malformed finding {row['id']}: {e}") - continue - - return findings - - def get_findings_by_workflow(self, workflow: str) -> List[FindingRecord]: - """Get all findings for a specific workflow""" - return self.get_all_findings(workflow=workflow) - - def get_aggregated_stats(self) -> Dict[str, Any]: - """Get aggregated statistics for all findings using SQL aggregation""" - with self.connection() as conn: - # Total findings and runs - total_findings = conn.execute("SELECT COUNT(*) FROM findings").fetchone()[0] - total_runs = conn.execute("SELECT COUNT(DISTINCT run_id) FROM findings").fetchone()[0] - - # Findings by workflow - workflow_stats = conn.execute(""" - SELECT r.workflow, COUNT(f.id) as count - FROM findings f - JOIN runs r ON f.run_id = r.run_id - GROUP BY r.workflow - ORDER BY count DESC - """).fetchall() - - # Recent activity - recent_findings = conn.execute(""" - SELECT COUNT(*) FROM findings - WHERE created_at > datetime('now', '-7 days') - """).fetchone()[0] - - # Use SQL JSON functions to aggregate severity stats efficiently - # This avoids loading all findings into memory - severity_stats = conn.execute(""" - SELECT - SUM(json_array_length(json_extract(sarif_data, '$.runs[0].results'))) as total_issues, - COUNT(*) as finding_count - FROM findings - WHERE json_extract(sarif_data, '$.runs[0].results') IS NOT NULL - """).fetchone() - - total_issues = severity_stats["total_issues"] or 0 - - # Get severity distribution using SQL - # Note: This is a simplified version - for full accuracy we'd need JSON parsing - # But it's much more efficient than loading all data into Python - severity_counts = {"error": 0, "warning": 0, "note": 0, "info": 0} - - # Sample the first N findings for severity distribution - # This gives a good approximation without loading everything - sample_findings = conn.execute(""" - SELECT sarif_data - FROM findings - LIMIT ? - """, (STATS_SAMPLE_SIZE,)).fetchall() - - for row in sample_findings: - try: - data = json.loads(row["sarif_data"]) - if "runs" in data: - for run in data["runs"]: - for result in run.get("results", []): - level = result.get("level", "note").lower() - severity_counts[level] = severity_counts.get(level, 0) + 1 - except (json.JSONDecodeError, KeyError): - continue - - # Extrapolate severity counts if we have more than sample size - if total_findings > STATS_SAMPLE_SIZE: - multiplier = total_findings / STATS_SAMPLE_SIZE - for key in severity_counts: - severity_counts[key] = int(severity_counts[key] * multiplier) - - return { - "total_findings_records": total_findings, - "total_runs": total_runs, - "total_issues": total_issues, - "severity_distribution": severity_counts, - "workflows": {row["workflow"]: row["count"] for row in workflow_stats}, - "recent_findings": recent_findings, - "last_updated": datetime.now() - } - - # Crash management methods - - def save_crash(self, crash: CrashRecord) -> int: - """Save crash report and return the ID""" - with self.connection() as conn: - cursor = conn.execute(""" - INSERT INTO crashes - (run_id, crash_id, signal, stack_trace, input_file, severity, timestamp) - VALUES (?, ?, ?, ?, ?, ?, ?) - """, ( - crash.run_id, - crash.crash_id, - crash.signal, - crash.stack_trace, - crash.input_file, - crash.severity, - crash.timestamp - )) - return cursor.lastrowid - - def get_crashes(self, run_id: str) -> List[CrashRecord]: - """Get all crashes for a run""" - with self.connection() as conn: - rows = conn.execute( - "SELECT * FROM crashes WHERE run_id = ? ORDER BY timestamp DESC", - (run_id,) - ).fetchall() - - return [ - CrashRecord( - id=row["id"], - run_id=row["run_id"], - crash_id=row["crash_id"], - signal=row["signal"], - stack_trace=row["stack_trace"], - input_file=row["input_file"], - severity=row["severity"], - timestamp=row["timestamp"] - ) - for row in rows - ] - - # Utility methods - - def cleanup_old_runs(self, keep_days: int = DEFAULT_CLEANUP_DAYS) -> int: - """Remove old runs and associated data""" - cutoff_date = datetime.now().replace( - hour=0, minute=0, second=0, microsecond=0 - ) - datetime.timedelta(days=keep_days) - - with self.connection() as conn: - # Get run IDs to delete - old_runs = conn.execute( - "SELECT run_id FROM runs WHERE created_at < ?", - (cutoff_date,) - ).fetchall() - - if not old_runs: - return 0 - - run_ids = [row["run_id"] for row in old_runs] - placeholders = ",".join("?" * len(run_ids)) - - # Delete associated findings and crashes - conn.execute(f"DELETE FROM findings WHERE run_id IN ({placeholders})", run_ids) - conn.execute(f"DELETE FROM crashes WHERE run_id IN ({placeholders})", run_ids) - - # Delete runs - conn.execute(f"DELETE FROM runs WHERE run_id IN ({placeholders})", run_ids) - - return len(run_ids) - - def get_stats(self) -> Dict[str, Any]: - """Get database statistics""" - with self.connection() as conn: - stats = {} - - # Run counts by status - run_stats = conn.execute(""" - SELECT status, COUNT(*) as count - FROM runs - GROUP BY status - """).fetchall() - stats["runs_by_status"] = {row["status"]: row["count"] for row in run_stats} - - # Total counts - stats["total_runs"] = conn.execute("SELECT COUNT(*) FROM runs").fetchone()[0] - stats["total_findings"] = conn.execute("SELECT COUNT(*) FROM findings").fetchone()[0] - stats["total_crashes"] = conn.execute("SELECT COUNT(*) FROM crashes").fetchone()[0] - - # Recent activity - stats["runs_last_7_days"] = conn.execute(""" - SELECT COUNT(*) FROM runs - WHERE created_at > datetime('now', '-7 days') - """).fetchone()[0] - - return stats - - def health_check(self) -> Dict[str, Any]: - """Perform database health check""" - health = { - "healthy": True, - "issues": [], - "recommendations": [] - } - - try: - with self.connection() as conn: - # Check database integrity - integrity_result = conn.execute("PRAGMA integrity_check").fetchone() - if integrity_result[0] != "ok": - health["healthy"] = False - health["issues"].append(f"Database integrity check failed: {integrity_result[0]}") - - # Check for orphaned records - orphaned_findings = conn.execute(""" - SELECT COUNT(*) FROM findings - WHERE run_id NOT IN (SELECT run_id FROM runs) - """).fetchone()[0] - - if orphaned_findings > 0: - health["issues"].append(f"Found {orphaned_findings} orphaned findings") - health["recommendations"].append("Run database cleanup to remove orphaned records") - - orphaned_crashes = conn.execute(""" - SELECT COUNT(*) FROM crashes - WHERE run_id NOT IN (SELECT run_id FROM runs) - """).fetchone()[0] - - if orphaned_crashes > 0: - health["issues"].append(f"Found {orphaned_crashes} orphaned crashes") - - # Check database size - db_size = self.db_path.stat().st_size if self.db_path.exists() else 0 - if db_size > 100 * 1024 * 1024: # 100MB - health["recommendations"].append("Database is large (>100MB). Consider cleanup.") - - except Exception as e: - health["healthy"] = False - health["issues"].append(f"Health check failed: {e}") - - return health - - -def get_project_db(project_dir: Optional[Path] = None) -> Optional[FuzzForgeDatabase]: - """Get the database for the current project with error handling""" - if project_dir is None: - project_dir = Path.cwd() - - fuzzforge_dir = project_dir / ".fuzzforge" - if not fuzzforge_dir.exists(): - return None - - db_path = fuzzforge_dir / "findings.db" - try: - return FuzzForgeDatabase(db_path) - except Exception as e: - logger.error(f"Failed to open project database: {e}") - raise sqlite3.DatabaseError(f"Failed to open project database: {e}") from e - - -def ensure_project_db(project_dir: Optional[Path] = None) -> FuzzForgeDatabase: - """Ensure project database exists, create if needed with error handling""" - if project_dir is None: - project_dir = Path.cwd() - - fuzzforge_dir = project_dir / ".fuzzforge" - try: - fuzzforge_dir.mkdir(exist_ok=True) - except PermissionError as e: - raise PermissionError(f"Cannot create .fuzzforge directory: {e}") from e - - db_path = fuzzforge_dir / "findings.db" - try: - return FuzzForgeDatabase(db_path) - except Exception as e: - logger.error(f"Failed to create/open project database: {e}") - raise sqlite3.DatabaseError(f"Failed to create project database: {e}") from e \ No newline at end of file diff --git a/cli/src/fuzzforge_cli/exceptions.py b/cli/src/fuzzforge_cli/exceptions.py deleted file mode 100644 index d1137f3..0000000 --- a/cli/src/fuzzforge_cli/exceptions.py +++ /dev/null @@ -1,472 +0,0 @@ -""" -Enhanced exception handling and error utilities for FuzzForge CLI with rich context display. -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -import time -import functools -from typing import Any, Callable, Optional, Union, List -from pathlib import Path - -import typer -import httpx -from rich.console import Console -from rich.panel import Panel -from rich.text import Text -from rich.table import Table - -# Import SDK exceptions for rich handling -from fuzzforge_sdk.exceptions import ( - FuzzForgeError as SDKFuzzForgeError -) - -console = Console() - - -class FuzzForgeError(Exception): - """Base exception for FuzzForge CLI errors (legacy CLI-specific errors)""" - - def __init__(self, message: str, hint: Optional[str] = None, exit_code: int = 1): - self.message = message - self.hint = hint - self.exit_code = exit_code - super().__init__(message) - - -class ProjectNotFoundError(FuzzForgeError): - """Raised when no FuzzForge project is found in current directory""" - - def __init__(self): - super().__init__( - "No FuzzForge project found in current directory", - "Run 'ff init' to initialize a new project" - ) - - -class APIConnectionError(FuzzForgeError): - """Legacy API connection error for backward compatibility""" - - def __init__(self, url: str, original_error: Exception): - self.url = url - self.original_error = original_error - - if isinstance(original_error, httpx.ConnectTimeout): - message = f"Connection timeout to FuzzForge API at {url}" - hint = "Check if the API server is running and the URL is correct" - elif isinstance(original_error, httpx.ConnectError): - message = f"Failed to connect to FuzzForge API at {url}" - hint = "Verify the API URL is correct and the server is accessible" - elif isinstance(original_error, httpx.TimeoutException): - message = f"Request timeout to FuzzForge API at {url}" - hint = "The API server may be overloaded. Try again later" - else: - message = f"API connection error: {str(original_error)}" - hint = "Check your network connection and API configuration" - - super().__init__(message, hint) - - -class DatabaseError(FuzzForgeError): - """Raised when database operations fail""" - - def __init__(self, operation: str, original_error: Exception): - self.operation = operation - self.original_error = original_error - - message = f"Database error during {operation}: {str(original_error)}" - hint = "The database may be corrupted. Try 'ff init --force' to reset" - - super().__init__(message, hint) - - -class ValidationError(FuzzForgeError): - """Legacy validation error for CLI-specific validation""" - - def __init__(self, field: str, value: Any, expected: str): - self.field = field - self.value = value - self.expected = expected - - message = f"Invalid {field}: {value}" - hint = f"Expected {expected}" - - super().__init__(message, hint) - - -class FileOperationError(FuzzForgeError): - """Raised when file operations fail""" - - def __init__(self, operation: str, path: Union[str, Path], original_error: Exception): - self.operation = operation - self.path = Path(path) - self.original_error = original_error - - if isinstance(original_error, FileNotFoundError): - message = f"File not found: {path}" - hint = "Check the path exists and you have permission to access it" - elif isinstance(original_error, PermissionError): - message = f"Permission denied: {path}" - hint = "Check file permissions or run with appropriate privileges" - else: - message = f"File operation failed ({operation}): {str(original_error)}" - hint = "Check the file path and permissions" - - super().__init__(message, hint) - - -def display_container_logs(diagnostics, title: str = "Container Logs"): - """Display container logs in a rich format.""" - if not diagnostics or not diagnostics.logs: - return - - # Show last 20 lines of logs - recent_logs = diagnostics.logs[-20:] if len(diagnostics.logs) > 20 else diagnostics.logs - - log_content = [] - for log_entry in recent_logs: - timestamp = log_entry.timestamp.strftime("%H:%M:%S") - level_color = { - 'ERROR': 'red', - 'WARNING': 'yellow', - 'INFO': 'blue', - 'DEBUG': 'dim white' - }.get(log_entry.level, 'white') - - log_line = f"[dim]{timestamp}[/dim] [{level_color}]{log_entry.level}[/{level_color}] {log_entry.message}" - log_content.append(log_line) - - if log_content: - logs_panel = Panel( - "\n".join(log_content), - title=title, - title_align="left", - border_style="dim", - expand=False - ) - console.print(logs_panel) - - -def display_container_diagnostics(diagnostics): - """Display comprehensive container diagnostics.""" - if not diagnostics: - return - - # Container Status Table - status_table = Table(title="Container Status", show_header=False, box=None) - status_table.add_column("Property", style="bold") - status_table.add_column("Value") - - status_color = { - 'running': 'green', - 'exited': 'red', - 'failed': 'red', - 'created': 'yellow', - 'unknown': 'dim' - }.get(diagnostics.status.lower(), 'white') - - status_table.add_row("Status", f"[{status_color}]{diagnostics.status}[/{status_color}]") - - if diagnostics.exit_code is not None: - exit_color = 'green' if diagnostics.exit_code == 0 else 'red' - status_table.add_row("Exit Code", f"[{exit_color}]{diagnostics.exit_code}[/{exit_color}]") - - if diagnostics.error: - status_table.add_row("Error", f"[red]{diagnostics.error}[/red]") - - # Resource Usage - if diagnostics.resource_usage: - memory_limit = diagnostics.resource_usage.get('memory_limit', 0) - if memory_limit > 0: - memory_mb = memory_limit // (1024 * 1024) - status_table.add_row("Memory Limit", f"{memory_mb} MB") - - console.print(status_table) - - # Volume Mounts - if diagnostics.volume_mounts: - console.print("\n[bold]Volume Mounts:[/bold]") - for mount in diagnostics.volume_mounts: - mount_info = f" {mount['source']} → {mount['destination']} ([dim]{mount['mode']}[/dim])" - console.print(mount_info) - - -def display_error_patterns(error_patterns): - """Display detected error patterns.""" - if not error_patterns: - return - - console.print("\n[bold red]šŸ” Detected Issues:[/bold red]") - - for error_type, messages in error_patterns.items(): - # Format error type name - formatted_type = error_type.replace('_', ' ').title() - console.print(f"\n[bold yellow]• {formatted_type}:[/bold yellow]") - - for message in messages[:3]: # Show first 3 messages - console.print(f" [dim]ā–ø[/dim] {message}") - - if len(messages) > 3: - console.print(f" [dim]ā–ø ... and {len(messages) - 3} more similar messages[/dim]") - - -def display_suggestions(suggestions: List[str]): - """Display actionable suggestions.""" - if not suggestions: - return - - console.print("\n[bold green]šŸ’” Suggested Fixes:[/bold green]") - - for i, suggestion in enumerate(suggestions[:6], 1): # Show max 6 suggestions - console.print(f" [bold green]{i}.[/bold green] {suggestion}") - - -def handle_error(error: Exception, context: str = "") -> None: - """ - Display comprehensive error messages with rich context and exit appropriately. - - Args: - error: The exception that occurred - context: Additional context about where the error occurred - """ - # Handle SDK errors with rich context - if isinstance(error, SDKFuzzForgeError): - console.print() # Add some spacing - - # Main error message - error_title = f"āŒ {error.__class__.__name__}" - if context: - error_title += f" during {context}" - - console.print(Panel( - error.get_summary(), - title=error_title, - title_align="left", - border_style="red", - expand=False - )) - - # Show detailed context if available - if hasattr(error, 'context') and error.context: - ctx = error.context - - # Error patterns - if ctx.error_patterns: - display_error_patterns(ctx.error_patterns) - - # API context - if ctx.url: - console.print(f"\n[dim]Request URL: {ctx.url}[/dim]") - - if ctx.response_data and isinstance(ctx.response_data, dict) and 'raw' not in ctx.response_data: - console.print(f"[dim]API Response: {ctx.response_data}[/dim]") - - # Suggestions - if ctx.suggested_fixes: - display_suggestions(ctx.suggested_fixes) - - console.print() # Add spacing before exit - raise typer.Exit(1) - - # Handle legacy CLI errors - elif isinstance(error, FuzzForgeError): - error_text = Text() - error_text.append("āŒ ", style="red") - error_text.append(error.message, style="red") - - if context: - error_text.append(f" ({context})", style="dim red") - - console.print(error_text) - - if error.hint: - hint_text = Text() - hint_text.append("šŸ’” ", style="yellow") - hint_text.append(error.hint, style="yellow") - console.print(hint_text) - - raise typer.Exit(error.exit_code) - - elif isinstance(error, KeyboardInterrupt): - console.print("\nā¹ļø Operation cancelled by user", style="yellow") - raise typer.Exit(130) # Standard exit code for SIGINT - - else: - # Unexpected errors - show minimal info to user, log details - console.print() - - error_panel = Panel( - f"An unexpected error occurred: {str(error)}", - title="āŒ Unexpected Error", - title_align="left", - border_style="red", - expand=False - ) - - if context: - error_panel.title += f" during {context}" - - console.print(error_panel) - - # Show error details for debugging - console.print(f"\n[dim yellow]Error type: {type(error).__name__}[/dim yellow]") - console.print("[dim yellow]Please report this issue if it persists[/dim yellow]") - console.print() - - raise typer.Exit(1) - - -def retry_on_network_error(max_retries: int = 3, delay: float = 1.0, backoff_multiplier: float = 2.0): - """ - Decorator to retry network operations with exponential backoff. - - Args: - max_retries: Maximum number of retry attempts - delay: Initial delay between retries in seconds - backoff_multiplier: Multiplier for exponential backoff - """ - def decorator(func: Callable) -> Callable: - @functools.wraps(func) - def wrapper(*args, **kwargs): - last_exception = None - current_delay = delay - - for attempt in range(max_retries + 1): - try: - return func(*args, **kwargs) - except (httpx.ConnectError, httpx.TimeoutException, httpx.NetworkError) as e: - last_exception = e - - if attempt < max_retries: - console.print( - f"šŸ”„ Network error, retrying in {current_delay:.1f}s... " - f"(attempt {attempt + 1}/{max_retries})", - style="yellow" - ) - time.sleep(current_delay) - current_delay *= backoff_multiplier - else: - # Convert to our custom error type - api_url = getattr(args[0], 'base_url', 'unknown') if args else 'unknown' - raise APIConnectionError(str(api_url), e) - - # Should never reach here, but just in case - if last_exception: - raise last_exception - - return wrapper - return decorator - - -def validate_path(path: Union[str, Path], must_exist: bool = True, must_be_file: bool = False, - must_be_dir: bool = False) -> Path: - """ - Validate file/directory paths with user-friendly error messages. - - Args: - path: Path to validate - must_exist: Whether the path must exist - must_be_file: Whether the path must be a file - must_be_dir: Whether the path must be a directory - - Returns: - Validated Path object - - Raises: - ValidationError: If validation fails - """ - path_obj = Path(path) - - if must_exist and not path_obj.exists(): - raise ValidationError("path", str(path), "an existing path") - - if must_be_file and path_obj.exists() and not path_obj.is_file(): - raise ValidationError("path", str(path), "a file") - - if must_be_dir and path_obj.exists() and not path_obj.is_dir(): - raise ValidationError("path", str(path), "a directory") - - return path_obj - - -def validate_run_id(run_id: str) -> str: - """ - Validate run ID format. - - Args: - run_id: Run ID to validate - - Returns: - Validated run ID - - Raises: - ValidationError: If run ID format is invalid - """ - if not run_id or len(run_id) < 8: - raise ValidationError("run_id", run_id, "at least 8 characters") - - # Allow alphanumeric characters, hyphens, and underscores - if not run_id.replace('-', '').replace('_', '').isalnum(): - raise ValidationError("run_id", run_id, "alphanumeric characters, hyphens, and underscores only") - - return run_id - - -def safe_json_load(file_path: Union[str, Path]) -> dict: - """ - Safely load JSON file with proper error handling. - - Args: - file_path: Path to JSON file - - Returns: - Parsed JSON data - - Raises: - FileOperationError: If file operation fails - ValidationError: If JSON is invalid - """ - path_obj = Path(file_path) - - try: - with open(path_obj, 'r', encoding='utf-8') as f: - import json - return json.load(f) - except FileNotFoundError as e: - raise FileOperationError("read", path_obj, e) - except PermissionError as e: - raise FileOperationError("read", path_obj, e) - except json.JSONDecodeError as e: - raise ValidationError("JSON file", str(path_obj), f"valid JSON format (error: {e})") - except Exception as e: - raise FileOperationError("read", path_obj, e) - - -def require_project() -> Path: - """ - Ensure we're in a FuzzForge project directory. - - Returns: - Path to project root - - Raises: - ProjectNotFoundError: If not in a project directory - """ - current = Path.cwd() - - # Look for .fuzzforge directory in current or parent directories - for path in [current] + list(current.parents): - fuzzforge_dir = path / ".fuzzforge" - if fuzzforge_dir.is_dir(): - return path - - raise ProjectNotFoundError() \ No newline at end of file diff --git a/cli/src/fuzzforge_cli/fuzzy.py b/cli/src/fuzzforge_cli/fuzzy.py deleted file mode 100644 index 4cec4de..0000000 --- a/cli/src/fuzzforge_cli/fuzzy.py +++ /dev/null @@ -1,305 +0,0 @@ -""" -Fuzzy matching and smart suggestions for FuzzForge CLI. - -Provides "Did you mean...?" functionality and intelligent command/parameter suggestions. -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -import difflib -from typing import Any, Dict, List, Optional, Tuple - -from rich.console import Console -from rich.panel import Panel -from rich.text import Text - -console = Console() - - -class FuzzyMatcher: - """Fuzzy matching engine for CLI commands and parameters.""" - - def __init__(self): - # Known commands and subcommands - self.commands = { - "init": ["project"], - "workflows": ["list", "info"], - "runs": ["submit", "status", "list", "rerun"], - "findings": ["get", "list", "export", "all"], - "config": ["set", "get", "list", "init"], - "ai": ["ask", "summarize", "explain"], - "ingest": ["project", "findings"], - } - - # Common workflow names - self.workflow_names = [ - "security_assessment", - "language_fuzzing", - "infrastructure_scan", - "static_analysis_scan", - "penetration_testing_scan", - "secret_detection_scan", - ] - - # Common parameter names - self.parameter_names = [ - "target_path", - "timeout", - "workflow", - "param", - "param-file", - "interactive", - "wait", - "format", - "output", - "severity", - "since", - "limit", - "stats", - "export", - ] - - # Common values - self.common_values = { - "format": ["json", "csv", "html", "sarif"], - "severity": ["critical", "high", "medium", "low", "info"], - } - - def find_closest_command( - self, user_input: str, command_group: Optional[str] = None - ) -> Optional[Tuple[str, float]]: - """Find the closest matching command.""" - if command_group and command_group in self.commands: - # Search within a specific command group - candidates = self.commands[command_group] - else: - # Search all main commands - candidates = list(self.commands.keys()) - - matches = difflib.get_close_matches(user_input, candidates, n=1, cutoff=0.6) - - if matches: - match = matches[0] - # Calculate similarity ratio - ratio = difflib.SequenceMatcher(None, user_input, match).ratio() - return match, ratio - - return None - - def find_closest_workflow(self, user_input: str) -> Optional[Tuple[str, float]]: - """Find the closest matching workflow name.""" - matches = difflib.get_close_matches( - user_input, self.workflow_names, n=1, cutoff=0.6 - ) - - if matches: - match = matches[0] - ratio = difflib.SequenceMatcher(None, user_input, match).ratio() - return match, ratio - - return None - - def find_closest_parameter(self, user_input: str) -> Optional[Tuple[str, float]]: - """Find the closest matching parameter name.""" - # Remove leading dashes - clean_input = user_input.lstrip("-") - - matches = difflib.get_close_matches( - clean_input, self.parameter_names, n=1, cutoff=0.6 - ) - - if matches: - match = matches[0] - ratio = difflib.SequenceMatcher(None, clean_input, match).ratio() - return match, ratio - - return None - - def suggest_parameter_values(self, parameter: str, user_input: str) -> List[str]: - """Suggest parameter values based on known options.""" - if parameter in self.common_values: - values = self.common_values[parameter] - if user_input: - # Filter values that start with user input - return [v for v in values if v.startswith(user_input.lower())] - else: - return values - - return [] - - def get_command_suggestions( - self, user_command: List[str] - ) -> Optional[Dict[str, Any]]: - """Get suggestions for a user command that may have typos.""" - if not user_command: - return None - - suggestions = {"type": None, "original": user_command, "suggestions": []} - - # Check main command - main_cmd = user_command[0] - if main_cmd not in self.commands: - closest = self.find_closest_command(main_cmd) - if closest: - match, confidence = closest - suggestions["type"] = "main_command" - suggestions["suggestions"].append( - {"text": match, "confidence": confidence, "type": "command"} - ) - - # Check subcommand if present - elif len(user_command) > 1: - sub_cmd = user_command[1] - if main_cmd in self.commands and sub_cmd not in self.commands[main_cmd]: - closest = self.find_closest_command(sub_cmd, main_cmd) - if closest: - match, confidence = closest - suggestions["type"] = "subcommand" - suggestions["suggestions"].append( - { - "text": f"{main_cmd} {match}", - "confidence": confidence, - "type": "subcommand", - } - ) - - return suggestions if suggestions["suggestions"] else None - - def suggest_workflow_fix(self, user_workflow: str) -> Optional[str]: - """Suggest a workflow name correction.""" - closest = self.find_closest_workflow(user_workflow) - if closest: - match, confidence = closest - if confidence > 0.6: # Only suggest if reasonably confident - return match - return None - - -def display_command_suggestion(suggestions: Dict[str, Any]): - """Display command suggestions to the user.""" - if not suggestions or not suggestions["suggestions"]: - return - - original = " ".join(suggestions["original"]) - suggestion_type = suggestions["type"] - - # Create suggestion text - text = Text() - text.append("ā“ Command not found: ", style="red") - text.append(f"'{original}'", style="bold red") - text.append("\n\n") - - text.append("šŸ’” Did you mean:\n", style="yellow") - - for i, suggestion in enumerate(suggestions["suggestions"], 1): - confidence_percent = int(suggestion["confidence"] * 100) - text.append(f" {i}. ", style="bold cyan") - text.append(f"{suggestion['text']}", style="bold white") - text.append(f" ({confidence_percent}% match)", style="dim") - text.append("\n") - - # Add helpful context - if suggestion_type == "main_command": - text.append( - "\nšŸ’” Use 'fuzzforge --help' to see all available commands", style="dim" - ) - elif suggestion_type == "subcommand": - main_cmd = suggestions["original"][0] - text.append( - f"\nšŸ’” Use 'fuzzforge {main_cmd} --help' to see available subcommands", - style="dim", - ) - - console.print( - Panel(text, title="šŸ¤” Command Suggestion", border_style="yellow", expand=False) - ) - - -def display_workflow_suggestion(original: str, suggestion: str): - """Display workflow name suggestion.""" - text = Text() - text.append("ā“ Workflow not found: ", style="red") - text.append(f"'{original}'", style="bold red") - text.append("\n\n") - - text.append("šŸ’” Did you mean: ", style="yellow") - text.append(f"'{suggestion}'", style="bold green") - text.append("?\n\n") - - text.append( - "šŸ’” Use 'fuzzforge workflows' to see all available workflows", style="dim" - ) - - console.print( - Panel(text, title="šŸ”§ Workflow Suggestion", border_style="yellow", expand=False) - ) - - -def display_parameter_suggestion(original: str, suggestion: str): - """Display parameter name suggestion.""" - text = Text() - text.append("ā“ Unknown parameter: ", style="red") - text.append(f"'{original}'", style="bold red") - text.append("\n\n") - - text.append("šŸ’” Did you mean: ", style="yellow") - text.append(f"'--{suggestion}'", style="bold green") - text.append("?\n\n") - - text.append("šŸ’” Use '--help' to see all available parameters", style="dim") - - console.print( - Panel(text, title="āš™ļø Parameter Suggestion", border_style="yellow", expand=False) - ) - - -def enhanced_command_not_found_handler(command_parts: List[str]): - """Handle command not found with fuzzy matching suggestions.""" - matcher = FuzzyMatcher() - suggestions = matcher.get_command_suggestions(command_parts) - - if suggestions: - display_command_suggestion(suggestions) - else: - # Fallback to generic help - console.print("āŒ [red]Command not found[/red]") - console.print("šŸ’” Use 'fuzzforge --help' to see available commands") - - -def enhanced_workflow_not_found_handler(workflow_name: str): - """Handle workflow not found with suggestions.""" - matcher = FuzzyMatcher() - suggestion = matcher.suggest_workflow_fix(workflow_name) - - if suggestion: - display_workflow_suggestion(workflow_name, suggestion) - else: - console.print(f"āŒ [red]Workflow '{workflow_name}' not found[/red]") - console.print("šŸ’” Use 'fuzzforge workflows' to see available workflows") - - -def enhanced_parameter_not_found_handler(parameter_name: str): - """Handle unknown parameter with suggestions.""" - matcher = FuzzyMatcher() - closest = matcher.find_closest_parameter(parameter_name) - - if closest: - match, confidence = closest - if confidence > 0.6: - display_parameter_suggestion(parameter_name, match) - return - - console.print(f"āŒ [red]Unknown parameter: '{parameter_name}'[/red]") - console.print("šŸ’” Use '--help' to see available parameters") - - -# Global fuzzy matcher instance -fuzzy_matcher = FuzzyMatcher() diff --git a/cli/src/fuzzforge_cli/ingest_utils.py b/cli/src/fuzzforge_cli/ingest_utils.py deleted file mode 100644 index 8b90a4c..0000000 --- a/cli/src/fuzzforge_cli/ingest_utils.py +++ /dev/null @@ -1,105 +0,0 @@ -"""Utilities for collecting files to ingest into Cognee.""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -from __future__ import annotations - -import fnmatch -from pathlib import Path -from typing import Iterable, List, Optional - -# Default extensions and exclusions mirrored from the OSS implementation -_DEFAULT_FILE_TYPES = [ - ".py", - ".js", - ".ts", - ".java", - ".cpp", - ".c", - ".h", - ".rs", - ".go", - ".rb", - ".php", - ".cs", - ".swift", - ".kt", - ".scala", - ".clj", - ".hs", - ".md", - ".txt", - ".yaml", - ".yml", - ".json", - ".toml", - ".cfg", - ".ini", -] - -_DEFAULT_EXCLUDE = [ - "*.pyc", - "__pycache__", - ".git", - ".svn", - ".hg", - "node_modules", - ".venv", - "venv", - ".env", - "dist", - "build", - ".pytest_cache", - ".mypy_cache", - ".tox", - "coverage", - "*.log", - "*.tmp", -] - - -def collect_ingest_files( - path: Path, - recursive: bool = True, - file_types: Optional[Iterable[str]] = None, - exclude: Optional[Iterable[str]] = None, -) -> List[Path]: - """Return a list of files eligible for ingestion.""" - path = path.resolve() - files: List[Path] = [] - - extensions = list(file_types) if file_types else list(_DEFAULT_FILE_TYPES) - exclusions = list(exclude) if exclude else [] - exclusions.extend(_DEFAULT_EXCLUDE) - - def should_exclude(file_path: Path) -> bool: - file_str = str(file_path) - for pattern in exclusions: - if fnmatch.fnmatch(file_str, f"*{pattern}*") or fnmatch.fnmatch(file_path.name, pattern): - return True - return False - - if path.is_file(): - if not should_exclude(path) and any(str(path).endswith(ext) for ext in extensions): - files.append(path) - return files - - pattern = "**/*" if recursive else "*" - for file_path in path.glob(pattern): - if file_path.is_file() and not should_exclude(file_path): - if any(str(file_path).endswith(ext) for ext in extensions): - files.append(file_path) - - return files - - -__all__ = ["collect_ingest_files"] diff --git a/cli/src/fuzzforge_cli/main.py b/cli/src/fuzzforge_cli/main.py deleted file mode 100644 index f869c8c..0000000 --- a/cli/src/fuzzforge_cli/main.py +++ /dev/null @@ -1,425 +0,0 @@ -""" -Main CLI application with improved command structure. -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -import typer -from rich.console import Console -from rich.traceback import install -from typing import Optional, List -import sys - -from .config import load_project_env - -from .commands import ( - workflows, - workflow_exec, - findings, - monitor, - config as config_cmd, - ai, - ingest, - worker, -) -from .fuzzy import enhanced_command_not_found_handler - -# Install rich traceback handler -install(show_locals=True) - -# Ensure environment variables are available before command execution -load_project_env() - -# Create console for rich output -console = Console() - -# Create the main Typer app -app = typer.Typer( - name="fuzzforge", - help=( - "\b\n" - "[cyan]ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•—ā–ˆā–ˆā•— ā–ˆā–ˆā•—ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•—ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•—ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•—\n" - "ā–ˆā–ˆā•”ā•ā•ā•ā•ā•ā–ˆā–ˆā•‘ ā–ˆā–ˆā•‘ā•šā•ā•ā–ˆā–ˆā–ˆā•”ā•ā•šā•ā•ā–ˆā–ˆā–ˆā•”ā•ā–ˆā–ˆā•”ā•ā•ā•ā•ā•ā–ˆā–ˆā•”ā•ā•ā•ā–ˆā–ˆā•—ā–ˆā–ˆā•”ā•ā•ā–ˆā–ˆā•—ā–ˆā–ˆā•”ā•ā•ā•ā•ā• ā–ˆā–ˆā•”ā•ā•ā•ā•ā•\n" - "ā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā•‘ ā–ˆā–ˆā•‘ ā–ˆā–ˆā–ˆā•”ā• ā–ˆā–ˆā–ˆā•”ā• ā–ˆā–ˆā–ˆā–ˆā–ˆā•— ā–ˆā–ˆā•‘ ā–ˆā–ˆā•‘ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•”ā•ā–ˆā–ˆā•‘ ā–ˆā–ˆā–ˆā•—ā–ˆā–ˆā–ˆā–ˆā–ˆā•— \n" - "ā–ˆā–ˆā•”ā•ā•ā• ā–ˆā–ˆā•‘ ā–ˆā–ˆā•‘ ā–ˆā–ˆā–ˆā•”ā• ā–ˆā–ˆā–ˆā•”ā• ā–ˆā–ˆā•”ā•ā•ā• ā–ˆā–ˆā•‘ ā–ˆā–ˆā•‘ā–ˆā–ˆā•”ā•ā•ā–ˆā–ˆā•—ā–ˆā–ˆā•‘ ā–ˆā–ˆā•‘ā–ˆā–ˆā•”ā•ā•ā• \n" - "ā–ˆā–ˆā•‘ ā•šā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•”ā•ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•—ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•—ā–ˆā–ˆā•‘ ā•šā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•”ā•ā–ˆā–ˆā•‘ ā–ˆā–ˆā•‘ā•šā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•”ā•ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā•—\n" - "ā•šā•ā• ā•šā•ā•ā•ā•ā•ā• ā•šā•ā•ā•ā•ā•ā•ā•ā•šā•ā•ā•ā•ā•ā•ā•ā•šā•ā• ā•šā•ā•ā•ā•ā•ā• ā•šā•ā• ā•šā•ā• ā•šā•ā•ā•ā•ā•ā• ā•šā•ā•ā•ā•ā•ā•ā•[/cyan]\n\n" - "šŸ›”ļø Security testing workflow orchestration platform" - ), - rich_markup_mode="rich", - no_args_is_help=True, - context_settings={ - # Prevent help text from wrapping so ASCII art stays aligned - "max_content_width": 200, - # Keep common help flags - "help_option_names": ["--help", "-h"], - }, -) - -# Create workflow singular command group -workflow_app = typer.Typer( - name="workflow", - help="šŸš€ Execute and manage individual workflows", - no_args_is_help=False, # Allow direct execution -) - -# Create finding singular command group -finding_app = typer.Typer( - name="finding", - help="šŸ” View and analyze individual findings", - no_args_is_help=False, -) - - -# === Top-level commands === - -@app.command() -def init( - name: Optional[str] = typer.Option( - None, "--name", "-n", - help="Project name (defaults to current directory name)" - ), - api_url: Optional[str] = typer.Option( - None, "--api-url", "-u", - help="FuzzForge API URL (defaults to http://localhost:8000)" - ), - force: bool = typer.Option( - False, "--force", "-f", - help="Force initialization even if project already exists" - ) -): - """ - šŸ“ Initialize a new FuzzForge project - """ - from .commands.init import project - project(name=name, api_url=api_url, force=force) - - -@app.command() -def status(): - """ - šŸ“Š Show project and latest execution status - """ - from .commands.status import show_status - show_status() - - -@app.command() -def config( - key: Optional[str] = typer.Argument(None, help="Configuration key"), - value: Optional[str] = typer.Argument(None, help="Configuration value to set") -): - """ - āš™ļø Manage configuration (show all, get, or set values) - """ - - if key is None: - # No arguments: show all config - config_cmd.show_config(global_config=False) - elif value is None: - # Key only: get specific value - config_cmd.get_config(key=key, global_config=False) - else: - # Key and value: set value - config_cmd.set_config(key=key, value=value, global_config=False) - - -@app.command() -def clean( - days: int = typer.Option( - 90, "--days", "-d", - help="Remove data older than this many days" - ), - dry_run: bool = typer.Option( - False, "--dry-run", - help="Show what would be deleted without actually deleting" - ) -): - """ - 🧹 Clean old execution data and findings - """ - from .database import get_project_db - from .exceptions import require_project - - try: - require_project() - db = get_project_db() - if not db: - console.print("āŒ No project database found", style="red") - raise typer.Exit(1) - - if dry_run: - console.print(f"šŸ” [bold]Dry run:[/bold] Would clean data older than {days} days") - - deleted = db.cleanup_old_runs(keep_days=days) - - if not dry_run: - console.print(f"āœ… Cleaned {deleted} old executions", style="green") - else: - console.print(f"Would delete {deleted} executions", style="yellow") - except Exception as e: - console.print(f"āŒ Failed to clean data: {e}", style="red") - raise typer.Exit(1) - - -# === Workflow commands (singular) === - -# Add workflow subcommands first (before callback) -workflow_app.command("status")(workflow_exec.workflow_status) -workflow_app.command("history")(workflow_exec.workflow_history) -workflow_app.command("retry")(workflow_exec.retry_workflow) -workflow_app.command("info")(workflows.workflow_info) -workflow_app.command("params")(workflows.workflow_parameters) - -@workflow_app.command("run") -def run_workflow( - workflow: str = typer.Argument(help="Workflow name"), - target: str = typer.Argument(help="Target path"), - params: List[str] = typer.Argument(default=None, help="Parameters as key=value pairs"), - param_file: Optional[str] = typer.Option( - None, "--param-file", "-f", - help="JSON file containing workflow parameters" - ), - timeout: Optional[int] = typer.Option( - None, "--timeout", "-t", - help="Execution timeout in seconds" - ), - interactive: bool = typer.Option( - True, "--interactive/--no-interactive", "-i/-n", - help="Interactive parameter input for missing required parameters" - ), - wait: bool = typer.Option( - False, "--wait", "-w", - help="Wait for execution to complete" - ), - live: bool = typer.Option( - False, "--live", "-l", - help="Start live monitoring after execution (useful for fuzzing workflows)" - ), - auto_start: Optional[bool] = typer.Option( - None, "--auto-start/--no-auto-start", - help="Automatically start required worker if not running (default: from config)" - ), - auto_stop: Optional[bool] = typer.Option( - None, "--auto-stop/--no-auto-stop", - help="Automatically stop worker after execution completes (default: from config)" - ), - fail_on: Optional[str] = typer.Option( - None, "--fail-on", - help="Fail build if findings match SARIF level (error,warning,note,info,all,none). Use with --wait" - ), - export_sarif: Optional[str] = typer.Option( - None, "--export-sarif", - help="Export SARIF results to file after completion. Use with --wait" - ) -): - """ - šŸš€ Execute a security testing workflow - - Use --fail-on with --wait to fail CI builds based on finding severity. - Use --export-sarif with --wait to export SARIF findings to a file. - """ - from .commands.workflow_exec import execute_workflow - - execute_workflow( - workflow=workflow, - target_path=target, - params=params, - param_file=param_file, - timeout=timeout, - interactive=interactive, - wait=wait, - live=live, - auto_start=auto_start, - auto_stop=auto_stop, - fail_on=fail_on, - export_sarif=export_sarif - ) - -@workflow_app.callback() -def workflow_main(): - """ - Execute workflows and manage workflow executions - - Examples: - fuzzforge workflow run security_assessment ./target # Execute workflow - fuzzforge workflow status # Check latest status - fuzzforge workflow history # Show execution history - """ - pass - - -# === Finding commands (singular) === - -@finding_app.command("show") -def show_finding_detail( - run_id: str = typer.Argument(..., help="Run ID to get finding from"), - rule_id: str = typer.Option(..., "--rule", "-r", help="Rule ID of the specific finding to show") -): - """ - šŸ” Show detailed information about a specific finding - """ - from .commands.findings import show_finding - show_finding(run_id=run_id, rule_id=rule_id) - - -@finding_app.callback(invoke_without_command=True) -def finding_main( - ctx: typer.Context, -): - """ - View and analyze individual findings - - Examples: - fuzzforge finding # Show latest finding - fuzzforge finding # Show specific finding - fuzzforge finding show --rule # Show specific finding detail - """ - # Check if a subcommand is being invoked - if ctx.invoked_subcommand is not None: - # Let the subcommand handle it - return - - # Get remaining arguments for direct viewing - args = ctx.args if hasattr(ctx, 'args') else [] - finding_id = args[0] if args else None - - # Direct viewing: fuzzforge finding [id] - from .commands.findings import get_findings - from .database import get_project_db - from .exceptions import require_project - - try: - require_project() - - # If no ID provided, get the latest - if not finding_id: - db = get_project_db() - if db: - recent_runs = db.list_runs(limit=1) - if recent_runs: - finding_id = recent_runs[0].run_id - console.print(f"šŸ” Using most recent execution: {finding_id}") - else: - console.print("āš ļø No findings found in project database", style="yellow") - return - else: - console.print("āŒ No project database found", style="red") - return - - get_findings(run_id=finding_id, save=True, format="table") - except Exception as e: - console.print(f"āŒ Failed to get findings: {e}", style="red") - - -# === Add command groups === - -# Plural commands (for browsing/listing) -app.add_typer(workflows.app, name="workflows", help="šŸ“‹ Browse available workflows") -app.add_typer(findings.app, name="findings", help="šŸ“‹ Browse all findings") - -# Singular commands (for actions) -app.add_typer(workflow_app, name="workflow", help="šŸš€ Execute and manage workflows") -app.add_typer(finding_app, name="finding", help="šŸ” View and analyze findings") - -# Other command groups -app.add_typer(monitor.app, name="monitor", help="šŸ“Š Real-time monitoring") -app.add_typer(ai.app, name="ai", help="šŸ¤– AI integration features") -app.add_typer(ingest.app, name="ingest", help="🧠 Ingest knowledge into AI") -app.add_typer(worker.app, name="worker", help="šŸ”§ Manage Temporal workers") - -# Help and utility commands -@app.command() -def version(): - """ - šŸ“¦ Show version information - """ - from . import __version__ - console.print(f"FuzzForge CLI v{__version__}") - console.print("Short command: ff") - - -@app.callback() -def main_callback( - ctx: typer.Context, - version: Optional[bool] = typer.Option( - None, "--version", "-v", - help="Show version information" - ), -): - """ - šŸ›”ļø FuzzForge CLI - Security testing workflow orchestration platform - - Quick start: - • ff init - Initialize a new project - • ff workflows - See available workflows - • ff workflow - Execute a workflow - """ - if version: - from . import __version__ - console.print(f"FuzzForge CLI v{__version__}") - raise typer.Exit() - - -def main(): - """Main entry point with smart command routing and error handling""" - # Smart command routing BEFORE Typer processes arguments - if len(sys.argv) > 1: - args = sys.argv[1:] - - - # Handle finding command with pattern recognition - if len(args) >= 2 and args[0] == 'finding': - finding_subcommands = ['show'] - # Skip custom dispatching if help flags are present - if not any(arg in ['--help', '-h', '--version', '-v'] for arg in args): - if args[1] not in finding_subcommands: - # Direct finding display: ff finding - from .commands.findings import get_findings - - finding_id = args[1] - console.print(f"šŸ” Displaying finding: {finding_id}") - - try: - get_findings(run_id=finding_id, save=True, format="table") - return - except Exception as e: - console.print(f"āŒ Failed to get finding: {e}", style="red") - sys.exit(1) - - # Default Typer app handling - try: - app() - except SystemExit as e: - # Enhanced error handling for command not found - if hasattr(e, 'code') and e.code != 0 and len(sys.argv) > 1: - command_parts = sys.argv[1:] - clean_parts = [part for part in command_parts if not part.startswith('-')] - - if clean_parts: - main_cmd = clean_parts[0] - valid_commands = [ - 'init', 'status', 'config', 'clean', - 'workflows', 'workflow', - 'findings', 'finding', - 'monitor', 'ai', 'ingest', 'worker', - 'version' - ] - - if main_cmd not in valid_commands: - enhanced_command_not_found_handler(clean_parts) - sys.exit(1) - raise - - -if __name__ == "__main__": - main() diff --git a/cli/src/fuzzforge_cli/progress.py b/cli/src/fuzzforge_cli/progress.py deleted file mode 100644 index d9f1696..0000000 --- a/cli/src/fuzzforge_cli/progress.py +++ /dev/null @@ -1,370 +0,0 @@ -""" -Enhanced progress indicators and loading animations for FuzzForge CLI. - -Provides rich progress bars, spinners, and status displays for all long-running operations. -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -import time -from contextlib import contextmanager -from typing import Optional, Any, Dict, List -from datetime import datetime - -from rich.console import Console -from rich.progress import ( - Progress, SpinnerColumn, TextColumn, BarColumn, TaskProgressColumn, - TimeElapsedColumn, TimeRemainingColumn, MofNCompleteColumn -) -from rich.panel import Panel -from rich.live import Live -from rich.table import Table -from rich.text import Text -from rich import box - -console = Console() - - -class ProgressManager: - """Enhanced progress manager with multiple progress types.""" - - def __init__(self): - self.progress = None - self.live = None - - def create_progress(self, show_speed: bool = False, show_eta: bool = False) -> Progress: - """Create a rich progress instance with customizable columns.""" - columns = [ - SpinnerColumn(), - TextColumn("[bold blue]{task.description}"), - BarColumn(bar_width=40), - TaskProgressColumn(), - ] - - if show_speed: - columns.append(TextColumn("[cyan]{task.fields[speed]}/s")) - - columns.extend([ - TimeElapsedColumn(), - ]) - - if show_eta: - columns.append(TimeRemainingColumn()) - - return Progress(*columns, console=console) - - @contextmanager - def workflow_submission(self, workflow_name: str, target_path: str): - """Progress context for workflow submission.""" - with self.create_progress() as progress: - task = progress.add_task( - f"šŸš€ Submitting workflow: [yellow]{workflow_name}[/yellow]", - total=4 - ) - - # Step 1: Validation - progress.update(task, description="šŸ” Validating parameters...", advance=1) - yield progress, task - - # Step 2: API Connection - progress.update(task, description="🌐 Connecting to API...", advance=1) - time.sleep(0.5) # Brief pause for visual feedback - - # Step 3: Submission - progress.update(task, description="šŸ“¤ Submitting workflow...", advance=1) - time.sleep(0.3) - - # Step 4: Complete - progress.update(task, description="āœ… Workflow submitted successfully!", advance=1) - - @contextmanager - def data_export(self, format_type: str, record_count: int): - """Progress context for data export operations.""" - with self.create_progress(show_eta=True) as progress: - task = progress.add_task( - f"šŸ“Š Exporting {record_count} records as [yellow]{format_type.upper()}[/yellow]", - total=record_count - ) - yield progress, task - - @contextmanager - def file_operations(self, operation: str, file_count: int): - """Progress context for file operations.""" - with self.create_progress(show_eta=True) as progress: - task = progress.add_task( - f"šŸ“ {operation} {file_count} files...", - total=file_count - ) - yield progress, task - - @contextmanager - def api_requests(self, operation: str, request_count: Optional[int] = None): - """Progress context for API requests.""" - if request_count: - with self.create_progress() as progress: - task = progress.add_task( - f"🌐 {operation}...", - total=request_count - ) - yield progress, task - else: - # Indeterminate progress for unknown request count - with self.create_progress() as progress: - task = progress.add_task( - f"🌐 {operation}...", - total=None - ) - yield progress, task - - def create_live_stats_display(self) -> Dict[str, Any]: - """Create a live statistics display layout.""" - return { - "layout": None, - "stats_table": None, - "progress_bars": None - } - - -@contextmanager -def spinner(text: str, success_text: Optional[str] = None): - """Simple spinner context manager for quick operations.""" - with Progress( - SpinnerColumn(), - TextColumn("[bold blue]{task.description}"), - console=console - ) as progress: - task = progress.add_task(text, total=None) - try: - yield progress - if success_text: - progress.update(task, description=f"āœ… {success_text}") - time.sleep(0.5) # Brief pause to show success - except Exception as e: - progress.update(task, description=f"āŒ Failed: {str(e)}") - time.sleep(0.5) - raise - - -@contextmanager -def step_progress(steps: List[str], title: str = "Processing"): - """Multi-step progress with predefined steps.""" - with Progress( - SpinnerColumn(), - TextColumn("[bold blue]{task.description}"), - BarColumn(bar_width=30), - MofNCompleteColumn(), - console=console - ) as progress: - task = progress.add_task(f"šŸ”„ {title}", total=len(steps)) - - class StepProgressController: - def __init__(self, progress_instance, task_id): - self.progress = progress_instance - self.task = task_id - self.current_step = 0 - - def next_step(self): - if self.current_step < len(steps): - step_text = steps[self.current_step] - self.progress.update( - self.task, - description=f"šŸ”„ {step_text}", - advance=1 - ) - self.current_step += 1 - - def complete(self, success_text: str = "Completed"): - self.progress.update( - self.task, - description=f"āœ… {success_text}", - completed=len(steps) - ) - - yield StepProgressController(progress, task) - - -def create_workflow_monitoring_display(run_id: str, workflow_name: str) -> Table: - """Create a monitoring display for workflow execution.""" - table = Table(show_header=False, box=box.ROUNDED) - table.add_column("Metric", style="bold cyan") - table.add_column("Value", justify="right") - - table.add_row("Run ID", f"[dim]{run_id[:12]}...[/dim]") - table.add_row("Workflow", f"[yellow]{workflow_name}[/yellow]") - table.add_row("Status", "[orange]Running[/orange]") - table.add_row("Started", datetime.now().strftime("%H:%M:%S")) - - return Panel.fit( - table, - title="šŸ”„ Workflow Monitoring", - border_style="blue" - ) - - -def create_fuzzing_progress_display(stats: Dict[str, Any]) -> Panel: - """Create a rich display for fuzzing progress.""" - # Main stats table - stats_table = Table(show_header=False, box=box.SIMPLE) - stats_table.add_column("Metric", style="bold") - stats_table.add_column("Value", justify="right", style="bold white") - - stats_table.add_row("Executions", f"{stats.get('executions', 0):,}") - stats_table.add_row("Exec/sec", f"{stats.get('executions_per_sec', 0):.1f}") - stats_table.add_row("Crashes", f"[red]{stats.get('crashes', 0):,}[/red]") - stats_table.add_row("Coverage", f"{stats.get('coverage', 0):.1f}%") - - # Progress bars - progress_table = Table(show_header=False, box=box.SIMPLE) - progress_table.add_column("Metric", style="bold") - progress_table.add_column("Progress", min_width=25) - - # Execution rate progress (as percentage of target rate) - exec_rate = stats.get('executions_per_sec', 0) - target_rate = 1000 # Target 1000 exec/sec - exec_progress = min(100, (exec_rate / target_rate) * 100) - progress_table.add_row( - "Exec Rate", - create_progress_bar(exec_progress, color="green") - ) - - # Coverage progress - coverage = stats.get('coverage', 0) - progress_table.add_row( - "Coverage", - create_progress_bar(coverage, color="blue") - ) - - # Combine tables - combined = Table(show_header=False, box=None) - combined.add_column("Stats", ratio=1) - combined.add_column("Progress", ratio=1) - combined.add_row(stats_table, progress_table) - - return Panel( - combined, - title="šŸŽÆ Fuzzing Progress", - border_style="green" - ) - - -def create_progress_bar(percentage: float, color: str = "green", width: int = 20) -> Text: - """Create a visual progress bar using Rich Text.""" - filled = int((percentage / 100) * width) - bar = "ā–ˆ" * filled + "ā–‘" * (width - filled) - text = Text(bar, style=color) - text.append(f" {percentage:.1f}%", style="dim") - return text - - -def create_loading_animation(text: str) -> Live: - """Create a loading animation with rotating spinner.""" - frames = ["ā ‹", "ā ™", "ā ¹", "ā ø", "ā ¼", "ā “", "ā ¦", "ā §", "ā ‡", "ā "] - frame_index = 0 - - def get_spinner_frame(): - nonlocal frame_index - frame = frames[frame_index] - frame_index = (frame_index + 1) % len(frames) - return frame - - panel = Panel( - f"{get_spinner_frame()} [bold blue]{text}[/bold blue]", - box=box.ROUNDED, - border_style="cyan" - ) - - return Live(panel, auto_refresh=True, refresh_per_second=10) - - -class WorkflowProgressTracker: - """Advanced progress tracker for workflow execution.""" - - def __init__(self, workflow_name: str, run_id: str): - self.workflow_name = workflow_name - self.run_id = run_id - self.start_time = datetime.now() - self.phases = [] - self.current_phase = None - - def add_phase(self, name: str, description: str, estimated_duration: Optional[int] = None): - """Add a phase to the workflow progress.""" - self.phases.append({ - "name": name, - "description": description, - "estimated_duration": estimated_duration, - "start_time": None, - "end_time": None, - "status": "pending" - }) - - def start_phase(self, phase_name: str): - """Start a specific phase.""" - for phase in self.phases: - if phase["name"] == phase_name: - phase["start_time"] = datetime.now() - phase["status"] = "running" - self.current_phase = phase_name - break - - def complete_phase(self, phase_name: str, success: bool = True): - """Complete a specific phase.""" - for phase in self.phases: - if phase["name"] == phase_name: - phase["end_time"] = datetime.now() - phase["status"] = "completed" if success else "failed" - self.current_phase = None - break - - def get_progress_display(self) -> Panel: - """Get the current progress display.""" - # Create progress table - table = Table(show_header=True, box=box.ROUNDED) - table.add_column("Phase", style="bold") - table.add_column("Status", justify="center") - table.add_column("Duration") - - for phase in self.phases: - status_emoji = { - "pending": "ā³", - "running": "šŸ”„", - "completed": "āœ…", - "failed": "āŒ" - } - - status_text = f"{status_emoji.get(phase['status'], 'ā“')} {phase['status'].title()}" - - # Calculate duration - if phase["start_time"]: - end_time = phase["end_time"] or datetime.now() - duration = end_time - phase["start_time"] - duration_text = f"{duration.seconds}s" - else: - duration_text = "-" - - table.add_row( - phase["description"], - status_text, - duration_text - ) - - total_duration = datetime.now() - self.start_time - title = f"šŸ”„ {self.workflow_name} Progress (Run: {self.run_id[:8]}..., {total_duration.seconds}s)" - - return Panel( - table, - title=title, - border_style="blue" - ) - - -# Global progress manager instance -progress_manager = ProgressManager() \ No newline at end of file diff --git a/cli/src/fuzzforge_cli/validation.py b/cli/src/fuzzforge_cli/validation.py deleted file mode 100644 index b8fdfb7..0000000 --- a/cli/src/fuzzforge_cli/validation.py +++ /dev/null @@ -1,171 +0,0 @@ -""" -Input validation utilities for FuzzForge CLI. -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - - -import re -from pathlib import Path -from typing import Any, Dict, List, Optional - -from .constants import SUPPORTED_EXPORT_FORMATS -from .exceptions import ValidationError - - -def validate_run_id(run_id: str) -> None: - """Validate a run/execution ID format""" - if not run_id or not isinstance(run_id, str): - raise ValidationError("run_id", run_id, "a non-empty string") - - # Check for reasonable length (UUIDs are typically 36 chars) - if len(run_id) < 8 or len(run_id) > 128: - raise ValidationError("run_id", run_id, "between 8 and 128 characters") - - # Check for valid characters (alphanumeric, hyphens, underscores) - if not re.match(r'^[a-zA-Z0-9_-]+$', run_id): - raise ValidationError("run_id", run_id, "alphanumeric characters, hyphens, and underscores only") - - -def validate_workflow_name(workflow: str) -> None: - """Validate workflow name format""" - if not workflow or not isinstance(workflow, str): - raise ValidationError("workflow_name", workflow, "a non-empty string") - - # Check for reasonable length - if len(workflow) < 2 or len(workflow) > 64: - raise ValidationError("workflow_name", workflow, "between 2 and 64 characters") - - # Check for valid characters (alphanumeric, hyphens, underscores) - if not re.match(r'^[a-zA-Z0-9_-]+$', workflow): - raise ValidationError("workflow_name", workflow, "alphanumeric characters, hyphens, and underscores only") - - -def validate_target_path(target_path: str, must_exist: bool = True) -> Path: - """Validate and normalize a target path""" - if not target_path or not isinstance(target_path, str): - raise ValidationError("target_path", target_path, "a non-empty string") - - try: - path = Path(target_path).resolve() - except Exception as e: - raise ValidationError("target_path", target_path, f"a valid path: {e}") - - if must_exist and not path.exists(): - raise ValidationError("target_path", target_path, "an existing path") - - return path - - -def validate_export_format(export_format: str) -> None: - """Validate export format""" - if export_format not in SUPPORTED_EXPORT_FORMATS: - raise ValidationError( - "export_format", export_format, - f"one of: {', '.join(SUPPORTED_EXPORT_FORMATS)}" - ) - - -def validate_parameter_value(key: str, value: str, param_type: str) -> Any: - """Validate and convert a parameter value based on its type""" - if param_type == "integer": - try: - return int(value) - except ValueError: - raise ValidationError(f"parameter '{key}'", value, "an integer") - - elif param_type == "number": - try: - return float(value) - except ValueError: - raise ValidationError(f"parameter '{key}'", value, "a number") - - elif param_type == "boolean": - lower_value = value.lower() - if lower_value in ("true", "yes", "1", "on"): - return True - elif lower_value in ("false", "no", "0", "off"): - return False - else: - raise ValidationError(f"parameter '{key}'", value, "a boolean (true/false, yes/no, 1/0, on/off)") - - elif param_type == "array": - # Split by comma and strip whitespace - items = [item.strip() for item in value.split(",") if item.strip()] - if not items: - raise ValidationError(f"parameter '{key}'", value, "a non-empty comma-separated list") - return items - - else: - # String type - basic validation - if not value: - raise ValidationError(f"parameter '{key}'", value, "a non-empty string") - return value - - -def validate_parameters(params: List[str]) -> Dict[str, Any]: - """Validate and parse parameter list""" - parameters = {} - - for param_str in params: - if "=" not in param_str: - raise ValidationError("parameter format", param_str, "key=value format") - - key, value = param_str.split("=", 1) - key = key.strip() - value = value.strip() - - if not key: - raise ValidationError("parameter key", param_str, "a non-empty key") - - if not value: - raise ValidationError(f"parameter '{key}'", param_str, "a non-empty value") - - # Auto-detect type and convert - try: - if value.lower() in ("true", "false"): - parameters[key] = value.lower() == "true" - elif value.isdigit(): - parameters[key] = int(value) - elif re.match(r'^\d+\.\d+$', value): - parameters[key] = float(value) - else: - parameters[key] = value - except ValueError: - parameters[key] = value - - return parameters - - -def validate_config_key(key: str) -> None: - """Validate configuration key format""" - if not key or not isinstance(key, str): - raise ValidationError("config_key", key, "a non-empty string") - - # Check for valid key format (e.g., "api.url", "timeout") - if not re.match(r'^[a-zA-Z0-9._-]+$', key): - raise ValidationError("config_key", key, "alphanumeric characters, dots, hyphens, and underscores only") - - -def validate_positive_integer(value: int, name: str) -> None: - """Validate that a value is a positive integer""" - if not isinstance(value, int) or value <= 0: - raise ValidationError(name, value, "a positive integer") - - -def validate_timeout(timeout: Optional[int]) -> None: - """Validate timeout value""" - if timeout is not None: - if not isinstance(timeout, int) or timeout <= 0: - raise ValidationError("timeout", timeout, "a positive integer (seconds)") - - if timeout > 86400: # 24 hours - raise ValidationError("timeout", timeout, "less than 24 hours (86400 seconds)") \ No newline at end of file diff --git a/cli/src/fuzzforge_cli/worker_manager.py b/cli/src/fuzzforge_cli/worker_manager.py deleted file mode 100644 index a9b3eaf..0000000 --- a/cli/src/fuzzforge_cli/worker_manager.py +++ /dev/null @@ -1,642 +0,0 @@ -""" -Worker lifecycle management for FuzzForge CLI. - -Manages on-demand startup and shutdown of Temporal workers using Docker Compose. -""" -# Copyright (c) 2025 FuzzingLabs -# -# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file -# at the root of this repository for details. -# -# After the Change Date (four years from publication), this version of the -# Licensed Work will be made available under the Apache License, Version 2.0. -# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 -# -# Additional attribution and requirements are provided in the NOTICE file. - -import logging -import os -import platform -import subprocess -import time -from pathlib import Path -from typing import Optional, Dict, Any - -import requests -import yaml -from rich.console import Console -from rich.status import Status - -logger = logging.getLogger(__name__) -console = Console() - - -class WorkerManager: - """ - Manages Temporal worker lifecycle using docker-compose. - - This class handles: - - Checking if workers are running - - Starting workers on demand - - Waiting for workers to be ready - - Stopping workers when done - """ - - def __init__( - self, - compose_file: Optional[Path] = None, - startup_timeout: int = 60, - health_check_interval: float = 2.0 - ): - """ - Initialize WorkerManager. - - Args: - compose_file: Path to docker-compose.yml (defaults to auto-detect) - startup_timeout: Maximum seconds to wait for worker startup - health_check_interval: Seconds between health checks - """ - self.compose_file = compose_file or self._find_compose_file() - self.startup_timeout = startup_timeout - self.health_check_interval = health_check_interval - - def _find_compose_file(self) -> Path: - """ - Auto-detect docker-compose.yml location using multiple strategies. - - Strategies (in order): - 1. Query backend API for host path - 2. Search upward for .fuzzforge marker directory - 3. Use FUZZFORGE_ROOT environment variable - 4. Fallback to current directory - - Returns: - Path to docker-compose.yml - - Raises: - FileNotFoundError: If docker-compose.yml cannot be located - """ - # Strategy 1: Ask backend for location - try: - backend_url = os.getenv("FUZZFORGE_API_URL", "http://localhost:8000") - response = requests.get(f"{backend_url}/system/info", timeout=2) - if response.ok: - info = response.json() - if compose_path_str := info.get("docker_compose_path"): - compose_path = Path(compose_path_str) - if compose_path.exists(): - logger.debug(f"Found docker-compose.yml via backend API: {compose_path}") - return compose_path - except Exception as e: - logger.debug(f"Backend API not reachable for path lookup: {e}") - - # Strategy 2: Search upward for .fuzzforge marker directory - current = Path.cwd() - for parent in [current] + list(current.parents): - if (parent / ".fuzzforge").exists(): - compose_path = parent / "docker-compose.yml" - if compose_path.exists(): - logger.debug(f"Found docker-compose.yml via .fuzzforge marker: {compose_path}") - return compose_path - - # Strategy 3: Environment variable - if fuzzforge_root := os.getenv("FUZZFORGE_ROOT"): - compose_path = Path(fuzzforge_root) / "docker-compose.yml" - if compose_path.exists(): - logger.debug(f"Found docker-compose.yml via FUZZFORGE_ROOT: {compose_path}") - return compose_path - - # Strategy 4: Fallback to current directory - compose_path = Path("docker-compose.yml") - if compose_path.exists(): - return compose_path - - raise FileNotFoundError( - "Cannot find docker-compose.yml. Ensure backend is running, " - "run from FuzzForge directory, or set FUZZFORGE_ROOT environment variable." - ) - - def _get_workers_dir(self) -> Path: - """ - Get the workers directory path. - - Uses same strategy as _find_compose_file(): - 1. Query backend API - 2. Derive from compose_file location - 3. Use FUZZFORGE_ROOT - - Returns: - Path to workers directory - """ - # Strategy 1: Ask backend - try: - backend_url = os.getenv("FUZZFORGE_API_URL", "http://localhost:8000") - response = requests.get(f"{backend_url}/system/info", timeout=2) - if response.ok: - info = response.json() - if workers_dir_str := info.get("workers_dir"): - workers_dir = Path(workers_dir_str) - if workers_dir.exists(): - return workers_dir - except Exception: - pass - - # Strategy 2: Derive from compose file location - if self.compose_file.exists(): - workers_dir = self.compose_file.parent / "workers" - if workers_dir.exists(): - return workers_dir - - # Strategy 3: Use environment variable - if fuzzforge_root := os.getenv("FUZZFORGE_ROOT"): - workers_dir = Path(fuzzforge_root) / "workers" - if workers_dir.exists(): - return workers_dir - - # Fallback - return Path("workers") - - def _detect_platform(self) -> str: - """ - Detect the current platform. - - Returns: - Platform string: "linux/amd64" or "linux/arm64" - """ - machine = platform.machine().lower() - system = platform.system().lower() - - logger.debug(f"Platform detection: machine={machine}, system={system}") - - # Normalize machine architecture - if machine in ["x86_64", "amd64", "x64"]: - detected = "linux/amd64" - elif machine in ["arm64", "aarch64", "armv8", "arm64v8"]: - detected = "linux/arm64" - else: - # Fallback to amd64 for unknown architectures - logger.warning( - f"Unknown architecture '{machine}' detected, falling back to linux/amd64. " - f"Please report this issue if you're experiencing problems." - ) - detected = "linux/amd64" - - logger.info(f"Detected platform: {detected}") - return detected - - def _read_worker_metadata(self, vertical: str) -> dict: - """ - Read worker metadata.yaml for a vertical. - - Args: - vertical: Worker vertical name (e.g., "android", "python") - - Returns: - Dictionary containing metadata, or empty dict if not found - """ - try: - workers_dir = self._get_workers_dir() - metadata_file = workers_dir / vertical / "metadata.yaml" - - if not metadata_file.exists(): - logger.debug(f"No metadata.yaml found for {vertical}") - return {} - - with open(metadata_file, 'r') as f: - return yaml.safe_load(f) or {} - except Exception as e: - logger.debug(f"Failed to read metadata for {vertical}: {e}") - return {} - - def _select_dockerfile(self, vertical: str) -> str: - """ - Select the appropriate Dockerfile for the current platform. - - Args: - vertical: Worker vertical name - - Returns: - Dockerfile name (e.g., "Dockerfile.amd64", "Dockerfile.arm64") - """ - detected_platform = self._detect_platform() - metadata = self._read_worker_metadata(vertical) - - if not metadata: - # No metadata: use default Dockerfile - logger.debug(f"No metadata for {vertical}, using Dockerfile") - return "Dockerfile" - - platforms = metadata.get("platforms", {}) - - if not platforms: - # Metadata exists but no platform definitions - logger.debug(f"No platform definitions in metadata for {vertical}, using Dockerfile") - return "Dockerfile" - - # Try detected platform first - if detected_platform in platforms: - dockerfile = platforms[detected_platform].get("dockerfile", "Dockerfile") - logger.info(f"āœ“ Selected {dockerfile} for {vertical} on {detected_platform}") - return dockerfile - - # Fallback to default platform - default_platform = metadata.get("default_platform", "linux/amd64") - logger.warning( - f"Platform {detected_platform} not found in metadata for {vertical}, " - f"falling back to default: {default_platform}" - ) - - if default_platform in platforms: - dockerfile = platforms[default_platform].get("dockerfile", "Dockerfile.amd64") - logger.info(f"Using default platform {default_platform}: {dockerfile}") - return dockerfile - - # Last resort: just use Dockerfile - logger.warning(f"No suitable Dockerfile found for {vertical}, using 'Dockerfile'") - return "Dockerfile" - - def _run_docker_compose(self, *args: str, env: Optional[Dict[str, str]] = None) -> subprocess.CompletedProcess: - """ - Run docker compose command with optional environment variables. - - Args: - *args: Arguments to pass to docker compose - env: Optional environment variables to set - - Returns: - CompletedProcess with result - - Raises: - subprocess.CalledProcessError: If command fails - """ - cmd = ["docker", "compose", "-f", str(self.compose_file)] + list(args) - logger.debug(f"Running: {' '.join(cmd)}") - - # Merge with current environment - full_env = os.environ.copy() - if env: - full_env.update(env) - logger.debug(f"Environment overrides: {env}") - - return subprocess.run( - cmd, - capture_output=True, - text=True, - check=True, - env=full_env - ) - - def _service_to_container_name(self, service_name: str) -> str: - """ - Convert service name to container name based on docker-compose naming convention. - - Args: - service_name: Docker Compose service name (e.g., "worker-python") - - Returns: - Container name (e.g., "fuzzforge-worker-python") - """ - return f"fuzzforge-{service_name}" - - def is_worker_running(self, service_name: str) -> bool: - """ - Check if a worker service is running. - - Args: - service_name: Name of the Docker Compose service (e.g., "worker-ossfuzz") - - Returns: - True if container is running, False otherwise - """ - try: - container_name = self._service_to_container_name(service_name) - result = subprocess.run( - ["docker", "inspect", "-f", "{{.State.Running}}", container_name], - capture_output=True, - text=True, - check=False - ) - - # Output is "true" or "false" - return result.stdout.strip().lower() == "true" - - except Exception as e: - logger.debug(f"Failed to check worker status: {e}") - return False - - def start_worker(self, service_name: str) -> bool: - """ - Start a worker service using docker-compose with platform-specific Dockerfile. - - Args: - service_name: Name of the Docker Compose service to start (e.g., "worker-android") - - Returns: - True if started successfully, False otherwise - """ - try: - # Extract vertical name from service name - vertical = service_name.replace("worker-", "") - - # Detect platform and select appropriate Dockerfile - detected_platform = self._detect_platform() - dockerfile = self._select_dockerfile(vertical) - - # Set environment variable for docker-compose - env_var_name = f"{vertical.upper()}_DOCKERFILE" - env = {env_var_name: dockerfile} - - console.print( - f"šŸš€ Starting worker: {service_name} " - f"(platform: {detected_platform}, using {dockerfile})" - ) - - # Use docker-compose up with --build to ensure correct Dockerfile is used - result = self._run_docker_compose("up", "-d", "--build", service_name, env=env) - - logger.info(f"Worker {service_name} started with {dockerfile}") - return True - - except subprocess.CalledProcessError as e: - logger.error(f"Failed to start worker {service_name}: {e.stderr}") - console.print(f"āŒ Failed to start worker: {e.stderr}", style="red") - console.print(f"šŸ’” Start the worker manually: docker compose up -d {service_name}", style="yellow") - return False - - except Exception as e: - logger.error(f"Unexpected error starting worker {service_name}: {e}") - console.print(f"āŒ Unexpected error: {e}", style="red") - return False - - def _get_container_state(self, service_name: str) -> str: - """ - Get the current state of a container (running, created, restarting, etc.). - - Args: - service_name: Name of the Docker Compose service - - Returns: - Container state string (running, created, restarting, exited, etc.) or "unknown" - """ - try: - container_name = self._service_to_container_name(service_name) - result = subprocess.run( - ["docker", "inspect", "-f", "{{.State.Status}}", container_name], - capture_output=True, - text=True, - check=False - ) - if result.returncode == 0: - return result.stdout.strip() - return "unknown" - except Exception as e: - logger.debug(f"Failed to get container state: {e}") - return "unknown" - - def _get_health_status(self, container_name: str) -> str: - """ - Get container health status. - - Args: - container_name: Docker container name - - Returns: - Health status: "healthy", "unhealthy", "starting", "none", or "unknown" - """ - try: - result = subprocess.run( - ["docker", "inspect", "-f", "{{.State.Health.Status}}", container_name], - capture_output=True, - text=True, - check=False - ) - - if result.returncode != 0: - return "unknown" - - health_status = result.stdout.strip() - - if health_status == "" or health_status == "": - return "none" # No health check defined - - return health_status # healthy, unhealthy, starting - - except Exception as e: - logger.debug(f"Failed to check health: {e}") - return "unknown" - - def wait_for_worker_ready(self, service_name: str, timeout: Optional[int] = None) -> bool: - """ - Wait for a worker to be healthy and ready to process tasks. - Shows live progress updates during startup. - - Args: - service_name: Name of the Docker Compose service - timeout: Maximum seconds to wait (uses instance default if not specified) - - Returns: - True if worker is ready, False if timeout reached - """ - timeout = timeout or self.startup_timeout - start_time = time.time() - container_name = self._service_to_container_name(service_name) - last_status_msg = "" - - with Status("[bold cyan]Starting worker...", console=console, spinner="dots") as status: - while time.time() - start_time < timeout: - elapsed = int(time.time() - start_time) - - # Get container state - container_state = self._get_container_state(service_name) - - # Get health status - health_status = self._get_health_status(container_name) - - # Build status message based on current state - if container_state == "created": - status_msg = f"[cyan]Worker starting... ({elapsed}s)[/cyan]" - elif container_state == "restarting": - status_msg = f"[yellow]Worker restarting... ({elapsed}s)[/yellow]" - elif container_state == "running": - if health_status == "starting": - status_msg = f"[cyan]Worker running, health check starting... ({elapsed}s)[/cyan]" - elif health_status == "unhealthy": - status_msg = f"[yellow]Worker running, health check: unhealthy ({elapsed}s)[/yellow]" - elif health_status == "healthy": - status_msg = f"[green]Worker healthy! ({elapsed}s)[/green]" - status.update(status_msg) - console.print(f"āœ… Worker ready: {service_name} (took {elapsed}s)") - logger.info(f"Worker {service_name} is healthy (took {elapsed}s)") - return True - elif health_status == "none": - # No health check defined, assume ready - status_msg = f"[green]Worker running (no health check) ({elapsed}s)[/green]" - status.update(status_msg) - console.print(f"āœ… Worker ready: {service_name} (took {elapsed}s)") - logger.info(f"Worker {service_name} is running, no health check (took {elapsed}s)") - return True - else: - status_msg = f"[cyan]Worker running ({elapsed}s)[/cyan]" - elif not container_state or container_state == "exited": - status_msg = f"[yellow]Waiting for container to start... ({elapsed}s)[/yellow]" - else: - status_msg = f"[cyan]Worker state: {container_state} ({elapsed}s)[/cyan]" - - # Show helpful hints at certain intervals - if elapsed == 10: - status_msg += " [dim](pulling image if not cached)[/dim]" - elif elapsed == 30: - status_msg += " [dim](large images can take time)[/dim]" - elif elapsed == 60: - status_msg += " [dim](still working...)[/dim]" - - # Update status if changed - if status_msg != last_status_msg: - status.update(status_msg) - last_status_msg = status_msg - logger.debug(f"Worker {service_name} - state: {container_state}, health: {health_status}") - - time.sleep(self.health_check_interval) - - # Timeout reached - elapsed = int(time.time() - start_time) - logger.warning(f"Worker {service_name} did not become ready within {elapsed}s") - console.print(f"āš ļø Worker startup timeout after {elapsed}s", style="yellow") - console.print(f" Last state: {container_state}, health: {health_status}", style="dim") - return False - - def stop_worker(self, service_name: str) -> bool: - """ - Stop a worker service using docker-compose. - - Args: - service_name: Name of the Docker Compose service to stop - - Returns: - True if stopped successfully, False otherwise - """ - try: - console.print(f"šŸ›‘ Stopping worker: {service_name}") - - # Use docker-compose down to stop and remove the service - result = self._run_docker_compose("stop", service_name) - - logger.info(f"Worker {service_name} stopped") - return True - - except subprocess.CalledProcessError as e: - logger.error(f"Failed to stop worker {service_name}: {e.stderr}") - console.print(f"āŒ Failed to stop worker: {e.stderr}", style="red") - return False - - except Exception as e: - logger.error(f"Unexpected error stopping worker {service_name}: {e}") - console.print(f"āŒ Unexpected error: {e}", style="red") - return False - - def stop_all_workers(self) -> bool: - """ - Stop all running FuzzForge worker containers. - - This uses `docker stop` to stop worker containers individually, - avoiding the Docker Compose profile issue and preventing accidental - shutdown of core services. - - Returns: - True if all workers stopped successfully, False otherwise - """ - try: - console.print("šŸ›‘ Stopping all FuzzForge workers...") - - # Get list of all running worker containers - result = subprocess.run( - ["docker", "ps", "--filter", "name=fuzzforge-worker-", "--format", "{{.Names}}"], - capture_output=True, - text=True, - check=False - ) - - running_workers = [name.strip() for name in result.stdout.splitlines() if name.strip()] - - if not running_workers: - console.print("āœ“ No workers running") - return True - - console.print(f"Found {len(running_workers)} running worker(s):") - for worker in running_workers: - console.print(f" - {worker}") - - # Stop each worker container individually using docker stop - # This is safer than docker compose down and won't affect core services - failed_workers = [] - for worker in running_workers: - try: - logger.info(f"Stopping {worker}...") - result = subprocess.run( - ["docker", "stop", worker], - capture_output=True, - text=True, - check=True, - timeout=30 - ) - console.print(f" āœ“ Stopped {worker}") - except subprocess.CalledProcessError as e: - logger.error(f"Failed to stop {worker}: {e.stderr}") - failed_workers.append(worker) - console.print(f" āœ— Failed to stop {worker}", style="red") - except subprocess.TimeoutExpired: - logger.error(f"Timeout stopping {worker}") - failed_workers.append(worker) - console.print(f" āœ— Timeout stopping {worker}", style="red") - - if failed_workers: - console.print(f"\nāš ļø {len(failed_workers)} worker(s) failed to stop", style="yellow") - console.print("šŸ’” Try manually: docker stop " + " ".join(failed_workers), style="dim") - return False - - console.print("\nāœ… All workers stopped") - logger.info("All workers stopped successfully") - return True - - except Exception as e: - logger.error(f"Unexpected error stopping workers: {e}") - console.print(f"āŒ Unexpected error: {e}", style="red") - return False - - def ensure_worker_running( - self, - worker_info: Dict[str, Any], - auto_start: bool = True - ) -> bool: - """ - Ensure a worker is running, starting it if necessary. - - Args: - worker_info: Worker information dict from API (contains worker_service, etc.) - auto_start: Whether to automatically start the worker if not running - - Returns: - True if worker is running, False otherwise - """ - # Get worker_service (docker-compose service name) - service_name = worker_info.get("worker_service", f"worker-{worker_info['vertical']}") - vertical = worker_info["vertical"] - - # Check if already running - if self.is_worker_running(service_name): - console.print(f"āœ“ Worker already running: {vertical}") - return True - - if not auto_start: - console.print( - f"āš ļø Worker not running: {vertical}. Use --auto-start to start automatically.", - style="yellow" - ) - return False - - # Start the worker - if not self.start_worker(service_name): - return False - - # Wait for it to be ready - return self.wait_for_worker_ready(service_name) diff --git a/cli/uv.lock b/cli/uv.lock deleted file mode 100644 index e83f96f..0000000 --- a/cli/uv.lock +++ /dev/null @@ -1,5256 +0,0 @@ -version = 1 -revision = 3 -requires-python = ">=3.11" -resolution-markers = [ - "python_full_version >= '3.14' and platform_python_implementation != 'PyPy' and sys_platform != 'emscripten'", - "python_full_version >= '3.14' and platform_python_implementation == 'PyPy' and sys_platform != 'emscripten'", - "python_full_version >= '3.14' and sys_platform == 'emscripten'", - "python_full_version == '3.13.*' and platform_python_implementation != 'PyPy' and sys_platform != 'emscripten'", - "python_full_version == '3.12.*' and platform_python_implementation != 'PyPy' and sys_platform != 'emscripten'", - "python_full_version < '3.12' and platform_python_implementation != 'PyPy' and sys_platform != 'emscripten'", - "python_full_version == '3.13.*' and platform_python_implementation == 'PyPy' and sys_platform != 'emscripten'", - "python_full_version == '3.12.*' and platform_python_implementation == 'PyPy' and sys_platform != 'emscripten'", - "python_full_version < '3.12' and platform_python_implementation == 'PyPy' and sys_platform != 'emscripten'", - "python_full_version == '3.13.*' and sys_platform == 'emscripten'", - "python_full_version == '3.12.*' and sys_platform == 'emscripten'", - "python_full_version < '3.12' and sys_platform == 'emscripten'", -] - -[[package]] -name = "a2a-sdk" -version = "0.3.7" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "google-api-core" }, - { name = "httpx" }, - { name = "httpx-sse" }, - { name = "protobuf" }, - { name = "pydantic" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/8d/ad/b6ecb58f44459a24f1c260e91304e1ddbb7a8e213f1f82cc4c074f66e9bb/a2a_sdk-0.3.7.tar.gz", hash = "sha256:795aa2bd2cfb3c9e8654a1352bf5f75d6cf1205b262b1bf8f4003b5308267ea2", size = 223426, upload-time = "2025-09-23T16:27:29.585Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/e6/27/9cf8c6de4ae71e9c98ec96b3304449d5d0cd36ec3b95e66b6e7f58a9e571/a2a_sdk-0.3.7-py3-none-any.whl", hash = "sha256:0813b8fd7add427b2b56895cf28cae705303cf6d671b305c0aac69987816e03e", size = 137957, upload-time = "2025-09-23T16:27:27.546Z" }, -] - -[[package]] -name = "absolufy-imports" -version = "0.3.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/74/0f/9da9dc9a12ebf4622ec96d9338d221e0172699e7574929f65ec8fdb30f9c/absolufy_imports-0.3.1.tar.gz", hash = "sha256:c90638a6c0b66826d1fb4880ddc20ef7701af34192c94faf40b95d32b59f9793", size = 4724, upload-time = "2022-01-20T14:48:53.434Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/a3/a4/b65c9fbc2c0c09c0ea3008f62d2010fd261e62a4881502f03a6301079182/absolufy_imports-0.3.1-py2.py3-none-any.whl", hash = "sha256:49bf7c753a9282006d553ba99217f48f947e3eef09e18a700f8a82f75dc7fc5c", size = 5937, upload-time = "2022-01-20T14:48:51.718Z" }, -] - -[[package]] -name = "agentops" -version = "0.4.21" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "aiohttp" }, - { name = "httpx" }, - { name = "opentelemetry-api" }, - { name = "opentelemetry-exporter-otlp-proto-http" }, - { name = "opentelemetry-instrumentation" }, - { name = "opentelemetry-sdk" }, - { name = "opentelemetry-semantic-conventions" }, - { name = "ordered-set" }, - { name = "packaging" }, - { name = "psutil" }, - { name = "pyyaml" }, - { name = "requests" }, - { name = "termcolor" }, - { name = "wrapt" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/0a/c4/023fe976169c57b1edd71f4c08d6dedaf66814f5b25ecf59b3a8540311ab/agentops-0.4.21.tar.gz", hash = "sha256:47759c6dfd6ea58bad2f7764257e4778cb2e34ae180cef642f60f56adced6510", size = 430861, upload-time = "2025-08-29T06:36:55.323Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/d9/63/3e48da56d5121ddcefef8645ad5a3446b0974154111a14bf75ea2b5b3cc3/agentops-0.4.21-py3-none-any.whl", hash = "sha256:93b098ea77bc5f64dcae5031a8292531cb446d9d66e6c7ef2f21a66d4e4fb2f0", size = 309579, upload-time = "2025-08-29T06:36:53.855Z" }, -] - -[[package]] -name = "aiobotocore" -version = "2.24.2" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "aiohttp" }, - { name = "aioitertools" }, - { name = "botocore" }, - { name = "jmespath" }, - { name = "multidict" }, - { name = "python-dateutil" }, - { name = "wrapt" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/05/93/9f5243c2fd2fc22cff92f8d8a7e98d3080171be60778d49aeabb555a463d/aiobotocore-2.24.2.tar.gz", hash = "sha256:dfb21bdb2610e8de4d22f401e91a24d50f1330a302d03c62c485757becd439a9", size = 119837, upload-time = "2025-09-05T12:13:46.963Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/87/03/2330062ac4ea9fa6447e02b0625f24efd6f05b6c44d61d86610b3555ee66/aiobotocore-2.24.2-py3-none-any.whl", hash = "sha256:808c63b2bd344b91e2f2acb874831118a9f53342d248acd16a68455a226e283a", size = 85441, upload-time = "2025-09-05T12:13:45.378Z" }, -] - -[package.optional-dependencies] -boto3 = [ - { name = "boto3" }, -] - -[[package]] -name = "aiofiles" -version = "23.2.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/af/41/cfed10bc64d774f497a86e5ede9248e1d062db675504b41c320954d99641/aiofiles-23.2.1.tar.gz", hash = "sha256:84ec2218d8419404abcb9f0c02df3f34c6e0a68ed41072acfb1cef5cbc29051a", size = 32072, upload-time = "2023-08-09T15:23:11.564Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/c5/19/5af6804c4cc0fed83f47bff6e413a98a36618e7d40185cd36e69737f3b0e/aiofiles-23.2.1-py3-none-any.whl", hash = "sha256:19297512c647d4b27a2cf7c34caa7e405c0d60b5560618a29a9fe027b18b0107", size = 15727, upload-time = "2023-08-09T15:23:09.774Z" }, -] - -[[package]] -name = "aiohappyeyeballs" -version = "2.6.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/26/30/f84a107a9c4331c14b2b586036f40965c128aa4fee4dda5d3d51cb14ad54/aiohappyeyeballs-2.6.1.tar.gz", hash = "sha256:c3f9d0113123803ccadfdf3f0faa505bc78e6a72d1cc4806cbd719826e943558", size = 22760, upload-time = "2025-03-12T01:42:48.764Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/0f/15/5bf3b99495fb160b63f95972b81750f18f7f4e02ad051373b669d17d44f2/aiohappyeyeballs-2.6.1-py3-none-any.whl", hash = "sha256:f349ba8f4b75cb25c99c5c2d84e997e485204d2902a9597802b0371f09331fb8", size = 15265, upload-time = "2025-03-12T01:42:47.083Z" }, -] - -[[package]] -name = "aiohttp" -version = "3.12.15" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "aiohappyeyeballs" }, - { name = "aiosignal" }, - { name = "attrs" }, - { name = "frozenlist" }, - { name = "multidict" }, - { name = "propcache" }, - { name = "yarl" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/9b/e7/d92a237d8802ca88483906c388f7c201bbe96cd80a165ffd0ac2f6a8d59f/aiohttp-3.12.15.tar.gz", hash = "sha256:4fc61385e9c98d72fcdf47e6dd81833f47b2f77c114c29cd64a361be57a763a2", size = 7823716, upload-time = "2025-07-29T05:52:32.215Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/20/19/9e86722ec8e835959bd97ce8c1efa78cf361fa4531fca372551abcc9cdd6/aiohttp-3.12.15-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:d3ce17ce0220383a0f9ea07175eeaa6aa13ae5a41f30bc61d84df17f0e9b1117", size = 711246, upload-time = "2025-07-29T05:50:15.937Z" }, - { url = "https://files.pythonhosted.org/packages/71/f9/0a31fcb1a7d4629ac9d8f01f1cb9242e2f9943f47f5d03215af91c3c1a26/aiohttp-3.12.15-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:010cc9bbd06db80fe234d9003f67e97a10fe003bfbedb40da7d71c1008eda0fe", size = 483515, upload-time = "2025-07-29T05:50:17.442Z" }, - { url = "https://files.pythonhosted.org/packages/62/6c/94846f576f1d11df0c2e41d3001000527c0fdf63fce7e69b3927a731325d/aiohttp-3.12.15-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:3f9d7c55b41ed687b9d7165b17672340187f87a773c98236c987f08c858145a9", size = 471776, upload-time = "2025-07-29T05:50:19.568Z" }, - { url = "https://files.pythonhosted.org/packages/f8/6c/f766d0aaafcee0447fad0328da780d344489c042e25cd58fde566bf40aed/aiohttp-3.12.15-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bc4fbc61bb3548d3b482f9ac7ddd0f18c67e4225aaa4e8552b9f1ac7e6bda9e5", size = 1741977, upload-time = "2025-07-29T05:50:21.665Z" }, - { url = "https://files.pythonhosted.org/packages/17/e5/fb779a05ba6ff44d7bc1e9d24c644e876bfff5abe5454f7b854cace1b9cc/aiohttp-3.12.15-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:7fbc8a7c410bb3ad5d595bb7118147dfbb6449d862cc1125cf8867cb337e8728", size = 1690645, upload-time = "2025-07-29T05:50:23.333Z" }, - { url = "https://files.pythonhosted.org/packages/37/4e/a22e799c2035f5d6a4ad2cf8e7c1d1bd0923192871dd6e367dafb158b14c/aiohttp-3.12.15-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:74dad41b3458dbb0511e760fb355bb0b6689e0630de8a22b1b62a98777136e16", size = 1789437, upload-time = "2025-07-29T05:50:25.007Z" }, - { url = "https://files.pythonhosted.org/packages/28/e5/55a33b991f6433569babb56018b2fb8fb9146424f8b3a0c8ecca80556762/aiohttp-3.12.15-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3b6f0af863cf17e6222b1735a756d664159e58855da99cfe965134a3ff63b0b0", size = 1828482, upload-time = "2025-07-29T05:50:26.693Z" }, - { url = "https://files.pythonhosted.org/packages/c6/82/1ddf0ea4f2f3afe79dffed5e8a246737cff6cbe781887a6a170299e33204/aiohttp-3.12.15-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b5b7fe4972d48a4da367043b8e023fb70a04d1490aa7d68800e465d1b97e493b", size = 1730944, upload-time = "2025-07-29T05:50:28.382Z" }, - { url = "https://files.pythonhosted.org/packages/1b/96/784c785674117b4cb3877522a177ba1b5e4db9ce0fd519430b5de76eec90/aiohttp-3.12.15-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6443cca89553b7a5485331bc9bedb2342b08d073fa10b8c7d1c60579c4a7b9bd", size = 1668020, upload-time = "2025-07-29T05:50:30.032Z" }, - { url = "https://files.pythonhosted.org/packages/12/8a/8b75f203ea7e5c21c0920d84dd24a5c0e971fe1e9b9ebbf29ae7e8e39790/aiohttp-3.12.15-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:6c5f40ec615e5264f44b4282ee27628cea221fcad52f27405b80abb346d9f3f8", size = 1716292, upload-time = "2025-07-29T05:50:31.983Z" }, - { url = "https://files.pythonhosted.org/packages/47/0b/a1451543475bb6b86a5cfc27861e52b14085ae232896a2654ff1231c0992/aiohttp-3.12.15-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:2abbb216a1d3a2fe86dbd2edce20cdc5e9ad0be6378455b05ec7f77361b3ab50", size = 1711451, upload-time = "2025-07-29T05:50:33.989Z" }, - { url = "https://files.pythonhosted.org/packages/55/fd/793a23a197cc2f0d29188805cfc93aa613407f07e5f9da5cd1366afd9d7c/aiohttp-3.12.15-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:db71ce547012a5420a39c1b744d485cfb823564d01d5d20805977f5ea1345676", size = 1691634, upload-time = "2025-07-29T05:50:35.846Z" }, - { url = "https://files.pythonhosted.org/packages/ca/bf/23a335a6670b5f5dfc6d268328e55a22651b440fca341a64fccf1eada0c6/aiohttp-3.12.15-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:ced339d7c9b5030abad5854aa5413a77565e5b6e6248ff927d3e174baf3badf7", size = 1785238, upload-time = "2025-07-29T05:50:37.597Z" }, - { url = "https://files.pythonhosted.org/packages/57/4f/ed60a591839a9d85d40694aba5cef86dde9ee51ce6cca0bb30d6eb1581e7/aiohttp-3.12.15-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:7c7dd29c7b5bda137464dc9bfc738d7ceea46ff70309859ffde8c022e9b08ba7", size = 1805701, upload-time = "2025-07-29T05:50:39.591Z" }, - { url = "https://files.pythonhosted.org/packages/85/e0/444747a9455c5de188c0f4a0173ee701e2e325d4b2550e9af84abb20cdba/aiohttp-3.12.15-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:421da6fd326460517873274875c6c5a18ff225b40da2616083c5a34a7570b685", size = 1718758, upload-time = "2025-07-29T05:50:41.292Z" }, - { url = "https://files.pythonhosted.org/packages/36/ab/1006278d1ffd13a698e5dd4bfa01e5878f6bddefc296c8b62649753ff249/aiohttp-3.12.15-cp311-cp311-win32.whl", hash = "sha256:4420cf9d179ec8dfe4be10e7d0fe47d6d606485512ea2265b0d8c5113372771b", size = 428868, upload-time = "2025-07-29T05:50:43.063Z" }, - { url = "https://files.pythonhosted.org/packages/10/97/ad2b18700708452400278039272032170246a1bf8ec5d832772372c71f1a/aiohttp-3.12.15-cp311-cp311-win_amd64.whl", hash = "sha256:edd533a07da85baa4b423ee8839e3e91681c7bfa19b04260a469ee94b778bf6d", size = 453273, upload-time = "2025-07-29T05:50:44.613Z" }, - { url = "https://files.pythonhosted.org/packages/63/97/77cb2450d9b35f517d6cf506256bf4f5bda3f93a66b4ad64ba7fc917899c/aiohttp-3.12.15-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:802d3868f5776e28f7bf69d349c26fc0efadb81676d0afa88ed00d98a26340b7", size = 702333, upload-time = "2025-07-29T05:50:46.507Z" }, - { url = "https://files.pythonhosted.org/packages/83/6d/0544e6b08b748682c30b9f65640d006e51f90763b41d7c546693bc22900d/aiohttp-3.12.15-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:f2800614cd560287be05e33a679638e586a2d7401f4ddf99e304d98878c29444", size = 476948, upload-time = "2025-07-29T05:50:48.067Z" }, - { url = "https://files.pythonhosted.org/packages/3a/1d/c8c40e611e5094330284b1aea8a4b02ca0858f8458614fa35754cab42b9c/aiohttp-3.12.15-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8466151554b593909d30a0a125d638b4e5f3836e5aecde85b66b80ded1cb5b0d", size = 469787, upload-time = "2025-07-29T05:50:49.669Z" }, - { url = "https://files.pythonhosted.org/packages/38/7d/b76438e70319796bfff717f325d97ce2e9310f752a267bfdf5192ac6082b/aiohttp-3.12.15-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2e5a495cb1be69dae4b08f35a6c4579c539e9b5706f606632102c0f855bcba7c", size = 1716590, upload-time = "2025-07-29T05:50:51.368Z" }, - { url = "https://files.pythonhosted.org/packages/79/b1/60370d70cdf8b269ee1444b390cbd72ce514f0d1cd1a715821c784d272c9/aiohttp-3.12.15-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:6404dfc8cdde35c69aaa489bb3542fb86ef215fc70277c892be8af540e5e21c0", size = 1699241, upload-time = "2025-07-29T05:50:53.628Z" }, - { url = "https://files.pythonhosted.org/packages/a3/2b/4968a7b8792437ebc12186db31523f541943e99bda8f30335c482bea6879/aiohttp-3.12.15-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3ead1c00f8521a5c9070fcb88f02967b1d8a0544e6d85c253f6968b785e1a2ab", size = 1754335, upload-time = "2025-07-29T05:50:55.394Z" }, - { url = "https://files.pythonhosted.org/packages/fb/c1/49524ed553f9a0bec1a11fac09e790f49ff669bcd14164f9fab608831c4d/aiohttp-3.12.15-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6990ef617f14450bc6b34941dba4f12d5613cbf4e33805932f853fbd1cf18bfb", size = 1800491, upload-time = "2025-07-29T05:50:57.202Z" }, - { url = "https://files.pythonhosted.org/packages/de/5e/3bf5acea47a96a28c121b167f5ef659cf71208b19e52a88cdfa5c37f1fcc/aiohttp-3.12.15-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fd736ed420f4db2b8148b52b46b88ed038d0354255f9a73196b7bbce3ea97545", size = 1719929, upload-time = "2025-07-29T05:50:59.192Z" }, - { url = "https://files.pythonhosted.org/packages/39/94/8ae30b806835bcd1cba799ba35347dee6961a11bd507db634516210e91d8/aiohttp-3.12.15-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3c5092ce14361a73086b90c6efb3948ffa5be2f5b6fbcf52e8d8c8b8848bb97c", size = 1635733, upload-time = "2025-07-29T05:51:01.394Z" }, - { url = "https://files.pythonhosted.org/packages/7a/46/06cdef71dd03acd9da7f51ab3a9107318aee12ad38d273f654e4f981583a/aiohttp-3.12.15-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:aaa2234bb60c4dbf82893e934d8ee8dea30446f0647e024074237a56a08c01bd", size = 1696790, upload-time = "2025-07-29T05:51:03.657Z" }, - { url = "https://files.pythonhosted.org/packages/02/90/6b4cfaaf92ed98d0ec4d173e78b99b4b1a7551250be8937d9d67ecb356b4/aiohttp-3.12.15-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:6d86a2fbdd14192e2f234a92d3b494dd4457e683ba07e5905a0b3ee25389ac9f", size = 1718245, upload-time = "2025-07-29T05:51:05.911Z" }, - { url = "https://files.pythonhosted.org/packages/2e/e6/2593751670fa06f080a846f37f112cbe6f873ba510d070136a6ed46117c6/aiohttp-3.12.15-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:a041e7e2612041a6ddf1c6a33b883be6a421247c7afd47e885969ee4cc58bd8d", size = 1658899, upload-time = "2025-07-29T05:51:07.753Z" }, - { url = "https://files.pythonhosted.org/packages/8f/28/c15bacbdb8b8eb5bf39b10680d129ea7410b859e379b03190f02fa104ffd/aiohttp-3.12.15-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:5015082477abeafad7203757ae44299a610e89ee82a1503e3d4184e6bafdd519", size = 1738459, upload-time = "2025-07-29T05:51:09.56Z" }, - { url = "https://files.pythonhosted.org/packages/00/de/c269cbc4faa01fb10f143b1670633a8ddd5b2e1ffd0548f7aa49cb5c70e2/aiohttp-3.12.15-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:56822ff5ddfd1b745534e658faba944012346184fbfe732e0d6134b744516eea", size = 1766434, upload-time = "2025-07-29T05:51:11.423Z" }, - { url = "https://files.pythonhosted.org/packages/52/b0/4ff3abd81aa7d929b27d2e1403722a65fc87b763e3a97b3a2a494bfc63bc/aiohttp-3.12.15-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b2acbbfff69019d9014508c4ba0401822e8bae5a5fdc3b6814285b71231b60f3", size = 1726045, upload-time = "2025-07-29T05:51:13.689Z" }, - { url = "https://files.pythonhosted.org/packages/71/16/949225a6a2dd6efcbd855fbd90cf476052e648fb011aa538e3b15b89a57a/aiohttp-3.12.15-cp312-cp312-win32.whl", hash = "sha256:d849b0901b50f2185874b9a232f38e26b9b3d4810095a7572eacea939132d4e1", size = 423591, upload-time = "2025-07-29T05:51:15.452Z" }, - { url = "https://files.pythonhosted.org/packages/2b/d8/fa65d2a349fe938b76d309db1a56a75c4fb8cc7b17a398b698488a939903/aiohttp-3.12.15-cp312-cp312-win_amd64.whl", hash = "sha256:b390ef5f62bb508a9d67cb3bba9b8356e23b3996da7062f1a57ce1a79d2b3d34", size = 450266, upload-time = "2025-07-29T05:51:17.239Z" }, - { url = "https://files.pythonhosted.org/packages/f2/33/918091abcf102e39d15aba2476ad9e7bd35ddb190dcdd43a854000d3da0d/aiohttp-3.12.15-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:9f922ffd05034d439dde1c77a20461cf4a1b0831e6caa26151fe7aa8aaebc315", size = 696741, upload-time = "2025-07-29T05:51:19.021Z" }, - { url = "https://files.pythonhosted.org/packages/b5/2a/7495a81e39a998e400f3ecdd44a62107254803d1681d9189be5c2e4530cd/aiohttp-3.12.15-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:2ee8a8ac39ce45f3e55663891d4b1d15598c157b4d494a4613e704c8b43112cd", size = 474407, upload-time = "2025-07-29T05:51:21.165Z" }, - { url = "https://files.pythonhosted.org/packages/49/fc/a9576ab4be2dcbd0f73ee8675d16c707cfc12d5ee80ccf4015ba543480c9/aiohttp-3.12.15-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:3eae49032c29d356b94eee45a3f39fdf4b0814b397638c2f718e96cfadf4c4e4", size = 466703, upload-time = "2025-07-29T05:51:22.948Z" }, - { url = "https://files.pythonhosted.org/packages/09/2f/d4bcc8448cf536b2b54eed48f19682031ad182faa3a3fee54ebe5b156387/aiohttp-3.12.15-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b97752ff12cc12f46a9b20327104448042fce5c33a624f88c18f66f9368091c7", size = 1705532, upload-time = "2025-07-29T05:51:25.211Z" }, - { url = "https://files.pythonhosted.org/packages/f1/f3/59406396083f8b489261e3c011aa8aee9df360a96ac8fa5c2e7e1b8f0466/aiohttp-3.12.15-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:894261472691d6fe76ebb7fcf2e5870a2ac284c7406ddc95823c8598a1390f0d", size = 1686794, upload-time = "2025-07-29T05:51:27.145Z" }, - { url = "https://files.pythonhosted.org/packages/dc/71/164d194993a8d114ee5656c3b7ae9c12ceee7040d076bf7b32fb98a8c5c6/aiohttp-3.12.15-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5fa5d9eb82ce98959fc1031c28198b431b4d9396894f385cb63f1e2f3f20ca6b", size = 1738865, upload-time = "2025-07-29T05:51:29.366Z" }, - { url = "https://files.pythonhosted.org/packages/1c/00/d198461b699188a93ead39cb458554d9f0f69879b95078dce416d3209b54/aiohttp-3.12.15-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f0fa751efb11a541f57db59c1dd821bec09031e01452b2b6217319b3a1f34f3d", size = 1788238, upload-time = "2025-07-29T05:51:31.285Z" }, - { url = "https://files.pythonhosted.org/packages/85/b8/9e7175e1fa0ac8e56baa83bf3c214823ce250d0028955dfb23f43d5e61fd/aiohttp-3.12.15-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5346b93e62ab51ee2a9d68e8f73c7cf96ffb73568a23e683f931e52450e4148d", size = 1710566, upload-time = "2025-07-29T05:51:33.219Z" }, - { url = "https://files.pythonhosted.org/packages/59/e4/16a8eac9df39b48ae102ec030fa9f726d3570732e46ba0c592aeeb507b93/aiohttp-3.12.15-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:049ec0360f939cd164ecbfd2873eaa432613d5e77d6b04535e3d1fbae5a9e645", size = 1624270, upload-time = "2025-07-29T05:51:35.195Z" }, - { url = "https://files.pythonhosted.org/packages/1f/f8/cd84dee7b6ace0740908fd0af170f9fab50c2a41ccbc3806aabcb1050141/aiohttp-3.12.15-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:b52dcf013b57464b6d1e51b627adfd69a8053e84b7103a7cd49c030f9ca44461", size = 1677294, upload-time = "2025-07-29T05:51:37.215Z" }, - { url = "https://files.pythonhosted.org/packages/ce/42/d0f1f85e50d401eccd12bf85c46ba84f947a84839c8a1c2c5f6e8ab1eb50/aiohttp-3.12.15-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:9b2af240143dd2765e0fb661fd0361a1b469cab235039ea57663cda087250ea9", size = 1708958, upload-time = "2025-07-29T05:51:39.328Z" }, - { url = "https://files.pythonhosted.org/packages/d5/6b/f6fa6c5790fb602538483aa5a1b86fcbad66244997e5230d88f9412ef24c/aiohttp-3.12.15-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:ac77f709a2cde2cc71257ab2d8c74dd157c67a0558a0d2799d5d571b4c63d44d", size = 1651553, upload-time = "2025-07-29T05:51:41.356Z" }, - { url = "https://files.pythonhosted.org/packages/04/36/a6d36ad545fa12e61d11d1932eef273928b0495e6a576eb2af04297fdd3c/aiohttp-3.12.15-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:47f6b962246f0a774fbd3b6b7be25d59b06fdb2f164cf2513097998fc6a29693", size = 1727688, upload-time = "2025-07-29T05:51:43.452Z" }, - { url = "https://files.pythonhosted.org/packages/aa/c8/f195e5e06608a97a4e52c5d41c7927301bf757a8e8bb5bbf8cef6c314961/aiohttp-3.12.15-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:760fb7db442f284996e39cf9915a94492e1896baac44f06ae551974907922b64", size = 1761157, upload-time = "2025-07-29T05:51:45.643Z" }, - { url = "https://files.pythonhosted.org/packages/05/6a/ea199e61b67f25ba688d3ce93f63b49b0a4e3b3d380f03971b4646412fc6/aiohttp-3.12.15-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:ad702e57dc385cae679c39d318def49aef754455f237499d5b99bea4ef582e51", size = 1710050, upload-time = "2025-07-29T05:51:48.203Z" }, - { url = "https://files.pythonhosted.org/packages/b4/2e/ffeb7f6256b33635c29dbed29a22a723ff2dd7401fff42ea60cf2060abfb/aiohttp-3.12.15-cp313-cp313-win32.whl", hash = "sha256:f813c3e9032331024de2eb2e32a88d86afb69291fbc37a3a3ae81cc9917fb3d0", size = 422647, upload-time = "2025-07-29T05:51:50.718Z" }, - { url = "https://files.pythonhosted.org/packages/1b/8e/78ee35774201f38d5e1ba079c9958f7629b1fd079459aea9467441dbfbf5/aiohttp-3.12.15-cp313-cp313-win_amd64.whl", hash = "sha256:1a649001580bdb37c6fdb1bebbd7e3bc688e8ec2b5c6f52edbb664662b17dc84", size = 449067, upload-time = "2025-07-29T05:51:52.549Z" }, -] - -[[package]] -name = "aioitertools" -version = "0.12.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/06/de/38491a84ab323b47c7f86e94d2830e748780525f7a10c8600b67ead7e9ea/aioitertools-0.12.0.tar.gz", hash = "sha256:c2a9055b4fbb7705f561b9d86053e8af5d10cc845d22c32008c43490b2d8dd6b", size = 19369, upload-time = "2024-09-02T03:33:40.349Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/85/13/58b70a580de00893223d61de8fea167877a3aed97d4a5e1405c9159ef925/aioitertools-0.12.0-py3-none-any.whl", hash = "sha256:fc1f5fac3d737354de8831cbba3eb04f79dd649d8f3afb4c5b114925e662a796", size = 24345, upload-time = "2024-09-02T03:34:59.454Z" }, -] - -[[package]] -name = "aiosignal" -version = "1.4.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "frozenlist" }, - { name = "typing-extensions", marker = "python_full_version < '3.13'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/61/62/06741b579156360248d1ec624842ad0edf697050bbaf7c3e46394e106ad1/aiosignal-1.4.0.tar.gz", hash = "sha256:f47eecd9468083c2029cc99945502cb7708b082c232f9aca65da147157b251c7", size = 25007, upload-time = "2025-07-03T22:54:43.528Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/fb/76/641ae371508676492379f16e2fa48f4e2c11741bd63c48be4b12a6b09cba/aiosignal-1.4.0-py3-none-any.whl", hash = "sha256:053243f8b92b990551949e63930a839ff0cf0b0ebbe0597b0f3fb19e1a0fe82e", size = 7490, upload-time = "2025-07-03T22:54:42.156Z" }, -] - -[[package]] -name = "aiosqlite" -version = "0.21.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "typing-extensions" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/13/7d/8bca2bf9a247c2c5dfeec1d7a5f40db6518f88d314b8bca9da29670d2671/aiosqlite-0.21.0.tar.gz", hash = "sha256:131bb8056daa3bc875608c631c678cda73922a2d4ba8aec373b19f18c17e7aa3", size = 13454, upload-time = "2025-02-03T07:30:16.235Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/f5/10/6c25ed6de94c49f88a91fa5018cb4c0f3625f31d5be9f771ebe5cc7cd506/aiosqlite-0.21.0-py3-none-any.whl", hash = "sha256:2549cf4057f95f53dcba16f2b64e8e2791d7e1adedb13197dd8ed77bb226d7d0", size = 15792, upload-time = "2025-02-03T07:30:13.6Z" }, -] - -[[package]] -name = "alembic" -version = "1.16.5" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "mako" }, - { name = "sqlalchemy" }, - { name = "typing-extensions" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/9a/ca/4dc52902cf3491892d464f5265a81e9dff094692c8a049a3ed6a05fe7ee8/alembic-1.16.5.tar.gz", hash = "sha256:a88bb7f6e513bd4301ecf4c7f2206fe93f9913f9b48dac3b78babde2d6fe765e", size = 1969868, upload-time = "2025-08-27T18:02:05.668Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/39/4a/4c61d4c84cfd9befb6fa08a702535b27b21fff08c946bc2f6139decbf7f7/alembic-1.16.5-py3-none-any.whl", hash = "sha256:e845dfe090c5ffa7b92593ae6687c5cb1a101e91fa53868497dbd79847f9dbe3", size = 247355, upload-time = "2025-08-27T18:02:07.37Z" }, -] - -[[package]] -name = "annotated-types" -version = "0.7.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/ee/67/531ea369ba64dcff5ec9c3402f9f51bf748cec26dde048a2f973a4eea7f5/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89", size = 16081, upload-time = "2024-05-20T21:33:25.928Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643, upload-time = "2024-05-20T21:33:24.1Z" }, -] - -[[package]] -name = "anyio" -version = "4.10.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "idna" }, - { name = "sniffio" }, - { name = "typing-extensions", marker = "python_full_version < '3.13'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/f1/b4/636b3b65173d3ce9a38ef5f0522789614e590dab6a8d505340a4efe4c567/anyio-4.10.0.tar.gz", hash = "sha256:3f3fae35c96039744587aa5b8371e7e8e603c0702999535961dd336026973ba6", size = 213252, upload-time = "2025-08-04T08:54:26.451Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/6f/12/e5e0282d673bb9746bacfb6e2dba8719989d3660cdb2ea79aee9a9651afb/anyio-4.10.0-py3-none-any.whl", hash = "sha256:60e474ac86736bbfd6f210f7a61218939c318f43f9972497381f1c5e930ed3d1", size = 107213, upload-time = "2025-08-04T08:54:24.882Z" }, -] - -[[package]] -name = "argon2-cffi" -version = "23.1.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "argon2-cffi-bindings" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/31/fa/57ec2c6d16ecd2ba0cf15f3c7d1c3c2e7b5fcb83555ff56d7ab10888ec8f/argon2_cffi-23.1.0.tar.gz", hash = "sha256:879c3e79a2729ce768ebb7d36d4609e3a78a4ca2ec3a9f12286ca057e3d0db08", size = 42798, upload-time = "2023-08-15T14:13:12.711Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/a4/6a/e8a041599e78b6b3752da48000b14c8d1e8a04ded09c88c714ba047f34f5/argon2_cffi-23.1.0-py3-none-any.whl", hash = "sha256:c670642b78ba29641818ab2e68bd4e6a78ba53b7eff7b4c3815ae16abf91c7ea", size = 15124, upload-time = "2023-08-15T14:13:10.752Z" }, -] - -[[package]] -name = "argon2-cffi-bindings" -version = "25.1.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "cffi" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/5c/2d/db8af0df73c1cf454f71b2bbe5e356b8c1f8041c979f505b3d3186e520a9/argon2_cffi_bindings-25.1.0.tar.gz", hash = "sha256:b957f3e6ea4d55d820e40ff76f450952807013d361a65d7f28acc0acbf29229d", size = 1783441, upload-time = "2025-07-30T10:02:05.147Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/60/97/3c0a35f46e52108d4707c44b95cfe2afcafc50800b5450c197454569b776/argon2_cffi_bindings-25.1.0-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:3d3f05610594151994ca9ccb3c771115bdb4daef161976a266f0dd8aa9996b8f", size = 54393, upload-time = "2025-07-30T10:01:40.97Z" }, - { url = "https://files.pythonhosted.org/packages/9d/f4/98bbd6ee89febd4f212696f13c03ca302b8552e7dbf9c8efa11ea4a388c3/argon2_cffi_bindings-25.1.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:8b8efee945193e667a396cbc7b4fb7d357297d6234d30a489905d96caabde56b", size = 29328, upload-time = "2025-07-30T10:01:41.916Z" }, - { url = "https://files.pythonhosted.org/packages/43/24/90a01c0ef12ac91a6be05969f29944643bc1e5e461155ae6559befa8f00b/argon2_cffi_bindings-25.1.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:3c6702abc36bf3ccba3f802b799505def420a1b7039862014a65db3205967f5a", size = 31269, upload-time = "2025-07-30T10:01:42.716Z" }, - { url = "https://files.pythonhosted.org/packages/d4/d3/942aa10782b2697eee7af5e12eeff5ebb325ccfb86dd8abda54174e377e4/argon2_cffi_bindings-25.1.0-cp314-cp314t-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a1c70058c6ab1e352304ac7e3b52554daadacd8d453c1752e547c76e9c99ac44", size = 86558, upload-time = "2025-07-30T10:01:43.943Z" }, - { url = "https://files.pythonhosted.org/packages/0d/82/b484f702fec5536e71836fc2dbc8c5267b3f6e78d2d539b4eaa6f0db8bf8/argon2_cffi_bindings-25.1.0-cp314-cp314t-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e2fd3bfbff3c5d74fef31a722f729bf93500910db650c925c2d6ef879a7e51cb", size = 92364, upload-time = "2025-07-30T10:01:44.887Z" }, - { url = "https://files.pythonhosted.org/packages/c9/c1/a606ff83b3f1735f3759ad0f2cd9e038a0ad11a3de3b6c673aa41c24bb7b/argon2_cffi_bindings-25.1.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:c4f9665de60b1b0e99bcd6be4f17d90339698ce954cfd8d9cf4f91c995165a92", size = 85637, upload-time = "2025-07-30T10:01:46.225Z" }, - { url = "https://files.pythonhosted.org/packages/44/b4/678503f12aceb0262f84fa201f6027ed77d71c5019ae03b399b97caa2f19/argon2_cffi_bindings-25.1.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:ba92837e4a9aa6a508c8d2d7883ed5a8f6c308c89a4790e1e447a220deb79a85", size = 91934, upload-time = "2025-07-30T10:01:47.203Z" }, - { url = "https://files.pythonhosted.org/packages/f0/c7/f36bd08ef9bd9f0a9cff9428406651f5937ce27b6c5b07b92d41f91ae541/argon2_cffi_bindings-25.1.0-cp314-cp314t-win32.whl", hash = "sha256:84a461d4d84ae1295871329b346a97f68eade8c53b6ed9a7ca2d7467f3c8ff6f", size = 28158, upload-time = "2025-07-30T10:01:48.341Z" }, - { url = "https://files.pythonhosted.org/packages/b3/80/0106a7448abb24a2c467bf7d527fe5413b7fdfa4ad6d6a96a43a62ef3988/argon2_cffi_bindings-25.1.0-cp314-cp314t-win_amd64.whl", hash = "sha256:b55aec3565b65f56455eebc9b9f34130440404f27fe21c3b375bf1ea4d8fbae6", size = 32597, upload-time = "2025-07-30T10:01:49.112Z" }, - { url = "https://files.pythonhosted.org/packages/05/b8/d663c9caea07e9180b2cb662772865230715cbd573ba3b5e81793d580316/argon2_cffi_bindings-25.1.0-cp314-cp314t-win_arm64.whl", hash = "sha256:87c33a52407e4c41f3b70a9c2d3f6056d88b10dad7695be708c5021673f55623", size = 28231, upload-time = "2025-07-30T10:01:49.92Z" }, - { url = "https://files.pythonhosted.org/packages/1d/57/96b8b9f93166147826da5f90376e784a10582dd39a393c99bb62cfcf52f0/argon2_cffi_bindings-25.1.0-cp39-abi3-macosx_10_9_universal2.whl", hash = "sha256:aecba1723ae35330a008418a91ea6cfcedf6d31e5fbaa056a166462ff066d500", size = 54121, upload-time = "2025-07-30T10:01:50.815Z" }, - { url = "https://files.pythonhosted.org/packages/0a/08/a9bebdb2e0e602dde230bdde8021b29f71f7841bd54801bcfd514acb5dcf/argon2_cffi_bindings-25.1.0-cp39-abi3-macosx_10_9_x86_64.whl", hash = "sha256:2630b6240b495dfab90aebe159ff784d08ea999aa4b0d17efa734055a07d2f44", size = 29177, upload-time = "2025-07-30T10:01:51.681Z" }, - { url = "https://files.pythonhosted.org/packages/b6/02/d297943bcacf05e4f2a94ab6f462831dc20158614e5d067c35d4e63b9acb/argon2_cffi_bindings-25.1.0-cp39-abi3-macosx_11_0_arm64.whl", hash = "sha256:7aef0c91e2c0fbca6fc68e7555aa60ef7008a739cbe045541e438373bc54d2b0", size = 31090, upload-time = "2025-07-30T10:01:53.184Z" }, - { url = "https://files.pythonhosted.org/packages/c1/93/44365f3d75053e53893ec6d733e4a5e3147502663554b4d864587c7828a7/argon2_cffi_bindings-25.1.0-cp39-abi3-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1e021e87faa76ae0d413b619fe2b65ab9a037f24c60a1e6cc43457ae20de6dc6", size = 81246, upload-time = "2025-07-30T10:01:54.145Z" }, - { url = "https://files.pythonhosted.org/packages/09/52/94108adfdd6e2ddf58be64f959a0b9c7d4ef2fa71086c38356d22dc501ea/argon2_cffi_bindings-25.1.0-cp39-abi3-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d3e924cfc503018a714f94a49a149fdc0b644eaead5d1f089330399134fa028a", size = 87126, upload-time = "2025-07-30T10:01:55.074Z" }, - { url = "https://files.pythonhosted.org/packages/72/70/7a2993a12b0ffa2a9271259b79cc616e2389ed1a4d93842fac5a1f923ffd/argon2_cffi_bindings-25.1.0-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:c87b72589133f0346a1cb8d5ecca4b933e3c9b64656c9d175270a000e73b288d", size = 80343, upload-time = "2025-07-30T10:01:56.007Z" }, - { url = "https://files.pythonhosted.org/packages/78/9a/4e5157d893ffc712b74dbd868c7f62365618266982b64accab26bab01edc/argon2_cffi_bindings-25.1.0-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:1db89609c06afa1a214a69a462ea741cf735b29a57530478c06eb81dd403de99", size = 86777, upload-time = "2025-07-30T10:01:56.943Z" }, - { url = "https://files.pythonhosted.org/packages/74/cd/15777dfde1c29d96de7f18edf4cc94c385646852e7c7b0320aa91ccca583/argon2_cffi_bindings-25.1.0-cp39-abi3-win32.whl", hash = "sha256:473bcb5f82924b1becbb637b63303ec8d10e84c8d241119419897a26116515d2", size = 27180, upload-time = "2025-07-30T10:01:57.759Z" }, - { url = "https://files.pythonhosted.org/packages/e2/c6/a759ece8f1829d1f162261226fbfd2c6832b3ff7657384045286d2afa384/argon2_cffi_bindings-25.1.0-cp39-abi3-win_amd64.whl", hash = "sha256:a98cd7d17e9f7ce244c0803cad3c23a7d379c301ba618a5fa76a67d116618b98", size = 31715, upload-time = "2025-07-30T10:01:58.56Z" }, - { url = "https://files.pythonhosted.org/packages/42/b9/f8d6fa329ab25128b7e98fd83a3cb34d9db5b059a9847eddb840a0af45dd/argon2_cffi_bindings-25.1.0-cp39-abi3-win_arm64.whl", hash = "sha256:b0fdbcf513833809c882823f98dc2f931cf659d9a1429616ac3adebb49f5db94", size = 27149, upload-time = "2025-07-30T10:01:59.329Z" }, -] - -[[package]] -name = "attrs" -version = "25.3.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/5a/b0/1367933a8532ee6ff8d63537de4f1177af4bff9f3e829baf7331f595bb24/attrs-25.3.0.tar.gz", hash = "sha256:75d7cefc7fb576747b2c81b4442d4d4a1ce0900973527c011d1030fd3bf4af1b", size = 812032, upload-time = "2025-03-13T11:10:22.779Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/77/06/bb80f5f86020c4551da315d78b3ab75e8228f89f0162f2c3a819e407941a/attrs-25.3.0-py3-none-any.whl", hash = "sha256:427318ce031701fea540783410126f03899a97ffc6f61596ad581ac2e40e3bc3", size = 63815, upload-time = "2025-03-13T11:10:21.14Z" }, -] - -[[package]] -name = "authlib" -version = "1.6.4" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "cryptography" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/ce/bb/73a1f1c64ee527877f64122422dafe5b87a846ccf4ac933fe21bcbb8fee8/authlib-1.6.4.tar.gz", hash = "sha256:104b0442a43061dc8bc23b133d1d06a2b0a9c2e3e33f34c4338929e816287649", size = 164046, upload-time = "2025-09-17T09:59:23.897Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/0e/aa/91355b5f539caf1b94f0e66ff1e4ee39373b757fce08204981f7829ede51/authlib-1.6.4-py2.py3-none-any.whl", hash = "sha256:39313d2a2caac3ecf6d8f95fbebdfd30ae6ea6ae6a6db794d976405fdd9aa796", size = 243076, upload-time = "2025-09-17T09:59:22.259Z" }, -] - -[[package]] -name = "backoff" -version = "2.2.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/47/d7/5bbeb12c44d7c4f2fb5b56abce497eb5ed9f34d85701de869acedd602619/backoff-2.2.1.tar.gz", hash = "sha256:03f829f5bb1923180821643f8753b0502c3b682293992485b0eef2807afa5cba", size = 17001, upload-time = "2022-10-05T19:19:32.061Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/df/73/b6e24bd22e6720ca8ee9a85a0c4a2971af8497d8f3193fa05390cbd46e09/backoff-2.2.1-py3-none-any.whl", hash = "sha256:63579f9a0628e06278f7e47b7d7d5b6ce20dc65c5e96a6f3ca99a6adca0396e8", size = 15148, upload-time = "2022-10-05T19:19:30.546Z" }, -] - -[[package]] -name = "baml-py" -version = "0.201.0" -source = { registry = "https://pypi.org/simple" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/54/54/2b0edb3d22e95ce56f36610391c11108a4ef26ba2837736a32001687ae34/baml_py-0.201.0-cp38-abi3-macosx_10_12_x86_64.whl", hash = "sha256:83228d2af2b0e845bbbb4e14f7cbd3376cec385aee01210ac522ab6076e07bec", size = 17387971, upload-time = "2025-07-03T19:29:05.844Z" }, - { url = "https://files.pythonhosted.org/packages/c9/08/1d48c28c63eadea2c04360cbb7f64968599e99cd6b8fc0ec0bd4424d3cf1/baml_py-0.201.0-cp38-abi3-macosx_11_0_arm64.whl", hash = "sha256:2a9d016139e3ae5b5ce98c7b05b5fbd53d5d38f04dc810ec4d70fb17dd6c10e4", size = 16191010, upload-time = "2025-07-03T19:29:09.323Z" }, - { url = "https://files.pythonhosted.org/packages/73/1a/20b2d46501e3dd0648af339825106a6ac5eeb5d22d7e6a10cf16b9aa1cb8/baml_py-0.201.0-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b5058505b1a3c5f04fc1679aec4d730fa9bef2cbd96209b3ed50152f60b96baf", size = 19950249, upload-time = "2025-07-03T19:29:11.974Z" }, - { url = "https://files.pythonhosted.org/packages/38/24/bc871059e905159ae1913c2e3032dd6ef2f5c3d0983999d2c2f1eebb65a4/baml_py-0.201.0-cp38-abi3-manylinux_2_24_aarch64.whl", hash = "sha256:36289d548581ba4accd5eaaab3246872542dd32dc6717e537654fa0cad884071", size = 19231310, upload-time = "2025-07-03T19:29:14.857Z" }, - { url = "https://files.pythonhosted.org/packages/0e/11/4268a0b82b02c7202fe5aa0d7175712158d998c491cac723b2bac3d5d495/baml_py-0.201.0-cp38-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:5ab70e7bd6481d71edca8a33313347b29faccec78b9960138aa437522813ac9a", size = 19490012, upload-time = "2025-07-03T19:29:18.512Z" }, - { url = "https://files.pythonhosted.org/packages/31/21/c9f9aea1adba2a5978ffab11ba0948a9f3f81ec6ed3056067713260e93a1/baml_py-0.201.0-cp38-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:7efc5c693a7142c230a4f3d6700415127fee0b9f5fdbb36db63e04e27ac4c0f1", size = 20090620, upload-time = "2025-07-03T19:29:21.072Z" }, - { url = "https://files.pythonhosted.org/packages/99/cf/92123d8d753f1d1473e080c4c182139bfe3b9a6418e891cf1d96b6c33848/baml_py-0.201.0-cp38-abi3-win_amd64.whl", hash = "sha256:56499857b7a27ae61a661c8ce0dddd0fb567a45c0b826157e44048a14cf586f9", size = 17253005, upload-time = "2025-07-03T19:29:23.722Z" }, - { url = "https://files.pythonhosted.org/packages/59/88/5056aa1bc9480f758cd6e210d63bd1f9ad90b44c87f4121285906526495e/baml_py-0.201.0-cp38-abi3-win_arm64.whl", hash = "sha256:1e52dc1151db84a302b746590fe2bc484bdd794f83fa5da7216d9394c559f33a", size = 15612701, upload-time = "2025-07-03T19:29:26.712Z" }, -] - -[[package]] -name = "bcrypt" -version = "4.3.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/bb/5d/6d7433e0f3cd46ce0b43cd65e1db465ea024dbb8216fb2404e919c2ad77b/bcrypt-4.3.0.tar.gz", hash = "sha256:3a3fd2204178b6d2adcf09cb4f6426ffef54762577a7c9b54c159008cb288c18", size = 25697, upload-time = "2025-02-28T01:24:09.174Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/bf/2c/3d44e853d1fe969d229bd58d39ae6902b3d924af0e2b5a60d17d4b809ded/bcrypt-4.3.0-cp313-cp313t-macosx_10_12_universal2.whl", hash = "sha256:f01e060f14b6b57bbb72fc5b4a83ac21c443c9a2ee708e04a10e9192f90a6281", size = 483719, upload-time = "2025-02-28T01:22:34.539Z" }, - { url = "https://files.pythonhosted.org/packages/a1/e2/58ff6e2a22eca2e2cff5370ae56dba29d70b1ea6fc08ee9115c3ae367795/bcrypt-4.3.0-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c5eeac541cefd0bb887a371ef73c62c3cd78535e4887b310626036a7c0a817bb", size = 272001, upload-time = "2025-02-28T01:22:38.078Z" }, - { url = "https://files.pythonhosted.org/packages/37/1f/c55ed8dbe994b1d088309e366749633c9eb90d139af3c0a50c102ba68a1a/bcrypt-4.3.0-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:59e1aa0e2cd871b08ca146ed08445038f42ff75968c7ae50d2fdd7860ade2180", size = 277451, upload-time = "2025-02-28T01:22:40.787Z" }, - { url = "https://files.pythonhosted.org/packages/d7/1c/794feb2ecf22fe73dcfb697ea7057f632061faceb7dcf0f155f3443b4d79/bcrypt-4.3.0-cp313-cp313t-manylinux_2_28_aarch64.whl", hash = "sha256:0042b2e342e9ae3d2ed22727c1262f76cc4f345683b5c1715f0250cf4277294f", size = 272792, upload-time = "2025-02-28T01:22:43.144Z" }, - { url = "https://files.pythonhosted.org/packages/13/b7/0b289506a3f3598c2ae2bdfa0ea66969812ed200264e3f61df77753eee6d/bcrypt-4.3.0-cp313-cp313t-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:74a8d21a09f5e025a9a23e7c0fd2c7fe8e7503e4d356c0a2c1486ba010619f09", size = 289752, upload-time = "2025-02-28T01:22:45.56Z" }, - { url = "https://files.pythonhosted.org/packages/dc/24/d0fb023788afe9e83cc118895a9f6c57e1044e7e1672f045e46733421fe6/bcrypt-4.3.0-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:0142b2cb84a009f8452c8c5a33ace5e3dfec4159e7735f5afe9a4d50a8ea722d", size = 277762, upload-time = "2025-02-28T01:22:47.023Z" }, - { url = "https://files.pythonhosted.org/packages/e4/38/cde58089492e55ac4ef6c49fea7027600c84fd23f7520c62118c03b4625e/bcrypt-4.3.0-cp313-cp313t-manylinux_2_34_aarch64.whl", hash = "sha256:12fa6ce40cde3f0b899729dbd7d5e8811cb892d31b6f7d0334a1f37748b789fd", size = 272384, upload-time = "2025-02-28T01:22:49.221Z" }, - { url = "https://files.pythonhosted.org/packages/de/6a/d5026520843490cfc8135d03012a413e4532a400e471e6188b01b2de853f/bcrypt-4.3.0-cp313-cp313t-manylinux_2_34_x86_64.whl", hash = "sha256:5bd3cca1f2aa5dbcf39e2aa13dd094ea181f48959e1071265de49cc2b82525af", size = 277329, upload-time = "2025-02-28T01:22:51.603Z" }, - { url = "https://files.pythonhosted.org/packages/b3/a3/4fc5255e60486466c389e28c12579d2829b28a527360e9430b4041df4cf9/bcrypt-4.3.0-cp313-cp313t-musllinux_1_1_aarch64.whl", hash = "sha256:335a420cfd63fc5bc27308e929bee231c15c85cc4c496610ffb17923abf7f231", size = 305241, upload-time = "2025-02-28T01:22:53.283Z" }, - { url = "https://files.pythonhosted.org/packages/c7/15/2b37bc07d6ce27cc94e5b10fd5058900eb8fb11642300e932c8c82e25c4a/bcrypt-4.3.0-cp313-cp313t-musllinux_1_1_x86_64.whl", hash = "sha256:0e30e5e67aed0187a1764911af023043b4542e70a7461ad20e837e94d23e1d6c", size = 309617, upload-time = "2025-02-28T01:22:55.461Z" }, - { url = "https://files.pythonhosted.org/packages/5f/1f/99f65edb09e6c935232ba0430c8c13bb98cb3194b6d636e61d93fe60ac59/bcrypt-4.3.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:3b8d62290ebefd49ee0b3ce7500f5dbdcf13b81402c05f6dafab9a1e1b27212f", size = 335751, upload-time = "2025-02-28T01:22:57.81Z" }, - { url = "https://files.pythonhosted.org/packages/00/1b/b324030c706711c99769988fcb694b3cb23f247ad39a7823a78e361bdbb8/bcrypt-4.3.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:2ef6630e0ec01376f59a006dc72918b1bf436c3b571b80fa1968d775fa02fe7d", size = 355965, upload-time = "2025-02-28T01:22:59.181Z" }, - { url = "https://files.pythonhosted.org/packages/aa/dd/20372a0579dd915dfc3b1cd4943b3bca431866fcb1dfdfd7518c3caddea6/bcrypt-4.3.0-cp313-cp313t-win32.whl", hash = "sha256:7a4be4cbf241afee43f1c3969b9103a41b40bcb3a3f467ab19f891d9bc4642e4", size = 155316, upload-time = "2025-02-28T01:23:00.763Z" }, - { url = "https://files.pythonhosted.org/packages/6d/52/45d969fcff6b5577c2bf17098dc36269b4c02197d551371c023130c0f890/bcrypt-4.3.0-cp313-cp313t-win_amd64.whl", hash = "sha256:5c1949bf259a388863ced887c7861da1df681cb2388645766c89fdfd9004c669", size = 147752, upload-time = "2025-02-28T01:23:02.908Z" }, - { url = "https://files.pythonhosted.org/packages/11/22/5ada0b9af72b60cbc4c9a399fdde4af0feaa609d27eb0adc61607997a3fa/bcrypt-4.3.0-cp38-abi3-macosx_10_12_universal2.whl", hash = "sha256:f81b0ed2639568bf14749112298f9e4e2b28853dab50a8b357e31798686a036d", size = 498019, upload-time = "2025-02-28T01:23:05.838Z" }, - { url = "https://files.pythonhosted.org/packages/b8/8c/252a1edc598dc1ce57905be173328eda073083826955ee3c97c7ff5ba584/bcrypt-4.3.0-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:864f8f19adbe13b7de11ba15d85d4a428c7e2f344bac110f667676a0ff84924b", size = 279174, upload-time = "2025-02-28T01:23:07.274Z" }, - { url = "https://files.pythonhosted.org/packages/29/5b/4547d5c49b85f0337c13929f2ccbe08b7283069eea3550a457914fc078aa/bcrypt-4.3.0-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3e36506d001e93bffe59754397572f21bb5dc7c83f54454c990c74a468cd589e", size = 283870, upload-time = "2025-02-28T01:23:09.151Z" }, - { url = "https://files.pythonhosted.org/packages/be/21/7dbaf3fa1745cb63f776bb046e481fbababd7d344c5324eab47f5ca92dd2/bcrypt-4.3.0-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:842d08d75d9fe9fb94b18b071090220697f9f184d4547179b60734846461ed59", size = 279601, upload-time = "2025-02-28T01:23:11.461Z" }, - { url = "https://files.pythonhosted.org/packages/6d/64/e042fc8262e971347d9230d9abbe70d68b0a549acd8611c83cebd3eaec67/bcrypt-4.3.0-cp38-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:7c03296b85cb87db865d91da79bf63d5609284fc0cab9472fdd8367bbd830753", size = 297660, upload-time = "2025-02-28T01:23:12.989Z" }, - { url = "https://files.pythonhosted.org/packages/50/b8/6294eb84a3fef3b67c69b4470fcdd5326676806bf2519cda79331ab3c3a9/bcrypt-4.3.0-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:62f26585e8b219cdc909b6a0069efc5e4267e25d4a3770a364ac58024f62a761", size = 284083, upload-time = "2025-02-28T01:23:14.5Z" }, - { url = "https://files.pythonhosted.org/packages/62/e6/baff635a4f2c42e8788fe1b1633911c38551ecca9a749d1052d296329da6/bcrypt-4.3.0-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:beeefe437218a65322fbd0069eb437e7c98137e08f22c4660ac2dc795c31f8bb", size = 279237, upload-time = "2025-02-28T01:23:16.686Z" }, - { url = "https://files.pythonhosted.org/packages/39/48/46f623f1b0c7dc2e5de0b8af5e6f5ac4cc26408ac33f3d424e5ad8da4a90/bcrypt-4.3.0-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:97eea7408db3a5bcce4a55d13245ab3fa566e23b4c67cd227062bb49e26c585d", size = 283737, upload-time = "2025-02-28T01:23:18.897Z" }, - { url = "https://files.pythonhosted.org/packages/49/8b/70671c3ce9c0fca4a6cc3cc6ccbaa7e948875a2e62cbd146e04a4011899c/bcrypt-4.3.0-cp38-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:191354ebfe305e84f344c5964c7cd5f924a3bfc5d405c75ad07f232b6dffb49f", size = 312741, upload-time = "2025-02-28T01:23:21.041Z" }, - { url = "https://files.pythonhosted.org/packages/27/fb/910d3a1caa2d249b6040a5caf9f9866c52114d51523ac2fb47578a27faee/bcrypt-4.3.0-cp38-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:41261d64150858eeb5ff43c753c4b216991e0ae16614a308a15d909503617732", size = 316472, upload-time = "2025-02-28T01:23:23.183Z" }, - { url = "https://files.pythonhosted.org/packages/dc/cf/7cf3a05b66ce466cfb575dbbda39718d45a609daa78500f57fa9f36fa3c0/bcrypt-4.3.0-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:33752b1ba962ee793fa2b6321404bf20011fe45b9afd2a842139de3011898fef", size = 343606, upload-time = "2025-02-28T01:23:25.361Z" }, - { url = "https://files.pythonhosted.org/packages/e3/b8/e970ecc6d7e355c0d892b7f733480f4aa8509f99b33e71550242cf0b7e63/bcrypt-4.3.0-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:50e6e80a4bfd23a25f5c05b90167c19030cf9f87930f7cb2eacb99f45d1c3304", size = 362867, upload-time = "2025-02-28T01:23:26.875Z" }, - { url = "https://files.pythonhosted.org/packages/a9/97/8d3118efd8354c555a3422d544163f40d9f236be5b96c714086463f11699/bcrypt-4.3.0-cp38-abi3-win32.whl", hash = "sha256:67a561c4d9fb9465ec866177e7aebcad08fe23aaf6fbd692a6fab69088abfc51", size = 160589, upload-time = "2025-02-28T01:23:28.381Z" }, - { url = "https://files.pythonhosted.org/packages/29/07/416f0b99f7f3997c69815365babbc2e8754181a4b1899d921b3c7d5b6f12/bcrypt-4.3.0-cp38-abi3-win_amd64.whl", hash = "sha256:584027857bc2843772114717a7490a37f68da563b3620f78a849bcb54dc11e62", size = 152794, upload-time = "2025-02-28T01:23:30.187Z" }, - { url = "https://files.pythonhosted.org/packages/6e/c1/3fa0e9e4e0bfd3fd77eb8b52ec198fd6e1fd7e9402052e43f23483f956dd/bcrypt-4.3.0-cp39-abi3-macosx_10_12_universal2.whl", hash = "sha256:0d3efb1157edebfd9128e4e46e2ac1a64e0c1fe46fb023158a407c7892b0f8c3", size = 498969, upload-time = "2025-02-28T01:23:31.945Z" }, - { url = "https://files.pythonhosted.org/packages/ce/d4/755ce19b6743394787fbd7dff6bf271b27ee9b5912a97242e3caf125885b/bcrypt-4.3.0-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:08bacc884fd302b611226c01014eca277d48f0a05187666bca23aac0dad6fe24", size = 279158, upload-time = "2025-02-28T01:23:34.161Z" }, - { url = "https://files.pythonhosted.org/packages/9b/5d/805ef1a749c965c46b28285dfb5cd272a7ed9fa971f970435a5133250182/bcrypt-4.3.0-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f6746e6fec103fcd509b96bacdfdaa2fbde9a553245dbada284435173a6f1aef", size = 284285, upload-time = "2025-02-28T01:23:35.765Z" }, - { url = "https://files.pythonhosted.org/packages/ab/2b/698580547a4a4988e415721b71eb45e80c879f0fb04a62da131f45987b96/bcrypt-4.3.0-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:afe327968aaf13fc143a56a3360cb27d4ad0345e34da12c7290f1b00b8fe9a8b", size = 279583, upload-time = "2025-02-28T01:23:38.021Z" }, - { url = "https://files.pythonhosted.org/packages/f2/87/62e1e426418204db520f955ffd06f1efd389feca893dad7095bf35612eec/bcrypt-4.3.0-cp39-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:d9af79d322e735b1fc33404b5765108ae0ff232d4b54666d46730f8ac1a43676", size = 297896, upload-time = "2025-02-28T01:23:39.575Z" }, - { url = "https://files.pythonhosted.org/packages/cb/c6/8fedca4c2ada1b6e889c52d2943b2f968d3427e5d65f595620ec4c06fa2f/bcrypt-4.3.0-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:f1e3ffa1365e8702dc48c8b360fef8d7afeca482809c5e45e653af82ccd088c1", size = 284492, upload-time = "2025-02-28T01:23:40.901Z" }, - { url = "https://files.pythonhosted.org/packages/4d/4d/c43332dcaaddb7710a8ff5269fcccba97ed3c85987ddaa808db084267b9a/bcrypt-4.3.0-cp39-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:3004df1b323d10021fda07a813fd33e0fd57bef0e9a480bb143877f6cba996fe", size = 279213, upload-time = "2025-02-28T01:23:42.653Z" }, - { url = "https://files.pythonhosted.org/packages/dc/7f/1e36379e169a7df3a14a1c160a49b7b918600a6008de43ff20d479e6f4b5/bcrypt-4.3.0-cp39-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:531457e5c839d8caea9b589a1bcfe3756b0547d7814e9ce3d437f17da75c32b0", size = 284162, upload-time = "2025-02-28T01:23:43.964Z" }, - { url = "https://files.pythonhosted.org/packages/1c/0a/644b2731194b0d7646f3210dc4d80c7fee3ecb3a1f791a6e0ae6bb8684e3/bcrypt-4.3.0-cp39-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:17a854d9a7a476a89dcef6c8bd119ad23e0f82557afbd2c442777a16408e614f", size = 312856, upload-time = "2025-02-28T01:23:46.011Z" }, - { url = "https://files.pythonhosted.org/packages/dc/62/2a871837c0bb6ab0c9a88bf54de0fc021a6a08832d4ea313ed92a669d437/bcrypt-4.3.0-cp39-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:6fb1fd3ab08c0cbc6826a2e0447610c6f09e983a281b919ed721ad32236b8b23", size = 316726, upload-time = "2025-02-28T01:23:47.575Z" }, - { url = "https://files.pythonhosted.org/packages/0c/a1/9898ea3faac0b156d457fd73a3cb9c2855c6fd063e44b8522925cdd8ce46/bcrypt-4.3.0-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:e965a9c1e9a393b8005031ff52583cedc15b7884fce7deb8b0346388837d6cfe", size = 343664, upload-time = "2025-02-28T01:23:49.059Z" }, - { url = "https://files.pythonhosted.org/packages/40/f2/71b4ed65ce38982ecdda0ff20c3ad1b15e71949c78b2c053df53629ce940/bcrypt-4.3.0-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:79e70b8342a33b52b55d93b3a59223a844962bef479f6a0ea318ebbcadf71505", size = 363128, upload-time = "2025-02-28T01:23:50.399Z" }, - { url = "https://files.pythonhosted.org/packages/11/99/12f6a58eca6dea4be992d6c681b7ec9410a1d9f5cf368c61437e31daa879/bcrypt-4.3.0-cp39-abi3-win32.whl", hash = "sha256:b4d4e57f0a63fd0b358eb765063ff661328f69a04494427265950c71b992a39a", size = 160598, upload-time = "2025-02-28T01:23:51.775Z" }, - { url = "https://files.pythonhosted.org/packages/a9/cf/45fb5261ece3e6b9817d3d82b2f343a505fd58674a92577923bc500bd1aa/bcrypt-4.3.0-cp39-abi3-win_amd64.whl", hash = "sha256:e53e074b120f2877a35cc6c736b8eb161377caae8925c17688bd46ba56daaa5b", size = 152799, upload-time = "2025-02-28T01:23:53.139Z" }, - { url = "https://files.pythonhosted.org/packages/4c/b1/1289e21d710496b88340369137cc4c5f6ee036401190ea116a7b4ae6d32a/bcrypt-4.3.0-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:a839320bf27d474e52ef8cb16449bb2ce0ba03ca9f44daba6d93fa1d8828e48a", size = 275103, upload-time = "2025-02-28T01:24:00.764Z" }, - { url = "https://files.pythonhosted.org/packages/94/41/19be9fe17e4ffc5d10b7b67f10e459fc4eee6ffe9056a88de511920cfd8d/bcrypt-4.3.0-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:bdc6a24e754a555d7316fa4774e64c6c3997d27ed2d1964d55920c7c227bc4ce", size = 280513, upload-time = "2025-02-28T01:24:02.243Z" }, - { url = "https://files.pythonhosted.org/packages/aa/73/05687a9ef89edebdd8ad7474c16d8af685eb4591c3c38300bb6aad4f0076/bcrypt-4.3.0-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:55a935b8e9a1d2def0626c4269db3fcd26728cbff1e84f0341465c31c4ee56d8", size = 274685, upload-time = "2025-02-28T01:24:04.512Z" }, - { url = "https://files.pythonhosted.org/packages/63/13/47bba97924ebe86a62ef83dc75b7c8a881d53c535f83e2c54c4bd701e05c/bcrypt-4.3.0-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:57967b7a28d855313a963aaea51bf6df89f833db4320da458e5b3c5ab6d4c938", size = 280110, upload-time = "2025-02-28T01:24:05.896Z" }, -] - -[[package]] -name = "black" -version = "25.1.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "click" }, - { name = "mypy-extensions" }, - { name = "packaging" }, - { name = "pathspec" }, - { name = "platformdirs" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/94/49/26a7b0f3f35da4b5a65f081943b7bcd22d7002f5f0fb8098ec1ff21cb6ef/black-25.1.0.tar.gz", hash = "sha256:33496d5cd1222ad73391352b4ae8da15253c5de89b93a80b3e2c8d9a19ec2666", size = 649449, upload-time = "2025-01-29T04:15:40.373Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/7e/4f/87f596aca05c3ce5b94b8663dbfe242a12843caaa82dd3f85f1ffdc3f177/black-25.1.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a39337598244de4bae26475f77dda852ea00a93bd4c728e09eacd827ec929df0", size = 1614372, upload-time = "2025-01-29T05:37:11.71Z" }, - { url = "https://files.pythonhosted.org/packages/e7/d0/2c34c36190b741c59c901e56ab7f6e54dad8df05a6272a9747ecef7c6036/black-25.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:96c1c7cd856bba8e20094e36e0f948718dc688dba4a9d78c3adde52b9e6c2299", size = 1442865, upload-time = "2025-01-29T05:37:14.309Z" }, - { url = "https://files.pythonhosted.org/packages/21/d4/7518c72262468430ead45cf22bd86c883a6448b9eb43672765d69a8f1248/black-25.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bce2e264d59c91e52d8000d507eb20a9aca4a778731a08cfff7e5ac4a4bb7096", size = 1749699, upload-time = "2025-01-29T04:18:17.688Z" }, - { url = "https://files.pythonhosted.org/packages/58/db/4f5beb989b547f79096e035c4981ceb36ac2b552d0ac5f2620e941501c99/black-25.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:172b1dbff09f86ce6f4eb8edf9dede08b1fce58ba194c87d7a4f1a5aa2f5b3c2", size = 1428028, upload-time = "2025-01-29T04:18:51.711Z" }, - { url = "https://files.pythonhosted.org/packages/83/71/3fe4741df7adf015ad8dfa082dd36c94ca86bb21f25608eb247b4afb15b2/black-25.1.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4b60580e829091e6f9238c848ea6750efed72140b91b048770b64e74fe04908b", size = 1650988, upload-time = "2025-01-29T05:37:16.707Z" }, - { url = "https://files.pythonhosted.org/packages/13/f3/89aac8a83d73937ccd39bbe8fc6ac8860c11cfa0af5b1c96d081facac844/black-25.1.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1e2978f6df243b155ef5fa7e558a43037c3079093ed5d10fd84c43900f2d8ecc", size = 1453985, upload-time = "2025-01-29T05:37:18.273Z" }, - { url = "https://files.pythonhosted.org/packages/6f/22/b99efca33f1f3a1d2552c714b1e1b5ae92efac6c43e790ad539a163d1754/black-25.1.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3b48735872ec535027d979e8dcb20bf4f70b5ac75a8ea99f127c106a7d7aba9f", size = 1783816, upload-time = "2025-01-29T04:18:33.823Z" }, - { url = "https://files.pythonhosted.org/packages/18/7e/a27c3ad3822b6f2e0e00d63d58ff6299a99a5b3aee69fa77cd4b0076b261/black-25.1.0-cp312-cp312-win_amd64.whl", hash = "sha256:ea0213189960bda9cf99be5b8c8ce66bb054af5e9e861249cd23471bd7b0b3ba", size = 1440860, upload-time = "2025-01-29T04:19:12.944Z" }, - { url = "https://files.pythonhosted.org/packages/98/87/0edf98916640efa5d0696e1abb0a8357b52e69e82322628f25bf14d263d1/black-25.1.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8f0b18a02996a836cc9c9c78e5babec10930862827b1b724ddfe98ccf2f2fe4f", size = 1650673, upload-time = "2025-01-29T05:37:20.574Z" }, - { url = "https://files.pythonhosted.org/packages/52/e5/f7bf17207cf87fa6e9b676576749c6b6ed0d70f179a3d812c997870291c3/black-25.1.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:afebb7098bfbc70037a053b91ae8437c3857482d3a690fefc03e9ff7aa9a5fd3", size = 1453190, upload-time = "2025-01-29T05:37:22.106Z" }, - { url = "https://files.pythonhosted.org/packages/e3/ee/adda3d46d4a9120772fae6de454c8495603c37c4c3b9c60f25b1ab6401fe/black-25.1.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:030b9759066a4ee5e5aca28c3c77f9c64789cdd4de8ac1df642c40b708be6171", size = 1782926, upload-time = "2025-01-29T04:18:58.564Z" }, - { url = "https://files.pythonhosted.org/packages/cc/64/94eb5f45dcb997d2082f097a3944cfc7fe87e071907f677e80788a2d7b7a/black-25.1.0-cp313-cp313-win_amd64.whl", hash = "sha256:a22f402b410566e2d1c950708c77ebf5ebd5d0d88a6a2e87c86d9fb48afa0d18", size = 1442613, upload-time = "2025-01-29T04:19:27.63Z" }, - { url = "https://files.pythonhosted.org/packages/09/71/54e999902aed72baf26bca0d50781b01838251a462612966e9fc4891eadd/black-25.1.0-py3-none-any.whl", hash = "sha256:95e8176dae143ba9097f351d174fdaf0ccd29efb414b362ae3fd72bf0f710717", size = 207646, upload-time = "2025-01-29T04:15:38.082Z" }, -] - -[[package]] -name = "boto3" -version = "1.40.18" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "botocore" }, - { name = "jmespath" }, - { name = "s3transfer" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/36/35/a30dc21ca6582358e0ce963f38e85d42ea619f12e7be4101a834c21d749d/boto3-1.40.18.tar.gz", hash = "sha256:64301d39adecc154e3e595eaf0d4f28998ef0a5551f1d033aeac51a9e1a688e5", size = 111994, upload-time = "2025-08-26T19:21:38.61Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/ad/b5/3fc1802eb24aef135c3ba69fff2a9bfcc6a7a8258fb396706b1a6a44de36/boto3-1.40.18-py3-none-any.whl", hash = "sha256:daa776ba1251a7458c9d6c7627873d0c2460c8e8272d35759065580e9193700a", size = 140076, upload-time = "2025-08-26T19:21:36.484Z" }, -] - -[[package]] -name = "botocore" -version = "1.40.18" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "jmespath" }, - { name = "python-dateutil" }, - { name = "urllib3" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/6a/91/2e745382793fa7d30810a7d5ca3e05f6817b6db07601ca5aaab12720caf9/botocore-1.40.18.tar.gz", hash = "sha256:afd69bdadd8c55cc89d69de0799829e555193a352d87867f746e19020271cc0f", size = 14375007, upload-time = "2025-08-26T19:21:24.996Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/1a/f5/bd57bf21fdcc4e500cc406ed2c296e626ddd160f0fee2a4932256e5d62d8/botocore-1.40.18-py3-none-any.whl", hash = "sha256:57025c46ca00cf8cec25de07a759521bfbfb3036a0f69b272654a354615dc45f", size = 14039935, upload-time = "2025-08-26T19:21:19.085Z" }, -] - -[[package]] -name = "cachetools" -version = "5.5.2" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/6c/81/3747dad6b14fa2cf53fcf10548cf5aea6913e96fab41a3c198676f8948a5/cachetools-5.5.2.tar.gz", hash = "sha256:1a661caa9175d26759571b2e19580f9d6393969e5dfca11fdb1f947a23e640d4", size = 28380, upload-time = "2025-02-20T21:01:19.524Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/72/76/20fa66124dbe6be5cafeb312ece67de6b61dd91a0247d1ea13db4ebb33c2/cachetools-5.5.2-py3-none-any.whl", hash = "sha256:d26a22bcc62eb95c3beabd9f1ee5e820d3d2704fe2967cbe350e20c8ffcd3f0a", size = 10080, upload-time = "2025-02-20T21:01:16.647Z" }, -] - -[[package]] -name = "certifi" -version = "2025.8.3" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/dc/67/960ebe6bf230a96cda2e0abcf73af550ec4f090005363542f0765df162e0/certifi-2025.8.3.tar.gz", hash = "sha256:e564105f78ded564e3ae7c923924435e1daa7463faeab5bb932bc53ffae63407", size = 162386, upload-time = "2025-08-03T03:07:47.08Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/e5/48/1549795ba7742c948d2ad169c1c8cdbae65bc450d6cd753d124b17c8cd32/certifi-2025.8.3-py3-none-any.whl", hash = "sha256:f6c12493cfb1b06ba2ff328595af9350c65d6644968e5d3a2ffd78699af217a5", size = 161216, upload-time = "2025-08-03T03:07:45.777Z" }, -] - -[[package]] -name = "cffi" -version = "2.0.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "pycparser", marker = "implementation_name != 'PyPy'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/eb/56/b1ba7935a17738ae8453301356628e8147c79dbb825bcbc73dc7401f9846/cffi-2.0.0.tar.gz", hash = "sha256:44d1b5909021139fe36001ae048dbdde8214afa20200eda0f64c068cac5d5529", size = 523588, upload-time = "2025-09-08T23:24:04.541Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/12/4a/3dfd5f7850cbf0d06dc84ba9aa00db766b52ca38d8b86e3a38314d52498c/cffi-2.0.0-cp311-cp311-macosx_10_13_x86_64.whl", hash = "sha256:b4c854ef3adc177950a8dfc81a86f5115d2abd545751a304c5bcf2c2c7283cfe", size = 184344, upload-time = "2025-09-08T23:22:26.456Z" }, - { url = "https://files.pythonhosted.org/packages/4f/8b/f0e4c441227ba756aafbe78f117485b25bb26b1c059d01f137fa6d14896b/cffi-2.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2de9a304e27f7596cd03d16f1b7c72219bd944e99cc52b84d0145aefb07cbd3c", size = 180560, upload-time = "2025-09-08T23:22:28.197Z" }, - { url = "https://files.pythonhosted.org/packages/b1/b7/1200d354378ef52ec227395d95c2576330fd22a869f7a70e88e1447eb234/cffi-2.0.0-cp311-cp311-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:baf5215e0ab74c16e2dd324e8ec067ef59e41125d3eade2b863d294fd5035c92", size = 209613, upload-time = "2025-09-08T23:22:29.475Z" }, - { url = "https://files.pythonhosted.org/packages/b8/56/6033f5e86e8cc9bb629f0077ba71679508bdf54a9a5e112a3c0b91870332/cffi-2.0.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:730cacb21e1bdff3ce90babf007d0a0917cc3e6492f336c2f0134101e0944f93", size = 216476, upload-time = "2025-09-08T23:22:31.063Z" }, - { url = "https://files.pythonhosted.org/packages/dc/7f/55fecd70f7ece178db2f26128ec41430d8720f2d12ca97bf8f0a628207d5/cffi-2.0.0-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:6824f87845e3396029f3820c206e459ccc91760e8fa24422f8b0c3d1731cbec5", size = 203374, upload-time = "2025-09-08T23:22:32.507Z" }, - { url = "https://files.pythonhosted.org/packages/84/ef/a7b77c8bdc0f77adc3b46888f1ad54be8f3b7821697a7b89126e829e676a/cffi-2.0.0-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:9de40a7b0323d889cf8d23d1ef214f565ab154443c42737dfe52ff82cf857664", size = 202597, upload-time = "2025-09-08T23:22:34.132Z" }, - { url = "https://files.pythonhosted.org/packages/d7/91/500d892b2bf36529a75b77958edfcd5ad8e2ce4064ce2ecfeab2125d72d1/cffi-2.0.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:8941aaadaf67246224cee8c3803777eed332a19d909b47e29c9842ef1e79ac26", size = 215574, upload-time = "2025-09-08T23:22:35.443Z" }, - { url = "https://files.pythonhosted.org/packages/44/64/58f6255b62b101093d5df22dcb752596066c7e89dd725e0afaed242a61be/cffi-2.0.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:a05d0c237b3349096d3981b727493e22147f934b20f6f125a3eba8f994bec4a9", size = 218971, upload-time = "2025-09-08T23:22:36.805Z" }, - { url = "https://files.pythonhosted.org/packages/ab/49/fa72cebe2fd8a55fbe14956f9970fe8eb1ac59e5df042f603ef7c8ba0adc/cffi-2.0.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:94698a9c5f91f9d138526b48fe26a199609544591f859c870d477351dc7b2414", size = 211972, upload-time = "2025-09-08T23:22:38.436Z" }, - { url = "https://files.pythonhosted.org/packages/0b/28/dd0967a76aab36731b6ebfe64dec4e981aff7e0608f60c2d46b46982607d/cffi-2.0.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:5fed36fccc0612a53f1d4d9a816b50a36702c28a2aa880cb8a122b3466638743", size = 217078, upload-time = "2025-09-08T23:22:39.776Z" }, - { url = "https://files.pythonhosted.org/packages/2b/c0/015b25184413d7ab0a410775fdb4a50fca20f5589b5dab1dbbfa3baad8ce/cffi-2.0.0-cp311-cp311-win32.whl", hash = "sha256:c649e3a33450ec82378822b3dad03cc228b8f5963c0c12fc3b1e0ab940f768a5", size = 172076, upload-time = "2025-09-08T23:22:40.95Z" }, - { url = "https://files.pythonhosted.org/packages/ae/8f/dc5531155e7070361eb1b7e4c1a9d896d0cb21c49f807a6c03fd63fc877e/cffi-2.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:66f011380d0e49ed280c789fbd08ff0d40968ee7b665575489afa95c98196ab5", size = 182820, upload-time = "2025-09-08T23:22:42.463Z" }, - { url = "https://files.pythonhosted.org/packages/95/5c/1b493356429f9aecfd56bc171285a4c4ac8697f76e9bbbbb105e537853a1/cffi-2.0.0-cp311-cp311-win_arm64.whl", hash = "sha256:c6638687455baf640e37344fe26d37c404db8b80d037c3d29f58fe8d1c3b194d", size = 177635, upload-time = "2025-09-08T23:22:43.623Z" }, - { url = "https://files.pythonhosted.org/packages/ea/47/4f61023ea636104d4f16ab488e268b93008c3d0bb76893b1b31db1f96802/cffi-2.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:6d02d6655b0e54f54c4ef0b94eb6be0607b70853c45ce98bd278dc7de718be5d", size = 185271, upload-time = "2025-09-08T23:22:44.795Z" }, - { url = "https://files.pythonhosted.org/packages/df/a2/781b623f57358e360d62cdd7a8c681f074a71d445418a776eef0aadb4ab4/cffi-2.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8eca2a813c1cb7ad4fb74d368c2ffbbb4789d377ee5bb8df98373c2cc0dee76c", size = 181048, upload-time = "2025-09-08T23:22:45.938Z" }, - { url = "https://files.pythonhosted.org/packages/ff/df/a4f0fbd47331ceeba3d37c2e51e9dfc9722498becbeec2bd8bc856c9538a/cffi-2.0.0-cp312-cp312-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:21d1152871b019407d8ac3985f6775c079416c282e431a4da6afe7aefd2bccbe", size = 212529, upload-time = "2025-09-08T23:22:47.349Z" }, - { url = "https://files.pythonhosted.org/packages/d5/72/12b5f8d3865bf0f87cf1404d8c374e7487dcf097a1c91c436e72e6badd83/cffi-2.0.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:b21e08af67b8a103c71a250401c78d5e0893beff75e28c53c98f4de42f774062", size = 220097, upload-time = "2025-09-08T23:22:48.677Z" }, - { url = "https://files.pythonhosted.org/packages/c2/95/7a135d52a50dfa7c882ab0ac17e8dc11cec9d55d2c18dda414c051c5e69e/cffi-2.0.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:1e3a615586f05fc4065a8b22b8152f0c1b00cdbc60596d187c2a74f9e3036e4e", size = 207983, upload-time = "2025-09-08T23:22:50.06Z" }, - { url = "https://files.pythonhosted.org/packages/3a/c8/15cb9ada8895957ea171c62dc78ff3e99159ee7adb13c0123c001a2546c1/cffi-2.0.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:81afed14892743bbe14dacb9e36d9e0e504cd204e0b165062c488942b9718037", size = 206519, upload-time = "2025-09-08T23:22:51.364Z" }, - { url = "https://files.pythonhosted.org/packages/78/2d/7fa73dfa841b5ac06c7b8855cfc18622132e365f5b81d02230333ff26e9e/cffi-2.0.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:3e17ed538242334bf70832644a32a7aae3d83b57567f9fd60a26257e992b79ba", size = 219572, upload-time = "2025-09-08T23:22:52.902Z" }, - { url = "https://files.pythonhosted.org/packages/07/e0/267e57e387b4ca276b90f0434ff88b2c2241ad72b16d31836adddfd6031b/cffi-2.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3925dd22fa2b7699ed2617149842d2e6adde22b262fcbfada50e3d195e4b3a94", size = 222963, upload-time = "2025-09-08T23:22:54.518Z" }, - { url = "https://files.pythonhosted.org/packages/b6/75/1f2747525e06f53efbd878f4d03bac5b859cbc11c633d0fb81432d98a795/cffi-2.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:2c8f814d84194c9ea681642fd164267891702542f028a15fc97d4674b6206187", size = 221361, upload-time = "2025-09-08T23:22:55.867Z" }, - { url = "https://files.pythonhosted.org/packages/7b/2b/2b6435f76bfeb6bbf055596976da087377ede68df465419d192acf00c437/cffi-2.0.0-cp312-cp312-win32.whl", hash = "sha256:da902562c3e9c550df360bfa53c035b2f241fed6d9aef119048073680ace4a18", size = 172932, upload-time = "2025-09-08T23:22:57.188Z" }, - { url = "https://files.pythonhosted.org/packages/f8/ed/13bd4418627013bec4ed6e54283b1959cf6db888048c7cf4b4c3b5b36002/cffi-2.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:da68248800ad6320861f129cd9c1bf96ca849a2771a59e0344e88681905916f5", size = 183557, upload-time = "2025-09-08T23:22:58.351Z" }, - { url = "https://files.pythonhosted.org/packages/95/31/9f7f93ad2f8eff1dbc1c3656d7ca5bfd8fb52c9d786b4dcf19b2d02217fa/cffi-2.0.0-cp312-cp312-win_arm64.whl", hash = "sha256:4671d9dd5ec934cb9a73e7ee9676f9362aba54f7f34910956b84d727b0d73fb6", size = 177762, upload-time = "2025-09-08T23:22:59.668Z" }, - { url = "https://files.pythonhosted.org/packages/4b/8d/a0a47a0c9e413a658623d014e91e74a50cdd2c423f7ccfd44086ef767f90/cffi-2.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:00bdf7acc5f795150faa6957054fbbca2439db2f775ce831222b66f192f03beb", size = 185230, upload-time = "2025-09-08T23:23:00.879Z" }, - { url = "https://files.pythonhosted.org/packages/4a/d2/a6c0296814556c68ee32009d9c2ad4f85f2707cdecfd7727951ec228005d/cffi-2.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:45d5e886156860dc35862657e1494b9bae8dfa63bf56796f2fb56e1679fc0bca", size = 181043, upload-time = "2025-09-08T23:23:02.231Z" }, - { url = "https://files.pythonhosted.org/packages/b0/1e/d22cc63332bd59b06481ceaac49d6c507598642e2230f201649058a7e704/cffi-2.0.0-cp313-cp313-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:07b271772c100085dd28b74fa0cd81c8fb1a3ba18b21e03d7c27f3436a10606b", size = 212446, upload-time = "2025-09-08T23:23:03.472Z" }, - { url = "https://files.pythonhosted.org/packages/a9/f5/a2c23eb03b61a0b8747f211eb716446c826ad66818ddc7810cc2cc19b3f2/cffi-2.0.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d48a880098c96020b02d5a1f7d9251308510ce8858940e6fa99ece33f610838b", size = 220101, upload-time = "2025-09-08T23:23:04.792Z" }, - { url = "https://files.pythonhosted.org/packages/f2/7f/e6647792fc5850d634695bc0e6ab4111ae88e89981d35ac269956605feba/cffi-2.0.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:f93fd8e5c8c0a4aa1f424d6173f14a892044054871c771f8566e4008eaa359d2", size = 207948, upload-time = "2025-09-08T23:23:06.127Z" }, - { url = "https://files.pythonhosted.org/packages/cb/1e/a5a1bd6f1fb30f22573f76533de12a00bf274abcdc55c8edab639078abb6/cffi-2.0.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:dd4f05f54a52fb558f1ba9f528228066954fee3ebe629fc1660d874d040ae5a3", size = 206422, upload-time = "2025-09-08T23:23:07.753Z" }, - { url = "https://files.pythonhosted.org/packages/98/df/0a1755e750013a2081e863e7cd37e0cdd02664372c754e5560099eb7aa44/cffi-2.0.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c8d3b5532fc71b7a77c09192b4a5a200ea992702734a2e9279a37f2478236f26", size = 219499, upload-time = "2025-09-08T23:23:09.648Z" }, - { url = "https://files.pythonhosted.org/packages/50/e1/a969e687fcf9ea58e6e2a928ad5e2dd88cc12f6f0ab477e9971f2309b57c/cffi-2.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:d9b29c1f0ae438d5ee9acb31cadee00a58c46cc9c0b2f9038c6b0b3470877a8c", size = 222928, upload-time = "2025-09-08T23:23:10.928Z" }, - { url = "https://files.pythonhosted.org/packages/36/54/0362578dd2c9e557a28ac77698ed67323ed5b9775ca9d3fe73fe191bb5d8/cffi-2.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6d50360be4546678fc1b79ffe7a66265e28667840010348dd69a314145807a1b", size = 221302, upload-time = "2025-09-08T23:23:12.42Z" }, - { url = "https://files.pythonhosted.org/packages/eb/6d/bf9bda840d5f1dfdbf0feca87fbdb64a918a69bca42cfa0ba7b137c48cb8/cffi-2.0.0-cp313-cp313-win32.whl", hash = "sha256:74a03b9698e198d47562765773b4a8309919089150a0bb17d829ad7b44b60d27", size = 172909, upload-time = "2025-09-08T23:23:14.32Z" }, - { url = "https://files.pythonhosted.org/packages/37/18/6519e1ee6f5a1e579e04b9ddb6f1676c17368a7aba48299c3759bbc3c8b3/cffi-2.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:19f705ada2530c1167abacb171925dd886168931e0a7b78f5bffcae5c6b5be75", size = 183402, upload-time = "2025-09-08T23:23:15.535Z" }, - { url = "https://files.pythonhosted.org/packages/cb/0e/02ceeec9a7d6ee63bb596121c2c8e9b3a9e150936f4fbef6ca1943e6137c/cffi-2.0.0-cp313-cp313-win_arm64.whl", hash = "sha256:256f80b80ca3853f90c21b23ee78cd008713787b1b1e93eae9f3d6a7134abd91", size = 177780, upload-time = "2025-09-08T23:23:16.761Z" }, - { url = "https://files.pythonhosted.org/packages/92/c4/3ce07396253a83250ee98564f8d7e9789fab8e58858f35d07a9a2c78de9f/cffi-2.0.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fc33c5141b55ed366cfaad382df24fe7dcbc686de5be719b207bb248e3053dc5", size = 185320, upload-time = "2025-09-08T23:23:18.087Z" }, - { url = "https://files.pythonhosted.org/packages/59/dd/27e9fa567a23931c838c6b02d0764611c62290062a6d4e8ff7863daf9730/cffi-2.0.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c654de545946e0db659b3400168c9ad31b5d29593291482c43e3564effbcee13", size = 181487, upload-time = "2025-09-08T23:23:19.622Z" }, - { url = "https://files.pythonhosted.org/packages/d6/43/0e822876f87ea8a4ef95442c3d766a06a51fc5298823f884ef87aaad168c/cffi-2.0.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:24b6f81f1983e6df8db3adc38562c83f7d4a0c36162885ec7f7b77c7dcbec97b", size = 220049, upload-time = "2025-09-08T23:23:20.853Z" }, - { url = "https://files.pythonhosted.org/packages/b4/89/76799151d9c2d2d1ead63c2429da9ea9d7aac304603de0c6e8764e6e8e70/cffi-2.0.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:12873ca6cb9b0f0d3a0da705d6086fe911591737a59f28b7936bdfed27c0d47c", size = 207793, upload-time = "2025-09-08T23:23:22.08Z" }, - { url = "https://files.pythonhosted.org/packages/bb/dd/3465b14bb9e24ee24cb88c9e3730f6de63111fffe513492bf8c808a3547e/cffi-2.0.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:d9b97165e8aed9272a6bb17c01e3cc5871a594a446ebedc996e2397a1c1ea8ef", size = 206300, upload-time = "2025-09-08T23:23:23.314Z" }, - { url = "https://files.pythonhosted.org/packages/47/d9/d83e293854571c877a92da46fdec39158f8d7e68da75bf73581225d28e90/cffi-2.0.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:afb8db5439b81cf9c9d0c80404b60c3cc9c3add93e114dcae767f1477cb53775", size = 219244, upload-time = "2025-09-08T23:23:24.541Z" }, - { url = "https://files.pythonhosted.org/packages/2b/0f/1f177e3683aead2bb00f7679a16451d302c436b5cbf2505f0ea8146ef59e/cffi-2.0.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:737fe7d37e1a1bffe70bd5754ea763a62a066dc5913ca57e957824b72a85e205", size = 222828, upload-time = "2025-09-08T23:23:26.143Z" }, - { url = "https://files.pythonhosted.org/packages/c6/0f/cafacebd4b040e3119dcb32fed8bdef8dfe94da653155f9d0b9dc660166e/cffi-2.0.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:38100abb9d1b1435bc4cc340bb4489635dc2f0da7456590877030c9b3d40b0c1", size = 220926, upload-time = "2025-09-08T23:23:27.873Z" }, - { url = "https://files.pythonhosted.org/packages/3e/aa/df335faa45b395396fcbc03de2dfcab242cd61a9900e914fe682a59170b1/cffi-2.0.0-cp314-cp314-win32.whl", hash = "sha256:087067fa8953339c723661eda6b54bc98c5625757ea62e95eb4898ad5e776e9f", size = 175328, upload-time = "2025-09-08T23:23:44.61Z" }, - { url = "https://files.pythonhosted.org/packages/bb/92/882c2d30831744296ce713f0feb4c1cd30f346ef747b530b5318715cc367/cffi-2.0.0-cp314-cp314-win_amd64.whl", hash = "sha256:203a48d1fb583fc7d78a4c6655692963b860a417c0528492a6bc21f1aaefab25", size = 185650, upload-time = "2025-09-08T23:23:45.848Z" }, - { url = "https://files.pythonhosted.org/packages/9f/2c/98ece204b9d35a7366b5b2c6539c350313ca13932143e79dc133ba757104/cffi-2.0.0-cp314-cp314-win_arm64.whl", hash = "sha256:dbd5c7a25a7cb98f5ca55d258b103a2054f859a46ae11aaf23134f9cc0d356ad", size = 180687, upload-time = "2025-09-08T23:23:47.105Z" }, - { url = "https://files.pythonhosted.org/packages/3e/61/c768e4d548bfa607abcda77423448df8c471f25dbe64fb2ef6d555eae006/cffi-2.0.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:9a67fc9e8eb39039280526379fb3a70023d77caec1852002b4da7e8b270c4dd9", size = 188773, upload-time = "2025-09-08T23:23:29.347Z" }, - { url = "https://files.pythonhosted.org/packages/2c/ea/5f76bce7cf6fcd0ab1a1058b5af899bfbef198bea4d5686da88471ea0336/cffi-2.0.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:7a66c7204d8869299919db4d5069a82f1561581af12b11b3c9f48c584eb8743d", size = 185013, upload-time = "2025-09-08T23:23:30.63Z" }, - { url = "https://files.pythonhosted.org/packages/be/b4/c56878d0d1755cf9caa54ba71e5d049479c52f9e4afc230f06822162ab2f/cffi-2.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7cc09976e8b56f8cebd752f7113ad07752461f48a58cbba644139015ac24954c", size = 221593, upload-time = "2025-09-08T23:23:31.91Z" }, - { url = "https://files.pythonhosted.org/packages/e0/0d/eb704606dfe8033e7128df5e90fee946bbcb64a04fcdaa97321309004000/cffi-2.0.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:92b68146a71df78564e4ef48af17551a5ddd142e5190cdf2c5624d0c3ff5b2e8", size = 209354, upload-time = "2025-09-08T23:23:33.214Z" }, - { url = "https://files.pythonhosted.org/packages/d8/19/3c435d727b368ca475fb8742ab97c9cb13a0de600ce86f62eab7fa3eea60/cffi-2.0.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:b1e74d11748e7e98e2f426ab176d4ed720a64412b6a15054378afdb71e0f37dc", size = 208480, upload-time = "2025-09-08T23:23:34.495Z" }, - { url = "https://files.pythonhosted.org/packages/d0/44/681604464ed9541673e486521497406fadcc15b5217c3e326b061696899a/cffi-2.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:28a3a209b96630bca57cce802da70c266eb08c6e97e5afd61a75611ee6c64592", size = 221584, upload-time = "2025-09-08T23:23:36.096Z" }, - { url = "https://files.pythonhosted.org/packages/25/8e/342a504ff018a2825d395d44d63a767dd8ebc927ebda557fecdaca3ac33a/cffi-2.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:7553fb2090d71822f02c629afe6042c299edf91ba1bf94951165613553984512", size = 224443, upload-time = "2025-09-08T23:23:37.328Z" }, - { url = "https://files.pythonhosted.org/packages/e1/5e/b666bacbbc60fbf415ba9988324a132c9a7a0448a9a8f125074671c0f2c3/cffi-2.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:6c6c373cfc5c83a975506110d17457138c8c63016b563cc9ed6e056a82f13ce4", size = 223437, upload-time = "2025-09-08T23:23:38.945Z" }, - { url = "https://files.pythonhosted.org/packages/a0/1d/ec1a60bd1a10daa292d3cd6bb0b359a81607154fb8165f3ec95fe003b85c/cffi-2.0.0-cp314-cp314t-win32.whl", hash = "sha256:1fc9ea04857caf665289b7a75923f2c6ed559b8298a1b8c49e59f7dd95c8481e", size = 180487, upload-time = "2025-09-08T23:23:40.423Z" }, - { url = "https://files.pythonhosted.org/packages/bf/41/4c1168c74fac325c0c8156f04b6749c8b6a8f405bbf91413ba088359f60d/cffi-2.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:d68b6cef7827e8641e8ef16f4494edda8b36104d79773a334beaa1e3521430f6", size = 191726, upload-time = "2025-09-08T23:23:41.742Z" }, - { url = "https://files.pythonhosted.org/packages/ae/3a/dbeec9d1ee0844c679f6bb5d6ad4e9f198b1224f4e7a32825f47f6192b0c/cffi-2.0.0-cp314-cp314t-win_arm64.whl", hash = "sha256:0a1527a803f0a659de1af2e1fd700213caba79377e27e4693648c2923da066f9", size = 184195, upload-time = "2025-09-08T23:23:43.004Z" }, -] - -[[package]] -name = "cfgv" -version = "3.4.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/11/74/539e56497d9bd1d484fd863dd69cbbfa653cd2aa27abfe35653494d85e94/cfgv-3.4.0.tar.gz", hash = "sha256:e52591d4c5f5dead8e0f673fb16db7949d2cfb3f7da4582893288f0ded8fe560", size = 7114, upload-time = "2023-08-12T20:38:17.776Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/c5/55/51844dd50c4fc7a33b653bfaba4c2456f06955289ca770a5dbd5fd267374/cfgv-3.4.0-py2.py3-none-any.whl", hash = "sha256:b7265b1f29fd3316bfcd2b330d63d024f2bfd8bcb8b0272f8e19a504856c48f9", size = 7249, upload-time = "2023-08-12T20:38:16.269Z" }, -] - -[[package]] -name = "charset-normalizer" -version = "3.4.3" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/83/2d/5fd176ceb9b2fc619e63405525573493ca23441330fcdaee6bef9460e924/charset_normalizer-3.4.3.tar.gz", hash = "sha256:6fce4b8500244f6fcb71465d4a4930d132ba9ab8e71a7859e6a5d59851068d14", size = 122371, upload-time = "2025-08-09T07:57:28.46Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/7f/b5/991245018615474a60965a7c9cd2b4efbaabd16d582a5547c47ee1c7730b/charset_normalizer-3.4.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:b256ee2e749283ef3ddcff51a675ff43798d92d746d1a6e4631bf8c707d22d0b", size = 204483, upload-time = "2025-08-09T07:55:53.12Z" }, - { url = "https://files.pythonhosted.org/packages/c7/2a/ae245c41c06299ec18262825c1569c5d3298fc920e4ddf56ab011b417efd/charset_normalizer-3.4.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:13faeacfe61784e2559e690fc53fa4c5ae97c6fcedb8eb6fb8d0a15b475d2c64", size = 145520, upload-time = "2025-08-09T07:55:54.712Z" }, - { url = "https://files.pythonhosted.org/packages/3a/a4/b3b6c76e7a635748c4421d2b92c7b8f90a432f98bda5082049af37ffc8e3/charset_normalizer-3.4.3-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:00237675befef519d9af72169d8604a067d92755e84fe76492fef5441db05b91", size = 158876, upload-time = "2025-08-09T07:55:56.024Z" }, - { url = "https://files.pythonhosted.org/packages/e2/e6/63bb0e10f90a8243c5def74b5b105b3bbbfb3e7bb753915fe333fb0c11ea/charset_normalizer-3.4.3-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:585f3b2a80fbd26b048a0be90c5aae8f06605d3c92615911c3a2b03a8a3b796f", size = 156083, upload-time = "2025-08-09T07:55:57.582Z" }, - { url = "https://files.pythonhosted.org/packages/87/df/b7737ff046c974b183ea9aa111b74185ac8c3a326c6262d413bd5a1b8c69/charset_normalizer-3.4.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0e78314bdc32fa80696f72fa16dc61168fda4d6a0c014e0380f9d02f0e5d8a07", size = 150295, upload-time = "2025-08-09T07:55:59.147Z" }, - { url = "https://files.pythonhosted.org/packages/61/f1/190d9977e0084d3f1dc169acd060d479bbbc71b90bf3e7bf7b9927dec3eb/charset_normalizer-3.4.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:96b2b3d1a83ad55310de8c7b4a2d04d9277d5591f40761274856635acc5fcb30", size = 148379, upload-time = "2025-08-09T07:56:00.364Z" }, - { url = "https://files.pythonhosted.org/packages/4c/92/27dbe365d34c68cfe0ca76f1edd70e8705d82b378cb54ebbaeabc2e3029d/charset_normalizer-3.4.3-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:939578d9d8fd4299220161fdd76e86c6a251987476f5243e8864a7844476ba14", size = 160018, upload-time = "2025-08-09T07:56:01.678Z" }, - { url = "https://files.pythonhosted.org/packages/99/04/baae2a1ea1893a01635d475b9261c889a18fd48393634b6270827869fa34/charset_normalizer-3.4.3-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:fd10de089bcdcd1be95a2f73dbe6254798ec1bda9f450d5828c96f93e2536b9c", size = 157430, upload-time = "2025-08-09T07:56:02.87Z" }, - { url = "https://files.pythonhosted.org/packages/2f/36/77da9c6a328c54d17b960c89eccacfab8271fdaaa228305330915b88afa9/charset_normalizer-3.4.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:1e8ac75d72fa3775e0b7cb7e4629cec13b7514d928d15ef8ea06bca03ef01cae", size = 151600, upload-time = "2025-08-09T07:56:04.089Z" }, - { url = "https://files.pythonhosted.org/packages/64/d4/9eb4ff2c167edbbf08cdd28e19078bf195762e9bd63371689cab5ecd3d0d/charset_normalizer-3.4.3-cp311-cp311-win32.whl", hash = "sha256:6cf8fd4c04756b6b60146d98cd8a77d0cdae0e1ca20329da2ac85eed779b6849", size = 99616, upload-time = "2025-08-09T07:56:05.658Z" }, - { url = "https://files.pythonhosted.org/packages/f4/9c/996a4a028222e7761a96634d1820de8a744ff4327a00ada9c8942033089b/charset_normalizer-3.4.3-cp311-cp311-win_amd64.whl", hash = "sha256:31a9a6f775f9bcd865d88ee350f0ffb0e25936a7f930ca98995c05abf1faf21c", size = 107108, upload-time = "2025-08-09T07:56:07.176Z" }, - { url = "https://files.pythonhosted.org/packages/e9/5e/14c94999e418d9b87682734589404a25854d5f5d0408df68bc15b6ff54bb/charset_normalizer-3.4.3-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:e28e334d3ff134e88989d90ba04b47d84382a828c061d0d1027b1b12a62b39b1", size = 205655, upload-time = "2025-08-09T07:56:08.475Z" }, - { url = "https://files.pythonhosted.org/packages/7d/a8/c6ec5d389672521f644505a257f50544c074cf5fc292d5390331cd6fc9c3/charset_normalizer-3.4.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0cacf8f7297b0c4fcb74227692ca46b4a5852f8f4f24b3c766dd94a1075c4884", size = 146223, upload-time = "2025-08-09T07:56:09.708Z" }, - { url = "https://files.pythonhosted.org/packages/fc/eb/a2ffb08547f4e1e5415fb69eb7db25932c52a52bed371429648db4d84fb1/charset_normalizer-3.4.3-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c6fd51128a41297f5409deab284fecbe5305ebd7e5a1f959bee1c054622b7018", size = 159366, upload-time = "2025-08-09T07:56:11.326Z" }, - { url = "https://files.pythonhosted.org/packages/82/10/0fd19f20c624b278dddaf83b8464dcddc2456cb4b02bb902a6da126b87a1/charset_normalizer-3.4.3-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:3cfb2aad70f2c6debfbcb717f23b7eb55febc0bb23dcffc0f076009da10c6392", size = 157104, upload-time = "2025-08-09T07:56:13.014Z" }, - { url = "https://files.pythonhosted.org/packages/16/ab/0233c3231af734f5dfcf0844aa9582d5a1466c985bbed6cedab85af9bfe3/charset_normalizer-3.4.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1606f4a55c0fd363d754049cdf400175ee96c992b1f8018b993941f221221c5f", size = 151830, upload-time = "2025-08-09T07:56:14.428Z" }, - { url = "https://files.pythonhosted.org/packages/ae/02/e29e22b4e02839a0e4a06557b1999d0a47db3567e82989b5bb21f3fbbd9f/charset_normalizer-3.4.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:027b776c26d38b7f15b26a5da1044f376455fb3766df8fc38563b4efbc515154", size = 148854, upload-time = "2025-08-09T07:56:16.051Z" }, - { url = "https://files.pythonhosted.org/packages/05/6b/e2539a0a4be302b481e8cafb5af8792da8093b486885a1ae4d15d452bcec/charset_normalizer-3.4.3-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:42e5088973e56e31e4fa58eb6bd709e42fc03799c11c42929592889a2e54c491", size = 160670, upload-time = "2025-08-09T07:56:17.314Z" }, - { url = "https://files.pythonhosted.org/packages/31/e7/883ee5676a2ef217a40ce0bffcc3d0dfbf9e64cbcfbdf822c52981c3304b/charset_normalizer-3.4.3-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:cc34f233c9e71701040d772aa7490318673aa7164a0efe3172b2981218c26d93", size = 158501, upload-time = "2025-08-09T07:56:18.641Z" }, - { url = "https://files.pythonhosted.org/packages/c1/35/6525b21aa0db614cf8b5792d232021dca3df7f90a1944db934efa5d20bb1/charset_normalizer-3.4.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:320e8e66157cc4e247d9ddca8e21f427efc7a04bbd0ac8a9faf56583fa543f9f", size = 153173, upload-time = "2025-08-09T07:56:20.289Z" }, - { url = "https://files.pythonhosted.org/packages/50/ee/f4704bad8201de513fdc8aac1cabc87e38c5818c93857140e06e772b5892/charset_normalizer-3.4.3-cp312-cp312-win32.whl", hash = "sha256:fb6fecfd65564f208cbf0fba07f107fb661bcd1a7c389edbced3f7a493f70e37", size = 99822, upload-time = "2025-08-09T07:56:21.551Z" }, - { url = "https://files.pythonhosted.org/packages/39/f5/3b3836ca6064d0992c58c7561c6b6eee1b3892e9665d650c803bd5614522/charset_normalizer-3.4.3-cp312-cp312-win_amd64.whl", hash = "sha256:86df271bf921c2ee3818f0522e9a5b8092ca2ad8b065ece5d7d9d0e9f4849bcc", size = 107543, upload-time = "2025-08-09T07:56:23.115Z" }, - { url = "https://files.pythonhosted.org/packages/65/ca/2135ac97709b400c7654b4b764daf5c5567c2da45a30cdd20f9eefe2d658/charset_normalizer-3.4.3-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:14c2a87c65b351109f6abfc424cab3927b3bdece6f706e4d12faaf3d52ee5efe", size = 205326, upload-time = "2025-08-09T07:56:24.721Z" }, - { url = "https://files.pythonhosted.org/packages/71/11/98a04c3c97dd34e49c7d247083af03645ca3730809a5509443f3c37f7c99/charset_normalizer-3.4.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:41d1fc408ff5fdfb910200ec0e74abc40387bccb3252f3f27c0676731df2b2c8", size = 146008, upload-time = "2025-08-09T07:56:26.004Z" }, - { url = "https://files.pythonhosted.org/packages/60/f5/4659a4cb3c4ec146bec80c32d8bb16033752574c20b1252ee842a95d1a1e/charset_normalizer-3.4.3-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:1bb60174149316da1c35fa5233681f7c0f9f514509b8e399ab70fea5f17e45c9", size = 159196, upload-time = "2025-08-09T07:56:27.25Z" }, - { url = "https://files.pythonhosted.org/packages/86/9e/f552f7a00611f168b9a5865a1414179b2c6de8235a4fa40189f6f79a1753/charset_normalizer-3.4.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:30d006f98569de3459c2fc1f2acde170b7b2bd265dc1943e87e1a4efe1b67c31", size = 156819, upload-time = "2025-08-09T07:56:28.515Z" }, - { url = "https://files.pythonhosted.org/packages/7e/95/42aa2156235cbc8fa61208aded06ef46111c4d3f0de233107b3f38631803/charset_normalizer-3.4.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:416175faf02e4b0810f1f38bcb54682878a4af94059a1cd63b8747244420801f", size = 151350, upload-time = "2025-08-09T07:56:29.716Z" }, - { url = "https://files.pythonhosted.org/packages/c2/a9/3865b02c56f300a6f94fc631ef54f0a8a29da74fb45a773dfd3dcd380af7/charset_normalizer-3.4.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:6aab0f181c486f973bc7262a97f5aca3ee7e1437011ef0c2ec04b5a11d16c927", size = 148644, upload-time = "2025-08-09T07:56:30.984Z" }, - { url = "https://files.pythonhosted.org/packages/77/d9/cbcf1a2a5c7d7856f11e7ac2d782aec12bdfea60d104e60e0aa1c97849dc/charset_normalizer-3.4.3-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:fdabf8315679312cfa71302f9bd509ded4f2f263fb5b765cf1433b39106c3cc9", size = 160468, upload-time = "2025-08-09T07:56:32.252Z" }, - { url = "https://files.pythonhosted.org/packages/f6/42/6f45efee8697b89fda4d50580f292b8f7f9306cb2971d4b53f8914e4d890/charset_normalizer-3.4.3-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:bd28b817ea8c70215401f657edef3a8aa83c29d447fb0b622c35403780ba11d5", size = 158187, upload-time = "2025-08-09T07:56:33.481Z" }, - { url = "https://files.pythonhosted.org/packages/70/99/f1c3bdcfaa9c45b3ce96f70b14f070411366fa19549c1d4832c935d8e2c3/charset_normalizer-3.4.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:18343b2d246dc6761a249ba1fb13f9ee9a2bcd95decc767319506056ea4ad4dc", size = 152699, upload-time = "2025-08-09T07:56:34.739Z" }, - { url = "https://files.pythonhosted.org/packages/a3/ad/b0081f2f99a4b194bcbb1934ef3b12aa4d9702ced80a37026b7607c72e58/charset_normalizer-3.4.3-cp313-cp313-win32.whl", hash = "sha256:6fb70de56f1859a3f71261cbe41005f56a7842cc348d3aeb26237560bfa5e0ce", size = 99580, upload-time = "2025-08-09T07:56:35.981Z" }, - { url = "https://files.pythonhosted.org/packages/9a/8f/ae790790c7b64f925e5c953b924aaa42a243fb778fed9e41f147b2a5715a/charset_normalizer-3.4.3-cp313-cp313-win_amd64.whl", hash = "sha256:cf1ebb7d78e1ad8ec2a8c4732c7be2e736f6e5123a4146c5b89c9d1f585f8cef", size = 107366, upload-time = "2025-08-09T07:56:37.339Z" }, - { url = "https://files.pythonhosted.org/packages/8e/91/b5a06ad970ddc7a0e513112d40113e834638f4ca1120eb727a249fb2715e/charset_normalizer-3.4.3-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:3cd35b7e8aedeb9e34c41385fda4f73ba609e561faedfae0a9e75e44ac558a15", size = 204342, upload-time = "2025-08-09T07:56:38.687Z" }, - { url = "https://files.pythonhosted.org/packages/ce/ec/1edc30a377f0a02689342f214455c3f6c2fbedd896a1d2f856c002fc3062/charset_normalizer-3.4.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b89bc04de1d83006373429975f8ef9e7932534b8cc9ca582e4db7d20d91816db", size = 145995, upload-time = "2025-08-09T07:56:40.048Z" }, - { url = "https://files.pythonhosted.org/packages/17/e5/5e67ab85e6d22b04641acb5399c8684f4d37caf7558a53859f0283a650e9/charset_normalizer-3.4.3-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:2001a39612b241dae17b4687898843f254f8748b796a2e16f1051a17078d991d", size = 158640, upload-time = "2025-08-09T07:56:41.311Z" }, - { url = "https://files.pythonhosted.org/packages/f1/e5/38421987f6c697ee3722981289d554957c4be652f963d71c5e46a262e135/charset_normalizer-3.4.3-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:8dcfc373f888e4fb39a7bc57e93e3b845e7f462dacc008d9749568b1c4ece096", size = 156636, upload-time = "2025-08-09T07:56:43.195Z" }, - { url = "https://files.pythonhosted.org/packages/a0/e4/5a075de8daa3ec0745a9a3b54467e0c2967daaaf2cec04c845f73493e9a1/charset_normalizer-3.4.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:18b97b8404387b96cdbd30ad660f6407799126d26a39ca65729162fd810a99aa", size = 150939, upload-time = "2025-08-09T07:56:44.819Z" }, - { url = "https://files.pythonhosted.org/packages/02/f7/3611b32318b30974131db62b4043f335861d4d9b49adc6d57c1149cc49d4/charset_normalizer-3.4.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:ccf600859c183d70eb47e05a44cd80a4ce77394d1ac0f79dbd2dd90a69a3a049", size = 148580, upload-time = "2025-08-09T07:56:46.684Z" }, - { url = "https://files.pythonhosted.org/packages/7e/61/19b36f4bd67f2793ab6a99b979b4e4f3d8fc754cbdffb805335df4337126/charset_normalizer-3.4.3-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:53cd68b185d98dde4ad8990e56a58dea83a4162161b1ea9272e5c9182ce415e0", size = 159870, upload-time = "2025-08-09T07:56:47.941Z" }, - { url = "https://files.pythonhosted.org/packages/06/57/84722eefdd338c04cf3030ada66889298eaedf3e7a30a624201e0cbe424a/charset_normalizer-3.4.3-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:30a96e1e1f865f78b030d65241c1ee850cdf422d869e9028e2fc1d5e4db73b92", size = 157797, upload-time = "2025-08-09T07:56:49.756Z" }, - { url = "https://files.pythonhosted.org/packages/72/2a/aff5dd112b2f14bcc3462c312dce5445806bfc8ab3a7328555da95330e4b/charset_normalizer-3.4.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:d716a916938e03231e86e43782ca7878fb602a125a91e7acb8b5112e2e96ac16", size = 152224, upload-time = "2025-08-09T07:56:51.369Z" }, - { url = "https://files.pythonhosted.org/packages/b7/8c/9839225320046ed279c6e839d51f028342eb77c91c89b8ef2549f951f3ec/charset_normalizer-3.4.3-cp314-cp314-win32.whl", hash = "sha256:c6dbd0ccdda3a2ba7c2ecd9d77b37f3b5831687d8dc1b6ca5f56a4880cc7b7ce", size = 100086, upload-time = "2025-08-09T07:56:52.722Z" }, - { url = "https://files.pythonhosted.org/packages/ee/7a/36fbcf646e41f710ce0a563c1c9a343c6edf9be80786edeb15b6f62e17db/charset_normalizer-3.4.3-cp314-cp314-win_amd64.whl", hash = "sha256:73dc19b562516fc9bcf6e5d6e596df0b4eb98d87e4f79f3ae71840e6ed21361c", size = 107400, upload-time = "2025-08-09T07:56:55.172Z" }, - { url = "https://files.pythonhosted.org/packages/8a/1f/f041989e93b001bc4e44bb1669ccdcf54d3f00e628229a85b08d330615c5/charset_normalizer-3.4.3-py3-none-any.whl", hash = "sha256:ce571ab16d890d23b5c278547ba694193a45011ff86a9162a71307ed9f86759a", size = 53175, upload-time = "2025-08-09T07:57:26.864Z" }, -] - -[[package]] -name = "click" -version = "8.2.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "colorama", marker = "sys_platform == 'win32'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/60/6c/8ca2efa64cf75a977a0d7fac081354553ebe483345c734fb6b6515d96bbc/click-8.2.1.tar.gz", hash = "sha256:27c491cc05d968d271d5a1db13e3b5a184636d9d930f148c50b038f0d0646202", size = 286342, upload-time = "2025-05-20T23:19:49.832Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/85/32/10bb5764d90a8eee674e9dc6f4db6a0ab47c8c4d0d83c27f7c39ac415a4d/click-8.2.1-py3-none-any.whl", hash = "sha256:61a3265b914e850b85317d0b3109c7f8cd35a670f963866005d6ef1d5175a12b", size = 102215, upload-time = "2025-05-20T23:19:47.796Z" }, -] - -[[package]] -name = "cloudpickle" -version = "3.1.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/52/39/069100b84d7418bc358d81669d5748efb14b9cceacd2f9c75f550424132f/cloudpickle-3.1.1.tar.gz", hash = "sha256:b216fa8ae4019d5482a8ac3c95d8f6346115d8835911fd4aefd1a445e4242c64", size = 22113, upload-time = "2025-01-14T17:02:05.085Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/7e/e8/64c37fadfc2816a7701fa8a6ed8d87327c7d54eacfbfb6edab14a2f2be75/cloudpickle-3.1.1-py3-none-any.whl", hash = "sha256:c8c5a44295039331ee9dad40ba100a9c7297b6f988e50e87ccdf3765a668350e", size = 20992, upload-time = "2025-01-14T17:02:02.417Z" }, -] - -[[package]] -name = "cognee" -version = "0.3.3" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "aiofiles" }, - { name = "aiohttp" }, - { name = "aiosqlite" }, - { name = "alembic" }, - { name = "baml-py" }, - { name = "dlt", extra = ["sqlalchemy"] }, - { name = "fastapi" }, - { name = "fastapi-users", extra = ["sqlalchemy"] }, - { name = "filetype" }, - { name = "instructor" }, - { name = "jinja2" }, - { name = "kuzu" }, - { name = "lancedb" }, - { name = "langfuse" }, - { name = "limits" }, - { name = "litellm" }, - { name = "matplotlib" }, - { name = "networkx" }, - { name = "nltk" }, - { name = "numpy" }, - { name = "onnxruntime" }, - { name = "openai" }, - { name = "pandas" }, - { name = "pre-commit" }, - { name = "pydantic" }, - { name = "pydantic-settings" }, - { name = "pylance" }, - { name = "pympler" }, - { name = "pypdf" }, - { name = "python-dotenv" }, - { name = "python-magic-bin", marker = "sys_platform == 'win32'" }, - { name = "python-multipart" }, - { name = "rdflib" }, - { name = "s3fs", extra = ["boto3"] }, - { name = "scikit-learn" }, - { name = "sentry-sdk", extra = ["fastapi"] }, - { name = "sqlalchemy" }, - { name = "structlog" }, - { name = "tiktoken" }, - { name = "typing-extensions" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/60/b8/d3dff77c5cccf052deb401afcf441507445516eec79423606fb572d4817b/cognee-0.3.3.tar.gz", hash = "sha256:64b301625ab02d9a026fa64798fc075bf7cf6517b14514b4cee5b843454c6520", size = 14395369, upload-time = "2025-09-12T18:23:37.723Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/69/75/1df5946648dc4650771b3912d368e57192910ebdad9217dcd94d46bb4257/cognee-0.3.3-py3-none-any.whl", hash = "sha256:a595754f822b092573ebf655390ce8eb4c22d0500451b8c804dab14a9b61b774", size = 1515870, upload-time = "2025-09-12T18:23:19.031Z" }, -] - -[[package]] -name = "colorama" -version = "0.4.6" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697, upload-time = "2022-10-25T02:36:22.414Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335, upload-time = "2022-10-25T02:36:20.889Z" }, -] - -[[package]] -name = "coloredlogs" -version = "15.0.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "humanfriendly" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/cc/c7/eed8f27100517e8c0e6b923d5f0845d0cb99763da6fdee00478f91db7325/coloredlogs-15.0.1.tar.gz", hash = "sha256:7c991aa71a4577af2f82600d8f8f3a89f936baeaf9b50a9c197da014e5bf16b0", size = 278520, upload-time = "2021-06-11T10:22:45.202Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/a7/06/3d6badcf13db419e25b07041d9c7b4a2c331d3f4e7134445ec5df57714cd/coloredlogs-15.0.1-py2.py3-none-any.whl", hash = "sha256:612ee75c546f53e92e70049c9dbfcc18c935a2b9a53b66085ce9ef6a6e5c0934", size = 46018, upload-time = "2021-06-11T10:22:42.561Z" }, -] - -[[package]] -name = "contourpy" -version = "1.3.3" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "numpy" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/58/01/1253e6698a07380cd31a736d248a3f2a50a7c88779a1813da27503cadc2a/contourpy-1.3.3.tar.gz", hash = "sha256:083e12155b210502d0bca491432bb04d56dc3432f95a979b429f2848c3dbe880", size = 13466174, upload-time = "2025-07-26T12:03:12.549Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/91/2e/c4390a31919d8a78b90e8ecf87cd4b4c4f05a5b48d05ec17db8e5404c6f4/contourpy-1.3.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:709a48ef9a690e1343202916450bc48b9e51c049b089c7f79a267b46cffcdaa1", size = 288773, upload-time = "2025-07-26T12:01:02.277Z" }, - { url = "https://files.pythonhosted.org/packages/0d/44/c4b0b6095fef4dc9c420e041799591e3b63e9619e3044f7f4f6c21c0ab24/contourpy-1.3.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:23416f38bfd74d5d28ab8429cc4d63fa67d5068bd711a85edb1c3fb0c3e2f381", size = 270149, upload-time = "2025-07-26T12:01:04.072Z" }, - { url = "https://files.pythonhosted.org/packages/30/2e/dd4ced42fefac8470661d7cb7e264808425e6c5d56d175291e93890cce09/contourpy-1.3.3-cp311-cp311-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:929ddf8c4c7f348e4c0a5a3a714b5c8542ffaa8c22954862a46ca1813b667ee7", size = 329222, upload-time = "2025-07-26T12:01:05.688Z" }, - { url = "https://files.pythonhosted.org/packages/f2/74/cc6ec2548e3d276c71389ea4802a774b7aa3558223b7bade3f25787fafc2/contourpy-1.3.3-cp311-cp311-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:9e999574eddae35f1312c2b4b717b7885d4edd6cb46700e04f7f02db454e67c1", size = 377234, upload-time = "2025-07-26T12:01:07.054Z" }, - { url = "https://files.pythonhosted.org/packages/03/b3/64ef723029f917410f75c09da54254c5f9ea90ef89b143ccadb09df14c15/contourpy-1.3.3-cp311-cp311-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:0bf67e0e3f482cb69779dd3061b534eb35ac9b17f163d851e2a547d56dba0a3a", size = 380555, upload-time = "2025-07-26T12:01:08.801Z" }, - { url = "https://files.pythonhosted.org/packages/5f/4b/6157f24ca425b89fe2eb7e7be642375711ab671135be21e6faa100f7448c/contourpy-1.3.3-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:51e79c1f7470158e838808d4a996fa9bac72c498e93d8ebe5119bc1e6becb0db", size = 355238, upload-time = "2025-07-26T12:01:10.319Z" }, - { url = "https://files.pythonhosted.org/packages/98/56/f914f0dd678480708a04cfd2206e7c382533249bc5001eb9f58aa693e200/contourpy-1.3.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:598c3aaece21c503615fd59c92a3598b428b2f01bfb4b8ca9c4edeecc2438620", size = 1326218, upload-time = "2025-07-26T12:01:12.659Z" }, - { url = "https://files.pythonhosted.org/packages/fb/d7/4a972334a0c971acd5172389671113ae82aa7527073980c38d5868ff1161/contourpy-1.3.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:322ab1c99b008dad206d406bb61d014cf0174df491ae9d9d0fac6a6fda4f977f", size = 1392867, upload-time = "2025-07-26T12:01:15.533Z" }, - { url = "https://files.pythonhosted.org/packages/75/3e/f2cc6cd56dc8cff46b1a56232eabc6feea52720083ea71ab15523daab796/contourpy-1.3.3-cp311-cp311-win32.whl", hash = "sha256:fd907ae12cd483cd83e414b12941c632a969171bf90fc937d0c9f268a31cafff", size = 183677, upload-time = "2025-07-26T12:01:17.088Z" }, - { url = "https://files.pythonhosted.org/packages/98/4b/9bd370b004b5c9d8045c6c33cf65bae018b27aca550a3f657cdc99acdbd8/contourpy-1.3.3-cp311-cp311-win_amd64.whl", hash = "sha256:3519428f6be58431c56581f1694ba8e50626f2dd550af225f82fb5f5814d2a42", size = 225234, upload-time = "2025-07-26T12:01:18.256Z" }, - { url = "https://files.pythonhosted.org/packages/d9/b6/71771e02c2e004450c12b1120a5f488cad2e4d5b590b1af8bad060360fe4/contourpy-1.3.3-cp311-cp311-win_arm64.whl", hash = "sha256:15ff10bfada4bf92ec8b31c62bf7c1834c244019b4a33095a68000d7075df470", size = 193123, upload-time = "2025-07-26T12:01:19.848Z" }, - { url = "https://files.pythonhosted.org/packages/be/45/adfee365d9ea3d853550b2e735f9d66366701c65db7855cd07621732ccfc/contourpy-1.3.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:b08a32ea2f8e42cf1d4be3169a98dd4be32bafe4f22b6c4cb4ba810fa9e5d2cb", size = 293419, upload-time = "2025-07-26T12:01:21.16Z" }, - { url = "https://files.pythonhosted.org/packages/53/3e/405b59cfa13021a56bba395a6b3aca8cec012b45bf177b0eaf7a202cde2c/contourpy-1.3.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:556dba8fb6f5d8742f2923fe9457dbdd51e1049c4a43fd3986a0b14a1d815fc6", size = 273979, upload-time = "2025-07-26T12:01:22.448Z" }, - { url = "https://files.pythonhosted.org/packages/d4/1c/a12359b9b2ca3a845e8f7f9ac08bdf776114eb931392fcad91743e2ea17b/contourpy-1.3.3-cp312-cp312-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:92d9abc807cf7d0e047b95ca5d957cf4792fcd04e920ca70d48add15c1a90ea7", size = 332653, upload-time = "2025-07-26T12:01:24.155Z" }, - { url = "https://files.pythonhosted.org/packages/63/12/897aeebfb475b7748ea67b61e045accdfcf0d971f8a588b67108ed7f5512/contourpy-1.3.3-cp312-cp312-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:b2e8faa0ed68cb29af51edd8e24798bb661eac3bd9f65420c1887b6ca89987c8", size = 379536, upload-time = "2025-07-26T12:01:25.91Z" }, - { url = "https://files.pythonhosted.org/packages/43/8a/a8c584b82deb248930ce069e71576fc09bd7174bbd35183b7943fb1064fd/contourpy-1.3.3-cp312-cp312-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:626d60935cf668e70a5ce6ff184fd713e9683fb458898e4249b63be9e28286ea", size = 384397, upload-time = "2025-07-26T12:01:27.152Z" }, - { url = "https://files.pythonhosted.org/packages/cc/8f/ec6289987824b29529d0dfda0d74a07cec60e54b9c92f3c9da4c0ac732de/contourpy-1.3.3-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4d00e655fcef08aba35ec9610536bfe90267d7ab5ba944f7032549c55a146da1", size = 362601, upload-time = "2025-07-26T12:01:28.808Z" }, - { url = "https://files.pythonhosted.org/packages/05/0a/a3fe3be3ee2dceb3e615ebb4df97ae6f3828aa915d3e10549ce016302bd1/contourpy-1.3.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:451e71b5a7d597379ef572de31eeb909a87246974d960049a9848c3bc6c41bf7", size = 1331288, upload-time = "2025-07-26T12:01:31.198Z" }, - { url = "https://files.pythonhosted.org/packages/33/1d/acad9bd4e97f13f3e2b18a3977fe1b4a37ecf3d38d815333980c6c72e963/contourpy-1.3.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:459c1f020cd59fcfe6650180678a9993932d80d44ccde1fa1868977438f0b411", size = 1403386, upload-time = "2025-07-26T12:01:33.947Z" }, - { url = "https://files.pythonhosted.org/packages/cf/8f/5847f44a7fddf859704217a99a23a4f6417b10e5ab1256a179264561540e/contourpy-1.3.3-cp312-cp312-win32.whl", hash = "sha256:023b44101dfe49d7d53932be418477dba359649246075c996866106da069af69", size = 185018, upload-time = "2025-07-26T12:01:35.64Z" }, - { url = "https://files.pythonhosted.org/packages/19/e8/6026ed58a64563186a9ee3f29f41261fd1828f527dd93d33b60feca63352/contourpy-1.3.3-cp312-cp312-win_amd64.whl", hash = "sha256:8153b8bfc11e1e4d75bcb0bff1db232f9e10b274e0929de9d608027e0d34ff8b", size = 226567, upload-time = "2025-07-26T12:01:36.804Z" }, - { url = "https://files.pythonhosted.org/packages/d1/e2/f05240d2c39a1ed228d8328a78b6f44cd695f7ef47beb3e684cf93604f86/contourpy-1.3.3-cp312-cp312-win_arm64.whl", hash = "sha256:07ce5ed73ecdc4a03ffe3e1b3e3c1166db35ae7584be76f65dbbe28a7791b0cc", size = 193655, upload-time = "2025-07-26T12:01:37.999Z" }, - { url = "https://files.pythonhosted.org/packages/68/35/0167aad910bbdb9599272bd96d01a9ec6852f36b9455cf2ca67bd4cc2d23/contourpy-1.3.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:177fb367556747a686509d6fef71d221a4b198a3905fe824430e5ea0fda54eb5", size = 293257, upload-time = "2025-07-26T12:01:39.367Z" }, - { url = "https://files.pythonhosted.org/packages/96/e4/7adcd9c8362745b2210728f209bfbcf7d91ba868a2c5f40d8b58f54c509b/contourpy-1.3.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:d002b6f00d73d69333dac9d0b8d5e84d9724ff9ef044fd63c5986e62b7c9e1b1", size = 274034, upload-time = "2025-07-26T12:01:40.645Z" }, - { url = "https://files.pythonhosted.org/packages/73/23/90e31ceeed1de63058a02cb04b12f2de4b40e3bef5e082a7c18d9c8ae281/contourpy-1.3.3-cp313-cp313-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:348ac1f5d4f1d66d3322420f01d42e43122f43616e0f194fc1c9f5d830c5b286", size = 334672, upload-time = "2025-07-26T12:01:41.942Z" }, - { url = "https://files.pythonhosted.org/packages/ed/93/b43d8acbe67392e659e1d984700e79eb67e2acb2bd7f62012b583a7f1b55/contourpy-1.3.3-cp313-cp313-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:655456777ff65c2c548b7c454af9c6f33f16c8884f11083244b5819cc214f1b5", size = 381234, upload-time = "2025-07-26T12:01:43.499Z" }, - { url = "https://files.pythonhosted.org/packages/46/3b/bec82a3ea06f66711520f75a40c8fc0b113b2a75edb36aa633eb11c4f50f/contourpy-1.3.3-cp313-cp313-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:644a6853d15b2512d67881586bd03f462c7ab755db95f16f14d7e238f2852c67", size = 385169, upload-time = "2025-07-26T12:01:45.219Z" }, - { url = "https://files.pythonhosted.org/packages/4b/32/e0f13a1c5b0f8572d0ec6ae2f6c677b7991fafd95da523159c19eff0696a/contourpy-1.3.3-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4debd64f124ca62069f313a9cb86656ff087786016d76927ae2cf37846b006c9", size = 362859, upload-time = "2025-07-26T12:01:46.519Z" }, - { url = "https://files.pythonhosted.org/packages/33/71/e2a7945b7de4e58af42d708a219f3b2f4cff7386e6b6ab0a0fa0033c49a9/contourpy-1.3.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a15459b0f4615b00bbd1e91f1b9e19b7e63aea7483d03d804186f278c0af2659", size = 1332062, upload-time = "2025-07-26T12:01:48.964Z" }, - { url = "https://files.pythonhosted.org/packages/12/fc/4e87ac754220ccc0e807284f88e943d6d43b43843614f0a8afa469801db0/contourpy-1.3.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:ca0fdcd73925568ca027e0b17ab07aad764be4706d0a925b89227e447d9737b7", size = 1403932, upload-time = "2025-07-26T12:01:51.979Z" }, - { url = "https://files.pythonhosted.org/packages/a6/2e/adc197a37443f934594112222ac1aa7dc9a98faf9c3842884df9a9d8751d/contourpy-1.3.3-cp313-cp313-win32.whl", hash = "sha256:b20c7c9a3bf701366556e1b1984ed2d0cedf999903c51311417cf5f591d8c78d", size = 185024, upload-time = "2025-07-26T12:01:53.245Z" }, - { url = "https://files.pythonhosted.org/packages/18/0b/0098c214843213759692cc638fce7de5c289200a830e5035d1791d7a2338/contourpy-1.3.3-cp313-cp313-win_amd64.whl", hash = "sha256:1cadd8b8969f060ba45ed7c1b714fe69185812ab43bd6b86a9123fe8f99c3263", size = 226578, upload-time = "2025-07-26T12:01:54.422Z" }, - { url = "https://files.pythonhosted.org/packages/8a/9a/2f6024a0c5995243cd63afdeb3651c984f0d2bc727fd98066d40e141ad73/contourpy-1.3.3-cp313-cp313-win_arm64.whl", hash = "sha256:fd914713266421b7536de2bfa8181aa8c699432b6763a0ea64195ebe28bff6a9", size = 193524, upload-time = "2025-07-26T12:01:55.73Z" }, - { url = "https://files.pythonhosted.org/packages/c0/b3/f8a1a86bd3298513f500e5b1f5fd92b69896449f6cab6a146a5d52715479/contourpy-1.3.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:88df9880d507169449d434c293467418b9f6cbe82edd19284aa0409e7fdb933d", size = 306730, upload-time = "2025-07-26T12:01:57.051Z" }, - { url = "https://files.pythonhosted.org/packages/3f/11/4780db94ae62fc0c2053909b65dc3246bd7cecfc4f8a20d957ad43aa4ad8/contourpy-1.3.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:d06bb1f751ba5d417047db62bca3c8fde202b8c11fb50742ab3ab962c81e8216", size = 287897, upload-time = "2025-07-26T12:01:58.663Z" }, - { url = "https://files.pythonhosted.org/packages/ae/15/e59f5f3ffdd6f3d4daa3e47114c53daabcb18574a26c21f03dc9e4e42ff0/contourpy-1.3.3-cp313-cp313t-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e4e6b05a45525357e382909a4c1600444e2a45b4795163d3b22669285591c1ae", size = 326751, upload-time = "2025-07-26T12:02:00.343Z" }, - { url = "https://files.pythonhosted.org/packages/0f/81/03b45cfad088e4770b1dcf72ea78d3802d04200009fb364d18a493857210/contourpy-1.3.3-cp313-cp313t-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:ab3074b48c4e2cf1a960e6bbeb7f04566bf36b1861d5c9d4d8ac04b82e38ba20", size = 375486, upload-time = "2025-07-26T12:02:02.128Z" }, - { url = "https://files.pythonhosted.org/packages/0c/ba/49923366492ffbdd4486e970d421b289a670ae8cf539c1ea9a09822b371a/contourpy-1.3.3-cp313-cp313t-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:6c3d53c796f8647d6deb1abe867daeb66dcc8a97e8455efa729516b997b8ed99", size = 388106, upload-time = "2025-07-26T12:02:03.615Z" }, - { url = "https://files.pythonhosted.org/packages/9f/52/5b00ea89525f8f143651f9f03a0df371d3cbd2fccd21ca9b768c7a6500c2/contourpy-1.3.3-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:50ed930df7289ff2a8d7afeb9603f8289e5704755c7e5c3bbd929c90c817164b", size = 352548, upload-time = "2025-07-26T12:02:05.165Z" }, - { url = "https://files.pythonhosted.org/packages/32/1d/a209ec1a3a3452d490f6b14dd92e72280c99ae3d1e73da74f8277d4ee08f/contourpy-1.3.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:4feffb6537d64b84877da813a5c30f1422ea5739566abf0bd18065ac040e120a", size = 1322297, upload-time = "2025-07-26T12:02:07.379Z" }, - { url = "https://files.pythonhosted.org/packages/bc/9e/46f0e8ebdd884ca0e8877e46a3f4e633f6c9c8c4f3f6e72be3fe075994aa/contourpy-1.3.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:2b7e9480ffe2b0cd2e787e4df64270e3a0440d9db8dc823312e2c940c167df7e", size = 1391023, upload-time = "2025-07-26T12:02:10.171Z" }, - { url = "https://files.pythonhosted.org/packages/b9/70/f308384a3ae9cd2209e0849f33c913f658d3326900d0ff5d378d6a1422d2/contourpy-1.3.3-cp313-cp313t-win32.whl", hash = "sha256:283edd842a01e3dcd435b1c5116798d661378d83d36d337b8dde1d16a5fc9ba3", size = 196157, upload-time = "2025-07-26T12:02:11.488Z" }, - { url = "https://files.pythonhosted.org/packages/b2/dd/880f890a6663b84d9e34a6f88cded89d78f0091e0045a284427cb6b18521/contourpy-1.3.3-cp313-cp313t-win_amd64.whl", hash = "sha256:87acf5963fc2b34825e5b6b048f40e3635dd547f590b04d2ab317c2619ef7ae8", size = 240570, upload-time = "2025-07-26T12:02:12.754Z" }, - { url = "https://files.pythonhosted.org/packages/80/99/2adc7d8ffead633234817ef8e9a87115c8a11927a94478f6bb3d3f4d4f7d/contourpy-1.3.3-cp313-cp313t-win_arm64.whl", hash = "sha256:3c30273eb2a55024ff31ba7d052dde990d7d8e5450f4bbb6e913558b3d6c2301", size = 199713, upload-time = "2025-07-26T12:02:14.4Z" }, - { url = "https://files.pythonhosted.org/packages/72/8b/4546f3ab60f78c514ffb7d01a0bd743f90de36f0019d1be84d0a708a580a/contourpy-1.3.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fde6c716d51c04b1c25d0b90364d0be954624a0ee9d60e23e850e8d48353d07a", size = 292189, upload-time = "2025-07-26T12:02:16.095Z" }, - { url = "https://files.pythonhosted.org/packages/fd/e1/3542a9cb596cadd76fcef413f19c79216e002623158befe6daa03dbfa88c/contourpy-1.3.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:cbedb772ed74ff5be440fa8eee9bd49f64f6e3fc09436d9c7d8f1c287b121d77", size = 273251, upload-time = "2025-07-26T12:02:17.524Z" }, - { url = "https://files.pythonhosted.org/packages/b1/71/f93e1e9471d189f79d0ce2497007731c1e6bf9ef6d1d61b911430c3db4e5/contourpy-1.3.3-cp314-cp314-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:22e9b1bd7a9b1d652cd77388465dc358dafcd2e217d35552424aa4f996f524f5", size = 335810, upload-time = "2025-07-26T12:02:18.9Z" }, - { url = "https://files.pythonhosted.org/packages/91/f9/e35f4c1c93f9275d4e38681a80506b5510e9327350c51f8d4a5a724d178c/contourpy-1.3.3-cp314-cp314-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a22738912262aa3e254e4f3cb079a95a67132fc5a063890e224393596902f5a4", size = 382871, upload-time = "2025-07-26T12:02:20.418Z" }, - { url = "https://files.pythonhosted.org/packages/b5/71/47b512f936f66a0a900d81c396a7e60d73419868fba959c61efed7a8ab46/contourpy-1.3.3-cp314-cp314-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:afe5a512f31ee6bd7d0dda52ec9864c984ca3d66664444f2d72e0dc4eb832e36", size = 386264, upload-time = "2025-07-26T12:02:21.916Z" }, - { url = "https://files.pythonhosted.org/packages/04/5f/9ff93450ba96b09c7c2b3f81c94de31c89f92292f1380261bd7195bea4ea/contourpy-1.3.3-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f64836de09927cba6f79dcd00fdd7d5329f3fccc633468507079c829ca4db4e3", size = 363819, upload-time = "2025-07-26T12:02:23.759Z" }, - { url = "https://files.pythonhosted.org/packages/3e/a6/0b185d4cc480ee494945cde102cb0149ae830b5fa17bf855b95f2e70ad13/contourpy-1.3.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:1fd43c3be4c8e5fd6e4f2baeae35ae18176cf2e5cced681cca908addf1cdd53b", size = 1333650, upload-time = "2025-07-26T12:02:26.181Z" }, - { url = "https://files.pythonhosted.org/packages/43/d7/afdc95580ca56f30fbcd3060250f66cedbde69b4547028863abd8aa3b47e/contourpy-1.3.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:6afc576f7b33cf00996e5c1102dc2a8f7cc89e39c0b55df93a0b78c1bd992b36", size = 1404833, upload-time = "2025-07-26T12:02:28.782Z" }, - { url = "https://files.pythonhosted.org/packages/e2/e2/366af18a6d386f41132a48f033cbd2102e9b0cf6345d35ff0826cd984566/contourpy-1.3.3-cp314-cp314-win32.whl", hash = "sha256:66c8a43a4f7b8df8b71ee1840e4211a3c8d93b214b213f590e18a1beca458f7d", size = 189692, upload-time = "2025-07-26T12:02:30.128Z" }, - { url = "https://files.pythonhosted.org/packages/7d/c2/57f54b03d0f22d4044b8afb9ca0e184f8b1afd57b4f735c2fa70883dc601/contourpy-1.3.3-cp314-cp314-win_amd64.whl", hash = "sha256:cf9022ef053f2694e31d630feaacb21ea24224be1c3ad0520b13d844274614fd", size = 232424, upload-time = "2025-07-26T12:02:31.395Z" }, - { url = "https://files.pythonhosted.org/packages/18/79/a9416650df9b525737ab521aa181ccc42d56016d2123ddcb7b58e926a42c/contourpy-1.3.3-cp314-cp314-win_arm64.whl", hash = "sha256:95b181891b4c71de4bb404c6621e7e2390745f887f2a026b2d99e92c17892339", size = 198300, upload-time = "2025-07-26T12:02:32.956Z" }, - { url = "https://files.pythonhosted.org/packages/1f/42/38c159a7d0f2b7b9c04c64ab317042bb6952b713ba875c1681529a2932fe/contourpy-1.3.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:33c82d0138c0a062380332c861387650c82e4cf1747aaa6938b9b6516762e772", size = 306769, upload-time = "2025-07-26T12:02:34.2Z" }, - { url = "https://files.pythonhosted.org/packages/c3/6c/26a8205f24bca10974e77460de68d3d7c63e282e23782f1239f226fcae6f/contourpy-1.3.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:ea37e7b45949df430fe649e5de8351c423430046a2af20b1c1961cae3afcda77", size = 287892, upload-time = "2025-07-26T12:02:35.807Z" }, - { url = "https://files.pythonhosted.org/packages/66/06/8a475c8ab718ebfd7925661747dbb3c3ee9c82ac834ccb3570be49d129f4/contourpy-1.3.3-cp314-cp314t-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d304906ecc71672e9c89e87c4675dc5c2645e1f4269a5063b99b0bb29f232d13", size = 326748, upload-time = "2025-07-26T12:02:37.193Z" }, - { url = "https://files.pythonhosted.org/packages/b4/a3/c5ca9f010a44c223f098fccd8b158bb1cb287378a31ac141f04730dc49be/contourpy-1.3.3-cp314-cp314t-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:ca658cd1a680a5c9ea96dc61cdbae1e85c8f25849843aa799dfd3cb370ad4fbe", size = 375554, upload-time = "2025-07-26T12:02:38.894Z" }, - { url = "https://files.pythonhosted.org/packages/80/5b/68bd33ae63fac658a4145088c1e894405e07584a316738710b636c6d0333/contourpy-1.3.3-cp314-cp314t-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:ab2fd90904c503739a75b7c8c5c01160130ba67944a7b77bbf36ef8054576e7f", size = 388118, upload-time = "2025-07-26T12:02:40.642Z" }, - { url = "https://files.pythonhosted.org/packages/40/52/4c285a6435940ae25d7410a6c36bda5145839bc3f0beb20c707cda18b9d2/contourpy-1.3.3-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b7301b89040075c30e5768810bc96a8e8d78085b47d8be6e4c3f5a0b4ed478a0", size = 352555, upload-time = "2025-07-26T12:02:42.25Z" }, - { url = "https://files.pythonhosted.org/packages/24/ee/3e81e1dd174f5c7fefe50e85d0892de05ca4e26ef1c9a59c2a57e43b865a/contourpy-1.3.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:2a2a8b627d5cc6b7c41a4beff6c5ad5eb848c88255fda4a8745f7e901b32d8e4", size = 1322295, upload-time = "2025-07-26T12:02:44.668Z" }, - { url = "https://files.pythonhosted.org/packages/3c/b2/6d913d4d04e14379de429057cd169e5e00f6c2af3bb13e1710bcbdb5da12/contourpy-1.3.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:fd6ec6be509c787f1caf6b247f0b1ca598bef13f4ddeaa126b7658215529ba0f", size = 1391027, upload-time = "2025-07-26T12:02:47.09Z" }, - { url = "https://files.pythonhosted.org/packages/93/8a/68a4ec5c55a2971213d29a9374913f7e9f18581945a7a31d1a39b5d2dfe5/contourpy-1.3.3-cp314-cp314t-win32.whl", hash = "sha256:e74a9a0f5e3fff48fb5a7f2fd2b9b70a3fe014a67522f79b7cca4c0c7e43c9ae", size = 202428, upload-time = "2025-07-26T12:02:48.691Z" }, - { url = "https://files.pythonhosted.org/packages/fa/96/fd9f641ffedc4fa3ace923af73b9d07e869496c9cc7a459103e6e978992f/contourpy-1.3.3-cp314-cp314t-win_amd64.whl", hash = "sha256:13b68d6a62db8eafaebb8039218921399baf6e47bf85006fd8529f2a08ef33fc", size = 250331, upload-time = "2025-07-26T12:02:50.137Z" }, - { url = "https://files.pythonhosted.org/packages/ae/8c/469afb6465b853afff216f9528ffda78a915ff880ed58813ba4faf4ba0b6/contourpy-1.3.3-cp314-cp314t-win_arm64.whl", hash = "sha256:b7448cb5a725bb1e35ce88771b86fba35ef418952474492cf7c764059933ff8b", size = 203831, upload-time = "2025-07-26T12:02:51.449Z" }, - { url = "https://files.pythonhosted.org/packages/a5/29/8dcfe16f0107943fa92388c23f6e05cff0ba58058c4c95b00280d4c75a14/contourpy-1.3.3-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:cd5dfcaeb10f7b7f9dc8941717c6c2ade08f587be2226222c12b25f0483ed497", size = 278809, upload-time = "2025-07-26T12:02:52.74Z" }, - { url = "https://files.pythonhosted.org/packages/85/a9/8b37ef4f7dafeb335daee3c8254645ef5725be4d9c6aa70b50ec46ef2f7e/contourpy-1.3.3-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:0c1fc238306b35f246d61a1d416a627348b5cf0648648a031e14bb8705fcdfe8", size = 261593, upload-time = "2025-07-26T12:02:54.037Z" }, - { url = "https://files.pythonhosted.org/packages/0a/59/ebfb8c677c75605cc27f7122c90313fd2f375ff3c8d19a1694bda74aaa63/contourpy-1.3.3-pp311-pypy311_pp73-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:70f9aad7de812d6541d29d2bbf8feb22ff7e1c299523db288004e3157ff4674e", size = 302202, upload-time = "2025-07-26T12:02:55.947Z" }, - { url = "https://files.pythonhosted.org/packages/3c/37/21972a15834d90bfbfb009b9d004779bd5a07a0ec0234e5ba8f64d5736f4/contourpy-1.3.3-pp311-pypy311_pp73-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:5ed3657edf08512fc3fe81b510e35c2012fbd3081d2e26160f27ca28affec989", size = 329207, upload-time = "2025-07-26T12:02:57.468Z" }, - { url = "https://files.pythonhosted.org/packages/0c/58/bd257695f39d05594ca4ad60df5bcb7e32247f9951fd09a9b8edb82d1daa/contourpy-1.3.3-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:3d1a3799d62d45c18bafd41c5fa05120b96a28079f2393af559b843d1a966a77", size = 225315, upload-time = "2025-07-26T12:02:58.801Z" }, -] - -[[package]] -name = "cryptography" -version = "46.0.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "cffi", marker = "platform_python_implementation != 'PyPy'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/a9/62/e3664e6ffd7743e1694b244dde70b43a394f6f7fbcacf7014a8ff5197c73/cryptography-46.0.1.tar.gz", hash = "sha256:ed570874e88f213437f5cf758f9ef26cbfc3f336d889b1e592ee11283bb8d1c7", size = 749198, upload-time = "2025-09-17T00:10:35.797Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/4c/8c/44ee01267ec01e26e43ebfdae3f120ec2312aa72fa4c0507ebe41a26739f/cryptography-46.0.1-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:1cd6d50c1a8b79af1a6f703709d8973845f677c8e97b1268f5ff323d38ce8475", size = 7285044, upload-time = "2025-09-17T00:08:36.807Z" }, - { url = "https://files.pythonhosted.org/packages/22/59/9ae689a25047e0601adfcb159ec4f83c0b4149fdb5c3030cc94cd218141d/cryptography-46.0.1-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:0ff483716be32690c14636e54a1f6e2e1b7bf8e22ca50b989f88fa1b2d287080", size = 4308182, upload-time = "2025-09-17T00:08:39.388Z" }, - { url = "https://files.pythonhosted.org/packages/c4/ee/ca6cc9df7118f2fcd142c76b1da0f14340d77518c05b1ebfbbabca6b9e7d/cryptography-46.0.1-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:9873bf7c1f2a6330bdfe8621e7ce64b725784f9f0c3a6a55c3047af5849f920e", size = 4572393, upload-time = "2025-09-17T00:08:41.663Z" }, - { url = "https://files.pythonhosted.org/packages/7f/a3/0f5296f63815d8e985922b05c31f77ce44787b3127a67c0b7f70f115c45f/cryptography-46.0.1-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:0dfb7c88d4462a0cfdd0d87a3c245a7bc3feb59de101f6ff88194f740f72eda6", size = 4308400, upload-time = "2025-09-17T00:08:43.559Z" }, - { url = "https://files.pythonhosted.org/packages/5d/8c/74fcda3e4e01be1d32775d5b4dd841acaac3c1b8fa4d0774c7ac8d52463d/cryptography-46.0.1-cp311-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:e22801b61613ebdebf7deb18b507919e107547a1d39a3b57f5f855032dd7cfb8", size = 4015786, upload-time = "2025-09-17T00:08:45.758Z" }, - { url = "https://files.pythonhosted.org/packages/dc/b8/85d23287baeef273b0834481a3dd55bbed3a53587e3b8d9f0898235b8f91/cryptography-46.0.1-cp311-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:757af4f6341ce7a1e47c326ca2a81f41d236070217e5fbbad61bbfe299d55d28", size = 4982606, upload-time = "2025-09-17T00:08:47.602Z" }, - { url = "https://files.pythonhosted.org/packages/e5/d3/de61ad5b52433b389afca0bc70f02a7a1f074651221f599ce368da0fe437/cryptography-46.0.1-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:f7a24ea78de345cfa7f6a8d3bde8b242c7fac27f2bd78fa23474ca38dfaeeab9", size = 4604234, upload-time = "2025-09-17T00:08:49.879Z" }, - { url = "https://files.pythonhosted.org/packages/dc/1f/dbd4d6570d84748439237a7478d124ee0134bf166ad129267b7ed8ea6d22/cryptography-46.0.1-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:9e8776dac9e660c22241b6587fae51a67b4b0147daa4d176b172c3ff768ad736", size = 4307669, upload-time = "2025-09-17T00:08:52.321Z" }, - { url = "https://files.pythonhosted.org/packages/ec/fd/ca0a14ce7f0bfe92fa727aacaf2217eb25eb7e4ed513b14d8e03b26e63ed/cryptography-46.0.1-cp311-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:9f40642a140c0c8649987027867242b801486865277cbabc8c6059ddef16dc8b", size = 4947579, upload-time = "2025-09-17T00:08:54.697Z" }, - { url = "https://files.pythonhosted.org/packages/89/6b/09c30543bb93401f6f88fce556b3bdbb21e55ae14912c04b7bf355f5f96c/cryptography-46.0.1-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:449ef2b321bec7d97ef2c944173275ebdab78f3abdd005400cc409e27cd159ab", size = 4603669, upload-time = "2025-09-17T00:08:57.16Z" }, - { url = "https://files.pythonhosted.org/packages/23/9a/38cb01cb09ce0adceda9fc627c9cf98eb890fc8d50cacbe79b011df20f8a/cryptography-46.0.1-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:2dd339ba3345b908fa3141ddba4025568fa6fd398eabce3ef72a29ac2d73ad75", size = 4435828, upload-time = "2025-09-17T00:08:59.606Z" }, - { url = "https://files.pythonhosted.org/packages/0f/53/435b5c36a78d06ae0bef96d666209b0ecd8f8181bfe4dda46536705df59e/cryptography-46.0.1-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:7411c910fb2a412053cf33cfad0153ee20d27e256c6c3f14d7d7d1d9fec59fd5", size = 4709553, upload-time = "2025-09-17T00:09:01.832Z" }, - { url = "https://files.pythonhosted.org/packages/f5/c4/0da6e55595d9b9cd3b6eb5dc22f3a07ded7f116a3ea72629cab595abb804/cryptography-46.0.1-cp311-abi3-win32.whl", hash = "sha256:cbb8e769d4cac884bb28e3ff620ef1001b75588a5c83c9c9f1fdc9afbe7f29b0", size = 3058327, upload-time = "2025-09-17T00:09:03.726Z" }, - { url = "https://files.pythonhosted.org/packages/95/0f/cd29a35e0d6e78a0ee61793564c8cff0929c38391cb0de27627bdc7525aa/cryptography-46.0.1-cp311-abi3-win_amd64.whl", hash = "sha256:92e8cfe8bd7dd86eac0a677499894862cd5cc2fd74de917daa881d00871ac8e7", size = 3523893, upload-time = "2025-09-17T00:09:06.272Z" }, - { url = "https://files.pythonhosted.org/packages/f2/dd/eea390f3e78432bc3d2f53952375f8b37cb4d37783e626faa6a51e751719/cryptography-46.0.1-cp311-abi3-win_arm64.whl", hash = "sha256:db5597a4c7353b2e5fb05a8e6cb74b56a4658a2b7bf3cb6b1821ae7e7fd6eaa0", size = 2932145, upload-time = "2025-09-17T00:09:08.568Z" }, - { url = "https://files.pythonhosted.org/packages/0a/fb/c73588561afcd5e24b089952bd210b14676c0c5bf1213376350ae111945c/cryptography-46.0.1-cp314-cp314t-macosx_10_9_universal2.whl", hash = "sha256:4c49eda9a23019e11d32a0eb51a27b3e7ddedde91e099c0ac6373e3aacc0d2ee", size = 7193928, upload-time = "2025-09-17T00:09:10.595Z" }, - { url = "https://files.pythonhosted.org/packages/26/34/0ff0bb2d2c79f25a2a63109f3b76b9108a906dd2a2eb5c1d460b9938adbb/cryptography-46.0.1-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:9babb7818fdd71394e576cf26c5452df77a355eac1a27ddfa24096665a27f8fd", size = 4293515, upload-time = "2025-09-17T00:09:12.861Z" }, - { url = "https://files.pythonhosted.org/packages/df/b7/d4f848aee24ecd1be01db6c42c4a270069a4f02a105d9c57e143daf6cf0f/cryptography-46.0.1-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:9f2c4cc63be3ef43c0221861177cee5d14b505cd4d4599a89e2cd273c4d3542a", size = 4545619, upload-time = "2025-09-17T00:09:15.397Z" }, - { url = "https://files.pythonhosted.org/packages/44/a5/42fedefc754fd1901e2d95a69815ea4ec8a9eed31f4c4361fcab80288661/cryptography-46.0.1-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:41c281a74df173876da1dc9a9b6953d387f06e3d3ed9284e3baae3ab3f40883a", size = 4299160, upload-time = "2025-09-17T00:09:17.155Z" }, - { url = "https://files.pythonhosted.org/packages/86/a1/cd21174f56e769c831fbbd6399a1b7519b0ff6280acec1b826d7b072640c/cryptography-46.0.1-cp314-cp314t-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:0a17377fa52563d730248ba1f68185461fff36e8bc75d8787a7dd2e20a802b7a", size = 3994491, upload-time = "2025-09-17T00:09:18.971Z" }, - { url = "https://files.pythonhosted.org/packages/8d/2f/a8cbfa1c029987ddc746fd966711d4fa71efc891d37fbe9f030fe5ab4eec/cryptography-46.0.1-cp314-cp314t-manylinux_2_28_ppc64le.whl", hash = "sha256:0d1922d9280e08cde90b518a10cd66831f632960a8d08cb3418922d83fce6f12", size = 4960157, upload-time = "2025-09-17T00:09:20.923Z" }, - { url = "https://files.pythonhosted.org/packages/67/ae/63a84e6789e0d5a2502edf06b552bcb0fa9ff16147265d5c44a211942abe/cryptography-46.0.1-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:af84e8e99f1a82cea149e253014ea9dc89f75b82c87bb6c7242203186f465129", size = 4577263, upload-time = "2025-09-17T00:09:23.356Z" }, - { url = "https://files.pythonhosted.org/packages/ef/8f/1b9fa8e92bd9cbcb3b7e1e593a5232f2c1e6f9bd72b919c1a6b37d315f92/cryptography-46.0.1-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:ef648d2c690703501714588b2ba640facd50fd16548133b11b2859e8655a69da", size = 4298703, upload-time = "2025-09-17T00:09:25.566Z" }, - { url = "https://files.pythonhosted.org/packages/c3/af/bb95db070e73fea3fae31d8a69ac1463d89d1c084220f549b00dd01094a8/cryptography-46.0.1-cp314-cp314t-manylinux_2_34_ppc64le.whl", hash = "sha256:e94eb5fa32a8a9f9bf991f424f002913e3dd7c699ef552db9b14ba6a76a6313b", size = 4926363, upload-time = "2025-09-17T00:09:27.451Z" }, - { url = "https://files.pythonhosted.org/packages/f5/3b/d8fb17ffeb3a83157a1cc0aa5c60691d062aceecba09c2e5e77ebfc1870c/cryptography-46.0.1-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:534b96c0831855e29fc3b069b085fd185aa5353033631a585d5cd4dd5d40d657", size = 4576958, upload-time = "2025-09-17T00:09:29.924Z" }, - { url = "https://files.pythonhosted.org/packages/d9/46/86bc3a05c10c8aa88c8ae7e953a8b4e407c57823ed201dbcba55c4d655f4/cryptography-46.0.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:f9b55038b5c6c47559aa33626d8ecd092f354e23de3c6975e4bb205df128a2a0", size = 4422507, upload-time = "2025-09-17T00:09:32.222Z" }, - { url = "https://files.pythonhosted.org/packages/a8/4e/387e5a21dfd2b4198e74968a541cfd6128f66f8ec94ed971776e15091ac3/cryptography-46.0.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:ec13b7105117dbc9afd023300fb9954d72ca855c274fe563e72428ece10191c0", size = 4683964, upload-time = "2025-09-17T00:09:34.118Z" }, - { url = "https://files.pythonhosted.org/packages/25/a3/f9f5907b166adb8f26762071474b38bbfcf89858a5282f032899075a38a1/cryptography-46.0.1-cp314-cp314t-win32.whl", hash = "sha256:504e464944f2c003a0785b81668fe23c06f3b037e9cb9f68a7c672246319f277", size = 3029705, upload-time = "2025-09-17T00:09:36.381Z" }, - { url = "https://files.pythonhosted.org/packages/12/66/4d3a4f1850db2e71c2b1628d14b70b5e4c1684a1bd462f7fffb93c041c38/cryptography-46.0.1-cp314-cp314t-win_amd64.whl", hash = "sha256:c52fded6383f7e20eaf70a60aeddd796b3677c3ad2922c801be330db62778e05", size = 3502175, upload-time = "2025-09-17T00:09:38.261Z" }, - { url = "https://files.pythonhosted.org/packages/52/c7/9f10ad91435ef7d0d99a0b93c4360bea3df18050ff5b9038c489c31ac2f5/cryptography-46.0.1-cp314-cp314t-win_arm64.whl", hash = "sha256:9495d78f52c804b5ec8878b5b8c7873aa8e63db9cd9ee387ff2db3fffe4df784", size = 2912354, upload-time = "2025-09-17T00:09:40.078Z" }, - { url = "https://files.pythonhosted.org/packages/98/e5/fbd632385542a3311915976f88e0dfcf09e62a3fc0aff86fb6762162a24d/cryptography-46.0.1-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:d84c40bdb8674c29fa192373498b6cb1e84f882889d21a471b45d1f868d8d44b", size = 7255677, upload-time = "2025-09-17T00:09:42.407Z" }, - { url = "https://files.pythonhosted.org/packages/56/3e/13ce6eab9ad6eba1b15a7bd476f005a4c1b3f299f4c2f32b22408b0edccf/cryptography-46.0.1-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:9ed64e5083fa806709e74fc5ea067dfef9090e5b7a2320a49be3c9df3583a2d8", size = 4301110, upload-time = "2025-09-17T00:09:45.614Z" }, - { url = "https://files.pythonhosted.org/packages/a2/67/65dc233c1ddd688073cf7b136b06ff4b84bf517ba5529607c9d79720fc67/cryptography-46.0.1-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:341fb7a26bc9d6093c1b124b9f13acc283d2d51da440b98b55ab3f79f2522ead", size = 4562369, upload-time = "2025-09-17T00:09:47.601Z" }, - { url = "https://files.pythonhosted.org/packages/17/db/d64ae4c6f4e98c3dac5bf35dd4d103f4c7c345703e43560113e5e8e31b2b/cryptography-46.0.1-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:6ef1488967e729948d424d09c94753d0167ce59afba8d0f6c07a22b629c557b2", size = 4302126, upload-time = "2025-09-17T00:09:49.335Z" }, - { url = "https://files.pythonhosted.org/packages/3d/19/5f1eea17d4805ebdc2e685b7b02800c4f63f3dd46cfa8d4c18373fea46c8/cryptography-46.0.1-cp38-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:7823bc7cdf0b747ecfb096d004cc41573c2f5c7e3a29861603a2871b43d3ef32", size = 4009431, upload-time = "2025-09-17T00:09:51.239Z" }, - { url = "https://files.pythonhosted.org/packages/81/b5/229ba6088fe7abccbfe4c5edb96c7a5ad547fac5fdd0d40aa6ea540b2985/cryptography-46.0.1-cp38-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:f736ab8036796f5a119ff8211deda416f8c15ce03776db704a7a4e17381cb2ef", size = 4980739, upload-time = "2025-09-17T00:09:54.181Z" }, - { url = "https://files.pythonhosted.org/packages/3a/9c/50aa38907b201e74bc43c572f9603fa82b58e831bd13c245613a23cff736/cryptography-46.0.1-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:e46710a240a41d594953012213ea8ca398cd2448fbc5d0f1be8160b5511104a0", size = 4592289, upload-time = "2025-09-17T00:09:56.731Z" }, - { url = "https://files.pythonhosted.org/packages/5a/33/229858f8a5bb22f82468bb285e9f4c44a31978d5f5830bb4ea1cf8a4e454/cryptography-46.0.1-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:84ef1f145de5aee82ea2447224dc23f065ff4cc5791bb3b506615957a6ba8128", size = 4301815, upload-time = "2025-09-17T00:09:58.548Z" }, - { url = "https://files.pythonhosted.org/packages/52/cb/b76b2c87fbd6ed4a231884bea3ce073406ba8e2dae9defad910d33cbf408/cryptography-46.0.1-cp38-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:9394c7d5a7565ac5f7d9ba38b2617448eba384d7b107b262d63890079fad77ca", size = 4943251, upload-time = "2025-09-17T00:10:00.475Z" }, - { url = "https://files.pythonhosted.org/packages/94/0f/f66125ecf88e4cb5b8017ff43f3a87ede2d064cb54a1c5893f9da9d65093/cryptography-46.0.1-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:ed957044e368ed295257ae3d212b95456bd9756df490e1ac4538857f67531fcc", size = 4591247, upload-time = "2025-09-17T00:10:02.874Z" }, - { url = "https://files.pythonhosted.org/packages/f6/22/9f3134ae436b63b463cfdf0ff506a0570da6873adb4bf8c19b8a5b4bac64/cryptography-46.0.1-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:f7de12fa0eee6234de9a9ce0ffcfa6ce97361db7a50b09b65c63ac58e5f22fc7", size = 4428534, upload-time = "2025-09-17T00:10:04.994Z" }, - { url = "https://files.pythonhosted.org/packages/89/39/e6042bcb2638650b0005c752c38ea830cbfbcbb1830e4d64d530000aa8dc/cryptography-46.0.1-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:7fab1187b6c6b2f11a326f33b036f7168f5b996aedd0c059f9738915e4e8f53a", size = 4699541, upload-time = "2025-09-17T00:10:06.925Z" }, - { url = "https://files.pythonhosted.org/packages/68/46/753d457492d15458c7b5a653fc9a84a1c9c7a83af6ebdc94c3fc373ca6e8/cryptography-46.0.1-cp38-abi3-win32.whl", hash = "sha256:45f790934ac1018adeba46a0f7289b2b8fe76ba774a88c7f1922213a56c98bc1", size = 3043779, upload-time = "2025-09-17T00:10:08.951Z" }, - { url = "https://files.pythonhosted.org/packages/2f/50/b6f3b540c2f6ee712feeb5fa780bb11fad76634e71334718568e7695cb55/cryptography-46.0.1-cp38-abi3-win_amd64.whl", hash = "sha256:7176a5ab56fac98d706921f6416a05e5aff7df0e4b91516f450f8627cda22af3", size = 3517226, upload-time = "2025-09-17T00:10:10.769Z" }, - { url = "https://files.pythonhosted.org/packages/ff/e8/77d17d00981cdd27cc493e81e1749a0b8bbfb843780dbd841e30d7f50743/cryptography-46.0.1-cp38-abi3-win_arm64.whl", hash = "sha256:efc9e51c3e595267ff84adf56e9b357db89ab2279d7e375ffcaf8f678606f3d9", size = 2923149, upload-time = "2025-09-17T00:10:13.236Z" }, - { url = "https://files.pythonhosted.org/packages/27/27/077e09fd92075dd1338ea0ffaf5cfee641535545925768350ad90d8c36ca/cryptography-46.0.1-pp311-pypy311_pp73-macosx_10_9_x86_64.whl", hash = "sha256:b9c79af2c3058430d911ff1a5b2b96bbfe8da47d5ed961639ce4681886614e70", size = 3722319, upload-time = "2025-09-17T00:10:20.273Z" }, - { url = "https://files.pythonhosted.org/packages/db/32/6fc7250280920418651640d76cee34d91c1e0601d73acd44364570cf041f/cryptography-46.0.1-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:0ca4be2af48c24df689a150d9cd37404f689e2968e247b6b8ff09bff5bcd786f", size = 4249030, upload-time = "2025-09-17T00:10:22.396Z" }, - { url = "https://files.pythonhosted.org/packages/32/33/8d5398b2da15a15110b2478480ab512609f95b45ead3a105c9a9c76f9980/cryptography-46.0.1-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:13e67c4d3fb8b6bc4ef778a7ccdd8df4cd15b4bcc18f4239c8440891a11245cc", size = 4528009, upload-time = "2025-09-17T00:10:24.418Z" }, - { url = "https://files.pythonhosted.org/packages/fd/1c/4012edad2a8977ab386c36b6e21f5065974d37afa3eade83a9968cba4855/cryptography-46.0.1-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:15b5fd9358803b0d1cc42505a18d8bca81dabb35b5cfbfea1505092e13a9d96d", size = 4248902, upload-time = "2025-09-17T00:10:26.255Z" }, - { url = "https://files.pythonhosted.org/packages/58/a3/257cd5ae677302de8fa066fca9de37128f6729d1e63c04dd6a15555dd450/cryptography-46.0.1-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:e34da95e29daf8a71cb2841fd55df0511539a6cdf33e6f77c1e95e44006b9b46", size = 4527150, upload-time = "2025-09-17T00:10:28.28Z" }, - { url = "https://files.pythonhosted.org/packages/6a/cd/fe6b65e1117ec7631f6be8951d3db076bac3e1b096e3e12710ed071ffc3c/cryptography-46.0.1-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:34f04b7311174469ab3ac2647469743720f8b6c8b046f238e5cb27905695eb2a", size = 3448210, upload-time = "2025-09-17T00:10:30.145Z" }, -] - -[[package]] -name = "cycler" -version = "0.12.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/a9/95/a3dbbb5028f35eafb79008e7522a75244477d2838f38cbb722248dabc2a8/cycler-0.12.1.tar.gz", hash = "sha256:88bb128f02ba341da8ef447245a9e138fae777f6a23943da4540077d3601eb1c", size = 7615, upload-time = "2023-10-07T05:32:18.335Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/e7/05/c19819d5e3d95294a6f5947fb9b9629efb316b96de511b418c53d245aae6/cycler-0.12.1-py3-none-any.whl", hash = "sha256:85cef7cff222d8644161529808465972e51340599459b8ac3ccbac5a854e0d30", size = 8321, upload-time = "2023-10-07T05:32:16.783Z" }, -] - -[[package]] -name = "cyclopts" -version = "3.24.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "attrs" }, - { name = "docstring-parser", marker = "python_full_version < '4'" }, - { name = "rich" }, - { name = "rich-rst" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/30/ca/7782da3b03242d5f0a16c20371dff99d4bd1fedafe26bc48ff82e42be8c9/cyclopts-3.24.0.tar.gz", hash = "sha256:de6964a041dfb3c57bf043b41e68c43548227a17de1bad246e3a0bfc5c4b7417", size = 76131, upload-time = "2025-09-08T15:40:57.75Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/f0/8b/2c95f0645c6f40211896375e6fa51f504b8ccb29c21f6ae661fe87ab044e/cyclopts-3.24.0-py3-none-any.whl", hash = "sha256:809d04cde9108617106091140c3964ee6fceb33cecdd537f7ffa360bde13ed71", size = 86154, upload-time = "2025-09-08T15:40:56.41Z" }, -] - -[[package]] -name = "deprecated" -version = "1.2.18" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "wrapt" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/98/97/06afe62762c9a8a86af0cfb7bfdab22a43ad17138b07af5b1a58442690a2/deprecated-1.2.18.tar.gz", hash = "sha256:422b6f6d859da6f2ef57857761bfb392480502a64c3028ca9bbe86085d72115d", size = 2928744, upload-time = "2025-01-27T10:46:25.7Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/6e/c6/ac0b6c1e2d138f1002bcf799d330bd6d85084fece321e662a14223794041/Deprecated-1.2.18-py2.py3-none-any.whl", hash = "sha256:bd5011788200372a32418f888e326a09ff80d0214bd961147cfed01b5c018eec", size = 9998, upload-time = "2025-01-27T10:46:09.186Z" }, -] - -[[package]] -name = "deprecation" -version = "2.1.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "packaging" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/5a/d3/8ae2869247df154b64c1884d7346d412fed0c49df84db635aab2d1c40e62/deprecation-2.1.0.tar.gz", hash = "sha256:72b3bde64e5d778694b0cf68178aed03d15e15477116add3fb773e581f9518ff", size = 173788, upload-time = "2020-04-20T14:23:38.738Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/02/c3/253a89ee03fc9b9682f1541728eb66db7db22148cd94f89ab22528cd1e1b/deprecation-2.1.0-py2.py3-none-any.whl", hash = "sha256:a10811591210e1fb0e768a8c25517cabeabcba6f0bf96564f8ff45189f90b14a", size = 11178, upload-time = "2020-04-20T14:23:36.581Z" }, -] - -[[package]] -name = "diskcache" -version = "5.6.3" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/3f/21/1c1ffc1a039ddcc459db43cc108658f32c57d271d7289a2794e401d0fdb6/diskcache-5.6.3.tar.gz", hash = "sha256:2c3a3fa2743d8535d832ec61c2054a1641f41775aa7c556758a109941e33e4fc", size = 67916, upload-time = "2023-08-31T06:12:00.316Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/3f/27/4570e78fc0bf5ea0ca45eb1de3818a23787af9b390c0b0a0033a1b8236f9/diskcache-5.6.3-py3-none-any.whl", hash = "sha256:5e31b2d5fbad117cc363ebaf6b689474db18a1f6438bc82358b024abd4c2ca19", size = 45550, upload-time = "2023-08-31T06:11:58.822Z" }, -] - -[[package]] -name = "distlib" -version = "0.4.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/96/8e/709914eb2b5749865801041647dc7f4e6d00b549cfe88b65ca192995f07c/distlib-0.4.0.tar.gz", hash = "sha256:feec40075be03a04501a973d81f633735b4b69f98b05450592310c0f401a4e0d", size = 614605, upload-time = "2025-07-17T16:52:00.465Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/33/6b/e0547afaf41bf2c42e52430072fa5658766e3d65bd4b03a563d1b6336f57/distlib-0.4.0-py2.py3-none-any.whl", hash = "sha256:9659f7d87e46584a30b5780e43ac7a2143098441670ff0a49d5f9034c54a6c16", size = 469047, upload-time = "2025-07-17T16:51:58.613Z" }, -] - -[[package]] -name = "distro" -version = "1.9.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/fc/f8/98eea607f65de6527f8a2e8885fc8015d3e6f5775df186e443e0964a11c3/distro-1.9.0.tar.gz", hash = "sha256:2fa77c6fd8940f116ee1d6b94a2f90b13b5ea8d019b98bc8bafdcabcdd9bdbed", size = 60722, upload-time = "2023-12-24T09:54:32.31Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/12/b3/231ffd4ab1fc9d679809f356cebee130ac7daa00d6d6f3206dd4fd137e9e/distro-1.9.0-py3-none-any.whl", hash = "sha256:7bffd925d65168f85027d8da9af6bddab658135b840670a223589bc0c8ef02b2", size = 20277, upload-time = "2023-12-24T09:54:30.421Z" }, -] - -[[package]] -name = "dlt" -version = "1.16.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "click" }, - { name = "fsspec" }, - { name = "gitpython" }, - { name = "giturlparse" }, - { name = "hexbytes" }, - { name = "humanize" }, - { name = "jsonpath-ng" }, - { name = "orjson", marker = "python_full_version >= '3.14' or sys_platform != 'emscripten'" }, - { name = "packaging" }, - { name = "pathvalidate" }, - { name = "pendulum" }, - { name = "pluggy" }, - { name = "pytz" }, - { name = "pywin32", marker = "sys_platform == 'win32'" }, - { name = "pyyaml" }, - { name = "requests" }, - { name = "requirements-parser" }, - { name = "rich-argparse" }, - { name = "semver" }, - { name = "setuptools" }, - { name = "simplejson" }, - { name = "sqlglot" }, - { name = "tenacity" }, - { name = "tomlkit" }, - { name = "typing-extensions" }, - { name = "tzdata" }, - { name = "win-precise-time", marker = "python_full_version < '3.13' and os_name == 'nt'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/47/45/8f32b8cc4c709c79edc54763ab0e5f62df55a17bfaf8c31e2d2538422e34/dlt-1.16.0.tar.gz", hash = "sha256:113d17a3f27aa4f41c3438b0b032a68d30db195d8415a471ba43a9502e971a21", size = 809187, upload-time = "2025-09-10T06:53:06.365Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/c6/1c/0a96ced9fb52e859b44624cc86ace5f59324ca899ac7e5a5cfeb1f1c797c/dlt-1.16.0-py3-none-any.whl", hash = "sha256:882ef281bbdc32eaba3b5ced984a8ed7014d8978fd7ab4a58b198023c8938c9f", size = 1029963, upload-time = "2025-09-10T06:53:04.014Z" }, -] - -[package.optional-dependencies] -sqlalchemy = [ - { name = "alembic" }, - { name = "sqlalchemy" }, -] - -[[package]] -name = "dnspython" -version = "2.8.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/8c/8b/57666417c0f90f08bcafa776861060426765fdb422eb10212086fb811d26/dnspython-2.8.0.tar.gz", hash = "sha256:181d3c6996452cb1189c4046c61599b84a5a86e099562ffde77d26984ff26d0f", size = 368251, upload-time = "2025-09-07T18:58:00.022Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/ba/5a/18ad964b0086c6e62e2e7500f7edc89e3faa45033c71c1893d34eed2b2de/dnspython-2.8.0-py3-none-any.whl", hash = "sha256:01d9bbc4a2d76bf0db7c1f729812ded6d912bd318d3b1cf81d30c0f845dbf3af", size = 331094, upload-time = "2025-09-07T18:57:58.071Z" }, -] - -[[package]] -name = "docstring-parser" -version = "0.17.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/b2/9d/c3b43da9515bd270df0f80548d9944e389870713cc1fe2b8fb35fe2bcefd/docstring_parser-0.17.0.tar.gz", hash = "sha256:583de4a309722b3315439bb31d64ba3eebada841f2e2cee23b99df001434c912", size = 27442, upload-time = "2025-07-21T07:35:01.868Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/55/e2/2537ebcff11c1ee1ff17d8d0b6f4db75873e3b0fb32c2d4a2ee31ecb310a/docstring_parser-0.17.0-py3-none-any.whl", hash = "sha256:cf2569abd23dce8099b300f9b4fa8191e9582dda731fd533daf54c4551658708", size = 36896, upload-time = "2025-07-21T07:35:00.684Z" }, -] - -[[package]] -name = "docutils" -version = "0.22.2" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/4a/c0/89fe6215b443b919cb98a5002e107cb5026854ed1ccb6b5833e0768419d1/docutils-0.22.2.tar.gz", hash = "sha256:9fdb771707c8784c8f2728b67cb2c691305933d68137ef95a75db5f4dfbc213d", size = 2289092, upload-time = "2025-09-20T17:55:47.994Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/66/dd/f95350e853a4468ec37478414fc04ae2d61dad7a947b3015c3dcc51a09b9/docutils-0.22.2-py3-none-any.whl", hash = "sha256:b0e98d679283fc3bb0ead8a5da7f501baa632654e7056e9c5846842213d674d8", size = 632667, upload-time = "2025-09-20T17:55:43.052Z" }, -] - -[[package]] -name = "email-validator" -version = "2.2.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "dnspython" }, - { name = "idna" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/48/ce/13508a1ec3f8bb981ae4ca79ea40384becc868bfae97fd1c942bb3a001b1/email_validator-2.2.0.tar.gz", hash = "sha256:cb690f344c617a714f22e66ae771445a1ceb46821152df8e165c5f9a364582b7", size = 48967, upload-time = "2024-06-20T11:30:30.034Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/d7/ee/bf0adb559ad3c786f12bcbc9296b3f5675f529199bef03e2df281fa1fadb/email_validator-2.2.0-py3-none-any.whl", hash = "sha256:561977c2d73ce3611850a06fa56b414621e0c8faa9d66f2611407d87465da631", size = 33521, upload-time = "2024-06-20T11:30:28.248Z" }, -] - -[[package]] -name = "exceptiongroup" -version = "1.3.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "typing-extensions", marker = "python_full_version < '3.13'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/0b/9f/a65090624ecf468cdca03533906e7c69ed7588582240cfe7cc9e770b50eb/exceptiongroup-1.3.0.tar.gz", hash = "sha256:b241f5885f560bc56a59ee63ca4c6a8bfa46ae4ad651af316d4e81817bb9fd88", size = 29749, upload-time = "2025-05-10T17:42:51.123Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/36/f4/c6e662dade71f56cd2f3735141b265c3c79293c109549c1e6933b0651ffc/exceptiongroup-1.3.0-py3-none-any.whl", hash = "sha256:4d111e6e0c13d0644cad6ddaa7ed0261a0b36971f6d23e7ec9b4b9097da78a10", size = 16674, upload-time = "2025-05-10T17:42:49.33Z" }, -] - -[[package]] -name = "fastapi" -version = "0.116.2" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "pydantic" }, - { name = "starlette" }, - { name = "typing-extensions" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/01/64/1296f46d6b9e3b23fb22e5d01af3f104ef411425531376212f1eefa2794d/fastapi-0.116.2.tar.gz", hash = "sha256:231a6af2fe21cfa2c32730170ad8514985fc250bec16c9b242d3b94c835ef529", size = 298595, upload-time = "2025-09-16T18:29:23.058Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/32/e4/c543271a8018874b7f682bf6156863c416e1334b8ed3e51a69495c5d4360/fastapi-0.116.2-py3-none-any.whl", hash = "sha256:c3a7a8fb830b05f7e087d920e0d786ca1fc9892eb4e9a84b227be4c1bc7569db", size = 95670, upload-time = "2025-09-16T18:29:21.329Z" }, -] - -[[package]] -name = "fastapi-users" -version = "14.0.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "email-validator" }, - { name = "fastapi" }, - { name = "makefun" }, - { name = "pwdlib", extra = ["argon2", "bcrypt"] }, - { name = "pyjwt", extra = ["crypto"] }, - { name = "python-multipart" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/e4/26/7fe4e6a4f60d9cde2b95f58ba45ff03219b62bd03bea75d914b723ecfa2a/fastapi_users-14.0.1.tar.gz", hash = "sha256:8c032b3a75c6fb2b1f5eab8ffce5321176e9916efe1fe93e7c15ee55f0b02236", size = 120315, upload-time = "2025-01-04T13:20:05.95Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/2c/52/2821d3e95a92567d38f98a33d1ef89302aa3448866bf45ff19a48a5f28f8/fastapi_users-14.0.1-py3-none-any.whl", hash = "sha256:074df59676dccf79412d2880bdcb661ab1fabc2ecec1f043b4e6a23be97ed9e1", size = 38717, upload-time = "2025-01-04T13:20:04.441Z" }, -] - -[package.optional-dependencies] -sqlalchemy = [ - { name = "fastapi-users-db-sqlalchemy" }, -] - -[[package]] -name = "fastapi-users-db-sqlalchemy" -version = "7.0.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "fastapi-users" }, - { name = "sqlalchemy", extra = ["asyncio"] }, -] -sdist = { url = "https://files.pythonhosted.org/packages/87/12/bc9e6146ae31564741cefc87ee6e37fa5b566933f0afe8aa030779d60e60/fastapi_users_db_sqlalchemy-7.0.0.tar.gz", hash = "sha256:6823eeedf8a92f819276a2b2210ef1dcfd71fe8b6e37f7b4da8d1c60e3dfd595", size = 10877, upload-time = "2025-01-04T13:09:05.086Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/a6/08/9968963c1fb8c34627b7f1fbcdfe9438540f87dc7c9bfb59bb4fd19a4ecf/fastapi_users_db_sqlalchemy-7.0.0-py3-none-any.whl", hash = "sha256:5fceac018e7cfa69efc70834dd3035b3de7988eb4274154a0dbe8b14f5aa001e", size = 6891, upload-time = "2025-01-04T13:09:02.869Z" }, -] - -[[package]] -name = "fastmcp" -version = "2.12.3" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "authlib" }, - { name = "cyclopts" }, - { name = "exceptiongroup" }, - { name = "httpx" }, - { name = "mcp" }, - { name = "openapi-core" }, - { name = "openapi-pydantic" }, - { name = "pydantic", extra = ["email"] }, - { name = "pyperclip" }, - { name = "python-dotenv" }, - { name = "rich" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/99/5e/035fdfa23646de8811776cd62d93440e334e8a4557b35c63c1bff125c08c/fastmcp-2.12.3.tar.gz", hash = "sha256:541dd569d5b6c083140b04d997ba3dc47f7c10695cee700d0a733ce63b20bb65", size = 5246812, upload-time = "2025-09-12T12:28:07.136Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/96/79/0fd386e61819e205563d4eb15da76564b80dc2edd3c64b46f2706235daec/fastmcp-2.12.3-py3-none-any.whl", hash = "sha256:aee50872923a9cba731861fc0120e7dbe4642a2685ba251b2b202b82fb6c25a9", size = 314031, upload-time = "2025-09-12T12:28:05.024Z" }, -] - -[[package]] -name = "fastuuid" -version = "0.12.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/19/17/13146a1e916bd2971d0a58db5e0a4ad23efdd49f78f33ac871c161f8007b/fastuuid-0.12.0.tar.gz", hash = "sha256:d0bd4e5b35aad2826403f4411937c89e7c88857b1513fe10f696544c03e9bd8e", size = 19180, upload-time = "2025-01-27T18:04:14.387Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/d4/99/555eab31381c7912103d4c8654082611e5e82a7bb88ad5ab067e36b622d7/fastuuid-0.12.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2bced35269315d16fe0c41003f8c9d63f2ee16a59295d90922cad5e6a67d0418", size = 247249, upload-time = "2025-01-27T18:03:23.092Z" }, - { url = "https://files.pythonhosted.org/packages/6d/3b/d62ce7f2af3d50a8e787603d44809770f43a3f2ff708bf10c252bf479109/fastuuid-0.12.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:82106e4b0a24f4f2f73c88f89dadbc1533bb808900740ca5db9bbb17d3b0c824", size = 258369, upload-time = "2025-01-27T18:04:08.903Z" }, - { url = "https://files.pythonhosted.org/packages/86/23/33ec5355036745cf83ea9ca7576d2e0750ff8d268c03b4af40ed26f1a303/fastuuid-0.12.0-cp311-cp311-manylinux_2_34_x86_64.whl", hash = "sha256:4db1bc7b8caa1d7412e1bea29b016d23a8d219131cff825b933eb3428f044dca", size = 278316, upload-time = "2025-01-27T18:04:12.74Z" }, - { url = "https://files.pythonhosted.org/packages/40/91/32ce82a14650148b6979ccd1a0089fd63d92505a90fb7156d2acc3245cbd/fastuuid-0.12.0-cp311-cp311-win_amd64.whl", hash = "sha256:07afc8e674e67ac3d35a608c68f6809da5fab470fb4ef4469094fdb32ba36c51", size = 156643, upload-time = "2025-01-27T18:05:59.266Z" }, - { url = "https://files.pythonhosted.org/packages/f6/28/442e79d6219b90208cb243ac01db05d89cc4fdf8ecd563fb89476baf7122/fastuuid-0.12.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:328694a573fe9dce556b0b70c9d03776786801e028d82f0b6d9db1cb0521b4d1", size = 247372, upload-time = "2025-01-27T18:03:40.967Z" }, - { url = "https://files.pythonhosted.org/packages/40/eb/e0fd56890970ca7a9ec0d116844580988b692b1a749ac38e0c39e1dbdf23/fastuuid-0.12.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:02acaea2c955bb2035a7d8e7b3fba8bd623b03746ae278e5fa932ef54c702f9f", size = 258200, upload-time = "2025-01-27T18:04:12.138Z" }, - { url = "https://files.pythonhosted.org/packages/f5/3c/4b30e376e65597a51a3dc929461a0dec77c8aec5d41d930f482b8f43e781/fastuuid-0.12.0-cp312-cp312-manylinux_2_34_x86_64.whl", hash = "sha256:ed9f449cba8cf16cced252521aee06e633d50ec48c807683f21cc1d89e193eb0", size = 278446, upload-time = "2025-01-27T18:04:15.877Z" }, - { url = "https://files.pythonhosted.org/packages/fe/96/cc5975fd23d2197b3e29f650a7a9beddce8993eaf934fa4ac595b77bb71f/fastuuid-0.12.0-cp312-cp312-win_amd64.whl", hash = "sha256:0df2ea4c9db96fd8f4fa38d0e88e309b3e56f8fd03675a2f6958a5b082a0c1e4", size = 157185, upload-time = "2025-01-27T18:06:19.21Z" }, - { url = "https://files.pythonhosted.org/packages/a9/e8/d2bb4f19e5ee15f6f8e3192a54a897678314151aa17d0fb766d2c2cbc03d/fastuuid-0.12.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:7fe2407316a04ee8f06d3dbc7eae396d0a86591d92bafe2ca32fce23b1145786", size = 247512, upload-time = "2025-01-27T18:04:08.115Z" }, - { url = "https://files.pythonhosted.org/packages/bc/53/25e811d92fd60f5c65e098c3b68bd8f1a35e4abb6b77a153025115b680de/fastuuid-0.12.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b9b31dd488d0778c36f8279b306dc92a42f16904cba54acca71e107d65b60b0c", size = 258257, upload-time = "2025-01-27T18:03:56.408Z" }, - { url = "https://files.pythonhosted.org/packages/10/23/73618e7793ea0b619caae2accd9e93e60da38dd78dd425002d319152ef2f/fastuuid-0.12.0-cp313-cp313-manylinux_2_34_x86_64.whl", hash = "sha256:b19361ee649365eefc717ec08005972d3d1eb9ee39908022d98e3bfa9da59e37", size = 278559, upload-time = "2025-01-27T18:03:58.661Z" }, - { url = "https://files.pythonhosted.org/packages/e4/41/6317ecfc4757d5f2a604e5d3993f353ba7aee85fa75ad8b86fce6fc2fa40/fastuuid-0.12.0-cp313-cp313-win_amd64.whl", hash = "sha256:8fc66b11423e6f3e1937385f655bedd67aebe56a3dcec0cb835351cfe7d358c9", size = 157276, upload-time = "2025-01-27T18:06:39.245Z" }, -] - -[[package]] -name = "filelock" -version = "3.19.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/40/bb/0ab3e58d22305b6f5440629d20683af28959bf793d98d11950e305c1c326/filelock-3.19.1.tar.gz", hash = "sha256:66eda1888b0171c998b35be2bcc0f6d75c388a7ce20c3f3f37aa8e96c2dddf58", size = 17687, upload-time = "2025-08-14T16:56:03.016Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/42/14/42b2651a2f46b022ccd948bca9f2d5af0fd8929c4eec235b8d6d844fbe67/filelock-3.19.1-py3-none-any.whl", hash = "sha256:d38e30481def20772f5baf097c122c3babc4fcdb7e14e57049eb9d88c6dc017d", size = 15988, upload-time = "2025-08-14T16:56:01.633Z" }, -] - -[[package]] -name = "filetype" -version = "1.2.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/bb/29/745f7d30d47fe0f251d3ad3dc2978a23141917661998763bebb6da007eb1/filetype-1.2.0.tar.gz", hash = "sha256:66b56cd6474bf41d8c54660347d37afcc3f7d1970648de365c102ef77548aadb", size = 998020, upload-time = "2022-11-02T17:34:04.141Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/18/79/1b8fa1bb3568781e84c9200f951c735f3f157429f44be0495da55894d620/filetype-1.2.0-py2.py3-none-any.whl", hash = "sha256:7ce71b6880181241cf7ac8697a2f1eb6a8bd9b429f7ad6d27b8db9ba5f1c2d25", size = 19970, upload-time = "2022-11-02T17:34:01.425Z" }, -] - -[[package]] -name = "flatbuffers" -version = "25.2.10" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/e4/30/eb5dce7994fc71a2f685d98ec33cc660c0a5887db5610137e60d8cbc4489/flatbuffers-25.2.10.tar.gz", hash = "sha256:97e451377a41262f8d9bd4295cc836133415cc03d8cb966410a4af92eb00d26e", size = 22170, upload-time = "2025-02-11T04:26:46.257Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/b8/25/155f9f080d5e4bc0082edfda032ea2bc2b8fab3f4d25d46c1e9dd22a1a89/flatbuffers-25.2.10-py2.py3-none-any.whl", hash = "sha256:ebba5f4d5ea615af3f7fd70fc310636fbb2bbd1f566ac0a23d98dd412de50051", size = 30953, upload-time = "2025-02-11T04:26:44.484Z" }, -] - -[[package]] -name = "fonttools" -version = "4.60.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/27/d9/4eabd956fe123651a1f0efe29d9758b3837b5ae9a98934bdb571117033bb/fonttools-4.60.0.tar.gz", hash = "sha256:8f5927f049091a0ca74d35cce7f78e8f7775c83a6901a8fbe899babcc297146a", size = 3553671, upload-time = "2025-09-17T11:34:01.504Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/da/3d/c57731fbbf204ef1045caca28d5176430161ead73cd9feac3e9d9ef77ee6/fonttools-4.60.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:a9106c202d68ff5f9b4a0094c4d7ad2eaa7e9280f06427b09643215e706eb016", size = 2830883, upload-time = "2025-09-17T11:32:10.552Z" }, - { url = "https://files.pythonhosted.org/packages/cc/2d/b7a6ebaed464ce441c755252cc222af11edc651d17c8f26482f429cc2c0e/fonttools-4.60.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:9da3a4a3f2485b156bb429b4f8faa972480fc01f553f7c8c80d05d48f17eec89", size = 2356005, upload-time = "2025-09-17T11:32:13.248Z" }, - { url = "https://files.pythonhosted.org/packages/ee/c2/ea834e921324e2051403e125c1fe0bfbdde4951a7c1784e4ae6bdbd286cc/fonttools-4.60.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1f84de764c6057b2ffd4feb50ddef481d92e348f0c70f2c849b723118d352bf3", size = 5041201, upload-time = "2025-09-17T11:32:15.373Z" }, - { url = "https://files.pythonhosted.org/packages/93/3c/1c64a338e9aa410d2d0728827d5bb1301463078cb225b94589f27558b427/fonttools-4.60.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:800b3fa0d5c12ddff02179d45b035a23989a6c597a71c8035c010fff3b2ef1bb", size = 4977696, upload-time = "2025-09-17T11:32:17.674Z" }, - { url = "https://files.pythonhosted.org/packages/07/cc/c8c411a0d9732bb886b870e052f20658fec9cf91118314f253950d2c1d65/fonttools-4.60.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8dd68f60b030277f292a582d31c374edfadc60bb33d51ec7b6cd4304531819ba", size = 5020386, upload-time = "2025-09-17T11:32:20.089Z" }, - { url = "https://files.pythonhosted.org/packages/13/01/1d3bc07cf92e7f4fc27f06d4494bf6078dc595b2e01b959157a4fd23df12/fonttools-4.60.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:53328e3ca9e5c8660ef6de07c35f8f312c189b757535e12141be7a8ec942de6e", size = 5131575, upload-time = "2025-09-17T11:32:22.582Z" }, - { url = "https://files.pythonhosted.org/packages/5a/16/08db3917ee19e89d2eb0ee637d37cd4136c849dc421ff63f406b9165c1a1/fonttools-4.60.0-cp311-cp311-win32.whl", hash = "sha256:d493c175ddd0b88a5376e61163e3e6fde3be8b8987db9b092e0a84650709c9e7", size = 2229297, upload-time = "2025-09-17T11:32:24.834Z" }, - { url = "https://files.pythonhosted.org/packages/d2/0b/76764da82c0dfcea144861f568d9e83f4b921e84f2be617b451257bb25a7/fonttools-4.60.0-cp311-cp311-win_amd64.whl", hash = "sha256:cc2770c9dc49c2d0366e9683f4d03beb46c98042d7ccc8ddbadf3459ecb051a7", size = 2277193, upload-time = "2025-09-17T11:32:27.094Z" }, - { url = "https://files.pythonhosted.org/packages/2a/9b/706ebf84b55ab03439c1f3a94d6915123c0d96099f4238b254fdacffe03a/fonttools-4.60.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:8c68928a438d60dfde90e2f09aa7f848ed201176ca6652341744ceec4215859f", size = 2831953, upload-time = "2025-09-17T11:32:29.39Z" }, - { url = "https://files.pythonhosted.org/packages/76/40/782f485be450846e4f3aecff1f10e42af414fc6e19d235c70020f64278e1/fonttools-4.60.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:b7133821249097cffabf0624eafd37f5a3358d5ce814febe9db688e3673e724e", size = 2351716, upload-time = "2025-09-17T11:32:31.46Z" }, - { url = "https://files.pythonhosted.org/packages/39/77/ad8d2a6ecc19716eb488c8cf118de10f7802e14bdf61d136d7b52358d6b1/fonttools-4.60.0-cp312-cp312-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:d3638905d3d77ac8791127ce181f7cb434f37e4204d8b2e31b8f1e154320b41f", size = 4922729, upload-time = "2025-09-17T11:32:33.659Z" }, - { url = "https://files.pythonhosted.org/packages/6b/48/aa543037c6e7788e1bc36b3f858ac70a59d32d0f45915263d0b330a35140/fonttools-4.60.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7968a26ef010ae89aabbb2f8e9dec1e2709a2541bb8620790451ee8aeb4f6fbf", size = 4967188, upload-time = "2025-09-17T11:32:35.74Z" }, - { url = "https://files.pythonhosted.org/packages/ac/58/e407d2028adc6387947eff8f2940b31f4ed40b9a83c2c7bbc8b9255126e2/fonttools-4.60.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:1ef01ca7847c356b0fe026b7b92304bc31dc60a4218689ee0acc66652c1a36b2", size = 4910043, upload-time = "2025-09-17T11:32:38.054Z" }, - { url = "https://files.pythonhosted.org/packages/16/ef/e78519b3c296ef757a21b792fc6a785aa2ef9a2efb098083d8ed5f6ee2ba/fonttools-4.60.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:f3482d7ed7867edfcf785f77c1dffc876c4b2ddac19539c075712ff2a0703cf5", size = 5061980, upload-time = "2025-09-17T11:32:40.457Z" }, - { url = "https://files.pythonhosted.org/packages/00/4c/ad72444d1e3ef704ee90af8d5abf198016a39908d322bf41235562fb01a0/fonttools-4.60.0-cp312-cp312-win32.whl", hash = "sha256:8c937c4fe8addff575a984c9519433391180bf52cf35895524a07b520f376067", size = 2217750, upload-time = "2025-09-17T11:32:42.586Z" }, - { url = "https://files.pythonhosted.org/packages/46/55/3e8ac21963e130242f5a9ea2ebc57f5726d704bf4dcca89088b5b637b2d3/fonttools-4.60.0-cp312-cp312-win_amd64.whl", hash = "sha256:99b06d5d6f29f32e312adaed0367112f5ff2d300ea24363d377ec917daf9e8c5", size = 2266025, upload-time = "2025-09-17T11:32:44.8Z" }, - { url = "https://files.pythonhosted.org/packages/b4/6b/d090cd54abe88192fe3010f573508b2592cf1d1f98b14bcb799a8ad20525/fonttools-4.60.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:97100ba820936cdb5148b634e0884f0088699c7e2f1302ae7bba3747c7a19fb3", size = 2824791, upload-time = "2025-09-17T11:32:47.002Z" }, - { url = "https://files.pythonhosted.org/packages/97/8c/7ccb5a27aac9a535623fe04935fb9f469a4f8a1253991af9fbac2fe88c17/fonttools-4.60.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:03fccf84f377f83e99a5328a9ebe6b41e16fcf64a1450c352b6aa7e0deedbc01", size = 2347081, upload-time = "2025-09-17T11:32:49.204Z" }, - { url = "https://files.pythonhosted.org/packages/f8/1a/c14f0bb20b4cb7849dc0519f0ab0da74318d52236dc23168530569958599/fonttools-4.60.0-cp313-cp313-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:a3ef06671f862cd7da78ab105fbf8dce9da3634a8f91b3a64ed5c29c0ac6a9a8", size = 4902095, upload-time = "2025-09-17T11:32:51.848Z" }, - { url = "https://files.pythonhosted.org/packages/c9/a0/c7c91f07c40de5399cbaec7d25e04c9afac6c8f80036a98c125efdb5fe1a/fonttools-4.60.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3f2195faf96594c238462c420c7eff97d1aa51de595434f806ec3952df428616", size = 4959137, upload-time = "2025-09-17T11:32:54.185Z" }, - { url = "https://files.pythonhosted.org/packages/38/d2/169e49498df9f2c721763aa39b0bf3d08cb762864ebc8a8ddb99f5ba7ec8/fonttools-4.60.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:3887008865fa4f56cff58a1878f1300ba81a4e34f76daf9b47234698493072ee", size = 4900467, upload-time = "2025-09-17T11:32:56.664Z" }, - { url = "https://files.pythonhosted.org/packages/cc/9c/bfb56b89c3eab8bcb739c7fd1e8a43285c8dd833e1e1d18d4f54f2f641af/fonttools-4.60.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:5567bd130378f21231d3856d8f0571dcdfcd77e47832978c26dabe572d456daa", size = 5043508, upload-time = "2025-09-17T11:32:58.944Z" }, - { url = "https://files.pythonhosted.org/packages/77/30/2b511c7eb99faee1fd9a0b42e984fb91275da3d681da650af4edf409d0fd/fonttools-4.60.0-cp313-cp313-win32.whl", hash = "sha256:699d0b521ec0b188ac11f2c14ccf6a926367795818ddf2bd00a273e9a052dd20", size = 2216037, upload-time = "2025-09-17T11:33:01.192Z" }, - { url = "https://files.pythonhosted.org/packages/3d/73/a2cc5ee4faeb0302cc81942c27f3b516801bf489fdc422a1b20090fff695/fonttools-4.60.0-cp313-cp313-win_amd64.whl", hash = "sha256:24296163268e7c800009711ce5c0e9997be8882c0bd546696c82ef45966163a6", size = 2265190, upload-time = "2025-09-17T11:33:03.935Z" }, - { url = "https://files.pythonhosted.org/packages/86/dd/a126706e45e0ce097cef6de4108b5597795acaa945fdbdd922dbc090d335/fonttools-4.60.0-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:b6fe3efdc956bdad95145cea906ad9ff345c17b706356dfc1098ce3230591343", size = 2821835, upload-time = "2025-09-17T11:33:06.094Z" }, - { url = "https://files.pythonhosted.org/packages/ac/90/5c17f311bbd983fd614b82a7a06da967b5d3c87e3e61cf34de6029a92ff4/fonttools-4.60.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:764b2aaab839762a3aa3207e5b3f0e0dfa41799e0b091edec5fcbccc584fdab5", size = 2344536, upload-time = "2025-09-17T11:33:08.574Z" }, - { url = "https://files.pythonhosted.org/packages/60/67/48c1a6229b2a5668c4111fbd1694ca417adedc1254c5cd2f9a11834c429d/fonttools-4.60.0-cp314-cp314-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:b81c7c47d9e78106a4d70f1dbeb49150513171715e45e0d2661809f2b0e3f710", size = 4842494, upload-time = "2025-09-17T11:33:11.338Z" }, - { url = "https://files.pythonhosted.org/packages/13/3e/83b0b37d02b7e321cbe2b8fcec0aa18571f0a47d3dc222196404371d83b6/fonttools-4.60.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:799ff60ee66b300ebe1fe6632b1cc55a66400fe815cef7b034d076bce6b1d8fc", size = 4943203, upload-time = "2025-09-17T11:33:13.285Z" }, - { url = "https://files.pythonhosted.org/packages/c9/07/11163e49497c53392eaca210a474104e4987c17ca7731f8754ba0d416a67/fonttools-4.60.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:f9878abe155ddd1b433bab95d027a686898a6afba961f3c5ca14b27488f2d772", size = 4889233, upload-time = "2025-09-17T11:33:15.175Z" }, - { url = "https://files.pythonhosted.org/packages/60/90/e85005d955cb26e7de015d5678778b8cc3293c0f3d717865675bd641fbfc/fonttools-4.60.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:ded432b7133ea4602fdb4731a4a7443a8e9548edad28987b99590cf6da626254", size = 4998335, upload-time = "2025-09-17T11:33:17.217Z" }, - { url = "https://files.pythonhosted.org/packages/2a/82/0374ad53729de6e3788ecdb8a3731ce6592c5ffa9bff823cef2ffe0164af/fonttools-4.60.0-cp314-cp314-win32.whl", hash = "sha256:5d97cf3a9245316d5978628c05642b939809c4f55ca632ca40744cb9de6e8d4a", size = 2219840, upload-time = "2025-09-17T11:33:19.494Z" }, - { url = "https://files.pythonhosted.org/packages/11/c3/804cd47453dcafb7976f9825b43cc0e61a2fe30eddb971b681cd72c4ca65/fonttools-4.60.0-cp314-cp314-win_amd64.whl", hash = "sha256:61b9ef46dd5e9dcb6f437eb0cc5ed83d5049e1bf9348e31974ffee1235db0f8f", size = 2269891, upload-time = "2025-09-17T11:33:21.743Z" }, - { url = "https://files.pythonhosted.org/packages/75/bf/1bd760aca04098e7028b4e0e5f73b41ff74b322275698071454652476a44/fonttools-4.60.0-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:bba7e3470cf353e1484a36dfb4108f431c2859e3f6097fe10118eeae92166773", size = 2893361, upload-time = "2025-09-17T11:33:23.68Z" }, - { url = "https://files.pythonhosted.org/packages/25/35/7a2c09aa990ed77f34924def383f44fc576a5596cc3df8438071e1baa1ac/fonttools-4.60.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:c5ac6439a38c27b3287063176b3303b34982024b01e2e95bba8ac1e45f6d41c1", size = 2374086, upload-time = "2025-09-17T11:33:25.988Z" }, - { url = "https://files.pythonhosted.org/packages/77/a9/f85ed2493e82837ff73421f3f7a1c3ae8f0b14051307418c916d9563da1f/fonttools-4.60.0-cp314-cp314t-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:4acd21e9f125a1257da59edf7a6e9bd4abd76282770715c613f1fe482409e9f9", size = 4848766, upload-time = "2025-09-17T11:33:28.018Z" }, - { url = "https://files.pythonhosted.org/packages/d1/91/29830eda31ae9231a06d5246e5d0c686422d03456ed666e13576c24c3f97/fonttools-4.60.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b4a6fc53039ea047e35dc62b958af9cd397eedbc3fa42406d2910ae091b9ae37", size = 5084613, upload-time = "2025-09-17T11:33:30.562Z" }, - { url = "https://files.pythonhosted.org/packages/48/01/615905e7db2568fe1843145077e680443494b7caab2089527b7e112c7606/fonttools-4.60.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:ef34f44eadf133e94e82c775a33ee3091dd37ee0161c5f5ea224b46e3ce0fb8e", size = 4956620, upload-time = "2025-09-17T11:33:32.497Z" }, - { url = "https://files.pythonhosted.org/packages/97/8e/64e65255871ec2f13b6c00b5b12d08b928b504867cfb7e7ed73e5e941832/fonttools-4.60.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:d112cae3e7ad1bb5d7f7a60365fcf6c181374648e064a8c07617b240e7c828ee", size = 4973202, upload-time = "2025-09-17T11:33:34.561Z" }, - { url = "https://files.pythonhosted.org/packages/e0/6d/04d16243eb441e8de61074c7809e92d2e35df4cd11af5632e486bc630dab/fonttools-4.60.0-cp314-cp314t-win32.whl", hash = "sha256:0f7b2c251dc338973e892a1e153016114e7a75f6aac7a49b84d5d1a4c0608d08", size = 2281217, upload-time = "2025-09-17T11:33:36.965Z" }, - { url = "https://files.pythonhosted.org/packages/ab/5f/09bd2f9f28ef0d6f3620fa19699d11c4bc83ff8a2786d8ccdd97c209b19a/fonttools-4.60.0-cp314-cp314t-win_amd64.whl", hash = "sha256:c8a72771106bc7434098db35abecd84d608857f6e116d3ef00366b213c502ce9", size = 2344738, upload-time = "2025-09-17T11:33:39.372Z" }, - { url = "https://files.pythonhosted.org/packages/f9/a4/247d3e54eb5ed59e94e09866cfc4f9567e274fbf310ba390711851f63b3b/fonttools-4.60.0-py3-none-any.whl", hash = "sha256:496d26e4d14dcccdd6ada2e937e4d174d3138e3d73f5c9b6ec6eb2fd1dab4f66", size = 1142186, upload-time = "2025-09-17T11:33:59.287Z" }, -] - -[[package]] -name = "frozenlist" -version = "1.7.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/79/b1/b64018016eeb087db503b038296fd782586432b9c077fc5c7839e9cb6ef6/frozenlist-1.7.0.tar.gz", hash = "sha256:2e310d81923c2437ea8670467121cc3e9b0f76d3043cc1d2331d56c7fb7a3a8f", size = 45078, upload-time = "2025-06-09T23:02:35.538Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/34/7e/803dde33760128acd393a27eb002f2020ddb8d99d30a44bfbaab31c5f08a/frozenlist-1.7.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:aa51e147a66b2d74de1e6e2cf5921890de6b0f4820b257465101d7f37b49fb5a", size = 82251, upload-time = "2025-06-09T23:00:16.279Z" }, - { url = "https://files.pythonhosted.org/packages/75/a9/9c2c5760b6ba45eae11334db454c189d43d34a4c0b489feb2175e5e64277/frozenlist-1.7.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:9b35db7ce1cd71d36ba24f80f0c9e7cff73a28d7a74e91fe83e23d27c7828750", size = 48183, upload-time = "2025-06-09T23:00:17.698Z" }, - { url = "https://files.pythonhosted.org/packages/47/be/4038e2d869f8a2da165f35a6befb9158c259819be22eeaf9c9a8f6a87771/frozenlist-1.7.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:34a69a85e34ff37791e94542065c8416c1afbf820b68f720452f636d5fb990cd", size = 47107, upload-time = "2025-06-09T23:00:18.952Z" }, - { url = "https://files.pythonhosted.org/packages/79/26/85314b8a83187c76a37183ceed886381a5f992975786f883472fcb6dc5f2/frozenlist-1.7.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4a646531fa8d82c87fe4bb2e596f23173caec9185bfbca5d583b4ccfb95183e2", size = 237333, upload-time = "2025-06-09T23:00:20.275Z" }, - { url = "https://files.pythonhosted.org/packages/1f/fd/e5b64f7d2c92a41639ffb2ad44a6a82f347787abc0c7df5f49057cf11770/frozenlist-1.7.0-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:79b2ffbba483f4ed36a0f236ccb85fbb16e670c9238313709638167670ba235f", size = 231724, upload-time = "2025-06-09T23:00:21.705Z" }, - { url = "https://files.pythonhosted.org/packages/20/fb/03395c0a43a5976af4bf7534759d214405fbbb4c114683f434dfdd3128ef/frozenlist-1.7.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a26f205c9ca5829cbf82bb2a84b5c36f7184c4316617d7ef1b271a56720d6b30", size = 245842, upload-time = "2025-06-09T23:00:23.148Z" }, - { url = "https://files.pythonhosted.org/packages/d0/15/c01c8e1dffdac5d9803507d824f27aed2ba76b6ed0026fab4d9866e82f1f/frozenlist-1.7.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:bcacfad3185a623fa11ea0e0634aac7b691aa925d50a440f39b458e41c561d98", size = 239767, upload-time = "2025-06-09T23:00:25.103Z" }, - { url = "https://files.pythonhosted.org/packages/14/99/3f4c6fe882c1f5514b6848aa0a69b20cb5e5d8e8f51a339d48c0e9305ed0/frozenlist-1.7.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:72c1b0fe8fe451b34f12dce46445ddf14bd2a5bcad7e324987194dc8e3a74c86", size = 224130, upload-time = "2025-06-09T23:00:27.061Z" }, - { url = "https://files.pythonhosted.org/packages/4d/83/220a374bd7b2aeba9d0725130665afe11de347d95c3620b9b82cc2fcab97/frozenlist-1.7.0-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:61d1a5baeaac6c0798ff6edfaeaa00e0e412d49946c53fae8d4b8e8b3566c4ae", size = 235301, upload-time = "2025-06-09T23:00:29.02Z" }, - { url = "https://files.pythonhosted.org/packages/03/3c/3e3390d75334a063181625343e8daab61b77e1b8214802cc4e8a1bb678fc/frozenlist-1.7.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:7edf5c043c062462f09b6820de9854bf28cc6cc5b6714b383149745e287181a8", size = 234606, upload-time = "2025-06-09T23:00:30.514Z" }, - { url = "https://files.pythonhosted.org/packages/23/1e/58232c19608b7a549d72d9903005e2d82488f12554a32de2d5fb59b9b1ba/frozenlist-1.7.0-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:d50ac7627b3a1bd2dcef6f9da89a772694ec04d9a61b66cf87f7d9446b4a0c31", size = 248372, upload-time = "2025-06-09T23:00:31.966Z" }, - { url = "https://files.pythonhosted.org/packages/c0/a4/e4a567e01702a88a74ce8a324691e62a629bf47d4f8607f24bf1c7216e7f/frozenlist-1.7.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:ce48b2fece5aeb45265bb7a58259f45027db0abff478e3077e12b05b17fb9da7", size = 229860, upload-time = "2025-06-09T23:00:33.375Z" }, - { url = "https://files.pythonhosted.org/packages/73/a6/63b3374f7d22268b41a9db73d68a8233afa30ed164c46107b33c4d18ecdd/frozenlist-1.7.0-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:fe2365ae915a1fafd982c146754e1de6ab3478def8a59c86e1f7242d794f97d5", size = 245893, upload-time = "2025-06-09T23:00:35.002Z" }, - { url = "https://files.pythonhosted.org/packages/6d/eb/d18b3f6e64799a79673c4ba0b45e4cfbe49c240edfd03a68be20002eaeaa/frozenlist-1.7.0-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:45a6f2fdbd10e074e8814eb98b05292f27bad7d1883afbe009d96abdcf3bc898", size = 246323, upload-time = "2025-06-09T23:00:36.468Z" }, - { url = "https://files.pythonhosted.org/packages/5a/f5/720f3812e3d06cd89a1d5db9ff6450088b8f5c449dae8ffb2971a44da506/frozenlist-1.7.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:21884e23cffabb157a9dd7e353779077bf5b8f9a58e9b262c6caad2ef5f80a56", size = 233149, upload-time = "2025-06-09T23:00:37.963Z" }, - { url = "https://files.pythonhosted.org/packages/69/68/03efbf545e217d5db8446acfd4c447c15b7c8cf4dbd4a58403111df9322d/frozenlist-1.7.0-cp311-cp311-win32.whl", hash = "sha256:284d233a8953d7b24f9159b8a3496fc1ddc00f4db99c324bd5fb5f22d8698ea7", size = 39565, upload-time = "2025-06-09T23:00:39.753Z" }, - { url = "https://files.pythonhosted.org/packages/58/17/fe61124c5c333ae87f09bb67186d65038834a47d974fc10a5fadb4cc5ae1/frozenlist-1.7.0-cp311-cp311-win_amd64.whl", hash = "sha256:387cbfdcde2f2353f19c2f66bbb52406d06ed77519ac7ee21be0232147c2592d", size = 44019, upload-time = "2025-06-09T23:00:40.988Z" }, - { url = "https://files.pythonhosted.org/packages/ef/a2/c8131383f1e66adad5f6ecfcce383d584ca94055a34d683bbb24ac5f2f1c/frozenlist-1.7.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:3dbf9952c4bb0e90e98aec1bd992b3318685005702656bc6f67c1a32b76787f2", size = 81424, upload-time = "2025-06-09T23:00:42.24Z" }, - { url = "https://files.pythonhosted.org/packages/4c/9d/02754159955088cb52567337d1113f945b9e444c4960771ea90eb73de8db/frozenlist-1.7.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:1f5906d3359300b8a9bb194239491122e6cf1444c2efb88865426f170c262cdb", size = 47952, upload-time = "2025-06-09T23:00:43.481Z" }, - { url = "https://files.pythonhosted.org/packages/01/7a/0046ef1bd6699b40acd2067ed6d6670b4db2f425c56980fa21c982c2a9db/frozenlist-1.7.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3dabd5a8f84573c8d10d8859a50ea2dec01eea372031929871368c09fa103478", size = 46688, upload-time = "2025-06-09T23:00:44.793Z" }, - { url = "https://files.pythonhosted.org/packages/d6/a2/a910bafe29c86997363fb4c02069df4ff0b5bc39d33c5198b4e9dd42d8f8/frozenlist-1.7.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aa57daa5917f1738064f302bf2626281a1cb01920c32f711fbc7bc36111058a8", size = 243084, upload-time = "2025-06-09T23:00:46.125Z" }, - { url = "https://files.pythonhosted.org/packages/64/3e/5036af9d5031374c64c387469bfcc3af537fc0f5b1187d83a1cf6fab1639/frozenlist-1.7.0-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:c193dda2b6d49f4c4398962810fa7d7c78f032bf45572b3e04dd5249dff27e08", size = 233524, upload-time = "2025-06-09T23:00:47.73Z" }, - { url = "https://files.pythonhosted.org/packages/06/39/6a17b7c107a2887e781a48ecf20ad20f1c39d94b2a548c83615b5b879f28/frozenlist-1.7.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bfe2b675cf0aaa6d61bf8fbffd3c274b3c9b7b1623beb3809df8a81399a4a9c4", size = 248493, upload-time = "2025-06-09T23:00:49.742Z" }, - { url = "https://files.pythonhosted.org/packages/be/00/711d1337c7327d88c44d91dd0f556a1c47fb99afc060ae0ef66b4d24793d/frozenlist-1.7.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8fc5d5cda37f62b262405cf9652cf0856839c4be8ee41be0afe8858f17f4c94b", size = 244116, upload-time = "2025-06-09T23:00:51.352Z" }, - { url = "https://files.pythonhosted.org/packages/24/fe/74e6ec0639c115df13d5850e75722750adabdc7de24e37e05a40527ca539/frozenlist-1.7.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b0d5ce521d1dd7d620198829b87ea002956e4319002ef0bc8d3e6d045cb4646e", size = 224557, upload-time = "2025-06-09T23:00:52.855Z" }, - { url = "https://files.pythonhosted.org/packages/8d/db/48421f62a6f77c553575201e89048e97198046b793f4a089c79a6e3268bd/frozenlist-1.7.0-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:488d0a7d6a0008ca0db273c542098a0fa9e7dfaa7e57f70acef43f32b3f69dca", size = 241820, upload-time = "2025-06-09T23:00:54.43Z" }, - { url = "https://files.pythonhosted.org/packages/1d/fa/cb4a76bea23047c8462976ea7b7a2bf53997a0ca171302deae9d6dd12096/frozenlist-1.7.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:15a7eaba63983d22c54d255b854e8108e7e5f3e89f647fc854bd77a237e767df", size = 236542, upload-time = "2025-06-09T23:00:56.409Z" }, - { url = "https://files.pythonhosted.org/packages/5d/32/476a4b5cfaa0ec94d3f808f193301debff2ea42288a099afe60757ef6282/frozenlist-1.7.0-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:1eaa7e9c6d15df825bf255649e05bd8a74b04a4d2baa1ae46d9c2d00b2ca2cb5", size = 249350, upload-time = "2025-06-09T23:00:58.468Z" }, - { url = "https://files.pythonhosted.org/packages/8d/ba/9a28042f84a6bf8ea5dbc81cfff8eaef18d78b2a1ad9d51c7bc5b029ad16/frozenlist-1.7.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:e4389e06714cfa9d47ab87f784a7c5be91d3934cd6e9a7b85beef808297cc025", size = 225093, upload-time = "2025-06-09T23:01:00.015Z" }, - { url = "https://files.pythonhosted.org/packages/bc/29/3a32959e68f9cf000b04e79ba574527c17e8842e38c91d68214a37455786/frozenlist-1.7.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:73bd45e1488c40b63fe5a7df892baf9e2a4d4bb6409a2b3b78ac1c6236178e01", size = 245482, upload-time = "2025-06-09T23:01:01.474Z" }, - { url = "https://files.pythonhosted.org/packages/80/e8/edf2f9e00da553f07f5fa165325cfc302dead715cab6ac8336a5f3d0adc2/frozenlist-1.7.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:99886d98e1643269760e5fe0df31e5ae7050788dd288947f7f007209b8c33f08", size = 249590, upload-time = "2025-06-09T23:01:02.961Z" }, - { url = "https://files.pythonhosted.org/packages/1c/80/9a0eb48b944050f94cc51ee1c413eb14a39543cc4f760ed12657a5a3c45a/frozenlist-1.7.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:290a172aae5a4c278c6da8a96222e6337744cd9c77313efe33d5670b9f65fc43", size = 237785, upload-time = "2025-06-09T23:01:05.095Z" }, - { url = "https://files.pythonhosted.org/packages/f3/74/87601e0fb0369b7a2baf404ea921769c53b7ae00dee7dcfe5162c8c6dbf0/frozenlist-1.7.0-cp312-cp312-win32.whl", hash = "sha256:426c7bc70e07cfebc178bc4c2bf2d861d720c4fff172181eeb4a4c41d4ca2ad3", size = 39487, upload-time = "2025-06-09T23:01:06.54Z" }, - { url = "https://files.pythonhosted.org/packages/0b/15/c026e9a9fc17585a9d461f65d8593d281fedf55fbf7eb53f16c6df2392f9/frozenlist-1.7.0-cp312-cp312-win_amd64.whl", hash = "sha256:563b72efe5da92e02eb68c59cb37205457c977aa7a449ed1b37e6939e5c47c6a", size = 43874, upload-time = "2025-06-09T23:01:07.752Z" }, - { url = "https://files.pythonhosted.org/packages/24/90/6b2cebdabdbd50367273c20ff6b57a3dfa89bd0762de02c3a1eb42cb6462/frozenlist-1.7.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ee80eeda5e2a4e660651370ebffd1286542b67e268aa1ac8d6dbe973120ef7ee", size = 79791, upload-time = "2025-06-09T23:01:09.368Z" }, - { url = "https://files.pythonhosted.org/packages/83/2e/5b70b6a3325363293fe5fc3ae74cdcbc3e996c2a11dde2fd9f1fb0776d19/frozenlist-1.7.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:d1a81c85417b914139e3a9b995d4a1c84559afc839a93cf2cb7f15e6e5f6ed2d", size = 47165, upload-time = "2025-06-09T23:01:10.653Z" }, - { url = "https://files.pythonhosted.org/packages/f4/25/a0895c99270ca6966110f4ad98e87e5662eab416a17e7fd53c364bf8b954/frozenlist-1.7.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:cbb65198a9132ebc334f237d7b0df163e4de83fb4f2bdfe46c1e654bdb0c5d43", size = 45881, upload-time = "2025-06-09T23:01:12.296Z" }, - { url = "https://files.pythonhosted.org/packages/19/7c/71bb0bbe0832793c601fff68cd0cf6143753d0c667f9aec93d3c323f4b55/frozenlist-1.7.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dab46c723eeb2c255a64f9dc05b8dd601fde66d6b19cdb82b2e09cc6ff8d8b5d", size = 232409, upload-time = "2025-06-09T23:01:13.641Z" }, - { url = "https://files.pythonhosted.org/packages/c0/45/ed2798718910fe6eb3ba574082aaceff4528e6323f9a8570be0f7028d8e9/frozenlist-1.7.0-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:6aeac207a759d0dedd2e40745575ae32ab30926ff4fa49b1635def65806fddee", size = 225132, upload-time = "2025-06-09T23:01:15.264Z" }, - { url = "https://files.pythonhosted.org/packages/ba/e2/8417ae0f8eacb1d071d4950f32f229aa6bf68ab69aab797b72a07ea68d4f/frozenlist-1.7.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bd8c4e58ad14b4fa7802b8be49d47993182fdd4023393899632c88fd8cd994eb", size = 237638, upload-time = "2025-06-09T23:01:16.752Z" }, - { url = "https://files.pythonhosted.org/packages/f8/b7/2ace5450ce85f2af05a871b8c8719b341294775a0a6c5585d5e6170f2ce7/frozenlist-1.7.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:04fb24d104f425da3540ed83cbfc31388a586a7696142004c577fa61c6298c3f", size = 233539, upload-time = "2025-06-09T23:01:18.202Z" }, - { url = "https://files.pythonhosted.org/packages/46/b9/6989292c5539553dba63f3c83dc4598186ab2888f67c0dc1d917e6887db6/frozenlist-1.7.0-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6a5c505156368e4ea6b53b5ac23c92d7edc864537ff911d2fb24c140bb175e60", size = 215646, upload-time = "2025-06-09T23:01:19.649Z" }, - { url = "https://files.pythonhosted.org/packages/72/31/bc8c5c99c7818293458fe745dab4fd5730ff49697ccc82b554eb69f16a24/frozenlist-1.7.0-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8bd7eb96a675f18aa5c553eb7ddc24a43c8c18f22e1f9925528128c052cdbe00", size = 232233, upload-time = "2025-06-09T23:01:21.175Z" }, - { url = "https://files.pythonhosted.org/packages/59/52/460db4d7ba0811b9ccb85af996019f5d70831f2f5f255f7cc61f86199795/frozenlist-1.7.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:05579bf020096fe05a764f1f84cd104a12f78eaab68842d036772dc6d4870b4b", size = 227996, upload-time = "2025-06-09T23:01:23.098Z" }, - { url = "https://files.pythonhosted.org/packages/ba/c9/f4b39e904c03927b7ecf891804fd3b4df3db29b9e487c6418e37988d6e9d/frozenlist-1.7.0-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:376b6222d114e97eeec13d46c486facd41d4f43bab626b7c3f6a8b4e81a5192c", size = 242280, upload-time = "2025-06-09T23:01:24.808Z" }, - { url = "https://files.pythonhosted.org/packages/b8/33/3f8d6ced42f162d743e3517781566b8481322be321b486d9d262adf70bfb/frozenlist-1.7.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:0aa7e176ebe115379b5b1c95b4096fb1c17cce0847402e227e712c27bdb5a949", size = 217717, upload-time = "2025-06-09T23:01:26.28Z" }, - { url = "https://files.pythonhosted.org/packages/3e/e8/ad683e75da6ccef50d0ab0c2b2324b32f84fc88ceee778ed79b8e2d2fe2e/frozenlist-1.7.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:3fbba20e662b9c2130dc771e332a99eff5da078b2b2648153a40669a6d0e36ca", size = 236644, upload-time = "2025-06-09T23:01:27.887Z" }, - { url = "https://files.pythonhosted.org/packages/b2/14/8d19ccdd3799310722195a72ac94ddc677541fb4bef4091d8e7775752360/frozenlist-1.7.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:f3f4410a0a601d349dd406b5713fec59b4cee7e71678d5b17edda7f4655a940b", size = 238879, upload-time = "2025-06-09T23:01:29.524Z" }, - { url = "https://files.pythonhosted.org/packages/ce/13/c12bf657494c2fd1079a48b2db49fa4196325909249a52d8f09bc9123fd7/frozenlist-1.7.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:e2cdfaaec6a2f9327bf43c933c0319a7c429058e8537c508964a133dffee412e", size = 232502, upload-time = "2025-06-09T23:01:31.287Z" }, - { url = "https://files.pythonhosted.org/packages/d7/8b/e7f9dfde869825489382bc0d512c15e96d3964180c9499efcec72e85db7e/frozenlist-1.7.0-cp313-cp313-win32.whl", hash = "sha256:5fc4df05a6591c7768459caba1b342d9ec23fa16195e744939ba5914596ae3e1", size = 39169, upload-time = "2025-06-09T23:01:35.503Z" }, - { url = "https://files.pythonhosted.org/packages/35/89/a487a98d94205d85745080a37860ff5744b9820a2c9acbcdd9440bfddf98/frozenlist-1.7.0-cp313-cp313-win_amd64.whl", hash = "sha256:52109052b9791a3e6b5d1b65f4b909703984b770694d3eb64fad124c835d7cba", size = 43219, upload-time = "2025-06-09T23:01:36.784Z" }, - { url = "https://files.pythonhosted.org/packages/56/d5/5c4cf2319a49eddd9dd7145e66c4866bdc6f3dbc67ca3d59685149c11e0d/frozenlist-1.7.0-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:a6f86e4193bb0e235ef6ce3dde5cbabed887e0b11f516ce8a0f4d3b33078ec2d", size = 84345, upload-time = "2025-06-09T23:01:38.295Z" }, - { url = "https://files.pythonhosted.org/packages/a4/7d/ec2c1e1dc16b85bc9d526009961953df9cec8481b6886debb36ec9107799/frozenlist-1.7.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:82d664628865abeb32d90ae497fb93df398a69bb3434463d172b80fc25b0dd7d", size = 48880, upload-time = "2025-06-09T23:01:39.887Z" }, - { url = "https://files.pythonhosted.org/packages/69/86/f9596807b03de126e11e7d42ac91e3d0b19a6599c714a1989a4e85eeefc4/frozenlist-1.7.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:912a7e8375a1c9a68325a902f3953191b7b292aa3c3fb0d71a216221deca460b", size = 48498, upload-time = "2025-06-09T23:01:41.318Z" }, - { url = "https://files.pythonhosted.org/packages/5e/cb/df6de220f5036001005f2d726b789b2c0b65f2363b104bbc16f5be8084f8/frozenlist-1.7.0-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9537c2777167488d539bc5de2ad262efc44388230e5118868e172dd4a552b146", size = 292296, upload-time = "2025-06-09T23:01:42.685Z" }, - { url = "https://files.pythonhosted.org/packages/83/1f/de84c642f17c8f851a2905cee2dae401e5e0daca9b5ef121e120e19aa825/frozenlist-1.7.0-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:f34560fb1b4c3e30ba35fa9a13894ba39e5acfc5f60f57d8accde65f46cc5e74", size = 273103, upload-time = "2025-06-09T23:01:44.166Z" }, - { url = "https://files.pythonhosted.org/packages/88/3c/c840bfa474ba3fa13c772b93070893c6e9d5c0350885760376cbe3b6c1b3/frozenlist-1.7.0-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:acd03d224b0175f5a850edc104ac19040d35419eddad04e7cf2d5986d98427f1", size = 292869, upload-time = "2025-06-09T23:01:45.681Z" }, - { url = "https://files.pythonhosted.org/packages/a6/1c/3efa6e7d5a39a1d5ef0abeb51c48fb657765794a46cf124e5aca2c7a592c/frozenlist-1.7.0-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f2038310bc582f3d6a09b3816ab01737d60bf7b1ec70f5356b09e84fb7408ab1", size = 291467, upload-time = "2025-06-09T23:01:47.234Z" }, - { url = "https://files.pythonhosted.org/packages/4f/00/d5c5e09d4922c395e2f2f6b79b9a20dab4b67daaf78ab92e7729341f61f6/frozenlist-1.7.0-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b8c05e4c8e5f36e5e088caa1bf78a687528f83c043706640a92cb76cd6999384", size = 266028, upload-time = "2025-06-09T23:01:48.819Z" }, - { url = "https://files.pythonhosted.org/packages/4e/27/72765be905619dfde25a7f33813ac0341eb6b076abede17a2e3fbfade0cb/frozenlist-1.7.0-cp313-cp313t-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:765bb588c86e47d0b68f23c1bee323d4b703218037765dcf3f25c838c6fecceb", size = 284294, upload-time = "2025-06-09T23:01:50.394Z" }, - { url = "https://files.pythonhosted.org/packages/88/67/c94103a23001b17808eb7dd1200c156bb69fb68e63fcf0693dde4cd6228c/frozenlist-1.7.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:32dc2e08c67d86d0969714dd484fd60ff08ff81d1a1e40a77dd34a387e6ebc0c", size = 281898, upload-time = "2025-06-09T23:01:52.234Z" }, - { url = "https://files.pythonhosted.org/packages/42/34/a3e2c00c00f9e2a9db5653bca3fec306349e71aff14ae45ecc6d0951dd24/frozenlist-1.7.0-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:c0303e597eb5a5321b4de9c68e9845ac8f290d2ab3f3e2c864437d3c5a30cd65", size = 290465, upload-time = "2025-06-09T23:01:53.788Z" }, - { url = "https://files.pythonhosted.org/packages/bb/73/f89b7fbce8b0b0c095d82b008afd0590f71ccb3dee6eee41791cf8cd25fd/frozenlist-1.7.0-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:a47f2abb4e29b3a8d0b530f7c3598badc6b134562b1a5caee867f7c62fee51e3", size = 266385, upload-time = "2025-06-09T23:01:55.769Z" }, - { url = "https://files.pythonhosted.org/packages/cd/45/e365fdb554159462ca12df54bc59bfa7a9a273ecc21e99e72e597564d1ae/frozenlist-1.7.0-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:3d688126c242a6fabbd92e02633414d40f50bb6002fa4cf995a1d18051525657", size = 288771, upload-time = "2025-06-09T23:01:57.4Z" }, - { url = "https://files.pythonhosted.org/packages/00/11/47b6117002a0e904f004d70ec5194fe9144f117c33c851e3d51c765962d0/frozenlist-1.7.0-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:4e7e9652b3d367c7bd449a727dc79d5043f48b88d0cbfd4f9f1060cf2b414104", size = 288206, upload-time = "2025-06-09T23:01:58.936Z" }, - { url = "https://files.pythonhosted.org/packages/40/37/5f9f3c3fd7f7746082ec67bcdc204db72dad081f4f83a503d33220a92973/frozenlist-1.7.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:1a85e345b4c43db8b842cab1feb41be5cc0b10a1830e6295b69d7310f99becaf", size = 282620, upload-time = "2025-06-09T23:02:00.493Z" }, - { url = "https://files.pythonhosted.org/packages/0b/31/8fbc5af2d183bff20f21aa743b4088eac4445d2bb1cdece449ae80e4e2d1/frozenlist-1.7.0-cp313-cp313t-win32.whl", hash = "sha256:3a14027124ddb70dfcee5148979998066897e79f89f64b13328595c4bdf77c81", size = 43059, upload-time = "2025-06-09T23:02:02.072Z" }, - { url = "https://files.pythonhosted.org/packages/bb/ed/41956f52105b8dbc26e457c5705340c67c8cc2b79f394b79bffc09d0e938/frozenlist-1.7.0-cp313-cp313t-win_amd64.whl", hash = "sha256:3bf8010d71d4507775f658e9823210b7427be36625b387221642725b515dcf3e", size = 47516, upload-time = "2025-06-09T23:02:03.779Z" }, - { url = "https://files.pythonhosted.org/packages/ee/45/b82e3c16be2182bff01179db177fe144d58b5dc787a7d4492c6ed8b9317f/frozenlist-1.7.0-py3-none-any.whl", hash = "sha256:9a5af342e34f7e97caf8c995864c7a396418ae2859cc6fdf1b1073020d516a7e", size = 13106, upload-time = "2025-06-09T23:02:34.204Z" }, -] - -[[package]] -name = "fsspec" -version = "2025.3.2" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/45/d8/8425e6ba5fcec61a1d16e41b1b71d2bf9344f1fe48012c2b48b9620feae5/fsspec-2025.3.2.tar.gz", hash = "sha256:e52c77ef398680bbd6a98c0e628fbc469491282981209907bbc8aea76a04fdc6", size = 299281, upload-time = "2025-03-31T15:27:08.524Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/44/4b/e0cfc1a6f17e990f3e64b7d941ddc4acdc7b19d6edd51abf495f32b1a9e4/fsspec-2025.3.2-py3-none-any.whl", hash = "sha256:2daf8dc3d1dfa65b6aa37748d112773a7a08416f6c70d96b264c96476ecaf711", size = 194435, upload-time = "2025-03-31T15:27:07.028Z" }, -] - -[[package]] -name = "fuzzforge-ai" -version = "0.7.0" -source = { editable = "../ai" } -dependencies = [ - { name = "a2a-sdk" }, - { name = "agentops" }, - { name = "cognee" }, - { name = "fastmcp" }, - { name = "google-adk" }, - { name = "httpx" }, - { name = "litellm" }, - { name = "mcp" }, - { name = "python-dotenv" }, - { name = "rich" }, - { name = "typing-extensions" }, - { name = "uvicorn" }, -] - -[package.metadata] -requires-dist = [ - { name = "a2a-sdk" }, - { name = "agentops" }, - { name = "black", marker = "extra == 'dev'" }, - { name = "cognee", specifier = ">=0.3.0" }, - { name = "fastmcp" }, - { name = "google-adk" }, - { name = "httpx" }, - { name = "litellm" }, - { name = "mcp" }, - { name = "pytest", marker = "extra == 'dev'" }, - { name = "pytest-asyncio", marker = "extra == 'dev'" }, - { name = "python-dotenv" }, - { name = "rich" }, - { name = "ruff", marker = "extra == 'dev'" }, - { name = "typing-extensions" }, - { name = "uvicorn" }, -] -provides-extras = ["dev"] - -[package.metadata.requires-dev] -dev = [ - { name = "pytest" }, - { name = "pytest-asyncio" }, -] - -[[package]] -name = "fuzzforge-cli" -version = "0.7.0" -source = { editable = "." } -dependencies = [ - { name = "fuzzforge-ai" }, - { name = "fuzzforge-sdk" }, - { name = "httpx" }, - { name = "pydantic" }, - { name = "pyyaml" }, - { name = "rich" }, - { name = "sseclient-py" }, - { name = "typer" }, - { name = "websockets" }, -] - -[package.optional-dependencies] -dev = [ - { name = "black" }, - { name = "isort" }, - { name = "mypy" }, - { name = "pytest" }, - { name = "pytest-asyncio" }, -] - -[package.metadata] -requires-dist = [ - { name = "black", marker = "extra == 'dev'", specifier = ">=24.0.0" }, - { name = "fuzzforge-ai", editable = "../ai" }, - { name = "fuzzforge-sdk", editable = "../sdk" }, - { name = "httpx", specifier = ">=0.27.0" }, - { name = "isort", marker = "extra == 'dev'", specifier = ">=5.13.0" }, - { name = "mypy", marker = "extra == 'dev'", specifier = ">=1.11.0" }, - { name = "pydantic", specifier = ">=2.0.0" }, - { name = "pytest", marker = "extra == 'dev'", specifier = ">=8.0.0" }, - { name = "pytest-asyncio", marker = "extra == 'dev'", specifier = ">=0.23.0" }, - { name = "pyyaml", specifier = ">=6.0.0" }, - { name = "rich", specifier = ">=13.0.0" }, - { name = "sseclient-py", specifier = ">=1.8.0" }, - { name = "typer", specifier = ">=0.12.0" }, - { name = "websockets", specifier = ">=13.0" }, -] -provides-extras = ["dev"] - -[[package]] -name = "fuzzforge-sdk" -version = "0.7.0" -source = { editable = "../sdk" } -dependencies = [ - { name = "httpx" }, - { name = "pydantic" }, - { name = "sseclient-py" }, - { name = "websockets" }, -] - -[package.metadata] -requires-dist = [ - { name = "black", marker = "extra == 'dev'", specifier = ">=24.0.0" }, - { name = "httpx", specifier = ">=0.27.0" }, - { name = "isort", marker = "extra == 'dev'", specifier = ">=5.13.0" }, - { name = "mypy", marker = "extra == 'dev'", specifier = ">=1.11.0" }, - { name = "pydantic", specifier = ">=2.0.0" }, - { name = "pytest", marker = "extra == 'dev'", specifier = ">=8.0.0" }, - { name = "pytest-asyncio", marker = "extra == 'dev'", specifier = ">=0.23.0" }, - { name = "pytest-mock", marker = "extra == 'dev'", specifier = ">=3.14.0" }, - { name = "sseclient-py", specifier = ">=1.8.0" }, - { name = "websockets", specifier = ">=13.0" }, -] -provides-extras = ["dev"] - -[[package]] -name = "gitdb" -version = "4.0.12" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "smmap" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/72/94/63b0fc47eb32792c7ba1fe1b694daec9a63620db1e313033d18140c2320a/gitdb-4.0.12.tar.gz", hash = "sha256:5ef71f855d191a3326fcfbc0d5da835f26b13fbcba60c32c21091c349ffdb571", size = 394684, upload-time = "2025-01-02T07:20:46.413Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/a0/61/5c78b91c3143ed5c14207f463aecfc8f9dbb5092fb2869baf37c273b2705/gitdb-4.0.12-py3-none-any.whl", hash = "sha256:67073e15955400952c6565cc3e707c554a4eea2e428946f7a4c162fab9bd9bcf", size = 62794, upload-time = "2025-01-02T07:20:43.624Z" }, -] - -[[package]] -name = "gitpython" -version = "3.1.45" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "gitdb" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/9a/c8/dd58967d119baab745caec2f9d853297cec1989ec1d63f677d3880632b88/gitpython-3.1.45.tar.gz", hash = "sha256:85b0ee964ceddf211c41b9f27a49086010a190fd8132a24e21f362a4b36a791c", size = 215076, upload-time = "2025-07-24T03:45:54.871Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/01/61/d4b89fec821f72385526e1b9d9a3a0385dda4a72b206d28049e2c7cd39b8/gitpython-3.1.45-py3-none-any.whl", hash = "sha256:8908cb2e02fb3b93b7eb0f2827125cb699869470432cc885f019b8fd0fccff77", size = 208168, upload-time = "2025-07-24T03:45:52.517Z" }, -] - -[[package]] -name = "giturlparse" -version = "0.12.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/37/5f/543dc54c82842376139748226e5aa61eb95093992f63dd495af9c6b4f076/giturlparse-0.12.0.tar.gz", hash = "sha256:c0fff7c21acc435491b1779566e038757a205c1ffdcb47e4f81ea52ad8c3859a", size = 14907, upload-time = "2023-09-24T07:22:36.795Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/dd/94/c6ff3388b8e3225a014e55aed957188639aa0966443e0408d38f0c9614a7/giturlparse-0.12.0-py2.py3-none-any.whl", hash = "sha256:412b74f2855f1da2fefa89fd8dde62df48476077a72fc19b62039554d27360eb", size = 15752, upload-time = "2023-09-24T07:22:35.465Z" }, -] - -[[package]] -name = "google-adk" -version = "1.14.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "absolufy-imports" }, - { name = "anyio" }, - { name = "authlib" }, - { name = "click" }, - { name = "fastapi" }, - { name = "google-api-python-client" }, - { name = "google-cloud-aiplatform", extra = ["agent-engines"] }, - { name = "google-cloud-bigtable" }, - { name = "google-cloud-secret-manager" }, - { name = "google-cloud-spanner" }, - { name = "google-cloud-speech" }, - { name = "google-cloud-storage" }, - { name = "google-genai" }, - { name = "graphviz" }, - { name = "mcp" }, - { name = "opentelemetry-api" }, - { name = "opentelemetry-exporter-gcp-trace" }, - { name = "opentelemetry-sdk" }, - { name = "pydantic" }, - { name = "python-dateutil" }, - { name = "python-dotenv" }, - { name = "pyyaml" }, - { name = "requests" }, - { name = "sqlalchemy" }, - { name = "sqlalchemy-spanner" }, - { name = "starlette" }, - { name = "tenacity" }, - { name = "typing-extensions" }, - { name = "tzlocal" }, - { name = "uvicorn" }, - { name = "watchdog" }, - { name = "websockets" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/35/fe/0efba60d22bfcd7ab18f48d23771f0701664fd93be247eddc42592b9b68f/google_adk-1.14.1.tar.gz", hash = "sha256:06caab4599286123eceb9348e4accb6c3c1476b8d9b2b13f078a975c8ace966f", size = 1681879, upload-time = "2025-09-15T00:06:48.823Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/31/74/0b68fab470f13e80fd135bcf890c13bb1154804c1eaaff60dd1f5995027c/google_adk-1.14.1-py3-none-any.whl", hash = "sha256:acb31ed41d3b05b0d3a65cce76f6ef1289385f49a72164a07dae56190b648d50", size = 1922802, upload-time = "2025-09-15T00:06:47.011Z" }, -] - -[[package]] -name = "google-api-core" -version = "2.25.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "google-auth" }, - { name = "googleapis-common-protos" }, - { name = "proto-plus" }, - { name = "protobuf" }, - { name = "requests" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/dc/21/e9d043e88222317afdbdb567165fdbc3b0aad90064c7e0c9eb0ad9955ad8/google_api_core-2.25.1.tar.gz", hash = "sha256:d2aaa0b13c78c61cb3f4282c464c046e45fbd75755683c9c525e6e8f7ed0a5e8", size = 165443, upload-time = "2025-06-12T20:52:20.439Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/14/4b/ead00905132820b623732b175d66354e9d3e69fcf2a5dcdab780664e7896/google_api_core-2.25.1-py3-none-any.whl", hash = "sha256:8a2a56c1fef82987a524371f99f3bd0143702fecc670c72e600c1cda6bf8dbb7", size = 160807, upload-time = "2025-06-12T20:52:19.334Z" }, -] - -[package.optional-dependencies] -grpc = [ - { name = "grpcio" }, - { name = "grpcio-status" }, -] - -[[package]] -name = "google-api-python-client" -version = "2.182.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "google-api-core" }, - { name = "google-auth" }, - { name = "google-auth-httplib2" }, - { name = "httplib2" }, - { name = "uritemplate" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/6f/cb/b85b1d7d7fd520739fb70c4878f1f414043c3c34434bc90ba9d4f93366ed/google_api_python_client-2.182.0.tar.gz", hash = "sha256:cb2aa127e33c3a31e89a06f39cf9de982db90a98dee020911b21013afafad35f", size = 13599318, upload-time = "2025-09-16T21:10:57.97Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/c1/29/76dabe97ebb710ca9a308f0415b2206e37d149983ec2becbf66525c52322/google_api_python_client-2.182.0-py3-none-any.whl", hash = "sha256:a9b071036d41a17991d8fbf27bedb61f2888a39ae5696cb5a326bf999b2d5209", size = 14168745, upload-time = "2025-09-16T21:10:54.657Z" }, -] - -[[package]] -name = "google-auth" -version = "2.40.3" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "cachetools" }, - { name = "pyasn1-modules" }, - { name = "rsa" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/9e/9b/e92ef23b84fa10a64ce4831390b7a4c2e53c0132568d99d4ae61d04c8855/google_auth-2.40.3.tar.gz", hash = "sha256:500c3a29adedeb36ea9cf24b8d10858e152f2412e3ca37829b3fa18e33d63b77", size = 281029, upload-time = "2025-06-04T18:04:57.577Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/17/63/b19553b658a1692443c62bd07e5868adaa0ad746a0751ba62c59568cd45b/google_auth-2.40.3-py2.py3-none-any.whl", hash = "sha256:1370d4593e86213563547f97a92752fc658456fe4514c809544f330fed45a7ca", size = 216137, upload-time = "2025-06-04T18:04:55.573Z" }, -] - -[[package]] -name = "google-auth-httplib2" -version = "0.2.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "google-auth" }, - { name = "httplib2" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/56/be/217a598a818567b28e859ff087f347475c807a5649296fb5a817c58dacef/google-auth-httplib2-0.2.0.tar.gz", hash = "sha256:38aa7badf48f974f1eb9861794e9c0cb2a0511a4ec0679b1f886d108f5640e05", size = 10842, upload-time = "2023-12-12T17:40:30.722Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/be/8a/fe34d2f3f9470a27b01c9e76226965863f153d5fbe276f83608562e49c04/google_auth_httplib2-0.2.0-py2.py3-none-any.whl", hash = "sha256:b65a0a2123300dd71281a7bf6e64d65a0759287df52729bdd1ae2e47dc311a3d", size = 9253, upload-time = "2023-12-12T17:40:13.055Z" }, -] - -[[package]] -name = "google-cloud-aiplatform" -version = "1.114.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "docstring-parser" }, - { name = "google-api-core", extra = ["grpc"] }, - { name = "google-auth" }, - { name = "google-cloud-bigquery" }, - { name = "google-cloud-resource-manager" }, - { name = "google-cloud-storage" }, - { name = "google-genai" }, - { name = "packaging" }, - { name = "proto-plus" }, - { name = "protobuf" }, - { name = "pydantic" }, - { name = "shapely" }, - { name = "typing-extensions" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/3d/0e/8097231fba8e688993b0b6d371ee298ac3955cdca77fc0731799de1253ca/google_cloud_aiplatform-1.114.0.tar.gz", hash = "sha256:44e5e3da9b23c9316a4d9e7cd6a04258ebf84f3aadf95a725d5d1de179e2c2ce", size = 9650673, upload-time = "2025-09-16T19:47:55.12Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/7a/0a/526e70e5cd8e0e96207e201721457dac020d9b8d1bd2ce7326e550b8150d/google_cloud_aiplatform-1.114.0-py2.py3-none-any.whl", hash = "sha256:87386d9364bd0bed4dd33873845afbbe251d1ed83ee25d676c3c0cea630af682", size = 8032171, upload-time = "2025-09-16T19:47:52.725Z" }, -] - -[package.optional-dependencies] -agent-engines = [ - { name = "cloudpickle" }, - { name = "google-cloud-logging" }, - { name = "google-cloud-trace" }, - { name = "opentelemetry-exporter-gcp-trace" }, - { name = "opentelemetry-sdk" }, - { name = "packaging" }, - { name = "pydantic" }, - { name = "typing-extensions" }, -] - -[[package]] -name = "google-cloud-appengine-logging" -version = "1.6.2" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "google-api-core", extra = ["grpc"] }, - { name = "google-auth" }, - { name = "proto-plus" }, - { name = "protobuf" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/e7/ea/85da73d4f162b29d24ad591c4ce02688b44094ee5f3d6c0cc533c2b23b23/google_cloud_appengine_logging-1.6.2.tar.gz", hash = "sha256:4890928464c98da9eecc7bf4e0542eba2551512c0265462c10f3a3d2a6424b90", size = 16587, upload-time = "2025-06-11T22:38:53.525Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/e4/9e/dc1fd7f838dcaf608c465171b1a25d8ce63f9987e2d5c73bda98792097a9/google_cloud_appengine_logging-1.6.2-py3-none-any.whl", hash = "sha256:2b28ed715e92b67e334c6fcfe1deb523f001919560257b25fc8fcda95fd63938", size = 16889, upload-time = "2025-06-11T22:38:52.26Z" }, -] - -[[package]] -name = "google-cloud-audit-log" -version = "0.3.2" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "googleapis-common-protos" }, - { name = "protobuf" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/85/af/53b4ef636e492d136b3c217e52a07bee569430dda07b8e515d5f2b701b1e/google_cloud_audit_log-0.3.2.tar.gz", hash = "sha256:2598f1533a7d7cdd6c7bf448c12e5519c1d53162d78784e10bcdd1df67791bc3", size = 33377, upload-time = "2025-03-17T11:27:59.808Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/b1/74/38a70339e706b174b3c1117ad931aaa0ff0565b599869317a220d1967e1b/google_cloud_audit_log-0.3.2-py3-none-any.whl", hash = "sha256:daaedfb947a0d77f524e1bd2b560242ab4836fe1afd6b06b92f152b9658554ed", size = 32472, upload-time = "2025-03-17T11:27:58.51Z" }, -] - -[[package]] -name = "google-cloud-bigquery" -version = "3.38.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "google-api-core", extra = ["grpc"] }, - { name = "google-auth" }, - { name = "google-cloud-core" }, - { name = "google-resumable-media" }, - { name = "packaging" }, - { name = "python-dateutil" }, - { name = "requests" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/07/b2/a17e40afcf9487e3d17db5e36728ffe75c8d5671c46f419d7b6528a5728a/google_cloud_bigquery-3.38.0.tar.gz", hash = "sha256:8afcb7116f5eac849097a344eb8bfda78b7cfaae128e60e019193dd483873520", size = 503666, upload-time = "2025-09-17T20:33:33.47Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/39/3c/c8cada9ec282b29232ed9aed5a0b5cca6cf5367cb2ffa8ad0d2583d743f1/google_cloud_bigquery-3.38.0-py3-none-any.whl", hash = "sha256:e06e93ff7b245b239945ef59cb59616057598d369edac457ebf292bd61984da6", size = 259257, upload-time = "2025-09-17T20:33:31.404Z" }, -] - -[[package]] -name = "google-cloud-bigtable" -version = "2.32.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "google-api-core", extra = ["grpc"] }, - { name = "google-auth" }, - { name = "google-cloud-core" }, - { name = "google-crc32c" }, - { name = "grpc-google-iam-v1" }, - { name = "proto-plus" }, - { name = "protobuf" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/88/18/52eaef1e08b1570a56a74bb909345bfae082b6915e482df10de1fb0b341d/google_cloud_bigtable-2.32.0.tar.gz", hash = "sha256:1dcf8a9fae5801164dc184558cd8e9e930485424655faae254e2c7350fa66946", size = 746803, upload-time = "2025-08-06T17:28:54.589Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/20/89/2e3607c3c6f85954c3351078f3b891e5a2ec6dec9b964e260731818dcaec/google_cloud_bigtable-2.32.0-py3-none-any.whl", hash = "sha256:39881c36a4009703fa046337cf3259da4dd2cbcabe7b95ee5b0b0a8f19c3234e", size = 520438, upload-time = "2025-08-06T17:28:53.27Z" }, -] - -[[package]] -name = "google-cloud-core" -version = "2.4.3" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "google-api-core" }, - { name = "google-auth" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/d6/b8/2b53838d2acd6ec6168fd284a990c76695e84c65deee79c9f3a4276f6b4f/google_cloud_core-2.4.3.tar.gz", hash = "sha256:1fab62d7102844b278fe6dead3af32408b1df3eb06f5c7e8634cbd40edc4da53", size = 35861, upload-time = "2025-03-10T21:05:38.948Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/40/86/bda7241a8da2d28a754aad2ba0f6776e35b67e37c36ae0c45d49370f1014/google_cloud_core-2.4.3-py2.py3-none-any.whl", hash = "sha256:5130f9f4c14b4fafdff75c79448f9495cfade0d8775facf1b09c3bf67e027f6e", size = 29348, upload-time = "2025-03-10T21:05:37.785Z" }, -] - -[[package]] -name = "google-cloud-logging" -version = "3.12.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "google-api-core", extra = ["grpc"] }, - { name = "google-auth" }, - { name = "google-cloud-appengine-logging" }, - { name = "google-cloud-audit-log" }, - { name = "google-cloud-core" }, - { name = "grpc-google-iam-v1" }, - { name = "opentelemetry-api" }, - { name = "proto-plus" }, - { name = "protobuf" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/14/9c/d42ecc94f795a6545930e5f846a7ae59ff685ded8bc086648dd2bee31a1a/google_cloud_logging-3.12.1.tar.gz", hash = "sha256:36efc823985055b203904e83e1c8f9f999b3c64270bcda39d57386ca4effd678", size = 289569, upload-time = "2025-04-22T20:50:24.71Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/b1/41/f8a3197d39b773a91f335dee36c92ef26a8ec96efe78d64baad89d367df4/google_cloud_logging-3.12.1-py2.py3-none-any.whl", hash = "sha256:6817878af76ec4e7568976772839ab2c43ddfd18fbbf2ce32b13ef549cd5a862", size = 229466, upload-time = "2025-04-22T20:50:23.294Z" }, -] - -[[package]] -name = "google-cloud-resource-manager" -version = "1.14.2" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "google-api-core", extra = ["grpc"] }, - { name = "google-auth" }, - { name = "grpc-google-iam-v1" }, - { name = "proto-plus" }, - { name = "protobuf" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/6e/ca/a4648f5038cb94af4b3942815942a03aa9398f9fb0bef55b3f1585b9940d/google_cloud_resource_manager-1.14.2.tar.gz", hash = "sha256:962e2d904c550d7bac48372607904ff7bb3277e3bb4a36d80cc9a37e28e6eb74", size = 446370, upload-time = "2025-03-17T11:35:56.343Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/b1/ea/a92631c358da377af34d3a9682c97af83185c2d66363d5939ab4a1169a7f/google_cloud_resource_manager-1.14.2-py3-none-any.whl", hash = "sha256:d0fa954dedd1d2b8e13feae9099c01b8aac515b648e612834f9942d2795a9900", size = 394344, upload-time = "2025-03-17T11:35:54.722Z" }, -] - -[[package]] -name = "google-cloud-secret-manager" -version = "2.24.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "google-api-core", extra = ["grpc"] }, - { name = "google-auth" }, - { name = "grpc-google-iam-v1" }, - { name = "proto-plus" }, - { name = "protobuf" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/58/7a/2fa6735ec693d822fe08a76709c4d95d9b5b4c02e83e720497355039d2ee/google_cloud_secret_manager-2.24.0.tar.gz", hash = "sha256:ce573d40ffc2fb7d01719243a94ee17aa243ea642a6ae6c337501e58fbf642b5", size = 269516, upload-time = "2025-06-05T22:22:22.965Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/be/af/db1217cae1809e69a4527ee6293b82a9af2a1fb2313ad110c775e8f3c820/google_cloud_secret_manager-2.24.0-py3-none-any.whl", hash = "sha256:9bea1254827ecc14874bc86c63b899489f8f50bfe1442bfb2517530b30b3a89b", size = 218050, upload-time = "2025-06-10T02:02:19.88Z" }, -] - -[[package]] -name = "google-cloud-spanner" -version = "3.57.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "google-api-core", extra = ["grpc"] }, - { name = "google-cloud-core" }, - { name = "grpc-google-iam-v1" }, - { name = "grpc-interceptor" }, - { name = "proto-plus" }, - { name = "protobuf" }, - { name = "sqlparse" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/5e/e8/e008f9ffa2dcf596718d2533d96924735110378853c55f730d2527a19e04/google_cloud_spanner-3.57.0.tar.gz", hash = "sha256:73f52f58617449fcff7073274a7f7a798f4f7b2788eda26de3b7f98ad857ab99", size = 701574, upload-time = "2025-08-14T15:24:59.18Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/3a/9f/66fe9118bc0e593b65ade612775e397f596b0bcd75daa3ea63dbe1020f95/google_cloud_spanner-3.57.0-py3-none-any.whl", hash = "sha256:5b10b40bc646091f1b4cbb2e7e2e82ec66bcce52c7105f86b65070d34d6df86f", size = 501380, upload-time = "2025-08-14T15:24:57.683Z" }, -] - -[[package]] -name = "google-cloud-speech" -version = "2.33.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "google-api-core", extra = ["grpc"] }, - { name = "google-auth" }, - { name = "proto-plus" }, - { name = "protobuf" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/9a/74/9c5a556f8af19cab461058aa15e1409e7afa453ca2383473a24a12801ef7/google_cloud_speech-2.33.0.tar.gz", hash = "sha256:fd08511b5124fdaa768d71a4054e84a5d8eb02531cb6f84f311c0387ea1314ed", size = 389072, upload-time = "2025-06-11T23:56:37.231Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/12/1d/880342b2541b4bad888ad8ab2ac77d4b5dad25b32a2a1c5f21140c14c8e3/google_cloud_speech-2.33.0-py3-none-any.whl", hash = "sha256:4ba16c8517c24a6abcde877289b0f40b719090504bf06b1adea248198ccd50a5", size = 335681, upload-time = "2025-06-11T23:56:36.026Z" }, -] - -[[package]] -name = "google-cloud-storage" -version = "2.19.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "google-api-core" }, - { name = "google-auth" }, - { name = "google-cloud-core" }, - { name = "google-crc32c" }, - { name = "google-resumable-media" }, - { name = "requests" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/36/76/4d965702e96bb67976e755bed9828fa50306dca003dbee08b67f41dd265e/google_cloud_storage-2.19.0.tar.gz", hash = "sha256:cd05e9e7191ba6cb68934d8eb76054d9be4562aa89dbc4236feee4d7d51342b2", size = 5535488, upload-time = "2024-12-05T01:35:06.49Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/d5/94/6db383d8ee1adf45dc6c73477152b82731fa4c4a46d9c1932cc8757e0fd4/google_cloud_storage-2.19.0-py2.py3-none-any.whl", hash = "sha256:aeb971b5c29cf8ab98445082cbfe7b161a1f48ed275822f59ed3f1524ea54fba", size = 131787, upload-time = "2024-12-05T01:35:04.736Z" }, -] - -[[package]] -name = "google-cloud-trace" -version = "1.16.2" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "google-api-core", extra = ["grpc"] }, - { name = "google-auth" }, - { name = "proto-plus" }, - { name = "protobuf" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/c5/ea/0e42e2196fb2bc8c7b25f081a0b46b5053d160b34d5322e7eac2d5f7a742/google_cloud_trace-1.16.2.tar.gz", hash = "sha256:89bef223a512465951eb49335be6d60bee0396d576602dbf56368439d303cab4", size = 97826, upload-time = "2025-06-12T00:53:02.12Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/08/96/7a8d271e91effa9ccc2fd7cfd5cf287a2d7900080a475477c2ac0c7a331d/google_cloud_trace-1.16.2-py3-none-any.whl", hash = "sha256:40fb74607752e4ee0f3d7e5fc6b8f6eb1803982254a1507ba918172484131456", size = 103755, upload-time = "2025-06-12T00:53:00.672Z" }, -] - -[[package]] -name = "google-crc32c" -version = "1.7.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/19/ae/87802e6d9f9d69adfaedfcfd599266bf386a54d0be058b532d04c794f76d/google_crc32c-1.7.1.tar.gz", hash = "sha256:2bff2305f98846f3e825dbeec9ee406f89da7962accdb29356e4eadc251bd472", size = 14495, upload-time = "2025-03-26T14:29:13.32Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/f7/94/220139ea87822b6fdfdab4fb9ba81b3fff7ea2c82e2af34adc726085bffc/google_crc32c-1.7.1-cp311-cp311-macosx_12_0_arm64.whl", hash = "sha256:6fbab4b935989e2c3610371963ba1b86afb09537fd0c633049be82afe153ac06", size = 30468, upload-time = "2025-03-26T14:32:52.215Z" }, - { url = "https://files.pythonhosted.org/packages/94/97/789b23bdeeb9d15dc2904660463ad539d0318286d7633fe2760c10ed0c1c/google_crc32c-1.7.1-cp311-cp311-macosx_12_0_x86_64.whl", hash = "sha256:ed66cbe1ed9cbaaad9392b5259b3eba4a9e565420d734e6238813c428c3336c9", size = 30313, upload-time = "2025-03-26T14:57:38.758Z" }, - { url = "https://files.pythonhosted.org/packages/81/b8/976a2b843610c211e7ccb3e248996a61e87dbb2c09b1499847e295080aec/google_crc32c-1.7.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ee6547b657621b6cbed3562ea7826c3e11cab01cd33b74e1f677690652883e77", size = 33048, upload-time = "2025-03-26T14:41:30.679Z" }, - { url = "https://files.pythonhosted.org/packages/c9/16/a3842c2cf591093b111d4a5e2bfb478ac6692d02f1b386d2a33283a19dc9/google_crc32c-1.7.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d68e17bad8f7dd9a49181a1f5a8f4b251c6dbc8cc96fb79f1d321dfd57d66f53", size = 32669, upload-time = "2025-03-26T14:41:31.432Z" }, - { url = "https://files.pythonhosted.org/packages/04/17/ed9aba495916fcf5fe4ecb2267ceb851fc5f273c4e4625ae453350cfd564/google_crc32c-1.7.1-cp311-cp311-win_amd64.whl", hash = "sha256:6335de12921f06e1f774d0dd1fbea6bf610abe0887a1638f64d694013138be5d", size = 33476, upload-time = "2025-03-26T14:29:10.211Z" }, - { url = "https://files.pythonhosted.org/packages/dd/b7/787e2453cf8639c94b3d06c9d61f512234a82e1d12d13d18584bd3049904/google_crc32c-1.7.1-cp312-cp312-macosx_12_0_arm64.whl", hash = "sha256:2d73a68a653c57281401871dd4aeebbb6af3191dcac751a76ce430df4d403194", size = 30470, upload-time = "2025-03-26T14:34:31.655Z" }, - { url = "https://files.pythonhosted.org/packages/ed/b4/6042c2b0cbac3ec3a69bb4c49b28d2f517b7a0f4a0232603c42c58e22b44/google_crc32c-1.7.1-cp312-cp312-macosx_12_0_x86_64.whl", hash = "sha256:22beacf83baaf59f9d3ab2bbb4db0fb018da8e5aebdce07ef9f09fce8220285e", size = 30315, upload-time = "2025-03-26T15:01:54.634Z" }, - { url = "https://files.pythonhosted.org/packages/29/ad/01e7a61a5d059bc57b702d9ff6a18b2585ad97f720bd0a0dbe215df1ab0e/google_crc32c-1.7.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:19eafa0e4af11b0a4eb3974483d55d2d77ad1911e6cf6f832e1574f6781fd337", size = 33180, upload-time = "2025-03-26T14:41:32.168Z" }, - { url = "https://files.pythonhosted.org/packages/3b/a5/7279055cf004561894ed3a7bfdf5bf90a53f28fadd01af7cd166e88ddf16/google_crc32c-1.7.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b6d86616faaea68101195c6bdc40c494e4d76f41e07a37ffdef270879c15fb65", size = 32794, upload-time = "2025-03-26T14:41:33.264Z" }, - { url = "https://files.pythonhosted.org/packages/0f/d6/77060dbd140c624e42ae3ece3df53b9d811000729a5c821b9fd671ceaac6/google_crc32c-1.7.1-cp312-cp312-win_amd64.whl", hash = "sha256:b7491bdc0c7564fcf48c0179d2048ab2f7c7ba36b84ccd3a3e1c3f7a72d3bba6", size = 33477, upload-time = "2025-03-26T14:29:10.94Z" }, - { url = "https://files.pythonhosted.org/packages/8b/72/b8d785e9184ba6297a8620c8a37cf6e39b81a8ca01bb0796d7cbb28b3386/google_crc32c-1.7.1-cp313-cp313-macosx_12_0_arm64.whl", hash = "sha256:df8b38bdaf1629d62d51be8bdd04888f37c451564c2042d36e5812da9eff3c35", size = 30467, upload-time = "2025-03-26T14:36:06.909Z" }, - { url = "https://files.pythonhosted.org/packages/34/25/5f18076968212067c4e8ea95bf3b69669f9fc698476e5f5eb97d5b37999f/google_crc32c-1.7.1-cp313-cp313-macosx_12_0_x86_64.whl", hash = "sha256:e42e20a83a29aa2709a0cf271c7f8aefaa23b7ab52e53b322585297bb94d4638", size = 30309, upload-time = "2025-03-26T15:06:15.318Z" }, - { url = "https://files.pythonhosted.org/packages/92/83/9228fe65bf70e93e419f38bdf6c5ca5083fc6d32886ee79b450ceefd1dbd/google_crc32c-1.7.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:905a385140bf492ac300026717af339790921f411c0dfd9aa5a9e69a08ed32eb", size = 33133, upload-time = "2025-03-26T14:41:34.388Z" }, - { url = "https://files.pythonhosted.org/packages/c3/ca/1ea2fd13ff9f8955b85e7956872fdb7050c4ace8a2306a6d177edb9cf7fe/google_crc32c-1.7.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6b211ddaf20f7ebeec5c333448582c224a7c90a9d98826fbab82c0ddc11348e6", size = 32773, upload-time = "2025-03-26T14:41:35.19Z" }, - { url = "https://files.pythonhosted.org/packages/89/32/a22a281806e3ef21b72db16f948cad22ec68e4bdd384139291e00ff82fe2/google_crc32c-1.7.1-cp313-cp313-win_amd64.whl", hash = "sha256:0f99eaa09a9a7e642a61e06742856eec8b19fc0037832e03f941fe7cf0c8e4db", size = 33475, upload-time = "2025-03-26T14:29:11.771Z" }, - { url = "https://files.pythonhosted.org/packages/b8/c5/002975aff514e57fc084ba155697a049b3f9b52225ec3bc0f542871dd524/google_crc32c-1.7.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:32d1da0d74ec5634a05f53ef7df18fc646666a25efaaca9fc7dcfd4caf1d98c3", size = 33243, upload-time = "2025-03-26T14:41:35.975Z" }, - { url = "https://files.pythonhosted.org/packages/61/cb/c585282a03a0cea70fcaa1bf55d5d702d0f2351094d663ec3be1c6c67c52/google_crc32c-1.7.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e10554d4abc5238823112c2ad7e4560f96c7bf3820b202660373d769d9e6e4c9", size = 32870, upload-time = "2025-03-26T14:41:37.08Z" }, - { url = "https://files.pythonhosted.org/packages/16/1b/1693372bf423ada422f80fd88260dbfd140754adb15cbc4d7e9a68b1cb8e/google_crc32c-1.7.1-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:85fef7fae11494e747c9fd1359a527e5970fc9603c90764843caabd3a16a0a48", size = 28241, upload-time = "2025-03-26T14:41:45.898Z" }, - { url = "https://files.pythonhosted.org/packages/fd/3c/2a19a60a473de48717b4efb19398c3f914795b64a96cf3fbe82588044f78/google_crc32c-1.7.1-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6efb97eb4369d52593ad6f75e7e10d053cf00c48983f7a973105bc70b0ac4d82", size = 28048, upload-time = "2025-03-26T14:41:46.696Z" }, -] - -[[package]] -name = "google-genai" -version = "1.38.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "anyio" }, - { name = "google-auth" }, - { name = "httpx" }, - { name = "pydantic" }, - { name = "requests" }, - { name = "tenacity" }, - { name = "typing-extensions" }, - { name = "websockets" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/b4/11/108ddd3aca8af6a9e2369e59b9646a3a4c64aefb39d154f6467ab8d79f34/google_genai-1.38.0.tar.gz", hash = "sha256:363272fc4f677d0be6a1aed7ebabe8adf45e1626a7011a7886a587e9464ca9ec", size = 244903, upload-time = "2025-09-16T23:25:42.577Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/53/6c/1de711bab3c118284904c3bedf870519e8c63a7a8e0905ac3833f1db9cbc/google_genai-1.38.0-py3-none-any.whl", hash = "sha256:95407425132d42b3fa11bc92b3f5cf61a0fbd8d9add1f0e89aac52c46fbba090", size = 245558, upload-time = "2025-09-16T23:25:41.141Z" }, -] - -[[package]] -name = "google-resumable-media" -version = "2.7.2" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "google-crc32c" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/58/5a/0efdc02665dca14e0837b62c8a1a93132c264bd02054a15abb2218afe0ae/google_resumable_media-2.7.2.tar.gz", hash = "sha256:5280aed4629f2b60b847b0d42f9857fd4935c11af266744df33d8074cae92fe0", size = 2163099, upload-time = "2024-08-07T22:20:38.555Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/82/35/b8d3baf8c46695858cb9d8835a53baa1eeb9906ddaf2f728a5f5b640fd1e/google_resumable_media-2.7.2-py2.py3-none-any.whl", hash = "sha256:3ce7551e9fe6d99e9a126101d2536612bb73486721951e9562fee0f90c6ababa", size = 81251, upload-time = "2024-08-07T22:20:36.409Z" }, -] - -[[package]] -name = "googleapis-common-protos" -version = "1.70.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "protobuf" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/39/24/33db22342cf4a2ea27c9955e6713140fedd51e8b141b5ce5260897020f1a/googleapis_common_protos-1.70.0.tar.gz", hash = "sha256:0e1b44e0ea153e6594f9f394fef15193a68aaaea2d843f83e2742717ca753257", size = 145903, upload-time = "2025-04-14T10:17:02.924Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/86/f1/62a193f0227cf15a920390abe675f386dec35f7ae3ffe6da582d3ade42c7/googleapis_common_protos-1.70.0-py3-none-any.whl", hash = "sha256:b8bfcca8c25a2bb253e0e0b0adaf8c00773e5e6af6fd92397576680b807e0fd8", size = 294530, upload-time = "2025-04-14T10:17:01.271Z" }, -] - -[package.optional-dependencies] -grpc = [ - { name = "grpcio" }, -] - -[[package]] -name = "graphviz" -version = "0.21" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/f8/b3/3ac91e9be6b761a4b30d66ff165e54439dcd48b83f4e20d644867215f6ca/graphviz-0.21.tar.gz", hash = "sha256:20743e7183be82aaaa8ad6c93f8893c923bd6658a04c32ee115edb3c8a835f78", size = 200434, upload-time = "2025-06-15T09:35:05.824Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/91/4c/e0ce1ef95d4000ebc1c11801f9b944fa5910ecc15b5e351865763d8657f8/graphviz-0.21-py3-none-any.whl", hash = "sha256:54f33de9f4f911d7e84e4191749cac8cc5653f815b06738c54db9a15ab8b1e42", size = 47300, upload-time = "2025-06-15T09:35:04.433Z" }, -] - -[[package]] -name = "greenlet" -version = "3.2.4" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/03/b8/704d753a5a45507a7aab61f18db9509302ed3d0a27ac7e0359ec2905b1a6/greenlet-3.2.4.tar.gz", hash = "sha256:0dca0d95ff849f9a364385f36ab49f50065d76964944638be9691e1832e9f86d", size = 188260, upload-time = "2025-08-07T13:24:33.51Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/a4/de/f28ced0a67749cac23fecb02b694f6473f47686dff6afaa211d186e2ef9c/greenlet-3.2.4-cp311-cp311-macosx_11_0_universal2.whl", hash = "sha256:96378df1de302bc38e99c3a9aa311967b7dc80ced1dcc6f171e99842987882a2", size = 272305, upload-time = "2025-08-07T13:15:41.288Z" }, - { url = "https://files.pythonhosted.org/packages/09/16/2c3792cba130000bf2a31c5272999113f4764fd9d874fb257ff588ac779a/greenlet-3.2.4-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:1ee8fae0519a337f2329cb78bd7a8e128ec0f881073d43f023c7b8d4831d5246", size = 632472, upload-time = "2025-08-07T13:42:55.044Z" }, - { url = "https://files.pythonhosted.org/packages/ae/8f/95d48d7e3d433e6dae5b1682e4292242a53f22df82e6d3dda81b1701a960/greenlet-3.2.4-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:94abf90142c2a18151632371140b3dba4dee031633fe614cb592dbb6c9e17bc3", size = 644646, upload-time = "2025-08-07T13:45:26.523Z" }, - { url = "https://files.pythonhosted.org/packages/d5/5e/405965351aef8c76b8ef7ad370e5da58d57ef6068df197548b015464001a/greenlet-3.2.4-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:4d1378601b85e2e5171b99be8d2dc85f594c79967599328f95c1dc1a40f1c633", size = 640519, upload-time = "2025-08-07T13:53:13.928Z" }, - { url = "https://files.pythonhosted.org/packages/25/5d/382753b52006ce0218297ec1b628e048c4e64b155379331f25a7316eb749/greenlet-3.2.4-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:0db5594dce18db94f7d1650d7489909b57afde4c580806b8d9203b6e79cdc079", size = 639707, upload-time = "2025-08-07T13:18:27.146Z" }, - { url = "https://files.pythonhosted.org/packages/1f/8e/abdd3f14d735b2929290a018ecf133c901be4874b858dd1c604b9319f064/greenlet-3.2.4-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2523e5246274f54fdadbce8494458a2ebdcdbc7b802318466ac5606d3cded1f8", size = 587684, upload-time = "2025-08-07T13:18:25.164Z" }, - { url = "https://files.pythonhosted.org/packages/5d/65/deb2a69c3e5996439b0176f6651e0052542bb6c8f8ec2e3fba97c9768805/greenlet-3.2.4-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:1987de92fec508535687fb807a5cea1560f6196285a4cde35c100b8cd632cc52", size = 1116647, upload-time = "2025-08-07T13:42:38.655Z" }, - { url = "https://files.pythonhosted.org/packages/3f/cc/b07000438a29ac5cfb2194bfc128151d52f333cee74dd7dfe3fb733fc16c/greenlet-3.2.4-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:55e9c5affaa6775e2c6b67659f3a71684de4c549b3dd9afca3bc773533d284fa", size = 1142073, upload-time = "2025-08-07T13:18:21.737Z" }, - { url = "https://files.pythonhosted.org/packages/d8/0f/30aef242fcab550b0b3520b8e3561156857c94288f0332a79928c31a52cf/greenlet-3.2.4-cp311-cp311-win_amd64.whl", hash = "sha256:9c40adce87eaa9ddb593ccb0fa6a07caf34015a29bf8d344811665b573138db9", size = 299100, upload-time = "2025-08-07T13:44:12.287Z" }, - { url = "https://files.pythonhosted.org/packages/44/69/9b804adb5fd0671f367781560eb5eb586c4d495277c93bde4307b9e28068/greenlet-3.2.4-cp312-cp312-macosx_11_0_universal2.whl", hash = "sha256:3b67ca49f54cede0186854a008109d6ee71f66bd57bb36abd6d0a0267b540cdd", size = 274079, upload-time = "2025-08-07T13:15:45.033Z" }, - { url = "https://files.pythonhosted.org/packages/46/e9/d2a80c99f19a153eff70bc451ab78615583b8dac0754cfb942223d2c1a0d/greenlet-3.2.4-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:ddf9164e7a5b08e9d22511526865780a576f19ddd00d62f8a665949327fde8bb", size = 640997, upload-time = "2025-08-07T13:42:56.234Z" }, - { url = "https://files.pythonhosted.org/packages/3b/16/035dcfcc48715ccd345f3a93183267167cdd162ad123cd93067d86f27ce4/greenlet-3.2.4-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:f28588772bb5fb869a8eb331374ec06f24a83a9c25bfa1f38b6993afe9c1e968", size = 655185, upload-time = "2025-08-07T13:45:27.624Z" }, - { url = "https://files.pythonhosted.org/packages/31/da/0386695eef69ffae1ad726881571dfe28b41970173947e7c558d9998de0f/greenlet-3.2.4-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:5c9320971821a7cb77cfab8d956fa8e39cd07ca44b6070db358ceb7f8797c8c9", size = 649926, upload-time = "2025-08-07T13:53:15.251Z" }, - { url = "https://files.pythonhosted.org/packages/68/88/69bf19fd4dc19981928ceacbc5fd4bb6bc2215d53199e367832e98d1d8fe/greenlet-3.2.4-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c60a6d84229b271d44b70fb6e5fa23781abb5d742af7b808ae3f6efd7c9c60f6", size = 651839, upload-time = "2025-08-07T13:18:30.281Z" }, - { url = "https://files.pythonhosted.org/packages/19/0d/6660d55f7373b2ff8152401a83e02084956da23ae58cddbfb0b330978fe9/greenlet-3.2.4-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3b3812d8d0c9579967815af437d96623f45c0f2ae5f04e366de62a12d83a8fb0", size = 607586, upload-time = "2025-08-07T13:18:28.544Z" }, - { url = "https://files.pythonhosted.org/packages/8e/1a/c953fdedd22d81ee4629afbb38d2f9d71e37d23caace44775a3a969147d4/greenlet-3.2.4-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:abbf57b5a870d30c4675928c37278493044d7c14378350b3aa5d484fa65575f0", size = 1123281, upload-time = "2025-08-07T13:42:39.858Z" }, - { url = "https://files.pythonhosted.org/packages/3f/c7/12381b18e21aef2c6bd3a636da1088b888b97b7a0362fac2e4de92405f97/greenlet-3.2.4-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:20fb936b4652b6e307b8f347665e2c615540d4b42b3b4c8a321d8286da7e520f", size = 1151142, upload-time = "2025-08-07T13:18:22.981Z" }, - { url = "https://files.pythonhosted.org/packages/e9/08/b0814846b79399e585f974bbeebf5580fbe59e258ea7be64d9dfb253c84f/greenlet-3.2.4-cp312-cp312-win_amd64.whl", hash = "sha256:a7d4e128405eea3814a12cc2605e0e6aedb4035bf32697f72deca74de4105e02", size = 299899, upload-time = "2025-08-07T13:38:53.448Z" }, - { url = "https://files.pythonhosted.org/packages/49/e8/58c7f85958bda41dafea50497cbd59738c5c43dbbea5ee83d651234398f4/greenlet-3.2.4-cp313-cp313-macosx_11_0_universal2.whl", hash = "sha256:1a921e542453fe531144e91e1feedf12e07351b1cf6c9e8a3325ea600a715a31", size = 272814, upload-time = "2025-08-07T13:15:50.011Z" }, - { url = "https://files.pythonhosted.org/packages/62/dd/b9f59862e9e257a16e4e610480cfffd29e3fae018a68c2332090b53aac3d/greenlet-3.2.4-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:cd3c8e693bff0fff6ba55f140bf390fa92c994083f838fece0f63be121334945", size = 641073, upload-time = "2025-08-07T13:42:57.23Z" }, - { url = "https://files.pythonhosted.org/packages/f7/0b/bc13f787394920b23073ca3b6c4a7a21396301ed75a655bcb47196b50e6e/greenlet-3.2.4-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:710638eb93b1fa52823aa91bf75326f9ecdfd5e0466f00789246a5280f4ba0fc", size = 655191, upload-time = "2025-08-07T13:45:29.752Z" }, - { url = "https://files.pythonhosted.org/packages/f2/d6/6adde57d1345a8d0f14d31e4ab9c23cfe8e2cd39c3baf7674b4b0338d266/greenlet-3.2.4-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:c5111ccdc9c88f423426df3fd1811bfc40ed66264d35aa373420a34377efc98a", size = 649516, upload-time = "2025-08-07T13:53:16.314Z" }, - { url = "https://files.pythonhosted.org/packages/7f/3b/3a3328a788d4a473889a2d403199932be55b1b0060f4ddd96ee7cdfcad10/greenlet-3.2.4-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:d76383238584e9711e20ebe14db6c88ddcedc1829a9ad31a584389463b5aa504", size = 652169, upload-time = "2025-08-07T13:18:32.861Z" }, - { url = "https://files.pythonhosted.org/packages/ee/43/3cecdc0349359e1a527cbf2e3e28e5f8f06d3343aaf82ca13437a9aa290f/greenlet-3.2.4-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:23768528f2911bcd7e475210822ffb5254ed10d71f4028387e5a99b4c6699671", size = 610497, upload-time = "2025-08-07T13:18:31.636Z" }, - { url = "https://files.pythonhosted.org/packages/b8/19/06b6cf5d604e2c382a6f31cafafd6f33d5dea706f4db7bdab184bad2b21d/greenlet-3.2.4-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:00fadb3fedccc447f517ee0d3fd8fe49eae949e1cd0f6a611818f4f6fb7dc83b", size = 1121662, upload-time = "2025-08-07T13:42:41.117Z" }, - { url = "https://files.pythonhosted.org/packages/a2/15/0d5e4e1a66fab130d98168fe984c509249c833c1a3c16806b90f253ce7b9/greenlet-3.2.4-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:d25c5091190f2dc0eaa3f950252122edbbadbb682aa7b1ef2f8af0f8c0afefae", size = 1149210, upload-time = "2025-08-07T13:18:24.072Z" }, - { url = "https://files.pythonhosted.org/packages/0b/55/2321e43595e6801e105fcfdee02b34c0f996eb71e6ddffca6b10b7e1d771/greenlet-3.2.4-cp313-cp313-win_amd64.whl", hash = "sha256:554b03b6e73aaabec3745364d6239e9e012d64c68ccd0b8430c64ccc14939a8b", size = 299685, upload-time = "2025-08-07T13:24:38.824Z" }, - { url = "https://files.pythonhosted.org/packages/22/5c/85273fd7cc388285632b0498dbbab97596e04b154933dfe0f3e68156c68c/greenlet-3.2.4-cp314-cp314-macosx_11_0_universal2.whl", hash = "sha256:49a30d5fda2507ae77be16479bdb62a660fa51b1eb4928b524975b3bde77b3c0", size = 273586, upload-time = "2025-08-07T13:16:08.004Z" }, - { url = "https://files.pythonhosted.org/packages/d1/75/10aeeaa3da9332c2e761e4c50d4c3556c21113ee3f0afa2cf5769946f7a3/greenlet-3.2.4-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:299fd615cd8fc86267b47597123e3f43ad79c9d8a22bebdce535e53550763e2f", size = 686346, upload-time = "2025-08-07T13:42:59.944Z" }, - { url = "https://files.pythonhosted.org/packages/c0/aa/687d6b12ffb505a4447567d1f3abea23bd20e73a5bed63871178e0831b7a/greenlet-3.2.4-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:c17b6b34111ea72fc5a4e4beec9711d2226285f0386ea83477cbb97c30a3f3a5", size = 699218, upload-time = "2025-08-07T13:45:30.969Z" }, - { url = "https://files.pythonhosted.org/packages/dc/8b/29aae55436521f1d6f8ff4e12fb676f3400de7fcf27fccd1d4d17fd8fecd/greenlet-3.2.4-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:b4a1870c51720687af7fa3e7cda6d08d801dae660f75a76f3845b642b4da6ee1", size = 694659, upload-time = "2025-08-07T13:53:17.759Z" }, - { url = "https://files.pythonhosted.org/packages/92/2e/ea25914b1ebfde93b6fc4ff46d6864564fba59024e928bdc7de475affc25/greenlet-3.2.4-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:061dc4cf2c34852b052a8620d40f36324554bc192be474b9e9770e8c042fd735", size = 695355, upload-time = "2025-08-07T13:18:34.517Z" }, - { url = "https://files.pythonhosted.org/packages/72/60/fc56c62046ec17f6b0d3060564562c64c862948c9d4bc8aa807cf5bd74f4/greenlet-3.2.4-cp314-cp314-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:44358b9bf66c8576a9f57a590d5f5d6e72fa4228b763d0e43fee6d3b06d3a337", size = 657512, upload-time = "2025-08-07T13:18:33.969Z" }, - { url = "https://files.pythonhosted.org/packages/e3/a5/6ddab2b4c112be95601c13428db1d8b6608a8b6039816f2ba09c346c08fc/greenlet-3.2.4-cp314-cp314-win_amd64.whl", hash = "sha256:e37ab26028f12dbb0ff65f29a8d3d44a765c61e729647bf2ddfbbed621726f01", size = 303425, upload-time = "2025-08-07T13:32:27.59Z" }, -] - -[[package]] -name = "grpc-google-iam-v1" -version = "0.14.2" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "googleapis-common-protos", extra = ["grpc"] }, - { name = "grpcio" }, - { name = "protobuf" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/b9/4e/8d0ca3b035e41fe0b3f31ebbb638356af720335e5a11154c330169b40777/grpc_google_iam_v1-0.14.2.tar.gz", hash = "sha256:b3e1fc387a1a329e41672197d0ace9de22c78dd7d215048c4c78712073f7bd20", size = 16259, upload-time = "2025-03-17T11:40:23.586Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/66/6f/dd9b178aee7835b96c2e63715aba6516a9d50f6bebbd1cc1d32c82a2a6c3/grpc_google_iam_v1-0.14.2-py3-none-any.whl", hash = "sha256:a3171468459770907926d56a440b2bb643eec1d7ba215f48f3ecece42b4d8351", size = 19242, upload-time = "2025-03-17T11:40:22.648Z" }, -] - -[[package]] -name = "grpc-interceptor" -version = "0.15.4" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "grpcio" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/9f/28/57449d5567adf4c1d3e216aaca545913fbc21a915f2da6790d6734aac76e/grpc-interceptor-0.15.4.tar.gz", hash = "sha256:1f45c0bcb58b6f332f37c637632247c9b02bc6af0fdceb7ba7ce8d2ebbfb0926", size = 19322, upload-time = "2023-11-16T02:05:42.459Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/15/ac/8d53f230a7443401ce81791ec50a3b0e54924bf615ad287654fa4a2f5cdc/grpc_interceptor-0.15.4-py3-none-any.whl", hash = "sha256:0035f33228693ed3767ee49d937bac424318db173fef4d2d0170b3215f254d9d", size = 20848, upload-time = "2023-11-16T02:05:40.913Z" }, -] - -[[package]] -name = "grpcio" -version = "1.75.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "typing-extensions" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/91/88/fe2844eefd3d2188bc0d7a2768c6375b46dfd96469ea52d8aeee8587d7e0/grpcio-1.75.0.tar.gz", hash = "sha256:b989e8b09489478c2d19fecc744a298930f40d8b27c3638afbfe84d22f36ce4e", size = 12722485, upload-time = "2025-09-16T09:20:21.731Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/95/b7/a6f42596fc367656970f5811e5d2d9912ca937aa90621d5468a11680ef47/grpcio-1.75.0-cp311-cp311-linux_armv7l.whl", hash = "sha256:7f89d6d0cd43170a80ebb4605cad54c7d462d21dc054f47688912e8bf08164af", size = 5699769, upload-time = "2025-09-16T09:18:32.536Z" }, - { url = "https://files.pythonhosted.org/packages/c2/42/284c463a311cd2c5f804fd4fdbd418805460bd5d702359148dd062c1685d/grpcio-1.75.0-cp311-cp311-macosx_11_0_universal2.whl", hash = "sha256:cb6c5b075c2d092f81138646a755f0dad94e4622300ebef089f94e6308155d82", size = 11480362, upload-time = "2025-09-16T09:18:35.562Z" }, - { url = "https://files.pythonhosted.org/packages/0b/10/60d54d5a03062c3ae91bddb6e3acefe71264307a419885f453526d9203ff/grpcio-1.75.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:494dcbade5606128cb9f530ce00331a90ecf5e7c5b243d373aebdb18e503c346", size = 6284753, upload-time = "2025-09-16T09:18:38.055Z" }, - { url = "https://files.pythonhosted.org/packages/cf/af/381a4bfb04de5e2527819452583e694df075c7a931e9bf1b2a603b593ab2/grpcio-1.75.0-cp311-cp311-manylinux2014_i686.manylinux_2_17_i686.whl", hash = "sha256:050760fd29c8508844a720f06c5827bb00de8f5e02f58587eb21a4444ad706e5", size = 6944103, upload-time = "2025-09-16T09:18:40.844Z" }, - { url = "https://files.pythonhosted.org/packages/16/18/c80dd7e1828bd6700ce242c1616871927eef933ed0c2cee5c636a880e47b/grpcio-1.75.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:266fa6209b68a537b2728bb2552f970e7e78c77fe43c6e9cbbe1f476e9e5c35f", size = 6464036, upload-time = "2025-09-16T09:18:43.351Z" }, - { url = "https://files.pythonhosted.org/packages/79/3f/78520c7ed9ccea16d402530bc87958bbeb48c42a2ec8032738a7864d38f8/grpcio-1.75.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:06d22e1d8645e37bc110f4c589cb22c283fd3de76523065f821d6e81de33f5d4", size = 7097455, upload-time = "2025-09-16T09:18:45.465Z" }, - { url = "https://files.pythonhosted.org/packages/ad/69/3cebe4901a865eb07aefc3ee03a02a632e152e9198dadf482a7faf926f31/grpcio-1.75.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:9880c323595d851292785966cadb6c708100b34b163cab114e3933f5773cba2d", size = 8037203, upload-time = "2025-09-16T09:18:47.878Z" }, - { url = "https://files.pythonhosted.org/packages/04/ed/1e483d1eba5032642c10caf28acf07ca8de0508244648947764956db346a/grpcio-1.75.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:55a2d5ae79cd0f68783fb6ec95509be23746e3c239290b2ee69c69a38daa961a", size = 7492085, upload-time = "2025-09-16T09:18:50.907Z" }, - { url = "https://files.pythonhosted.org/packages/ee/65/6ef676aa7dbd9578dfca990bb44d41a49a1e36344ca7d79de6b59733ba96/grpcio-1.75.0-cp311-cp311-win32.whl", hash = "sha256:352dbdf25495eef584c8de809db280582093bc3961d95a9d78f0dfb7274023a2", size = 3944697, upload-time = "2025-09-16T09:18:53.427Z" }, - { url = "https://files.pythonhosted.org/packages/0d/83/b753373098b81ec5cb01f71c21dfd7aafb5eb48a1566d503e9fd3c1254fe/grpcio-1.75.0-cp311-cp311-win_amd64.whl", hash = "sha256:678b649171f229fb16bda1a2473e820330aa3002500c4f9fd3a74b786578e90f", size = 4642235, upload-time = "2025-09-16T09:18:56.095Z" }, - { url = "https://files.pythonhosted.org/packages/0d/93/a1b29c2452d15cecc4a39700fbf54721a3341f2ddbd1bd883f8ec0004e6e/grpcio-1.75.0-cp312-cp312-linux_armv7l.whl", hash = "sha256:fa35ccd9501ffdd82b861809cbfc4b5b13f4b4c5dc3434d2d9170b9ed38a9054", size = 5661861, upload-time = "2025-09-16T09:18:58.748Z" }, - { url = "https://files.pythonhosted.org/packages/b8/ce/7280df197e602d14594e61d1e60e89dfa734bb59a884ba86cdd39686aadb/grpcio-1.75.0-cp312-cp312-macosx_11_0_universal2.whl", hash = "sha256:0fcb77f2d718c1e58cc04ef6d3b51e0fa3b26cf926446e86c7eba105727b6cd4", size = 11459982, upload-time = "2025-09-16T09:19:01.211Z" }, - { url = "https://files.pythonhosted.org/packages/7c/9b/37e61349771f89b543a0a0bbc960741115ea8656a2414bfb24c4de6f3dd7/grpcio-1.75.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:36764a4ad9dc1eb891042fab51e8cdf7cc014ad82cee807c10796fb708455041", size = 6239680, upload-time = "2025-09-16T09:19:04.443Z" }, - { url = "https://files.pythonhosted.org/packages/a6/66/f645d9d5b22ca307f76e71abc83ab0e574b5dfef3ebde4ec8b865dd7e93e/grpcio-1.75.0-cp312-cp312-manylinux2014_i686.manylinux_2_17_i686.whl", hash = "sha256:725e67c010f63ef17fc052b261004942763c0b18dcd84841e6578ddacf1f9d10", size = 6908511, upload-time = "2025-09-16T09:19:07.884Z" }, - { url = "https://files.pythonhosted.org/packages/e6/9a/34b11cd62d03c01b99068e257595804c695c3c119596c7077f4923295e19/grpcio-1.75.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:91fbfc43f605c5ee015c9056d580a70dd35df78a7bad97e05426795ceacdb59f", size = 6429105, upload-time = "2025-09-16T09:19:10.085Z" }, - { url = "https://files.pythonhosted.org/packages/1a/46/76eaceaad1f42c1e7e6a5b49a61aac40fc5c9bee4b14a1630f056ac3a57e/grpcio-1.75.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:7a9337ac4ce61c388e02019d27fa837496c4b7837cbbcec71b05934337e51531", size = 7060578, upload-time = "2025-09-16T09:19:12.283Z" }, - { url = "https://files.pythonhosted.org/packages/3d/82/181a0e3f1397b6d43239e95becbeb448563f236c0db11ce990f073b08d01/grpcio-1.75.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:ee16e232e3d0974750ab5f4da0ab92b59d6473872690b5e40dcec9a22927f22e", size = 8003283, upload-time = "2025-09-16T09:19:15.601Z" }, - { url = "https://files.pythonhosted.org/packages/de/09/a335bca211f37a3239be4b485e3c12bf3da68d18b1f723affdff2b9e9680/grpcio-1.75.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:55dfb9122973cc69520b23d39867726722cafb32e541435707dc10249a1bdbc6", size = 7460319, upload-time = "2025-09-16T09:19:18.409Z" }, - { url = "https://files.pythonhosted.org/packages/aa/59/6330105cdd6bc4405e74c96838cd7e148c3653ae3996e540be6118220c79/grpcio-1.75.0-cp312-cp312-win32.whl", hash = "sha256:fb64dd62face3d687a7b56cd881e2ea39417af80f75e8b36f0f81dfd93071651", size = 3934011, upload-time = "2025-09-16T09:19:21.013Z" }, - { url = "https://files.pythonhosted.org/packages/ff/14/e1309a570b7ebdd1c8ca24c4df6b8d6690009fa8e0d997cb2c026ce850c9/grpcio-1.75.0-cp312-cp312-win_amd64.whl", hash = "sha256:6b365f37a9c9543a9e91c6b4103d68d38d5bcb9965b11d5092b3c157bd6a5ee7", size = 4637934, upload-time = "2025-09-16T09:19:23.19Z" }, - { url = "https://files.pythonhosted.org/packages/00/64/dbce0ffb6edaca2b292d90999dd32a3bd6bc24b5b77618ca28440525634d/grpcio-1.75.0-cp313-cp313-linux_armv7l.whl", hash = "sha256:1bb78d052948d8272c820bb928753f16a614bb2c42fbf56ad56636991b427518", size = 5666860, upload-time = "2025-09-16T09:19:25.417Z" }, - { url = "https://files.pythonhosted.org/packages/f3/e6/da02c8fa882ad3a7f868d380bb3da2c24d35dd983dd12afdc6975907a352/grpcio-1.75.0-cp313-cp313-macosx_11_0_universal2.whl", hash = "sha256:9dc4a02796394dd04de0b9673cb79a78901b90bb16bf99ed8cb528c61ed9372e", size = 11455148, upload-time = "2025-09-16T09:19:28.615Z" }, - { url = "https://files.pythonhosted.org/packages/ba/a0/84f87f6c2cf2a533cfce43b2b620eb53a51428ec0c8fe63e5dd21d167a70/grpcio-1.75.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:437eeb16091d31498585d73b133b825dc80a8db43311e332c08facf820d36894", size = 6243865, upload-time = "2025-09-16T09:19:31.342Z" }, - { url = "https://files.pythonhosted.org/packages/be/12/53da07aa701a4839dd70d16e61ce21ecfcc9e929058acb2f56e9b2dd8165/grpcio-1.75.0-cp313-cp313-manylinux2014_i686.manylinux_2_17_i686.whl", hash = "sha256:c2c39984e846bd5da45c5f7bcea8fafbe47c98e1ff2b6f40e57921b0c23a52d0", size = 6915102, upload-time = "2025-09-16T09:19:33.658Z" }, - { url = "https://files.pythonhosted.org/packages/5b/c0/7eaceafd31f52ec4bf128bbcf36993b4bc71f64480f3687992ddd1a6e315/grpcio-1.75.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:38d665f44b980acdbb2f0e1abf67605ba1899f4d2443908df9ec8a6f26d2ed88", size = 6432042, upload-time = "2025-09-16T09:19:36.583Z" }, - { url = "https://files.pythonhosted.org/packages/6b/12/a2ce89a9f4fc52a16ed92951f1b05f53c17c4028b3db6a4db7f08332bee8/grpcio-1.75.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:2e8e752ab5cc0a9c5b949808c000ca7586223be4f877b729f034b912364c3964", size = 7062984, upload-time = "2025-09-16T09:19:39.163Z" }, - { url = "https://files.pythonhosted.org/packages/55/a6/2642a9b491e24482d5685c0f45c658c495a5499b43394846677abed2c966/grpcio-1.75.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:3a6788b30aa8e6f207c417874effe3f79c2aa154e91e78e477c4825e8b431ce0", size = 8001212, upload-time = "2025-09-16T09:19:41.726Z" }, - { url = "https://files.pythonhosted.org/packages/19/20/530d4428750e9ed6ad4254f652b869a20a40a276c1f6817b8c12d561f5ef/grpcio-1.75.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:ffc33e67cab6141c54e75d85acd5dec616c5095a957ff997b4330a6395aa9b51", size = 7457207, upload-time = "2025-09-16T09:19:44.368Z" }, - { url = "https://files.pythonhosted.org/packages/e2/6f/843670007e0790af332a21468d10059ea9fdf97557485ae633b88bd70efc/grpcio-1.75.0-cp313-cp313-win32.whl", hash = "sha256:c8cfc780b7a15e06253aae5f228e1e84c0d3c4daa90faf5bc26b751174da4bf9", size = 3934235, upload-time = "2025-09-16T09:19:46.815Z" }, - { url = "https://files.pythonhosted.org/packages/4b/92/c846b01b38fdf9e2646a682b12e30a70dc7c87dfe68bd5e009ee1501c14b/grpcio-1.75.0-cp313-cp313-win_amd64.whl", hash = "sha256:0c91d5b16eff3cbbe76b7a1eaaf3d91e7a954501e9d4f915554f87c470475c3d", size = 4637558, upload-time = "2025-09-16T09:19:49.698Z" }, -] - -[[package]] -name = "grpcio-status" -version = "1.75.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "googleapis-common-protos" }, - { name = "grpcio" }, - { name = "protobuf" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/ca/8a/2e45ec0512d4ce9afa136c6e4186d063721b5b4c192eec7536ce6b7ba615/grpcio_status-1.75.0.tar.gz", hash = "sha256:69d5b91be1b8b926f086c1c483519a968c14640773a0ccdd6c04282515dbedf7", size = 13646, upload-time = "2025-09-16T09:24:51.069Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/2b/24/d536f0a0fda3a3eeb334893e5fb9d567c2777de6a5384413f71b35cfd0e5/grpcio_status-1.75.0-py3-none-any.whl", hash = "sha256:de62557ef97b7e19c3ce6da19793a12c5f6c1fbbb918d233d9671aba9d9e1d78", size = 14424, upload-time = "2025-09-16T09:23:33.843Z" }, -] - -[[package]] -name = "h11" -version = "0.16.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/01/ee/02a2c011bdab74c6fb3c75474d40b3052059d95df7e73351460c8588d963/h11-0.16.0.tar.gz", hash = "sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1", size = 101250, upload-time = "2025-04-24T03:35:25.427Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/04/4b/29cac41a4d98d144bf5f6d33995617b185d14b22401f75ca86f384e87ff1/h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86", size = 37515, upload-time = "2025-04-24T03:35:24.344Z" }, -] - -[[package]] -name = "hexbytes" -version = "1.3.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/7f/87/adf4635b4b8c050283d74e6db9a81496063229c9263e6acc1903ab79fbec/hexbytes-1.3.1.tar.gz", hash = "sha256:a657eebebdfe27254336f98d8af6e2236f3f83aed164b87466b6cf6c5f5a4765", size = 8633, upload-time = "2025-05-14T16:45:17.5Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/8d/e0/3b31492b1c89da3c5a846680517871455b30c54738486fc57ac79a5761bd/hexbytes-1.3.1-py3-none-any.whl", hash = "sha256:da01ff24a1a9a2b1881c4b85f0e9f9b0f51b526b379ffa23832ae7899d29c2c7", size = 5074, upload-time = "2025-05-14T16:45:16.179Z" }, -] - -[[package]] -name = "hf-xet" -version = "1.1.10" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/74/31/feeddfce1748c4a233ec1aa5b7396161c07ae1aa9b7bdbc9a72c3c7dd768/hf_xet-1.1.10.tar.gz", hash = "sha256:408aef343800a2102374a883f283ff29068055c111f003ff840733d3b715bb97", size = 487910, upload-time = "2025-09-12T20:10:27.12Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/f7/a2/343e6d05de96908366bdc0081f2d8607d61200be2ac802769c4284cc65bd/hf_xet-1.1.10-cp37-abi3-macosx_10_12_x86_64.whl", hash = "sha256:686083aca1a6669bc85c21c0563551cbcdaa5cf7876a91f3d074a030b577231d", size = 2761466, upload-time = "2025-09-12T20:10:22.836Z" }, - { url = "https://files.pythonhosted.org/packages/31/f9/6215f948ac8f17566ee27af6430ea72045e0418ce757260248b483f4183b/hf_xet-1.1.10-cp37-abi3-macosx_11_0_arm64.whl", hash = "sha256:71081925383b66b24eedff3013f8e6bbd41215c3338be4b94ba75fd75b21513b", size = 2623807, upload-time = "2025-09-12T20:10:21.118Z" }, - { url = "https://files.pythonhosted.org/packages/15/07/86397573efefff941e100367bbda0b21496ffcdb34db7ab51912994c32a2/hf_xet-1.1.10-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6b6bceb6361c80c1cc42b5a7b4e3efd90e64630bcf11224dcac50ef30a47e435", size = 3186960, upload-time = "2025-09-12T20:10:19.336Z" }, - { url = "https://files.pythonhosted.org/packages/01/a7/0b2e242b918cc30e1f91980f3c4b026ff2eedaf1e2ad96933bca164b2869/hf_xet-1.1.10-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:eae7c1fc8a664e54753ffc235e11427ca61f4b0477d757cc4eb9ae374b69f09c", size = 3087167, upload-time = "2025-09-12T20:10:17.255Z" }, - { url = "https://files.pythonhosted.org/packages/4a/25/3e32ab61cc7145b11eee9d745988e2f0f4fafda81b25980eebf97d8cff15/hf_xet-1.1.10-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:0a0005fd08f002180f7a12d4e13b22be277725bc23ed0529f8add5c7a6309c06", size = 3248612, upload-time = "2025-09-12T20:10:24.093Z" }, - { url = "https://files.pythonhosted.org/packages/2c/3d/ab7109e607ed321afaa690f557a9ada6d6d164ec852fd6bf9979665dc3d6/hf_xet-1.1.10-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:f900481cf6e362a6c549c61ff77468bd59d6dd082f3170a36acfef2eb6a6793f", size = 3353360, upload-time = "2025-09-12T20:10:25.563Z" }, - { url = "https://files.pythonhosted.org/packages/ee/0e/471f0a21db36e71a2f1752767ad77e92d8cde24e974e03d662931b1305ec/hf_xet-1.1.10-cp37-abi3-win_amd64.whl", hash = "sha256:5f54b19cc347c13235ae7ee98b330c26dd65ef1df47e5316ffb1e87713ca7045", size = 2804691, upload-time = "2025-09-12T20:10:28.433Z" }, -] - -[[package]] -name = "httpcore" -version = "1.0.9" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "certifi" }, - { name = "h11" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/06/94/82699a10bca87a5556c9c59b5963f2d039dbd239f25bc2a63907a05a14cb/httpcore-1.0.9.tar.gz", hash = "sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8", size = 85484, upload-time = "2025-04-24T22:06:22.219Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/7e/f5/f66802a942d491edb555dd61e3a9961140fd64c90bce1eafd741609d334d/httpcore-1.0.9-py3-none-any.whl", hash = "sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55", size = 78784, upload-time = "2025-04-24T22:06:20.566Z" }, -] - -[[package]] -name = "httplib2" -version = "0.31.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "pyparsing" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/52/77/6653db69c1f7ecfe5e3f9726fdadc981794656fcd7d98c4209fecfea9993/httplib2-0.31.0.tar.gz", hash = "sha256:ac7ab497c50975147d4f7b1ade44becc7df2f8954d42b38b3d69c515f531135c", size = 250759, upload-time = "2025-09-11T12:16:03.403Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/8c/a2/0d269db0f6163be503775dc8b6a6fa15820cc9fdc866f6ba608d86b721f2/httplib2-0.31.0-py3-none-any.whl", hash = "sha256:b9cd78abea9b4e43a7714c6e0f8b6b8561a6fc1e95d5dbd367f5bf0ef35f5d24", size = 91148, upload-time = "2025-09-11T12:16:01.803Z" }, -] - -[[package]] -name = "httpx" -version = "0.28.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "anyio" }, - { name = "certifi" }, - { name = "httpcore" }, - { name = "idna" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/b1/df/48c586a5fe32a0f01324ee087459e112ebb7224f646c0b5023f5e79e9956/httpx-0.28.1.tar.gz", hash = "sha256:75e98c5f16b0f35b567856f597f06ff2270a374470a5c2392242528e3e3e42fc", size = 141406, upload-time = "2024-12-06T15:37:23.222Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/2a/39/e50c7c3a983047577ee07d2a9e53faf5a69493943ec3f6a384bdc792deb2/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad", size = 73517, upload-time = "2024-12-06T15:37:21.509Z" }, -] - -[[package]] -name = "httpx-sse" -version = "0.4.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/6e/fa/66bd985dd0b7c109a3bcb89272ee0bfb7e2b4d06309ad7b38ff866734b2a/httpx_sse-0.4.1.tar.gz", hash = "sha256:8f44d34414bc7b21bf3602713005c5df4917884f76072479b21f68befa4ea26e", size = 12998, upload-time = "2025-06-24T13:21:05.71Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/25/0a/6269e3473b09aed2dab8aa1a600c70f31f00ae1349bee30658f7e358a159/httpx_sse-0.4.1-py3-none-any.whl", hash = "sha256:cba42174344c3a5b06f255ce65b350880f962d99ead85e776f23c6618a377a37", size = 8054, upload-time = "2025-06-24T13:21:04.772Z" }, -] - -[[package]] -name = "huggingface-hub" -version = "0.35.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "filelock" }, - { name = "fsspec" }, - { name = "hf-xet", marker = "platform_machine == 'aarch64' or platform_machine == 'amd64' or platform_machine == 'arm64' or platform_machine == 'x86_64'" }, - { name = "packaging" }, - { name = "pyyaml" }, - { name = "requests" }, - { name = "tqdm" }, - { name = "typing-extensions" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/37/79/d71d40efa058e8c4a075158f8855bc2998037b5ff1c84f249f34435c1df7/huggingface_hub-0.35.0.tar.gz", hash = "sha256:ccadd2a78eef75effff184ad89401413629fabc52cefd76f6bbacb9b1c0676ac", size = 461486, upload-time = "2025-09-16T13:49:33.282Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/fe/85/a18508becfa01f1e4351b5e18651b06d210dbd96debccd48a452acccb901/huggingface_hub-0.35.0-py3-none-any.whl", hash = "sha256:f2e2f693bca9a26530b1c0b9bcd4c1495644dad698e6a0060f90e22e772c31e9", size = 563436, upload-time = "2025-09-16T13:49:30.627Z" }, -] - -[[package]] -name = "humanfriendly" -version = "10.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "pyreadline3", marker = "sys_platform == 'win32'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/cc/3f/2c29224acb2e2df4d2046e4c73ee2662023c58ff5b113c4c1adac0886c43/humanfriendly-10.0.tar.gz", hash = "sha256:6b0b831ce8f15f7300721aa49829fc4e83921a9a301cc7f606be6686a2288ddc", size = 360702, upload-time = "2021-09-17T21:40:43.31Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/f0/0f/310fb31e39e2d734ccaa2c0fb981ee41f7bd5056ce9bc29b2248bd569169/humanfriendly-10.0-py2.py3-none-any.whl", hash = "sha256:1697e1a8a8f550fd43c2865cd84542fc175a61dcb779b6fee18cf6b6ccba1477", size = 86794, upload-time = "2021-09-17T21:40:39.897Z" }, -] - -[[package]] -name = "humanize" -version = "4.13.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/98/1d/3062fcc89ee05a715c0b9bfe6490c00c576314f27ffee3a704122c6fd259/humanize-4.13.0.tar.gz", hash = "sha256:78f79e68f76f0b04d711c4e55d32bebef5be387148862cb1ef83d2b58e7935a0", size = 81884, upload-time = "2025-08-25T09:39:20.04Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/1e/c7/316e7ca04d26695ef0635dc81683d628350810eb8e9b2299fc08ba49f366/humanize-4.13.0-py3-none-any.whl", hash = "sha256:b810820b31891813b1673e8fec7f1ed3312061eab2f26e3fa192c393d11ed25f", size = 128869, upload-time = "2025-08-25T09:39:18.54Z" }, -] - -[[package]] -name = "identify" -version = "2.6.14" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/52/c4/62963f25a678f6a050fb0505a65e9e726996171e6dbe1547f79619eefb15/identify-2.6.14.tar.gz", hash = "sha256:663494103b4f717cb26921c52f8751363dc89db64364cd836a9bf1535f53cd6a", size = 99283, upload-time = "2025-09-06T19:30:52.938Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/e5/ae/2ad30f4652712c82f1c23423d79136fbce338932ad166d70c1efb86a5998/identify-2.6.14-py2.py3-none-any.whl", hash = "sha256:11a073da82212c6646b1f39bb20d4483bfb9543bd5566fec60053c4bb309bf2e", size = 99172, upload-time = "2025-09-06T19:30:51.759Z" }, -] - -[[package]] -name = "idna" -version = "3.10" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/f1/70/7703c29685631f5a7590aa73f1f1d3fa9a380e654b86af429e0934a32f7d/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9", size = 190490, upload-time = "2024-09-15T18:07:39.745Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3", size = 70442, upload-time = "2024-09-15T18:07:37.964Z" }, -] - -[[package]] -name = "importlib-metadata" -version = "8.7.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "zipp" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/76/66/650a33bd90f786193e4de4b3ad86ea60b53c89b669a5c7be931fac31cdb0/importlib_metadata-8.7.0.tar.gz", hash = "sha256:d13b81ad223b890aa16c5471f2ac3056cf76c5f10f82d6f9292f0b415f389000", size = 56641, upload-time = "2025-04-27T15:29:01.736Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/20/b0/36bd937216ec521246249be3bf9855081de4c5e06a0c9b4219dbeda50373/importlib_metadata-8.7.0-py3-none-any.whl", hash = "sha256:e5dd1551894c77868a30651cef00984d50e1002d06942a7101d34870c5f02afd", size = 27656, upload-time = "2025-04-27T15:29:00.214Z" }, -] - -[[package]] -name = "iniconfig" -version = "2.1.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/f2/97/ebf4da567aa6827c909642694d71c9fcf53e5b504f2d96afea02718862f3/iniconfig-2.1.0.tar.gz", hash = "sha256:3abbd2e30b36733fee78f9c7f7308f2d0050e88f0087fd25c2645f63c773e1c7", size = 4793, upload-time = "2025-03-19T20:09:59.721Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/2c/e1/e6716421ea10d38022b952c159d5161ca1193197fb744506875fbb87ea7b/iniconfig-2.1.0-py3-none-any.whl", hash = "sha256:9deba5723312380e77435581c6bf4935c94cbfab9b1ed33ef8d238ea168eb760", size = 6050, upload-time = "2025-03-19T20:10:01.071Z" }, -] - -[[package]] -name = "instructor" -version = "1.11.3" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "aiohttp" }, - { name = "diskcache" }, - { name = "docstring-parser" }, - { name = "jinja2" }, - { name = "jiter" }, - { name = "openai" }, - { name = "pydantic" }, - { name = "pydantic-core" }, - { name = "requests" }, - { name = "rich" }, - { name = "tenacity" }, - { name = "typer" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/6a/af/428b5d7a6a6eca5738c51706795a395099c141779cd1bbb9a6e2b0d3a94d/instructor-1.11.3.tar.gz", hash = "sha256:6f58fea6fadfa228c411ecdedad4662230c456718f4a770a97a806dcb36b3287", size = 69879936, upload-time = "2025-09-09T15:44:31.548Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/4c/5f/54783e5b1a497de204a0a59b5e22549f67f5f1aceaa08e00db21b1107ce4/instructor-1.11.3-py3-none-any.whl", hash = "sha256:9ecd7a3780a045506165debad2ddcc4a30e1057f06997973185f356b0a42c6e3", size = 155501, upload-time = "2025-09-09T15:44:26.139Z" }, -] - -[[package]] -name = "isodate" -version = "0.7.2" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/54/4d/e940025e2ce31a8ce1202635910747e5a87cc3a6a6bb2d00973375014749/isodate-0.7.2.tar.gz", hash = "sha256:4cd1aa0f43ca76f4a6c6c0292a85f40b35ec2e43e315b59f06e6d32171a953e6", size = 29705, upload-time = "2024-10-08T23:04:11.5Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/15/aa/0aca39a37d3c7eb941ba736ede56d689e7be91cab5d9ca846bde3999eba6/isodate-0.7.2-py3-none-any.whl", hash = "sha256:28009937d8031054830160fce6d409ed342816b543597cece116d966c6d99e15", size = 22320, upload-time = "2024-10-08T23:04:09.501Z" }, -] - -[[package]] -name = "isort" -version = "6.0.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/b8/21/1e2a441f74a653a144224d7d21afe8f4169e6c7c20bb13aec3a2dc3815e0/isort-6.0.1.tar.gz", hash = "sha256:1cb5df28dfbc742e490c5e41bad6da41b805b0a8be7bc93cd0fb2a8a890ac450", size = 821955, upload-time = "2025-02-26T21:13:16.955Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/c1/11/114d0a5f4dabbdcedc1125dee0888514c3c3b16d3e9facad87ed96fad97c/isort-6.0.1-py3-none-any.whl", hash = "sha256:2dc5d7f65c9678d94c88dfc29161a320eec67328bc97aad576874cb4be1e9615", size = 94186, upload-time = "2025-02-26T21:13:14.911Z" }, -] - -[[package]] -name = "jinja2" -version = "3.1.6" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "markupsafe" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/df/bf/f7da0350254c0ed7c72f3e33cef02e048281fec7ecec5f032d4aac52226b/jinja2-3.1.6.tar.gz", hash = "sha256:0137fb05990d35f1275a587e9aee6d56da821fc83491a0fb838183be43f66d6d", size = 245115, upload-time = "2025-03-05T20:05:02.478Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/62/a1/3d680cbfd5f4b8f15abc1d571870c5fc3e594bb582bc3b64ea099db13e56/jinja2-3.1.6-py3-none-any.whl", hash = "sha256:85ece4451f492d0c13c5dd7c13a64681a86afae63a5f347908daf103ce6d2f67", size = 134899, upload-time = "2025-03-05T20:05:00.369Z" }, -] - -[[package]] -name = "jiter" -version = "0.10.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/ee/9d/ae7ddb4b8ab3fb1b51faf4deb36cb48a4fbbd7cb36bad6a5fca4741306f7/jiter-0.10.0.tar.gz", hash = "sha256:07a7142c38aacc85194391108dc91b5b57093c978a9932bd86a36862759d9500", size = 162759, upload-time = "2025-05-18T19:04:59.73Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/1b/dd/6cefc6bd68b1c3c979cecfa7029ab582b57690a31cd2f346c4d0ce7951b6/jiter-0.10.0-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:3bebe0c558e19902c96e99217e0b8e8b17d570906e72ed8a87170bc290b1e978", size = 317473, upload-time = "2025-05-18T19:03:25.942Z" }, - { url = "https://files.pythonhosted.org/packages/be/cf/fc33f5159ce132be1d8dd57251a1ec7a631c7df4bd11e1cd198308c6ae32/jiter-0.10.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:558cc7e44fd8e507a236bee6a02fa17199ba752874400a0ca6cd6e2196cdb7dc", size = 321971, upload-time = "2025-05-18T19:03:27.255Z" }, - { url = "https://files.pythonhosted.org/packages/68/a4/da3f150cf1d51f6c472616fb7650429c7ce053e0c962b41b68557fdf6379/jiter-0.10.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4d613e4b379a07d7c8453c5712ce7014e86c6ac93d990a0b8e7377e18505e98d", size = 345574, upload-time = "2025-05-18T19:03:28.63Z" }, - { url = "https://files.pythonhosted.org/packages/84/34/6e8d412e60ff06b186040e77da5f83bc158e9735759fcae65b37d681f28b/jiter-0.10.0-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:f62cf8ba0618eda841b9bf61797f21c5ebd15a7a1e19daab76e4e4b498d515b2", size = 371028, upload-time = "2025-05-18T19:03:30.292Z" }, - { url = "https://files.pythonhosted.org/packages/fb/d9/9ee86173aae4576c35a2f50ae930d2ccb4c4c236f6cb9353267aa1d626b7/jiter-0.10.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:919d139cdfa8ae8945112398511cb7fca58a77382617d279556b344867a37e61", size = 491083, upload-time = "2025-05-18T19:03:31.654Z" }, - { url = "https://files.pythonhosted.org/packages/d9/2c/f955de55e74771493ac9e188b0f731524c6a995dffdcb8c255b89c6fb74b/jiter-0.10.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:13ddbc6ae311175a3b03bd8994881bc4635c923754932918e18da841632349db", size = 388821, upload-time = "2025-05-18T19:03:33.184Z" }, - { url = "https://files.pythonhosted.org/packages/81/5a/0e73541b6edd3f4aada586c24e50626c7815c561a7ba337d6a7eb0a915b4/jiter-0.10.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4c440ea003ad10927a30521a9062ce10b5479592e8a70da27f21eeb457b4a9c5", size = 352174, upload-time = "2025-05-18T19:03:34.965Z" }, - { url = "https://files.pythonhosted.org/packages/1c/c0/61eeec33b8c75b31cae42be14d44f9e6fe3ac15a4e58010256ac3abf3638/jiter-0.10.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:dc347c87944983481e138dea467c0551080c86b9d21de6ea9306efb12ca8f606", size = 391869, upload-time = "2025-05-18T19:03:36.436Z" }, - { url = "https://files.pythonhosted.org/packages/41/22/5beb5ee4ad4ef7d86f5ea5b4509f680a20706c4a7659e74344777efb7739/jiter-0.10.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:13252b58c1f4d8c5b63ab103c03d909e8e1e7842d302473f482915d95fefd605", size = 523741, upload-time = "2025-05-18T19:03:38.168Z" }, - { url = "https://files.pythonhosted.org/packages/ea/10/768e8818538e5817c637b0df52e54366ec4cebc3346108a4457ea7a98f32/jiter-0.10.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:7d1bbf3c465de4a24ab12fb7766a0003f6f9bce48b8b6a886158c4d569452dc5", size = 514527, upload-time = "2025-05-18T19:03:39.577Z" }, - { url = "https://files.pythonhosted.org/packages/73/6d/29b7c2dc76ce93cbedabfd842fc9096d01a0550c52692dfc33d3cc889815/jiter-0.10.0-cp311-cp311-win32.whl", hash = "sha256:db16e4848b7e826edca4ccdd5b145939758dadf0dc06e7007ad0e9cfb5928ae7", size = 210765, upload-time = "2025-05-18T19:03:41.271Z" }, - { url = "https://files.pythonhosted.org/packages/c2/c9/d394706deb4c660137caf13e33d05a031d734eb99c051142e039d8ceb794/jiter-0.10.0-cp311-cp311-win_amd64.whl", hash = "sha256:9c9c1d5f10e18909e993f9641f12fe1c77b3e9b533ee94ffa970acc14ded3812", size = 209234, upload-time = "2025-05-18T19:03:42.918Z" }, - { url = "https://files.pythonhosted.org/packages/6d/b5/348b3313c58f5fbfb2194eb4d07e46a35748ba6e5b3b3046143f3040bafa/jiter-0.10.0-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:1e274728e4a5345a6dde2d343c8da018b9d4bd4350f5a472fa91f66fda44911b", size = 312262, upload-time = "2025-05-18T19:03:44.637Z" }, - { url = "https://files.pythonhosted.org/packages/9c/4a/6a2397096162b21645162825f058d1709a02965606e537e3304b02742e9b/jiter-0.10.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:7202ae396446c988cb2a5feb33a543ab2165b786ac97f53b59aafb803fef0744", size = 320124, upload-time = "2025-05-18T19:03:46.341Z" }, - { url = "https://files.pythonhosted.org/packages/2a/85/1ce02cade7516b726dd88f59a4ee46914bf79d1676d1228ef2002ed2f1c9/jiter-0.10.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:23ba7722d6748b6920ed02a8f1726fb4b33e0fd2f3f621816a8b486c66410ab2", size = 345330, upload-time = "2025-05-18T19:03:47.596Z" }, - { url = "https://files.pythonhosted.org/packages/75/d0/bb6b4f209a77190ce10ea8d7e50bf3725fc16d3372d0a9f11985a2b23eff/jiter-0.10.0-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:371eab43c0a288537d30e1f0b193bc4eca90439fc08a022dd83e5e07500ed026", size = 369670, upload-time = "2025-05-18T19:03:49.334Z" }, - { url = "https://files.pythonhosted.org/packages/a0/f5/a61787da9b8847a601e6827fbc42ecb12be2c925ced3252c8ffcb56afcaf/jiter-0.10.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6c675736059020365cebc845a820214765162728b51ab1e03a1b7b3abb70f74c", size = 489057, upload-time = "2025-05-18T19:03:50.66Z" }, - { url = "https://files.pythonhosted.org/packages/12/e4/6f906272810a7b21406c760a53aadbe52e99ee070fc5c0cb191e316de30b/jiter-0.10.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0c5867d40ab716e4684858e4887489685968a47e3ba222e44cde6e4a2154f959", size = 389372, upload-time = "2025-05-18T19:03:51.98Z" }, - { url = "https://files.pythonhosted.org/packages/e2/ba/77013b0b8ba904bf3762f11e0129b8928bff7f978a81838dfcc958ad5728/jiter-0.10.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:395bb9a26111b60141757d874d27fdea01b17e8fac958b91c20128ba8f4acc8a", size = 352038, upload-time = "2025-05-18T19:03:53.703Z" }, - { url = "https://files.pythonhosted.org/packages/67/27/c62568e3ccb03368dbcc44a1ef3a423cb86778a4389e995125d3d1aaa0a4/jiter-0.10.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6842184aed5cdb07e0c7e20e5bdcfafe33515ee1741a6835353bb45fe5d1bd95", size = 391538, upload-time = "2025-05-18T19:03:55.046Z" }, - { url = "https://files.pythonhosted.org/packages/c0/72/0d6b7e31fc17a8fdce76164884edef0698ba556b8eb0af9546ae1a06b91d/jiter-0.10.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:62755d1bcea9876770d4df713d82606c8c1a3dca88ff39046b85a048566d56ea", size = 523557, upload-time = "2025-05-18T19:03:56.386Z" }, - { url = "https://files.pythonhosted.org/packages/2f/09/bc1661fbbcbeb6244bd2904ff3a06f340aa77a2b94e5a7373fd165960ea3/jiter-0.10.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:533efbce2cacec78d5ba73a41756beff8431dfa1694b6346ce7af3a12c42202b", size = 514202, upload-time = "2025-05-18T19:03:57.675Z" }, - { url = "https://files.pythonhosted.org/packages/1b/84/5a5d5400e9d4d54b8004c9673bbe4403928a00d28529ff35b19e9d176b19/jiter-0.10.0-cp312-cp312-win32.whl", hash = "sha256:8be921f0cadd245e981b964dfbcd6fd4bc4e254cdc069490416dd7a2632ecc01", size = 211781, upload-time = "2025-05-18T19:03:59.025Z" }, - { url = "https://files.pythonhosted.org/packages/9b/52/7ec47455e26f2d6e5f2ea4951a0652c06e5b995c291f723973ae9e724a65/jiter-0.10.0-cp312-cp312-win_amd64.whl", hash = "sha256:a7c7d785ae9dda68c2678532a5a1581347e9c15362ae9f6e68f3fdbfb64f2e49", size = 206176, upload-time = "2025-05-18T19:04:00.305Z" }, - { url = "https://files.pythonhosted.org/packages/2e/b0/279597e7a270e8d22623fea6c5d4eeac328e7d95c236ed51a2b884c54f70/jiter-0.10.0-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:e0588107ec8e11b6f5ef0e0d656fb2803ac6cf94a96b2b9fc675c0e3ab5e8644", size = 311617, upload-time = "2025-05-18T19:04:02.078Z" }, - { url = "https://files.pythonhosted.org/packages/91/e3/0916334936f356d605f54cc164af4060e3e7094364add445a3bc79335d46/jiter-0.10.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:cafc4628b616dc32530c20ee53d71589816cf385dd9449633e910d596b1f5c8a", size = 318947, upload-time = "2025-05-18T19:04:03.347Z" }, - { url = "https://files.pythonhosted.org/packages/6a/8e/fd94e8c02d0e94539b7d669a7ebbd2776e51f329bb2c84d4385e8063a2ad/jiter-0.10.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:520ef6d981172693786a49ff5b09eda72a42e539f14788124a07530f785c3ad6", size = 344618, upload-time = "2025-05-18T19:04:04.709Z" }, - { url = "https://files.pythonhosted.org/packages/6f/b0/f9f0a2ec42c6e9c2e61c327824687f1e2415b767e1089c1d9135f43816bd/jiter-0.10.0-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:554dedfd05937f8fc45d17ebdf298fe7e0c77458232bcb73d9fbbf4c6455f5b3", size = 368829, upload-time = "2025-05-18T19:04:06.912Z" }, - { url = "https://files.pythonhosted.org/packages/e8/57/5bbcd5331910595ad53b9fd0c610392ac68692176f05ae48d6ce5c852967/jiter-0.10.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5bc299da7789deacf95f64052d97f75c16d4fc8c4c214a22bf8d859a4288a1c2", size = 491034, upload-time = "2025-05-18T19:04:08.222Z" }, - { url = "https://files.pythonhosted.org/packages/9b/be/c393df00e6e6e9e623a73551774449f2f23b6ec6a502a3297aeeece2c65a/jiter-0.10.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5161e201172de298a8a1baad95eb85db4fb90e902353b1f6a41d64ea64644e25", size = 388529, upload-time = "2025-05-18T19:04:09.566Z" }, - { url = "https://files.pythonhosted.org/packages/42/3e/df2235c54d365434c7f150b986a6e35f41ebdc2f95acea3036d99613025d/jiter-0.10.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2e2227db6ba93cb3e2bf67c87e594adde0609f146344e8207e8730364db27041", size = 350671, upload-time = "2025-05-18T19:04:10.98Z" }, - { url = "https://files.pythonhosted.org/packages/c6/77/71b0b24cbcc28f55ab4dbfe029f9a5b73aeadaba677843fc6dc9ed2b1d0a/jiter-0.10.0-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:15acb267ea5e2c64515574b06a8bf393fbfee6a50eb1673614aa45f4613c0cca", size = 390864, upload-time = "2025-05-18T19:04:12.722Z" }, - { url = "https://files.pythonhosted.org/packages/6a/d3/ef774b6969b9b6178e1d1e7a89a3bd37d241f3d3ec5f8deb37bbd203714a/jiter-0.10.0-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:901b92f2e2947dc6dfcb52fd624453862e16665ea909a08398dde19c0731b7f4", size = 522989, upload-time = "2025-05-18T19:04:14.261Z" }, - { url = "https://files.pythonhosted.org/packages/0c/41/9becdb1d8dd5d854142f45a9d71949ed7e87a8e312b0bede2de849388cb9/jiter-0.10.0-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:d0cb9a125d5a3ec971a094a845eadde2db0de85b33c9f13eb94a0c63d463879e", size = 513495, upload-time = "2025-05-18T19:04:15.603Z" }, - { url = "https://files.pythonhosted.org/packages/9c/36/3468e5a18238bdedae7c4d19461265b5e9b8e288d3f86cd89d00cbb48686/jiter-0.10.0-cp313-cp313-win32.whl", hash = "sha256:48a403277ad1ee208fb930bdf91745e4d2d6e47253eedc96e2559d1e6527006d", size = 211289, upload-time = "2025-05-18T19:04:17.541Z" }, - { url = "https://files.pythonhosted.org/packages/7e/07/1c96b623128bcb913706e294adb5f768fb7baf8db5e1338ce7b4ee8c78ef/jiter-0.10.0-cp313-cp313-win_amd64.whl", hash = "sha256:75f9eb72ecb640619c29bf714e78c9c46c9c4eaafd644bf78577ede459f330d4", size = 205074, upload-time = "2025-05-18T19:04:19.21Z" }, - { url = "https://files.pythonhosted.org/packages/54/46/caa2c1342655f57d8f0f2519774c6d67132205909c65e9aa8255e1d7b4f4/jiter-0.10.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:28ed2a4c05a1f32ef0e1d24c2611330219fed727dae01789f4a335617634b1ca", size = 318225, upload-time = "2025-05-18T19:04:20.583Z" }, - { url = "https://files.pythonhosted.org/packages/43/84/c7d44c75767e18946219ba2d703a5a32ab37b0bc21886a97bc6062e4da42/jiter-0.10.0-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:14a4c418b1ec86a195f1ca69da8b23e8926c752b685af665ce30777233dfe070", size = 350235, upload-time = "2025-05-18T19:04:22.363Z" }, - { url = "https://files.pythonhosted.org/packages/01/16/f5a0135ccd968b480daad0e6ab34b0c7c5ba3bc447e5088152696140dcb3/jiter-0.10.0-cp313-cp313t-win_amd64.whl", hash = "sha256:d7bfed2fe1fe0e4dda6ef682cee888ba444b21e7a6553e03252e4feb6cf0adca", size = 207278, upload-time = "2025-05-18T19:04:23.627Z" }, - { url = "https://files.pythonhosted.org/packages/1c/9b/1d646da42c3de6c2188fdaa15bce8ecb22b635904fc68be025e21249ba44/jiter-0.10.0-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:5e9251a5e83fab8d87799d3e1a46cb4b7f2919b895c6f4483629ed2446f66522", size = 310866, upload-time = "2025-05-18T19:04:24.891Z" }, - { url = "https://files.pythonhosted.org/packages/ad/0e/26538b158e8a7c7987e94e7aeb2999e2e82b1f9d2e1f6e9874ddf71ebda0/jiter-0.10.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:023aa0204126fe5b87ccbcd75c8a0d0261b9abdbbf46d55e7ae9f8e22424eeb8", size = 318772, upload-time = "2025-05-18T19:04:26.161Z" }, - { url = "https://files.pythonhosted.org/packages/7b/fb/d302893151caa1c2636d6574d213e4b34e31fd077af6050a9c5cbb42f6fb/jiter-0.10.0-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3c189c4f1779c05f75fc17c0c1267594ed918996a231593a21a5ca5438445216", size = 344534, upload-time = "2025-05-18T19:04:27.495Z" }, - { url = "https://files.pythonhosted.org/packages/01/d8/5780b64a149d74e347c5128d82176eb1e3241b1391ac07935693466d6219/jiter-0.10.0-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:15720084d90d1098ca0229352607cd68256c76991f6b374af96f36920eae13c4", size = 369087, upload-time = "2025-05-18T19:04:28.896Z" }, - { url = "https://files.pythonhosted.org/packages/e8/5b/f235a1437445160e777544f3ade57544daf96ba7e96c1a5b24a6f7ac7004/jiter-0.10.0-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e4f2fb68e5f1cfee30e2b2a09549a00683e0fde4c6a2ab88c94072fc33cb7426", size = 490694, upload-time = "2025-05-18T19:04:30.183Z" }, - { url = "https://files.pythonhosted.org/packages/85/a9/9c3d4617caa2ff89cf61b41e83820c27ebb3f7b5fae8a72901e8cd6ff9be/jiter-0.10.0-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ce541693355fc6da424c08b7edf39a2895f58d6ea17d92cc2b168d20907dee12", size = 388992, upload-time = "2025-05-18T19:04:32.028Z" }, - { url = "https://files.pythonhosted.org/packages/68/b1/344fd14049ba5c94526540af7eb661871f9c54d5f5601ff41a959b9a0bbd/jiter-0.10.0-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:31c50c40272e189d50006ad5c73883caabb73d4e9748a688b216e85a9a9ca3b9", size = 351723, upload-time = "2025-05-18T19:04:33.467Z" }, - { url = "https://files.pythonhosted.org/packages/41/89/4c0e345041186f82a31aee7b9d4219a910df672b9fef26f129f0cda07a29/jiter-0.10.0-cp314-cp314-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fa3402a2ff9815960e0372a47b75c76979d74402448509ccd49a275fa983ef8a", size = 392215, upload-time = "2025-05-18T19:04:34.827Z" }, - { url = "https://files.pythonhosted.org/packages/55/58/ee607863e18d3f895feb802154a2177d7e823a7103f000df182e0f718b38/jiter-0.10.0-cp314-cp314-musllinux_1_1_aarch64.whl", hash = "sha256:1956f934dca32d7bb647ea21d06d93ca40868b505c228556d3373cbd255ce853", size = 522762, upload-time = "2025-05-18T19:04:36.19Z" }, - { url = "https://files.pythonhosted.org/packages/15/d0/9123fb41825490d16929e73c212de9a42913d68324a8ce3c8476cae7ac9d/jiter-0.10.0-cp314-cp314-musllinux_1_1_x86_64.whl", hash = "sha256:fcedb049bdfc555e261d6f65a6abe1d5ad68825b7202ccb9692636c70fcced86", size = 513427, upload-time = "2025-05-18T19:04:37.544Z" }, - { url = "https://files.pythonhosted.org/packages/d8/b3/2bd02071c5a2430d0b70403a34411fc519c2f227da7b03da9ba6a956f931/jiter-0.10.0-cp314-cp314-win32.whl", hash = "sha256:ac509f7eccca54b2a29daeb516fb95b6f0bd0d0d8084efaf8ed5dfc7b9f0b357", size = 210127, upload-time = "2025-05-18T19:04:38.837Z" }, - { url = "https://files.pythonhosted.org/packages/03/0c/5fe86614ea050c3ecd728ab4035534387cd41e7c1855ef6c031f1ca93e3f/jiter-0.10.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:5ed975b83a2b8639356151cef5c0d597c68376fc4922b45d0eb384ac058cfa00", size = 318527, upload-time = "2025-05-18T19:04:40.612Z" }, - { url = "https://files.pythonhosted.org/packages/b3/4a/4175a563579e884192ba6e81725fc0448b042024419be8d83aa8a80a3f44/jiter-0.10.0-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3aa96f2abba33dc77f79b4cf791840230375f9534e5fac927ccceb58c5e604a5", size = 354213, upload-time = "2025-05-18T19:04:41.894Z" }, -] - -[[package]] -name = "jmespath" -version = "1.0.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/00/2a/e867e8531cf3e36b41201936b7fa7ba7b5702dbef42922193f05c8976cd6/jmespath-1.0.1.tar.gz", hash = "sha256:90261b206d6defd58fdd5e85f478bf633a2901798906be2ad389150c5c60edbe", size = 25843, upload-time = "2022-06-17T18:00:12.224Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/31/b4/b9b800c45527aadd64d5b442f9b932b00648617eb5d63d2c7a6587b7cafc/jmespath-1.0.1-py3-none-any.whl", hash = "sha256:02e2e4cc71b5bcab88332eebf907519190dd9e6e82107fa7f83b1003a6252980", size = 20256, upload-time = "2022-06-17T18:00:10.251Z" }, -] - -[[package]] -name = "joblib" -version = "1.5.2" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/e8/5d/447af5ea094b9e4c4054f82e223ada074c552335b9b4b2d14bd9b35a67c4/joblib-1.5.2.tar.gz", hash = "sha256:3faa5c39054b2f03ca547da9b2f52fde67c06240c31853f306aea97f13647b55", size = 331077, upload-time = "2025-08-27T12:15:46.575Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/1e/e8/685f47e0d754320684db4425a0967f7d3fa70126bffd76110b7009a0090f/joblib-1.5.2-py3-none-any.whl", hash = "sha256:4e1f0bdbb987e6d843c70cf43714cb276623def372df3c22fe5266b2670bc241", size = 308396, upload-time = "2025-08-27T12:15:45.188Z" }, -] - -[[package]] -name = "jsonpath-ng" -version = "1.7.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "ply" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/6d/86/08646239a313f895186ff0a4573452038eed8c86f54380b3ebac34d32fb2/jsonpath-ng-1.7.0.tar.gz", hash = "sha256:f6f5f7fd4e5ff79c785f1573b394043b39849fb2bb47bcead935d12b00beab3c", size = 37838, upload-time = "2024-10-11T15:41:42.404Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/35/5a/73ecb3d82f8615f32ccdadeb9356726d6cae3a4bbc840b437ceb95708063/jsonpath_ng-1.7.0-py3-none-any.whl", hash = "sha256:f3d7f9e848cba1b6da28c55b1c26ff915dc9e0b1ba7e752a53d6da8d5cbd00b6", size = 30105, upload-time = "2024-11-20T17:58:30.418Z" }, -] - -[[package]] -name = "jsonschema" -version = "4.25.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "attrs" }, - { name = "jsonschema-specifications" }, - { name = "referencing" }, - { name = "rpds-py" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/74/69/f7185de793a29082a9f3c7728268ffb31cb5095131a9c139a74078e27336/jsonschema-4.25.1.tar.gz", hash = "sha256:e4a9655ce0da0c0b67a085847e00a3a51449e1157f4f75e9fb5aa545e122eb85", size = 357342, upload-time = "2025-08-18T17:03:50.038Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/bf/9c/8c95d856233c1f82500c2450b8c68576b4cf1c871db3afac5c34ff84e6fd/jsonschema-4.25.1-py3-none-any.whl", hash = "sha256:3fba0169e345c7175110351d456342c364814cfcf3b964ba4587f22915230a63", size = 90040, upload-time = "2025-08-18T17:03:48.373Z" }, -] - -[[package]] -name = "jsonschema-path" -version = "0.3.4" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "pathable" }, - { name = "pyyaml" }, - { name = "referencing" }, - { name = "requests" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/6e/45/41ebc679c2a4fced6a722f624c18d658dee42612b83ea24c1caf7c0eb3a8/jsonschema_path-0.3.4.tar.gz", hash = "sha256:8365356039f16cc65fddffafda5f58766e34bebab7d6d105616ab52bc4297001", size = 11159, upload-time = "2025-01-24T14:33:16.547Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/cb/58/3485da8cb93d2f393bce453adeef16896751f14ba3e2024bc21dc9597646/jsonschema_path-0.3.4-py3-none-any.whl", hash = "sha256:f502191fdc2b22050f9a81c9237be9d27145b9001c55842bece5e94e382e52f8", size = 14810, upload-time = "2025-01-24T14:33:14.652Z" }, -] - -[[package]] -name = "jsonschema-specifications" -version = "2025.9.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "referencing" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/19/74/a633ee74eb36c44aa6d1095e7cc5569bebf04342ee146178e2d36600708b/jsonschema_specifications-2025.9.1.tar.gz", hash = "sha256:b540987f239e745613c7a9176f3edb72b832a4ac465cf02712288397832b5e8d", size = 32855, upload-time = "2025-09-08T01:34:59.186Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/41/45/1a4ed80516f02155c51f51e8cedb3c1902296743db0bbc66608a0db2814f/jsonschema_specifications-2025.9.1-py3-none-any.whl", hash = "sha256:98802fee3a11ee76ecaca44429fda8a41bff98b00a0f2838151b113f210cc6fe", size = 18437, upload-time = "2025-09-08T01:34:57.871Z" }, -] - -[[package]] -name = "kiwisolver" -version = "1.4.9" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/5c/3c/85844f1b0feb11ee581ac23fe5fce65cd049a200c1446708cc1b7f922875/kiwisolver-1.4.9.tar.gz", hash = "sha256:c3b22c26c6fd6811b0ae8363b95ca8ce4ea3c202d3d0975b2914310ceb1bcc4d", size = 97564, upload-time = "2025-08-10T21:27:49.279Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/6f/ab/c80b0d5a9d8a1a65f4f815f2afff9798b12c3b9f31f1d304dd233dd920e2/kiwisolver-1.4.9-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:eb14a5da6dc7642b0f3a18f13654847cd8b7a2550e2645a5bda677862b03ba16", size = 124167, upload-time = "2025-08-10T21:25:53.403Z" }, - { url = "https://files.pythonhosted.org/packages/a0/c0/27fe1a68a39cf62472a300e2879ffc13c0538546c359b86f149cc19f6ac3/kiwisolver-1.4.9-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:39a219e1c81ae3b103643d2aedb90f1ef22650deb266ff12a19e7773f3e5f089", size = 66579, upload-time = "2025-08-10T21:25:54.79Z" }, - { url = "https://files.pythonhosted.org/packages/31/a2/a12a503ac1fd4943c50f9822678e8015a790a13b5490354c68afb8489814/kiwisolver-1.4.9-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2405a7d98604b87f3fc28b1716783534b1b4b8510d8142adca34ee0bc3c87543", size = 65309, upload-time = "2025-08-10T21:25:55.76Z" }, - { url = "https://files.pythonhosted.org/packages/66/e1/e533435c0be77c3f64040d68d7a657771194a63c279f55573188161e81ca/kiwisolver-1.4.9-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:dc1ae486f9abcef254b5618dfb4113dd49f94c68e3e027d03cf0143f3f772b61", size = 1435596, upload-time = "2025-08-10T21:25:56.861Z" }, - { url = "https://files.pythonhosted.org/packages/67/1e/51b73c7347f9aabdc7215aa79e8b15299097dc2f8e67dee2b095faca9cb0/kiwisolver-1.4.9-cp311-cp311-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8a1f570ce4d62d718dce3f179ee78dac3b545ac16c0c04bb363b7607a949c0d1", size = 1246548, upload-time = "2025-08-10T21:25:58.246Z" }, - { url = "https://files.pythonhosted.org/packages/21/aa/72a1c5d1e430294f2d32adb9542719cfb441b5da368d09d268c7757af46c/kiwisolver-1.4.9-cp311-cp311-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:cb27e7b78d716c591e88e0a09a2139c6577865d7f2e152488c2cc6257f460872", size = 1263618, upload-time = "2025-08-10T21:25:59.857Z" }, - { url = "https://files.pythonhosted.org/packages/a3/af/db1509a9e79dbf4c260ce0cfa3903ea8945f6240e9e59d1e4deb731b1a40/kiwisolver-1.4.9-cp311-cp311-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:15163165efc2f627eb9687ea5f3a28137217d217ac4024893d753f46bce9de26", size = 1317437, upload-time = "2025-08-10T21:26:01.105Z" }, - { url = "https://files.pythonhosted.org/packages/e0/f2/3ea5ee5d52abacdd12013a94130436e19969fa183faa1e7c7fbc89e9a42f/kiwisolver-1.4.9-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:bdee92c56a71d2b24c33a7d4c2856bd6419d017e08caa7802d2963870e315028", size = 2195742, upload-time = "2025-08-10T21:26:02.675Z" }, - { url = "https://files.pythonhosted.org/packages/6f/9b/1efdd3013c2d9a2566aa6a337e9923a00590c516add9a1e89a768a3eb2fc/kiwisolver-1.4.9-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:412f287c55a6f54b0650bd9b6dce5aceddb95864a1a90c87af16979d37c89771", size = 2290810, upload-time = "2025-08-10T21:26:04.009Z" }, - { url = "https://files.pythonhosted.org/packages/fb/e5/cfdc36109ae4e67361f9bc5b41323648cb24a01b9ade18784657e022e65f/kiwisolver-1.4.9-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:2c93f00dcba2eea70af2be5f11a830a742fe6b579a1d4e00f47760ef13be247a", size = 2461579, upload-time = "2025-08-10T21:26:05.317Z" }, - { url = "https://files.pythonhosted.org/packages/62/86/b589e5e86c7610842213994cdea5add00960076bef4ae290c5fa68589cac/kiwisolver-1.4.9-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:f117e1a089d9411663a3207ba874f31be9ac8eaa5b533787024dc07aeb74f464", size = 2268071, upload-time = "2025-08-10T21:26:06.686Z" }, - { url = "https://files.pythonhosted.org/packages/3b/c6/f8df8509fd1eee6c622febe54384a96cfaf4d43bf2ccec7a0cc17e4715c9/kiwisolver-1.4.9-cp311-cp311-win_amd64.whl", hash = "sha256:be6a04e6c79819c9a8c2373317d19a96048e5a3f90bec587787e86a1153883c2", size = 73840, upload-time = "2025-08-10T21:26:07.94Z" }, - { url = "https://files.pythonhosted.org/packages/e2/2d/16e0581daafd147bc11ac53f032a2b45eabac897f42a338d0a13c1e5c436/kiwisolver-1.4.9-cp311-cp311-win_arm64.whl", hash = "sha256:0ae37737256ba2de764ddc12aed4956460277f00c4996d51a197e72f62f5eec7", size = 65159, upload-time = "2025-08-10T21:26:09.048Z" }, - { url = "https://files.pythonhosted.org/packages/86/c9/13573a747838aeb1c76e3267620daa054f4152444d1f3d1a2324b78255b5/kiwisolver-1.4.9-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:ac5a486ac389dddcc5bef4f365b6ae3ffff2c433324fb38dd35e3fab7c957999", size = 123686, upload-time = "2025-08-10T21:26:10.034Z" }, - { url = "https://files.pythonhosted.org/packages/51/ea/2ecf727927f103ffd1739271ca19c424d0e65ea473fbaeea1c014aea93f6/kiwisolver-1.4.9-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:f2ba92255faa7309d06fe44c3a4a97efe1c8d640c2a79a5ef728b685762a6fd2", size = 66460, upload-time = "2025-08-10T21:26:11.083Z" }, - { url = "https://files.pythonhosted.org/packages/5b/5a/51f5464373ce2aeb5194508298a508b6f21d3867f499556263c64c621914/kiwisolver-1.4.9-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:4a2899935e724dd1074cb568ce7ac0dce28b2cd6ab539c8e001a8578eb106d14", size = 64952, upload-time = "2025-08-10T21:26:12.058Z" }, - { url = "https://files.pythonhosted.org/packages/70/90/6d240beb0f24b74371762873e9b7f499f1e02166a2d9c5801f4dbf8fa12e/kiwisolver-1.4.9-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f6008a4919fdbc0b0097089f67a1eb55d950ed7e90ce2cc3e640abadd2757a04", size = 1474756, upload-time = "2025-08-10T21:26:13.096Z" }, - { url = "https://files.pythonhosted.org/packages/12/42/f36816eaf465220f683fb711efdd1bbf7a7005a2473d0e4ed421389bd26c/kiwisolver-1.4.9-cp312-cp312-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:67bb8b474b4181770f926f7b7d2f8c0248cbcb78b660fdd41a47054b28d2a752", size = 1276404, upload-time = "2025-08-10T21:26:14.457Z" }, - { url = "https://files.pythonhosted.org/packages/2e/64/bc2de94800adc830c476dce44e9b40fd0809cddeef1fde9fcf0f73da301f/kiwisolver-1.4.9-cp312-cp312-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:2327a4a30d3ee07d2fbe2e7933e8a37c591663b96ce42a00bc67461a87d7df77", size = 1294410, upload-time = "2025-08-10T21:26:15.73Z" }, - { url = "https://files.pythonhosted.org/packages/5f/42/2dc82330a70aa8e55b6d395b11018045e58d0bb00834502bf11509f79091/kiwisolver-1.4.9-cp312-cp312-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:7a08b491ec91b1d5053ac177afe5290adacf1f0f6307d771ccac5de30592d198", size = 1343631, upload-time = "2025-08-10T21:26:17.045Z" }, - { url = "https://files.pythonhosted.org/packages/22/fd/f4c67a6ed1aab149ec5a8a401c323cee7a1cbe364381bb6c9c0d564e0e20/kiwisolver-1.4.9-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:d8fc5c867c22b828001b6a38d2eaeb88160bf5783c6cb4a5e440efc981ce286d", size = 2224963, upload-time = "2025-08-10T21:26:18.737Z" }, - { url = "https://files.pythonhosted.org/packages/45/aa/76720bd4cb3713314677d9ec94dcc21ced3f1baf4830adde5bb9b2430a5f/kiwisolver-1.4.9-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:3b3115b2581ea35bb6d1f24a4c90af37e5d9b49dcff267eeed14c3893c5b86ab", size = 2321295, upload-time = "2025-08-10T21:26:20.11Z" }, - { url = "https://files.pythonhosted.org/packages/80/19/d3ec0d9ab711242f56ae0dc2fc5d70e298bb4a1f9dfab44c027668c673a1/kiwisolver-1.4.9-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:858e4c22fb075920b96a291928cb7dea5644e94c0ee4fcd5af7e865655e4ccf2", size = 2487987, upload-time = "2025-08-10T21:26:21.49Z" }, - { url = "https://files.pythonhosted.org/packages/39/e9/61e4813b2c97e86b6fdbd4dd824bf72d28bcd8d4849b8084a357bc0dd64d/kiwisolver-1.4.9-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ed0fecd28cc62c54b262e3736f8bb2512d8dcfdc2bcf08be5f47f96bf405b145", size = 2291817, upload-time = "2025-08-10T21:26:22.812Z" }, - { url = "https://files.pythonhosted.org/packages/a0/41/85d82b0291db7504da3c2defe35c9a8a5c9803a730f297bd823d11d5fb77/kiwisolver-1.4.9-cp312-cp312-win_amd64.whl", hash = "sha256:f68208a520c3d86ea51acf688a3e3002615a7f0238002cccc17affecc86a8a54", size = 73895, upload-time = "2025-08-10T21:26:24.37Z" }, - { url = "https://files.pythonhosted.org/packages/e2/92/5f3068cf15ee5cb624a0c7596e67e2a0bb2adee33f71c379054a491d07da/kiwisolver-1.4.9-cp312-cp312-win_arm64.whl", hash = "sha256:2c1a4f57df73965f3f14df20b80ee29e6a7930a57d2d9e8491a25f676e197c60", size = 64992, upload-time = "2025-08-10T21:26:25.732Z" }, - { url = "https://files.pythonhosted.org/packages/31/c1/c2686cda909742ab66c7388e9a1a8521a59eb89f8bcfbee28fc980d07e24/kiwisolver-1.4.9-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:a5d0432ccf1c7ab14f9949eec60c5d1f924f17c037e9f8b33352fa05799359b8", size = 123681, upload-time = "2025-08-10T21:26:26.725Z" }, - { url = "https://files.pythonhosted.org/packages/ca/f0/f44f50c9f5b1a1860261092e3bc91ecdc9acda848a8b8c6abfda4a24dd5c/kiwisolver-1.4.9-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:efb3a45b35622bb6c16dbfab491a8f5a391fe0e9d45ef32f4df85658232ca0e2", size = 66464, upload-time = "2025-08-10T21:26:27.733Z" }, - { url = "https://files.pythonhosted.org/packages/2d/7a/9d90a151f558e29c3936b8a47ac770235f436f2120aca41a6d5f3d62ae8d/kiwisolver-1.4.9-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:1a12cf6398e8a0a001a059747a1cbf24705e18fe413bc22de7b3d15c67cffe3f", size = 64961, upload-time = "2025-08-10T21:26:28.729Z" }, - { url = "https://files.pythonhosted.org/packages/e9/e9/f218a2cb3a9ffbe324ca29a9e399fa2d2866d7f348ec3a88df87fc248fc5/kiwisolver-1.4.9-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:b67e6efbf68e077dd71d1a6b37e43e1a99d0bff1a3d51867d45ee8908b931098", size = 1474607, upload-time = "2025-08-10T21:26:29.798Z" }, - { url = "https://files.pythonhosted.org/packages/d9/28/aac26d4c882f14de59041636292bc838db8961373825df23b8eeb807e198/kiwisolver-1.4.9-cp313-cp313-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5656aa670507437af0207645273ccdfee4f14bacd7f7c67a4306d0dcaeaf6eed", size = 1276546, upload-time = "2025-08-10T21:26:31.401Z" }, - { url = "https://files.pythonhosted.org/packages/8b/ad/8bfc1c93d4cc565e5069162f610ba2f48ff39b7de4b5b8d93f69f30c4bed/kiwisolver-1.4.9-cp313-cp313-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:bfc08add558155345129c7803b3671cf195e6a56e7a12f3dde7c57d9b417f525", size = 1294482, upload-time = "2025-08-10T21:26:32.721Z" }, - { url = "https://files.pythonhosted.org/packages/da/f1/6aca55ff798901d8ce403206d00e033191f63d82dd708a186e0ed2067e9c/kiwisolver-1.4.9-cp313-cp313-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:40092754720b174e6ccf9e845d0d8c7d8e12c3d71e7fc35f55f3813e96376f78", size = 1343720, upload-time = "2025-08-10T21:26:34.032Z" }, - { url = "https://files.pythonhosted.org/packages/d1/91/eed031876c595c81d90d0f6fc681ece250e14bf6998c3d7c419466b523b7/kiwisolver-1.4.9-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:497d05f29a1300d14e02e6441cf0f5ee81c1ff5a304b0d9fb77423974684e08b", size = 2224907, upload-time = "2025-08-10T21:26:35.824Z" }, - { url = "https://files.pythonhosted.org/packages/e9/ec/4d1925f2e49617b9cca9c34bfa11adefad49d00db038e692a559454dfb2e/kiwisolver-1.4.9-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:bdd1a81a1860476eb41ac4bc1e07b3f07259e6d55bbf739b79c8aaedcf512799", size = 2321334, upload-time = "2025-08-10T21:26:37.534Z" }, - { url = "https://files.pythonhosted.org/packages/43/cb/450cd4499356f68802750c6ddc18647b8ea01ffa28f50d20598e0befe6e9/kiwisolver-1.4.9-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:e6b93f13371d341afee3be9f7c5964e3fe61d5fa30f6a30eb49856935dfe4fc3", size = 2488313, upload-time = "2025-08-10T21:26:39.191Z" }, - { url = "https://files.pythonhosted.org/packages/71/67/fc76242bd99f885651128a5d4fa6083e5524694b7c88b489b1b55fdc491d/kiwisolver-1.4.9-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:d75aa530ccfaa593da12834b86a0724f58bff12706659baa9227c2ccaa06264c", size = 2291970, upload-time = "2025-08-10T21:26:40.828Z" }, - { url = "https://files.pythonhosted.org/packages/75/bd/f1a5d894000941739f2ae1b65a32892349423ad49c2e6d0771d0bad3fae4/kiwisolver-1.4.9-cp313-cp313-win_amd64.whl", hash = "sha256:dd0a578400839256df88c16abddf9ba14813ec5f21362e1fe65022e00c883d4d", size = 73894, upload-time = "2025-08-10T21:26:42.33Z" }, - { url = "https://files.pythonhosted.org/packages/95/38/dce480814d25b99a391abbddadc78f7c117c6da34be68ca8b02d5848b424/kiwisolver-1.4.9-cp313-cp313-win_arm64.whl", hash = "sha256:d4188e73af84ca82468f09cadc5ac4db578109e52acb4518d8154698d3a87ca2", size = 64995, upload-time = "2025-08-10T21:26:43.889Z" }, - { url = "https://files.pythonhosted.org/packages/e2/37/7d218ce5d92dadc5ebdd9070d903e0c7cf7edfe03f179433ac4d13ce659c/kiwisolver-1.4.9-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:5a0f2724dfd4e3b3ac5a82436a8e6fd16baa7d507117e4279b660fe8ca38a3a1", size = 126510, upload-time = "2025-08-10T21:26:44.915Z" }, - { url = "https://files.pythonhosted.org/packages/23/b0/e85a2b48233daef4b648fb657ebbb6f8367696a2d9548a00b4ee0eb67803/kiwisolver-1.4.9-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:1b11d6a633e4ed84fc0ddafd4ebfd8ea49b3f25082c04ad12b8315c11d504dc1", size = 67903, upload-time = "2025-08-10T21:26:45.934Z" }, - { url = "https://files.pythonhosted.org/packages/44/98/f2425bc0113ad7de24da6bb4dae1343476e95e1d738be7c04d31a5d037fd/kiwisolver-1.4.9-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:61874cdb0a36016354853593cffc38e56fc9ca5aa97d2c05d3dcf6922cd55a11", size = 66402, upload-time = "2025-08-10T21:26:47.101Z" }, - { url = "https://files.pythonhosted.org/packages/98/d8/594657886df9f34c4177cc353cc28ca7e6e5eb562d37ccc233bff43bbe2a/kiwisolver-1.4.9-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:60c439763a969a6af93b4881db0eed8fadf93ee98e18cbc35bc8da868d0c4f0c", size = 1582135, upload-time = "2025-08-10T21:26:48.665Z" }, - { url = "https://files.pythonhosted.org/packages/5c/c6/38a115b7170f8b306fc929e166340c24958347308ea3012c2b44e7e295db/kiwisolver-1.4.9-cp313-cp313t-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:92a2f997387a1b79a75e7803aa7ded2cfbe2823852ccf1ba3bcf613b62ae3197", size = 1389409, upload-time = "2025-08-10T21:26:50.335Z" }, - { url = "https://files.pythonhosted.org/packages/bf/3b/e04883dace81f24a568bcee6eb3001da4ba05114afa622ec9b6fafdc1f5e/kiwisolver-1.4.9-cp313-cp313t-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a31d512c812daea6d8b3be3b2bfcbeb091dbb09177706569bcfc6240dcf8b41c", size = 1401763, upload-time = "2025-08-10T21:26:51.867Z" }, - { url = "https://files.pythonhosted.org/packages/9f/80/20ace48e33408947af49d7d15c341eaee69e4e0304aab4b7660e234d6288/kiwisolver-1.4.9-cp313-cp313t-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:52a15b0f35dad39862d376df10c5230155243a2c1a436e39eb55623ccbd68185", size = 1453643, upload-time = "2025-08-10T21:26:53.592Z" }, - { url = "https://files.pythonhosted.org/packages/64/31/6ce4380a4cd1f515bdda976a1e90e547ccd47b67a1546d63884463c92ca9/kiwisolver-1.4.9-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:a30fd6fdef1430fd9e1ba7b3398b5ee4e2887783917a687d86ba69985fb08748", size = 2330818, upload-time = "2025-08-10T21:26:55.051Z" }, - { url = "https://files.pythonhosted.org/packages/fa/e9/3f3fcba3bcc7432c795b82646306e822f3fd74df0ee81f0fa067a1f95668/kiwisolver-1.4.9-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:cc9617b46837c6468197b5945e196ee9ca43057bb7d9d1ae688101e4e1dddf64", size = 2419963, upload-time = "2025-08-10T21:26:56.421Z" }, - { url = "https://files.pythonhosted.org/packages/99/43/7320c50e4133575c66e9f7dadead35ab22d7c012a3b09bb35647792b2a6d/kiwisolver-1.4.9-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:0ab74e19f6a2b027ea4f845a78827969af45ce790e6cb3e1ebab71bdf9f215ff", size = 2594639, upload-time = "2025-08-10T21:26:57.882Z" }, - { url = "https://files.pythonhosted.org/packages/65/d6/17ae4a270d4a987ef8a385b906d2bdfc9fce502d6dc0d3aea865b47f548c/kiwisolver-1.4.9-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:dba5ee5d3981160c28d5490f0d1b7ed730c22470ff7f6cc26cfcfaacb9896a07", size = 2391741, upload-time = "2025-08-10T21:26:59.237Z" }, - { url = "https://files.pythonhosted.org/packages/2a/8f/8f6f491d595a9e5912971f3f863d81baddccc8a4d0c3749d6a0dd9ffc9df/kiwisolver-1.4.9-cp313-cp313t-win_arm64.whl", hash = "sha256:0749fd8f4218ad2e851e11cc4dc05c7cbc0cbc4267bdfdb31782e65aace4ee9c", size = 68646, upload-time = "2025-08-10T21:27:00.52Z" }, - { url = "https://files.pythonhosted.org/packages/6b/32/6cc0fbc9c54d06c2969faa9c1d29f5751a2e51809dd55c69055e62d9b426/kiwisolver-1.4.9-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:9928fe1eb816d11ae170885a74d074f57af3a0d65777ca47e9aeb854a1fba386", size = 123806, upload-time = "2025-08-10T21:27:01.537Z" }, - { url = "https://files.pythonhosted.org/packages/b2/dd/2bfb1d4a4823d92e8cbb420fe024b8d2167f72079b3bb941207c42570bdf/kiwisolver-1.4.9-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:d0005b053977e7b43388ddec89fa567f43d4f6d5c2c0affe57de5ebf290dc552", size = 66605, upload-time = "2025-08-10T21:27:03.335Z" }, - { url = "https://files.pythonhosted.org/packages/f7/69/00aafdb4e4509c2ca6064646cba9cd4b37933898f426756adb2cb92ebbed/kiwisolver-1.4.9-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:2635d352d67458b66fd0667c14cb1d4145e9560d503219034a18a87e971ce4f3", size = 64925, upload-time = "2025-08-10T21:27:04.339Z" }, - { url = "https://files.pythonhosted.org/packages/43/dc/51acc6791aa14e5cb6d8a2e28cefb0dc2886d8862795449d021334c0df20/kiwisolver-1.4.9-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:767c23ad1c58c9e827b649a9ab7809fd5fd9db266a9cf02b0e926ddc2c680d58", size = 1472414, upload-time = "2025-08-10T21:27:05.437Z" }, - { url = "https://files.pythonhosted.org/packages/3d/bb/93fa64a81db304ac8a246f834d5094fae4b13baf53c839d6bb6e81177129/kiwisolver-1.4.9-cp314-cp314-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:72d0eb9fba308b8311685c2268cf7d0a0639a6cd027d8128659f72bdd8a024b4", size = 1281272, upload-time = "2025-08-10T21:27:07.063Z" }, - { url = "https://files.pythonhosted.org/packages/70/e6/6df102916960fb8d05069d4bd92d6d9a8202d5a3e2444494e7cd50f65b7a/kiwisolver-1.4.9-cp314-cp314-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:f68e4f3eeca8fb22cc3d731f9715a13b652795ef657a13df1ad0c7dc0e9731df", size = 1298578, upload-time = "2025-08-10T21:27:08.452Z" }, - { url = "https://files.pythonhosted.org/packages/7c/47/e142aaa612f5343736b087864dbaebc53ea8831453fb47e7521fa8658f30/kiwisolver-1.4.9-cp314-cp314-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d84cd4061ae292d8ac367b2c3fa3aad11cb8625a95d135fe93f286f914f3f5a6", size = 1345607, upload-time = "2025-08-10T21:27:10.125Z" }, - { url = "https://files.pythonhosted.org/packages/54/89/d641a746194a0f4d1a3670fb900d0dbaa786fb98341056814bc3f058fa52/kiwisolver-1.4.9-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:a60ea74330b91bd22a29638940d115df9dc00af5035a9a2a6ad9399ffb4ceca5", size = 2230150, upload-time = "2025-08-10T21:27:11.484Z" }, - { url = "https://files.pythonhosted.org/packages/aa/6b/5ee1207198febdf16ac11f78c5ae40861b809cbe0e6d2a8d5b0b3044b199/kiwisolver-1.4.9-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:ce6a3a4e106cf35c2d9c4fa17c05ce0b180db622736845d4315519397a77beaf", size = 2325979, upload-time = "2025-08-10T21:27:12.917Z" }, - { url = "https://files.pythonhosted.org/packages/fc/ff/b269eefd90f4ae14dcc74973d5a0f6d28d3b9bb1afd8c0340513afe6b39a/kiwisolver-1.4.9-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:77937e5e2a38a7b48eef0585114fe7930346993a88060d0bf886086d2aa49ef5", size = 2491456, upload-time = "2025-08-10T21:27:14.353Z" }, - { url = "https://files.pythonhosted.org/packages/fc/d4/10303190bd4d30de547534601e259a4fbf014eed94aae3e5521129215086/kiwisolver-1.4.9-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:24c175051354f4a28c5d6a31c93906dc653e2bf234e8a4bbfb964892078898ce", size = 2294621, upload-time = "2025-08-10T21:27:15.808Z" }, - { url = "https://files.pythonhosted.org/packages/28/e0/a9a90416fce5c0be25742729c2ea52105d62eda6c4be4d803c2a7be1fa50/kiwisolver-1.4.9-cp314-cp314-win_amd64.whl", hash = "sha256:0763515d4df10edf6d06a3c19734e2566368980d21ebec439f33f9eb936c07b7", size = 75417, upload-time = "2025-08-10T21:27:17.436Z" }, - { url = "https://files.pythonhosted.org/packages/1f/10/6949958215b7a9a264299a7db195564e87900f709db9245e4ebdd3c70779/kiwisolver-1.4.9-cp314-cp314-win_arm64.whl", hash = "sha256:0e4e2bf29574a6a7b7f6cb5fa69293b9f96c928949ac4a53ba3f525dffb87f9c", size = 66582, upload-time = "2025-08-10T21:27:18.436Z" }, - { url = "https://files.pythonhosted.org/packages/ec/79/60e53067903d3bc5469b369fe0dfc6b3482e2133e85dae9daa9527535991/kiwisolver-1.4.9-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:d976bbb382b202f71c67f77b0ac11244021cfa3f7dfd9e562eefcea2df711548", size = 126514, upload-time = "2025-08-10T21:27:19.465Z" }, - { url = "https://files.pythonhosted.org/packages/25/d1/4843d3e8d46b072c12a38c97c57fab4608d36e13fe47d47ee96b4d61ba6f/kiwisolver-1.4.9-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:2489e4e5d7ef9a1c300a5e0196e43d9c739f066ef23270607d45aba368b91f2d", size = 67905, upload-time = "2025-08-10T21:27:20.51Z" }, - { url = "https://files.pythonhosted.org/packages/8c/ae/29ffcbd239aea8b93108de1278271ae764dfc0d803a5693914975f200596/kiwisolver-1.4.9-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:e2ea9f7ab7fbf18fffb1b5434ce7c69a07582f7acc7717720f1d69f3e806f90c", size = 66399, upload-time = "2025-08-10T21:27:21.496Z" }, - { url = "https://files.pythonhosted.org/packages/a1/ae/d7ba902aa604152c2ceba5d352d7b62106bedbccc8e95c3934d94472bfa3/kiwisolver-1.4.9-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:b34e51affded8faee0dfdb705416153819d8ea9250bbbf7ea1b249bdeb5f1122", size = 1582197, upload-time = "2025-08-10T21:27:22.604Z" }, - { url = "https://files.pythonhosted.org/packages/f2/41/27c70d427eddb8bc7e4f16420a20fefc6f480312122a59a959fdfe0445ad/kiwisolver-1.4.9-cp314-cp314t-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d8aacd3d4b33b772542b2e01beb50187536967b514b00003bdda7589722d2a64", size = 1390125, upload-time = "2025-08-10T21:27:24.036Z" }, - { url = "https://files.pythonhosted.org/packages/41/42/b3799a12bafc76d962ad69083f8b43b12bf4fe78b097b12e105d75c9b8f1/kiwisolver-1.4.9-cp314-cp314t-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:7cf974dd4e35fa315563ac99d6287a1024e4dc2077b8a7d7cd3d2fb65d283134", size = 1402612, upload-time = "2025-08-10T21:27:25.773Z" }, - { url = "https://files.pythonhosted.org/packages/d2/b5/a210ea073ea1cfaca1bb5c55a62307d8252f531beb364e18aa1e0888b5a0/kiwisolver-1.4.9-cp314-cp314t-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:85bd218b5ecfbee8c8a82e121802dcb519a86044c9c3b2e4aef02fa05c6da370", size = 1453990, upload-time = "2025-08-10T21:27:27.089Z" }, - { url = "https://files.pythonhosted.org/packages/5f/ce/a829eb8c033e977d7ea03ed32fb3c1781b4fa0433fbadfff29e39c676f32/kiwisolver-1.4.9-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:0856e241c2d3df4efef7c04a1e46b1936b6120c9bcf36dd216e3acd84bc4fb21", size = 2331601, upload-time = "2025-08-10T21:27:29.343Z" }, - { url = "https://files.pythonhosted.org/packages/e0/4b/b5e97eb142eb9cd0072dacfcdcd31b1c66dc7352b0f7c7255d339c0edf00/kiwisolver-1.4.9-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:9af39d6551f97d31a4deebeac6f45b156f9755ddc59c07b402c148f5dbb6482a", size = 2422041, upload-time = "2025-08-10T21:27:30.754Z" }, - { url = "https://files.pythonhosted.org/packages/40/be/8eb4cd53e1b85ba4edc3a9321666f12b83113a178845593307a3e7891f44/kiwisolver-1.4.9-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:bb4ae2b57fc1d8cbd1cf7b1d9913803681ffa903e7488012be5b76dedf49297f", size = 2594897, upload-time = "2025-08-10T21:27:32.803Z" }, - { url = "https://files.pythonhosted.org/packages/99/dd/841e9a66c4715477ea0abc78da039832fbb09dac5c35c58dc4c41a407b8a/kiwisolver-1.4.9-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:aedff62918805fb62d43a4aa2ecd4482c380dc76cd31bd7c8878588a61bd0369", size = 2391835, upload-time = "2025-08-10T21:27:34.23Z" }, - { url = "https://files.pythonhosted.org/packages/0c/28/4b2e5c47a0da96896fdfdb006340ade064afa1e63675d01ea5ac222b6d52/kiwisolver-1.4.9-cp314-cp314t-win_amd64.whl", hash = "sha256:1fa333e8b2ce4d9660f2cda9c0e1b6bafcfb2457a9d259faa82289e73ec24891", size = 79988, upload-time = "2025-08-10T21:27:35.587Z" }, - { url = "https://files.pythonhosted.org/packages/80/be/3578e8afd18c88cdf9cb4cffde75a96d2be38c5a903f1ed0ceec061bd09e/kiwisolver-1.4.9-cp314-cp314t-win_arm64.whl", hash = "sha256:4a48a2ce79d65d363597ef7b567ce3d14d68783d2b2263d98db3d9477805ba32", size = 70260, upload-time = "2025-08-10T21:27:36.606Z" }, - { url = "https://files.pythonhosted.org/packages/a3/0f/36d89194b5a32c054ce93e586d4049b6c2c22887b0eb229c61c68afd3078/kiwisolver-1.4.9-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:720e05574713db64c356e86732c0f3c5252818d05f9df320f0ad8380641acea5", size = 60104, upload-time = "2025-08-10T21:27:43.287Z" }, - { url = "https://files.pythonhosted.org/packages/52/ba/4ed75f59e4658fd21fe7dde1fee0ac397c678ec3befba3fe6482d987af87/kiwisolver-1.4.9-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:17680d737d5335b552994a2008fab4c851bcd7de33094a82067ef3a576ff02fa", size = 58592, upload-time = "2025-08-10T21:27:44.314Z" }, - { url = "https://files.pythonhosted.org/packages/33/01/a8ea7c5ea32a9b45ceeaee051a04c8ed4320f5add3c51bfa20879b765b70/kiwisolver-1.4.9-pp311-pypy311_pp73-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:85b5352f94e490c028926ea567fc569c52ec79ce131dadb968d3853e809518c2", size = 80281, upload-time = "2025-08-10T21:27:45.369Z" }, - { url = "https://files.pythonhosted.org/packages/da/e3/dbd2ecdce306f1d07a1aaf324817ee993aab7aee9db47ceac757deabafbe/kiwisolver-1.4.9-pp311-pypy311_pp73-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:464415881e4801295659462c49461a24fb107c140de781d55518c4b80cb6790f", size = 78009, upload-time = "2025-08-10T21:27:46.376Z" }, - { url = "https://files.pythonhosted.org/packages/da/e9/0d4add7873a73e462aeb45c036a2dead2562b825aa46ba326727b3f31016/kiwisolver-1.4.9-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:fb940820c63a9590d31d88b815e7a3aa5915cad3ce735ab45f0c730b39547de1", size = 73929, upload-time = "2025-08-10T21:27:48.236Z" }, -] - -[[package]] -name = "kuzu" -version = "0.11.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/d8/7c/d2c9355054a67a79ec0cc516b3fad68d970245a1a6f5173eaa2bf94d1782/kuzu-0.11.0.tar.gz", hash = "sha256:34b9fe2d9f94421585f921cb0513bd584842a5705ae757c09fd075e23acb42d7", size = 4897335, upload-time = "2025-07-13T18:37:37.009Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/c7/32/f60c8cd9f3ceb4ff75fb4a2e9c9ea02ad40ae50323e14f71fd8440c4eb70/kuzu-0.11.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:8f20ea8c608bb40d6e15d32538a903ace177464c90aa88ee542e99814bf78881", size = 3694199, upload-time = "2025-07-13T18:36:46.867Z" }, - { url = "https://files.pythonhosted.org/packages/a7/33/544e65c08ce49f41e2ee35cd8576df602c87cc58b033cd10f9d7847cc98f/kuzu-0.11.0-cp311-cp311-macosx_11_0_x86_64.whl", hash = "sha256:df94c3beaf57d2c3ac84ce4087fc210c09e9ff5b5c9863a496b274bbc82f0a3f", size = 4092338, upload-time = "2025-07-13T18:36:48.185Z" }, - { url = "https://files.pythonhosted.org/packages/46/da/bd305260c82fe40d1d1e1710cd20a538160c0dd858559568cebe5e3ad5b7/kuzu-0.11.0-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0457283aaf75bcd7912dcdf0292adaabdd615db654b09435387637a70cbae28d", size = 6201525, upload-time = "2025-07-13T18:36:49.601Z" }, - { url = "https://files.pythonhosted.org/packages/40/98/dfc00fca1c126a2eb678cb75cca4d966b902450f5215cab1ca221bb0dbc9/kuzu-0.11.0-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3a474c74aa7953cca399862dce2098fc5bbc94f4d83b04d891688fe6fb2e14c4", size = 6980556, upload-time = "2025-07-13T18:36:51.402Z" }, - { url = "https://files.pythonhosted.org/packages/4d/58/fe2f00687531c02b6b4a636a4ff2603d161d504ace4ca2d01878db87793a/kuzu-0.11.0-cp311-cp311-win_amd64.whl", hash = "sha256:139c0b52cc2037ee03243335f37734fc30fe20b8d94b7dea66a1ee8ad44e5b16", size = 4289032, upload-time = "2025-07-13T18:36:53.395Z" }, - { url = "https://files.pythonhosted.org/packages/08/ee/c172bd487e6b11734db2febc03f0b5517225bafbfe144a080f265569b010/kuzu-0.11.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f200955e3af6a64ecb3f8db24e88d2620e4f04cfe958f580d614d6fca4b7b73d", size = 3693481, upload-time = "2025-07-13T18:36:54.828Z" }, - { url = "https://files.pythonhosted.org/packages/56/c0/1a4f466366454e0657e3f6de8b9fd649a2a12e7c72d86f1d341dc264e927/kuzu-0.11.0-cp312-cp312-macosx_11_0_x86_64.whl", hash = "sha256:6de9af1886401cdec89e41bbe67fdd37b562bdc39ad81b4cc62c4c7e5703e23e", size = 4094896, upload-time = "2025-07-13T18:36:56.204Z" }, - { url = "https://files.pythonhosted.org/packages/b4/02/387ad1d493944f5ab7b2dc521ee5adf45b2e6d1b549e6ed1876192c847bd/kuzu-0.11.0-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3843f4107c287c9759d34e0082feea84d8f48366033ea191a58518baf3a9e2d8", size = 6201276, upload-time = "2025-07-13T18:36:57.617Z" }, - { url = "https://files.pythonhosted.org/packages/2f/e5/678dab0df8cd47b61d4d82f9ba4fd46e92f98689ec4031c19911880dbce8/kuzu-0.11.0-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3c0bdb0cbc7be83eb3e5e6c999e8c6add4cb88d26468c67205b3062fe01af859", size = 6979740, upload-time = "2025-07-13T18:36:59.293Z" }, - { url = "https://files.pythonhosted.org/packages/bb/ff/8368ed24f2cd90769b604b6c86ee9f01adcc024adc4a6f0ef4564a484672/kuzu-0.11.0-cp312-cp312-win_amd64.whl", hash = "sha256:a74660da390adb1996b5c8305e442304bfdb84b40424f2b045a7d4977ae22f34", size = 4289700, upload-time = "2025-07-13T18:37:01.023Z" }, - { url = "https://files.pythonhosted.org/packages/e7/22/b1577470c1e142272cc3646cd68ec13dc06b68bfe26869c1339e3ba8a1b0/kuzu-0.11.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:d3b928a6646aad0a4284a07918140761f70626e936976c7bc9a1504395029353", size = 3693508, upload-time = "2025-07-13T18:37:02.4Z" }, - { url = "https://files.pythonhosted.org/packages/af/7c/c97de999c782860bff2a223d07afaa71c9ae4e0a214a1d7c3db866cf9157/kuzu-0.11.0-cp313-cp313-macosx_11_0_x86_64.whl", hash = "sha256:5a995172d99e961fe2ff073722a447d335dca608d566fc924520f1bfea4f97cf", size = 4095016, upload-time = "2025-07-13T18:37:03.742Z" }, - { url = "https://files.pythonhosted.org/packages/2a/df/c9d63b4a3835b944d042add771bdfbaca5bd61a1490b78492e4e299c948f/kuzu-0.11.0-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:836af97ba5159a59e55cb336869f45987d74d9875bd97caae31af5244f8b99e8", size = 6201752, upload-time = "2025-07-13T18:37:05.756Z" }, - { url = "https://files.pythonhosted.org/packages/e6/8d/55226444b7607d81299e3ff1d47ae4ad76149c0fd266ae7fe04eab52060e/kuzu-0.11.0-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7ee8559686eac9f874d125708f9a83f1dca09bb165e5b838c6c0ad521cce68ee", size = 6979587, upload-time = "2025-07-13T18:37:07.468Z" }, - { url = "https://files.pythonhosted.org/packages/a7/19/1e19851f7229953cd696df9983b953dcc2c0cc1f0ae81e02be9eddd2b379/kuzu-0.11.0-cp313-cp313-win_amd64.whl", hash = "sha256:7ae94e8add6b5cc25f3cf2a38a07f3c4a4acb9b636078be8a53ac3e8f736d6ba", size = 4289847, upload-time = "2025-07-13T18:37:09.08Z" }, - { url = "https://files.pythonhosted.org/packages/9f/2a/f4579d9b7a8dd205bfc1af89596ed3cbcfea3c0bdf14206083fea509c545/kuzu-0.11.0-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3667b430de2efbc96e45878e460851d1aa8aa94be96fa5d4d82186f19a95889a", size = 6204963, upload-time = "2025-07-13T18:37:10.637Z" }, - { url = "https://files.pythonhosted.org/packages/ff/bd/a827d5eff7a7abd577841bbe71f8df485501ca8f0250ddbe29c7edf67e6e/kuzu-0.11.0-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4162d80861e606f4d82d6e559fc11c0d7efa7725a6dc811c61bcd266a2963705", size = 6982953, upload-time = "2025-07-13T18:37:12.429Z" }, - { url = "https://files.pythonhosted.org/packages/03/19/6d41056e2d429ddb19396d992dee5f7804cdb3bee160d53c3cbf97c0f251/kuzu-0.11.0-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7da89fb506be064ebb7d3954f9ffb6e9c0f9ef9c10f37be59a347a0bc48efd28", size = 6202100, upload-time = "2025-07-13T18:37:14.156Z" }, - { url = "https://files.pythonhosted.org/packages/ea/a7/13585d872b65263da8e83c77100914fbaafe91fea11160151a61cf111e03/kuzu-0.11.0-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b17cc92a925073a3bbd65e05af59a9c0c931e1573755d7ad340705059d849af7", size = 6205072, upload-time = "2025-07-13T18:37:15.907Z" }, -] - -[[package]] -name = "lance-namespace" -version = "0.0.6" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "lance-namespace-urllib3-client" }, - { name = "pyarrow" }, - { name = "pylance" }, - { name = "typing-extensions" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/55/07/5e809f1053a53bdbe0a8f461a710bbf7e1b3119e1432a60b46b648d51ba3/lance_namespace-0.0.6.tar.gz", hash = "sha256:3eeeba5f6bb8d01504cda33d86e6c22bd9cefb1f6f3aac1f963d46a9ff09b9a0", size = 11973, upload-time = "2025-08-20T19:28:03.213Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/25/c1/35bb590f9a9421f02b5d4440c975b6852becaad8292b5007994a8d3fe0cd/lance_namespace-0.0.6-py3-none-any.whl", hash = "sha256:fd102aec0ca3672b15cae65f4b9bf15086f7a73cedb7f5c12c47b5b48f9090b4", size = 9050, upload-time = "2025-08-20T19:28:02.535Z" }, -] - -[[package]] -name = "lance-namespace-urllib3-client" -version = "0.0.14" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "pydantic" }, - { name = "python-dateutil" }, - { name = "typing-extensions" }, - { name = "urllib3" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/43/09/727f5749da387a16ffd342339d859073e950ae451f66554bfba8e8adac71/lance_namespace_urllib3_client-0.0.14.tar.gz", hash = "sha256:911c6a3b5c2c98f4239b6d96609cf840e740c3af5482f5fb22096afb9db1dc1c", size = 134488, upload-time = "2025-09-02T03:48:43.108Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/ca/90/ceb58b9a9f3aca0af1c294d71115ee9d44d6d82e0c9dc57d6743574d6358/lance_namespace_urllib3_client-0.0.14-py3-none-any.whl", hash = "sha256:40277cfcf7c9084419c2784e7924b3e316f6fe5b8057f4dc62a49f3b40c2d80c", size = 229639, upload-time = "2025-09-02T03:48:41.975Z" }, -] - -[[package]] -name = "lancedb" -version = "0.25.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "deprecation" }, - { name = "lance-namespace" }, - { name = "numpy" }, - { name = "overrides" }, - { name = "packaging" }, - { name = "pyarrow" }, - { name = "pydantic" }, - { name = "tqdm" }, -] -wheels = [ - { url = "https://files.pythonhosted.org/packages/a2/e7/10953deea89b06ae5bc568169d5ae888ff6df314decb92b9b3e453f53f0b/lancedb-0.25.0-cp39-abi3-macosx_10_15_x86_64.whl", hash = "sha256:ae2e80b7b3be3fa4d92fc8d500f47549dd1f8d28ca5092f1c898b92d0cfd4393", size = 34171227, upload-time = "2025-09-04T11:05:31.327Z" }, - { url = "https://files.pythonhosted.org/packages/55/7f/2874a3709f1b8c487e707e171c9004a9240af3af0fd7a247b9187bb6e0f7/lancedb-0.25.0-cp39-abi3-macosx_11_0_arm64.whl", hash = "sha256:a9d67ea9edffa596c6f190151fdd535da8e355a4fd1979c1dc19d540a5665916", size = 31552856, upload-time = "2025-09-04T09:46:50.788Z" }, - { url = "https://files.pythonhosted.org/packages/e3/e9/faab70ad918576ed3bb7cb936474137ac265ac3026d3e16e30cd4d3daac2/lancedb-0.25.0-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d8fe20079ed86b1ab75c65dcfc920a9646c835e9c40ef825cadd148c11b0001e", size = 32487962, upload-time = "2025-09-04T08:51:35.358Z" }, - { url = "https://files.pythonhosted.org/packages/ce/40/5471bc8115f287040b5afdf9d7a20c4685ec16cddb4a7da79e7c1f63914e/lancedb-0.25.0-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b37bc402d85c83e454d9f2e79480b31acc5904bb159a4fc715032c7560494157", size = 35726794, upload-time = "2025-09-04T08:57:30.554Z" }, - { url = "https://files.pythonhosted.org/packages/47/5e/aa3d9d2c7a834a9aa539b2b1c731ab860f7e32e2c87b9086ad233ecb13cd/lancedb-0.25.0-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:f9bbc20bd1e64be359ca11c90428c00b0062d26b0291bddf32ab5471a3525c76", size = 32492508, upload-time = "2025-09-04T08:53:54.661Z" }, - { url = "https://files.pythonhosted.org/packages/fa/37/75f4e3ed7fa00a2cd5d321e8bf13441cdb61a83fbbcd0fa0f1a7241affe1/lancedb-0.25.0-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:1306be9c08e208a5bcb5188275f47f962c2eda96369fad5949a3ddaf592afc6d", size = 35776383, upload-time = "2025-09-04T08:57:18.737Z" }, - { url = "https://files.pythonhosted.org/packages/b5/af/eb217ea1daab5c28ce4c764d2f672f4e3a5bcd3d4faf7921a8ee28c6cb5b/lancedb-0.25.0-cp39-abi3-win_amd64.whl", hash = "sha256:f66283e5d63c99c2bfbd4eaa134d9a5c5b0145eb26a972648214f8ba87777e24", size = 37826272, upload-time = "2025-09-04T09:15:23.729Z" }, -] - -[[package]] -name = "langfuse" -version = "2.60.10" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "anyio" }, - { name = "backoff" }, - { name = "httpx" }, - { name = "idna" }, - { name = "packaging" }, - { name = "pydantic" }, - { name = "requests" }, - { name = "wrapt" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/eb/45/77fdf53c9e9f49bb78f72eba3f992f2f3d8343e05976aabfe1fca276a640/langfuse-2.60.10.tar.gz", hash = "sha256:a26d0d927a28ee01b2d12bb5b862590b643cc4e60a28de6e2b0c2cfff5dbfc6a", size = 152648, upload-time = "2025-09-16T15:08:12.426Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/76/69/08584fbd69e14398d3932a77d0c8d7e20389da3e6470210d6719afba2801/langfuse-2.60.10-py3-none-any.whl", hash = "sha256:815c6369194aa5b2a24f88eb9952f7c3fc863272c41e90642a71f3bc76f4a11f", size = 275568, upload-time = "2025-09-16T15:08:10.166Z" }, -] - -[[package]] -name = "lazy-object-proxy" -version = "1.12.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/08/a2/69df9c6ba6d316cfd81fe2381e464db3e6de5db45f8c43c6a23504abf8cb/lazy_object_proxy-1.12.0.tar.gz", hash = "sha256:1f5a462d92fd0cfb82f1fab28b51bfb209fabbe6aabf7f0d51472c0c124c0c61", size = 43681, upload-time = "2025-08-22T13:50:06.783Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/01/b3/4684b1e128a87821e485f5a901b179790e6b5bc02f89b7ee19c23be36ef3/lazy_object_proxy-1.12.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1cf69cd1a6c7fe2dbcc3edaa017cf010f4192e53796538cc7d5e1fedbfa4bcff", size = 26656, upload-time = "2025-08-22T13:42:30.605Z" }, - { url = "https://files.pythonhosted.org/packages/3a/03/1bdc21d9a6df9ff72d70b2ff17d8609321bea4b0d3cffd2cea92fb2ef738/lazy_object_proxy-1.12.0-cp311-cp311-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:efff4375a8c52f55a145dc8487a2108c2140f0bec4151ab4e1843e52eb9987ad", size = 68832, upload-time = "2025-08-22T13:42:31.675Z" }, - { url = "https://files.pythonhosted.org/packages/3d/4b/5788e5e8bd01d19af71e50077ab020bc5cce67e935066cd65e1215a09ff9/lazy_object_proxy-1.12.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1192e8c2f1031a6ff453ee40213afa01ba765b3dc861302cd91dbdb2e2660b00", size = 69148, upload-time = "2025-08-22T13:42:32.876Z" }, - { url = "https://files.pythonhosted.org/packages/79/0e/090bf070f7a0de44c61659cb7f74c2fe02309a77ca8c4b43adfe0b695f66/lazy_object_proxy-1.12.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:3605b632e82a1cbc32a1e5034278a64db555b3496e0795723ee697006b980508", size = 67800, upload-time = "2025-08-22T13:42:34.054Z" }, - { url = "https://files.pythonhosted.org/packages/cf/d2/b320325adbb2d119156f7c506a5fbfa37fcab15c26d13cf789a90a6de04e/lazy_object_proxy-1.12.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:a61095f5d9d1a743e1e20ec6d6db6c2ca511961777257ebd9b288951b23b44fa", size = 68085, upload-time = "2025-08-22T13:42:35.197Z" }, - { url = "https://files.pythonhosted.org/packages/6a/48/4b718c937004bf71cd82af3713874656bcb8d0cc78600bf33bb9619adc6c/lazy_object_proxy-1.12.0-cp311-cp311-win_amd64.whl", hash = "sha256:997b1d6e10ecc6fb6fe0f2c959791ae59599f41da61d652f6c903d1ee58b7370", size = 26535, upload-time = "2025-08-22T13:42:36.521Z" }, - { url = "https://files.pythonhosted.org/packages/0d/1b/b5f5bd6bda26f1e15cd3232b223892e4498e34ec70a7f4f11c401ac969f1/lazy_object_proxy-1.12.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8ee0d6027b760a11cc18281e702c0309dd92da458a74b4c15025d7fc490deede", size = 26746, upload-time = "2025-08-22T13:42:37.572Z" }, - { url = "https://files.pythonhosted.org/packages/55/64/314889b618075c2bfc19293ffa9153ce880ac6153aacfd0a52fcabf21a66/lazy_object_proxy-1.12.0-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:4ab2c584e3cc8be0dfca422e05ad30a9abe3555ce63e9ab7a559f62f8dbc6ff9", size = 71457, upload-time = "2025-08-22T13:42:38.743Z" }, - { url = "https://files.pythonhosted.org/packages/11/53/857fc2827fc1e13fbdfc0ba2629a7d2579645a06192d5461809540b78913/lazy_object_proxy-1.12.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:14e348185adbd03ec17d051e169ec45686dcd840a3779c9d4c10aabe2ca6e1c0", size = 71036, upload-time = "2025-08-22T13:42:40.184Z" }, - { url = "https://files.pythonhosted.org/packages/2b/24/e581ffed864cd33c1b445b5763d617448ebb880f48675fc9de0471a95cbc/lazy_object_proxy-1.12.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:c4fcbe74fb85df8ba7825fa05eddca764138da752904b378f0ae5ab33a36c308", size = 69329, upload-time = "2025-08-22T13:42:41.311Z" }, - { url = "https://files.pythonhosted.org/packages/78/be/15f8f5a0b0b2e668e756a152257d26370132c97f2f1943329b08f057eff0/lazy_object_proxy-1.12.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:563d2ec8e4d4b68ee7848c5ab4d6057a6d703cb7963b342968bb8758dda33a23", size = 70690, upload-time = "2025-08-22T13:42:42.51Z" }, - { url = "https://files.pythonhosted.org/packages/5d/aa/f02be9bbfb270e13ee608c2b28b8771f20a5f64356c6d9317b20043c6129/lazy_object_proxy-1.12.0-cp312-cp312-win_amd64.whl", hash = "sha256:53c7fd99eb156bbb82cbc5d5188891d8fdd805ba6c1e3b92b90092da2a837073", size = 26563, upload-time = "2025-08-22T13:42:43.685Z" }, - { url = "https://files.pythonhosted.org/packages/f4/26/b74c791008841f8ad896c7f293415136c66cc27e7c7577de4ee68040c110/lazy_object_proxy-1.12.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:86fd61cb2ba249b9f436d789d1356deae69ad3231dc3c0f17293ac535162672e", size = 26745, upload-time = "2025-08-22T13:42:44.982Z" }, - { url = "https://files.pythonhosted.org/packages/9b/52/641870d309e5d1fb1ea7d462a818ca727e43bfa431d8c34b173eb090348c/lazy_object_proxy-1.12.0-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:81d1852fb30fab81696f93db1b1e55a5d1ff7940838191062f5f56987d5fcc3e", size = 71537, upload-time = "2025-08-22T13:42:46.141Z" }, - { url = "https://files.pythonhosted.org/packages/47/b6/919118e99d51c5e76e8bf5a27df406884921c0acf2c7b8a3b38d847ab3e9/lazy_object_proxy-1.12.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:be9045646d83f6c2664c1330904b245ae2371b5c57a3195e4028aedc9f999655", size = 71141, upload-time = "2025-08-22T13:42:47.375Z" }, - { url = "https://files.pythonhosted.org/packages/e5/47/1d20e626567b41de085cf4d4fb3661a56c159feaa73c825917b3b4d4f806/lazy_object_proxy-1.12.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:67f07ab742f1adfb3966c40f630baaa7902be4222a17941f3d85fd1dae5565ff", size = 69449, upload-time = "2025-08-22T13:42:48.49Z" }, - { url = "https://files.pythonhosted.org/packages/58/8d/25c20ff1a1a8426d9af2d0b6f29f6388005fc8cd10d6ee71f48bff86fdd0/lazy_object_proxy-1.12.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:75ba769017b944fcacbf6a80c18b2761a1795b03f8899acdad1f1c39db4409be", size = 70744, upload-time = "2025-08-22T13:42:49.608Z" }, - { url = "https://files.pythonhosted.org/packages/c0/67/8ec9abe15c4f8a4bcc6e65160a2c667240d025cbb6591b879bea55625263/lazy_object_proxy-1.12.0-cp313-cp313-win_amd64.whl", hash = "sha256:7b22c2bbfb155706b928ac4d74c1a63ac8552a55ba7fff4445155523ea4067e1", size = 26568, upload-time = "2025-08-22T13:42:57.719Z" }, - { url = "https://files.pythonhosted.org/packages/23/12/cd2235463f3469fd6c62d41d92b7f120e8134f76e52421413a0ad16d493e/lazy_object_proxy-1.12.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:4a79b909aa16bde8ae606f06e6bbc9d3219d2e57fb3e0076e17879072b742c65", size = 27391, upload-time = "2025-08-22T13:42:50.62Z" }, - { url = "https://files.pythonhosted.org/packages/60/9e/f1c53e39bbebad2e8609c67d0830cc275f694d0ea23d78e8f6db526c12d3/lazy_object_proxy-1.12.0-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:338ab2f132276203e404951205fe80c3fd59429b3a724e7b662b2eb539bb1be9", size = 80552, upload-time = "2025-08-22T13:42:51.731Z" }, - { url = "https://files.pythonhosted.org/packages/4c/b6/6c513693448dcb317d9d8c91d91f47addc09553613379e504435b4cc8b3e/lazy_object_proxy-1.12.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8c40b3c9faee2e32bfce0df4ae63f4e73529766893258eca78548bac801c8f66", size = 82857, upload-time = "2025-08-22T13:42:53.225Z" }, - { url = "https://files.pythonhosted.org/packages/12/1c/d9c4aaa4c75da11eb7c22c43d7c90a53b4fca0e27784a5ab207768debea7/lazy_object_proxy-1.12.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:717484c309df78cedf48396e420fa57fc8a2b1f06ea889df7248fdd156e58847", size = 80833, upload-time = "2025-08-22T13:42:54.391Z" }, - { url = "https://files.pythonhosted.org/packages/0b/ae/29117275aac7d7d78ae4f5a4787f36ff33262499d486ac0bf3e0b97889f6/lazy_object_proxy-1.12.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:a6b7ea5ea1ffe15059eb44bcbcb258f97bcb40e139b88152c40d07b1a1dfc9ac", size = 79516, upload-time = "2025-08-22T13:42:55.812Z" }, - { url = "https://files.pythonhosted.org/packages/19/40/b4e48b2c38c69392ae702ae7afa7b6551e0ca5d38263198b7c79de8b3bdf/lazy_object_proxy-1.12.0-cp313-cp313t-win_amd64.whl", hash = "sha256:08c465fb5cd23527512f9bd7b4c7ba6cec33e28aad36fbbe46bf7b858f9f3f7f", size = 27656, upload-time = "2025-08-22T13:42:56.793Z" }, - { url = "https://files.pythonhosted.org/packages/ef/3a/277857b51ae419a1574557c0b12e0d06bf327b758ba94cafc664cb1e2f66/lazy_object_proxy-1.12.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c9defba70ab943f1df98a656247966d7729da2fe9c2d5d85346464bf320820a3", size = 26582, upload-time = "2025-08-22T13:49:49.366Z" }, - { url = "https://files.pythonhosted.org/packages/1a/b6/c5e0fa43535bb9c87880e0ba037cdb1c50e01850b0831e80eb4f4762f270/lazy_object_proxy-1.12.0-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:6763941dbf97eea6b90f5b06eb4da9418cc088fce0e3883f5816090f9afcde4a", size = 71059, upload-time = "2025-08-22T13:49:50.488Z" }, - { url = "https://files.pythonhosted.org/packages/06/8a/7dcad19c685963c652624702f1a968ff10220b16bfcc442257038216bf55/lazy_object_proxy-1.12.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:fdc70d81235fc586b9e3d1aeef7d1553259b62ecaae9db2167a5d2550dcc391a", size = 71034, upload-time = "2025-08-22T13:49:54.224Z" }, - { url = "https://files.pythonhosted.org/packages/12/ac/34cbfb433a10e28c7fd830f91c5a348462ba748413cbb950c7f259e67aa7/lazy_object_proxy-1.12.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:0a83c6f7a6b2bfc11ef3ed67f8cbe99f8ff500b05655d8e7df9aab993a6abc95", size = 69529, upload-time = "2025-08-22T13:49:55.29Z" }, - { url = "https://files.pythonhosted.org/packages/6f/6a/11ad7e349307c3ca4c0175db7a77d60ce42a41c60bcb11800aabd6a8acb8/lazy_object_proxy-1.12.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:256262384ebd2a77b023ad02fbcc9326282bcfd16484d5531154b02bc304f4c5", size = 70391, upload-time = "2025-08-22T13:49:56.35Z" }, - { url = "https://files.pythonhosted.org/packages/59/97/9b410ed8fbc6e79c1ee8b13f8777a80137d4bc189caf2c6202358e66192c/lazy_object_proxy-1.12.0-cp314-cp314-win_amd64.whl", hash = "sha256:7601ec171c7e8584f8ff3f4e440aa2eebf93e854f04639263875b8c2971f819f", size = 26988, upload-time = "2025-08-22T13:49:57.302Z" }, - { url = "https://files.pythonhosted.org/packages/41/a0/b91504515c1f9a299fc157967ffbd2f0321bce0516a3d5b89f6f4cad0355/lazy_object_proxy-1.12.0-pp39.pp310.pp311.graalpy311-none-any.whl", hash = "sha256:c3b2e0af1f7f77c4263759c4824316ce458fabe0fceadcd24ef8ca08b2d1e402", size = 15072, upload-time = "2025-08-22T13:50:05.498Z" }, -] - -[[package]] -name = "limits" -version = "4.8.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "deprecated" }, - { name = "packaging" }, - { name = "typing-extensions" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/71/c6/18c4676257f78add093babffbe4d101ff943e9b86e4f708ca5b8fad03a9e/limits-4.8.0.tar.gz", hash = "sha256:74a9691f8a2c82c37480ee9305de3490f6cab3df5b8c61dbde670550f2b34510", size = 95679, upload-time = "2025-04-23T21:00:28.166Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/6d/c9/556846b9d112a3387397850d5560f5ec63464508c6aa068257f0516159d0/limits-4.8.0-py3-none-any.whl", hash = "sha256:de43d24969a0050b859dd29bbd61bd807a5de3ed9255f666aec1ea3dd3fc407e", size = 62028, upload-time = "2025-04-23T21:00:26.017Z" }, -] - -[[package]] -name = "litellm" -version = "1.77.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "aiohttp" }, - { name = "click" }, - { name = "fastuuid" }, - { name = "httpx" }, - { name = "importlib-metadata" }, - { name = "jinja2" }, - { name = "jsonschema" }, - { name = "openai" }, - { name = "pydantic" }, - { name = "python-dotenv" }, - { name = "tiktoken" }, - { name = "tokenizers" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/8c/65/71fe4851709fa4a612e41b80001a9ad803fea979d21b90970093fd65eded/litellm-1.77.1.tar.gz", hash = "sha256:76bab5203115efb9588244e5bafbfc07a800a239be75d8dc6b1b9d17394c6418", size = 10275745, upload-time = "2025-09-13T21:05:21.377Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/bb/dc/ff4f119cd4d783742c9648a03e0ba5c2b52fc385b2ae9f0d32acf3a78241/litellm-1.77.1-py3-none-any.whl", hash = "sha256:407761dc3c35fbcd41462d3fe65dd3ed70aac705f37cde318006c18940f695a0", size = 9067070, upload-time = "2025-09-13T21:05:18.078Z" }, -] - -[[package]] -name = "makefun" -version = "1.16.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/7b/cf/6780ab8bc3b84a1cce3e4400aed3d64b6db7d5e227a2f75b6ded5674701a/makefun-1.16.0.tar.gz", hash = "sha256:e14601831570bff1f6d7e68828bcd30d2f5856f24bad5de0ccb22921ceebc947", size = 73565, upload-time = "2025-05-09T15:00:42.313Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/b7/c0/4bc973defd1270b89ccaae04cef0d5fa3ea85b59b108ad2c08aeea9afb76/makefun-1.16.0-py2.py3-none-any.whl", hash = "sha256:43baa4c3e7ae2b17de9ceac20b669e9a67ceeadff31581007cca20a07bbe42c4", size = 22923, upload-time = "2025-05-09T15:00:41.042Z" }, -] - -[[package]] -name = "mako" -version = "1.3.10" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "markupsafe" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/9e/38/bd5b78a920a64d708fe6bc8e0a2c075e1389d53bef8413725c63ba041535/mako-1.3.10.tar.gz", hash = "sha256:99579a6f39583fa7e5630a28c3c1f440e4e97a414b80372649c0ce338da2ea28", size = 392474, upload-time = "2025-04-10T12:44:31.16Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/87/fb/99f81ac72ae23375f22b7afdb7642aba97c00a713c217124420147681a2f/mako-1.3.10-py3-none-any.whl", hash = "sha256:baef24a52fc4fc514a0887ac600f9f1cff3d82c61d4d700a1fa84d597b88db59", size = 78509, upload-time = "2025-04-10T12:50:53.297Z" }, -] - -[[package]] -name = "markdown-it-py" -version = "4.0.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "mdurl" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/5b/f5/4ec618ed16cc4f8fb3b701563655a69816155e79e24a17b651541804721d/markdown_it_py-4.0.0.tar.gz", hash = "sha256:cb0a2b4aa34f932c007117b194e945bd74e0ec24133ceb5bac59009cda1cb9f3", size = 73070, upload-time = "2025-08-11T12:57:52.854Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/94/54/e7d793b573f298e1c9013b8c4dade17d481164aa517d1d7148619c2cedbf/markdown_it_py-4.0.0-py3-none-any.whl", hash = "sha256:87327c59b172c5011896038353a81343b6754500a08cd7a4973bb48c6d578147", size = 87321, upload-time = "2025-08-11T12:57:51.923Z" }, -] - -[[package]] -name = "markupsafe" -version = "3.0.2" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/b2/97/5d42485e71dfc078108a86d6de8fa46db44a1a9295e89c5d6d4a06e23a62/markupsafe-3.0.2.tar.gz", hash = "sha256:ee55d3edf80167e48ea11a923c7386f4669df67d7994554387f84e7d8b0a2bf0", size = 20537, upload-time = "2024-10-18T15:21:54.129Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/6b/28/bbf83e3f76936960b850435576dd5e67034e200469571be53f69174a2dfd/MarkupSafe-3.0.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:9025b4018f3a1314059769c7bf15441064b2207cb3f065e6ea1e7359cb46db9d", size = 14353, upload-time = "2024-10-18T15:21:02.187Z" }, - { url = "https://files.pythonhosted.org/packages/6c/30/316d194b093cde57d448a4c3209f22e3046c5bb2fb0820b118292b334be7/MarkupSafe-3.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:93335ca3812df2f366e80509ae119189886b0f3c2b81325d39efdb84a1e2ae93", size = 12392, upload-time = "2024-10-18T15:21:02.941Z" }, - { url = "https://files.pythonhosted.org/packages/f2/96/9cdafba8445d3a53cae530aaf83c38ec64c4d5427d975c974084af5bc5d2/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2cb8438c3cbb25e220c2ab33bb226559e7afb3baec11c4f218ffa7308603c832", size = 23984, upload-time = "2024-10-18T15:21:03.953Z" }, - { url = "https://files.pythonhosted.org/packages/f1/a4/aefb044a2cd8d7334c8a47d3fb2c9f328ac48cb349468cc31c20b539305f/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a123e330ef0853c6e822384873bef7507557d8e4a082961e1defa947aa59ba84", size = 23120, upload-time = "2024-10-18T15:21:06.495Z" }, - { url = "https://files.pythonhosted.org/packages/8d/21/5e4851379f88f3fad1de30361db501300d4f07bcad047d3cb0449fc51f8c/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1e084f686b92e5b83186b07e8a17fc09e38fff551f3602b249881fec658d3eca", size = 23032, upload-time = "2024-10-18T15:21:07.295Z" }, - { url = "https://files.pythonhosted.org/packages/00/7b/e92c64e079b2d0d7ddf69899c98842f3f9a60a1ae72657c89ce2655c999d/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d8213e09c917a951de9d09ecee036d5c7d36cb6cb7dbaece4c71a60d79fb9798", size = 24057, upload-time = "2024-10-18T15:21:08.073Z" }, - { url = "https://files.pythonhosted.org/packages/f9/ac/46f960ca323037caa0a10662ef97d0a4728e890334fc156b9f9e52bcc4ca/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:5b02fb34468b6aaa40dfc198d813a641e3a63b98c2b05a16b9f80b7ec314185e", size = 23359, upload-time = "2024-10-18T15:21:09.318Z" }, - { url = "https://files.pythonhosted.org/packages/69/84/83439e16197337b8b14b6a5b9c2105fff81d42c2a7c5b58ac7b62ee2c3b1/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:0bff5e0ae4ef2e1ae4fdf2dfd5b76c75e5c2fa4132d05fc1b0dabcd20c7e28c4", size = 23306, upload-time = "2024-10-18T15:21:10.185Z" }, - { url = "https://files.pythonhosted.org/packages/9a/34/a15aa69f01e2181ed8d2b685c0d2f6655d5cca2c4db0ddea775e631918cd/MarkupSafe-3.0.2-cp311-cp311-win32.whl", hash = "sha256:6c89876f41da747c8d3677a2b540fb32ef5715f97b66eeb0c6b66f5e3ef6f59d", size = 15094, upload-time = "2024-10-18T15:21:11.005Z" }, - { url = "https://files.pythonhosted.org/packages/da/b8/3a3bd761922d416f3dc5d00bfbed11f66b1ab89a0c2b6e887240a30b0f6b/MarkupSafe-3.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:70a87b411535ccad5ef2f1df5136506a10775d267e197e4cf531ced10537bd6b", size = 15521, upload-time = "2024-10-18T15:21:12.911Z" }, - { url = "https://files.pythonhosted.org/packages/22/09/d1f21434c97fc42f09d290cbb6350d44eb12f09cc62c9476effdb33a18aa/MarkupSafe-3.0.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:9778bd8ab0a994ebf6f84c2b949e65736d5575320a17ae8984a77fab08db94cf", size = 14274, upload-time = "2024-10-18T15:21:13.777Z" }, - { url = "https://files.pythonhosted.org/packages/6b/b0/18f76bba336fa5aecf79d45dcd6c806c280ec44538b3c13671d49099fdd0/MarkupSafe-3.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:846ade7b71e3536c4e56b386c2a47adf5741d2d8b94ec9dc3e92e5e1ee1e2225", size = 12348, upload-time = "2024-10-18T15:21:14.822Z" }, - { url = "https://files.pythonhosted.org/packages/e0/25/dd5c0f6ac1311e9b40f4af06c78efde0f3b5cbf02502f8ef9501294c425b/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1c99d261bd2d5f6b59325c92c73df481e05e57f19837bdca8413b9eac4bd8028", size = 24149, upload-time = "2024-10-18T15:21:15.642Z" }, - { url = "https://files.pythonhosted.org/packages/f3/f0/89e7aadfb3749d0f52234a0c8c7867877876e0a20b60e2188e9850794c17/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e17c96c14e19278594aa4841ec148115f9c7615a47382ecb6b82bd8fea3ab0c8", size = 23118, upload-time = "2024-10-18T15:21:17.133Z" }, - { url = "https://files.pythonhosted.org/packages/d5/da/f2eeb64c723f5e3777bc081da884b414671982008c47dcc1873d81f625b6/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:88416bd1e65dcea10bc7569faacb2c20ce071dd1f87539ca2ab364bf6231393c", size = 22993, upload-time = "2024-10-18T15:21:18.064Z" }, - { url = "https://files.pythonhosted.org/packages/da/0e/1f32af846df486dce7c227fe0f2398dc7e2e51d4a370508281f3c1c5cddc/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:2181e67807fc2fa785d0592dc2d6206c019b9502410671cc905d132a92866557", size = 24178, upload-time = "2024-10-18T15:21:18.859Z" }, - { url = "https://files.pythonhosted.org/packages/c4/f6/bb3ca0532de8086cbff5f06d137064c8410d10779c4c127e0e47d17c0b71/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:52305740fe773d09cffb16f8ed0427942901f00adedac82ec8b67752f58a1b22", size = 23319, upload-time = "2024-10-18T15:21:19.671Z" }, - { url = "https://files.pythonhosted.org/packages/a2/82/8be4c96ffee03c5b4a034e60a31294daf481e12c7c43ab8e34a1453ee48b/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ad10d3ded218f1039f11a75f8091880239651b52e9bb592ca27de44eed242a48", size = 23352, upload-time = "2024-10-18T15:21:20.971Z" }, - { url = "https://files.pythonhosted.org/packages/51/ae/97827349d3fcffee7e184bdf7f41cd6b88d9919c80f0263ba7acd1bbcb18/MarkupSafe-3.0.2-cp312-cp312-win32.whl", hash = "sha256:0f4ca02bea9a23221c0182836703cbf8930c5e9454bacce27e767509fa286a30", size = 15097, upload-time = "2024-10-18T15:21:22.646Z" }, - { url = "https://files.pythonhosted.org/packages/c1/80/a61f99dc3a936413c3ee4e1eecac96c0da5ed07ad56fd975f1a9da5bc630/MarkupSafe-3.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:8e06879fc22a25ca47312fbe7c8264eb0b662f6db27cb2d3bbbc74b1df4b9b87", size = 15601, upload-time = "2024-10-18T15:21:23.499Z" }, - { url = "https://files.pythonhosted.org/packages/83/0e/67eb10a7ecc77a0c2bbe2b0235765b98d164d81600746914bebada795e97/MarkupSafe-3.0.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ba9527cdd4c926ed0760bc301f6728ef34d841f405abf9d4f959c478421e4efd", size = 14274, upload-time = "2024-10-18T15:21:24.577Z" }, - { url = "https://files.pythonhosted.org/packages/2b/6d/9409f3684d3335375d04e5f05744dfe7e9f120062c9857df4ab490a1031a/MarkupSafe-3.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f8b3d067f2e40fe93e1ccdd6b2e1d16c43140e76f02fb1319a05cf2b79d99430", size = 12352, upload-time = "2024-10-18T15:21:25.382Z" }, - { url = "https://files.pythonhosted.org/packages/d2/f5/6eadfcd3885ea85fe2a7c128315cc1bb7241e1987443d78c8fe712d03091/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:569511d3b58c8791ab4c2e1285575265991e6d8f8700c7be0e88f86cb0672094", size = 24122, upload-time = "2024-10-18T15:21:26.199Z" }, - { url = "https://files.pythonhosted.org/packages/0c/91/96cf928db8236f1bfab6ce15ad070dfdd02ed88261c2afafd4b43575e9e9/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:15ab75ef81add55874e7ab7055e9c397312385bd9ced94920f2802310c930396", size = 23085, upload-time = "2024-10-18T15:21:27.029Z" }, - { url = "https://files.pythonhosted.org/packages/c2/cf/c9d56af24d56ea04daae7ac0940232d31d5a8354f2b457c6d856b2057d69/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f3818cb119498c0678015754eba762e0d61e5b52d34c8b13d770f0719f7b1d79", size = 22978, upload-time = "2024-10-18T15:21:27.846Z" }, - { url = "https://files.pythonhosted.org/packages/2a/9f/8619835cd6a711d6272d62abb78c033bda638fdc54c4e7f4272cf1c0962b/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:cdb82a876c47801bb54a690c5ae105a46b392ac6099881cdfb9f6e95e4014c6a", size = 24208, upload-time = "2024-10-18T15:21:28.744Z" }, - { url = "https://files.pythonhosted.org/packages/f9/bf/176950a1792b2cd2102b8ffeb5133e1ed984547b75db47c25a67d3359f77/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:cabc348d87e913db6ab4aa100f01b08f481097838bdddf7c7a84b7575b7309ca", size = 23357, upload-time = "2024-10-18T15:21:29.545Z" }, - { url = "https://files.pythonhosted.org/packages/ce/4f/9a02c1d335caabe5c4efb90e1b6e8ee944aa245c1aaaab8e8a618987d816/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:444dcda765c8a838eaae23112db52f1efaf750daddb2d9ca300bcae1039adc5c", size = 23344, upload-time = "2024-10-18T15:21:30.366Z" }, - { url = "https://files.pythonhosted.org/packages/ee/55/c271b57db36f748f0e04a759ace9f8f759ccf22b4960c270c78a394f58be/MarkupSafe-3.0.2-cp313-cp313-win32.whl", hash = "sha256:bcf3e58998965654fdaff38e58584d8937aa3096ab5354d493c77d1fdd66d7a1", size = 15101, upload-time = "2024-10-18T15:21:31.207Z" }, - { url = "https://files.pythonhosted.org/packages/29/88/07df22d2dd4df40aba9f3e402e6dc1b8ee86297dddbad4872bd5e7b0094f/MarkupSafe-3.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:e6a2a455bd412959b57a172ce6328d2dd1f01cb2135efda2e4576e8a23fa3b0f", size = 15603, upload-time = "2024-10-18T15:21:32.032Z" }, - { url = "https://files.pythonhosted.org/packages/62/6a/8b89d24db2d32d433dffcd6a8779159da109842434f1dd2f6e71f32f738c/MarkupSafe-3.0.2-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:b5a6b3ada725cea8a5e634536b1b01c30bcdcd7f9c6fff4151548d5bf6b3a36c", size = 14510, upload-time = "2024-10-18T15:21:33.625Z" }, - { url = "https://files.pythonhosted.org/packages/7a/06/a10f955f70a2e5a9bf78d11a161029d278eeacbd35ef806c3fd17b13060d/MarkupSafe-3.0.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:a904af0a6162c73e3edcb969eeeb53a63ceeb5d8cf642fade7d39e7963a22ddb", size = 12486, upload-time = "2024-10-18T15:21:34.611Z" }, - { url = "https://files.pythonhosted.org/packages/34/cf/65d4a571869a1a9078198ca28f39fba5fbb910f952f9dbc5220afff9f5e6/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4aa4e5faecf353ed117801a068ebab7b7e09ffb6e1d5e412dc852e0da018126c", size = 25480, upload-time = "2024-10-18T15:21:35.398Z" }, - { url = "https://files.pythonhosted.org/packages/0c/e3/90e9651924c430b885468b56b3d597cabf6d72be4b24a0acd1fa0e12af67/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c0ef13eaeee5b615fb07c9a7dadb38eac06a0608b41570d8ade51c56539e509d", size = 23914, upload-time = "2024-10-18T15:21:36.231Z" }, - { url = "https://files.pythonhosted.org/packages/66/8c/6c7cf61f95d63bb866db39085150df1f2a5bd3335298f14a66b48e92659c/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d16a81a06776313e817c951135cf7340a3e91e8c1ff2fac444cfd75fffa04afe", size = 23796, upload-time = "2024-10-18T15:21:37.073Z" }, - { url = "https://files.pythonhosted.org/packages/bb/35/cbe9238ec3f47ac9a7c8b3df7a808e7cb50fe149dc7039f5f454b3fba218/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:6381026f158fdb7c72a168278597a5e3a5222e83ea18f543112b2662a9b699c5", size = 25473, upload-time = "2024-10-18T15:21:37.932Z" }, - { url = "https://files.pythonhosted.org/packages/e6/32/7621a4382488aa283cc05e8984a9c219abad3bca087be9ec77e89939ded9/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:3d79d162e7be8f996986c064d1c7c817f6df3a77fe3d6859f6f9e7be4b8c213a", size = 24114, upload-time = "2024-10-18T15:21:39.799Z" }, - { url = "https://files.pythonhosted.org/packages/0d/80/0985960e4b89922cb5a0bac0ed39c5b96cbc1a536a99f30e8c220a996ed9/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:131a3c7689c85f5ad20f9f6fb1b866f402c445b220c19fe4308c0b147ccd2ad9", size = 24098, upload-time = "2024-10-18T15:21:40.813Z" }, - { url = "https://files.pythonhosted.org/packages/82/78/fedb03c7d5380df2427038ec8d973587e90561b2d90cd472ce9254cf348b/MarkupSafe-3.0.2-cp313-cp313t-win32.whl", hash = "sha256:ba8062ed2cf21c07a9e295d5b8a2a5ce678b913b45fdf68c32d95d6c1291e0b6", size = 15208, upload-time = "2024-10-18T15:21:41.814Z" }, - { url = "https://files.pythonhosted.org/packages/4f/65/6079a46068dfceaeabb5dcad6d674f5f5c61a6fa5673746f42a9f4c233b3/MarkupSafe-3.0.2-cp313-cp313t-win_amd64.whl", hash = "sha256:e444a31f8db13eb18ada366ab3cf45fd4b31e4db1236a4448f68778c1d1a5a2f", size = 15739, upload-time = "2024-10-18T15:21:42.784Z" }, -] - -[[package]] -name = "matplotlib" -version = "3.10.6" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "contourpy" }, - { name = "cycler" }, - { name = "fonttools" }, - { name = "kiwisolver" }, - { name = "numpy" }, - { name = "packaging" }, - { name = "pillow" }, - { name = "pyparsing" }, - { name = "python-dateutil" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/a0/59/c3e6453a9676ffba145309a73c462bb407f4400de7de3f2b41af70720a3c/matplotlib-3.10.6.tar.gz", hash = "sha256:ec01b645840dd1996df21ee37f208cd8ba57644779fa20464010638013d3203c", size = 34804264, upload-time = "2025-08-30T00:14:25.137Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/80/d6/5d3665aa44c49005aaacaa68ddea6fcb27345961cd538a98bb0177934ede/matplotlib-3.10.6-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:905b60d1cb0ee604ce65b297b61cf8be9f4e6cfecf95a3fe1c388b5266bc8f4f", size = 8257527, upload-time = "2025-08-30T00:12:45.31Z" }, - { url = "https://files.pythonhosted.org/packages/8c/af/30ddefe19ca67eebd70047dabf50f899eaff6f3c5e6a1a7edaecaf63f794/matplotlib-3.10.6-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:7bac38d816637343e53d7185d0c66677ff30ffb131044a81898b5792c956ba76", size = 8119583, upload-time = "2025-08-30T00:12:47.236Z" }, - { url = "https://files.pythonhosted.org/packages/d3/29/4a8650a3dcae97fa4f375d46efcb25920d67b512186f8a6788b896062a81/matplotlib-3.10.6-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:942a8de2b5bfff1de31d95722f702e2966b8a7e31f4e68f7cd963c7cd8861cf6", size = 8692682, upload-time = "2025-08-30T00:12:48.781Z" }, - { url = "https://files.pythonhosted.org/packages/aa/d3/b793b9cb061cfd5d42ff0f69d1822f8d5dbc94e004618e48a97a8373179a/matplotlib-3.10.6-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a3276c85370bc0dfca051ec65c5817d1e0f8f5ce1b7787528ec8ed2d524bbc2f", size = 9521065, upload-time = "2025-08-30T00:12:50.602Z" }, - { url = "https://files.pythonhosted.org/packages/f7/c5/53de5629f223c1c66668d46ac2621961970d21916a4bc3862b174eb2a88f/matplotlib-3.10.6-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:9df5851b219225731f564e4b9e7f2ac1e13c9e6481f941b5631a0f8e2d9387ce", size = 9576888, upload-time = "2025-08-30T00:12:52.92Z" }, - { url = "https://files.pythonhosted.org/packages/fc/8e/0a18d6d7d2d0a2e66585032a760d13662e5250c784d53ad50434e9560991/matplotlib-3.10.6-cp311-cp311-win_amd64.whl", hash = "sha256:abb5d9478625dd9c9eb51a06d39aae71eda749ae9b3138afb23eb38824026c7e", size = 8115158, upload-time = "2025-08-30T00:12:54.863Z" }, - { url = "https://files.pythonhosted.org/packages/07/b3/1a5107bb66c261e23b9338070702597a2d374e5aa7004b7adfc754fbed02/matplotlib-3.10.6-cp311-cp311-win_arm64.whl", hash = "sha256:886f989ccfae63659183173bb3fced7fd65e9eb793c3cc21c273add368536951", size = 7992444, upload-time = "2025-08-30T00:12:57.067Z" }, - { url = "https://files.pythonhosted.org/packages/ea/1a/7042f7430055d567cc3257ac409fcf608599ab27459457f13772c2d9778b/matplotlib-3.10.6-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:31ca662df6a80bd426f871105fdd69db7543e28e73a9f2afe80de7e531eb2347", size = 8272404, upload-time = "2025-08-30T00:12:59.112Z" }, - { url = "https://files.pythonhosted.org/packages/a9/5d/1d5f33f5b43f4f9e69e6a5fe1fb9090936ae7bc8e2ff6158e7a76542633b/matplotlib-3.10.6-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1678bb61d897bb4ac4757b5ecfb02bfb3fddf7f808000fb81e09c510712fda75", size = 8128262, upload-time = "2025-08-30T00:13:01.141Z" }, - { url = "https://files.pythonhosted.org/packages/67/c3/135fdbbbf84e0979712df58e5e22b4f257b3f5e52a3c4aacf1b8abec0d09/matplotlib-3.10.6-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:56cd2d20842f58c03d2d6e6c1f1cf5548ad6f66b91e1e48f814e4fb5abd1cb95", size = 8697008, upload-time = "2025-08-30T00:13:03.24Z" }, - { url = "https://files.pythonhosted.org/packages/9c/be/c443ea428fb2488a3ea7608714b1bd85a82738c45da21b447dc49e2f8e5d/matplotlib-3.10.6-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:662df55604a2f9a45435566d6e2660e41efe83cd94f4288dfbf1e6d1eae4b0bb", size = 9530166, upload-time = "2025-08-30T00:13:05.951Z" }, - { url = "https://files.pythonhosted.org/packages/a9/35/48441422b044d74034aea2a3e0d1a49023f12150ebc58f16600132b9bbaf/matplotlib-3.10.6-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:08f141d55148cd1fc870c3387d70ca4df16dee10e909b3b038782bd4bda6ea07", size = 9593105, upload-time = "2025-08-30T00:13:08.356Z" }, - { url = "https://files.pythonhosted.org/packages/45/c3/994ef20eb4154ab84cc08d033834555319e4af970165e6c8894050af0b3c/matplotlib-3.10.6-cp312-cp312-win_amd64.whl", hash = "sha256:590f5925c2d650b5c9d813c5b3b5fc53f2929c3f8ef463e4ecfa7e052044fb2b", size = 8122784, upload-time = "2025-08-30T00:13:10.367Z" }, - { url = "https://files.pythonhosted.org/packages/57/b8/5c85d9ae0e40f04e71bedb053aada5d6bab1f9b5399a0937afb5d6b02d98/matplotlib-3.10.6-cp312-cp312-win_arm64.whl", hash = "sha256:f44c8d264a71609c79a78d50349e724f5d5fc3684ead7c2a473665ee63d868aa", size = 7992823, upload-time = "2025-08-30T00:13:12.24Z" }, - { url = "https://files.pythonhosted.org/packages/a0/db/18380e788bb837e724358287b08e223b32bc8dccb3b0c12fa8ca20bc7f3b/matplotlib-3.10.6-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:819e409653c1106c8deaf62e6de6b8611449c2cd9939acb0d7d4e57a3d95cc7a", size = 8273231, upload-time = "2025-08-30T00:13:13.881Z" }, - { url = "https://files.pythonhosted.org/packages/d3/0f/38dd49445b297e0d4f12a322c30779df0d43cb5873c7847df8a82e82ec67/matplotlib-3.10.6-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:59c8ac8382fefb9cb71308dde16a7c487432f5255d8f1fd32473523abecfecdf", size = 8128730, upload-time = "2025-08-30T00:13:15.556Z" }, - { url = "https://files.pythonhosted.org/packages/e5/b8/9eea6630198cb303d131d95d285a024b3b8645b1763a2916fddb44ca8760/matplotlib-3.10.6-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:84e82d9e0fd70c70bc55739defbd8055c54300750cbacf4740c9673a24d6933a", size = 8698539, upload-time = "2025-08-30T00:13:17.297Z" }, - { url = "https://files.pythonhosted.org/packages/71/34/44c7b1f075e1ea398f88aeabcc2907c01b9cc99e2afd560c1d49845a1227/matplotlib-3.10.6-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:25f7a3eb42d6c1c56e89eacd495661fc815ffc08d9da750bca766771c0fd9110", size = 9529702, upload-time = "2025-08-30T00:13:19.248Z" }, - { url = "https://files.pythonhosted.org/packages/b5/7f/e5c2dc9950c7facaf8b461858d1b92c09dd0cf174fe14e21953b3dda06f7/matplotlib-3.10.6-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:f9c862d91ec0b7842920a4cfdaaec29662195301914ea54c33e01f1a28d014b2", size = 9593742, upload-time = "2025-08-30T00:13:21.181Z" }, - { url = "https://files.pythonhosted.org/packages/ff/1d/70c28528794f6410ee2856cd729fa1f1756498b8d3126443b0a94e1a8695/matplotlib-3.10.6-cp313-cp313-win_amd64.whl", hash = "sha256:1b53bd6337eba483e2e7d29c5ab10eee644bc3a2491ec67cc55f7b44583ffb18", size = 8122753, upload-time = "2025-08-30T00:13:23.44Z" }, - { url = "https://files.pythonhosted.org/packages/e8/74/0e1670501fc7d02d981564caf7c4df42974464625935424ca9654040077c/matplotlib-3.10.6-cp313-cp313-win_arm64.whl", hash = "sha256:cbd5eb50b7058b2892ce45c2f4e92557f395c9991f5c886d1bb74a1582e70fd6", size = 7992973, upload-time = "2025-08-30T00:13:26.632Z" }, - { url = "https://files.pythonhosted.org/packages/b1/4e/60780e631d73b6b02bd7239f89c451a72970e5e7ec34f621eda55cd9a445/matplotlib-3.10.6-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:acc86dd6e0e695c095001a7fccff158c49e45e0758fdf5dcdbb0103318b59c9f", size = 8316869, upload-time = "2025-08-30T00:13:28.262Z" }, - { url = "https://files.pythonhosted.org/packages/f8/15/baa662374a579413210fc2115d40c503b7360a08e9cc254aa0d97d34b0c1/matplotlib-3.10.6-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:e228cd2ffb8f88b7d0b29e37f68ca9aaf83e33821f24a5ccc4f082dd8396bc27", size = 8178240, upload-time = "2025-08-30T00:13:30.007Z" }, - { url = "https://files.pythonhosted.org/packages/c6/3f/3c38e78d2aafdb8829fcd0857d25aaf9e7dd2dfcf7ec742765b585774931/matplotlib-3.10.6-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:658bc91894adeab669cf4bb4a186d049948262987e80f0857216387d7435d833", size = 8711719, upload-time = "2025-08-30T00:13:31.72Z" }, - { url = "https://files.pythonhosted.org/packages/96/4b/2ec2bbf8cefaa53207cc56118d1fa8a0f9b80642713ea9390235d331ede4/matplotlib-3.10.6-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8913b7474f6dd83ac444c9459c91f7f0f2859e839f41d642691b104e0af056aa", size = 9541422, upload-time = "2025-08-30T00:13:33.611Z" }, - { url = "https://files.pythonhosted.org/packages/83/7d/40255e89b3ef11c7871020563b2dd85f6cb1b4eff17c0f62b6eb14c8fa80/matplotlib-3.10.6-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:091cea22e059b89f6d7d1a18e2c33a7376c26eee60e401d92a4d6726c4e12706", size = 9594068, upload-time = "2025-08-30T00:13:35.833Z" }, - { url = "https://files.pythonhosted.org/packages/f0/a9/0213748d69dc842537a113493e1c27daf9f96bd7cc316f933dc8ec4de985/matplotlib-3.10.6-cp313-cp313t-win_amd64.whl", hash = "sha256:491e25e02a23d7207629d942c666924a6b61e007a48177fdd231a0097b7f507e", size = 8200100, upload-time = "2025-08-30T00:13:37.668Z" }, - { url = "https://files.pythonhosted.org/packages/be/15/79f9988066ce40b8a6f1759a934ea0cde8dc4adc2262255ee1bc98de6ad0/matplotlib-3.10.6-cp313-cp313t-win_arm64.whl", hash = "sha256:3d80d60d4e54cda462e2cd9a086d85cd9f20943ead92f575ce86885a43a565d5", size = 8042142, upload-time = "2025-08-30T00:13:39.426Z" }, - { url = "https://files.pythonhosted.org/packages/7c/58/e7b6d292beae6fb4283ca6fb7fa47d7c944a68062d6238c07b497dd35493/matplotlib-3.10.6-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:70aaf890ce1d0efd482df969b28a5b30ea0b891224bb315810a3940f67182899", size = 8273802, upload-time = "2025-08-30T00:13:41.006Z" }, - { url = "https://files.pythonhosted.org/packages/9f/f6/7882d05aba16a8cdd594fb9a03a9d3cca751dbb6816adf7b102945522ee9/matplotlib-3.10.6-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:1565aae810ab79cb72e402b22facfa6501365e73ebab70a0fdfb98488d2c3c0c", size = 8131365, upload-time = "2025-08-30T00:13:42.664Z" }, - { url = "https://files.pythonhosted.org/packages/94/bf/ff32f6ed76e78514e98775a53715eca4804b12bdcf35902cdd1cf759d324/matplotlib-3.10.6-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f3b23315a01981689aa4e1a179dbf6ef9fbd17143c3eea77548c2ecfb0499438", size = 9533961, upload-time = "2025-08-30T00:13:44.372Z" }, - { url = "https://files.pythonhosted.org/packages/fe/c3/6bf88c2fc2da7708a2ff8d2eeb5d68943130f50e636d5d3dcf9d4252e971/matplotlib-3.10.6-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:30fdd37edf41a4e6785f9b37969de57aea770696cb637d9946eb37470c94a453", size = 9804262, upload-time = "2025-08-30T00:13:46.614Z" }, - { url = "https://files.pythonhosted.org/packages/0f/7a/e05e6d9446d2d577b459427ad060cd2de5742d0e435db3191fea4fcc7e8b/matplotlib-3.10.6-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:bc31e693da1c08012c764b053e702c1855378e04102238e6a5ee6a7117c53a47", size = 9595508, upload-time = "2025-08-30T00:13:48.731Z" }, - { url = "https://files.pythonhosted.org/packages/39/fb/af09c463ced80b801629fd73b96f726c9f6124c3603aa2e480a061d6705b/matplotlib-3.10.6-cp314-cp314-win_amd64.whl", hash = "sha256:05be9bdaa8b242bc6ff96330d18c52f1fc59c6fb3a4dd411d953d67e7e1baf98", size = 8252742, upload-time = "2025-08-30T00:13:50.539Z" }, - { url = "https://files.pythonhosted.org/packages/b1/f9/b682f6db9396d9ab8f050c0a3bfbb5f14fb0f6518f08507c04cc02f8f229/matplotlib-3.10.6-cp314-cp314-win_arm64.whl", hash = "sha256:f56a0d1ab05d34c628592435781d185cd99630bdfd76822cd686fb5a0aecd43a", size = 8124237, upload-time = "2025-08-30T00:13:54.3Z" }, - { url = "https://files.pythonhosted.org/packages/b5/d2/b69b4a0923a3c05ab90527c60fdec899ee21ca23ede7f0fb818e6620d6f2/matplotlib-3.10.6-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:94f0b4cacb23763b64b5dace50d5b7bfe98710fed5f0cef5c08135a03399d98b", size = 8316956, upload-time = "2025-08-30T00:13:55.932Z" }, - { url = "https://files.pythonhosted.org/packages/28/e9/dc427b6f16457ffaeecb2fc4abf91e5adb8827861b869c7a7a6d1836fa73/matplotlib-3.10.6-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:cc332891306b9fb39462673d8225d1b824c89783fee82840a709f96714f17a5c", size = 8178260, upload-time = "2025-08-30T00:14:00.942Z" }, - { url = "https://files.pythonhosted.org/packages/c4/89/1fbd5ad611802c34d1c7ad04607e64a1350b7fb9c567c4ec2c19e066ed35/matplotlib-3.10.6-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ee1d607b3fb1590deb04b69f02ea1d53ed0b0bf75b2b1a5745f269afcbd3cdd3", size = 9541422, upload-time = "2025-08-30T00:14:02.664Z" }, - { url = "https://files.pythonhosted.org/packages/b0/3b/65fec8716025b22c1d72d5a82ea079934c76a547696eaa55be6866bc89b1/matplotlib-3.10.6-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:376a624a218116461696b27b2bbf7a8945053e6d799f6502fc03226d077807bf", size = 9803678, upload-time = "2025-08-30T00:14:04.741Z" }, - { url = "https://files.pythonhosted.org/packages/c7/b0/40fb2b3a1ab9381bb39a952e8390357c8be3bdadcf6d5055d9c31e1b35ae/matplotlib-3.10.6-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:83847b47f6524c34b4f2d3ce726bb0541c48c8e7692729865c3df75bfa0f495a", size = 9594077, upload-time = "2025-08-30T00:14:07.012Z" }, - { url = "https://files.pythonhosted.org/packages/76/34/c4b71b69edf5b06e635eee1ed10bfc73cf8df058b66e63e30e6a55e231d5/matplotlib-3.10.6-cp314-cp314t-win_amd64.whl", hash = "sha256:c7e0518e0d223683532a07f4b512e2e0729b62674f1b3a1a69869f98e6b1c7e3", size = 8342822, upload-time = "2025-08-30T00:14:09.041Z" }, - { url = "https://files.pythonhosted.org/packages/e8/62/aeabeef1a842b6226a30d49dd13e8a7a1e81e9ec98212c0b5169f0a12d83/matplotlib-3.10.6-cp314-cp314t-win_arm64.whl", hash = "sha256:4dd83e029f5b4801eeb87c64efd80e732452781c16a9cf7415b7b63ec8f374d7", size = 8172588, upload-time = "2025-08-30T00:14:11.166Z" }, - { url = "https://files.pythonhosted.org/packages/12/bb/02c35a51484aae5f49bd29f091286e7af5f3f677a9736c58a92b3c78baeb/matplotlib-3.10.6-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:f2d684c3204fa62421bbf770ddfebc6b50130f9cad65531eeba19236d73bb488", size = 8252296, upload-time = "2025-08-30T00:14:19.49Z" }, - { url = "https://files.pythonhosted.org/packages/7d/85/41701e3092005aee9a2445f5ee3904d9dbd4a7df7a45905ffef29b7ef098/matplotlib-3.10.6-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:6f4a69196e663a41d12a728fab8751177215357906436804217d6d9cf0d4d6cf", size = 8116749, upload-time = "2025-08-30T00:14:21.344Z" }, - { url = "https://files.pythonhosted.org/packages/16/53/8d8fa0ea32a8c8239e04d022f6c059ee5e1b77517769feccd50f1df43d6d/matplotlib-3.10.6-pp311-pypy311_pp73-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:4d6ca6ef03dfd269f4ead566ec6f3fb9becf8dab146fb999022ed85ee9f6b3eb", size = 8693933, upload-time = "2025-08-30T00:14:22.942Z" }, -] - -[[package]] -name = "mcp" -version = "1.14.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "anyio" }, - { name = "httpx" }, - { name = "httpx-sse" }, - { name = "jsonschema" }, - { name = "pydantic" }, - { name = "pydantic-settings" }, - { name = "python-multipart" }, - { name = "pywin32", marker = "sys_platform == 'win32'" }, - { name = "sse-starlette" }, - { name = "starlette" }, - { name = "uvicorn", marker = "sys_platform != 'emscripten'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/95/fd/d6e941a52446198b73e5e4a953441f667f1469aeb06fb382d9f6729d6168/mcp-1.14.0.tar.gz", hash = "sha256:2e7d98b195e08b2abc1dc6191f6f3dc0059604ac13ee6a40f88676274787fac4", size = 454855, upload-time = "2025-09-11T17:40:48.667Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/04/7b/84b0dd4c2c5a499d2c5d63fb7a1224c25fc4c8b6c24623fa7a566471480d/mcp-1.14.0-py3-none-any.whl", hash = "sha256:b2d27feba27b4c53d41b58aa7f4d090ae0cb740cbc4e339af10f8cbe54c4e19d", size = 163805, upload-time = "2025-09-11T17:40:46.891Z" }, -] - -[[package]] -name = "mdurl" -version = "0.1.2" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/d6/54/cfe61301667036ec958cb99bd3efefba235e65cdeb9c84d24a8293ba1d90/mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba", size = 8729, upload-time = "2022-08-14T12:40:10.846Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/b3/38/89ba8ad64ae25be8de66a6d463314cf1eb366222074cfda9ee839c56a4b4/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8", size = 9979, upload-time = "2022-08-14T12:40:09.779Z" }, -] - -[[package]] -name = "more-itertools" -version = "10.8.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/ea/5d/38b681d3fce7a266dd9ab73c66959406d565b3e85f21d5e66e1181d93721/more_itertools-10.8.0.tar.gz", hash = "sha256:f638ddf8a1a0d134181275fb5d58b086ead7c6a72429ad725c67503f13ba30bd", size = 137431, upload-time = "2025-09-02T15:23:11.018Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/a4/8e/469e5a4a2f5855992e425f3cb33804cc07bf18d48f2db061aec61ce50270/more_itertools-10.8.0-py3-none-any.whl", hash = "sha256:52d4362373dcf7c52546bc4af9a86ee7c4579df9a8dc268be0a2f949d376cc9b", size = 69667, upload-time = "2025-09-02T15:23:09.635Z" }, -] - -[[package]] -name = "mpmath" -version = "1.3.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/e0/47/dd32fa426cc72114383ac549964eecb20ecfd886d1e5ccf5340b55b02f57/mpmath-1.3.0.tar.gz", hash = "sha256:7a28eb2a9774d00c7bc92411c19a89209d5da7c4c9a9e227be8330a23a25b91f", size = 508106, upload-time = "2023-03-07T16:47:11.061Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/43/e3/7d92a15f894aa0c9c4b49b8ee9ac9850d6e63b03c9c32c0367a13ae62209/mpmath-1.3.0-py3-none-any.whl", hash = "sha256:a0b2b9fe80bbcd81a6647ff13108738cfb482d481d826cc0e02f5b35e5c88d2c", size = 536198, upload-time = "2023-03-07T16:47:09.197Z" }, -] - -[[package]] -name = "multidict" -version = "6.6.4" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/69/7f/0652e6ed47ab288e3756ea9c0df8b14950781184d4bd7883f4d87dd41245/multidict-6.6.4.tar.gz", hash = "sha256:d2d4e4787672911b48350df02ed3fa3fffdc2f2e8ca06dd6afdf34189b76a9dd", size = 101843, upload-time = "2025-08-11T12:08:48.217Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/6b/7f/90a7f01e2d005d6653c689039977f6856718c75c5579445effb7e60923d1/multidict-6.6.4-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:c7a0e9b561e6460484318a7612e725df1145d46b0ef57c6b9866441bf6e27e0c", size = 76472, upload-time = "2025-08-11T12:06:29.006Z" }, - { url = "https://files.pythonhosted.org/packages/54/a3/bed07bc9e2bb302ce752f1dabc69e884cd6a676da44fb0e501b246031fdd/multidict-6.6.4-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6bf2f10f70acc7a2446965ffbc726e5fc0b272c97a90b485857e5c70022213eb", size = 44634, upload-time = "2025-08-11T12:06:30.374Z" }, - { url = "https://files.pythonhosted.org/packages/a7/4b/ceeb4f8f33cf81277da464307afeaf164fb0297947642585884f5cad4f28/multidict-6.6.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:66247d72ed62d5dd29752ffc1d3b88f135c6a8de8b5f63b7c14e973ef5bda19e", size = 44282, upload-time = "2025-08-11T12:06:31.958Z" }, - { url = "https://files.pythonhosted.org/packages/03/35/436a5da8702b06866189b69f655ffdb8f70796252a8772a77815f1812679/multidict-6.6.4-cp311-cp311-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:105245cc6b76f51e408451a844a54e6823bbd5a490ebfe5bdfc79798511ceded", size = 229696, upload-time = "2025-08-11T12:06:33.087Z" }, - { url = "https://files.pythonhosted.org/packages/b6/0e/915160be8fecf1fca35f790c08fb74ca684d752fcba62c11daaf3d92c216/multidict-6.6.4-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:cbbc54e58b34c3bae389ef00046be0961f30fef7cb0dd9c7756aee376a4f7683", size = 246665, upload-time = "2025-08-11T12:06:34.448Z" }, - { url = "https://files.pythonhosted.org/packages/08/ee/2f464330acd83f77dcc346f0b1a0eaae10230291450887f96b204b8ac4d3/multidict-6.6.4-cp311-cp311-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:56c6b3652f945c9bc3ac6c8178cd93132b8d82dd581fcbc3a00676c51302bc1a", size = 225485, upload-time = "2025-08-11T12:06:35.672Z" }, - { url = "https://files.pythonhosted.org/packages/71/cc/9a117f828b4d7fbaec6adeed2204f211e9caf0a012692a1ee32169f846ae/multidict-6.6.4-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:b95494daf857602eccf4c18ca33337dd2be705bccdb6dddbfc9d513e6addb9d9", size = 257318, upload-time = "2025-08-11T12:06:36.98Z" }, - { url = "https://files.pythonhosted.org/packages/25/77/62752d3dbd70e27fdd68e86626c1ae6bccfebe2bb1f84ae226363e112f5a/multidict-6.6.4-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:e5b1413361cef15340ab9dc61523e653d25723e82d488ef7d60a12878227ed50", size = 254689, upload-time = "2025-08-11T12:06:38.233Z" }, - { url = "https://files.pythonhosted.org/packages/00/6e/fac58b1072a6fc59af5e7acb245e8754d3e1f97f4f808a6559951f72a0d4/multidict-6.6.4-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e167bf899c3d724f9662ef00b4f7fef87a19c22b2fead198a6f68b263618df52", size = 246709, upload-time = "2025-08-11T12:06:39.517Z" }, - { url = "https://files.pythonhosted.org/packages/01/ef/4698d6842ef5e797c6db7744b0081e36fb5de3d00002cc4c58071097fac3/multidict-6.6.4-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:aaea28ba20a9026dfa77f4b80369e51cb767c61e33a2d4043399c67bd95fb7c6", size = 243185, upload-time = "2025-08-11T12:06:40.796Z" }, - { url = "https://files.pythonhosted.org/packages/aa/c9/d82e95ae1d6e4ef396934e9b0e942dfc428775f9554acf04393cce66b157/multidict-6.6.4-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:8c91cdb30809a96d9ecf442ec9bc45e8cfaa0f7f8bdf534e082c2443a196727e", size = 237838, upload-time = "2025-08-11T12:06:42.595Z" }, - { url = "https://files.pythonhosted.org/packages/57/cf/f94af5c36baaa75d44fab9f02e2a6bcfa0cd90acb44d4976a80960759dbc/multidict-6.6.4-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:1a0ccbfe93ca114c5d65a2471d52d8829e56d467c97b0e341cf5ee45410033b3", size = 246368, upload-time = "2025-08-11T12:06:44.304Z" }, - { url = "https://files.pythonhosted.org/packages/4a/fe/29f23460c3d995f6a4b678cb2e9730e7277231b981f0b234702f0177818a/multidict-6.6.4-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:55624b3f321d84c403cb7d8e6e982f41ae233d85f85db54ba6286f7295dc8a9c", size = 253339, upload-time = "2025-08-11T12:06:45.597Z" }, - { url = "https://files.pythonhosted.org/packages/29/b6/fd59449204426187b82bf8a75f629310f68c6adc9559dc922d5abe34797b/multidict-6.6.4-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:4a1fb393a2c9d202cb766c76208bd7945bc194eba8ac920ce98c6e458f0b524b", size = 246933, upload-time = "2025-08-11T12:06:46.841Z" }, - { url = "https://files.pythonhosted.org/packages/19/52/d5d6b344f176a5ac3606f7a61fb44dc746e04550e1a13834dff722b8d7d6/multidict-6.6.4-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:43868297a5759a845fa3a483fb4392973a95fb1de891605a3728130c52b8f40f", size = 242225, upload-time = "2025-08-11T12:06:48.588Z" }, - { url = "https://files.pythonhosted.org/packages/ec/d3/5b2281ed89ff4d5318d82478a2a2450fcdfc3300da48ff15c1778280ad26/multidict-6.6.4-cp311-cp311-win32.whl", hash = "sha256:ed3b94c5e362a8a84d69642dbeac615452e8af9b8eb825b7bc9f31a53a1051e2", size = 41306, upload-time = "2025-08-11T12:06:49.95Z" }, - { url = "https://files.pythonhosted.org/packages/74/7d/36b045c23a1ab98507aefd44fd8b264ee1dd5e5010543c6fccf82141ccef/multidict-6.6.4-cp311-cp311-win_amd64.whl", hash = "sha256:d8c112f7a90d8ca5d20213aa41eac690bb50a76da153e3afb3886418e61cb22e", size = 46029, upload-time = "2025-08-11T12:06:51.082Z" }, - { url = "https://files.pythonhosted.org/packages/0f/5e/553d67d24432c5cd52b49047f2d248821843743ee6d29a704594f656d182/multidict-6.6.4-cp311-cp311-win_arm64.whl", hash = "sha256:3bb0eae408fa1996d87247ca0d6a57b7fc1dcf83e8a5c47ab82c558c250d4adf", size = 43017, upload-time = "2025-08-11T12:06:52.243Z" }, - { url = "https://files.pythonhosted.org/packages/05/f6/512ffd8fd8b37fb2680e5ac35d788f1d71bbaf37789d21a820bdc441e565/multidict-6.6.4-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:0ffb87be160942d56d7b87b0fdf098e81ed565add09eaa1294268c7f3caac4c8", size = 76516, upload-time = "2025-08-11T12:06:53.393Z" }, - { url = "https://files.pythonhosted.org/packages/99/58/45c3e75deb8855c36bd66cc1658007589662ba584dbf423d01df478dd1c5/multidict-6.6.4-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:d191de6cbab2aff5de6c5723101705fd044b3e4c7cfd587a1929b5028b9714b3", size = 45394, upload-time = "2025-08-11T12:06:54.555Z" }, - { url = "https://files.pythonhosted.org/packages/fd/ca/e8c4472a93a26e4507c0b8e1f0762c0d8a32de1328ef72fd704ef9cc5447/multidict-6.6.4-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:38a0956dd92d918ad5feff3db8fcb4a5eb7dba114da917e1a88475619781b57b", size = 43591, upload-time = "2025-08-11T12:06:55.672Z" }, - { url = "https://files.pythonhosted.org/packages/05/51/edf414f4df058574a7265034d04c935aa84a89e79ce90fcf4df211f47b16/multidict-6.6.4-cp312-cp312-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:6865f6d3b7900ae020b495d599fcf3765653bc927951c1abb959017f81ae8287", size = 237215, upload-time = "2025-08-11T12:06:57.213Z" }, - { url = "https://files.pythonhosted.org/packages/c8/45/8b3d6dbad8cf3252553cc41abea09ad527b33ce47a5e199072620b296902/multidict-6.6.4-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0a2088c126b6f72db6c9212ad827d0ba088c01d951cee25e758c450da732c138", size = 258299, upload-time = "2025-08-11T12:06:58.946Z" }, - { url = "https://files.pythonhosted.org/packages/3c/e8/8ca2e9a9f5a435fc6db40438a55730a4bf4956b554e487fa1b9ae920f825/multidict-6.6.4-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:0f37bed7319b848097085d7d48116f545985db988e2256b2e6f00563a3416ee6", size = 242357, upload-time = "2025-08-11T12:07:00.301Z" }, - { url = "https://files.pythonhosted.org/packages/0f/84/80c77c99df05a75c28490b2af8f7cba2a12621186e0a8b0865d8e745c104/multidict-6.6.4-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:01368e3c94032ba6ca0b78e7ccb099643466cf24f8dc8eefcfdc0571d56e58f9", size = 268369, upload-time = "2025-08-11T12:07:01.638Z" }, - { url = "https://files.pythonhosted.org/packages/0d/e9/920bfa46c27b05fb3e1ad85121fd49f441492dca2449c5bcfe42e4565d8a/multidict-6.6.4-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:8fe323540c255db0bffee79ad7f048c909f2ab0edb87a597e1c17da6a54e493c", size = 269341, upload-time = "2025-08-11T12:07:02.943Z" }, - { url = "https://files.pythonhosted.org/packages/af/65/753a2d8b05daf496f4a9c367fe844e90a1b2cac78e2be2c844200d10cc4c/multidict-6.6.4-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b8eb3025f17b0a4c3cd08cda49acf312a19ad6e8a4edd9dbd591e6506d999402", size = 256100, upload-time = "2025-08-11T12:07:04.564Z" }, - { url = "https://files.pythonhosted.org/packages/09/54/655be13ae324212bf0bc15d665a4e34844f34c206f78801be42f7a0a8aaa/multidict-6.6.4-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:bbc14f0365534d35a06970d6a83478b249752e922d662dc24d489af1aa0d1be7", size = 253584, upload-time = "2025-08-11T12:07:05.914Z" }, - { url = "https://files.pythonhosted.org/packages/5c/74/ab2039ecc05264b5cec73eb018ce417af3ebb384ae9c0e9ed42cb33f8151/multidict-6.6.4-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:75aa52fba2d96bf972e85451b99d8e19cc37ce26fd016f6d4aa60da9ab2b005f", size = 251018, upload-time = "2025-08-11T12:07:08.301Z" }, - { url = "https://files.pythonhosted.org/packages/af/0a/ccbb244ac848e56c6427f2392741c06302bbfba49c0042f1eb3c5b606497/multidict-6.6.4-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:4fefd4a815e362d4f011919d97d7b4a1e566f1dde83dc4ad8cfb5b41de1df68d", size = 251477, upload-time = "2025-08-11T12:07:10.248Z" }, - { url = "https://files.pythonhosted.org/packages/0e/b0/0ed49bba775b135937f52fe13922bc64a7eaf0a3ead84a36e8e4e446e096/multidict-6.6.4-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:db9801fe021f59a5b375ab778973127ca0ac52429a26e2fd86aa9508f4d26eb7", size = 263575, upload-time = "2025-08-11T12:07:11.928Z" }, - { url = "https://files.pythonhosted.org/packages/3e/d9/7fb85a85e14de2e44dfb6a24f03c41e2af8697a6df83daddb0e9b7569f73/multidict-6.6.4-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:a650629970fa21ac1fb06ba25dabfc5b8a2054fcbf6ae97c758aa956b8dba802", size = 259649, upload-time = "2025-08-11T12:07:13.244Z" }, - { url = "https://files.pythonhosted.org/packages/03/9e/b3a459bcf9b6e74fa461a5222a10ff9b544cb1cd52fd482fb1b75ecda2a2/multidict-6.6.4-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:452ff5da78d4720d7516a3a2abd804957532dd69296cb77319c193e3ffb87e24", size = 251505, upload-time = "2025-08-11T12:07:14.57Z" }, - { url = "https://files.pythonhosted.org/packages/86/a2/8022f78f041dfe6d71e364001a5cf987c30edfc83c8a5fb7a3f0974cff39/multidict-6.6.4-cp312-cp312-win32.whl", hash = "sha256:8c2fcb12136530ed19572bbba61b407f655e3953ba669b96a35036a11a485793", size = 41888, upload-time = "2025-08-11T12:07:15.904Z" }, - { url = "https://files.pythonhosted.org/packages/c7/eb/d88b1780d43a56db2cba24289fa744a9d216c1a8546a0dc3956563fd53ea/multidict-6.6.4-cp312-cp312-win_amd64.whl", hash = "sha256:047d9425860a8c9544fed1b9584f0c8bcd31bcde9568b047c5e567a1025ecd6e", size = 46072, upload-time = "2025-08-11T12:07:17.045Z" }, - { url = "https://files.pythonhosted.org/packages/9f/16/b929320bf5750e2d9d4931835a4c638a19d2494a5b519caaaa7492ebe105/multidict-6.6.4-cp312-cp312-win_arm64.whl", hash = "sha256:14754eb72feaa1e8ae528468f24250dd997b8e2188c3d2f593f9eba259e4b364", size = 43222, upload-time = "2025-08-11T12:07:18.328Z" }, - { url = "https://files.pythonhosted.org/packages/3a/5d/e1db626f64f60008320aab00fbe4f23fc3300d75892a3381275b3d284580/multidict-6.6.4-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:f46a6e8597f9bd71b31cc708195d42b634c8527fecbcf93febf1052cacc1f16e", size = 75848, upload-time = "2025-08-11T12:07:19.912Z" }, - { url = "https://files.pythonhosted.org/packages/4c/aa/8b6f548d839b6c13887253af4e29c939af22a18591bfb5d0ee6f1931dae8/multidict-6.6.4-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:22e38b2bc176c5eb9c0a0e379f9d188ae4cd8b28c0f53b52bce7ab0a9e534657", size = 45060, upload-time = "2025-08-11T12:07:21.163Z" }, - { url = "https://files.pythonhosted.org/packages/eb/c6/f5e97e5d99a729bc2aa58eb3ebfa9f1e56a9b517cc38c60537c81834a73f/multidict-6.6.4-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:5df8afd26f162da59e218ac0eefaa01b01b2e6cd606cffa46608f699539246da", size = 43269, upload-time = "2025-08-11T12:07:22.392Z" }, - { url = "https://files.pythonhosted.org/packages/dc/31/d54eb0c62516776f36fe67f84a732f97e0b0e12f98d5685bebcc6d396910/multidict-6.6.4-cp313-cp313-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:49517449b58d043023720aa58e62b2f74ce9b28f740a0b5d33971149553d72aa", size = 237158, upload-time = "2025-08-11T12:07:23.636Z" }, - { url = "https://files.pythonhosted.org/packages/c4/1c/8a10c1c25b23156e63b12165a929d8eb49a6ed769fdbefb06e6f07c1e50d/multidict-6.6.4-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ae9408439537c5afdca05edd128a63f56a62680f4b3c234301055d7a2000220f", size = 257076, upload-time = "2025-08-11T12:07:25.049Z" }, - { url = "https://files.pythonhosted.org/packages/ad/86/90e20b5771d6805a119e483fd3d1e8393e745a11511aebca41f0da38c3e2/multidict-6.6.4-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:87a32d20759dc52a9e850fe1061b6e41ab28e2998d44168a8a341b99ded1dba0", size = 240694, upload-time = "2025-08-11T12:07:26.458Z" }, - { url = "https://files.pythonhosted.org/packages/e7/49/484d3e6b535bc0555b52a0a26ba86e4d8d03fd5587d4936dc59ba7583221/multidict-6.6.4-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:52e3c8d43cdfff587ceedce9deb25e6ae77daba560b626e97a56ddcad3756879", size = 266350, upload-time = "2025-08-11T12:07:27.94Z" }, - { url = "https://files.pythonhosted.org/packages/bf/b4/aa4c5c379b11895083d50021e229e90c408d7d875471cb3abf721e4670d6/multidict-6.6.4-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:ad8850921d3a8d8ff6fbef790e773cecfc260bbfa0566998980d3fa8f520bc4a", size = 267250, upload-time = "2025-08-11T12:07:29.303Z" }, - { url = "https://files.pythonhosted.org/packages/80/e5/5e22c5bf96a64bdd43518b1834c6d95a4922cc2066b7d8e467dae9b6cee6/multidict-6.6.4-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:497a2954adc25c08daff36f795077f63ad33e13f19bfff7736e72c785391534f", size = 254900, upload-time = "2025-08-11T12:07:30.764Z" }, - { url = "https://files.pythonhosted.org/packages/17/38/58b27fed927c07035abc02befacab42491e7388ca105e087e6e0215ead64/multidict-6.6.4-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:024ce601f92d780ca1617ad4be5ac15b501cc2414970ffa2bb2bbc2bd5a68fa5", size = 252355, upload-time = "2025-08-11T12:07:32.205Z" }, - { url = "https://files.pythonhosted.org/packages/d0/a1/dad75d23a90c29c02b5d6f3d7c10ab36c3197613be5d07ec49c7791e186c/multidict-6.6.4-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:a693fc5ed9bdd1c9e898013e0da4dcc640de7963a371c0bd458e50e046bf6438", size = 250061, upload-time = "2025-08-11T12:07:33.623Z" }, - { url = "https://files.pythonhosted.org/packages/b8/1a/ac2216b61c7f116edab6dc3378cca6c70dc019c9a457ff0d754067c58b20/multidict-6.6.4-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:190766dac95aab54cae5b152a56520fd99298f32a1266d66d27fdd1b5ac00f4e", size = 249675, upload-time = "2025-08-11T12:07:34.958Z" }, - { url = "https://files.pythonhosted.org/packages/d4/79/1916af833b800d13883e452e8e0977c065c4ee3ab7a26941fbfdebc11895/multidict-6.6.4-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:34d8f2a5ffdceab9dcd97c7a016deb2308531d5f0fced2bb0c9e1df45b3363d7", size = 261247, upload-time = "2025-08-11T12:07:36.588Z" }, - { url = "https://files.pythonhosted.org/packages/c5/65/d1f84fe08ac44a5fc7391cbc20a7cedc433ea616b266284413fd86062f8c/multidict-6.6.4-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:59e8d40ab1f5a8597abcef00d04845155a5693b5da00d2c93dbe88f2050f2812", size = 257960, upload-time = "2025-08-11T12:07:39.735Z" }, - { url = "https://files.pythonhosted.org/packages/13/b5/29ec78057d377b195ac2c5248c773703a6b602e132a763e20ec0457e7440/multidict-6.6.4-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:467fe64138cfac771f0e949b938c2e1ada2b5af22f39692aa9258715e9ea613a", size = 250078, upload-time = "2025-08-11T12:07:41.525Z" }, - { url = "https://files.pythonhosted.org/packages/c4/0e/7e79d38f70a872cae32e29b0d77024bef7834b0afb406ddae6558d9e2414/multidict-6.6.4-cp313-cp313-win32.whl", hash = "sha256:14616a30fe6d0a48d0a48d1a633ab3b8bec4cf293aac65f32ed116f620adfd69", size = 41708, upload-time = "2025-08-11T12:07:43.405Z" }, - { url = "https://files.pythonhosted.org/packages/9d/34/746696dffff742e97cd6a23da953e55d0ea51fa601fa2ff387b3edcfaa2c/multidict-6.6.4-cp313-cp313-win_amd64.whl", hash = "sha256:40cd05eaeb39e2bc8939451f033e57feaa2ac99e07dbca8afe2be450a4a3b6cf", size = 45912, upload-time = "2025-08-11T12:07:45.082Z" }, - { url = "https://files.pythonhosted.org/packages/c7/87/3bac136181e271e29170d8d71929cdeddeb77f3e8b6a0c08da3a8e9da114/multidict-6.6.4-cp313-cp313-win_arm64.whl", hash = "sha256:f6eb37d511bfae9e13e82cb4d1af36b91150466f24d9b2b8a9785816deb16605", size = 43076, upload-time = "2025-08-11T12:07:46.746Z" }, - { url = "https://files.pythonhosted.org/packages/64/94/0a8e63e36c049b571c9ae41ee301ada29c3fee9643d9c2548d7d558a1d99/multidict-6.6.4-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:6c84378acd4f37d1b507dfa0d459b449e2321b3ba5f2338f9b085cf7a7ba95eb", size = 82812, upload-time = "2025-08-11T12:07:48.402Z" }, - { url = "https://files.pythonhosted.org/packages/25/1a/be8e369dfcd260d2070a67e65dd3990dd635cbd735b98da31e00ea84cd4e/multidict-6.6.4-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:0e0558693063c75f3d952abf645c78f3c5dfdd825a41d8c4d8156fc0b0da6e7e", size = 48313, upload-time = "2025-08-11T12:07:49.679Z" }, - { url = "https://files.pythonhosted.org/packages/26/5a/dd4ade298674b2f9a7b06a32c94ffbc0497354df8285f27317c66433ce3b/multidict-6.6.4-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:3f8e2384cb83ebd23fd07e9eada8ba64afc4c759cd94817433ab8c81ee4b403f", size = 46777, upload-time = "2025-08-11T12:07:51.318Z" }, - { url = "https://files.pythonhosted.org/packages/89/db/98aa28bc7e071bfba611ac2ae803c24e96dd3a452b4118c587d3d872c64c/multidict-6.6.4-cp313-cp313t-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:f996b87b420995a9174b2a7c1a8daf7db4750be6848b03eb5e639674f7963773", size = 229321, upload-time = "2025-08-11T12:07:52.965Z" }, - { url = "https://files.pythonhosted.org/packages/c7/bc/01ddda2a73dd9d167bd85d0e8ef4293836a8f82b786c63fb1a429bc3e678/multidict-6.6.4-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:cc356250cffd6e78416cf5b40dc6a74f1edf3be8e834cf8862d9ed5265cf9b0e", size = 249954, upload-time = "2025-08-11T12:07:54.423Z" }, - { url = "https://files.pythonhosted.org/packages/06/78/6b7c0f020f9aa0acf66d0ab4eb9f08375bac9a50ff5e3edb1c4ccd59eafc/multidict-6.6.4-cp313-cp313t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:dadf95aa862714ea468a49ad1e09fe00fcc9ec67d122f6596a8d40caf6cec7d0", size = 228612, upload-time = "2025-08-11T12:07:55.914Z" }, - { url = "https://files.pythonhosted.org/packages/00/44/3faa416f89b2d5d76e9d447296a81521e1c832ad6e40b92f990697b43192/multidict-6.6.4-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:7dd57515bebffd8ebd714d101d4c434063322e4fe24042e90ced41f18b6d3395", size = 257528, upload-time = "2025-08-11T12:07:57.371Z" }, - { url = "https://files.pythonhosted.org/packages/05/5f/77c03b89af0fcb16f018f668207768191fb9dcfb5e3361a5e706a11db2c9/multidict-6.6.4-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:967af5f238ebc2eb1da4e77af5492219fbd9b4b812347da39a7b5f5c72c0fa45", size = 256329, upload-time = "2025-08-11T12:07:58.844Z" }, - { url = "https://files.pythonhosted.org/packages/cf/e9/ed750a2a9afb4f8dc6f13dc5b67b514832101b95714f1211cd42e0aafc26/multidict-6.6.4-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2a4c6875c37aae9794308ec43e3530e4aa0d36579ce38d89979bbf89582002bb", size = 247928, upload-time = "2025-08-11T12:08:01.037Z" }, - { url = "https://files.pythonhosted.org/packages/1f/b5/e0571bc13cda277db7e6e8a532791d4403dacc9850006cb66d2556e649c0/multidict-6.6.4-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:7f683a551e92bdb7fac545b9c6f9fa2aebdeefa61d607510b3533286fcab67f5", size = 245228, upload-time = "2025-08-11T12:08:02.96Z" }, - { url = "https://files.pythonhosted.org/packages/f3/a3/69a84b0eccb9824491f06368f5b86e72e4af54c3067c37c39099b6687109/multidict-6.6.4-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:3ba5aaf600edaf2a868a391779f7a85d93bed147854925f34edd24cc70a3e141", size = 235869, upload-time = "2025-08-11T12:08:04.746Z" }, - { url = "https://files.pythonhosted.org/packages/a9/9d/28802e8f9121a6a0804fa009debf4e753d0a59969ea9f70be5f5fdfcb18f/multidict-6.6.4-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:580b643b7fd2c295d83cad90d78419081f53fd532d1f1eb67ceb7060f61cff0d", size = 243446, upload-time = "2025-08-11T12:08:06.332Z" }, - { url = "https://files.pythonhosted.org/packages/38/ea/6c98add069b4878c1d66428a5f5149ddb6d32b1f9836a826ac764b9940be/multidict-6.6.4-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:37b7187197da6af3ee0b044dbc9625afd0c885f2800815b228a0e70f9a7f473d", size = 252299, upload-time = "2025-08-11T12:08:07.931Z" }, - { url = "https://files.pythonhosted.org/packages/3a/09/8fe02d204473e14c0af3affd50af9078839dfca1742f025cca765435d6b4/multidict-6.6.4-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:e1b93790ed0bc26feb72e2f08299691ceb6da5e9e14a0d13cc74f1869af327a0", size = 246926, upload-time = "2025-08-11T12:08:09.467Z" }, - { url = "https://files.pythonhosted.org/packages/37/3d/7b1e10d774a6df5175ecd3c92bff069e77bed9ec2a927fdd4ff5fe182f67/multidict-6.6.4-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:a506a77ddee1efcca81ecbeae27ade3e09cdf21a8ae854d766c2bb4f14053f92", size = 243383, upload-time = "2025-08-11T12:08:10.981Z" }, - { url = "https://files.pythonhosted.org/packages/50/b0/a6fae46071b645ae98786ab738447de1ef53742eaad949f27e960864bb49/multidict-6.6.4-cp313-cp313t-win32.whl", hash = "sha256:f93b2b2279883d1d0a9e1bd01f312d6fc315c5e4c1f09e112e4736e2f650bc4e", size = 47775, upload-time = "2025-08-11T12:08:12.439Z" }, - { url = "https://files.pythonhosted.org/packages/b2/0a/2436550b1520091af0600dff547913cb2d66fbac27a8c33bc1b1bccd8d98/multidict-6.6.4-cp313-cp313t-win_amd64.whl", hash = "sha256:6d46a180acdf6e87cc41dc15d8f5c2986e1e8739dc25dbb7dac826731ef381a4", size = 53100, upload-time = "2025-08-11T12:08:13.823Z" }, - { url = "https://files.pythonhosted.org/packages/97/ea/43ac51faff934086db9c072a94d327d71b7d8b40cd5dcb47311330929ef0/multidict-6.6.4-cp313-cp313t-win_arm64.whl", hash = "sha256:756989334015e3335d087a27331659820d53ba432befdef6a718398b0a8493ad", size = 45501, upload-time = "2025-08-11T12:08:15.173Z" }, - { url = "https://files.pythonhosted.org/packages/fd/69/b547032297c7e63ba2af494edba695d781af8a0c6e89e4d06cf848b21d80/multidict-6.6.4-py3-none-any.whl", hash = "sha256:27d8f8e125c07cb954e54d75d04905a9bba8a439c1d84aca94949d4d03d8601c", size = 12313, upload-time = "2025-08-11T12:08:46.891Z" }, -] - -[[package]] -name = "mypy" -version = "1.18.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "mypy-extensions" }, - { name = "pathspec" }, - { name = "typing-extensions" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/14/a3/931e09fc02d7ba96da65266884da4e4a8806adcdb8a57faaacc6edf1d538/mypy-1.18.1.tar.gz", hash = "sha256:9e988c64ad3ac5987f43f5154f884747faf62141b7f842e87465b45299eea5a9", size = 3448447, upload-time = "2025-09-11T23:00:47.067Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/32/28/47709d5d9e7068b26c0d5189c8137c8783e81065ad1102b505214a08b548/mypy-1.18.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6c903857b3e28fc5489e54042684a9509039ea0aedb2a619469438b544ae1961", size = 12734635, upload-time = "2025-09-11T23:00:24.983Z" }, - { url = "https://files.pythonhosted.org/packages/7c/12/ee5c243e52497d0e59316854041cf3b3130131b92266d0764aca4dec3c00/mypy-1.18.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2a0c8392c19934c2b6c65566d3a6abdc6b51d5da7f5d04e43f0eb627d6eeee65", size = 11817287, upload-time = "2025-09-11T22:59:07.38Z" }, - { url = "https://files.pythonhosted.org/packages/48/bd/2aeb950151005fe708ab59725afed7c4aeeb96daf844f86a05d4b8ac34f8/mypy-1.18.1-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f85eb7efa2ec73ef63fc23b8af89c2fe5bf2a4ad985ed2d3ff28c1bb3c317c92", size = 12430464, upload-time = "2025-09-11T22:58:48.084Z" }, - { url = "https://files.pythonhosted.org/packages/71/e8/7a20407aafb488acb5734ad7fb5e8c2ef78d292ca2674335350fa8ebef67/mypy-1.18.1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:82ace21edf7ba8af31c3308a61dc72df30500f4dbb26f99ac36b4b80809d7e94", size = 13164555, upload-time = "2025-09-11T23:00:13.803Z" }, - { url = "https://files.pythonhosted.org/packages/e8/c9/5f39065252e033b60f397096f538fb57c1d9fd70a7a490f314df20dd9d64/mypy-1.18.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:a2dfd53dfe632f1ef5d161150a4b1f2d0786746ae02950eb3ac108964ee2975a", size = 13359222, upload-time = "2025-09-11T23:00:33.469Z" }, - { url = "https://files.pythonhosted.org/packages/85/b6/d54111ef3c1e55992cd2ec9b8b6ce9c72a407423e93132cae209f7e7ba60/mypy-1.18.1-cp311-cp311-win_amd64.whl", hash = "sha256:320f0ad4205eefcb0e1a72428dde0ad10be73da9f92e793c36228e8ebf7298c0", size = 9760441, upload-time = "2025-09-11T23:00:44.826Z" }, - { url = "https://files.pythonhosted.org/packages/e7/14/1c3f54d606cb88a55d1567153ef3a8bc7b74702f2ff5eb64d0994f9e49cb/mypy-1.18.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:502cde8896be8e638588b90fdcb4c5d5b8c1b004dfc63fd5604a973547367bb9", size = 12911082, upload-time = "2025-09-11T23:00:41.465Z" }, - { url = "https://files.pythonhosted.org/packages/90/83/235606c8b6d50a8eba99773add907ce1d41c068edb523f81eb0d01603a83/mypy-1.18.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:7509549b5e41be279afc1228242d0e397f1af2919a8f2877ad542b199dc4083e", size = 11919107, upload-time = "2025-09-11T22:58:40.903Z" }, - { url = "https://files.pythonhosted.org/packages/ca/25/4e2ce00f8d15b99d0c68a2536ad63e9eac033f723439ef80290ec32c1ff5/mypy-1.18.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5956ecaabb3a245e3f34100172abca1507be687377fe20e24d6a7557e07080e2", size = 12472551, upload-time = "2025-09-11T22:58:37.272Z" }, - { url = "https://files.pythonhosted.org/packages/32/bb/92642a9350fc339dd9dcefcf6862d171b52294af107d521dce075f32f298/mypy-1.18.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8750ceb014a96c9890421c83f0db53b0f3b8633e2864c6f9bc0a8e93951ed18d", size = 13340554, upload-time = "2025-09-11T22:59:38.756Z" }, - { url = "https://files.pythonhosted.org/packages/cd/ee/38d01db91c198fb6350025d28f9719ecf3c8f2c55a0094bfbf3ef478cc9a/mypy-1.18.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:fb89ea08ff41adf59476b235293679a6eb53a7b9400f6256272fb6029bec3ce5", size = 13530933, upload-time = "2025-09-11T22:59:20.228Z" }, - { url = "https://files.pythonhosted.org/packages/da/8d/6d991ae631f80d58edbf9d7066e3f2a96e479dca955d9a968cd6e90850a3/mypy-1.18.1-cp312-cp312-win_amd64.whl", hash = "sha256:2657654d82fcd2a87e02a33e0d23001789a554059bbf34702d623dafe353eabf", size = 9828426, upload-time = "2025-09-11T23:00:21.007Z" }, - { url = "https://files.pythonhosted.org/packages/e4/ec/ef4a7260e1460a3071628a9277a7579e7da1b071bc134ebe909323f2fbc7/mypy-1.18.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:d70d2b5baf9b9a20bc9c730015615ae3243ef47fb4a58ad7b31c3e0a59b5ef1f", size = 12918671, upload-time = "2025-09-11T22:58:29.814Z" }, - { url = "https://files.pythonhosted.org/packages/a1/82/0ea6c3953f16223f0b8eda40c1aeac6bd266d15f4902556ae6e91f6fca4c/mypy-1.18.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:b8367e33506300f07a43012fc546402f283c3f8bcff1dc338636affb710154ce", size = 11913023, upload-time = "2025-09-11T23:00:29.049Z" }, - { url = "https://files.pythonhosted.org/packages/ae/ef/5e2057e692c2690fc27b3ed0a4dbde4388330c32e2576a23f0302bc8358d/mypy-1.18.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:913f668ec50c3337b89df22f973c1c8f0b29ee9e290a8b7fe01cc1ef7446d42e", size = 12473355, upload-time = "2025-09-11T23:00:04.544Z" }, - { url = "https://files.pythonhosted.org/packages/98/43/b7e429fc4be10e390a167b0cd1810d41cb4e4add4ae50bab96faff695a3b/mypy-1.18.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1a0e70b87eb27b33209fa4792b051c6947976f6ab829daa83819df5f58330c71", size = 13346944, upload-time = "2025-09-11T22:58:23.024Z" }, - { url = "https://files.pythonhosted.org/packages/89/4e/899dba0bfe36bbd5b7c52e597de4cf47b5053d337b6d201a30e3798e77a6/mypy-1.18.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:c378d946e8a60be6b6ede48c878d145546fb42aad61df998c056ec151bf6c746", size = 13512574, upload-time = "2025-09-11T22:59:52.152Z" }, - { url = "https://files.pythonhosted.org/packages/f5/f8/7661021a5b0e501b76440454d786b0f01bb05d5c4b125fcbda02023d0250/mypy-1.18.1-cp313-cp313-win_amd64.whl", hash = "sha256:2cd2c1e0f3a7465f22731987fff6fc427e3dcbb4ca5f7db5bbeaff2ff9a31f6d", size = 9837684, upload-time = "2025-09-11T22:58:44.454Z" }, - { url = "https://files.pythonhosted.org/packages/bf/87/7b173981466219eccc64c107cf8e5ab9eb39cc304b4c07df8e7881533e4f/mypy-1.18.1-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:ba24603c58e34dd5b096dfad792d87b304fc6470cbb1c22fd64e7ebd17edcc61", size = 12900265, upload-time = "2025-09-11T22:59:03.4Z" }, - { url = "https://files.pythonhosted.org/packages/ae/cc/b10e65bae75b18a5ac8f81b1e8e5867677e418f0dd2c83b8e2de9ba96ebd/mypy-1.18.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:ed36662fb92ae4cb3cacc682ec6656208f323bbc23d4b08d091eecfc0863d4b5", size = 11942890, upload-time = "2025-09-11T23:00:00.607Z" }, - { url = "https://files.pythonhosted.org/packages/39/d4/aeefa07c44d09f4c2102e525e2031bc066d12e5351f66b8a83719671004d/mypy-1.18.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:040ecc95e026f71a9ad7956fea2724466602b561e6a25c2e5584160d3833aaa8", size = 12472291, upload-time = "2025-09-11T22:59:43.425Z" }, - { url = "https://files.pythonhosted.org/packages/c6/07/711e78668ff8e365f8c19735594ea95938bff3639a4c46a905e3ed8ff2d6/mypy-1.18.1-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:937e3ed86cb731276706e46e03512547e43c391a13f363e08d0fee49a7c38a0d", size = 13318610, upload-time = "2025-09-11T23:00:17.604Z" }, - { url = "https://files.pythonhosted.org/packages/ca/85/df3b2d39339c31d360ce299b418c55e8194ef3205284739b64962f6074e7/mypy-1.18.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:1f95cc4f01c0f1701ca3b0355792bccec13ecb2ec1c469e5b85a6ef398398b1d", size = 13513697, upload-time = "2025-09-11T22:58:59.534Z" }, - { url = "https://files.pythonhosted.org/packages/b1/df/462866163c99ea73bb28f0eb4d415c087e30de5d36ee0f5429d42e28689b/mypy-1.18.1-cp314-cp314-win_amd64.whl", hash = "sha256:e4f16c0019d48941220ac60b893615be2f63afedaba6a0801bdcd041b96991ce", size = 9985739, upload-time = "2025-09-11T22:58:51.644Z" }, - { url = "https://files.pythonhosted.org/packages/e0/1d/4b97d3089b48ef3d904c9ca69fab044475bd03245d878f5f0b3ea1daf7ce/mypy-1.18.1-py3-none-any.whl", hash = "sha256:b76a4de66a0ac01da1be14ecc8ae88ddea33b8380284a9e3eae39d57ebcbe26e", size = 2352212, upload-time = "2025-09-11T22:59:26.576Z" }, -] - -[[package]] -name = "mypy-extensions" -version = "1.1.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/a2/6e/371856a3fb9d31ca8dac321cda606860fa4548858c0cc45d9d1d4ca2628b/mypy_extensions-1.1.0.tar.gz", hash = "sha256:52e68efc3284861e772bbcd66823fde5ae21fd2fdb51c62a211403730b916558", size = 6343, upload-time = "2025-04-22T14:54:24.164Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/79/7b/2c79738432f5c924bef5071f933bcc9efd0473bac3b4aa584a6f7c1c8df8/mypy_extensions-1.1.0-py3-none-any.whl", hash = "sha256:1be4cccdb0f2482337c4743e60421de3a356cd97508abadd57d47403e94f5505", size = 4963, upload-time = "2025-04-22T14:54:22.983Z" }, -] - -[[package]] -name = "networkx" -version = "3.5" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/6c/4f/ccdb8ad3a38e583f214547fd2f7ff1fc160c43a75af88e6aec213404b96a/networkx-3.5.tar.gz", hash = "sha256:d4c6f9cf81f52d69230866796b82afbccdec3db7ae4fbd1b65ea750feed50037", size = 2471065, upload-time = "2025-05-29T11:35:07.804Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/eb/8d/776adee7bbf76365fdd7f2552710282c79a4ead5d2a46408c9043a2b70ba/networkx-3.5-py3-none-any.whl", hash = "sha256:0030d386a9a06dee3565298b4a734b68589749a544acbb6c412dc9e2489ec6ec", size = 2034406, upload-time = "2025-05-29T11:35:04.961Z" }, -] - -[[package]] -name = "nltk" -version = "3.9.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "click" }, - { name = "joblib" }, - { name = "regex" }, - { name = "tqdm" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/3c/87/db8be88ad32c2d042420b6fd9ffd4a149f9a0d7f0e86b3f543be2eeeedd2/nltk-3.9.1.tar.gz", hash = "sha256:87d127bd3de4bd89a4f81265e5fa59cb1b199b27440175370f7417d2bc7ae868", size = 2904691, upload-time = "2024-08-18T19:48:37.769Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/4d/66/7d9e26593edda06e8cb531874633f7c2372279c3b0f46235539fe546df8b/nltk-3.9.1-py3-none-any.whl", hash = "sha256:4fa26829c5b00715afe3061398a8989dc643b92ce7dd93fb4585a70930d168a1", size = 1505442, upload-time = "2024-08-18T19:48:21.909Z" }, -] - -[[package]] -name = "nodeenv" -version = "1.9.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/43/16/fc88b08840de0e0a72a2f9d8c6bae36be573e475a6326ae854bcc549fc45/nodeenv-1.9.1.tar.gz", hash = "sha256:6ec12890a2dab7946721edbfbcd91f3319c6ccc9aec47be7c7e6b7011ee6645f", size = 47437, upload-time = "2024-06-04T18:44:11.171Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/d2/1d/1b658dbd2b9fa9c4c9f32accbfc0205d532c8c6194dc0f2a4c0428e7128a/nodeenv-1.9.1-py2.py3-none-any.whl", hash = "sha256:ba11c9782d29c27c70ffbdda2d7415098754709be8a7056d79a737cd901155c9", size = 22314, upload-time = "2024-06-04T18:44:08.352Z" }, -] - -[[package]] -name = "numpy" -version = "2.3.3" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/d0/19/95b3d357407220ed24c139018d2518fab0a61a948e68286a25f1a4d049ff/numpy-2.3.3.tar.gz", hash = "sha256:ddc7c39727ba62b80dfdbedf400d1c10ddfa8eefbd7ec8dcb118be8b56d31029", size = 20576648, upload-time = "2025-09-09T16:54:12.543Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/7a/45/e80d203ef6b267aa29b22714fb558930b27960a0c5ce3c19c999232bb3eb/numpy-2.3.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:0ffc4f5caba7dfcbe944ed674b7eef683c7e94874046454bb79ed7ee0236f59d", size = 21259253, upload-time = "2025-09-09T15:56:02.094Z" }, - { url = "https://files.pythonhosted.org/packages/52/18/cf2c648fccf339e59302e00e5f2bc87725a3ce1992f30f3f78c9044d7c43/numpy-2.3.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e7e946c7170858a0295f79a60214424caac2ffdb0063d4d79cb681f9aa0aa569", size = 14450980, upload-time = "2025-09-09T15:56:05.926Z" }, - { url = "https://files.pythonhosted.org/packages/93/fb/9af1082bec870188c42a1c239839915b74a5099c392389ff04215dcee812/numpy-2.3.3-cp311-cp311-macosx_14_0_arm64.whl", hash = "sha256:cd4260f64bc794c3390a63bf0728220dd1a68170c169088a1e0dfa2fde1be12f", size = 5379709, upload-time = "2025-09-09T15:56:07.95Z" }, - { url = "https://files.pythonhosted.org/packages/75/0f/bfd7abca52bcbf9a4a65abc83fe18ef01ccdeb37bfb28bbd6ad613447c79/numpy-2.3.3-cp311-cp311-macosx_14_0_x86_64.whl", hash = "sha256:f0ddb4b96a87b6728df9362135e764eac3cfa674499943ebc44ce96c478ab125", size = 6913923, upload-time = "2025-09-09T15:56:09.443Z" }, - { url = "https://files.pythonhosted.org/packages/79/55/d69adad255e87ab7afda1caf93ca997859092afeb697703e2f010f7c2e55/numpy-2.3.3-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:afd07d377f478344ec6ca2b8d4ca08ae8bd44706763d1efb56397de606393f48", size = 14589591, upload-time = "2025-09-09T15:56:11.234Z" }, - { url = "https://files.pythonhosted.org/packages/10/a2/010b0e27ddeacab7839957d7a8f00e91206e0c2c47abbb5f35a2630e5387/numpy-2.3.3-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bc92a5dedcc53857249ca51ef29f5e5f2f8c513e22cfb90faeb20343b8c6f7a6", size = 16938714, upload-time = "2025-09-09T15:56:14.637Z" }, - { url = "https://files.pythonhosted.org/packages/1c/6b/12ce8ede632c7126eb2762b9e15e18e204b81725b81f35176eac14dc5b82/numpy-2.3.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:7af05ed4dc19f308e1d9fc759f36f21921eb7bbfc82843eeec6b2a2863a0aefa", size = 16370592, upload-time = "2025-09-09T15:56:17.285Z" }, - { url = "https://files.pythonhosted.org/packages/b4/35/aba8568b2593067bb6a8fe4c52babb23b4c3b9c80e1b49dff03a09925e4a/numpy-2.3.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:433bf137e338677cebdd5beac0199ac84712ad9d630b74eceeb759eaa45ddf30", size = 18884474, upload-time = "2025-09-09T15:56:20.943Z" }, - { url = "https://files.pythonhosted.org/packages/45/fa/7f43ba10c77575e8be7b0138d107e4f44ca4a1ef322cd16980ea3e8b8222/numpy-2.3.3-cp311-cp311-win32.whl", hash = "sha256:eb63d443d7b4ffd1e873f8155260d7f58e7e4b095961b01c91062935c2491e57", size = 6599794, upload-time = "2025-09-09T15:56:23.258Z" }, - { url = "https://files.pythonhosted.org/packages/0a/a2/a4f78cb2241fe5664a22a10332f2be886dcdea8784c9f6a01c272da9b426/numpy-2.3.3-cp311-cp311-win_amd64.whl", hash = "sha256:ec9d249840f6a565f58d8f913bccac2444235025bbb13e9a4681783572ee3caa", size = 13088104, upload-time = "2025-09-09T15:56:25.476Z" }, - { url = "https://files.pythonhosted.org/packages/79/64/e424e975adbd38282ebcd4891661965b78783de893b381cbc4832fb9beb2/numpy-2.3.3-cp311-cp311-win_arm64.whl", hash = "sha256:74c2a948d02f88c11a3c075d9733f1ae67d97c6bdb97f2bb542f980458b257e7", size = 10460772, upload-time = "2025-09-09T15:56:27.679Z" }, - { url = "https://files.pythonhosted.org/packages/51/5d/bb7fc075b762c96329147799e1bcc9176ab07ca6375ea976c475482ad5b3/numpy-2.3.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:cfdd09f9c84a1a934cde1eec2267f0a43a7cd44b2cca4ff95b7c0d14d144b0bf", size = 20957014, upload-time = "2025-09-09T15:56:29.966Z" }, - { url = "https://files.pythonhosted.org/packages/6b/0e/c6211bb92af26517acd52125a237a92afe9c3124c6a68d3b9f81b62a0568/numpy-2.3.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:cb32e3cf0f762aee47ad1ddc6672988f7f27045b0783c887190545baba73aa25", size = 14185220, upload-time = "2025-09-09T15:56:32.175Z" }, - { url = "https://files.pythonhosted.org/packages/22/f2/07bb754eb2ede9073f4054f7c0286b0d9d2e23982e090a80d478b26d35ca/numpy-2.3.3-cp312-cp312-macosx_14_0_arm64.whl", hash = "sha256:396b254daeb0a57b1fe0ecb5e3cff6fa79a380fa97c8f7781a6d08cd429418fe", size = 5113918, upload-time = "2025-09-09T15:56:34.175Z" }, - { url = "https://files.pythonhosted.org/packages/81/0a/afa51697e9fb74642f231ea36aca80fa17c8fb89f7a82abd5174023c3960/numpy-2.3.3-cp312-cp312-macosx_14_0_x86_64.whl", hash = "sha256:067e3d7159a5d8f8a0b46ee11148fc35ca9b21f61e3c49fbd0a027450e65a33b", size = 6647922, upload-time = "2025-09-09T15:56:36.149Z" }, - { url = "https://files.pythonhosted.org/packages/5d/f5/122d9cdb3f51c520d150fef6e87df9279e33d19a9611a87c0d2cf78a89f4/numpy-2.3.3-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1c02d0629d25d426585fb2e45a66154081b9fa677bc92a881ff1d216bc9919a8", size = 14281991, upload-time = "2025-09-09T15:56:40.548Z" }, - { url = "https://files.pythonhosted.org/packages/51/64/7de3c91e821a2debf77c92962ea3fe6ac2bc45d0778c1cbe15d4fce2fd94/numpy-2.3.3-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d9192da52b9745f7f0766531dcfa978b7763916f158bb63bdb8a1eca0068ab20", size = 16641643, upload-time = "2025-09-09T15:56:43.343Z" }, - { url = "https://files.pythonhosted.org/packages/30/e4/961a5fa681502cd0d68907818b69f67542695b74e3ceaa513918103b7e80/numpy-2.3.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:cd7de500a5b66319db419dc3c345244404a164beae0d0937283b907d8152e6ea", size = 16056787, upload-time = "2025-09-09T15:56:46.141Z" }, - { url = "https://files.pythonhosted.org/packages/99/26/92c912b966e47fbbdf2ad556cb17e3a3088e2e1292b9833be1dfa5361a1a/numpy-2.3.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:93d4962d8f82af58f0b2eb85daaf1b3ca23fe0a85d0be8f1f2b7bb46034e56d7", size = 18579598, upload-time = "2025-09-09T15:56:49.844Z" }, - { url = "https://files.pythonhosted.org/packages/17/b6/fc8f82cb3520768718834f310c37d96380d9dc61bfdaf05fe5c0b7653e01/numpy-2.3.3-cp312-cp312-win32.whl", hash = "sha256:5534ed6b92f9b7dca6c0a19d6df12d41c68b991cef051d108f6dbff3babc4ebf", size = 6320800, upload-time = "2025-09-09T15:56:52.499Z" }, - { url = "https://files.pythonhosted.org/packages/32/ee/de999f2625b80d043d6d2d628c07d0d5555a677a3cf78fdf868d409b8766/numpy-2.3.3-cp312-cp312-win_amd64.whl", hash = "sha256:497d7cad08e7092dba36e3d296fe4c97708c93daf26643a1ae4b03f6294d30eb", size = 12786615, upload-time = "2025-09-09T15:56:54.422Z" }, - { url = "https://files.pythonhosted.org/packages/49/6e/b479032f8a43559c383acb20816644f5f91c88f633d9271ee84f3b3a996c/numpy-2.3.3-cp312-cp312-win_arm64.whl", hash = "sha256:ca0309a18d4dfea6fc6262a66d06c26cfe4640c3926ceec90e57791a82b6eee5", size = 10195936, upload-time = "2025-09-09T15:56:56.541Z" }, - { url = "https://files.pythonhosted.org/packages/7d/b9/984c2b1ee61a8b803bf63582b4ac4242cf76e2dbd663efeafcb620cc0ccb/numpy-2.3.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f5415fb78995644253370985342cd03572ef8620b934da27d77377a2285955bf", size = 20949588, upload-time = "2025-09-09T15:56:59.087Z" }, - { url = "https://files.pythonhosted.org/packages/a6/e4/07970e3bed0b1384d22af1e9912527ecbeb47d3b26e9b6a3bced068b3bea/numpy-2.3.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:d00de139a3324e26ed5b95870ce63be7ec7352171bc69a4cf1f157a48e3eb6b7", size = 14177802, upload-time = "2025-09-09T15:57:01.73Z" }, - { url = "https://files.pythonhosted.org/packages/35/c7/477a83887f9de61f1203bad89cf208b7c19cc9fef0cebef65d5a1a0619f2/numpy-2.3.3-cp313-cp313-macosx_14_0_arm64.whl", hash = "sha256:9dc13c6a5829610cc07422bc74d3ac083bd8323f14e2827d992f9e52e22cd6a6", size = 5106537, upload-time = "2025-09-09T15:57:03.765Z" }, - { url = "https://files.pythonhosted.org/packages/52/47/93b953bd5866a6f6986344d045a207d3f1cfbad99db29f534ea9cee5108c/numpy-2.3.3-cp313-cp313-macosx_14_0_x86_64.whl", hash = "sha256:d79715d95f1894771eb4e60fb23f065663b2298f7d22945d66877aadf33d00c7", size = 6640743, upload-time = "2025-09-09T15:57:07.921Z" }, - { url = "https://files.pythonhosted.org/packages/23/83/377f84aaeb800b64c0ef4de58b08769e782edcefa4fea712910b6f0afd3c/numpy-2.3.3-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:952cfd0748514ea7c3afc729a0fc639e61655ce4c55ab9acfab14bda4f402b4c", size = 14278881, upload-time = "2025-09-09T15:57:11.349Z" }, - { url = "https://files.pythonhosted.org/packages/9a/a5/bf3db6e66c4b160d6ea10b534c381a1955dfab34cb1017ea93aa33c70ed3/numpy-2.3.3-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:5b83648633d46f77039c29078751f80da65aa64d5622a3cd62aaef9d835b6c93", size = 16636301, upload-time = "2025-09-09T15:57:14.245Z" }, - { url = "https://files.pythonhosted.org/packages/a2/59/1287924242eb4fa3f9b3a2c30400f2e17eb2707020d1c5e3086fe7330717/numpy-2.3.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:b001bae8cea1c7dfdb2ae2b017ed0a6f2102d7a70059df1e338e307a4c78a8ae", size = 16053645, upload-time = "2025-09-09T15:57:16.534Z" }, - { url = "https://files.pythonhosted.org/packages/e6/93/b3d47ed882027c35e94ac2320c37e452a549f582a5e801f2d34b56973c97/numpy-2.3.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:8e9aced64054739037d42fb84c54dd38b81ee238816c948c8f3ed134665dcd86", size = 18578179, upload-time = "2025-09-09T15:57:18.883Z" }, - { url = "https://files.pythonhosted.org/packages/20/d9/487a2bccbf7cc9d4bfc5f0f197761a5ef27ba870f1e3bbb9afc4bbe3fcc2/numpy-2.3.3-cp313-cp313-win32.whl", hash = "sha256:9591e1221db3f37751e6442850429b3aabf7026d3b05542d102944ca7f00c8a8", size = 6312250, upload-time = "2025-09-09T15:57:21.296Z" }, - { url = "https://files.pythonhosted.org/packages/1b/b5/263ebbbbcede85028f30047eab3d58028d7ebe389d6493fc95ae66c636ab/numpy-2.3.3-cp313-cp313-win_amd64.whl", hash = "sha256:f0dadeb302887f07431910f67a14d57209ed91130be0adea2f9793f1a4f817cf", size = 12783269, upload-time = "2025-09-09T15:57:23.034Z" }, - { url = "https://files.pythonhosted.org/packages/fa/75/67b8ca554bbeaaeb3fac2e8bce46967a5a06544c9108ec0cf5cece559b6c/numpy-2.3.3-cp313-cp313-win_arm64.whl", hash = "sha256:3c7cf302ac6e0b76a64c4aecf1a09e51abd9b01fc7feee80f6c43e3ab1b1dbc5", size = 10195314, upload-time = "2025-09-09T15:57:25.045Z" }, - { url = "https://files.pythonhosted.org/packages/11/d0/0d1ddec56b162042ddfafeeb293bac672de9b0cfd688383590090963720a/numpy-2.3.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:eda59e44957d272846bb407aad19f89dc6f58fecf3504bd144f4c5cf81a7eacc", size = 21048025, upload-time = "2025-09-09T15:57:27.257Z" }, - { url = "https://files.pythonhosted.org/packages/36/9e/1996ca6b6d00415b6acbdd3c42f7f03ea256e2c3f158f80bd7436a8a19f3/numpy-2.3.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:823d04112bc85ef5c4fda73ba24e6096c8f869931405a80aa8b0e604510a26bc", size = 14301053, upload-time = "2025-09-09T15:57:30.077Z" }, - { url = "https://files.pythonhosted.org/packages/05/24/43da09aa764c68694b76e84b3d3f0c44cb7c18cdc1ba80e48b0ac1d2cd39/numpy-2.3.3-cp313-cp313t-macosx_14_0_arm64.whl", hash = "sha256:40051003e03db4041aa325da2a0971ba41cf65714e65d296397cc0e32de6018b", size = 5229444, upload-time = "2025-09-09T15:57:32.733Z" }, - { url = "https://files.pythonhosted.org/packages/bc/14/50ffb0f22f7218ef8af28dd089f79f68289a7a05a208db9a2c5dcbe123c1/numpy-2.3.3-cp313-cp313t-macosx_14_0_x86_64.whl", hash = "sha256:6ee9086235dd6ab7ae75aba5662f582a81ced49f0f1c6de4260a78d8f2d91a19", size = 6738039, upload-time = "2025-09-09T15:57:34.328Z" }, - { url = "https://files.pythonhosted.org/packages/55/52/af46ac0795e09657d45a7f4db961917314377edecf66db0e39fa7ab5c3d3/numpy-2.3.3-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:94fcaa68757c3e2e668ddadeaa86ab05499a70725811e582b6a9858dd472fb30", size = 14352314, upload-time = "2025-09-09T15:57:36.255Z" }, - { url = "https://files.pythonhosted.org/packages/a7/b1/dc226b4c90eb9f07a3fff95c2f0db3268e2e54e5cce97c4ac91518aee71b/numpy-2.3.3-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:da1a74b90e7483d6ce5244053399a614b1d6b7bc30a60d2f570e5071f8959d3e", size = 16701722, upload-time = "2025-09-09T15:57:38.622Z" }, - { url = "https://files.pythonhosted.org/packages/9d/9d/9d8d358f2eb5eced14dba99f110d83b5cd9a4460895230f3b396ad19a323/numpy-2.3.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:2990adf06d1ecee3b3dcbb4977dfab6e9f09807598d647f04d385d29e7a3c3d3", size = 16132755, upload-time = "2025-09-09T15:57:41.16Z" }, - { url = "https://files.pythonhosted.org/packages/b6/27/b3922660c45513f9377b3fb42240bec63f203c71416093476ec9aa0719dc/numpy-2.3.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:ed635ff692483b8e3f0fcaa8e7eb8a75ee71aa6d975388224f70821421800cea", size = 18651560, upload-time = "2025-09-09T15:57:43.459Z" }, - { url = "https://files.pythonhosted.org/packages/5b/8e/3ab61a730bdbbc201bb245a71102aa609f0008b9ed15255500a99cd7f780/numpy-2.3.3-cp313-cp313t-win32.whl", hash = "sha256:a333b4ed33d8dc2b373cc955ca57babc00cd6f9009991d9edc5ddbc1bac36bcd", size = 6442776, upload-time = "2025-09-09T15:57:45.793Z" }, - { url = "https://files.pythonhosted.org/packages/1c/3a/e22b766b11f6030dc2decdeff5c2fb1610768055603f9f3be88b6d192fb2/numpy-2.3.3-cp313-cp313t-win_amd64.whl", hash = "sha256:4384a169c4d8f97195980815d6fcad04933a7e1ab3b530921c3fef7a1c63426d", size = 12927281, upload-time = "2025-09-09T15:57:47.492Z" }, - { url = "https://files.pythonhosted.org/packages/7b/42/c2e2bc48c5e9b2a83423f99733950fbefd86f165b468a3d85d52b30bf782/numpy-2.3.3-cp313-cp313t-win_arm64.whl", hash = "sha256:75370986cc0bc66f4ce5110ad35aae6d182cc4ce6433c40ad151f53690130bf1", size = 10265275, upload-time = "2025-09-09T15:57:49.647Z" }, - { url = "https://files.pythonhosted.org/packages/6b/01/342ad585ad82419b99bcf7cebe99e61da6bedb89e213c5fd71acc467faee/numpy-2.3.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:cd052f1fa6a78dee696b58a914b7229ecfa41f0a6d96dc663c1220a55e137593", size = 20951527, upload-time = "2025-09-09T15:57:52.006Z" }, - { url = "https://files.pythonhosted.org/packages/ef/d8/204e0d73fc1b7a9ee80ab1fe1983dd33a4d64a4e30a05364b0208e9a241a/numpy-2.3.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:414a97499480067d305fcac9716c29cf4d0d76db6ebf0bf3cbce666677f12652", size = 14186159, upload-time = "2025-09-09T15:57:54.407Z" }, - { url = "https://files.pythonhosted.org/packages/22/af/f11c916d08f3a18fb8ba81ab72b5b74a6e42ead4c2846d270eb19845bf74/numpy-2.3.3-cp314-cp314-macosx_14_0_arm64.whl", hash = "sha256:50a5fe69f135f88a2be9b6ca0481a68a136f6febe1916e4920e12f1a34e708a7", size = 5114624, upload-time = "2025-09-09T15:57:56.5Z" }, - { url = "https://files.pythonhosted.org/packages/fb/11/0ed919c8381ac9d2ffacd63fd1f0c34d27e99cab650f0eb6f110e6ae4858/numpy-2.3.3-cp314-cp314-macosx_14_0_x86_64.whl", hash = "sha256:b912f2ed2b67a129e6a601e9d93d4fa37bef67e54cac442a2f588a54afe5c67a", size = 6642627, upload-time = "2025-09-09T15:57:58.206Z" }, - { url = "https://files.pythonhosted.org/packages/ee/83/deb5f77cb0f7ba6cb52b91ed388b47f8f3c2e9930d4665c600408d9b90b9/numpy-2.3.3-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9e318ee0596d76d4cb3d78535dc005fa60e5ea348cd131a51e99d0bdbe0b54fe", size = 14296926, upload-time = "2025-09-09T15:58:00.035Z" }, - { url = "https://files.pythonhosted.org/packages/77/cc/70e59dcb84f2b005d4f306310ff0a892518cc0c8000a33d0e6faf7ca8d80/numpy-2.3.3-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ce020080e4a52426202bdb6f7691c65bb55e49f261f31a8f506c9f6bc7450421", size = 16638958, upload-time = "2025-09-09T15:58:02.738Z" }, - { url = "https://files.pythonhosted.org/packages/b6/5a/b2ab6c18b4257e099587d5b7f903317bd7115333ad8d4ec4874278eafa61/numpy-2.3.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:e6687dc183aa55dae4a705b35f9c0f8cb178bcaa2f029b241ac5356221d5c021", size = 16071920, upload-time = "2025-09-09T15:58:05.029Z" }, - { url = "https://files.pythonhosted.org/packages/b8/f1/8b3fdc44324a259298520dd82147ff648979bed085feeacc1250ef1656c0/numpy-2.3.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:d8f3b1080782469fdc1718c4ed1d22549b5fb12af0d57d35e992158a772a37cf", size = 18577076, upload-time = "2025-09-09T15:58:07.745Z" }, - { url = "https://files.pythonhosted.org/packages/f0/a1/b87a284fb15a42e9274e7fcea0dad259d12ddbf07c1595b26883151ca3b4/numpy-2.3.3-cp314-cp314-win32.whl", hash = "sha256:cb248499b0bc3be66ebd6578b83e5acacf1d6cb2a77f2248ce0e40fbec5a76d0", size = 6366952, upload-time = "2025-09-09T15:58:10.096Z" }, - { url = "https://files.pythonhosted.org/packages/70/5f/1816f4d08f3b8f66576d8433a66f8fa35a5acfb3bbd0bf6c31183b003f3d/numpy-2.3.3-cp314-cp314-win_amd64.whl", hash = "sha256:691808c2b26b0f002a032c73255d0bd89751425f379f7bcd22d140db593a96e8", size = 12919322, upload-time = "2025-09-09T15:58:12.138Z" }, - { url = "https://files.pythonhosted.org/packages/8c/de/072420342e46a8ea41c324a555fa90fcc11637583fb8df722936aed1736d/numpy-2.3.3-cp314-cp314-win_arm64.whl", hash = "sha256:9ad12e976ca7b10f1774b03615a2a4bab8addce37ecc77394d8e986927dc0dfe", size = 10478630, upload-time = "2025-09-09T15:58:14.64Z" }, - { url = "https://files.pythonhosted.org/packages/d5/df/ee2f1c0a9de7347f14da5dd3cd3c3b034d1b8607ccb6883d7dd5c035d631/numpy-2.3.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:9cc48e09feb11e1db00b320e9d30a4151f7369afb96bd0e48d942d09da3a0d00", size = 21047987, upload-time = "2025-09-09T15:58:16.889Z" }, - { url = "https://files.pythonhosted.org/packages/d6/92/9453bdc5a4e9e69cf4358463f25e8260e2ffc126d52e10038b9077815989/numpy-2.3.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:901bf6123879b7f251d3631967fd574690734236075082078e0571977c6a8e6a", size = 14301076, upload-time = "2025-09-09T15:58:20.343Z" }, - { url = "https://files.pythonhosted.org/packages/13/77/1447b9eb500f028bb44253105bd67534af60499588a5149a94f18f2ca917/numpy-2.3.3-cp314-cp314t-macosx_14_0_arm64.whl", hash = "sha256:7f025652034199c301049296b59fa7d52c7e625017cae4c75d8662e377bf487d", size = 5229491, upload-time = "2025-09-09T15:58:22.481Z" }, - { url = "https://files.pythonhosted.org/packages/3d/f9/d72221b6ca205f9736cb4b2ce3b002f6e45cd67cd6a6d1c8af11a2f0b649/numpy-2.3.3-cp314-cp314t-macosx_14_0_x86_64.whl", hash = "sha256:533ca5f6d325c80b6007d4d7fb1984c303553534191024ec6a524a4c92a5935a", size = 6737913, upload-time = "2025-09-09T15:58:24.569Z" }, - { url = "https://files.pythonhosted.org/packages/3c/5f/d12834711962ad9c46af72f79bb31e73e416ee49d17f4c797f72c96b6ca5/numpy-2.3.3-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0edd58682a399824633b66885d699d7de982800053acf20be1eaa46d92009c54", size = 14352811, upload-time = "2025-09-09T15:58:26.416Z" }, - { url = "https://files.pythonhosted.org/packages/a1/0d/fdbec6629d97fd1bebed56cd742884e4eead593611bbe1abc3eb40d304b2/numpy-2.3.3-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:367ad5d8fbec5d9296d18478804a530f1191e24ab4d75ab408346ae88045d25e", size = 16702689, upload-time = "2025-09-09T15:58:28.831Z" }, - { url = "https://files.pythonhosted.org/packages/9b/09/0a35196dc5575adde1eb97ddfbc3e1687a814f905377621d18ca9bc2b7dd/numpy-2.3.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:8f6ac61a217437946a1fa48d24c47c91a0c4f725237871117dea264982128097", size = 16133855, upload-time = "2025-09-09T15:58:31.349Z" }, - { url = "https://files.pythonhosted.org/packages/7a/ca/c9de3ea397d576f1b6753eaa906d4cdef1bf97589a6d9825a349b4729cc2/numpy-2.3.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:179a42101b845a816d464b6fe9a845dfaf308fdfc7925387195570789bb2c970", size = 18652520, upload-time = "2025-09-09T15:58:33.762Z" }, - { url = "https://files.pythonhosted.org/packages/fd/c2/e5ed830e08cd0196351db55db82f65bc0ab05da6ef2b72a836dcf1936d2f/numpy-2.3.3-cp314-cp314t-win32.whl", hash = "sha256:1250c5d3d2562ec4174bce2e3a1523041595f9b651065e4a4473f5f48a6bc8a5", size = 6515371, upload-time = "2025-09-09T15:58:36.04Z" }, - { url = "https://files.pythonhosted.org/packages/47/c7/b0f6b5b67f6788a0725f744496badbb604d226bf233ba716683ebb47b570/numpy-2.3.3-cp314-cp314t-win_amd64.whl", hash = "sha256:b37a0b2e5935409daebe82c1e42274d30d9dd355852529eab91dab8dcca7419f", size = 13112576, upload-time = "2025-09-09T15:58:37.927Z" }, - { url = "https://files.pythonhosted.org/packages/06/b9/33bba5ff6fb679aa0b1f8a07e853f002a6b04b9394db3069a1270a7784ca/numpy-2.3.3-cp314-cp314t-win_arm64.whl", hash = "sha256:78c9f6560dc7e6b3990e32df7ea1a50bbd0e2a111e05209963f5ddcab7073b0b", size = 10545953, upload-time = "2025-09-09T15:58:40.576Z" }, - { url = "https://files.pythonhosted.org/packages/b8/f2/7e0a37cfced2644c9563c529f29fa28acbd0960dde32ece683aafa6f4949/numpy-2.3.3-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:1e02c7159791cd481e1e6d5ddd766b62a4d5acf8df4d4d1afe35ee9c5c33a41e", size = 21131019, upload-time = "2025-09-09T15:58:42.838Z" }, - { url = "https://files.pythonhosted.org/packages/1a/7e/3291f505297ed63831135a6cc0f474da0c868a1f31b0dd9a9f03a7a0d2ed/numpy-2.3.3-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:dca2d0fc80b3893ae72197b39f69d55a3cd8b17ea1b50aa4c62de82419936150", size = 14376288, upload-time = "2025-09-09T15:58:45.425Z" }, - { url = "https://files.pythonhosted.org/packages/bf/4b/ae02e985bdeee73d7b5abdefeb98aef1207e96d4c0621ee0cf228ddfac3c/numpy-2.3.3-pp311-pypy311_pp73-macosx_14_0_arm64.whl", hash = "sha256:99683cbe0658f8271b333a1b1b4bb3173750ad59c0c61f5bbdc5b318918fffe3", size = 5305425, upload-time = "2025-09-09T15:58:48.6Z" }, - { url = "https://files.pythonhosted.org/packages/8b/eb/9df215d6d7250db32007941500dc51c48190be25f2401d5b2b564e467247/numpy-2.3.3-pp311-pypy311_pp73-macosx_14_0_x86_64.whl", hash = "sha256:d9d537a39cc9de668e5cd0e25affb17aec17b577c6b3ae8a3d866b479fbe88d0", size = 6819053, upload-time = "2025-09-09T15:58:50.401Z" }, - { url = "https://files.pythonhosted.org/packages/57/62/208293d7d6b2a8998a4a1f23ac758648c3c32182d4ce4346062018362e29/numpy-2.3.3-pp311-pypy311_pp73-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8596ba2f8af5f93b01d97563832686d20206d303024777f6dfc2e7c7c3f1850e", size = 14420354, upload-time = "2025-09-09T15:58:52.704Z" }, - { url = "https://files.pythonhosted.org/packages/ed/0c/8e86e0ff7072e14a71b4c6af63175e40d1e7e933ce9b9e9f765a95b4e0c3/numpy-2.3.3-pp311-pypy311_pp73-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e1ec5615b05369925bd1125f27df33f3b6c8bc10d788d5999ecd8769a1fa04db", size = 16760413, upload-time = "2025-09-09T15:58:55.027Z" }, - { url = "https://files.pythonhosted.org/packages/af/11/0cc63f9f321ccf63886ac203336777140011fb669e739da36d8db3c53b98/numpy-2.3.3-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:2e267c7da5bf7309670523896df97f93f6e469fb931161f483cd6882b3b1a5dc", size = 12971844, upload-time = "2025-09-09T15:58:57.359Z" }, -] - -[[package]] -name = "onnxruntime" -version = "1.22.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "coloredlogs" }, - { name = "flatbuffers" }, - { name = "numpy" }, - { name = "packaging" }, - { name = "protobuf" }, - { name = "sympy" }, -] -wheels = [ - { url = "https://files.pythonhosted.org/packages/82/ff/4a1a6747e039ef29a8d4ee4510060e9a805982b6da906a3da2306b7a3be6/onnxruntime-1.22.1-cp311-cp311-macosx_13_0_universal2.whl", hash = "sha256:f4581bccb786da68725d8eac7c63a8f31a89116b8761ff8b4989dc58b61d49a0", size = 34324148, upload-time = "2025-07-10T19:15:26.584Z" }, - { url = "https://files.pythonhosted.org/packages/0b/05/9f1929723f1cca8c9fb1b2b97ac54ce61362c7201434d38053ea36ee4225/onnxruntime-1.22.1-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7ae7526cf10f93454beb0f751e78e5cb7619e3b92f9fc3bd51aa6f3b7a8977e5", size = 14473779, upload-time = "2025-07-10T19:15:30.183Z" }, - { url = "https://files.pythonhosted.org/packages/59/f3/c93eb4167d4f36ea947930f82850231f7ce0900cb00e1a53dc4995b60479/onnxruntime-1.22.1-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f6effa1299ac549a05c784d50292e3378dbbf010346ded67400193b09ddc2f04", size = 16460799, upload-time = "2025-07-10T19:15:33.005Z" }, - { url = "https://files.pythonhosted.org/packages/a8/01/e536397b03e4462d3260aee5387e6f606c8fa9d2b20b1728f988c3c72891/onnxruntime-1.22.1-cp311-cp311-win_amd64.whl", hash = "sha256:f28a42bb322b4ca6d255531bb334a2b3e21f172e37c1741bd5e66bc4b7b61f03", size = 12689881, upload-time = "2025-07-10T19:15:35.501Z" }, - { url = "https://files.pythonhosted.org/packages/48/70/ca2a4d38a5deccd98caa145581becb20c53684f451e89eb3a39915620066/onnxruntime-1.22.1-cp312-cp312-macosx_13_0_universal2.whl", hash = "sha256:a938d11c0dc811badf78e435daa3899d9af38abee950d87f3ab7430eb5b3cf5a", size = 34342883, upload-time = "2025-07-10T19:15:38.223Z" }, - { url = "https://files.pythonhosted.org/packages/29/e5/00b099b4d4f6223b610421080d0eed9327ef9986785c9141819bbba0d396/onnxruntime-1.22.1-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:984cea2a02fcc5dfea44ade9aca9fe0f7a8a2cd6f77c258fc4388238618f3928", size = 14473861, upload-time = "2025-07-10T19:15:42.911Z" }, - { url = "https://files.pythonhosted.org/packages/0a/50/519828a5292a6ccd8d5cd6d2f72c6b36ea528a2ef68eca69647732539ffa/onnxruntime-1.22.1-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2d39a530aff1ec8d02e365f35e503193991417788641b184f5b1e8c9a6d5ce8d", size = 16475713, upload-time = "2025-07-10T19:15:45.452Z" }, - { url = "https://files.pythonhosted.org/packages/5d/54/7139d463bb0a312890c9a5db87d7815d4a8cce9e6f5f28d04f0b55fcb160/onnxruntime-1.22.1-cp312-cp312-win_amd64.whl", hash = "sha256:6a64291d57ea966a245f749eb970f4fa05a64d26672e05a83fdb5db6b7d62f87", size = 12690910, upload-time = "2025-07-10T19:15:47.478Z" }, - { url = "https://files.pythonhosted.org/packages/e0/39/77cefa829740bd830915095d8408dce6d731b244e24b1f64fe3df9f18e86/onnxruntime-1.22.1-cp313-cp313-macosx_13_0_universal2.whl", hash = "sha256:d29c7d87b6cbed8fecfd09dca471832384d12a69e1ab873e5effbb94adc3e966", size = 34342026, upload-time = "2025-07-10T19:15:50.266Z" }, - { url = "https://files.pythonhosted.org/packages/d2/a6/444291524cb52875b5de980a6e918072514df63a57a7120bf9dfae3aeed1/onnxruntime-1.22.1-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:460487d83b7056ba98f1f7bac80287224c31d8149b15712b0d6f5078fcc33d0f", size = 14474014, upload-time = "2025-07-10T19:15:53.991Z" }, - { url = "https://files.pythonhosted.org/packages/87/9d/45a995437879c18beff26eacc2322f4227224d04c6ac3254dce2e8950190/onnxruntime-1.22.1-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b0c37070268ba4e02a1a9d28560cd00cd1e94f0d4f275cbef283854f861a65fa", size = 16475427, upload-time = "2025-07-10T19:15:56.067Z" }, - { url = "https://files.pythonhosted.org/packages/4c/06/9c765e66ad32a7e709ce4cb6b95d7eaa9cb4d92a6e11ea97c20ffecaf765/onnxruntime-1.22.1-cp313-cp313-win_amd64.whl", hash = "sha256:70980d729145a36a05f74b573435531f55ef9503bcda81fc6c3d6b9306199982", size = 12690841, upload-time = "2025-07-10T19:15:58.337Z" }, - { url = "https://files.pythonhosted.org/packages/52/8c/02af24ee1c8dce4e6c14a1642a7a56cebe323d2fa01d9a360a638f7e4b75/onnxruntime-1.22.1-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:33a7980bbc4b7f446bac26c3785652fe8730ed02617d765399e89ac7d44e0f7d", size = 14479333, upload-time = "2025-07-10T19:16:00.544Z" }, - { url = "https://files.pythonhosted.org/packages/5d/15/d75fd66aba116ce3732bb1050401394c5ec52074c4f7ee18db8838dd4667/onnxruntime-1.22.1-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:6e7e823624b015ea879d976cbef8bfaed2f7e2cc233d7506860a76dd37f8f381", size = 16477261, upload-time = "2025-07-10T19:16:03.226Z" }, -] - -[[package]] -name = "openai" -version = "1.108.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "anyio" }, - { name = "distro" }, - { name = "httpx" }, - { name = "jiter" }, - { name = "pydantic" }, - { name = "sniffio" }, - { name = "tqdm" }, - { name = "typing-extensions" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/07/3c/3ea4c40c62d5f4b11690de13de35554d0d49b5e5780669fad5e83562d635/openai-1.108.0.tar.gz", hash = "sha256:e859c64e4202d7f5956f19280eee92bb281f211c41cdd5be9e63bf51a024ff72", size = 564659, upload-time = "2025-09-17T22:03:23.075Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/af/dc/0a007b7c5a079e13d66eecc5d521bbc67b53c135e2a3131160ef76b5db1f/openai-1.108.0-py3-none-any.whl", hash = "sha256:31f2e58230e2703f13ddbb50c285f39dacf7fca64ab19882fd8a7a0b2bccd781", size = 948114, upload-time = "2025-09-17T22:03:20.972Z" }, -] - -[[package]] -name = "openapi-core" -version = "0.19.5" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "isodate" }, - { name = "jsonschema" }, - { name = "jsonschema-path" }, - { name = "more-itertools" }, - { name = "openapi-schema-validator" }, - { name = "openapi-spec-validator" }, - { name = "parse" }, - { name = "typing-extensions" }, - { name = "werkzeug" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/b1/35/1acaa5f2fcc6e54eded34a2ec74b479439c4e469fc4e8d0e803fda0234db/openapi_core-0.19.5.tar.gz", hash = "sha256:421e753da56c391704454e66afe4803a290108590ac8fa6f4a4487f4ec11f2d3", size = 103264, upload-time = "2025-03-20T20:17:28.193Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/27/6f/83ead0e2e30a90445ee4fc0135f43741aebc30cca5b43f20968b603e30b6/openapi_core-0.19.5-py3-none-any.whl", hash = "sha256:ef7210e83a59394f46ce282639d8d26ad6fc8094aa904c9c16eb1bac8908911f", size = 106595, upload-time = "2025-03-20T20:17:26.77Z" }, -] - -[[package]] -name = "openapi-pydantic" -version = "0.5.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "pydantic" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/02/2e/58d83848dd1a79cb92ed8e63f6ba901ca282c5f09d04af9423ec26c56fd7/openapi_pydantic-0.5.1.tar.gz", hash = "sha256:ff6835af6bde7a459fb93eb93bb92b8749b754fc6e51b2f1590a19dc3005ee0d", size = 60892, upload-time = "2025-01-08T19:29:27.083Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/12/cf/03675d8bd8ecbf4445504d8071adab19f5f993676795708e36402ab38263/openapi_pydantic-0.5.1-py3-none-any.whl", hash = "sha256:a3a09ef4586f5bd760a8df7f43028b60cafb6d9f61de2acba9574766255ab146", size = 96381, upload-time = "2025-01-08T19:29:25.275Z" }, -] - -[[package]] -name = "openapi-schema-validator" -version = "0.6.3" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "jsonschema" }, - { name = "jsonschema-specifications" }, - { name = "rfc3339-validator" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/8b/f3/5507ad3325169347cd8ced61c232ff3df70e2b250c49f0fe140edb4973c6/openapi_schema_validator-0.6.3.tar.gz", hash = "sha256:f37bace4fc2a5d96692f4f8b31dc0f8d7400fd04f3a937798eaf880d425de6ee", size = 11550, upload-time = "2025-01-10T18:08:22.268Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/21/c6/ad0fba32775ae749016829dace42ed80f4407b171da41313d1a3a5f102e4/openapi_schema_validator-0.6.3-py3-none-any.whl", hash = "sha256:f3b9870f4e556b5a62a1c39da72a6b4b16f3ad9c73dc80084b1b11e74ba148a3", size = 8755, upload-time = "2025-01-10T18:08:19.758Z" }, -] - -[[package]] -name = "openapi-spec-validator" -version = "0.7.2" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "jsonschema" }, - { name = "jsonschema-path" }, - { name = "lazy-object-proxy" }, - { name = "openapi-schema-validator" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/82/af/fe2d7618d6eae6fb3a82766a44ed87cd8d6d82b4564ed1c7cfb0f6378e91/openapi_spec_validator-0.7.2.tar.gz", hash = "sha256:cc029309b5c5dbc7859df0372d55e9d1ff43e96d678b9ba087f7c56fc586f734", size = 36855, upload-time = "2025-06-07T14:48:56.299Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/27/dd/b3fd642260cb17532f66cc1e8250f3507d1e580483e209dc1e9d13bd980d/openapi_spec_validator-0.7.2-py3-none-any.whl", hash = "sha256:4bbdc0894ec85f1d1bea1d6d9c8b2c3c8d7ccaa13577ef40da9c006c9fd0eb60", size = 39713, upload-time = "2025-06-07T14:48:54.077Z" }, -] - -[[package]] -name = "opentelemetry-api" -version = "1.37.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "importlib-metadata" }, - { name = "typing-extensions" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/63/04/05040d7ce33a907a2a02257e601992f0cdf11c73b33f13c4492bf6c3d6d5/opentelemetry_api-1.37.0.tar.gz", hash = "sha256:540735b120355bd5112738ea53621f8d5edb35ebcd6fe21ada3ab1c61d1cd9a7", size = 64923, upload-time = "2025-09-11T10:29:01.662Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/91/48/28ed9e55dcf2f453128df738210a980e09f4e468a456fa3c763dbc8be70a/opentelemetry_api-1.37.0-py3-none-any.whl", hash = "sha256:accf2024d3e89faec14302213bc39550ec0f4095d1cf5ca688e1bfb1c8612f47", size = 65732, upload-time = "2025-09-11T10:28:41.826Z" }, -] - -[[package]] -name = "opentelemetry-exporter-gcp-trace" -version = "1.9.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "google-cloud-trace" }, - { name = "opentelemetry-api" }, - { name = "opentelemetry-resourcedetector-gcp" }, - { name = "opentelemetry-sdk" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/c3/15/7556d54b01fb894497f69a98d57faa9caa45ffa59896e0bba6847a7f0d15/opentelemetry_exporter_gcp_trace-1.9.0.tar.gz", hash = "sha256:c3fc090342f6ee32a0cc41a5716a6bb716b4422d19facefcb22dc4c6b683ece8", size = 18568, upload-time = "2025-02-04T19:45:08.185Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/c0/cd/6d7fbad05771eb3c2bace20f6360ce5dac5ca751c6f2122853e43830c32e/opentelemetry_exporter_gcp_trace-1.9.0-py3-none-any.whl", hash = "sha256:0a8396e8b39f636eeddc3f0ae08ddb40c40f288bc8c5544727c3581545e77254", size = 13973, upload-time = "2025-02-04T19:44:59.148Z" }, -] - -[[package]] -name = "opentelemetry-exporter-otlp-proto-common" -version = "1.37.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "opentelemetry-proto" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/dc/6c/10018cbcc1e6fff23aac67d7fd977c3d692dbe5f9ef9bb4db5c1268726cc/opentelemetry_exporter_otlp_proto_common-1.37.0.tar.gz", hash = "sha256:c87a1bdd9f41fdc408d9cc9367bb53f8d2602829659f2b90be9f9d79d0bfe62c", size = 20430, upload-time = "2025-09-11T10:29:03.605Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/08/13/b4ef09837409a777f3c0af2a5b4ba9b7af34872bc43609dda0c209e4060d/opentelemetry_exporter_otlp_proto_common-1.37.0-py3-none-any.whl", hash = "sha256:53038428449c559b0c564b8d718df3314da387109c4d36bd1b94c9a641b0292e", size = 18359, upload-time = "2025-09-11T10:28:44.939Z" }, -] - -[[package]] -name = "opentelemetry-exporter-otlp-proto-http" -version = "1.37.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "googleapis-common-protos" }, - { name = "opentelemetry-api" }, - { name = "opentelemetry-exporter-otlp-proto-common" }, - { name = "opentelemetry-proto" }, - { name = "opentelemetry-sdk" }, - { name = "requests" }, - { name = "typing-extensions" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/5d/e3/6e320aeb24f951449e73867e53c55542bebbaf24faeee7623ef677d66736/opentelemetry_exporter_otlp_proto_http-1.37.0.tar.gz", hash = "sha256:e52e8600f1720d6de298419a802108a8f5afa63c96809ff83becb03f874e44ac", size = 17281, upload-time = "2025-09-11T10:29:04.844Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/e9/e9/70d74a664d83976556cec395d6bfedd9b85ec1498b778367d5f93e373397/opentelemetry_exporter_otlp_proto_http-1.37.0-py3-none-any.whl", hash = "sha256:54c42b39945a6cc9d9a2a33decb876eabb9547e0dcb49df090122773447f1aef", size = 19576, upload-time = "2025-09-11T10:28:46.726Z" }, -] - -[[package]] -name = "opentelemetry-instrumentation" -version = "0.58b0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "opentelemetry-api" }, - { name = "opentelemetry-semantic-conventions" }, - { name = "packaging" }, - { name = "wrapt" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/f6/36/7c307d9be8ce4ee7beb86d7f1d31027f2a6a89228240405a858d6e4d64f9/opentelemetry_instrumentation-0.58b0.tar.gz", hash = "sha256:df640f3ac715a3e05af145c18f527f4422c6ab6c467e40bd24d2ad75a00cb705", size = 31549, upload-time = "2025-09-11T11:42:14.084Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/d4/db/5ff1cd6c5ca1d12ecf1b73be16fbb2a8af2114ee46d4b0e6d4b23f4f4db7/opentelemetry_instrumentation-0.58b0-py3-none-any.whl", hash = "sha256:50f97ac03100676c9f7fc28197f8240c7290ca1baa12da8bfbb9a1de4f34cc45", size = 33019, upload-time = "2025-09-11T11:41:00.624Z" }, -] - -[[package]] -name = "opentelemetry-proto" -version = "1.37.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "protobuf" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/dd/ea/a75f36b463a36f3c5a10c0b5292c58b31dbdde74f6f905d3d0ab2313987b/opentelemetry_proto-1.37.0.tar.gz", hash = "sha256:30f5c494faf66f77faeaefa35ed4443c5edb3b0aa46dad073ed7210e1a789538", size = 46151, upload-time = "2025-09-11T10:29:11.04Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/c4/25/f89ea66c59bd7687e218361826c969443c4fa15dfe89733f3bf1e2a9e971/opentelemetry_proto-1.37.0-py3-none-any.whl", hash = "sha256:8ed8c066ae8828bbf0c39229979bdf583a126981142378a9cbe9d6fd5701c6e2", size = 72534, upload-time = "2025-09-11T10:28:56.831Z" }, -] - -[[package]] -name = "opentelemetry-resourcedetector-gcp" -version = "1.9.0a0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "opentelemetry-api" }, - { name = "opentelemetry-sdk" }, - { name = "requests" }, - { name = "typing-extensions" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/e1/86/f0693998817779802525a5bcc885a3cdb68d05b636bc6faae5c9ade4bee4/opentelemetry_resourcedetector_gcp-1.9.0a0.tar.gz", hash = "sha256:6860a6649d1e3b9b7b7f09f3918cc16b72aa0c0c590d2a72ea6e42b67c9a42e7", size = 20730, upload-time = "2025-02-04T19:45:10.693Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/12/04/7e33228c88422a5518e1774a836c9ec68f10f51bde0f1d5dd5f3054e612a/opentelemetry_resourcedetector_gcp-1.9.0a0-py3-none-any.whl", hash = "sha256:4e5a0822b0f0d7647b7ceb282d7aa921dd7f45466540bd0a24f954f90db8fde8", size = 20378, upload-time = "2025-02-04T19:45:03.898Z" }, -] - -[[package]] -name = "opentelemetry-sdk" -version = "1.37.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "opentelemetry-api" }, - { name = "opentelemetry-semantic-conventions" }, - { name = "typing-extensions" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/f4/62/2e0ca80d7fe94f0b193135375da92c640d15fe81f636658d2acf373086bc/opentelemetry_sdk-1.37.0.tar.gz", hash = "sha256:cc8e089c10953ded765b5ab5669b198bbe0af1b3f89f1007d19acd32dc46dda5", size = 170404, upload-time = "2025-09-11T10:29:11.779Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/9f/62/9f4ad6a54126fb00f7ed4bb5034964c6e4f00fcd5a905e115bd22707e20d/opentelemetry_sdk-1.37.0-py3-none-any.whl", hash = "sha256:8f3c3c22063e52475c5dbced7209495c2c16723d016d39287dfc215d1771257c", size = 131941, upload-time = "2025-09-11T10:28:57.83Z" }, -] - -[[package]] -name = "opentelemetry-semantic-conventions" -version = "0.58b0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "opentelemetry-api" }, - { name = "typing-extensions" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/aa/1b/90701d91e6300d9f2fb352153fb1721ed99ed1f6ea14fa992c756016e63a/opentelemetry_semantic_conventions-0.58b0.tar.gz", hash = "sha256:6bd46f51264279c433755767bb44ad00f1c9e2367e1b42af563372c5a6fa0c25", size = 129867, upload-time = "2025-09-11T10:29:12.597Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/07/90/68152b7465f50285d3ce2481b3aec2f82822e3f52e5152eeeaf516bab841/opentelemetry_semantic_conventions-0.58b0-py3-none-any.whl", hash = "sha256:5564905ab1458b96684db1340232729fce3b5375a06e140e8904c78e4f815b28", size = 207954, upload-time = "2025-09-11T10:28:59.218Z" }, -] - -[[package]] -name = "ordered-set" -version = "4.1.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/4c/ca/bfac8bc689799bcca4157e0e0ced07e70ce125193fc2e166d2e685b7e2fe/ordered-set-4.1.0.tar.gz", hash = "sha256:694a8e44c87657c59292ede72891eb91d34131f6531463aab3009191c77364a8", size = 12826, upload-time = "2022-01-26T14:38:56.6Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/33/55/af02708f230eb77084a299d7b08175cff006dea4f2721074b92cdb0296c0/ordered_set-4.1.0-py3-none-any.whl", hash = "sha256:046e1132c71fcf3330438a539928932caf51ddbc582496833e23de611de14562", size = 7634, upload-time = "2022-01-26T14:38:48.677Z" }, -] - -[[package]] -name = "orjson" -version = "3.11.3" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/be/4d/8df5f83256a809c22c4d6792ce8d43bb503be0fb7a8e4da9025754b09658/orjson-3.11.3.tar.gz", hash = "sha256:1c0603b1d2ffcd43a411d64797a19556ef76958aef1c182f22dc30860152a98a", size = 5482394, upload-time = "2025-08-26T17:46:43.171Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/cd/8b/360674cd817faef32e49276187922a946468579fcaf37afdfb6c07046e92/orjson-3.11.3-cp311-cp311-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:9d2ae0cc6aeb669633e0124531f342a17d8e97ea999e42f12a5ad4adaa304c5f", size = 238238, upload-time = "2025-08-26T17:44:54.214Z" }, - { url = "https://files.pythonhosted.org/packages/05/3d/5fa9ea4b34c1a13be7d9046ba98d06e6feb1d8853718992954ab59d16625/orjson-3.11.3-cp311-cp311-macosx_15_0_arm64.whl", hash = "sha256:ba21dbb2493e9c653eaffdc38819b004b7b1b246fb77bfc93dc016fe664eac91", size = 127713, upload-time = "2025-08-26T17:44:55.596Z" }, - { url = "https://files.pythonhosted.org/packages/e5/5f/e18367823925e00b1feec867ff5f040055892fc474bf5f7875649ecfa586/orjson-3.11.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:00f1a271e56d511d1569937c0447d7dce5a99a33ea0dec76673706360a051904", size = 123241, upload-time = "2025-08-26T17:44:57.185Z" }, - { url = "https://files.pythonhosted.org/packages/0f/bd/3c66b91c4564759cf9f473251ac1650e446c7ba92a7c0f9f56ed54f9f0e6/orjson-3.11.3-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:b67e71e47caa6680d1b6f075a396d04fa6ca8ca09aafb428731da9b3ea32a5a6", size = 127895, upload-time = "2025-08-26T17:44:58.349Z" }, - { url = "https://files.pythonhosted.org/packages/82/b5/dc8dcd609db4766e2967a85f63296c59d4722b39503e5b0bf7fd340d387f/orjson-3.11.3-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d7d012ebddffcce8c85734a6d9e5f08180cd3857c5f5a3ac70185b43775d043d", size = 130303, upload-time = "2025-08-26T17:44:59.491Z" }, - { url = "https://files.pythonhosted.org/packages/48/c2/d58ec5fd1270b2aa44c862171891adc2e1241bd7dab26c8f46eb97c6c6f1/orjson-3.11.3-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:dd759f75d6b8d1b62012b7f5ef9461d03c804f94d539a5515b454ba3a6588038", size = 132366, upload-time = "2025-08-26T17:45:00.654Z" }, - { url = "https://files.pythonhosted.org/packages/73/87/0ef7e22eb8dd1ef940bfe3b9e441db519e692d62ed1aae365406a16d23d0/orjson-3.11.3-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6890ace0809627b0dff19cfad92d69d0fa3f089d3e359a2a532507bb6ba34efb", size = 135180, upload-time = "2025-08-26T17:45:02.424Z" }, - { url = "https://files.pythonhosted.org/packages/bb/6a/e5bf7b70883f374710ad74faf99bacfc4b5b5a7797c1d5e130350e0e28a3/orjson-3.11.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f9d4a5e041ae435b815e568537755773d05dac031fee6a57b4ba70897a44d9d2", size = 132741, upload-time = "2025-08-26T17:45:03.663Z" }, - { url = "https://files.pythonhosted.org/packages/bd/0c/4577fd860b6386ffaa56440e792af01c7882b56d2766f55384b5b0e9d39b/orjson-3.11.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:2d68bf97a771836687107abfca089743885fb664b90138d8761cce61d5625d55", size = 131104, upload-time = "2025-08-26T17:45:04.939Z" }, - { url = "https://files.pythonhosted.org/packages/66/4b/83e92b2d67e86d1c33f2ea9411742a714a26de63641b082bdbf3d8e481af/orjson-3.11.3-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:bfc27516ec46f4520b18ef645864cee168d2a027dbf32c5537cb1f3e3c22dac1", size = 403887, upload-time = "2025-08-26T17:45:06.228Z" }, - { url = "https://files.pythonhosted.org/packages/6d/e5/9eea6a14e9b5ceb4a271a1fd2e1dec5f2f686755c0fab6673dc6ff3433f4/orjson-3.11.3-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:f66b001332a017d7945e177e282a40b6997056394e3ed7ddb41fb1813b83e824", size = 145855, upload-time = "2025-08-26T17:45:08.338Z" }, - { url = "https://files.pythonhosted.org/packages/45/78/8d4f5ad0c80ba9bf8ac4d0fc71f93a7d0dc0844989e645e2074af376c307/orjson-3.11.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:212e67806525d2561efbfe9e799633b17eb668b8964abed6b5319b2f1cfbae1f", size = 135361, upload-time = "2025-08-26T17:45:09.625Z" }, - { url = "https://files.pythonhosted.org/packages/0b/5f/16386970370178d7a9b438517ea3d704efcf163d286422bae3b37b88dbb5/orjson-3.11.3-cp311-cp311-win32.whl", hash = "sha256:6e8e0c3b85575a32f2ffa59de455f85ce002b8bdc0662d6b9c2ed6d80ab5d204", size = 136190, upload-time = "2025-08-26T17:45:10.962Z" }, - { url = "https://files.pythonhosted.org/packages/09/60/db16c6f7a41dd8ac9fb651f66701ff2aeb499ad9ebc15853a26c7c152448/orjson-3.11.3-cp311-cp311-win_amd64.whl", hash = "sha256:6be2f1b5d3dc99a5ce5ce162fc741c22ba9f3443d3dd586e6a1211b7bc87bc7b", size = 131389, upload-time = "2025-08-26T17:45:12.285Z" }, - { url = "https://files.pythonhosted.org/packages/3e/2a/bb811ad336667041dea9b8565c7c9faf2f59b47eb5ab680315eea612ef2e/orjson-3.11.3-cp311-cp311-win_arm64.whl", hash = "sha256:fafb1a99d740523d964b15c8db4eabbfc86ff29f84898262bf6e3e4c9e97e43e", size = 126120, upload-time = "2025-08-26T17:45:13.515Z" }, - { url = "https://files.pythonhosted.org/packages/3d/b0/a7edab2a00cdcb2688e1c943401cb3236323e7bfd2839815c6131a3742f4/orjson-3.11.3-cp312-cp312-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:8c752089db84333e36d754c4baf19c0e1437012242048439c7e80eb0e6426e3b", size = 238259, upload-time = "2025-08-26T17:45:15.093Z" }, - { url = "https://files.pythonhosted.org/packages/e1/c6/ff4865a9cc398a07a83342713b5932e4dc3cb4bf4bc04e8f83dedfc0d736/orjson-3.11.3-cp312-cp312-macosx_15_0_arm64.whl", hash = "sha256:9b8761b6cf04a856eb544acdd82fc594b978f12ac3602d6374a7edb9d86fd2c2", size = 127633, upload-time = "2025-08-26T17:45:16.417Z" }, - { url = "https://files.pythonhosted.org/packages/6e/e6/e00bea2d9472f44fe8794f523e548ce0ad51eb9693cf538a753a27b8bda4/orjson-3.11.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8b13974dc8ac6ba22feaa867fc19135a3e01a134b4f7c9c28162fed4d615008a", size = 123061, upload-time = "2025-08-26T17:45:17.673Z" }, - { url = "https://files.pythonhosted.org/packages/54/31/9fbb78b8e1eb3ac605467cb846e1c08d0588506028b37f4ee21f978a51d4/orjson-3.11.3-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:f83abab5bacb76d9c821fd5c07728ff224ed0e52d7a71b7b3de822f3df04e15c", size = 127956, upload-time = "2025-08-26T17:45:19.172Z" }, - { url = "https://files.pythonhosted.org/packages/36/88/b0604c22af1eed9f98d709a96302006915cfd724a7ebd27d6dd11c22d80b/orjson-3.11.3-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e6fbaf48a744b94091a56c62897b27c31ee2da93d826aa5b207131a1e13d4064", size = 130790, upload-time = "2025-08-26T17:45:20.586Z" }, - { url = "https://files.pythonhosted.org/packages/0e/9d/1c1238ae9fffbfed51ba1e507731b3faaf6b846126a47e9649222b0fd06f/orjson-3.11.3-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bc779b4f4bba2847d0d2940081a7b6f7b5877e05408ffbb74fa1faf4a136c424", size = 132385, upload-time = "2025-08-26T17:45:22.036Z" }, - { url = "https://files.pythonhosted.org/packages/a3/b5/c06f1b090a1c875f337e21dd71943bc9d84087f7cdf8c6e9086902c34e42/orjson-3.11.3-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:bd4b909ce4c50faa2192da6bb684d9848d4510b736b0611b6ab4020ea6fd2d23", size = 135305, upload-time = "2025-08-26T17:45:23.4Z" }, - { url = "https://files.pythonhosted.org/packages/a0/26/5f028c7d81ad2ebbf84414ba6d6c9cac03f22f5cd0d01eb40fb2d6a06b07/orjson-3.11.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:524b765ad888dc5518bbce12c77c2e83dee1ed6b0992c1790cc5fb49bb4b6667", size = 132875, upload-time = "2025-08-26T17:45:25.182Z" }, - { url = "https://files.pythonhosted.org/packages/fe/d4/b8df70d9cfb56e385bf39b4e915298f9ae6c61454c8154a0f5fd7efcd42e/orjson-3.11.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:84fd82870b97ae3cdcea9d8746e592b6d40e1e4d4527835fc520c588d2ded04f", size = 130940, upload-time = "2025-08-26T17:45:27.209Z" }, - { url = "https://files.pythonhosted.org/packages/da/5e/afe6a052ebc1a4741c792dd96e9f65bf3939d2094e8b356503b68d48f9f5/orjson-3.11.3-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:fbecb9709111be913ae6879b07bafd4b0785b44c1eb5cac8ac76da048b3885a1", size = 403852, upload-time = "2025-08-26T17:45:28.478Z" }, - { url = "https://files.pythonhosted.org/packages/f8/90/7bbabafeb2ce65915e9247f14a56b29c9334003536009ef5b122783fe67e/orjson-3.11.3-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:9dba358d55aee552bd868de348f4736ca5a4086d9a62e2bfbbeeb5629fe8b0cc", size = 146293, upload-time = "2025-08-26T17:45:29.86Z" }, - { url = "https://files.pythonhosted.org/packages/27/b3/2d703946447da8b093350570644a663df69448c9d9330e5f1d9cce997f20/orjson-3.11.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:eabcf2e84f1d7105f84580e03012270c7e97ecb1fb1618bda395061b2a84a049", size = 135470, upload-time = "2025-08-26T17:45:31.243Z" }, - { url = "https://files.pythonhosted.org/packages/38/70/b14dcfae7aff0e379b0119c8a812f8396678919c431efccc8e8a0263e4d9/orjson-3.11.3-cp312-cp312-win32.whl", hash = "sha256:3782d2c60b8116772aea8d9b7905221437fdf53e7277282e8d8b07c220f96cca", size = 136248, upload-time = "2025-08-26T17:45:32.567Z" }, - { url = "https://files.pythonhosted.org/packages/35/b8/9e3127d65de7fff243f7f3e53f59a531bf6bb295ebe5db024c2503cc0726/orjson-3.11.3-cp312-cp312-win_amd64.whl", hash = "sha256:79b44319268af2eaa3e315b92298de9a0067ade6e6003ddaef72f8e0bedb94f1", size = 131437, upload-time = "2025-08-26T17:45:34.949Z" }, - { url = "https://files.pythonhosted.org/packages/51/92/a946e737d4d8a7fd84a606aba96220043dcc7d6988b9e7551f7f6d5ba5ad/orjson-3.11.3-cp312-cp312-win_arm64.whl", hash = "sha256:0e92a4e83341ef79d835ca21b8bd13e27c859e4e9e4d7b63defc6e58462a3710", size = 125978, upload-time = "2025-08-26T17:45:36.422Z" }, - { url = "https://files.pythonhosted.org/packages/fc/79/8932b27293ad35919571f77cb3693b5906cf14f206ef17546052a241fdf6/orjson-3.11.3-cp313-cp313-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:af40c6612fd2a4b00de648aa26d18186cd1322330bd3a3cc52f87c699e995810", size = 238127, upload-time = "2025-08-26T17:45:38.146Z" }, - { url = "https://files.pythonhosted.org/packages/1c/82/cb93cd8cf132cd7643b30b6c5a56a26c4e780c7a145db6f83de977b540ce/orjson-3.11.3-cp313-cp313-macosx_15_0_arm64.whl", hash = "sha256:9f1587f26c235894c09e8b5b7636a38091a9e6e7fe4531937534749c04face43", size = 127494, upload-time = "2025-08-26T17:45:39.57Z" }, - { url = "https://files.pythonhosted.org/packages/a4/b8/2d9eb181a9b6bb71463a78882bcac1027fd29cf62c38a40cc02fc11d3495/orjson-3.11.3-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:61dcdad16da5bb486d7227a37a2e789c429397793a6955227cedbd7252eb5a27", size = 123017, upload-time = "2025-08-26T17:45:40.876Z" }, - { url = "https://files.pythonhosted.org/packages/b4/14/a0e971e72d03b509190232356d54c0f34507a05050bd026b8db2bf2c192c/orjson-3.11.3-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:11c6d71478e2cbea0a709e8a06365fa63da81da6498a53e4c4f065881d21ae8f", size = 127898, upload-time = "2025-08-26T17:45:42.188Z" }, - { url = "https://files.pythonhosted.org/packages/8e/af/dc74536722b03d65e17042cc30ae586161093e5b1f29bccda24765a6ae47/orjson-3.11.3-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ff94112e0098470b665cb0ed06efb187154b63649403b8d5e9aedeb482b4548c", size = 130742, upload-time = "2025-08-26T17:45:43.511Z" }, - { url = "https://files.pythonhosted.org/packages/62/e6/7a3b63b6677bce089fe939353cda24a7679825c43a24e49f757805fc0d8a/orjson-3.11.3-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae8b756575aaa2a855a75192f356bbda11a89169830e1439cfb1a3e1a6dde7be", size = 132377, upload-time = "2025-08-26T17:45:45.525Z" }, - { url = "https://files.pythonhosted.org/packages/fc/cd/ce2ab93e2e7eaf518f0fd15e3068b8c43216c8a44ed82ac2b79ce5cef72d/orjson-3.11.3-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c9416cc19a349c167ef76135b2fe40d03cea93680428efee8771f3e9fb66079d", size = 135313, upload-time = "2025-08-26T17:45:46.821Z" }, - { url = "https://files.pythonhosted.org/packages/d0/b4/f98355eff0bd1a38454209bbc73372ce351ba29933cb3e2eba16c04b9448/orjson-3.11.3-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b822caf5b9752bc6f246eb08124c3d12bf2175b66ab74bac2ef3bbf9221ce1b2", size = 132908, upload-time = "2025-08-26T17:45:48.126Z" }, - { url = "https://files.pythonhosted.org/packages/eb/92/8f5182d7bc2a1bed46ed960b61a39af8389f0ad476120cd99e67182bfb6d/orjson-3.11.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:414f71e3bdd5573893bf5ecdf35c32b213ed20aa15536fe2f588f946c318824f", size = 130905, upload-time = "2025-08-26T17:45:49.414Z" }, - { url = "https://files.pythonhosted.org/packages/1a/60/c41ca753ce9ffe3d0f67b9b4c093bdd6e5fdb1bc53064f992f66bb99954d/orjson-3.11.3-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:828e3149ad8815dc14468f36ab2a4b819237c155ee1370341b91ea4c8672d2ee", size = 403812, upload-time = "2025-08-26T17:45:51.085Z" }, - { url = "https://files.pythonhosted.org/packages/dd/13/e4a4f16d71ce1868860db59092e78782c67082a8f1dc06a3788aef2b41bc/orjson-3.11.3-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:ac9e05f25627ffc714c21f8dfe3a579445a5c392a9c8ae7ba1d0e9fb5333f56e", size = 146277, upload-time = "2025-08-26T17:45:52.851Z" }, - { url = "https://files.pythonhosted.org/packages/8d/8b/bafb7f0afef9344754a3a0597a12442f1b85a048b82108ef2c956f53babd/orjson-3.11.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:e44fbe4000bd321d9f3b648ae46e0196d21577cf66ae684a96ff90b1f7c93633", size = 135418, upload-time = "2025-08-26T17:45:54.806Z" }, - { url = "https://files.pythonhosted.org/packages/60/d4/bae8e4f26afb2c23bea69d2f6d566132584d1c3a5fe89ee8c17b718cab67/orjson-3.11.3-cp313-cp313-win32.whl", hash = "sha256:2039b7847ba3eec1f5886e75e6763a16e18c68a63efc4b029ddf994821e2e66b", size = 136216, upload-time = "2025-08-26T17:45:57.182Z" }, - { url = "https://files.pythonhosted.org/packages/88/76/224985d9f127e121c8cad882cea55f0ebe39f97925de040b75ccd4b33999/orjson-3.11.3-cp313-cp313-win_amd64.whl", hash = "sha256:29be5ac4164aa8bdcba5fa0700a3c9c316b411d8ed9d39ef8a882541bd452fae", size = 131362, upload-time = "2025-08-26T17:45:58.56Z" }, - { url = "https://files.pythonhosted.org/packages/e2/cf/0dce7a0be94bd36d1346be5067ed65ded6adb795fdbe3abd234c8d576d01/orjson-3.11.3-cp313-cp313-win_arm64.whl", hash = "sha256:18bd1435cb1f2857ceb59cfb7de6f92593ef7b831ccd1b9bfb28ca530e539dce", size = 125989, upload-time = "2025-08-26T17:45:59.95Z" }, - { url = "https://files.pythonhosted.org/packages/ef/77/d3b1fef1fc6aaeed4cbf3be2b480114035f4df8fa1a99d2dac1d40d6e924/orjson-3.11.3-cp314-cp314-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:cf4b81227ec86935568c7edd78352a92e97af8da7bd70bdfdaa0d2e0011a1ab4", size = 238115, upload-time = "2025-08-26T17:46:01.669Z" }, - { url = "https://files.pythonhosted.org/packages/e4/6d/468d21d49bb12f900052edcfbf52c292022d0a323d7828dc6376e6319703/orjson-3.11.3-cp314-cp314-macosx_15_0_arm64.whl", hash = "sha256:bc8bc85b81b6ac9fc4dae393a8c159b817f4c2c9dee5d12b773bddb3b95fc07e", size = 127493, upload-time = "2025-08-26T17:46:03.466Z" }, - { url = "https://files.pythonhosted.org/packages/67/46/1e2588700d354aacdf9e12cc2d98131fb8ac6f31ca65997bef3863edb8ff/orjson-3.11.3-cp314-cp314-manylinux_2_34_aarch64.whl", hash = "sha256:88dcfc514cfd1b0de038443c7b3e6a9797ffb1b3674ef1fd14f701a13397f82d", size = 122998, upload-time = "2025-08-26T17:46:04.803Z" }, - { url = "https://files.pythonhosted.org/packages/3b/94/11137c9b6adb3779f1b34fd98be51608a14b430dbc02c6d41134fbba484c/orjson-3.11.3-cp314-cp314-manylinux_2_34_x86_64.whl", hash = "sha256:d61cd543d69715d5fc0a690c7c6f8dcc307bc23abef9738957981885f5f38229", size = 132915, upload-time = "2025-08-26T17:46:06.237Z" }, - { url = "https://files.pythonhosted.org/packages/10/61/dccedcf9e9bcaac09fdabe9eaee0311ca92115699500efbd31950d878833/orjson-3.11.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:2b7b153ed90ababadbef5c3eb39549f9476890d339cf47af563aea7e07db2451", size = 130907, upload-time = "2025-08-26T17:46:07.581Z" }, - { url = "https://files.pythonhosted.org/packages/0e/fd/0e935539aa7b08b3ca0f817d73034f7eb506792aae5ecc3b7c6e679cdf5f/orjson-3.11.3-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:7909ae2460f5f494fecbcd10613beafe40381fd0316e35d6acb5f3a05bfda167", size = 403852, upload-time = "2025-08-26T17:46:08.982Z" }, - { url = "https://files.pythonhosted.org/packages/4a/2b/50ae1a5505cd1043379132fdb2adb8a05f37b3e1ebffe94a5073321966fd/orjson-3.11.3-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:2030c01cbf77bc67bee7eef1e7e31ecf28649353987775e3583062c752da0077", size = 146309, upload-time = "2025-08-26T17:46:10.576Z" }, - { url = "https://files.pythonhosted.org/packages/cd/1d/a473c158e380ef6f32753b5f39a69028b25ec5be331c2049a2201bde2e19/orjson-3.11.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:a0169ebd1cbd94b26c7a7ad282cf5c2744fce054133f959e02eb5265deae1872", size = 135424, upload-time = "2025-08-26T17:46:12.386Z" }, - { url = "https://files.pythonhosted.org/packages/da/09/17d9d2b60592890ff7382e591aa1d9afb202a266b180c3d4049b1ec70e4a/orjson-3.11.3-cp314-cp314-win32.whl", hash = "sha256:0c6d7328c200c349e3a4c6d8c83e0a5ad029bdc2d417f234152bf34842d0fc8d", size = 136266, upload-time = "2025-08-26T17:46:13.853Z" }, - { url = "https://files.pythonhosted.org/packages/15/58/358f6846410a6b4958b74734727e582ed971e13d335d6c7ce3e47730493e/orjson-3.11.3-cp314-cp314-win_amd64.whl", hash = "sha256:317bbe2c069bbc757b1a2e4105b64aacd3bc78279b66a6b9e51e846e4809f804", size = 131351, upload-time = "2025-08-26T17:46:15.27Z" }, - { url = "https://files.pythonhosted.org/packages/28/01/d6b274a0635be0468d4dbd9cafe80c47105937a0d42434e805e67cd2ed8b/orjson-3.11.3-cp314-cp314-win_arm64.whl", hash = "sha256:e8f6a7a27d7b7bec81bd5924163e9af03d49bbb63013f107b48eb5d16db711bc", size = 125985, upload-time = "2025-08-26T17:46:16.67Z" }, -] - -[[package]] -name = "overrides" -version = "7.7.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/36/86/b585f53236dec60aba864e050778b25045f857e17f6e5ea0ae95fe80edd2/overrides-7.7.0.tar.gz", hash = "sha256:55158fa3d93b98cc75299b1e67078ad9003ca27945c76162c1c0766d6f91820a", size = 22812, upload-time = "2024-01-27T21:01:33.423Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/2c/ab/fc8290c6a4c722e5514d80f62b2dc4c4df1a68a41d1364e625c35990fcf3/overrides-7.7.0-py3-none-any.whl", hash = "sha256:c7ed9d062f78b8e4c1a7b70bd8796b35ead4d9f510227ef9c5dc7626c60d7e49", size = 17832, upload-time = "2024-01-27T21:01:31.393Z" }, -] - -[[package]] -name = "packaging" -version = "24.2" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/d0/63/68dbb6eb2de9cb10ee4c9c14a0148804425e13c4fb20d61cce69f53106da/packaging-24.2.tar.gz", hash = "sha256:c228a6dc5e932d346bc5739379109d49e8853dd8223571c7c5b55260edc0b97f", size = 163950, upload-time = "2024-11-08T09:47:47.202Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/88/ef/eb23f262cca3c0c4eb7ab1933c3b1f03d021f2c48f54763065b6f0e321be/packaging-24.2-py3-none-any.whl", hash = "sha256:09abb1bccd265c01f4a3aa3f7a7db064b36514d2cba19a2f694fe6150451a759", size = 65451, upload-time = "2024-11-08T09:47:44.722Z" }, -] - -[[package]] -name = "pandas" -version = "2.3.2" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "numpy" }, - { name = "python-dateutil" }, - { name = "pytz" }, - { name = "tzdata" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/79/8e/0e90233ac205ad182bd6b422532695d2b9414944a280488105d598c70023/pandas-2.3.2.tar.gz", hash = "sha256:ab7b58f8f82706890924ccdfb5f48002b83d2b5a3845976a9fb705d36c34dcdb", size = 4488684, upload-time = "2025-08-21T10:28:29.257Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/7a/59/f3e010879f118c2d400902d2d871c2226cef29b08c09fb8dc41111730400/pandas-2.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:1333e9c299adcbb68ee89a9bb568fc3f20f9cbb419f1dd5225071e6cddb2a743", size = 11563308, upload-time = "2025-08-21T10:26:56.656Z" }, - { url = "https://files.pythonhosted.org/packages/38/18/48f10f1cc5c397af59571d638d211f494dba481f449c19adbd282aa8f4ca/pandas-2.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:76972bcbd7de8e91ad5f0ca884a9f2c477a2125354af624e022c49e5bd0dfff4", size = 10820319, upload-time = "2025-08-21T10:26:59.162Z" }, - { url = "https://files.pythonhosted.org/packages/95/3b/1e9b69632898b048e223834cd9702052bcf06b15e1ae716eda3196fb972e/pandas-2.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b98bdd7c456a05eef7cd21fd6b29e3ca243591fe531c62be94a2cc987efb5ac2", size = 11790097, upload-time = "2025-08-21T10:27:02.204Z" }, - { url = "https://files.pythonhosted.org/packages/8b/ef/0e2ffb30b1f7fbc9a588bd01e3c14a0d96854d09a887e15e30cc19961227/pandas-2.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1d81573b3f7db40d020983f78721e9bfc425f411e616ef019a10ebf597aedb2e", size = 12397958, upload-time = "2025-08-21T10:27:05.409Z" }, - { url = "https://files.pythonhosted.org/packages/23/82/e6b85f0d92e9afb0e7f705a51d1399b79c7380c19687bfbf3d2837743249/pandas-2.3.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:e190b738675a73b581736cc8ec71ae113d6c3768d0bd18bffa5b9a0927b0b6ea", size = 13225600, upload-time = "2025-08-21T10:27:07.791Z" }, - { url = "https://files.pythonhosted.org/packages/e8/f1/f682015893d9ed51611948bd83683670842286a8edd4f68c2c1c3b231eef/pandas-2.3.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:c253828cb08f47488d60f43c5fc95114c771bbfff085da54bfc79cb4f9e3a372", size = 13879433, upload-time = "2025-08-21T10:27:10.347Z" }, - { url = "https://files.pythonhosted.org/packages/a7/e7/ae86261695b6c8a36d6a4c8d5f9b9ede8248510d689a2f379a18354b37d7/pandas-2.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:9467697b8083f9667b212633ad6aa4ab32436dcbaf4cd57325debb0ddef2012f", size = 11336557, upload-time = "2025-08-21T10:27:12.983Z" }, - { url = "https://files.pythonhosted.org/packages/ec/db/614c20fb7a85a14828edd23f1c02db58a30abf3ce76f38806155d160313c/pandas-2.3.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:3fbb977f802156e7a3f829e9d1d5398f6192375a3e2d1a9ee0803e35fe70a2b9", size = 11587652, upload-time = "2025-08-21T10:27:15.888Z" }, - { url = "https://files.pythonhosted.org/packages/99/b0/756e52f6582cade5e746f19bad0517ff27ba9c73404607c0306585c201b3/pandas-2.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1b9b52693123dd234b7c985c68b709b0b009f4521000d0525f2b95c22f15944b", size = 10717686, upload-time = "2025-08-21T10:27:18.486Z" }, - { url = "https://files.pythonhosted.org/packages/37/4c/dd5ccc1e357abfeee8353123282de17997f90ff67855f86154e5a13b81e5/pandas-2.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0bd281310d4f412733f319a5bc552f86d62cddc5f51d2e392c8787335c994175", size = 11278722, upload-time = "2025-08-21T10:27:21.149Z" }, - { url = "https://files.pythonhosted.org/packages/d3/a4/f7edcfa47e0a88cda0be8b068a5bae710bf264f867edfdf7b71584ace362/pandas-2.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:96d31a6b4354e3b9b8a2c848af75d31da390657e3ac6f30c05c82068b9ed79b9", size = 11987803, upload-time = "2025-08-21T10:27:23.767Z" }, - { url = "https://files.pythonhosted.org/packages/f6/61/1bce4129f93ab66f1c68b7ed1c12bac6a70b1b56c5dab359c6bbcd480b52/pandas-2.3.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:df4df0b9d02bb873a106971bb85d448378ef14b86ba96f035f50bbd3688456b4", size = 12766345, upload-time = "2025-08-21T10:27:26.6Z" }, - { url = "https://files.pythonhosted.org/packages/8e/46/80d53de70fee835531da3a1dae827a1e76e77a43ad22a8cd0f8142b61587/pandas-2.3.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:213a5adf93d020b74327cb2c1b842884dbdd37f895f42dcc2f09d451d949f811", size = 13439314, upload-time = "2025-08-21T10:27:29.213Z" }, - { url = "https://files.pythonhosted.org/packages/28/30/8114832daff7489f179971dbc1d854109b7f4365a546e3ea75b6516cea95/pandas-2.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:8c13b81a9347eb8c7548f53fd9a4f08d4dfe996836543f805c987bafa03317ae", size = 10983326, upload-time = "2025-08-21T10:27:31.901Z" }, - { url = "https://files.pythonhosted.org/packages/27/64/a2f7bf678af502e16b472527735d168b22b7824e45a4d7e96a4fbb634b59/pandas-2.3.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:0c6ecbac99a354a051ef21c5307601093cb9e0f4b1855984a084bfec9302699e", size = 11531061, upload-time = "2025-08-21T10:27:34.647Z" }, - { url = "https://files.pythonhosted.org/packages/54/4c/c3d21b2b7769ef2f4c2b9299fcadd601efa6729f1357a8dbce8dd949ed70/pandas-2.3.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:c6f048aa0fd080d6a06cc7e7537c09b53be6642d330ac6f54a600c3ace857ee9", size = 10668666, upload-time = "2025-08-21T10:27:37.203Z" }, - { url = "https://files.pythonhosted.org/packages/50/e2/f775ba76ecfb3424d7f5862620841cf0edb592e9abd2d2a5387d305fe7a8/pandas-2.3.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0064187b80a5be6f2f9c9d6bdde29372468751dfa89f4211a3c5871854cfbf7a", size = 11332835, upload-time = "2025-08-21T10:27:40.188Z" }, - { url = "https://files.pythonhosted.org/packages/8f/52/0634adaace9be2d8cac9ef78f05c47f3a675882e068438b9d7ec7ef0c13f/pandas-2.3.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4ac8c320bded4718b298281339c1a50fb00a6ba78cb2a63521c39bec95b0209b", size = 12057211, upload-time = "2025-08-21T10:27:43.117Z" }, - { url = "https://files.pythonhosted.org/packages/0b/9d/2df913f14b2deb9c748975fdb2491da1a78773debb25abbc7cbc67c6b549/pandas-2.3.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:114c2fe4f4328cf98ce5716d1532f3ab79c5919f95a9cfee81d9140064a2e4d6", size = 12749277, upload-time = "2025-08-21T10:27:45.474Z" }, - { url = "https://files.pythonhosted.org/packages/87/af/da1a2417026bd14d98c236dba88e39837182459d29dcfcea510b2ac9e8a1/pandas-2.3.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:48fa91c4dfb3b2b9bfdb5c24cd3567575f4e13f9636810462ffed8925352be5a", size = 13415256, upload-time = "2025-08-21T10:27:49.885Z" }, - { url = "https://files.pythonhosted.org/packages/22/3c/f2af1ce8840ef648584a6156489636b5692c162771918aa95707c165ad2b/pandas-2.3.2-cp313-cp313-win_amd64.whl", hash = "sha256:12d039facec710f7ba305786837d0225a3444af7bbd9c15c32ca2d40d157ed8b", size = 10982579, upload-time = "2025-08-21T10:28:08.435Z" }, - { url = "https://files.pythonhosted.org/packages/f3/98/8df69c4097a6719e357dc249bf437b8efbde808038268e584421696cbddf/pandas-2.3.2-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:c624b615ce97864eb588779ed4046186f967374185c047070545253a52ab2d57", size = 12028163, upload-time = "2025-08-21T10:27:52.232Z" }, - { url = "https://files.pythonhosted.org/packages/0e/23/f95cbcbea319f349e10ff90db488b905c6883f03cbabd34f6b03cbc3c044/pandas-2.3.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:0cee69d583b9b128823d9514171cabb6861e09409af805b54459bd0c821a35c2", size = 11391860, upload-time = "2025-08-21T10:27:54.673Z" }, - { url = "https://files.pythonhosted.org/packages/ad/1b/6a984e98c4abee22058aa75bfb8eb90dce58cf8d7296f8bc56c14bc330b0/pandas-2.3.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2319656ed81124982900b4c37f0e0c58c015af9a7bbc62342ba5ad07ace82ba9", size = 11309830, upload-time = "2025-08-21T10:27:56.957Z" }, - { url = "https://files.pythonhosted.org/packages/15/d5/f0486090eb18dd8710bf60afeaf638ba6817047c0c8ae5c6a25598665609/pandas-2.3.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b37205ad6f00d52f16b6d09f406434ba928c1a1966e2771006a9033c736d30d2", size = 11883216, upload-time = "2025-08-21T10:27:59.302Z" }, - { url = "https://files.pythonhosted.org/packages/10/86/692050c119696da19e20245bbd650d8dfca6ceb577da027c3a73c62a047e/pandas-2.3.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:837248b4fc3a9b83b9c6214699a13f069dc13510a6a6d7f9ba33145d2841a012", size = 12699743, upload-time = "2025-08-21T10:28:02.447Z" }, - { url = "https://files.pythonhosted.org/packages/cd/d7/612123674d7b17cf345aad0a10289b2a384bff404e0463a83c4a3a59d205/pandas-2.3.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:d2c3554bd31b731cd6490d94a28f3abb8dd770634a9e06eb6d2911b9827db370", size = 13186141, upload-time = "2025-08-21T10:28:05.377Z" }, -] - -[[package]] -name = "parse" -version = "1.20.2" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/4f/78/d9b09ba24bb36ef8b83b71be547e118d46214735b6dfb39e4bfde0e9b9dd/parse-1.20.2.tar.gz", hash = "sha256:b41d604d16503c79d81af5165155c0b20f6c8d6c559efa66b4b695c3e5a0a0ce", size = 29391, upload-time = "2024-06-11T04:41:57.34Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/d0/31/ba45bf0b2aa7898d81cbbfac0e88c267befb59ad91a19e36e1bc5578ddb1/parse-1.20.2-py2.py3-none-any.whl", hash = "sha256:967095588cb802add9177d0c0b6133b5ba33b1ea9007ca800e526f42a85af558", size = 20126, upload-time = "2024-06-11T04:41:55.057Z" }, -] - -[[package]] -name = "pathable" -version = "0.4.4" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/67/93/8f2c2075b180c12c1e9f6a09d1a985bc2036906b13dff1d8917e395f2048/pathable-0.4.4.tar.gz", hash = "sha256:6905a3cd17804edfac7875b5f6c9142a218c7caef78693c2dbbbfbac186d88b2", size = 8124, upload-time = "2025-01-10T18:43:13.247Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/7d/eb/b6260b31b1a96386c0a880edebe26f89669098acea8e0318bff6adb378fd/pathable-0.4.4-py3-none-any.whl", hash = "sha256:5ae9e94793b6ef5a4cbe0a7ce9dbbefc1eec38df253763fd0aeeacf2762dbbc2", size = 9592, upload-time = "2025-01-10T18:43:11.88Z" }, -] - -[[package]] -name = "pathspec" -version = "0.12.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/ca/bc/f35b8446f4531a7cb215605d100cd88b7ac6f44ab3fc94870c120ab3adbf/pathspec-0.12.1.tar.gz", hash = "sha256:a482d51503a1ab33b1c67a6c3813a26953dbdc71c31dacaef9a838c4e29f5712", size = 51043, upload-time = "2023-12-10T22:30:45Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/cc/20/ff623b09d963f88bfde16306a54e12ee5ea43e9b597108672ff3a408aad6/pathspec-0.12.1-py3-none-any.whl", hash = "sha256:a0d503e138a4c123b27490a4f7beda6a01c6f288df0e4a8b79c7eb0dc7b4cc08", size = 31191, upload-time = "2023-12-10T22:30:43.14Z" }, -] - -[[package]] -name = "pathvalidate" -version = "3.3.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/fa/2a/52a8da6fe965dea6192eb716b357558e103aea0a1e9a8352ad575a8406ca/pathvalidate-3.3.1.tar.gz", hash = "sha256:b18c07212bfead624345bb8e1d6141cdcf15a39736994ea0b94035ad2b1ba177", size = 63262, upload-time = "2025-06-15T09:07:20.736Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/9a/70/875f4a23bfc4731703a5835487d0d2fb999031bd415e7d17c0ae615c18b7/pathvalidate-3.3.1-py3-none-any.whl", hash = "sha256:5263baab691f8e1af96092fa5137ee17df5bdfbd6cff1fcac4d6ef4bc2e1735f", size = 24305, upload-time = "2025-06-15T09:07:19.117Z" }, -] - -[[package]] -name = "pendulum" -version = "3.1.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "python-dateutil" }, - { name = "tzdata" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/23/7c/009c12b86c7cc6c403aec80f8a4308598dfc5995e5c523a5491faaa3952e/pendulum-3.1.0.tar.gz", hash = "sha256:66f96303560f41d097bee7d2dc98ffca716fbb3a832c4b3062034c2d45865015", size = 85930, upload-time = "2025-04-19T14:30:01.675Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/5e/6e/d28d3c22e6708b819a94c05bd05a3dfaed5c685379e8b6dc4b34b473b942/pendulum-3.1.0-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:61a03d14f8c64d13b2f7d5859e4b4053c4a7d3b02339f6c71f3e4606bfd67423", size = 338596, upload-time = "2025-04-19T14:01:11.306Z" }, - { url = "https://files.pythonhosted.org/packages/e1/e6/43324d58021d463c2eeb6146b169d2c935f2f840f9e45ac2d500453d954c/pendulum-3.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e674ed2d158afa5c361e60f1f67872dc55b492a10cacdaa7fcd7b7da5f158f24", size = 325854, upload-time = "2025-04-19T14:01:13.156Z" }, - { url = "https://files.pythonhosted.org/packages/b0/a7/d2ae79b960bfdea94dab67e2f118697b08bc9e98eb6bd8d32c4d99240da3/pendulum-3.1.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7c75377eb16e58bbe7e03ea89eeea49be6fc5de0934a4aef0e263f8b4fa71bc2", size = 344334, upload-time = "2025-04-19T14:01:15.151Z" }, - { url = "https://files.pythonhosted.org/packages/96/94/941f071212e23c29aae7def891fb636930c648386e059ce09ea0dcd43933/pendulum-3.1.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:656b8b0ce070f0f2e5e2668247d3c783c55336534aa1f13bd0969535878955e1", size = 382259, upload-time = "2025-04-19T14:01:16.924Z" }, - { url = "https://files.pythonhosted.org/packages/51/ad/a78a701656aec00d16fee636704445c23ca11617a0bfe7c3848d1caa5157/pendulum-3.1.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:48962903e6c1afe1f13548cb6252666056086c107d59e3d64795c58c9298bc2e", size = 436361, upload-time = "2025-04-19T14:01:18.796Z" }, - { url = "https://files.pythonhosted.org/packages/da/93/83f59ccbf4435c29dca8c63a6560fcbe4783079a468a5f91d9f886fd21f0/pendulum-3.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d364ec3f8e65010fefd4b0aaf7be5eb97e5df761b107a06f5e743b7c3f52c311", size = 353653, upload-time = "2025-04-19T14:01:20.159Z" }, - { url = "https://files.pythonhosted.org/packages/6f/0f/42d6644ec6339b41066f594e52d286162aecd2e9735aaf994d7e00c9e09d/pendulum-3.1.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:dd52caffc2afb86612ec43bbeb226f204ea12ebff9f3d12f900a7d3097210fcc", size = 524567, upload-time = "2025-04-19T14:01:21.457Z" }, - { url = "https://files.pythonhosted.org/packages/de/45/d84d909202755ab9d3379e5481fdf70f53344ebefbd68d6f5803ddde98a6/pendulum-3.1.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:d439fccaa35c91f686bd59d30604dab01e8b5c1d0dd66e81648c432fd3f8a539", size = 525571, upload-time = "2025-04-19T14:01:23.329Z" }, - { url = "https://files.pythonhosted.org/packages/0d/e0/4de160773ce3c2f7843c310db19dd919a0cd02cc1c0384866f63b18a6251/pendulum-3.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:43288773a86d9c5c0ddb645f88f615ff6bd12fd1410b34323662beccb18f3b49", size = 260259, upload-time = "2025-04-19T14:01:24.689Z" }, - { url = "https://files.pythonhosted.org/packages/c1/7f/ffa278f78112c6c6e5130a702042f52aab5c649ae2edf814df07810bbba5/pendulum-3.1.0-cp311-cp311-win_arm64.whl", hash = "sha256:569ea5072ae0f11d625e03b36d865f8037b76e838a3b621f6967314193896a11", size = 253899, upload-time = "2025-04-19T14:01:26.442Z" }, - { url = "https://files.pythonhosted.org/packages/7a/d7/b1bfe15a742f2c2713acb1fdc7dc3594ff46ef9418ac6a96fcb12a6ba60b/pendulum-3.1.0-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:4dfd53e7583ccae138be86d6c0a0b324c7547df2afcec1876943c4d481cf9608", size = 336209, upload-time = "2025-04-19T14:01:27.815Z" }, - { url = "https://files.pythonhosted.org/packages/eb/87/0392da0c603c828b926d9f7097fbdddaafc01388cb8a00888635d04758c3/pendulum-3.1.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6a6e06a28f3a7d696546347805536f6f38be458cb79de4f80754430696bea9e6", size = 323130, upload-time = "2025-04-19T14:01:29.336Z" }, - { url = "https://files.pythonhosted.org/packages/c0/61/95f1eec25796be6dddf71440ee16ec1fd0c573fc61a73bd1ef6daacd529a/pendulum-3.1.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7e68d6a51880708084afd8958af42dc8c5e819a70a6c6ae903b1c4bfc61e0f25", size = 341509, upload-time = "2025-04-19T14:01:31.1Z" }, - { url = "https://files.pythonhosted.org/packages/b5/7b/eb0f5e6aa87d5e1b467a1611009dbdc92f0f72425ebf07669bfadd8885a6/pendulum-3.1.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9e3f1e5da39a7ea7119efda1dd96b529748c1566f8a983412d0908455d606942", size = 378674, upload-time = "2025-04-19T14:01:32.974Z" }, - { url = "https://files.pythonhosted.org/packages/29/68/5a4c1b5de3e54e16cab21d2ec88f9cd3f18599e96cc90a441c0b0ab6b03f/pendulum-3.1.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e9af1e5eeddb4ebbe1b1c9afb9fd8077d73416ade42dd61264b3f3b87742e0bb", size = 436133, upload-time = "2025-04-19T14:01:34.349Z" }, - { url = "https://files.pythonhosted.org/packages/87/5d/f7a1d693e5c0f789185117d5c1d5bee104f5b0d9fbf061d715fb61c840a8/pendulum-3.1.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:20f74aa8029a42e327bfc150472e0e4d2358fa5d795f70460160ba81b94b6945", size = 351232, upload-time = "2025-04-19T14:01:35.669Z" }, - { url = "https://files.pythonhosted.org/packages/30/77/c97617eb31f1d0554edb073201a294019b9e0a9bd2f73c68e6d8d048cd6b/pendulum-3.1.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:cf6229e5ee70c2660148523f46c472e677654d0097bec010d6730f08312a4931", size = 521562, upload-time = "2025-04-19T14:01:37.05Z" }, - { url = "https://files.pythonhosted.org/packages/76/22/0d0ef3393303877e757b848ecef8a9a8c7627e17e7590af82d14633b2cd1/pendulum-3.1.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:350cabb23bf1aec7c7694b915d3030bff53a2ad4aeabc8c8c0d807c8194113d6", size = 523221, upload-time = "2025-04-19T14:01:38.444Z" }, - { url = "https://files.pythonhosted.org/packages/99/f3/aefb579aa3cebd6f2866b205fc7a60d33e9a696e9e629024752107dc3cf5/pendulum-3.1.0-cp312-cp312-win_amd64.whl", hash = "sha256:42959341e843077c41d47420f28c3631de054abd64da83f9b956519b5c7a06a7", size = 260502, upload-time = "2025-04-19T14:01:39.814Z" }, - { url = "https://files.pythonhosted.org/packages/02/74/4332b5d6e34c63d4df8e8eab2249e74c05513b1477757463f7fdca99e9be/pendulum-3.1.0-cp312-cp312-win_arm64.whl", hash = "sha256:006758e2125da2e624493324dfd5d7d1b02b0c44bc39358e18bf0f66d0767f5f", size = 253089, upload-time = "2025-04-19T14:01:41.171Z" }, - { url = "https://files.pythonhosted.org/packages/8e/1f/af928ba4aa403dac9569f787adcf024005e7654433d71f7a84e608716837/pendulum-3.1.0-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:28658b0baf4b30eb31d096a375983cfed033e60c0a7bbe94fa23f06cd779b50b", size = 336209, upload-time = "2025-04-19T14:01:42.775Z" }, - { url = "https://files.pythonhosted.org/packages/b6/16/b010643007ba964c397da7fa622924423883c1bbff1a53f9d1022cd7f024/pendulum-3.1.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:b114dcb99ce511cb8f5495c7b6f0056b2c3dba444ef1ea6e48030d7371bd531a", size = 323132, upload-time = "2025-04-19T14:01:44.577Z" }, - { url = "https://files.pythonhosted.org/packages/64/19/c3c47aeecb5d9bceb0e89faafd800d39809b696c5b7bba8ec8370ad5052c/pendulum-3.1.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2404a6a54c80252ea393291f0b7f35525a61abae3d795407f34e118a8f133a18", size = 341509, upload-time = "2025-04-19T14:01:46.084Z" }, - { url = "https://files.pythonhosted.org/packages/38/cf/c06921ff6b860ff7e62e70b8e5d4dc70e36f5abb66d168bd64d51760bc4e/pendulum-3.1.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d06999790d9ee9962a1627e469f98568bf7ad1085553fa3c30ed08b3944a14d7", size = 378674, upload-time = "2025-04-19T14:01:47.727Z" }, - { url = "https://files.pythonhosted.org/packages/62/0b/a43953b9eba11e82612b033ac5133f716f1b76b6108a65da6f408b3cc016/pendulum-3.1.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:94751c52f6b7c306734d1044c2c6067a474237e1e5afa2f665d1fbcbbbcf24b3", size = 436133, upload-time = "2025-04-19T14:01:49.126Z" }, - { url = "https://files.pythonhosted.org/packages/eb/a0/ec3d70b3b96e23ae1d039f132af35e17704c22a8250d1887aaefea4d78a6/pendulum-3.1.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5553ac27be05e997ec26d7f004cf72788f4ce11fe60bb80dda604a64055b29d0", size = 351232, upload-time = "2025-04-19T14:01:50.575Z" }, - { url = "https://files.pythonhosted.org/packages/f4/97/aba23f1716b82f6951ba2b1c9178a2d107d1e66c102762a9bf19988547ea/pendulum-3.1.0-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:f8dee234ca6142bf0514368d01a72945a44685aaa2fc4c14c98d09da9437b620", size = 521563, upload-time = "2025-04-19T14:01:51.9Z" }, - { url = "https://files.pythonhosted.org/packages/01/33/2c0d5216cc53d16db0c4b3d510f141ee0a540937f8675948541190fbd48b/pendulum-3.1.0-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:7378084fe54faab4ee481897a00b710876f2e901ded6221671e827a253e643f2", size = 523221, upload-time = "2025-04-19T14:01:53.275Z" }, - { url = "https://files.pythonhosted.org/packages/51/89/8de955c339c31aeae77fd86d3225509b998c81875e9dba28cb88b8cbf4b3/pendulum-3.1.0-cp313-cp313-win_amd64.whl", hash = "sha256:8539db7ae2c8da430ac2515079e288948c8ebf7eb1edd3e8281b5cdf433040d6", size = 260501, upload-time = "2025-04-19T14:01:54.749Z" }, - { url = "https://files.pythonhosted.org/packages/15/c3/226a3837363e94f8722461848feec18bfdd7d5172564d53aa3c3397ff01e/pendulum-3.1.0-cp313-cp313-win_arm64.whl", hash = "sha256:1ce26a608e1f7387cd393fba2a129507c4900958d4f47b90757ec17656856571", size = 253087, upload-time = "2025-04-19T14:01:55.998Z" }, - { url = "https://files.pythonhosted.org/packages/6e/23/e98758924d1b3aac11a626268eabf7f3cf177e7837c28d47bf84c64532d0/pendulum-3.1.0-py3-none-any.whl", hash = "sha256:f9178c2a8e291758ade1e8dd6371b1d26d08371b4c7730a6e9a3ef8b16ebae0f", size = 111799, upload-time = "2025-04-19T14:02:34.739Z" }, -] - -[[package]] -name = "pillow" -version = "11.3.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/f3/0d/d0d6dea55cd152ce3d6767bb38a8fc10e33796ba4ba210cbab9354b6d238/pillow-11.3.0.tar.gz", hash = "sha256:3828ee7586cd0b2091b6209e5ad53e20d0649bbe87164a459d0676e035e8f523", size = 47113069, upload-time = "2025-07-01T09:16:30.666Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/db/26/77f8ed17ca4ffd60e1dcd220a6ec6d71210ba398cfa33a13a1cd614c5613/pillow-11.3.0-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:1cd110edf822773368b396281a2293aeb91c90a2db00d78ea43e7e861631b722", size = 5316531, upload-time = "2025-07-01T09:13:59.203Z" }, - { url = "https://files.pythonhosted.org/packages/cb/39/ee475903197ce709322a17a866892efb560f57900d9af2e55f86db51b0a5/pillow-11.3.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:9c412fddd1b77a75aa904615ebaa6001f169b26fd467b4be93aded278266b288", size = 4686560, upload-time = "2025-07-01T09:14:01.101Z" }, - { url = "https://files.pythonhosted.org/packages/d5/90/442068a160fd179938ba55ec8c97050a612426fae5ec0a764e345839f76d/pillow-11.3.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7d1aa4de119a0ecac0a34a9c8bde33f34022e2e8f99104e47a3ca392fd60e37d", size = 5870978, upload-time = "2025-07-03T13:09:55.638Z" }, - { url = "https://files.pythonhosted.org/packages/13/92/dcdd147ab02daf405387f0218dcf792dc6dd5b14d2573d40b4caeef01059/pillow-11.3.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:91da1d88226663594e3f6b4b8c3c8d85bd504117d043740a8e0ec449087cc494", size = 7641168, upload-time = "2025-07-03T13:10:00.37Z" }, - { url = "https://files.pythonhosted.org/packages/6e/db/839d6ba7fd38b51af641aa904e2960e7a5644d60ec754c046b7d2aee00e5/pillow-11.3.0-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:643f189248837533073c405ec2f0bb250ba54598cf80e8c1e043381a60632f58", size = 5973053, upload-time = "2025-07-01T09:14:04.491Z" }, - { url = "https://files.pythonhosted.org/packages/f2/2f/d7675ecae6c43e9f12aa8d58b6012683b20b6edfbdac7abcb4e6af7a3784/pillow-11.3.0-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:106064daa23a745510dabce1d84f29137a37224831d88eb4ce94bb187b1d7e5f", size = 6640273, upload-time = "2025-07-01T09:14:06.235Z" }, - { url = "https://files.pythonhosted.org/packages/45/ad/931694675ede172e15b2ff03c8144a0ddaea1d87adb72bb07655eaffb654/pillow-11.3.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:cd8ff254faf15591e724dc7c4ddb6bf4793efcbe13802a4ae3e863cd300b493e", size = 6082043, upload-time = "2025-07-01T09:14:07.978Z" }, - { url = "https://files.pythonhosted.org/packages/3a/04/ba8f2b11fc80d2dd462d7abec16351b45ec99cbbaea4387648a44190351a/pillow-11.3.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:932c754c2d51ad2b2271fd01c3d121daaa35e27efae2a616f77bf164bc0b3e94", size = 6715516, upload-time = "2025-07-01T09:14:10.233Z" }, - { url = "https://files.pythonhosted.org/packages/48/59/8cd06d7f3944cc7d892e8533c56b0acb68399f640786313275faec1e3b6f/pillow-11.3.0-cp311-cp311-win32.whl", hash = "sha256:b4b8f3efc8d530a1544e5962bd6b403d5f7fe8b9e08227c6b255f98ad82b4ba0", size = 6274768, upload-time = "2025-07-01T09:14:11.921Z" }, - { url = "https://files.pythonhosted.org/packages/f1/cc/29c0f5d64ab8eae20f3232da8f8571660aa0ab4b8f1331da5c2f5f9a938e/pillow-11.3.0-cp311-cp311-win_amd64.whl", hash = "sha256:1a992e86b0dd7aeb1f053cd506508c0999d710a8f07b4c791c63843fc6a807ac", size = 6986055, upload-time = "2025-07-01T09:14:13.623Z" }, - { url = "https://files.pythonhosted.org/packages/c6/df/90bd886fabd544c25addd63e5ca6932c86f2b701d5da6c7839387a076b4a/pillow-11.3.0-cp311-cp311-win_arm64.whl", hash = "sha256:30807c931ff7c095620fe04448e2c2fc673fcbb1ffe2a7da3fb39613489b1ddd", size = 2423079, upload-time = "2025-07-01T09:14:15.268Z" }, - { url = "https://files.pythonhosted.org/packages/40/fe/1bc9b3ee13f68487a99ac9529968035cca2f0a51ec36892060edcc51d06a/pillow-11.3.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:fdae223722da47b024b867c1ea0be64e0df702c5e0a60e27daad39bf960dd1e4", size = 5278800, upload-time = "2025-07-01T09:14:17.648Z" }, - { url = "https://files.pythonhosted.org/packages/2c/32/7e2ac19b5713657384cec55f89065fb306b06af008cfd87e572035b27119/pillow-11.3.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:921bd305b10e82b4d1f5e802b6850677f965d8394203d182f078873851dada69", size = 4686296, upload-time = "2025-07-01T09:14:19.828Z" }, - { url = "https://files.pythonhosted.org/packages/8e/1e/b9e12bbe6e4c2220effebc09ea0923a07a6da1e1f1bfbc8d7d29a01ce32b/pillow-11.3.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:eb76541cba2f958032d79d143b98a3a6b3ea87f0959bbe256c0b5e416599fd5d", size = 5871726, upload-time = "2025-07-03T13:10:04.448Z" }, - { url = "https://files.pythonhosted.org/packages/8d/33/e9200d2bd7ba00dc3ddb78df1198a6e80d7669cce6c2bdbeb2530a74ec58/pillow-11.3.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:67172f2944ebba3d4a7b54f2e95c786a3a50c21b88456329314caaa28cda70f6", size = 7644652, upload-time = "2025-07-03T13:10:10.391Z" }, - { url = "https://files.pythonhosted.org/packages/41/f1/6f2427a26fc683e00d985bc391bdd76d8dd4e92fac33d841127eb8fb2313/pillow-11.3.0-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:97f07ed9f56a3b9b5f49d3661dc9607484e85c67e27f3e8be2c7d28ca032fec7", size = 5977787, upload-time = "2025-07-01T09:14:21.63Z" }, - { url = "https://files.pythonhosted.org/packages/e4/c9/06dd4a38974e24f932ff5f98ea3c546ce3f8c995d3f0985f8e5ba48bba19/pillow-11.3.0-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:676b2815362456b5b3216b4fd5bd89d362100dc6f4945154ff172e206a22c024", size = 6645236, upload-time = "2025-07-01T09:14:23.321Z" }, - { url = "https://files.pythonhosted.org/packages/40/e7/848f69fb79843b3d91241bad658e9c14f39a32f71a301bcd1d139416d1be/pillow-11.3.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3e184b2f26ff146363dd07bde8b711833d7b0202e27d13540bfe2e35a323a809", size = 6086950, upload-time = "2025-07-01T09:14:25.237Z" }, - { url = "https://files.pythonhosted.org/packages/0b/1a/7cff92e695a2a29ac1958c2a0fe4c0b2393b60aac13b04a4fe2735cad52d/pillow-11.3.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:6be31e3fc9a621e071bc17bb7de63b85cbe0bfae91bb0363c893cbe67247780d", size = 6723358, upload-time = "2025-07-01T09:14:27.053Z" }, - { url = "https://files.pythonhosted.org/packages/26/7d/73699ad77895f69edff76b0f332acc3d497f22f5d75e5360f78cbcaff248/pillow-11.3.0-cp312-cp312-win32.whl", hash = "sha256:7b161756381f0918e05e7cb8a371fff367e807770f8fe92ecb20d905d0e1c149", size = 6275079, upload-time = "2025-07-01T09:14:30.104Z" }, - { url = "https://files.pythonhosted.org/packages/8c/ce/e7dfc873bdd9828f3b6e5c2bbb74e47a98ec23cc5c74fc4e54462f0d9204/pillow-11.3.0-cp312-cp312-win_amd64.whl", hash = "sha256:a6444696fce635783440b7f7a9fc24b3ad10a9ea3f0ab66c5905be1c19ccf17d", size = 6986324, upload-time = "2025-07-01T09:14:31.899Z" }, - { url = "https://files.pythonhosted.org/packages/16/8f/b13447d1bf0b1f7467ce7d86f6e6edf66c0ad7cf44cf5c87a37f9bed9936/pillow-11.3.0-cp312-cp312-win_arm64.whl", hash = "sha256:2aceea54f957dd4448264f9bf40875da0415c83eb85f55069d89c0ed436e3542", size = 2423067, upload-time = "2025-07-01T09:14:33.709Z" }, - { url = "https://files.pythonhosted.org/packages/1e/93/0952f2ed8db3a5a4c7a11f91965d6184ebc8cd7cbb7941a260d5f018cd2d/pillow-11.3.0-cp313-cp313-ios_13_0_arm64_iphoneos.whl", hash = "sha256:1c627742b539bba4309df89171356fcb3cc5a9178355b2727d1b74a6cf155fbd", size = 2128328, upload-time = "2025-07-01T09:14:35.276Z" }, - { url = "https://files.pythonhosted.org/packages/4b/e8/100c3d114b1a0bf4042f27e0f87d2f25e857e838034e98ca98fe7b8c0a9c/pillow-11.3.0-cp313-cp313-ios_13_0_arm64_iphonesimulator.whl", hash = "sha256:30b7c02f3899d10f13d7a48163c8969e4e653f8b43416d23d13d1bbfdc93b9f8", size = 2170652, upload-time = "2025-07-01T09:14:37.203Z" }, - { url = "https://files.pythonhosted.org/packages/aa/86/3f758a28a6e381758545f7cdb4942e1cb79abd271bea932998fc0db93cb6/pillow-11.3.0-cp313-cp313-ios_13_0_x86_64_iphonesimulator.whl", hash = "sha256:7859a4cc7c9295f5838015d8cc0a9c215b77e43d07a25e460f35cf516df8626f", size = 2227443, upload-time = "2025-07-01T09:14:39.344Z" }, - { url = "https://files.pythonhosted.org/packages/01/f4/91d5b3ffa718df2f53b0dc109877993e511f4fd055d7e9508682e8aba092/pillow-11.3.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:ec1ee50470b0d050984394423d96325b744d55c701a439d2bd66089bff963d3c", size = 5278474, upload-time = "2025-07-01T09:14:41.843Z" }, - { url = "https://files.pythonhosted.org/packages/f9/0e/37d7d3eca6c879fbd9dba21268427dffda1ab00d4eb05b32923d4fbe3b12/pillow-11.3.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:7db51d222548ccfd274e4572fdbf3e810a5e66b00608862f947b163e613b67dd", size = 4686038, upload-time = "2025-07-01T09:14:44.008Z" }, - { url = "https://files.pythonhosted.org/packages/ff/b0/3426e5c7f6565e752d81221af9d3676fdbb4f352317ceafd42899aaf5d8a/pillow-11.3.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:2d6fcc902a24ac74495df63faad1884282239265c6839a0a6416d33faedfae7e", size = 5864407, upload-time = "2025-07-03T13:10:15.628Z" }, - { url = "https://files.pythonhosted.org/packages/fc/c1/c6c423134229f2a221ee53f838d4be9d82bab86f7e2f8e75e47b6bf6cd77/pillow-11.3.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f0f5d8f4a08090c6d6d578351a2b91acf519a54986c055af27e7a93feae6d3f1", size = 7639094, upload-time = "2025-07-03T13:10:21.857Z" }, - { url = "https://files.pythonhosted.org/packages/ba/c9/09e6746630fe6372c67c648ff9deae52a2bc20897d51fa293571977ceb5d/pillow-11.3.0-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c37d8ba9411d6003bba9e518db0db0c58a680ab9fe5179f040b0463644bc9805", size = 5973503, upload-time = "2025-07-01T09:14:45.698Z" }, - { url = "https://files.pythonhosted.org/packages/d5/1c/a2a29649c0b1983d3ef57ee87a66487fdeb45132df66ab30dd37f7dbe162/pillow-11.3.0-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:13f87d581e71d9189ab21fe0efb5a23e9f28552d5be6979e84001d3b8505abe8", size = 6642574, upload-time = "2025-07-01T09:14:47.415Z" }, - { url = "https://files.pythonhosted.org/packages/36/de/d5cc31cc4b055b6c6fd990e3e7f0f8aaf36229a2698501bcb0cdf67c7146/pillow-11.3.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:023f6d2d11784a465f09fd09a34b150ea4672e85fb3d05931d89f373ab14abb2", size = 6084060, upload-time = "2025-07-01T09:14:49.636Z" }, - { url = "https://files.pythonhosted.org/packages/d5/ea/502d938cbaeec836ac28a9b730193716f0114c41325db428e6b280513f09/pillow-11.3.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:45dfc51ac5975b938e9809451c51734124e73b04d0f0ac621649821a63852e7b", size = 6721407, upload-time = "2025-07-01T09:14:51.962Z" }, - { url = "https://files.pythonhosted.org/packages/45/9c/9c5e2a73f125f6cbc59cc7087c8f2d649a7ae453f83bd0362ff7c9e2aee2/pillow-11.3.0-cp313-cp313-win32.whl", hash = "sha256:a4d336baed65d50d37b88ca5b60c0fa9d81e3a87d4a7930d3880d1624d5b31f3", size = 6273841, upload-time = "2025-07-01T09:14:54.142Z" }, - { url = "https://files.pythonhosted.org/packages/23/85/397c73524e0cd212067e0c969aa245b01d50183439550d24d9f55781b776/pillow-11.3.0-cp313-cp313-win_amd64.whl", hash = "sha256:0bce5c4fd0921f99d2e858dc4d4d64193407e1b99478bc5cacecba2311abde51", size = 6978450, upload-time = "2025-07-01T09:14:56.436Z" }, - { url = "https://files.pythonhosted.org/packages/17/d2/622f4547f69cd173955194b78e4d19ca4935a1b0f03a302d655c9f6aae65/pillow-11.3.0-cp313-cp313-win_arm64.whl", hash = "sha256:1904e1264881f682f02b7f8167935cce37bc97db457f8e7849dc3a6a52b99580", size = 2423055, upload-time = "2025-07-01T09:14:58.072Z" }, - { url = "https://files.pythonhosted.org/packages/dd/80/a8a2ac21dda2e82480852978416cfacd439a4b490a501a288ecf4fe2532d/pillow-11.3.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:4c834a3921375c48ee6b9624061076bc0a32a60b5532b322cc0ea64e639dd50e", size = 5281110, upload-time = "2025-07-01T09:14:59.79Z" }, - { url = "https://files.pythonhosted.org/packages/44/d6/b79754ca790f315918732e18f82a8146d33bcd7f4494380457ea89eb883d/pillow-11.3.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:5e05688ccef30ea69b9317a9ead994b93975104a677a36a8ed8106be9260aa6d", size = 4689547, upload-time = "2025-07-01T09:15:01.648Z" }, - { url = "https://files.pythonhosted.org/packages/49/20/716b8717d331150cb00f7fdd78169c01e8e0c219732a78b0e59b6bdb2fd6/pillow-11.3.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:1019b04af07fc0163e2810167918cb5add8d74674b6267616021ab558dc98ced", size = 5901554, upload-time = "2025-07-03T13:10:27.018Z" }, - { url = "https://files.pythonhosted.org/packages/74/cf/a9f3a2514a65bb071075063a96f0a5cf949c2f2fce683c15ccc83b1c1cab/pillow-11.3.0-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f944255db153ebb2b19c51fe85dd99ef0ce494123f21b9db4877ffdfc5590c7c", size = 7669132, upload-time = "2025-07-03T13:10:33.01Z" }, - { url = "https://files.pythonhosted.org/packages/98/3c/da78805cbdbee9cb43efe8261dd7cc0b4b93f2ac79b676c03159e9db2187/pillow-11.3.0-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1f85acb69adf2aaee8b7da124efebbdb959a104db34d3a2cb0f3793dbae422a8", size = 6005001, upload-time = "2025-07-01T09:15:03.365Z" }, - { url = "https://files.pythonhosted.org/packages/6c/fa/ce044b91faecf30e635321351bba32bab5a7e034c60187fe9698191aef4f/pillow-11.3.0-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:05f6ecbeff5005399bb48d198f098a9b4b6bdf27b8487c7f38ca16eeb070cd59", size = 6668814, upload-time = "2025-07-01T09:15:05.655Z" }, - { url = "https://files.pythonhosted.org/packages/7b/51/90f9291406d09bf93686434f9183aba27b831c10c87746ff49f127ee80cb/pillow-11.3.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:a7bc6e6fd0395bc052f16b1a8670859964dbd7003bd0af2ff08342eb6e442cfe", size = 6113124, upload-time = "2025-07-01T09:15:07.358Z" }, - { url = "https://files.pythonhosted.org/packages/cd/5a/6fec59b1dfb619234f7636d4157d11fb4e196caeee220232a8d2ec48488d/pillow-11.3.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:83e1b0161c9d148125083a35c1c5a89db5b7054834fd4387499e06552035236c", size = 6747186, upload-time = "2025-07-01T09:15:09.317Z" }, - { url = "https://files.pythonhosted.org/packages/49/6b/00187a044f98255225f172de653941e61da37104a9ea60e4f6887717e2b5/pillow-11.3.0-cp313-cp313t-win32.whl", hash = "sha256:2a3117c06b8fb646639dce83694f2f9eac405472713fcb1ae887469c0d4f6788", size = 6277546, upload-time = "2025-07-01T09:15:11.311Z" }, - { url = "https://files.pythonhosted.org/packages/e8/5c/6caaba7e261c0d75bab23be79f1d06b5ad2a2ae49f028ccec801b0e853d6/pillow-11.3.0-cp313-cp313t-win_amd64.whl", hash = "sha256:857844335c95bea93fb39e0fa2726b4d9d758850b34075a7e3ff4f4fa3aa3b31", size = 6985102, upload-time = "2025-07-01T09:15:13.164Z" }, - { url = "https://files.pythonhosted.org/packages/f3/7e/b623008460c09a0cb38263c93b828c666493caee2eb34ff67f778b87e58c/pillow-11.3.0-cp313-cp313t-win_arm64.whl", hash = "sha256:8797edc41f3e8536ae4b10897ee2f637235c94f27404cac7297f7b607dd0716e", size = 2424803, upload-time = "2025-07-01T09:15:15.695Z" }, - { url = "https://files.pythonhosted.org/packages/73/f4/04905af42837292ed86cb1b1dabe03dce1edc008ef14c473c5c7e1443c5d/pillow-11.3.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:d9da3df5f9ea2a89b81bb6087177fb1f4d1c7146d583a3fe5c672c0d94e55e12", size = 5278520, upload-time = "2025-07-01T09:15:17.429Z" }, - { url = "https://files.pythonhosted.org/packages/41/b0/33d79e377a336247df6348a54e6d2a2b85d644ca202555e3faa0cf811ecc/pillow-11.3.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:0b275ff9b04df7b640c59ec5a3cb113eefd3795a8df80bac69646ef699c6981a", size = 4686116, upload-time = "2025-07-01T09:15:19.423Z" }, - { url = "https://files.pythonhosted.org/packages/49/2d/ed8bc0ab219ae8768f529597d9509d184fe8a6c4741a6864fea334d25f3f/pillow-11.3.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:0743841cabd3dba6a83f38a92672cccbd69af56e3e91777b0ee7f4dba4385632", size = 5864597, upload-time = "2025-07-03T13:10:38.404Z" }, - { url = "https://files.pythonhosted.org/packages/b5/3d/b932bb4225c80b58dfadaca9d42d08d0b7064d2d1791b6a237f87f661834/pillow-11.3.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:2465a69cf967b8b49ee1b96d76718cd98c4e925414ead59fdf75cf0fd07df673", size = 7638246, upload-time = "2025-07-03T13:10:44.987Z" }, - { url = "https://files.pythonhosted.org/packages/09/b5/0487044b7c096f1b48f0d7ad416472c02e0e4bf6919541b111efd3cae690/pillow-11.3.0-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:41742638139424703b4d01665b807c6468e23e699e8e90cffefe291c5832b027", size = 5973336, upload-time = "2025-07-01T09:15:21.237Z" }, - { url = "https://files.pythonhosted.org/packages/a8/2d/524f9318f6cbfcc79fbc004801ea6b607ec3f843977652fdee4857a7568b/pillow-11.3.0-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:93efb0b4de7e340d99057415c749175e24c8864302369e05914682ba642e5d77", size = 6642699, upload-time = "2025-07-01T09:15:23.186Z" }, - { url = "https://files.pythonhosted.org/packages/6f/d2/a9a4f280c6aefedce1e8f615baaa5474e0701d86dd6f1dede66726462bbd/pillow-11.3.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:7966e38dcd0fa11ca390aed7c6f20454443581d758242023cf36fcb319b1a874", size = 6083789, upload-time = "2025-07-01T09:15:25.1Z" }, - { url = "https://files.pythonhosted.org/packages/fe/54/86b0cd9dbb683a9d5e960b66c7379e821a19be4ac5810e2e5a715c09a0c0/pillow-11.3.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:98a9afa7b9007c67ed84c57c9e0ad86a6000da96eaa638e4f8abe5b65ff83f0a", size = 6720386, upload-time = "2025-07-01T09:15:27.378Z" }, - { url = "https://files.pythonhosted.org/packages/e7/95/88efcaf384c3588e24259c4203b909cbe3e3c2d887af9e938c2022c9dd48/pillow-11.3.0-cp314-cp314-win32.whl", hash = "sha256:02a723e6bf909e7cea0dac1b0e0310be9d7650cd66222a5f1c571455c0a45214", size = 6370911, upload-time = "2025-07-01T09:15:29.294Z" }, - { url = "https://files.pythonhosted.org/packages/2e/cc/934e5820850ec5eb107e7b1a72dd278140731c669f396110ebc326f2a503/pillow-11.3.0-cp314-cp314-win_amd64.whl", hash = "sha256:a418486160228f64dd9e9efcd132679b7a02a5f22c982c78b6fc7dab3fefb635", size = 7117383, upload-time = "2025-07-01T09:15:31.128Z" }, - { url = "https://files.pythonhosted.org/packages/d6/e9/9c0a616a71da2a5d163aa37405e8aced9a906d574b4a214bede134e731bc/pillow-11.3.0-cp314-cp314-win_arm64.whl", hash = "sha256:155658efb5e044669c08896c0c44231c5e9abcaadbc5cd3648df2f7c0b96b9a6", size = 2511385, upload-time = "2025-07-01T09:15:33.328Z" }, - { url = "https://files.pythonhosted.org/packages/1a/33/c88376898aff369658b225262cd4f2659b13e8178e7534df9e6e1fa289f6/pillow-11.3.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:59a03cdf019efbfeeed910bf79c7c93255c3d54bc45898ac2a4140071b02b4ae", size = 5281129, upload-time = "2025-07-01T09:15:35.194Z" }, - { url = "https://files.pythonhosted.org/packages/1f/70/d376247fb36f1844b42910911c83a02d5544ebd2a8bad9efcc0f707ea774/pillow-11.3.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:f8a5827f84d973d8636e9dc5764af4f0cf2318d26744b3d902931701b0d46653", size = 4689580, upload-time = "2025-07-01T09:15:37.114Z" }, - { url = "https://files.pythonhosted.org/packages/eb/1c/537e930496149fbac69efd2fc4329035bbe2e5475b4165439e3be9cb183b/pillow-11.3.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:ee92f2fd10f4adc4b43d07ec5e779932b4eb3dbfbc34790ada5a6669bc095aa6", size = 5902860, upload-time = "2025-07-03T13:10:50.248Z" }, - { url = "https://files.pythonhosted.org/packages/bd/57/80f53264954dcefeebcf9dae6e3eb1daea1b488f0be8b8fef12f79a3eb10/pillow-11.3.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c96d333dcf42d01f47b37e0979b6bd73ec91eae18614864622d9b87bbd5bbf36", size = 7670694, upload-time = "2025-07-03T13:10:56.432Z" }, - { url = "https://files.pythonhosted.org/packages/70/ff/4727d3b71a8578b4587d9c276e90efad2d6fe0335fd76742a6da08132e8c/pillow-11.3.0-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4c96f993ab8c98460cd0c001447bff6194403e8b1d7e149ade5f00594918128b", size = 6005888, upload-time = "2025-07-01T09:15:39.436Z" }, - { url = "https://files.pythonhosted.org/packages/05/ae/716592277934f85d3be51d7256f3636672d7b1abfafdc42cf3f8cbd4b4c8/pillow-11.3.0-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:41342b64afeba938edb034d122b2dda5db2139b9a4af999729ba8818e0056477", size = 6670330, upload-time = "2025-07-01T09:15:41.269Z" }, - { url = "https://files.pythonhosted.org/packages/e7/bb/7fe6cddcc8827b01b1a9766f5fdeb7418680744f9082035bdbabecf1d57f/pillow-11.3.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:068d9c39a2d1b358eb9f245ce7ab1b5c3246c7c8c7d9ba58cfa5b43146c06e50", size = 6114089, upload-time = "2025-07-01T09:15:43.13Z" }, - { url = "https://files.pythonhosted.org/packages/8b/f5/06bfaa444c8e80f1a8e4bff98da9c83b37b5be3b1deaa43d27a0db37ef84/pillow-11.3.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:a1bc6ba083b145187f648b667e05a2534ecc4b9f2784c2cbe3089e44868f2b9b", size = 6748206, upload-time = "2025-07-01T09:15:44.937Z" }, - { url = "https://files.pythonhosted.org/packages/f0/77/bc6f92a3e8e6e46c0ca78abfffec0037845800ea38c73483760362804c41/pillow-11.3.0-cp314-cp314t-win32.whl", hash = "sha256:118ca10c0d60b06d006be10a501fd6bbdfef559251ed31b794668ed569c87e12", size = 6377370, upload-time = "2025-07-01T09:15:46.673Z" }, - { url = "https://files.pythonhosted.org/packages/4a/82/3a721f7d69dca802befb8af08b7c79ebcab461007ce1c18bd91a5d5896f9/pillow-11.3.0-cp314-cp314t-win_amd64.whl", hash = "sha256:8924748b688aa210d79883357d102cd64690e56b923a186f35a82cbc10f997db", size = 7121500, upload-time = "2025-07-01T09:15:48.512Z" }, - { url = "https://files.pythonhosted.org/packages/89/c7/5572fa4a3f45740eaab6ae86fcdf7195b55beac1371ac8c619d880cfe948/pillow-11.3.0-cp314-cp314t-win_arm64.whl", hash = "sha256:79ea0d14d3ebad43ec77ad5272e6ff9bba5b679ef73375ea760261207fa8e0aa", size = 2512835, upload-time = "2025-07-01T09:15:50.399Z" }, - { url = "https://files.pythonhosted.org/packages/9e/e3/6fa84033758276fb31da12e5fb66ad747ae83b93c67af17f8c6ff4cc8f34/pillow-11.3.0-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:7c8ec7a017ad1bd562f93dbd8505763e688d388cde6e4a010ae1486916e713e6", size = 5270566, upload-time = "2025-07-01T09:16:19.801Z" }, - { url = "https://files.pythonhosted.org/packages/5b/ee/e8d2e1ab4892970b561e1ba96cbd59c0d28cf66737fc44abb2aec3795a4e/pillow-11.3.0-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:9ab6ae226de48019caa8074894544af5b53a117ccb9d3b3dcb2871464c829438", size = 4654618, upload-time = "2025-07-01T09:16:21.818Z" }, - { url = "https://files.pythonhosted.org/packages/f2/6d/17f80f4e1f0761f02160fc433abd4109fa1548dcfdca46cfdadaf9efa565/pillow-11.3.0-pp311-pypy311_pp73-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:fe27fb049cdcca11f11a7bfda64043c37b30e6b91f10cb5bab275806c32f6ab3", size = 4874248, upload-time = "2025-07-03T13:11:20.738Z" }, - { url = "https://files.pythonhosted.org/packages/de/5f/c22340acd61cef960130585bbe2120e2fd8434c214802f07e8c03596b17e/pillow-11.3.0-pp311-pypy311_pp73-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:465b9e8844e3c3519a983d58b80be3f668e2a7a5db97f2784e7079fbc9f9822c", size = 6583963, upload-time = "2025-07-03T13:11:26.283Z" }, - { url = "https://files.pythonhosted.org/packages/31/5e/03966aedfbfcbb4d5f8aa042452d3361f325b963ebbadddac05b122e47dd/pillow-11.3.0-pp311-pypy311_pp73-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5418b53c0d59b3824d05e029669efa023bbef0f3e92e75ec8428f3799487f361", size = 4957170, upload-time = "2025-07-01T09:16:23.762Z" }, - { url = "https://files.pythonhosted.org/packages/cc/2d/e082982aacc927fc2cab48e1e731bdb1643a1406acace8bed0900a61464e/pillow-11.3.0-pp311-pypy311_pp73-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:504b6f59505f08ae014f724b6207ff6222662aab5cc9542577fb084ed0676ac7", size = 5581505, upload-time = "2025-07-01T09:16:25.593Z" }, - { url = "https://files.pythonhosted.org/packages/34/e7/ae39f538fd6844e982063c3a5e4598b8ced43b9633baa3a85ef33af8c05c/pillow-11.3.0-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:c84d689db21a1c397d001aa08241044aa2069e7587b398c8cc63020390b1c1b8", size = 6984598, upload-time = "2025-07-01T09:16:27.732Z" }, -] - -[[package]] -name = "platformdirs" -version = "4.4.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/23/e8/21db9c9987b0e728855bd57bff6984f67952bea55d6f75e055c46b5383e8/platformdirs-4.4.0.tar.gz", hash = "sha256:ca753cf4d81dc309bc67b0ea38fd15dc97bc30ce419a7f58d13eb3bf14c4febf", size = 21634, upload-time = "2025-08-26T14:32:04.268Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/40/4b/2028861e724d3bd36227adfa20d3fd24c3fc6d52032f4a93c133be5d17ce/platformdirs-4.4.0-py3-none-any.whl", hash = "sha256:abd01743f24e5287cd7a5db3752faf1a2d65353f38ec26d98e25a6db65958c85", size = 18654, upload-time = "2025-08-26T14:32:02.735Z" }, -] - -[[package]] -name = "pluggy" -version = "1.6.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/f9/e2/3e91f31a7d2b083fe6ef3fa267035b518369d9511ffab804f839851d2779/pluggy-1.6.0.tar.gz", hash = "sha256:7dcc130b76258d33b90f61b658791dede3486c3e6bfb003ee5c9bfb396dd22f3", size = 69412, upload-time = "2025-05-15T12:30:07.975Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/54/20/4d324d65cc6d9205fabedc306948156824eb9f0ee1633355a8f7ec5c66bf/pluggy-1.6.0-py3-none-any.whl", hash = "sha256:e920276dd6813095e9377c0bc5566d94c932c33b27a3e3945d8389c374dd4746", size = 20538, upload-time = "2025-05-15T12:30:06.134Z" }, -] - -[[package]] -name = "ply" -version = "3.11" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/e5/69/882ee5c9d017149285cab114ebeab373308ef0f874fcdac9beb90e0ac4da/ply-3.11.tar.gz", hash = "sha256:00c7c1aaa88358b9c765b6d3000c6eec0ba42abca5351b095321aef446081da3", size = 159130, upload-time = "2018-02-15T19:01:31.097Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/a3/58/35da89ee790598a0700ea49b2a66594140f44dec458c07e8e3d4979137fc/ply-3.11-py2.py3-none-any.whl", hash = "sha256:096f9b8350b65ebd2fd1346b12452efe5b9607f7482813ffca50c22722a807ce", size = 49567, upload-time = "2018-02-15T19:01:27.172Z" }, -] - -[[package]] -name = "pre-commit" -version = "4.3.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "cfgv" }, - { name = "identify" }, - { name = "nodeenv" }, - { name = "pyyaml" }, - { name = "virtualenv" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/ff/29/7cf5bbc236333876e4b41f56e06857a87937ce4bf91e117a6991a2dbb02a/pre_commit-4.3.0.tar.gz", hash = "sha256:499fe450cc9d42e9d58e606262795ecb64dd05438943c62b66f6a8673da30b16", size = 193792, upload-time = "2025-08-09T18:56:14.651Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/5b/a5/987a405322d78a73b66e39e4a90e4ef156fd7141bf71df987e50717c321b/pre_commit-4.3.0-py2.py3-none-any.whl", hash = "sha256:2b0747ad7e6e967169136edffee14c16e148a778a54e4f967921aa1ebf2308d8", size = 220965, upload-time = "2025-08-09T18:56:13.192Z" }, -] - -[[package]] -name = "propcache" -version = "0.3.2" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/a6/16/43264e4a779dd8588c21a70f0709665ee8f611211bdd2c87d952cfa7c776/propcache-0.3.2.tar.gz", hash = "sha256:20d7d62e4e7ef05f221e0db2856b979540686342e7dd9973b815599c7057e168", size = 44139, upload-time = "2025-06-09T22:56:06.081Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/80/8d/e8b436717ab9c2cfc23b116d2c297305aa4cd8339172a456d61ebf5669b8/propcache-0.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:0b8d2f607bd8f80ddc04088bc2a037fdd17884a6fcadc47a96e334d72f3717be", size = 74207, upload-time = "2025-06-09T22:54:05.399Z" }, - { url = "https://files.pythonhosted.org/packages/d6/29/1e34000e9766d112171764b9fa3226fa0153ab565d0c242c70e9945318a7/propcache-0.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:06766d8f34733416e2e34f46fea488ad5d60726bb9481d3cddf89a6fa2d9603f", size = 43648, upload-time = "2025-06-09T22:54:08.023Z" }, - { url = "https://files.pythonhosted.org/packages/46/92/1ad5af0df781e76988897da39b5f086c2bf0f028b7f9bd1f409bb05b6874/propcache-0.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a2dc1f4a1df4fecf4e6f68013575ff4af84ef6f478fe5344317a65d38a8e6dc9", size = 43496, upload-time = "2025-06-09T22:54:09.228Z" }, - { url = "https://files.pythonhosted.org/packages/b3/ce/e96392460f9fb68461fabab3e095cb00c8ddf901205be4eae5ce246e5b7e/propcache-0.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:be29c4f4810c5789cf10ddf6af80b041c724e629fa51e308a7a0fb19ed1ef7bf", size = 217288, upload-time = "2025-06-09T22:54:10.466Z" }, - { url = "https://files.pythonhosted.org/packages/c5/2a/866726ea345299f7ceefc861a5e782b045545ae6940851930a6adaf1fca6/propcache-0.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:59d61f6970ecbd8ff2e9360304d5c8876a6abd4530cb752c06586849ac8a9dc9", size = 227456, upload-time = "2025-06-09T22:54:11.828Z" }, - { url = "https://files.pythonhosted.org/packages/de/03/07d992ccb6d930398689187e1b3c718339a1c06b8b145a8d9650e4726166/propcache-0.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:62180e0b8dbb6b004baec00a7983e4cc52f5ada9cd11f48c3528d8cfa7b96a66", size = 225429, upload-time = "2025-06-09T22:54:13.823Z" }, - { url = "https://files.pythonhosted.org/packages/5d/e6/116ba39448753b1330f48ab8ba927dcd6cf0baea8a0ccbc512dfb49ba670/propcache-0.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c144ca294a204c470f18cf4c9d78887810d04a3e2fbb30eea903575a779159df", size = 213472, upload-time = "2025-06-09T22:54:15.232Z" }, - { url = "https://files.pythonhosted.org/packages/a6/85/f01f5d97e54e428885a5497ccf7f54404cbb4f906688a1690cd51bf597dc/propcache-0.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c5c2a784234c28854878d68978265617aa6dc0780e53d44b4d67f3651a17a9a2", size = 204480, upload-time = "2025-06-09T22:54:17.104Z" }, - { url = "https://files.pythonhosted.org/packages/e3/79/7bf5ab9033b8b8194cc3f7cf1aaa0e9c3256320726f64a3e1f113a812dce/propcache-0.3.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:5745bc7acdafa978ca1642891b82c19238eadc78ba2aaa293c6863b304e552d7", size = 214530, upload-time = "2025-06-09T22:54:18.512Z" }, - { url = "https://files.pythonhosted.org/packages/31/0b/bd3e0c00509b609317df4a18e6b05a450ef2d9a963e1d8bc9c9415d86f30/propcache-0.3.2-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:c0075bf773d66fa8c9d41f66cc132ecc75e5bb9dd7cce3cfd14adc5ca184cb95", size = 205230, upload-time = "2025-06-09T22:54:19.947Z" }, - { url = "https://files.pythonhosted.org/packages/7a/23/fae0ff9b54b0de4e819bbe559508da132d5683c32d84d0dc2ccce3563ed4/propcache-0.3.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:5f57aa0847730daceff0497f417c9de353c575d8da3579162cc74ac294c5369e", size = 206754, upload-time = "2025-06-09T22:54:21.716Z" }, - { url = "https://files.pythonhosted.org/packages/b7/7f/ad6a3c22630aaa5f618b4dc3c3598974a72abb4c18e45a50b3cdd091eb2f/propcache-0.3.2-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:eef914c014bf72d18efb55619447e0aecd5fb7c2e3fa7441e2e5d6099bddff7e", size = 218430, upload-time = "2025-06-09T22:54:23.17Z" }, - { url = "https://files.pythonhosted.org/packages/5b/2c/ba4f1c0e8a4b4c75910742f0d333759d441f65a1c7f34683b4a74c0ee015/propcache-0.3.2-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:2a4092e8549031e82facf3decdbc0883755d5bbcc62d3aea9d9e185549936dcf", size = 223884, upload-time = "2025-06-09T22:54:25.539Z" }, - { url = "https://files.pythonhosted.org/packages/88/e4/ebe30fc399e98572019eee82ad0caf512401661985cbd3da5e3140ffa1b0/propcache-0.3.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:85871b050f174bc0bfb437efbdb68aaf860611953ed12418e4361bc9c392749e", size = 211480, upload-time = "2025-06-09T22:54:26.892Z" }, - { url = "https://files.pythonhosted.org/packages/96/0a/7d5260b914e01d1d0906f7f38af101f8d8ed0dc47426219eeaf05e8ea7c2/propcache-0.3.2-cp311-cp311-win32.whl", hash = "sha256:36c8d9b673ec57900c3554264e630d45980fd302458e4ac801802a7fd2ef7897", size = 37757, upload-time = "2025-06-09T22:54:28.241Z" }, - { url = "https://files.pythonhosted.org/packages/e1/2d/89fe4489a884bc0da0c3278c552bd4ffe06a1ace559db5ef02ef24ab446b/propcache-0.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:e53af8cb6a781b02d2ea079b5b853ba9430fcbe18a8e3ce647d5982a3ff69f39", size = 41500, upload-time = "2025-06-09T22:54:29.4Z" }, - { url = "https://files.pythonhosted.org/packages/a8/42/9ca01b0a6f48e81615dca4765a8f1dd2c057e0540f6116a27dc5ee01dfb6/propcache-0.3.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:8de106b6c84506b31c27168582cd3cb3000a6412c16df14a8628e5871ff83c10", size = 73674, upload-time = "2025-06-09T22:54:30.551Z" }, - { url = "https://files.pythonhosted.org/packages/af/6e/21293133beb550f9c901bbece755d582bfaf2176bee4774000bd4dd41884/propcache-0.3.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:28710b0d3975117239c76600ea351934ac7b5ff56e60953474342608dbbb6154", size = 43570, upload-time = "2025-06-09T22:54:32.296Z" }, - { url = "https://files.pythonhosted.org/packages/0c/c8/0393a0a3a2b8760eb3bde3c147f62b20044f0ddac81e9d6ed7318ec0d852/propcache-0.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ce26862344bdf836650ed2487c3d724b00fbfec4233a1013f597b78c1cb73615", size = 43094, upload-time = "2025-06-09T22:54:33.929Z" }, - { url = "https://files.pythonhosted.org/packages/37/2c/489afe311a690399d04a3e03b069225670c1d489eb7b044a566511c1c498/propcache-0.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bca54bd347a253af2cf4544bbec232ab982f4868de0dd684246b67a51bc6b1db", size = 226958, upload-time = "2025-06-09T22:54:35.186Z" }, - { url = "https://files.pythonhosted.org/packages/9d/ca/63b520d2f3d418c968bf596839ae26cf7f87bead026b6192d4da6a08c467/propcache-0.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:55780d5e9a2ddc59711d727226bb1ba83a22dd32f64ee15594b9392b1f544eb1", size = 234894, upload-time = "2025-06-09T22:54:36.708Z" }, - { url = "https://files.pythonhosted.org/packages/11/60/1d0ed6fff455a028d678df30cc28dcee7af77fa2b0e6962ce1df95c9a2a9/propcache-0.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:035e631be25d6975ed87ab23153db6a73426a48db688070d925aa27e996fe93c", size = 233672, upload-time = "2025-06-09T22:54:38.062Z" }, - { url = "https://files.pythonhosted.org/packages/37/7c/54fd5301ef38505ab235d98827207176a5c9b2aa61939b10a460ca53e123/propcache-0.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ee6f22b6eaa39297c751d0e80c0d3a454f112f5c6481214fcf4c092074cecd67", size = 224395, upload-time = "2025-06-09T22:54:39.634Z" }, - { url = "https://files.pythonhosted.org/packages/ee/1a/89a40e0846f5de05fdc6779883bf46ba980e6df4d2ff8fb02643de126592/propcache-0.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7ca3aee1aa955438c4dba34fc20a9f390e4c79967257d830f137bd5a8a32ed3b", size = 212510, upload-time = "2025-06-09T22:54:41.565Z" }, - { url = "https://files.pythonhosted.org/packages/5e/33/ca98368586c9566a6b8d5ef66e30484f8da84c0aac3f2d9aec6d31a11bd5/propcache-0.3.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:7a4f30862869fa2b68380d677cc1c5fcf1e0f2b9ea0cf665812895c75d0ca3b8", size = 222949, upload-time = "2025-06-09T22:54:43.038Z" }, - { url = "https://files.pythonhosted.org/packages/ba/11/ace870d0aafe443b33b2f0b7efdb872b7c3abd505bfb4890716ad7865e9d/propcache-0.3.2-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:b77ec3c257d7816d9f3700013639db7491a434644c906a2578a11daf13176251", size = 217258, upload-time = "2025-06-09T22:54:44.376Z" }, - { url = "https://files.pythonhosted.org/packages/5b/d2/86fd6f7adffcfc74b42c10a6b7db721d1d9ca1055c45d39a1a8f2a740a21/propcache-0.3.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:cab90ac9d3f14b2d5050928483d3d3b8fb6b4018893fc75710e6aa361ecb2474", size = 213036, upload-time = "2025-06-09T22:54:46.243Z" }, - { url = "https://files.pythonhosted.org/packages/07/94/2d7d1e328f45ff34a0a284cf5a2847013701e24c2a53117e7c280a4316b3/propcache-0.3.2-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:0b504d29f3c47cf6b9e936c1852246c83d450e8e063d50562115a6be6d3a2535", size = 227684, upload-time = "2025-06-09T22:54:47.63Z" }, - { url = "https://files.pythonhosted.org/packages/b7/05/37ae63a0087677e90b1d14710e532ff104d44bc1efa3b3970fff99b891dc/propcache-0.3.2-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:ce2ac2675a6aa41ddb2a0c9cbff53780a617ac3d43e620f8fd77ba1c84dcfc06", size = 234562, upload-time = "2025-06-09T22:54:48.982Z" }, - { url = "https://files.pythonhosted.org/packages/a4/7c/3f539fcae630408d0bd8bf3208b9a647ccad10976eda62402a80adf8fc34/propcache-0.3.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:62b4239611205294cc433845b914131b2a1f03500ff3c1ed093ed216b82621e1", size = 222142, upload-time = "2025-06-09T22:54:50.424Z" }, - { url = "https://files.pythonhosted.org/packages/7c/d2/34b9eac8c35f79f8a962546b3e97e9d4b990c420ee66ac8255d5d9611648/propcache-0.3.2-cp312-cp312-win32.whl", hash = "sha256:df4a81b9b53449ebc90cc4deefb052c1dd934ba85012aa912c7ea7b7e38b60c1", size = 37711, upload-time = "2025-06-09T22:54:52.072Z" }, - { url = "https://files.pythonhosted.org/packages/19/61/d582be5d226cf79071681d1b46b848d6cb03d7b70af7063e33a2787eaa03/propcache-0.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:7046e79b989d7fe457bb755844019e10f693752d169076138abf17f31380800c", size = 41479, upload-time = "2025-06-09T22:54:53.234Z" }, - { url = "https://files.pythonhosted.org/packages/dc/d1/8c747fafa558c603c4ca19d8e20b288aa0c7cda74e9402f50f31eb65267e/propcache-0.3.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ca592ed634a73ca002967458187109265e980422116c0a107cf93d81f95af945", size = 71286, upload-time = "2025-06-09T22:54:54.369Z" }, - { url = "https://files.pythonhosted.org/packages/61/99/d606cb7986b60d89c36de8a85d58764323b3a5ff07770a99d8e993b3fa73/propcache-0.3.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:9ecb0aad4020e275652ba3975740f241bd12a61f1a784df044cf7477a02bc252", size = 42425, upload-time = "2025-06-09T22:54:55.642Z" }, - { url = "https://files.pythonhosted.org/packages/8c/96/ef98f91bbb42b79e9bb82bdd348b255eb9d65f14dbbe3b1594644c4073f7/propcache-0.3.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:7f08f1cc28bd2eade7a8a3d2954ccc673bb02062e3e7da09bc75d843386b342f", size = 41846, upload-time = "2025-06-09T22:54:57.246Z" }, - { url = "https://files.pythonhosted.org/packages/5b/ad/3f0f9a705fb630d175146cd7b1d2bf5555c9beaed54e94132b21aac098a6/propcache-0.3.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d1a342c834734edb4be5ecb1e9fb48cb64b1e2320fccbd8c54bf8da8f2a84c33", size = 208871, upload-time = "2025-06-09T22:54:58.975Z" }, - { url = "https://files.pythonhosted.org/packages/3a/38/2085cda93d2c8b6ec3e92af2c89489a36a5886b712a34ab25de9fbca7992/propcache-0.3.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8a544caaae1ac73f1fecfae70ded3e93728831affebd017d53449e3ac052ac1e", size = 215720, upload-time = "2025-06-09T22:55:00.471Z" }, - { url = "https://files.pythonhosted.org/packages/61/c1/d72ea2dc83ac7f2c8e182786ab0fc2c7bd123a1ff9b7975bee671866fe5f/propcache-0.3.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:310d11aa44635298397db47a3ebce7db99a4cc4b9bbdfcf6c98a60c8d5261cf1", size = 215203, upload-time = "2025-06-09T22:55:01.834Z" }, - { url = "https://files.pythonhosted.org/packages/af/81/b324c44ae60c56ef12007105f1460d5c304b0626ab0cc6b07c8f2a9aa0b8/propcache-0.3.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4c1396592321ac83157ac03a2023aa6cc4a3cc3cfdecb71090054c09e5a7cce3", size = 206365, upload-time = "2025-06-09T22:55:03.199Z" }, - { url = "https://files.pythonhosted.org/packages/09/73/88549128bb89e66d2aff242488f62869014ae092db63ccea53c1cc75a81d/propcache-0.3.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8cabf5b5902272565e78197edb682017d21cf3b550ba0460ee473753f28d23c1", size = 196016, upload-time = "2025-06-09T22:55:04.518Z" }, - { url = "https://files.pythonhosted.org/packages/b9/3f/3bdd14e737d145114a5eb83cb172903afba7242f67c5877f9909a20d948d/propcache-0.3.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:0a2f2235ac46a7aa25bdeb03a9e7060f6ecbd213b1f9101c43b3090ffb971ef6", size = 205596, upload-time = "2025-06-09T22:55:05.942Z" }, - { url = "https://files.pythonhosted.org/packages/0f/ca/2f4aa819c357d3107c3763d7ef42c03980f9ed5c48c82e01e25945d437c1/propcache-0.3.2-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:92b69e12e34869a6970fd2f3da91669899994b47c98f5d430b781c26f1d9f387", size = 200977, upload-time = "2025-06-09T22:55:07.792Z" }, - { url = "https://files.pythonhosted.org/packages/cd/4a/e65276c7477533c59085251ae88505caf6831c0e85ff8b2e31ebcbb949b1/propcache-0.3.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:54e02207c79968ebbdffc169591009f4474dde3b4679e16634d34c9363ff56b4", size = 197220, upload-time = "2025-06-09T22:55:09.173Z" }, - { url = "https://files.pythonhosted.org/packages/7c/54/fc7152e517cf5578278b242396ce4d4b36795423988ef39bb8cd5bf274c8/propcache-0.3.2-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:4adfb44cb588001f68c5466579d3f1157ca07f7504fc91ec87862e2b8e556b88", size = 210642, upload-time = "2025-06-09T22:55:10.62Z" }, - { url = "https://files.pythonhosted.org/packages/b9/80/abeb4a896d2767bf5f1ea7b92eb7be6a5330645bd7fb844049c0e4045d9d/propcache-0.3.2-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:fd3e6019dc1261cd0291ee8919dd91fbab7b169bb76aeef6c716833a3f65d206", size = 212789, upload-time = "2025-06-09T22:55:12.029Z" }, - { url = "https://files.pythonhosted.org/packages/b3/db/ea12a49aa7b2b6d68a5da8293dcf50068d48d088100ac016ad92a6a780e6/propcache-0.3.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4c181cad81158d71c41a2bce88edce078458e2dd5ffee7eddd6b05da85079f43", size = 205880, upload-time = "2025-06-09T22:55:13.45Z" }, - { url = "https://files.pythonhosted.org/packages/d1/e5/9076a0bbbfb65d1198007059c65639dfd56266cf8e477a9707e4b1999ff4/propcache-0.3.2-cp313-cp313-win32.whl", hash = "sha256:8a08154613f2249519e549de2330cf8e2071c2887309a7b07fb56098f5170a02", size = 37220, upload-time = "2025-06-09T22:55:15.284Z" }, - { url = "https://files.pythonhosted.org/packages/d3/f5/b369e026b09a26cd77aa88d8fffd69141d2ae00a2abaaf5380d2603f4b7f/propcache-0.3.2-cp313-cp313-win_amd64.whl", hash = "sha256:e41671f1594fc4ab0a6dec1351864713cb3a279910ae8b58f884a88a0a632c05", size = 40678, upload-time = "2025-06-09T22:55:16.445Z" }, - { url = "https://files.pythonhosted.org/packages/a4/3a/6ece377b55544941a08d03581c7bc400a3c8cd3c2865900a68d5de79e21f/propcache-0.3.2-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:9a3cf035bbaf035f109987d9d55dc90e4b0e36e04bbbb95af3055ef17194057b", size = 76560, upload-time = "2025-06-09T22:55:17.598Z" }, - { url = "https://files.pythonhosted.org/packages/0c/da/64a2bb16418740fa634b0e9c3d29edff1db07f56d3546ca2d86ddf0305e1/propcache-0.3.2-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:156c03d07dc1323d8dacaa221fbe028c5c70d16709cdd63502778e6c3ccca1b0", size = 44676, upload-time = "2025-06-09T22:55:18.922Z" }, - { url = "https://files.pythonhosted.org/packages/36/7b/f025e06ea51cb72c52fb87e9b395cced02786610b60a3ed51da8af017170/propcache-0.3.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:74413c0ba02ba86f55cf60d18daab219f7e531620c15f1e23d95563f505efe7e", size = 44701, upload-time = "2025-06-09T22:55:20.106Z" }, - { url = "https://files.pythonhosted.org/packages/a4/00/faa1b1b7c3b74fc277f8642f32a4c72ba1d7b2de36d7cdfb676db7f4303e/propcache-0.3.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f066b437bb3fa39c58ff97ab2ca351db465157d68ed0440abecb21715eb24b28", size = 276934, upload-time = "2025-06-09T22:55:21.5Z" }, - { url = "https://files.pythonhosted.org/packages/74/ab/935beb6f1756e0476a4d5938ff44bf0d13a055fed880caf93859b4f1baf4/propcache-0.3.2-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f1304b085c83067914721e7e9d9917d41ad87696bf70f0bc7dee450e9c71ad0a", size = 278316, upload-time = "2025-06-09T22:55:22.918Z" }, - { url = "https://files.pythonhosted.org/packages/f8/9d/994a5c1ce4389610838d1caec74bdf0e98b306c70314d46dbe4fcf21a3e2/propcache-0.3.2-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ab50cef01b372763a13333b4e54021bdcb291fc9a8e2ccb9c2df98be51bcde6c", size = 282619, upload-time = "2025-06-09T22:55:24.651Z" }, - { url = "https://files.pythonhosted.org/packages/2b/00/a10afce3d1ed0287cef2e09506d3be9822513f2c1e96457ee369adb9a6cd/propcache-0.3.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fad3b2a085ec259ad2c2842666b2a0a49dea8463579c606426128925af1ed725", size = 265896, upload-time = "2025-06-09T22:55:26.049Z" }, - { url = "https://files.pythonhosted.org/packages/2e/a8/2aa6716ffa566ca57c749edb909ad27884680887d68517e4be41b02299f3/propcache-0.3.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:261fa020c1c14deafd54c76b014956e2f86991af198c51139faf41c4d5e83892", size = 252111, upload-time = "2025-06-09T22:55:27.381Z" }, - { url = "https://files.pythonhosted.org/packages/36/4f/345ca9183b85ac29c8694b0941f7484bf419c7f0fea2d1e386b4f7893eed/propcache-0.3.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:46d7f8aa79c927e5f987ee3a80205c987717d3659f035c85cf0c3680526bdb44", size = 268334, upload-time = "2025-06-09T22:55:28.747Z" }, - { url = "https://files.pythonhosted.org/packages/3e/ca/fcd54f78b59e3f97b3b9715501e3147f5340167733d27db423aa321e7148/propcache-0.3.2-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:6d8f3f0eebf73e3c0ff0e7853f68be638b4043c65a70517bb575eff54edd8dbe", size = 255026, upload-time = "2025-06-09T22:55:30.184Z" }, - { url = "https://files.pythonhosted.org/packages/8b/95/8e6a6bbbd78ac89c30c225210a5c687790e532ba4088afb8c0445b77ef37/propcache-0.3.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:03c89c1b14a5452cf15403e291c0ccd7751d5b9736ecb2c5bab977ad6c5bcd81", size = 250724, upload-time = "2025-06-09T22:55:31.646Z" }, - { url = "https://files.pythonhosted.org/packages/ee/b0/0dd03616142baba28e8b2d14ce5df6631b4673850a3d4f9c0f9dd714a404/propcache-0.3.2-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:0cc17efde71e12bbaad086d679ce575268d70bc123a5a71ea7ad76f70ba30bba", size = 268868, upload-time = "2025-06-09T22:55:33.209Z" }, - { url = "https://files.pythonhosted.org/packages/c5/98/2c12407a7e4fbacd94ddd32f3b1e3d5231e77c30ef7162b12a60e2dd5ce3/propcache-0.3.2-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:acdf05d00696bc0447e278bb53cb04ca72354e562cf88ea6f9107df8e7fd9770", size = 271322, upload-time = "2025-06-09T22:55:35.065Z" }, - { url = "https://files.pythonhosted.org/packages/35/91/9cb56efbb428b006bb85db28591e40b7736847b8331d43fe335acf95f6c8/propcache-0.3.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:4445542398bd0b5d32df908031cb1b30d43ac848e20470a878b770ec2dcc6330", size = 265778, upload-time = "2025-06-09T22:55:36.45Z" }, - { url = "https://files.pythonhosted.org/packages/9a/4c/b0fe775a2bdd01e176b14b574be679d84fc83958335790f7c9a686c1f468/propcache-0.3.2-cp313-cp313t-win32.whl", hash = "sha256:f86e5d7cd03afb3a1db8e9f9f6eff15794e79e791350ac48a8c924e6f439f394", size = 41175, upload-time = "2025-06-09T22:55:38.436Z" }, - { url = "https://files.pythonhosted.org/packages/a4/ff/47f08595e3d9b5e149c150f88d9714574f1a7cbd89fe2817158a952674bf/propcache-0.3.2-cp313-cp313t-win_amd64.whl", hash = "sha256:9704bedf6e7cbe3c65eca4379a9b53ee6a83749f047808cbb5044d40d7d72198", size = 44857, upload-time = "2025-06-09T22:55:39.687Z" }, - { url = "https://files.pythonhosted.org/packages/cc/35/cc0aaecf278bb4575b8555f2b137de5ab821595ddae9da9d3cd1da4072c7/propcache-0.3.2-py3-none-any.whl", hash = "sha256:98f1ec44fb675f5052cccc8e609c46ed23a35a1cfd18545ad4e29002d858a43f", size = 12663, upload-time = "2025-06-09T22:56:04.484Z" }, -] - -[[package]] -name = "proto-plus" -version = "1.26.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "protobuf" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/f4/ac/87285f15f7cce6d4a008f33f1757fb5a13611ea8914eb58c3d0d26243468/proto_plus-1.26.1.tar.gz", hash = "sha256:21a515a4c4c0088a773899e23c7bbade3d18f9c66c73edd4c7ee3816bc96a012", size = 56142, upload-time = "2025-03-10T15:54:38.843Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/4e/6d/280c4c2ce28b1593a19ad5239c8b826871fc6ec275c21afc8e1820108039/proto_plus-1.26.1-py3-none-any.whl", hash = "sha256:13285478c2dcf2abb829db158e1047e2f1e8d63a077d94263c2b88b043c75a66", size = 50163, upload-time = "2025-03-10T15:54:37.335Z" }, -] - -[[package]] -name = "protobuf" -version = "6.32.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/fa/a4/cc17347aa2897568beece2e674674359f911d6fe21b0b8d6268cd42727ac/protobuf-6.32.1.tar.gz", hash = "sha256:ee2469e4a021474ab9baafea6cd070e5bf27c7d29433504ddea1a4ee5850f68d", size = 440635, upload-time = "2025-09-11T21:38:42.935Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/c0/98/645183ea03ab3995d29086b8bf4f7562ebd3d10c9a4b14ee3f20d47cfe50/protobuf-6.32.1-cp310-abi3-win32.whl", hash = "sha256:a8a32a84bc9f2aad712041b8b366190f71dde248926da517bde9e832e4412085", size = 424411, upload-time = "2025-09-11T21:38:27.427Z" }, - { url = "https://files.pythonhosted.org/packages/8c/f3/6f58f841f6ebafe076cebeae33fc336e900619d34b1c93e4b5c97a81fdfa/protobuf-6.32.1-cp310-abi3-win_amd64.whl", hash = "sha256:b00a7d8c25fa471f16bc8153d0e53d6c9e827f0953f3c09aaa4331c718cae5e1", size = 435738, upload-time = "2025-09-11T21:38:30.959Z" }, - { url = "https://files.pythonhosted.org/packages/10/56/a8a3f4e7190837139e68c7002ec749190a163af3e330f65d90309145a210/protobuf-6.32.1-cp39-abi3-macosx_10_9_universal2.whl", hash = "sha256:d8c7e6eb619ffdf105ee4ab76af5a68b60a9d0f66da3ea12d1640e6d8dab7281", size = 426454, upload-time = "2025-09-11T21:38:34.076Z" }, - { url = "https://files.pythonhosted.org/packages/3f/be/8dd0a927c559b37d7a6c8ab79034fd167dcc1f851595f2e641ad62be8643/protobuf-6.32.1-cp39-abi3-manylinux2014_aarch64.whl", hash = "sha256:2f5b80a49e1eb7b86d85fcd23fe92df154b9730a725c3b38c4e43b9d77018bf4", size = 322874, upload-time = "2025-09-11T21:38:35.509Z" }, - { url = "https://files.pythonhosted.org/packages/5c/f6/88d77011b605ef979aace37b7703e4eefad066f7e84d935e5a696515c2dd/protobuf-6.32.1-cp39-abi3-manylinux2014_x86_64.whl", hash = "sha256:b1864818300c297265c83a4982fd3169f97122c299f56a56e2445c3698d34710", size = 322013, upload-time = "2025-09-11T21:38:37.017Z" }, - { url = "https://files.pythonhosted.org/packages/97/b7/15cc7d93443d6c6a84626ae3258a91f4c6ac8c0edd5df35ea7658f71b79c/protobuf-6.32.1-py3-none-any.whl", hash = "sha256:2601b779fc7d32a866c6b4404f9d42a3f67c5b9f3f15b4db3cccabe06b95c346", size = 169289, upload-time = "2025-09-11T21:38:41.234Z" }, -] - -[[package]] -name = "psutil" -version = "7.0.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/2a/80/336820c1ad9286a4ded7e845b2eccfcb27851ab8ac6abece774a6ff4d3de/psutil-7.0.0.tar.gz", hash = "sha256:7be9c3eba38beccb6495ea33afd982a44074b78f28c434a1f51cc07fd315c456", size = 497003, upload-time = "2025-02-13T21:54:07.946Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/ed/e6/2d26234410f8b8abdbf891c9da62bee396583f713fb9f3325a4760875d22/psutil-7.0.0-cp36-abi3-macosx_10_9_x86_64.whl", hash = "sha256:101d71dc322e3cffd7cea0650b09b3d08b8e7c4109dd6809fe452dfd00e58b25", size = 238051, upload-time = "2025-02-13T21:54:12.36Z" }, - { url = "https://files.pythonhosted.org/packages/04/8b/30f930733afe425e3cbfc0e1468a30a18942350c1a8816acfade80c005c4/psutil-7.0.0-cp36-abi3-macosx_11_0_arm64.whl", hash = "sha256:39db632f6bb862eeccf56660871433e111b6ea58f2caea825571951d4b6aa3da", size = 239535, upload-time = "2025-02-13T21:54:16.07Z" }, - { url = "https://files.pythonhosted.org/packages/2a/ed/d362e84620dd22876b55389248e522338ed1bf134a5edd3b8231d7207f6d/psutil-7.0.0-cp36-abi3-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1fcee592b4c6f146991ca55919ea3d1f8926497a713ed7faaf8225e174581e91", size = 275004, upload-time = "2025-02-13T21:54:18.662Z" }, - { url = "https://files.pythonhosted.org/packages/bf/b9/b0eb3f3cbcb734d930fdf839431606844a825b23eaf9a6ab371edac8162c/psutil-7.0.0-cp36-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4b1388a4f6875d7e2aff5c4ca1cc16c545ed41dd8bb596cefea80111db353a34", size = 277986, upload-time = "2025-02-13T21:54:21.811Z" }, - { url = "https://files.pythonhosted.org/packages/eb/a2/709e0fe2f093556c17fbafda93ac032257242cabcc7ff3369e2cb76a97aa/psutil-7.0.0-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a5f098451abc2828f7dc6b58d44b532b22f2088f4999a937557b603ce72b1993", size = 279544, upload-time = "2025-02-13T21:54:24.68Z" }, - { url = "https://files.pythonhosted.org/packages/50/e6/eecf58810b9d12e6427369784efe814a1eec0f492084ce8eb8f4d89d6d61/psutil-7.0.0-cp37-abi3-win32.whl", hash = "sha256:ba3fcef7523064a6c9da440fc4d6bd07da93ac726b5733c29027d7dc95b39d99", size = 241053, upload-time = "2025-02-13T21:54:34.31Z" }, - { url = "https://files.pythonhosted.org/packages/50/1b/6921afe68c74868b4c9fa424dad3be35b095e16687989ebbb50ce4fceb7c/psutil-7.0.0-cp37-abi3-win_amd64.whl", hash = "sha256:4cf3d4eb1aa9b348dec30105c55cd9b7d4629285735a102beb4441e38db90553", size = 244885, upload-time = "2025-02-13T21:54:37.486Z" }, -] - -[[package]] -name = "pwdlib" -version = "0.2.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/82/a0/9daed437a6226f632a25d98d65d60ba02bdafa920c90dcb6454c611ead6c/pwdlib-0.2.1.tar.gz", hash = "sha256:9a1d8a8fa09a2f7ebf208265e55d7d008103cbdc82b9e4902ffdd1ade91add5e", size = 11699, upload-time = "2024-08-19T06:48:59.58Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/01/f3/0dae5078a486f0fdf4d4a1121e103bc42694a9da9bea7b0f2c63f29cfbd3/pwdlib-0.2.1-py3-none-any.whl", hash = "sha256:1823dc6f22eae472b540e889ecf57fd424051d6a4023ec0bcf7f0de2d9d7ef8c", size = 8082, upload-time = "2024-08-19T06:49:00.997Z" }, -] - -[package.optional-dependencies] -argon2 = [ - { name = "argon2-cffi" }, -] -bcrypt = [ - { name = "bcrypt" }, -] - -[[package]] -name = "pyarrow" -version = "21.0.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/ef/c2/ea068b8f00905c06329a3dfcd40d0fcc2b7d0f2e355bdb25b65e0a0e4cd4/pyarrow-21.0.0.tar.gz", hash = "sha256:5051f2dccf0e283ff56335760cbc8622cf52264d67e359d5569541ac11b6d5bc", size = 1133487, upload-time = "2025-07-18T00:57:31.761Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/94/dc/80564a3071a57c20b7c32575e4a0120e8a330ef487c319b122942d665960/pyarrow-21.0.0-cp311-cp311-macosx_12_0_arm64.whl", hash = "sha256:c077f48aab61738c237802836fc3844f85409a46015635198761b0d6a688f87b", size = 31243234, upload-time = "2025-07-18T00:55:03.812Z" }, - { url = "https://files.pythonhosted.org/packages/ea/cc/3b51cb2db26fe535d14f74cab4c79b191ed9a8cd4cbba45e2379b5ca2746/pyarrow-21.0.0-cp311-cp311-macosx_12_0_x86_64.whl", hash = "sha256:689f448066781856237eca8d1975b98cace19b8dd2ab6145bf49475478bcaa10", size = 32714370, upload-time = "2025-07-18T00:55:07.495Z" }, - { url = "https://files.pythonhosted.org/packages/24/11/a4431f36d5ad7d83b87146f515c063e4d07ef0b7240876ddb885e6b44f2e/pyarrow-21.0.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:479ee41399fcddc46159a551705b89c05f11e8b8cb8e968f7fec64f62d91985e", size = 41135424, upload-time = "2025-07-18T00:55:11.461Z" }, - { url = "https://files.pythonhosted.org/packages/74/dc/035d54638fc5d2971cbf1e987ccd45f1091c83bcf747281cf6cc25e72c88/pyarrow-21.0.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:40ebfcb54a4f11bcde86bc586cbd0272bac0d516cfa539c799c2453768477569", size = 42823810, upload-time = "2025-07-18T00:55:16.301Z" }, - { url = "https://files.pythonhosted.org/packages/2e/3b/89fced102448a9e3e0d4dded1f37fa3ce4700f02cdb8665457fcc8015f5b/pyarrow-21.0.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8d58d8497814274d3d20214fbb24abcad2f7e351474357d552a8d53bce70c70e", size = 43391538, upload-time = "2025-07-18T00:55:23.82Z" }, - { url = "https://files.pythonhosted.org/packages/fb/bb/ea7f1bd08978d39debd3b23611c293f64a642557e8141c80635d501e6d53/pyarrow-21.0.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:585e7224f21124dd57836b1530ac8f2df2afc43c861d7bf3d58a4870c42ae36c", size = 45120056, upload-time = "2025-07-18T00:55:28.231Z" }, - { url = "https://files.pythonhosted.org/packages/6e/0b/77ea0600009842b30ceebc3337639a7380cd946061b620ac1a2f3cb541e2/pyarrow-21.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:555ca6935b2cbca2c0e932bedd853e9bc523098c39636de9ad4693b5b1df86d6", size = 26220568, upload-time = "2025-07-18T00:55:32.122Z" }, - { url = "https://files.pythonhosted.org/packages/ca/d4/d4f817b21aacc30195cf6a46ba041dd1be827efa4a623cc8bf39a1c2a0c0/pyarrow-21.0.0-cp312-cp312-macosx_12_0_arm64.whl", hash = "sha256:3a302f0e0963db37e0a24a70c56cf91a4faa0bca51c23812279ca2e23481fccd", size = 31160305, upload-time = "2025-07-18T00:55:35.373Z" }, - { url = "https://files.pythonhosted.org/packages/a2/9c/dcd38ce6e4b4d9a19e1d36914cb8e2b1da4e6003dd075474c4cfcdfe0601/pyarrow-21.0.0-cp312-cp312-macosx_12_0_x86_64.whl", hash = "sha256:b6b27cf01e243871390474a211a7922bfbe3bda21e39bc9160daf0da3fe48876", size = 32684264, upload-time = "2025-07-18T00:55:39.303Z" }, - { url = "https://files.pythonhosted.org/packages/4f/74/2a2d9f8d7a59b639523454bec12dba35ae3d0a07d8ab529dc0809f74b23c/pyarrow-21.0.0-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:e72a8ec6b868e258a2cd2672d91f2860ad532d590ce94cdf7d5e7ec674ccf03d", size = 41108099, upload-time = "2025-07-18T00:55:42.889Z" }, - { url = "https://files.pythonhosted.org/packages/ad/90/2660332eeb31303c13b653ea566a9918484b6e4d6b9d2d46879a33ab0622/pyarrow-21.0.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:b7ae0bbdc8c6674259b25bef5d2a1d6af5d39d7200c819cf99e07f7dfef1c51e", size = 42829529, upload-time = "2025-07-18T00:55:47.069Z" }, - { url = "https://files.pythonhosted.org/packages/33/27/1a93a25c92717f6aa0fca06eb4700860577d016cd3ae51aad0e0488ac899/pyarrow-21.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:58c30a1729f82d201627c173d91bd431db88ea74dcaa3885855bc6203e433b82", size = 43367883, upload-time = "2025-07-18T00:55:53.069Z" }, - { url = "https://files.pythonhosted.org/packages/05/d9/4d09d919f35d599bc05c6950095e358c3e15148ead26292dfca1fb659b0c/pyarrow-21.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:072116f65604b822a7f22945a7a6e581cfa28e3454fdcc6939d4ff6090126623", size = 45133802, upload-time = "2025-07-18T00:55:57.714Z" }, - { url = "https://files.pythonhosted.org/packages/71/30/f3795b6e192c3ab881325ffe172e526499eb3780e306a15103a2764916a2/pyarrow-21.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:cf56ec8b0a5c8c9d7021d6fd754e688104f9ebebf1bf4449613c9531f5346a18", size = 26203175, upload-time = "2025-07-18T00:56:01.364Z" }, - { url = "https://files.pythonhosted.org/packages/16/ca/c7eaa8e62db8fb37ce942b1ea0c6d7abfe3786ca193957afa25e71b81b66/pyarrow-21.0.0-cp313-cp313-macosx_12_0_arm64.whl", hash = "sha256:e99310a4ebd4479bcd1964dff9e14af33746300cb014aa4a3781738ac63baf4a", size = 31154306, upload-time = "2025-07-18T00:56:04.42Z" }, - { url = "https://files.pythonhosted.org/packages/ce/e8/e87d9e3b2489302b3a1aea709aaca4b781c5252fcb812a17ab6275a9a484/pyarrow-21.0.0-cp313-cp313-macosx_12_0_x86_64.whl", hash = "sha256:d2fe8e7f3ce329a71b7ddd7498b3cfac0eeb200c2789bd840234f0dc271a8efe", size = 32680622, upload-time = "2025-07-18T00:56:07.505Z" }, - { url = "https://files.pythonhosted.org/packages/84/52/79095d73a742aa0aba370c7942b1b655f598069489ab387fe47261a849e1/pyarrow-21.0.0-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:f522e5709379d72fb3da7785aa489ff0bb87448a9dc5a75f45763a795a089ebd", size = 41104094, upload-time = "2025-07-18T00:56:10.994Z" }, - { url = "https://files.pythonhosted.org/packages/89/4b/7782438b551dbb0468892a276b8c789b8bbdb25ea5c5eb27faadd753e037/pyarrow-21.0.0-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:69cbbdf0631396e9925e048cfa5bce4e8c3d3b41562bbd70c685a8eb53a91e61", size = 42825576, upload-time = "2025-07-18T00:56:15.569Z" }, - { url = "https://files.pythonhosted.org/packages/b3/62/0f29de6e0a1e33518dec92c65be0351d32d7ca351e51ec5f4f837a9aab91/pyarrow-21.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:731c7022587006b755d0bdb27626a1a3bb004bb56b11fb30d98b6c1b4718579d", size = 43368342, upload-time = "2025-07-18T00:56:19.531Z" }, - { url = "https://files.pythonhosted.org/packages/90/c7/0fa1f3f29cf75f339768cc698c8ad4ddd2481c1742e9741459911c9ac477/pyarrow-21.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:dc56bc708f2d8ac71bd1dcb927e458c93cec10b98eb4120206a4091db7b67b99", size = 45131218, upload-time = "2025-07-18T00:56:23.347Z" }, - { url = "https://files.pythonhosted.org/packages/01/63/581f2076465e67b23bc5a37d4a2abff8362d389d29d8105832e82c9c811c/pyarrow-21.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:186aa00bca62139f75b7de8420f745f2af12941595bbbfa7ed3870ff63e25636", size = 26087551, upload-time = "2025-07-18T00:56:26.758Z" }, - { url = "https://files.pythonhosted.org/packages/c9/ab/357d0d9648bb8241ee7348e564f2479d206ebe6e1c47ac5027c2e31ecd39/pyarrow-21.0.0-cp313-cp313t-macosx_12_0_arm64.whl", hash = "sha256:a7a102574faa3f421141a64c10216e078df467ab9576684d5cd696952546e2da", size = 31290064, upload-time = "2025-07-18T00:56:30.214Z" }, - { url = "https://files.pythonhosted.org/packages/3f/8a/5685d62a990e4cac2043fc76b4661bf38d06efed55cf45a334b455bd2759/pyarrow-21.0.0-cp313-cp313t-macosx_12_0_x86_64.whl", hash = "sha256:1e005378c4a2c6db3ada3ad4c217b381f6c886f0a80d6a316fe586b90f77efd7", size = 32727837, upload-time = "2025-07-18T00:56:33.935Z" }, - { url = "https://files.pythonhosted.org/packages/fc/de/c0828ee09525c2bafefd3e736a248ebe764d07d0fd762d4f0929dbc516c9/pyarrow-21.0.0-cp313-cp313t-manylinux_2_28_aarch64.whl", hash = "sha256:65f8e85f79031449ec8706b74504a316805217b35b6099155dd7e227eef0d4b6", size = 41014158, upload-time = "2025-07-18T00:56:37.528Z" }, - { url = "https://files.pythonhosted.org/packages/6e/26/a2865c420c50b7a3748320b614f3484bfcde8347b2639b2b903b21ce6a72/pyarrow-21.0.0-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:3a81486adc665c7eb1a2bde0224cfca6ceaba344a82a971ef059678417880eb8", size = 42667885, upload-time = "2025-07-18T00:56:41.483Z" }, - { url = "https://files.pythonhosted.org/packages/0a/f9/4ee798dc902533159250fb4321267730bc0a107d8c6889e07c3add4fe3a5/pyarrow-21.0.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:fc0d2f88b81dcf3ccf9a6ae17f89183762c8a94a5bdcfa09e05cfe413acf0503", size = 43276625, upload-time = "2025-07-18T00:56:48.002Z" }, - { url = "https://files.pythonhosted.org/packages/5a/da/e02544d6997037a4b0d22d8e5f66bc9315c3671371a8b18c79ade1cefe14/pyarrow-21.0.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:6299449adf89df38537837487a4f8d3bd91ec94354fdd2a7d30bc11c48ef6e79", size = 44951890, upload-time = "2025-07-18T00:56:52.568Z" }, - { url = "https://files.pythonhosted.org/packages/e5/4e/519c1bc1876625fe6b71e9a28287c43ec2f20f73c658b9ae1d485c0c206e/pyarrow-21.0.0-cp313-cp313t-win_amd64.whl", hash = "sha256:222c39e2c70113543982c6b34f3077962b44fca38c0bd9e68bb6781534425c10", size = 26371006, upload-time = "2025-07-18T00:56:56.379Z" }, -] - -[[package]] -name = "pyasn1" -version = "0.6.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/ba/e9/01f1a64245b89f039897cb0130016d79f77d52669aae6ee7b159a6c4c018/pyasn1-0.6.1.tar.gz", hash = "sha256:6f580d2bdd84365380830acf45550f2511469f673cb4a5ae3857a3170128b034", size = 145322, upload-time = "2024-09-10T22:41:42.55Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/c8/f1/d6a797abb14f6283c0ddff96bbdd46937f64122b8c925cab503dd37f8214/pyasn1-0.6.1-py3-none-any.whl", hash = "sha256:0d632f46f2ba09143da3a8afe9e33fb6f92fa2320ab7e886e2d0f7672af84629", size = 83135, upload-time = "2024-09-11T16:00:36.122Z" }, -] - -[[package]] -name = "pyasn1-modules" -version = "0.4.2" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "pyasn1" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/e9/e6/78ebbb10a8c8e4b61a59249394a4a594c1a7af95593dc933a349c8d00964/pyasn1_modules-0.4.2.tar.gz", hash = "sha256:677091de870a80aae844b1ca6134f54652fa2c8c5a52aa396440ac3106e941e6", size = 307892, upload-time = "2025-03-28T02:41:22.17Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/47/8d/d529b5d697919ba8c11ad626e835d4039be708a35b0d22de83a269a6682c/pyasn1_modules-0.4.2-py3-none-any.whl", hash = "sha256:29253a9207ce32b64c3ac6600edc75368f98473906e8fd1043bd6b5b1de2c14a", size = 181259, upload-time = "2025-03-28T02:41:19.028Z" }, -] - -[[package]] -name = "pycparser" -version = "2.23" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/fe/cf/d2d3b9f5699fb1e4615c8e32ff220203e43b248e1dfcc6736ad9057731ca/pycparser-2.23.tar.gz", hash = "sha256:78816d4f24add8f10a06d6f05b4d424ad9e96cfebf68a4ddc99c65c0720d00c2", size = 173734, upload-time = "2025-09-09T13:23:47.91Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/a0/e3/59cd50310fc9b59512193629e1984c1f95e5c8ae6e5d8c69532ccc65a7fe/pycparser-2.23-py3-none-any.whl", hash = "sha256:e5c6e8d3fbad53479cab09ac03729e0a9faf2bee3db8208a550daf5af81a5934", size = 118140, upload-time = "2025-09-09T13:23:46.651Z" }, -] - -[[package]] -name = "pydantic" -version = "2.11.9" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "annotated-types" }, - { name = "pydantic-core" }, - { name = "typing-extensions" }, - { name = "typing-inspection" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/ff/5d/09a551ba512d7ca404d785072700d3f6727a02f6f3c24ecfd081c7cf0aa8/pydantic-2.11.9.tar.gz", hash = "sha256:6b8ffda597a14812a7975c90b82a8a2e777d9257aba3453f973acd3c032a18e2", size = 788495, upload-time = "2025-09-13T11:26:39.325Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/3e/d3/108f2006987c58e76691d5ae5d200dd3e0f532cb4e5fa3560751c3a1feba/pydantic-2.11.9-py3-none-any.whl", hash = "sha256:c42dd626f5cfc1c6950ce6205ea58c93efa406da65f479dcb4029d5934857da2", size = 444855, upload-time = "2025-09-13T11:26:36.909Z" }, -] - -[package.optional-dependencies] -email = [ - { name = "email-validator" }, -] - -[[package]] -name = "pydantic-core" -version = "2.33.2" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "typing-extensions" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/ad/88/5f2260bdfae97aabf98f1778d43f69574390ad787afb646292a638c923d4/pydantic_core-2.33.2.tar.gz", hash = "sha256:7cb8bc3605c29176e1b105350d2e6474142d7c1bd1d9327c4a9bdb46bf827acc", size = 435195, upload-time = "2025-04-23T18:33:52.104Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/3f/8d/71db63483d518cbbf290261a1fc2839d17ff89fce7089e08cad07ccfce67/pydantic_core-2.33.2-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:4c5b0a576fb381edd6d27f0a85915c6daf2f8138dc5c267a57c08a62900758c7", size = 2028584, upload-time = "2025-04-23T18:31:03.106Z" }, - { url = "https://files.pythonhosted.org/packages/24/2f/3cfa7244ae292dd850989f328722d2aef313f74ffc471184dc509e1e4e5a/pydantic_core-2.33.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e799c050df38a639db758c617ec771fd8fb7a5f8eaaa4b27b101f266b216a246", size = 1855071, upload-time = "2025-04-23T18:31:04.621Z" }, - { url = "https://files.pythonhosted.org/packages/b3/d3/4ae42d33f5e3f50dd467761304be2fa0a9417fbf09735bc2cce003480f2a/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dc46a01bf8d62f227d5ecee74178ffc448ff4e5197c756331f71efcc66dc980f", size = 1897823, upload-time = "2025-04-23T18:31:06.377Z" }, - { url = "https://files.pythonhosted.org/packages/f4/f3/aa5976e8352b7695ff808599794b1fba2a9ae2ee954a3426855935799488/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a144d4f717285c6d9234a66778059f33a89096dfb9b39117663fd8413d582dcc", size = 1983792, upload-time = "2025-04-23T18:31:07.93Z" }, - { url = "https://files.pythonhosted.org/packages/d5/7a/cda9b5a23c552037717f2b2a5257e9b2bfe45e687386df9591eff7b46d28/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:73cf6373c21bc80b2e0dc88444f41ae60b2f070ed02095754eb5a01df12256de", size = 2136338, upload-time = "2025-04-23T18:31:09.283Z" }, - { url = "https://files.pythonhosted.org/packages/2b/9f/b8f9ec8dd1417eb9da784e91e1667d58a2a4a7b7b34cf4af765ef663a7e5/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3dc625f4aa79713512d1976fe9f0bc99f706a9dee21dfd1810b4bbbf228d0e8a", size = 2730998, upload-time = "2025-04-23T18:31:11.7Z" }, - { url = "https://files.pythonhosted.org/packages/47/bc/cd720e078576bdb8255d5032c5d63ee5c0bf4b7173dd955185a1d658c456/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:881b21b5549499972441da4758d662aeea93f1923f953e9cbaff14b8b9565aef", size = 2003200, upload-time = "2025-04-23T18:31:13.536Z" }, - { url = "https://files.pythonhosted.org/packages/ca/22/3602b895ee2cd29d11a2b349372446ae9727c32e78a94b3d588a40fdf187/pydantic_core-2.33.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bdc25f3681f7b78572699569514036afe3c243bc3059d3942624e936ec93450e", size = 2113890, upload-time = "2025-04-23T18:31:15.011Z" }, - { url = "https://files.pythonhosted.org/packages/ff/e6/e3c5908c03cf00d629eb38393a98fccc38ee0ce8ecce32f69fc7d7b558a7/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:fe5b32187cbc0c862ee201ad66c30cf218e5ed468ec8dc1cf49dec66e160cc4d", size = 2073359, upload-time = "2025-04-23T18:31:16.393Z" }, - { url = "https://files.pythonhosted.org/packages/12/e7/6a36a07c59ebefc8777d1ffdaf5ae71b06b21952582e4b07eba88a421c79/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:bc7aee6f634a6f4a95676fcb5d6559a2c2a390330098dba5e5a5f28a2e4ada30", size = 2245883, upload-time = "2025-04-23T18:31:17.892Z" }, - { url = "https://files.pythonhosted.org/packages/16/3f/59b3187aaa6cc0c1e6616e8045b284de2b6a87b027cce2ffcea073adf1d2/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:235f45e5dbcccf6bd99f9f472858849f73d11120d76ea8707115415f8e5ebebf", size = 2241074, upload-time = "2025-04-23T18:31:19.205Z" }, - { url = "https://files.pythonhosted.org/packages/e0/ed/55532bb88f674d5d8f67ab121a2a13c385df382de2a1677f30ad385f7438/pydantic_core-2.33.2-cp311-cp311-win32.whl", hash = "sha256:6368900c2d3ef09b69cb0b913f9f8263b03786e5b2a387706c5afb66800efd51", size = 1910538, upload-time = "2025-04-23T18:31:20.541Z" }, - { url = "https://files.pythonhosted.org/packages/fe/1b/25b7cccd4519c0b23c2dd636ad39d381abf113085ce4f7bec2b0dc755eb1/pydantic_core-2.33.2-cp311-cp311-win_amd64.whl", hash = "sha256:1e063337ef9e9820c77acc768546325ebe04ee38b08703244c1309cccc4f1bab", size = 1952909, upload-time = "2025-04-23T18:31:22.371Z" }, - { url = "https://files.pythonhosted.org/packages/49/a9/d809358e49126438055884c4366a1f6227f0f84f635a9014e2deb9b9de54/pydantic_core-2.33.2-cp311-cp311-win_arm64.whl", hash = "sha256:6b99022f1d19bc32a4c2a0d544fc9a76e3be90f0b3f4af413f87d38749300e65", size = 1897786, upload-time = "2025-04-23T18:31:24.161Z" }, - { url = "https://files.pythonhosted.org/packages/18/8a/2b41c97f554ec8c71f2a8a5f85cb56a8b0956addfe8b0efb5b3d77e8bdc3/pydantic_core-2.33.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:a7ec89dc587667f22b6a0b6579c249fca9026ce7c333fc142ba42411fa243cdc", size = 2009000, upload-time = "2025-04-23T18:31:25.863Z" }, - { url = "https://files.pythonhosted.org/packages/a1/02/6224312aacb3c8ecbaa959897af57181fb6cf3a3d7917fd44d0f2917e6f2/pydantic_core-2.33.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3c6db6e52c6d70aa0d00d45cdb9b40f0433b96380071ea80b09277dba021ddf7", size = 1847996, upload-time = "2025-04-23T18:31:27.341Z" }, - { url = "https://files.pythonhosted.org/packages/d6/46/6dcdf084a523dbe0a0be59d054734b86a981726f221f4562aed313dbcb49/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e61206137cbc65e6d5256e1166f88331d3b6238e082d9f74613b9b765fb9025", size = 1880957, upload-time = "2025-04-23T18:31:28.956Z" }, - { url = "https://files.pythonhosted.org/packages/ec/6b/1ec2c03837ac00886ba8160ce041ce4e325b41d06a034adbef11339ae422/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:eb8c529b2819c37140eb51b914153063d27ed88e3bdc31b71198a198e921e011", size = 1964199, upload-time = "2025-04-23T18:31:31.025Z" }, - { url = "https://files.pythonhosted.org/packages/2d/1d/6bf34d6adb9debd9136bd197ca72642203ce9aaaa85cfcbfcf20f9696e83/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c52b02ad8b4e2cf14ca7b3d918f3eb0ee91e63b3167c32591e57c4317e134f8f", size = 2120296, upload-time = "2025-04-23T18:31:32.514Z" }, - { url = "https://files.pythonhosted.org/packages/e0/94/2bd0aaf5a591e974b32a9f7123f16637776c304471a0ab33cf263cf5591a/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:96081f1605125ba0855dfda83f6f3df5ec90c61195421ba72223de35ccfb2f88", size = 2676109, upload-time = "2025-04-23T18:31:33.958Z" }, - { url = "https://files.pythonhosted.org/packages/f9/41/4b043778cf9c4285d59742281a769eac371b9e47e35f98ad321349cc5d61/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f57a69461af2a5fa6e6bbd7a5f60d3b7e6cebb687f55106933188e79ad155c1", size = 2002028, upload-time = "2025-04-23T18:31:39.095Z" }, - { url = "https://files.pythonhosted.org/packages/cb/d5/7bb781bf2748ce3d03af04d5c969fa1308880e1dca35a9bd94e1a96a922e/pydantic_core-2.33.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:572c7e6c8bb4774d2ac88929e3d1f12bc45714ae5ee6d9a788a9fb35e60bb04b", size = 2100044, upload-time = "2025-04-23T18:31:41.034Z" }, - { url = "https://files.pythonhosted.org/packages/fe/36/def5e53e1eb0ad896785702a5bbfd25eed546cdcf4087ad285021a90ed53/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:db4b41f9bd95fbe5acd76d89920336ba96f03e149097365afe1cb092fceb89a1", size = 2058881, upload-time = "2025-04-23T18:31:42.757Z" }, - { url = "https://files.pythonhosted.org/packages/01/6c/57f8d70b2ee57fc3dc8b9610315949837fa8c11d86927b9bb044f8705419/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:fa854f5cf7e33842a892e5c73f45327760bc7bc516339fda888c75ae60edaeb6", size = 2227034, upload-time = "2025-04-23T18:31:44.304Z" }, - { url = "https://files.pythonhosted.org/packages/27/b9/9c17f0396a82b3d5cbea4c24d742083422639e7bb1d5bf600e12cb176a13/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:5f483cfb75ff703095c59e365360cb73e00185e01aaea067cd19acffd2ab20ea", size = 2234187, upload-time = "2025-04-23T18:31:45.891Z" }, - { url = "https://files.pythonhosted.org/packages/b0/6a/adf5734ffd52bf86d865093ad70b2ce543415e0e356f6cacabbc0d9ad910/pydantic_core-2.33.2-cp312-cp312-win32.whl", hash = "sha256:9cb1da0f5a471435a7bc7e439b8a728e8b61e59784b2af70d7c169f8dd8ae290", size = 1892628, upload-time = "2025-04-23T18:31:47.819Z" }, - { url = "https://files.pythonhosted.org/packages/43/e4/5479fecb3606c1368d496a825d8411e126133c41224c1e7238be58b87d7e/pydantic_core-2.33.2-cp312-cp312-win_amd64.whl", hash = "sha256:f941635f2a3d96b2973e867144fde513665c87f13fe0e193c158ac51bfaaa7b2", size = 1955866, upload-time = "2025-04-23T18:31:49.635Z" }, - { url = "https://files.pythonhosted.org/packages/0d/24/8b11e8b3e2be9dd82df4b11408a67c61bb4dc4f8e11b5b0fc888b38118b5/pydantic_core-2.33.2-cp312-cp312-win_arm64.whl", hash = "sha256:cca3868ddfaccfbc4bfb1d608e2ccaaebe0ae628e1416aeb9c4d88c001bb45ab", size = 1888894, upload-time = "2025-04-23T18:31:51.609Z" }, - { url = "https://files.pythonhosted.org/packages/46/8c/99040727b41f56616573a28771b1bfa08a3d3fe74d3d513f01251f79f172/pydantic_core-2.33.2-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:1082dd3e2d7109ad8b7da48e1d4710c8d06c253cbc4a27c1cff4fbcaa97a9e3f", size = 2015688, upload-time = "2025-04-23T18:31:53.175Z" }, - { url = "https://files.pythonhosted.org/packages/3a/cc/5999d1eb705a6cefc31f0b4a90e9f7fc400539b1a1030529700cc1b51838/pydantic_core-2.33.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f517ca031dfc037a9c07e748cefd8d96235088b83b4f4ba8939105d20fa1dcd6", size = 1844808, upload-time = "2025-04-23T18:31:54.79Z" }, - { url = "https://files.pythonhosted.org/packages/6f/5e/a0a7b8885c98889a18b6e376f344da1ef323d270b44edf8174d6bce4d622/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0a9f2c9dd19656823cb8250b0724ee9c60a82f3cdf68a080979d13092a3b0fef", size = 1885580, upload-time = "2025-04-23T18:31:57.393Z" }, - { url = "https://files.pythonhosted.org/packages/3b/2a/953581f343c7d11a304581156618c3f592435523dd9d79865903272c256a/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2b0a451c263b01acebe51895bfb0e1cc842a5c666efe06cdf13846c7418caa9a", size = 1973859, upload-time = "2025-04-23T18:31:59.065Z" }, - { url = "https://files.pythonhosted.org/packages/e6/55/f1a813904771c03a3f97f676c62cca0c0a4138654107c1b61f19c644868b/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ea40a64d23faa25e62a70ad163571c0b342b8bf66d5fa612ac0dec4f069d916", size = 2120810, upload-time = "2025-04-23T18:32:00.78Z" }, - { url = "https://files.pythonhosted.org/packages/aa/c3/053389835a996e18853ba107a63caae0b9deb4a276c6b472931ea9ae6e48/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0fb2d542b4d66f9470e8065c5469ec676978d625a8b7a363f07d9a501a9cb36a", size = 2676498, upload-time = "2025-04-23T18:32:02.418Z" }, - { url = "https://files.pythonhosted.org/packages/eb/3c/f4abd740877a35abade05e437245b192f9d0ffb48bbbbd708df33d3cda37/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fdac5d6ffa1b5a83bca06ffe7583f5576555e6c8b3a91fbd25ea7780f825f7d", size = 2000611, upload-time = "2025-04-23T18:32:04.152Z" }, - { url = "https://files.pythonhosted.org/packages/59/a7/63ef2fed1837d1121a894d0ce88439fe3e3b3e48c7543b2a4479eb99c2bd/pydantic_core-2.33.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:04a1a413977ab517154eebb2d326da71638271477d6ad87a769102f7c2488c56", size = 2107924, upload-time = "2025-04-23T18:32:06.129Z" }, - { url = "https://files.pythonhosted.org/packages/04/8f/2551964ef045669801675f1cfc3b0d74147f4901c3ffa42be2ddb1f0efc4/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:c8e7af2f4e0194c22b5b37205bfb293d166a7344a5b0d0eaccebc376546d77d5", size = 2063196, upload-time = "2025-04-23T18:32:08.178Z" }, - { url = "https://files.pythonhosted.org/packages/26/bd/d9602777e77fc6dbb0c7db9ad356e9a985825547dce5ad1d30ee04903918/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:5c92edd15cd58b3c2d34873597a1e20f13094f59cf88068adb18947df5455b4e", size = 2236389, upload-time = "2025-04-23T18:32:10.242Z" }, - { url = "https://files.pythonhosted.org/packages/42/db/0e950daa7e2230423ab342ae918a794964b053bec24ba8af013fc7c94846/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:65132b7b4a1c0beded5e057324b7e16e10910c106d43675d9bd87d4f38dde162", size = 2239223, upload-time = "2025-04-23T18:32:12.382Z" }, - { url = "https://files.pythonhosted.org/packages/58/4d/4f937099c545a8a17eb52cb67fe0447fd9a373b348ccfa9a87f141eeb00f/pydantic_core-2.33.2-cp313-cp313-win32.whl", hash = "sha256:52fb90784e0a242bb96ec53f42196a17278855b0f31ac7c3cc6f5c1ec4811849", size = 1900473, upload-time = "2025-04-23T18:32:14.034Z" }, - { url = "https://files.pythonhosted.org/packages/a0/75/4a0a9bac998d78d889def5e4ef2b065acba8cae8c93696906c3a91f310ca/pydantic_core-2.33.2-cp313-cp313-win_amd64.whl", hash = "sha256:c083a3bdd5a93dfe480f1125926afcdbf2917ae714bdb80b36d34318b2bec5d9", size = 1955269, upload-time = "2025-04-23T18:32:15.783Z" }, - { url = "https://files.pythonhosted.org/packages/f9/86/1beda0576969592f1497b4ce8e7bc8cbdf614c352426271b1b10d5f0aa64/pydantic_core-2.33.2-cp313-cp313-win_arm64.whl", hash = "sha256:e80b087132752f6b3d714f041ccf74403799d3b23a72722ea2e6ba2e892555b9", size = 1893921, upload-time = "2025-04-23T18:32:18.473Z" }, - { url = "https://files.pythonhosted.org/packages/a4/7d/e09391c2eebeab681df2b74bfe6c43422fffede8dc74187b2b0bf6fd7571/pydantic_core-2.33.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:61c18fba8e5e9db3ab908620af374db0ac1baa69f0f32df4f61ae23f15e586ac", size = 1806162, upload-time = "2025-04-23T18:32:20.188Z" }, - { url = "https://files.pythonhosted.org/packages/f1/3d/847b6b1fed9f8ed3bb95a9ad04fbd0b212e832d4f0f50ff4d9ee5a9f15cf/pydantic_core-2.33.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:95237e53bb015f67b63c91af7518a62a8660376a6a0db19b89acc77a4d6199f5", size = 1981560, upload-time = "2025-04-23T18:32:22.354Z" }, - { url = "https://files.pythonhosted.org/packages/6f/9a/e73262f6c6656262b5fdd723ad90f518f579b7bc8622e43a942eec53c938/pydantic_core-2.33.2-cp313-cp313t-win_amd64.whl", hash = "sha256:c2fc0a768ef76c15ab9238afa6da7f69895bb5d1ee83aeea2e3509af4472d0b9", size = 1935777, upload-time = "2025-04-23T18:32:25.088Z" }, - { url = "https://files.pythonhosted.org/packages/7b/27/d4ae6487d73948d6f20dddcd94be4ea43e74349b56eba82e9bdee2d7494c/pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:dd14041875d09cc0f9308e37a6f8b65f5585cf2598a53aa0123df8b129d481f8", size = 2025200, upload-time = "2025-04-23T18:33:14.199Z" }, - { url = "https://files.pythonhosted.org/packages/f1/b8/b3cb95375f05d33801024079b9392a5ab45267a63400bf1866e7ce0f0de4/pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:d87c561733f66531dced0da6e864f44ebf89a8fba55f31407b00c2f7f9449593", size = 1859123, upload-time = "2025-04-23T18:33:16.555Z" }, - { url = "https://files.pythonhosted.org/packages/05/bc/0d0b5adeda59a261cd30a1235a445bf55c7e46ae44aea28f7bd6ed46e091/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2f82865531efd18d6e07a04a17331af02cb7a651583c418df8266f17a63c6612", size = 1892852, upload-time = "2025-04-23T18:33:18.513Z" }, - { url = "https://files.pythonhosted.org/packages/3e/11/d37bdebbda2e449cb3f519f6ce950927b56d62f0b84fd9cb9e372a26a3d5/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bfb5112df54209d820d7bf9317c7a6c9025ea52e49f46b6a2060104bba37de7", size = 2067484, upload-time = "2025-04-23T18:33:20.475Z" }, - { url = "https://files.pythonhosted.org/packages/8c/55/1f95f0a05ce72ecb02a8a8a1c3be0579bbc29b1d5ab68f1378b7bebc5057/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:64632ff9d614e5eecfb495796ad51b0ed98c453e447a76bcbeeb69615079fc7e", size = 2108896, upload-time = "2025-04-23T18:33:22.501Z" }, - { url = "https://files.pythonhosted.org/packages/53/89/2b2de6c81fa131f423246a9109d7b2a375e83968ad0800d6e57d0574629b/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:f889f7a40498cc077332c7ab6b4608d296d852182211787d4f3ee377aaae66e8", size = 2069475, upload-time = "2025-04-23T18:33:24.528Z" }, - { url = "https://files.pythonhosted.org/packages/b8/e9/1f7efbe20d0b2b10f6718944b5d8ece9152390904f29a78e68d4e7961159/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:de4b83bb311557e439b9e186f733f6c645b9417c84e2eb8203f3f820a4b988bf", size = 2239013, upload-time = "2025-04-23T18:33:26.621Z" }, - { url = "https://files.pythonhosted.org/packages/3c/b2/5309c905a93811524a49b4e031e9851a6b00ff0fb668794472ea7746b448/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:82f68293f055f51b51ea42fafc74b6aad03e70e191799430b90c13d643059ebb", size = 2238715, upload-time = "2025-04-23T18:33:28.656Z" }, - { url = "https://files.pythonhosted.org/packages/32/56/8a7ca5d2cd2cda1d245d34b1c9a942920a718082ae8e54e5f3e5a58b7add/pydantic_core-2.33.2-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:329467cecfb529c925cf2bbd4d60d2c509bc2fb52a20c1045bf09bb70971a9c1", size = 2066757, upload-time = "2025-04-23T18:33:30.645Z" }, -] - -[[package]] -name = "pydantic-settings" -version = "2.10.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "pydantic" }, - { name = "python-dotenv" }, - { name = "typing-inspection" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/68/85/1ea668bbab3c50071ca613c6ab30047fb36ab0da1b92fa8f17bbc38fd36c/pydantic_settings-2.10.1.tar.gz", hash = "sha256:06f0062169818d0f5524420a360d632d5857b83cffd4d42fe29597807a1614ee", size = 172583, upload-time = "2025-06-24T13:26:46.841Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/58/f0/427018098906416f580e3cf1366d3b1abfb408a0652e9f31600c24a1903c/pydantic_settings-2.10.1-py3-none-any.whl", hash = "sha256:a60952460b99cf661dc25c29c0ef171721f98bfcb52ef8d9ea4c943d7c8cc796", size = 45235, upload-time = "2025-06-24T13:26:45.485Z" }, -] - -[[package]] -name = "pygments" -version = "2.19.2" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/b0/77/a5b8c569bf593b0140bde72ea885a803b82086995367bf2037de0159d924/pygments-2.19.2.tar.gz", hash = "sha256:636cb2477cec7f8952536970bc533bc43743542f70392ae026374600add5b887", size = 4968631, upload-time = "2025-06-21T13:39:12.283Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/c7/21/705964c7812476f378728bdf590ca4b771ec72385c533964653c68e86bdc/pygments-2.19.2-py3-none-any.whl", hash = "sha256:86540386c03d588bb81d44bc3928634ff26449851e99741617ecb9037ee5ec0b", size = 1225217, upload-time = "2025-06-21T13:39:07.939Z" }, -] - -[[package]] -name = "pyjwt" -version = "2.10.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/e7/46/bd74733ff231675599650d3e47f361794b22ef3e3770998dda30d3b63726/pyjwt-2.10.1.tar.gz", hash = "sha256:3cc5772eb20009233caf06e9d8a0577824723b44e6648ee0a2aedb6cf9381953", size = 87785, upload-time = "2024-11-28T03:43:29.933Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/61/ad/689f02752eeec26aed679477e80e632ef1b682313be70793d798c1d5fc8f/PyJWT-2.10.1-py3-none-any.whl", hash = "sha256:dcdd193e30abefd5debf142f9adfcdd2b58004e644f25406ffaebd50bd98dacb", size = 22997, upload-time = "2024-11-28T03:43:27.893Z" }, -] - -[package.optional-dependencies] -crypto = [ - { name = "cryptography" }, -] - -[[package]] -name = "pylance" -version = "0.36.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "numpy" }, - { name = "pyarrow" }, -] -wheels = [ - { url = "https://files.pythonhosted.org/packages/09/13/f7f029d12a3dfdc9f3059d77b3999d40f9cc064ba85fef885a08bf65dcb2/pylance-0.36.0-cp39-abi3-macosx_10_15_x86_64.whl", hash = "sha256:160ed088dc5fb63a71c8c96640d43ea58464f64bca8aa23b0337b1a96fd47b79", size = 43403867, upload-time = "2025-09-12T20:29:25.507Z" }, - { url = "https://files.pythonhosted.org/packages/95/95/defad18786260653b33d5ef8223736c0e481861c8d33311756bd471468ad/pylance-0.36.0-cp39-abi3-macosx_11_0_arm64.whl", hash = "sha256:ce43ad002b4e67ffb1a33925d05d472bbde77c57a5e84aca1728faa9ace0c086", size = 39777498, upload-time = "2025-09-12T20:27:02.906Z" }, - { url = "https://files.pythonhosted.org/packages/19/33/7080ed4e45648d8c803a49cd5a206eb95176ef9dc06bff26748ec2109c65/pylance-0.36.0-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6ad7b168b0d4b7864be6040bebaf6d9a3959e76a190ff401a84b165b75eade96", size = 41819489, upload-time = "2025-09-12T20:17:06.37Z" }, - { url = "https://files.pythonhosted.org/packages/29/9a/0c572994d96e03e70481dafb2b062033a9ce24beb5ac6045f00f013ca57c/pylance-0.36.0-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:353deeb7b19be505db490258b5f2fc897efd4a45255fa0d51455662e01ad59ab", size = 45366480, upload-time = "2025-09-12T20:19:53.924Z" }, - { url = "https://files.pythonhosted.org/packages/fe/82/a74f0436b6a983c2798d1f44699352cd98c42bc335781ece98a878cf63fb/pylance-0.36.0-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:9cd963fc22257591d1daf281fa2369e05299d78950cb11980aa099d7cbacdf00", size = 41833322, upload-time = "2025-09-12T20:17:40.784Z" }, - { url = "https://files.pythonhosted.org/packages/a8/f2/d28fa3487992c3bd46af6838da13cf9a00be24fcf4cf928f77feec52d8d6/pylance-0.36.0-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:40117569a87379e08ed12eccac658999158f81df946f2ed02693b77776b57597", size = 45347065, upload-time = "2025-09-12T20:19:26.435Z" }, - { url = "https://files.pythonhosted.org/packages/ff/ab/e7fc302950f1c6815a6e832d052d0860130374bfe4bd482b075299dc8384/pylance-0.36.0-cp39-abi3-win_amd64.whl", hash = "sha256:a2930738192e5075220bc38c8a58ff4e48a71d53b3ca2a577ffce0318609cac0", size = 46348996, upload-time = "2025-09-12T20:36:04.663Z" }, -] - -[[package]] -name = "pympler" -version = "1.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "pywin32", marker = "sys_platform == 'win32'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/dd/37/c384631908029676d8e7213dd956bb686af303a80db7afbc9be36bc49495/pympler-1.1.tar.gz", hash = "sha256:1eaa867cb8992c218430f1708fdaccda53df064144d1c5656b1e6f1ee6000424", size = 179954, upload-time = "2024-06-28T19:56:06.563Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/79/4f/a6a2e2b202d7fd97eadfe90979845b8706676b41cbd3b42ba75adf329d1f/Pympler-1.1-py3-none-any.whl", hash = "sha256:5b223d6027d0619584116a0cbc28e8d2e378f7a79c1e5e024f9ff3b673c58506", size = 165766, upload-time = "2024-06-28T19:56:05.087Z" }, -] - -[[package]] -name = "pyparsing" -version = "3.2.4" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/98/c9/b4594e6a81371dfa9eb7a2c110ad682acf985d96115ae8b25a1d63b4bf3b/pyparsing-3.2.4.tar.gz", hash = "sha256:fff89494f45559d0f2ce46613b419f632bbb6afbdaed49696d322bcf98a58e99", size = 1098809, upload-time = "2025-09-13T05:47:19.732Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/53/b8/fbab973592e23ae313042d450fc26fa24282ebffba21ba373786e1ce63b4/pyparsing-3.2.4-py3-none-any.whl", hash = "sha256:91d0fcde680d42cd031daf3a6ba20da3107e08a75de50da58360e7d94ab24d36", size = 113869, upload-time = "2025-09-13T05:47:17.863Z" }, -] - -[[package]] -name = "pypdf" -version = "6.0.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/20/ac/a300a03c3b34967c050677ccb16e7a4b65607ee5df9d51e8b6d713de4098/pypdf-6.0.0.tar.gz", hash = "sha256:282a99d2cc94a84a3a3159f0d9358c0af53f85b4d28d76ea38b96e9e5ac2a08d", size = 5033827, upload-time = "2025-08-11T14:22:02.352Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/2c/83/2cacc506eb322bb31b747bc06ccb82cc9aa03e19ee9c1245e538e49d52be/pypdf-6.0.0-py3-none-any.whl", hash = "sha256:56ea60100ce9f11fc3eec4f359da15e9aec3821b036c1f06d2b660d35683abb8", size = 310465, upload-time = "2025-08-11T14:22:00.481Z" }, -] - -[[package]] -name = "pyperclip" -version = "1.10.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/15/99/25f4898cf420efb6f45f519de018f4faea5391114a8618b16736ef3029f1/pyperclip-1.10.0.tar.gz", hash = "sha256:180c8346b1186921c75dfd14d9048a6b5d46bfc499778811952c6dd6eb1ca6be", size = 12193, upload-time = "2025-09-18T00:54:00.384Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/1e/bc/22540e73c5f5ae18f02924cd3954a6c9a4aa6b713c841a94c98335d333a1/pyperclip-1.10.0-py3-none-any.whl", hash = "sha256:596fbe55dc59263bff26e61d2afbe10223e2fccb5210c9c96a28d6887cfcc7ec", size = 11062, upload-time = "2025-09-18T00:53:59.252Z" }, -] - -[[package]] -name = "pyreadline3" -version = "3.5.4" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/0f/49/4cea918a08f02817aabae639e3d0ac046fef9f9180518a3ad394e22da148/pyreadline3-3.5.4.tar.gz", hash = "sha256:8d57d53039a1c75adba8e50dd3d992b28143480816187ea5efbd5c78e6c885b7", size = 99839, upload-time = "2024-09-19T02:40:10.062Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/5a/dc/491b7661614ab97483abf2056be1deee4dc2490ecbf7bff9ab5cdbac86e1/pyreadline3-3.5.4-py3-none-any.whl", hash = "sha256:eaf8e6cc3c49bcccf145fc6067ba8643d1df34d604a1ec0eccbf7a18e6d3fae6", size = 83178, upload-time = "2024-09-19T02:40:08.598Z" }, -] - -[[package]] -name = "pytest" -version = "8.4.2" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "colorama", marker = "sys_platform == 'win32'" }, - { name = "iniconfig" }, - { name = "packaging" }, - { name = "pluggy" }, - { name = "pygments" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/a3/5c/00a0e072241553e1a7496d638deababa67c5058571567b92a7eaa258397c/pytest-8.4.2.tar.gz", hash = "sha256:86c0d0b93306b961d58d62a4db4879f27fe25513d4b969df351abdddb3c30e01", size = 1519618, upload-time = "2025-09-04T14:34:22.711Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/a8/a4/20da314d277121d6534b3a980b29035dcd51e6744bd79075a6ce8fa4eb8d/pytest-8.4.2-py3-none-any.whl", hash = "sha256:872f880de3fc3a5bdc88a11b39c9710c3497a547cfa9320bc3c5e62fbf272e79", size = 365750, upload-time = "2025-09-04T14:34:20.226Z" }, -] - -[[package]] -name = "pytest-asyncio" -version = "1.2.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "pytest" }, - { name = "typing-extensions", marker = "python_full_version < '3.13'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/42/86/9e3c5f48f7b7b638b216e4b9e645f54d199d7abbbab7a64a13b4e12ba10f/pytest_asyncio-1.2.0.tar.gz", hash = "sha256:c609a64a2a8768462d0c99811ddb8bd2583c33fd33cf7f21af1c142e824ffb57", size = 50119, upload-time = "2025-09-12T07:33:53.816Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/04/93/2fa34714b7a4ae72f2f8dad66ba17dd9a2c793220719e736dda28b7aec27/pytest_asyncio-1.2.0-py3-none-any.whl", hash = "sha256:8e17ae5e46d8e7efe51ab6494dd2010f4ca8dae51652aa3c8d55acf50bfb2e99", size = 15095, upload-time = "2025-09-12T07:33:52.639Z" }, -] - -[[package]] -name = "python-dateutil" -version = "2.9.0.post0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "six" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/66/c0/0c8b6ad9f17a802ee498c46e004a0eb49bc148f2fd230864601a86dcf6db/python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3", size = 342432, upload-time = "2024-03-01T18:36:20.211Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/ec/57/56b9bcc3c9c6a792fcbaf139543cee77261f3651ca9da0c93f5c1221264b/python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427", size = 229892, upload-time = "2024-03-01T18:36:18.57Z" }, -] - -[[package]] -name = "python-dotenv" -version = "1.1.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/f6/b0/4bc07ccd3572a2f9df7e6782f52b0c6c90dcbb803ac4a167702d7d0dfe1e/python_dotenv-1.1.1.tar.gz", hash = "sha256:a8a6399716257f45be6a007360200409fce5cda2661e3dec71d23dc15f6189ab", size = 41978, upload-time = "2025-06-24T04:21:07.341Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/5f/ed/539768cf28c661b5b068d66d96a2f155c4971a5d55684a514c1a0e0dec2f/python_dotenv-1.1.1-py3-none-any.whl", hash = "sha256:31f23644fe2602f88ff55e1f5c79ba497e01224ee7737937930c448e4d0e24dc", size = 20556, upload-time = "2025-06-24T04:21:06.073Z" }, -] - -[[package]] -name = "python-magic-bin" -version = "0.4.14" -source = { registry = "https://pypi.org/simple" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/5a/5d/10b9ac745d9fd2f7151a2ab901e6bb6983dbd70e87c71111f54859d1ca2e/python_magic_bin-0.4.14-py2.py3-none-win32.whl", hash = "sha256:34a788c03adde7608028203e2dbb208f1f62225ad91518787ae26d603ae68892", size = 397784, upload-time = "2017-10-02T16:30:15.806Z" }, - { url = "https://files.pythonhosted.org/packages/07/c2/094e3d62b906d952537196603a23aec4bcd7c6126bf80eb14e6f9f4be3a2/python_magic_bin-0.4.14-py2.py3-none-win_amd64.whl", hash = "sha256:90be6206ad31071a36065a2fc169c5afb5e0355cbe6030e87641c6c62edc2b69", size = 409299, upload-time = "2017-10-02T16:30:18.545Z" }, -] - -[[package]] -name = "python-multipart" -version = "0.0.20" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/f3/87/f44d7c9f274c7ee665a29b885ec97089ec5dc034c7f3fafa03da9e39a09e/python_multipart-0.0.20.tar.gz", hash = "sha256:8dd0cab45b8e23064ae09147625994d090fa46f5b0d1e13af944c331a7fa9d13", size = 37158, upload-time = "2024-12-16T19:45:46.972Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/45/58/38b5afbc1a800eeea951b9285d3912613f2603bdf897a4ab0f4bd7f405fc/python_multipart-0.0.20-py3-none-any.whl", hash = "sha256:8a62d3a8335e06589fe01f2a3e178cdcc632f3fbe0d492ad9ee0ec35aab1f104", size = 24546, upload-time = "2024-12-16T19:45:44.423Z" }, -] - -[[package]] -name = "pytz" -version = "2025.2" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/f8/bf/abbd3cdfb8fbc7fb3d4d38d320f2441b1e7cbe29be4f23797b4a2b5d8aac/pytz-2025.2.tar.gz", hash = "sha256:360b9e3dbb49a209c21ad61809c7fb453643e048b38924c765813546746e81c3", size = 320884, upload-time = "2025-03-25T02:25:00.538Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/81/c4/34e93fe5f5429d7570ec1fa436f1986fb1f00c3e0f43a589fe2bbcd22c3f/pytz-2025.2-py2.py3-none-any.whl", hash = "sha256:5ddf76296dd8c44c26eb8f4b6f35488f3ccbf6fbbd7adee0b7262d43f0ec2f00", size = 509225, upload-time = "2025-03-25T02:24:58.468Z" }, -] - -[[package]] -name = "pywin32" -version = "311" -source = { registry = "https://pypi.org/simple" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/7c/af/449a6a91e5d6db51420875c54f6aff7c97a86a3b13a0b4f1a5c13b988de3/pywin32-311-cp311-cp311-win32.whl", hash = "sha256:184eb5e436dea364dcd3d2316d577d625c0351bf237c4e9a5fabbcfa5a58b151", size = 8697031, upload-time = "2025-07-14T20:13:13.266Z" }, - { url = "https://files.pythonhosted.org/packages/51/8f/9bb81dd5bb77d22243d33c8397f09377056d5c687aa6d4042bea7fbf8364/pywin32-311-cp311-cp311-win_amd64.whl", hash = "sha256:3ce80b34b22b17ccbd937a6e78e7225d80c52f5ab9940fe0506a1a16f3dab503", size = 9508308, upload-time = "2025-07-14T20:13:15.147Z" }, - { url = "https://files.pythonhosted.org/packages/44/7b/9c2ab54f74a138c491aba1b1cd0795ba61f144c711daea84a88b63dc0f6c/pywin32-311-cp311-cp311-win_arm64.whl", hash = "sha256:a733f1388e1a842abb67ffa8e7aad0e70ac519e09b0f6a784e65a136ec7cefd2", size = 8703930, upload-time = "2025-07-14T20:13:16.945Z" }, - { url = "https://files.pythonhosted.org/packages/e7/ab/01ea1943d4eba0f850c3c61e78e8dd59757ff815ff3ccd0a84de5f541f42/pywin32-311-cp312-cp312-win32.whl", hash = "sha256:750ec6e621af2b948540032557b10a2d43b0cee2ae9758c54154d711cc852d31", size = 8706543, upload-time = "2025-07-14T20:13:20.765Z" }, - { url = "https://files.pythonhosted.org/packages/d1/a8/a0e8d07d4d051ec7502cd58b291ec98dcc0c3fff027caad0470b72cfcc2f/pywin32-311-cp312-cp312-win_amd64.whl", hash = "sha256:b8c095edad5c211ff31c05223658e71bf7116daa0ecf3ad85f3201ea3190d067", size = 9495040, upload-time = "2025-07-14T20:13:22.543Z" }, - { url = "https://files.pythonhosted.org/packages/ba/3a/2ae996277b4b50f17d61f0603efd8253cb2d79cc7ae159468007b586396d/pywin32-311-cp312-cp312-win_arm64.whl", hash = "sha256:e286f46a9a39c4a18b319c28f59b61de793654af2f395c102b4f819e584b5852", size = 8710102, upload-time = "2025-07-14T20:13:24.682Z" }, - { url = "https://files.pythonhosted.org/packages/a5/be/3fd5de0979fcb3994bfee0d65ed8ca9506a8a1260651b86174f6a86f52b3/pywin32-311-cp313-cp313-win32.whl", hash = "sha256:f95ba5a847cba10dd8c4d8fefa9f2a6cf283b8b88ed6178fa8a6c1ab16054d0d", size = 8705700, upload-time = "2025-07-14T20:13:26.471Z" }, - { url = "https://files.pythonhosted.org/packages/e3/28/e0a1909523c6890208295a29e05c2adb2126364e289826c0a8bc7297bd5c/pywin32-311-cp313-cp313-win_amd64.whl", hash = "sha256:718a38f7e5b058e76aee1c56ddd06908116d35147e133427e59a3983f703a20d", size = 9494700, upload-time = "2025-07-14T20:13:28.243Z" }, - { url = "https://files.pythonhosted.org/packages/04/bf/90339ac0f55726dce7d794e6d79a18a91265bdf3aa70b6b9ca52f35e022a/pywin32-311-cp313-cp313-win_arm64.whl", hash = "sha256:7b4075d959648406202d92a2310cb990fea19b535c7f4a78d3f5e10b926eeb8a", size = 8709318, upload-time = "2025-07-14T20:13:30.348Z" }, - { url = "https://files.pythonhosted.org/packages/c9/31/097f2e132c4f16d99a22bfb777e0fd88bd8e1c634304e102f313af69ace5/pywin32-311-cp314-cp314-win32.whl", hash = "sha256:b7a2c10b93f8986666d0c803ee19b5990885872a7de910fc460f9b0c2fbf92ee", size = 8840714, upload-time = "2025-07-14T20:13:32.449Z" }, - { url = "https://files.pythonhosted.org/packages/90/4b/07c77d8ba0e01349358082713400435347df8426208171ce297da32c313d/pywin32-311-cp314-cp314-win_amd64.whl", hash = "sha256:3aca44c046bd2ed8c90de9cb8427f581c479e594e99b5c0bb19b29c10fd6cb87", size = 9656800, upload-time = "2025-07-14T20:13:34.312Z" }, - { url = "https://files.pythonhosted.org/packages/c0/d2/21af5c535501a7233e734b8af901574572da66fcc254cb35d0609c9080dd/pywin32-311-cp314-cp314-win_arm64.whl", hash = "sha256:a508e2d9025764a8270f93111a970e1d0fbfc33f4153b388bb649b7eec4f9b42", size = 8932540, upload-time = "2025-07-14T20:13:36.379Z" }, -] - -[[package]] -name = "pyyaml" -version = "6.0.2" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/54/ed/79a089b6be93607fa5cdaedf301d7dfb23af5f25c398d5ead2525b063e17/pyyaml-6.0.2.tar.gz", hash = "sha256:d584d9ec91ad65861cc08d42e834324ef890a082e591037abe114850ff7bbc3e", size = 130631, upload-time = "2024-08-06T20:33:50.674Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/f8/aa/7af4e81f7acba21a4c6be026da38fd2b872ca46226673c89a758ebdc4fd2/PyYAML-6.0.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:cc1c1159b3d456576af7a3e4d1ba7e6924cb39de8f67111c735f6fc832082774", size = 184612, upload-time = "2024-08-06T20:32:03.408Z" }, - { url = "https://files.pythonhosted.org/packages/8b/62/b9faa998fd185f65c1371643678e4d58254add437edb764a08c5a98fb986/PyYAML-6.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1e2120ef853f59c7419231f3bf4e7021f1b936f6ebd222406c3b60212205d2ee", size = 172040, upload-time = "2024-08-06T20:32:04.926Z" }, - { url = "https://files.pythonhosted.org/packages/ad/0c/c804f5f922a9a6563bab712d8dcc70251e8af811fce4524d57c2c0fd49a4/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5d225db5a45f21e78dd9358e58a98702a0302f2659a3c6cd320564b75b86f47c", size = 736829, upload-time = "2024-08-06T20:32:06.459Z" }, - { url = "https://files.pythonhosted.org/packages/51/16/6af8d6a6b210c8e54f1406a6b9481febf9c64a3109c541567e35a49aa2e7/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5ac9328ec4831237bec75defaf839f7d4564be1e6b25ac710bd1a96321cc8317", size = 764167, upload-time = "2024-08-06T20:32:08.338Z" }, - { url = "https://files.pythonhosted.org/packages/75/e4/2c27590dfc9992f73aabbeb9241ae20220bd9452df27483b6e56d3975cc5/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3ad2a3decf9aaba3d29c8f537ac4b243e36bef957511b4766cb0057d32b0be85", size = 762952, upload-time = "2024-08-06T20:32:14.124Z" }, - { url = "https://files.pythonhosted.org/packages/9b/97/ecc1abf4a823f5ac61941a9c00fe501b02ac3ab0e373c3857f7d4b83e2b6/PyYAML-6.0.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:ff3824dc5261f50c9b0dfb3be22b4567a6f938ccce4587b38952d85fd9e9afe4", size = 735301, upload-time = "2024-08-06T20:32:16.17Z" }, - { url = "https://files.pythonhosted.org/packages/45/73/0f49dacd6e82c9430e46f4a027baa4ca205e8b0a9dce1397f44edc23559d/PyYAML-6.0.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:797b4f722ffa07cc8d62053e4cff1486fa6dc094105d13fea7b1de7d8bf71c9e", size = 756638, upload-time = "2024-08-06T20:32:18.555Z" }, - { url = "https://files.pythonhosted.org/packages/22/5f/956f0f9fc65223a58fbc14459bf34b4cc48dec52e00535c79b8db361aabd/PyYAML-6.0.2-cp311-cp311-win32.whl", hash = "sha256:11d8f3dd2b9c1207dcaf2ee0bbbfd5991f571186ec9cc78427ba5bd32afae4b5", size = 143850, upload-time = "2024-08-06T20:32:19.889Z" }, - { url = "https://files.pythonhosted.org/packages/ed/23/8da0bbe2ab9dcdd11f4f4557ccaf95c10b9811b13ecced089d43ce59c3c8/PyYAML-6.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:e10ce637b18caea04431ce14fabcf5c64a1c61ec9c56b071a4b7ca131ca52d44", size = 161980, upload-time = "2024-08-06T20:32:21.273Z" }, - { url = "https://files.pythonhosted.org/packages/86/0c/c581167fc46d6d6d7ddcfb8c843a4de25bdd27e4466938109ca68492292c/PyYAML-6.0.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:c70c95198c015b85feafc136515252a261a84561b7b1d51e3384e0655ddf25ab", size = 183873, upload-time = "2024-08-06T20:32:25.131Z" }, - { url = "https://files.pythonhosted.org/packages/a8/0c/38374f5bb272c051e2a69281d71cba6fdb983413e6758b84482905e29a5d/PyYAML-6.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ce826d6ef20b1bc864f0a68340c8b3287705cae2f8b4b1d932177dcc76721725", size = 173302, upload-time = "2024-08-06T20:32:26.511Z" }, - { url = "https://files.pythonhosted.org/packages/c3/93/9916574aa8c00aa06bbac729972eb1071d002b8e158bd0e83a3b9a20a1f7/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1f71ea527786de97d1a0cc0eacd1defc0985dcf6b3f17bb77dcfc8c34bec4dc5", size = 739154, upload-time = "2024-08-06T20:32:28.363Z" }, - { url = "https://files.pythonhosted.org/packages/95/0f/b8938f1cbd09739c6da569d172531567dbcc9789e0029aa070856f123984/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9b22676e8097e9e22e36d6b7bda33190d0d400f345f23d4065d48f4ca7ae0425", size = 766223, upload-time = "2024-08-06T20:32:30.058Z" }, - { url = "https://files.pythonhosted.org/packages/b9/2b/614b4752f2e127db5cc206abc23a8c19678e92b23c3db30fc86ab731d3bd/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:80bab7bfc629882493af4aa31a4cfa43a4c57c83813253626916b8c7ada83476", size = 767542, upload-time = "2024-08-06T20:32:31.881Z" }, - { url = "https://files.pythonhosted.org/packages/d4/00/dd137d5bcc7efea1836d6264f049359861cf548469d18da90cd8216cf05f/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:0833f8694549e586547b576dcfaba4a6b55b9e96098b36cdc7ebefe667dfed48", size = 731164, upload-time = "2024-08-06T20:32:37.083Z" }, - { url = "https://files.pythonhosted.org/packages/c9/1f/4f998c900485e5c0ef43838363ba4a9723ac0ad73a9dc42068b12aaba4e4/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8b9c7197f7cb2738065c481a0461e50ad02f18c78cd75775628afb4d7137fb3b", size = 756611, upload-time = "2024-08-06T20:32:38.898Z" }, - { url = "https://files.pythonhosted.org/packages/df/d1/f5a275fdb252768b7a11ec63585bc38d0e87c9e05668a139fea92b80634c/PyYAML-6.0.2-cp312-cp312-win32.whl", hash = "sha256:ef6107725bd54b262d6dedcc2af448a266975032bc85ef0172c5f059da6325b4", size = 140591, upload-time = "2024-08-06T20:32:40.241Z" }, - { url = "https://files.pythonhosted.org/packages/0c/e8/4f648c598b17c3d06e8753d7d13d57542b30d56e6c2dedf9c331ae56312e/PyYAML-6.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:7e7401d0de89a9a855c839bc697c079a4af81cf878373abd7dc625847d25cbd8", size = 156338, upload-time = "2024-08-06T20:32:41.93Z" }, - { url = "https://files.pythonhosted.org/packages/ef/e3/3af305b830494fa85d95f6d95ef7fa73f2ee1cc8ef5b495c7c3269fb835f/PyYAML-6.0.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:efdca5630322a10774e8e98e1af481aad470dd62c3170801852d752aa7a783ba", size = 181309, upload-time = "2024-08-06T20:32:43.4Z" }, - { url = "https://files.pythonhosted.org/packages/45/9f/3b1c20a0b7a3200524eb0076cc027a970d320bd3a6592873c85c92a08731/PyYAML-6.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:50187695423ffe49e2deacb8cd10510bc361faac997de9efef88badc3bb9e2d1", size = 171679, upload-time = "2024-08-06T20:32:44.801Z" }, - { url = "https://files.pythonhosted.org/packages/7c/9a/337322f27005c33bcb656c655fa78325b730324c78620e8328ae28b64d0c/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0ffe8360bab4910ef1b9e87fb812d8bc0a308b0d0eef8c8f44e0254ab3b07133", size = 733428, upload-time = "2024-08-06T20:32:46.432Z" }, - { url = "https://files.pythonhosted.org/packages/a3/69/864fbe19e6c18ea3cc196cbe5d392175b4cf3d5d0ac1403ec3f2d237ebb5/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:17e311b6c678207928d649faa7cb0d7b4c26a0ba73d41e99c4fff6b6c3276484", size = 763361, upload-time = "2024-08-06T20:32:51.188Z" }, - { url = "https://files.pythonhosted.org/packages/04/24/b7721e4845c2f162d26f50521b825fb061bc0a5afcf9a386840f23ea19fa/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:70b189594dbe54f75ab3a1acec5f1e3faa7e8cf2f1e08d9b561cb41b845f69d5", size = 759523, upload-time = "2024-08-06T20:32:53.019Z" }, - { url = "https://files.pythonhosted.org/packages/2b/b2/e3234f59ba06559c6ff63c4e10baea10e5e7df868092bf9ab40e5b9c56b6/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:41e4e3953a79407c794916fa277a82531dd93aad34e29c2a514c2c0c5fe971cc", size = 726660, upload-time = "2024-08-06T20:32:54.708Z" }, - { url = "https://files.pythonhosted.org/packages/fe/0f/25911a9f080464c59fab9027482f822b86bf0608957a5fcc6eaac85aa515/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:68ccc6023a3400877818152ad9a1033e3db8625d899c72eacb5a668902e4d652", size = 751597, upload-time = "2024-08-06T20:32:56.985Z" }, - { url = "https://files.pythonhosted.org/packages/14/0d/e2c3b43bbce3cf6bd97c840b46088a3031085179e596d4929729d8d68270/PyYAML-6.0.2-cp313-cp313-win32.whl", hash = "sha256:bc2fa7c6b47d6bc618dd7fb02ef6fdedb1090ec036abab80d4681424b84c1183", size = 140527, upload-time = "2024-08-06T20:33:03.001Z" }, - { url = "https://files.pythonhosted.org/packages/fa/de/02b54f42487e3d3c6efb3f89428677074ca7bf43aae402517bc7cca949f3/PyYAML-6.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:8388ee1976c416731879ac16da0aff3f63b286ffdd57cdeb95f3f2e085687563", size = 156446, upload-time = "2024-08-06T20:33:04.33Z" }, -] - -[[package]] -name = "rdflib" -version = "7.1.4" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "pyparsing" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/e8/7e/cb2d74466bd8495051ebe2d241b1cb1d4acf9740d481126aef19ef2697f5/rdflib-7.1.4.tar.gz", hash = "sha256:fed46e24f26a788e2ab8e445f7077f00edcf95abb73bcef4b86cefa8b62dd174", size = 4692745, upload-time = "2025-03-29T02:23:02.386Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/f4/31/e9b6f04288dcd3fa60cb3179260d6dad81b92aef3063d679ac7d80a827ea/rdflib-7.1.4-py3-none-any.whl", hash = "sha256:72f4adb1990fa5241abd22ddaf36d7cafa5d91d9ff2ba13f3086d339b213d997", size = 565051, upload-time = "2025-03-29T02:22:44.987Z" }, -] - -[[package]] -name = "referencing" -version = "0.36.2" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "attrs" }, - { name = "rpds-py" }, - { name = "typing-extensions", marker = "python_full_version < '3.13'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/2f/db/98b5c277be99dd18bfd91dd04e1b759cad18d1a338188c936e92f921c7e2/referencing-0.36.2.tar.gz", hash = "sha256:df2e89862cd09deabbdba16944cc3f10feb6b3e6f18e902f7cc25609a34775aa", size = 74744, upload-time = "2025-01-25T08:48:16.138Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/c1/b1/3baf80dc6d2b7bc27a95a67752d0208e410351e3feb4eb78de5f77454d8d/referencing-0.36.2-py3-none-any.whl", hash = "sha256:e8699adbbf8b5c7de96d8ffa0eb5c158b3beafce084968e2ea8bb08c6794dcd0", size = 26775, upload-time = "2025-01-25T08:48:14.241Z" }, -] - -[[package]] -name = "regex" -version = "2025.9.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/b2/5a/4c63457fbcaf19d138d72b2e9b39405954f98c0349b31c601bfcb151582c/regex-2025.9.1.tar.gz", hash = "sha256:88ac07b38d20b54d79e704e38aa3bd2c0f8027432164226bdee201a1c0c9c9ff", size = 400852, upload-time = "2025-09-01T22:10:10.479Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/06/4d/f741543c0c59f96c6625bc6c11fea1da2e378b7d293ffff6f318edc0ce14/regex-2025.9.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:e5bcf112b09bfd3646e4db6bf2e598534a17d502b0c01ea6550ba4eca780c5e6", size = 484811, upload-time = "2025-09-01T22:08:12.834Z" }, - { url = "https://files.pythonhosted.org/packages/c2/bd/27e73e92635b6fbd51afc26a414a3133243c662949cd1cda677fe7bb09bd/regex-2025.9.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:67a0295a3c31d675a9ee0238d20238ff10a9a2fdb7a1323c798fc7029578b15c", size = 288977, upload-time = "2025-09-01T22:08:14.499Z" }, - { url = "https://files.pythonhosted.org/packages/eb/7d/7dc0c6efc8bc93cd6e9b947581f5fde8a5dbaa0af7c4ec818c5729fdc807/regex-2025.9.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:ea8267fbadc7d4bd7c1301a50e85c2ff0de293ff9452a1a9f8d82c6cafe38179", size = 286606, upload-time = "2025-09-01T22:08:15.881Z" }, - { url = "https://files.pythonhosted.org/packages/d1/01/9b5c6dd394f97c8f2c12f6e8f96879c9ac27292a718903faf2e27a0c09f6/regex-2025.9.1-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6aeff21de7214d15e928fb5ce757f9495214367ba62875100d4c18d293750cc1", size = 792436, upload-time = "2025-09-01T22:08:17.38Z" }, - { url = "https://files.pythonhosted.org/packages/fc/24/b7430cfc6ee34bbb3db6ff933beb5e7692e5cc81e8f6f4da63d353566fb0/regex-2025.9.1-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:d89f1bbbbbc0885e1c230f7770d5e98f4f00b0ee85688c871d10df8b184a6323", size = 858705, upload-time = "2025-09-01T22:08:19.037Z" }, - { url = "https://files.pythonhosted.org/packages/d6/98/155f914b4ea6ae012663188545c4f5216c11926d09b817127639d618b003/regex-2025.9.1-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:ca3affe8ddea498ba9d294ab05f5f2d3b5ad5d515bc0d4a9016dd592a03afe52", size = 905881, upload-time = "2025-09-01T22:08:20.377Z" }, - { url = "https://files.pythonhosted.org/packages/8a/a7/a470e7bc8259c40429afb6d6a517b40c03f2f3e455c44a01abc483a1c512/regex-2025.9.1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:91892a7a9f0a980e4c2c85dd19bc14de2b219a3a8867c4b5664b9f972dcc0c78", size = 798968, upload-time = "2025-09-01T22:08:22.081Z" }, - { url = "https://files.pythonhosted.org/packages/1d/fa/33f6fec4d41449fea5f62fdf5e46d668a1c046730a7f4ed9f478331a8e3a/regex-2025.9.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:e1cb40406f4ae862710615f9f636c1e030fd6e6abe0e0f65f6a695a2721440c6", size = 781884, upload-time = "2025-09-01T22:08:23.832Z" }, - { url = "https://files.pythonhosted.org/packages/42/de/2b45f36ab20da14eedddf5009d370625bc5942d9953fa7e5037a32d66843/regex-2025.9.1-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:94f6cff6f7e2149c7e6499a6ecd4695379eeda8ccbccb9726e8149f2fe382e92", size = 852935, upload-time = "2025-09-01T22:08:25.536Z" }, - { url = "https://files.pythonhosted.org/packages/1e/f9/878f4fc92c87e125e27aed0f8ee0d1eced9b541f404b048f66f79914475a/regex-2025.9.1-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:6c0226fb322b82709e78c49cc33484206647f8a39954d7e9de1567f5399becd0", size = 844340, upload-time = "2025-09-01T22:08:27.141Z" }, - { url = "https://files.pythonhosted.org/packages/90/c2/5b6f2bce6ece5f8427c718c085eca0de4bbb4db59f54db77aa6557aef3e9/regex-2025.9.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:a12f59c7c380b4fcf7516e9cbb126f95b7a9518902bcf4a852423ff1dcd03e6a", size = 787238, upload-time = "2025-09-01T22:08:28.75Z" }, - { url = "https://files.pythonhosted.org/packages/47/66/1ef1081c831c5b611f6f55f6302166cfa1bc9574017410ba5595353f846a/regex-2025.9.1-cp311-cp311-win32.whl", hash = "sha256:49865e78d147a7a4f143064488da5d549be6bfc3f2579e5044cac61f5c92edd4", size = 264118, upload-time = "2025-09-01T22:08:30.388Z" }, - { url = "https://files.pythonhosted.org/packages/ad/e0/8adc550d7169df1d6b9be8ff6019cda5291054a0107760c2f30788b6195f/regex-2025.9.1-cp311-cp311-win_amd64.whl", hash = "sha256:d34b901f6f2f02ef60f4ad3855d3a02378c65b094efc4b80388a3aeb700a5de7", size = 276151, upload-time = "2025-09-01T22:08:32.073Z" }, - { url = "https://files.pythonhosted.org/packages/cb/bd/46fef29341396d955066e55384fb93b0be7d64693842bf4a9a398db6e555/regex-2025.9.1-cp311-cp311-win_arm64.whl", hash = "sha256:47d7c2dab7e0b95b95fd580087b6ae196039d62306a592fa4e162e49004b6299", size = 268460, upload-time = "2025-09-01T22:08:33.281Z" }, - { url = "https://files.pythonhosted.org/packages/39/ef/a0372febc5a1d44c1be75f35d7e5aff40c659ecde864d7fa10e138f75e74/regex-2025.9.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:84a25164bd8dcfa9f11c53f561ae9766e506e580b70279d05a7946510bdd6f6a", size = 486317, upload-time = "2025-09-01T22:08:34.529Z" }, - { url = "https://files.pythonhosted.org/packages/b5/25/d64543fb7eb41a1024786d518cc57faf1ce64aa6e9ddba097675a0c2f1d2/regex-2025.9.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:645e88a73861c64c1af558dd12294fb4e67b5c1eae0096a60d7d8a2143a611c7", size = 289698, upload-time = "2025-09-01T22:08:36.162Z" }, - { url = "https://files.pythonhosted.org/packages/d8/dc/fbf31fc60be317bd9f6f87daa40a8a9669b3b392aa8fe4313df0a39d0722/regex-2025.9.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:10a450cba5cd5409526ee1d4449f42aad38dd83ac6948cbd6d7f71ca7018f7db", size = 287242, upload-time = "2025-09-01T22:08:37.794Z" }, - { url = "https://files.pythonhosted.org/packages/0f/74/f933a607a538f785da5021acf5323961b4620972e2c2f1f39b6af4b71db7/regex-2025.9.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e9dc5991592933a4192c166eeb67b29d9234f9c86344481173d1bc52f73a7104", size = 797441, upload-time = "2025-09-01T22:08:39.108Z" }, - { url = "https://files.pythonhosted.org/packages/89/d0/71fc49b4f20e31e97f199348b8c4d6e613e7b6a54a90eb1b090c2b8496d7/regex-2025.9.1-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a32291add816961aab472f4fad344c92871a2ee33c6c219b6598e98c1f0108f2", size = 862654, upload-time = "2025-09-01T22:08:40.586Z" }, - { url = "https://files.pythonhosted.org/packages/59/05/984edce1411a5685ba9abbe10d42cdd9450aab4a022271f9585539788150/regex-2025.9.1-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:588c161a68a383478e27442a678e3b197b13c5ba51dbba40c1ccb8c4c7bee9e9", size = 910862, upload-time = "2025-09-01T22:08:42.416Z" }, - { url = "https://files.pythonhosted.org/packages/b2/02/5c891bb5fe0691cc1bad336e3a94b9097fbcf9707ec8ddc1dce9f0397289/regex-2025.9.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:47829ffaf652f30d579534da9085fe30c171fa2a6744a93d52ef7195dc38218b", size = 801991, upload-time = "2025-09-01T22:08:44.072Z" }, - { url = "https://files.pythonhosted.org/packages/f1/ae/fd10d6ad179910f7a1b3e0a7fde1ef8bb65e738e8ac4fd6ecff3f52252e4/regex-2025.9.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:1e978e5a35b293ea43f140c92a3269b6ab13fe0a2bf8a881f7ac740f5a6ade85", size = 786651, upload-time = "2025-09-01T22:08:46.079Z" }, - { url = "https://files.pythonhosted.org/packages/30/cf/9d686b07bbc5bf94c879cc168db92542d6bc9fb67088d03479fef09ba9d3/regex-2025.9.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:4cf09903e72411f4bf3ac1eddd624ecfd423f14b2e4bf1c8b547b72f248b7bf7", size = 856556, upload-time = "2025-09-01T22:08:48.376Z" }, - { url = "https://files.pythonhosted.org/packages/91/9d/302f8a29bb8a49528abbab2d357a793e2a59b645c54deae0050f8474785b/regex-2025.9.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:d016b0f77be63e49613c9e26aaf4a242f196cd3d7a4f15898f5f0ab55c9b24d2", size = 849001, upload-time = "2025-09-01T22:08:50.067Z" }, - { url = "https://files.pythonhosted.org/packages/93/fa/b4c6dbdedc85ef4caec54c817cd5f4418dbfa2453214119f2538082bf666/regex-2025.9.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:656563e620de6908cd1c9d4f7b9e0777e3341ca7db9d4383bcaa44709c90281e", size = 788138, upload-time = "2025-09-01T22:08:51.933Z" }, - { url = "https://files.pythonhosted.org/packages/4a/1b/91ee17a3cbf87f81e8c110399279d0e57f33405468f6e70809100f2ff7d8/regex-2025.9.1-cp312-cp312-win32.whl", hash = "sha256:df33f4ef07b68f7ab637b1dbd70accbf42ef0021c201660656601e8a9835de45", size = 264524, upload-time = "2025-09-01T22:08:53.75Z" }, - { url = "https://files.pythonhosted.org/packages/92/28/6ba31cce05b0f1ec6b787921903f83bd0acf8efde55219435572af83c350/regex-2025.9.1-cp312-cp312-win_amd64.whl", hash = "sha256:5aba22dfbc60cda7c0853516104724dc904caa2db55f2c3e6e984eb858d3edf3", size = 275489, upload-time = "2025-09-01T22:08:55.037Z" }, - { url = "https://files.pythonhosted.org/packages/bd/ed/ea49f324db00196e9ef7fe00dd13c6164d5173dd0f1bbe495e61bb1fb09d/regex-2025.9.1-cp312-cp312-win_arm64.whl", hash = "sha256:ec1efb4c25e1849c2685fa95da44bfde1b28c62d356f9c8d861d4dad89ed56e9", size = 268589, upload-time = "2025-09-01T22:08:56.369Z" }, - { url = "https://files.pythonhosted.org/packages/98/25/b2959ce90c6138c5142fe5264ee1f9b71a0c502ca4c7959302a749407c79/regex-2025.9.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:bc6834727d1b98d710a63e6c823edf6ffbf5792eba35d3fa119531349d4142ef", size = 485932, upload-time = "2025-09-01T22:08:57.913Z" }, - { url = "https://files.pythonhosted.org/packages/49/2e/6507a2a85f3f2be6643438b7bd976e67ad73223692d6988eb1ff444106d3/regex-2025.9.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:c3dc05b6d579875719bccc5f3037b4dc80433d64e94681a0061845bd8863c025", size = 289568, upload-time = "2025-09-01T22:08:59.258Z" }, - { url = "https://files.pythonhosted.org/packages/c7/d8/de4a4b57215d99868f1640e062a7907e185ec7476b4b689e2345487c1ff4/regex-2025.9.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:22213527df4c985ec4a729b055a8306272d41d2f45908d7bacb79be0fa7a75ad", size = 286984, upload-time = "2025-09-01T22:09:00.835Z" }, - { url = "https://files.pythonhosted.org/packages/03/15/e8cb403403a57ed316e80661db0e54d7aa2efcd85cb6156f33cc18746922/regex-2025.9.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8e3f6e3c5a5a1adc3f7ea1b5aec89abfc2f4fbfba55dafb4343cd1d084f715b2", size = 797514, upload-time = "2025-09-01T22:09:02.538Z" }, - { url = "https://files.pythonhosted.org/packages/e4/26/2446f2b9585fed61faaa7e2bbce3aca7dd8df6554c32addee4c4caecf24a/regex-2025.9.1-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:bcb89c02a0d6c2bec9b0bb2d8c78782699afe8434493bfa6b4021cc51503f249", size = 862586, upload-time = "2025-09-01T22:09:04.322Z" }, - { url = "https://files.pythonhosted.org/packages/fd/b8/82ffbe9c0992c31bbe6ae1c4b4e21269a5df2559102b90543c9b56724c3c/regex-2025.9.1-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:b0e2f95413eb0c651cd1516a670036315b91b71767af83bc8525350d4375ccba", size = 910815, upload-time = "2025-09-01T22:09:05.978Z" }, - { url = "https://files.pythonhosted.org/packages/2f/d8/7303ea38911759c1ee30cc5bc623ee85d3196b733c51fd6703c34290a8d9/regex-2025.9.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:09a41dc039e1c97d3c2ed3e26523f748e58c4de3ea7a31f95e1cf9ff973fff5a", size = 802042, upload-time = "2025-09-01T22:09:07.865Z" }, - { url = "https://files.pythonhosted.org/packages/fc/0e/6ad51a55ed4b5af512bb3299a05d33309bda1c1d1e1808fa869a0bed31bc/regex-2025.9.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:4f0b4258b161094f66857a26ee938d3fe7b8a5063861e44571215c44fbf0e5df", size = 786764, upload-time = "2025-09-01T22:09:09.362Z" }, - { url = "https://files.pythonhosted.org/packages/8d/d5/394e3ffae6baa5a9217bbd14d96e0e5da47bb069d0dbb8278e2681a2b938/regex-2025.9.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:bf70e18ac390e6977ea7e56f921768002cb0fa359c4199606c7219854ae332e0", size = 856557, upload-time = "2025-09-01T22:09:11.129Z" }, - { url = "https://files.pythonhosted.org/packages/cd/80/b288d3910c41194ad081b9fb4b371b76b0bbfdce93e7709fc98df27b37dc/regex-2025.9.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:b84036511e1d2bb0a4ff1aec26951caa2dea8772b223c9e8a19ed8885b32dbac", size = 849108, upload-time = "2025-09-01T22:09:12.877Z" }, - { url = "https://files.pythonhosted.org/packages/d1/cd/5ec76bf626d0d5abdc277b7a1734696f5f3d14fbb4a3e2540665bc305d85/regex-2025.9.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:c2e05dcdfe224047f2a59e70408274c325d019aad96227ab959403ba7d58d2d7", size = 788201, upload-time = "2025-09-01T22:09:14.561Z" }, - { url = "https://files.pythonhosted.org/packages/b5/36/674672f3fdead107565a2499f3007788b878188acec6d42bc141c5366c2c/regex-2025.9.1-cp313-cp313-win32.whl", hash = "sha256:3b9a62107a7441b81ca98261808fed30ae36ba06c8b7ee435308806bd53c1ed8", size = 264508, upload-time = "2025-09-01T22:09:16.193Z" }, - { url = "https://files.pythonhosted.org/packages/83/ad/931134539515eb64ce36c24457a98b83c1b2e2d45adf3254b94df3735a76/regex-2025.9.1-cp313-cp313-win_amd64.whl", hash = "sha256:b38afecc10c177eb34cfae68d669d5161880849ba70c05cbfbe409f08cc939d7", size = 275469, upload-time = "2025-09-01T22:09:17.462Z" }, - { url = "https://files.pythonhosted.org/packages/24/8c/96d34e61c0e4e9248836bf86d69cb224fd222f270fa9045b24e218b65604/regex-2025.9.1-cp313-cp313-win_arm64.whl", hash = "sha256:ec329890ad5e7ed9fc292858554d28d58d56bf62cf964faf0aa57964b21155a0", size = 268586, upload-time = "2025-09-01T22:09:18.948Z" }, - { url = "https://files.pythonhosted.org/packages/21/b1/453cbea5323b049181ec6344a803777914074b9726c9c5dc76749966d12d/regex-2025.9.1-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:72fb7a016467d364546f22b5ae86c45680a4e0de6b2a6f67441d22172ff641f1", size = 486111, upload-time = "2025-09-01T22:09:20.734Z" }, - { url = "https://files.pythonhosted.org/packages/f6/0e/92577f197bd2f7652c5e2857f399936c1876978474ecc5b068c6d8a79c86/regex-2025.9.1-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:c9527fa74eba53f98ad86be2ba003b3ebe97e94b6eb2b916b31b5f055622ef03", size = 289520, upload-time = "2025-09-01T22:09:22.249Z" }, - { url = "https://files.pythonhosted.org/packages/af/c6/b472398116cca7ea5a6c4d5ccd0fc543f7fd2492cb0c48d2852a11972f73/regex-2025.9.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c905d925d194c83a63f92422af7544ec188301451b292c8b487f0543726107ca", size = 287215, upload-time = "2025-09-01T22:09:23.657Z" }, - { url = "https://files.pythonhosted.org/packages/cf/11/f12ecb0cf9ca792a32bb92f758589a84149017467a544f2f6bfb45c0356d/regex-2025.9.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:74df7c74a63adcad314426b1f4ea6054a5ab25d05b0244f0c07ff9ce640fa597", size = 797855, upload-time = "2025-09-01T22:09:25.197Z" }, - { url = "https://files.pythonhosted.org/packages/46/88/bbb848f719a540fb5997e71310f16f0b33a92c5d4b4d72d4311487fff2a3/regex-2025.9.1-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:4f6e935e98ea48c7a2e8be44494de337b57a204470e7f9c9c42f912c414cd6f5", size = 863363, upload-time = "2025-09-01T22:09:26.705Z" }, - { url = "https://files.pythonhosted.org/packages/54/a9/2321eb3e2838f575a78d48e03c1e83ea61bd08b74b7ebbdeca8abc50fc25/regex-2025.9.1-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:4a62d033cd9ebefc7c5e466731a508dfabee827d80b13f455de68a50d3c2543d", size = 910202, upload-time = "2025-09-01T22:09:28.906Z" }, - { url = "https://files.pythonhosted.org/packages/33/07/d1d70835d7d11b7e126181f316f7213c4572ecf5c5c97bdbb969fb1f38a2/regex-2025.9.1-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ef971ebf2b93bdc88d8337238be4dfb851cc97ed6808eb04870ef67589415171", size = 801808, upload-time = "2025-09-01T22:09:30.733Z" }, - { url = "https://files.pythonhosted.org/packages/13/d1/29e4d1bed514ef2bf3a4ead3cb8bb88ca8af94130239a4e68aa765c35b1c/regex-2025.9.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:d936a1db208bdca0eca1f2bb2c1ba1d8370b226785c1e6db76e32a228ffd0ad5", size = 786824, upload-time = "2025-09-01T22:09:32.61Z" }, - { url = "https://files.pythonhosted.org/packages/33/27/20d8ccb1bee460faaa851e6e7cc4cfe852a42b70caa1dca22721ba19f02f/regex-2025.9.1-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:7e786d9e4469698fc63815b8de08a89165a0aa851720eb99f5e0ea9d51dd2b6a", size = 857406, upload-time = "2025-09-01T22:09:34.117Z" }, - { url = "https://files.pythonhosted.org/packages/74/fe/60c6132262dc36430d51e0c46c49927d113d3a38c1aba6a26c7744c84cf3/regex-2025.9.1-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:6b81d7dbc5466ad2c57ce3a0ddb717858fe1a29535c8866f8514d785fdb9fc5b", size = 848593, upload-time = "2025-09-01T22:09:35.598Z" }, - { url = "https://files.pythonhosted.org/packages/cc/ae/2d4ff915622fabbef1af28387bf71e7f2f4944a348b8460d061e85e29bf0/regex-2025.9.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:cd4890e184a6feb0ef195338a6ce68906a8903a0f2eb7e0ab727dbc0a3156273", size = 787951, upload-time = "2025-09-01T22:09:37.139Z" }, - { url = "https://files.pythonhosted.org/packages/85/37/dc127703a9e715a284cc2f7dbdd8a9776fd813c85c126eddbcbdd1ca5fec/regex-2025.9.1-cp314-cp314-win32.whl", hash = "sha256:34679a86230e46164c9e0396b56cab13c0505972343880b9e705083cc5b8ec86", size = 269833, upload-time = "2025-09-01T22:09:39.245Z" }, - { url = "https://files.pythonhosted.org/packages/83/bf/4bed4d3d0570e16771defd5f8f15f7ea2311edcbe91077436d6908956c4a/regex-2025.9.1-cp314-cp314-win_amd64.whl", hash = "sha256:a1196e530a6bfa5f4bde029ac5b0295a6ecfaaffbfffede4bbaf4061d9455b70", size = 278742, upload-time = "2025-09-01T22:09:40.651Z" }, - { url = "https://files.pythonhosted.org/packages/cf/3e/7d7ac6fd085023312421e0d69dfabdfb28e116e513fadbe9afe710c01893/regex-2025.9.1-cp314-cp314-win_arm64.whl", hash = "sha256:f46d525934871ea772930e997d577d48c6983e50f206ff7b66d4ac5f8941e993", size = 271860, upload-time = "2025-09-01T22:09:42.413Z" }, -] - -[[package]] -name = "requests" -version = "2.32.5" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "certifi" }, - { name = "charset-normalizer" }, - { name = "idna" }, - { name = "urllib3" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/c9/74/b3ff8e6c8446842c3f5c837e9c3dfcfe2018ea6ecef224c710c85ef728f4/requests-2.32.5.tar.gz", hash = "sha256:dbba0bac56e100853db0ea71b82b4dfd5fe2bf6d3754a8893c3af500cec7d7cf", size = 134517, upload-time = "2025-08-18T20:46:02.573Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/1e/db/4254e3eabe8020b458f1a747140d32277ec7a271daf1d235b70dc0b4e6e3/requests-2.32.5-py3-none-any.whl", hash = "sha256:2462f94637a34fd532264295e186976db0f5d453d1cdd31473c85a6a161affb6", size = 64738, upload-time = "2025-08-18T20:46:00.542Z" }, -] - -[[package]] -name = "requirements-parser" -version = "0.13.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "packaging" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/95/96/fb6dbfebb524d5601d359a47c78fe7ba1eef90fc4096404aa60c9a906fbb/requirements_parser-0.13.0.tar.gz", hash = "sha256:0843119ca2cb2331de4eb31b10d70462e39ace698fd660a915c247d2301a4418", size = 22630, upload-time = "2025-05-21T13:42:05.464Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/bd/60/50fbb6ffb35f733654466f1a90d162bcbea358adc3b0871339254fbc37b2/requirements_parser-0.13.0-py3-none-any.whl", hash = "sha256:2b3173faecf19ec5501971b7222d38f04cb45bb9d87d0ad629ca71e2e62ded14", size = 14782, upload-time = "2025-05-21T13:42:04.007Z" }, -] - -[[package]] -name = "rfc3339-validator" -version = "0.1.4" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "six" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/28/ea/a9387748e2d111c3c2b275ba970b735e04e15cdb1eb30693b6b5708c4dbd/rfc3339_validator-0.1.4.tar.gz", hash = "sha256:138a2abdf93304ad60530167e51d2dfb9549521a836871b88d7f4695d0022f6b", size = 5513, upload-time = "2021-05-12T16:37:54.178Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/7b/44/4e421b96b67b2daff264473f7465db72fbdf36a07e05494f50300cc7b0c6/rfc3339_validator-0.1.4-py2.py3-none-any.whl", hash = "sha256:24f6ec1eda14ef823da9e36ec7113124b39c04d50a4d3d3a3c2859577e7791fa", size = 3490, upload-time = "2021-05-12T16:37:52.536Z" }, -] - -[[package]] -name = "rich" -version = "14.1.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "markdown-it-py" }, - { name = "pygments" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/fe/75/af448d8e52bf1d8fa6a9d089ca6c07ff4453d86c65c145d0a300bb073b9b/rich-14.1.0.tar.gz", hash = "sha256:e497a48b844b0320d45007cdebfeaeed8db2a4f4bcf49f15e455cfc4af11eaa8", size = 224441, upload-time = "2025-07-25T07:32:58.125Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/e3/30/3c4d035596d3cf444529e0b2953ad0466f6049528a879d27534700580395/rich-14.1.0-py3-none-any.whl", hash = "sha256:536f5f1785986d6dbdea3c75205c473f970777b4a0d6c6dd1b696aa05a3fa04f", size = 243368, upload-time = "2025-07-25T07:32:56.73Z" }, -] - -[[package]] -name = "rich-argparse" -version = "1.7.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "rich" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/71/a6/34460d81e5534f6d2fc8e8d91ff99a5835fdca53578eac89e4f37b3a7c6d/rich_argparse-1.7.1.tar.gz", hash = "sha256:d7a493cde94043e41ea68fb43a74405fa178de981bf7b800f7a3bd02ac5c27be", size = 38094, upload-time = "2025-05-25T20:20:35.335Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/31/f6/5fc0574af5379606ffd57a4b68ed88f9b415eb222047fe023aefcc00a648/rich_argparse-1.7.1-py3-none-any.whl", hash = "sha256:a8650b42e4a4ff72127837632fba6b7da40784842f08d7395eb67a9cbd7b4bf9", size = 25357, upload-time = "2025-05-25T20:20:33.793Z" }, -] - -[[package]] -name = "rich-rst" -version = "1.3.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "docutils" }, - { name = "rich" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/b0/69/5514c3a87b5f10f09a34bb011bc0927bc12c596c8dae5915604e71abc386/rich_rst-1.3.1.tar.gz", hash = "sha256:fad46e3ba42785ea8c1785e2ceaa56e0ffa32dbe5410dec432f37e4107c4f383", size = 13839, upload-time = "2024-04-30T04:40:38.125Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/fd/bc/cc4e3dbc5e7992398dcb7a8eda0cbcf4fb792a0cdb93f857b478bf3cf884/rich_rst-1.3.1-py3-none-any.whl", hash = "sha256:498a74e3896507ab04492d326e794c3ef76e7cda078703aa592d1853d91098c1", size = 11621, upload-time = "2024-04-30T04:40:32.619Z" }, -] - -[[package]] -name = "rpds-py" -version = "0.27.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/e9/dd/2c0cbe774744272b0ae725f44032c77bdcab6e8bcf544bffa3b6e70c8dba/rpds_py-0.27.1.tar.gz", hash = "sha256:26a1c73171d10b7acccbded82bf6a586ab8203601e565badc74bbbf8bc5a10f8", size = 27479, upload-time = "2025-08-27T12:16:36.024Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/b5/c1/7907329fbef97cbd49db6f7303893bd1dd5a4a3eae415839ffdfb0762cae/rpds_py-0.27.1-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:be898f271f851f68b318872ce6ebebbc62f303b654e43bf72683dbdc25b7c881", size = 371063, upload-time = "2025-08-27T12:12:47.856Z" }, - { url = "https://files.pythonhosted.org/packages/11/94/2aab4bc86228bcf7c48760990273653a4900de89c7537ffe1b0d6097ed39/rpds_py-0.27.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:62ac3d4e3e07b58ee0ddecd71d6ce3b1637de2d373501412df395a0ec5f9beb5", size = 353210, upload-time = "2025-08-27T12:12:49.187Z" }, - { url = "https://files.pythonhosted.org/packages/3a/57/f5eb3ecf434342f4f1a46009530e93fd201a0b5b83379034ebdb1d7c1a58/rpds_py-0.27.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4708c5c0ceb2d034f9991623631d3d23cb16e65c83736ea020cdbe28d57c0a0e", size = 381636, upload-time = "2025-08-27T12:12:50.492Z" }, - { url = "https://files.pythonhosted.org/packages/ae/f4/ef95c5945e2ceb5119571b184dd5a1cc4b8541bbdf67461998cfeac9cb1e/rpds_py-0.27.1-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:abfa1171a9952d2e0002aba2ad3780820b00cc3d9c98c6630f2e93271501f66c", size = 394341, upload-time = "2025-08-27T12:12:52.024Z" }, - { url = "https://files.pythonhosted.org/packages/5a/7e/4bd610754bf492d398b61725eb9598ddd5eb86b07d7d9483dbcd810e20bc/rpds_py-0.27.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4b507d19f817ebaca79574b16eb2ae412e5c0835542c93fe9983f1e432aca195", size = 523428, upload-time = "2025-08-27T12:12:53.779Z" }, - { url = "https://files.pythonhosted.org/packages/9f/e5/059b9f65a8c9149361a8b75094864ab83b94718344db511fd6117936ed2a/rpds_py-0.27.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:168b025f8fd8d8d10957405f3fdcef3dc20f5982d398f90851f4abc58c566c52", size = 402923, upload-time = "2025-08-27T12:12:55.15Z" }, - { url = "https://files.pythonhosted.org/packages/f5/48/64cabb7daced2968dd08e8a1b7988bf358d7bd5bcd5dc89a652f4668543c/rpds_py-0.27.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cb56c6210ef77caa58e16e8c17d35c63fe3f5b60fd9ba9d424470c3400bcf9ed", size = 384094, upload-time = "2025-08-27T12:12:57.194Z" }, - { url = "https://files.pythonhosted.org/packages/ae/e1/dc9094d6ff566bff87add8a510c89b9e158ad2ecd97ee26e677da29a9e1b/rpds_py-0.27.1-cp311-cp311-manylinux_2_31_riscv64.whl", hash = "sha256:d252f2d8ca0195faa707f8eb9368955760880b2b42a8ee16d382bf5dd807f89a", size = 401093, upload-time = "2025-08-27T12:12:58.985Z" }, - { url = "https://files.pythonhosted.org/packages/37/8e/ac8577e3ecdd5593e283d46907d7011618994e1d7ab992711ae0f78b9937/rpds_py-0.27.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6e5e54da1e74b91dbc7996b56640f79b195d5925c2b78efaa8c5d53e1d88edde", size = 417969, upload-time = "2025-08-27T12:13:00.367Z" }, - { url = "https://files.pythonhosted.org/packages/66/6d/87507430a8f74a93556fe55c6485ba9c259949a853ce407b1e23fea5ba31/rpds_py-0.27.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:ffce0481cc6e95e5b3f0a47ee17ffbd234399e6d532f394c8dce320c3b089c21", size = 558302, upload-time = "2025-08-27T12:13:01.737Z" }, - { url = "https://files.pythonhosted.org/packages/3a/bb/1db4781ce1dda3eecc735e3152659a27b90a02ca62bfeea17aee45cc0fbc/rpds_py-0.27.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:a205fdfe55c90c2cd8e540ca9ceba65cbe6629b443bc05db1f590a3db8189ff9", size = 589259, upload-time = "2025-08-27T12:13:03.127Z" }, - { url = "https://files.pythonhosted.org/packages/7b/0e/ae1c8943d11a814d01b482e1f8da903f88047a962dff9bbdadf3bd6e6fd1/rpds_py-0.27.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:689fb5200a749db0415b092972e8eba85847c23885c8543a8b0f5c009b1a5948", size = 554983, upload-time = "2025-08-27T12:13:04.516Z" }, - { url = "https://files.pythonhosted.org/packages/b2/d5/0b2a55415931db4f112bdab072443ff76131b5ac4f4dc98d10d2d357eb03/rpds_py-0.27.1-cp311-cp311-win32.whl", hash = "sha256:3182af66048c00a075010bc7f4860f33913528a4b6fc09094a6e7598e462fe39", size = 217154, upload-time = "2025-08-27T12:13:06.278Z" }, - { url = "https://files.pythonhosted.org/packages/24/75/3b7ffe0d50dc86a6a964af0d1cc3a4a2cdf437cb7b099a4747bbb96d1819/rpds_py-0.27.1-cp311-cp311-win_amd64.whl", hash = "sha256:b4938466c6b257b2f5c4ff98acd8128ec36b5059e5c8f8372d79316b1c36bb15", size = 228627, upload-time = "2025-08-27T12:13:07.625Z" }, - { url = "https://files.pythonhosted.org/packages/8d/3f/4fd04c32abc02c710f09a72a30c9a55ea3cc154ef8099078fd50a0596f8e/rpds_py-0.27.1-cp311-cp311-win_arm64.whl", hash = "sha256:2f57af9b4d0793e53266ee4325535a31ba48e2f875da81a9177c9926dfa60746", size = 220998, upload-time = "2025-08-27T12:13:08.972Z" }, - { url = "https://files.pythonhosted.org/packages/bd/fe/38de28dee5df58b8198c743fe2bea0c785c6d40941b9950bac4cdb71a014/rpds_py-0.27.1-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:ae2775c1973e3c30316892737b91f9283f9908e3cc7625b9331271eaaed7dc90", size = 361887, upload-time = "2025-08-27T12:13:10.233Z" }, - { url = "https://files.pythonhosted.org/packages/7c/9a/4b6c7eedc7dd90986bf0fab6ea2a091ec11c01b15f8ba0a14d3f80450468/rpds_py-0.27.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:2643400120f55c8a96f7c9d858f7be0c88d383cd4653ae2cf0d0c88f668073e5", size = 345795, upload-time = "2025-08-27T12:13:11.65Z" }, - { url = "https://files.pythonhosted.org/packages/6f/0e/e650e1b81922847a09cca820237b0edee69416a01268b7754d506ade11ad/rpds_py-0.27.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:16323f674c089b0360674a4abd28d5042947d54ba620f72514d69be4ff64845e", size = 385121, upload-time = "2025-08-27T12:13:13.008Z" }, - { url = "https://files.pythonhosted.org/packages/1b/ea/b306067a712988e2bff00dcc7c8f31d26c29b6d5931b461aa4b60a013e33/rpds_py-0.27.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:9a1f4814b65eacac94a00fc9a526e3fdafd78e439469644032032d0d63de4881", size = 398976, upload-time = "2025-08-27T12:13:14.368Z" }, - { url = "https://files.pythonhosted.org/packages/2c/0a/26dc43c8840cb8fe239fe12dbc8d8de40f2365e838f3d395835dde72f0e5/rpds_py-0.27.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7ba32c16b064267b22f1850a34051121d423b6f7338a12b9459550eb2096e7ec", size = 525953, upload-time = "2025-08-27T12:13:15.774Z" }, - { url = "https://files.pythonhosted.org/packages/22/14/c85e8127b573aaf3a0cbd7fbb8c9c99e735a4a02180c84da2a463b766e9e/rpds_py-0.27.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e5c20f33fd10485b80f65e800bbe5f6785af510b9f4056c5a3c612ebc83ba6cb", size = 407915, upload-time = "2025-08-27T12:13:17.379Z" }, - { url = "https://files.pythonhosted.org/packages/ed/7b/8f4fee9ba1fb5ec856eb22d725a4efa3deb47f769597c809e03578b0f9d9/rpds_py-0.27.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:466bfe65bd932da36ff279ddd92de56b042f2266d752719beb97b08526268ec5", size = 386883, upload-time = "2025-08-27T12:13:18.704Z" }, - { url = "https://files.pythonhosted.org/packages/86/47/28fa6d60f8b74fcdceba81b272f8d9836ac0340570f68f5df6b41838547b/rpds_py-0.27.1-cp312-cp312-manylinux_2_31_riscv64.whl", hash = "sha256:41e532bbdcb57c92ba3be62c42e9f096431b4cf478da9bc3bc6ce5c38ab7ba7a", size = 405699, upload-time = "2025-08-27T12:13:20.089Z" }, - { url = "https://files.pythonhosted.org/packages/d0/fd/c5987b5e054548df56953a21fe2ebed51fc1ec7c8f24fd41c067b68c4a0a/rpds_py-0.27.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f149826d742b406579466283769a8ea448eed82a789af0ed17b0cd5770433444", size = 423713, upload-time = "2025-08-27T12:13:21.436Z" }, - { url = "https://files.pythonhosted.org/packages/ac/ba/3c4978b54a73ed19a7d74531be37a8bcc542d917c770e14d372b8daea186/rpds_py-0.27.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:80c60cfb5310677bd67cb1e85a1e8eb52e12529545441b43e6f14d90b878775a", size = 562324, upload-time = "2025-08-27T12:13:22.789Z" }, - { url = "https://files.pythonhosted.org/packages/b5/6c/6943a91768fec16db09a42b08644b960cff540c66aab89b74be6d4a144ba/rpds_py-0.27.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:7ee6521b9baf06085f62ba9c7a3e5becffbc32480d2f1b351559c001c38ce4c1", size = 593646, upload-time = "2025-08-27T12:13:24.122Z" }, - { url = "https://files.pythonhosted.org/packages/11/73/9d7a8f4be5f4396f011a6bb7a19fe26303a0dac9064462f5651ced2f572f/rpds_py-0.27.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:a512c8263249a9d68cac08b05dd59d2b3f2061d99b322813cbcc14c3c7421998", size = 558137, upload-time = "2025-08-27T12:13:25.557Z" }, - { url = "https://files.pythonhosted.org/packages/6e/96/6772cbfa0e2485bcceef8071de7821f81aeac8bb45fbfd5542a3e8108165/rpds_py-0.27.1-cp312-cp312-win32.whl", hash = "sha256:819064fa048ba01b6dadc5116f3ac48610435ac9a0058bbde98e569f9e785c39", size = 221343, upload-time = "2025-08-27T12:13:26.967Z" }, - { url = "https://files.pythonhosted.org/packages/67/b6/c82f0faa9af1c6a64669f73a17ee0eeef25aff30bb9a1c318509efe45d84/rpds_py-0.27.1-cp312-cp312-win_amd64.whl", hash = "sha256:d9199717881f13c32c4046a15f024971a3b78ad4ea029e8da6b86e5aa9cf4594", size = 232497, upload-time = "2025-08-27T12:13:28.326Z" }, - { url = "https://files.pythonhosted.org/packages/e1/96/2817b44bd2ed11aebacc9251da03689d56109b9aba5e311297b6902136e2/rpds_py-0.27.1-cp312-cp312-win_arm64.whl", hash = "sha256:33aa65b97826a0e885ef6e278fbd934e98cdcfed80b63946025f01e2f5b29502", size = 222790, upload-time = "2025-08-27T12:13:29.71Z" }, - { url = "https://files.pythonhosted.org/packages/cc/77/610aeee8d41e39080c7e14afa5387138e3c9fa9756ab893d09d99e7d8e98/rpds_py-0.27.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:e4b9fcfbc021633863a37e92571d6f91851fa656f0180246e84cbd8b3f6b329b", size = 361741, upload-time = "2025-08-27T12:13:31.039Z" }, - { url = "https://files.pythonhosted.org/packages/3a/fc/c43765f201c6a1c60be2043cbdb664013def52460a4c7adace89d6682bf4/rpds_py-0.27.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:1441811a96eadca93c517d08df75de45e5ffe68aa3089924f963c782c4b898cf", size = 345574, upload-time = "2025-08-27T12:13:32.902Z" }, - { url = "https://files.pythonhosted.org/packages/20/42/ee2b2ca114294cd9847d0ef9c26d2b0851b2e7e00bf14cc4c0b581df0fc3/rpds_py-0.27.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:55266dafa22e672f5a4f65019015f90336ed31c6383bd53f5e7826d21a0e0b83", size = 385051, upload-time = "2025-08-27T12:13:34.228Z" }, - { url = "https://files.pythonhosted.org/packages/fd/e8/1e430fe311e4799e02e2d1af7c765f024e95e17d651612425b226705f910/rpds_py-0.27.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d78827d7ac08627ea2c8e02c9e5b41180ea5ea1f747e9db0915e3adf36b62dcf", size = 398395, upload-time = "2025-08-27T12:13:36.132Z" }, - { url = "https://files.pythonhosted.org/packages/82/95/9dc227d441ff2670651c27a739acb2535ccaf8b351a88d78c088965e5996/rpds_py-0.27.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae92443798a40a92dc5f0b01d8a7c93adde0c4dc965310a29ae7c64d72b9fad2", size = 524334, upload-time = "2025-08-27T12:13:37.562Z" }, - { url = "https://files.pythonhosted.org/packages/87/01/a670c232f401d9ad461d9a332aa4080cd3cb1d1df18213dbd0d2a6a7ab51/rpds_py-0.27.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c46c9dd2403b66a2a3b9720ec4b74d4ab49d4fabf9f03dfdce2d42af913fe8d0", size = 407691, upload-time = "2025-08-27T12:13:38.94Z" }, - { url = "https://files.pythonhosted.org/packages/03/36/0a14aebbaa26fe7fab4780c76f2239e76cc95a0090bdb25e31d95c492fcd/rpds_py-0.27.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2efe4eb1d01b7f5f1939f4ef30ecea6c6b3521eec451fb93191bf84b2a522418", size = 386868, upload-time = "2025-08-27T12:13:40.192Z" }, - { url = "https://files.pythonhosted.org/packages/3b/03/8c897fb8b5347ff6c1cc31239b9611c5bf79d78c984430887a353e1409a1/rpds_py-0.27.1-cp313-cp313-manylinux_2_31_riscv64.whl", hash = "sha256:15d3b4d83582d10c601f481eca29c3f138d44c92187d197aff663a269197c02d", size = 405469, upload-time = "2025-08-27T12:13:41.496Z" }, - { url = "https://files.pythonhosted.org/packages/da/07/88c60edc2df74850d496d78a1fdcdc7b54360a7f610a4d50008309d41b94/rpds_py-0.27.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:4ed2e16abbc982a169d30d1a420274a709949e2cbdef119fe2ec9d870b42f274", size = 422125, upload-time = "2025-08-27T12:13:42.802Z" }, - { url = "https://files.pythonhosted.org/packages/6b/86/5f4c707603e41b05f191a749984f390dabcbc467cf833769b47bf14ba04f/rpds_py-0.27.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a75f305c9b013289121ec0f1181931975df78738cdf650093e6b86d74aa7d8dd", size = 562341, upload-time = "2025-08-27T12:13:44.472Z" }, - { url = "https://files.pythonhosted.org/packages/b2/92/3c0cb2492094e3cd9baf9e49bbb7befeceb584ea0c1a8b5939dca4da12e5/rpds_py-0.27.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:67ce7620704745881a3d4b0ada80ab4d99df390838839921f99e63c474f82cf2", size = 592511, upload-time = "2025-08-27T12:13:45.898Z" }, - { url = "https://files.pythonhosted.org/packages/10/bb/82e64fbb0047c46a168faa28d0d45a7851cd0582f850b966811d30f67ad8/rpds_py-0.27.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9d992ac10eb86d9b6f369647b6a3f412fc0075cfd5d799530e84d335e440a002", size = 557736, upload-time = "2025-08-27T12:13:47.408Z" }, - { url = "https://files.pythonhosted.org/packages/00/95/3c863973d409210da7fb41958172c6b7dbe7fc34e04d3cc1f10bb85e979f/rpds_py-0.27.1-cp313-cp313-win32.whl", hash = "sha256:4f75e4bd8ab8db624e02c8e2fc4063021b58becdbe6df793a8111d9343aec1e3", size = 221462, upload-time = "2025-08-27T12:13:48.742Z" }, - { url = "https://files.pythonhosted.org/packages/ce/2c/5867b14a81dc217b56d95a9f2a40fdbc56a1ab0181b80132beeecbd4b2d6/rpds_py-0.27.1-cp313-cp313-win_amd64.whl", hash = "sha256:f9025faafc62ed0b75a53e541895ca272815bec18abe2249ff6501c8f2e12b83", size = 232034, upload-time = "2025-08-27T12:13:50.11Z" }, - { url = "https://files.pythonhosted.org/packages/c7/78/3958f3f018c01923823f1e47f1cc338e398814b92d83cd278364446fac66/rpds_py-0.27.1-cp313-cp313-win_arm64.whl", hash = "sha256:ed10dc32829e7d222b7d3b93136d25a406ba9788f6a7ebf6809092da1f4d279d", size = 222392, upload-time = "2025-08-27T12:13:52.587Z" }, - { url = "https://files.pythonhosted.org/packages/01/76/1cdf1f91aed5c3a7bf2eba1f1c4e4d6f57832d73003919a20118870ea659/rpds_py-0.27.1-cp313-cp313t-macosx_10_12_x86_64.whl", hash = "sha256:92022bbbad0d4426e616815b16bc4127f83c9a74940e1ccf3cfe0b387aba0228", size = 358355, upload-time = "2025-08-27T12:13:54.012Z" }, - { url = "https://files.pythonhosted.org/packages/c3/6f/bf142541229374287604caf3bb2a4ae17f0a580798fd72d3b009b532db4e/rpds_py-0.27.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:47162fdab9407ec3f160805ac3e154df042e577dd53341745fc7fb3f625e6d92", size = 342138, upload-time = "2025-08-27T12:13:55.791Z" }, - { url = "https://files.pythonhosted.org/packages/1a/77/355b1c041d6be40886c44ff5e798b4e2769e497b790f0f7fd1e78d17e9a8/rpds_py-0.27.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb89bec23fddc489e5d78b550a7b773557c9ab58b7946154a10a6f7a214a48b2", size = 380247, upload-time = "2025-08-27T12:13:57.683Z" }, - { url = "https://files.pythonhosted.org/packages/d6/a4/d9cef5c3946ea271ce2243c51481971cd6e34f21925af2783dd17b26e815/rpds_py-0.27.1-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e48af21883ded2b3e9eb48cb7880ad8598b31ab752ff3be6457001d78f416723", size = 390699, upload-time = "2025-08-27T12:13:59.137Z" }, - { url = "https://files.pythonhosted.org/packages/3a/06/005106a7b8c6c1a7e91b73169e49870f4af5256119d34a361ae5240a0c1d/rpds_py-0.27.1-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6f5b7bd8e219ed50299e58551a410b64daafb5017d54bbe822e003856f06a802", size = 521852, upload-time = "2025-08-27T12:14:00.583Z" }, - { url = "https://files.pythonhosted.org/packages/e5/3e/50fb1dac0948e17a02eb05c24510a8fe12d5ce8561c6b7b7d1339ab7ab9c/rpds_py-0.27.1-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:08f1e20bccf73b08d12d804d6e1c22ca5530e71659e6673bce31a6bb71c1e73f", size = 402582, upload-time = "2025-08-27T12:14:02.034Z" }, - { url = "https://files.pythonhosted.org/packages/cb/b0/f4e224090dc5b0ec15f31a02d746ab24101dd430847c4d99123798661bfc/rpds_py-0.27.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0dc5dceeaefcc96dc192e3a80bbe1d6c410c469e97bdd47494a7d930987f18b2", size = 384126, upload-time = "2025-08-27T12:14:03.437Z" }, - { url = "https://files.pythonhosted.org/packages/54/77/ac339d5f82b6afff1df8f0fe0d2145cc827992cb5f8eeb90fc9f31ef7a63/rpds_py-0.27.1-cp313-cp313t-manylinux_2_31_riscv64.whl", hash = "sha256:d76f9cc8665acdc0c9177043746775aa7babbf479b5520b78ae4002d889f5c21", size = 399486, upload-time = "2025-08-27T12:14:05.443Z" }, - { url = "https://files.pythonhosted.org/packages/d6/29/3e1c255eee6ac358c056a57d6d6869baa00a62fa32eea5ee0632039c50a3/rpds_py-0.27.1-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:134fae0e36022edad8290a6661edf40c023562964efea0cc0ec7f5d392d2aaef", size = 414832, upload-time = "2025-08-27T12:14:06.902Z" }, - { url = "https://files.pythonhosted.org/packages/3f/db/6d498b844342deb3fa1d030598db93937a9964fcf5cb4da4feb5f17be34b/rpds_py-0.27.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:eb11a4f1b2b63337cfd3b4d110af778a59aae51c81d195768e353d8b52f88081", size = 557249, upload-time = "2025-08-27T12:14:08.37Z" }, - { url = "https://files.pythonhosted.org/packages/60/f3/690dd38e2310b6f68858a331399b4d6dbb9132c3e8ef8b4333b96caf403d/rpds_py-0.27.1-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:13e608ac9f50a0ed4faec0e90ece76ae33b34c0e8656e3dceb9a7db994c692cd", size = 587356, upload-time = "2025-08-27T12:14:10.034Z" }, - { url = "https://files.pythonhosted.org/packages/86/e3/84507781cccd0145f35b1dc32c72675200c5ce8d5b30f813e49424ef68fc/rpds_py-0.27.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:dd2135527aa40f061350c3f8f89da2644de26cd73e4de458e79606384f4f68e7", size = 555300, upload-time = "2025-08-27T12:14:11.783Z" }, - { url = "https://files.pythonhosted.org/packages/e5/ee/375469849e6b429b3516206b4580a79e9ef3eb12920ddbd4492b56eaacbe/rpds_py-0.27.1-cp313-cp313t-win32.whl", hash = "sha256:3020724ade63fe320a972e2ffd93b5623227e684315adce194941167fee02688", size = 216714, upload-time = "2025-08-27T12:14:13.629Z" }, - { url = "https://files.pythonhosted.org/packages/21/87/3fc94e47c9bd0742660e84706c311a860dcae4374cf4a03c477e23ce605a/rpds_py-0.27.1-cp313-cp313t-win_amd64.whl", hash = "sha256:8ee50c3e41739886606388ba3ab3ee2aae9f35fb23f833091833255a31740797", size = 228943, upload-time = "2025-08-27T12:14:14.937Z" }, - { url = "https://files.pythonhosted.org/packages/70/36/b6e6066520a07cf029d385de869729a895917b411e777ab1cde878100a1d/rpds_py-0.27.1-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:acb9aafccaae278f449d9c713b64a9e68662e7799dbd5859e2c6b3c67b56d334", size = 362472, upload-time = "2025-08-27T12:14:16.333Z" }, - { url = "https://files.pythonhosted.org/packages/af/07/b4646032e0dcec0df9c73a3bd52f63bc6c5f9cda992f06bd0e73fe3fbebd/rpds_py-0.27.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:b7fb801aa7f845ddf601c49630deeeccde7ce10065561d92729bfe81bd21fb33", size = 345676, upload-time = "2025-08-27T12:14:17.764Z" }, - { url = "https://files.pythonhosted.org/packages/b0/16/2f1003ee5d0af4bcb13c0cf894957984c32a6751ed7206db2aee7379a55e/rpds_py-0.27.1-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fe0dd05afb46597b9a2e11c351e5e4283c741237e7f617ffb3252780cca9336a", size = 385313, upload-time = "2025-08-27T12:14:19.829Z" }, - { url = "https://files.pythonhosted.org/packages/05/cd/7eb6dd7b232e7f2654d03fa07f1414d7dfc980e82ba71e40a7c46fd95484/rpds_py-0.27.1-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:b6dfb0e058adb12d8b1d1b25f686e94ffa65d9995a5157afe99743bf7369d62b", size = 399080, upload-time = "2025-08-27T12:14:21.531Z" }, - { url = "https://files.pythonhosted.org/packages/20/51/5829afd5000ec1cb60f304711f02572d619040aa3ec033d8226817d1e571/rpds_py-0.27.1-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ed090ccd235f6fa8bb5861684567f0a83e04f52dfc2e5c05f2e4b1309fcf85e7", size = 523868, upload-time = "2025-08-27T12:14:23.485Z" }, - { url = "https://files.pythonhosted.org/packages/05/2c/30eebca20d5db95720ab4d2faec1b5e4c1025c473f703738c371241476a2/rpds_py-0.27.1-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:bf876e79763eecf3e7356f157540d6a093cef395b65514f17a356f62af6cc136", size = 408750, upload-time = "2025-08-27T12:14:24.924Z" }, - { url = "https://files.pythonhosted.org/packages/90/1a/cdb5083f043597c4d4276eae4e4c70c55ab5accec078da8611f24575a367/rpds_py-0.27.1-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:12ed005216a51b1d6e2b02a7bd31885fe317e45897de81d86dcce7d74618ffff", size = 387688, upload-time = "2025-08-27T12:14:27.537Z" }, - { url = "https://files.pythonhosted.org/packages/7c/92/cf786a15320e173f945d205ab31585cc43969743bb1a48b6888f7a2b0a2d/rpds_py-0.27.1-cp314-cp314-manylinux_2_31_riscv64.whl", hash = "sha256:ee4308f409a40e50593c7e3bb8cbe0b4d4c66d1674a316324f0c2f5383b486f9", size = 407225, upload-time = "2025-08-27T12:14:28.981Z" }, - { url = "https://files.pythonhosted.org/packages/33/5c/85ee16df5b65063ef26017bef33096557a4c83fbe56218ac7cd8c235f16d/rpds_py-0.27.1-cp314-cp314-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:0b08d152555acf1f455154d498ca855618c1378ec810646fcd7c76416ac6dc60", size = 423361, upload-time = "2025-08-27T12:14:30.469Z" }, - { url = "https://files.pythonhosted.org/packages/4b/8e/1c2741307fcabd1a334ecf008e92c4f47bb6f848712cf15c923becfe82bb/rpds_py-0.27.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:dce51c828941973a5684d458214d3a36fcd28da3e1875d659388f4f9f12cc33e", size = 562493, upload-time = "2025-08-27T12:14:31.987Z" }, - { url = "https://files.pythonhosted.org/packages/04/03/5159321baae9b2222442a70c1f988cbbd66b9be0675dd3936461269be360/rpds_py-0.27.1-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:c1476d6f29eb81aa4151c9a31219b03f1f798dc43d8af1250a870735516a1212", size = 592623, upload-time = "2025-08-27T12:14:33.543Z" }, - { url = "https://files.pythonhosted.org/packages/ff/39/c09fd1ad28b85bc1d4554a8710233c9f4cefd03d7717a1b8fbfd171d1167/rpds_py-0.27.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:3ce0cac322b0d69b63c9cdb895ee1b65805ec9ffad37639f291dd79467bee675", size = 558800, upload-time = "2025-08-27T12:14:35.436Z" }, - { url = "https://files.pythonhosted.org/packages/c5/d6/99228e6bbcf4baa764b18258f519a9035131d91b538d4e0e294313462a98/rpds_py-0.27.1-cp314-cp314-win32.whl", hash = "sha256:dfbfac137d2a3d0725758cd141f878bf4329ba25e34979797c89474a89a8a3a3", size = 221943, upload-time = "2025-08-27T12:14:36.898Z" }, - { url = "https://files.pythonhosted.org/packages/be/07/c802bc6b8e95be83b79bdf23d1aa61d68324cb1006e245d6c58e959e314d/rpds_py-0.27.1-cp314-cp314-win_amd64.whl", hash = "sha256:a6e57b0abfe7cc513450fcf529eb486b6e4d3f8aee83e92eb5f1ef848218d456", size = 233739, upload-time = "2025-08-27T12:14:38.386Z" }, - { url = "https://files.pythonhosted.org/packages/c8/89/3e1b1c16d4c2d547c5717377a8df99aee8099ff050f87c45cb4d5fa70891/rpds_py-0.27.1-cp314-cp314-win_arm64.whl", hash = "sha256:faf8d146f3d476abfee026c4ae3bdd9ca14236ae4e4c310cbd1cf75ba33d24a3", size = 223120, upload-time = "2025-08-27T12:14:39.82Z" }, - { url = "https://files.pythonhosted.org/packages/62/7e/dc7931dc2fa4a6e46b2a4fa744a9fe5c548efd70e0ba74f40b39fa4a8c10/rpds_py-0.27.1-cp314-cp314t-macosx_10_12_x86_64.whl", hash = "sha256:ba81d2b56b6d4911ce735aad0a1d4495e808b8ee4dc58715998741a26874e7c2", size = 358944, upload-time = "2025-08-27T12:14:41.199Z" }, - { url = "https://files.pythonhosted.org/packages/e6/22/4af76ac4e9f336bfb1a5f240d18a33c6b2fcaadb7472ac7680576512b49a/rpds_py-0.27.1-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:84f7d509870098de0e864cad0102711c1e24e9b1a50ee713b65928adb22269e4", size = 342283, upload-time = "2025-08-27T12:14:42.699Z" }, - { url = "https://files.pythonhosted.org/packages/1c/15/2a7c619b3c2272ea9feb9ade67a45c40b3eeb500d503ad4c28c395dc51b4/rpds_py-0.27.1-cp314-cp314t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a9e960fc78fecd1100539f14132425e1d5fe44ecb9239f8f27f079962021523e", size = 380320, upload-time = "2025-08-27T12:14:44.157Z" }, - { url = "https://files.pythonhosted.org/packages/a2/7d/4c6d243ba4a3057e994bb5bedd01b5c963c12fe38dde707a52acdb3849e7/rpds_py-0.27.1-cp314-cp314t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:62f85b665cedab1a503747617393573995dac4600ff51869d69ad2f39eb5e817", size = 391760, upload-time = "2025-08-27T12:14:45.845Z" }, - { url = "https://files.pythonhosted.org/packages/b4/71/b19401a909b83bcd67f90221330bc1ef11bc486fe4e04c24388d28a618ae/rpds_py-0.27.1-cp314-cp314t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:fed467af29776f6556250c9ed85ea5a4dd121ab56a5f8b206e3e7a4c551e48ec", size = 522476, upload-time = "2025-08-27T12:14:47.364Z" }, - { url = "https://files.pythonhosted.org/packages/e4/44/1a3b9715c0455d2e2f0f6df5ee6d6f5afdc423d0773a8a682ed2b43c566c/rpds_py-0.27.1-cp314-cp314t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f2729615f9d430af0ae6b36cf042cb55c0936408d543fb691e1a9e36648fd35a", size = 403418, upload-time = "2025-08-27T12:14:49.991Z" }, - { url = "https://files.pythonhosted.org/packages/1c/4b/fb6c4f14984eb56673bc868a66536f53417ddb13ed44b391998100a06a96/rpds_py-0.27.1-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1b207d881a9aef7ba753d69c123a35d96ca7cb808056998f6b9e8747321f03b8", size = 384771, upload-time = "2025-08-27T12:14:52.159Z" }, - { url = "https://files.pythonhosted.org/packages/c0/56/d5265d2d28b7420d7b4d4d85cad8ef891760f5135102e60d5c970b976e41/rpds_py-0.27.1-cp314-cp314t-manylinux_2_31_riscv64.whl", hash = "sha256:639fd5efec029f99b79ae47e5d7e00ad8a773da899b6309f6786ecaf22948c48", size = 400022, upload-time = "2025-08-27T12:14:53.859Z" }, - { url = "https://files.pythonhosted.org/packages/8f/e9/9f5fc70164a569bdd6ed9046486c3568d6926e3a49bdefeeccfb18655875/rpds_py-0.27.1-cp314-cp314t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fecc80cb2a90e28af8a9b366edacf33d7a91cbfe4c2c4544ea1246e949cfebeb", size = 416787, upload-time = "2025-08-27T12:14:55.673Z" }, - { url = "https://files.pythonhosted.org/packages/d4/64/56dd03430ba491db943a81dcdef115a985aac5f44f565cd39a00c766d45c/rpds_py-0.27.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:42a89282d711711d0a62d6f57d81aa43a1368686c45bc1c46b7f079d55692734", size = 557538, upload-time = "2025-08-27T12:14:57.245Z" }, - { url = "https://files.pythonhosted.org/packages/3f/36/92cc885a3129993b1d963a2a42ecf64e6a8e129d2c7cc980dbeba84e55fb/rpds_py-0.27.1-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:cf9931f14223de59551ab9d38ed18d92f14f055a5f78c1d8ad6493f735021bbb", size = 588512, upload-time = "2025-08-27T12:14:58.728Z" }, - { url = "https://files.pythonhosted.org/packages/dd/10/6b283707780a81919f71625351182b4f98932ac89a09023cb61865136244/rpds_py-0.27.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:f39f58a27cc6e59f432b568ed8429c7e1641324fbe38131de852cd77b2d534b0", size = 555813, upload-time = "2025-08-27T12:15:00.334Z" }, - { url = "https://files.pythonhosted.org/packages/04/2e/30b5ea18c01379da6272a92825dd7e53dc9d15c88a19e97932d35d430ef7/rpds_py-0.27.1-cp314-cp314t-win32.whl", hash = "sha256:d5fa0ee122dc09e23607a28e6d7b150da16c662e66409bbe85230e4c85bb528a", size = 217385, upload-time = "2025-08-27T12:15:01.937Z" }, - { url = "https://files.pythonhosted.org/packages/32/7d/97119da51cb1dd3f2f3c0805f155a3aa4a95fa44fe7d78ae15e69edf4f34/rpds_py-0.27.1-cp314-cp314t-win_amd64.whl", hash = "sha256:6567d2bb951e21232c2f660c24cf3470bb96de56cdcb3f071a83feeaff8a2772", size = 230097, upload-time = "2025-08-27T12:15:03.961Z" }, - { url = "https://files.pythonhosted.org/packages/0c/ed/e1fba02de17f4f76318b834425257c8ea297e415e12c68b4361f63e8ae92/rpds_py-0.27.1-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:cdfe4bb2f9fe7458b7453ad3c33e726d6d1c7c0a72960bcc23800d77384e42df", size = 371402, upload-time = "2025-08-27T12:15:51.561Z" }, - { url = "https://files.pythonhosted.org/packages/af/7c/e16b959b316048b55585a697e94add55a4ae0d984434d279ea83442e460d/rpds_py-0.27.1-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:8fabb8fd848a5f75a2324e4a84501ee3a5e3c78d8603f83475441866e60b94a3", size = 354084, upload-time = "2025-08-27T12:15:53.219Z" }, - { url = "https://files.pythonhosted.org/packages/de/c1/ade645f55de76799fdd08682d51ae6724cb46f318573f18be49b1e040428/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:eda8719d598f2f7f3e0f885cba8646644b55a187762bec091fa14a2b819746a9", size = 383090, upload-time = "2025-08-27T12:15:55.158Z" }, - { url = "https://files.pythonhosted.org/packages/1f/27/89070ca9b856e52960da1472efcb6c20ba27cfe902f4f23ed095b9cfc61d/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:3c64d07e95606ec402a0a1c511fe003873fa6af630bda59bac77fac8b4318ebc", size = 394519, upload-time = "2025-08-27T12:15:57.238Z" }, - { url = "https://files.pythonhosted.org/packages/b3/28/be120586874ef906aa5aeeae95ae8df4184bc757e5b6bd1c729ccff45ed5/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:93a2ed40de81bcff59aabebb626562d48332f3d028ca2036f1d23cbb52750be4", size = 523817, upload-time = "2025-08-27T12:15:59.237Z" }, - { url = "https://files.pythonhosted.org/packages/a8/ef/70cc197bc11cfcde02a86f36ac1eed15c56667c2ebddbdb76a47e90306da/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:387ce8c44ae94e0ec50532d9cb0edce17311024c9794eb196b90e1058aadeb66", size = 403240, upload-time = "2025-08-27T12:16:00.923Z" }, - { url = "https://files.pythonhosted.org/packages/cf/35/46936cca449f7f518f2f4996e0e8344db4b57e2081e752441154089d2a5f/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aaf94f812c95b5e60ebaf8bfb1898a7d7cb9c1af5744d4a67fa47796e0465d4e", size = 385194, upload-time = "2025-08-27T12:16:02.802Z" }, - { url = "https://files.pythonhosted.org/packages/e1/62/29c0d3e5125c3270b51415af7cbff1ec587379c84f55a5761cc9efa8cd06/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_31_riscv64.whl", hash = "sha256:4848ca84d6ded9b58e474dfdbad4b8bfb450344c0551ddc8d958bf4b36aa837c", size = 402086, upload-time = "2025-08-27T12:16:04.806Z" }, - { url = "https://files.pythonhosted.org/packages/8f/66/03e1087679227785474466fdd04157fb793b3b76e3fcf01cbf4c693c1949/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2bde09cbcf2248b73c7c323be49b280180ff39fadcfe04e7b6f54a678d02a7cf", size = 419272, upload-time = "2025-08-27T12:16:06.471Z" }, - { url = "https://files.pythonhosted.org/packages/6a/24/e3e72d265121e00b063aef3e3501e5b2473cf1b23511d56e529531acf01e/rpds_py-0.27.1-pp311-pypy311_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:94c44ee01fd21c9058f124d2d4f0c9dc7634bec93cd4b38eefc385dabe71acbf", size = 560003, upload-time = "2025-08-27T12:16:08.06Z" }, - { url = "https://files.pythonhosted.org/packages/26/ca/f5a344c534214cc2d41118c0699fffbdc2c1bc7046f2a2b9609765ab9c92/rpds_py-0.27.1-pp311-pypy311_pp73-musllinux_1_2_i686.whl", hash = "sha256:df8b74962e35c9249425d90144e721eed198e6555a0e22a563d29fe4486b51f6", size = 590482, upload-time = "2025-08-27T12:16:10.137Z" }, - { url = "https://files.pythonhosted.org/packages/ce/08/4349bdd5c64d9d193c360aa9db89adeee6f6682ab8825dca0a3f535f434f/rpds_py-0.27.1-pp311-pypy311_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:dc23e6820e3b40847e2f4a7726462ba0cf53089512abe9ee16318c366494c17a", size = 556523, upload-time = "2025-08-27T12:16:12.188Z" }, -] - -[[package]] -name = "rsa" -version = "4.9.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "pyasn1" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/da/8a/22b7beea3ee0d44b1916c0c1cb0ee3af23b700b6da9f04991899d0c555d4/rsa-4.9.1.tar.gz", hash = "sha256:e7bdbfdb5497da4c07dfd35530e1a902659db6ff241e39d9953cad06ebd0ae75", size = 29034, upload-time = "2025-04-16T09:51:18.218Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/64/8d/0133e4eb4beed9e425d9a98ed6e081a55d195481b7632472be1af08d2f6b/rsa-4.9.1-py3-none-any.whl", hash = "sha256:68635866661c6836b8d39430f97a996acbd61bfa49406748ea243539fe239762", size = 34696, upload-time = "2025-04-16T09:51:17.142Z" }, -] - -[[package]] -name = "s3fs" -version = "2025.3.2" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "aiobotocore" }, - { name = "aiohttp" }, - { name = "fsspec" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/72/df/559dc6d796c38f1b8a09a5f6dcf62a467a84f3c87a837ee07c59f60a26ad/s3fs-2025.3.2.tar.gz", hash = "sha256:6798f896ec76dd3bfd8beb89f0bb7c5263cb2760e038bae0978505cd172a307c", size = 77280, upload-time = "2025-03-31T15:35:18.881Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/66/e1/4db0388df5655de92ce5f9b60d2bef220a58dde130e0453e5433c579986e/s3fs-2025.3.2-py3-none-any.whl", hash = "sha256:81eae3f37b4b04bcc08845d7bcc607c6ca45878813ef7e6a28d77b2688417130", size = 30485, upload-time = "2025-03-31T15:35:17.384Z" }, -] - -[package.optional-dependencies] -boto3 = [ - { name = "aiobotocore", extra = ["boto3"] }, -] - -[[package]] -name = "s3transfer" -version = "0.13.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "botocore" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/6d/05/d52bf1e65044b4e5e27d4e63e8d1579dbdec54fce685908ae09bc3720030/s3transfer-0.13.1.tar.gz", hash = "sha256:c3fdba22ba1bd367922f27ec8032d6a1cf5f10c934fb5d68cf60fd5a23d936cf", size = 150589, upload-time = "2025-07-18T19:22:42.31Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/6d/4f/d073e09df851cfa251ef7840007d04db3293a0482ce607d2b993926089be/s3transfer-0.13.1-py3-none-any.whl", hash = "sha256:a981aa7429be23fe6dfc13e80e4020057cbab622b08c0315288758d67cabc724", size = 85308, upload-time = "2025-07-18T19:22:40.947Z" }, -] - -[[package]] -name = "scikit-learn" -version = "1.7.2" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "joblib" }, - { name = "numpy" }, - { name = "scipy" }, - { name = "threadpoolctl" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/98/c2/a7855e41c9d285dfe86dc50b250978105dce513d6e459ea66a6aeb0e1e0c/scikit_learn-1.7.2.tar.gz", hash = "sha256:20e9e49ecd130598f1ca38a1d85090e1a600147b9c02fa6f15d69cb53d968fda", size = 7193136, upload-time = "2025-09-09T08:21:29.075Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/43/83/564e141eef908a5863a54da8ca342a137f45a0bfb71d1d79704c9894c9d1/scikit_learn-1.7.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:c7509693451651cd7361d30ce4e86a1347493554f172b1c72a39300fa2aea79e", size = 9331967, upload-time = "2025-09-09T08:20:32.421Z" }, - { url = "https://files.pythonhosted.org/packages/18/d6/ba863a4171ac9d7314c4d3fc251f015704a2caeee41ced89f321c049ed83/scikit_learn-1.7.2-cp311-cp311-macosx_12_0_arm64.whl", hash = "sha256:0486c8f827c2e7b64837c731c8feff72c0bd2b998067a8a9cbc10643c31f0fe1", size = 8648645, upload-time = "2025-09-09T08:20:34.436Z" }, - { url = "https://files.pythonhosted.org/packages/ef/0e/97dbca66347b8cf0ea8b529e6bb9367e337ba2e8be0ef5c1a545232abfde/scikit_learn-1.7.2-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:89877e19a80c7b11a2891a27c21c4894fb18e2c2e077815bcade10d34287b20d", size = 9715424, upload-time = "2025-09-09T08:20:36.776Z" }, - { url = "https://files.pythonhosted.org/packages/f7/32/1f3b22e3207e1d2c883a7e09abb956362e7d1bd2f14458c7de258a26ac15/scikit_learn-1.7.2-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8da8bf89d4d79aaec192d2bda62f9b56ae4e5b4ef93b6a56b5de4977e375c1f1", size = 9509234, upload-time = "2025-09-09T08:20:38.957Z" }, - { url = "https://files.pythonhosted.org/packages/9f/71/34ddbd21f1da67c7a768146968b4d0220ee6831e4bcbad3e03dd3eae88b6/scikit_learn-1.7.2-cp311-cp311-win_amd64.whl", hash = "sha256:9b7ed8d58725030568523e937c43e56bc01cadb478fc43c042a9aca1dacb3ba1", size = 8894244, upload-time = "2025-09-09T08:20:41.166Z" }, - { url = "https://files.pythonhosted.org/packages/a7/aa/3996e2196075689afb9fce0410ebdb4a09099d7964d061d7213700204409/scikit_learn-1.7.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:8d91a97fa2b706943822398ab943cde71858a50245e31bc71dba62aab1d60a96", size = 9259818, upload-time = "2025-09-09T08:20:43.19Z" }, - { url = "https://files.pythonhosted.org/packages/43/5d/779320063e88af9c4a7c2cf463ff11c21ac9c8bd730c4a294b0000b666c9/scikit_learn-1.7.2-cp312-cp312-macosx_12_0_arm64.whl", hash = "sha256:acbc0f5fd2edd3432a22c69bed78e837c70cf896cd7993d71d51ba6708507476", size = 8636997, upload-time = "2025-09-09T08:20:45.468Z" }, - { url = "https://files.pythonhosted.org/packages/5c/d0/0c577d9325b05594fdd33aa970bf53fb673f051a45496842caee13cfd7fe/scikit_learn-1.7.2-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:e5bf3d930aee75a65478df91ac1225ff89cd28e9ac7bd1196853a9229b6adb0b", size = 9478381, upload-time = "2025-09-09T08:20:47.982Z" }, - { url = "https://files.pythonhosted.org/packages/82/70/8bf44b933837ba8494ca0fc9a9ab60f1c13b062ad0197f60a56e2fc4c43e/scikit_learn-1.7.2-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b4d6e9deed1a47aca9fe2f267ab8e8fe82ee20b4526b2c0cd9e135cea10feb44", size = 9300296, upload-time = "2025-09-09T08:20:50.366Z" }, - { url = "https://files.pythonhosted.org/packages/c6/99/ed35197a158f1fdc2fe7c3680e9c70d0128f662e1fee4ed495f4b5e13db0/scikit_learn-1.7.2-cp312-cp312-win_amd64.whl", hash = "sha256:6088aa475f0785e01bcf8529f55280a3d7d298679f50c0bb70a2364a82d0b290", size = 8731256, upload-time = "2025-09-09T08:20:52.627Z" }, - { url = "https://files.pythonhosted.org/packages/ae/93/a3038cb0293037fd335f77f31fe053b89c72f17b1c8908c576c29d953e84/scikit_learn-1.7.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:0b7dacaa05e5d76759fb071558a8b5130f4845166d88654a0f9bdf3eb57851b7", size = 9212382, upload-time = "2025-09-09T08:20:54.731Z" }, - { url = "https://files.pythonhosted.org/packages/40/dd/9a88879b0c1104259136146e4742026b52df8540c39fec21a6383f8292c7/scikit_learn-1.7.2-cp313-cp313-macosx_12_0_arm64.whl", hash = "sha256:abebbd61ad9e1deed54cca45caea8ad5f79e1b93173dece40bb8e0c658dbe6fe", size = 8592042, upload-time = "2025-09-09T08:20:57.313Z" }, - { url = "https://files.pythonhosted.org/packages/46/af/c5e286471b7d10871b811b72ae794ac5fe2989c0a2df07f0ec723030f5f5/scikit_learn-1.7.2-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:502c18e39849c0ea1a5d681af1dbcf15f6cce601aebb657aabbfe84133c1907f", size = 9434180, upload-time = "2025-09-09T08:20:59.671Z" }, - { url = "https://files.pythonhosted.org/packages/f1/fd/df59faa53312d585023b2da27e866524ffb8faf87a68516c23896c718320/scikit_learn-1.7.2-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7a4c328a71785382fe3fe676a9ecf2c86189249beff90bf85e22bdb7efaf9ae0", size = 9283660, upload-time = "2025-09-09T08:21:01.71Z" }, - { url = "https://files.pythonhosted.org/packages/a7/c7/03000262759d7b6f38c836ff9d512f438a70d8a8ddae68ee80de72dcfb63/scikit_learn-1.7.2-cp313-cp313-win_amd64.whl", hash = "sha256:63a9afd6f7b229aad94618c01c252ce9e6fa97918c5ca19c9a17a087d819440c", size = 8702057, upload-time = "2025-09-09T08:21:04.234Z" }, - { url = "https://files.pythonhosted.org/packages/55/87/ef5eb1f267084532c8e4aef98a28b6ffe7425acbfd64b5e2f2e066bc29b3/scikit_learn-1.7.2-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:9acb6c5e867447b4e1390930e3944a005e2cb115922e693c08a323421a6966e8", size = 9558731, upload-time = "2025-09-09T08:21:06.381Z" }, - { url = "https://files.pythonhosted.org/packages/93/f8/6c1e3fc14b10118068d7938878a9f3f4e6d7b74a8ddb1e5bed65159ccda8/scikit_learn-1.7.2-cp313-cp313t-macosx_12_0_arm64.whl", hash = "sha256:2a41e2a0ef45063e654152ec9d8bcfc39f7afce35b08902bfe290c2498a67a6a", size = 9038852, upload-time = "2025-09-09T08:21:08.628Z" }, - { url = "https://files.pythonhosted.org/packages/83/87/066cafc896ee540c34becf95d30375fe5cbe93c3b75a0ee9aa852cd60021/scikit_learn-1.7.2-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:98335fb98509b73385b3ab2bd0639b1f610541d3988ee675c670371d6a87aa7c", size = 9527094, upload-time = "2025-09-09T08:21:11.486Z" }, - { url = "https://files.pythonhosted.org/packages/9c/2b/4903e1ccafa1f6453b1ab78413938c8800633988c838aa0be386cbb33072/scikit_learn-1.7.2-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:191e5550980d45449126e23ed1d5e9e24b2c68329ee1f691a3987476e115e09c", size = 9367436, upload-time = "2025-09-09T08:21:13.602Z" }, - { url = "https://files.pythonhosted.org/packages/b5/aa/8444be3cfb10451617ff9d177b3c190288f4563e6c50ff02728be67ad094/scikit_learn-1.7.2-cp313-cp313t-win_amd64.whl", hash = "sha256:57dc4deb1d3762c75d685507fbd0bc17160144b2f2ba4ccea5dc285ab0d0e973", size = 9275749, upload-time = "2025-09-09T08:21:15.96Z" }, - { url = "https://files.pythonhosted.org/packages/d9/82/dee5acf66837852e8e68df6d8d3a6cb22d3df997b733b032f513d95205b7/scikit_learn-1.7.2-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fa8f63940e29c82d1e67a45d5297bdebbcb585f5a5a50c4914cc2e852ab77f33", size = 9208906, upload-time = "2025-09-09T08:21:18.557Z" }, - { url = "https://files.pythonhosted.org/packages/3c/30/9029e54e17b87cb7d50d51a5926429c683d5b4c1732f0507a6c3bed9bf65/scikit_learn-1.7.2-cp314-cp314-macosx_12_0_arm64.whl", hash = "sha256:f95dc55b7902b91331fa4e5845dd5bde0580c9cd9612b1b2791b7e80c3d32615", size = 8627836, upload-time = "2025-09-09T08:21:20.695Z" }, - { url = "https://files.pythonhosted.org/packages/60/18/4a52c635c71b536879f4b971c2cedf32c35ee78f48367885ed8025d1f7ee/scikit_learn-1.7.2-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:9656e4a53e54578ad10a434dc1f993330568cfee176dff07112b8785fb413106", size = 9426236, upload-time = "2025-09-09T08:21:22.645Z" }, - { url = "https://files.pythonhosted.org/packages/99/7e/290362f6ab582128c53445458a5befd471ed1ea37953d5bcf80604619250/scikit_learn-1.7.2-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:96dc05a854add0e50d3f47a1ef21a10a595016da5b007c7d9cd9d0bffd1fcc61", size = 9312593, upload-time = "2025-09-09T08:21:24.65Z" }, - { url = "https://files.pythonhosted.org/packages/8e/87/24f541b6d62b1794939ae6422f8023703bbf6900378b2b34e0b4384dfefd/scikit_learn-1.7.2-cp314-cp314-win_amd64.whl", hash = "sha256:bb24510ed3f9f61476181e4db51ce801e2ba37541def12dc9333b946fc7a9cf8", size = 8820007, upload-time = "2025-09-09T08:21:26.713Z" }, -] - -[[package]] -name = "scipy" -version = "1.16.2" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "numpy" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/4c/3b/546a6f0bfe791bbb7f8d591613454d15097e53f906308ec6f7c1ce588e8e/scipy-1.16.2.tar.gz", hash = "sha256:af029b153d243a80afb6eabe40b0a07f8e35c9adc269c019f364ad747f826a6b", size = 30580599, upload-time = "2025-09-11T17:48:08.271Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/0b/ef/37ed4b213d64b48422df92560af7300e10fe30b5d665dd79932baebee0c6/scipy-1.16.2-cp311-cp311-macosx_10_14_x86_64.whl", hash = "sha256:6ab88ea43a57da1af33292ebd04b417e8e2eaf9d5aa05700be8d6e1b6501cd92", size = 36619956, upload-time = "2025-09-11T17:39:20.5Z" }, - { url = "https://files.pythonhosted.org/packages/85/ab/5c2eba89b9416961a982346a4d6a647d78c91ec96ab94ed522b3b6baf444/scipy-1.16.2-cp311-cp311-macosx_12_0_arm64.whl", hash = "sha256:c95e96c7305c96ede73a7389f46ccd6c659c4da5ef1b2789466baeaed3622b6e", size = 28931117, upload-time = "2025-09-11T17:39:29.06Z" }, - { url = "https://files.pythonhosted.org/packages/80/d1/eed51ab64d227fe60229a2d57fb60ca5898cfa50ba27d4f573e9e5f0b430/scipy-1.16.2-cp311-cp311-macosx_14_0_arm64.whl", hash = "sha256:87eb178db04ece7c698220d523c170125dbffebb7af0345e66c3554f6f60c173", size = 20921997, upload-time = "2025-09-11T17:39:34.892Z" }, - { url = "https://files.pythonhosted.org/packages/be/7c/33ea3e23bbadde96726edba6bf9111fb1969d14d9d477ffa202c67bec9da/scipy-1.16.2-cp311-cp311-macosx_14_0_x86_64.whl", hash = "sha256:4e409eac067dcee96a57fbcf424c13f428037827ec7ee3cb671ff525ca4fc34d", size = 23523374, upload-time = "2025-09-11T17:39:40.846Z" }, - { url = "https://files.pythonhosted.org/packages/96/0b/7399dc96e1e3f9a05e258c98d716196a34f528eef2ec55aad651ed136d03/scipy-1.16.2-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:e574be127bb760f0dad24ff6e217c80213d153058372362ccb9555a10fc5e8d2", size = 33583702, upload-time = "2025-09-11T17:39:49.011Z" }, - { url = "https://files.pythonhosted.org/packages/1a/bc/a5c75095089b96ea72c1bd37a4497c24b581ec73db4ef58ebee142ad2d14/scipy-1.16.2-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f5db5ba6188d698ba7abab982ad6973265b74bb40a1efe1821b58c87f73892b9", size = 35883427, upload-time = "2025-09-11T17:39:57.406Z" }, - { url = "https://files.pythonhosted.org/packages/ab/66/e25705ca3d2b87b97fe0a278a24b7f477b4023a926847935a1a71488a6a6/scipy-1.16.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:ec6e74c4e884104ae006d34110677bfe0098203a3fec2f3faf349f4cb05165e3", size = 36212940, upload-time = "2025-09-11T17:40:06.013Z" }, - { url = "https://files.pythonhosted.org/packages/d6/fd/0bb911585e12f3abdd603d721d83fc1c7492835e1401a0e6d498d7822b4b/scipy-1.16.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:912f46667d2d3834bc3d57361f854226475f695eb08c08a904aadb1c936b6a88", size = 38865092, upload-time = "2025-09-11T17:40:15.143Z" }, - { url = "https://files.pythonhosted.org/packages/d6/73/c449a7d56ba6e6f874183759f8483cde21f900a8be117d67ffbb670c2958/scipy-1.16.2-cp311-cp311-win_amd64.whl", hash = "sha256:91e9e8a37befa5a69e9cacbe0bcb79ae5afb4a0b130fd6db6ee6cc0d491695fa", size = 38687626, upload-time = "2025-09-11T17:40:24.041Z" }, - { url = "https://files.pythonhosted.org/packages/68/72/02f37316adf95307f5d9e579023c6899f89ff3a051fa079dbd6faafc48e5/scipy-1.16.2-cp311-cp311-win_arm64.whl", hash = "sha256:f3bf75a6dcecab62afde4d1f973f1692be013110cad5338007927db8da73249c", size = 25503506, upload-time = "2025-09-11T17:40:30.703Z" }, - { url = "https://files.pythonhosted.org/packages/b7/8d/6396e00db1282279a4ddd507c5f5e11f606812b608ee58517ce8abbf883f/scipy-1.16.2-cp312-cp312-macosx_10_14_x86_64.whl", hash = "sha256:89d6c100fa5c48472047632e06f0876b3c4931aac1f4291afc81a3644316bb0d", size = 36646259, upload-time = "2025-09-11T17:40:39.329Z" }, - { url = "https://files.pythonhosted.org/packages/3b/93/ea9edd7e193fceb8eef149804491890bde73fb169c896b61aa3e2d1e4e77/scipy-1.16.2-cp312-cp312-macosx_12_0_arm64.whl", hash = "sha256:ca748936cd579d3f01928b30a17dc474550b01272d8046e3e1ee593f23620371", size = 28888976, upload-time = "2025-09-11T17:40:46.82Z" }, - { url = "https://files.pythonhosted.org/packages/91/4d/281fddc3d80fd738ba86fd3aed9202331180b01e2c78eaae0642f22f7e83/scipy-1.16.2-cp312-cp312-macosx_14_0_arm64.whl", hash = "sha256:fac4f8ce2ddb40e2e3d0f7ec36d2a1e7f92559a2471e59aec37bd8d9de01fec0", size = 20879905, upload-time = "2025-09-11T17:40:52.545Z" }, - { url = "https://files.pythonhosted.org/packages/69/40/b33b74c84606fd301b2915f0062e45733c6ff5708d121dd0deaa8871e2d0/scipy-1.16.2-cp312-cp312-macosx_14_0_x86_64.whl", hash = "sha256:033570f1dcefd79547a88e18bccacff025c8c647a330381064f561d43b821232", size = 23553066, upload-time = "2025-09-11T17:40:59.014Z" }, - { url = "https://files.pythonhosted.org/packages/55/a7/22c739e2f21a42cc8f16bc76b47cff4ed54fbe0962832c589591c2abec34/scipy-1.16.2-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:ea3421209bf00c8a5ef2227de496601087d8f638a2363ee09af059bd70976dc1", size = 33336407, upload-time = "2025-09-11T17:41:06.796Z" }, - { url = "https://files.pythonhosted.org/packages/53/11/a0160990b82999b45874dc60c0c183d3a3a969a563fffc476d5a9995c407/scipy-1.16.2-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f66bd07ba6f84cd4a380b41d1bf3c59ea488b590a2ff96744845163309ee8e2f", size = 35673281, upload-time = "2025-09-11T17:41:15.055Z" }, - { url = "https://files.pythonhosted.org/packages/96/53/7ef48a4cfcf243c3d0f1643f5887c81f29fdf76911c4e49331828e19fc0a/scipy-1.16.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:5e9feab931bd2aea4a23388c962df6468af3d808ddf2d40f94a81c5dc38f32ef", size = 36004222, upload-time = "2025-09-11T17:41:23.868Z" }, - { url = "https://files.pythonhosted.org/packages/49/7f/71a69e0afd460049d41c65c630c919c537815277dfea214031005f474d78/scipy-1.16.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:03dfc75e52f72cf23ec2ced468645321407faad8f0fe7b1f5b49264adbc29cb1", size = 38664586, upload-time = "2025-09-11T17:41:31.021Z" }, - { url = "https://files.pythonhosted.org/packages/34/95/20e02ca66fb495a95fba0642fd48e0c390d0ece9b9b14c6e931a60a12dea/scipy-1.16.2-cp312-cp312-win_amd64.whl", hash = "sha256:0ce54e07bbb394b417457409a64fd015be623f36e330ac49306433ffe04bc97e", size = 38550641, upload-time = "2025-09-11T17:41:36.61Z" }, - { url = "https://files.pythonhosted.org/packages/92/ad/13646b9beb0a95528ca46d52b7babafbe115017814a611f2065ee4e61d20/scipy-1.16.2-cp312-cp312-win_arm64.whl", hash = "sha256:2a8ffaa4ac0df81a0b94577b18ee079f13fecdb924df3328fc44a7dc5ac46851", size = 25456070, upload-time = "2025-09-11T17:41:41.3Z" }, - { url = "https://files.pythonhosted.org/packages/c1/27/c5b52f1ee81727a9fc457f5ac1e9bf3d6eab311805ea615c83c27ba06400/scipy-1.16.2-cp313-cp313-macosx_10_14_x86_64.whl", hash = "sha256:84f7bf944b43e20b8a894f5fe593976926744f6c185bacfcbdfbb62736b5cc70", size = 36604856, upload-time = "2025-09-11T17:41:47.695Z" }, - { url = "https://files.pythonhosted.org/packages/32/a9/15c20d08e950b540184caa8ced675ba1128accb0e09c653780ba023a4110/scipy-1.16.2-cp313-cp313-macosx_12_0_arm64.whl", hash = "sha256:5c39026d12edc826a1ef2ad35ad1e6d7f087f934bb868fc43fa3049c8b8508f9", size = 28864626, upload-time = "2025-09-11T17:41:52.642Z" }, - { url = "https://files.pythonhosted.org/packages/4c/fc/ea36098df653cca26062a627c1a94b0de659e97127c8491e18713ca0e3b9/scipy-1.16.2-cp313-cp313-macosx_14_0_arm64.whl", hash = "sha256:e52729ffd45b68777c5319560014d6fd251294200625d9d70fd8626516fc49f5", size = 20855689, upload-time = "2025-09-11T17:41:57.886Z" }, - { url = "https://files.pythonhosted.org/packages/dc/6f/d0b53be55727f3e6d7c72687ec18ea6d0047cf95f1f77488b99a2bafaee1/scipy-1.16.2-cp313-cp313-macosx_14_0_x86_64.whl", hash = "sha256:024dd4a118cccec09ca3209b7e8e614931a6ffb804b2a601839499cb88bdf925", size = 23512151, upload-time = "2025-09-11T17:42:02.303Z" }, - { url = "https://files.pythonhosted.org/packages/11/85/bf7dab56e5c4b1d3d8eef92ca8ede788418ad38a7dc3ff50262f00808760/scipy-1.16.2-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7a5dc7ee9c33019973a470556081b0fd3c9f4c44019191039f9769183141a4d9", size = 33329824, upload-time = "2025-09-11T17:42:07.549Z" }, - { url = "https://files.pythonhosted.org/packages/da/6a/1a927b14ddc7714111ea51f4e568203b2bb6ed59bdd036d62127c1a360c8/scipy-1.16.2-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c2275ff105e508942f99d4e3bc56b6ef5e4b3c0af970386ca56b777608ce95b7", size = 35681881, upload-time = "2025-09-11T17:42:13.255Z" }, - { url = "https://files.pythonhosted.org/packages/c1/5f/331148ea5780b4fcc7007a4a6a6ee0a0c1507a796365cc642d4d226e1c3a/scipy-1.16.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:af80196eaa84f033e48444d2e0786ec47d328ba00c71e4299b602235ffef9acb", size = 36006219, upload-time = "2025-09-11T17:42:18.765Z" }, - { url = "https://files.pythonhosted.org/packages/46/3a/e991aa9d2aec723b4a8dcfbfc8365edec5d5e5f9f133888067f1cbb7dfc1/scipy-1.16.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9fb1eb735fe3d6ed1f89918224e3385fbf6f9e23757cacc35f9c78d3b712dd6e", size = 38682147, upload-time = "2025-09-11T17:42:25.177Z" }, - { url = "https://files.pythonhosted.org/packages/a1/57/0f38e396ad19e41b4c5db66130167eef8ee620a49bc7d0512e3bb67e0cab/scipy-1.16.2-cp313-cp313-win_amd64.whl", hash = "sha256:fda714cf45ba43c9d3bae8f2585c777f64e3f89a2e073b668b32ede412d8f52c", size = 38520766, upload-time = "2025-09-11T17:43:25.342Z" }, - { url = "https://files.pythonhosted.org/packages/1b/a5/85d3e867b6822d331e26c862a91375bb7746a0b458db5effa093d34cdb89/scipy-1.16.2-cp313-cp313-win_arm64.whl", hash = "sha256:2f5350da923ccfd0b00e07c3e5cfb316c1c0d6c1d864c07a72d092e9f20db104", size = 25451169, upload-time = "2025-09-11T17:43:30.198Z" }, - { url = "https://files.pythonhosted.org/packages/09/d9/60679189bcebda55992d1a45498de6d080dcaf21ce0c8f24f888117e0c2d/scipy-1.16.2-cp313-cp313t-macosx_10_14_x86_64.whl", hash = "sha256:53d8d2ee29b925344c13bda64ab51785f016b1b9617849dac10897f0701b20c1", size = 37012682, upload-time = "2025-09-11T17:42:30.677Z" }, - { url = "https://files.pythonhosted.org/packages/83/be/a99d13ee4d3b7887a96f8c71361b9659ba4ef34da0338f14891e102a127f/scipy-1.16.2-cp313-cp313t-macosx_12_0_arm64.whl", hash = "sha256:9e05e33657efb4c6a9d23bd8300101536abd99c85cca82da0bffff8d8764d08a", size = 29389926, upload-time = "2025-09-11T17:42:35.845Z" }, - { url = "https://files.pythonhosted.org/packages/bf/0a/130164a4881cec6ca8c00faf3b57926f28ed429cd6001a673f83c7c2a579/scipy-1.16.2-cp313-cp313t-macosx_14_0_arm64.whl", hash = "sha256:7fe65b36036357003b3ef9d37547abeefaa353b237e989c21027b8ed62b12d4f", size = 21381152, upload-time = "2025-09-11T17:42:40.07Z" }, - { url = "https://files.pythonhosted.org/packages/47/a6/503ffb0310ae77fba874e10cddfc4a1280bdcca1d13c3751b8c3c2996cf8/scipy-1.16.2-cp313-cp313t-macosx_14_0_x86_64.whl", hash = "sha256:6406d2ac6d40b861cccf57f49592f9779071655e9f75cd4f977fa0bdd09cb2e4", size = 23914410, upload-time = "2025-09-11T17:42:44.313Z" }, - { url = "https://files.pythonhosted.org/packages/fa/c7/1147774bcea50d00c02600aadaa919facbd8537997a62496270133536ed6/scipy-1.16.2-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:ff4dc42bd321991fbf611c23fc35912d690f731c9914bf3af8f417e64aca0f21", size = 33481880, upload-time = "2025-09-11T17:42:49.325Z" }, - { url = "https://files.pythonhosted.org/packages/6a/74/99d5415e4c3e46b2586f30cdbecb95e101c7192628a484a40dd0d163811a/scipy-1.16.2-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:654324826654d4d9133e10675325708fb954bc84dae6e9ad0a52e75c6b1a01d7", size = 35791425, upload-time = "2025-09-11T17:42:54.711Z" }, - { url = "https://files.pythonhosted.org/packages/1b/ee/a6559de7c1cc710e938c0355d9d4fbcd732dac4d0d131959d1f3b63eb29c/scipy-1.16.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:63870a84cd15c44e65220eaed2dac0e8f8b26bbb991456a033c1d9abfe8a94f8", size = 36178622, upload-time = "2025-09-11T17:43:00.375Z" }, - { url = "https://files.pythonhosted.org/packages/4e/7b/f127a5795d5ba8ece4e0dce7d4a9fb7cb9e4f4757137757d7a69ab7d4f1a/scipy-1.16.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:fa01f0f6a3050fa6a9771a95d5faccc8e2f5a92b4a2e5440a0fa7264a2398472", size = 38783985, upload-time = "2025-09-11T17:43:06.661Z" }, - { url = "https://files.pythonhosted.org/packages/3e/9f/bc81c1d1e033951eb5912cd3750cc005943afa3e65a725d2443a3b3c4347/scipy-1.16.2-cp313-cp313t-win_amd64.whl", hash = "sha256:116296e89fba96f76353a8579820c2512f6e55835d3fad7780fece04367de351", size = 38631367, upload-time = "2025-09-11T17:43:14.44Z" }, - { url = "https://files.pythonhosted.org/packages/d6/5e/2cc7555fd81d01814271412a1d59a289d25f8b63208a0a16c21069d55d3e/scipy-1.16.2-cp313-cp313t-win_arm64.whl", hash = "sha256:98e22834650be81d42982360382b43b17f7ba95e0e6993e2a4f5b9ad9283a94d", size = 25787992, upload-time = "2025-09-11T17:43:19.745Z" }, - { url = "https://files.pythonhosted.org/packages/8b/ac/ad8951250516db71619f0bd3b2eb2448db04b720a003dd98619b78b692c0/scipy-1.16.2-cp314-cp314-macosx_10_14_x86_64.whl", hash = "sha256:567e77755019bb7461513c87f02bb73fb65b11f049aaaa8ca17cfaa5a5c45d77", size = 36595109, upload-time = "2025-09-11T17:43:35.713Z" }, - { url = "https://files.pythonhosted.org/packages/ff/f6/5779049ed119c5b503b0f3dc6d6f3f68eefc3a9190d4ad4c276f854f051b/scipy-1.16.2-cp314-cp314-macosx_12_0_arm64.whl", hash = "sha256:17d9bb346194e8967296621208fcdfd39b55498ef7d2f376884d5ac47cec1a70", size = 28859110, upload-time = "2025-09-11T17:43:40.814Z" }, - { url = "https://files.pythonhosted.org/packages/82/09/9986e410ae38bf0a0c737ff8189ac81a93b8e42349aac009891c054403d7/scipy-1.16.2-cp314-cp314-macosx_14_0_arm64.whl", hash = "sha256:0a17541827a9b78b777d33b623a6dcfe2ef4a25806204d08ead0768f4e529a88", size = 20850110, upload-time = "2025-09-11T17:43:44.981Z" }, - { url = "https://files.pythonhosted.org/packages/0d/ad/485cdef2d9215e2a7df6d61b81d2ac073dfacf6ae24b9ae87274c4e936ae/scipy-1.16.2-cp314-cp314-macosx_14_0_x86_64.whl", hash = "sha256:d7d4c6ba016ffc0f9568d012f5f1eb77ddd99412aea121e6fa8b4c3b7cbad91f", size = 23497014, upload-time = "2025-09-11T17:43:49.074Z" }, - { url = "https://files.pythonhosted.org/packages/a7/74/f6a852e5d581122b8f0f831f1d1e32fb8987776ed3658e95c377d308ed86/scipy-1.16.2-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:9702c4c023227785c779cba2e1d6f7635dbb5b2e0936cdd3a4ecb98d78fd41eb", size = 33401155, upload-time = "2025-09-11T17:43:54.661Z" }, - { url = "https://files.pythonhosted.org/packages/d9/f5/61d243bbc7c6e5e4e13dde9887e84a5cbe9e0f75fd09843044af1590844e/scipy-1.16.2-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:d1cdf0ac28948d225decdefcc45ad7dd91716c29ab56ef32f8e0d50657dffcc7", size = 35691174, upload-time = "2025-09-11T17:44:00.101Z" }, - { url = "https://files.pythonhosted.org/packages/03/99/59933956331f8cc57e406cdb7a483906c74706b156998f322913e789c7e1/scipy-1.16.2-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:70327d6aa572a17c2941cdfb20673f82e536e91850a2e4cb0c5b858b690e1548", size = 36070752, upload-time = "2025-09-11T17:44:05.619Z" }, - { url = "https://files.pythonhosted.org/packages/c6/7d/00f825cfb47ee19ef74ecf01244b43e95eae74e7e0ff796026ea7cd98456/scipy-1.16.2-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:5221c0b2a4b58aa7c4ed0387d360fd90ee9086d383bb34d9f2789fafddc8a936", size = 38701010, upload-time = "2025-09-11T17:44:11.322Z" }, - { url = "https://files.pythonhosted.org/packages/e4/9f/b62587029980378304ba5a8563d376c96f40b1e133daacee76efdcae32de/scipy-1.16.2-cp314-cp314-win_amd64.whl", hash = "sha256:f5a85d7b2b708025af08f060a496dd261055b617d776fc05a1a1cc69e09fe9ff", size = 39360061, upload-time = "2025-09-11T17:45:09.814Z" }, - { url = "https://files.pythonhosted.org/packages/82/04/7a2f1609921352c7fbee0815811b5050582f67f19983096c4769867ca45f/scipy-1.16.2-cp314-cp314-win_arm64.whl", hash = "sha256:2cc73a33305b4b24556957d5857d6253ce1e2dcd67fa0ff46d87d1670b3e1e1d", size = 26126914, upload-time = "2025-09-11T17:45:14.73Z" }, - { url = "https://files.pythonhosted.org/packages/51/b9/60929ce350c16b221928725d2d1d7f86cf96b8bc07415547057d1196dc92/scipy-1.16.2-cp314-cp314t-macosx_10_14_x86_64.whl", hash = "sha256:9ea2a3fed83065d77367775d689401a703d0f697420719ee10c0780bcab594d8", size = 37013193, upload-time = "2025-09-11T17:44:16.757Z" }, - { url = "https://files.pythonhosted.org/packages/2a/41/ed80e67782d4bc5fc85a966bc356c601afddd175856ba7c7bb6d9490607e/scipy-1.16.2-cp314-cp314t-macosx_12_0_arm64.whl", hash = "sha256:7280d926f11ca945c3ef92ba960fa924e1465f8d07ce3a9923080363390624c4", size = 29390172, upload-time = "2025-09-11T17:44:21.783Z" }, - { url = "https://files.pythonhosted.org/packages/c4/a3/2f673ace4090452696ccded5f5f8efffb353b8f3628f823a110e0170b605/scipy-1.16.2-cp314-cp314t-macosx_14_0_arm64.whl", hash = "sha256:8afae1756f6a1fe04636407ef7dbece33d826a5d462b74f3d0eb82deabefd831", size = 21381326, upload-time = "2025-09-11T17:44:25.982Z" }, - { url = "https://files.pythonhosted.org/packages/42/bf/59df61c5d51395066c35836b78136accf506197617c8662e60ea209881e1/scipy-1.16.2-cp314-cp314t-macosx_14_0_x86_64.whl", hash = "sha256:5c66511f29aa8d233388e7416a3f20d5cae7a2744d5cee2ecd38c081f4e861b3", size = 23915036, upload-time = "2025-09-11T17:44:30.527Z" }, - { url = "https://files.pythonhosted.org/packages/91/c3/edc7b300dc16847ad3672f1a6f3f7c5d13522b21b84b81c265f4f2760d4a/scipy-1.16.2-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:efe6305aeaa0e96b0ccca5ff647a43737d9a092064a3894e46c414db84bc54ac", size = 33484341, upload-time = "2025-09-11T17:44:35.981Z" }, - { url = "https://files.pythonhosted.org/packages/26/c7/24d1524e72f06ff141e8d04b833c20db3021020563272ccb1b83860082a9/scipy-1.16.2-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:7f3a337d9ae06a1e8d655ee9d8ecb835ea5ddcdcbd8d23012afa055ab014f374", size = 35790840, upload-time = "2025-09-11T17:44:41.76Z" }, - { url = "https://files.pythonhosted.org/packages/aa/b7/5aaad984eeedd56858dc33d75efa59e8ce798d918e1033ef62d2708f2c3d/scipy-1.16.2-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:bab3605795d269067d8ce78a910220262711b753de8913d3deeaedb5dded3bb6", size = 36174716, upload-time = "2025-09-11T17:44:47.316Z" }, - { url = "https://files.pythonhosted.org/packages/fd/c2/e276a237acb09824822b0ada11b028ed4067fdc367a946730979feacb870/scipy-1.16.2-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:b0348d8ddb55be2a844c518cd8cc8deeeb8aeba707cf834db5758fc89b476a2c", size = 38790088, upload-time = "2025-09-11T17:44:53.011Z" }, - { url = "https://files.pythonhosted.org/packages/c6/b4/5c18a766e8353015439f3780f5fc473f36f9762edc1a2e45da3ff5a31b21/scipy-1.16.2-cp314-cp314t-win_amd64.whl", hash = "sha256:26284797e38b8a75e14ea6631d29bda11e76ceaa6ddb6fdebbfe4c4d90faf2f9", size = 39457455, upload-time = "2025-09-11T17:44:58.899Z" }, - { url = "https://files.pythonhosted.org/packages/97/30/2f9a5243008f76dfc5dee9a53dfb939d9b31e16ce4bd4f2e628bfc5d89d2/scipy-1.16.2-cp314-cp314t-win_arm64.whl", hash = "sha256:d2a4472c231328d4de38d5f1f68fdd6d28a615138f842580a8a321b5845cf779", size = 26448374, upload-time = "2025-09-11T17:45:03.45Z" }, -] - -[[package]] -name = "semver" -version = "3.0.4" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/72/d1/d3159231aec234a59dd7d601e9dd9fe96f3afff15efd33c1070019b26132/semver-3.0.4.tar.gz", hash = "sha256:afc7d8c584a5ed0a11033af086e8af226a9c0b206f313e0301f8dd7b6b589602", size = 269730, upload-time = "2025-01-24T13:19:27.617Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/a6/24/4d91e05817e92e3a61c8a21e08fd0f390f5301f1c448b137c57c4bc6e543/semver-3.0.4-py3-none-any.whl", hash = "sha256:9c824d87ba7f7ab4a1890799cec8596f15c1241cb473404ea1cb0c55e4b04746", size = 17912, upload-time = "2025-01-24T13:19:24.949Z" }, -] - -[[package]] -name = "sentry-sdk" -version = "2.38.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "certifi" }, - { name = "urllib3" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/b2/22/60fd703b34d94d216b2387e048ac82de3e86b63bc28869fb076f8bb0204a/sentry_sdk-2.38.0.tar.gz", hash = "sha256:792d2af45e167e2f8a3347143f525b9b6bac6f058fb2014720b40b84ccbeb985", size = 348116, upload-time = "2025-09-15T15:00:37.846Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/7a/84/bde4c4bbb269b71bc09316af8eb00da91f67814d40337cc12ef9c8742541/sentry_sdk-2.38.0-py2.py3-none-any.whl", hash = "sha256:2324aea8573a3fa1576df7fb4d65c4eb8d9929c8fa5939647397a07179eef8d0", size = 370346, upload-time = "2025-09-15T15:00:35.821Z" }, -] - -[package.optional-dependencies] -fastapi = [ - { name = "fastapi" }, -] - -[[package]] -name = "setuptools" -version = "80.9.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/18/5d/3bf57dcd21979b887f014ea83c24ae194cfcd12b9e0fda66b957c69d1fca/setuptools-80.9.0.tar.gz", hash = "sha256:f36b47402ecde768dbfafc46e8e4207b4360c654f1f3bb84475f0a28628fb19c", size = 1319958, upload-time = "2025-05-27T00:56:51.443Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/a3/dc/17031897dae0efacfea57dfd3a82fdd2a2aeb58e0ff71b77b87e44edc772/setuptools-80.9.0-py3-none-any.whl", hash = "sha256:062d34222ad13e0cc312a4c02d73f059e86a4acbfbdea8f8f76b28c99f306922", size = 1201486, upload-time = "2025-05-27T00:56:49.664Z" }, -] - -[[package]] -name = "shapely" -version = "2.1.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "numpy" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/ca/3c/2da625233f4e605155926566c0e7ea8dda361877f48e8b1655e53456f252/shapely-2.1.1.tar.gz", hash = "sha256:500621967f2ffe9642454808009044c21e5b35db89ce69f8a2042c2ffd0e2772", size = 315422, upload-time = "2025-05-19T11:04:41.265Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/19/97/2df985b1e03f90c503796ad5ecd3d9ed305123b64d4ccb54616b30295b29/shapely-2.1.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:587a1aa72bc858fab9b8c20427b5f6027b7cbc92743b8e2c73b9de55aa71c7a7", size = 1819368, upload-time = "2025-05-19T11:03:55.937Z" }, - { url = "https://files.pythonhosted.org/packages/56/17/504518860370f0a28908b18864f43d72f03581e2b6680540ca668f07aa42/shapely-2.1.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:9fa5c53b0791a4b998f9ad84aad456c988600757a96b0a05e14bba10cebaaaea", size = 1625362, upload-time = "2025-05-19T11:03:57.06Z" }, - { url = "https://files.pythonhosted.org/packages/36/a1/9677337d729b79fce1ef3296aac6b8ef4743419086f669e8a8070eff8f40/shapely-2.1.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aabecd038841ab5310d23495253f01c2a82a3aedae5ab9ca489be214aa458aa7", size = 2999005, upload-time = "2025-05-19T11:03:58.692Z" }, - { url = "https://files.pythonhosted.org/packages/a2/17/e09357274699c6e012bbb5a8ea14765a4d5860bb658df1931c9f90d53bd3/shapely-2.1.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:586f6aee1edec04e16227517a866df3e9a2e43c1f635efc32978bb3dc9c63753", size = 3108489, upload-time = "2025-05-19T11:04:00.059Z" }, - { url = "https://files.pythonhosted.org/packages/17/5d/93a6c37c4b4e9955ad40834f42b17260ca74ecf36df2e81bb14d12221b90/shapely-2.1.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:b9878b9e37ad26c72aada8de0c9cfe418d9e2ff36992a1693b7f65a075b28647", size = 3945727, upload-time = "2025-05-19T11:04:01.786Z" }, - { url = "https://files.pythonhosted.org/packages/a3/1a/ad696648f16fd82dd6bfcca0b3b8fbafa7aacc13431c7fc4c9b49e481681/shapely-2.1.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:d9a531c48f289ba355e37b134e98e28c557ff13965d4653a5228d0f42a09aed0", size = 4109311, upload-time = "2025-05-19T11:04:03.134Z" }, - { url = "https://files.pythonhosted.org/packages/d4/38/150dd245beab179ec0d4472bf6799bf18f21b1efbef59ac87de3377dbf1c/shapely-2.1.1-cp311-cp311-win32.whl", hash = "sha256:4866de2673a971820c75c0167b1f1cd8fb76f2d641101c23d3ca021ad0449bab", size = 1522982, upload-time = "2025-05-19T11:04:05.217Z" }, - { url = "https://files.pythonhosted.org/packages/93/5b/842022c00fbb051083c1c85430f3bb55565b7fd2d775f4f398c0ba8052ce/shapely-2.1.1-cp311-cp311-win_amd64.whl", hash = "sha256:20a9d79958b3d6c70d8a886b250047ea32ff40489d7abb47d01498c704557a93", size = 1703872, upload-time = "2025-05-19T11:04:06.791Z" }, - { url = "https://files.pythonhosted.org/packages/fb/64/9544dc07dfe80a2d489060791300827c941c451e2910f7364b19607ea352/shapely-2.1.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:2827365b58bf98efb60affc94a8e01c56dd1995a80aabe4b701465d86dcbba43", size = 1833021, upload-time = "2025-05-19T11:04:08.022Z" }, - { url = "https://files.pythonhosted.org/packages/07/aa/fb5f545e72e89b6a0f04a0effda144f5be956c9c312c7d4e00dfddbddbcf/shapely-2.1.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:a9c551f7fa7f1e917af2347fe983f21f212863f1d04f08eece01e9c275903fad", size = 1643018, upload-time = "2025-05-19T11:04:09.343Z" }, - { url = "https://files.pythonhosted.org/packages/03/46/61e03edba81de729f09d880ce7ae5c1af873a0814206bbfb4402ab5c3388/shapely-2.1.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:78dec4d4fbe7b1db8dc36de3031767e7ece5911fb7782bc9e95c5cdec58fb1e9", size = 2986417, upload-time = "2025-05-19T11:04:10.56Z" }, - { url = "https://files.pythonhosted.org/packages/1f/1e/83ec268ab8254a446b4178b45616ab5822d7b9d2b7eb6e27cf0b82f45601/shapely-2.1.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:872d3c0a7b8b37da0e23d80496ec5973c4692920b90de9f502b5beb994bbaaef", size = 3098224, upload-time = "2025-05-19T11:04:11.903Z" }, - { url = "https://files.pythonhosted.org/packages/f1/44/0c21e7717c243e067c9ef8fa9126de24239f8345a5bba9280f7bb9935959/shapely-2.1.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:2e2b9125ebfbc28ecf5353511de62f75a8515ae9470521c9a693e4bb9fbe0cf1", size = 3925982, upload-time = "2025-05-19T11:04:13.224Z" }, - { url = "https://files.pythonhosted.org/packages/15/50/d3b4e15fefc103a0eb13d83bad5f65cd6e07a5d8b2ae920e767932a247d1/shapely-2.1.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:4b96cea171b3d7f6786976a0520f178c42792897653ecca0c5422fb1e6946e6d", size = 4089122, upload-time = "2025-05-19T11:04:14.477Z" }, - { url = "https://files.pythonhosted.org/packages/bd/05/9a68f27fc6110baeedeeebc14fd86e73fa38738c5b741302408fb6355577/shapely-2.1.1-cp312-cp312-win32.whl", hash = "sha256:39dca52201e02996df02e447f729da97cfb6ff41a03cb50f5547f19d02905af8", size = 1522437, upload-time = "2025-05-19T11:04:16.203Z" }, - { url = "https://files.pythonhosted.org/packages/bc/e9/a4560e12b9338842a1f82c9016d2543eaa084fce30a1ca11991143086b57/shapely-2.1.1-cp312-cp312-win_amd64.whl", hash = "sha256:13d643256f81d55a50013eff6321142781cf777eb6a9e207c2c9e6315ba6044a", size = 1703479, upload-time = "2025-05-19T11:04:18.497Z" }, - { url = "https://files.pythonhosted.org/packages/71/8e/2bc836437f4b84d62efc1faddce0d4e023a5d990bbddd3c78b2004ebc246/shapely-2.1.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:3004a644d9e89e26c20286d5fdc10f41b1744c48ce910bd1867fdff963fe6c48", size = 1832107, upload-time = "2025-05-19T11:04:19.736Z" }, - { url = "https://files.pythonhosted.org/packages/12/a2/12c7cae5b62d5d851c2db836eadd0986f63918a91976495861f7c492f4a9/shapely-2.1.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:1415146fa12d80a47d13cfad5310b3c8b9c2aa8c14a0c845c9d3d75e77cb54f6", size = 1642355, upload-time = "2025-05-19T11:04:21.035Z" }, - { url = "https://files.pythonhosted.org/packages/5b/7e/6d28b43d53fea56de69c744e34c2b999ed4042f7a811dc1bceb876071c95/shapely-2.1.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:21fcab88b7520820ec16d09d6bea68652ca13993c84dffc6129dc3607c95594c", size = 2968871, upload-time = "2025-05-19T11:04:22.167Z" }, - { url = "https://files.pythonhosted.org/packages/dd/87/1017c31e52370b2b79e4d29e07cbb590ab9e5e58cf7e2bdfe363765d6251/shapely-2.1.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e5ce6a5cc52c974b291237a96c08c5592e50f066871704fb5b12be2639d9026a", size = 3080830, upload-time = "2025-05-19T11:04:23.997Z" }, - { url = "https://files.pythonhosted.org/packages/1d/fe/f4a03d81abd96a6ce31c49cd8aaba970eaaa98e191bd1e4d43041e57ae5a/shapely-2.1.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:04e4c12a45a1d70aeb266618d8cf81a2de9c4df511b63e105b90bfdfb52146de", size = 3908961, upload-time = "2025-05-19T11:04:25.702Z" }, - { url = "https://files.pythonhosted.org/packages/ef/59/7605289a95a6844056a2017ab36d9b0cb9d6a3c3b5317c1f968c193031c9/shapely-2.1.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6ca74d851ca5264aae16c2b47e96735579686cb69fa93c4078070a0ec845b8d8", size = 4079623, upload-time = "2025-05-19T11:04:27.171Z" }, - { url = "https://files.pythonhosted.org/packages/bc/4d/9fea036eff2ef4059d30247128b2d67aaa5f0b25e9fc27e1d15cc1b84704/shapely-2.1.1-cp313-cp313-win32.whl", hash = "sha256:fd9130501bf42ffb7e0695b9ea17a27ae8ce68d50b56b6941c7f9b3d3453bc52", size = 1521916, upload-time = "2025-05-19T11:04:28.405Z" }, - { url = "https://files.pythonhosted.org/packages/12/d9/6d13b8957a17c95794f0c4dfb65ecd0957e6c7131a56ce18d135c1107a52/shapely-2.1.1-cp313-cp313-win_amd64.whl", hash = "sha256:ab8d878687b438a2f4c138ed1a80941c6ab0029e0f4c785ecfe114413b498a97", size = 1702746, upload-time = "2025-05-19T11:04:29.643Z" }, - { url = "https://files.pythonhosted.org/packages/60/36/b1452e3e7f35f5f6454d96f3be6e2bb87082720ff6c9437ecc215fa79be0/shapely-2.1.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:0c062384316a47f776305ed2fa22182717508ffdeb4a56d0ff4087a77b2a0f6d", size = 1833482, upload-time = "2025-05-19T11:04:30.852Z" }, - { url = "https://files.pythonhosted.org/packages/ce/ca/8e6f59be0718893eb3e478141285796a923636dc8f086f83e5b0ec0036d0/shapely-2.1.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:4ecf6c196b896e8f1360cc219ed4eee1c1e5f5883e505d449f263bd053fb8c05", size = 1642256, upload-time = "2025-05-19T11:04:32.068Z" }, - { url = "https://files.pythonhosted.org/packages/ab/78/0053aea449bb1d4503999525fec6232f049abcdc8df60d290416110de943/shapely-2.1.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb00070b4c4860f6743c600285109c273cca5241e970ad56bb87bef0be1ea3a0", size = 3016614, upload-time = "2025-05-19T11:04:33.7Z" }, - { url = "https://files.pythonhosted.org/packages/ee/53/36f1b1de1dfafd1b457dcbafa785b298ce1b8a3e7026b79619e708a245d5/shapely-2.1.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d14a9afa5fa980fbe7bf63706fdfb8ff588f638f145a1d9dbc18374b5b7de913", size = 3093542, upload-time = "2025-05-19T11:04:34.952Z" }, - { url = "https://files.pythonhosted.org/packages/b9/bf/0619f37ceec6b924d84427c88835b61f27f43560239936ff88915c37da19/shapely-2.1.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:b640e390dabde790e3fb947198b466e63223e0a9ccd787da5f07bcb14756c28d", size = 3945961, upload-time = "2025-05-19T11:04:36.32Z" }, - { url = "https://files.pythonhosted.org/packages/93/c9/20ca4afeb572763b07a7997f00854cb9499df6af85929e93012b189d8917/shapely-2.1.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:69e08bf9697c1b73ec6aa70437db922bafcea7baca131c90c26d59491a9760f9", size = 4089514, upload-time = "2025-05-19T11:04:37.683Z" }, - { url = "https://files.pythonhosted.org/packages/33/6a/27036a5a560b80012a544366bceafd491e8abb94a8db14047b5346b5a749/shapely-2.1.1-cp313-cp313t-win32.whl", hash = "sha256:ef2d09d5a964cc90c2c18b03566cf918a61c248596998a0301d5b632beadb9db", size = 1540607, upload-time = "2025-05-19T11:04:38.925Z" }, - { url = "https://files.pythonhosted.org/packages/ea/f1/5e9b3ba5c7aa7ebfaf269657e728067d16a7c99401c7973ddf5f0cf121bd/shapely-2.1.1-cp313-cp313t-win_amd64.whl", hash = "sha256:8cb8f17c377260452e9d7720eeaf59082c5f8ea48cf104524d953e5d36d4bdb7", size = 1723061, upload-time = "2025-05-19T11:04:40.082Z" }, -] - -[[package]] -name = "shellingham" -version = "1.5.4" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/58/15/8b3609fd3830ef7b27b655beb4b4e9c62313a4e8da8c676e142cc210d58e/shellingham-1.5.4.tar.gz", hash = "sha256:8dbca0739d487e5bd35ab3ca4b36e11c4078f3a234bfce294b0a0291363404de", size = 10310, upload-time = "2023-10-24T04:13:40.426Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/e0/f9/0595336914c5619e5f28a1fb793285925a8cd4b432c9da0a987836c7f822/shellingham-1.5.4-py2.py3-none-any.whl", hash = "sha256:7ecfff8f2fd72616f7481040475a65b2bf8af90a56c89140852d1120324e8686", size = 9755, upload-time = "2023-10-24T04:13:38.866Z" }, -] - -[[package]] -name = "simplejson" -version = "3.20.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/af/92/51b417685abd96b31308b61b9acce7ec50d8e1de8fbc39a7fd4962c60689/simplejson-3.20.1.tar.gz", hash = "sha256:e64139b4ec4f1f24c142ff7dcafe55a22b811a74d86d66560c8815687143037d", size = 85591, upload-time = "2025-02-15T05:18:53.15Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/76/59/74bc90d1c051bc2432c96b34bd4e8036875ab58b4fcbe4d6a5a76985f853/simplejson-3.20.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:325b8c107253d3217e89d7b50c71015b5b31e2433e6c5bf38967b2f80630a8ca", size = 92132, upload-time = "2025-02-15T05:16:15.743Z" }, - { url = "https://files.pythonhosted.org/packages/71/c7/1970916e0c51794fff89f76da2f632aaf0b259b87753c88a8c409623d3e1/simplejson-3.20.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:88a7baa8211089b9e58d78fbc1b0b322103f3f3d459ff16f03a36cece0d0fcf0", size = 74956, upload-time = "2025-02-15T05:16:17.062Z" }, - { url = "https://files.pythonhosted.org/packages/c8/0d/98cc5909180463f1d75fac7180de62d4cdb4e82c4fef276b9e591979372c/simplejson-3.20.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:299b1007b8101d50d95bc0db1bf5c38dc372e85b504cf77f596462083ee77e3f", size = 74772, upload-time = "2025-02-15T05:16:19.204Z" }, - { url = "https://files.pythonhosted.org/packages/e1/94/a30a5211a90d67725a3e8fcc1c788189f2ae2ed2b96b63ed15d0b7f5d6bb/simplejson-3.20.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:03ec618ed65caab48e81e3ed29586236a8e57daef792f1f3bb59504a7e98cd10", size = 143575, upload-time = "2025-02-15T05:16:21.337Z" }, - { url = "https://files.pythonhosted.org/packages/ee/08/cdb6821f1058eb5db46d252de69ff7e6c53f05f1bae6368fe20d5b51d37e/simplejson-3.20.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:cd2cdead1d3197f0ff43373cf4730213420523ba48697743e135e26f3d179f38", size = 153241, upload-time = "2025-02-15T05:16:22.859Z" }, - { url = "https://files.pythonhosted.org/packages/4c/2d/ca3caeea0bdc5efc5503d5f57a2dfb56804898fb196dfada121323ee0ccb/simplejson-3.20.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3466d2839fdc83e1af42e07b90bc8ff361c4e8796cd66722a40ba14e458faddd", size = 141500, upload-time = "2025-02-15T05:16:25.068Z" }, - { url = "https://files.pythonhosted.org/packages/e1/33/d3e0779d5c58245e7370c98eb969275af6b7a4a5aec3b97cbf85f09ad328/simplejson-3.20.1-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d492ed8e92f3a9f9be829205f44b1d0a89af6582f0cf43e0d129fa477b93fe0c", size = 144757, upload-time = "2025-02-15T05:16:28.301Z" }, - { url = "https://files.pythonhosted.org/packages/54/53/2d93128bb55861b2fa36c5944f38da51a0bc6d83e513afc6f7838440dd15/simplejson-3.20.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:f924b485537b640dc69434565463fd6fc0c68c65a8c6e01a823dd26c9983cf79", size = 144409, upload-time = "2025-02-15T05:16:29.687Z" }, - { url = "https://files.pythonhosted.org/packages/99/4c/dac310a98f897ad3435b4bdc836d92e78f09e38c5dbf28211ed21dc59fa2/simplejson-3.20.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:9e8eacf6a3491bf76ea91a8d46726368a6be0eb94993f60b8583550baae9439e", size = 146082, upload-time = "2025-02-15T05:16:31.064Z" }, - { url = "https://files.pythonhosted.org/packages/ee/22/d7ba958cfed39827335b82656b1c46f89678faecda9a7677b47e87b48ee6/simplejson-3.20.1-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:d34d04bf90b4cea7c22d8b19091633908f14a096caa301b24c2f3d85b5068fb8", size = 154339, upload-time = "2025-02-15T05:16:32.719Z" }, - { url = "https://files.pythonhosted.org/packages/b8/c8/b072b741129406a7086a0799c6f5d13096231bf35fdd87a0cffa789687fc/simplejson-3.20.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:69dd28d4ce38390ea4aaf212902712c0fd1093dc4c1ff67e09687c3c3e15a749", size = 147915, upload-time = "2025-02-15T05:16:34.291Z" }, - { url = "https://files.pythonhosted.org/packages/6c/46/8347e61e9cf3db5342a42f7fd30a81b4f5cf85977f916852d7674a540907/simplejson-3.20.1-cp311-cp311-win32.whl", hash = "sha256:dfe7a9da5fd2a3499436cd350f31539e0a6ded5da6b5b3d422df016444d65e43", size = 73972, upload-time = "2025-02-15T05:16:35.712Z" }, - { url = "https://files.pythonhosted.org/packages/01/85/b52f24859237b4e9d523d5655796d911ba3d46e242eb1959c45b6af5aedd/simplejson-3.20.1-cp311-cp311-win_amd64.whl", hash = "sha256:896a6c04d7861d507d800da7642479c3547060bf97419d9ef73d98ced8258766", size = 75595, upload-time = "2025-02-15T05:16:36.957Z" }, - { url = "https://files.pythonhosted.org/packages/8d/eb/34c16a1ac9ba265d024dc977ad84e1659d931c0a700967c3e59a98ed7514/simplejson-3.20.1-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:f31c4a3a7ab18467ee73a27f3e59158255d1520f3aad74315edde7a940f1be23", size = 93100, upload-time = "2025-02-15T05:16:38.801Z" }, - { url = "https://files.pythonhosted.org/packages/41/fc/2c2c007d135894971e6814e7c0806936e5bade28f8db4dd7e2a58b50debd/simplejson-3.20.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:884e6183d16b725e113b83a6fc0230152ab6627d4d36cb05c89c2c5bccfa7bc6", size = 75464, upload-time = "2025-02-15T05:16:40.905Z" }, - { url = "https://files.pythonhosted.org/packages/0f/05/2b5ecb33b776c34bb5cace5de5d7669f9b60e3ca13c113037b2ca86edfbd/simplejson-3.20.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:03d7a426e416fe0d3337115f04164cd9427eb4256e843a6b8751cacf70abc832", size = 75112, upload-time = "2025-02-15T05:16:42.246Z" }, - { url = "https://files.pythonhosted.org/packages/fe/36/1f3609a2792f06cd4b71030485f78e91eb09cfd57bebf3116bf2980a8bac/simplejson-3.20.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:000602141d0bddfcff60ea6a6e97d5e10c9db6b17fd2d6c66199fa481b6214bb", size = 150182, upload-time = "2025-02-15T05:16:43.557Z" }, - { url = "https://files.pythonhosted.org/packages/2f/b0/053fbda38b8b602a77a4f7829def1b4f316cd8deb5440a6d3ee90790d2a4/simplejson-3.20.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:af8377a8af78226e82e3a4349efdde59ffa421ae88be67e18cef915e4023a595", size = 158363, upload-time = "2025-02-15T05:16:45.748Z" }, - { url = "https://files.pythonhosted.org/packages/d1/4b/2eb84ae867539a80822e92f9be4a7200dffba609275faf99b24141839110/simplejson-3.20.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:15c7de4c88ab2fbcb8781a3b982ef883696736134e20b1210bca43fb42ff1acf", size = 148415, upload-time = "2025-02-15T05:16:47.861Z" }, - { url = "https://files.pythonhosted.org/packages/e0/bd/400b0bd372a5666addf2540c7358bfc3841b9ce5cdbc5cc4ad2f61627ad8/simplejson-3.20.1-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:455a882ff3f97d810709f7b620007d4e0aca8da71d06fc5c18ba11daf1c4df49", size = 152213, upload-time = "2025-02-15T05:16:49.25Z" }, - { url = "https://files.pythonhosted.org/packages/50/12/143f447bf6a827ee9472693768dc1a5eb96154f8feb140a88ce6973a3cfa/simplejson-3.20.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:fc0f523ce923e7f38eb67804bc80e0a028c76d7868500aa3f59225574b5d0453", size = 150048, upload-time = "2025-02-15T05:16:51.5Z" }, - { url = "https://files.pythonhosted.org/packages/5e/ea/dd9b3e8e8ed710a66f24a22c16a907c9b539b6f5f45fd8586bd5c231444e/simplejson-3.20.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:76461ec929282dde4a08061071a47281ad939d0202dc4e63cdd135844e162fbc", size = 151668, upload-time = "2025-02-15T05:16:53Z" }, - { url = "https://files.pythonhosted.org/packages/99/af/ee52a8045426a0c5b89d755a5a70cc821815ef3c333b56fbcad33c4435c0/simplejson-3.20.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:ab19c2da8c043607bde4d4ef3a6b633e668a7d2e3d56f40a476a74c5ea71949f", size = 158840, upload-time = "2025-02-15T05:16:54.851Z" }, - { url = "https://files.pythonhosted.org/packages/68/db/ab32869acea6b5de7d75fa0dac07a112ded795d41eaa7e66c7813b17be95/simplejson-3.20.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b2578bedaedf6294415197b267d4ef678fea336dd78ee2a6d2f4b028e9d07be3", size = 154212, upload-time = "2025-02-15T05:16:56.318Z" }, - { url = "https://files.pythonhosted.org/packages/fa/7a/e3132d454977d75a3bf9a6d541d730f76462ebf42a96fea2621498166f41/simplejson-3.20.1-cp312-cp312-win32.whl", hash = "sha256:339f407373325a36b7fd744b688ba5bae0666b5d340ec6d98aebc3014bf3d8ea", size = 74101, upload-time = "2025-02-15T05:16:57.746Z" }, - { url = "https://files.pythonhosted.org/packages/bc/5d/4e243e937fa3560107c69f6f7c2eed8589163f5ed14324e864871daa2dd9/simplejson-3.20.1-cp312-cp312-win_amd64.whl", hash = "sha256:627d4486a1ea7edf1f66bb044ace1ce6b4c1698acd1b05353c97ba4864ea2e17", size = 75736, upload-time = "2025-02-15T05:16:59.017Z" }, - { url = "https://files.pythonhosted.org/packages/c4/03/0f453a27877cb5a5fff16a975925f4119102cc8552f52536b9a98ef0431e/simplejson-3.20.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:71e849e7ceb2178344998cbe5ade101f1b329460243c79c27fbfc51c0447a7c3", size = 93109, upload-time = "2025-02-15T05:17:00.377Z" }, - { url = "https://files.pythonhosted.org/packages/74/1f/a729f4026850cabeaff23e134646c3f455e86925d2533463420635ae54de/simplejson-3.20.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:b63fdbab29dc3868d6f009a59797cefaba315fd43cd32ddd998ee1da28e50e29", size = 75475, upload-time = "2025-02-15T05:17:02.544Z" }, - { url = "https://files.pythonhosted.org/packages/e2/14/50a2713fee8ff1f8d655b1a14f4a0f1c0c7246768a1b3b3d12964a4ed5aa/simplejson-3.20.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:1190f9a3ce644fd50ec277ac4a98c0517f532cfebdcc4bd975c0979a9f05e1fb", size = 75112, upload-time = "2025-02-15T05:17:03.875Z" }, - { url = "https://files.pythonhosted.org/packages/45/86/ea9835abb646755140e2d482edc9bc1e91997ed19a59fd77ae4c6a0facea/simplejson-3.20.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c1336ba7bcb722ad487cd265701ff0583c0bb6de638364ca947bb84ecc0015d1", size = 150245, upload-time = "2025-02-15T05:17:06.899Z" }, - { url = "https://files.pythonhosted.org/packages/12/b4/53084809faede45da829fe571c65fbda8479d2a5b9c633f46b74124d56f5/simplejson-3.20.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e975aac6a5acd8b510eba58d5591e10a03e3d16c1cf8a8624ca177491f7230f0", size = 158465, upload-time = "2025-02-15T05:17:08.707Z" }, - { url = "https://files.pythonhosted.org/packages/a9/7d/d56579468d1660b3841e1f21c14490d103e33cf911886b22652d6e9683ec/simplejson-3.20.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6a6dd11ee282937ad749da6f3b8d87952ad585b26e5edfa10da3ae2536c73078", size = 148514, upload-time = "2025-02-15T05:17:11.323Z" }, - { url = "https://files.pythonhosted.org/packages/19/e3/874b1cca3d3897b486d3afdccc475eb3a09815bf1015b01cf7fcb52a55f0/simplejson-3.20.1-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ab980fcc446ab87ea0879edad41a5c28f2d86020014eb035cf5161e8de4474c6", size = 152262, upload-time = "2025-02-15T05:17:13.543Z" }, - { url = "https://files.pythonhosted.org/packages/32/84/f0fdb3625292d945c2bd13a814584603aebdb38cfbe5fe9be6b46fe598c4/simplejson-3.20.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f5aee2a4cb6b146bd17333ac623610f069f34e8f31d2f4f0c1a2186e50c594f0", size = 150164, upload-time = "2025-02-15T05:17:15.021Z" }, - { url = "https://files.pythonhosted.org/packages/95/51/6d625247224f01eaaeabace9aec75ac5603a42f8ebcce02c486fbda8b428/simplejson-3.20.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:652d8eecbb9a3b6461b21ec7cf11fd0acbab144e45e600c817ecf18e4580b99e", size = 151795, upload-time = "2025-02-15T05:17:16.542Z" }, - { url = "https://files.pythonhosted.org/packages/7f/d9/bb921df6b35be8412f519e58e86d1060fddf3ad401b783e4862e0a74c4c1/simplejson-3.20.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:8c09948f1a486a89251ee3a67c9f8c969b379f6ffff1a6064b41fea3bce0a112", size = 159027, upload-time = "2025-02-15T05:17:18.083Z" }, - { url = "https://files.pythonhosted.org/packages/03/c5/5950605e4ad023a6621cf4c931b29fd3d2a9c1f36be937230bfc83d7271d/simplejson-3.20.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:cbbd7b215ad4fc6f058b5dd4c26ee5c59f72e031dfda3ac183d7968a99e4ca3a", size = 154380, upload-time = "2025-02-15T05:17:20.334Z" }, - { url = "https://files.pythonhosted.org/packages/66/ad/b74149557c5ec1e4e4d55758bda426f5d2ec0123cd01a53ae63b8de51fa3/simplejson-3.20.1-cp313-cp313-win32.whl", hash = "sha256:ae81e482476eaa088ef9d0120ae5345de924f23962c0c1e20abbdff597631f87", size = 74102, upload-time = "2025-02-15T05:17:22.475Z" }, - { url = "https://files.pythonhosted.org/packages/db/a9/25282fdd24493e1022f30b7f5cdf804255c007218b2bfaa655bd7ad34b2d/simplejson-3.20.1-cp313-cp313-win_amd64.whl", hash = "sha256:1b9fd15853b90aec3b1739f4471efbf1ac05066a2c7041bf8db821bb73cd2ddc", size = 75736, upload-time = "2025-02-15T05:17:24.122Z" }, - { url = "https://files.pythonhosted.org/packages/4b/30/00f02a0a921556dd5a6db1ef2926a1bc7a8bbbfb1c49cfed68a275b8ab2b/simplejson-3.20.1-py3-none-any.whl", hash = "sha256:8a6c1bbac39fa4a79f83cbf1df6ccd8ff7069582a9fd8db1e52cea073bc2c697", size = 57121, upload-time = "2025-02-15T05:18:51.243Z" }, -] - -[[package]] -name = "six" -version = "1.17.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/94/e7/b2c673351809dca68a0e064b6af791aa332cf192da575fd474ed7d6f16a2/six-1.17.0.tar.gz", hash = "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81", size = 34031, upload-time = "2024-12-04T17:35:28.174Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/b7/ce/149a00dd41f10bc29e5921b496af8b574d8413afcd5e30dfa0ed46c2cc5e/six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274", size = 11050, upload-time = "2024-12-04T17:35:26.475Z" }, -] - -[[package]] -name = "smmap" -version = "5.0.2" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/44/cd/a040c4b3119bbe532e5b0732286f805445375489fceaec1f48306068ee3b/smmap-5.0.2.tar.gz", hash = "sha256:26ea65a03958fa0c8a1c7e8c7a58fdc77221b8910f6be2131affade476898ad5", size = 22329, upload-time = "2025-01-02T07:14:40.909Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/04/be/d09147ad1ec7934636ad912901c5fd7667e1c858e19d355237db0d0cd5e4/smmap-5.0.2-py3-none-any.whl", hash = "sha256:b30115f0def7d7531d22a0fb6502488d879e75b260a9db4d0819cfb25403af5e", size = 24303, upload-time = "2025-01-02T07:14:38.724Z" }, -] - -[[package]] -name = "sniffio" -version = "1.3.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/a2/87/a6771e1546d97e7e041b6ae58d80074f81b7d5121207425c964ddf5cfdbd/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc", size = 20372, upload-time = "2024-02-25T23:20:04.057Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235, upload-time = "2024-02-25T23:20:01.196Z" }, -] - -[[package]] -name = "sqlalchemy" -version = "2.0.43" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "greenlet", marker = "(python_full_version < '3.14' and platform_machine == 'AMD64') or (python_full_version < '3.14' and platform_machine == 'WIN32') or (python_full_version < '3.14' and platform_machine == 'aarch64') or (python_full_version < '3.14' and platform_machine == 'amd64') or (python_full_version < '3.14' and platform_machine == 'ppc64le') or (python_full_version < '3.14' and platform_machine == 'win32') or (python_full_version < '3.14' and platform_machine == 'x86_64')" }, - { name = "typing-extensions" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/d7/bc/d59b5d97d27229b0e009bd9098cd81af71c2fa5549c580a0a67b9bed0496/sqlalchemy-2.0.43.tar.gz", hash = "sha256:788bfcef6787a7764169cfe9859fe425bf44559619e1d9f56f5bddf2ebf6f417", size = 9762949, upload-time = "2025-08-11T14:24:58.438Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/9d/77/fa7189fe44114658002566c6fe443d3ed0ec1fa782feb72af6ef7fbe98e7/sqlalchemy-2.0.43-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:52d9b73b8fb3e9da34c2b31e6d99d60f5f99fd8c1225c9dad24aeb74a91e1d29", size = 2136472, upload-time = "2025-08-11T15:52:21.789Z" }, - { url = "https://files.pythonhosted.org/packages/99/ea/92ac27f2fbc2e6c1766bb807084ca455265707e041ba027c09c17d697867/sqlalchemy-2.0.43-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f42f23e152e4545157fa367b2435a1ace7571cab016ca26038867eb7df2c3631", size = 2126535, upload-time = "2025-08-11T15:52:23.109Z" }, - { url = "https://files.pythonhosted.org/packages/94/12/536ede80163e295dc57fff69724caf68f91bb40578b6ac6583a293534849/sqlalchemy-2.0.43-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4fb1a8c5438e0c5ea51afe9c6564f951525795cf432bed0c028c1cb081276685", size = 3297521, upload-time = "2025-08-11T15:50:33.536Z" }, - { url = "https://files.pythonhosted.org/packages/03/b5/cacf432e6f1fc9d156eca0560ac61d4355d2181e751ba8c0cd9cb232c8c1/sqlalchemy-2.0.43-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:db691fa174e8f7036afefe3061bc40ac2b770718be2862bfb03aabae09051aca", size = 3297343, upload-time = "2025-08-11T15:57:51.186Z" }, - { url = "https://files.pythonhosted.org/packages/ca/ba/d4c9b526f18457667de4c024ffbc3a0920c34237b9e9dd298e44c7c00ee5/sqlalchemy-2.0.43-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:fe2b3b4927d0bc03d02ad883f402d5de201dbc8894ac87d2e981e7d87430e60d", size = 3232113, upload-time = "2025-08-11T15:50:34.949Z" }, - { url = "https://files.pythonhosted.org/packages/aa/79/c0121b12b1b114e2c8a10ea297a8a6d5367bc59081b2be896815154b1163/sqlalchemy-2.0.43-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:4d3d9b904ad4a6b175a2de0738248822f5ac410f52c2fd389ada0b5262d6a1e3", size = 3258240, upload-time = "2025-08-11T15:57:52.983Z" }, - { url = "https://files.pythonhosted.org/packages/79/99/a2f9be96fb382f3ba027ad42f00dbe30fdb6ba28cda5f11412eee346bec5/sqlalchemy-2.0.43-cp311-cp311-win32.whl", hash = "sha256:5cda6b51faff2639296e276591808c1726c4a77929cfaa0f514f30a5f6156921", size = 2101248, upload-time = "2025-08-11T15:55:01.855Z" }, - { url = "https://files.pythonhosted.org/packages/ee/13/744a32ebe3b4a7a9c7ea4e57babae7aa22070d47acf330d8e5a1359607f1/sqlalchemy-2.0.43-cp311-cp311-win_amd64.whl", hash = "sha256:c5d1730b25d9a07727d20ad74bc1039bbbb0a6ca24e6769861c1aa5bf2c4c4a8", size = 2126109, upload-time = "2025-08-11T15:55:04.092Z" }, - { url = "https://files.pythonhosted.org/packages/61/db/20c78f1081446095450bdc6ee6cc10045fce67a8e003a5876b6eaafc5cc4/sqlalchemy-2.0.43-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:20d81fc2736509d7a2bd33292e489b056cbae543661bb7de7ce9f1c0cd6e7f24", size = 2134891, upload-time = "2025-08-11T15:51:13.019Z" }, - { url = "https://files.pythonhosted.org/packages/45/0a/3d89034ae62b200b4396f0f95319f7d86e9945ee64d2343dcad857150fa2/sqlalchemy-2.0.43-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:25b9fc27650ff5a2c9d490c13c14906b918b0de1f8fcbb4c992712d8caf40e83", size = 2123061, upload-time = "2025-08-11T15:51:14.319Z" }, - { url = "https://files.pythonhosted.org/packages/cb/10/2711f7ff1805919221ad5bee205971254845c069ee2e7036847103ca1e4c/sqlalchemy-2.0.43-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6772e3ca8a43a65a37c88e2f3e2adfd511b0b1da37ef11ed78dea16aeae85bd9", size = 3320384, upload-time = "2025-08-11T15:52:35.088Z" }, - { url = "https://files.pythonhosted.org/packages/6e/0e/3d155e264d2ed2778484006ef04647bc63f55b3e2d12e6a4f787747b5900/sqlalchemy-2.0.43-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1a113da919c25f7f641ffbd07fbc9077abd4b3b75097c888ab818f962707eb48", size = 3329648, upload-time = "2025-08-11T15:56:34.153Z" }, - { url = "https://files.pythonhosted.org/packages/5b/81/635100fb19725c931622c673900da5efb1595c96ff5b441e07e3dd61f2be/sqlalchemy-2.0.43-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:4286a1139f14b7d70141c67a8ae1582fc2b69105f1b09d9573494eb4bb4b2687", size = 3258030, upload-time = "2025-08-11T15:52:36.933Z" }, - { url = "https://files.pythonhosted.org/packages/0c/ed/a99302716d62b4965fded12520c1cbb189f99b17a6d8cf77611d21442e47/sqlalchemy-2.0.43-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:529064085be2f4d8a6e5fab12d36ad44f1909a18848fcfbdb59cc6d4bbe48efe", size = 3294469, upload-time = "2025-08-11T15:56:35.553Z" }, - { url = "https://files.pythonhosted.org/packages/5d/a2/3a11b06715149bf3310b55a98b5c1e84a42cfb949a7b800bc75cb4e33abc/sqlalchemy-2.0.43-cp312-cp312-win32.whl", hash = "sha256:b535d35dea8bbb8195e7e2b40059e2253acb2b7579b73c1b432a35363694641d", size = 2098906, upload-time = "2025-08-11T15:55:00.645Z" }, - { url = "https://files.pythonhosted.org/packages/bc/09/405c915a974814b90aa591280623adc6ad6b322f61fd5cff80aeaef216c9/sqlalchemy-2.0.43-cp312-cp312-win_amd64.whl", hash = "sha256:1c6d85327ca688dbae7e2b06d7d84cfe4f3fffa5b5f9e21bb6ce9d0e1a0e0e0a", size = 2126260, upload-time = "2025-08-11T15:55:02.965Z" }, - { url = "https://files.pythonhosted.org/packages/41/1c/a7260bd47a6fae7e03768bf66451437b36451143f36b285522b865987ced/sqlalchemy-2.0.43-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:e7c08f57f75a2bb62d7ee80a89686a5e5669f199235c6d1dac75cd59374091c3", size = 2130598, upload-time = "2025-08-11T15:51:15.903Z" }, - { url = "https://files.pythonhosted.org/packages/8e/84/8a337454e82388283830b3586ad7847aa9c76fdd4f1df09cdd1f94591873/sqlalchemy-2.0.43-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:14111d22c29efad445cd5021a70a8b42f7d9152d8ba7f73304c4d82460946aaa", size = 2118415, upload-time = "2025-08-11T15:51:17.256Z" }, - { url = "https://files.pythonhosted.org/packages/cf/ff/22ab2328148492c4d71899d62a0e65370ea66c877aea017a244a35733685/sqlalchemy-2.0.43-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:21b27b56eb2f82653168cefe6cb8e970cdaf4f3a6cb2c5e3c3c1cf3158968ff9", size = 3248707, upload-time = "2025-08-11T15:52:38.444Z" }, - { url = "https://files.pythonhosted.org/packages/dc/29/11ae2c2b981de60187f7cbc84277d9d21f101093d1b2e945c63774477aba/sqlalchemy-2.0.43-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9c5a9da957c56e43d72126a3f5845603da00e0293720b03bde0aacffcf2dc04f", size = 3253602, upload-time = "2025-08-11T15:56:37.348Z" }, - { url = "https://files.pythonhosted.org/packages/b8/61/987b6c23b12c56d2be451bc70900f67dd7d989d52b1ee64f239cf19aec69/sqlalchemy-2.0.43-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:5d79f9fdc9584ec83d1b3c75e9f4595c49017f5594fee1a2217117647225d738", size = 3183248, upload-time = "2025-08-11T15:52:39.865Z" }, - { url = "https://files.pythonhosted.org/packages/86/85/29d216002d4593c2ce1c0ec2cec46dda77bfbcd221e24caa6e85eff53d89/sqlalchemy-2.0.43-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9df7126fd9db49e3a5a3999442cc67e9ee8971f3cb9644250107d7296cb2a164", size = 3219363, upload-time = "2025-08-11T15:56:39.11Z" }, - { url = "https://files.pythonhosted.org/packages/b6/e4/bd78b01919c524f190b4905d47e7630bf4130b9f48fd971ae1c6225b6f6a/sqlalchemy-2.0.43-cp313-cp313-win32.whl", hash = "sha256:7f1ac7828857fcedb0361b48b9ac4821469f7694089d15550bbcf9ab22564a1d", size = 2096718, upload-time = "2025-08-11T15:55:05.349Z" }, - { url = "https://files.pythonhosted.org/packages/ac/a5/ca2f07a2a201f9497de1928f787926613db6307992fe5cda97624eb07c2f/sqlalchemy-2.0.43-cp313-cp313-win_amd64.whl", hash = "sha256:971ba928fcde01869361f504fcff3b7143b47d30de188b11c6357c0505824197", size = 2123200, upload-time = "2025-08-11T15:55:07.932Z" }, - { url = "https://files.pythonhosted.org/packages/b8/d9/13bdde6521f322861fab67473cec4b1cc8999f3871953531cf61945fad92/sqlalchemy-2.0.43-py3-none-any.whl", hash = "sha256:1681c21dd2ccee222c2fe0bef671d1aef7c504087c9c4e800371cfcc8ac966fc", size = 1924759, upload-time = "2025-08-11T15:39:53.024Z" }, -] - -[package.optional-dependencies] -asyncio = [ - { name = "greenlet" }, -] - -[[package]] -name = "sqlalchemy-spanner" -version = "1.16.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "alembic" }, - { name = "google-cloud-spanner" }, - { name = "sqlalchemy" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/bf/6c/d9a2e05d839ec4d00d11887f18e66de331f696b162159dc2655e3910bb55/sqlalchemy_spanner-1.16.0.tar.gz", hash = "sha256:5143d5d092f2f1fef66b332163291dc7913a58292580733a601ff5fae160515a", size = 82748, upload-time = "2025-09-02T08:26:00.645Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/94/74/a9c88abddfeca46c253000e87aad923014c1907953e06b39a0cbec229a86/sqlalchemy_spanner-1.16.0-py3-none-any.whl", hash = "sha256:e53cadb2b973e88936c0a9874e133ee9a0829ea3261f328b4ca40bdedf2016c1", size = 32069, upload-time = "2025-09-02T08:25:59.264Z" }, -] - -[[package]] -name = "sqlglot" -version = "27.16.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/2d/a3/b29fd2d07ee1b0267b3bbe7d38610d05844daa089bba8657c5321a24fd79/sqlglot-27.16.1.tar.gz", hash = "sha256:b89d2b4dba879e40aff6a1c805d68e8c33d53a821c67242f373d361a727181e8", size = 5471043, upload-time = "2025-09-18T13:01:49.85Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/4c/c2/161ed0fc376c55a4e48074f14dbf3c3dae44abcd9021725584443049d60a/sqlglot-27.16.1-py3-none-any.whl", hash = "sha256:9a080a4ce3bebe5a38b1f84c38c2fb5207828ab8ca09871102ad5ad231f58571", size = 517887, upload-time = "2025-09-18T13:01:47.413Z" }, -] - -[[package]] -name = "sqlparse" -version = "0.5.3" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/e5/40/edede8dd6977b0d3da179a342c198ed100dd2aba4be081861ee5911e4da4/sqlparse-0.5.3.tar.gz", hash = "sha256:09f67787f56a0b16ecdbde1bfc7f5d9c3371ca683cfeaa8e6ff60b4807ec9272", size = 84999, upload-time = "2024-12-10T12:05:30.728Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/a9/5c/bfd6bd0bf979426d405cc6e71eceb8701b148b16c21d2dc3c261efc61c7b/sqlparse-0.5.3-py3-none-any.whl", hash = "sha256:cf2196ed3418f3ba5de6af7e82c694a9fbdbfecccdfc72e281548517081f16ca", size = 44415, upload-time = "2024-12-10T12:05:27.824Z" }, -] - -[[package]] -name = "sse-starlette" -version = "3.0.2" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "anyio" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/42/6f/22ed6e33f8a9e76ca0a412405f31abb844b779d52c5f96660766edcd737c/sse_starlette-3.0.2.tar.gz", hash = "sha256:ccd60b5765ebb3584d0de2d7a6e4f745672581de4f5005ab31c3a25d10b52b3a", size = 20985, upload-time = "2025-07-27T09:07:44.565Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/ef/10/c78f463b4ef22eef8491f218f692be838282cd65480f6e423d7730dfd1fb/sse_starlette-3.0.2-py3-none-any.whl", hash = "sha256:16b7cbfddbcd4eaca11f7b586f3b8a080f1afe952c15813455b162edea619e5a", size = 11297, upload-time = "2025-07-27T09:07:43.268Z" }, -] - -[[package]] -name = "sseclient-py" -version = "1.8.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/e8/ed/3df5ab8bb0c12f86c28d0cadb11ed1de44a92ed35ce7ff4fd5518a809325/sseclient-py-1.8.0.tar.gz", hash = "sha256:c547c5c1a7633230a38dc599a21a2dc638f9b5c297286b48b46b935c71fac3e8", size = 7791, upload-time = "2023-09-01T19:39:20.45Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/49/58/97655efdfeb5b4eeab85b1fc5d3fa1023661246c2ab2a26ea8e47402d4f2/sseclient_py-1.8.0-py2.py3-none-any.whl", hash = "sha256:4ecca6dc0b9f963f8384e9d7fd529bf93dd7d708144c4fb5da0e0a1a926fee83", size = 8828, upload-time = "2023-09-01T19:39:17.627Z" }, -] - -[[package]] -name = "starlette" -version = "0.48.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "anyio" }, - { name = "typing-extensions", marker = "python_full_version < '3.13'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/a7/a5/d6f429d43394057b67a6b5bbe6eae2f77a6bf7459d961fdb224bf206eee6/starlette-0.48.0.tar.gz", hash = "sha256:7e8cee469a8ab2352911528110ce9088fdc6a37d9876926e73da7ce4aa4c7a46", size = 2652949, upload-time = "2025-09-13T08:41:05.699Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/be/72/2db2f49247d0a18b4f1bb9a5a39a0162869acf235f3a96418363947b3d46/starlette-0.48.0-py3-none-any.whl", hash = "sha256:0764ca97b097582558ecb498132ed0c7d942f233f365b86ba37770e026510659", size = 73736, upload-time = "2025-09-13T08:41:03.869Z" }, -] - -[[package]] -name = "structlog" -version = "25.4.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/79/b9/6e672db4fec07349e7a8a8172c1a6ae235c58679ca29c3f86a61b5e59ff3/structlog-25.4.0.tar.gz", hash = "sha256:186cd1b0a8ae762e29417095664adf1d6a31702160a46dacb7796ea82f7409e4", size = 1369138, upload-time = "2025-06-02T08:21:12.971Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/a0/4a/97ee6973e3a73c74c8120d59829c3861ea52210667ec3e7a16045c62b64d/structlog-25.4.0-py3-none-any.whl", hash = "sha256:fe809ff5c27e557d14e613f45ca441aabda051d119ee5a0102aaba6ce40eed2c", size = 68720, upload-time = "2025-06-02T08:21:11.43Z" }, -] - -[[package]] -name = "sympy" -version = "1.14.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "mpmath" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/83/d3/803453b36afefb7c2bb238361cd4ae6125a569b4db67cd9e79846ba2d68c/sympy-1.14.0.tar.gz", hash = "sha256:d3d3fe8df1e5a0b42f0e7bdf50541697dbe7d23746e894990c030e2b05e72517", size = 7793921, upload-time = "2025-04-27T18:05:01.611Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/a2/09/77d55d46fd61b4a135c444fc97158ef34a095e5681d0a6c10b75bf356191/sympy-1.14.0-py3-none-any.whl", hash = "sha256:e091cc3e99d2141a0ba2847328f5479b05d94a6635cb96148ccb3f34671bd8f5", size = 6299353, upload-time = "2025-04-27T18:04:59.103Z" }, -] - -[[package]] -name = "tenacity" -version = "8.5.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/a3/4d/6a19536c50b849338fcbe9290d562b52cbdcf30d8963d3588a68a4107df1/tenacity-8.5.0.tar.gz", hash = "sha256:8bc6c0c8a09b31e6cad13c47afbed1a567518250a9a171418582ed8d9c20ca78", size = 47309, upload-time = "2024-07-05T07:25:31.836Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/d2/3f/8ba87d9e287b9d385a02a7114ddcef61b26f86411e121c9003eb509a1773/tenacity-8.5.0-py3-none-any.whl", hash = "sha256:b594c2a5945830c267ce6b79a166228323ed52718f30302c1359836112346687", size = 28165, upload-time = "2024-07-05T07:25:29.591Z" }, -] - -[[package]] -name = "termcolor" -version = "2.4.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/10/56/d7d66a84f96d804155f6ff2873d065368b25a07222a6fd51c4f24ef6d764/termcolor-2.4.0.tar.gz", hash = "sha256:aab9e56047c8ac41ed798fa36d892a37aca6b3e9159f3e0c24bc64a9b3ac7b7a", size = 12664, upload-time = "2023-12-01T11:04:51.66Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/d9/5f/8c716e47b3a50cbd7c146f45881e11d9414def768b7cd9c5e6650ec2a80a/termcolor-2.4.0-py3-none-any.whl", hash = "sha256:9297c0df9c99445c2412e832e882a7884038a25617c60cea2ad69488d4040d63", size = 7719, upload-time = "2023-12-01T11:04:50.019Z" }, -] - -[[package]] -name = "threadpoolctl" -version = "3.6.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/b7/4d/08c89e34946fce2aec4fbb45c9016efd5f4d7f24af8e5d93296e935631d8/threadpoolctl-3.6.0.tar.gz", hash = "sha256:8ab8b4aa3491d812b623328249fab5302a68d2d71745c8a4c719a2fcaba9f44e", size = 21274, upload-time = "2025-03-13T13:49:23.031Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/32/d5/f9a850d79b0851d1d4ef6456097579a9005b31fea68726a4ae5f2d82ddd9/threadpoolctl-3.6.0-py3-none-any.whl", hash = "sha256:43a0b8fd5a2928500110039e43a5eed8480b918967083ea48dc3ab9f13c4a7fb", size = 18638, upload-time = "2025-03-13T13:49:21.846Z" }, -] - -[[package]] -name = "tiktoken" -version = "0.11.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "regex" }, - { name = "requests" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/a7/86/ad0155a37c4f310935d5ac0b1ccf9bdb635dcb906e0a9a26b616dd55825a/tiktoken-0.11.0.tar.gz", hash = "sha256:3c518641aee1c52247c2b97e74d8d07d780092af79d5911a6ab5e79359d9b06a", size = 37648, upload-time = "2025-08-08T23:58:08.495Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/8a/91/912b459799a025d2842566fe1e902f7f50d54a1ce8a0f236ab36b5bd5846/tiktoken-0.11.0-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:4ae374c46afadad0f501046db3da1b36cd4dfbfa52af23c998773682446097cf", size = 1059743, upload-time = "2025-08-08T23:57:37.516Z" }, - { url = "https://files.pythonhosted.org/packages/8c/e9/6faa6870489ce64f5f75dcf91512bf35af5864583aee8fcb0dcb593121f5/tiktoken-0.11.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:25a512ff25dc6c85b58f5dd4f3d8c674dc05f96b02d66cdacf628d26a4e4866b", size = 999334, upload-time = "2025-08-08T23:57:38.595Z" }, - { url = "https://files.pythonhosted.org/packages/a1/3e/a05d1547cf7db9dc75d1461cfa7b556a3b48e0516ec29dfc81d984a145f6/tiktoken-0.11.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2130127471e293d385179c1f3f9cd445070c0772be73cdafb7cec9a3684c0458", size = 1129402, upload-time = "2025-08-08T23:57:39.627Z" }, - { url = "https://files.pythonhosted.org/packages/34/9a/db7a86b829e05a01fd4daa492086f708e0a8b53952e1dbc9d380d2b03677/tiktoken-0.11.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:21e43022bf2c33f733ea9b54f6a3f6b4354b909f5a73388fb1b9347ca54a069c", size = 1184046, upload-time = "2025-08-08T23:57:40.689Z" }, - { url = "https://files.pythonhosted.org/packages/9d/bb/52edc8e078cf062ed749248f1454e9e5cfd09979baadb830b3940e522015/tiktoken-0.11.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:adb4e308eb64380dc70fa30493e21c93475eaa11669dea313b6bbf8210bfd013", size = 1244691, upload-time = "2025-08-08T23:57:42.251Z" }, - { url = "https://files.pythonhosted.org/packages/60/d9/884b6cd7ae2570ecdcaffa02b528522b18fef1cbbfdbcaa73799807d0d3b/tiktoken-0.11.0-cp311-cp311-win_amd64.whl", hash = "sha256:ece6b76bfeeb61a125c44bbefdfccc279b5288e6007fbedc0d32bfec602df2f2", size = 884392, upload-time = "2025-08-08T23:57:43.628Z" }, - { url = "https://files.pythonhosted.org/packages/e7/9e/eceddeffc169fc75fe0fd4f38471309f11cb1906f9b8aa39be4f5817df65/tiktoken-0.11.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:fd9e6b23e860973cf9526544e220b223c60badf5b62e80a33509d6d40e6c8f5d", size = 1055199, upload-time = "2025-08-08T23:57:45.076Z" }, - { url = "https://files.pythonhosted.org/packages/4f/cf/5f02bfefffdc6b54e5094d2897bc80efd43050e5b09b576fd85936ee54bf/tiktoken-0.11.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6a76d53cee2da71ee2731c9caa747398762bda19d7f92665e882fef229cb0b5b", size = 996655, upload-time = "2025-08-08T23:57:46.304Z" }, - { url = "https://files.pythonhosted.org/packages/65/8e/c769b45ef379bc360c9978c4f6914c79fd432400a6733a8afc7ed7b0726a/tiktoken-0.11.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6ef72aab3ea240646e642413cb363b73869fed4e604dcfd69eec63dc54d603e8", size = 1128867, upload-time = "2025-08-08T23:57:47.438Z" }, - { url = "https://files.pythonhosted.org/packages/d5/2d/4d77f6feb9292bfdd23d5813e442b3bba883f42d0ac78ef5fdc56873f756/tiktoken-0.11.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7f929255c705efec7a28bf515e29dc74220b2f07544a8c81b8d69e8efc4578bd", size = 1183308, upload-time = "2025-08-08T23:57:48.566Z" }, - { url = "https://files.pythonhosted.org/packages/7a/65/7ff0a65d3bb0fc5a1fb6cc71b03e0f6e71a68c5eea230d1ff1ba3fd6df49/tiktoken-0.11.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:61f1d15822e4404953d499fd1dcc62817a12ae9fb1e4898033ec8fe3915fdf8e", size = 1244301, upload-time = "2025-08-08T23:57:49.642Z" }, - { url = "https://files.pythonhosted.org/packages/f5/6e/5b71578799b72e5bdcef206a214c3ce860d999d579a3b56e74a6c8989ee2/tiktoken-0.11.0-cp312-cp312-win_amd64.whl", hash = "sha256:45927a71ab6643dfd3ef57d515a5db3d199137adf551f66453be098502838b0f", size = 884282, upload-time = "2025-08-08T23:57:50.759Z" }, - { url = "https://files.pythonhosted.org/packages/cc/cd/a9034bcee638716d9310443818d73c6387a6a96db93cbcb0819b77f5b206/tiktoken-0.11.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:a5f3f25ffb152ee7fec78e90a5e5ea5b03b4ea240beed03305615847f7a6ace2", size = 1055339, upload-time = "2025-08-08T23:57:51.802Z" }, - { url = "https://files.pythonhosted.org/packages/f1/91/9922b345f611b4e92581f234e64e9661e1c524875c8eadd513c4b2088472/tiktoken-0.11.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:7dc6e9ad16a2a75b4c4be7208055a1f707c9510541d94d9cc31f7fbdc8db41d8", size = 997080, upload-time = "2025-08-08T23:57:53.442Z" }, - { url = "https://files.pythonhosted.org/packages/d0/9d/49cd047c71336bc4b4af460ac213ec1c457da67712bde59b892e84f1859f/tiktoken-0.11.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5a0517634d67a8a48fd4a4ad73930c3022629a85a217d256a6e9b8b47439d1e4", size = 1128501, upload-time = "2025-08-08T23:57:54.808Z" }, - { url = "https://files.pythonhosted.org/packages/52/d5/a0dcdb40dd2ea357e83cb36258967f0ae96f5dd40c722d6e382ceee6bba9/tiktoken-0.11.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7fb4effe60574675118b73c6fbfd3b5868e5d7a1f570d6cc0d18724b09ecf318", size = 1182743, upload-time = "2025-08-08T23:57:56.307Z" }, - { url = "https://files.pythonhosted.org/packages/3b/17/a0fc51aefb66b7b5261ca1314afa83df0106b033f783f9a7bcbe8e741494/tiktoken-0.11.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:94f984c9831fd32688aef4348803b0905d4ae9c432303087bae370dc1381a2b8", size = 1244057, upload-time = "2025-08-08T23:57:57.628Z" }, - { url = "https://files.pythonhosted.org/packages/50/79/bcf350609f3a10f09fe4fc207f132085e497fdd3612f3925ab24d86a0ca0/tiktoken-0.11.0-cp313-cp313-win_amd64.whl", hash = "sha256:2177ffda31dec4023356a441793fed82f7af5291120751dee4d696414f54db0c", size = 883901, upload-time = "2025-08-08T23:57:59.359Z" }, -] - -[[package]] -name = "tokenizers" -version = "0.22.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "huggingface-hub" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/5e/b4/c1ce3699e81977da2ace8b16d2badfd42b060e7d33d75c4ccdbf9dc920fa/tokenizers-0.22.0.tar.gz", hash = "sha256:2e33b98525be8453f355927f3cab312c36cd3e44f4d7e9e97da2fa94d0a49dcb", size = 362771, upload-time = "2025-08-29T10:25:33.914Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/6d/b1/18c13648edabbe66baa85fe266a478a7931ddc0cd1ba618802eb7b8d9865/tokenizers-0.22.0-cp39-abi3-macosx_10_12_x86_64.whl", hash = "sha256:eaa9620122a3fb99b943f864af95ed14c8dfc0f47afa3b404ac8c16b3f2bb484", size = 3081954, upload-time = "2025-08-29T10:25:24.993Z" }, - { url = "https://files.pythonhosted.org/packages/c2/02/c3c454b641bd7c4f79e4464accfae9e7dfc913a777d2e561e168ae060362/tokenizers-0.22.0-cp39-abi3-macosx_11_0_arm64.whl", hash = "sha256:71784b9ab5bf0ff3075bceeb198149d2c5e068549c0d18fe32d06ba0deb63f79", size = 2945644, upload-time = "2025-08-29T10:25:23.405Z" }, - { url = "https://files.pythonhosted.org/packages/55/02/d10185ba2fd8c2d111e124c9d92de398aee0264b35ce433f79fb8472f5d0/tokenizers-0.22.0-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ec5b71f668a8076802b0241a42387d48289f25435b86b769ae1837cad4172a17", size = 3254764, upload-time = "2025-08-29T10:25:12.445Z" }, - { url = "https://files.pythonhosted.org/packages/13/89/17514bd7ef4bf5bfff58e2b131cec0f8d5cea2b1c8ffe1050a2c8de88dbb/tokenizers-0.22.0-cp39-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ea8562fa7498850d02a16178105b58803ea825b50dc9094d60549a7ed63654bb", size = 3161654, upload-time = "2025-08-29T10:25:15.493Z" }, - { url = "https://files.pythonhosted.org/packages/5a/d8/bac9f3a7ef6dcceec206e3857c3b61bb16c6b702ed7ae49585f5bd85c0ef/tokenizers-0.22.0-cp39-abi3-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4136e1558a9ef2e2f1de1555dcd573e1cbc4a320c1a06c4107a3d46dc8ac6e4b", size = 3511484, upload-time = "2025-08-29T10:25:20.477Z" }, - { url = "https://files.pythonhosted.org/packages/aa/27/9c9800eb6763683010a4851db4d1802d8cab9cec114c17056eccb4d4a6e0/tokenizers-0.22.0-cp39-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:cdf5954de3962a5fd9781dc12048d24a1a6f1f5df038c6e95db328cd22964206", size = 3712829, upload-time = "2025-08-29T10:25:17.154Z" }, - { url = "https://files.pythonhosted.org/packages/10/e3/b1726dbc1f03f757260fa21752e1921445b5bc350389a8314dd3338836db/tokenizers-0.22.0-cp39-abi3-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8337ca75d0731fc4860e6204cc24bb36a67d9736142aa06ed320943b50b1e7ed", size = 3408934, upload-time = "2025-08-29T10:25:18.76Z" }, - { url = "https://files.pythonhosted.org/packages/d4/61/aeab3402c26874b74bb67a7f2c4b569dde29b51032c5384db592e7b216f4/tokenizers-0.22.0-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a89264e26f63c449d8cded9061adea7b5de53ba2346fc7e87311f7e4117c1cc8", size = 3345585, upload-time = "2025-08-29T10:25:22.08Z" }, - { url = "https://files.pythonhosted.org/packages/bc/d3/498b4a8a8764cce0900af1add0f176ff24f475d4413d55b760b8cdf00893/tokenizers-0.22.0-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:790bad50a1b59d4c21592f9c3cf5e5cf9c3c7ce7e1a23a739f13e01fb1be377a", size = 9322986, upload-time = "2025-08-29T10:25:26.607Z" }, - { url = "https://files.pythonhosted.org/packages/a2/62/92378eb1c2c565837ca3cb5f9569860d132ab9d195d7950c1ea2681dffd0/tokenizers-0.22.0-cp39-abi3-musllinux_1_2_armv7l.whl", hash = "sha256:76cf6757c73a10ef10bf06fa937c0ec7393d90432f543f49adc8cab3fb6f26cb", size = 9276630, upload-time = "2025-08-29T10:25:28.349Z" }, - { url = "https://files.pythonhosted.org/packages/eb/f0/342d80457aa1cda7654327460f69db0d69405af1e4c453f4dc6ca7c4a76e/tokenizers-0.22.0-cp39-abi3-musllinux_1_2_i686.whl", hash = "sha256:1626cb186e143720c62c6c6b5371e62bbc10af60481388c0da89bc903f37ea0c", size = 9547175, upload-time = "2025-08-29T10:25:29.989Z" }, - { url = "https://files.pythonhosted.org/packages/14/84/8aa9b4adfc4fbd09381e20a5bc6aa27040c9c09caa89988c01544e008d18/tokenizers-0.22.0-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:da589a61cbfea18ae267723d6b029b84598dc8ca78db9951d8f5beff72d8507c", size = 9692735, upload-time = "2025-08-29T10:25:32.089Z" }, - { url = "https://files.pythonhosted.org/packages/bf/24/83ee2b1dc76bfe05c3142e7d0ccdfe69f0ad2f1ebf6c726cea7f0874c0d0/tokenizers-0.22.0-cp39-abi3-win32.whl", hash = "sha256:dbf9d6851bddae3e046fedfb166f47743c1c7bd11c640f0691dd35ef0bcad3be", size = 2471915, upload-time = "2025-08-29T10:25:36.411Z" }, - { url = "https://files.pythonhosted.org/packages/d1/9b/0e0bf82214ee20231845b127aa4a8015936ad5a46779f30865d10e404167/tokenizers-0.22.0-cp39-abi3-win_amd64.whl", hash = "sha256:c78174859eeaee96021f248a56c801e36bfb6bd5b067f2e95aa82445ca324f00", size = 2680494, upload-time = "2025-08-29T10:25:35.14Z" }, -] - -[[package]] -name = "tomlkit" -version = "0.13.3" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/cc/18/0bbf3884e9eaa38819ebe46a7bd25dcd56b67434402b66a58c4b8e552575/tomlkit-0.13.3.tar.gz", hash = "sha256:430cf247ee57df2b94ee3fbe588e71d362a941ebb545dec29b53961d61add2a1", size = 185207, upload-time = "2025-06-05T07:13:44.947Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/bd/75/8539d011f6be8e29f339c42e633aae3cb73bffa95dd0f9adec09b9c58e85/tomlkit-0.13.3-py3-none-any.whl", hash = "sha256:c89c649d79ee40629a9fda55f8ace8c6a1b42deb912b2a8fd8d942ddadb606b0", size = 38901, upload-time = "2025-06-05T07:13:43.546Z" }, -] - -[[package]] -name = "tqdm" -version = "4.67.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "colorama", marker = "sys_platform == 'win32'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/a8/4b/29b4ef32e036bb34e4ab51796dd745cdba7ed47ad142a9f4a1eb8e0c744d/tqdm-4.67.1.tar.gz", hash = "sha256:f8aef9c52c08c13a65f30ea34f4e5aac3fd1a34959879d7e59e63027286627f2", size = 169737, upload-time = "2024-11-24T20:12:22.481Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl", hash = "sha256:26445eca388f82e72884e0d580d5464cd801a3ea01e63e5601bdff9ba6a48de2", size = 78540, upload-time = "2024-11-24T20:12:19.698Z" }, -] - -[[package]] -name = "typer" -version = "0.17.4" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "click" }, - { name = "rich" }, - { name = "shellingham" }, - { name = "typing-extensions" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/92/e8/2a73ccf9874ec4c7638f172efc8972ceab13a0e3480b389d6ed822f7a822/typer-0.17.4.tar.gz", hash = "sha256:b77dc07d849312fd2bb5e7f20a7af8985c7ec360c45b051ed5412f64d8dc1580", size = 103734, upload-time = "2025-09-05T18:14:40.746Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/93/72/6b3e70d32e89a5cbb6a4513726c1ae8762165b027af569289e19ec08edd8/typer-0.17.4-py3-none-any.whl", hash = "sha256:015534a6edaa450e7007eba705d5c18c3349dcea50a6ad79a5ed530967575824", size = 46643, upload-time = "2025-09-05T18:14:39.166Z" }, -] - -[[package]] -name = "typing-extensions" -version = "4.15.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/72/94/1a15dd82efb362ac84269196e94cf00f187f7ed21c242792a923cdb1c61f/typing_extensions-4.15.0.tar.gz", hash = "sha256:0cea48d173cc12fa28ecabc3b837ea3cf6f38c6d1136f85cbaaf598984861466", size = 109391, upload-time = "2025-08-25T13:49:26.313Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/18/67/36e9267722cc04a6b9f15c7f3441c2363321a3ea07da7ae0c0707beb2a9c/typing_extensions-4.15.0-py3-none-any.whl", hash = "sha256:f0fa19c6845758ab08074a0cfa8b7aecb71c999ca73d62883bc25cc018c4e548", size = 44614, upload-time = "2025-08-25T13:49:24.86Z" }, -] - -[[package]] -name = "typing-inspection" -version = "0.4.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "typing-extensions" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/f8/b1/0c11f5058406b3af7609f121aaa6b609744687f1d158b3c3a5bf4cc94238/typing_inspection-0.4.1.tar.gz", hash = "sha256:6ae134cc0203c33377d43188d4064e9b357dba58cff3185f22924610e70a9d28", size = 75726, upload-time = "2025-05-21T18:55:23.885Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/17/69/cd203477f944c353c31bade965f880aa1061fd6bf05ded0726ca845b6ff7/typing_inspection-0.4.1-py3-none-any.whl", hash = "sha256:389055682238f53b04f7badcb49b989835495a96700ced5dab2d8feae4b26f51", size = 14552, upload-time = "2025-05-21T18:55:22.152Z" }, -] - -[[package]] -name = "tzdata" -version = "2025.2" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/95/32/1a225d6164441be760d75c2c42e2780dc0873fe382da3e98a2e1e48361e5/tzdata-2025.2.tar.gz", hash = "sha256:b60a638fcc0daffadf82fe0f57e53d06bdec2f36c4df66280ae79bce6bd6f2b9", size = 196380, upload-time = "2025-03-23T13:54:43.652Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/5c/23/c7abc0ca0a1526a0774eca151daeb8de62ec457e77262b66b359c3c7679e/tzdata-2025.2-py2.py3-none-any.whl", hash = "sha256:1a403fada01ff9221ca8044d701868fa132215d84beb92242d9acd2147f667a8", size = 347839, upload-time = "2025-03-23T13:54:41.845Z" }, -] - -[[package]] -name = "tzlocal" -version = "5.3.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "tzdata", marker = "sys_platform == 'win32'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/8b/2e/c14812d3d4d9cd1773c6be938f89e5735a1f11a9f184ac3639b93cef35d5/tzlocal-5.3.1.tar.gz", hash = "sha256:cceffc7edecefea1f595541dbd6e990cb1ea3d19bf01b2809f362a03dd7921fd", size = 30761, upload-time = "2025-03-05T21:17:41.549Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/c2/14/e2a54fabd4f08cd7af1c07030603c3356b74da07f7cc056e600436edfa17/tzlocal-5.3.1-py3-none-any.whl", hash = "sha256:eb1a66c3ef5847adf7a834f1be0800581b683b5608e74f86ecbcef8ab91bb85d", size = 18026, upload-time = "2025-03-05T21:17:39.857Z" }, -] - -[[package]] -name = "uritemplate" -version = "4.2.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/98/60/f174043244c5306c9988380d2cb10009f91563fc4b31293d27e17201af56/uritemplate-4.2.0.tar.gz", hash = "sha256:480c2ed180878955863323eea31b0ede668795de182617fef9c6ca09e6ec9d0e", size = 33267, upload-time = "2025-06-02T15:12:06.318Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/a9/99/3ae339466c9183ea5b8ae87b34c0b897eda475d2aec2307cae60e5cd4f29/uritemplate-4.2.0-py3-none-any.whl", hash = "sha256:962201ba1c4edcab02e60f9a0d3821e82dfc5d2d6662a21abd533879bdb8a686", size = 11488, upload-time = "2025-06-02T15:12:03.405Z" }, -] - -[[package]] -name = "urllib3" -version = "2.5.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/15/22/9ee70a2574a4f4599c47dd506532914ce044817c7752a79b6a51286319bc/urllib3-2.5.0.tar.gz", hash = "sha256:3fc47733c7e419d4bc3f6b3dc2b4f890bb743906a30d56ba4a5bfa4bbff92760", size = 393185, upload-time = "2025-06-18T14:07:41.644Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/a7/c2/fe1e52489ae3122415c51f387e221dd0773709bad6c6cdaa599e8a2c5185/urllib3-2.5.0-py3-none-any.whl", hash = "sha256:e6b01673c0fa6a13e374b50871808eb3bf7046c4b125b216f6bf1cc604cff0dc", size = 129795, upload-time = "2025-06-18T14:07:40.39Z" }, -] - -[[package]] -name = "uvicorn" -version = "0.35.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "click" }, - { name = "h11" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/5e/42/e0e305207bb88c6b8d3061399c6a961ffe5fbb7e2aa63c9234df7259e9cd/uvicorn-0.35.0.tar.gz", hash = "sha256:bc662f087f7cf2ce11a1d7fd70b90c9f98ef2e2831556dd078d131b96cc94a01", size = 78473, upload-time = "2025-06-28T16:15:46.058Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/d2/e2/dc81b1bd1dcfe91735810265e9d26bc8ec5da45b4c0f6237e286819194c3/uvicorn-0.35.0-py3-none-any.whl", hash = "sha256:197535216b25ff9b785e29a0b79199f55222193d47f820816e7da751e9bc8d4a", size = 66406, upload-time = "2025-06-28T16:15:44.816Z" }, -] - -[[package]] -name = "virtualenv" -version = "20.34.0" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "distlib" }, - { name = "filelock" }, - { name = "platformdirs" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/1c/14/37fcdba2808a6c615681cd216fecae00413c9dab44fb2e57805ecf3eaee3/virtualenv-20.34.0.tar.gz", hash = "sha256:44815b2c9dee7ed86e387b842a84f20b93f7f417f95886ca1996a72a4138eb1a", size = 6003808, upload-time = "2025-08-13T14:24:07.464Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/76/06/04c8e804f813cf972e3262f3f8584c232de64f0cde9f703b46cf53a45090/virtualenv-20.34.0-py3-none-any.whl", hash = "sha256:341f5afa7eee943e4984a9207c025feedd768baff6753cd660c857ceb3e36026", size = 5983279, upload-time = "2025-08-13T14:24:05.111Z" }, -] - -[[package]] -name = "watchdog" -version = "6.0.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/db/7d/7f3d619e951c88ed75c6037b246ddcf2d322812ee8ea189be89511721d54/watchdog-6.0.0.tar.gz", hash = "sha256:9ddf7c82fda3ae8e24decda1338ede66e1c99883db93711d8fb941eaa2d8c282", size = 131220, upload-time = "2024-11-01T14:07:13.037Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/e0/24/d9be5cd6642a6aa68352ded4b4b10fb0d7889cb7f45814fb92cecd35f101/watchdog-6.0.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:6eb11feb5a0d452ee41f824e271ca311a09e250441c262ca2fd7ebcf2461a06c", size = 96393, upload-time = "2024-11-01T14:06:31.756Z" }, - { url = "https://files.pythonhosted.org/packages/63/7a/6013b0d8dbc56adca7fdd4f0beed381c59f6752341b12fa0886fa7afc78b/watchdog-6.0.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:ef810fbf7b781a5a593894e4f439773830bdecb885e6880d957d5b9382a960d2", size = 88392, upload-time = "2024-11-01T14:06:32.99Z" }, - { url = "https://files.pythonhosted.org/packages/d1/40/b75381494851556de56281e053700e46bff5b37bf4c7267e858640af5a7f/watchdog-6.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:afd0fe1b2270917c5e23c2a65ce50c2a4abb63daafb0d419fde368e272a76b7c", size = 89019, upload-time = "2024-11-01T14:06:34.963Z" }, - { url = "https://files.pythonhosted.org/packages/39/ea/3930d07dafc9e286ed356a679aa02d777c06e9bfd1164fa7c19c288a5483/watchdog-6.0.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:bdd4e6f14b8b18c334febb9c4425a878a2ac20efd1e0b231978e7b150f92a948", size = 96471, upload-time = "2024-11-01T14:06:37.745Z" }, - { url = "https://files.pythonhosted.org/packages/12/87/48361531f70b1f87928b045df868a9fd4e253d9ae087fa4cf3f7113be363/watchdog-6.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:c7c15dda13c4eb00d6fb6fc508b3c0ed88b9d5d374056b239c4ad1611125c860", size = 88449, upload-time = "2024-11-01T14:06:39.748Z" }, - { url = "https://files.pythonhosted.org/packages/5b/7e/8f322f5e600812e6f9a31b75d242631068ca8f4ef0582dd3ae6e72daecc8/watchdog-6.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6f10cb2d5902447c7d0da897e2c6768bca89174d0c6e1e30abec5421af97a5b0", size = 89054, upload-time = "2024-11-01T14:06:41.009Z" }, - { url = "https://files.pythonhosted.org/packages/68/98/b0345cabdce2041a01293ba483333582891a3bd5769b08eceb0d406056ef/watchdog-6.0.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:490ab2ef84f11129844c23fb14ecf30ef3d8a6abafd3754a6f75ca1e6654136c", size = 96480, upload-time = "2024-11-01T14:06:42.952Z" }, - { url = "https://files.pythonhosted.org/packages/85/83/cdf13902c626b28eedef7ec4f10745c52aad8a8fe7eb04ed7b1f111ca20e/watchdog-6.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:76aae96b00ae814b181bb25b1b98076d5fc84e8a53cd8885a318b42b6d3a5134", size = 88451, upload-time = "2024-11-01T14:06:45.084Z" }, - { url = "https://files.pythonhosted.org/packages/fe/c4/225c87bae08c8b9ec99030cd48ae9c4eca050a59bf5c2255853e18c87b50/watchdog-6.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a175f755fc2279e0b7312c0035d52e27211a5bc39719dd529625b1930917345b", size = 89057, upload-time = "2024-11-01T14:06:47.324Z" }, - { url = "https://files.pythonhosted.org/packages/a9/c7/ca4bf3e518cb57a686b2feb4f55a1892fd9a3dd13f470fca14e00f80ea36/watchdog-6.0.0-py3-none-manylinux2014_aarch64.whl", hash = "sha256:7607498efa04a3542ae3e05e64da8202e58159aa1fa4acddf7678d34a35d4f13", size = 79079, upload-time = "2024-11-01T14:06:59.472Z" }, - { url = "https://files.pythonhosted.org/packages/5c/51/d46dc9332f9a647593c947b4b88e2381c8dfc0942d15b8edc0310fa4abb1/watchdog-6.0.0-py3-none-manylinux2014_armv7l.whl", hash = "sha256:9041567ee8953024c83343288ccc458fd0a2d811d6a0fd68c4c22609e3490379", size = 79078, upload-time = "2024-11-01T14:07:01.431Z" }, - { url = "https://files.pythonhosted.org/packages/d4/57/04edbf5e169cd318d5f07b4766fee38e825d64b6913ca157ca32d1a42267/watchdog-6.0.0-py3-none-manylinux2014_i686.whl", hash = "sha256:82dc3e3143c7e38ec49d61af98d6558288c415eac98486a5c581726e0737c00e", size = 79076, upload-time = "2024-11-01T14:07:02.568Z" }, - { url = "https://files.pythonhosted.org/packages/ab/cc/da8422b300e13cb187d2203f20b9253e91058aaf7db65b74142013478e66/watchdog-6.0.0-py3-none-manylinux2014_ppc64.whl", hash = "sha256:212ac9b8bf1161dc91bd09c048048a95ca3a4c4f5e5d4a7d1b1a7d5752a7f96f", size = 79077, upload-time = "2024-11-01T14:07:03.893Z" }, - { url = "https://files.pythonhosted.org/packages/2c/3b/b8964e04ae1a025c44ba8e4291f86e97fac443bca31de8bd98d3263d2fcf/watchdog-6.0.0-py3-none-manylinux2014_ppc64le.whl", hash = "sha256:e3df4cbb9a450c6d49318f6d14f4bbc80d763fa587ba46ec86f99f9e6876bb26", size = 79078, upload-time = "2024-11-01T14:07:05.189Z" }, - { url = "https://files.pythonhosted.org/packages/62/ae/a696eb424bedff7407801c257d4b1afda455fe40821a2be430e173660e81/watchdog-6.0.0-py3-none-manylinux2014_s390x.whl", hash = "sha256:2cce7cfc2008eb51feb6aab51251fd79b85d9894e98ba847408f662b3395ca3c", size = 79077, upload-time = "2024-11-01T14:07:06.376Z" }, - { url = "https://files.pythonhosted.org/packages/b5/e8/dbf020b4d98251a9860752a094d09a65e1b436ad181faf929983f697048f/watchdog-6.0.0-py3-none-manylinux2014_x86_64.whl", hash = "sha256:20ffe5b202af80ab4266dcd3e91aae72bf2da48c0d33bdb15c66658e685e94e2", size = 79078, upload-time = "2024-11-01T14:07:07.547Z" }, - { url = "https://files.pythonhosted.org/packages/07/f6/d0e5b343768e8bcb4cda79f0f2f55051bf26177ecd5651f84c07567461cf/watchdog-6.0.0-py3-none-win32.whl", hash = "sha256:07df1fdd701c5d4c8e55ef6cf55b8f0120fe1aef7ef39a1c6fc6bc2e606d517a", size = 79065, upload-time = "2024-11-01T14:07:09.525Z" }, - { url = "https://files.pythonhosted.org/packages/db/d9/c495884c6e548fce18a8f40568ff120bc3a4b7b99813081c8ac0c936fa64/watchdog-6.0.0-py3-none-win_amd64.whl", hash = "sha256:cbafb470cf848d93b5d013e2ecb245d4aa1c8fd0504e863ccefa32445359d680", size = 79070, upload-time = "2024-11-01T14:07:10.686Z" }, - { url = "https://files.pythonhosted.org/packages/33/e8/e40370e6d74ddba47f002a32919d91310d6074130fe4e17dabcafc15cbf1/watchdog-6.0.0-py3-none-win_ia64.whl", hash = "sha256:a1914259fa9e1454315171103c6a30961236f508b9b623eae470268bbcc6a22f", size = 79067, upload-time = "2024-11-01T14:07:11.845Z" }, -] - -[[package]] -name = "websockets" -version = "15.0.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/21/e6/26d09fab466b7ca9c7737474c52be4f76a40301b08362eb2dbc19dcc16c1/websockets-15.0.1.tar.gz", hash = "sha256:82544de02076bafba038ce055ee6412d68da13ab47f0c60cab827346de828dee", size = 177016, upload-time = "2025-03-05T20:03:41.606Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/9f/32/18fcd5919c293a398db67443acd33fde142f283853076049824fc58e6f75/websockets-15.0.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:823c248b690b2fd9303ba00c4f66cd5e2d8c3ba4aa968b2779be9532a4dad431", size = 175423, upload-time = "2025-03-05T20:01:56.276Z" }, - { url = "https://files.pythonhosted.org/packages/76/70/ba1ad96b07869275ef42e2ce21f07a5b0148936688c2baf7e4a1f60d5058/websockets-15.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:678999709e68425ae2593acf2e3ebcbcf2e69885a5ee78f9eb80e6e371f1bf57", size = 173082, upload-time = "2025-03-05T20:01:57.563Z" }, - { url = "https://files.pythonhosted.org/packages/86/f2/10b55821dd40eb696ce4704a87d57774696f9451108cff0d2824c97e0f97/websockets-15.0.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:d50fd1ee42388dcfb2b3676132c78116490976f1300da28eb629272d5d93e905", size = 173330, upload-time = "2025-03-05T20:01:59.063Z" }, - { url = "https://files.pythonhosted.org/packages/a5/90/1c37ae8b8a113d3daf1065222b6af61cc44102da95388ac0018fcb7d93d9/websockets-15.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d99e5546bf73dbad5bf3547174cd6cb8ba7273062a23808ffea025ecb1cf8562", size = 182878, upload-time = "2025-03-05T20:02:00.305Z" }, - { url = "https://files.pythonhosted.org/packages/8e/8d/96e8e288b2a41dffafb78e8904ea7367ee4f891dafc2ab8d87e2124cb3d3/websockets-15.0.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:66dd88c918e3287efc22409d426c8f729688d89a0c587c88971a0faa2c2f3792", size = 181883, upload-time = "2025-03-05T20:02:03.148Z" }, - { url = "https://files.pythonhosted.org/packages/93/1f/5d6dbf551766308f6f50f8baf8e9860be6182911e8106da7a7f73785f4c4/websockets-15.0.1-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8dd8327c795b3e3f219760fa603dcae1dcc148172290a8ab15158cf85a953413", size = 182252, upload-time = "2025-03-05T20:02:05.29Z" }, - { url = "https://files.pythonhosted.org/packages/d4/78/2d4fed9123e6620cbf1706c0de8a1632e1a28e7774d94346d7de1bba2ca3/websockets-15.0.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8fdc51055e6ff4adeb88d58a11042ec9a5eae317a0a53d12c062c8a8865909e8", size = 182521, upload-time = "2025-03-05T20:02:07.458Z" }, - { url = "https://files.pythonhosted.org/packages/e7/3b/66d4c1b444dd1a9823c4a81f50231b921bab54eee2f69e70319b4e21f1ca/websockets-15.0.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:693f0192126df6c2327cce3baa7c06f2a117575e32ab2308f7f8216c29d9e2e3", size = 181958, upload-time = "2025-03-05T20:02:09.842Z" }, - { url = "https://files.pythonhosted.org/packages/08/ff/e9eed2ee5fed6f76fdd6032ca5cd38c57ca9661430bb3d5fb2872dc8703c/websockets-15.0.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:54479983bd5fb469c38f2f5c7e3a24f9a4e70594cd68cd1fa6b9340dadaff7cf", size = 181918, upload-time = "2025-03-05T20:02:11.968Z" }, - { url = "https://files.pythonhosted.org/packages/d8/75/994634a49b7e12532be6a42103597b71098fd25900f7437d6055ed39930a/websockets-15.0.1-cp311-cp311-win32.whl", hash = "sha256:16b6c1b3e57799b9d38427dda63edcbe4926352c47cf88588c0be4ace18dac85", size = 176388, upload-time = "2025-03-05T20:02:13.32Z" }, - { url = "https://files.pythonhosted.org/packages/98/93/e36c73f78400a65f5e236cd376713c34182e6663f6889cd45a4a04d8f203/websockets-15.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:27ccee0071a0e75d22cb35849b1db43f2ecd3e161041ac1ee9d2352ddf72f065", size = 176828, upload-time = "2025-03-05T20:02:14.585Z" }, - { url = "https://files.pythonhosted.org/packages/51/6b/4545a0d843594f5d0771e86463606a3988b5a09ca5123136f8a76580dd63/websockets-15.0.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:3e90baa811a5d73f3ca0bcbf32064d663ed81318ab225ee4f427ad4e26e5aff3", size = 175437, upload-time = "2025-03-05T20:02:16.706Z" }, - { url = "https://files.pythonhosted.org/packages/f4/71/809a0f5f6a06522af902e0f2ea2757f71ead94610010cf570ab5c98e99ed/websockets-15.0.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:592f1a9fe869c778694f0aa806ba0374e97648ab57936f092fd9d87f8bc03665", size = 173096, upload-time = "2025-03-05T20:02:18.832Z" }, - { url = "https://files.pythonhosted.org/packages/3d/69/1a681dd6f02180916f116894181eab8b2e25b31e484c5d0eae637ec01f7c/websockets-15.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0701bc3cfcb9164d04a14b149fd74be7347a530ad3bbf15ab2c678a2cd3dd9a2", size = 173332, upload-time = "2025-03-05T20:02:20.187Z" }, - { url = "https://files.pythonhosted.org/packages/a6/02/0073b3952f5bce97eafbb35757f8d0d54812b6174ed8dd952aa08429bcc3/websockets-15.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e8b56bdcdb4505c8078cb6c7157d9811a85790f2f2b3632c7d1462ab5783d215", size = 183152, upload-time = "2025-03-05T20:02:22.286Z" }, - { url = "https://files.pythonhosted.org/packages/74/45/c205c8480eafd114b428284840da0b1be9ffd0e4f87338dc95dc6ff961a1/websockets-15.0.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0af68c55afbd5f07986df82831c7bff04846928ea8d1fd7f30052638788bc9b5", size = 182096, upload-time = "2025-03-05T20:02:24.368Z" }, - { url = "https://files.pythonhosted.org/packages/14/8f/aa61f528fba38578ec553c145857a181384c72b98156f858ca5c8e82d9d3/websockets-15.0.1-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:64dee438fed052b52e4f98f76c5790513235efaa1ef7f3f2192c392cd7c91b65", size = 182523, upload-time = "2025-03-05T20:02:25.669Z" }, - { url = "https://files.pythonhosted.org/packages/ec/6d/0267396610add5bc0d0d3e77f546d4cd287200804fe02323797de77dbce9/websockets-15.0.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:d5f6b181bb38171a8ad1d6aa58a67a6aa9d4b38d0f8c5f496b9e42561dfc62fe", size = 182790, upload-time = "2025-03-05T20:02:26.99Z" }, - { url = "https://files.pythonhosted.org/packages/02/05/c68c5adbf679cf610ae2f74a9b871ae84564462955d991178f95a1ddb7dd/websockets-15.0.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:5d54b09eba2bada6011aea5375542a157637b91029687eb4fdb2dab11059c1b4", size = 182165, upload-time = "2025-03-05T20:02:30.291Z" }, - { url = "https://files.pythonhosted.org/packages/29/93/bb672df7b2f5faac89761cb5fa34f5cec45a4026c383a4b5761c6cea5c16/websockets-15.0.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:3be571a8b5afed347da347bfcf27ba12b069d9d7f42cb8c7028b5e98bbb12597", size = 182160, upload-time = "2025-03-05T20:02:31.634Z" }, - { url = "https://files.pythonhosted.org/packages/ff/83/de1f7709376dc3ca9b7eeb4b9a07b4526b14876b6d372a4dc62312bebee0/websockets-15.0.1-cp312-cp312-win32.whl", hash = "sha256:c338ffa0520bdb12fbc527265235639fb76e7bc7faafbb93f6ba80d9c06578a9", size = 176395, upload-time = "2025-03-05T20:02:33.017Z" }, - { url = "https://files.pythonhosted.org/packages/7d/71/abf2ebc3bbfa40f391ce1428c7168fb20582d0ff57019b69ea20fa698043/websockets-15.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:fcd5cf9e305d7b8338754470cf69cf81f420459dbae8a3b40cee57417f4614a7", size = 176841, upload-time = "2025-03-05T20:02:34.498Z" }, - { url = "https://files.pythonhosted.org/packages/cb/9f/51f0cf64471a9d2b4d0fc6c534f323b664e7095640c34562f5182e5a7195/websockets-15.0.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ee443ef070bb3b6ed74514f5efaa37a252af57c90eb33b956d35c8e9c10a1931", size = 175440, upload-time = "2025-03-05T20:02:36.695Z" }, - { url = "https://files.pythonhosted.org/packages/8a/05/aa116ec9943c718905997412c5989f7ed671bc0188ee2ba89520e8765d7b/websockets-15.0.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:5a939de6b7b4e18ca683218320fc67ea886038265fd1ed30173f5ce3f8e85675", size = 173098, upload-time = "2025-03-05T20:02:37.985Z" }, - { url = "https://files.pythonhosted.org/packages/ff/0b/33cef55ff24f2d92924923c99926dcce78e7bd922d649467f0eda8368923/websockets-15.0.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:746ee8dba912cd6fc889a8147168991d50ed70447bf18bcda7039f7d2e3d9151", size = 173329, upload-time = "2025-03-05T20:02:39.298Z" }, - { url = "https://files.pythonhosted.org/packages/31/1d/063b25dcc01faa8fada1469bdf769de3768b7044eac9d41f734fd7b6ad6d/websockets-15.0.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:595b6c3969023ecf9041b2936ac3827e4623bfa3ccf007575f04c5a6aa318c22", size = 183111, upload-time = "2025-03-05T20:02:40.595Z" }, - { url = "https://files.pythonhosted.org/packages/93/53/9a87ee494a51bf63e4ec9241c1ccc4f7c2f45fff85d5bde2ff74fcb68b9e/websockets-15.0.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3c714d2fc58b5ca3e285461a4cc0c9a66bd0e24c5da9911e30158286c9b5be7f", size = 182054, upload-time = "2025-03-05T20:02:41.926Z" }, - { url = "https://files.pythonhosted.org/packages/ff/b2/83a6ddf56cdcbad4e3d841fcc55d6ba7d19aeb89c50f24dd7e859ec0805f/websockets-15.0.1-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0f3c1e2ab208db911594ae5b4f79addeb3501604a165019dd221c0bdcabe4db8", size = 182496, upload-time = "2025-03-05T20:02:43.304Z" }, - { url = "https://files.pythonhosted.org/packages/98/41/e7038944ed0abf34c45aa4635ba28136f06052e08fc2168520bb8b25149f/websockets-15.0.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:229cf1d3ca6c1804400b0a9790dc66528e08a6a1feec0d5040e8b9eb14422375", size = 182829, upload-time = "2025-03-05T20:02:48.812Z" }, - { url = "https://files.pythonhosted.org/packages/e0/17/de15b6158680c7623c6ef0db361da965ab25d813ae54fcfeae2e5b9ef910/websockets-15.0.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:756c56e867a90fb00177d530dca4b097dd753cde348448a1012ed6c5131f8b7d", size = 182217, upload-time = "2025-03-05T20:02:50.14Z" }, - { url = "https://files.pythonhosted.org/packages/33/2b/1f168cb6041853eef0362fb9554c3824367c5560cbdaad89ac40f8c2edfc/websockets-15.0.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:558d023b3df0bffe50a04e710bc87742de35060580a293c2a984299ed83bc4e4", size = 182195, upload-time = "2025-03-05T20:02:51.561Z" }, - { url = "https://files.pythonhosted.org/packages/86/eb/20b6cdf273913d0ad05a6a14aed4b9a85591c18a987a3d47f20fa13dcc47/websockets-15.0.1-cp313-cp313-win32.whl", hash = "sha256:ba9e56e8ceeeedb2e080147ba85ffcd5cd0711b89576b83784d8605a7df455fa", size = 176393, upload-time = "2025-03-05T20:02:53.814Z" }, - { url = "https://files.pythonhosted.org/packages/1b/6c/c65773d6cab416a64d191d6ee8a8b1c68a09970ea6909d16965d26bfed1e/websockets-15.0.1-cp313-cp313-win_amd64.whl", hash = "sha256:e09473f095a819042ecb2ab9465aee615bd9c2028e4ef7d933600a8401c79561", size = 176837, upload-time = "2025-03-05T20:02:55.237Z" }, - { url = "https://files.pythonhosted.org/packages/fa/a8/5b41e0da817d64113292ab1f8247140aac61cbf6cfd085d6a0fa77f4984f/websockets-15.0.1-py3-none-any.whl", hash = "sha256:f7a866fbc1e97b5c617ee4116daaa09b722101d4a3c170c787450ba409f9736f", size = 169743, upload-time = "2025-03-05T20:03:39.41Z" }, -] - -[[package]] -name = "werkzeug" -version = "3.1.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "markupsafe" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/32/af/d4502dc713b4ccea7175d764718d5183caf8d0867a4f0190d5d4a45cea49/werkzeug-3.1.1.tar.gz", hash = "sha256:8cd39dfbdfc1e051965f156163e2974e52c210f130810e9ad36858f0fd3edad4", size = 806453, upload-time = "2024-11-01T16:40:45.462Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/ee/ea/c67e1dee1ba208ed22c06d1d547ae5e293374bfc43e0eb0ef5e262b68561/werkzeug-3.1.1-py3-none-any.whl", hash = "sha256:a71124d1ef06008baafa3d266c02f56e1836a5984afd6dd6c9230669d60d9fb5", size = 224371, upload-time = "2024-11-01T16:40:43.994Z" }, -] - -[[package]] -name = "win-precise-time" -version = "1.4.2" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/9e/b0/21547e16a47206ccdd15769bf65e143ade1ffae67f0881c855f76e44e9fa/win-precise-time-1.4.2.tar.gz", hash = "sha256:89274785cbc5f2997e01675206da3203835a442c60fd97798415c6b3c179c0b9", size = 7982, upload-time = "2023-10-08T17:08:18.618Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/bb/d6/a48717649fea2d7a6679db86dae9ae4b12078c7a48aa89a8f14a360f29d0/win_precise_time-1.4.2-cp311-cp311-win32.whl", hash = "sha256:59272655ad6f36910d0b585969402386fa627fca3be24acc9a21be1d550e5db8", size = 14703, upload-time = "2023-10-08T17:08:06.945Z" }, - { url = "https://files.pythonhosted.org/packages/f9/9c/46d69220d468c82ca2044284c5a8089705c5eb66be416abcbba156365a14/win_precise_time-1.4.2-cp311-cp311-win_amd64.whl", hash = "sha256:0897bb055f19f3b4336e2ba6bee0115ac20fd7ec615a6d736632e2df77f8851a", size = 14912, upload-time = "2023-10-08T17:08:07.896Z" }, - { url = "https://files.pythonhosted.org/packages/2e/96/55a14b5c0e90439951f4a72672223bba81a5f882033c5850f8a6c7f4308b/win_precise_time-1.4.2-cp312-cp312-win32.whl", hash = "sha256:0210dcea88a520c91de1708ae4c881e3c0ddc956daa08b9eabf2b7c35f3109f5", size = 14694, upload-time = "2023-10-08T17:08:09.275Z" }, - { url = "https://files.pythonhosted.org/packages/17/19/7ea9a22a69fc23d5ca02e8edf65e4a335a210497794af1af0ef8fda91fa0/win_precise_time-1.4.2-cp312-cp312-win_amd64.whl", hash = "sha256:85670f77cc8accd8f1e6d05073999f77561c23012a9ee988cbd44bb7ce655062", size = 14913, upload-time = "2023-10-08T17:08:10.677Z" }, -] - -[[package]] -name = "wrapt" -version = "1.17.3" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/95/8f/aeb76c5b46e273670962298c23e7ddde79916cb74db802131d49a85e4b7d/wrapt-1.17.3.tar.gz", hash = "sha256:f66eb08feaa410fe4eebd17f2a2c8e2e46d3476e9f8c783daa8e09e0faa666d0", size = 55547, upload-time = "2025-08-12T05:53:21.714Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/52/db/00e2a219213856074a213503fdac0511203dceefff26e1daa15250cc01a0/wrapt-1.17.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:273a736c4645e63ac582c60a56b0acb529ef07f78e08dc6bfadf6a46b19c0da7", size = 53482, upload-time = "2025-08-12T05:51:45.79Z" }, - { url = "https://files.pythonhosted.org/packages/5e/30/ca3c4a5eba478408572096fe9ce36e6e915994dd26a4e9e98b4f729c06d9/wrapt-1.17.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:5531d911795e3f935a9c23eb1c8c03c211661a5060aab167065896bbf62a5f85", size = 38674, upload-time = "2025-08-12T05:51:34.629Z" }, - { url = "https://files.pythonhosted.org/packages/31/25/3e8cc2c46b5329c5957cec959cb76a10718e1a513309c31399a4dad07eb3/wrapt-1.17.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:0610b46293c59a3adbae3dee552b648b984176f8562ee0dba099a56cfbe4df1f", size = 38959, upload-time = "2025-08-12T05:51:56.074Z" }, - { url = "https://files.pythonhosted.org/packages/5d/8f/a32a99fc03e4b37e31b57cb9cefc65050ea08147a8ce12f288616b05ef54/wrapt-1.17.3-cp311-cp311-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:b32888aad8b6e68f83a8fdccbf3165f5469702a7544472bdf41f582970ed3311", size = 82376, upload-time = "2025-08-12T05:52:32.134Z" }, - { url = "https://files.pythonhosted.org/packages/31/57/4930cb8d9d70d59c27ee1332a318c20291749b4fba31f113c2f8ac49a72e/wrapt-1.17.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8cccf4f81371f257440c88faed6b74f1053eef90807b77e31ca057b2db74edb1", size = 83604, upload-time = "2025-08-12T05:52:11.663Z" }, - { url = "https://files.pythonhosted.org/packages/a8/f3/1afd48de81d63dd66e01b263a6fbb86e1b5053b419b9b33d13e1f6d0f7d0/wrapt-1.17.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d8a210b158a34164de8bb68b0e7780041a903d7b00c87e906fb69928bf7890d5", size = 82782, upload-time = "2025-08-12T05:52:12.626Z" }, - { url = "https://files.pythonhosted.org/packages/1e/d7/4ad5327612173b144998232f98a85bb24b60c352afb73bc48e3e0d2bdc4e/wrapt-1.17.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:79573c24a46ce11aab457b472efd8d125e5a51da2d1d24387666cd85f54c05b2", size = 82076, upload-time = "2025-08-12T05:52:33.168Z" }, - { url = "https://files.pythonhosted.org/packages/bb/59/e0adfc831674a65694f18ea6dc821f9fcb9ec82c2ce7e3d73a88ba2e8718/wrapt-1.17.3-cp311-cp311-win32.whl", hash = "sha256:c31eebe420a9a5d2887b13000b043ff6ca27c452a9a22fa71f35f118e8d4bf89", size = 36457, upload-time = "2025-08-12T05:53:03.936Z" }, - { url = "https://files.pythonhosted.org/packages/83/88/16b7231ba49861b6f75fc309b11012ede4d6b0a9c90969d9e0db8d991aeb/wrapt-1.17.3-cp311-cp311-win_amd64.whl", hash = "sha256:0b1831115c97f0663cb77aa27d381237e73ad4f721391a9bfb2fe8bc25fa6e77", size = 38745, upload-time = "2025-08-12T05:53:02.885Z" }, - { url = "https://files.pythonhosted.org/packages/9a/1e/c4d4f3398ec073012c51d1c8d87f715f56765444e1a4b11e5180577b7e6e/wrapt-1.17.3-cp311-cp311-win_arm64.whl", hash = "sha256:5a7b3c1ee8265eb4c8f1b7d29943f195c00673f5ab60c192eba2d4a7eae5f46a", size = 36806, upload-time = "2025-08-12T05:52:53.368Z" }, - { url = "https://files.pythonhosted.org/packages/9f/41/cad1aba93e752f1f9268c77270da3c469883d56e2798e7df6240dcb2287b/wrapt-1.17.3-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:ab232e7fdb44cdfbf55fc3afa31bcdb0d8980b9b95c38b6405df2acb672af0e0", size = 53998, upload-time = "2025-08-12T05:51:47.138Z" }, - { url = "https://files.pythonhosted.org/packages/60/f8/096a7cc13097a1869fe44efe68dace40d2a16ecb853141394047f0780b96/wrapt-1.17.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:9baa544e6acc91130e926e8c802a17f3b16fbea0fd441b5a60f5cf2cc5c3deba", size = 39020, upload-time = "2025-08-12T05:51:35.906Z" }, - { url = "https://files.pythonhosted.org/packages/33/df/bdf864b8997aab4febb96a9ae5c124f700a5abd9b5e13d2a3214ec4be705/wrapt-1.17.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6b538e31eca1a7ea4605e44f81a48aa24c4632a277431a6ed3f328835901f4fd", size = 39098, upload-time = "2025-08-12T05:51:57.474Z" }, - { url = "https://files.pythonhosted.org/packages/9f/81/5d931d78d0eb732b95dc3ddaeeb71c8bb572fb01356e9133916cd729ecdd/wrapt-1.17.3-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:042ec3bb8f319c147b1301f2393bc19dba6e176b7da446853406d041c36c7828", size = 88036, upload-time = "2025-08-12T05:52:34.784Z" }, - { url = "https://files.pythonhosted.org/packages/ca/38/2e1785df03b3d72d34fc6252d91d9d12dc27a5c89caef3335a1bbb8908ca/wrapt-1.17.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3af60380ba0b7b5aeb329bc4e402acd25bd877e98b3727b0135cb5c2efdaefe9", size = 88156, upload-time = "2025-08-12T05:52:13.599Z" }, - { url = "https://files.pythonhosted.org/packages/b3/8b/48cdb60fe0603e34e05cffda0b2a4adab81fd43718e11111a4b0100fd7c1/wrapt-1.17.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:0b02e424deef65c9f7326d8c19220a2c9040c51dc165cddb732f16198c168396", size = 87102, upload-time = "2025-08-12T05:52:14.56Z" }, - { url = "https://files.pythonhosted.org/packages/3c/51/d81abca783b58f40a154f1b2c56db1d2d9e0d04fa2d4224e357529f57a57/wrapt-1.17.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:74afa28374a3c3a11b3b5e5fca0ae03bef8450d6aa3ab3a1e2c30e3a75d023dc", size = 87732, upload-time = "2025-08-12T05:52:36.165Z" }, - { url = "https://files.pythonhosted.org/packages/9e/b1/43b286ca1392a006d5336412d41663eeef1ad57485f3e52c767376ba7e5a/wrapt-1.17.3-cp312-cp312-win32.whl", hash = "sha256:4da9f45279fff3543c371d5ababc57a0384f70be244de7759c85a7f989cb4ebe", size = 36705, upload-time = "2025-08-12T05:53:07.123Z" }, - { url = "https://files.pythonhosted.org/packages/28/de/49493f962bd3c586ab4b88066e967aa2e0703d6ef2c43aa28cb83bf7b507/wrapt-1.17.3-cp312-cp312-win_amd64.whl", hash = "sha256:e71d5c6ebac14875668a1e90baf2ea0ef5b7ac7918355850c0908ae82bcb297c", size = 38877, upload-time = "2025-08-12T05:53:05.436Z" }, - { url = "https://files.pythonhosted.org/packages/f1/48/0f7102fe9cb1e8a5a77f80d4f0956d62d97034bbe88d33e94699f99d181d/wrapt-1.17.3-cp312-cp312-win_arm64.whl", hash = "sha256:604d076c55e2fdd4c1c03d06dc1a31b95130010517b5019db15365ec4a405fc6", size = 36885, upload-time = "2025-08-12T05:52:54.367Z" }, - { url = "https://files.pythonhosted.org/packages/fc/f6/759ece88472157acb55fc195e5b116e06730f1b651b5b314c66291729193/wrapt-1.17.3-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:a47681378a0439215912ef542c45a783484d4dd82bac412b71e59cf9c0e1cea0", size = 54003, upload-time = "2025-08-12T05:51:48.627Z" }, - { url = "https://files.pythonhosted.org/packages/4f/a9/49940b9dc6d47027dc850c116d79b4155f15c08547d04db0f07121499347/wrapt-1.17.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:54a30837587c6ee3cd1a4d1c2ec5d24e77984d44e2f34547e2323ddb4e22eb77", size = 39025, upload-time = "2025-08-12T05:51:37.156Z" }, - { url = "https://files.pythonhosted.org/packages/45/35/6a08de0f2c96dcdd7fe464d7420ddb9a7655a6561150e5fc4da9356aeaab/wrapt-1.17.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:16ecf15d6af39246fe33e507105d67e4b81d8f8d2c6598ff7e3ca1b8a37213f7", size = 39108, upload-time = "2025-08-12T05:51:58.425Z" }, - { url = "https://files.pythonhosted.org/packages/0c/37/6faf15cfa41bf1f3dba80cd3f5ccc6622dfccb660ab26ed79f0178c7497f/wrapt-1.17.3-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:6fd1ad24dc235e4ab88cda009e19bf347aabb975e44fd5c2fb22a3f6e4141277", size = 88072, upload-time = "2025-08-12T05:52:37.53Z" }, - { url = "https://files.pythonhosted.org/packages/78/f2/efe19ada4a38e4e15b6dff39c3e3f3f73f5decf901f66e6f72fe79623a06/wrapt-1.17.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0ed61b7c2d49cee3c027372df5809a59d60cf1b6c2f81ee980a091f3afed6a2d", size = 88214, upload-time = "2025-08-12T05:52:15.886Z" }, - { url = "https://files.pythonhosted.org/packages/40/90/ca86701e9de1622b16e09689fc24b76f69b06bb0150990f6f4e8b0eeb576/wrapt-1.17.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:423ed5420ad5f5529db9ce89eac09c8a2f97da18eb1c870237e84c5a5c2d60aa", size = 87105, upload-time = "2025-08-12T05:52:17.914Z" }, - { url = "https://files.pythonhosted.org/packages/fd/e0/d10bd257c9a3e15cbf5523025252cc14d77468e8ed644aafb2d6f54cb95d/wrapt-1.17.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:e01375f275f010fcbf7f643b4279896d04e571889b8a5b3f848423d91bf07050", size = 87766, upload-time = "2025-08-12T05:52:39.243Z" }, - { url = "https://files.pythonhosted.org/packages/e8/cf/7d848740203c7b4b27eb55dbfede11aca974a51c3d894f6cc4b865f42f58/wrapt-1.17.3-cp313-cp313-win32.whl", hash = "sha256:53e5e39ff71b3fc484df8a522c933ea2b7cdd0d5d15ae82e5b23fde87d44cbd8", size = 36711, upload-time = "2025-08-12T05:53:10.074Z" }, - { url = "https://files.pythonhosted.org/packages/57/54/35a84d0a4d23ea675994104e667ceff49227ce473ba6a59ba2c84f250b74/wrapt-1.17.3-cp313-cp313-win_amd64.whl", hash = "sha256:1f0b2f40cf341ee8cc1a97d51ff50dddb9fcc73241b9143ec74b30fc4f44f6cb", size = 38885, upload-time = "2025-08-12T05:53:08.695Z" }, - { url = "https://files.pythonhosted.org/packages/01/77/66e54407c59d7b02a3c4e0af3783168fff8e5d61def52cda8728439d86bc/wrapt-1.17.3-cp313-cp313-win_arm64.whl", hash = "sha256:7425ac3c54430f5fc5e7b6f41d41e704db073309acfc09305816bc6a0b26bb16", size = 36896, upload-time = "2025-08-12T05:52:55.34Z" }, - { url = "https://files.pythonhosted.org/packages/02/a2/cd864b2a14f20d14f4c496fab97802001560f9f41554eef6df201cd7f76c/wrapt-1.17.3-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:cf30f6e3c077c8e6a9a7809c94551203c8843e74ba0c960f4a98cd80d4665d39", size = 54132, upload-time = "2025-08-12T05:51:49.864Z" }, - { url = "https://files.pythonhosted.org/packages/d5/46/d011725b0c89e853dc44cceb738a307cde5d240d023d6d40a82d1b4e1182/wrapt-1.17.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:e228514a06843cae89621384cfe3a80418f3c04aadf8a3b14e46a7be704e4235", size = 39091, upload-time = "2025-08-12T05:51:38.935Z" }, - { url = "https://files.pythonhosted.org/packages/2e/9e/3ad852d77c35aae7ddebdbc3b6d35ec8013af7d7dddad0ad911f3d891dae/wrapt-1.17.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:5ea5eb3c0c071862997d6f3e02af1d055f381b1d25b286b9d6644b79db77657c", size = 39172, upload-time = "2025-08-12T05:51:59.365Z" }, - { url = "https://files.pythonhosted.org/packages/c3/f7/c983d2762bcce2326c317c26a6a1e7016f7eb039c27cdf5c4e30f4160f31/wrapt-1.17.3-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:281262213373b6d5e4bb4353bc36d1ba4084e6d6b5d242863721ef2bf2c2930b", size = 87163, upload-time = "2025-08-12T05:52:40.965Z" }, - { url = "https://files.pythonhosted.org/packages/e4/0f/f673f75d489c7f22d17fe0193e84b41540d962f75fce579cf6873167c29b/wrapt-1.17.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:dc4a8d2b25efb6681ecacad42fca8859f88092d8732b170de6a5dddd80a1c8fa", size = 87963, upload-time = "2025-08-12T05:52:20.326Z" }, - { url = "https://files.pythonhosted.org/packages/df/61/515ad6caca68995da2fac7a6af97faab8f78ebe3bf4f761e1b77efbc47b5/wrapt-1.17.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:373342dd05b1d07d752cecbec0c41817231f29f3a89aa8b8843f7b95992ed0c7", size = 86945, upload-time = "2025-08-12T05:52:21.581Z" }, - { url = "https://files.pythonhosted.org/packages/d3/bd/4e70162ce398462a467bc09e768bee112f1412e563620adc353de9055d33/wrapt-1.17.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:d40770d7c0fd5cbed9d84b2c3f2e156431a12c9a37dc6284060fb4bec0b7ffd4", size = 86857, upload-time = "2025-08-12T05:52:43.043Z" }, - { url = "https://files.pythonhosted.org/packages/2b/b8/da8560695e9284810b8d3df8a19396a6e40e7518059584a1a394a2b35e0a/wrapt-1.17.3-cp314-cp314-win32.whl", hash = "sha256:fbd3c8319de8e1dc79d346929cd71d523622da527cca14e0c1d257e31c2b8b10", size = 37178, upload-time = "2025-08-12T05:53:12.605Z" }, - { url = "https://files.pythonhosted.org/packages/db/c8/b71eeb192c440d67a5a0449aaee2310a1a1e8eca41676046f99ed2487e9f/wrapt-1.17.3-cp314-cp314-win_amd64.whl", hash = "sha256:e1a4120ae5705f673727d3253de3ed0e016f7cd78dc463db1b31e2463e1f3cf6", size = 39310, upload-time = "2025-08-12T05:53:11.106Z" }, - { url = "https://files.pythonhosted.org/packages/45/20/2cda20fd4865fa40f86f6c46ed37a2a8356a7a2fde0773269311f2af56c7/wrapt-1.17.3-cp314-cp314-win_arm64.whl", hash = "sha256:507553480670cab08a800b9463bdb881b2edeed77dc677b0a5915e6106e91a58", size = 37266, upload-time = "2025-08-12T05:52:56.531Z" }, - { url = "https://files.pythonhosted.org/packages/77/ed/dd5cf21aec36c80443c6f900449260b80e2a65cf963668eaef3b9accce36/wrapt-1.17.3-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:ed7c635ae45cfbc1a7371f708727bf74690daedc49b4dba310590ca0bd28aa8a", size = 56544, upload-time = "2025-08-12T05:51:51.109Z" }, - { url = "https://files.pythonhosted.org/packages/8d/96/450c651cc753877ad100c7949ab4d2e2ecc4d97157e00fa8f45df682456a/wrapt-1.17.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:249f88ed15503f6492a71f01442abddd73856a0032ae860de6d75ca62eed8067", size = 40283, upload-time = "2025-08-12T05:51:39.912Z" }, - { url = "https://files.pythonhosted.org/packages/d1/86/2fcad95994d9b572db57632acb6f900695a648c3e063f2cd344b3f5c5a37/wrapt-1.17.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:5a03a38adec8066d5a37bea22f2ba6bbf39fcdefbe2d91419ab864c3fb515454", size = 40366, upload-time = "2025-08-12T05:52:00.693Z" }, - { url = "https://files.pythonhosted.org/packages/64/0e/f4472f2fdde2d4617975144311f8800ef73677a159be7fe61fa50997d6c0/wrapt-1.17.3-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:5d4478d72eb61c36e5b446e375bbc49ed002430d17cdec3cecb36993398e1a9e", size = 108571, upload-time = "2025-08-12T05:52:44.521Z" }, - { url = "https://files.pythonhosted.org/packages/cc/01/9b85a99996b0a97c8a17484684f206cbb6ba73c1ce6890ac668bcf3838fb/wrapt-1.17.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:223db574bb38637e8230eb14b185565023ab624474df94d2af18f1cdb625216f", size = 113094, upload-time = "2025-08-12T05:52:22.618Z" }, - { url = "https://files.pythonhosted.org/packages/25/02/78926c1efddcc7b3aa0bc3d6b33a822f7d898059f7cd9ace8c8318e559ef/wrapt-1.17.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:e405adefb53a435f01efa7ccdec012c016b5a1d3f35459990afc39b6be4d5056", size = 110659, upload-time = "2025-08-12T05:52:24.057Z" }, - { url = "https://files.pythonhosted.org/packages/dc/ee/c414501ad518ac3e6fe184753632fe5e5ecacdcf0effc23f31c1e4f7bfcf/wrapt-1.17.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:88547535b787a6c9ce4086917b6e1d291aa8ed914fdd3a838b3539dc95c12804", size = 106946, upload-time = "2025-08-12T05:52:45.976Z" }, - { url = "https://files.pythonhosted.org/packages/be/44/a1bd64b723d13bb151d6cc91b986146a1952385e0392a78567e12149c7b4/wrapt-1.17.3-cp314-cp314t-win32.whl", hash = "sha256:41b1d2bc74c2cac6f9074df52b2efbef2b30bdfe5f40cb78f8ca22963bc62977", size = 38717, upload-time = "2025-08-12T05:53:15.214Z" }, - { url = "https://files.pythonhosted.org/packages/79/d9/7cfd5a312760ac4dd8bf0184a6ee9e43c33e47f3dadc303032ce012b8fa3/wrapt-1.17.3-cp314-cp314t-win_amd64.whl", hash = "sha256:73d496de46cd2cdbdbcce4ae4bcdb4afb6a11234a1df9c085249d55166b95116", size = 41334, upload-time = "2025-08-12T05:53:14.178Z" }, - { url = "https://files.pythonhosted.org/packages/46/78/10ad9781128ed2f99dbc474f43283b13fea8ba58723e98844367531c18e9/wrapt-1.17.3-cp314-cp314t-win_arm64.whl", hash = "sha256:f38e60678850c42461d4202739f9bf1e3a737c7ad283638251e79cc49effb6b6", size = 38471, upload-time = "2025-08-12T05:52:57.784Z" }, - { url = "https://files.pythonhosted.org/packages/1f/f6/a933bd70f98e9cf3e08167fc5cd7aaaca49147e48411c0bd5ae701bb2194/wrapt-1.17.3-py3-none-any.whl", hash = "sha256:7171ae35d2c33d326ac19dd8facb1e82e5fd04ef8c6c0e394d7af55a55051c22", size = 23591, upload-time = "2025-08-12T05:53:20.674Z" }, -] - -[[package]] -name = "yarl" -version = "1.20.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "idna" }, - { name = "multidict" }, - { name = "propcache" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/3c/fb/efaa23fa4e45537b827620f04cf8f3cd658b76642205162e072703a5b963/yarl-1.20.1.tar.gz", hash = "sha256:d017a4997ee50c91fd5466cef416231bb82177b93b029906cefc542ce14c35ac", size = 186428, upload-time = "2025-06-10T00:46:09.923Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/b1/18/893b50efc2350e47a874c5c2d67e55a0ea5df91186b2a6f5ac52eff887cd/yarl-1.20.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:47ee6188fea634bdfaeb2cc420f5b3b17332e6225ce88149a17c413c77ff269e", size = 133833, upload-time = "2025-06-10T00:43:07.393Z" }, - { url = "https://files.pythonhosted.org/packages/89/ed/b8773448030e6fc47fa797f099ab9eab151a43a25717f9ac043844ad5ea3/yarl-1.20.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:d0f6500f69e8402d513e5eedb77a4e1818691e8f45e6b687147963514d84b44b", size = 91070, upload-time = "2025-06-10T00:43:09.538Z" }, - { url = "https://files.pythonhosted.org/packages/e3/e3/409bd17b1e42619bf69f60e4f031ce1ccb29bd7380117a55529e76933464/yarl-1.20.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:7a8900a42fcdaad568de58887c7b2f602962356908eedb7628eaf6021a6e435b", size = 89818, upload-time = "2025-06-10T00:43:11.575Z" }, - { url = "https://files.pythonhosted.org/packages/f8/77/64d8431a4d77c856eb2d82aa3de2ad6741365245a29b3a9543cd598ed8c5/yarl-1.20.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bad6d131fda8ef508b36be3ece16d0902e80b88ea7200f030a0f6c11d9e508d4", size = 347003, upload-time = "2025-06-10T00:43:14.088Z" }, - { url = "https://files.pythonhosted.org/packages/8d/d2/0c7e4def093dcef0bd9fa22d4d24b023788b0a33b8d0088b51aa51e21e99/yarl-1.20.1-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:df018d92fe22aaebb679a7f89fe0c0f368ec497e3dda6cb81a567610f04501f1", size = 336537, upload-time = "2025-06-10T00:43:16.431Z" }, - { url = "https://files.pythonhosted.org/packages/f0/f3/fc514f4b2cf02cb59d10cbfe228691d25929ce8f72a38db07d3febc3f706/yarl-1.20.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8f969afbb0a9b63c18d0feecf0db09d164b7a44a053e78a7d05f5df163e43833", size = 362358, upload-time = "2025-06-10T00:43:18.704Z" }, - { url = "https://files.pythonhosted.org/packages/ea/6d/a313ac8d8391381ff9006ac05f1d4331cee3b1efaa833a53d12253733255/yarl-1.20.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:812303eb4aa98e302886ccda58d6b099e3576b1b9276161469c25803a8db277d", size = 357362, upload-time = "2025-06-10T00:43:20.888Z" }, - { url = "https://files.pythonhosted.org/packages/00/70/8f78a95d6935a70263d46caa3dd18e1f223cf2f2ff2037baa01a22bc5b22/yarl-1.20.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:98c4a7d166635147924aa0bf9bfe8d8abad6fffa6102de9c99ea04a1376f91e8", size = 348979, upload-time = "2025-06-10T00:43:23.169Z" }, - { url = "https://files.pythonhosted.org/packages/cb/05/42773027968968f4f15143553970ee36ead27038d627f457cc44bbbeecf3/yarl-1.20.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:12e768f966538e81e6e7550f9086a6236b16e26cd964cf4df35349970f3551cf", size = 337274, upload-time = "2025-06-10T00:43:27.111Z" }, - { url = "https://files.pythonhosted.org/packages/05/be/665634aa196954156741ea591d2f946f1b78ceee8bb8f28488bf28c0dd62/yarl-1.20.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:fe41919b9d899661c5c28a8b4b0acf704510b88f27f0934ac7a7bebdd8938d5e", size = 363294, upload-time = "2025-06-10T00:43:28.96Z" }, - { url = "https://files.pythonhosted.org/packages/eb/90/73448401d36fa4e210ece5579895731f190d5119c4b66b43b52182e88cd5/yarl-1.20.1-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:8601bc010d1d7780592f3fc1bdc6c72e2b6466ea34569778422943e1a1f3c389", size = 358169, upload-time = "2025-06-10T00:43:30.701Z" }, - { url = "https://files.pythonhosted.org/packages/c3/b0/fce922d46dc1eb43c811f1889f7daa6001b27a4005587e94878570300881/yarl-1.20.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:daadbdc1f2a9033a2399c42646fbd46da7992e868a5fe9513860122d7fe7a73f", size = 362776, upload-time = "2025-06-10T00:43:32.51Z" }, - { url = "https://files.pythonhosted.org/packages/f1/0d/b172628fce039dae8977fd22caeff3eeebffd52e86060413f5673767c427/yarl-1.20.1-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:03aa1e041727cb438ca762628109ef1333498b122e4c76dd858d186a37cec845", size = 381341, upload-time = "2025-06-10T00:43:34.543Z" }, - { url = "https://files.pythonhosted.org/packages/6b/9b/5b886d7671f4580209e855974fe1cecec409aa4a89ea58b8f0560dc529b1/yarl-1.20.1-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:642980ef5e0fa1de5fa96d905c7e00cb2c47cb468bfcac5a18c58e27dbf8d8d1", size = 379988, upload-time = "2025-06-10T00:43:36.489Z" }, - { url = "https://files.pythonhosted.org/packages/73/be/75ef5fd0fcd8f083a5d13f78fd3f009528132a1f2a1d7c925c39fa20aa79/yarl-1.20.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:86971e2795584fe8c002356d3b97ef6c61862720eeff03db2a7c86b678d85b3e", size = 371113, upload-time = "2025-06-10T00:43:38.592Z" }, - { url = "https://files.pythonhosted.org/packages/50/4f/62faab3b479dfdcb741fe9e3f0323e2a7d5cd1ab2edc73221d57ad4834b2/yarl-1.20.1-cp311-cp311-win32.whl", hash = "sha256:597f40615b8d25812f14562699e287f0dcc035d25eb74da72cae043bb884d773", size = 81485, upload-time = "2025-06-10T00:43:41.038Z" }, - { url = "https://files.pythonhosted.org/packages/f0/09/d9c7942f8f05c32ec72cd5c8e041c8b29b5807328b68b4801ff2511d4d5e/yarl-1.20.1-cp311-cp311-win_amd64.whl", hash = "sha256:26ef53a9e726e61e9cd1cda6b478f17e350fb5800b4bd1cd9fe81c4d91cfeb2e", size = 86686, upload-time = "2025-06-10T00:43:42.692Z" }, - { url = "https://files.pythonhosted.org/packages/5f/9a/cb7fad7d73c69f296eda6815e4a2c7ed53fc70c2f136479a91c8e5fbdb6d/yarl-1.20.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:bdcc4cd244e58593a4379fe60fdee5ac0331f8eb70320a24d591a3be197b94a9", size = 133667, upload-time = "2025-06-10T00:43:44.369Z" }, - { url = "https://files.pythonhosted.org/packages/67/38/688577a1cb1e656e3971fb66a3492501c5a5df56d99722e57c98249e5b8a/yarl-1.20.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:b29a2c385a5f5b9c7d9347e5812b6f7ab267193c62d282a540b4fc528c8a9d2a", size = 91025, upload-time = "2025-06-10T00:43:46.295Z" }, - { url = "https://files.pythonhosted.org/packages/50/ec/72991ae51febeb11a42813fc259f0d4c8e0507f2b74b5514618d8b640365/yarl-1.20.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1112ae8154186dfe2de4732197f59c05a83dc814849a5ced892b708033f40dc2", size = 89709, upload-time = "2025-06-10T00:43:48.22Z" }, - { url = "https://files.pythonhosted.org/packages/99/da/4d798025490e89426e9f976702e5f9482005c548c579bdae792a4c37769e/yarl-1.20.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:90bbd29c4fe234233f7fa2b9b121fb63c321830e5d05b45153a2ca68f7d310ee", size = 352287, upload-time = "2025-06-10T00:43:49.924Z" }, - { url = "https://files.pythonhosted.org/packages/1a/26/54a15c6a567aac1c61b18aa0f4b8aa2e285a52d547d1be8bf48abe2b3991/yarl-1.20.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:680e19c7ce3710ac4cd964e90dad99bf9b5029372ba0c7cbfcd55e54d90ea819", size = 345429, upload-time = "2025-06-10T00:43:51.7Z" }, - { url = "https://files.pythonhosted.org/packages/d6/95/9dcf2386cb875b234353b93ec43e40219e14900e046bf6ac118f94b1e353/yarl-1.20.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4a979218c1fdb4246a05efc2cc23859d47c89af463a90b99b7c56094daf25a16", size = 365429, upload-time = "2025-06-10T00:43:53.494Z" }, - { url = "https://files.pythonhosted.org/packages/91/b2/33a8750f6a4bc224242a635f5f2cff6d6ad5ba651f6edcccf721992c21a0/yarl-1.20.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:255b468adf57b4a7b65d8aad5b5138dce6a0752c139965711bdcb81bc370e1b6", size = 363862, upload-time = "2025-06-10T00:43:55.766Z" }, - { url = "https://files.pythonhosted.org/packages/98/28/3ab7acc5b51f4434b181b0cee8f1f4b77a65919700a355fb3617f9488874/yarl-1.20.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a97d67108e79cfe22e2b430d80d7571ae57d19f17cda8bb967057ca8a7bf5bfd", size = 355616, upload-time = "2025-06-10T00:43:58.056Z" }, - { url = "https://files.pythonhosted.org/packages/36/a3/f666894aa947a371724ec7cd2e5daa78ee8a777b21509b4252dd7bd15e29/yarl-1.20.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8570d998db4ddbfb9a590b185a0a33dbf8aafb831d07a5257b4ec9948df9cb0a", size = 339954, upload-time = "2025-06-10T00:43:59.773Z" }, - { url = "https://files.pythonhosted.org/packages/f1/81/5f466427e09773c04219d3450d7a1256138a010b6c9f0af2d48565e9ad13/yarl-1.20.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:97c75596019baae7c71ccf1d8cc4738bc08134060d0adfcbe5642f778d1dca38", size = 365575, upload-time = "2025-06-10T00:44:02.051Z" }, - { url = "https://files.pythonhosted.org/packages/2e/e3/e4b0ad8403e97e6c9972dd587388940a032f030ebec196ab81a3b8e94d31/yarl-1.20.1-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:1c48912653e63aef91ff988c5432832692ac5a1d8f0fb8a33091520b5bbe19ef", size = 365061, upload-time = "2025-06-10T00:44:04.196Z" }, - { url = "https://files.pythonhosted.org/packages/ac/99/b8a142e79eb86c926f9f06452eb13ecb1bb5713bd01dc0038faf5452e544/yarl-1.20.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:4c3ae28f3ae1563c50f3d37f064ddb1511ecc1d5584e88c6b7c63cf7702a6d5f", size = 364142, upload-time = "2025-06-10T00:44:06.527Z" }, - { url = "https://files.pythonhosted.org/packages/34/f2/08ed34a4a506d82a1a3e5bab99ccd930a040f9b6449e9fd050320e45845c/yarl-1.20.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:c5e9642f27036283550f5f57dc6156c51084b458570b9d0d96100c8bebb186a8", size = 381894, upload-time = "2025-06-10T00:44:08.379Z" }, - { url = "https://files.pythonhosted.org/packages/92/f8/9a3fbf0968eac704f681726eff595dce9b49c8a25cd92bf83df209668285/yarl-1.20.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:2c26b0c49220d5799f7b22c6838409ee9bc58ee5c95361a4d7831f03cc225b5a", size = 383378, upload-time = "2025-06-10T00:44:10.51Z" }, - { url = "https://files.pythonhosted.org/packages/af/85/9363f77bdfa1e4d690957cd39d192c4cacd1c58965df0470a4905253b54f/yarl-1.20.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:564ab3d517e3d01c408c67f2e5247aad4019dcf1969982aba3974b4093279004", size = 374069, upload-time = "2025-06-10T00:44:12.834Z" }, - { url = "https://files.pythonhosted.org/packages/35/99/9918c8739ba271dcd935400cff8b32e3cd319eaf02fcd023d5dcd487a7c8/yarl-1.20.1-cp312-cp312-win32.whl", hash = "sha256:daea0d313868da1cf2fac6b2d3a25c6e3a9e879483244be38c8e6a41f1d876a5", size = 81249, upload-time = "2025-06-10T00:44:14.731Z" }, - { url = "https://files.pythonhosted.org/packages/eb/83/5d9092950565481b413b31a23e75dd3418ff0a277d6e0abf3729d4d1ce25/yarl-1.20.1-cp312-cp312-win_amd64.whl", hash = "sha256:48ea7d7f9be0487339828a4de0360d7ce0efc06524a48e1810f945c45b813698", size = 86710, upload-time = "2025-06-10T00:44:16.716Z" }, - { url = "https://files.pythonhosted.org/packages/8a/e1/2411b6d7f769a07687acee88a062af5833cf1966b7266f3d8dfb3d3dc7d3/yarl-1.20.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:0b5ff0fbb7c9f1b1b5ab53330acbfc5247893069e7716840c8e7d5bb7355038a", size = 131811, upload-time = "2025-06-10T00:44:18.933Z" }, - { url = "https://files.pythonhosted.org/packages/b2/27/584394e1cb76fb771371770eccad35de400e7b434ce3142c2dd27392c968/yarl-1.20.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:14f326acd845c2b2e2eb38fb1346c94f7f3b01a4f5c788f8144f9b630bfff9a3", size = 90078, upload-time = "2025-06-10T00:44:20.635Z" }, - { url = "https://files.pythonhosted.org/packages/bf/9a/3246ae92d4049099f52d9b0fe3486e3b500e29b7ea872d0f152966fc209d/yarl-1.20.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f60e4ad5db23f0b96e49c018596707c3ae89f5d0bd97f0ad3684bcbad899f1e7", size = 88748, upload-time = "2025-06-10T00:44:22.34Z" }, - { url = "https://files.pythonhosted.org/packages/a3/25/35afe384e31115a1a801fbcf84012d7a066d89035befae7c5d4284df1e03/yarl-1.20.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:49bdd1b8e00ce57e68ba51916e4bb04461746e794e7c4d4bbc42ba2f18297691", size = 349595, upload-time = "2025-06-10T00:44:24.314Z" }, - { url = "https://files.pythonhosted.org/packages/28/2d/8aca6cb2cabc8f12efcb82749b9cefecbccfc7b0384e56cd71058ccee433/yarl-1.20.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:66252d780b45189975abfed839616e8fd2dbacbdc262105ad7742c6ae58f3e31", size = 342616, upload-time = "2025-06-10T00:44:26.167Z" }, - { url = "https://files.pythonhosted.org/packages/0b/e9/1312633d16b31acf0098d30440ca855e3492d66623dafb8e25b03d00c3da/yarl-1.20.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:59174e7332f5d153d8f7452a102b103e2e74035ad085f404df2e40e663a22b28", size = 361324, upload-time = "2025-06-10T00:44:27.915Z" }, - { url = "https://files.pythonhosted.org/packages/bc/a0/688cc99463f12f7669eec7c8acc71ef56a1521b99eab7cd3abb75af887b0/yarl-1.20.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e3968ec7d92a0c0f9ac34d5ecfd03869ec0cab0697c91a45db3fbbd95fe1b653", size = 359676, upload-time = "2025-06-10T00:44:30.041Z" }, - { url = "https://files.pythonhosted.org/packages/af/44/46407d7f7a56e9a85a4c207724c9f2c545c060380718eea9088f222ba697/yarl-1.20.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d1a4fbb50e14396ba3d375f68bfe02215d8e7bc3ec49da8341fe3157f59d2ff5", size = 352614, upload-time = "2025-06-10T00:44:32.171Z" }, - { url = "https://files.pythonhosted.org/packages/b1/91/31163295e82b8d5485d31d9cf7754d973d41915cadce070491778d9c9825/yarl-1.20.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:11a62c839c3a8eac2410e951301309426f368388ff2f33799052787035793b02", size = 336766, upload-time = "2025-06-10T00:44:34.494Z" }, - { url = "https://files.pythonhosted.org/packages/b4/8e/c41a5bc482121f51c083c4c2bcd16b9e01e1cf8729e380273a952513a21f/yarl-1.20.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:041eaa14f73ff5a8986b4388ac6bb43a77f2ea09bf1913df7a35d4646db69e53", size = 364615, upload-time = "2025-06-10T00:44:36.856Z" }, - { url = "https://files.pythonhosted.org/packages/e3/5b/61a3b054238d33d70ea06ebba7e58597891b71c699e247df35cc984ab393/yarl-1.20.1-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:377fae2fef158e8fd9d60b4c8751387b8d1fb121d3d0b8e9b0be07d1b41e83dc", size = 360982, upload-time = "2025-06-10T00:44:39.141Z" }, - { url = "https://files.pythonhosted.org/packages/df/a3/6a72fb83f8d478cb201d14927bc8040af901811a88e0ff2da7842dd0ed19/yarl-1.20.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:1c92f4390e407513f619d49319023664643d3339bd5e5a56a3bebe01bc67ec04", size = 369792, upload-time = "2025-06-10T00:44:40.934Z" }, - { url = "https://files.pythonhosted.org/packages/7c/af/4cc3c36dfc7c077f8dedb561eb21f69e1e9f2456b91b593882b0b18c19dc/yarl-1.20.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:d25ddcf954df1754ab0f86bb696af765c5bfaba39b74095f27eececa049ef9a4", size = 382049, upload-time = "2025-06-10T00:44:42.854Z" }, - { url = "https://files.pythonhosted.org/packages/19/3a/e54e2c4752160115183a66dc9ee75a153f81f3ab2ba4bf79c3c53b33de34/yarl-1.20.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:909313577e9619dcff8c31a0ea2aa0a2a828341d92673015456b3ae492e7317b", size = 384774, upload-time = "2025-06-10T00:44:45.275Z" }, - { url = "https://files.pythonhosted.org/packages/9c/20/200ae86dabfca89060ec6447649f219b4cbd94531e425e50d57e5f5ac330/yarl-1.20.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:793fd0580cb9664548c6b83c63b43c477212c0260891ddf86809e1c06c8b08f1", size = 374252, upload-time = "2025-06-10T00:44:47.31Z" }, - { url = "https://files.pythonhosted.org/packages/83/75/11ee332f2f516b3d094e89448da73d557687f7d137d5a0f48c40ff211487/yarl-1.20.1-cp313-cp313-win32.whl", hash = "sha256:468f6e40285de5a5b3c44981ca3a319a4b208ccc07d526b20b12aeedcfa654b7", size = 81198, upload-time = "2025-06-10T00:44:49.164Z" }, - { url = "https://files.pythonhosted.org/packages/ba/ba/39b1ecbf51620b40ab402b0fc817f0ff750f6d92712b44689c2c215be89d/yarl-1.20.1-cp313-cp313-win_amd64.whl", hash = "sha256:495b4ef2fea40596bfc0affe3837411d6aa3371abcf31aac0ccc4bdd64d4ef5c", size = 86346, upload-time = "2025-06-10T00:44:51.182Z" }, - { url = "https://files.pythonhosted.org/packages/43/c7/669c52519dca4c95153c8ad96dd123c79f354a376346b198f438e56ffeb4/yarl-1.20.1-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:f60233b98423aab21d249a30eb27c389c14929f47be8430efa7dbd91493a729d", size = 138826, upload-time = "2025-06-10T00:44:52.883Z" }, - { url = "https://files.pythonhosted.org/packages/6a/42/fc0053719b44f6ad04a75d7f05e0e9674d45ef62f2d9ad2c1163e5c05827/yarl-1.20.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:6f3eff4cc3f03d650d8755c6eefc844edde99d641d0dcf4da3ab27141a5f8ddf", size = 93217, upload-time = "2025-06-10T00:44:54.658Z" }, - { url = "https://files.pythonhosted.org/packages/4f/7f/fa59c4c27e2a076bba0d959386e26eba77eb52ea4a0aac48e3515c186b4c/yarl-1.20.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:69ff8439d8ba832d6bed88af2c2b3445977eba9a4588b787b32945871c2444e3", size = 92700, upload-time = "2025-06-10T00:44:56.784Z" }, - { url = "https://files.pythonhosted.org/packages/2f/d4/062b2f48e7c93481e88eff97a6312dca15ea200e959f23e96d8ab898c5b8/yarl-1.20.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3cf34efa60eb81dd2645a2e13e00bb98b76c35ab5061a3989c7a70f78c85006d", size = 347644, upload-time = "2025-06-10T00:44:59.071Z" }, - { url = "https://files.pythonhosted.org/packages/89/47/78b7f40d13c8f62b499cc702fdf69e090455518ae544c00a3bf4afc9fc77/yarl-1.20.1-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:8e0fe9364ad0fddab2688ce72cb7a8e61ea42eff3c7caeeb83874a5d479c896c", size = 323452, upload-time = "2025-06-10T00:45:01.605Z" }, - { url = "https://files.pythonhosted.org/packages/eb/2b/490d3b2dc66f52987d4ee0d3090a147ea67732ce6b4d61e362c1846d0d32/yarl-1.20.1-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8f64fbf81878ba914562c672024089e3401974a39767747691c65080a67b18c1", size = 346378, upload-time = "2025-06-10T00:45:03.946Z" }, - { url = "https://files.pythonhosted.org/packages/66/ad/775da9c8a94ce925d1537f939a4f17d782efef1f973039d821cbe4bcc211/yarl-1.20.1-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f6342d643bf9a1de97e512e45e4b9560a043347e779a173250824f8b254bd5ce", size = 353261, upload-time = "2025-06-10T00:45:05.992Z" }, - { url = "https://files.pythonhosted.org/packages/4b/23/0ed0922b47a4f5c6eb9065d5ff1e459747226ddce5c6a4c111e728c9f701/yarl-1.20.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:56dac5f452ed25eef0f6e3c6a066c6ab68971d96a9fb441791cad0efba6140d3", size = 335987, upload-time = "2025-06-10T00:45:08.227Z" }, - { url = "https://files.pythonhosted.org/packages/3e/49/bc728a7fe7d0e9336e2b78f0958a2d6b288ba89f25a1762407a222bf53c3/yarl-1.20.1-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c7d7f497126d65e2cad8dc5f97d34c27b19199b6414a40cb36b52f41b79014be", size = 329361, upload-time = "2025-06-10T00:45:10.11Z" }, - { url = "https://files.pythonhosted.org/packages/93/8f/b811b9d1f617c83c907e7082a76e2b92b655400e61730cd61a1f67178393/yarl-1.20.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:67e708dfb8e78d8a19169818eeb5c7a80717562de9051bf2413aca8e3696bf16", size = 346460, upload-time = "2025-06-10T00:45:12.055Z" }, - { url = "https://files.pythonhosted.org/packages/70/fd/af94f04f275f95da2c3b8b5e1d49e3e79f1ed8b6ceb0f1664cbd902773ff/yarl-1.20.1-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:595c07bc79af2494365cc96ddeb772f76272364ef7c80fb892ef9d0649586513", size = 334486, upload-time = "2025-06-10T00:45:13.995Z" }, - { url = "https://files.pythonhosted.org/packages/84/65/04c62e82704e7dd0a9b3f61dbaa8447f8507655fd16c51da0637b39b2910/yarl-1.20.1-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:7bdd2f80f4a7df852ab9ab49484a4dee8030023aa536df41f2d922fd57bf023f", size = 342219, upload-time = "2025-06-10T00:45:16.479Z" }, - { url = "https://files.pythonhosted.org/packages/91/95/459ca62eb958381b342d94ab9a4b6aec1ddec1f7057c487e926f03c06d30/yarl-1.20.1-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:c03bfebc4ae8d862f853a9757199677ab74ec25424d0ebd68a0027e9c639a390", size = 350693, upload-time = "2025-06-10T00:45:18.399Z" }, - { url = "https://files.pythonhosted.org/packages/a6/00/d393e82dd955ad20617abc546a8f1aee40534d599ff555ea053d0ec9bf03/yarl-1.20.1-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:344d1103e9c1523f32a5ed704d576172d2cabed3122ea90b1d4e11fe17c66458", size = 355803, upload-time = "2025-06-10T00:45:20.677Z" }, - { url = "https://files.pythonhosted.org/packages/9e/ed/c5fb04869b99b717985e244fd93029c7a8e8febdfcffa06093e32d7d44e7/yarl-1.20.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:88cab98aa4e13e1ade8c141daeedd300a4603b7132819c484841bb7af3edce9e", size = 341709, upload-time = "2025-06-10T00:45:23.221Z" }, - { url = "https://files.pythonhosted.org/packages/24/fd/725b8e73ac2a50e78a4534ac43c6addf5c1c2d65380dd48a9169cc6739a9/yarl-1.20.1-cp313-cp313t-win32.whl", hash = "sha256:b121ff6a7cbd4abc28985b6028235491941b9fe8fe226e6fdc539c977ea1739d", size = 86591, upload-time = "2025-06-10T00:45:25.793Z" }, - { url = "https://files.pythonhosted.org/packages/94/c3/b2e9f38bc3e11191981d57ea08cab2166e74ea770024a646617c9cddd9f6/yarl-1.20.1-cp313-cp313t-win_amd64.whl", hash = "sha256:541d050a355bbbc27e55d906bc91cb6fe42f96c01413dd0f4ed5a5240513874f", size = 93003, upload-time = "2025-06-10T00:45:27.752Z" }, - { url = "https://files.pythonhosted.org/packages/b4/2d/2345fce04cfd4bee161bf1e7d9cdc702e3e16109021035dbb24db654a622/yarl-1.20.1-py3-none-any.whl", hash = "sha256:83b8eb083fe4683c6115795d9fc1cfaf2cbbefb19b3a1cb68f6527460f483a77", size = 46542, upload-time = "2025-06-10T00:46:07.521Z" }, -] - -[[package]] -name = "zipp" -version = "3.23.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/e3/02/0f2892c661036d50ede074e376733dca2ae7c6eb617489437771209d4180/zipp-3.23.0.tar.gz", hash = "sha256:a07157588a12518c9d4034df3fbbee09c814741a33ff63c05fa29d26a2404166", size = 25547, upload-time = "2025-06-08T17:06:39.4Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/2e/54/647ade08bf0db230bfea292f893923872fd20be6ac6f53b2b936ba839d75/zipp-3.23.0-py3-none-any.whl", hash = "sha256:071652d6115ed432f5ce1d34c336c0adfd6a884660d1e9712a256d3d3bd4b14e", size = 10276, upload-time = "2025-06-08T17:06:38.034Z" }, -] diff --git a/docker-compose.ci.yml b/docker-compose.ci.yml deleted file mode 100644 index a285b2a..0000000 --- a/docker-compose.ci.yml +++ /dev/null @@ -1,110 +0,0 @@ -# Docker Compose Override for CI/CD Environments -# -# This file optimizes FuzzForge for ephemeral CI/CD environments where: -# - Data persistence is not needed -# - Fast startup is critical -# - Disk I/O can be bypassed -# -# Usage: -# docker-compose -f docker-compose.yml -f docker-compose.ci.yml up -d -# -# Benefits: -# - Faster startup (tmpfs instead of volumes) -# - Reduced disk I/O -# - Automatic cleanup (no persistent data) -# -# WARNING: All data is lost when containers stop! - -version: '3.8' - -services: - # Temporal - Use in-memory storage and faster health checks - temporal: - environment: - # Skip some init steps for faster startup - - SKIP_DEFAULT_NAMESPACE_CREATION=false - healthcheck: - # More aggressive health checking for faster feedback - interval: 5s - timeout: 3s - retries: 15 - restart: "no" # Don't restart in CI - - # PostgreSQL - Use in-memory storage and disable durability features - postgresql: - command: > - postgres - -c fsync=off - -c full_page_writes=off - -c synchronous_commit=off - -c wal_level=minimal - -c max_wal_senders=0 - tmpfs: - # Store database in RAM (fast, but ephemeral) - - /var/lib/postgresql/data - healthcheck: - interval: 3s - timeout: 3s - retries: 10 - restart: "no" - - # MinIO - Use in-memory storage - minio: - environment: - # Already set in main compose, but ensure CI mode is enabled - - MINIO_CI_CD=true - tmpfs: - # Store objects in RAM - - /data - healthcheck: - interval: 3s - timeout: 3s - retries: 10 - restart: "no" - - # Backend - Optimize for CI - backend: - environment: - # Add CI-specific environment variables if needed - - CI=true - - LOG_LEVEL=WARNING # Reduce log noise - healthcheck: - interval: 5s - timeout: 3s - retries: 15 - restart: "no" - - # Temporal UI - Disable in CI (not needed, saves resources) - temporal-ui: - profiles: - - ui # Don't start unless explicitly requested - - # MinIO Setup - Speed up bucket creation - minio-setup: - restart: "no" - -# Volumes - Use tmpfs for all persistent data in CI -# Note: This overrides the named volumes with in-memory storage -volumes: - temporal_data: - driver: local - driver_opts: - type: tmpfs - device: tmpfs - - temporal_postgres: - driver: local - driver_opts: - type: tmpfs - device: tmpfs - - minio_data: - driver: local - driver_opts: - type: tmpfs - device: tmpfs - -# Networks - Keep the same -networks: - fuzzforge-network: - driver: bridge diff --git a/docker-compose.yml b/docker-compose.yml deleted file mode 100644 index aae0fb5..0000000 --- a/docker-compose.yml +++ /dev/null @@ -1,710 +0,0 @@ -# FuzzForge AI - Temporal Architecture with Vertical Workers -# -# This is the new architecture using: -# - Temporal for workflow orchestration -# - MinIO for unified storage (dev + prod) -# - Vertical workers with pre-built toolchains -# -# Usage: -# Development: docker-compose -f docker-compose.temporal.yaml up -# Production: docker-compose -f docker-compose.temporal.yaml -f docker-compose.temporal.prod.yaml up - -services: - # ============================================================================ - # Temporal Server - Workflow Orchestration - # ============================================================================ - temporal: - image: temporalio/auto-setup:latest - container_name: fuzzforge-temporal - depends_on: - - postgresql - ports: - - "7233:7233" # gRPC API - environment: - # Database configuration - - DB=postgres12 - - DB_PORT=5432 - - POSTGRES_USER=temporal - - POSTGRES_PWD=temporal - - POSTGRES_SEEDS=postgresql - # Temporal configuration (no custom dynamic config) - - ENABLE_ES=false - - ES_SEEDS= - # Address configuration - - TEMPORAL_ADDRESS=temporal:7233 - - TEMPORAL_CLI_ADDRESS=temporal:7233 - volumes: - - temporal_data:/etc/temporal - networks: - - fuzzforge-network - healthcheck: - test: ["CMD", "tctl", "--address", "temporal:7233", "cluster", "health"] - interval: 10s - timeout: 5s - retries: 5 - restart: unless-stopped - - # ============================================================================ - # Temporal UI - Web Interface - # ============================================================================ - temporal-ui: - image: temporalio/ui:latest - container_name: fuzzforge-temporal-ui - depends_on: - - temporal - ports: - - "8080:8080" # Web UI (http://localhost:8080) - environment: - - TEMPORAL_ADDRESS=temporal:7233 - - TEMPORAL_CORS_ORIGINS=http://localhost:8080 - networks: - - fuzzforge-network - restart: unless-stopped - - # ============================================================================ - # Temporal Database - PostgreSQL (lightweight for dev) - # ============================================================================ - postgresql: - image: postgres:14-alpine - container_name: fuzzforge-temporal-postgresql - environment: - POSTGRES_USER: temporal - POSTGRES_PASSWORD: temporal - POSTGRES_DB: temporal - volumes: - - temporal_postgres:/var/lib/postgresql/data - networks: - - fuzzforge-network - healthcheck: - test: ["CMD-SHELL", "pg_isready -U temporal"] - interval: 5s - timeout: 5s - retries: 5 - restart: unless-stopped - - # ============================================================================ - # MinIO - S3-Compatible Object Storage - # ============================================================================ - minio: - image: minio/minio:latest - container_name: fuzzforge-minio - command: server /data --console-address ":9001" - ports: - - "9000:9000" # S3 API - - "9001:9001" # Web Console (http://localhost:9001) - environment: - MINIO_ROOT_USER: fuzzforge - MINIO_ROOT_PASSWORD: fuzzforge123 - # Lightweight mode for development (reduces memory to 256MB) - MINIO_CI_CD: "true" - volumes: - - minio_data:/data - networks: - - fuzzforge-network - healthcheck: - test: ["CMD", "curl", "-f", "http://localhost:9000/minio/health/live"] - interval: 5s - timeout: 5s - retries: 5 - restart: unless-stopped - - # ============================================================================ - # MinIO Setup - Create Buckets and Lifecycle Policies - # ============================================================================ - minio-setup: - image: minio/mc:latest - container_name: fuzzforge-minio-setup - depends_on: - minio: - condition: service_healthy - entrypoint: > - /bin/sh -c " - echo 'Waiting for MinIO to be ready...'; - sleep 2; - - echo 'Setting up MinIO alias...'; - mc alias set fuzzforge http://minio:9000 fuzzforge fuzzforge123; - - echo 'Creating buckets...'; - mc mb fuzzforge/targets --ignore-existing; - mc mb fuzzforge/results --ignore-existing; - mc mb fuzzforge/cache --ignore-existing; - - echo 'Setting lifecycle policies...'; - mc ilm add fuzzforge/targets --expiry-days 7; - mc ilm add fuzzforge/results --expiry-days 30; - mc ilm add fuzzforge/cache --expiry-days 3; - - echo 'Setting access policies...'; - mc anonymous set download fuzzforge/results; - - echo 'MinIO setup complete!'; - exit 0; - " - networks: - - fuzzforge-network - - # ============================================================================ - # LLM Proxy - LiteLLM Gateway - # ============================================================================ - llm-proxy: - image: ghcr.io/berriai/litellm:main-stable - container_name: fuzzforge-llm-proxy - depends_on: - llm-proxy-db: - condition: service_healthy - otel-collector: - condition: service_started - env_file: - - ./volumes/env/.env - environment: - PORT: 4000 - DATABASE_URL: postgresql://litellm:litellm@llm-proxy-db:5432/litellm - STORE_MODEL_IN_DB: "True" - UI_USERNAME: ${UI_USERNAME:-fuzzforge} - UI_PASSWORD: ${UI_PASSWORD:-fuzzforge123} - OTEL_EXPORTER_OTLP_ENDPOINT: http://otel-collector:4317 - OTEL_EXPORTER_OTLP_PROTOCOL: grpc - ANTHROPIC_API_KEY: ${LITELLM_ANTHROPIC_API_KEY:-} - OPENAI_API_KEY: ${LITELLM_OPENAI_API_KEY:-} - command: - - "--config" - - "/etc/litellm/proxy_config.yaml" - ports: - - "10999:4000" # Web UI + OpenAI-compatible API - volumes: - - litellm_proxy_data:/var/lib/litellm - - ./volumes/litellm/proxy_config.yaml:/etc/litellm/proxy_config.yaml:ro - networks: - - fuzzforge-network - healthcheck: - test: ["CMD-SHELL", "wget --no-verbose --tries=1 http://localhost:4000/health/liveliness || exit 1"] - interval: 30s - timeout: 10s - retries: 3 - start_period: 40s - restart: unless-stopped - - otel-collector: - image: otel/opentelemetry-collector:latest - container_name: fuzzforge-otel-collector - command: ["--config=/etc/otel-collector/config.yaml"] - volumes: - - ./volumes/otel/collector-config.yaml:/etc/otel-collector/config.yaml:ro - ports: - - "4317:4317" - - "4318:4318" - networks: - - fuzzforge-network - restart: unless-stopped - - llm-proxy-db: - image: postgres:16 - container_name: fuzzforge-llm-proxy-db - environment: - POSTGRES_DB: litellm - POSTGRES_USER: litellm - POSTGRES_PASSWORD: litellm - healthcheck: - test: ["CMD-SHELL", "pg_isready -d litellm -U litellm"] - interval: 5s - timeout: 5s - retries: 12 - volumes: - - litellm_proxy_db:/var/lib/postgresql/data - networks: - - fuzzforge-network - restart: unless-stopped - - # ============================================================================ - # LLM Proxy Bootstrap - Seed providers and virtual keys - # ============================================================================ - llm-proxy-bootstrap: - image: python:3.11-slim - container_name: fuzzforge-llm-proxy-bootstrap - depends_on: - llm-proxy: - condition: service_started - env_file: - - ./volumes/env/.env - environment: - PROXY_BASE_URL: http://llm-proxy:4000 - ENV_FILE_PATH: /bootstrap/env/.env - UI_USERNAME: ${UI_USERNAME:-fuzzforge} - UI_PASSWORD: ${UI_PASSWORD:-fuzzforge123} - volumes: - - ./docker/scripts/bootstrap_llm_proxy.py:/app/bootstrap.py:ro - - ./volumes/env:/bootstrap/env - - litellm_proxy_data:/bootstrap/data - networks: - - fuzzforge-network - command: ["python", "/app/bootstrap.py"] - restart: "no" - - # ============================================================================ - # Vertical Worker: Rust/Native Security - # ============================================================================ - # This is a template/example worker. In production, you'll have multiple - # vertical workers (android, rust, web, ios, blockchain, etc.) - worker-rust: - build: - context: ./workers/rust - dockerfile: Dockerfile - container_name: fuzzforge-worker-rust - profiles: - - workers - - rust - depends_on: - postgresql: - condition: service_healthy - temporal: - condition: service_healthy - minio: - condition: service_healthy - environment: - # Temporal configuration - TEMPORAL_ADDRESS: temporal:7233 - TEMPORAL_NAMESPACE: default - - # Worker configuration - WORKER_VERTICAL: rust - WORKER_TASK_QUEUE: rust-queue - MAX_CONCURRENT_ACTIVITIES: 5 - - # Storage configuration (MinIO) - STORAGE_BACKEND: s3 - S3_ENDPOINT: http://minio:9000 - S3_ACCESS_KEY: fuzzforge - S3_SECRET_KEY: fuzzforge123 - S3_BUCKET: targets - S3_REGION: us-east-1 - S3_USE_SSL: "false" - - # Cache configuration - CACHE_DIR: /cache - CACHE_MAX_SIZE: 10GB - CACHE_TTL: 7d - - # Logging - LOG_LEVEL: INFO - PYTHONUNBUFFERED: 1 - volumes: - # Mount workflow code (read-only) for dynamic discovery - - ./backend/toolbox:/app/toolbox:ro - # Worker cache for downloaded targets - - worker_rust_cache:/cache - networks: - - fuzzforge-network - restart: "no" - # Resource limits (adjust based on vertical needs) - deploy: - resources: - limits: - cpus: '2' - memory: 2G - reservations: - cpus: '1' - memory: 512M - - # ============================================================================ - # Vertical Worker: Python Fuzzing - # ============================================================================ - worker-python: - build: - context: ./workers/python - dockerfile: Dockerfile - container_name: fuzzforge-worker-python - depends_on: - postgresql: - condition: service_healthy - temporal: - condition: service_healthy - minio: - condition: service_healthy - environment: - # Temporal configuration - TEMPORAL_ADDRESS: temporal:7233 - TEMPORAL_NAMESPACE: default - - # Worker configuration - WORKER_VERTICAL: python - WORKER_TASK_QUEUE: python-queue - MAX_CONCURRENT_ACTIVITIES: 5 - - # Storage configuration (MinIO) - STORAGE_BACKEND: s3 - S3_ENDPOINT: http://minio:9000 - S3_ACCESS_KEY: fuzzforge - S3_SECRET_KEY: fuzzforge123 - S3_BUCKET: targets - S3_REGION: us-east-1 - S3_USE_SSL: "false" - - # Cache configuration - CACHE_DIR: /cache - CACHE_MAX_SIZE: 10GB - CACHE_TTL: 7d - - # Logging - LOG_LEVEL: INFO - PYTHONUNBUFFERED: 1 - volumes: - # Mount workflow code (read-only) for dynamic discovery - - ./backend/toolbox:/app/toolbox:ro - # Mount AI module for A2A wrapper access - - ./ai/src:/app/ai_src:ro - # Worker cache for downloaded targets - - worker_python_cache:/cache - networks: - - fuzzforge-network - restart: "no" - # Resource limits (lighter than rust) - deploy: - resources: - limits: - cpus: '1' - memory: 1G - reservations: - cpus: '0.5' - memory: 256M - - # ============================================================================ - # Vertical Worker: Secret Detection - # ============================================================================ - worker-secrets: - build: - context: ./workers/secrets - dockerfile: Dockerfile - container_name: fuzzforge-worker-secrets - profiles: - - workers - - secrets - depends_on: - postgresql: - condition: service_healthy - temporal: - condition: service_healthy - minio: - condition: service_healthy - environment: - # Temporal configuration - TEMPORAL_ADDRESS: temporal:7233 - TEMPORAL_NAMESPACE: default - - # Worker configuration - WORKER_VERTICAL: secrets - WORKER_TASK_QUEUE: secrets-queue - MAX_CONCURRENT_ACTIVITIES: 5 - - # Storage configuration (MinIO) - STORAGE_BACKEND: s3 - S3_ENDPOINT: http://minio:9000 - S3_ACCESS_KEY: fuzzforge - S3_SECRET_KEY: fuzzforge123 - S3_BUCKET: targets - S3_REGION: us-east-1 - S3_USE_SSL: "false" - - # Cache configuration - CACHE_DIR: /cache - CACHE_MAX_SIZE: 10GB - CACHE_TTL: 7d - - # Logging - LOG_LEVEL: INFO - PYTHONUNBUFFERED: 1 - volumes: - # Mount workflow code (read-only) for dynamic discovery - - ./backend/toolbox:/app/toolbox:ro - # Mount AI module for A2A wrapper access - - ./ai/src:/app/ai_src:ro - # Worker cache for downloaded targets - - worker_secrets_cache:/cache - networks: - - fuzzforge-network - restart: "no" - # Resource limits (lighter than rust) - deploy: - resources: - limits: - cpus: '1' - memory: 1G - reservations: - cpus: '0.5' - memory: 256M - - # ============================================================================ - # Vertical Worker: Android Security - # ============================================================================ - worker-android: - build: - context: ./workers/android - dockerfile: ${ANDROID_DOCKERFILE:-Dockerfile.amd64} - container_name: fuzzforge-worker-android - profiles: - - workers - - android - - full - depends_on: - postgresql: - condition: service_healthy - temporal: - condition: service_healthy - minio: - condition: service_healthy - environment: - # Temporal configuration - TEMPORAL_ADDRESS: temporal:7233 - TEMPORAL_NAMESPACE: default - - # Worker configuration - WORKER_VERTICAL: android - WORKER_TASK_QUEUE: android-queue - MAX_CONCURRENT_ACTIVITIES: 5 - - # Storage configuration (MinIO) - STORAGE_BACKEND: s3 - S3_ENDPOINT: http://minio:9000 - S3_ACCESS_KEY: fuzzforge - S3_SECRET_KEY: fuzzforge123 - S3_BUCKET: targets - S3_REGION: us-east-1 - S3_USE_SSL: "false" - - # Cache configuration - CACHE_DIR: /cache - CACHE_MAX_SIZE: 10GB - CACHE_TTL: 7d - - # Logging - LOG_LEVEL: INFO - PYTHONUNBUFFERED: 1 - volumes: - # Mount workflow code (read-only) for dynamic discovery - - ./backend/toolbox:/app/toolbox:ro - # Worker cache for downloaded targets - - worker_android_cache:/cache - networks: - - fuzzforge-network - restart: "no" - # Resource limits (Android tools need more memory) - deploy: - resources: - limits: - cpus: '2' - memory: 3G - reservations: - cpus: '1' - memory: 1G - - # ============================================================================ - # FuzzForge Backend API - # ============================================================================ - backend: - build: - context: ./backend - dockerfile: Dockerfile - container_name: fuzzforge-backend - depends_on: - temporal: - condition: service_healthy - minio: - condition: service_healthy - environment: - # Temporal configuration - TEMPORAL_ADDRESS: temporal:7233 - TEMPORAL_NAMESPACE: default - - # Storage configuration (MinIO) - S3_ENDPOINT: http://minio:9000 - S3_ACCESS_KEY: fuzzforge - S3_SECRET_KEY: fuzzforge123 - S3_BUCKET: targets - S3_REGION: us-east-1 - S3_USE_SSL: "false" - - # Python configuration - PYTHONPATH: /app - PYTHONUNBUFFERED: 1 - - # Host filesystem paths (for CLI worker management) - FUZZFORGE_HOST_ROOT: ${PWD} - - # Logging - LOG_LEVEL: INFO - ports: - - "8000:8000" # FastAPI REST API - - "8010:8010" # MCP (Model Context Protocol) - volumes: - # Mount toolbox for workflow discovery (read-only) - - ./backend/toolbox:/app/toolbox:ro - networks: - - fuzzforge-network - restart: unless-stopped - healthcheck: - test: ["CMD", "curl", "-f", "http://localhost:8000/health"] - interval: 30s - timeout: 10s - retries: 3 - - # ============================================================================ - # Task Agent - A2A LiteLLM Agent - # ============================================================================ - task-agent: - build: - context: ./ai/agents/task_agent - dockerfile: Dockerfile - container_name: fuzzforge-task-agent - depends_on: - llm-proxy-bootstrap: - condition: service_completed_successfully - ports: - - "10900:8000" - environment: - - PORT=8000 - - PYTHONUNBUFFERED=1 - volumes: - - ./volumes/env:/app/config:ro - networks: - - fuzzforge-network - restart: unless-stopped - - # ============================================================================ - # Vertical Worker: OSS-Fuzz Campaigns - # ============================================================================ - worker-ossfuzz: - build: - context: ./workers/ossfuzz - dockerfile: Dockerfile - container_name: fuzzforge-worker-ossfuzz - profiles: - - workers - - ossfuzz - depends_on: - postgresql: - condition: service_healthy - temporal: - condition: service_healthy - minio: - condition: service_healthy - environment: - # Temporal configuration - TEMPORAL_ADDRESS: temporal:7233 - TEMPORAL_NAMESPACE: default - - # Worker configuration - WORKER_VERTICAL: ossfuzz - WORKER_TASK_QUEUE: ossfuzz-queue - MAX_CONCURRENT_ACTIVITIES: 2 # Lower concurrency for resource-intensive fuzzing - - # Storage configuration (MinIO) - STORAGE_BACKEND: s3 - S3_ENDPOINT: http://minio:9000 - S3_ACCESS_KEY: fuzzforge - S3_SECRET_KEY: fuzzforge123 - S3_BUCKET: targets - S3_REGION: us-east-1 - S3_USE_SSL: "false" - - # Cache configuration (larger for OSS-Fuzz builds) - CACHE_DIR: /cache - CACHE_MAX_SIZE: 50GB - CACHE_TTL: 30d - - # Logging - LOG_LEVEL: INFO - PYTHONUNBUFFERED: 1 - volumes: - # Mount workflow code (read-only) for dynamic discovery - - ./backend/toolbox:/app/toolbox:ro - # Worker cache for OSS-Fuzz builds and corpus - - worker_ossfuzz_cache:/cache - # OSS-Fuzz build output - - worker_ossfuzz_build:/opt/oss-fuzz/build - networks: - - fuzzforge-network - restart: "no" - # Higher resource limits for fuzzing campaigns - deploy: - resources: - limits: - cpus: '4' - memory: 8G - reservations: - cpus: '2' - memory: 2G - -# ============================================================================ -# Volumes -# ============================================================================ -volumes: - temporal_data: - name: fuzzforge_temporal_data - temporal_postgres: - name: fuzzforge_temporal_postgres - minio_data: - name: fuzzforge_minio_data - worker_rust_cache: - name: fuzzforge_worker_rust_cache - worker_python_cache: - name: fuzzforge_worker_python_cache - worker_secrets_cache: - name: fuzzforge_worker_secrets_cache - worker_android_cache: - name: fuzzforge_worker_android_cache - worker_ossfuzz_cache: - name: fuzzforge_worker_ossfuzz_cache - worker_ossfuzz_build: - name: fuzzforge_worker_ossfuzz_build - litellm_proxy_data: - name: fuzzforge_litellm_proxy_data - litellm_proxy_db: - name: fuzzforge_litellm_proxy_db - # Add more worker caches as you add verticals: - # worker_web_cache: - # worker_ios_cache: - -# ============================================================================ -# Networks -# ============================================================================ -networks: - fuzzforge-network: - name: fuzzforge_temporal_network - driver: bridge - -# ============================================================================ -# Notes: -# ============================================================================ -# -# 1. First Startup: -# - Creates all buckets and policies automatically -# - Temporal auto-setup creates database schema -# - Takes ~30-60 seconds for all health checks -# -# 2. Adding Vertical Workers: -# - Copy worker-rust section -# - Update: container_name, build.context, WORKER_VERTICAL, volumes -# - Add corresponding cache volume -# -# 3. Scaling Workers: -# - Horizontal: docker-compose up -d --scale worker-rust=3 -# - Vertical: Increase MAX_CONCURRENT_ACTIVITIES env var -# -# 4. Web UIs: -# - Temporal UI: http://localhost:8233 -# - MinIO Console: http://localhost:9001 (user: fuzzforge, pass: fuzzforge123) -# - LiteLLM Proxy: http://localhost:10999 -# -# 5. Resource Usage (Baseline): -# - Temporal: ~500MB -# - Temporal DB: ~100MB -# - MinIO: ~256MB (with CI_CD=true) -# - Worker-rust: ~512MB (varies by toolchain) -# - Total: ~1.4GB baseline -# -# 6. Production Overrides: -# - Use docker-compose.temporal.prod.yaml for: -# - Disable CI_CD mode (more memory but better performance) -# - Add more workers -# - Increase resource limits -# - Add monitoring/logging diff --git a/docker/scripts/bootstrap_llm_proxy.py b/docker/scripts/bootstrap_llm_proxy.py deleted file mode 100644 index 68f6745..0000000 --- a/docker/scripts/bootstrap_llm_proxy.py +++ /dev/null @@ -1,636 +0,0 @@ -"""Bootstrap the LiteLLM proxy with provider secrets and default virtual keys. - -The bootstrapper runs as a one-shot container during docker-compose startup. -It performs the following actions: - - 1. Waits for the proxy health endpoint to respond. - 2. Collects upstream provider API keys from the shared .env file (plus any - legacy copies) and mirrors them into a proxy-specific env file - (volumes/env/.env.litellm) so only the proxy container can access them. - 3. Emits a default virtual key for the task agent by calling /key/generate, - persisting the generated token back into volumes/env/.env so the agent can - authenticate through the proxy instead of using raw provider secrets. - 4. Keeps the process idempotent: existing keys are reused and their allowed - model list is refreshed instead of issuing duplicates on every run. -""" - -from __future__ import annotations - -import json -import os -import sys -import time -import urllib.error -import urllib.parse -import urllib.request -from dataclasses import dataclass -from pathlib import Path -from typing import Iterable, Mapping - -PROXY_BASE_URL = os.getenv("PROXY_BASE_URL", "http://llm-proxy:4000").rstrip("/") -ENV_FILE_PATH = Path(os.getenv("ENV_FILE_PATH", "/bootstrap/env/.env")) -LITELLM_ENV_FILE_PATH = Path( - os.getenv("LITELLM_ENV_FILE_PATH", "/bootstrap/env/.env.litellm") -) -LEGACY_ENV_FILE_PATH = Path( - os.getenv("LEGACY_ENV_FILE_PATH", "/bootstrap/env/.env.bifrost") -) -MAX_WAIT_SECONDS = int(os.getenv("LITELLM_PROXY_WAIT_SECONDS", "120")) - - -@dataclass(frozen=True) -class VirtualKeySpec: - """Configuration for a virtual key to be provisioned.""" - env_var: str - alias: str - user_id: str - budget_env_var: str - duration_env_var: str - default_budget: float - default_duration: str - - -# Multiple virtual keys for different services -VIRTUAL_KEYS: tuple[VirtualKeySpec, ...] = ( - VirtualKeySpec( - env_var="OPENAI_API_KEY", - alias="fuzzforge-cli", - user_id="fuzzforge-cli", - budget_env_var="CLI_BUDGET", - duration_env_var="CLI_DURATION", - default_budget=100.0, - default_duration="30d", - ), - VirtualKeySpec( - env_var="TASK_AGENT_API_KEY", - alias="fuzzforge-task-agent", - user_id="fuzzforge-task-agent", - budget_env_var="TASK_AGENT_BUDGET", - duration_env_var="TASK_AGENT_DURATION", - default_budget=25.0, - default_duration="30d", - ), - VirtualKeySpec( - env_var="COGNEE_API_KEY", - alias="fuzzforge-cognee", - user_id="fuzzforge-cognee", - budget_env_var="COGNEE_BUDGET", - duration_env_var="COGNEE_DURATION", - default_budget=50.0, - default_duration="30d", - ), -) - - -@dataclass(frozen=True) -class ProviderSpec: - name: str - litellm_env_var: str - alias_env_var: str - source_env_vars: tuple[str, ...] - - -# Support fresh LiteLLM variables while gracefully migrating legacy env -# aliases on first boot. -PROVIDERS: tuple[ProviderSpec, ...] = ( - ProviderSpec( - "openai", - "OPENAI_API_KEY", - "LITELLM_OPENAI_API_KEY", - ("LITELLM_OPENAI_API_KEY", "BIFROST_OPENAI_KEY"), - ), - ProviderSpec( - "anthropic", - "ANTHROPIC_API_KEY", - "LITELLM_ANTHROPIC_API_KEY", - ("LITELLM_ANTHROPIC_API_KEY", "BIFROST_ANTHROPIC_KEY"), - ), - ProviderSpec( - "gemini", - "GEMINI_API_KEY", - "LITELLM_GEMINI_API_KEY", - ("LITELLM_GEMINI_API_KEY", "BIFROST_GEMINI_KEY"), - ), - ProviderSpec( - "mistral", - "MISTRAL_API_KEY", - "LITELLM_MISTRAL_API_KEY", - ("LITELLM_MISTRAL_API_KEY", "BIFROST_MISTRAL_KEY"), - ), - ProviderSpec( - "openrouter", - "OPENROUTER_API_KEY", - "LITELLM_OPENROUTER_API_KEY", - ("LITELLM_OPENROUTER_API_KEY", "BIFROST_OPENROUTER_KEY"), - ), -) - -PROVIDER_LOOKUP: dict[str, ProviderSpec] = {spec.name: spec for spec in PROVIDERS} - - -def log(message: str) -> None: - print(f"[litellm-bootstrap] {message}", flush=True) - - -def read_lines(path: Path) -> list[str]: - if not path.exists(): - return [] - return path.read_text().splitlines() - - -def write_lines(path: Path, lines: Iterable[str]) -> None: - material = "\n".join(lines) - if material and not material.endswith("\n"): - material += "\n" - path.parent.mkdir(parents=True, exist_ok=True) - path.write_text(material) - - -def read_env_file() -> list[str]: - if not ENV_FILE_PATH.exists(): - raise FileNotFoundError( - f"Expected env file at {ENV_FILE_PATH}. Copy volumes/env/.env.template first." - ) - return read_lines(ENV_FILE_PATH) - - -def write_env_file(lines: Iterable[str]) -> None: - write_lines(ENV_FILE_PATH, lines) - - -def read_litellm_env_file() -> list[str]: - return read_lines(LITELLM_ENV_FILE_PATH) - - -def write_litellm_env_file(lines: Iterable[str]) -> None: - write_lines(LITELLM_ENV_FILE_PATH, lines) - - -def read_legacy_env_file() -> Mapping[str, str]: - lines = read_lines(LEGACY_ENV_FILE_PATH) - return parse_env_lines(lines) - - -def set_env_value(lines: list[str], key: str, value: str) -> tuple[list[str], bool]: - prefix = f"{key}=" - new_line = f"{prefix}{value}" - for idx, line in enumerate(lines): - stripped = line.lstrip() - if not stripped or stripped.startswith("#"): - continue - if stripped.startswith(prefix): - if stripped == new_line: - return lines, False - indent = line[: len(line) - len(stripped)] - lines[idx] = f"{indent}{new_line}" - return lines, True - lines.append(new_line) - return lines, True - - -def parse_env_lines(lines: list[str]) -> dict[str, str]: - mapping: dict[str, str] = {} - for raw_line in lines: - stripped = raw_line.strip() - if not stripped or stripped.startswith("#") or "=" not in stripped: - continue - key, value = stripped.split("=", 1) - mapping[key] = value - return mapping - - -def wait_for_proxy() -> None: - health_paths = ("/health/liveliness", "/health", "/") - deadline = time.time() + MAX_WAIT_SECONDS - attempt = 0 - while time.time() < deadline: - attempt += 1 - for path in health_paths: - url = f"{PROXY_BASE_URL}{path}" - try: - with urllib.request.urlopen(url) as response: # noqa: S310 - if response.status < 400: - log(f"Proxy responded on {path} (attempt {attempt})") - return - except urllib.error.URLError as exc: - log(f"Proxy not ready yet ({path}): {exc}") - time.sleep(3) - raise TimeoutError(f"Timed out waiting for proxy at {PROXY_BASE_URL}") - - -def request_json( - path: str, - *, - method: str = "GET", - payload: Mapping[str, object] | None = None, - auth_token: str | None = None, -) -> tuple[int, str]: - url = f"{PROXY_BASE_URL}{path}" - data = None - headers = {"Accept": "application/json"} - if auth_token: - headers["Authorization"] = f"Bearer {auth_token}" - if payload is not None: - data = json.dumps(payload).encode("utf-8") - headers["Content-Type"] = "application/json" - request = urllib.request.Request(url, data=data, headers=headers, method=method) - try: - with urllib.request.urlopen(request) as response: # noqa: S310 - body = response.read().decode("utf-8") - return response.status, body - except urllib.error.HTTPError as exc: - body = exc.read().decode("utf-8") - return exc.code, body - - -def get_master_key(env_map: Mapping[str, str]) -> str: - candidate = os.getenv("LITELLM_MASTER_KEY") or env_map.get("LITELLM_MASTER_KEY") - if not candidate: - raise RuntimeError( - "LITELLM_MASTER_KEY is not set. Add it to volumes/env/.env before starting Docker." - ) - value = candidate.strip() - if not value: - raise RuntimeError( - "LITELLM_MASTER_KEY is blank. Provide a non-empty value in the env file." - ) - return value - - -def gather_provider_keys( - env_lines: list[str], - env_map: dict[str, str], - legacy_map: Mapping[str, str], -) -> tuple[dict[str, str], list[str], bool]: - updated_lines = list(env_lines) - discovered: dict[str, str] = {} - changed = False - - for spec in PROVIDERS: - value: str | None = None - for source_var in spec.source_env_vars: - candidate = env_map.get(source_var) or legacy_map.get(source_var) or os.getenv( - source_var - ) - if not candidate: - continue - stripped = candidate.strip() - if stripped: - value = stripped - break - if not value: - continue - - discovered[spec.litellm_env_var] = value - updated_lines, alias_changed = set_env_value( - updated_lines, spec.alias_env_var, value - ) - if alias_changed: - env_map[spec.alias_env_var] = value - changed = True - - return discovered, updated_lines, changed - - -def ensure_litellm_env(provider_values: Mapping[str, str]) -> None: - if not provider_values: - log("No provider secrets discovered; skipping LiteLLM env update") - return - lines = read_litellm_env_file() - updated_lines = list(lines) - changed = False - for env_var, value in provider_values.items(): - updated_lines, var_changed = set_env_value(updated_lines, env_var, value) - if var_changed: - changed = True - if changed or not lines: - write_litellm_env_file(updated_lines) - log(f"Wrote provider secrets to {LITELLM_ENV_FILE_PATH}") - - -def current_env_key(env_map: Mapping[str, str], env_var: str) -> str | None: - candidate = os.getenv(env_var) or env_map.get(env_var) - if not candidate: - return None - value = candidate.strip() - if not value or value.startswith("sk-proxy-"): - return None - return value - - -def collect_default_models(env_map: Mapping[str, str]) -> list[str]: - explicit = ( - os.getenv("LITELLM_DEFAULT_MODELS") - or env_map.get("LITELLM_DEFAULT_MODELS") - or "" - ) - models: list[str] = [] - if explicit: - models.extend( - model.strip() - for model in explicit.split(",") - if model.strip() - ) - if models: - return sorted(dict.fromkeys(models)) - - configured_model = ( - os.getenv("LITELLM_MODEL") or env_map.get("LITELLM_MODEL") or "" - ).strip() - configured_provider = ( - os.getenv("LITELLM_PROVIDER") or env_map.get("LITELLM_PROVIDER") or "" - ).strip() - - if configured_model: - if "/" in configured_model: - models.append(configured_model) - elif configured_provider: - models.append(f"{configured_provider}/{configured_model}") - else: - log( - "LITELLM_MODEL is set without a provider; configure LITELLM_PROVIDER or " - "use the provider/model format (e.g. openai/gpt-4o-mini)." - ) - elif configured_provider: - log( - "LITELLM_PROVIDER configured without a default model. Bootstrap will issue an " - "unrestricted virtual key allowing any proxy-registered model." - ) - - return sorted(dict.fromkeys(models)) - - -def fetch_existing_key_record(master_key: str, key_value: str) -> Mapping[str, object] | None: - encoded = urllib.parse.quote_plus(key_value) - status, body = request_json(f"/key/info?key={encoded}", auth_token=master_key) - if status != 200: - log(f"Key lookup failed ({status}); treating OPENAI_API_KEY as new") - return None - try: - data = json.loads(body) - except json.JSONDecodeError: - log("Key info response was not valid JSON; ignoring") - return None - if isinstance(data, Mapping) and data.get("key"): - return data - return None - - -def fetch_key_by_alias(master_key: str, alias: str) -> str | None: - """Fetch existing key value by alias from LiteLLM proxy.""" - status, body = request_json("/key/info", auth_token=master_key) - if status != 200: - return None - try: - data = json.loads(body) - except json.JSONDecodeError: - return None - if isinstance(data, dict) and "keys" in data: - for key_info in data.get("keys", []): - if isinstance(key_info, dict) and key_info.get("key_alias") == alias: - return str(key_info.get("key", "")).strip() or None - return None - - -def generate_virtual_key( - master_key: str, - models: list[str], - spec: VirtualKeySpec, - env_map: Mapping[str, str], -) -> str: - budget_str = os.getenv(spec.budget_env_var) or env_map.get(spec.budget_env_var) or str(spec.default_budget) - try: - budget = float(budget_str) - except ValueError: - budget = spec.default_budget - - duration = os.getenv(spec.duration_env_var) or env_map.get(spec.duration_env_var) or spec.default_duration - - payload: dict[str, object] = { - "key_alias": spec.alias, - "user_id": spec.user_id, - "duration": duration, - "max_budget": budget, - "metadata": { - "provisioned_by": "bootstrap", - "service": spec.alias, - "default_models": models, - }, - "key_type": "llm_api", - } - if models: - payload["models"] = models - status, body = request_json( - "/key/generate", method="POST", payload=payload, auth_token=master_key - ) - if status == 400 and "already exists" in body.lower(): - # Key alias already exists but .env is out of sync (e.g., after docker prune) - # Delete the old key and regenerate - log(f"Key alias '{spec.alias}' already exists in database but not in .env; deleting and regenerating") - # Try to delete by alias using POST /key/delete with key_aliases array - delete_payload = {"key_aliases": [spec.alias]} - delete_status, delete_body = request_json( - "/key/delete", method="POST", payload=delete_payload, auth_token=master_key - ) - if delete_status not in {200, 201}: - log(f"Warning: Could not delete existing key alias {spec.alias} ({delete_status}): {delete_body}") - # Continue anyway and try to generate - else: - log(f"Deleted existing key alias {spec.alias}") - - # Retry generation - status, body = request_json( - "/key/generate", method="POST", payload=payload, auth_token=master_key - ) - if status not in {200, 201}: - raise RuntimeError(f"Failed to generate virtual key for {spec.alias} ({status}): {body}") - try: - data = json.loads(body) - except json.JSONDecodeError as exc: - raise RuntimeError(f"Virtual key response for {spec.alias} was not valid JSON") from exc - if isinstance(data, Mapping): - key_value = str(data.get("key") or data.get("token") or "").strip() - if key_value: - log(f"Generated new LiteLLM virtual key for {spec.alias} (budget: ${budget}, duration: {duration})") - return key_value - raise RuntimeError(f"Virtual key response for {spec.alias} did not include a key field") - - -def update_virtual_key( - master_key: str, - key_value: str, - models: list[str], - spec: VirtualKeySpec, -) -> None: - if not models: - return - payload: dict[str, object] = { - "key": key_value, - "models": models, - } - status, body = request_json( - "/key/update", method="POST", payload=payload, auth_token=master_key - ) - if status != 200: - log(f"Virtual key update for {spec.alias} skipped ({status}): {body}") - else: - log(f"Refreshed allowed models for {spec.alias}") - - -def persist_key_to_env(new_key: str, env_var: str) -> None: - lines = read_env_file() - updated_lines, changed = set_env_value(lines, env_var, new_key) - # Always update the environment variable, even if file wasn't changed - os.environ[env_var] = new_key - if changed: - write_env_file(updated_lines) - log(f"Persisted {env_var} to {ENV_FILE_PATH}") - else: - log(f"{env_var} already up-to-date in env file") - - -def ensure_virtual_key( - master_key: str, - models: list[str], - env_map: Mapping[str, str], - spec: VirtualKeySpec, -) -> str: - allowed_models: list[str] = [] - sync_flag = os.getenv("LITELLM_SYNC_VIRTUAL_KEY_MODELS", "").strip().lower() - if models and (sync_flag in {"1", "true", "yes", "on"} or models == ["*"]): - allowed_models = models - existing_key = current_env_key(env_map, spec.env_var) - if existing_key: - record = fetch_existing_key_record(master_key, existing_key) - if record: - log(f"Reusing existing LiteLLM virtual key for {spec.alias}") - if allowed_models: - update_virtual_key(master_key, existing_key, allowed_models, spec) - return existing_key - log(f"Existing {spec.env_var} not registered with proxy; generating new key") - - new_key = generate_virtual_key(master_key, models, spec, env_map) - if allowed_models: - update_virtual_key(master_key, new_key, allowed_models, spec) - return new_key - - -def _split_model_identifier(model: str) -> tuple[str | None, str]: - if "/" in model: - provider, short_name = model.split("/", 1) - return provider.lower().strip() or None, short_name.strip() - return None, model.strip() - - -def ensure_models_registered( - master_key: str, - models: list[str], - env_map: Mapping[str, str], -) -> None: - if not models: - return - for model in models: - provider, short_name = _split_model_identifier(model) - if not provider or not short_name: - log(f"Skipping model '{model}' (no provider segment)") - continue - spec = PROVIDER_LOOKUP.get(provider) - if not spec: - log(f"No provider spec registered for '{provider}'; skipping model '{model}'") - continue - provider_secret = ( - env_map.get(spec.alias_env_var) - or env_map.get(spec.litellm_env_var) - or os.getenv(spec.alias_env_var) - or os.getenv(spec.litellm_env_var) - ) - if not provider_secret: - log( - f"Provider secret for '{provider}' not found; skipping model registration" - ) - continue - - api_key_reference = f"os.environ/{spec.alias_env_var}" - payload: dict[str, object] = { - "model_name": model, - "litellm_params": { - "model": short_name, - "custom_llm_provider": provider, - "api_key": api_key_reference, - }, - "model_info": { - "provider": provider, - "description": "Auto-registered during bootstrap", - }, - } - - status, body = request_json( - "/model/new", method="POST", payload=payload, auth_token=master_key - ) - if status in {200, 201}: - log(f"Registered LiteLLM model '{model}'") - continue - try: - data = json.loads(body) - except json.JSONDecodeError: - data = body - error_message = ( - data.get("error") if isinstance(data, Mapping) else str(data) - ) - if status == 409 or ( - isinstance(error_message, str) - and "already" in error_message.lower() - ): - log(f"Model '{model}' already present; skipping") - continue - log(f"Failed to register model '{model}' ({status}): {error_message}") - - -def main() -> int: - log("Bootstrapping LiteLLM proxy") - try: - wait_for_proxy() - env_lines = read_env_file() - env_map = parse_env_lines(env_lines) - legacy_map = read_legacy_env_file() - master_key = get_master_key(env_map) - - provider_values, updated_env_lines, env_changed = gather_provider_keys( - env_lines, env_map, legacy_map - ) - if env_changed: - write_env_file(updated_env_lines) - env_map = parse_env_lines(updated_env_lines) - log("Updated LiteLLM provider aliases in shared env file") - - ensure_litellm_env(provider_values) - - models = collect_default_models(env_map) - if models: - log("Default models for virtual keys: %s" % ", ".join(models)) - models_for_key = models - else: - log( - "No default models configured; provisioning virtual keys without model " - "restrictions (model-agnostic)." - ) - models_for_key = ["*"] - - # Generate virtual keys for each service - for spec in VIRTUAL_KEYS: - virtual_key = ensure_virtual_key(master_key, models_for_key, env_map, spec) - persist_key_to_env(virtual_key, spec.env_var) - - # Register models if any were specified - if models: - ensure_models_registered(master_key, models, env_map) - - log("Bootstrap complete") - return 0 - except Exception as exc: # pragma: no cover - startup failure reported to logs - log(f"Bootstrap failed: {exc}") - return 1 - - -if __name__ == "__main__": - sys.exit(main()) diff --git a/docs/.gitignore b/docs/.gitignore deleted file mode 100644 index b2d6de3..0000000 --- a/docs/.gitignore +++ /dev/null @@ -1,20 +0,0 @@ -# Dependencies -/node_modules - -# Production -/build - -# Generated files -.docusaurus -.cache-loader - -# Misc -.DS_Store -.env.local -.env.development.local -.env.test.local -.env.production.local - -npm-debug.log* -yarn-debug.log* -yarn-error.log* diff --git a/docs/README.md b/docs/README.md deleted file mode 100644 index 5f6c5d6..0000000 --- a/docs/README.md +++ /dev/null @@ -1,25 +0,0 @@ -# FuzzForge Documentation - -This website is built using [Docusaurus](https://docusaurus.io/), a modern static website generator. - -## Installation - -```bash -yarn -``` - -## Local Development - -```bash -yarn start -``` - -This command starts a local development server and opens up a browser window. Most changes are reflected live without having to restart the server. - -## Build - -```bash -yarn build -``` - -This command generates static content into the `build` directory and can be served using any static contents hosting service. diff --git a/docs/blog/2025-01-16-v0.7.0-temporal-workers-release.md b/docs/blog/2025-01-16-v0.7.0-temporal-workers-release.md deleted file mode 100644 index 329ca7a..0000000 --- a/docs/blog/2025-01-16-v0.7.0-temporal-workers-release.md +++ /dev/null @@ -1,321 +0,0 @@ -# FuzzForge v0.7.0: Temporal Orchestration & Vertical Workers Architecture - -We're excited to announce **FuzzForge v0.7.0**, a major release featuring two significant improvements: - -1. **Architectural Foundation**: Complete migration from Prefect to **Temporal** orchestration with **vertical workers** - pre-built containers for instant deployment -2. **AI-Powered Secret Detection**: New workflows achieving 84% recall on obfuscated secrets using LLM semantic analysis - -This release transforms how security workflows are built, deployed, and scaled. - - - -## šŸš€ Flagship Features - -### Temporal Orchestration: Production-Ready Workflow Engine - -We've fully migrated from Prefect to [Temporal](https://temporal.io), bringing enterprise-grade workflow orchestration to FuzzForge: - -**Why Temporal?** - -- āœ… **Reliability**: Automatic retries, timeouts, and failure handling built-in -- āœ… **Observability**: World-class UI for monitoring workflow execution, logs, and debugging -- āœ… **Scalability**: Horizontal scaling across workers with intelligent load balancing -- āœ… **Developer Experience**: Type-safe workflows, versioning, and zero downtime deployments - -**What This Means for You:** - -```bash -# Start FuzzForge with Temporal -docker compose up -d - -# Monitor workflows in real-time -open http://localhost:8080 # Temporal UI - -# Submit workflows - everything just works -cd your_project/ -ff workflow run security_assessment . -``` - -The Temporal UI gives you complete visibility into workflow execution: - -- Live activity timelines -- Detailed logs for every step -- Retry history and failure analysis -- Performance metrics and bottleneck detection - -### Vertical Workers: Pre-Built Security Toolchains - -FuzzForge now uses **vertical workers** - long-lived containers pre-built with security toolchains for different languages and platforms: - -| Worker | Toolchain | Status | Available Workflows | -| ----------- | ----------------------------- | ------------- | ------------------------------------- | -| **python** | Gitleaks, TruffleHog, Atheris | āœ… Production | Secret detection, security assessment | -| **rust** | cargo-fuzz | āš ļø Early Dev | Rust fuzzing | -| **ossfuzz** | OSS-Fuzz infrastructure | āš ļø Heavy Dev | Continuous fuzzing campaigns | - -**Note:** Additional workers (web, android, Go) are planned but not yet available. - -**Key Benefits:** - -1. **Zero Build Time**: Workflows start instantly - no container builds per workflow -2. **Instant Code Changes**: Modify workflow code, restart worker, done -3. **Consistent Environment**: Same toolchain versions across all runs -4. **Resource Efficiency**: Share workers across multiple concurrent workflows - -**Example: Running Secret Detection** - -```bash -# Worker is already running with Gitleaks, TruffleHog installed -ff workflow run gitleaks_detection . - -# Behind the scenes: -# 1. CLI uploads project to MinIO -# 2. Temporal schedules on python-worker -# 3. Worker downloads from MinIO -# 4. Gitleaks runs (already installed!) -# 5. Results returned as SARIF -``` - -### MinIO Storage: Unified File Handling - -We've replaced volume mounts with **MinIO** (S3-compatible object storage): - -**Old Way (Volume Mounts):** - -```yaml -# Had to mount directories, manage paths, cleanup manually -volumes: - - ./my_project:/target -``` - -**New Way (MinIO):** - -```bash -# CLI handles everything automatically -ff workflow run security_assessment . -# āœ“ Creates tarball -# āœ“ Uploads to MinIO -# āœ“ Passes target_id to workflow -# āœ“ Worker downloads and extracts -# āœ“ Cleanup handled automatically -``` - -**Benefits:** - -- āœ… No path conflicts or permissions issues -- āœ… Works seamlessly with remote Temporal clusters -- āœ… Automatic cleanup and caching -- āœ… Supports large targets (GB+) - -## šŸ” AI-Powered Secret Detection: Also in v0.7.0 - -Alongside the architectural improvements, we're releasing a comprehensive **secret detection** system with three workflows: - -### Benchmark Results - -We tested on a controlled dataset of **32 documented secrets** (12 Easy, 10 Medium, 10 Hard): - -| Tool | Recall | Secrets Found | Speed | Best For | -| --------------------- | --------- | ------------- | ----- | --------------------------- | -| **LLM (gpt-5-mini)** | **84.4%** | 41 | 618s | Obfuscated & hidden secrets | -| **LLM (gpt-4o-mini)** | 56.2% | 30 | 297s | Balanced speed/accuracy | -| **Gitleaks** | 37.5% | 12 | 5s | Fast pattern-based scanning | -| **TruffleHog** | 0.0% | 1 | 5s | Entropy analysis | - -šŸ“Š [Full benchmark methodology and results →](https://github.com/FuzzingLabs/fuzzforge_ai/blob/dev/backend/benchmarks/by_category/secret_detection/results/comparison_report.md) - -### Why LLM-Based Detection Wins - -**Obfuscated Secrets (Medium Difficulty):** - -```python -# Gitleaks: āŒ Missed (no pattern match) -# LLM: āœ… Found (semantic understanding) -aws_key = base64.b64decode("QUtJQUlPU0ZPRE5ON0VYQU1QTEU=").decode() -``` - -**Well-Hidden Secrets (Hard Difficulty):** - -```python -# Gitleaks: āŒ Missed (no pattern) -# LLM: āœ… Found (understands XOR + join) -secret = ''.join(chr(ord(c) ^ 0x42) for c in "\x0b\x15\x04\x1b...") -``` - -**Standard Secrets (Easy Difficulty):** - -```python -# Both find these: -AWS_ACCESS_KEY = "AKIAIOSFODNN7EXAMPLE" -``` - -### Try It Yourself - -```bash -# Start FuzzForge -docker compose up -d - -# Run secret detection on your code -cd your_project/ -ff workflow run gitleaks_detection . # Fast pattern-based -ff workflow run trufflehog_detection . # Entropy analysis -ff workflow run llm_secret_detection . # AI semantic analysis - -# Get SARIF output -ff finding -``` - -## šŸ“Š Real-World Impact - -**Before v0.7.0 (Pattern-Only Detection):** - -- Found: Standard API keys, simple patterns -- Missed: Base64-encoded secrets, obfuscated credentials, split secrets - -**After v0.7.0 (LLM + Patterns):** - -- **84% recall** on comprehensive benchmark -- Detects novel obfuscation techniques -- Understands code context (not just regex) -- Catches secrets in: - - Base64/hex encoding - - String concatenation - - XOR/ROT13 obfuscation - - Template strings - - Binary literals - -## šŸ”„ Migration Guide - -### What Changed - -**Docker Compose:** - -```bash -# Old (Prefect) -docker-compose up - -# New (Temporal) -docker compose up -d -``` - -**Workflow Submission:** - -```bash -# Old (volume mounts) -ff workflow run security_assessment --volume ./project - -# New (automatic upload) -ff workflow run security_assessment . -# CLI handles upload automatically! -``` - -**Worker Management:** - -```bash -# Old (per-workflow containers) -# Each workflow built its own container - -# New (vertical workers) -docker compose up -d # All workers start -# Workflows share workers - much faster! -``` - -### Configuration - -Set up AI workflows with API keys: - -```bash -cp volumes/env/.env.template volumes/env/.env -# Edit .env and add your API keys (OpenAI, Anthropic, etc.) -``` - -Required for: - -- `llm_secret_detection` workflow -- AI agent features (`ff ai agent`) - -Basic security workflows (gitleaks, trufflehog, security_assessment) work without this. - -## šŸ—ļø Architecture Overview - -``` -ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” -│ User CLI │ Upload → MinIO -ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”¬ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ - ↓ Submit -ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” -│ Temporal │ Schedule → Task Queue -ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”¬ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ - ↓ Execute -ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” -│ Vertical │ Download from MinIO → Run Tools → Upload Results -│ Workers │ -ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ - rust, python, web, android, ossfuzz -``` - -**Benefits:** - -- šŸ”„ Automatic retries and timeouts (Temporal) -- šŸ“¦ No file path management (MinIO) -- ⚔ Zero container build time (Vertical Workers) -- šŸ“ˆ Horizontal scaling ready (Temporal + Workers) - -## šŸŽÆ Workflow Stability Status - -### āœ… Stable & Production-Ready - -- **Secret Detection**: `gitleaks_detection`, `trufflehog_detection`, `llm_secret_detection` -- **Security Assessment**: `security_assessment` -- Temporal orchestration with python worker -- MinIO file storage - -### āš ļø Early Development (Functional but not production-ready) - -- **Fuzzing workflows**: - - `atheris_fuzzing` - Python fuzzing with Atheris - - `cargo_fuzzing` - Rust fuzzing with cargo-fuzz -- **OSS-Fuzz integration**: `ossfuzz_campaign` (under heavy active development) - -**Important:** Fuzzing workflows are functional for testing but not recommended for production use yet. - -## šŸ“š Resources - -- 🌐 [Website](https://fuzzforge.ai) -- šŸ“– [Documentation](https://docs.fuzzforge.ai) -- šŸ’¬ [Discord Community](https://discord.gg/8XEX33UUwZ) -- šŸŽ“ [FuzzingLabs Academy](https://academy.fuzzinglabs.com/?coupon=GITHUB_FUZZFORGE) -- šŸ“Š [Secret Detection Benchmarks](https://github.com/FuzzingLabs/fuzzforge_ai/blob/dev/backend/benchmarks/by_category/secret_detection/results/comparison_report.md) - -## šŸ™ Acknowledgments - -Special thanks to: - -- [Temporal](https://temporal.io) for the amazing workflow engine -- Our community for feedback during the migration - -## šŸš€ Get Started - -```bash -# Clone and install -git clone https://github.com/fuzzinglabs/fuzzforge_ai.git -cd fuzzforge_ai -uv tool install --python python3.12 . - -# Start FuzzForge with Temporal -docker compose up -d - -# Run your first workflow -cd test_projects/vulnerable_app/ -fuzzforge init -ff workflow run security_assessment . - -# Check Temporal UI -open http://localhost:8080 -``` - ---- - -**FuzzForge v0.7.0** is a foundational release that sets the stage for scalable, production-ready security automation. Try it today and let us know what you think! - -Star us on [GitHub](https://github.com/FuzzingLabs/fuzzforge_ai) ⭐ diff --git a/docs/docs/ai/a2a-services.md b/docs/docs/ai/a2a-services.md deleted file mode 100644 index 694be54..0000000 --- a/docs/docs/ai/a2a-services.md +++ /dev/null @@ -1,150 +0,0 @@ -# A2A Services - -The FuzzForge AI module can expose itself as an Agent-to-Agent (A2A) server so downstream systems can register the agent, inspect its card, and call tools over HTTP. - -## Starting the Server - -```bash -fuzzforge ai server -``` - -Run the command from a project directory that already contains `.fuzzforge/`. The server reads the project configuration and reuses the same environment variables as the CLI shell. - -**Default directories** -- Logs: `.fuzzforge/logs/cognee.log` -- Cognee datasets: `.fuzzforge/cognee/project_/{data,system}` -- Artifact cache: `.fuzzforge/artifacts` - -## HTTP Endpoints - -| Method | Path | Purpose | -| --- | --- | --- | -| `GET` | `/artifacts/{id}` | Download artifacts created by the agent, workflows, or remote collaborators | -| `POST` | `/graph/query` | Query the Cognee project graph using `query`, optional `dataset`, and optional `search_type` | -| `POST` | `/project/files` | Mirror a file from the project workspace as a downloadable artifact | - -### `POST /graph/query` - -Request body: -- `query` *(str, required)* – Natural language question for the graph -- `search_type` *(str, optional)* – e.g. `GRAPH_COMPLETION`, `INSIGHTS`, `CHUNKS` -- `dataset` *(str, optional)* – Defaults to `_codebase` - -Example: - -```bash -curl -s http://localhost:10100/graph/query \ - -H 'Content-Type: application/json' \ - -d '{"query":"unsafe Rust", "search_type":"GRAPH_COMPLETION"}' | jq -``` - -### `POST /project/files` - -Registers a source file and returns an artifact descriptor. - -```bash -curl -s http://localhost:10100/project/files \ - -H 'Content-Type: application/json' \ - -d '{"path":"src/lib.rs"}' | jq -``` - -Response excerpt: - -```json -{ - "id": "project_file_4325a8a6", - "file_uri": "http://127.0.0.1:10100/artifacts/project_file_4325a8a6", - "name": "src/lib.rs", - "mime_type": "text/x-c", - "size": 160 -} -``` - -## Typical Collaboration Flow - -1. Ingest project knowledge with `fuzzforge rag ingest --path . --recursive`. -2. Start the A2A server: `fuzzforge ai server`. -3. Downstream agents: - - Call `POST /graph/query` to explore project knowledge. - - Call `POST /project/files` to fetch raw files from the repository. - - Download finished scan summaries with `GET /artifacts/{id}`. -4. The AI module pushes Temporal workflow results into artifacts automatically, so remote agents can poll without re-running scans. - -## Registration Flow - -```mermaid -sequenceDiagram - participant Client as Remote Agent - participant HTTP as A2A HTTP Server - participant Exec as FuzzForgeExecutor - participant Registry as Agent Registry - - Client->>HTTP: GET /.well-known/agent-card.json - HTTP-->>Client: Agent card (skills, protocol version) - Client->>HTTP: POST / (register) - HTTP->>Exec: Register request - Exec->>Registry: Persist remote agent metadata - Exec-->>HTTP: Confirmation + assigned agent ID - HTTP-->>Client: Success response - Client->>Exec: Subsequent messages routed via HTTP endpoints - Exec->>Registry: Update capability cache per message -``` - -### How registration works - -1. **Discovery** – A remote agent fetches `/.well-known/agent-card.json` to confirm skills, protocol version, and message schemas. -2. **Handshake** – The remote agent issues `POST /` to start the A2A session. The payload includes its agent card and callback URL. -3. **Persistence** – `FuzzForgeExecutor` stores the remote agent in the registry (`agents.yaml` when run via the CLI). Auto-registration on future boots replays these entries. -4. **Capability cache** – Each inbound message updates the capability cache so the router can route `ROUTE_TO AgentName:` commands without another round-trip. -5. **Teardown** – Removing an agent via `/unregister` purges it from the registry; restart the server to drop any lingering connections. - -### Where agent metadata lives - -- The AI CLI and server share the same registry file. When a project is initialised, registrations are written to `.fuzzforge/agents.yaml` (see `ai/src/fuzzforge_ai/config_manager.py`). -- If the project file is absent, the executor falls back to the packaged default at `ai/src/fuzzforge_ai/config.yaml`. -- Each entry records `name`, `url`, and description. On startup `_auto_register_agents()` replays that list so both the CLI and the A2A server automatically reconnect to known peers. -- Editing `.fuzzforge/agents.yaml` manually is supported; the CLI `/register` and `/unregister` commands update it for you. - -## Agent Card - -The server exposes its agent card at `/.well-known/agent-card.json`. Clients can read that metadata to confirm skills, supported message schemas, and protocol version (`0.3.0`). - -## Artifacts in A2A mode - -- **Creation** – Conversations generate artifacts automatically when the executor produces code, reports, or workflow summaries. The `/artifacts` CLI command lists them; over HTTP they are addressed by `GET /artifacts/{id}`. -- **Distribution** – Use `/sendfile [note]` in the CLI or call `POST /project/files` programmatically to turn a project file into an artifact that downstream agents can fetch. -- **Download** – Remote agents receive the artifact descriptor (including `file_uri`) in A2A responses or via polling. Retrieve the content with `GET /artifacts/{id}`; the cache lives under `.fuzzforge/artifacts/`. -- **Lifecycle** – Artifacts persist for the life of the project workspace. Clean the directory if you need to reclaim space; the executor recreates entries on demand. - -## Running the server vs. CLI-only mode - -- Launch the server with `fuzzforge ai server`. It loads `.fuzzforge/.env`, sets up Cognee directories via `ProjectConfigManager`, and exposes HTTP endpoints on `127.0.0.1:${FUZZFORGE_PORT:-10100}`. -- Without the server, the `fuzzforge ai agent` CLI still supports A2A-style routing for locally registered peers, but external agents cannot connect because the HTTP surface is absent. -- When the server is running, both the CLI and remote agents share the same executor, task store, and artifact cache. Stopping the server returns the module to CLI-only operation without altering persisted registrations. - -## Communication Patterns - -```mermaid -sequenceDiagram - participant Remote as Remote Agent - participant HTTP as A2A Server - participant Exec as Executor - participant Workflow as Temporal Backend - - Remote->>HTTP: POST / (message with tool request) - HTTP->>Exec: Forward message - Exec->>Workflow: (optional) submit_security_scan_mcp - Workflow-->>Exec: Status / findings - Exec->>HTTP: Response + artifact metadata - HTTP-->>Remote: A2A response with artifacts/tasks - Remote->>HTTP: GET /artifacts/{id} - HTTP-->>Remote: Artifact bytes -``` - -This pattern repeats for subsequent tool invocations. Remote agents can also call the helper endpoints (`/graph/query`, `/project/files`) directly while the conversation is active. - -## Related Files - -- Runtime entry point: `ai/src/fuzzforge_ai/__main__.py` -- HTTP implementation: `ai/src/fuzzforge_ai/a2a_server.py` -- Agent metadata: `ai/src/fuzzforge_ai/agent_card.py` diff --git a/docs/docs/ai/architecture.md b/docs/docs/ai/architecture.md deleted file mode 100644 index eea821b..0000000 --- a/docs/docs/ai/architecture.md +++ /dev/null @@ -1,146 +0,0 @@ -# AI Architecture - -FuzzForge AI is the orchestration layer that lets large language models drive the broader security platform. Built on the Google ADK runtime, the module coordinates local tools, remote Agent-to-Agent (A2A) peers, and Temporal-backed workflows while persisting long-running context for every project. - -## System Diagram - -```mermaid -graph TB - subgraph Surfaces - CLI[CLI Shell] - HTTP[A2A HTTP Server] - end - - CLI --> AgentCore - HTTP --> AgentCore - - subgraph AgentCore [Agent Core] - AgentCoreNode[FuzzForgeAgent] - AgentCoreNode --> Executor[Executor] - AgentCoreNode --> Memory[Memory Services] - AgentCoreNode --> Registry[Agent Registry] - end - - Executor --> MCP[MCP Workflow Bridge] - Executor --> Router[Capability Router] - Executor --> Files[Artifact Manager] - Executor --> Prompts[Prompt Templates] - - Router --> RemoteAgents[Registered A2A Agents] - MCP --> Temporal[FuzzForge Backend] - Memory --> SessionDB[Session Store] - Memory --> Semantic[Semantic Recall] - Memory --> Graphs[Cognee Graph] - Files --> Artifacts[Artifact Cache] - - -``` - -## Detailed Data Flow - -```mermaid -sequenceDiagram - participant User as User / Remote Agent - participant CLI as CLI / HTTP Surface - participant Exec as FuzzForgeExecutor - participant ADK as ADK Runner - participant Temporal as Temporal Backend - participant Cognee as Cognee - participant Artifact as Artifact Cache - - User->>CLI: Prompt or slash command - CLI->>Exec: Normalised request + context ID - Exec->>ADK: Tool invocation (LiteLLM) - ADK-->>Exec: Structured response / tool result - Exec->>Temporal: (optional) submit workflow via MCP - Temporal-->>Exec: Run status updates - Exec->>Cognee: (optional) knowledge query / ingestion - Cognee-->>Exec: Graph results - Exec->>Artifact: Persist generated files - Exec-->>CLI: Final response + artifact links + task events - CLI-->>User: Rendered answer -``` - -## Entry Points - -- **CLI shell** (`ai/src/fuzzforge_ai/cli.py`) provides the interactive `fuzzforge ai agent` loop. It streams user messages through the executor, wires slash commands for listing agents, sending files, and launching workflows, and keeps session IDs in sync with ADK’s session service. -- **A2A HTTP server** (`ai/src/fuzzforge_ai/a2a_server.py`) wraps the same agent in Starlette. It exposes RPC-compatible endpoints plus helper routes (`/artifacts/{id}`, `/graph/query`, `/project/files`) and reuses the executor’s task store so downstream agents can poll status updates. - -## Core Components - -- **FuzzForgeAgent** (`ai/src/fuzzforge_ai/agent.py`) assembles the runtime: it loads environment variables, constructs the executor, and builds an ADK `Agent` backed by `LiteLlm`. The singleton accessor `get_fuzzforge_agent()` keeps CLI and server instances aligned and shares the generated agent card. -- **FuzzForgeExecutor** (`ai/src/fuzzforge_ai/agent_executor.py`) is the brain. It registers tools, manages session storage (SQLite or in-memory via `DatabaseSessionService` / `InMemorySessionService`), and coordinates artifact storage. The executor also tracks long-running Temporal workflows inside `pending_runs`, produces `TaskStatusUpdateEvent` objects, and funnels every response through ADK’s `Runner` so traces include tool metadata. -- **Remote agent registry** (`ai/src/fuzzforge_ai/remote_agent.py`) holds metadata for downstream agents and handles capability discovery over HTTP. Auto-registration is configured by `ConfigManager` so known agents attach on startup. -- **Memory services**: - - `FuzzForgeMemoryService` and `HybridMemoryManager` (`ai/src/fuzzforge_ai/memory_service.py`) provide conversation recall and bridge to Cognee datasets when configured. - - Cognee bootstrap (`ai/src/fuzzforge_ai/cognee_service.py`) ensures ingestion and knowledge queries stay scoped to the current project. - -## Workflow Automation - -The executor wraps Temporal MCP actions exposed by the backend: - -| Tool | Source | Purpose | -| --- | --- | --- | -| `list_workflows_mcp` | `ai/src/fuzzforge_ai/agent_executor.py` | Enumerate available scans | -| `submit_security_scan_mcp` | `agent_executor.py` | Launch a scan and persist run metadata | -| `get_run_status_mcp` | `agent_executor.py` | Poll Temporal for status and push task events | -| `get_comprehensive_scan_summary` | `agent_executor.py` | Collect findings and bundle artifacts | -| `get_backend_status_mcp` | `agent_executor.py` | Block submissions until Temporal reports `ready` | - -The CLI surface mirrors these helpers as natural-language prompts (`You> run fuzzforge workflow …`). ADK’s `Runner` handles retries and ensures each tool call yields structured `Event` objects for downstream instrumentation. - -## Knowledge & Ingestion - -- The `fuzzforge ingest` and `fuzzforge rag ingest` commands call into `ai/src/fuzzforge_ai/ingest_utils.py`, which filters file types, ignores caches, and populates Cognee datasets under `.fuzzforge/cognee/project_/`. -- Runtime queries hit `query_project_knowledge_api` on the executor, which defers to `cognee_service` for dataset lookup and semantic search. When Cognee credentials are absent the tools return a friendly "not configured" response. - -## Artifact Pipeline - -Artifacts generated during conversations or workflow runs are written to `.fuzzforge/artifacts/`: - -1. The executor creates a unique directory per artifact ID and writes the payload (text, JSON, or binary). -2. Metadata is stored in-memory and, when running under the A2A server, surfaced via `GET /artifacts/{id}`. -3. File uploads from `/project/files` reuse the same pipeline so remote agents see a consistent interface. - -## Task & Event Wiring - -- In CLI mode, `FuzzForgeExecutor` bootstraps shared `InMemoryTaskStore` and `InMemoryQueueManager` instances (see `agent_executor.py`). They allow the agent to emit `TaskStatusUpdateEvent` objects even when the standalone server is not running. -- The A2A HTTP wrapper reuses those handles, so any active workflow is visible to both the local shell and remote peers. - -Use the complementary docs for step-by-step instructions: - -- [Ingestion & Knowledge Graphs](ingestion.md) -- [LLM & Environment Configuration](configuration.md) -- [Prompt Patterns & Examples](prompts.md) -- [A2A Services](a2a-services.md) - -## Memory & Persistence - -```mermaid -graph LR - subgraph ADK Memory Layer - SessionDB[(DatabaseSessionService)] - Semantic[Semantic Recall Index] - end - - subgraph Project Knowledge - CogneeDataset[(Cognee Dataset)] - HybridManager[HybridMemoryManager] - end - - Prompts[Prompts & Tool Outputs] --> SessionDB - SessionDB --> Semantic - Ingestion[Ingestion Pipeline] --> CogneeDataset - CogneeDataset --> HybridManager - HybridManager --> Semantic - HybridManager --> Exec[Executor] - Exec --> Responses[Responses with Context] -``` - -- **Session persistence** is controlled by `SESSION_PERSISTENCE`. When set to `sqlite`, ADK’s `DatabaseSessionService` writes transcripts to the path configured by `SESSION_DB_PATH` (defaults to `./fuzzforge_sessions.db`). With `inmemory`, the context is scoped to the current process. -- **Semantic recall** stores vector embeddings so `/recall` queries can surface earlier prompts, even after restarts when using SQLite. -- **Hybrid memory manager** (`HybridMemoryManager`) stitches Cognee results into the ADK session. When a knowledge query hits Cognee, the relevant nodes are appended back into the session context so follow-up prompts can reference them naturally. -- **Cognee datasets** are unique per project. Ingestion runs populate `_codebase` while custom calls to `ingest_to_dataset` let you maintain dedicated buckets (e.g., `insights`). Data is persisted inside `.fuzzforge/cognee/project_/` and shared across CLI and A2A modes. -- **Task metadata** (workflow runs, artifact descriptors) lives in the executor’s in-memory caches but is also mirrored through A2A task events so remote agents can resubscribe if the CLI restarts. -- **Operational check**: Run `/recall ` or `You> search project knowledge for "topic" using INSIGHTS` after ingestion to confirm both ADK session recall and Cognee graph access are active. -- **CLI quick check**: `/memory status` summarises the current memory type, session persistence, and Cognee dataset directories from inside the agent shell. diff --git a/docs/docs/ai/configuration.md b/docs/docs/ai/configuration.md deleted file mode 100644 index 2da0c11..0000000 --- a/docs/docs/ai/configuration.md +++ /dev/null @@ -1,122 +0,0 @@ -# LLM & Environment Configuration - -FuzzForge AI relies on LiteLLM adapters embedded in the Google ADK runtime, so you can swap between providers without touching code. Configuration is driven by environment variables inside `.fuzzforge/.env`. - -## Minimal Setup - -```env -LLM_PROVIDER=openai -LITELLM_MODEL=gpt-5-mini -OPENAI_API_KEY=sk-your-key -``` - -Set these values before launching `fuzzforge ai agent` or `python -m fuzzforge_ai`. - -## .env Template - -`fuzzforge init` creates `.fuzzforge/.env.template` alongside the real secrets file. Keep the template under version control so teammates can copy it to `.fuzzforge/.env` and fill in provider credentials locally. The template includes commented examples for Cognee, AgentOps, and alternative LLM providers—extend it with any project-specific overrides you expect collaborators to set. - -## Provider Examples - -**OpenAI-compatible (Azure, etc.)** -```env -LLM_PROVIDER=azure_openai -LITELLM_MODEL=gpt-4o-mini -LLM_API_KEY=sk-your-azure-key -LLM_ENDPOINT=https://your-resource.openai.azure.com -``` - -**Anthropic** -```env -LLM_PROVIDER=anthropic -LITELLM_MODEL=claude-3-haiku-20240307 -ANTHROPIC_API_KEY=sk-your-key -``` - -**Ollama (local models)** -```env -LLM_PROVIDER=ollama_chat -LITELLM_MODEL=codellama:latest -OLLAMA_API_BASE=http://localhost:11434 -``` -Run `ollama pull codellama:latest` ahead of time so the adapter can stream tokens immediately. Any Ollama-hosted model works; set `LITELLM_MODEL` to match the image tag. - -**Vertex AI** -```env -LLM_PROVIDER=vertex_ai -LITELLM_MODEL=gemini-1.5-pro -GOOGLE_APPLICATION_CREDENTIALS=/path/to/service-account.json -``` - -## Additional LiteLLM Providers - -LiteLLM exposes dozens of adapters. Popular additions include: - -- `LLM_PROVIDER=anthropic_messages` for Claude 3.5. -- `LLM_PROVIDER=azure_openai` for Azure-hosted GPT variants. -- `LLM_PROVIDER=groq` for Groq LPU-backed models (`GROQ_API_KEY` required). -- `LLM_PROVIDER=ollama_chat` for any local Ollama model. -- `LLM_PROVIDER=vertex_ai` for Gemini. - -Refer to the [LiteLLM provider catalog](https://docs.litellm.ai/docs/providers) when mapping environment variables; each adapter lists the exact keys the ADK runtime expects. - -## Session Persistence - -``` -SESSION_PERSISTENCE=sqlite # sqlite | inmemory -MEMORY_SERVICE=inmemory # ADK memory backend -``` - -Set `SESSION_PERSISTENCE=sqlite` to preserve conversational history across restarts. For ephemeral sessions, switch to `inmemory`. - -## Knowledge Graph Settings - -To enable Cognee-backed graphs: - -```env -LLM_COGNEE_PROVIDER=openai -LLM_COGNEE_MODEL=gpt-5-mini -LLM_COGNEE_API_KEY=sk-your-key -``` - -If the Cognee variables are omitted, graph-specific tools remain available but return a friendly "not configured" response. - -## MCP / Backend Integration - -```env -FUZZFORGE_MCP_URL=http://localhost:8010/mcp -``` - -The agent uses this endpoint to list, launch, and monitor Temporal workflows. - -## Tracing & Observability - -The executor ships with optional AgentOps tracing. Provide an API key to record conversations, tool calls, and workflow updates: - -```env -AGENTOPS_API_KEY=sk-your-agentops-key -AGENTOPS_ENVIRONMENT=local # Optional tag for dashboards -``` - -Set `FUZZFORGE_DEBUG=1` to surface verbose executor logging and enable additional stdout in the CLI. For HTTP deployments, combine that with: - -```env -LOG_LEVEL=DEBUG -``` - -The ADK runtime also honours `GOOGLE_ADK_TRACE_DIR=/path/to/logs` if you want JSONL traces without an external service. - -## Debugging Flags - -```env -FUZZFORGE_DEBUG=1 # Enables verbose logging -LOG_LEVEL=DEBUG # Applies to the A2A server and CLI -``` - -These flags surface additional insight when diagnosing routing or ingestion issues. Combine them with AgentOps tracing to get full timelines of tool usage. - -## Related Code - -- Env bootstrap: `ai/src/fuzzforge_ai/config_manager.py` -- LiteLLM glue: `ai/src/fuzzforge_ai/agent.py` -- Cognee integration: `ai/src/fuzzforge_ai/cognee_service.py` diff --git a/docs/docs/ai/ingestion.md b/docs/docs/ai/ingestion.md deleted file mode 100644 index 8e7ad58..0000000 --- a/docs/docs/ai/ingestion.md +++ /dev/null @@ -1,88 +0,0 @@ -# Ingestion & Knowledge Graphs - -The AI module keeps long-running context by mirroring your repository into a Cognee-powered knowledge graph and persisting conversations in local storage. - -## CLI Commands - -```bash -# Scan the current project (skips .git/, .fuzzforge/, virtualenvs, caches) -fuzzforge ingest --path . --recursive - -# Alias - identical behaviour -fuzzforge rag ingest --path . --recursive -``` - -The command gathers files using the filters defined in `ai/src/fuzzforge_ai/ingest_utils.py`. By default it includes common source, configuration, and documentation file types while skipping temporary and dependency directories. - -### Customising the File Set - -Use CLI flags to override the defaults: - -```bash -fuzzforge ingest --path backend --file-types .py --file-types .yaml --exclude node_modules --exclude dist -``` - -## Command Options - -`fuzzforge ingest` exposes several flags (see `cli/src/fuzzforge_cli/commands/ingest.py`): - -- `--recursive / -r` – Traverse sub-directories. -- `--file-types / -t` – Repeatable flag to whitelist extensions (`-t .py -t .rs`). -- `--exclude / -e` – Repeatable glob patterns to skip (`-e tests/**`). -- `--dataset / -d` – Write into a named dataset instead of `_codebase`. -- `--force / -f` – Clear previous Cognee data before ingesting (prompts for confirmation unless flag supplied). - -All runs automatically skip `.fuzzforge/**` and `.git/**` to avoid recursive ingestion of cache folders. - -## Dataset Layout - -- Primary dataset: `_codebase` -- Additional datasets: create ad-hoc buckets such as `insights` via the `ingest_to_dataset` tool -- Storage location: `.fuzzforge/cognee/project_/` - -### Persistence Details - -- Every dataset lives under `.fuzzforge/cognee/project_/{data,system}`. These directories are safe to commit to long-lived storage (they only contain embeddings and metadata). -- Cognee assigns deterministic IDs per project; if you move the repository, copy the entire `.fuzzforge/cognee/` tree to retain graph history. -- `HybridMemoryManager` ensures answers from Cognee are written back into the ADK session store so future prompts can refer to the same nodes without repeating the query. -- All Cognee processing runs locally against the files you ingest. No external service calls are made unless you configure a remote Cognee endpoint. - -## Prompt Examples - -``` -You> refresh the project knowledge graph for ./backend -Assistant> Kicks off `fuzzforge ingest` with recursive scan - -You> search project knowledge for "temporal workflow" using INSIGHTS -Assistant> Routes to Cognee `search_project_knowledge` - -You> ingest_to_dataset("Design doc for new scanner", "insights") -Assistant> Adds the provided text block to the `insights` dataset -``` - -## Environment Template - -The CLI writes a template at `.fuzzforge/.env.template` when you initialise a project. Keep it in source control so collaborators can copy it to `.env` and fill in secrets. - -```env -# Core LLM settings -LLM_PROVIDER=openai -LITELLM_MODEL=gpt-5-mini -OPENAI_API_KEY=sk-your-key - -# FuzzForge backend (Temporal-powered) -FUZZFORGE_MCP_URL=http://localhost:8010/mcp - -# Optional: knowledge graph provider -LLM_COGNEE_PROVIDER=openai -LLM_COGNEE_MODEL=gpt-5-mini -LLM_COGNEE_API_KEY=sk-your-key -``` - -Add comments or project-specific overrides as needed; the agent reads these variables on startup. - -## Tips - -- Re-run ingestion after significant code changes to keep the knowledge graph fresh. -- Large binary assets are skipped automatically—store summaries or documentation if you need them searchable. -- Set `FUZZFORGE_DEBUG=1` to surface verbose ingest logs during troubleshooting. diff --git a/docs/docs/ai/intro.md b/docs/docs/ai/intro.md deleted file mode 100644 index 491e200..0000000 --- a/docs/docs/ai/intro.md +++ /dev/null @@ -1,114 +0,0 @@ ---- -sidebar_position: 1 ---- - -# FuzzForge AI Module - -FuzzForge AI is the multi-agent layer that lets you operate the FuzzForge security platform through natural language. It orchestrates local tooling, registered Agent-to-Agent (A2A) peers, and the Temporal-powered backend while keeping long-running context in memory and project knowledge graphs. - -## Quick Start - -1. **Initialise a project** - ```bash - cd /path/to/project - fuzzforge init - ``` -2. **Review environment settings** – copy `.fuzzforge/.env.template` to `.fuzzforge/.env`, then edit the values to match your provider. The template ships with commented defaults for OpenAI-style usage and placeholders for Cognee keys. - ```env - LLM_PROVIDER=openai - LITELLM_MODEL=gpt-5-mini - OPENAI_API_KEY=sk-your-key - FUZZFORGE_MCP_URL=http://localhost:8010/mcp - SESSION_PERSISTENCE=sqlite - ``` - Optional flags you may want to enable early: - ```env - MEMORY_SERVICE=inmemory - AGENTOPS_API_KEY=sk-your-agentops-key # Enable hosted tracing - LOG_LEVEL=INFO # CLI / server log level - ``` -3. **Populate the knowledge graph** - ```bash - fuzzforge ingest --path . --recursive - # alias: fuzzforge rag ingest --path . --recursive - ``` -4. **Launch the agent shell** - ```bash - fuzzforge ai agent - ``` - Keep the backend running (Temporal API at `FUZZFORGE_MCP_URL`) so workflow commands succeed. - -## Everyday Workflow - -- Run `fuzzforge ai agent` and start with `list available fuzzforge workflows` or `/memory status` to confirm everything is wired. -- Use natural prompts for automation (`run fuzzforge workflow …`, `search project knowledge for …`) and fall back to slash commands for precision (`/recall`, `/sendfile`). -- Keep `/memory datasets` handy to see which Cognee datasets are available after each ingest. -- Start the HTTP surface with `python -m fuzzforge_ai` when external agents need access to artifacts or graph queries. The CLI stays usable at the same time. -- Refresh the knowledge graph regularly: `fuzzforge ingest --path . --recursive --force` keeps responses aligned with recent code changes. - -## What the Agent Can Do - -- **Route requests** – automatically selects the right local tool or remote agent using the A2A capability registry. -- **Run security workflows** – list, submit, and monitor FuzzForge workflows via MCP wrappers. -- **Manage artifacts** – create downloadable files for reports, code edits, and shared attachments. -- **Maintain context** – stores session history, semantic recall, and Cognee project graphs. -- **Serve over HTTP** – expose the same agent as an A2A server using `python -m fuzzforge_ai`. - -## Essential Commands - -Inside `fuzzforge ai agent` you can mix slash commands and free-form prompts: - -```text -/list # Show registered A2A agents -/register http://:10201 # Add a remote agent -/artifacts # List generated files -/sendfile SecurityAgent src/report.md "Please review" -You> route_to SecurityAnalyzer: scan ./backend for secrets -You> run fuzzforge workflow static_analysis_scan on ./test_projects/demo -You> search project knowledge for "temporal status" using INSIGHTS -``` - -Artifacts created during the conversation are served from `.fuzzforge/artifacts/` and exposed through the A2A HTTP API. - -## Memory & Knowledge - -The module layers three storage systems: - -- **Session persistence** (SQLite or in-memory) for chat transcripts. -- **Semantic recall** via the ADK memory service for fuzzy search. -- **Cognee graphs** for project-wide knowledge built from ingestion runs. - -Re-run ingestion after major code changes to keep graph answers relevant. If Cognee variables are not set, graph-specific tools automatically respond with a polite "not configured" message. - -## Sample Prompts - -Use these to validate the setup once the agent shell is running: - -- `list available fuzzforge workflows` -- `run fuzzforge workflow static_analysis_scan on ./backend with target_branch=main` -- `show findings for that run once it finishes` -- `refresh the project knowledge graph for ./backend` -- `search project knowledge for "temporal readiness" using INSIGHTS` -- `/recall terraform secrets` -- `/memory status` -- `ROUTE_TO SecurityAnalyzer: audit infrastructure_vulnerable` - -## Need More Detail? - -Dive into the dedicated guides in this category : - -- [Architecture](./architecture.md) – High-level architecture with diagrams and component breakdowns. -- [Ingestion](./ingestion.md) – Command options, Cognee persistence, and prompt examples. -- [Configuration](./configuration.md) – LLM provider matrix, local model setup, and tracing options. -- [Prompts](./prompts.md) – Slash commands, workflow prompts, and routing tips. -- [A2A Services](./a2a-services.md) – HTTP endpoints, agent card, and collaboration flow. -- [Memory Persistence](./architecture.md#memory--persistence) – Deep dive on memory storage, datasets, and how `/memory status` inspects them. - -## Development Notes - -- Entry point for the CLI: `ai/src/fuzzforge_ai/cli.py` -- A2A HTTP server: `ai/src/fuzzforge_ai/a2a_server.py` -- Tool routing & workflow glue: `ai/src/fuzzforge_ai/agent_executor.py` -- Ingestion helpers: `ai/src/fuzzforge_ai/ingest_utils.py` - -Install the module in editable mode (`pip install -e ai`) while iterating so CLI changes are picked up immediately. diff --git a/docs/docs/ai/prompts.md b/docs/docs/ai/prompts.md deleted file mode 100644 index 5669cad..0000000 --- a/docs/docs/ai/prompts.md +++ /dev/null @@ -1,60 +0,0 @@ -# Prompt Patterns & Examples - -Use the `fuzzforge ai agent` shell to mix structured slash commands with natural requests. The Google ADK runtime keeps conversation context, so follow-ups automatically reuse earlier answers, retrieved files, and workflow IDs. - -## Slash Commands - -| Command | Purpose | Example | -| --- | --- | --- | -| `/list` | Show registered A2A agents | `/list` | -| `/register ` | Register a remote agent card | `/register http://localhost:10201` | -| `/artifacts` | List generated artifacts with download links | `/artifacts` | -| `/sendfile [note]` | Ship a file as an artifact to a remote peer | `/sendfile SecurityAnalyzer reports/latest.md "Please review"` | -| `/memory status` | Summarise conversational memory, session store, and Cognee directories | `/memory status` | -| `/memory datasets` | List available Cognee datasets | `/memory datasets` | -| `/recall ` | Search prior conversation context using semantic vectors | `/recall dependency updates` | - -## Workflow Automation - -``` -You> list available fuzzforge workflows -Assistant> [returns workflow names, descriptions, and required parameters] - -You> run fuzzforge workflow security_assessment on ./backend -Assistant> Submits the run, emits TaskStatusUpdateEvent entries, and links the SARIF artifact when complete. - -You> show findings for that run once it finishes -Assistant> Streams the `get_comprehensive_scan_summary` output and attaches the artifact URI. -``` - -## Knowledge Graph & Memory Prompts - -``` -You> refresh the project knowledge graph for ./backend -Assistant> Launches `fuzzforge ingest --path ./backend --recursive` and reports file counts. - -You> search project knowledge for "temporal readiness" using INSIGHTS -Assistant> Routes to Cognee via `query_project_knowledge_api` and returns the top matches. - -You> recall "api key rotation" -Assistant> Uses the ADK semantic memory service to surface earlier chat snippets. -``` - -## Routing to Specialist Agents - -``` -You> ROUTE_TO SecurityAnalyzer: audit this Terraform module for secrets -Assistant> Delegates the request to `SecurityAnalyzer` using the A2A capability map. - -You> sendfile DocumentationAgent docs/runbook.md "Incorporate latest workflow" -Assistant> Uploads the file as an artifact and notifies the remote agent. -``` - -## Prompt Tips - -- Use explicit verbs (`list`, `run`, `search`) to trigger the Temporal workflow helpers. -- Include parameter names inline (`with target_branch=main`) so the executor maps values to MCP tool inputs without additional clarification. -- When referencing prior runs, reuse the assistant’s run IDs or ask for "the last run"—the session store tracks them per context ID. -- If Cognee is not configured, graph queries return a friendly notice; set `LLM_COGNEE_*` variables to enable full answers. -- Combine slash commands and natural prompts in the same session; the ADK session service keeps them in a single context thread. -- `/memory search ` is a shortcut for `/recall ` if you want status plus recall in one place. diff --git a/docs/docs/concept/_category_.json b/docs/docs/concept/_category_.json deleted file mode 100644 index 102bb11..0000000 --- a/docs/docs/concept/_category_.json +++ /dev/null @@ -1,8 +0,0 @@ -{ - "label": "Concept", - "position": 2, - "link": { - "type": "generated-index", - "description": "Concept pages that are understanding-oriented." - } -} diff --git a/docs/docs/concept/architecture.md b/docs/docs/concept/architecture.md deleted file mode 100644 index fcd922f..0000000 --- a/docs/docs/concept/architecture.md +++ /dev/null @@ -1,226 +0,0 @@ -# Architecture - -FuzzForge is a distributed, containerized platform for security analysis workflows. Its architecture is designed for scalability, isolation, and reliability, drawing on modern patterns like microservices and orchestration. This page explains the core architectural concepts behind FuzzForge, meaning what the main components are, how they interact, and why the system is structured this way. - -:::warning - -FuzzForge’s architecture is evolving. While the long-term goal is a hexagonal architecture, the current implementation is still in transition. Expect changes as the platform matures. - -::: - ---- - -## Why This Architecture? - -FuzzForge’s architecture is shaped by several key goals: - -- **Scalability:** Handle many workflows in parallel, scaling up or down as needed. -- **Isolation:** Run each workflow in its own secure environment, minimizing risk. -- **Reliability:** Ensure that failures in one part of the system don’t bring down the whole platform. -- **Extensibility:** Make it easy to add new workflows, tools, or integrations. - -## High-Level System Overview - -At a glance, FuzzForge is organized into several layers, each with a clear responsibility: - -- **Client Layer:** Where users and external systems interact (CLI, API clients, MCP server). -- **API Layer:** The FastAPI backend, which exposes REST endpoints and manages requests. -- **Orchestration Layer:** Temporal server and vertical workers, which schedule and execute workflows. -- **Execution Layer:** Long-lived vertical worker containers with pre-installed toolchains, where workflows run. -- **Storage Layer:** PostgreSQL database, MinIO (S3-compatible storage), and worker cache for persistence. - -Here’s a simplified view of how these layers fit together: - -```mermaid -graph TB - subgraph "Client Layer" - CLI[CLI Client] - API_Client[API Client] - MCP[MCP Server] - end - - subgraph "API Layer" - FastAPI[FastAPI Backend] - Router[Route Handlers] - Middleware[Middleware Stack] - end - - subgraph "Orchestration Layer" - Temporal[Temporal Server] - Workers[Vertical Workers] - Scheduler[Workflow Scheduler] - end - - subgraph "Execution Layer" - VerticalWorkers[Vertical Worker Containers] - Tools[Pre-installed Toolchains] - WorkerCache[Worker Cache /cache] - end - - subgraph "Storage Layer" - PostgreSQL[PostgreSQL Database] - MinIO[MinIO S3 Storage] - Cache[Result Cache] - end - - CLI --> FastAPI - API_Client --> FastAPI - MCP --> FastAPI - - FastAPI --> Router - Router --> Middleware - Middleware --> Temporal - - Temporal --> Workers - Workers --> Scheduler - Scheduler --> VerticalWorkers - - VerticalWorkers --> Tools - VerticalWorkers --> WorkerCache - VerticalWorkers --> MinIO - - FastAPI --> PostgreSQL - Workers --> PostgreSQL - FastAPI --> MinIO -``` - -## What Are the Main Components? - -### API Layer - -- **FastAPI Backend:** The main entry point for users and clients. Handles authentication, request validation, and exposes endpoints for workflow management, results, and health checks. -- **Middleware Stack:** Manages API keys, user authentication, CORS, logging, and error handling. - -### Orchestration Layer - -- **Temporal Server:** Schedules and tracks workflows, backed by PostgreSQL. -- **Vertical Workers:** Long-lived workers pre-built with domain-specific toolchains (Android, Rust, Web, etc.). Can be scaled horizontally. -- **Task Queues:** Route workflows to appropriate vertical workers based on workflow metadata. - -### Execution Layer - -- **Vertical Workers:** Long-lived processes with pre-installed security tools for specific domains. -- **MinIO Storage:** S3-compatible storage for uploaded targets and results. -- **Worker Cache:** Local cache for downloaded targets, with LRU eviction. - -### Storage Layer - -- **PostgreSQL Database:** Stores Temporal workflow state and metadata. -- **MinIO (S3):** Persistent storage for uploaded targets and workflow results. -- **Worker Cache:** Local filesystem cache for downloaded targets with workspace isolation: - - **Isolated mode**: Each run gets `/cache/{target_id}/{run_id}/workspace/` - - **Shared mode**: All runs share `/cache/{target_id}/workspace/` - - **Copy-on-write mode**: Download once, copy per run - - **LRU eviction** when cache exceeds configured size - -## How Does Data Flow Through the System? - -### Submitting a Workflow - -1. **User submits a workflow** via CLI or API client (with optional file upload). -2. **If file provided, API uploads** to MinIO and gets a `target_id`. -3. **API validates** the request and submits to Temporal. -4. **Temporal routes** the workflow to the appropriate vertical worker queue. -5. **Worker downloads target** from MinIO to local cache (if needed). -6. **Worker executes workflow** with pre-installed tools. -7. **Results are stored** in MinIO and metadata in PostgreSQL. -8. **Status updates** flow back through Temporal and the API to the user. - -```mermaid -sequenceDiagram - participant User - participant API - participant MinIO - participant Temporal - participant Worker - participant Cache - - User->>API: Submit workflow + file - API->>API: Validate parameters - API->>MinIO: Upload target file - MinIO-->>API: Return target_id - API->>Temporal: Submit workflow(target_id) - Temporal->>Worker: Route to vertical queue - Worker->>MinIO: Download target - MinIO-->>Worker: Stream file - Worker->>Cache: Store in local cache - Worker->>Worker: Execute security tools - Worker->>MinIO: Upload SARIF results - Worker->>Temporal: Update status - Temporal->>API: Workflow complete - API->>User: Return results -``` - -### Retrieving Results - -1. **User requests status or results** via the API. -2. **API queries the database** for workflow metadata. -3. **If complete,** results are fetched from storage and returned to the user. - -## How Do Services Communicate? - -- **Internally:** FastAPI talks to Temporal via gRPC; Temporal coordinates with workers over gRPC; workers access MinIO via S3 API. All core services use pooled connections to PostgreSQL. -- **Externally:** Users interact via CLI or API clients (HTTP REST). - -## How Is Security Enforced? - -- **Worker Isolation:** Each workflow runs in isolated vertical workers with pre-defined toolchains. -- **Storage Security:** Uploaded files stored in MinIO with lifecycle policies; read-only access by default. -- **API Security:** All endpoints validate inputs, enforce rate limits, and log requests for auditing. -- **No Host Access:** Workers access targets via MinIO, not host filesystem. - -## How Does FuzzForge Scale? - -- **Horizontally:** Add more vertical workers to handle more workflows in parallel. Scale specific worker types based on demand. -- **Vertically:** Adjust CPU and memory limits for workers and adjust concurrent activity limits. - -Example Docker Compose scaling: -```yaml -services: - worker-rust: - deploy: - replicas: 3 # Scale rust workers - resources: - limits: - memory: 4G - cpus: '2.0' - reservations: - memory: 1G - cpus: '0.5' -``` - -## How Is It Deployed? - -- **Development:** All services run via Docker Compose—backend, Temporal, vertical workers, database, and MinIO. -- **Production:** Add load balancers, Temporal clustering, database replication, and multiple worker instances for high availability. Health checks, metrics, and centralized logging support monitoring and troubleshooting. - -## How Is Configuration Managed? - -- **Environment Variables:** Control core settings like database URLs, MinIO endpoints, and Temporal addresses. -- **Service Discovery:** Docker Compose's internal DNS lets services find each other by name, with consistent port mapping and health check endpoints. - -Example configuration: -```bash -DATABASE_URL=postgresql://postgres:postgres@postgres:5432/fuzzforge -TEMPORAL_ADDRESS=temporal:7233 -S3_ENDPOINT=http://minio:9000 -S3_ACCESS_KEY=fuzzforge -S3_SECRET_KEY=fuzzforge123 -``` - -## How Are Failures Handled? - -- **Failure Isolation:** Each service is independent; failures don’t cascade. Circuit breakers and graceful degradation keep the system stable. -- **Recovery:** Automatic retries with backoff for transient errors, dead letter queues for persistent failures, and workflow state recovery after restarts. - -## Implementation Details - -- **Tech Stack:** FastAPI (Python async), Temporal, MinIO, Docker, Docker Compose, PostgreSQL (asyncpg), and boto3 (S3 client). -- **Performance:** Workflows start immediately (workers are long-lived); results are retrieved quickly thanks to MinIO caching and database indexing. -- **Extensibility:** Add new workflows by mounting code; add new vertical workers with specialized toolchains; extend the API with new endpoints. - ---- - -## In Summary - -FuzzForge’s architecture is designed to be robust, scalable, and secure—ready to handle demanding security analysis workflows in a modern, distributed environment. As the platform evolves, expect even more modularity and flexibility, making it easier to adapt to new requirements and technologies. diff --git a/docs/docs/concept/concept.tmpl b/docs/docs/concept/concept.tmpl deleted file mode 100644 index b6a7f86..0000000 --- a/docs/docs/concept/concept.tmpl +++ /dev/null @@ -1,20 +0,0 @@ -# {Concept Title} - -{Brief introduction of the concept, including its origin and general purpose.} - -## Purpose - -- {The primary purpose and its relevance in its field.} - -## Common Usage - -- {Usage 1}: {Brief description.} -- {Usage 2}: {Brief description.} - -## Benefits - -- {Key benefit and why it's preferred in certain scenarios.} - -## Conclusion - -{Summary of its importance and role in its respective field.} diff --git a/docs/docs/concept/docker-containers.md b/docs/docs/concept/docker-containers.md deleted file mode 100644 index e5a8bc6..0000000 --- a/docs/docs/concept/docker-containers.md +++ /dev/null @@ -1,229 +0,0 @@ -# Docker Containers in FuzzForge: Concept and Design - -Docker containers are at the heart of FuzzForge’s execution model. They provide the isolation, consistency, and flexibility needed to run security workflows reliably—no matter where FuzzForge is deployed. This page explains the core concepts behind container usage in FuzzForge, why containers are used, and how they shape the platform’s behavior. - ---- - -## Why Use Docker Containers? - -FuzzForge relies on Docker containers for several key reasons: - -- **Isolation:** Each workflow runs in its own container, so tools and processes can’t interfere with each other or the host. -- **Consistency:** The environment inside a container is always the same, regardless of the underlying system. -- **Security:** Containers restrict access to host resources and run as non-root users. -- **Reproducibility:** Results are deterministic, since the environment is controlled and versioned. -- **Scalability:** Containers can be started, stopped, and scaled up or down as needed. - ---- - -## How Does FuzzForge Use Containers? - -### The Container Model - -Every workflow in FuzzForge is executed inside a Docker container. Here’s what that means in practice: - -- **Vertical worker containers** are built from language-specific base images with domain-specific security toolchains pre-installed (Android, Rust, Web, etc.). -- **Infrastructure containers** (API server, Temporal, MinIO, database) use official images and are configured for the platform's needs. - -### Worker Lifecycle: From Build to Long-Running - -The lifecycle of a vertical worker looks like this: - -1. **Image Build:** A Docker image is built with all required toolchains for the vertical. -2. **Worker Start:** The worker container starts as a long-lived process. -3. **Workflow Discovery:** Worker scans mounted `/app/toolbox` for workflows matching its vertical. -4. **Registration:** Workflows are registered with Temporal on the worker's task queue. -5. **Execution:** When a workflow is submitted, the worker downloads the target from MinIO and executes. -6. **Continuous Running:** Worker remains running, ready for the next workflow. - -```mermaid -graph TB - Build[Build Worker Image] --> Start[Start Worker Container] - Start --> Mount[Mount Toolbox Volume] - Mount --> Discover[Discover Workflows] - Discover --> Register[Register with Temporal] - Register --> Ready[Worker Ready] - Ready --> Workflow[Workflow Submitted] - Workflow --> Download[Download Target from MinIO] - Download --> Execute[Execute Workflow] - Execute --> Upload[Upload Results to MinIO] - Upload --> Ready -``` - ---- - -## What's Inside a Vertical Worker Container? - -A typical vertical worker container is structured like this: - -- **Base Image:** Language-specific image (e.g., `python:3.11-slim`). -- **System Dependencies:** Installed as needed (e.g., `git`, `curl`). -- **Domain-Specific Toolchains:** Pre-installed (e.g., Rust: `AFL++`, `cargo-fuzz`; Android: `apktool`, `Frida`). -- **Temporal Python SDK:** For workflow execution. -- **Boto3:** For MinIO/S3 access. -- **Worker Script:** Discovers and registers workflows. -- **Non-root User:** Created for execution. -- **Entrypoint:** Runs the worker discovery and registration loop. - -Example Dockerfile snippet for Rust worker: - -```dockerfile -FROM python:3.11-slim -RUN apt-get update && apt-get install -y git curl build-essential && rm -rf /var/lib/apt/lists/* -# Install AFL++, cargo, etc. -RUN pip install temporalio boto3 pydantic -COPY worker.py /app/ -WORKDIR /app -RUN useradd -m -u 1000 fuzzforge -USER fuzzforge -# Toolbox will be mounted as volume at /app/toolbox -CMD ["python", "worker.py"] -``` - ---- - -## How Are Containers Networked and Connected? - -- **Docker Compose Network:** All containers are attached to a custom bridge network for internal communication. -- **Internal DNS:** Services communicate using Docker Compose service names. -- **Port Exposure:** Only necessary ports are exposed to the host. -- **Network Isolation:** Workflow containers are isolated from infrastructure containers when possible. - -Example network config: - -```yaml -networks: - fuzzforge: - driver: bridge - ipam: - config: - - subnet: 172.20.0.0/16 -``` - ---- - -## How Is Data Managed with Volumes? - -### Volume Types - -- **Toolbox Volume:** Mounts the workflow code directory, read-only, for dynamic discovery. -- **Worker Cache:** Local cache for downloaded MinIO targets, with LRU eviction. -- **MinIO Data:** Persistent storage for uploaded targets and results (S3-compatible). - -Example volume mount: - -```yaml -volumes: - - "./toolbox:/app/toolbox:ro" # Workflow code - - "worker_cache:/cache" # Local cache - - "minio_data:/data" # MinIO storage -``` - -### Volume Security - -- **Read-only Toolbox:** Workflows cannot modify the mounted toolbox code. -- **Isolated Storage:** Each workflow's target is stored with a unique `target_id` in MinIO. -- **No Host Filesystem Access:** Workers access targets via MinIO, not host paths. -- **Automatic Cleanup:** MinIO lifecycle policies delete old targets after 7 days. - ---- - -## How Are Worker Images Built and Managed? - -- **Automated Builds:** Vertical worker images are built with specialized toolchains. -- **Build Optimization:** Use layer caching, multi-stage builds, and minimal base images. -- **Versioning:** Use tags (`latest`, semantic versions) to track worker images. -- **Long-Lived:** Workers run continuously, not ephemeral per-workflow. - -Example build: - -```bash -cd workers/rust -docker build -t fuzzforge-worker-rust:latest . -# Or via docker-compose -docker-compose -f docker-compose.yml build worker-rust -``` - ---- - -## How Are Resources Controlled? - -- **Memory and CPU Limits:** Set per container to prevent resource exhaustion. -- **Resource Monitoring:** Use `docker stats` and platform APIs to track usage. -- **Alerts:** Detect and handle out-of-memory or CPU throttling events. - -Example resource config: - -```yaml -services: - worker-rust: - deploy: - resources: - limits: - memory: 4G - cpus: '2.0' - reservations: - memory: 1G - cpus: '0.5' - environment: - MAX_CONCURRENT_ACTIVITIES: 5 -``` - ---- - -## How Is Security Enforced? - -- **Non-root Execution:** Containers run as a dedicated, non-root user. -- **Capability Restrictions:** Drop unnecessary Linux capabilities. -- **Filesystem Protection:** Use read-only filesystems and tmpfs for temporary data. -- **Network Isolation:** Restrict network access to only what’s needed. -- **No Privileged Mode:** Containers never run with elevated privileges. - -Example security options: - -```yaml -services: - worker-rust: - security_opt: - - no-new-privileges:true - cap_drop: - - ALL - cap_add: - - CHOWN - - SETGID - - SETUID -``` - ---- - -## How Is Performance Optimized? - -- **Image Layering:** Structure Dockerfiles for efficient caching. -- **Pre-installed Toolchains:** All tools installed in worker image, zero setup time per workflow. -- **Long-Lived Workers:** Eliminate container startup overhead entirely. -- **Local Caching:** MinIO targets cached locally for repeated workflows. -- **Horizontal Scaling:** Scale worker containers to handle more workflows in parallel. - ---- - -## How Are Containers Monitored and Debugged? - -- **Health Checks:** Each service and workflow container has a health endpoint or check. -- **Logging:** All container logs are collected and can be accessed via `docker logs` or the FuzzForge API. -- **Debug Access:** Use `docker exec` to access running containers for troubleshooting. -- **Resource Stats:** Monitor with `docker stats` or platform dashboards. - ---- - -## How Does This All Fit Into FuzzForge? - -- **Temporal Workers:** Long-lived vertical workers execute workflows with pre-installed toolchains. -- **API Integration:** Exposes workflow status, logs, and resource metrics via Temporal. -- **MinIO Storage:** Ensures targets and results are stored, cached, and cleaned up automatically. -- **Security and Resource Controls:** Enforced automatically for every worker and workflow. - ---- - -## In Summary - -Docker containers are the foundation of FuzzForge’s execution model. They provide the isolation, security, and reproducibility needed for robust security analysis workflows—while making it easy to scale, monitor, and extend the platform. diff --git a/docs/docs/concept/fuzzforge-ai.md b/docs/docs/concept/fuzzforge-ai.md deleted file mode 100644 index 5ea3127..0000000 --- a/docs/docs/concept/fuzzforge-ai.md +++ /dev/null @@ -1,83 +0,0 @@ -# FuzzForge AI: Conceptual Overview - -Welcome to FuzzForge AI—a multi-agent orchestration platform designed to supercharge your intelligent automation, security workflows, and project knowledge management. This document provides a high-level conceptual introduction to what FuzzForge AI is, what problems it solves, and how its architecture enables powerful, context-aware agent collaboration. - ---- - -## What is FuzzForge AI? - -FuzzForge AI is a multi-agent orchestration system that implements the A2A (Agent-to-Agent) protocol for intelligent agent routing, persistent memory management, and project-scoped knowledge graphs. Think of it as an intelligent hub that coordinates a team of specialized agents, each with their own skills, while maintaining context and knowledge across sessions and projects. - -**Key Goals:** -- Seamlessly route requests to the right agent for the job -- Preserve and leverage project-specific knowledge -- Enable secure, auditable, and extensible automation workflows -- Make multi-agent collaboration as easy as talking to a single assistant - ---- - -## Core Concepts - -### 1. **Agent Orchestration** -FuzzForge AI acts as a conductor, automatically routing your requests to the most capable registered agent. Agents can be local or remote, and each advertises its skills and capabilities via the A2A protocol. - -### 2. **Memory & Knowledge Management** -The system features a three-layer memory architecture: -- **Session Persistence:** Keeps track of ongoing sessions and conversations. -- **Semantic Memory:** Archives conversations and enables semantic search. -- **Knowledge Graphs:** Maintains structured, project-scoped knowledge for deep context. - -### 3. **Artifact System** -Artifacts are files or structured content generated, processed, or shared by agents. The artifact system supports creation, storage, and secure sharing of code, configs, reports, and more—enabling reproducible, auditable workflows. - -### 4. **A2A Protocol Compliance** -FuzzForge AI fully implements the A2A (Agent-to-Agent) protocol (spec 0.3.0), ensuring standardized, interoperable communication between agents—whether they're running locally or across the network. - ---- - -## High-Level Architecture - -Here's how the main components fit together: - -``` -FuzzForge AI System -ā”œā”€ā”€ CLI Interface (cli.py) -│ ā”œā”€ā”€ Commands & Session Management -│ └── Agent Registry Persistence -ā”œā”€ā”€ Agent Core (agent.py) -│ ā”œā”€ā”€ Main Coordinator -│ └── Memory Manager Integration -ā”œā”€ā”€ Agent Executor (agent_executor.py) -│ ā”œā”€ā”€ Tool Management & Orchestration -│ ā”œā”€ā”€ ROUTE_TO Pattern Implementation -│ └── Artifact Creation & Management -ā”œā”€ā”€ Memory Architecture (Three Layers) -│ ā”œā”€ā”€ Session Persistence -│ ā”œā”€ā”€ Semantic Memory -│ └── Knowledge Graphs -ā”œā”€ā”€ A2A Communication Layer -│ ā”œā”€ā”€ Remote Agent Connection -│ ā”œā”€ā”€ Agent Card Management -│ └── Protocol Compliance -└── A2A Server (a2a_server.py) - ā”œā”€ā”€ HTTP/SSE Server - ā”œā”€ā”€ Artifact HTTP Serving - └── Task Store & Queue Management -``` - -**How it works:** -1. **User Input:** You interact via CLI or API, using natural language or commands. -2. **Agent Routing:** The system decides whether to handle the request itself or route it to a specialist agent. -3. **Tool Execution:** Built-in and agent-provided tools perform operations. -4. **Memory Integration:** Results and context are stored for future use. -5. **Response Generation:** The system returns results, often with artifacts or actionable insights. - ---- - -## Why FuzzForge AI? - -- **Extensible:** Easily add new agents, tools, and workflows. -- **Context-Aware:** Remembers project history, conversations, and knowledge. -- **Secure:** Project isolation, input validation, and artifact management. -- **Collaborative:** Enables multi-agent workflows and knowledge sharing. -- **Fun & Productive:** Designed to make automation and security tasks less tedious and more interactive. diff --git a/docs/docs/concept/resource-management.md b/docs/docs/concept/resource-management.md deleted file mode 100644 index e189fa7..0000000 --- a/docs/docs/concept/resource-management.md +++ /dev/null @@ -1,594 +0,0 @@ -# Resource Management in FuzzForge - -FuzzForge uses a multi-layered approach to manage CPU, memory, and concurrency for workflow execution. This ensures stable operation, prevents resource exhaustion, and allows predictable performance. - ---- - -## Overview - -Resource limiting in FuzzForge operates at three levels: - -1. **Docker Container Limits** (Primary Enforcement) - Hard limits enforced by Docker -2. **Worker Concurrency Limits** - Controls parallel workflow execution -3. **Workflow Metadata** (Advisory) - Documents resource requirements - ---- - -## Worker Lifecycle Management (On-Demand Startup) - -**New in v0.7.0**: Workers now support on-demand startup/shutdown for optimal resource usage. - -### Architecture - -Workers are **pre-built** but **not auto-started**: - -``` -ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” -│ docker- │ Pre-built worker images -│ compose │ with profiles: ["workers", "ossfuzz"] -│ build │ restart: "no" -ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ - ↓ -ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” -│ Workers │ Status: Exited (not running) -│ Pre-built │ RAM Usage: 0 MB -ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ - ↓ -ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” -│ ff workflow │ CLI detects required worker -│ run │ via /workflows/{name}/worker-info API -ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ - ↓ -ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” -│ docker │ docker start fuzzforge-worker-ossfuzz -│ start │ Wait for healthy status -ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ - ↓ -ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā” -│ Worker │ Status: Up -│ Running │ RAM Usage: ~1-2 GB -ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜ -``` - -### Resource Savings - -| State | Services Running | RAM Usage | -|-------|-----------------|-----------| -| **Idle** (no workflows) | Temporal, PostgreSQL, MinIO, Backend | ~1.2 GB | -| **Active** (1 workflow) | Core + 1 worker | ~3-5 GB | -| **Legacy** (all workers) | Core + all 5 workers | ~8 GB | - -**Savings: ~6-7GB RAM when idle** ✨ - -### Configuration - -Control via `.fuzzforge/config.yaml`: - -```yaml -workers: - auto_start_workers: true # Auto-start when needed - auto_stop_workers: false # Auto-stop after completion - worker_startup_timeout: 60 # Startup timeout (seconds) - docker_compose_file: null # Custom compose file path -``` - -Or via CLI flags: - -```bash -# Auto-start disabled -ff workflow run ossfuzz_campaign . --no-auto-start - -# Auto-stop enabled -ff workflow run ossfuzz_campaign . --wait --auto-stop -``` - -### Backend API - -New endpoint: `GET /workflows/{workflow_name}/worker-info` - -**Response**: -```json -{ - "workflow": "ossfuzz_campaign", - "vertical": "ossfuzz", - "worker_container": "fuzzforge-worker-ossfuzz", - "task_queue": "ossfuzz-queue", - "required": true -} -``` - -### SDK Integration - -```python -from fuzzforge_sdk import FuzzForgeClient - -client = FuzzForgeClient() -worker_info = client.get_workflow_worker_info("ossfuzz_campaign") -# Returns: {"vertical": "ossfuzz", "worker_container": "fuzzforge-worker-ossfuzz", ...} -``` - -### Manual Control - -```bash -# Start worker manually -docker start fuzzforge-worker-ossfuzz - -# Stop worker manually -docker stop fuzzforge-worker-ossfuzz - -# Check all worker statuses -docker ps -a --filter "name=fuzzforge-worker" -``` - ---- - -## Level 1: Docker Container Limits (Primary) - -Docker container limits are the **primary enforcement mechanism** for CPU and memory resources. These are configured in `docker-compose.yml` and enforced by the Docker runtime. - -### Configuration - -```yaml -services: - worker-rust: - deploy: - resources: - limits: - cpus: '2.0' # Maximum 2 CPU cores - memory: 2G # Maximum 2GB RAM - reservations: - cpus: '0.5' # Minimum 0.5 CPU cores reserved - memory: 512M # Minimum 512MB RAM reserved -``` - -### How It Works - -- **CPU Limit**: Docker throttles CPU usage when the container exceeds the limit -- **Memory Limit**: Docker kills the container (OOM) if it exceeds the memory limit -- **Reservations**: Guarantees minimum resources are available to the worker - -### Example Configuration by Vertical - -Different verticals have different resource needs: - -**Rust Worker** (CPU-intensive fuzzing): -```yaml -worker-rust: - deploy: - resources: - limits: - cpus: '4.0' - memory: 4G -``` - -**Android Worker** (Memory-intensive emulation): -```yaml -worker-android: - deploy: - resources: - limits: - cpus: '2.0' - memory: 8G -``` - -**Web Worker** (Lightweight analysis): -```yaml -worker-web: - deploy: - resources: - limits: - cpus: '1.0' - memory: 1G -``` - -### Monitoring Container Resources - -Check real-time resource usage: - -```bash -# Monitor all workers -docker stats - -# Monitor specific worker -docker stats fuzzforge-worker-rust - -# Output: -# CONTAINER CPU % MEM USAGE / LIMIT MEM % -# fuzzforge-worker-rust 85% 1.5GiB / 2GiB 75% -``` - ---- - -## Level 2: Worker Concurrency Limits - -The `MAX_CONCURRENT_ACTIVITIES` environment variable controls how many workflows can execute **simultaneously** on a single worker. - -### Configuration - -```yaml -services: - worker-rust: - environment: - MAX_CONCURRENT_ACTIVITIES: 5 - deploy: - resources: - limits: - memory: 2G -``` - -### How It Works - -- **Total Container Memory**: 2GB -- **Concurrent Workflows**: 5 -- **Memory per Workflow**: ~400MB (2GB Ć· 5) - -If a 6th workflow is submitted, it **waits in the Temporal queue** until one of the 5 running workflows completes. - -### Calculating Concurrency - -Use this formula to determine `MAX_CONCURRENT_ACTIVITIES`: - -``` -MAX_CONCURRENT_ACTIVITIES = Container Memory Limit / Estimated Workflow Memory -``` - -**Example:** -- Container limit: 4GB -- Workflow memory: ~800MB -- Concurrency: 4GB Ć· 800MB = **5 concurrent workflows** - -### Configuration Examples - -**High Concurrency (Lightweight Workflows)**: -```yaml -worker-web: - environment: - MAX_CONCURRENT_ACTIVITIES: 10 # Many small workflows - deploy: - resources: - limits: - memory: 2G # ~200MB per workflow -``` - -**Low Concurrency (Heavy Workflows)**: -```yaml -worker-rust: - environment: - MAX_CONCURRENT_ACTIVITIES: 2 # Few large workflows - deploy: - resources: - limits: - memory: 4G # ~2GB per workflow -``` - -### Monitoring Concurrency - -Check how many workflows are running: - -```bash -# View worker logs -docker-compose -f docker-compose.yml logs worker-rust | grep "Starting" - -# Check Temporal UI -# Open http://localhost:8080 -# Navigate to "Task Queues" → "rust" → See pending/running counts -``` - ---- - -## Level 3: Workflow Metadata (Advisory) - -Workflow metadata in `metadata.yaml` documents resource requirements, but these are **advisory only** (except for timeout). - -### Configuration - -```yaml -# backend/toolbox/workflows/security_assessment/metadata.yaml -requirements: - resources: - memory: "512Mi" # Estimated memory usage (advisory) - cpu: "500m" # Estimated CPU usage (advisory) - timeout: 1800 # Execution timeout in seconds (ENFORCED) -``` - -### What's Enforced vs Advisory - -| Field | Enforcement | Description | -|-------|-------------|-------------| -| `timeout` | āœ… **Enforced by Temporal** | Workflow killed if exceeds timeout | -| `memory` | āš ļø Advisory only | Documents expected memory usage | -| `cpu` | āš ļø Advisory only | Documents expected CPU usage | - -### Why Metadata Is Useful - -Even though `memory` and `cpu` are advisory, they're valuable for: - -1. **Capacity Planning**: Determine appropriate container limits -2. **Concurrency Tuning**: Calculate `MAX_CONCURRENT_ACTIVITIES` -3. **Documentation**: Communicate resource needs to users -4. **Scheduling Hints**: Future horizontal scaling logic - -### Timeout Enforcement - -The `timeout` field is **enforced by Temporal**: - -```python -# Temporal automatically cancels workflow after timeout -@workflow.defn -class SecurityAssessmentWorkflow: - @workflow.run - async def run(self, target_id: str): - # If this takes longer than metadata.timeout (1800s), - # Temporal will cancel the workflow - ... -``` - -**Check timeout in Temporal UI:** -1. Open http://localhost:8080 -2. Navigate to workflow execution -3. See "Timeout" in workflow details -4. If exceeded, status shows "TIMED_OUT" - ---- - -## Resource Management Best Practices - -### 1. Set Conservative Container Limits - -Start with lower limits and increase based on actual usage: - -```yaml -# Start conservative -worker-rust: - deploy: - resources: - limits: - cpus: '2.0' - memory: 2G - -# Monitor with: docker stats -# Increase if consistently hitting limits -``` - -### 2. Calculate Concurrency from Profiling - -Profile a single workflow first: - -```bash -# Run single workflow and monitor -docker stats fuzzforge-worker-rust - -# Note peak memory usage (e.g., 800MB) -# Calculate concurrency: 4GB Ć· 800MB = 5 -``` - -### 3. Set Realistic Timeouts - -Base timeouts on actual workflow duration: - -```yaml -# Static analysis: 5-10 minutes -timeout: 600 - -# Fuzzing: 1-24 hours -timeout: 86400 - -# Quick scans: 1-2 minutes -timeout: 120 -``` - -### 4. Monitor Resource Exhaustion - -Watch for these warning signs: - -```bash -# Check for OOM kills -docker-compose -f docker-compose.yml logs worker-rust | grep -i "oom\|killed" - -# Check for CPU throttling -docker stats fuzzforge-worker-rust -# If CPU% consistently at limit → increase cpus - -# Check for memory pressure -docker stats fuzzforge-worker-rust -# If MEM% consistently >90% → increase memory -``` - -### 5. Use Vertical-Specific Configuration - -Different verticals have different needs: - -| Vertical | CPU Priority | Memory Priority | Typical Config | -|----------|--------------|-----------------|----------------| -| Rust Fuzzing | High | Medium | 4 CPUs, 4GB RAM | -| Android Analysis | Medium | High | 2 CPUs, 8GB RAM | -| Web Scanning | Low | Low | 1 CPU, 1GB RAM | -| Static Analysis | Medium | Medium | 2 CPUs, 2GB RAM | - ---- - -## Horizontal Scaling - -To handle more workflows, scale worker containers horizontally: - -```bash -# Scale rust worker to 3 instances -docker-compose -f docker-compose.yml up -d --scale worker-rust=3 - -# Now you can run: -# - 3 workers Ɨ 5 concurrent activities = 15 workflows simultaneously -``` - -**How it works:** -- Temporal load balances across all workers on the same task queue -- Each worker has independent resource limits -- No shared state between workers - ---- - -## Troubleshooting Resource Issues - -### Issue: Workflows Stuck in "Running" State - -**Symptom:** Workflow shows RUNNING but makes no progress - -**Diagnosis:** -```bash -# Check worker is alive -docker-compose -f docker-compose.yml ps worker-rust - -# Check worker resource usage -docker stats fuzzforge-worker-rust - -# Check for OOM kills -docker-compose -f docker-compose.yml logs worker-rust | grep -i oom -``` - -**Solution:** -- Increase memory limit if worker was killed -- Reduce `MAX_CONCURRENT_ACTIVITIES` if overloaded -- Check worker logs for errors - -### Issue: "Too Many Pending Tasks" - -**Symptom:** Temporal shows many queued workflows - -**Diagnosis:** -```bash -# Check concurrent activities setting -docker exec fuzzforge-worker-rust env | grep MAX_CONCURRENT_ACTIVITIES - -# Check current workload -docker-compose -f docker-compose.yml logs worker-rust | grep "Starting" -``` - -**Solution:** -- Increase `MAX_CONCURRENT_ACTIVITIES` if resources allow -- Add more worker instances (horizontal scaling) -- Increase container resource limits - -### Issue: Workflow Timeout - -**Symptom:** Workflow shows "TIMED_OUT" in Temporal UI - -**Diagnosis:** -1. Check `metadata.yaml` timeout setting -2. Check Temporal UI for execution duration -3. Determine if timeout is appropriate - -**Solution:** -```yaml -# Increase timeout in metadata.yaml -requirements: - resources: - timeout: 3600 # Increased from 1800 -``` - ---- - -## Workspace Isolation and Cache Management - -FuzzForge uses workspace isolation to prevent concurrent workflows from interfering with each other. Each workflow run can have its own isolated workspace or share a common workspace based on the isolation mode. - -### Cache Directory Structure - -Workers cache downloaded targets locally to avoid repeated downloads: - -``` -/cache/ -ā”œā”€ā”€ {target_id_1}/ -│ ā”œā”€ā”€ {run_id_1}/ # Isolated mode -│ │ ā”œā”€ā”€ target # Downloaded tarball -│ │ └── workspace/ # Extracted files -│ ā”œā”€ā”€ {run_id_2}/ -│ │ ā”œā”€ā”€ target -│ │ └── workspace/ -│ └── workspace/ # Shared mode (no run_id) -│ └── ... -ā”œā”€ā”€ {target_id_2}/ -│ └── shared/ # Copy-on-write shared download -│ ā”œā”€ā”€ target -│ └── workspace/ -``` - -### Isolation Modes - -**Isolated Mode** (default for fuzzing): -- Each run gets `/cache/{target_id}/{run_id}/workspace/` -- Safe for concurrent execution -- Cleanup removes entire run directory - -**Shared Mode** (for read-only workflows): -- All runs share `/cache/{target_id}/workspace/` -- Efficient (downloads once) -- No cleanup (cache persists) - -**Copy-on-Write Mode**: -- Downloads to `/cache/{target_id}/shared/` -- Copies to `/cache/{target_id}/{run_id}/` per run -- Balances performance and isolation - -### Cache Limits - -Configure cache limits via environment variables: - -```yaml -worker-rust: - environment: - CACHE_DIR: /cache - CACHE_MAX_SIZE: 10GB # Maximum cache size before LRU eviction - CACHE_TTL: 7d # Time-to-live for cached files -``` - -### LRU Eviction - -When cache exceeds `CACHE_MAX_SIZE`, the least-recently-used files are automatically evicted: - -1. Worker tracks last access time for each cached target -2. When cache is full, oldest accessed files are removed first -3. Eviction runs periodically (every 30 minutes) - -### Monitoring Cache Usage - -Check cache size and cleanup logs: - -```bash -# Check cache size -docker exec fuzzforge-worker-rust du -sh /cache - -# Monitor cache evictions -docker-compose -f docker-compose.yml logs worker-rust | grep "Evicted from cache" - -# Check download vs cache hit rate -docker-compose -f docker-compose.yml logs worker-rust | grep -E "Cache (HIT|MISS)" -``` - -See the [Workspace Isolation](/docs/concept/workspace-isolation) guide for complete details on isolation modes and when to use each. - ---- - -## Summary - -FuzzForge's resource management strategy: - -1. **Docker Container Limits**: Primary enforcement (CPU/memory hard limits) -2. **Concurrency Limits**: Controls parallel workflows per worker -3. **Workflow Metadata**: Advisory resource hints + enforced timeout -4. **Workspace Isolation**: Controls cache sharing and cleanup behavior - -**Key Takeaways:** -- Set conservative Docker limits and adjust based on monitoring -- Calculate `MAX_CONCURRENT_ACTIVITIES` from container memory Ć· workflow memory -- Use `docker stats` and Temporal UI to monitor resource usage -- Scale horizontally by adding more worker instances -- Set realistic timeouts based on actual workflow duration -- Choose appropriate isolation mode (isolated for fuzzing, shared for analysis) -- Monitor cache usage and adjust `CACHE_MAX_SIZE` as needed - ---- - -**Next Steps:** -- Review `docker-compose.yml` resource configuration -- Profile your workflows to determine actual resource usage -- Adjust limits based on monitoring data -- Set up alerts for resource exhaustion diff --git a/docs/docs/concept/sarif-format.md b/docs/docs/concept/sarif-format.md deleted file mode 100644 index 05ef8ca..0000000 --- a/docs/docs/concept/sarif-format.md +++ /dev/null @@ -1,618 +0,0 @@ -# SARIF Format - -FuzzForge uses the Static Analysis Results Interchange Format (SARIF) as the standardized output format for all security analysis results. SARIF provides a consistent, machine-readable format that enables tool interoperability and comprehensive result analysis. - -## What is SARIF? - -### Overview - -SARIF (Static Analysis Results Interchange Format) is an OASIS-approved standard (SARIF 2.1.0) designed to standardize the output of static analysis tools. FuzzForge extends this standard to cover dynamic analysis, secret detection, infrastructure analysis, and fuzzing results. - -### Key Benefits - -- **Standardization**: Consistent format across all security tools and workflows -- **Interoperability**: Integration with existing security tools and platforms -- **Rich Metadata**: Comprehensive information about findings, tools, and analysis runs -- **Tool Agnostic**: Works with any security tool that produces structured results -- **IDE Integration**: Native support in modern development environments - -### SARIF Structure - -```json -{ - "version": "2.1.0", - "schema": "https://json.schemastore.org/sarif-2.1.0.json", - "runs": [ - { - "tool": { /* Tool information */ }, - "invocations": [ /* How the tool was run */ ], - "artifacts": [ /* Files analyzed */ ], - "results": [ /* Security findings */ ] - } - ] -} -``` - -## FuzzForge SARIF Implementation - -### Run Structure - -Each FuzzForge workflow produces a SARIF "run" containing: - -```json -{ - "tool": { - "driver": { - "name": "FuzzForge", - "version": "1.0.0", - "informationUri": "https://github.com/FuzzingLabs/fuzzforge_ai", - "organization": "FuzzingLabs", - "rules": [ /* Security rules applied */ ] - }, - "extensions": [ - { - "name": "semgrep", - "version": "1.45.0", - "rules": [ /* Semgrep-specific rules */ ] - } - ] - }, - "invocations": [ - { - "executionSuccessful": true, - "startTimeUtc": "2025-09-25T12:00:00.000Z", - "endTimeUtc": "2025-09-25T12:05:30.000Z", - "workingDirectory": { - "uri": "file:///app/target/" - }, - "commandLine": "python -m toolbox.workflows.static_analysis", - "environmentVariables": { - "WORKFLOW_TYPE": "static_analysis_scan" - } - } - ] -} -``` - -### Result Structure - -Each security finding is represented as a SARIF result: - -```json -{ - "ruleId": "semgrep.security.audit.sqli.pg-sqli", - "ruleIndex": 42, - "level": "error", - "message": { - "text": "Potential SQL injection vulnerability detected" - }, - "locations": [ - { - "physicalLocation": { - "artifactLocation": { - "uri": "src/database/queries.py", - "uriBaseId": "SRCROOT" - }, - "region": { - "startLine": 156, - "startColumn": 20, - "endLine": 156, - "endColumn": 45, - "snippet": { - "text": "cursor.execute(query)" - } - } - } - } - ], - "properties": { - "tool": "semgrep", - "confidence": "high", - "severity": "high", - "cwe": ["CWE-89"], - "owasp": ["A03:2021"], - "references": [ - "https://owasp.org/Top10/A03_2021-Injection/" - ] - } -} -``` - -## Finding Categories and Severity - -### Severity Levels - -FuzzForge maps tool-specific severity levels to SARIF standard levels: - -#### SARIF Level Mapping -- **error**: Critical and High severity findings -- **warning**: Medium severity findings -- **note**: Low severity findings -- **info**: Informational findings - -#### Extended Severity Properties -```json -{ - "properties": { - "severity": "high", // FuzzForge severity - "confidence": "medium", // Tool confidence - "exploitability": "high", // Likelihood of exploitation - "impact": "data_breach" // Potential impact - } -} -``` - -### Vulnerability Classification - -#### CWE (Common Weakness Enumeration) -```json -{ - "properties": { - "cwe": ["CWE-89", "CWE-79"], - "cwe_category": "Injection" - } -} -``` - -#### OWASP Top 10 Mapping -```json -{ - "properties": { - "owasp": ["A03:2021", "A06:2021"], - "owasp_category": "Injection" - } -} -``` - -#### Tool-Specific Classifications -```json -{ - "properties": { - "tool_category": "security", - "rule_type": "semantic_grep", - "finding_type": "sql_injection" - } -} -``` - -## Multi-Tool Result Aggregation - -### Tool Extension Model - -FuzzForge aggregates results from multiple tools using SARIF's extension model: - -```json -{ - "tool": { - "driver": { - "name": "FuzzForge", - "version": "1.0.0" - }, - "extensions": [ - { - "name": "semgrep", - "version": "1.45.0", - "guid": "semgrep-extension-guid" - }, - { - "name": "bandit", - "version": "1.7.5", - "guid": "bandit-extension-guid" - } - ] - } -} -``` - -### Result Correlation - -#### Cross-Tool Finding Correlation -```json -{ - "ruleId": "fuzzforge.correlation.sql-injection", - "level": "error", - "message": { - "text": "SQL injection vulnerability confirmed by multiple tools" - }, - "locations": [ /* Primary location */ ], - "relatedLocations": [ /* Additional contexts */ ], - "properties": { - "correlation_id": "corr-001", - "confirming_tools": ["semgrep", "bandit"], - "confidence_score": 0.95, - "aggregated_severity": "critical" - } -} -``` - -#### Finding Relationships -```json -{ - "ruleId": "semgrep.security.audit.xss.direct-use-of-jinja2", - "properties": { - "related_findings": [ - { - "correlation_type": "same_vulnerability_class", - "related_rule": "bandit.B703", - "relationship": "confirms" - }, - { - "correlation_type": "attack_chain", - "related_rule": "nuclei.xss.reflected", - "relationship": "exploits" - } - ] - } -} -``` - -## Workflow-Specific Extensions - -### Static Analysis Results -```json -{ - "properties": { - "analysis_type": "static", - "language": "python", - "complexity_score": 3.2, - "coverage": { - "lines_analyzed": 15420, - "functions_analyzed": 892, - "classes_analyzed": 156 - } - } -} -``` - -### Dynamic Analysis Results -```json -{ - "properties": { - "analysis_type": "dynamic", - "test_method": "web_application_scan", - "target_url": "https://example.com", - "http_method": "POST", - "request_payload": "user_input=", - "response_code": 200, - "exploitation_proof": "alert_box_displayed" - } -} -``` - -### Secret Detection Results -```json -{ - "properties": { - "analysis_type": "secret_detection", - "secret_type": "api_key", - "entropy_score": 4.2, - "commit_hash": "abc123def456", - "commit_date": "2025-09-20T10:30:00Z", - "author": "developer@example.com", - "exposure_duration": "30_days" - } -} -``` - -### Infrastructure Analysis Results -```json -{ - "properties": { - "analysis_type": "infrastructure", - "resource_type": "docker_container", - "policy_violation": "privileged_container", - "compliance_framework": ["CIS", "NIST"], - "remediation_effort": "low", - "deployment_risk": "high" - } -} -``` - -### Fuzzing Results -```json -{ - "properties": { - "analysis_type": "fuzzing", - "fuzzer": "afl++", - "crash_type": "segmentation_fault", - "crash_address": "0x7fff8b2a1000", - "exploitability": "likely_exploitable", - "test_case": "base64:SGVsbG8gV29ybGQ=", - "coverage_achieved": "85%" - } -} -``` - -## SARIF Processing and Analysis - -### Result Filtering - -#### Severity-Based Filtering -```python -def filter_by_severity(sarif_results, min_severity="medium"): - """Filter SARIF results by minimum severity level""" - severity_order = {"info": 0, "note": 1, "warning": 2, "error": 3} - min_level = severity_order.get(min_severity, 1) - - filtered_results = [] - for result in sarif_results["runs"][0]["results"]: - result_level = severity_order.get(result.get("level", "note"), 1) - if result_level >= min_level: - filtered_results.append(result) - - return filtered_results -``` - -#### Rule-Based Filtering -```python -def filter_by_rules(sarif_results, rule_patterns): - """Filter results by rule ID patterns""" - import re - - filtered_results = [] - for result in sarif_results["runs"][0]["results"]: - rule_id = result.get("ruleId", "") - for pattern in rule_patterns: - if re.match(pattern, rule_id): - filtered_results.append(result) - break - - return filtered_results -``` - -### Statistical Analysis - -#### Severity Distribution -```python -def analyze_severity_distribution(sarif_results): - """Analyze distribution of findings by severity""" - distribution = {"error": 0, "warning": 0, "note": 0, "info": 0} - - for result in sarif_results["runs"][0]["results"]: - level = result.get("level", "note") - distribution[level] += 1 - - return distribution -``` - -#### Tool Coverage Analysis -```python -def analyze_tool_coverage(sarif_results): - """Analyze which tools contributed findings""" - tool_stats = {} - - for result in sarif_results["runs"][0]["results"]: - tool = result.get("properties", {}).get("tool", "unknown") - if tool not in tool_stats: - tool_stats[tool] = {"count": 0, "severities": {"error": 0, "warning": 0, "note": 0, "info": 0}} - - tool_stats[tool]["count"] += 1 - level = result.get("level", "note") - tool_stats[tool]["severities"][level] += 1 - - return tool_stats -``` - -## SARIF Export and Integration - -### Export Formats - -#### JSON Export -```python -def export_sarif_json(sarif_results, output_path): - """Export SARIF results as JSON""" - import json - - with open(output_path, 'w') as f: - json.dump(sarif_results, f, indent=2, ensure_ascii=False) -``` - -#### CSV Export for Spreadsheets -```python -def export_sarif_csv(sarif_results, output_path): - """Export SARIF results as CSV for spreadsheet analysis""" - import csv - - with open(output_path, 'w', newline='') as f: - writer = csv.writer(f) - writer.writerow(['Rule ID', 'Severity', 'Message', 'File', 'Line', 'Tool']) - - for result in sarif_results["runs"][0]["results"]: - rule_id = result.get("ruleId", "unknown") - level = result.get("level", "note") - message = result.get("message", {}).get("text", "") - tool = result.get("properties", {}).get("tool", "unknown") - - for location in result.get("locations", []): - physical_location = location.get("physicalLocation", {}) - file_path = physical_location.get("artifactLocation", {}).get("uri", "") - line = physical_location.get("region", {}).get("startLine", "") - - writer.writerow([rule_id, level, message, file_path, line, tool]) -``` - -### IDE Integration - -#### Visual Studio Code -SARIF files can be opened directly in VS Code with the SARIF extension: - -```json -{ - "recommendations": ["ms-sarif.sarif-viewer"], - "sarif.viewer.connectToGitHub": true, - "sarif.viewer.showResultsInExplorer": true -} -``` - -#### GitHub Integration -GitHub automatically processes SARIF files uploaded through Actions: - -```yaml -- name: Upload SARIF results - uses: github/codeql-action/upload-sarif@v2 - with: - sarif_file: fuzzforge-results.sarif - category: security-analysis -``` - -### API Integration - -#### SARIF Result Access -```python -# Example: Accessing SARIF results via FuzzForge API -async with FuzzForgeClient() as client: - result = await client.get_workflow_result(run_id) - - # Access SARIF data - sarif_data = result["sarif"] - findings = sarif_data["runs"][0]["results"] - - # Filter critical findings - critical_findings = [ - f for f in findings - if f.get("level") == "error" and - f.get("properties", {}).get("severity") == "critical" - ] -``` - -## SARIF Validation and Quality - -### Schema Validation -```python -import jsonschema -import requests - -def validate_sarif(sarif_data): - """Validate SARIF data against official schema""" - schema_url = "https://json.schemastore.org/sarif-2.1.0.json" - schema = requests.get(schema_url).json() - - try: - jsonschema.validate(sarif_data, schema) - return True, "Valid SARIF 2.1.0 format" - except jsonschema.ValidationError as e: - return False, f"SARIF validation error: {e.message}" -``` - -### Quality Metrics -```python -def calculate_sarif_quality_metrics(sarif_data): - """Calculate quality metrics for SARIF results""" - results = sarif_data["runs"][0]["results"] - - metrics = { - "total_findings": len(results), - "findings_with_location": len([r for r in results if r.get("locations")]), - "findings_with_message": len([r for r in results if r.get("message", {}).get("text")]), - "findings_with_remediation": len([r for r in results if r.get("fixes")]), - "unique_rules": len(set(r.get("ruleId") for r in results)), - "coverage_percentage": calculate_coverage(sarif_data) - } - - metrics["quality_score"] = ( - metrics["findings_with_location"] / max(metrics["total_findings"], 1) * 0.3 + - metrics["findings_with_message"] / max(metrics["total_findings"], 1) * 0.3 + - metrics["findings_with_remediation"] / max(metrics["total_findings"], 1) * 0.2 + - min(metrics["coverage_percentage"] / 100, 1.0) * 0.2 - ) - - return metrics -``` - -## Advanced SARIF Features - -### Fixes and Remediation -```json -{ - "ruleId": "semgrep.security.audit.sqli.pg-sqli", - "fixes": [ - { - "description": { - "text": "Use parameterized queries to prevent SQL injection" - }, - "artifactChanges": [ - { - "artifactLocation": { - "uri": "src/database/queries.py" - }, - "replacements": [ - { - "deletedRegion": { - "startLine": 156, - "startColumn": 20, - "endLine": 156, - "endColumn": 45 - }, - "insertedContent": { - "text": "cursor.execute(query, params)" - } - } - ] - } - ] - } - ] -} -``` - -### Code Flows for Complex Vulnerabilities -```json -{ - "ruleId": "dataflow.taint.sql-injection", - "codeFlows": [ - { - "message": { - "text": "Tainted data flows from user input to SQL query" - }, - "threadFlows": [ - { - "locations": [ - { - "location": { - "physicalLocation": { - "artifactLocation": {"uri": "src/api/handlers.py"}, - "region": {"startLine": 45} - } - }, - "state": {"source": "user_input"}, - "nestingLevel": 0 - }, - { - "location": { - "physicalLocation": { - "artifactLocation": {"uri": "src/database/queries.py"}, - "region": {"startLine": 156} - } - }, - "state": {"sink": "sql_query"}, - "nestingLevel": 0 - } - ] - } - ] - } - ] -} -``` - ---- - -## SARIF Best Practices - -### Result Quality -- **Precise Locations**: Always include accurate file paths and line numbers -- **Clear Messages**: Write descriptive, actionable finding messages -- **Remediation Guidance**: Include fix suggestions when possible -- **Severity Consistency**: Use consistent severity mappings across tools - -### Performance -- **Efficient Processing**: Process SARIF results efficiently for large result sets -- **Streaming**: Use streaming for very large SARIF files -- **Caching**: Cache processed results for faster repeated access -- **Compression**: Compress SARIF files for storage and transmission - -### Integration -- **Tool Interoperability**: Ensure SARIF compatibility with existing tools -- **Standard Compliance**: Follow SARIF 2.1.0 specification precisely -- **Extension Documentation**: Document any custom extensions clearly -- **Version Management**: Handle SARIF schema version differences diff --git a/docs/docs/concept/security-analysis.md b/docs/docs/concept/security-analysis.md deleted file mode 100644 index a10033b..0000000 --- a/docs/docs/concept/security-analysis.md +++ /dev/null @@ -1,174 +0,0 @@ -# Security Analysis in FuzzForge: Concepts and Approach - -Security analysis is at the core of FuzzForge’s mission. This page explains the philosophy, methodologies, and integration patterns that shape how FuzzForge discovers vulnerabilities and helps teams secure their software. If you’re curious about what ā€œsecurity analysisā€ really means in this platform—and why it’s designed this way—read on. - ---- - -## Why Does FuzzForge Approach Security Analysis This Way? - -FuzzForge’s security analysis is built on a few guiding principles: - -- **Defense in Depth:** No single tool or method catches everything. FuzzForge layers multiple analysis types—static, dynamic, secret detection, infrastructure checks, and fuzzing—to maximize coverage. -- **Tool Diversity:** Different tools find different issues. Running several tools for each analysis type reduces blind spots and increases confidence in results. -- **Standardized Results:** All findings are normalized into SARIF, a widely adopted format. This makes results easy to aggregate, review, and integrate with other tools. -- **Automation and Integration:** Security analysis is only useful if it fits into real-world workflows. FuzzForge is designed for CI/CD, developer feedback, and automated reporting. - ---- - -## What Types of Security Analysis Does FuzzForge Perform? - -### Static Analysis - -- **What it is:** Examines source code without running it, looking for vulnerabilities, anti-patterns, and risky constructs. -- **How it works:** Parses code, analyzes control and data flow, and matches patterns against known vulnerabilities. -- **Tools:** Semgrep, Bandit, CodeQL, ESLint, and more. -- **Strengths:** Fast, broad coverage, no runtime needed. -- **Limitations:** Can’t see runtime issues, may produce false positives. - -### Dynamic Analysis - -- **What it is:** Tests running applications to find vulnerabilities that only appear at runtime. -- **How it works:** Deploys the app in a test environment, probes entry points, and observes behavior under attack. -- **Tools:** Nuclei, OWASP ZAP, Nmap, SQLMap. -- **Strengths:** Finds real, exploitable issues; validates actual behavior. -- **Limitations:** Needs a working environment; slower; may not cover all code. - -### Secret Detection - -- **What it is:** Scans code and configuration for exposed credentials, API keys, and sensitive data. -- **How it works:** Uses pattern matching, entropy analysis, and context checks—sometimes even scanning git history. -- **Tools:** TruffleHog, Gitleaks, detect-secrets, GitGuardian. -- **Strengths:** Fast, critical for preventing leaks. -- **Limitations:** Can’t find encrypted/encoded secrets; needs regular pattern updates. - -### Infrastructure Analysis - -- **What it is:** Analyzes infrastructure-as-code, container configs, and deployment manifests for security misconfigurations. -- **How it works:** Parses config files, applies security policies, checks compliance, and assesses risk. -- **Tools:** Checkov, Hadolint, Kubesec, Terrascan. -- **Strengths:** Prevents misconfigurations before deployment; automates compliance. -- **Limitations:** Can’t see runtime changes; depends on up-to-date policies. - -### Fuzzing - -- **What it is:** Automatically generates and sends unexpected or random inputs to code, looking for crashes or unexpected behavior. -- **How it works:** Identifies targets, generates inputs, monitors execution, and analyzes crashes. -- **Tools:** AFL++, libFuzzer, Cargo Fuzz, Jazzer. -- **Strengths:** Finds deep, complex bugs; great for memory safety. -- **Limitations:** Resource-intensive; may need manual setup. - -### Comprehensive Assessment - -- **What it is:** Combines all the above for a holistic view, correlating findings and prioritizing risks. -- **How it works:** Runs multiple analyses, aggregates and correlates results, and generates unified reports. -- **Benefits:** Complete coverage, better context, prioritized remediation, and compliance support. - ---- - -## How Does FuzzForge Integrate and Orchestrate Analysis? - -### Workflow Composition - -FuzzForge composes analysis workflows by combining different analysis types, each running in its own containerized environment. Inputs (code, configs, parameters) are fed into the appropriate tools, and results are normalized and aggregated. - -```mermaid -graph TB - subgraph "Input" - Target[Target Codebase] - Config[Analysis Configuration] - end - - subgraph "Analysis Workflows" - Static[Static Analysis] - Dynamic[Dynamic Analysis] - Secrets[Secret Detection] - Infra[Infrastructure Analysis] - Fuzz[Fuzzing Analysis] - end - - subgraph "Processing" - Normalize[Result Normalization] - Merge[Finding Aggregation] - Correlate[Cross-Tool Correlation] - end - - subgraph "Output" - SARIF[SARIF Results] - Report[Security Report] - Metrics[Analysis Metrics] - end - - Target --> Static - Target --> Dynamic - Target --> Secrets - Target --> Infra - Target --> Fuzz - Config --> Static - Config --> Dynamic - Config --> Secrets - Config --> Infra - Config --> Fuzz - - Static --> Normalize - Dynamic --> Normalize - Secrets --> Normalize - Infra --> Normalize - Fuzz --> Normalize - - Normalize --> Merge - Merge --> Correlate - Correlate --> SARIF - Correlate --> Report - Correlate --> Metrics -``` - -### Orchestration Patterns - -- **Parallel Execution:** Tools of the same type (e.g., multiple static analyzers) run in parallel for speed and redundancy. -- **Sequential Execution:** Some analyses depend on previous results (e.g., dynamic analysis using endpoints found by static analysis). -- **Result Normalization:** All findings are converted to SARIF for consistency. -- **Correlation:** Related findings from different tools are grouped and prioritized. - ---- - -## How Is Quality Ensured? - -### Metrics and Measurement - -- **Coverage:** How much code, how many rules, and how many vulnerability types are analyzed. -- **Accuracy:** False positive/negative rates, confidence scores, and validation rates. -- **Performance:** Analysis duration, resource usage, and scalability. - -### Quality Assurance - -- **Cross-Tool Validation:** Findings are confirmed by multiple tools when possible. -- **Manual Review:** High-severity findings can be flagged for expert review. -- **Continuous Improvement:** Tools and rules are updated regularly, and user feedback is incorporated. - ---- - -## How Does Security Analysis Fit Into Development Workflows? - -### CI/CD Integration - -- **Pre-commit Hooks:** Run security checks before code is committed. -- **Pipeline Integration:** Block deployments if high/critical issues are found. -- **Quality Gates:** Enforce severity thresholds and track trends over time. - -### Developer Experience - -- **IDE Integration:** Import SARIF findings into supported IDEs for inline feedback. -- **Real-Time Analysis:** Optionally run background checks during development. -- **Reporting:** Executive dashboards, technical reports, and compliance summaries. - ---- - -## What’s Next for Security Analysis in FuzzForge? - -FuzzForge is designed to evolve. Advanced techniques like machine learning for pattern recognition, contextual analysis, and business logic checks are on the roadmap. The goal: keep raising the bar for automated, actionable, and developer-friendly security analysis. - ---- - -## In Summary - -FuzzForge’s security analysis is comprehensive, layered, and designed for real-world integration. By combining multiple analysis types, normalizing results, and focusing on automation and developer experience, FuzzForge helps teams find and fix vulnerabilities—before attackers do. diff --git a/docs/docs/concept/workflow.md b/docs/docs/concept/workflow.md deleted file mode 100644 index 854c31c..0000000 --- a/docs/docs/concept/workflow.md +++ /dev/null @@ -1,129 +0,0 @@ -# Understanding Workflows in FuzzForge - -Workflows are the backbone of FuzzForge’s security analysis platform. If you want to get the most out of FuzzForge, it’s essential to understand what workflows are, how they’re designed, and how they operate from start to finish. This page explains the core concepts, design principles, and execution models behind FuzzForge workflows—so you can use them confidently and effectively. - ---- - -## What Is a Workflow? - -A **workflow** in FuzzForge is a containerized process that orchestrates one or more security tools to analyze a target codebase or application. Each workflow is tailored for a specific type of security analysis (like static analysis, secret detection, or fuzzing) and is designed to be: - -- **Isolated:** Runs in its own Docker container for security and reproducibility. -- **Integrated:** Can combine multiple tools for comprehensive results. -- **Standardized:** Always produces SARIF-compliant output. -- **Configurable:** Accepts parameters to customize analysis. -- **Scalable:** Can run in parallel and scale horizontally. - ---- - -## How Does a Workflow Operate? - -### High-Level Architecture - -Here’s how a workflow moves through the FuzzForge system: - -```mermaid -graph TB - User[User/CLI/API] --> API[FuzzForge API] - API --> MinIO[MinIO Storage] - API --> Temporal[Temporal Orchestrator] - Temporal --> Worker[Vertical Worker] - Worker --> MinIO - Worker --> Tools[Security Tools] - Tools --> Results[SARIF Results] - Results --> MinIO -``` - -**Key roles:** -- **User/CLI/API:** Submits workflows and uploads files. -- **FuzzForge API:** Validates, uploads targets, and tracks workflows. -- **Temporal Orchestrator:** Schedules and manages workflow execution. -- **Vertical Worker:** Long-lived worker with pre-installed security tools. -- **MinIO Storage:** Stores uploaded targets and results. -- **Security Tools:** Perform the actual analysis. - ---- - -## Workflow Lifecycle: From Idea to Results - -1. **Design:** Choose tools, define integration logic, set up parameters, and specify the vertical worker. -2. **Deployment:** Create workflow code, add metadata with `vertical` field, mount as volume in worker. -3. **Execution:** User submits a workflow with file upload; file is stored in MinIO; workflow is routed to vertical worker; worker downloads target and executes; tools run as designed. -4. **Completion:** Results are collected, normalized, and stored in MinIO; status is updated; MinIO lifecycle policies clean up old files; results are made available via API/CLI. - ---- - -## Types of Workflows - -FuzzForge supports several workflow types, each optimized for a specific security need: - -- **Static Analysis:** Examines source code without running it (e.g., Semgrep, Bandit). -- **Dynamic Analysis:** Tests running applications for runtime vulnerabilities (e.g., OWASP ZAP, Nuclei). -- **Secret Detection:** Finds exposed credentials and sensitive data (e.g., TruffleHog, Gitleaks). -- **Infrastructure Analysis:** Checks infrastructure-as-code and configs for misconfigurations (e.g., Checkov, Hadolint). -- **Fuzzing:** Generates unexpected inputs to find crashes and edge cases (e.g., AFL++, libFuzzer). -- **Comprehensive Assessment:** Combines multiple analysis types for full coverage. - ---- - -## Workflow Design Principles - -- **Tool Agnostic:** Workflows abstract away the specifics of underlying tools, providing a consistent interface. -- **Fail-Safe Execution:** If one tool fails, others continue—partial results are still valuable. -- **Configurable:** Users can adjust parameters to control tool behavior, output, and execution. -- **Resource-Aware:** Workflows specify and respect resource limits (CPU, memory). -- **Standardized Output:** All results are normalized to SARIF for easy integration and reporting. - ---- - -## Execution Models - -- **Synchronous:** Wait for the workflow to finish and get results immediately—great for interactive use. -- **Asynchronous:** Submit a workflow and check back later for results—ideal for long-running or batch jobs. -- **Parallel:** Run multiple workflows at once for comprehensive or time-sensitive analysis. - ---- - -## Data Flow and Storage - -- **Input:** Target files uploaded via HTTP to MinIO; parameters validated and passed to Temporal. -- **Processing:** Worker downloads target from MinIO to local cache; tools are initialized and run (often in parallel); outputs are collected and normalized. -- **Output:** Results are stored in MinIO and indexed for fast retrieval; metadata is saved in PostgreSQL; targets cached locally for repeated workflows; lifecycle policies clean up after 7 days. - ---- - -## Error Handling and Recovery - -- **Tool-Level:** Timeouts, resource exhaustion, and crashes are handled gracefully; failed tools don't stop the workflow. -- **Workflow-Level:** Worker failures, storage issues, and network problems are detected and reported by Temporal. -- **Recovery:** Automatic retries for transient errors via Temporal; partial results are returned when possible; workflows degrade gracefully if some tools are unavailable; MinIO ensures targets remain accessible. - ---- - -## Performance and Optimization - -- **Worker Efficiency:** Long-lived workers eliminate container startup overhead; pre-installed toolchains reduce setup time. -- **Parallel Processing:** Independent tools run concurrently to maximize CPU usage and minimize wait times. -- **Caching:** MinIO targets are cached locally; repeated workflows reuse cached targets; worker cache uses LRU eviction. - ---- - -## Monitoring and Observability - -- **Metrics:** Track execution time, resource usage, and success/failure rates. -- **Logging:** Structured logs and tool outputs are captured for debugging and analysis. -- **Real-Time Monitoring:** Live status updates and progress indicators are available via API/WebSocket. - ---- - -## Integration Patterns - -- **CI/CD:** Integrate workflows into pipelines to block deployments on critical findings. -- **API:** Programmatically submit and track workflows from your own tools or scripts. -- **Event-Driven:** Use webhooks or event listeners to trigger actions on workflow completion. - ---- - -## In Summary - -Workflows in FuzzForge are designed to be robust, flexible, and easy to integrate into your security and development processes. By combining containerization, orchestration, and a standardized interface, FuzzForge workflows help you automate and scale security analysis—so you can focus on fixing issues, not just finding them. diff --git a/docs/docs/concept/working-with-documentation.md b/docs/docs/concept/working-with-documentation.md deleted file mode 100644 index 7e72ed8..0000000 --- a/docs/docs/concept/working-with-documentation.md +++ /dev/null @@ -1,72 +0,0 @@ -# Working with documentation - -To update the documentation on any of the sections just add a new markdown file to the designated subfolder below : - -``` -ā”œā”€concepts -ā”œā”€tutorials -ā”œā”€how-to -│ └─troubleshooting -└─reference - ā”œā”€architecture - ā”œā”€decisions - └─faq -``` - -:::note Templates - -Each folder contains templates that can be used as quickstarts. Those are named `