diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md new file mode 100644 index 0000000..4e948db --- /dev/null +++ b/CONTRIBUTING.md @@ -0,0 +1,173 @@ +# Contributing to FuzzForge ๐Ÿค + +Thank you for your interest in contributing to FuzzForge! We welcome contributions from the community and are excited to collaborate with you. + +## ๐ŸŒŸ Ways to Contribute + +- ๐Ÿ› **Bug Reports** - Help us identify and fix issues +- ๐Ÿ’ก **Feature Requests** - Suggest new capabilities and improvements +- ๐Ÿ”ง **Code Contributions** - Submit bug fixes, features, and enhancements +- ๐Ÿ“š **Documentation** - Improve guides, tutorials, and API documentation +- ๐Ÿงช **Testing** - Help test new features and report issues +- ๐Ÿ›ก๏ธ **Security Workflows** - Contribute new security analysis workflows + +## ๐Ÿ“‹ Contribution Guidelines + +### Code Style + +- Follow [PEP 8](https://pep8.org/) for Python code +- Use type hints where applicable +- Write clear, descriptive commit messages +- Include docstrings for all public functions and classes +- Add tests for new functionality + +### Commit Message Format + +We use conventional commits for clear history: + +``` +(): + +[optional body] + +[optional footer] +``` + +**Types:** +- `feat:` New feature +- `fix:` Bug fix +- `docs:` Documentation changes +- `style:` Code formatting (no logic changes) +- `refactor:` Code restructuring without changing functionality +- `test:` Adding or updating tests +- `chore:` Maintenance tasks + +**Examples:** +``` +feat(workflows): add new static analysis workflow for Go +fix(api): resolve authentication timeout issue +docs(readme): update installation instructions +``` + +### Pull Request Process + +1. **Create a Branch** + ```bash + git checkout -b feature/your-feature-name + # or + git checkout -b fix/issue-description + ``` + +2. **Make Your Changes** + - Write clean, well-documented code + - Add tests for new functionality + - Update documentation as needed + +3. **Test Your Changes** + ```bash + # Test workflows + cd test_projects/vulnerable_app/ + ff workflow security_assessment . + ``` + +4. **Submit Pull Request** + - Use a clear, descriptive title + - Provide detailed description of changes + - Link related issues using `Fixes #123` or `Closes #123` + - Ensure all CI checks pass + +## ๐Ÿ›ก๏ธ Security Workflow Development + +### Creating New Workflows + +1. **Workflow Structure** + ``` + backend/toolbox/workflows/your_workflow/ + โ”œโ”€โ”€ __init__.py + โ”œโ”€โ”€ workflow.py # Main Prefect flow + โ”œโ”€โ”€ metadata.yaml # Workflow metadata + โ””โ”€โ”€ Dockerfile # Container definition + ``` + +2. **Register Your Workflow** + Add your workflow to `backend/toolbox/workflows/registry.py`: + ```python + # Import your workflow + from .your_workflow.workflow import main_flow as your_workflow_flow + + # Add to registry + WORKFLOW_REGISTRY["your_workflow"] = { + "flow": your_workflow_flow, + "module_path": "toolbox.workflows.your_workflow.workflow", + "function_name": "main_flow", + "description": "Description of your workflow", + "version": "1.0.0", + "author": "Your Name", + "tags": ["tag1", "tag2"] + } + ``` + +3. **Testing Workflows** + - Create test cases in `test_projects/vulnerable_app/` + - Ensure SARIF output format compliance + - Test with various input scenarios + +### Security Guidelines + +- ๐Ÿ” Never commit secrets, API keys, or credentials +- ๐Ÿ›ก๏ธ Focus on **defensive security** tools and analysis +- โš ๏ธ Do not create tools for malicious purposes +- ๐Ÿงช Test workflows thoroughly before submission +- ๐Ÿ“‹ Follow responsible disclosure for security issues + +## ๐Ÿ› Bug Reports + +When reporting bugs, please include: + +- **Environment**: OS, Python version, Docker version +- **Steps to Reproduce**: Clear steps to recreate the issue +- **Expected Behavior**: What should happen +- **Actual Behavior**: What actually happens +- **Logs**: Relevant error messages and stack traces +- **Screenshots**: If applicable + +Use our [Bug Report Template](.github/ISSUE_TEMPLATE/bug_report.md). + +## ๐Ÿ’ก Feature Requests + +For new features, please provide: + +- **Use Case**: Why is this feature needed? +- **Proposed Solution**: How should it work? +- **Alternatives**: Other approaches considered +- **Implementation**: Technical considerations (optional) + +Use our [Feature Request Template](.github/ISSUE_TEMPLATE/feature_request.md). + +## ๐Ÿ“š Documentation + +Help improve our documentation: + +- **API Documentation**: Update docstrings and type hints +- **User Guides**: Create tutorials and how-to guides +- **Workflow Documentation**: Document new security workflows +- **Examples**: Add practical usage examples + +## ๐Ÿ™ Recognition + +Contributors will be: + +- Listed in our [Contributors](CONTRIBUTORS.md) file +- Mentioned in release notes for significant contributions +- Invited to join our Discord community +- Eligible for FuzzingLabs Academy courses and swag + +## ๐Ÿ“œ License + +By contributing to FuzzForge, you agree that your contributions will be licensed under the same [Business Source License 1.1](LICENSE) as the project. + +--- + +**Thank you for making FuzzForge better! ๐Ÿš€** + +Every contribution, no matter how small, helps build a stronger security community. diff --git a/LICENSE b/LICENSE new file mode 100644 index 0000000..5e32dfd --- /dev/null +++ b/LICENSE @@ -0,0 +1,61 @@ +License text copyright (c) 2025 FuzzingLabs, All Rights Reserved. +"Business Source License" is a trademark of MariaDB Corporation Ab. + +Parameters + +Licensor: FuzzingLabs +Licensed Work: FuzzForge version 0.6.0 or later. The Licensed Work is (c) 2025 FuzzingLabs. +Additional Use Grant: You may make non-production use of the Licensed Work, including + research, academic, educational, personal, or internal evaluation purposes. + Production use of the Licensed Work requires a commercial license from FuzzingLabs. +Change Date: Four years from the date the Licensed Work is published. +Change License: Apache License, Version 2.0 + +For information about alternative licensing arrangements for the Licensed Work, +please contact licensing@fuzzinglabs.com. + +Notice + +Business Source License 1.1 + +Terms + +The Licensor hereby grants you the right to copy, modify, create derivative +works, redistribute, and make non-production use of the Licensed Work. The +Licensor may make an Additional Use Grant, above, permitting limited production use. + +Effective on the Change Date, or the fourth anniversary of the first publicly +available distribution of a specific version of the Licensed Work under this +License, whichever comes first, the Licensor hereby grants you rights under +the terms of the Change License, and the rights granted in the paragraph +above terminate. + +If your use of the Licensed Work does not comply with the requirements +currently in effect as described in this License, you must purchase a +commercial license from the Licensor, its affiliated entities, or authorized +resellers, or you must refrain from using the Licensed Work. + +All copies of the original and modified Licensed Work, and derivative works +of the Licensed Work, are subject to this License. This License applies +separately for each version of the Licensed Work and the Change Date may vary +for each version of the Licensed Work released by Licensor. + +You must conspicuously display this License on each original or modified copy +of the Licensed Work. If you receive the Licensed Work in original or +modified form from a third party, the terms and conditions set forth in this +License apply to your use of that work. + +Any use of the Licensed Work in violation of this License will automatically +terminate your rights under this License for the current and all other +versions of the Licensed Work. + +This License does not grant you any right in any trademark or logo of +Licensor or its affiliates (provided that you may use a trademark or logo of +Licensor as expressly required by this License). + +TO THE EXTENT PERMITTED BY APPLICABLE LAW, THE LICENSED WORK IS PROVIDED ON +AN "AS IS" BASIS. LICENSOR HEREBY DISCLAIMS ALL WARRANTIES AND CONDITIONS, +EXPRESS OR IMPLIED, INCLUDING (WITHOUT LIMITATION) WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, NON-INFRINGEMENT, AND +TITLE. + diff --git a/LICENSE-APACHE b/LICENSE-APACHE new file mode 100644 index 0000000..cebda9f --- /dev/null +++ b/LICENSE-APACHE @@ -0,0 +1,201 @@ + Apache License + Version 2.0, January 2004 + http://www.apache.org/licenses/ + +TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION + +1. Definitions. + + "License" shall mean the terms and conditions for use, reproduction, + and distribution as defined by Sections 1 through 9 of this document. + + "Licensor" shall mean the copyright owner or entity authorized by + the copyright owner that is granting the License. + + "Legal Entity" shall mean the union of the acting entity and all + other entities that control, are controlled by, or are under common + control with that entity. For the purposes of this definition, + "control" means (i) the power, direct or indirect, to cause the + direction or management of such entity, whether by contract or + otherwise, or (ii) ownership of fifty percent (50%) or more of the + outstanding shares, or (iii) beneficial ownership of such entity. + + "You" (or "Your") shall mean an individual or Legal Entity + exercising permissions granted by this License. + + "Source" form shall mean the preferred form for making modifications, + including but not limited to software source code, documentation + source, and configuration files. + + "Object" form shall mean any form resulting from mechanical + transformation or translation of a Source form, including but + not limited to compiled object code, generated documentation, + and conversions to other media types. + + "Work" shall mean the work of authorship, whether in Source or + Object form, made available under the License, as indicated by a + copyright notice that is included in or attached to the work + (an example is provided in the Appendix below). + + "Derivative Works" shall mean any work, whether in Source or Object + form, that is based on (or derived from) the Work and for which the + editorial revisions, annotations, elaborations, or other modifications + represent, as a whole, an original work of authorship. For the purposes + of this License, Derivative Works shall not include works that remain + separable from, or merely link (or bind by name) to the interfaces of, + the Work and Derivative Works thereof. + + "Contribution" shall mean any work of authorship, including + the original version of the Work and any modifications or additions + to that Work or Derivative Works thereof, that is intentionally + submitted to Licensor for inclusion in the Work by the copyright owner + or by an individual or Legal Entity authorized to submit on behalf of + the copyright owner. For the purposes of this definition, "submitted" + means any form of electronic, verbal, or written communication sent + to the Licensor or its representatives, including but not limited to + communication on electronic mailing lists, source code control systems, + and issue tracking systems that are managed by, or on behalf of, the + Licensor for the purpose of discussing and improving the Work, but + excluding communication that is conspicuously marked or otherwise + designated in writing by the copyright owner as "Not a Contribution." + + "Contributor" shall mean Licensor and any individual or Legal Entity + on behalf of whom a Contribution has been received by Licensor and + subsequently incorporated within the Work. + +2. Grant of Copyright License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + copyright license to reproduce, prepare Derivative Works of, + publicly display, publicly perform, sublicense, and distribute the + Work and such Derivative Works in Source or Object form. + +3. Grant of Patent License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + (except as stated in this section) patent license to make, have made, + use, offer to sell, sell, import, and otherwise transfer the Work, + where such license applies only to those patent claims licensable + by such Contributor that are necessarily infringed by their + Contribution(s) alone or by combination of their Contribution(s) + with the Work to which such Contribution(s) was submitted. If You + institute patent litigation against any entity (including a cross- + claim or counterclaim in a lawsuit) alleging that the Work or a + Contribution incorporated within the Work constitutes direct or + contributory patent infringement, then any patent licenses granted + to You under this License for that Work shall terminate as of the + date such litigation is filed. + +4. Redistribution. You may reproduce and distribute copies of the + Work or Derivative Works thereof in any medium, with or without + modifications, and in Source or Object form, provided that You + meet the following conditions: + + (a) You must give any other recipients of the Work or Derivative + Works a copy of this License; and + + (b) You must cause any modified files to carry prominent notices + stating that You changed the files; and + + (c) You must retain, in the Source form of any Derivative Works + that You distribute, all copyright, patent, trademark, and + attribution notices from the Source form of the Work, + excluding those notices that do not pertain to any part of + the Derivative Works; and + + (d) If the Work includes a "NOTICE" text file as part of its + distribution, then any Derivative Works that You distribute must + include a readable copy of the attribution notices contained + within such NOTICE file, excluding those notices that do not + pertain to any part of the Derivative Works, in at least one + of the following places: within a NOTICE text file distributed + as part of the Derivative Works; within the Source form or + documentation, if provided along with the Derivative Works; or, + within a display generated by the Derivative Works, if and + wherever such third-party notices normally appear. The contents + of the NOTICE file are for informational purposes only and + do not modify the License. You may add Your own attribution + notices within Derivative Works that You distribute, alongside + or as an addendum to the NOTICE text from the Work, provided + that such additional attribution notices cannot be construed + as modifying the License. + + You may add Your own copyright statement to Your modifications and + may provide additional or different license terms and conditions + for use, reproduction, or distribution of Your modifications, or + for any such Derivative Works as a whole, provided Your use, + reproduction, and distribution of the Work otherwise complies with + the conditions stated in this License. + +5. Submission of Contributions. Unless You explicitly state otherwise, + any Contribution intentionally submitted for inclusion in the Work + by You to the Licensor shall be under the terms and conditions of + this License, without any additional terms or conditions. + Notwithstanding the above, nothing herein shall supersede or modify + the terms of any separate license agreement you may have executed + with Licensor regarding such Contributions. + +6. Trademarks. This License does not grant permission to use the trade + names, trademarks, service marks, or product names of the Licensor, + except as required for reasonable and customary use in describing the + origin of the Work and reproducing the content of the NOTICE file. + +7. Disclaimer of Warranty. Unless required by applicable law or + agreed to in writing, Licensor provides the Work (and each + Contributor provides its Contributions) on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or + implied, including, without limitation, any warranties or conditions + of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A + PARTICULAR PURPOSE. You are solely responsible for determining the + appropriateness of using or redistributing the Work and assume any + risks associated with Your exercise of permissions under this License. + +8. Limitation of Liability. In no event and under no legal theory, + whether in tort (including negligence), contract, or otherwise, + unless required by applicable law (such as deliberate and grossly + negligent acts) or agreed to in writing, shall any Contributor be + liable to You for damages, including any direct, indirect, special, + incidental, or consequential damages of any character arising as a + result of this License or out of the use or inability to use the + Work (including but not limited to damages for loss of goodwill, + work stoppage, computer failure or malfunction, or any and all + other commercial damages or losses), even if such Contributor + has been advised of the possibility of such damages. + +9. Accepting Warranty or Additional Liability. While redistributing + the Work or Derivative Works thereof, You may choose to offer, + and charge a fee for, acceptance of support, warranty, indemnity, + or other liability obligations and/or rights consistent with this + License. However, in accepting such obligations, You may act only + on Your own behalf and on Your sole responsibility, not on behalf + of any other Contributor, and only if You agree to indemnify, + defend, and hold each Contributor harmless for any liability + incurred by, or claims asserted against, such Contributor by reason + of your accepting any such warranty or additional liability. + +END OF TERMS AND CONDITIONS + +APPENDIX: How to apply the Apache License to your work. + + To apply the Apache License to your work, attach the following + boilerplate notice, with the fields enclosed by brackets "[]" + replaced with your own identifying information. (Don't include + the brackets!) The text should be enclosed in the appropriate + comment syntax for the file format. We also recommend that a + file or class name and description of purpose be included on the + same "printed page" as the copyright notice for easier + identification within third-party archives. + +Copyright [yyyy] [name of copyright owner] + +Licensed under the Apache License, Version 2.0 (the "License"); +you may not use this file except in compliance with the License. +You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + +Unless required by applicable law or agreed to in writing, software +distributed under the License is distributed on an "AS IS" BASIS, +WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +See the License for the specific language governing permissions and +limitations under the License. diff --git a/NOTICE b/NOTICE new file mode 100644 index 0000000..26928a0 --- /dev/null +++ b/NOTICE @@ -0,0 +1,12 @@ +FuzzForge +Copyright (c) 2025 FuzzingLabs + +This product includes software developed by FuzzingLabs (https://fuzzforge.ai). + +Licensed under the Business Source License 1.1 (BSL). +After the Change Date (four years from the date of publication), this version +of the Licensed Work will be made available under the Apache License, Version 2.0. + +You may not use the name "FuzzingLabs" or "FuzzForge" nor the names of its +contributors to endorse or promote products derived from this software +without specific prior written permission. diff --git a/README.md b/README.md index 8c2736a..3fd9a34 100644 --- a/README.md +++ b/README.md @@ -1,13 +1,28 @@ -# FuzzForge +

+ FuzzForge Banner +

+

FuzzForge ๐Ÿšง

-![FuzzForge Logo](docs/assets/fuzzforge-logo.png) +

AI-powered workflow automation and AI Agents for AppSec, Fuzzing & Offensive Security

-**AI-powered workflow automation and AI Agents for AppSec, Fuzzing & Offensive Security** +

+ Discord + License: BSL + Apache + Python 3.11+ + Website + Version +

-[![Discord](https://img.shields.io/discord/0000000000000?logo=discord&label=Discord&color=7289da)](https://discord.com/invite/acqv9FVG) -[![Website](https://img.shields.io/badge/Website-fuzzforge.ai-blue?logo=vercel)](https://fuzzforge.ai) -[![License](https://img.shields.io/badge/license-BSL%20%2B%20Apache-orange)](LICENSE) -![Version](https://img.shields.io/badge/version-0.6.0-green) +

+ + Overview + โ€ข Features + โ€ข Installation + โ€ข Quickstart + โ€ข Demo + โ€ข Contributing + +

--- @@ -22,51 +37,7 @@ FuzzForge is **open source**, built to empower security teams, researchers, and the community. ---- - -## โšก Quickstart - -Run your first workflow in **3 steps**: - -```bash -# 1. Clone the repo -git clone https://github.com/fuzzinglabs/fuzzforge.git -cd fuzzforge - -# 2. Build & run with Docker -docker compose up - -# 3. Access the UI -open http://localhost:3000 -``` - -๐Ÿ‘‰ More installation options in the [Documentation](https://fuzzforge.ai/docs). - ---- - -## ๐Ÿ” Example Workflow - -Example: Run a workflow that audits an Android APK with AI agents: - -```bash -fuzzforge run workflows/android_apk_audit.yaml -``` - -FuzzForge automatically orchestrates static analysis, AI-assisted reversing, and vulnerability triage. - ---- - -## ๐ŸŽฅ Demos - -### AI-Powered Workflow Execution -![LLM Workflow Demo](docs/static/videos/llm_workflow.gif) - -*AI agents automatically analyzing code and providing security insights* - -### Manual Workflow Setup -![Manual Workflow Demo](docs/static/videos/manual_workflow.gif) - -*Setting up and running security workflows through the interface* +> ๐Ÿšง FuzzForge is still a work in progress, you can [subscribe]() to get the latest news. --- @@ -81,12 +52,83 @@ FuzzForge automatically orchestrates static analysis, AI-assisted reversing, and --- +## ๐Ÿ“ฆ Installation + +### Requirements + +**Python 3.11+** +Python 3.11 or higher is required. + +**uv Package Manager** +```bash +curl -LsSf https://astral.sh/uv/install.sh | sh +``` + +**Docker** +For containerized workflows, see the [Docker Installation Guide](https://docs.docker.com/get-docker/). + +### CLI Installation + +After installing the requirements, install the FuzzForge CLI: + +```bash +# Clone the repository +git clone https://github.com/fuzzinglabs/fuzzforge_ai.git +cd fuzzforge_ai + +# Install CLI with uv (from the root directory) +uv tool install --python python3.12 . +``` + +--- + +## โšก Quickstart + +Run your first workflow in **3 steps**: + +```bash +# 1. Clone the repo +git clone https://github.com/fuzzinglabs/fuzzforge.git +cd fuzzforge + +# 2. Build & run with Docker +# Set registry host for your OS (local registry is mandatory) +# macOS/Windows (Docker Desktop): +export REGISTRY_HOST=host.docker.internal +# Linux (default): +# export REGISTRY_HOST=localhost +docker compose up -d +``` + +> The first launch can take 5-10 minutes due to Docker image building - a good time for a coffee break โ˜• + +```bash +# 3. Run your first workflow +cd test_projects/vulnerable_app/ # Go into the test directory +fuzzforge init # Init a fuzzforge project +ff workflow security_assessment . # Start a workflow (you can also use ff command) +``` + +### Manual Workflow Setup +![Manual Workflow Demo](docs/static/videos/manual_workflow.gif) + +*Setting up and running security workflows through the interface* + +๐Ÿ‘‰ More installation options in the [Documentation](https://fuzzforge.ai/docs). + +--- + +## AI-Powered Workflow Execution +![LLM Workflow Demo](docs/static/videos/llm_workflow.gif) + +*AI agents automatically analyzing code and providing security insights* + ## ๐Ÿ“š Resources - ๐ŸŒ [Website](https://fuzzforge.ai) - ๐Ÿ“– [Documentation](https://fuzzforge.ai/docs) - ๐Ÿ’ฌ [Community Discord](https://discord.com/invite/acqv9FVG) -- ๐ŸŽ“ [FuzzingLabs Academy](https://academy.fuzzinglabs.com) +- ๐ŸŽ“ [FuzzingLabs Academy](https://academy.fuzzinglabs.com/?coupon=GITHUB_FUZZFORGE) --- diff --git a/ai/.gitignore b/ai/.gitignore new file mode 100644 index 0000000..5f018c8 --- /dev/null +++ b/ai/.gitignore @@ -0,0 +1,6 @@ +.env +__pycache__/ +*.pyc +fuzzforge_sessions.db +agentops.log +*.log diff --git a/ai/README.md b/ai/README.md new file mode 100644 index 0000000..36f1f2f --- /dev/null +++ b/ai/README.md @@ -0,0 +1,110 @@ +# FuzzForge AI Module + +FuzzForge AI is the multi-agent layer that lets you operate the FuzzForge security platform through natural language. It orchestrates local tooling, registered Agent-to-Agent (A2A) peers, and the Prefect-powered backend while keeping long-running context in memory and project knowledge graphs. + +## Quick Start + +1. **Initialise a project** + ```bash + cd /path/to/project + fuzzforge init + ``` +2. **Review environment settings** โ€“ copy `.fuzzforge/.env.template` to `.fuzzforge/.env`, then edit the values to match your provider. The template ships with commented defaults for OpenAI-style usage and placeholders for Cognee keys. + ```env + LLM_PROVIDER=openai + LITELLM_MODEL=gpt-5-mini + OPENAI_API_KEY=sk-your-key + FUZZFORGE_MCP_URL=http://localhost:8010/mcp + SESSION_PERSISTENCE=sqlite + ``` + Optional flags you may want to enable early: + ```env + MEMORY_SERVICE=inmemory + AGENTOPS_API_KEY=sk-your-agentops-key # Enable hosted tracing + LOG_LEVEL=INFO # CLI / server log level + ``` +3. **Populate the knowledge graph** + ```bash + fuzzforge ingest --path . --recursive + # alias: fuzzforge rag ingest --path . --recursive + ``` +4. **Launch the agent shell** + ```bash + fuzzforge ai agent + ``` + Keep the backend running (Prefect API at `FUZZFORGE_MCP_URL`) so workflow commands succeed. + +## Everyday Workflow + +- Run `fuzzforge ai agent` and start with `list available fuzzforge workflows` or `/memory status` to confirm everything is wired. +- Use natural prompts for automation (`run fuzzforge workflow โ€ฆ`, `search project knowledge for โ€ฆ`) and fall back to slash commands for precision (`/recall`, `/sendfile`). +- Keep `/memory datasets` handy to see which Cognee datasets are available after each ingest. +- Start the HTTP surface with `python -m fuzzforge_ai` when external agents need access to artifacts or graph queries. The CLI stays usable at the same time. +- Refresh the knowledge graph regularly: `fuzzforge ingest --path . --recursive --force` keeps responses aligned with recent code changes. + +## What the Agent Can Do + +- **Route requests** โ€“ automatically selects the right local tool or remote agent using the A2A capability registry. +- **Run security workflows** โ€“ list, submit, and monitor FuzzForge workflows via MCP wrappers. +- **Manage artifacts** โ€“ create downloadable files for reports, code edits, and shared attachments. +- **Maintain context** โ€“ stores session history, semantic recall, and Cognee project graphs. +- **Serve over HTTP** โ€“ expose the same agent as an A2A server using `python -m fuzzforge_ai`. + +## Essential Commands + +Inside `fuzzforge ai agent` you can mix slash commands and free-form prompts: + +```text +/list # Show registered A2A agents +/register http://:10201 # Add a remote agent +/artifacts # List generated files +/sendfile SecurityAgent src/report.md "Please review" +You> route_to SecurityAnalyzer: scan ./backend for secrets +You> run fuzzforge workflow static_analysis_scan on ./test_projects/demo +You> search project knowledge for "prefect status" using INSIGHTS +``` + +Artifacts created during the conversation are served from `.fuzzforge/artifacts/` and exposed through the A2A HTTP API. + +## Memory & Knowledge + +The module layers three storage systems: + +- **Session persistence** (SQLite or in-memory) for chat transcripts. +- **Semantic recall** via the ADK memory service for fuzzy search. +- **Cognee graphs** for project-wide knowledge built from ingestion runs. + +Re-run ingestion after major code changes to keep graph answers relevant. If Cognee variables are not set, graph-specific tools automatically respond with a polite "not configured" message. + +## Sample Prompts + +Use these to validate the setup once the agent shell is running: + +- `list available fuzzforge workflows` +- `run fuzzforge workflow static_analysis_scan on ./backend with target_branch=main` +- `show findings for that run once it finishes` +- `refresh the project knowledge graph for ./backend` +- `search project knowledge for "prefect readiness" using INSIGHTS` +- `/recall terraform secrets` +- `/memory status` +- `ROUTE_TO SecurityAnalyzer: audit infrastructure_vulnerable` + +## Need More Detail? + +Dive into the dedicated guides under `ai/docs/advanced/`: + +- [Architecture](https://docs.fuzzforge.ai/docs/ai/intro) โ€“ High-level architecture with diagrams and component breakdowns. +- [Ingestion](https://docs.fuzzforge.ai/docs/ai/ingestion.md) โ€“ Command options, Cognee persistence, and prompt examples. +- [Configuration](https://docs.fuzzforge.ai/docs/ai/configuration.md) โ€“ LLM provider matrix, local model setup, and tracing options. +- [Prompts](https://docs.fuzzforge.ai/docs/ai/prompts.md) โ€“ Slash commands, workflow prompts, and routing tips. +- [A2A Services](https://docs.fuzzforge.ai/docs/ai/a2a-services.md) โ€“ HTTP endpoints, agent card, and collaboration flow. +- [Memory Persistence](https://docs.fuzzforge.ai/docs/ai/architecture.md#memory--persistence) โ€“ Deep dive on memory storage, datasets, and how `/memory status` inspects them. + +## Development Notes + +- Entry point for the CLI: `ai/src/fuzzforge_ai/cli.py` +- A2A HTTP server: `ai/src/fuzzforge_ai/a2a_server.py` +- Tool routing & workflow glue: `ai/src/fuzzforge_ai/agent_executor.py` +- Ingestion helpers: `ai/src/fuzzforge_ai/ingest_utils.py` + +Install the module in editable mode (`pip install -e ai`) while iterating so CLI changes are picked up immediately. diff --git a/ai/llm.txt b/ai/llm.txt new file mode 100644 index 0000000..4c54800 --- /dev/null +++ b/ai/llm.txt @@ -0,0 +1,93 @@ +FuzzForge AI LLM Configuration Guide +=================================== + +This note summarises the environment variables and libraries that drive LiteLLM (via the Google ADK runtime) inside the FuzzForge AI module. For complete matrices and advanced examples, read `docs/advanced/configuration.md`. + +Core Libraries +-------------- +- `google-adk` โ€“ hosts the agent runtime, memory services, and LiteLLM bridge. +- `litellm` โ€“ provider-agnostic LLM client used by ADK and the executor. +- Provider SDKs โ€“ install the SDK that matches your target backend (`openai`, `anthropic`, `google-cloud-aiplatform`, `groq`, etc.). +- Optional extras: `agentops` for tracing, `cognee[all]` for knowledge-graph ingestion, `ollama` CLI for running local models. + +Quick install foundation:: + +``` +pip install google-adk litellm openai +``` + +Add any provider-specific SDKs (for example `pip install anthropic groq`) on top of that base. + +Baseline Setup +-------------- +Copy `.fuzzforge/.env.template` to `.fuzzforge/.env` and set the core fields: + +``` +LLM_PROVIDER=openai +LITELLM_MODEL=gpt-5-mini +OPENAI_API_KEY=sk-your-key +FUZZFORGE_MCP_URL=http://localhost:8010/mcp +SESSION_PERSISTENCE=sqlite +MEMORY_SERVICE=inmemory +``` + +LiteLLM Provider Examples +------------------------- + +OpenAI-compatible (Azure, etc.):: +``` +LLM_PROVIDER=azure_openai +LITELLM_MODEL=gpt-4o-mini +LLM_API_KEY=sk-your-azure-key +LLM_ENDPOINT=https://your-resource.openai.azure.com +``` + +Anthropic:: +``` +LLM_PROVIDER=anthropic +LITELLM_MODEL=claude-3-haiku-20240307 +ANTHROPIC_API_KEY=sk-your-key +``` + +Ollama (local):: +``` +LLM_PROVIDER=ollama_chat +LITELLM_MODEL=codellama:latest +OLLAMA_API_BASE=http://localhost:11434 +``` +Run `ollama pull codellama:latest` so the adapter can respond immediately. + +Vertex AI:: +``` +LLM_PROVIDER=vertex_ai +LITELLM_MODEL=gemini-1.5-pro +GOOGLE_APPLICATION_CREDENTIALS=/path/to/service-account.json +``` + +Provider Checklist +------------------ +- **OpenAI / Azure OpenAI**: `LLM_PROVIDER`, `LITELLM_MODEL`, API key, optional endpoint + API version (Azure). +- **Anthropic**: `LLM_PROVIDER=anthropic`, `LITELLM_MODEL`, `ANTHROPIC_API_KEY`. +- **Google Vertex AI**: `LLM_PROVIDER=vertex_ai`, `LITELLM_MODEL`, `GOOGLE_APPLICATION_CREDENTIALS`, `GOOGLE_CLOUD_PROJECT`. +- **Groq**: `LLM_PROVIDER=groq`, `LITELLM_MODEL`, `GROQ_API_KEY`. +- **Ollama / Local**: `LLM_PROVIDER=ollama_chat`, `LITELLM_MODEL`, `OLLAMA_API_BASE`, and the model pulled locally (`ollama pull `). + +Knowledge Graph Add-ons +----------------------- +Set these only if you plan to use Cognee project graphs: + +``` +LLM_COGNEE_PROVIDER=openai +LLM_COGNEE_MODEL=gpt-5-mini +LLM_COGNEE_API_KEY=sk-your-key +``` + +Tracing & Debugging +------------------- +- Provide `AGENTOPS_API_KEY` to enable hosted traces for every conversation. +- Set `FUZZFORGE_DEBUG=1` (and optionally `LOG_LEVEL=DEBUG`) for verbose executor output. +- Restart the agent after changing environment variables; LiteLLM loads configuration on boot. + +Further Reading +--------------- +`docs/advanced/configuration.md` โ€“ provider comparison, debugging flags, and referenced modules. diff --git a/ai/pyproject.toml b/ai/pyproject.toml new file mode 100644 index 0000000..ef62383 --- /dev/null +++ b/ai/pyproject.toml @@ -0,0 +1,44 @@ +[project] +name = "fuzzforge-ai" +version = "0.6.0" +description = "FuzzForge AI orchestration module" +readme = "README.md" +requires-python = ">=3.11" +dependencies = [ + "google-adk", + "a2a-sdk", + "litellm", + "python-dotenv", + "httpx", + "uvicorn", + "rich", + "agentops", + "fastmcp", + "mcp", + "typing-extensions", + "cognee>=0.3.0", +] + +[project.optional-dependencies] +dev = [ + "pytest", + "pytest-asyncio", + "black", + "ruff", +] + +[build-system] +requires = ["hatchling"] +build-backend = "hatchling.build" + +[tool.hatch.build.targets.wheel] +packages = ["src/fuzzforge_ai"] + +[tool.hatch.metadata] +allow-direct-references = true + +[tool.uv] +dev-dependencies = [ + "pytest", + "pytest-asyncio", +] diff --git a/ai/src/fuzzforge_ai/__init__.py b/ai/src/fuzzforge_ai/__init__.py new file mode 100644 index 0000000..5b343a2 --- /dev/null +++ b/ai/src/fuzzforge_ai/__init__.py @@ -0,0 +1,24 @@ +""" +FuzzForge AI Module - Agent-to-Agent orchestration system + +This module integrates the fuzzforge_ai components into FuzzForge, +providing intelligent AI agent capabilities for security analysis. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +__version__ = "0.6.0" + +from .agent import FuzzForgeAgent +from .config_manager import ConfigManager + +__all__ = ['FuzzForgeAgent', 'ConfigManager'] \ No newline at end of file diff --git a/ai/src/fuzzforge_ai/__main__.py b/ai/src/fuzzforge_ai/__main__.py new file mode 100644 index 0000000..9a3e73b --- /dev/null +++ b/ai/src/fuzzforge_ai/__main__.py @@ -0,0 +1,109 @@ +""" +FuzzForge A2A Server +Run this to expose FuzzForge as an A2A-compatible agent +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import os +import warnings +import logging +from dotenv import load_dotenv + +from fuzzforge_ai.config_bridge import ProjectConfigManager + +# Suppress warnings +warnings.filterwarnings("ignore") +logging.getLogger("google.adk").setLevel(logging.ERROR) +logging.getLogger("google.adk.tools.base_authenticated_tool").setLevel(logging.ERROR) + +# Load .env from .fuzzforge directory first, then fallback +from pathlib import Path + +# Ensure Cognee logs stay inside the project workspace +project_root = Path.cwd() +default_log_dir = project_root / ".fuzzforge" / "logs" +default_log_dir.mkdir(parents=True, exist_ok=True) +log_path = default_log_dir / "cognee.log" +os.environ.setdefault("COGNEE_LOG_PATH", str(log_path)) +fuzzforge_env = Path.cwd() / ".fuzzforge" / ".env" +if fuzzforge_env.exists(): + load_dotenv(fuzzforge_env, override=True) +else: + load_dotenv(override=True) + +# Ensure Cognee uses the project-specific storage paths when available +try: + project_config = ProjectConfigManager() + project_config.setup_cognee_environment() +except Exception: + # Project may not be initialized; fall through with default settings + pass + +# Check configuration +if not os.getenv('LITELLM_MODEL'): + print("[ERROR] LITELLM_MODEL not set in .env file") + print("Please set LITELLM_MODEL to your desired model (e.g., gpt-4o-mini)") + exit(1) + +from .agent import get_fuzzforge_agent +from .a2a_server import create_a2a_app as create_custom_a2a_app + + +def create_a2a_app(): + """Create the A2A application""" + # Get configuration + port = int(os.getenv('FUZZFORGE_PORT', 10100)) + + # Get the FuzzForge agent + fuzzforge = get_fuzzforge_agent() + + # Print ASCII banner + print("\033[95m") # Purple color + print(" โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•— โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ•—") + print(" โ–ˆโ–ˆโ•”โ•โ•โ•โ•โ•โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ•šโ•โ•โ–ˆโ–ˆโ–ˆโ•”โ•โ•šโ•โ•โ–ˆโ–ˆโ–ˆโ•”โ•โ–ˆโ–ˆโ•”โ•โ•โ•โ•โ•โ–ˆโ–ˆโ•”โ•โ•โ•โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•”โ•โ•โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•”โ•โ•โ•โ•โ• โ–ˆโ–ˆโ•”โ•โ•โ•โ•โ• โ–ˆโ–ˆโ•”โ•โ•โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•‘") + print(" โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ–ˆโ•”โ• โ–ˆโ–ˆโ–ˆโ•”โ• โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•”โ•โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ–ˆโ•—โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ•‘") + print(" โ–ˆโ–ˆโ•”โ•โ•โ• โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ–ˆโ•”โ• โ–ˆโ–ˆโ–ˆโ•”โ• โ–ˆโ–ˆโ•”โ•โ•โ• โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ•”โ•โ•โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ•”โ•โ•โ• โ–ˆโ–ˆโ•”โ•โ•โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ•‘") + print(" โ–ˆโ–ˆโ•‘ โ•šโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•”โ•โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•‘ โ•šโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•”โ•โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ•šโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•”โ•โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ•‘") + print(" โ•šโ•โ• โ•šโ•โ•โ•โ•โ•โ• โ•šโ•โ•โ•โ•โ•โ•โ•โ•šโ•โ•โ•โ•โ•โ•โ•โ•šโ•โ• โ•šโ•โ•โ•โ•โ•โ• โ•šโ•โ• โ•šโ•โ• โ•šโ•โ•โ•โ•โ•โ• โ•šโ•โ•โ•โ•โ•โ•โ• โ•šโ•โ• โ•šโ•โ•โ•šโ•โ•") + print("\033[0m") # Reset color + + # Create A2A app + print(f"๐Ÿš€ Starting FuzzForge A2A Server") + print(f" Model: {fuzzforge.model}") + if fuzzforge.cognee_url: + print(f" Memory: Cognee at {fuzzforge.cognee_url}") + print(f" Port: {port}") + + app = create_custom_a2a_app(fuzzforge.adk_agent, port=port, executor=fuzzforge.executor) + + print(f"\nโœ… FuzzForge A2A Server ready!") + print(f" Agent card: http://localhost:{port}/.well-known/agent-card.json") + print(f" A2A endpoint: http://localhost:{port}/") + print(f"\n๐Ÿ“ก Other agents can register FuzzForge at: http://localhost:{port}") + + return app + + +def main(): + """Start the A2A server using uvicorn.""" + import uvicorn + + app = create_a2a_app() + port = int(os.getenv('FUZZFORGE_PORT', 10100)) + + print(f"\n๐ŸŽฏ Starting server with uvicorn...") + uvicorn.run(app, host="127.0.0.1", port=port) + + +if __name__ == "__main__": + main() diff --git a/ai/src/fuzzforge_ai/a2a_server.py b/ai/src/fuzzforge_ai/a2a_server.py new file mode 100644 index 0000000..310451c --- /dev/null +++ b/ai/src/fuzzforge_ai/a2a_server.py @@ -0,0 +1,230 @@ +"""Custom A2A wiring so we can access task store and queue manager.""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +from __future__ import annotations + +import logging +from typing import Optional, Union + +from starlette.applications import Starlette +from starlette.responses import Response, FileResponse +from starlette.routing import Route + +from google.adk.a2a.executor.a2a_agent_executor import A2aAgentExecutor +from google.adk.a2a.utils.agent_card_builder import AgentCardBuilder +from google.adk.a2a.experimental import a2a_experimental +from google.adk.agents.base_agent import BaseAgent +from google.adk.artifacts.in_memory_artifact_service import InMemoryArtifactService +from google.adk.auth.credential_service.in_memory_credential_service import InMemoryCredentialService +from google.adk.cli.utils.logs import setup_adk_logger +from google.adk.memory.in_memory_memory_service import InMemoryMemoryService +from google.adk.runners import Runner +from google.adk.sessions.in_memory_session_service import InMemorySessionService + +from a2a.server.apps import A2AStarletteApplication +from a2a.server.request_handlers.default_request_handler import DefaultRequestHandler +from a2a.server.tasks.inmemory_task_store import InMemoryTaskStore +from a2a.server.events.in_memory_queue_manager import InMemoryQueueManager +from a2a.types import AgentCard + +from .agent_executor import FuzzForgeExecutor + + +import json + + +async def serve_artifact(request): + """Serve artifact files via HTTP for A2A agents""" + artifact_id = request.path_params["artifact_id"] + + # Try to get the executor instance to access artifact cache + # We'll store a reference to it during app creation + executor = getattr(serve_artifact, '_executor', None) + if not executor: + return Response("Artifact service not available", status_code=503) + + try: + # Look in the artifact cache directory + artifact_cache_dir = executor._artifact_cache_dir + artifact_dir = artifact_cache_dir / artifact_id + + if not artifact_dir.exists(): + return Response("Artifact not found", status_code=404) + + # Find the artifact file (should be only one file in the directory) + artifact_files = list(artifact_dir.glob("*")) + if not artifact_files: + return Response("Artifact file not found", status_code=404) + + artifact_file = artifact_files[0] # Take the first (and should be only) file + + # Determine mime type from file extension or default to octet-stream + import mimetypes + mime_type, _ = mimetypes.guess_type(str(artifact_file)) + if not mime_type: + mime_type = 'application/octet-stream' + + return FileResponse( + path=str(artifact_file), + media_type=mime_type, + filename=artifact_file.name + ) + + except Exception as e: + return Response(f"Error serving artifact: {str(e)}", status_code=500) + + +async def knowledge_query(request): + """Expose knowledge graph search over HTTP for external agents.""" + executor = getattr(knowledge_query, '_executor', None) + if not executor: + return Response("Knowledge service not available", status_code=503) + + try: + payload = await request.json() + except Exception: + return Response("Invalid JSON body", status_code=400) + + query = payload.get("query") + if not query: + return Response("'query' is required", status_code=400) + + search_type = payload.get("search_type", "INSIGHTS") + dataset = payload.get("dataset") + + result = await executor.query_project_knowledge_api( + query=query, + search_type=search_type, + dataset=dataset, + ) + + status = 200 if not isinstance(result, dict) or "error" not in result else 400 + return Response( + json.dumps(result, default=str), + status_code=status, + media_type="application/json", + ) + + +async def create_file_artifact(request): + """Create an artifact from a project file via HTTP.""" + executor = getattr(create_file_artifact, '_executor', None) + if not executor: + return Response("File service not available", status_code=503) + + try: + payload = await request.json() + except Exception: + return Response("Invalid JSON body", status_code=400) + + path = payload.get("path") + if not path: + return Response("'path' is required", status_code=400) + + result = await executor.create_project_file_artifact_api(path) + status = 200 if not isinstance(result, dict) or "error" not in result else 400 + return Response( + json.dumps(result, default=str), + status_code=status, + media_type="application/json", + ) + + +def _load_agent_card(agent_card: Optional[Union[AgentCard, str]]) -> Optional[AgentCard]: + if agent_card is None: + return None + if isinstance(agent_card, AgentCard): + return agent_card + + import json + from pathlib import Path + + path = Path(agent_card) + with path.open('r', encoding='utf-8') as handle: + data = json.load(handle) + return AgentCard(**data) + + +@a2a_experimental +def create_a2a_app( + agent: BaseAgent, + *, + host: str = "localhost", + port: int = 8000, + protocol: str = "http", + agent_card: Optional[Union[AgentCard, str]] = None, + executor=None, # Accept executor reference +) -> Starlette: + """Variant of google.adk.a2a.utils.to_a2a that exposes task-store handles.""" + + setup_adk_logger(logging.INFO) + + async def create_runner() -> Runner: + return Runner( + agent=agent, + app_name=agent.name or "fuzzforge", + artifact_service=InMemoryArtifactService(), + session_service=InMemorySessionService(), + memory_service=InMemoryMemoryService(), + credential_service=InMemoryCredentialService(), + ) + + task_store = InMemoryTaskStore() + queue_manager = InMemoryQueueManager() + + agent_executor = A2aAgentExecutor(runner=create_runner) + request_handler = DefaultRequestHandler( + agent_executor=agent_executor, + task_store=task_store, + queue_manager=queue_manager, + ) + + rpc_url = f"{protocol}://{host}:{port}/" + provided_card = _load_agent_card(agent_card) + + card_builder = AgentCardBuilder(agent=agent, rpc_url=rpc_url) + + app = Starlette() + + async def setup() -> None: + if provided_card is not None: + final_card = provided_card + else: + final_card = await card_builder.build() + + a2a_app = A2AStarletteApplication( + agent_card=final_card, + http_handler=request_handler, + ) + a2a_app.add_routes_to_app(app) + + # Add artifact serving route + app.router.add_route("/artifacts/{artifact_id}", serve_artifact, methods=["GET"]) + app.router.add_route("/graph/query", knowledge_query, methods=["POST"]) + app.router.add_route("/project/files", create_file_artifact, methods=["POST"]) + + app.add_event_handler("startup", setup) + + # Expose handles so the executor can emit task updates later + FuzzForgeExecutor.task_store = task_store + FuzzForgeExecutor.queue_manager = queue_manager + + # Store reference to executor for artifact serving + serve_artifact._executor = executor + knowledge_query._executor = executor + create_file_artifact._executor = executor + + return app + + +__all__ = ["create_a2a_app"] diff --git a/ai/src/fuzzforge_ai/agent.py b/ai/src/fuzzforge_ai/agent.py new file mode 100644 index 0000000..0cedc7a --- /dev/null +++ b/ai/src/fuzzforge_ai/agent.py @@ -0,0 +1,133 @@ +""" +FuzzForge Agent Definition +The core agent that combines all components +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import os +from pathlib import Path +from typing import Dict, Any, List +from google.adk import Agent +from google.adk.models.lite_llm import LiteLlm +from .agent_card import get_fuzzforge_agent_card +from .agent_executor import FuzzForgeExecutor +from .memory_service import FuzzForgeMemoryService, HybridMemoryManager + +# Load environment variables from the AI module's .env file +try: + from dotenv import load_dotenv + _ai_dir = Path(__file__).parent + _env_file = _ai_dir / ".env" + if _env_file.exists(): + load_dotenv(_env_file, override=False) # Don't override existing env vars +except ImportError: + # dotenv not available, skip loading + pass + + +class FuzzForgeAgent: + """The main FuzzForge agent that combines card, executor, and ADK agent""" + + def __init__( + self, + model: str = None, + cognee_url: str = None, + port: int = 10100, + ): + """Initialize FuzzForge agent with configuration""" + self.model = model or os.getenv('LITELLM_MODEL', 'gpt-4o-mini') + self.cognee_url = cognee_url or os.getenv('COGNEE_MCP_URL') + self.port = port + + # Initialize ADK Memory Service for conversational memory + memory_type = os.getenv('MEMORY_SERVICE', 'inmemory') + self.memory_service = FuzzForgeMemoryService(memory_type=memory_type) + + # Create the executor (the brain) with memory and session services + self.executor = FuzzForgeExecutor( + model=self.model, + cognee_url=self.cognee_url, + debug=os.getenv('FUZZFORGE_DEBUG', '0') == '1', + memory_service=self.memory_service, + session_persistence=os.getenv('SESSION_PERSISTENCE', 'inmemory'), + fuzzforge_mcp_url=os.getenv('FUZZFORGE_MCP_URL'), + ) + + # Create Hybrid Memory Manager (ADK + Cognee direct integration) + # MCP tools removed - using direct Cognee integration only + self.memory_manager = HybridMemoryManager( + memory_service=self.memory_service, + cognee_tools=None # No MCP tools, direct integration used instead + ) + + # Get the agent card (the identity) + self.agent_card = get_fuzzforge_agent_card(f"http://localhost:{self.port}") + + # Create the ADK agent (for A2A server mode) + self.adk_agent = self._create_adk_agent() + + def _create_adk_agent(self) -> Agent: + """Create the ADK agent for A2A server mode""" + # Build instruction + instruction = f"""You are {self.agent_card.name}, {self.agent_card.description} + +Your capabilities include: +""" + for skill in self.agent_card.skills: + instruction += f"\n- {skill.name}: {skill.description}" + + instruction += """ + +When responding to requests: +1. Use your registered agents when appropriate +2. Use Cognee memory tools when available +3. Provide helpful, concise responses +4. Maintain context across conversations +""" + + # Create ADK agent + return Agent( + model=LiteLlm(model=self.model), + name=self.agent_card.name, + description=self.agent_card.description, + instruction=instruction, + tools=self.executor.agent.tools if hasattr(self.executor.agent, 'tools') else [] + ) + + async def process_message(self, message: str, context_id: str = None) -> str: + """Process a message using the executor""" + result = await self.executor.execute(message, context_id or "default") + return result.get("response", "No response generated") + + async def register_agent(self, url: str) -> Dict[str, Any]: + """Register a new agent""" + return await self.executor.register_agent(url) + + def list_agents(self) -> List[Dict[str, Any]]: + """List registered agents""" + return self.executor.list_agents() + + async def cleanup(self): + """Clean up resources""" + await self.executor.cleanup() + + +# Create a singleton instance for import +_instance = None + +def get_fuzzforge_agent() -> FuzzForgeAgent: + """Get the singleton FuzzForge agent instance""" + global _instance + if _instance is None: + _instance = FuzzForgeAgent() + return _instance diff --git a/ai/src/fuzzforge_ai/agent_card.py b/ai/src/fuzzforge_ai/agent_card.py new file mode 100644 index 0000000..9150092 --- /dev/null +++ b/ai/src/fuzzforge_ai/agent_card.py @@ -0,0 +1,183 @@ +""" +FuzzForge Agent Card and Skills Definition +Defines what FuzzForge can do and how others can discover it +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +from dataclasses import dataclass +from typing import List, Optional, Dict, Any + +@dataclass +class AgentSkill: + """Represents a specific capability of the agent""" + id: str + name: str + description: str + tags: List[str] + examples: List[str] + input_modes: List[str] = None + output_modes: List[str] = None + + def to_dict(self) -> Dict[str, Any]: + """Convert to dictionary for JSON serialization""" + return { + "id": self.id, + "name": self.name, + "description": self.description, + "tags": self.tags, + "examples": self.examples, + "inputModes": self.input_modes or ["text/plain"], + "outputModes": self.output_modes or ["text/plain"] + } + + +@dataclass +class AgentCapabilities: + """Defines agent capabilities for A2A protocol""" + streaming: bool = False + push_notifications: bool = False + multi_turn: bool = True + context_retention: bool = True + + def to_dict(self) -> Dict[str, Any]: + return { + "streaming": self.streaming, + "pushNotifications": self.push_notifications, + "multiTurn": self.multi_turn, + "contextRetention": self.context_retention + } + + +@dataclass +class AgentCard: + """The agent's business card - tells others what this agent can do""" + name: str + description: str + version: str + url: str + skills: List[AgentSkill] + capabilities: AgentCapabilities + default_input_modes: List[str] = None + default_output_modes: List[str] = None + preferred_transport: str = "JSONRPC" + protocol_version: str = "0.3.0" + + def to_dict(self) -> Dict[str, Any]: + """Convert to A2A-compliant agent card JSON""" + return { + "name": self.name, + "description": self.description, + "version": self.version, + "url": self.url, + "protocolVersion": self.protocol_version, + "preferredTransport": self.preferred_transport, + "defaultInputModes": self.default_input_modes or ["text/plain"], + "defaultOutputModes": self.default_output_modes or ["text/plain"], + "capabilities": self.capabilities.to_dict(), + "skills": [skill.to_dict() for skill in self.skills] + } + + +# Define FuzzForge's skills +orchestration_skill = AgentSkill( + id="orchestration", + name="Agent Orchestration", + description="Route requests to appropriate registered agents based on their capabilities", + tags=["orchestration", "routing", "coordination"], + examples=[ + "Route this to the calculator", + "Send this to the appropriate agent", + "Which agent should handle this?" + ] +) + +memory_skill = AgentSkill( + id="memory", + name="Memory Management", + description="Store and retrieve information using Cognee knowledge graph", + tags=["memory", "knowledge", "storage", "cognee"], + examples=[ + "Remember that my favorite color is blue", + "What do you remember about me?", + "Search your memory for project details" + ] +) + +conversation_skill = AgentSkill( + id="conversation", + name="General Conversation", + description="Engage in general conversation and answer questions using LLM", + tags=["chat", "conversation", "qa", "llm"], + examples=[ + "What is the meaning of life?", + "Explain quantum computing", + "Help me understand this concept" + ] +) + +workflow_automation_skill = AgentSkill( + id="workflow_automation", + name="Workflow Automation", + description="Operate project workflows via MCP, monitor runs, and share results", + tags=["workflow", "automation", "mcp", "orchestration"], + examples=[ + "Submit the security assessment workflow", + "Kick off the infrastructure scan and monitor it", + "Summarise findings for run abc123" + ] +) + +agent_management_skill = AgentSkill( + id="agent_management", + name="Agent Registry Management", + description="Register, list, and manage connections to other A2A agents", + tags=["registry", "management", "discovery"], + examples=[ + "Register agent at http://localhost:10201", + "List all registered agents", + "Show agent capabilities" + ] +) + +# Define FuzzForge's capabilities +fuzzforge_capabilities = AgentCapabilities( + streaming=False, + push_notifications=True, + multi_turn=True, # We support multi-turn conversations + context_retention=True # We maintain context across turns +) + +# Create the public agent card +def get_fuzzforge_agent_card(url: str = "http://localhost:10100") -> AgentCard: + """Get FuzzForge's agent card with current configuration""" + return AgentCard( + name="ProjectOrchestrator", + description=( + "An A2A-capable project agent that can launch and monitor FuzzForge workflows, " + "consult the project knowledge graph, and coordinate with speciality agents." + ), + version="project-agent", + url=url, + skills=[ + orchestration_skill, + memory_skill, + conversation_skill, + workflow_automation_skill, + agent_management_skill + ], + capabilities=fuzzforge_capabilities, + default_input_modes=["text/plain", "application/json"], + default_output_modes=["text/plain", "application/json"], + preferred_transport="JSONRPC", + protocol_version="0.3.0" + ) diff --git a/ai/src/fuzzforge_ai/agent_executor.py b/ai/src/fuzzforge_ai/agent_executor.py new file mode 100644 index 0000000..6c0be70 --- /dev/null +++ b/ai/src/fuzzforge_ai/agent_executor.py @@ -0,0 +1,2319 @@ +"""FuzzForge Agent Executor - orchestrates workflows and delegation.""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import asyncio +import base64 +import time +import uuid +import json +from typing import Dict, Any, List, Union +from datetime import datetime +import os +import warnings +import logging +from pathlib import Path +import mimetypes +import hashlib +import tempfile + +# Suppress warnings +warnings.filterwarnings("ignore") +logging.getLogger("google.adk").setLevel(logging.ERROR) +logging.getLogger("google.adk.tools.base_authenticated_tool").setLevel(logging.ERROR) +logging.getLogger("agentops").setLevel(logging.ERROR) + +from google.genai import types +from google.adk.runners import Runner +from google.adk.sessions import DatabaseSessionService, InMemorySessionService +from google.adk.agents import LlmAgent +from google.adk.models.lite_llm import LiteLlm +from google.adk.artifacts.in_memory_artifact_service import InMemoryArtifactService +from google.adk.artifacts.gcs_artifact_service import GcsArtifactService +from google.adk.events.event import Event +from google.adk.events.event_actions import EventActions +from google.adk.tools import FunctionTool +from google.adk.tools.long_running_tool import LongRunningFunctionTool +from google.adk.tools.tool_context import ToolContext + +# Optional AgentOps +try: + import agentops + AGENTOPS_AVAILABLE = True +except ImportError: + AGENTOPS_AVAILABLE = False + +# MCP functionality removed - keeping direct Cognee integration only + +from google.genai.types import Part +from a2a.types import ( + Task, + TaskStatus, + TaskState, + TaskStatusUpdateEvent, + Message, + Part as A2APart, +) + +from .remote_agent import RemoteAgentConnection +from .config_bridge import ProjectConfigManager + + +class FuzzForgeExecutor: + """Executes tasks for FuzzForge - the brain of the operation""" + + task_store = None + queue_manager = None + + def __init__( + self, + model: str = None, + cognee_url: str = None, + debug: bool = False, + memory_service=None, + session_persistence: str = None, + fuzzforge_mcp_url: str = None, + ): + """Initialize the executor with configuration""" + self.model = model or os.getenv('LITELLM_MODEL', 'gpt-5-mini') + self.cognee_url = cognee_url or os.getenv('COGNEE_MCP_URL') + self.debug = debug + self.memory_service = memory_service # ADK memory service + self.session_persistence = session_persistence or os.getenv('SESSION_PERSISTENCE', 'inmemory') + self.fuzzforge_mcp_url = fuzzforge_mcp_url or os.getenv('FUZZFORGE_MCP_URL') + self._background_tasks: set[asyncio.Task] = set() + self.pending_runs: Dict[str, Dict[str, Any]] = {} + self.session_metadata: Dict[str, Dict[str, Any]] = {} + self._artifact_cache_dir = Path(os.getenv('FUZZFORGE_ARTIFACT_DIR', Path.cwd() / '.fuzzforge' / 'artifacts')) + self._knowledge_integration = None + + # Initialize Cognee service if available + self.cognee_service = None + self._cognee_initialized = False + + # Agent registry - stores registered agents + self.agents: Dict[str, Dict[str, Any]] = {} + + # Session management + self.sessions: Dict[str, Any] = {} + self.session_lookup: Dict[str, str] = {} + + # Create session service based on persistence setting + self.session_service = self._create_session_service() + + # Initialize artifact service (A2A compliant) + self.artifact_service = self._create_artifact_service() + # Local artifact cache for quick access + self.artifacts: Dict[str, List[Dict[str, Any]]] = {} + + # Initialize AgentOps if available + self.agentops_trace = None + if AGENTOPS_AVAILABLE and os.getenv('AGENTOPS_API_KEY'): + try: + agentops.init(api_key=os.getenv('AGENTOPS_API_KEY')) + self.agentops_trace = agentops.start_trace() + if self.debug: + print("[DEBUG] AgentOps tracking enabled") + except Exception as e: + if self.debug: + print(f"[DEBUG] AgentOps init failed: {e}") + + # Initialize the core agent + self._initialize_agent() + + # Auto-register agents from config + self._auto_register_agents() + + # Ensure task store/queue manager exist for CLI usage even without A2A server + if getattr(FuzzForgeExecutor, "task_store", None) is None: + try: + from a2a.server.tasks.inmemory_task_store import InMemoryTaskStore + FuzzForgeExecutor.task_store = InMemoryTaskStore() + except Exception: + FuzzForgeExecutor.task_store = None + if getattr(FuzzForgeExecutor, "queue_manager", None) is None: + try: + from a2a.server.events.in_memory_queue_manager import InMemoryQueueManager + FuzzForgeExecutor.queue_manager = InMemoryQueueManager() + except Exception: + FuzzForgeExecutor.queue_manager = None + + self.task_store = FuzzForgeExecutor.task_store + self.queue_manager = FuzzForgeExecutor.queue_manager + + def _auto_register_agents(self): + """Auto-register agents from config file""" + try: + from .config_manager import ConfigManager + config_mgr = ConfigManager() + registered = config_mgr.get_registered_agents() + + if registered and self.debug: + print(f"[DEBUG] Auto-registering {len(registered)} agents from config") + + for agent_config in registered: + url = agent_config.get('url') + name = agent_config.get('name', '') + if url: + # Register silently (don't wait for async) + import asyncio + try: + loop = asyncio.get_event_loop() + if loop.is_running(): + # Schedule for later if loop is already running + asyncio.create_task(self._register_agent_async(url, name)) + else: + # Run now if no loop is running + loop.run_until_complete(self._register_agent_async(url, name)) + except: + # Ignore auto-registration failures + pass + except Exception as e: + if self.debug: + print(f"[DEBUG] Auto-registration failed: {e}") + + async def _register_agent_async(self, url: str, name: str): + """Async helper for auto-registration""" + try: + result = await self.register_agent(url) + if self.debug: + if result.get('success'): + print(f"[DEBUG] Auto-registered: {name or result.get('name')} at {url} as RemoteA2aAgent sub-agent") + else: + print(f"[DEBUG] Failed to auto-register {url}: {result.get('error')}") + except Exception as e: + if self.debug: + print(f"[DEBUG] Auto-registration error for {url}: {e}") + + def _create_artifact_service(self): + """Create artifact service based on configuration""" + artifact_storage = os.getenv('ARTIFACT_STORAGE', 'inmemory') + + if artifact_storage.lower() == 'gcs': + # Use Google Cloud Storage for artifacts + bucket_name = os.getenv('GCS_ARTIFACT_BUCKET', 'fuzzforge-artifacts') + if self.debug: + print(f"[DEBUG] Using GCS artifact storage: {bucket_name}") + try: + return GcsArtifactService(bucket_name=bucket_name) + except Exception as e: + if self.debug: + print(f"[DEBUG] GCS artifact service failed: {e}, falling back to in-memory") + return InMemoryArtifactService() + else: + # Default to in-memory artifacts + if self.debug: + print("[DEBUG] Using in-memory artifact service") + return InMemoryArtifactService() + + def _prepare_artifact_cache_dir(self) -> Path: + """Ensure a shared directory exists for delegated artifacts.""" + try: + self._artifact_cache_dir.mkdir(parents=True, exist_ok=True) + return self._artifact_cache_dir + except Exception: + fallback = Path(tempfile.gettempdir()) / "fuzzforge_artifacts" + fallback.mkdir(parents=True, exist_ok=True) + self._artifact_cache_dir = fallback + if self.debug: + print(f"[DEBUG] Falling back to artifact cache dir {fallback}") + return fallback + + def _register_artifact_bytes( + self, + *, + name: str, + data: bytes, + mime_type: str, + sha256_digest: str, + size: int, + artifact_id: str = None, # Optional: use provided ID instead of generating new one + ) -> Dict[str, Any]: + """Persist artifact bytes to cache directory and return metadata.""" + base_dir = self._prepare_artifact_cache_dir() + if artifact_id is None: + artifact_id = uuid.uuid4().hex + artifact_dir = base_dir / artifact_id + artifact_dir.mkdir(parents=True, exist_ok=True) + file_path = artifact_dir / name + file_path.write_bytes(data) + + # Create HTTP URL for A2A artifact serving instead of file:// URI + port = int(os.getenv('FUZZFORGE_PORT', 10100)) + http_uri = f"http://127.0.0.1:{port}/artifacts/{artifact_id}" + + return { + "id": artifact_id, + "file_uri": http_uri, + "path": str(file_path), + "name": name, + "mime_type": mime_type, + "sha256": sha256_digest, + "size": size, + } + + def _create_session_service(self): + """Create session service based on persistence setting""" + if self.session_persistence.lower() == 'sqlite': + # Use SQLite for persistent sessions + db_path = os.getenv('SESSION_DB_PATH', './fuzzforge_sessions.db') + # Convert to absolute path for SQLite URL + abs_db_path = os.path.abspath(db_path) + db_url = f"sqlite:///{abs_db_path}" + if self.debug: + print(f"[DEBUG] Using SQLite session persistence: {db_url}") + return DatabaseSessionService(db_url=db_url) + else: + # Default to in-memory sessions + if self.debug: + print("[DEBUG] Using in-memory session service (non-persistent)") + return InMemorySessionService() + + async def _get_cognee_service(self): + """Get or initialize shared Cognee service""" + if self.cognee_service is None or not self._cognee_initialized: + try: + from .cognee_service import CogneeService + + config = ProjectConfigManager() + if not config.is_initialized(): + raise ValueError("FuzzForge project not initialized. Run 'fuzzforge init' first.") + + self.cognee_service = CogneeService(config) + await self.cognee_service.initialize() + self._cognee_initialized = True + + if self.debug: + print("[DEBUG] Shared Cognee service initialized") + + except Exception as e: + if self.debug: + print(f"[DEBUG] Failed to initialize Cognee service: {e}") + raise + + return self.cognee_service + + async def _get_knowledge_integration(self): + """Get reusable Cognee project integration for structured queries.""" + if self._knowledge_integration is not None: + return self._knowledge_integration + + try: + from .cognee_integration import CogneeProjectIntegration + + integration = CogneeProjectIntegration() + initialised = await integration.initialize() + if not initialised: + if self.debug: + print("[DEBUG] CogneeProjectIntegration initialization failed") + return None + + self._knowledge_integration = integration + return integration + except Exception as exc: + if self.debug: + print(f"[DEBUG] Knowledge integration unavailable: {exc}") + return None + + def _initialize_agent(self): + """Initialize the LLM agent with tools""" + # Build tools list + tools = [] + + # Add custom function tools for Cognee operations (making it callable as a tool) + + # Define Cognee tool functions + async def cognee_add(text: str) -> str: + """Add information to Cognee knowledge graph memory""" + try: + if self.cognee_service: + result = await self.cognee_service.add_to_memory(text) + return f"Added to Cognee: {result}" + return "Cognee service not available" + except Exception as e: + return f"Error adding to Cognee: {e}" + + async def cognee_search(query: str) -> str: + """Search Cognee knowledge graph memory""" + try: + if self.cognee_service: + results = await self.cognee_service.search_memory(query) + return f"Cognee search results: {results}" + return "Cognee service not available" + except Exception as e: + return f"Error searching Cognee: {e}" + + # Add Cognee project integration tools + async def search_project_knowledge(query: str, dataset: str, search_type: str) -> str: + """Search the project's knowledge graph (codebase, documentation, specs, etc.) + + Args: + query: Search query + dataset: Specific dataset to search (optional, searches all if empty) + search_type: Type of search - any SearchType: INSIGHTS, CHUNKS, GRAPH_COMPLETION, CODE, SUMMARIES, RAG_COMPLETION, NATURAL_LANGUAGE, etc. + """ + try: + from cognee.modules.search.types import SearchType + + # Use shared cognee service + cognee_service = await self._get_cognee_service() + config = cognee_service.config + + # Get SearchType enum value dynamically + try: + search_type_enum = getattr(SearchType, search_type.upper()) + except AttributeError: + # Fallback to INSIGHTS if invalid search type + search_type_enum = SearchType.INSIGHTS + search_type = "INSIGHTS" + + # Handle empty/default values + if not dataset: + dataset = None + if not search_type: + search_type = "INSIGHTS" + search_type_enum = SearchType.INSIGHTS + + # Use direct cognee import like ingest command + import cognee + + # Set up user context + try: + from cognee.modules.users.methods import get_user + user_email = f"project_{config.get_project_context()['project_id']}@fuzzforge.example" + user = await get_user(user_email) + cognee.set_user(user) + except Exception as e: + pass # User context not critical + + # Use cognee search directly for maximum flexibility + search_kwargs = { + "query_type": search_type_enum, + "query_text": query + } + + if dataset: + search_kwargs["datasets"] = [dataset] + + results = await cognee.search(**search_kwargs) + + if not results: + return f"No results found for '{query}'" + (f" in dataset '{dataset}'" if dataset else "") + + project_context = config.get_project_context() + output = f"Search results for '{query}' in project {project_context['project_name']} (search_type: {search_type}):\n\n" + + for i, result in enumerate(results[:5], 1): # Top 5 results + if isinstance(result, str): + preview = result[:200] + "..." if len(result) > 200 else result + output += f"{i}. {preview}\n\n" + else: + output += f"{i}. {str(result)[:200]}...\n\n" + + return output + + except Exception as e: + return f"Error searching project knowledge: {e}" + + async def list_project_knowledge() -> str: + """List available knowledge and datasets in the project's knowledge graph""" + try: + import logging + logger = logging.getLogger(__name__) + + # Use shared cognee service + cognee_service = await self._get_cognee_service() + config = cognee_service.config + + project_context = config.get_project_context() + result = f"Available knowledge in project {project_context['project_name']}:\n\n" + + # Use direct cognee import like ingest command does + try: + import cognee + from cognee.modules.search.types import SearchType + + # Set up user context like ingest command + try: + from cognee.modules.users.methods import create_user, get_user + + user_email = f"project_{project_context['project_id']}@fuzzforge.example" + user_tenant = project_context['tenant_id'] + + try: + user = await get_user(user_email) + logger.info(f"Using existing user: {user_email}") + except: + try: + user = await create_user(user_email, user_tenant) + logger.info(f"Created new user: {user_email}") + except: + user = None + + if user: + cognee.set_user(user) + except Exception as e: + logger.warning(f"User context setup failed: {e}") + + # List available datasets + datasets = await cognee.datasets.list_datasets() + logger.info(f"Found datasets: {datasets}") + + if datasets and len(datasets) > 0: + dataset_name = f"{project_context['project_name']}_codebase" + + # Try to search for some basic info to show data exists + try: + sample_results = await cognee.search( + query_type=SearchType.INSIGHTS, + query_text="project overview files functions", + datasets=[dataset_name] + ) + + if sample_results: + data = [f"Dataset '{dataset_name}' contains {len(sample_results)} insights"] + sample_results[:3] + else: + data = [f"Dataset '{dataset_name}' exists but no insights found"] + except Exception as search_e: + logger.info(f"Search failed: {search_e}") + data = [f"Dataset '{dataset_name}' exists in: {[str(ds) for ds in datasets]}"] + else: + data = None + + except Exception as e: + data = None + logger.warning(f"Error accessing cognee: {e}") + + if not data: + result += "No data available in knowledge graph\n" + result += "Use 'fuzzforge ingest' to ingest code, documentation, or other project files\n" + else: + # Extract datasets from data + datasets = set() + if isinstance(data, list): + for item in data: + if isinstance(item, dict) and 'dataset_name' in item: + datasets.add(item['dataset_name']) + + if datasets: + result += f"Available Datasets ({len(datasets)}):\n" + for i, dataset in enumerate(sorted(datasets), 1): + result += f" {i}. {dataset}\n" + result += "\n" + + result += f"Total data items: {len(data)}\n" + + # Show sample of available data + result += "\nSample content:\n" + for i, item in enumerate(data[:3], 1): + if isinstance(item, dict): + item_str = str(item)[:100] + "..." if len(str(item)) > 100 else str(item) + result += f" {i}. {item_str}\n" + else: + item_str = str(item)[:100] + "..." if len(str(item)) > 100 else str(item) + result += f" {i}. {item_str}\n" + + return result + + except Exception as e: + return f"Error listing knowledge: {e}" + + async def ingest_to_dataset(content: str, dataset: str) -> str: + """Ingest text content (code, documentation, notes) into a specific project dataset + + Args: + content: Text content to ingest (code, docs, specs, research, etc.) + dataset: Dataset name to ingest into + """ + try: + # Use shared cognee service + cognee_service = await self._get_cognee_service() + config = cognee_service.config + + # Ingest the content + success = await cognee_service.ingest_text(content, dataset) + + if success: + project_context = config.get_project_context() + return f"Successfully ingested {len(content)} characters into dataset '{dataset}' for project {project_context['project_name']}" + else: + return f"Failed to ingest content into dataset '{dataset}'" + + except Exception as e: + return f"Error ingesting to dataset: {e}" + + async def cognify_information(text: str) -> str: + """Transform information into knowledge graph format""" + try: + from .cognee_integration import CogneeProjectIntegration + integration = CogneeProjectIntegration() + result = await integration.cognify_text(text) + + if "error" in result: + return f"Error cognifying information: {result['error']}" + + project = result.get('project', 'Unknown') + return f"Successfully transformed information into knowledge graph for project {project}" + except Exception as e: + return f"Error cognifying information: {e}" + + tools.extend([ + FunctionTool(search_project_knowledge), + FunctionTool(list_project_knowledge), + FunctionTool(ingest_to_dataset), + FunctionTool(cognify_information), + FunctionTool(self.query_project_knowledge_api) + ]) + + # Add project-local filesystem tools + async def list_project_files(path: str, pattern: str) -> str: + """List files in the current project directory with optional pattern + + Args: + path: Relative path within project (e.g. '.' for root, 'src', 'tests') + pattern: Glob pattern (e.g. '*.py', '**/*.js', '') + """ + try: + from pathlib import Path + + # Get project root from config + config = ProjectConfigManager() + if not config.is_initialized(): + return "Project not initialized. Run 'fuzzforge init' first." + + project_root = config.config_path.parent # Parent of .fuzzforge + requested_path = project_root / path + + # Security check - ensure we stay within project + try: + requested_path = requested_path.resolve() + project_root = project_root.resolve() + requested_path.relative_to(project_root) + except ValueError: + return f"Access denied: Path '{path}' is outside project directory" + + if not requested_path.exists(): + return f"Path does not exist: {path}" + + if not requested_path.is_dir(): + return f"Not a directory: {path}" + + # List contents + if not pattern: + # Simple directory listing + items = [] + for item in sorted(requested_path.iterdir()): + relative = item.relative_to(project_root) + if item.is_dir(): + items.append(f"๐Ÿ“ {relative}/") + else: + size = item.stat().st_size + size_str = f"({size} bytes)" if size < 1024 else f"({size//1024}KB)" + items.append(f"๐Ÿ“„ {relative} {size_str}") + + return f"Project files in '{path}':\n" + "\n".join(items) if items else "Empty directory" + else: + # Pattern matching + matches = list(requested_path.glob(pattern)) + if matches: + files = [] + for f in sorted(matches): + if f.is_file(): + relative = f.relative_to(project_root) + size = f.stat().st_size + size_str = f" ({size//1024}KB)" if size >= 1024 else f" ({size}B)" + files.append(f"๐Ÿ“„ {relative}{size_str}") + + return f"Found {len(files)} files matching '{pattern}' in project:\n" + "\n".join(files[:100]) + else: + return f"No files found matching '{pattern}' in project path '{path}'" + + except Exception as e: + return f"Error listing project files: {e}" + + async def read_project_file(file_path: str, max_lines: int) -> str: + """Read a file from the current project + + Args: + file_path: Relative path to file within project + max_lines: Maximum lines to read (0 for all, default 200 for large files) + """ + try: + from pathlib import Path + + # Get project root from config + config = ProjectConfigManager() + if not config.is_initialized(): + return "Project not initialized. Run 'fuzzforge init' first." + + project_root = config.config_path.parent + requested_file = project_root / file_path + + # Security check - ensure we stay within project + try: + requested_file = requested_file.resolve() + project_root = project_root.resolve() + requested_file.relative_to(project_root) + except ValueError: + return f"Access denied: File '{file_path}' is outside project directory" + + if not requested_file.exists(): + return f"File does not exist: {file_path}" + + if not requested_file.is_file(): + return f"Not a file: {file_path}" + + # Check file size + size_mb = requested_file.stat().st_size / (1024 * 1024) + if size_mb > 5: + return f"File too large ({size_mb:.1f} MB). Use max_lines parameter to read portions." + + # Set reasonable default for max_lines + if max_lines == 0: + max_lines = 200 if size_mb > 0.1 else 0 # Default limit for larger files + + with open(requested_file, 'r', encoding='utf-8', errors='replace') as f: + if max_lines == 0: + content = f.read() + else: + lines = [] + for i, line in enumerate(f, 1): + if i > max_lines: + lines.append(f"... (truncated at {max_lines} lines)") + break + lines.append(f"{i:4d}: {line.rstrip()}") + content = "\n".join(lines) + + relative_path = requested_file.relative_to(project_root) + return f"Contents of {relative_path}:\n{content}" + + except UnicodeDecodeError: + return f"Cannot read file (binary or encoding issue): {file_path}" + except Exception as e: + return f"Error reading file: {e}" + + async def search_project_files(search_pattern: str, file_pattern: str, path: str) -> str: + """Search for text patterns in project files + + Args: + search_pattern: Text/regex pattern to find + file_pattern: File pattern to search in (e.g. '*.py', '**/*.js') + path: Relative project path to search in (e.g. '.', 'src') + """ + try: + import re + from pathlib import Path + + # Get project root from config + config = ProjectConfigManager() + if not config.is_initialized(): + return "Project not initialized. Run 'fuzzforge init' first." + + project_root = config.config_path.parent + search_path = project_root / path + + # Security check + try: + search_path = search_path.resolve() + project_root = project_root.resolve() + search_path.relative_to(project_root) + except ValueError: + return f"Access denied: Path '{path}' is outside project directory" + + if not search_path.exists(): + return f"Search path does not exist: {path}" + + matches = [] + files_searched = 0 + + # Search in files + for file_path in search_path.glob(file_pattern): + if file_path.is_file(): + files_searched += 1 + try: + with open(file_path, 'r', encoding='utf-8', errors='replace') as f: + for line_num, line in enumerate(f, 1): + if re.search(search_pattern, line, re.IGNORECASE): + relative = file_path.relative_to(project_root) + matches.append(f"{relative}:{line_num}: {line.strip()}") + if len(matches) >= 50: # Limit results + break + except (PermissionError, OSError): + continue + + if len(matches) >= 50: + break + + if matches: + result = f"Found '{search_pattern}' in {len(matches)} locations (searched {files_searched} files):\n" + result += "\n".join(matches[:50]) + if len(matches) >= 50: + result += f"\n... (showing first 50 matches)" + return result + else: + return f"No matches found for '{search_pattern}' in {files_searched} files matching '{file_pattern}'" + + except Exception as e: + return f"Error searching project files: {e}" + + tools.extend([ + FunctionTool(list_project_files), + FunctionTool(read_project_file), + FunctionTool(search_project_files), + FunctionTool(self.create_project_file_artifact_api) + ]) + + async def send_file_to_agent(agent_name: str, file_path: str, note: str, tool_context: ToolContext) -> str: + """Send a local file to a registered agent (agent_name, file_path, note).""" + # Handle empty note parameter + if not note: + note = "" + + session = None + context_id = None + if tool_context and getattr(tool_context, "invocation_context", None): + invocation = tool_context.invocation_context + session = invocation.session + context_id = self.session_lookup.get(getattr(session, 'id', None)) + return await self.delegate_file_to_agent(agent_name, file_path, note, session=session, context_id=context_id) + + tools.append(FunctionTool(send_file_to_agent)) + + if self.debug: + print("[DEBUG] Added Cognee project integration tools") + + # Add FuzzForge backend workflow tools if MCP endpoint configured + if self.fuzzforge_mcp_url: + if self.debug: + print(f"[DEBUG] FuzzForge MCP endpoint configured at {self.fuzzforge_mcp_url}") + + async def _call_fuzzforge_mcp(tool_name: str, payload: Dict[str, Any] | None = None) -> Any: + return await self._call_mcp_generic(tool_name, payload or {}) + + async def list_fuzzforge_workflows(tool_context: ToolContext | None = None) -> Any: + return await _call_fuzzforge_mcp("list_workflows_mcp") + + async def get_fuzzforge_workflow_metadata(workflow_name: str, tool_context: ToolContext | None = None) -> Any: + return await _call_fuzzforge_mcp("get_workflow_metadata_mcp", {"workflow_name": workflow_name}) + + async def get_fuzzforge_workflow_parameters(workflow_name: str, tool_context: ToolContext | None = None) -> Any: + return await _call_fuzzforge_mcp("get_workflow_parameters_mcp", {"workflow_name": workflow_name}) + + async def get_fuzzforge_workflow_schema(tool_context: ToolContext | None = None) -> Any: + return await _call_fuzzforge_mcp("get_workflow_metadata_schema_mcp") + + async def list_fuzzforge_runs( + limit: int = 10, + workflow_name: str = "", + states: str = "", + tool_context: ToolContext | None = None, + ) -> Any: + payload: Dict[str, Any] = {"limit": limit} + workflow_name = (workflow_name or "").strip() + if workflow_name: + payload["workflow_name"] = workflow_name + + state_tokens = [ + token.strip() + for token in (states or "").split(",") + if token.strip() + ] + if state_tokens: + payload["states"] = state_tokens + return await _call_fuzzforge_mcp("list_recent_runs_mcp", payload) + + async def submit_security_scan_mcp( + workflow_name: str, + target_path: str = "", + volume_mode: str = "", + parameters: Dict[str, Any] | None = None, + tool_context: ToolContext | None = None, + ) -> Any: + # Normalise volume mode to supported values + normalised_mode = (volume_mode or "ro").strip().lower().replace("-", "_") + if normalised_mode in {"read_only", "readonly", "ro"}: + normalised_mode = "ro" + elif normalised_mode in {"read_write", "readwrite", "rw"}: + normalised_mode = "rw" + else: + # Fall back to Prefect defaults if we can't recognise the input + normalised_mode = "ro" + + # Resolve the target path to an absolute path for Prefect's validation + resolved_path = target_path or "." + try: + resolved_path = str(Path(resolved_path).expanduser().resolve()) + except Exception: + # If resolution fails, Prefect will surface the validation error โ€“ use the raw value + resolved_path = target_path + + # Ensure configuration objects default to dictionaries instead of None + cleaned_parameters: Dict[str, Any] = {} + if parameters: + for key, value in parameters.items(): + if isinstance(key, str) and key.endswith("_config") and value is None: + cleaned_parameters[key] = {} + else: + cleaned_parameters[key] = value + + # Merge in default parameter schema for known workflows to avoid missing dicts + try: + param_info = await get_fuzzforge_workflow_parameters(workflow_name) + if isinstance(param_info, dict): + defaults = param_info.get("defaults") or {} + if isinstance(defaults, dict): + for key, value in defaults.items(): + if key.endswith("_config") and key not in cleaned_parameters: + cleaned_parameters[key] = value or {} + except Exception: + # Defaults fetch is best-effort โ€“ continue with whatever we have + pass + + # Final pass โ€“ replace any lingering None configs with empty dicts + for key, value in list(cleaned_parameters.items()): + if isinstance(key, str) and key.endswith("_config") and value is None: + cleaned_parameters[key] = {} + + payload = { + "workflow_name": workflow_name, + "target_path": resolved_path, + "volume_mode": normalised_mode, + "parameters": cleaned_parameters, + } + result = await _call_fuzzforge_mcp("submit_security_scan_mcp", payload) + + if isinstance(result, dict): + run_id = result.get("run_id") or result.get("id") + if run_id and tool_context: + context_id = tool_context.invocation_context.session.id + session_meta = self.session_metadata.get(context_id, {}) + self.pending_runs[run_id] = { + "context_id": context_id, + "session_id": session_meta.get("session_id"), + "user_id": session_meta.get("user_id"), + "app_name": session_meta.get("app_name", "fuzzforge"), + "workflow_name": workflow_name, + "submitted_at": datetime.now().isoformat(), + } + tool_context.actions.state_delta[ + f"fuzzforge.run.{run_id}.status" + ] = "submitted" + await self._publish_task_pending(run_id, context_id, workflow_name) + self._schedule_run_followup(run_id) + + return result + + async def get_fuzzforge_run_status(run_id: str, tool_context: ToolContext | None = None) -> Any: + return await _call_fuzzforge_mcp("get_run_status_mcp", {"run_id": run_id}) + + async def get_fuzzforge_summary(run_id: str, tool_context: ToolContext | None = None) -> Any: + return await _call_fuzzforge_mcp("get_comprehensive_scan_summary", {"run_id": run_id}) + + async def get_fuzzforge_findings(run_id: str, tool_context: ToolContext | None = None) -> Any: + return await _call_fuzzforge_mcp("get_run_findings_mcp", {"run_id": run_id}) + + async def get_fuzzforge_fuzzing_stats(run_id: str, tool_context: ToolContext | None = None) -> Any: + return await _call_fuzzforge_mcp("get_fuzzing_stats_mcp", {"run_id": run_id}) + + tools.extend([ + FunctionTool(list_fuzzforge_workflows), + FunctionTool(get_fuzzforge_workflow_metadata), + FunctionTool(get_fuzzforge_workflow_parameters), + FunctionTool(get_fuzzforge_workflow_schema), + FunctionTool(list_fuzzforge_runs), + LongRunningFunctionTool(submit_security_scan_mcp), + FunctionTool(get_fuzzforge_run_status), + FunctionTool(get_fuzzforge_summary), + FunctionTool(get_fuzzforge_findings), + FunctionTool(get_fuzzforge_fuzzing_stats), + ]) + + # Add agent introspection tools + async def get_agent_capabilities(agent_name: str) -> str: + """Get detailed capabilities and tools of a registered agent""" + # Handle empty agent_name + if not agent_name or agent_name.strip() == "": + # List all agents with their capabilities + if not self.agents: + return "No agents are currently registered" + + result = "Registered agents and their capabilities:\n\n" + for name, info in self.agents.items(): + card = info.get("card", {}) + result += f"{name}\n" + result += f" Description: {card.get('description', 'No description')}\n" + + # Get skills/tools from agent card + skills = card.get('skills', []) + if skills: + result += f" Tools ({len(skills)}):\n" + for skill in skills: + skill_name = skill.get('name', 'Unknown') + skill_desc = skill.get('description', 'No description') + result += f" - {skill_name}: {skill_desc}\n" + else: + result += " Tools: Not specified in agent card\n" + result += "\n" + return result + else: + # Get specific agent details + if agent_name not in self.agents: + return f"Agent '{agent_name}' not found. Available agents: {', '.join(self.agents.keys())}" + + info = self.agents[agent_name] + card = info.get("card", {}) + + result = f"{agent_name} - Detailed Capabilities\n\n" + result += f"URL: {info.get('url')}\n" + result += f"Description: {card.get('description', 'No description')}\n\n" + + # Detailed skills/tools + skills = card.get('skills', []) + if skills: + result += f"Available Tools ({len(skills)}):\n" + for i, skill in enumerate(skills, 1): + skill_name = skill.get('name', 'Unknown') + skill_desc = skill.get('description', 'No description') + result += f"{i}. {skill_name}\n {skill_desc}\n\n" + else: + result += "Tools: Not specified in agent card\n\n" + + # Additional capabilities + capabilities = card.get('capabilities', {}) + if capabilities: + result += "Capabilities:\n" + for key, value in capabilities.items(): + result += f" - {key}: {value}\n" + result += "\n" + + # Input/Output modes + input_modes = card.get('defaultInputModes', card.get('default_input_modes', [])) + output_modes = card.get('defaultOutputModes', card.get('default_output_modes', [])) + + if input_modes: + result += f"Supported Input Modes: {', '.join(input_modes)}\n" + if output_modes: + result += f"Supported Output Modes: {', '.join(output_modes)}\n" + + return result + + # Add task tracking tools + async def create_task_list(tasks: List[str]) -> str: + """Create a task list for tracking project progress""" + if not hasattr(self, 'task_lists'): + self.task_lists = {} + + task_id = f"task_list_{len(self.task_lists)}" + self.task_lists[task_id] = { + 'tasks': [{'id': i, 'description': task, 'status': 'pending'} for i, task in enumerate(tasks)], + 'created_at': datetime.now().isoformat() + } + return f"Created task list {task_id} with {len(tasks)} tasks" + + async def update_task_status(task_list_id: str, task_id: int, status: str) -> str: + """Update the status of a task (pending, in_progress, completed)""" + if not hasattr(self, 'task_lists') or task_list_id not in self.task_lists: + return f"Task list {task_list_id} not found" + + tasks = self.task_lists[task_list_id]['tasks'] + for task in tasks: + if task['id'] == task_id: + task['status'] = status + return f"Updated task {task_id} to {status}" + return f"Task {task_id} not found" + + async def get_task_list(task_list_id: str) -> str: + """Get current task list status""" + # Handle empty task_list_id + if not task_list_id or task_list_id.strip() == "": + task_list_id = "default" + + if not hasattr(self, 'task_lists'): + return "No task lists created" + + if task_list_id: + if task_list_id in self.task_lists: + tasks = self.task_lists[task_list_id]['tasks'] + result = f"Task List {task_list_id}:\n" + for task in tasks: + result += f" [{task['status']}] {task['id']}: {task['description']}\n" + return result + return f"Task list {task_list_id} not found" + else: + # Return all task lists + result = "All task lists:\n" + for list_id, list_data in self.task_lists.items(): + completed = sum(1 for t in list_data['tasks'] if t['status'] == 'completed') + total = len(list_data['tasks']) + result += f" {list_id}: {completed}/{total} completed\n" + return result + + tools.extend([ + FunctionTool(get_agent_capabilities), + FunctionTool(create_task_list), + FunctionTool(update_task_status), + FunctionTool(get_task_list) + ]) + + + # Create the agent + self.agent = LlmAgent( + model=LiteLlm(model=self.model), + name="fuzzforge_executor", + description="Intelligent A2A orchestrator with memory", + instruction=self._build_instruction(), + tools=tools # Always pass tools list (empty list is fine) + ) + + # Create runner with our session service + self.runner = Runner( + agent=self.agent, + session_service=self.session_service, # Use our configured session service + app_name="fuzzforge" + ) + + # Connect runner to our artifact service + if hasattr(self.runner, 'artifact_service'): + # Override with our configured artifact service + self.runner.artifact_service = self.artifact_service + + def _build_instruction(self) -> str: + """Build the agent's instruction prompt""" + instruction = f"""You are FuzzForge, an intelligent A2A orchestrator with dual memory systems. + +## Your Core Responsibilities: + +1. **Agent Orchestration (Primary)** + - Always use get_agent_capabilities() tool to check available agents + - When users ask about agent tools/capabilities, use get_agent_capabilities(agent_name) + - When a user mentions any registered agent by name, delegate to that agent + - When a request matches an agent's capabilities, route to it + - To route to an agent, format your response as: "ROUTE_TO: [agent_name] [message]" + - The system follows A2A protocol standards for agent communication + - Be agent-agnostic - work with whatever agents are registered + - Prefer using your built-in FuzzForge workflow tools directly unless the user explicitly requests delegation + +2. **FuzzForge Platform Tools (Secondary)** + - Use your FuzzForge MCP tools by default for workflow submission, monitoring, and findings retrieval + - Use the appropriate tool for the user's request + - You can submit and monitor FuzzForge workflows via MCP tools (list_workflows_mcp, submit_security_scan_mcp, list_recent_runs_mcp, get_run_status_mcp, get_comprehensive_scan_summary) + - Treat any absolute path the user provides as mountable; the backend handles volume access. Do NOT ask the user to upload, move, or zip projectsโ€”just call submit_security_scan_mcp with the supplied path and options. + - When asked to send local files or binaries to another agent, call send_file_to_agent(agent_name, file_path, note="...") + +3. **Dual Memory Systems**: + + a) **Conversational Memory** (ADK MemoryService - for past conversations) + - Automatically ingests completed sessions + - Search with "recall from past conversations about X" + - Uses semantic search (VertexAI) or keyword matching (InMemory) + + b) **Project Knowledge Graph** (Cognee - for ingested code, documentation, specs, and structured data) + - Use search_project_knowledge(query, dataset="", search_type="INSIGHTS") to search project knowledge + - Available search_type options: INSIGHTS, CHUNKS, GRAPH_COMPLETION, CODE, SUMMARIES, RAG_COMPLETION, NATURAL_LANGUAGE, CYPHER, TEMPORAL, FEELING_LUCKY + - Use list_project_knowledge() to see available datasets and knowledge + - Use ingest_to_dataset(content, dataset) to add content to specific datasets + - Use cognify_information(text) to add new information to knowledge graph + - Automatically uses current project context and directory + - Example: "what functions are in the codebase?" -> use search_project_knowledge("functions classes methods", search_type="CHUNKS") + - Example: "what documentation exists?" -> use search_project_knowledge("documentation specs readme", search_type="INSIGHTS") + - Example: "search security docs" -> use search_project_knowledge("security vulnerabilities", dataset="security_docs") + + c) **Project Filesystem Access** (Project-local file operations) + - Use list_project_files(path, pattern) to explore project structure + - Use read_project_file(file_path, max_lines) to examine file contents + - Use search_project_files(search_pattern, file_pattern, path) to find text in files + - All file operations are restricted to the current project directory for security + - Example: "show me all Python files" -> use list_project_files(".", "*.py") + - Example: "read the main agent file" -> use read_project_file("agent.py", 0) + - Example: "find TODO comments" -> use search_project_files("TODO", "**/*.py", ".") + +4. **Artifact Creation** + - When generating code, configurations, or documents, create an artifact + - Format: "ARTIFACT: [type] [title]\n```\n[content]\n```" + - Types: code, config, document, data, diagram + +5. **Multi-Step Task Execution with Graph Building** + - Chain multiple actions together + - When user says "ask agent X and then save to memory": + a) Route to agent X + b) Use `cognify` to structure the response as a knowledge graph + c) This automatically creates searchable nodes and relationships + - Build a growing knowledge graph from all interactions + - Connect new information to existing graph nodes + +6. **General Assistance** + - Only answer directly if no suitable agent is registered AND no FuzzForge tool can help + - Provide helpful responses + - Maintain conversation context + +## Tool Usage Protocol: +- ALWAYS use get_agent_capabilities() tool when asked about agents or their tools +- Use get_agent_capabilities(agent_name) for specific agent details +- Use get_agent_capabilities() without parameters to list all agents +- If an agent's skills/description match the request, use "ROUTE_TO: [name] [message]" +- After receiving agent response: + - If user wants to save/store: Use `cognify` to create knowledge graph + - Structure the data as: entities (nodes) and relationships (edges) + - Example cognify text: "Entity: 1001 (Number). Property: is_prime=false. Relationship: 1001 CHECKED_BY CalculatorAgent. Relationship: 1001 HAS_FACTORS [7, 11, 13]" +- When searching memory, use GRAPH_COMPLETION mode to traverse relationships + +## Important Rules: +- NEVER mention specific types of agents or tasks in greetings +- Do NOT say things like "I can run calculations" or mention specific capabilities +- Keep greetings generic: just say you're an orchestrator that can help +- When user asks for chained actions, acknowledge and execute all steps + +Be concise and intelligent in your responses.""" + + + return instruction + + async def execute(self, message: str, context_id: str = None) -> Dict[str, Any]: + """Execute a task/message and return the result""" + + # Use default context if none provided + if not context_id: + context_id = "default" + + # Get or create session + if context_id not in self.sessions: + session_obj = await self._create_session() + self.sessions[context_id] = session_obj + self.session_metadata[context_id] = { + "session_id": getattr(session_obj, 'id', context_id), + "user_id": getattr(session_obj, 'user_id', 'user'), + "app_name": getattr(session_obj, 'app_name', 'fuzzforge'), + } + if self.debug: + print(f"[DEBUG] Created new session for context: {context_id}") + + session = self.sessions[context_id] + session_id = getattr(session, 'id', context_id) + self.session_lookup[session_id] = context_id + if context_id not in self.session_metadata: + self.session_metadata[context_id] = { + "session_id": getattr(session, 'id', context_id), + "user_id": getattr(session, 'user_id', 'user'), + "app_name": getattr(session, 'app_name', 'fuzzforge'), + } + + # Search conversational memory if relevant + if self.memory_service and any(word in message.lower() for word in ['recall', 'remember', 'past conversation', 'previously']): + try: + memory_results = await self.memory_service.search_memory( + query=message, + app_name="fuzzforge", + user_id=getattr(session, 'user_id', 'user') + ) + if memory_results and memory_results.memories: + # Add memory context to session state + # MemoryEntry has 'text' field + session.state["memory_context"] = [ + {"text": getattr(m, 'text', str(m))} + for m in memory_results.memories + ] + if self.debug: + print(f"[DEBUG] Found {len(memory_results.memories)} memories") + except Exception as e: + if self.debug: + print(f"[DEBUG] Memory search failed: {e}") + + # Update session with registered agents following A2A AgentCard standard + registered_agents = [] + for name, info in self.agents.items(): + card = info.get("card", {}) + skills = card.get("skills", []) + + # Format according to A2A AgentSkill standard + agent_info = { + "name": name, + "url": info["url"], + "description": card.get("description", ""), + "skills": [ + { + "id": skill.get("id", ""), + "name": skill.get("name", ""), + "description": skill.get("description", ""), + "tags": skill.get("tags", []) + } + for skill in skills + ], + "skill_count": len(skills), + "default_input_modes": card.get("defaultInputModes", card.get("default_input_modes", [])), + "default_output_modes": card.get("defaultOutputModes", card.get("default_output_modes", [])) + } + registered_agents.append(agent_info) + + session.state["registered_agents"] = registered_agents + session.state["agent_names"] = list(self.agents.keys()) + + # Track if this is a multi-step request + multi_step_keywords = ["and then", "then save", "and save", "store the", "save the result", "save to memory", "remember"] + is_multi_step = any(keyword in message.lower() for keyword in multi_step_keywords) + + if is_multi_step: + session.state["multi_step_request"] = message + session.state["pending_actions"] = [] + + # Process with LLM + content = types.Content( + role='user', + parts=[types.Part.from_text(text=message)] + ) + + response = "" + try: + # Try to use existing session ID or create a new one + session_id = getattr(session, 'id', context_id) + user_id = getattr(session, 'user_id', 'user') + + if self.debug: + print(f"[DEBUG] Running with session_id: {session_id}, user_id: {user_id}") + + async for event in self.runner.run_async( + user_id=user_id, + session_id=session_id, + new_message=content + ): + # Check if event has content before accessing parts + if event and event.content: + # Normal content handling + if event.content: + if hasattr(event.content, 'parts') and event.content.parts: + # Get text from the first part that has text + for part in event.content.parts: + if hasattr(part, 'text') and part.text: + response = part.text + break + if not response and len(event.content.parts) > 0: + # Fallback to string representation + response = str(event.content.parts[0]) + elif hasattr(event.content, 'text'): + # Direct text content + response = event.content.text + else: + # Log for debugging + if self.debug: + print(f"[DEBUG] Event content type: {type(event.content)}, has parts: {hasattr(event.content, 'parts')}") + + # Check if LLM wants to route to an agent + if "ROUTE_TO:" in response: + # Extract routing command from response + route_line = None + for line in response.split('\n'): + if line.strip().startswith("ROUTE_TO:"): + route_line = line.strip() + break + + if route_line: + # Parse routing command more robustly + route_content = route_line[9:].strip() # Remove "ROUTE_TO:" + + # Try to match against registered agents + agent_name = None + agent_message = route_content + + # Check each registered agent name + for registered_name in self.agents.keys(): + if route_content.lower().startswith(registered_name.lower()): + agent_name = registered_name + # Extract message after agent name + agent_message = route_content[len(registered_name):].strip() + break + + if not agent_name: + # Fallback: try first word as agent name + parts = route_content.split(None, 1) + if parts: + agent_name = parts[0] + agent_message = parts[1] if len(parts) > 1 else message + + # Route to the agent + if agent_name in self.agents: + try: + connection = self.agents[agent_name]["connection"] + routed_response = await connection.send_message(agent_message) + agent_result = f"[{agent_name}]: {routed_response}" + + # If this was a multi-step request, process next steps + if is_multi_step: + # Store the agent response for next action + session.state["last_agent_response"] = routed_response + + # Ask LLM to continue with next steps + followup_content = types.Content( + role='user', + parts=[types.Part.from_text( + text=f"The agent responded: {routed_response}\n\nNow complete the remaining actions from the original request: {message}" + )] + ) + + # Process followup + async for followup_event in self.runner.run_async( + user_id=user_id, + session_id=session_id, + new_message=followup_content + ): + if followup_event.content.parts and followup_event.content.parts[0].text: + followup_response = followup_event.content.parts[0].text + response = f"{agent_result}\n\n{followup_response}" + break + else: + response = agent_result + + except Exception as e: + response = f"Error routing to {agent_name}: {e}" + else: + response = f"Agent {agent_name} not found. Available agents: {', '.join(self.agents.keys())}" + + # Check for artifacts in response + elif "ARTIFACT:" in response: + response = await self._extract_and_store_artifact(response, session, context_id) + except Exception as e: + if self.debug: + print(f"[DEBUG] Runner error: {e}") + print(f"[DEBUG] Error type: {type(e).__name__}") + import traceback + print(f"[DEBUG] Traceback: {traceback.format_exc()}") + # Fallback to direct agent response + response = f"I encountered an issue processing your request: {str(e) if self.debug else 'Please try again.'}" + + try: + save_session = getattr(self.runner.session_service, "save_session", None) + if callable(save_session): + await save_session(session) + except Exception as exc: + if self.debug: + print(f"[DEBUG] Failed to save session: {exc}") + + return { + "response": response or "No response generated", + "context_id": context_id, + "routed": False + } + + async def _create_session(self) -> Any: + """Create a new session""" + try: + # Create session with proper parameters + session = await self.runner.session_service.create_session( + app_name="fuzzforge", + user_id=f"user_{datetime.now().strftime('%Y%m%d_%H%M%S')}" + ) + return session + except Exception as e: + # If session service fails, create a simple mock session + if self.debug: + print(f"[DEBUG] Session creation failed: {e}, using mock session") + + # Return a simple session object + from types import SimpleNamespace + return SimpleNamespace( + id=f"session_{datetime.now().strftime('%Y%m%d_%H%M%S')}", + state={}, + app_name="fuzzforge", + user_id="user" + ) + + + async def _extract_and_store_artifact(self, response: str, session: Any, context_id: str) -> str: + """Extract and store artifacts from response using ADK artifact service (A2A compliant)""" + import re + + # Pattern to match artifact format - handle both inline and multiline formats + # Format: ARTIFACT: type filename\n```content``` (with possible extra newlines) + pattern = r'ARTIFACT:\s*(\w+)\s+(.+?)\s*\n```([^`]*?)```' + matches = re.findall(pattern, response, re.DOTALL) + + if self.debug: + print(f"[DEBUG] Looking for artifacts in response. Found {len(matches)} matches.") + if matches: + for i, (artifact_type, title, content) in enumerate(matches): + print(f"[DEBUG] Artifact {i+1}: type={artifact_type}, title={title.strip()}, content_length={len(content)}") + else: + # Show first 500 chars of response to debug regex issues + print(f"[DEBUG] No artifacts found. Response preview: {response[:500]}...") + + if matches: + artifacts_created = [] + + for artifact_type, title, content in matches: + # Determine MIME type based on artifact type + mime_type_map = { + "code": "text/plain", + "c": "text/x-c", + "cpp": "text/x-c++", + "python": "text/x-python", + "javascript": "text/javascript", + "json": "application/json", + "config": "text/plain", + "document": "text/markdown", + "data": "application/json", + "diagram": "text/plain", + "yaml": "text/yaml", + "xml": "text/xml", + "html": "text/html" + } + mime_type = mime_type_map.get(artifact_type, "text/plain") + + # Create proper A2A artifact format + title_clean = title.strip().replace(' ', '_') + # If title already has extension, use it as-is, otherwise add artifact_type as extension + if '.' in title_clean: + filename = title_clean + else: + filename = f"{title_clean}.{artifact_type}" + artifact_id = f"artifact_{uuid.uuid4().hex[:8]}" + + try: + # Store using ADK artifact service if available + if self.artifact_service: + # Create artifact metadata for A2A + artifact_metadata = { + "id": artifact_id, + "name": title.strip(), + "type": artifact_type, + "mimeType": mime_type, + "filename": filename, + "size": len(content), + "createdAt": datetime.now().isoformat() + } + + # Store content in artifact service + # Save to ADK artifact service using correct API + try: + from google.genai import types + + # Detect content type and extension from artifact metadata + filename = artifact_metadata.get("filename", f"{artifact_id}.txt") + mime_type = artifact_metadata.get("mimeType", "text/plain") + + # Handle different content types + if isinstance(content, str): + content_bytes = content.encode('utf-8') + elif isinstance(content, bytes): + content_bytes = content + else: + content_bytes = str(content).encode('utf-8') + + # Create ADK artifact using correct API + artifact_part = types.Part( + inline_data=types.Blob( + mime_type=mime_type, + data=content_bytes + ) + ) + + # Save using ADK artifact service + await self.artifact_service.save_artifact( + filename=filename, + artifact=artifact_part + ) + + if self.debug: + print(f"[DEBUG] Saved artifact to ADK service: {filename}") + + except ImportError as e: + # Fallback: just store in local cache if ADK not available + if self.debug: + print(f"[DEBUG] ADK types not available ({e}), using local storage only") + except Exception as e: + if self.debug: + print(f"[DEBUG] ADK artifact service error: {e}, using local storage only") + + if self.debug: + print(f"[DEBUG] Saved artifact to service: {artifact_id}") + + # Store to file system cache for HTTP serving + try: + content_bytes = content.encode('utf-8') if isinstance(content, str) else content + sha256_digest = hashlib.sha256(content_bytes).hexdigest() + + file_cache_result = self._register_artifact_bytes( + name=filename, + data=content_bytes, + mime_type=mime_type, + sha256_digest=sha256_digest, + size=len(content_bytes), + artifact_id=artifact_id # Use the display ID for file system + ) + + if self.debug: + print(f"[DEBUG] Stored artifact to file cache: {file_cache_result['file_uri']}") + except Exception as e: + if self.debug: + print(f"[DEBUG] Failed to store to file cache: {e}") + + # Also store in local cache for quick access + if context_id not in self.artifacts: + self.artifacts[context_id] = [] + + artifact = { + "id": artifact_id, + "type": artifact_type, + "title": title.strip(), + "filename": filename, + "mimeType": mime_type, + "content": content.strip(), + "size": len(content), + "created_at": datetime.now().isoformat() + } + + self.artifacts[context_id].append(artifact) + artifacts_created.append(f"{title.strip()} ({artifact_type})") + + if self.debug: + print(f"[DEBUG] Stored artifact: {artifact['id']} - {artifact['title']}") + + except Exception as e: + if self.debug: + print(f"[DEBUG] Failed to store artifact: {e}") + + # Create A2A compliant response with artifact references + artifact_list = ", ".join(artifacts_created) + clean_response = re.sub(pattern, "", response) + + # Add artifact notification in A2A format + artifact_response = f"{clean_response}\n\n๐Ÿ“Ž Created artifacts: {artifact_list}" + + return artifact_response + + return response + + async def get_artifacts(self, context_id: str = None) -> List[Dict[str, Any]]: + """Get artifacts for a context or all artifacts""" + if self.debug: + print(f"[DEBUG] get_artifacts called with context_id: {context_id}") + print(f"[DEBUG] Available artifact contexts: {list(self.artifacts.keys())}") + print(f"[DEBUG] Total artifacts stored: {sum(len(artifacts) for artifacts in self.artifacts.values())}") + + if context_id: + result = self.artifacts.get(context_id, []) + if self.debug: + print(f"[DEBUG] Returning {len(result)} artifacts for context {context_id}") + return result + + # Return all artifacts + all_artifacts = [] + for ctx_id, artifacts in self.artifacts.items(): + for artifact in artifacts: + artifact_copy = artifact.copy() + artifact_copy['context_id'] = ctx_id + all_artifacts.append(artifact_copy) + + if self.debug: + print(f"[DEBUG] Returning {len(all_artifacts)} total artifacts") + return all_artifacts + + def format_artifacts_for_a2a(self, context_id: str) -> List[Dict[str, Any]]: + """Format artifacts for A2A protocol response""" + artifacts = self.artifacts.get(context_id, []) + a2a_artifacts = [] + + for artifact in artifacts: + # Create A2A compliant artifact format + a2a_artifact = { + "id": artifact["id"], + "type": "artifact", + "mimeType": artifact.get("mimeType", "text/plain"), + "name": artifact.get("title", artifact.get("filename", "untitled")), + "parts": [ + { + "type": "text", + "text": artifact.get("content", "") + } + ], + "metadata": { + "filename": artifact.get("filename"), + "size": artifact.get("size", 0), + "createdAt": artifact.get("created_at") + } + } + a2a_artifacts.append(a2a_artifact) + + return a2a_artifacts + + async def register_agent(self, url: str) -> Dict[str, Any]: + """Register a new A2A agent with persistence""" + try: + conn = RemoteAgentConnection(url) + card = await conn.get_agent_card() + + if not card: + return {"success": False, "error": "Failed to get agent card"} + + name = card.get("name", f"agent_{len(self.agents)}") + description = card.get("description", "") + + self.agents[name] = { + "url": url, + "card": card, + "connection": conn + } + + if self.debug: + print(f"[DEBUG] Registered agent {name} for ROUTE_TO delegation") + + # Update session state with registered agents for the LLM + if hasattr(self, 'sessions'): + for session in self.sessions.values(): + if hasattr(session, 'state'): + session.state["registered_agents"] = list(self.agents.keys()) + + # Persist to config + from .config_manager import ConfigManager + config_mgr = ConfigManager() + config_mgr.add_registered_agent(name, url, description) + + return { + "success": True, + "name": name, + "capabilities": len(card.get("skills", [])), + "description": description + } + + except Exception as e: + return {"success": False, "error": str(e)} + + def list_agents(self) -> List[Dict[str, Any]]: + """List all registered agents""" + return [ + { + "name": name, + "url": info["url"], + "description": info.get("card", {}).get("description", ""), + "skills": len(info.get("card", {}).get("skills", [])) + } + for name, info in self.agents.items() + ] + + async def cleanup(self): + """Clean up resources""" + # Close agent connections + for agent in self.agents.values(): + conn = agent.get("connection") + if conn: + await conn.close() + + # End AgentOps trace + if self.agentops_trace: + try: + agentops.end_trace() + except: + pass + + # Cancel background monitors + for task in list(self._background_tasks): + task.cancel() + self._background_tasks.clear() + + def _schedule_run_followup(self, run_id: str) -> None: + if run_id not in self.pending_runs: + return + + try: + task = asyncio.create_task(self._monitor_run_and_notify(run_id), name=f"fuzzforge_run_{run_id}") + self._background_tasks.add(task) + + def _cleanup(t: asyncio.Task) -> None: + self._background_tasks.discard(t) + try: + t.result() + except asyncio.CancelledError: + if self.debug: + print(f"[DEBUG] Run monitor for {run_id} cancelled") + except Exception as exc: + if self.debug: + print(f"[DEBUG] Run monitor for {run_id} failed: {exc}") + + task.add_done_callback(_cleanup) + except RuntimeError as exc: + if self.debug: + print(f"[DEBUG] Unable to schedule run follow-up: {exc}") + + async def _monitor_run_and_notify(self, run_id: str) -> None: + try: + run_meta = self.pending_runs.get(run_id) + if not run_meta: + return + context_id = run_meta.get("context_id") + while True: + status = await self._call_mcp_status(run_id) + if isinstance(status, dict) and status.get("is_completed"): + break + await asyncio.sleep(5) + + summary = await self._call_mcp_summary(run_id) + findings: Any | None = None + try: + findings = await self._call_mcp_generic( + "get_run_findings_mcp", {"run_id": run_id} + ) + except Exception as exc: + if self.debug: + print(f"[DEBUG] Unable to fetch findings for {run_id}: {exc}") + + artifact_info = None + try: + artifact_info = await self._create_run_artifact( + run_id=run_id, + run_meta=run_meta, + status=status, + summary=summary, + findings=findings, + ) + if artifact_info: + run_meta["artifact"] = artifact_info + except Exception as exc: + if self.debug: + print(f"[DEBUG] Failed to create artifact for {run_id}: {exc}") + + message = self._format_run_summary(run_id, status, summary) + if artifact_info and artifact_info.get("file_uri"): + message += ( + f"\nArtifact: {artifact_info['file_uri']}" + f" ({artifact_info.get('name', 'run-summary')})" + ) + if context_id: + await self._append_session_message(context_id, message, run_id) + await self._publish_task_update( + run_id, + context_id, + status, + summary, + message, + artifact_info, + ) + self.pending_runs.pop(run_id, None) + except asyncio.CancelledError: + raise + except Exception as exc: + if self.debug: + print(f"[DEBUG] Follow-up notification failed for {run_id}: {exc}") + + async def _call_mcp_status(self, run_id: str) -> Any: + return await self._call_mcp_generic("get_run_status_mcp", {"run_id": run_id}) + + async def _call_mcp_summary(self, run_id: str) -> Any: + return await self._call_mcp_generic("get_comprehensive_scan_summary", {"run_id": run_id}) + + async def _call_mcp_generic(self, tool_name: str, payload: Dict[str, Any]) -> Any: + if not self.fuzzforge_mcp_url: + return {"error": "FUZZFORGE_MCP_URL not configured"} + + try: + from fastmcp.client import Client + except ImportError as exc: + return {"error": f"fastmcp not installed: {exc}"} + + async with Client(self.fuzzforge_mcp_url) as client: + result = await client.call_tool(tool_name, payload) + + if hasattr(result, "content") and result.content: + raw = result.content[0] if isinstance(result.content, list) else result.content + if isinstance(raw, dict) and "text" in raw: + raw = raw["text"] + if isinstance(raw, str): + stripped = raw.strip() + if stripped.startswith("{") or stripped.startswith("["): + try: + return json.loads(stripped) + except json.JSONDecodeError: + return raw + return raw + return raw + + if isinstance(result, (dict, list)): + return result + return str(result) + + def _format_run_summary(self, run_id: str, status: Any, summary: Any) -> str: + lines = [f"FuzzForge workflow {run_id} completed."] + if isinstance(status, dict): + state = status.get("status") or status.get("state") + if state: + lines.append(f"Status: {state}") + updated = status.get("updated_at") or status.get("completed_at") + if updated: + lines.append(f"Completed at: {updated}") + if isinstance(summary, dict): + total = summary.get("total_findings") + if total is not None: + lines.append(f"Total findings: {total}") + severity = summary.get("severity_summary") + if isinstance(severity, dict): + lines.append("Severity breakdown: " + ", ".join(f"{k}={v}" for k, v in severity.items())) + recommendations = summary.get("recommendations") + if recommendations: + if isinstance(recommendations, list): + lines.append("Recommendations:") + lines.extend(f"- {item}" for item in recommendations) + else: + lines.append(f"Recommendations: {recommendations}") + else: + lines.append(str(summary)) + lines.append("You can request more detail with get_run_findings_mcp(run_id) or get_run_status_mcp(run_id).") + return "\n".join(lines) + + async def query_project_knowledge_api( + self, + query: str, + search_type: str = "INSIGHTS", + dataset: str = "", + ) -> Dict[str, Any]: + integration = await self._get_knowledge_integration() + if integration is None: + return {"error": "Knowledge graph integration unavailable"} + + try: + result = await integration.search_knowledge_graph( + query=query, + search_type=search_type, + dataset=dataset or None, + ) + return json.loads(json.dumps(result, default=str)) + except Exception as exc: + return {"error": f"Knowledge graph query failed: {exc}"} + + async def create_project_file_artifact_api(self, file_path: str) -> Dict[str, Any]: + try: + config = ProjectConfigManager() + if not config.is_initialized(): + return {"error": "Project not initialized. Run 'fuzzforge init' first."} + + project_root = config.config_path.parent.resolve() + requested_file = (project_root / file_path).resolve() + + try: + requested_file.relative_to(project_root) + except ValueError: + return {"error": f"Access denied: '{file_path}' is outside the project"} + + if not requested_file.exists() or not requested_file.is_file(): + return {"error": f"File not found: {file_path}"} + + size = requested_file.stat().st_size + max_bytes = int(os.getenv("FUZZFORGE_ARTIFACT_MAX_BYTES", str(25 * 1024 * 1024))) + if size > max_bytes: + return { + "error": ( + f"File {file_path} is {size} bytes, exceeding the limit of {max_bytes} bytes" + ) + } + + data = requested_file.read_bytes() + mime_type, _ = mimetypes.guess_type(str(requested_file)) + if not mime_type: + mime_type = "application/octet-stream" + + artifact_id = f"project_file_{uuid.uuid4().hex[:8]}" + sha256_digest = hashlib.sha256(data).hexdigest() + + if self.artifact_service: + try: + artifact_part = types.Part( + inline_data=types.Blob( + mime_type=mime_type, + data=data, + ) + ) + await self.artifact_service.save_artifact( + filename=requested_file.name, + artifact=artifact_part, + ) + if self.debug: + print( + f"[DEBUG] Saved project file artifact to service: {requested_file.name}" + ) + except Exception as exc: + if self.debug: + print(f"[DEBUG] Artifact service save failed: {exc}") + + local_meta = self._register_artifact_bytes( + name=requested_file.name, + data=data, + mime_type=mime_type, + sha256_digest=sha256_digest, + size=size, + artifact_id=artifact_id, + ) + + local_meta.update( + { + "path": str(requested_file), + "size": size, + "name": requested_file.name, + "mime_type": mime_type, + } + ) + return local_meta + except Exception as exc: + return {"error": f"Failed to create artifact: {exc}"} + + async def _create_run_artifact( + self, + *, + run_id: str, + run_meta: Dict[str, Any], + status: Any, + summary: Any, + findings: Any | None = None, + ) -> Dict[str, Any] | None: + workflow_name = run_meta.get("workflow_name") or "workflow" + safe_workflow = "".join( + ch if ch.isalnum() or ch in {"-", "_"} else "_" for ch in workflow_name + ) or "workflow" + artifact_filename = f"{safe_workflow}_{run_id}_summary.json" + + payload: Dict[str, Any] = { + "run_id": run_id, + "workflow": workflow_name, + "submitted_at": run_meta.get("submitted_at"), + "status": status, + "summary": summary, + } + + if isinstance(findings, dict) and not findings.get("error"): + payload["findings"] = findings + + artifact_bytes = json.dumps(payload, indent=2, default=str).encode("utf-8") + + if self.artifact_service: + try: + artifact_part = types.Part( + inline_data=types.Blob( + mime_type="application/json", + data=artifact_bytes, + ) + ) + await self.artifact_service.save_artifact( + filename=artifact_filename, + artifact=artifact_part, + ) + if self.debug: + print( + f"[DEBUG] Saved run artifact to artifact service: {artifact_filename}" + ) + except Exception as exc: + if self.debug: + print(f"[DEBUG] Artifact service save failed: {exc}") + + sha256_digest = hashlib.sha256(artifact_bytes).hexdigest() + local_meta = self._register_artifact_bytes( + name=artifact_filename, + data=artifact_bytes, + mime_type="application/json", + sha256_digest=sha256_digest, + size=len(artifact_bytes), + artifact_id=f"fuzzforge_run_{run_id}", + ) + + return local_meta + + async def _append_session_message(self, context_id: str, message: str, run_id: str) -> None: + meta = self.session_metadata.get(context_id) + if not meta: + return + service = self.runner.session_service + session_obj = None + if hasattr(service, "sessions"): + session_obj = ( + service.sessions + .get(meta.get("app_name", "fuzzforge"), {}) + .get(meta.get("user_id"), {}) + .get(meta.get("session_id")) + ) + if not session_obj: + if self.debug: + print(f"[DEBUG] Could not locate session for context {context_id}") + return + + event = Event( + invocationId=str(uuid.uuid4()), + id=str(uuid.uuid4()), + author=getattr(self.agent, 'name', 'FuzzForge'), + content=types.Content( + role='assistant', + parts=[Part.from_text(text=message)] + ), + actions=EventActions(), + ) + event.actions.state_delta[f"fuzzforge.run.{run_id}.status"] = "completed" + event.actions.state_delta[f"fuzzforge.run.{run_id}.timestamp"] = datetime.now().isoformat() + + await service.append_event(session_obj, event) + session_obj.last_update_time = time.time() + + cached_session = self.sessions.get(context_id) + if cached_session and hasattr(cached_session, 'events'): + cached_session.events.append(event) + elif cached_session: + cached_session.events = [event] + + async def _append_external_event(self, session: Any, agent_name: str, message_text: str) -> None: + if session is None: + return + event = Event( + invocationId=str(uuid.uuid4()), + id=str(uuid.uuid4()), + author=agent_name, + content=types.Content( + role='assistant', + parts=[Part.from_text(text=message_text)] + ), + actions=EventActions(), + ) + await self.runner.session_service.append_event(session, event) + if hasattr(session, 'events'): + session.events.append(event) + else: + session.events = [event] + + async def _send_to_agent( + self, + agent_name: str, + message: Union[str, Dict[str, Any], List[Dict[str, Any]]], + session: Any, + context_id: str, + ) -> str: + agent_entry = self.agents.get(agent_name) + if not agent_entry: + return f"Agent '{agent_name}' is not registered." + + conn = agent_entry.get('connection') + if conn is None: + conn = RemoteAgentConnection(agent_entry['url']) + await conn.get_agent_card() + agent_entry['connection'] = conn + + conn.context_id = context_id + response = await conn.send_message(message) + response_text = response if isinstance(response, str) else str(response) + await self._append_external_event(session, agent_name, response_text) + return response_text + + async def delegate_file_to_agent( + self, + agent_name: str, + file_path: str, + note: str = "", + session: Any = None, + context_id: str | None = None, + ) -> str: + try: + project_root = None + try: + config = ProjectConfigManager() + if config.is_initialized(): + project_root = config.config_path.parent + except Exception: + project_root = None + + path_obj = Path(file_path).expanduser() + if not path_obj.is_absolute() and project_root: + path_obj = (project_root / path_obj).resolve() + else: + path_obj = path_obj.resolve() + + if not path_obj.is_file(): + return f"File not found: {path_obj}" + + data = path_obj.read_bytes() + except Exception as exc: + return f"Failed to read file '{file_path}': {exc}" + + message_text = note or f"Please analyse the artifact {path_obj.name}." + + if session is None: + if not self.sessions: + return "No active session available for delegation." + default_context = next(iter(self.sessions.keys())) + session = self.sessions[default_context] + context_id = default_context + + if context_id is None: + session_id = getattr(session, 'id', None) + context_id = self.session_lookup.get(session_id, session_id or 'default') + + app_name = getattr(session, 'app_name', 'fuzzforge') + user_id = getattr(session, 'user_id', 'user') + session_id = getattr(session, 'id', context_id) + + mime_type, _ = mimetypes.guess_type(str(path_obj)) + if not mime_type: + mime_type = 'application/octet-stream' + + sha256_digest = hashlib.sha256(data).hexdigest() + size = len(data) + + artifact_version = None + if self.artifact_service: + try: + artifact_part = types.Part( + inline_data=types.Blob(data=data, mime_type=mime_type) + ) + artifact_version = await self.artifact_service.save_artifact( + app_name=app_name, + user_id=user_id, + session_id=session_id, + filename=path_obj.name, + artifact=artifact_part, + ) + except Exception as exc: + artifact_version = None + if self.debug: + print(f"[DEBUG] Failed to persist artifact in service: {exc}") + + artifact_meta = self._register_artifact_bytes( + name=path_obj.name, + data=data, + mime_type=mime_type, + sha256_digest=sha256_digest, + size=size, + ) + + artifact_info = { + "file_uri": artifact_meta["file_uri"], # HTTP URL for download + "artifact_url": artifact_meta["file_uri"], # Alias for reverse agent compatibility + "cache_path": artifact_meta["path"], + "filename": path_obj.name, + "mime_type": mime_type, + "sha256": sha256_digest, + "size": size, + "session": { + "app_name": app_name, + "user_id": user_id, + "session_id": session_id, + }, + } + if artifact_version is not None: + artifact_info["artifact_version"] = artifact_version + + parts: List[Dict[str, Any]] = [ + {"type": "text", "text": message_text}, + { + "type": "file", + "file": { + "uri": artifact_meta["file_uri"], + "name": path_obj.name, + "mime_type": mime_type, + }, + }, + { + "type": "text", + "text": f"artifact_metadata: {json.dumps(artifact_info)}", + }, + ] + + return await self._send_to_agent(agent_name, {"parts": parts}, session, context_id) + + async def _publish_task_pending(self, run_id: str, context_id: str, workflow_name: str) -> None: + task_store = self.task_store + queue_manager = self.queue_manager + if not task_store or not queue_manager: + return + + context_identifier = context_id or "default" + + status_obj = TaskStatus( + state=TaskState.working, + timestamp=datetime.now().isoformat(), + ) + + task = Task( + id=run_id, + context_id=context_identifier, + status=status_obj, + metadata={"workflow": workflow_name}, + ) + await task_store.save(task) + + status_event = TaskStatusUpdateEvent( + taskId=run_id, + contextId=context_identifier, + status=status_obj, + final=False, + metadata={"workflow": workflow_name}, + ) + + queue = await queue_manager.create_or_tap(run_id) + await queue.enqueue_event(status_event) # type: ignore[arg-type] + + async def _publish_task_update( + self, + run_id: str, + context_id: str | None, + status_payload: Any, + summary_payload: Any, + message_text: str, + artifact_info: Dict[str, Any] | None = None, + ) -> None: + if not FuzzForgeExecutor.task_store or not FuzzForgeExecutor.queue_manager: + return + + task_store = self.task_store + queue_manager = self.queue_manager + + context_identifier = context_id or "default" + existing_task = await task_store.get(run_id) + + message_obj = Message( + messageId=str(uuid.uuid4()), + role="agent", + parts=[A2APart.model_validate({"type": "text", "text": message_text})], + contextId=context_identifier, + taskId=run_id, + ) + + status_obj = TaskStatus( + state=TaskState.completed, + timestamp=datetime.now().isoformat(), + message=message_obj, + ) + + metadata = { + "status": status_payload, + "summary": summary_payload, + } + if artifact_info: + metadata["artifact"] = artifact_info + + status_event = TaskStatusUpdateEvent( + taskId=run_id, + contextId=context_identifier, + status=status_obj, + final=True, + metadata=metadata, + ) + + if existing_task: + existing_task.status = status_obj + if existing_task.metadata is None: + existing_task.metadata = {} + existing_task.metadata.update(metadata) + if existing_task.history: + existing_task.history.append(message_obj) + else: + existing_task.history = [message_obj] + await task_store.save(existing_task) + else: + new_task = Task( + id=run_id, + context_id=context_identifier, + status=status_obj, + metadata=metadata, + history=[message_obj], + ) + await task_store.save(new_task) + + queue = await queue_manager.create_or_tap(run_id) + await queue.enqueue_event(status_event) # type: ignore[arg-type] diff --git a/ai/src/fuzzforge_ai/cli.py b/ai/src/fuzzforge_ai/cli.py new file mode 100755 index 0000000..b63f7bd --- /dev/null +++ b/ai/src/fuzzforge_ai/cli.py @@ -0,0 +1,977 @@ +#!/usr/bin/env python3 +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + +""" +FuzzForge CLI - Clean modular version +Uses the separated agent components +""" + +import asyncio +import shlex +import os +import sys +import signal +import warnings +import logging +import random +from datetime import datetime +from contextlib import contextmanager +from pathlib import Path +from typing import Any + +from dotenv import load_dotenv + +# Ensure Cognee writes logs inside the project workspace +project_root = Path.cwd() +default_log_dir = project_root / ".fuzzforge" / "logs" +default_log_dir.mkdir(parents=True, exist_ok=True) +log_path = default_log_dir / "cognee.log" +os.environ.setdefault("COGNEE_LOG_PATH", str(log_path)) + +# Suppress warnings +warnings.filterwarnings("ignore") +logging.basicConfig(level=logging.ERROR) + +# Load .env file with explicit path handling +# 1. First check current working directory for .fuzzforge/.env +fuzzforge_env = Path.cwd() / ".fuzzforge" / ".env" +if fuzzforge_env.exists(): + load_dotenv(fuzzforge_env, override=True) +else: + # 2. Then check parent directories for .fuzzforge projects + current_path = Path.cwd() + for parent in [current_path] + list(current_path.parents): + fuzzforge_dir = parent / ".fuzzforge" + if fuzzforge_dir.exists(): + project_env = fuzzforge_dir / ".env" + if project_env.exists(): + load_dotenv(project_env, override=True) + break + else: + # 3. Fallback to generic load_dotenv + load_dotenv(override=True) + +# Enhanced readline configuration for Rich Console input compatibility +try: + import readline + # Enable Rich-compatible input features + readline.parse_and_bind("tab: complete") + readline.parse_and_bind("set editing-mode emacs") + readline.parse_and_bind("set show-all-if-ambiguous on") + readline.parse_and_bind("set completion-ignore-case on") + readline.parse_and_bind("set colored-completion-prefix on") + readline.parse_and_bind("set enable-bracketed-paste on") # Better paste support + # Navigation bindings for better editing + readline.parse_and_bind("Control-a: beginning-of-line") + readline.parse_and_bind("Control-e: end-of-line") + readline.parse_and_bind("Control-u: unix-line-discard") + readline.parse_and_bind("Control-k: kill-line") + readline.parse_and_bind("Control-w: unix-word-rubout") + readline.parse_and_bind("Meta-Backspace: backward-kill-word") + # History and completion + readline.set_history_length(2000) + readline.set_startup_hook(None) + # Enable multiline editing hints + readline.parse_and_bind("set horizontal-scroll-mode off") + readline.parse_and_bind("set mark-symlinked-directories on") + READLINE_AVAILABLE = True +except ImportError: + READLINE_AVAILABLE = False + +from rich.console import Console +from rich.table import Table +from rich.panel import Panel +from rich.prompt import Prompt +from rich import box + +from google.adk.events.event import Event +from google.adk.events.event_actions import EventActions +from google.genai import types as gen_types + +from .agent import FuzzForgeAgent +from .agent_card import get_fuzzforge_agent_card +from .config_manager import ConfigManager +from .config_bridge import ProjectConfigManager +from .remote_agent import RemoteAgentConnection + +console = Console() + +# Global shutdown flag +shutdown_requested = False + +# Dynamic status messages for better UX +THINKING_MESSAGES = [ + "Thinking", "Processing", "Computing", "Analyzing", "Working", + "Pondering", "Deliberating", "Calculating", "Reasoning", "Evaluating" +] + +WORKING_MESSAGES = [ + "Working", "Processing", "Handling", "Executing", "Running", + "Operating", "Performing", "Conducting", "Managing", "Coordinating" +] + +SEARCH_MESSAGES = [ + "Searching", "Scanning", "Exploring", "Investigating", "Hunting", + "Seeking", "Probing", "Examining", "Inspecting", "Browsing" +] + +# Cool prompt symbols +PROMPT_STYLES = [ + "โ–ถ", "โฏ", "โžค", "โ†’", "ยป", "โŸฉ", "โ–ท", "โ‡จ", "โŸถ", "โ—†" +] + +def get_dynamic_status(action_type="thinking"): + """Get a random status message based on action type""" + if action_type == "thinking": + return f"{random.choice(THINKING_MESSAGES)}..." + elif action_type == "working": + return f"{random.choice(WORKING_MESSAGES)}..." + elif action_type == "searching": + return f"{random.choice(SEARCH_MESSAGES)}..." + else: + return f"{random.choice(THINKING_MESSAGES)}..." + +def get_prompt_symbol(): + """Get prompt symbol indicating where to write""" + return ">>" + +def signal_handler(signum, frame): + """Handle Ctrl+C gracefully""" + global shutdown_requested + shutdown_requested = True + console.print("\n\n[yellow]Shutting down gracefully...[/yellow]") + sys.exit(0) + +signal.signal(signal.SIGINT, signal_handler) + +@contextmanager +def safe_status(message: str): + """Safe status context manager""" + status = console.status(message, spinner="dots") + try: + status.start() + yield + finally: + status.stop() + + +class FuzzForgeCLI: + """Command-line interface for FuzzForge""" + + def __init__(self): + """Initialize the CLI""" + # Ensure .env is loaded from .fuzzforge directory + fuzzforge_env = Path.cwd() / ".fuzzforge" / ".env" + if fuzzforge_env.exists(): + load_dotenv(fuzzforge_env, override=True) + + # Load configuration for agent registry + self.config_manager = ConfigManager() + + # Check environment configuration + if not os.getenv('LITELLM_MODEL'): + console.print("[red]ERROR: LITELLM_MODEL not set in .env file[/red]") + console.print("Please set LITELLM_MODEL to your desired model") + sys.exit(1) + + # Create the agent (uses env vars directly) + self.agent = FuzzForgeAgent() + + # Create a consistent context ID for this CLI session + self.context_id = f"cli_{datetime.now().strftime('%Y%m%d_%H%M%S')}" + + # Track registered agents for config persistence + self.agents_modified = False + + # Command handlers + self.commands = { + "/help": self.cmd_help, + "/register": self.cmd_register, + "/unregister": self.cmd_unregister, + "/list": self.cmd_list, + "/memory": self.cmd_memory, + "/recall": self.cmd_recall, + "/artifacts": self.cmd_artifacts, + "/tasks": self.cmd_tasks, + "/skills": self.cmd_skills, + "/sessions": self.cmd_sessions, + "/clear": self.cmd_clear, + "/sendfile": self.cmd_sendfile, + "/quit": self.cmd_quit, + "/exit": self.cmd_quit, + } + + self.background_tasks: set[asyncio.Task] = set() + + def print_banner(self): + """Print welcome banner""" + card = self.agent.agent_card + + # Print ASCII banner + console.print("[medium_purple3] โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•— โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ•—[/medium_purple3]") + console.print("[medium_purple3] โ–ˆโ–ˆโ•”โ•โ•โ•โ•โ•โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ•šโ•โ•โ–ˆโ–ˆโ–ˆโ•”โ•โ•šโ•โ•โ–ˆโ–ˆโ–ˆโ•”โ•โ–ˆโ–ˆโ•”โ•โ•โ•โ•โ•โ–ˆโ–ˆโ•”โ•โ•โ•โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•”โ•โ•โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•”โ•โ•โ•โ•โ• โ–ˆโ–ˆโ•”โ•โ•โ•โ•โ• โ–ˆโ–ˆโ•”โ•โ•โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•‘[/medium_purple3]") + console.print("[medium_purple3] โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ–ˆโ•”โ• โ–ˆโ–ˆโ–ˆโ•”โ• โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•”โ•โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ–ˆโ•—โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ•‘[/medium_purple3]") + console.print("[medium_purple3] โ–ˆโ–ˆโ•”โ•โ•โ• โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ–ˆโ•”โ• โ–ˆโ–ˆโ–ˆโ•”โ• โ–ˆโ–ˆโ•”โ•โ•โ• โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ•”โ•โ•โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ•”โ•โ•โ• โ–ˆโ–ˆโ•”โ•โ•โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ•‘[/medium_purple3]") + console.print("[medium_purple3] โ–ˆโ–ˆโ•‘ โ•šโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•”โ•โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•‘ โ•šโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•”โ•โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ•šโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•”โ•โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ•‘[/medium_purple3]") + console.print("[medium_purple3] โ•šโ•โ• โ•šโ•โ•โ•โ•โ•โ• โ•šโ•โ•โ•โ•โ•โ•โ•โ•šโ•โ•โ•โ•โ•โ•โ•โ•šโ•โ• โ•šโ•โ•โ•โ•โ•โ• โ•šโ•โ• โ•šโ•โ• โ•šโ•โ•โ•โ•โ•โ• โ•šโ•โ•โ•โ•โ•โ•โ• โ•šโ•โ• โ•šโ•โ•โ•šโ•โ•[/medium_purple3]") + console.print(f"\n[dim]{card.description}[/dim]\n") + + provider = ( + os.getenv("LLM_PROVIDER") + or os.getenv("LLM_COGNEE_PROVIDER") + or os.getenv("COGNEE_LLM_PROVIDER") + or "unknown" + ) + + console.print( + "LLM Provider: [medium_purple1]{provider}[/medium_purple1]".format( + provider=provider + ) + ) + console.print( + "LLM Model: [medium_purple1]{model}[/medium_purple1]".format( + model=self.agent.model + ) + ) + if self.agent.executor.agentops_trace: + console.print(f"Tracking: [medium_purple1]AgentOps active[/medium_purple1]") + + # Show skills + console.print("\nSkills:") + for skill in card.skills: + console.print( + f" โ€ข [deep_sky_blue1]{skill.name}[/deep_sky_blue1] โ€“ {skill.description}" + ) + console.print("\nType /help for commands or just chat\n") + + async def cmd_help(self, args: str = "") -> None: + """Show help""" + help_text = """ +[bold]Commands:[/bold] + /register - Register an A2A agent (saves to config) + /unregister - Remove agent from registry and config + /list - List registered agents + +[bold]Memory Systems:[/bold] + /recall - Search past conversations (ADK Memory) + /memory - Show knowledge graph (Cognee) + /memory save - Save to knowledge graph + /memory search - Search knowledge graph + +[bold]Other:[/bold] + /artifacts - List created artifacts + /artifacts - Show artifact content + /tasks [id] - Show task list or details + /skills - Show FuzzForge skills + /sessions - List active sessions + /sendfile [message] - Attach file as artifact and route to agent + /clear - Clear screen + /help - Show this help + /quit - Exit + +[bold]Sample prompts:[/bold] + run fuzzforge workflow security_assessment on /absolute/path --volume-mode ro + list fuzzforge runs limit=5 + get fuzzforge summary + query project knowledge about "unsafe Rust" using GRAPH_COMPLETION + export project file src/lib.rs as artifact + /memory search "recent findings" + +[bold]Input Editing:[/bold] + Arrow keys - Move cursor + Ctrl+A/E - Start/end of line + Up/Down - Command history + """ + console.print(help_text) + + async def cmd_register(self, args: str) -> None: + """Register an agent""" + if not args: + console.print("Usage: /register ") + return + + with safe_status(f"{get_dynamic_status('working')} Registering {args}"): + result = await self.agent.register_agent(args.strip()) + + if result["success"]: + console.print(f"โœ… Registered: [bold]{result['name']}[/bold]") + console.print(f" Capabilities: {result['capabilities']} skills") + + # Get description from the agent's card + agents = self.agent.list_agents() + description = "" + for agent in agents: + if agent['name'] == result['name']: + description = agent.get('description', '') + break + + # Add to config for persistence + self.config_manager.add_registered_agent( + name=result['name'], + url=args.strip(), + description=description + ) + console.print(f" [dim]Saved to config for auto-registration[/dim]") + else: + console.print(f"[red]Failed: {result['error']}[/red]") + + async def cmd_unregister(self, args: str) -> None: + """Unregister an agent and remove from config""" + if not args: + console.print("Usage: /unregister ") + return + + # Try to find the agent + agents = self.agent.list_agents() + agent_to_remove = None + + for agent in agents: + if agent['name'].lower() == args.lower() or agent['url'] == args: + agent_to_remove = agent + break + + if not agent_to_remove: + console.print(f"[yellow]Agent '{args}' not found[/yellow]") + return + + # Remove from config + if self.config_manager.remove_registered_agent(name=agent_to_remove['name'], url=agent_to_remove['url']): + console.print(f"โœ… Unregistered: [bold]{agent_to_remove['name']}[/bold]") + console.print(f" [dim]Removed from config (won't auto-register next time)[/dim]") + else: + console.print(f"[yellow]Agent unregistered from session but not found in config[/yellow]") + + async def cmd_list(self, args: str = "") -> None: + """List registered agents""" + agents = self.agent.list_agents() + + if not agents: + console.print("No agents registered. Use /register ") + return + + table = Table(title="Registered Agents", box=box.ROUNDED) + table.add_column("Name", style="medium_purple3") + table.add_column("URL", style="deep_sky_blue3") + table.add_column("Skills", style="plum3") + table.add_column("Description", style="dim") + + for agent in agents: + desc = agent['description'] + if len(desc) > 40: + desc = desc[:37] + "..." + table.add_row( + agent['name'], + agent['url'], + str(agent['skills']), + desc + ) + + console.print(table) + + async def cmd_recall(self, args: str = "") -> None: + """Search conversational memory (past conversations)""" + if not args: + console.print("Usage: /recall ") + return + + await self._sync_conversational_memory() + + # First try MemoryService (for ingested memories) + with safe_status(get_dynamic_status('searching')): + results = await self.agent.memory_manager.search_conversational_memory(args) + + if results and results.memories: + console.print(f"[bold]Found {len(results.memories)} memories:[/bold]\n") + for i, memory in enumerate(results.memories, 1): + # MemoryEntry has 'text' field, not 'content' + text = getattr(memory, 'text', str(memory)) + if len(text) > 200: + text = text[:200] + "..." + console.print(f"{i}. {text}") + else: + # If MemoryService is empty, search SQLite directly + console.print("[yellow]No memories in MemoryService, searching SQLite sessions...[/yellow]") + + # Check if using DatabaseSessionService + if hasattr(self.agent.executor, 'session_service'): + service_type = type(self.agent.executor.session_service).__name__ + if service_type == 'DatabaseSessionService': + # Search SQLite database directly + import sqlite3 + import os + db_path = os.getenv('SESSION_DB_PATH', './fuzzforge_sessions.db') + + if os.path.exists(db_path): + conn = sqlite3.connect(db_path) + cursor = conn.cursor() + + # Search in events table + query = f"%{args}%" + cursor.execute( + "SELECT content FROM events WHERE content LIKE ? LIMIT 10", + (query,) + ) + + rows = cursor.fetchall() + conn.close() + + if rows: + console.print(f"[green]Found {len(rows)} matches in SQLite sessions:[/green]\n") + for i, (content,) in enumerate(rows, 1): + # Parse JSON content + import json + try: + data = json.loads(content) + if 'parts' in data and data['parts']: + text = data['parts'][0].get('text', '')[:150] + role = data.get('role', 'unknown') + console.print(f"{i}. [{role}]: {text}...") + except: + console.print(f"{i}. {content[:150]}...") + else: + console.print("[yellow]No matches found in SQLite either[/yellow]") + else: + console.print("[yellow]SQLite database not found[/yellow]") + else: + console.print(f"[dim]Using {service_type} (not searchable)[/dim]") + else: + console.print("[yellow]No session history available[/yellow]") + + async def cmd_memory(self, args: str = "") -> None: + """Inspect conversational memory and knowledge graph state.""" + raw_args = (args or "").strip() + lower_args = raw_args.lower() + + if not raw_args or lower_args in {"status", "info"}: + await self._show_memory_status() + return + + if lower_args == "datasets": + await self._show_dataset_summary() + return + + if lower_args.startswith("search ") or lower_args.startswith("recall "): + query = raw_args.split(" ", 1)[1].strip() if " " in raw_args else "" + if not query: + console.print("Usage: /memory search ") + return + await self.cmd_recall(query) + return + + console.print("Usage: /memory [status|datasets|search ]") + console.print("[dim]/memory search is an alias for /recall [/dim]") + + async def _sync_conversational_memory(self) -> None: + """Ensure the ADK memory service ingests any completed sessions.""" + memory_service = getattr(self.agent.memory_manager, "memory_service", None) + executor_sessions = getattr(self.agent.executor, "sessions", {}) + metadata_map = getattr(self.agent.executor, "session_metadata", {}) + + if not memory_service or not executor_sessions: + return + + for context_id, session in list(executor_sessions.items()): + meta = metadata_map.get(context_id, {}) + if meta.get('memory_synced'): + continue + + add_session = getattr(memory_service, "add_session_to_memory", None) + if not callable(add_session): + return + + try: + await add_session(session) + meta['memory_synced'] = True + metadata_map[context_id] = meta + except Exception as exc: # pragma: no cover - defensive logging + if os.getenv('FUZZFORGE_DEBUG', '0') == '1': + console.print(f"[yellow]Memory sync failed:[/yellow] {exc}") + + async def _show_memory_status(self) -> None: + """Render conversational memory, session store, and knowledge graph status.""" + await self._sync_conversational_memory() + + status = self.agent.memory_manager.get_status() + + conversational = status.get("conversational_memory", {}) + conv_type = conversational.get("type", "unknown") + conv_active = "yes" if conversational.get("active") else "no" + conv_details = conversational.get("details", "") + + session_service = getattr(self.agent.executor, "session_service", None) + session_service_name = type(session_service).__name__ if session_service else "Unavailable" + + session_lines = [ + f"[bold]Service:[/bold] {session_service_name}" + ] + + session_count = None + event_count = None + db_path_display = None + + if session_service_name == "DatabaseSessionService": + import sqlite3 + + db_path = os.getenv('SESSION_DB_PATH', './fuzzforge_sessions.db') + session_path = Path(db_path).expanduser().resolve() + db_path_display = str(session_path) + + if session_path.exists(): + try: + with sqlite3.connect(session_path) as conn: + cursor = conn.cursor() + cursor.execute("SELECT COUNT(*) FROM sessions") + session_count = cursor.fetchone()[0] + cursor.execute("SELECT COUNT(*) FROM events") + event_count = cursor.fetchone()[0] + except Exception as exc: + session_lines.append(f"[yellow]Warning:[/yellow] Unable to read session database ({exc})") + else: + session_lines.append("[yellow]SQLite session database not found yet[/yellow]") + + elif session_service_name == "InMemorySessionService": + session_lines.append("[dim]Session data persists for the current process only[/dim]") + + if db_path_display: + session_lines.append(f"[bold]Database:[/bold] {db_path_display}") + if session_count is not None: + session_lines.append(f"[bold]Sessions Recorded:[/bold] {session_count}") + if event_count is not None: + session_lines.append(f"[bold]Events Logged:[/bold] {event_count}") + + conv_lines = [ + f"[bold]Type:[/bold] {conv_type}", + f"[bold]Active:[/bold] {conv_active}" + ] + if conv_details: + conv_lines.append(f"[bold]Details:[/bold] {conv_details}") + + console.print(Panel("\n".join(conv_lines), title="Conversation Memory", border_style="medium_purple3")) + console.print(Panel("\n".join(session_lines), title="Session Store", border_style="deep_sky_blue3")) + + # Knowledge graph section + knowledge = status.get("knowledge_graph", {}) + kg_active = knowledge.get("active", False) + kg_lines = [ + f"[bold]Active:[/bold] {'yes' if kg_active else 'no'}", + f"[bold]Purpose:[/bold] {knowledge.get('purpose', 'N/A')}" + ] + + cognee_data = None + cognee_error = None + try: + project_config = ProjectConfigManager() + cognee_data = project_config.get_cognee_config() + except Exception as exc: # pragma: no cover - defensive + cognee_error = str(exc) + + if cognee_data: + data_dir = cognee_data.get('data_directory') + system_dir = cognee_data.get('system_directory') + if data_dir: + kg_lines.append(f"[bold]Data dir:[/bold] {data_dir}") + if system_dir: + kg_lines.append(f"[bold]System dir:[/bold] {system_dir}") + elif cognee_error: + kg_lines.append(f"[yellow]Config unavailable:[/yellow] {cognee_error}") + + dataset_summary = None + if kg_active: + try: + integration = await self.agent.executor._get_knowledge_integration() + if integration: + dataset_summary = await integration.list_datasets() + except Exception as exc: # pragma: no cover - defensive + kg_lines.append(f"[yellow]Dataset listing failed:[/yellow] {exc}") + + if dataset_summary: + if dataset_summary.get("error"): + kg_lines.append(f"[yellow]Dataset listing failed:[/yellow] {dataset_summary['error']}") + else: + datasets = dataset_summary.get("datasets", []) + total = dataset_summary.get("total_datasets") + if total is not None: + kg_lines.append(f"[bold]Datasets:[/bold] {total}") + if datasets: + preview = ", ".join(sorted(datasets)[:5]) + if len(datasets) > 5: + preview += ", โ€ฆ" + kg_lines.append(f"[bold]Samples:[/bold] {preview}") + else: + kg_lines.append("[dim]Run `fuzzforge ingest` to populate the knowledge graph[/dim]") + + console.print(Panel("\n".join(kg_lines), title="Knowledge Graph", border_style="spring_green4")) + console.print("\n[dim]Subcommands: /memory datasets | /memory search [/dim]") + + async def _show_dataset_summary(self) -> None: + """List datasets available in the Cognee knowledge graph.""" + try: + integration = await self.agent.executor._get_knowledge_integration() + except Exception as exc: + console.print(f"[yellow]Knowledge graph unavailable:[/yellow] {exc}") + return + + if not integration: + console.print("[yellow]Knowledge graph is not initialised yet.[/yellow]") + console.print("[dim]Run `fuzzforge ingest --path . --recursive` to create the project dataset.[/dim]") + return + + with safe_status(get_dynamic_status('searching')): + dataset_info = await integration.list_datasets() + + if dataset_info.get("error"): + console.print(f"[red]{dataset_info['error']}[/red]") + return + + datasets = dataset_info.get("datasets", []) + if not datasets: + console.print("[yellow]No datasets found.[/yellow]") + console.print("[dim]Run `fuzzforge ingest` to populate the knowledge graph.[/dim]") + return + + table = Table(title="Cognee Datasets", box=box.ROUNDED) + table.add_column("Dataset", style="medium_purple3") + table.add_column("Notes", style="dim") + + for name in sorted(datasets): + note = "" + if name.endswith("_codebase"): + note = "primary project dataset" + table.add_row(name, note) + + console.print(table) + console.print( + "[dim]Use knowledge graph prompts (e.g. `search project knowledge for \"topic\" using INSIGHTS`) to query these datasets.[/dim]" + ) + + async def cmd_artifacts(self, args: str = "") -> None: + """List or show artifacts""" + if args: + # Show specific artifact + artifacts = await self.agent.executor.get_artifacts(self.context_id) + for artifact in artifacts: + if artifact['id'] == args or args in artifact['id']: + console.print(Panel( + f"[bold]{artifact['title']}[/bold]\n" + f"Type: {artifact['type']} | Created: {artifact['created_at'][:19]}\n\n" + f"[code]{artifact['content']}[/code]", + title=f"Artifact: {artifact['id']}", + border_style="medium_purple3" + )) + return + console.print(f"[yellow]Artifact {args} not found[/yellow]") + return + + # List all artifacts + artifacts = await self.agent.executor.get_artifacts(self.context_id) + + if not artifacts: + console.print("No artifacts created yet") + console.print("[dim]Artifacts are created when generating code, configs, or documents[/dim]") + return + + table = Table(title="Artifacts", box=box.ROUNDED) + table.add_column("ID", style="medium_purple3") + table.add_column("Type", style="deep_sky_blue3") + table.add_column("Title", style="plum3") + table.add_column("Size", style="dim") + table.add_column("Created", style="dim") + + for artifact in artifacts: + size = f"{len(artifact['content'])} chars" + created = artifact['created_at'][:19] # Just date and time + + table.add_row( + artifact['id'], + artifact['type'], + artifact['title'][:40] + "..." if len(artifact['title']) > 40 else artifact['title'], + size, + created + ) + + console.print(table) + console.print(f"\n[dim]Use /artifacts to view artifact content[/dim]") + + async def cmd_tasks(self, args: str = "") -> None: + """List tasks or show details for a specific task.""" + store = getattr(self.agent.executor, "task_store", None) + if not store or not hasattr(store, "tasks"): + console.print("Task store not available") + return + + task_id = args.strip() + + async with store.lock: + tasks = dict(store.tasks) + + if not tasks: + console.print("No tasks recorded yet") + return + + if task_id: + task = tasks.get(task_id) + if not task: + console.print(f"Task '{task_id}' not found") + return + + state_str = task.status.state.value if hasattr(task.status.state, "value") else str(task.status.state) + console.print(f"\n[bold]Task {task.id}[/bold]") + console.print(f"Context: {task.context_id}") + console.print(f"State: {state_str}") + console.print(f"Timestamp: {task.status.timestamp}") + if task.metadata: + console.print("Metadata:") + for key, value in task.metadata.items(): + console.print(f" โ€ข {key}: {value}") + if task.history: + console.print("History:") + for entry in task.history[-5:]: + text = getattr(entry, "text", None) + if not text and hasattr(entry, "parts"): + text = " ".join( + getattr(part, "text", "") for part in getattr(entry, "parts", []) + ) + console.print(f" - {text}") + return + + table = Table(title="FuzzForge Tasks", box=box.ROUNDED) + table.add_column("ID", style="medium_purple3") + table.add_column("State", style="white") + table.add_column("Workflow", style="deep_sky_blue3") + table.add_column("Updated", style="green") + + for task in tasks.values(): + state_value = task.status.state.value if hasattr(task.status.state, "value") else str(task.status.state) + workflow = "" + if task.metadata: + workflow = task.metadata.get("workflow") or task.metadata.get("workflow_name") or "" + timestamp = task.status.timestamp if task.status else "" + table.add_row(task.id, state_value, workflow, timestamp) + + console.print(table) + console.print("\n[dim]Use /tasks to view task details[/dim]") + + async def cmd_sessions(self, args: str = "") -> None: + """List active sessions""" + sessions = self.agent.executor.sessions + + if not sessions: + console.print("No active sessions") + return + + table = Table(title="Active Sessions", box=box.ROUNDED) + table.add_column("Context ID", style="medium_purple3") + table.add_column("Session ID", style="deep_sky_blue3") + table.add_column("User ID", style="plum3") + table.add_column("State", style="dim") + + for context_id, session in sessions.items(): + # Get session info + session_id = getattr(session, 'id', 'N/A') + user_id = getattr(session, 'user_id', 'N/A') + state = getattr(session, 'state', {}) + + # Format state info + agents_count = len(state.get('registered_agents', [])) + state_info = f"{agents_count} agents registered" + + table.add_row( + context_id[:20] + "..." if len(context_id) > 20 else context_id, + session_id[:20] + "..." if len(str(session_id)) > 20 else str(session_id), + user_id, + state_info + ) + + console.print(table) + console.print(f"\n[dim]Current session: {self.context_id}[/dim]") + + async def cmd_skills(self, args: str = "") -> None: + """Show FuzzForge skills""" + card = self.agent.agent_card + + table = Table(title=f"{card.name} Skills", box=box.ROUNDED) + table.add_column("Skill", style="medium_purple3") + table.add_column("Description", style="white") + table.add_column("Tags", style="deep_sky_blue3") + + for skill in card.skills: + table.add_row( + skill.name, + skill.description, + ", ".join(skill.tags[:3]) + ) + + console.print(table) + + async def cmd_clear(self, args: str = "") -> None: + """Clear screen""" + console.clear() + self.print_banner() + + async def cmd_sendfile(self, args: str) -> None: + """Encode a local file as an artifact and route it to a registered agent.""" + tokens = shlex.split(args) + if len(tokens) < 2: + console.print("Usage: /sendfile [message]") + return + + agent_name = tokens[0] + file_arg = tokens[1] + note = " ".join(tokens[2:]).strip() + + file_path = Path(file_arg).expanduser() + if not file_path.exists(): + console.print(f"[red]File not found:[/red] {file_path}") + return + + session = self.agent.executor.sessions.get(self.context_id) + if not session: + console.print("[red]No active session available. Try sending a prompt first.[/red]") + return + + console.print(f"[dim]Delegating {file_path.name} to {agent_name}...[/dim]") + + async def _delegate() -> None: + try: + response = await self.agent.executor.delegate_file_to_agent( + agent_name, + str(file_path), + note, + session=session, + context_id=self.context_id, + ) + console.print(f"[{agent_name}]: {response}") + except Exception as exc: + console.print(f"[red]Failed to delegate file:[/red] {exc}") + finally: + self.background_tasks.discard(asyncio.current_task()) + + task = asyncio.create_task(_delegate()) + self.background_tasks.add(task) + console.print("[dim]Delegation in progressโ€ฆ you can continue working.[/dim]") + + async def cmd_quit(self, args: str = "") -> None: + """Exit the CLI""" + console.print("\n[green]Shutting down...[/green]") + await self.agent.cleanup() + if self.background_tasks: + for task in list(self.background_tasks): + task.cancel() + await asyncio.gather(*self.background_tasks, return_exceptions=True) + console.print("Goodbye!\n") + sys.exit(0) + + async def process_command(self, text: str) -> bool: + """Process slash commands""" + if not text.startswith('/'): + return False + + parts = text.split(maxsplit=1) + cmd = parts[0].lower() + args = parts[1] if len(parts) > 1 else "" + + if cmd in self.commands: + await self.commands[cmd](args) + return True + + console.print(f"Unknown command: {cmd}") + return True + + async def auto_register_agents(self): + """Auto-register agents from config on startup""" + agents_to_register = self.config_manager.get_registered_agents() + + if agents_to_register: + console.print(f"\n[dim]Auto-registering {len(agents_to_register)} agents from config...[/dim]") + + for agent_config in agents_to_register: + url = agent_config.get('url') + name = agent_config.get('name', 'Unknown') + + if url: + try: + with safe_status(f"Registering {name}..."): + result = await self.agent.register_agent(url) + + if result["success"]: + console.print(f" โœ… {name}: [green]Connected[/green]") + else: + console.print(f" โš ๏ธ {name}: [yellow]Failed - {result.get('error', 'Unknown error')}[/yellow]") + except Exception as e: + console.print(f" โš ๏ธ {name}: [yellow]Failed - {e}[/yellow]") + + console.print("") # Empty line for spacing + + async def run(self): + """Main CLI loop""" + self.print_banner() + + # Auto-register agents from config + await self.auto_register_agents() + + while not shutdown_requested: + try: + # Use standard input with non-deletable colored prompt + prompt_symbol = get_prompt_symbol() + try: + # Print colored prompt then use input() for non-deletable behavior + console.print(f"[medium_purple3]{prompt_symbol}[/medium_purple3] ", end="") + user_input = input().strip() + except (EOFError, KeyboardInterrupt): + raise + + if not user_input: + continue + + # Check for commands + if await self.process_command(user_input): + continue + + # Process message + with safe_status(get_dynamic_status('thinking')): + response = await self.agent.process_message(user_input, self.context_id) + + # Display response + console.print(f"\n{response}\n") + + except KeyboardInterrupt: + await self.cmd_quit() + + except EOFError: + await self.cmd_quit() + + except Exception as e: + console.print(f"[red]Error: {e}[/red]") + if os.getenv('FUZZFORGE_DEBUG') == '1': + console.print_exception() + console.print("") + + await self.agent.cleanup() + + +def main(): + """Main entry point""" + try: + cli = FuzzForgeCLI() + asyncio.run(cli.run()) + except KeyboardInterrupt: + console.print("\n[yellow]Interrupted[/yellow]") + sys.exit(0) + except Exception as e: + console.print(f"[red]Fatal error: {e}[/red]") + if os.getenv('FUZZFORGE_DEBUG') == '1': + console.print_exception() + sys.exit(1) + + +if __name__ == "__main__": + main() diff --git a/ai/src/fuzzforge_ai/cognee_integration.py b/ai/src/fuzzforge_ai/cognee_integration.py new file mode 100644 index 0000000..2f134ce --- /dev/null +++ b/ai/src/fuzzforge_ai/cognee_integration.py @@ -0,0 +1,435 @@ +""" +Cognee Integration Module for FuzzForge +Provides standardized access to project-specific knowledge graphs +Can be reused by external agents and other components +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import os +import asyncio +import json +from typing import Dict, List, Any, Optional, Union +from pathlib import Path + + +class CogneeProjectIntegration: + """ + Standardized Cognee integration that can be reused across agents + Automatically detects project context and provides knowledge graph access + """ + + def __init__(self, project_dir: Optional[str] = None): + """ + Initialize with project directory (defaults to current working directory) + + Args: + project_dir: Path to project directory (optional, defaults to cwd) + """ + self.project_dir = Path(project_dir) if project_dir else Path.cwd() + self.config_file = self.project_dir / ".fuzzforge" / "config.yaml" + self.project_context = None + self._cognee = None + self._initialized = False + + async def initialize(self) -> bool: + """ + Initialize Cognee with project context + + Returns: + bool: True if initialization successful + """ + try: + # Import Cognee + import cognee + self._cognee = cognee + + # Load project context + if not self._load_project_context(): + return False + + # Configure Cognee for this project + await self._setup_cognee_config() + + self._initialized = True + return True + + except ImportError: + print("Cognee not installed. Install with: pip install cognee") + return False + except Exception as e: + print(f"Failed to initialize Cognee: {e}") + return False + + def _load_project_context(self) -> bool: + """Load project context from FuzzForge config""" + try: + if not self.config_file.exists(): + print(f"No FuzzForge config found at {self.config_file}") + return False + + import yaml + with open(self.config_file, 'r') as f: + config = yaml.safe_load(f) + + self.project_context = { + "project_name": config.get("project", {}).get("name", "default"), + "project_id": config.get("project", {}).get("id", "default"), + "tenant_id": config.get("cognee", {}).get("tenant", "default") + } + return True + + except Exception as e: + print(f"Error loading project context: {e}") + return False + + async def _setup_cognee_config(self): + """Configure Cognee for project-specific access""" + # Set API key and model + api_key = os.getenv('OPENAI_API_KEY') + model = os.getenv('LITELLM_MODEL', 'gpt-4o-mini') + + if not api_key: + raise ValueError("OPENAI_API_KEY required for Cognee operations") + + # Configure Cognee + self._cognee.config.set_llm_api_key(api_key) + self._cognee.config.set_llm_model(model) + self._cognee.config.set_llm_provider("openai") + + # Set project-specific directories + project_cognee_dir = self.project_dir / ".fuzzforge" / "cognee" / f"project_{self.project_context['project_id']}" + + self._cognee.config.data_root_directory(str(project_cognee_dir / "data")) + self._cognee.config.system_root_directory(str(project_cognee_dir / "system")) + + # Ensure directories exist + project_cognee_dir.mkdir(parents=True, exist_ok=True) + (project_cognee_dir / "data").mkdir(exist_ok=True) + (project_cognee_dir / "system").mkdir(exist_ok=True) + + async def search_knowledge_graph(self, query: str, search_type: str = "GRAPH_COMPLETION", dataset: str = None) -> Dict[str, Any]: + """ + Search the project's knowledge graph + + Args: + query: Search query + search_type: Type of search ("GRAPH_COMPLETION", "INSIGHTS", "CHUNKS", etc.) + dataset: Specific dataset to search (optional) + + Returns: + Dict containing search results + """ + if not self._initialized: + await self.initialize() + + if not self._initialized: + return {"error": "Cognee not initialized"} + + try: + from cognee.modules.search.types import SearchType + + # Resolve search type dynamically; fallback to GRAPH_COMPLETION + try: + search_type_enum = getattr(SearchType, search_type.upper()) + except AttributeError: + search_type_enum = SearchType.GRAPH_COMPLETION + search_type = "GRAPH_COMPLETION" + + # Prepare search kwargs + search_kwargs = { + "query_type": search_type_enum, + "query_text": query + } + + # Add dataset filter if specified + if dataset: + search_kwargs["datasets"] = [dataset] + + results = await self._cognee.search(**search_kwargs) + + return { + "query": query, + "search_type": search_type, + "dataset": dataset, + "results": results, + "project": self.project_context["project_name"] + } + except Exception as e: + return {"error": f"Search failed: {e}"} + + async def list_knowledge_data(self) -> Dict[str, Any]: + """ + List available data in the knowledge graph + + Returns: + Dict containing available data + """ + if not self._initialized: + await self.initialize() + + if not self._initialized: + return {"error": "Cognee not initialized"} + + try: + data = await self._cognee.list_data() + return { + "project": self.project_context["project_name"], + "available_data": data + } + except Exception as e: + return {"error": f"Failed to list data: {e}"} + + async def ingest_text_to_dataset(self, text: str, dataset: str = None) -> Dict[str, Any]: + """ + Ingest text content into a specific dataset + + Args: + text: Text to ingest + dataset: Dataset name (defaults to project_name_codebase) + + Returns: + Dict containing ingest results + """ + if not self._initialized: + await self.initialize() + + if not self._initialized: + return {"error": "Cognee not initialized"} + + if not dataset: + dataset = f"{self.project_context['project_name']}_codebase" + + try: + # Add text to dataset + await self._cognee.add([text], dataset_name=dataset) + + # Process (cognify) the dataset + await self._cognee.cognify([dataset]) + + return { + "text_length": len(text), + "dataset": dataset, + "project": self.project_context["project_name"], + "status": "success" + } + except Exception as e: + return {"error": f"Ingest failed: {e}"} + + async def ingest_files_to_dataset(self, file_paths: list, dataset: str = None) -> Dict[str, Any]: + """ + Ingest multiple files into a specific dataset + + Args: + file_paths: List of file paths to ingest + dataset: Dataset name (defaults to project_name_codebase) + + Returns: + Dict containing ingest results + """ + if not self._initialized: + await self.initialize() + + if not self._initialized: + return {"error": "Cognee not initialized"} + + if not dataset: + dataset = f"{self.project_context['project_name']}_codebase" + + try: + # Validate and filter readable files + valid_files = [] + for file_path in file_paths: + try: + path = Path(file_path) + if path.exists() and path.is_file(): + # Test if file is readable + with open(path, 'r', encoding='utf-8') as f: + f.read(1) + valid_files.append(str(path)) + except (UnicodeDecodeError, PermissionError, OSError): + continue + + if not valid_files: + return {"error": "No valid files found to ingest"} + + # Add files to dataset + await self._cognee.add(valid_files, dataset_name=dataset) + + # Process (cognify) the dataset + await self._cognee.cognify([dataset]) + + return { + "files_processed": len(valid_files), + "total_files_requested": len(file_paths), + "dataset": dataset, + "project": self.project_context["project_name"], + "status": "success" + } + except Exception as e: + return {"error": f"Ingest failed: {e}"} + + async def list_datasets(self) -> Dict[str, Any]: + """ + List all datasets available in the project + + Returns: + Dict containing available datasets + """ + if not self._initialized: + await self.initialize() + + if not self._initialized: + return {"error": "Cognee not initialized"} + + try: + # Get available datasets by searching for data + data = await self._cognee.list_data() + + # Extract unique dataset names from the data + datasets = set() + if isinstance(data, list): + for item in data: + if isinstance(item, dict) and 'dataset_name' in item: + datasets.add(item['dataset_name']) + + return { + "project": self.project_context["project_name"], + "datasets": list(datasets), + "total_datasets": len(datasets) + } + except Exception as e: + return {"error": f"Failed to list datasets: {e}"} + + async def create_dataset(self, dataset: str) -> Dict[str, Any]: + """ + Create a new dataset (dataset is created automatically when data is added) + + Args: + dataset: Dataset name to create + + Returns: + Dict containing creation result + """ + if not self._initialized: + await self.initialize() + + if not self._initialized: + return {"error": "Cognee not initialized"} + + try: + # In Cognee, datasets are created implicitly when data is added + # We'll add empty content to create the dataset + await self._cognee.add([f"Dataset {dataset} initialized for project {self.project_context['project_name']}"], + dataset_name=dataset) + + return { + "dataset": dataset, + "project": self.project_context["project_name"], + "status": "created" + } + except Exception as e: + return {"error": f"Failed to create dataset: {e}"} + + def get_project_context(self) -> Optional[Dict[str, str]]: + """Get current project context""" + return self.project_context + + def is_initialized(self) -> bool: + """Check if Cognee is initialized""" + return self._initialized + + +# Convenience functions for easy integration +async def search_project_codebase(query: str, project_dir: Optional[str] = None, dataset: str = None, search_type: str = "GRAPH_COMPLETION") -> str: + """ + Convenience function to search project codebase + + Args: + query: Search query + project_dir: Project directory (optional, defaults to cwd) + dataset: Specific dataset to search (optional) + search_type: Type of search ("GRAPH_COMPLETION", "INSIGHTS", "CHUNKS") + + Returns: + Formatted search results as string + """ + cognee_integration = CogneeProjectIntegration(project_dir) + result = await cognee_integration.search_knowledge_graph(query, search_type, dataset) + + if "error" in result: + return f"Error searching codebase: {result['error']}" + + project_name = result.get("project", "Unknown") + results = result.get("results", []) + + if not results: + return f"No results found for '{query}' in project {project_name}" + + output = f"Search results for '{query}' in project {project_name}:\n\n" + + # Format results + if isinstance(results, list): + for i, item in enumerate(results, 1): + if isinstance(item, dict): + # Handle structured results + output += f"{i}. " + if "search_result" in item: + output += f"Dataset: {item.get('dataset_name', 'Unknown')}\n" + for result_item in item["search_result"]: + if isinstance(result_item, dict): + if "name" in result_item: + output += f" - {result_item['name']}: {result_item.get('description', '')}\n" + elif "text" in result_item: + text = result_item["text"][:200] + "..." if len(result_item["text"]) > 200 else result_item["text"] + output += f" - {text}\n" + else: + output += f" - {str(result_item)[:200]}...\n" + else: + output += f"{str(item)[:200]}...\n" + output += "\n" + else: + output += f"{i}. {str(item)[:200]}...\n\n" + else: + output += f"{str(results)[:500]}..." + + return output + + +async def list_project_knowledge(project_dir: Optional[str] = None) -> str: + """ + Convenience function to list project knowledge + + Args: + project_dir: Project directory (optional, defaults to cwd) + + Returns: + Formatted list of available data + """ + cognee_integration = CogneeProjectIntegration(project_dir) + result = await cognee_integration.list_knowledge_data() + + if "error" in result: + return f"Error listing knowledge: {result['error']}" + + project_name = result.get("project", "Unknown") + data = result.get("available_data", []) + + output = f"Available knowledge in project {project_name}:\n\n" + + if not data: + output += "No data available in knowledge graph" + else: + for i, item in enumerate(data, 1): + output += f"{i}. {item}\n" + + return output diff --git a/ai/src/fuzzforge_ai/cognee_service.py b/ai/src/fuzzforge_ai/cognee_service.py new file mode 100644 index 0000000..dea5d5d --- /dev/null +++ b/ai/src/fuzzforge_ai/cognee_service.py @@ -0,0 +1,416 @@ +""" +Cognee Service for FuzzForge +Provides integrated Cognee functionality for codebase analysis and knowledge graphs +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import os +import asyncio +import logging +from pathlib import Path +from typing import Dict, List, Any, Optional +from datetime import datetime + +logger = logging.getLogger(__name__) + + +class CogneeService: + """ + Service for managing Cognee integration with FuzzForge + Handles multi-tenant isolation and project-specific knowledge graphs + """ + + def __init__(self, config): + """Initialize with FuzzForge config""" + self.config = config + self.cognee_config = config.get_cognee_config() + self.project_context = config.get_project_context() + self._cognee = None + self._user = None + self._initialized = False + + async def initialize(self): + """Initialize Cognee with project-specific configuration""" + try: + # Ensure environment variables for Cognee are set before import + self.config.setup_cognee_environment() + logger.debug( + "Cognee environment configured", + extra={ + "data": self.cognee_config.get("data_directory"), + "system": self.cognee_config.get("system_directory"), + }, + ) + + import cognee + self._cognee = cognee + + # Configure LLM with API key BEFORE any other cognee operations + provider = os.getenv("LLM_PROVIDER", "openai") + model = os.getenv("LLM_MODEL") or os.getenv("LITELLM_MODEL", "gpt-4o-mini") + api_key = os.getenv("LLM_API_KEY") or os.getenv("OPENAI_API_KEY") + endpoint = os.getenv("LLM_ENDPOINT") + api_version = os.getenv("LLM_API_VERSION") + max_tokens = os.getenv("LLM_MAX_TOKENS") + + if provider.lower() in {"openai", "azure_openai", "custom"} and not api_key: + raise ValueError( + "OpenAI-compatible API key is required for Cognee LLM operations. " + "Set OPENAI_API_KEY, LLM_API_KEY, or COGNEE_LLM_API_KEY in your .env" + ) + + # Expose environment variables for downstream libraries + os.environ["LLM_PROVIDER"] = provider + os.environ["LITELLM_MODEL"] = model + os.environ["LLM_MODEL"] = model + if api_key: + os.environ["LLM_API_KEY"] = api_key + # Maintain compatibility with components still expecting OPENAI_API_KEY + if provider.lower() in {"openai", "azure_openai", "custom"}: + os.environ.setdefault("OPENAI_API_KEY", api_key) + if endpoint: + os.environ["LLM_ENDPOINT"] = endpoint + if api_version: + os.environ["LLM_API_VERSION"] = api_version + if max_tokens: + os.environ["LLM_MAX_TOKENS"] = str(max_tokens) + + # Configure Cognee's runtime using its configuration helpers when available + if hasattr(cognee.config, "set_llm_provider"): + cognee.config.set_llm_provider(provider) + if hasattr(cognee.config, "set_llm_model"): + cognee.config.set_llm_model(model) + if api_key and hasattr(cognee.config, "set_llm_api_key"): + cognee.config.set_llm_api_key(api_key) + if endpoint and hasattr(cognee.config, "set_llm_endpoint"): + cognee.config.set_llm_endpoint(endpoint) + if api_version and hasattr(cognee.config, "set_llm_api_version"): + cognee.config.set_llm_api_version(api_version) + if max_tokens and hasattr(cognee.config, "set_llm_max_tokens"): + cognee.config.set_llm_max_tokens(int(max_tokens)) + + # Configure graph database + cognee.config.set_graph_db_config({ + "graph_database_provider": self.cognee_config.get("graph_database_provider", "kuzu"), + }) + + # Set data directories + data_dir = self.cognee_config.get("data_directory") + system_dir = self.cognee_config.get("system_directory") + + if data_dir: + logger.debug("Setting cognee data root", extra={"path": data_dir}) + cognee.config.data_root_directory(data_dir) + if system_dir: + logger.debug("Setting cognee system root", extra={"path": system_dir}) + cognee.config.system_root_directory(system_dir) + + # Setup multi-tenant user context + await self._setup_user_context() + + self._initialized = True + logger.info(f"Cognee initialized for project {self.project_context['project_name']} " + f"with Kuzu at {system_dir}") + + except ImportError: + logger.error("Cognee not installed. Install with: pip install cognee") + raise + except Exception as e: + logger.error(f"Failed to initialize Cognee: {e}") + raise + + async def create_dataset(self): + """Create dataset for this project if it doesn't exist""" + if not self._initialized: + await self.initialize() + + try: + # Dataset creation is handled automatically by Cognee when adding files + # We just ensure we have the right context set up + dataset_name = f"{self.project_context['project_name']}_codebase" + logger.info(f"Dataset {dataset_name} ready for project {self.project_context['project_name']}") + return dataset_name + except Exception as e: + logger.error(f"Failed to create dataset: {e}") + raise + + async def _setup_user_context(self): + """Setup user context for multi-tenant isolation""" + try: + from cognee.modules.users.methods import create_user, get_user + + # Always try fallback email first to avoid validation issues + fallback_email = f"project_{self.project_context['project_id']}@fuzzforge.example" + user_tenant = self.project_context['tenant_id'] + + # Try to get existing fallback user first + try: + self._user = await get_user(fallback_email) + logger.info(f"Using existing user: {fallback_email}") + return + except: + # User doesn't exist, try to create fallback + pass + + # Create fallback user + try: + self._user = await create_user(fallback_email, user_tenant) + logger.info(f"Created fallback user: {fallback_email} for tenant: {user_tenant}") + return + except Exception as fallback_error: + logger.warning(f"Fallback user creation failed: {fallback_error}") + self._user = None + return + + except Exception as e: + logger.warning(f"Could not setup multi-tenant user context: {e}") + logger.info("Proceeding with default context") + self._user = None + + def get_project_dataset_name(self, dataset_suffix: str = "codebase") -> str: + """Get project-specific dataset name""" + return f"{self.project_context['project_name']}_{dataset_suffix}" + + async def ingest_text(self, content: str, dataset: str = "fuzzforge") -> bool: + """Ingest text content into knowledge graph""" + if not self._initialized: + await self.initialize() + + try: + await self._cognee.add([content], dataset) + await self._cognee.cognify([dataset]) + return True + except Exception as e: + logger.error(f"Failed to ingest text: {e}") + return False + + async def ingest_files(self, file_paths: List[Path], dataset: str = "fuzzforge") -> Dict[str, Any]: + """Ingest multiple files into knowledge graph""" + if not self._initialized: + await self.initialize() + + results = { + "success": 0, + "failed": 0, + "errors": [] + } + + try: + ingest_paths: List[str] = [] + for file_path in file_paths: + try: + with open(file_path, 'r', encoding='utf-8'): + ingest_paths.append(str(file_path)) + results["success"] += 1 + except (UnicodeDecodeError, PermissionError) as exc: + results["failed"] += 1 + results["errors"].append(f"{file_path}: {exc}") + logger.warning("Skipping %s: %s", file_path, exc) + + if ingest_paths: + await self._cognee.add(ingest_paths, dataset_name=dataset) + await self._cognee.cognify([dataset]) + + except Exception as e: + logger.error(f"Failed to ingest files: {e}") + results["errors"].append(f"Cognify error: {str(e)}") + + return results + + async def search_insights(self, query: str, dataset: str = None) -> List[str]: + """Search for insights in the knowledge graph""" + if not self._initialized: + await self.initialize() + + try: + from cognee.modules.search.types import SearchType + + kwargs = { + "query_type": SearchType.INSIGHTS, + "query_text": query + } + + if dataset: + kwargs["datasets"] = [dataset] + + results = await self._cognee.search(**kwargs) + return results if isinstance(results, list) else [] + + except Exception as e: + logger.error(f"Failed to search insights: {e}") + return [] + + async def search_chunks(self, query: str, dataset: str = None) -> List[str]: + """Search for relevant text chunks""" + if not self._initialized: + await self.initialize() + + try: + from cognee.modules.search.types import SearchType + + kwargs = { + "query_type": SearchType.CHUNKS, + "query_text": query + } + + if dataset: + kwargs["datasets"] = [dataset] + + results = await self._cognee.search(**kwargs) + return results if isinstance(results, list) else [] + + except Exception as e: + logger.error(f"Failed to search chunks: {e}") + return [] + + async def search_graph_completion(self, query: str) -> List[str]: + """Search for graph completion (relationships)""" + if not self._initialized: + await self.initialize() + + try: + from cognee.modules.search.types import SearchType + + results = await self._cognee.search( + query_type=SearchType.GRAPH_COMPLETION, + query_text=query + ) + return results if isinstance(results, list) else [] + + except Exception as e: + logger.error(f"Failed to search graph completion: {e}") + return [] + + async def get_status(self) -> Dict[str, Any]: + """Get service status and statistics""" + status = { + "initialized": self._initialized, + "enabled": self.cognee_config.get("enabled", True), + "provider": self.cognee_config.get("graph_database_provider", "kuzu"), + "data_directory": self.cognee_config.get("data_directory"), + "system_directory": self.cognee_config.get("system_directory"), + } + + if self._initialized: + try: + # Check if directories exist and get sizes + data_dir = Path(status["data_directory"]) + system_dir = Path(status["system_directory"]) + + status.update({ + "data_dir_exists": data_dir.exists(), + "system_dir_exists": system_dir.exists(), + "kuzu_db_exists": (system_dir / "kuzu_db").exists(), + "lancedb_exists": (system_dir / "lancedb").exists(), + }) + + except Exception as e: + status["status_error"] = str(e) + + return status + + async def clear_data(self, confirm: bool = False): + """Clear all ingested data (dangerous!)""" + if not confirm: + raise ValueError("Must confirm data clearing with confirm=True") + + if not self._initialized: + await self.initialize() + + try: + await self._cognee.prune.prune_data() + await self._cognee.prune.prune_system(metadata=True) + logger.info("Cognee data cleared") + except Exception as e: + logger.error(f"Failed to clear data: {e}") + raise + + +class FuzzForgeCogneeIntegration: + """ + Main integration class for FuzzForge + Cognee + Provides high-level operations for security analysis + """ + + def __init__(self, config): + self.service = CogneeService(config) + + async def analyze_codebase(self, path: Path, recursive: bool = True) -> Dict[str, Any]: + """ + Analyze a codebase and extract security-relevant insights + """ + # Collect code files + from fuzzforge_ai.ingest_utils import collect_ingest_files + + files = collect_ingest_files(path, recursive, None, []) + + if not files: + return {"error": "No files found to analyze"} + + # Ingest files + results = await self.service.ingest_files(files, "security_analysis") + + if results["success"] == 0: + return {"error": "Failed to ingest any files", "details": results} + + # Extract security insights + security_queries = [ + "vulnerabilities security risks", + "authentication authorization", + "input validation sanitization", + "encryption cryptography", + "error handling exceptions", + "logging sensitive data" + ] + + insights = {} + for query in security_queries: + insight_results = await self.service.search_insights(query, "security_analysis") + if insight_results: + insights[query.replace(" ", "_")] = insight_results + + return { + "files_processed": results["success"], + "files_failed": results["failed"], + "errors": results["errors"], + "security_insights": insights + } + + async def query_codebase(self, query: str, search_type: str = "insights") -> List[str]: + """Query the ingested codebase""" + if search_type == "insights": + return await self.service.search_insights(query) + elif search_type == "chunks": + return await self.service.search_chunks(query) + elif search_type == "graph": + return await self.service.search_graph_completion(query) + else: + raise ValueError(f"Unknown search type: {search_type}") + + async def get_project_summary(self) -> Dict[str, Any]: + """Get a summary of the analyzed project""" + # Search for general project insights + summary_queries = [ + "project structure components", + "main functionality features", + "programming languages frameworks", + "dependencies libraries" + ] + + summary = {} + for query in summary_queries: + results = await self.service.search_insights(query) + if results: + summary[query.replace(" ", "_")] = results[:3] # Top 3 results + + return summary diff --git a/ai/src/fuzzforge_ai/config.yaml b/ai/src/fuzzforge_ai/config.yaml new file mode 100644 index 0000000..133c61d --- /dev/null +++ b/ai/src/fuzzforge_ai/config.yaml @@ -0,0 +1,9 @@ +# FuzzForge Registered Agents +# These agents will be automatically registered on startup + +registered_agents: + +# Example entries: +# - name: Calculator +# url: http://localhost:10201 +# description: Mathematical calculations agent diff --git a/ai/src/fuzzforge_ai/config_bridge.py b/ai/src/fuzzforge_ai/config_bridge.py new file mode 100644 index 0000000..668f607 --- /dev/null +++ b/ai/src/fuzzforge_ai/config_bridge.py @@ -0,0 +1,31 @@ +"""Bridge module providing access to the host CLI configuration manager.""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +try: + from fuzzforge_cli.config import ProjectConfigManager as _ProjectConfigManager +except ImportError as exc: # pragma: no cover - used when CLI not available + class _ProjectConfigManager: # type: ignore[no-redef] + """Fallback implementation that raises a helpful error.""" + + def __init__(self, *args, **kwargs): + raise ImportError( + "ProjectConfigManager is unavailable. Install the FuzzForge CLI " + "package or supply a compatible configuration object." + ) from exc + + def __getattr__(name): # pragma: no cover - defensive + raise ImportError("ProjectConfigManager unavailable") from exc + +ProjectConfigManager = _ProjectConfigManager + +__all__ = ["ProjectConfigManager"] diff --git a/ai/src/fuzzforge_ai/config_manager.py b/ai/src/fuzzforge_ai/config_manager.py new file mode 100644 index 0000000..9aa76ca --- /dev/null +++ b/ai/src/fuzzforge_ai/config_manager.py @@ -0,0 +1,134 @@ +""" +Configuration manager for FuzzForge +Handles loading and saving registered agents +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import os +import yaml +from typing import Dict, Any, List + +class ConfigManager: + """Manages FuzzForge agent registry configuration""" + + def __init__(self, config_path: str = None): + """Initialize config manager""" + if config_path: + self.config_path = config_path + else: + # Check for local .fuzzforge/agents.yaml first, then fall back to global + local_config = os.path.join(os.getcwd(), '.fuzzforge', 'agents.yaml') + global_config = os.path.join(os.path.dirname(__file__), 'config.yaml') + + if os.path.exists(local_config): + self.config_path = local_config + if os.getenv("FUZZFORGE_DEBUG", "0") == "1": + print(f"[CONFIG] Using local config: {local_config}") + else: + self.config_path = global_config + if os.getenv("FUZZFORGE_DEBUG", "0") == "1": + print(f"[CONFIG] Using global config: {global_config}") + + self.config = self.load_config() + + def load_config(self) -> Dict[str, Any]: + """Load configuration from YAML file""" + if not os.path.exists(self.config_path): + # Create default config if it doesn't exist + return {'registered_agents': []} + + try: + with open(self.config_path, 'r') as f: + config = yaml.safe_load(f) or {} + # Ensure registered_agents is a list + if 'registered_agents' not in config or config['registered_agents'] is None: + config['registered_agents'] = [] + return config + except Exception as e: + print(f"[WARNING] Failed to load config: {e}") + return {'registered_agents': []} + + def save_config(self): + """Save current configuration to file""" + try: + # Create a clean config with comments + config_content = """# FuzzForge Registered Agents +# These agents will be automatically registered on startup + +""" + # Add the agents list + if self.config.get('registered_agents'): + config_content += yaml.dump({'registered_agents': self.config['registered_agents']}, + default_flow_style=False, sort_keys=False) + else: + config_content += "registered_agents: []\n" + + config_content += """ +# Example entries: +# - name: Calculator +# url: http://localhost:10201 +# description: Mathematical calculations agent +""" + + with open(self.config_path, 'w') as f: + f.write(config_content) + + return True + except Exception as e: + print(f"[ERROR] Failed to save config: {e}") + return False + + def get_registered_agents(self) -> List[Dict[str, Any]]: + """Get list of registered agents from config""" + return self.config.get('registered_agents', []) + + def add_registered_agent(self, name: str, url: str, description: str = "") -> bool: + """Add a new registered agent to config""" + if 'registered_agents' not in self.config: + self.config['registered_agents'] = [] + + # Check if agent already exists + for agent in self.config['registered_agents']: + if agent.get('url') == url: + # Update existing agent + agent['name'] = name + agent['description'] = description + return self.save_config() + + # Add new agent + self.config['registered_agents'].append({ + 'name': name, + 'url': url, + 'description': description + }) + + return self.save_config() + + def remove_registered_agent(self, name: str = None, url: str = None) -> bool: + """Remove a registered agent from config""" + if 'registered_agents' not in self.config: + return False + + original_count = len(self.config['registered_agents']) + + # Filter out the agent + self.config['registered_agents'] = [ + agent for agent in self.config['registered_agents'] + if not ((name and agent.get('name') == name) or + (url and agent.get('url') == url)) + ] + + if len(self.config['registered_agents']) < original_count: + return self.save_config() + + return False diff --git a/ai/src/fuzzforge_ai/ingest_utils.py b/ai/src/fuzzforge_ai/ingest_utils.py new file mode 100644 index 0000000..ef272d5 --- /dev/null +++ b/ai/src/fuzzforge_ai/ingest_utils.py @@ -0,0 +1,104 @@ +"""Utilities for collecting files to ingest into Cognee.""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +from __future__ import annotations + +import fnmatch +from pathlib import Path +from typing import Iterable, List, Optional + +_DEFAULT_FILE_TYPES = [ + ".py", + ".js", + ".ts", + ".java", + ".cpp", + ".c", + ".h", + ".rs", + ".go", + ".rb", + ".php", + ".cs", + ".swift", + ".kt", + ".scala", + ".clj", + ".hs", + ".md", + ".txt", + ".yaml", + ".yml", + ".json", + ".toml", + ".cfg", + ".ini", +] + +_DEFAULT_EXCLUDE = [ + "*.pyc", + "__pycache__", + ".git", + ".svn", + ".hg", + "node_modules", + ".venv", + "venv", + ".env", + "dist", + "build", + ".pytest_cache", + ".mypy_cache", + ".tox", + "coverage", + "*.log", + "*.tmp", +] + + +def collect_ingest_files( + path: Path, + recursive: bool = True, + file_types: Optional[Iterable[str]] = None, + exclude: Optional[Iterable[str]] = None, +) -> List[Path]: + """Return a list of files eligible for ingestion.""" + path = path.resolve() + files: List[Path] = [] + + extensions = list(file_types) if file_types else list(_DEFAULT_FILE_TYPES) + exclusions = list(exclude) if exclude else [] + exclusions.extend(_DEFAULT_EXCLUDE) + + def should_exclude(file_path: Path) -> bool: + file_str = str(file_path) + for pattern in exclusions: + if fnmatch.fnmatch(file_str, f"*{pattern}*") or fnmatch.fnmatch(file_path.name, pattern): + return True + return False + + if path.is_file(): + if not should_exclude(path) and any(str(path).endswith(ext) for ext in extensions): + files.append(path) + return files + + pattern = "**/*" if recursive else "*" + for file_path in path.glob(pattern): + if file_path.is_file() and not should_exclude(file_path): + if any(str(file_path).endswith(ext) for ext in extensions): + files.append(file_path) + + return files + + +__all__ = ["collect_ingest_files"] diff --git a/ai/src/fuzzforge_ai/memory_service.py b/ai/src/fuzzforge_ai/memory_service.py new file mode 100644 index 0000000..8f2446d --- /dev/null +++ b/ai/src/fuzzforge_ai/memory_service.py @@ -0,0 +1,247 @@ +""" +FuzzForge Memory Service +Implements ADK MemoryService pattern for conversational memory +Separate from Cognee which will be used for RAG/codebase analysis +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import os +import json +from typing import Dict, List, Any, Optional +from datetime import datetime +import logging + +# ADK Memory imports +from google.adk.memory import InMemoryMemoryService, BaseMemoryService +from google.adk.memory.base_memory_service import SearchMemoryResponse +from google.adk.memory.memory_entry import MemoryEntry + +# Optional VertexAI Memory Bank +try: + from google.adk.memory import VertexAiMemoryBankService + VERTEX_AVAILABLE = True +except ImportError: + VERTEX_AVAILABLE = False + +logger = logging.getLogger(__name__) + + +class FuzzForgeMemoryService: + """ + Manages conversational memory using ADK patterns + This is separate from Cognee which will handle RAG/codebase + """ + + def __init__(self, memory_type: str = "inmemory", **kwargs): + """ + Initialize memory service + + Args: + memory_type: "inmemory" or "vertexai" + **kwargs: Additional args for specific memory service + For vertexai: project, location, agent_engine_id + """ + self.memory_type = memory_type + self.service = self._create_service(memory_type, **kwargs) + + def _create_service(self, memory_type: str, **kwargs) -> BaseMemoryService: + """Create the appropriate memory service""" + + if memory_type == "inmemory": + # Use ADK's InMemoryMemoryService for local development + logger.info("Using InMemory MemoryService for conversational memory") + return InMemoryMemoryService() + + elif memory_type == "vertexai" and VERTEX_AVAILABLE: + # Use VertexAI Memory Bank for production + project = kwargs.get('project') or os.getenv('GOOGLE_CLOUD_PROJECT') + location = kwargs.get('location') or os.getenv('GOOGLE_CLOUD_LOCATION', 'us-central1') + agent_engine_id = kwargs.get('agent_engine_id') or os.getenv('AGENT_ENGINE_ID') + + if not all([project, location, agent_engine_id]): + logger.warning("VertexAI config missing, falling back to InMemory") + return InMemoryMemoryService() + + logger.info(f"Using VertexAI MemoryBank: {agent_engine_id}") + return VertexAiMemoryBankService( + project=project, + location=location, + agent_engine_id=agent_engine_id + ) + else: + # Default to in-memory + logger.info("Defaulting to InMemory MemoryService") + return InMemoryMemoryService() + + async def add_session_to_memory(self, session: Any) -> None: + """ + Add a completed session to long-term memory + This extracts meaningful information from the conversation + + Args: + session: The session object to process + """ + try: + # Let the underlying service handle the ingestion + # It will extract relevant information based on the implementation + await self.service.add_session_to_memory(session) + + logger.debug(f"Added session {session.id} to {self.memory_type} memory") + + except Exception as e: + logger.error(f"Failed to add session to memory: {e}") + + async def search_memory(self, + query: str, + app_name: str = "fuzzforge", + user_id: str = None, + max_results: int = 10) -> SearchMemoryResponse: + """ + Search long-term memory for relevant information + + Args: + query: The search query + app_name: Application name for filtering + user_id: User ID for filtering (optional) + max_results: Maximum number of results + + Returns: + SearchMemoryResponse with relevant memories + """ + try: + # Search the memory service + results = await self.service.search_memory( + app_name=app_name, + user_id=user_id, + query=query + ) + + logger.debug(f"Memory search for '{query}' returned {len(results.memories)} results") + return results + + except Exception as e: + logger.error(f"Memory search failed: {e}") + # Return empty results on error + return SearchMemoryResponse(memories=[]) + + async def ingest_completed_sessions(self, session_service) -> int: + """ + Batch ingest all completed sessions into memory + Useful for initial memory population + + Args: + session_service: The session service containing sessions + + Returns: + Number of sessions ingested + """ + ingested = 0 + + try: + # Get all sessions from the session service + sessions = await session_service.list_sessions(app_name="fuzzforge") + + for session_info in sessions: + # Load full session + session = await session_service.load_session( + app_name="fuzzforge", + user_id=session_info.get('user_id'), + session_id=session_info.get('id') + ) + + if session and len(session.get_events()) > 0: + await self.add_session_to_memory(session) + ingested += 1 + + logger.info(f"Ingested {ingested} sessions into {self.memory_type} memory") + + except Exception as e: + logger.error(f"Failed to batch ingest sessions: {e}") + + return ingested + + def get_status(self) -> Dict[str, Any]: + """Get memory service status""" + return { + "type": self.memory_type, + "active": self.service is not None, + "vertex_available": VERTEX_AVAILABLE, + "details": { + "inmemory": "Non-persistent, keyword search", + "vertexai": "Persistent, semantic search with LLM extraction" + }.get(self.memory_type, "Unknown") + } + + +class HybridMemoryManager: + """ + Manages both ADK MemoryService (conversational) and Cognee (RAG/codebase) + Provides unified interface for both memory systems + """ + + def __init__(self, + memory_service: FuzzForgeMemoryService = None, + cognee_tools = None): + """ + Initialize with both memory systems + + Args: + memory_service: ADK-pattern memory for conversations + cognee_tools: Cognee MCP tools for RAG/codebase + """ + # ADK memory for conversations + self.memory_service = memory_service or FuzzForgeMemoryService() + + # Cognee for knowledge graphs and RAG (future) + self.cognee_tools = cognee_tools + + async def search_conversational_memory(self, query: str) -> SearchMemoryResponse: + """Search past conversations using ADK memory""" + return await self.memory_service.search_memory(query) + + async def search_knowledge_graph(self, query: str, search_type: str = "GRAPH_COMPLETION"): + """Search Cognee knowledge graph (for RAG/codebase in future)""" + if not self.cognee_tools: + return None + + try: + # Use Cognee's graph search + return await self.cognee_tools.search( + query=query, + search_type=search_type + ) + except Exception as e: + logger.debug(f"Cognee search failed: {e}") + return None + + async def store_in_graph(self, content: str): + """Store in Cognee knowledge graph (for codebase analysis later)""" + if not self.cognee_tools: + return None + + try: + # Use cognify to create graph structures + return await self.cognee_tools.cognify(content) + except Exception as e: + logger.debug(f"Cognee store failed: {e}") + return None + + def get_status(self) -> Dict[str, Any]: + """Get status of both memory systems""" + return { + "conversational_memory": self.memory_service.get_status(), + "knowledge_graph": { + "active": self.cognee_tools is not None, + "purpose": "RAG/codebase analysis (future)" + } + } \ No newline at end of file diff --git a/ai/src/fuzzforge_ai/remote_agent.py b/ai/src/fuzzforge_ai/remote_agent.py new file mode 100644 index 0000000..52da844 --- /dev/null +++ b/ai/src/fuzzforge_ai/remote_agent.py @@ -0,0 +1,148 @@ +""" +Remote Agent Connection Handler +Handles A2A protocol communication with remote agents +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import httpx +import uuid +from typing import Dict, Any, Optional, List + + +class RemoteAgentConnection: + """Handles A2A protocol communication with remote agents""" + + def __init__(self, url: str): + """Initialize connection to a remote agent""" + self.url = url.rstrip('/') + self.agent_card = None + self.client = httpx.AsyncClient(timeout=120.0) + self.context_id = None + + async def get_agent_card(self) -> Optional[Dict[str, Any]]: + """Get the agent card from the remote agent""" + try: + # Try new path first (A2A 0.3.0+) + response = await self.client.get(f"{self.url}/.well-known/agent-card.json") + response.raise_for_status() + self.agent_card = response.json() + return self.agent_card + except: + # Try old path for compatibility + try: + response = await self.client.get(f"{self.url}/.well-known/agent.json") + response.raise_for_status() + self.agent_card = response.json() + return self.agent_card + except Exception as e: + print(f"Failed to get agent card from {self.url}: {e}") + return None + + async def send_message(self, message: str | Dict[str, Any] | List[Dict[str, Any]]) -> str: + """Send a message to the remote agent using A2A protocol""" + try: + parts: List[Dict[str, Any]] + metadata: Dict[str, Any] | None = None + if isinstance(message, dict): + metadata = message.get("metadata") if isinstance(message.get("metadata"), dict) else None + raw_parts = message.get("parts", []) + if not raw_parts: + text_value = message.get("text") or message.get("message") + if isinstance(text_value, str): + raw_parts = [{"type": "text", "text": text_value}] + parts = [raw_part for raw_part in raw_parts if isinstance(raw_part, dict)] + elif isinstance(message, list): + parts = [part for part in message if isinstance(part, dict)] + metadata = None + else: + parts = [{"type": "text", "text": message}] + metadata = None + + if not parts: + parts = [{"type": "text", "text": ""}] + + # Build JSON-RPC request per A2A spec + payload = { + "jsonrpc": "2.0", + "method": "message/send", + "params": { + "message": { + "messageId": str(uuid.uuid4()), + "role": "user", + "parts": parts, + } + }, + "id": 1 + } + + if metadata: + payload["params"]["message"]["metadata"] = metadata + + # Include context if we have one + if self.context_id: + payload["params"]["contextId"] = self.context_id + + # Send to root endpoint per A2A protocol + response = await self.client.post(f"{self.url}/", json=payload) + response.raise_for_status() + result = response.json() + + # Extract response based on A2A JSON-RPC format + if isinstance(result, dict): + # Update context for continuity + if "result" in result and isinstance(result["result"], dict): + if "contextId" in result["result"]: + self.context_id = result["result"]["contextId"] + + # Extract text from artifacts + if "artifacts" in result["result"]: + texts = [] + for artifact in result["result"]["artifacts"]: + if isinstance(artifact, dict) and "parts" in artifact: + for part in artifact["parts"]: + if isinstance(part, dict) and "text" in part: + texts.append(part["text"]) + if texts: + return " ".join(texts) + + # Extract from message format + if "message" in result["result"]: + msg = result["result"]["message"] + if isinstance(msg, dict) and "parts" in msg: + texts = [] + for part in msg["parts"]: + if isinstance(part, dict) and "text" in part: + texts.append(part["text"]) + return " ".join(texts) if texts else str(msg) + return str(msg) + + return str(result["result"]) + + # Handle error response + elif "error" in result: + error = result["error"] + if isinstance(error, dict): + return f"Error: {error.get('message', str(error))}" + return f"Error: {error}" + + # Fallback + return result.get("response", result.get("message", str(result))) + + return str(result) + + except Exception as e: + return f"Error communicating with agent: {e}" + + async def close(self): + """Close the connection properly""" + await self.client.aclose() diff --git a/backend/Dockerfile b/backend/Dockerfile new file mode 100644 index 0000000..e72c50c --- /dev/null +++ b/backend/Dockerfile @@ -0,0 +1,41 @@ +FROM python:3.11-slim + +WORKDIR /app + +# Install system dependencies including Docker client and rsync +RUN apt-get update && apt-get install -y \ + curl \ + ca-certificates \ + gnupg \ + lsb-release \ + rsync \ + && curl -fsSL https://download.docker.com/linux/debian/gpg | gpg --dearmor -o /usr/share/keyrings/docker-archive-keyring.gpg \ + && echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] https://download.docker.com/linux/debian $(lsb_release -cs) stable" | tee /etc/apt/sources.list.d/docker.list > /dev/null \ + && apt-get update \ + && apt-get install -y docker-ce-cli \ + && rm -rf /var/lib/apt/lists/* + +# Docker client configuration removed - localhost:5001 doesn't require insecure registry config + +# Install uv for faster package management +RUN pip install uv + +# Copy project files +COPY pyproject.toml ./ +COPY uv.lock ./ + +# Install dependencies +RUN uv sync --no-dev + +# Copy source code +COPY . . + +# Expose port +EXPOSE 8000 + +# Health check +HEALTHCHECK --interval=30s --timeout=10s --start-period=5s --retries=3 \ + CMD curl -f http://localhost:8000/health || exit 1 + +# Start the application +CMD ["uv", "run", "uvicorn", "src.main:app", "--host", "0.0.0.0", "--port", "8000"] \ No newline at end of file diff --git a/backend/README.md b/backend/README.md new file mode 100644 index 0000000..a5cbcd4 Binary files /dev/null and b/backend/README.md differ diff --git a/backend/mcp-config.json b/backend/mcp-config.json new file mode 100644 index 0000000..1b6e783 --- /dev/null +++ b/backend/mcp-config.json @@ -0,0 +1,122 @@ +{ + "name": "FuzzForge Security Testing Platform", + "description": "MCP server for FuzzForge security testing workflows via Docker Compose", + "version": "0.6.0", + "connection": { + "type": "http", + "host": "localhost", + "port": 8010, + "base_url": "http://localhost:8010", + "mcp_endpoint": "/mcp" + }, + "docker_compose": { + "service": "fuzzforge-backend", + "command": "docker compose up -d", + "health_check": "http://localhost:8000/health" + }, + "capabilities": { + "tools": [ + { + "name": "submit_security_scan_mcp", + "description": "Submit a security scanning workflow for execution", + "parameters": { + "workflow_name": "string", + "target_path": "string", + "volume_mode": "string (ro|rw)", + "parameters": "object" + } + }, + { + "name": "get_comprehensive_scan_summary", + "description": "Get a comprehensive summary of scan results with analysis", + "parameters": { + "run_id": "string" + } + } + ], + "fastapi_routes": [ + { + "method": "GET", + "path": "/", + "description": "Get API status and loaded workflows count" + }, + { + "method": "GET", + "path": "/workflows/", + "description": "List all available security testing workflows" + }, + { + "method": "POST", + "path": "/workflows/{workflow_name}/submit", + "description": "Submit a security scanning workflow for execution" + }, + { + "method": "GET", + "path": "/runs/{run_id}/status", + "description": "Get the current status of a security scan run" + }, + { + "method": "GET", + "path": "/runs/{run_id}/findings", + "description": "Get security findings from a completed scan" + }, + { + "method": "GET", + "path": "/fuzzing/{run_id}/stats", + "description": "Get fuzzing statistics for a run" + } + ] + }, + "examples": { + "start_infrastructure_scan": { + "description": "Run infrastructure security scan on a project", + "steps": [ + "1. Start Docker Compose: docker compose up -d", + "2. Submit scan via MCP tool: submit_security_scan_mcp", + "3. Monitor status and get results" + ], + "workflow_name": "infrastructure_scan", + "target_path": "/Users/tduhamel/Documents/FuzzingLabs/fuzzforge_alpha/test_projects/infrastructure_vulnerable", + "parameters": { + "checkov_config": { + "severity": ["HIGH", "MEDIUM", "LOW"] + }, + "hadolint_config": { + "severity": ["error", "warning", "info", "style"] + } + } + }, + "static_analysis_scan": { + "description": "Run static analysis security scan", + "workflow_name": "static_analysis_scan", + "target_path": "/Users/tduhamel/Documents/FuzzingLabs/fuzzforge_alpha/test_projects/static_analysis_vulnerable", + "parameters": { + "bandit_config": { + "severity": ["HIGH", "MEDIUM", "LOW"] + }, + "opengrep_config": { + "severity": ["HIGH", "MEDIUM", "LOW"] + } + } + }, + "secret_detection_scan": { + "description": "Run secret detection scan", + "workflow_name": "secret_detection_scan", + "target_path": "/Users/tduhamel/Documents/FuzzingLabs/fuzzforge_alpha/test_projects/secret_detection_vulnerable", + "parameters": { + "trufflehog_config": { + "verified_only": false + }, + "gitleaks_config": { + "no_git": true + } + } + } + }, + "usage": { + "via_mcp": "Connect MCP client to http://localhost:8010/mcp after starting Docker Compose", + "via_api": "Use FastAPI endpoints directly at http://localhost:8000", + "start_system": "docker compose up -d", + "stop_system": "docker compose down" + } +} diff --git a/backend/pyproject.toml b/backend/pyproject.toml new file mode 100644 index 0000000..1f3e7b5 --- /dev/null +++ b/backend/pyproject.toml @@ -0,0 +1,25 @@ +[project] +name = "backend" +version = "0.6.0" +description = "FuzzForge OSS backend" +authors = [] +readme = "README.md" +requires-python = ">=3.11" +dependencies = [ + "fastapi>=0.116.1", + "prefect>=3.4.18", + "pydantic>=2.0.0", + "pyyaml>=6.0", + "docker>=7.0.0", + "aiofiles>=23.0.0", + "uvicorn>=0.30.0", + "aiohttp>=3.12.15", + "fastmcp", +] + +[project.optional-dependencies] +dev = [ + "pytest>=8.0.0", + "pytest-asyncio>=0.23.0", + "httpx>=0.27.0", +] diff --git a/backend/src/__init__.py b/backend/src/__init__.py new file mode 100644 index 0000000..43bcfe7 --- /dev/null +++ b/backend/src/__init__.py @@ -0,0 +1,11 @@ +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + diff --git a/backend/src/api/__init__.py b/backend/src/api/__init__.py new file mode 100644 index 0000000..43bcfe7 --- /dev/null +++ b/backend/src/api/__init__.py @@ -0,0 +1,11 @@ +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + diff --git a/backend/src/api/fuzzing.py b/backend/src/api/fuzzing.py new file mode 100644 index 0000000..df4ed86 --- /dev/null +++ b/backend/src/api/fuzzing.py @@ -0,0 +1,325 @@ +""" +API endpoints for fuzzing workflow management and real-time monitoring +""" + +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + +import logging +from typing import List, Dict, Any +from fastapi import APIRouter, HTTPException, Depends, WebSocket, WebSocketDisconnect +from fastapi.responses import StreamingResponse +import asyncio +import json +from datetime import datetime + +from src.models.findings import ( + FuzzingStats, + CrashReport +) +from src.core.workflow_discovery import WorkflowDiscovery + +logger = logging.getLogger(__name__) + +router = APIRouter(prefix="/fuzzing", tags=["fuzzing"]) + +# In-memory storage for real-time stats (in production, use Redis or similar) +fuzzing_stats: Dict[str, FuzzingStats] = {} +crash_reports: Dict[str, List[CrashReport]] = {} +active_connections: Dict[str, List[WebSocket]] = {} + + +def initialize_fuzzing_tracking(run_id: str, workflow_name: str): + """ + Initialize fuzzing tracking for a new run. + + This function should be called when a workflow is submitted to enable + real-time monitoring and stats collection. + + Args: + run_id: The run identifier + workflow_name: Name of the workflow + """ + fuzzing_stats[run_id] = FuzzingStats( + run_id=run_id, + workflow=workflow_name + ) + crash_reports[run_id] = [] + active_connections[run_id] = [] + + +@router.get("/{run_id}/stats", response_model=FuzzingStats) +async def get_fuzzing_stats(run_id: str) -> FuzzingStats: + """ + Get current fuzzing statistics for a run. + + Args: + run_id: The fuzzing run ID + + Returns: + Current fuzzing statistics + + Raises: + HTTPException: 404 if run not found + """ + if run_id not in fuzzing_stats: + raise HTTPException( + status_code=404, + detail=f"Fuzzing run not found: {run_id}" + ) + + return fuzzing_stats[run_id] + + +@router.get("/{run_id}/crashes", response_model=List[CrashReport]) +async def get_crash_reports(run_id: str) -> List[CrashReport]: + """ + Get crash reports for a fuzzing run. + + Args: + run_id: The fuzzing run ID + + Returns: + List of crash reports + + Raises: + HTTPException: 404 if run not found + """ + if run_id not in crash_reports: + raise HTTPException( + status_code=404, + detail=f"Fuzzing run not found: {run_id}" + ) + + return crash_reports[run_id] + + +@router.post("/{run_id}/stats") +async def update_fuzzing_stats(run_id: str, stats: FuzzingStats): + """ + Update fuzzing statistics (called by fuzzing workflows). + + Args: + run_id: The fuzzing run ID + stats: Updated statistics + + Raises: + HTTPException: 404 if run not found + """ + if run_id not in fuzzing_stats: + raise HTTPException( + status_code=404, + detail=f"Fuzzing run not found: {run_id}" + ) + + # Update stats + fuzzing_stats[run_id] = stats + + # Debug: log reception for live instrumentation + try: + logger.info( + "Received fuzzing stats update: run_id=%s exec=%s eps=%.2f crashes=%s corpus=%s elapsed=%ss", + run_id, + stats.executions, + stats.executions_per_sec, + stats.crashes, + stats.corpus_size, + stats.elapsed_time, + ) + except Exception: + pass + + # Notify connected WebSocket clients + if run_id in active_connections: + message = { + "type": "stats_update", + "data": stats.model_dump() + } + for websocket in active_connections[run_id][:]: # Copy to avoid modification during iteration + try: + await websocket.send_text(json.dumps(message)) + except Exception: + # Remove disconnected clients + active_connections[run_id].remove(websocket) + + +@router.post("/{run_id}/crash") +async def report_crash(run_id: str, crash: CrashReport): + """ + Report a new crash (called by fuzzing workflows). + + Args: + run_id: The fuzzing run ID + crash: Crash report details + """ + if run_id not in crash_reports: + crash_reports[run_id] = [] + + # Add crash report + crash_reports[run_id].append(crash) + + # Update stats + if run_id in fuzzing_stats: + fuzzing_stats[run_id].crashes += 1 + fuzzing_stats[run_id].last_crash_time = crash.timestamp + + # Notify connected WebSocket clients + if run_id in active_connections: + message = { + "type": "crash_report", + "data": crash.model_dump() + } + for websocket in active_connections[run_id][:]: + try: + await websocket.send_text(json.dumps(message)) + except Exception: + active_connections[run_id].remove(websocket) + + +@router.websocket("/{run_id}/live") +async def websocket_endpoint(websocket: WebSocket, run_id: str): + """ + WebSocket endpoint for real-time fuzzing updates. + + Args: + websocket: WebSocket connection + run_id: The fuzzing run ID to monitor + """ + await websocket.accept() + + # Initialize connection tracking + if run_id not in active_connections: + active_connections[run_id] = [] + active_connections[run_id].append(websocket) + + try: + # Send current stats on connection + if run_id in fuzzing_stats: + current = fuzzing_stats[run_id] + if isinstance(current, dict): + payload = current + elif hasattr(current, "model_dump"): + payload = current.model_dump() + elif hasattr(current, "dict"): + payload = current.dict() + else: + payload = getattr(current, "__dict__", {"run_id": run_id}) + message = {"type": "stats_update", "data": payload} + await websocket.send_text(json.dumps(message)) + + # Keep connection alive + while True: + try: + # Wait for ping or handle disconnect + data = await asyncio.wait_for(websocket.receive_text(), timeout=30.0) + # Echo back for ping-pong + if data == "ping": + await websocket.send_text("pong") + except asyncio.TimeoutError: + # Send periodic heartbeat + await websocket.send_text(json.dumps({"type": "heartbeat"})) + + except WebSocketDisconnect: + # Clean up connection + if run_id in active_connections and websocket in active_connections[run_id]: + active_connections[run_id].remove(websocket) + except Exception as e: + logger.error(f"WebSocket error for run {run_id}: {e}") + if run_id in active_connections and websocket in active_connections[run_id]: + active_connections[run_id].remove(websocket) + + +@router.get("/{run_id}/stream") +async def stream_fuzzing_updates(run_id: str): + """ + Server-Sent Events endpoint for real-time fuzzing updates. + + Args: + run_id: The fuzzing run ID to monitor + + Returns: + Streaming response with real-time updates + """ + if run_id not in fuzzing_stats: + raise HTTPException( + status_code=404, + detail=f"Fuzzing run not found: {run_id}" + ) + + async def event_stream(): + """Generate server-sent events for fuzzing updates""" + last_stats_time = datetime.utcnow() + + while True: + try: + # Send current stats + if run_id in fuzzing_stats: + current_stats = fuzzing_stats[run_id] + if isinstance(current_stats, dict): + stats_payload = current_stats + elif hasattr(current_stats, "model_dump"): + stats_payload = current_stats.model_dump() + elif hasattr(current_stats, "dict"): + stats_payload = current_stats.dict() + else: + stats_payload = getattr(current_stats, "__dict__", {"run_id": run_id}) + event_data = f"data: {json.dumps({'type': 'stats', 'data': stats_payload})}\n\n" + yield event_data + + # Send recent crashes + if run_id in crash_reports: + recent_crashes = [ + crash for crash in crash_reports[run_id] + if crash.timestamp > last_stats_time + ] + for crash in recent_crashes: + event_data = f"data: {json.dumps({'type': 'crash', 'data': crash.model_dump()})}\n\n" + yield event_data + + last_stats_time = datetime.utcnow() + await asyncio.sleep(5) # Update every 5 seconds + + except Exception as e: + logger.error(f"Error in event stream for run {run_id}: {e}") + break + + return StreamingResponse( + event_stream(), + media_type="text/event-stream", + headers={ + "Cache-Control": "no-cache", + "Connection": "keep-alive", + } + ) + + +@router.delete("/{run_id}") +async def cleanup_fuzzing_run(run_id: str): + """ + Clean up fuzzing run data. + + Args: + run_id: The fuzzing run ID to clean up + """ + # Clean up tracking data + fuzzing_stats.pop(run_id, None) + crash_reports.pop(run_id, None) + + # Close any active WebSocket connections + if run_id in active_connections: + for websocket in active_connections[run_id]: + try: + await websocket.close() + except Exception: + pass + del active_connections[run_id] + + return {"message": f"Cleaned up fuzzing run {run_id}"} diff --git a/backend/src/api/runs.py b/backend/src/api/runs.py new file mode 100644 index 0000000..db63683 --- /dev/null +++ b/backend/src/api/runs.py @@ -0,0 +1,184 @@ +""" +API endpoints for workflow run management and findings retrieval +""" + +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + +import logging +from typing import Dict, Any +from fastapi import APIRouter, HTTPException, Depends + +from src.models.findings import WorkflowFindings, WorkflowStatus + +logger = logging.getLogger(__name__) + +router = APIRouter(prefix="/runs", tags=["runs"]) + + +def get_prefect_manager(): + """Dependency to get the Prefect manager instance""" + from src.main import prefect_mgr + return prefect_mgr + + +@router.get("/{run_id}/status", response_model=WorkflowStatus) +async def get_run_status( + run_id: str, + prefect_mgr=Depends(get_prefect_manager) +) -> WorkflowStatus: + """ + Get the current status of a workflow run. + + Args: + run_id: The flow run ID + + Returns: + Status information including state, timestamps, and completion flags + + Raises: + HTTPException: 404 if run not found + """ + try: + status = await prefect_mgr.get_flow_run_status(run_id) + + # Find workflow name from deployment + workflow_name = "unknown" + workflow_deployment_id = status.get("workflow", "") + for name, deployment_id in prefect_mgr.deployments.items(): + if str(deployment_id) == str(workflow_deployment_id): + workflow_name = name + break + + return WorkflowStatus( + run_id=status["run_id"], + workflow=workflow_name, + status=status["status"], + is_completed=status["is_completed"], + is_failed=status["is_failed"], + is_running=status["is_running"], + created_at=status["created_at"], + updated_at=status["updated_at"] + ) + + except Exception as e: + logger.error(f"Failed to get status for run {run_id}: {e}") + raise HTTPException( + status_code=404, + detail=f"Run not found: {run_id}" + ) + + +@router.get("/{run_id}/findings", response_model=WorkflowFindings) +async def get_run_findings( + run_id: str, + prefect_mgr=Depends(get_prefect_manager) +) -> WorkflowFindings: + """ + Get the findings from a completed workflow run. + + Args: + run_id: The flow run ID + + Returns: + SARIF-formatted findings from the workflow execution + + Raises: + HTTPException: 404 if run not found, 400 if run not completed + """ + try: + # Get run status first + status = await prefect_mgr.get_flow_run_status(run_id) + + if not status["is_completed"]: + if status["is_running"]: + raise HTTPException( + status_code=400, + detail=f"Run {run_id} is still running. Current status: {status['status']}" + ) + elif status["is_failed"]: + raise HTTPException( + status_code=400, + detail=f"Run {run_id} failed. Status: {status['status']}" + ) + else: + raise HTTPException( + status_code=400, + detail=f"Run {run_id} not completed. Status: {status['status']}" + ) + + # Get the findings + findings = await prefect_mgr.get_flow_run_findings(run_id) + + # Find workflow name + workflow_name = "unknown" + workflow_deployment_id = status.get("workflow", "") + for name, deployment_id in prefect_mgr.deployments.items(): + if str(deployment_id) == str(workflow_deployment_id): + workflow_name = name + break + + # Get workflow version if available + metadata = { + "completion_time": status["updated_at"], + "workflow_version": "unknown" + } + + if workflow_name in prefect_mgr.workflows: + workflow_info = prefect_mgr.workflows[workflow_name] + metadata["workflow_version"] = workflow_info.metadata.get("version", "unknown") + + return WorkflowFindings( + workflow=workflow_name, + run_id=run_id, + sarif=findings, + metadata=metadata + ) + + except HTTPException: + raise + except Exception as e: + logger.error(f"Failed to get findings for run {run_id}: {e}") + raise HTTPException( + status_code=500, + detail=f"Failed to retrieve findings: {str(e)}" + ) + + +@router.get("/{workflow_name}/findings/{run_id}", response_model=WorkflowFindings) +async def get_workflow_findings( + workflow_name: str, + run_id: str, + prefect_mgr=Depends(get_prefect_manager) +) -> WorkflowFindings: + """ + Get findings for a specific workflow run. + + Alternative endpoint that includes workflow name in the path for clarity. + + Args: + workflow_name: Name of the workflow + run_id: The flow run ID + + Returns: + SARIF-formatted findings from the workflow execution + + Raises: + HTTPException: 404 if workflow or run not found, 400 if run not completed + """ + if workflow_name not in prefect_mgr.workflows: + raise HTTPException( + status_code=404, + detail=f"Workflow not found: {workflow_name}" + ) + + # Delegate to the main findings endpoint + return await get_run_findings(run_id, prefect_mgr) \ No newline at end of file diff --git a/backend/src/api/workflows.py b/backend/src/api/workflows.py new file mode 100644 index 0000000..dcd504a --- /dev/null +++ b/backend/src/api/workflows.py @@ -0,0 +1,386 @@ +""" +API endpoints for workflow management with enhanced error handling +""" + +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + +import logging +import traceback +from typing import List, Dict, Any, Optional +from fastapi import APIRouter, HTTPException, Depends +from pathlib import Path + +from src.models.findings import ( + WorkflowSubmission, + WorkflowMetadata, + WorkflowListItem, + RunSubmissionResponse +) +from src.core.workflow_discovery import WorkflowDiscovery + +logger = logging.getLogger(__name__) + +router = APIRouter(prefix="/workflows", tags=["workflows"]) + + +def create_structured_error_response( + error_type: str, + message: str, + workflow_name: Optional[str] = None, + run_id: Optional[str] = None, + container_info: Optional[Dict[str, Any]] = None, + deployment_info: Optional[Dict[str, Any]] = None, + suggestions: Optional[List[str]] = None +) -> Dict[str, Any]: + """Create a structured error response with rich context.""" + error_response = { + "error": { + "type": error_type, + "message": message, + "timestamp": __import__("datetime").datetime.utcnow().isoformat() + "Z" + } + } + + if workflow_name: + error_response["error"]["workflow_name"] = workflow_name + + if run_id: + error_response["error"]["run_id"] = run_id + + if container_info: + error_response["error"]["container"] = container_info + + if deployment_info: + error_response["error"]["deployment"] = deployment_info + + if suggestions: + error_response["error"]["suggestions"] = suggestions + + return error_response + + +def get_prefect_manager(): + """Dependency to get the Prefect manager instance""" + from src.main import prefect_mgr + return prefect_mgr + + +@router.get("/", response_model=List[WorkflowListItem]) +async def list_workflows( + prefect_mgr=Depends(get_prefect_manager) +) -> List[WorkflowListItem]: + """ + List all discovered workflows with their metadata. + + Returns a summary of each workflow including name, version, description, + author, and tags. + """ + workflows = [] + for name, info in prefect_mgr.workflows.items(): + workflows.append(WorkflowListItem( + name=name, + version=info.metadata.get("version", "0.6.0"), + description=info.metadata.get("description", ""), + author=info.metadata.get("author"), + tags=info.metadata.get("tags", []) + )) + + return workflows + + +@router.get("/metadata/schema") +async def get_metadata_schema() -> Dict[str, Any]: + """ + Get the JSON schema for workflow metadata files. + + This schema defines the structure and requirements for metadata.yaml files + that must accompany each workflow. + """ + return WorkflowDiscovery.get_metadata_schema() + + +@router.get("/{workflow_name}/metadata", response_model=WorkflowMetadata) +async def get_workflow_metadata( + workflow_name: str, + prefect_mgr=Depends(get_prefect_manager) +) -> WorkflowMetadata: + """ + Get complete metadata for a specific workflow. + + Args: + workflow_name: Name of the workflow + + Returns: + Complete metadata including parameters schema, supported volume modes, + required modules, and more. + + Raises: + HTTPException: 404 if workflow not found + """ + if workflow_name not in prefect_mgr.workflows: + available_workflows = list(prefect_mgr.workflows.keys()) + error_response = create_structured_error_response( + error_type="WorkflowNotFound", + message=f"Workflow '{workflow_name}' not found", + workflow_name=workflow_name, + suggestions=[ + f"Available workflows: {', '.join(available_workflows)}", + "Use GET /workflows/ to see all available workflows", + "Check workflow name spelling and case sensitivity" + ] + ) + raise HTTPException( + status_code=404, + detail=error_response + ) + + info = prefect_mgr.workflows[workflow_name] + metadata = info.metadata + + return WorkflowMetadata( + name=workflow_name, + version=metadata.get("version", "0.6.0"), + description=metadata.get("description", ""), + author=metadata.get("author"), + tags=metadata.get("tags", []), + parameters=metadata.get("parameters", {}), + default_parameters=metadata.get("default_parameters", {}), + required_modules=metadata.get("required_modules", []), + supported_volume_modes=metadata.get("supported_volume_modes", ["ro", "rw"]), + has_custom_docker=info.has_docker + ) + + +@router.post("/{workflow_name}/submit", response_model=RunSubmissionResponse) +async def submit_workflow( + workflow_name: str, + submission: WorkflowSubmission, + prefect_mgr=Depends(get_prefect_manager) +) -> RunSubmissionResponse: + """ + Submit a workflow for execution with volume mounting. + + Args: + workflow_name: Name of the workflow to execute + submission: Submission parameters including target path and volume mode + + Returns: + Run submission response with run_id and initial status + + Raises: + HTTPException: 404 if workflow not found, 400 for invalid parameters + """ + if workflow_name not in prefect_mgr.workflows: + available_workflows = list(prefect_mgr.workflows.keys()) + error_response = create_structured_error_response( + error_type="WorkflowNotFound", + message=f"Workflow '{workflow_name}' not found", + workflow_name=workflow_name, + suggestions=[ + f"Available workflows: {', '.join(available_workflows)}", + "Use GET /workflows/ to see all available workflows", + "Check workflow name spelling and case sensitivity" + ] + ) + raise HTTPException( + status_code=404, + detail=error_response + ) + + try: + # Convert ResourceLimits to dict if provided + resource_limits_dict = None + if submission.resource_limits: + resource_limits_dict = { + "cpu_limit": submission.resource_limits.cpu_limit, + "memory_limit": submission.resource_limits.memory_limit, + "cpu_request": submission.resource_limits.cpu_request, + "memory_request": submission.resource_limits.memory_request + } + + # Submit the workflow with enhanced parameters + flow_run = await prefect_mgr.submit_workflow( + workflow_name=workflow_name, + target_path=submission.target_path, + volume_mode=submission.volume_mode, + parameters=submission.parameters, + resource_limits=resource_limits_dict, + additional_volumes=submission.additional_volumes, + timeout=submission.timeout + ) + + run_id = str(flow_run.id) + + # Initialize fuzzing tracking if this looks like a fuzzing workflow + workflow_info = prefect_mgr.workflows.get(workflow_name, {}) + workflow_tags = workflow_info.metadata.get("tags", []) if hasattr(workflow_info, 'metadata') else [] + if "fuzzing" in workflow_tags or "fuzz" in workflow_name.lower(): + from src.api.fuzzing import initialize_fuzzing_tracking + initialize_fuzzing_tracking(run_id, workflow_name) + + return RunSubmissionResponse( + run_id=run_id, + status=flow_run.state.name if flow_run.state else "PENDING", + workflow=workflow_name, + message=f"Workflow '{workflow_name}' submitted successfully" + ) + + except ValueError as e: + # Parameter validation errors + error_response = create_structured_error_response( + error_type="ValidationError", + message=str(e), + workflow_name=workflow_name, + suggestions=[ + "Check parameter types and values", + "Use GET /workflows/{workflow_name}/parameters for schema", + "Ensure all required parameters are provided" + ] + ) + raise HTTPException(status_code=400, detail=error_response) + + except Exception as e: + logger.error(f"Failed to submit workflow '{workflow_name}': {e}") + logger.error(f"Traceback: {traceback.format_exc()}") + + # Try to get more context about the error + container_info = None + deployment_info = None + suggestions = [] + + error_message = str(e) + error_type = "WorkflowSubmissionError" + + # Detect specific error patterns + if "deployment" in error_message.lower(): + error_type = "DeploymentError" + deployment_info = { + "status": "failed", + "error": error_message + } + suggestions.extend([ + "Check if Prefect server is running and accessible", + "Verify Docker is running and has sufficient resources", + "Check container image availability", + "Ensure volume paths exist and are accessible" + ]) + + elif "volume" in error_message.lower() or "mount" in error_message.lower(): + error_type = "VolumeError" + suggestions.extend([ + "Check if the target path exists and is accessible", + "Verify file permissions (Docker needs read access)", + "Ensure the path is not in use by another process", + "Try using an absolute path instead of relative path" + ]) + + elif "memory" in error_message.lower() or "resource" in error_message.lower(): + error_type = "ResourceError" + suggestions.extend([ + "Check system memory and CPU availability", + "Consider reducing resource limits or dataset size", + "Monitor Docker resource usage", + "Increase Docker memory limits if needed" + ]) + + elif "image" in error_message.lower(): + error_type = "ImageError" + suggestions.extend([ + "Check if the workflow image exists", + "Verify Docker registry access", + "Try rebuilding the workflow image", + "Check network connectivity to registries" + ]) + + else: + suggestions.extend([ + "Check FuzzForge backend logs for details", + "Verify all services are running (docker-compose up -d)", + "Try restarting the workflow deployment", + "Contact support if the issue persists" + ]) + + error_response = create_structured_error_response( + error_type=error_type, + message=f"Failed to submit workflow: {error_message}", + workflow_name=workflow_name, + container_info=container_info, + deployment_info=deployment_info, + suggestions=suggestions + ) + + raise HTTPException( + status_code=500, + detail=error_response + ) + + +@router.get("/{workflow_name}/parameters") +async def get_workflow_parameters( + workflow_name: str, + prefect_mgr=Depends(get_prefect_manager) +) -> Dict[str, Any]: + """ + Get the parameters schema for a workflow. + + Args: + workflow_name: Name of the workflow + + Returns: + Parameters schema with types, descriptions, and defaults + + Raises: + HTTPException: 404 if workflow not found + """ + if workflow_name not in prefect_mgr.workflows: + available_workflows = list(prefect_mgr.workflows.keys()) + error_response = create_structured_error_response( + error_type="WorkflowNotFound", + message=f"Workflow '{workflow_name}' not found", + workflow_name=workflow_name, + suggestions=[ + f"Available workflows: {', '.join(available_workflows)}", + "Use GET /workflows/ to see all available workflows" + ] + ) + raise HTTPException( + status_code=404, + detail=error_response + ) + + info = prefect_mgr.workflows[workflow_name] + metadata = info.metadata + + # Return parameters with enhanced schema information + parameters_schema = metadata.get("parameters", {}) + + # Extract the actual parameter definitions from JSON schema structure + if "properties" in parameters_schema: + param_definitions = parameters_schema["properties"] + else: + param_definitions = parameters_schema + + # Add default values to the schema + default_params = metadata.get("default_parameters", {}) + for param_name, param_schema in param_definitions.items(): + if isinstance(param_schema, dict) and param_name in default_params: + param_schema["default"] = default_params[param_name] + + return { + "workflow": workflow_name, + "parameters": param_definitions, + "default_parameters": default_params, + "required_parameters": [ + name for name, schema in param_definitions.items() + if isinstance(schema, dict) and schema.get("required", False) + ] + } \ No newline at end of file diff --git a/backend/src/core/__init__.py b/backend/src/core/__init__.py new file mode 100644 index 0000000..43bcfe7 --- /dev/null +++ b/backend/src/core/__init__.py @@ -0,0 +1,11 @@ +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + diff --git a/backend/src/core/prefect_manager.py b/backend/src/core/prefect_manager.py new file mode 100644 index 0000000..c26fe8b --- /dev/null +++ b/backend/src/core/prefect_manager.py @@ -0,0 +1,770 @@ +""" +Prefect Manager - Core orchestration for workflow deployment and execution +""" + +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + +import logging +import os +import platform +import re +from pathlib import Path +from typing import Dict, Optional, Any +from prefect import get_client +from prefect.docker import DockerImage +from prefect.client.schemas import FlowRun + +from src.core.workflow_discovery import WorkflowDiscovery, WorkflowInfo + +logger = logging.getLogger(__name__) + + +def get_registry_url(context: str = "default") -> str: + """ + Get the container registry URL to use for a given operation context. + + Goals: + - Work reliably across Linux and macOS Docker Desktop + - Prefer in-network service discovery when running inside containers + - Allow full override via env vars from docker-compose + + Env overrides: + - FUZZFORGE_REGISTRY_PUSH_URL: used for image builds/pushes + - FUZZFORGE_REGISTRY_PULL_URL: used for workers to pull images + """ + # Normalize context + ctx = (context or "default").lower() + + # Always honor explicit overrides first + if ctx in ("push", "build"): + push_url = os.getenv("FUZZFORGE_REGISTRY_PUSH_URL") + if push_url: + logger.debug("Using FUZZFORGE_REGISTRY_PUSH_URL: %s", push_url) + return push_url + # Default to host-published registry for Docker daemon operations + return "localhost:5001" + + if ctx == "pull": + pull_url = os.getenv("FUZZFORGE_REGISTRY_PULL_URL") + if pull_url: + logger.debug("Using FUZZFORGE_REGISTRY_PULL_URL: %s", pull_url) + return pull_url + # Prefect worker pulls via host Docker daemon as well + return "localhost:5001" + + # Default/fallback + return os.getenv("FUZZFORGE_REGISTRY_PULL_URL", os.getenv("FUZZFORGE_REGISTRY_PUSH_URL", "localhost:5001")) + + +def _compose_project_name(default: str = "fuzzforge_alpha") -> str: + """Return the docker-compose project name used for network/volume naming. + + Honors COMPOSE_PROJECT_NAME if present; falls back to a sensible default. + """ + return os.getenv("COMPOSE_PROJECT_NAME", default) + + +class PrefectManager: + """ + Manages Prefect deployments and flow runs for discovered workflows. + + This class handles: + - Workflow discovery and registration + - Docker image building through Prefect + - Deployment creation and management + - Flow run submission with volume mounting + - Findings retrieval from completed runs + """ + + def __init__(self, workflows_dir: Path = None): + """ + Initialize the Prefect manager. + + Args: + workflows_dir: Path to the workflows directory (default: toolbox/workflows) + """ + if workflows_dir is None: + workflows_dir = Path("toolbox/workflows") + + self.discovery = WorkflowDiscovery(workflows_dir) + self.workflows: Dict[str, WorkflowInfo] = {} + self.deployments: Dict[str, str] = {} # workflow_name -> deployment_id + + # Security: Define allowed and forbidden paths for host mounting + self.allowed_base_paths = [ + "/tmp", + "/home", + "/Users", # macOS users + "/opt", + "/var/tmp", + "/workspace", # Common container workspace + "/app" # Container application directory (for test projects) + ] + + self.forbidden_paths = [ + "/etc", + "/root", + "/var/run", + "/sys", + "/proc", + "/dev", + "/boot", + "/var/lib/docker", # Critical Docker data + "/var/log", # System logs + "/usr/bin", # System binaries + "/usr/sbin", + "/sbin", + "/bin" + ] + + @staticmethod + def _parse_memory_to_bytes(memory_str: str) -> int: + """ + Parse memory string (like '512Mi', '1Gi') to bytes. + + Args: + memory_str: Memory string with unit suffix + + Returns: + Memory in bytes + + Raises: + ValueError: If format is invalid + """ + if not memory_str: + return 0 + + match = re.match(r'^(\d+(?:\.\d+)?)\s*([GMK]i?)$', memory_str.strip()) + if not match: + raise ValueError(f"Invalid memory format: {memory_str}. Expected format like '512Mi', '1Gi'") + + value, unit = match.groups() + value = float(value) + + # Convert to bytes based on unit (binary units: Ki, Mi, Gi) + if unit in ['K', 'Ki']: + multiplier = 1024 + elif unit in ['M', 'Mi']: + multiplier = 1024 * 1024 + elif unit in ['G', 'Gi']: + multiplier = 1024 * 1024 * 1024 + else: + raise ValueError(f"Unsupported memory unit: {unit}") + + return int(value * multiplier) + + @staticmethod + def _parse_cpu_to_millicores(cpu_str: str) -> int: + """ + Parse CPU string (like '500m', '1', '2.5') to millicores. + + Args: + cpu_str: CPU string + + Returns: + CPU in millicores (1 core = 1000 millicores) + + Raises: + ValueError: If format is invalid + """ + if not cpu_str: + return 0 + + cpu_str = cpu_str.strip() + + # Handle millicores format (e.g., '500m') + if cpu_str.endswith('m'): + try: + return int(cpu_str[:-1]) + except ValueError: + raise ValueError(f"Invalid CPU format: {cpu_str}") + + # Handle core format (e.g., '1', '2.5') + try: + cores = float(cpu_str) + return int(cores * 1000) # Convert to millicores + except ValueError: + raise ValueError(f"Invalid CPU format: {cpu_str}") + + def _extract_resource_requirements(self, workflow_info: WorkflowInfo) -> Dict[str, str]: + """ + Extract resource requirements from workflow metadata. + + Args: + workflow_info: Workflow information with metadata + + Returns: + Dictionary with resource requirements in Docker format + """ + metadata = workflow_info.metadata + requirements = metadata.get("requirements", {}) + resources = requirements.get("resources", {}) + + resource_config = {} + + # Extract memory requirement + memory = resources.get("memory") + if memory: + try: + # Validate memory format and store original string for Docker + self._parse_memory_to_bytes(memory) + resource_config["memory"] = memory + except ValueError as e: + logger.warning(f"Invalid memory requirement in {workflow_info.name}: {e}") + + # Extract CPU requirement + cpu = resources.get("cpu") + if cpu: + try: + # Validate CPU format and store original string for Docker + self._parse_cpu_to_millicores(cpu) + resource_config["cpus"] = cpu + except ValueError as e: + logger.warning(f"Invalid CPU requirement in {workflow_info.name}: {e}") + + # Extract timeout + timeout = resources.get("timeout") + if timeout and isinstance(timeout, int): + resource_config["timeout"] = str(timeout) + + return resource_config + + async def initialize(self): + """ + Initialize the manager by discovering and deploying all workflows. + + This method: + 1. Discovers all valid workflows in the workflows directory + 2. Validates their metadata + 3. Deploys each workflow to Prefect with Docker images + """ + try: + # Discover workflows + self.workflows = await self.discovery.discover_workflows() + + if not self.workflows: + logger.warning("No workflows discovered") + return + + logger.info(f"Discovered {len(self.workflows)} workflows: {list(self.workflows.keys())}") + + # Deploy each workflow + for name, info in self.workflows.items(): + try: + await self._deploy_workflow(name, info) + except Exception as e: + logger.error(f"Failed to deploy workflow '{name}': {e}") + + except Exception as e: + logger.error(f"Failed to initialize Prefect manager: {e}") + raise + + async def _deploy_workflow(self, name: str, info: WorkflowInfo): + """ + Deploy a single workflow to Prefect with Docker image. + + Args: + name: Workflow name + info: Workflow information including metadata and paths + """ + logger.info(f"Deploying workflow '{name}'...") + + # Get the flow function from registry + flow_func = self.discovery.get_flow_function(name) + if not flow_func: + logger.error( + f"Failed to get flow function for '{name}' from registry. " + f"Ensure the workflow is properly registered in toolbox/workflows/registry.py" + ) + return + + # Use the mandatory Dockerfile with absolute paths for Docker Compose + # Get absolute paths for build context and dockerfile + toolbox_path = info.path.parent.parent.resolve() + dockerfile_abs_path = info.dockerfile.resolve() + + # Calculate relative dockerfile path from toolbox context + try: + dockerfile_rel_path = dockerfile_abs_path.relative_to(toolbox_path) + except ValueError: + # If relative path fails, use the workflow-specific path + dockerfile_rel_path = Path("workflows") / name / "Dockerfile" + + # Determine deployment strategy based on Dockerfile presence + base_image = "prefecthq/prefect:3-python3.11" + has_custom_dockerfile = info.has_docker and info.dockerfile.exists() + + logger.info(f"=== DEPLOYMENT DEBUG for '{name}' ===") + logger.info(f"info.has_docker: {info.has_docker}") + logger.info(f"info.dockerfile: {info.dockerfile}") + logger.info(f"info.dockerfile.exists(): {info.dockerfile.exists()}") + logger.info(f"has_custom_dockerfile: {has_custom_dockerfile}") + logger.info(f"toolbox_path: {toolbox_path}") + logger.info(f"dockerfile_rel_path: {dockerfile_rel_path}") + + if has_custom_dockerfile: + logger.info(f"Workflow '{name}' has custom Dockerfile - building custom image") + # Decide whether to use registry or keep images local to host engine + import os + # Default to using the local registry; set FUZZFORGE_USE_REGISTRY=false to bypass (not recommended) + use_registry = os.getenv("FUZZFORGE_USE_REGISTRY", "true").lower() == "true" + + if use_registry: + registry_url = get_registry_url(context="push") + image_spec = DockerImage( + name=f"{registry_url}/fuzzforge/{name}", + tag="latest", + dockerfile=str(dockerfile_rel_path), + context=str(toolbox_path) + ) + deploy_image = f"{registry_url}/fuzzforge/{name}:latest" + build_custom = True + push_custom = True + logger.info(f"Using registry: {registry_url} for '{name}'") + else: + # Single-host mode: build into host engine cache; no push required + image_spec = DockerImage( + name=f"fuzzforge/{name}", + tag="latest", + dockerfile=str(dockerfile_rel_path), + context=str(toolbox_path) + ) + deploy_image = f"fuzzforge/{name}:latest" + build_custom = True + push_custom = False + logger.info("Using single-host image (no registry push): %s", deploy_image) + else: + logger.info(f"Workflow '{name}' using base image - no custom dependencies needed") + deploy_image = base_image + build_custom = False + push_custom = False + + # Pre-validate registry connectivity when pushing + if push_custom: + try: + from .setup import validate_registry_connectivity + await validate_registry_connectivity(registry_url) + logger.info(f"Registry connectivity validated for {registry_url}") + except Exception as e: + logger.error(f"Registry connectivity validation failed for {registry_url}: {e}") + raise RuntimeError(f"Cannot deploy workflow '{name}': Registry {registry_url} is not accessible. {e}") + + # Deploy the workflow + try: + # Ensure any previous deployment is removed so job variables are updated + try: + async with get_client() as client: + existing = await client.read_deployment_by_name( + f"{name}/{name}-deployment" + ) + if existing: + logger.info(f"Removing existing deployment for '{name}' to refresh settings...") + await client.delete_deployment(existing.id) + except Exception: + # If not found or deletion fails, continue with deployment + pass + + # Extract resource requirements from metadata + workflow_resource_requirements = self._extract_resource_requirements(info) + logger.info(f"Workflow '{name}' resource requirements: {workflow_resource_requirements}") + + # Build job variables with resource requirements + job_variables = { + "image": deploy_image, # Use the worker-accessible registry name + "volumes": [], # Populated at run submission with toolbox mount + "env": { + "PYTHONPATH": "/opt/prefect/toolbox:/opt/prefect/toolbox/workflows", + "WORKFLOW_NAME": name + } + } + + # Add resource requirements to job variables if present + if workflow_resource_requirements: + job_variables["resources"] = workflow_resource_requirements + + # Prepare deployment parameters + deploy_params = { + "name": f"{name}-deployment", + "work_pool_name": "docker-pool", + "image": image_spec if has_custom_dockerfile else deploy_image, + "push": push_custom, + "build": build_custom, + "job_variables": job_variables + } + + deployment = await flow_func.deploy(**deploy_params) + + self.deployments[name] = str(deployment.id) if hasattr(deployment, 'id') else name + logger.info(f"Successfully deployed workflow '{name}'") + + except Exception as e: + # Enhanced error reporting with more context + import traceback + logger.error(f"Failed to deploy workflow '{name}': {e}") + logger.error(f"Deployment traceback: {traceback.format_exc()}") + + # Try to capture Docker-specific context + error_context = { + "workflow_name": name, + "has_dockerfile": has_custom_dockerfile, + "image_name": deploy_image if 'deploy_image' in locals() else "unknown", + "registry_url": registry_url if 'registry_url' in locals() else "unknown", + "error_type": type(e).__name__, + "error_message": str(e) + } + + # Check for specific error patterns with detailed categorization + error_msg_lower = str(e).lower() + if "registry" in error_msg_lower and ("no such host" in error_msg_lower or "connection" in error_msg_lower): + error_context["category"] = "registry_connectivity_error" + error_context["solution"] = f"Cannot reach registry at {error_context['registry_url']}. Check Docker network and registry service." + elif "docker" in error_msg_lower: + error_context["category"] = "docker_error" + if "build" in error_msg_lower: + error_context["subcategory"] = "image_build_failed" + error_context["solution"] = "Check Dockerfile syntax and dependencies." + elif "pull" in error_msg_lower: + error_context["subcategory"] = "image_pull_failed" + error_context["solution"] = "Check if image exists in registry and network connectivity." + elif "push" in error_msg_lower: + error_context["subcategory"] = "image_push_failed" + error_context["solution"] = f"Check registry connectivity and push permissions to {error_context['registry_url']}." + elif "registry" in error_msg_lower: + error_context["category"] = "registry_error" + error_context["solution"] = "Check registry configuration and accessibility." + elif "prefect" in error_msg_lower: + error_context["category"] = "prefect_error" + error_context["solution"] = "Check Prefect server connectivity and deployment configuration." + else: + error_context["category"] = "unknown_deployment_error" + error_context["solution"] = "Check logs for more specific error details." + + logger.error(f"Deployment error context: {error_context}") + + # Raise enhanced exception with context + enhanced_error = Exception(f"Deployment failed for workflow '{name}': {str(e)} | Context: {error_context}") + enhanced_error.original_error = e + enhanced_error.context = error_context + raise enhanced_error + + async def submit_workflow( + self, + workflow_name: str, + target_path: str, + volume_mode: str = "ro", + parameters: Dict[str, Any] = None, + resource_limits: Dict[str, str] = None, + additional_volumes: list = None, + timeout: int = None + ) -> FlowRun: + """ + Submit a workflow for execution with volume mounting. + + Args: + workflow_name: Name of the workflow to execute + target_path: Host path to mount as volume + volume_mode: Volume mount mode ("ro" for read-only, "rw" for read-write) + parameters: Workflow-specific parameters + resource_limits: CPU/memory limits for container + additional_volumes: List of additional volume mounts + timeout: Timeout in seconds + + Returns: + FlowRun object with run information + + Raises: + ValueError: If workflow not found or volume mode not supported + """ + if workflow_name not in self.workflows: + raise ValueError(f"Unknown workflow: {workflow_name}") + + # Validate volume mode + workflow_info = self.workflows[workflow_name] + supported_modes = workflow_info.metadata.get("supported_volume_modes", ["ro", "rw"]) + + if volume_mode not in supported_modes: + raise ValueError( + f"Workflow '{workflow_name}' doesn't support volume mode '{volume_mode}'. " + f"Supported modes: {supported_modes}" + ) + + # Validate target path with security checks + self._validate_target_path(target_path) + + # Validate additional volumes if provided + if additional_volumes: + for volume in additional_volumes: + self._validate_target_path(volume.host_path) + + async with get_client() as client: + # Get the deployment, auto-redeploy once if missing + try: + deployment = await client.read_deployment_by_name( + f"{workflow_name}/{workflow_name}-deployment" + ) + except Exception as e: + import traceback + logger.error(f"Failed to find deployment for workflow '{workflow_name}': {e}") + logger.error(f"Deployment lookup traceback: {traceback.format_exc()}") + + # Attempt a one-time auto-deploy to recover from startup races + try: + logger.info(f"Auto-deploying missing workflow '{workflow_name}' and retrying...") + await self._deploy_workflow(workflow_name, workflow_info) + deployment = await client.read_deployment_by_name( + f"{workflow_name}/{workflow_name}-deployment" + ) + except Exception as redeploy_exc: + # Enhanced error with context + error_context = { + "workflow_name": workflow_name, + "error_type": type(e).__name__, + "error_message": str(e), + "redeploy_error": str(redeploy_exc), + "available_deployments": list(self.deployments.keys()), + } + enhanced_error = ValueError( + f"Deployment not found and redeploy failed for workflow '{workflow_name}': {e} | Context: {error_context}" + ) + enhanced_error.context = error_context + raise enhanced_error + + # Determine the Docker Compose network name and volume names + # Docker Compose creates networks with pattern: {project_name}_default + import os + compose_project = _compose_project_name('fuzzforge_alpha') + docker_network = f"{compose_project}_default" + + # Build volume mounts + # Add toolbox volume mount for workflow code access + backend_toolbox_path = "/app/toolbox" # Path in backend container + + # Use dynamic volume names based on Docker Compose project name + prefect_storage_volume = f"{compose_project}_prefect_storage" + toolbox_code_volume = f"{compose_project}_toolbox_code" + + volumes = [ + f"{target_path}:/workspace:{volume_mode}", + f"{prefect_storage_volume}:/prefect-storage", # Shared storage for results + f"{toolbox_code_volume}:/opt/prefect/toolbox:ro" # Mount workflow code + ] + + # Add additional volumes if provided + if additional_volumes: + for volume in additional_volumes: + volume_spec = f"{volume.host_path}:{volume.container_path}:{volume.mode}" + volumes.append(volume_spec) + + # Build environment variables + env_vars = { + "PREFECT_API_URL": "http://prefect-server:4200/api", # Use internal network hostname + "PREFECT_LOGGING_LEVEL": "INFO", + "PREFECT_LOCAL_STORAGE_PATH": "/prefect-storage", # Use shared storage + "PREFECT_RESULTS_PERSIST_BY_DEFAULT": "true", # Enable result persistence + "PREFECT_DEFAULT_RESULT_STORAGE_BLOCK": "local-file-system/fuzzforge-results", # Use our storage block + "WORKSPACE_PATH": "/workspace", + "VOLUME_MODE": volume_mode, + "WORKFLOW_NAME": workflow_name + } + + # Add additional volume paths to environment for easy access + if additional_volumes: + for i, volume in enumerate(additional_volumes): + env_vars[f"ADDITIONAL_VOLUME_{i}_PATH"] = volume.container_path + + # Determine which image to use based on workflow configuration + workflow_info = self.workflows[workflow_name] + has_custom_dockerfile = workflow_info.has_docker and workflow_info.dockerfile.exists() + # Use pull context for worker to pull from registry + registry_url = get_registry_url(context="pull") + workflow_image = f"{registry_url}/fuzzforge/{workflow_name}:latest" if has_custom_dockerfile else "prefecthq/prefect:3-python3.11" + logger.debug(f"Worker will pull image: {workflow_image} (Registry: {registry_url})") + + # Configure job variables with volume mounting and network access + job_variables = { + # Use custom image if available, otherwise base Prefect image + "image": workflow_image, + "volumes": volumes, + "networks": [docker_network], # Connect to Docker Compose network + "env": { + **env_vars, + "PYTHONPATH": "/opt/prefect/toolbox:/opt/prefect/toolbox/workflows", + "WORKFLOW_NAME": workflow_name + } + } + + # Apply resource requirements from workflow metadata and user overrides + workflow_resource_requirements = self._extract_resource_requirements(workflow_info) + final_resource_config = {} + + # Start with workflow requirements as base + if workflow_resource_requirements: + final_resource_config.update(workflow_resource_requirements) + + # Apply user-provided resource limits (overrides workflow defaults) + if resource_limits: + user_resource_config = {} + if resource_limits.get("cpu_limit"): + user_resource_config["cpus"] = resource_limits["cpu_limit"] + if resource_limits.get("memory_limit"): + user_resource_config["memory"] = resource_limits["memory_limit"] + # Note: cpu_request and memory_request are not directly supported by Docker + # but could be used for Kubernetes in the future + + # User overrides take precedence + final_resource_config.update(user_resource_config) + + # Apply final resource configuration + if final_resource_config: + job_variables["resources"] = final_resource_config + logger.info(f"Applied resource limits: {final_resource_config}") + + # Merge parameters with defaults from metadata + default_params = workflow_info.metadata.get("default_parameters", {}) + final_params = {**default_params, **(parameters or {})} + + # Set flow parameters that match the flow signature + final_params["target_path"] = "/workspace" # Container path where volume is mounted + final_params["volume_mode"] = volume_mode + + # Create and submit the flow run + # Pass job_variables to ensure network, volumes, and environment are configured + logger.info(f"Submitting flow with job_variables: {job_variables}") + logger.info(f"Submitting flow with parameters: {final_params}") + + # Prepare flow run creation parameters + flow_run_params = { + "deployment_id": deployment.id, + "parameters": final_params, + "job_variables": job_variables + } + + # Note: Timeout is handled through workflow-level configuration + # Additional timeout configuration can be added to deployment metadata if needed + + flow_run = await client.create_flow_run_from_deployment(**flow_run_params) + + logger.info( + f"Submitted workflow '{workflow_name}' with run_id: {flow_run.id}, " + f"target: {target_path}, mode: {volume_mode}" + ) + + return flow_run + + async def get_flow_run_findings(self, run_id: str) -> Dict[str, Any]: + """ + Retrieve findings from a completed flow run. + + Args: + run_id: The flow run ID + + Returns: + Dictionary containing SARIF-formatted findings + + Raises: + ValueError: If run not completed or not found + """ + async with get_client() as client: + flow_run = await client.read_flow_run(run_id) + + if not flow_run.state.is_completed(): + raise ValueError( + f"Flow run {run_id} not completed. Current status: {flow_run.state.name}" + ) + + # Get the findings from the flow run result + try: + findings = await flow_run.state.result() + return findings + except Exception as e: + logger.error(f"Failed to retrieve findings for run {run_id}: {e}") + raise ValueError(f"Failed to retrieve findings: {e}") + + async def get_flow_run_status(self, run_id: str) -> Dict[str, Any]: + """ + Get the current status of a flow run. + + Args: + run_id: The flow run ID + + Returns: + Dictionary with status information + """ + async with get_client() as client: + flow_run = await client.read_flow_run(run_id) + + return { + "run_id": str(flow_run.id), + "workflow": flow_run.deployment_id, + "status": flow_run.state.name, + "is_completed": flow_run.state.is_completed(), + "is_failed": flow_run.state.is_failed(), + "is_running": flow_run.state.is_running(), + "created_at": flow_run.created, + "updated_at": flow_run.updated + } + + def _validate_target_path(self, target_path: str) -> None: + """ + Validate target path for security before mounting as volume. + + Args: + target_path: Host path to validate + + Raises: + ValueError: If path is not allowed for security reasons + """ + target = Path(target_path) + + # Path must be absolute + if not target.is_absolute(): + raise ValueError(f"Target path must be absolute: {target_path}") + + # Resolve path to handle symlinks and relative components + try: + resolved_path = target.resolve() + except (OSError, RuntimeError) as e: + raise ValueError(f"Cannot resolve target path: {target_path} - {e}") + + resolved_str = str(resolved_path) + + # Check against forbidden paths first (more restrictive) + for forbidden in self.forbidden_paths: + if resolved_str.startswith(forbidden): + raise ValueError( + f"Access denied: Path '{target_path}' resolves to forbidden directory '{forbidden}'. " + f"This path contains sensitive system files and cannot be mounted." + ) + + # Check if path starts with any allowed base path + path_allowed = False + for allowed in self.allowed_base_paths: + if resolved_str.startswith(allowed): + path_allowed = True + break + + if not path_allowed: + allowed_list = ", ".join(self.allowed_base_paths) + raise ValueError( + f"Access denied: Path '{target_path}' is not in allowed directories. " + f"Allowed base paths: {allowed_list}" + ) + + # Additional security checks + if resolved_str == "/": + raise ValueError("Cannot mount root filesystem") + + # Warn if path doesn't exist (but don't block - it might be created later) + if not resolved_path.exists(): + logger.warning(f"Target path does not exist: {target_path}") + + logger.info(f"Path validation passed for: {target_path} -> {resolved_str}") diff --git a/backend/src/core/setup.py b/backend/src/core/setup.py new file mode 100644 index 0000000..24444b1 --- /dev/null +++ b/backend/src/core/setup.py @@ -0,0 +1,402 @@ +""" +Setup utilities for Prefect infrastructure +""" + +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + +import logging +from prefect import get_client +from prefect.client.schemas.actions import WorkPoolCreate +from prefect.client.schemas.objects import WorkPool +from .prefect_manager import get_registry_url + +logger = logging.getLogger(__name__) + + +async def setup_docker_pool(): + """ + Create or update the Docker work pool for container execution. + + This work pool is configured to: + - Connect to the local Docker daemon + - Support volume mounting at runtime + - Clean up containers after execution + - Use bridge networking by default + """ + import os + + async with get_client() as client: + pool_name = "docker-pool" + + # Add force recreation flag for debugging fresh install issues + force_recreate = os.getenv('FORCE_RECREATE_WORK_POOL', 'false').lower() == 'true' + debug_setup = os.getenv('DEBUG_WORK_POOL_SETUP', 'false').lower() == 'true' + + if force_recreate: + logger.warning(f"FORCE_RECREATE_WORK_POOL=true - Will recreate work pool regardless of existing configuration") + if debug_setup: + logger.warning(f"DEBUG_WORK_POOL_SETUP=true - Enhanced logging enabled") + # Temporarily set logging level to DEBUG for this function + original_level = logger.level + logger.setLevel(logging.DEBUG) + + try: + # Check if pool already exists and supports custom images + existing_pools = await client.read_work_pools() + existing_pool = None + for pool in existing_pools: + if pool.name == pool_name: + existing_pool = pool + break + + if existing_pool and not force_recreate: + logger.info(f"Found existing work pool '{pool_name}' - validating configuration...") + + # Check if the existing pool has the correct configuration + base_template = existing_pool.base_job_template or {} + logger.debug(f"Base template keys: {list(base_template.keys())}") + + job_config = base_template.get("job_configuration", {}) + logger.debug(f"Job config keys: {list(job_config.keys())}") + + image_config = job_config.get("image", "") + has_image_variable = "{{ image }}" in str(image_config) + logger.debug(f"Image config: '{image_config}' -> has_image_variable: {has_image_variable}") + + # Check if volume defaults include toolbox mount + variables = base_template.get("variables", {}) + properties = variables.get("properties", {}) + volume_config = properties.get("volumes", {}) + volume_defaults = volume_config.get("default", []) + has_toolbox_volume = any("toolbox_code" in str(vol) for vol in volume_defaults) if volume_defaults else False + logger.debug(f"Volume defaults: {volume_defaults}") + logger.debug(f"Has toolbox volume: {has_toolbox_volume}") + + # Check if environment defaults include required settings + env_config = properties.get("env", {}) + env_defaults = env_config.get("default", {}) + has_api_url = "PREFECT_API_URL" in env_defaults + has_storage_path = "PREFECT_LOCAL_STORAGE_PATH" in env_defaults + has_results_persist = "PREFECT_RESULTS_PERSIST_BY_DEFAULT" in env_defaults + has_required_env = has_api_url and has_storage_path and has_results_persist + logger.debug(f"Environment defaults: {env_defaults}") + logger.debug(f"Has API URL: {has_api_url}, Has storage path: {has_storage_path}, Has results persist: {has_results_persist}") + logger.debug(f"Has required env: {has_required_env}") + + # Log the full validation result + logger.info(f"Work pool validation - Image: {has_image_variable}, Toolbox: {has_toolbox_volume}, Environment: {has_required_env}") + + if has_image_variable and has_toolbox_volume and has_required_env: + logger.info(f"Docker work pool '{pool_name}' already exists with correct configuration") + return + else: + reasons = [] + if not has_image_variable: + reasons.append("missing image template") + if not has_toolbox_volume: + reasons.append("missing toolbox volume mount") + if not has_required_env: + if not has_api_url: + reasons.append("missing PREFECT_API_URL") + if not has_storage_path: + reasons.append("missing PREFECT_LOCAL_STORAGE_PATH") + if not has_results_persist: + reasons.append("missing PREFECT_RESULTS_PERSIST_BY_DEFAULT") + + logger.warning(f"Docker work pool '{pool_name}' exists but lacks: {', '.join(reasons)}. Recreating...") + # Delete the old pool and recreate it + try: + await client.delete_work_pool(pool_name) + logger.info(f"Deleted old work pool '{pool_name}'") + except Exception as e: + logger.warning(f"Failed to delete old work pool: {e}") + elif force_recreate and existing_pool: + logger.warning(f"Force recreation enabled - deleting existing work pool '{pool_name}'") + try: + await client.delete_work_pool(pool_name) + logger.info(f"Deleted existing work pool for force recreation") + except Exception as e: + logger.warning(f"Failed to delete work pool for force recreation: {e}") + + logger.info(f"Creating Docker work pool '{pool_name}' with custom image support...") + + # Create the work pool with proper Docker configuration + work_pool = WorkPoolCreate( + name=pool_name, + type="docker", + description="Docker work pool for FuzzForge workflows with custom image support", + base_job_template={ + "job_configuration": { + "image": "{{ image }}", # Template variable for custom images + "volumes": "{{ volumes }}", # List of volume mounts + "env": "{{ env }}", # Environment variables + "networks": "{{ networks }}", # Docker networks + "stream_output": True, + "auto_remove": True, + "privileged": False, + "network_mode": None, # Use networks instead + "labels": {}, + "command": None # Let the image's CMD/ENTRYPOINT run + }, + "variables": { + "type": "object", + "properties": { + "image": { + "type": "string", + "title": "Docker Image", + "default": "prefecthq/prefect:3-python3.11", + "description": "Docker image for the flow run" + }, + "volumes": { + "type": "array", + "title": "Volume Mounts", + "default": [ + f"{get_actual_compose_project_name()}_prefect_storage:/prefect-storage", + f"{get_actual_compose_project_name()}_toolbox_code:/opt/prefect/toolbox:ro" + ], + "description": "Volume mounts in format 'host:container:mode'", + "items": { + "type": "string" + } + }, + "networks": { + "type": "array", + "title": "Docker Networks", + "default": [f"{get_actual_compose_project_name()}_default"], + "description": "Docker networks to connect container to", + "items": { + "type": "string" + } + }, + "env": { + "type": "object", + "title": "Environment Variables", + "default": { + "PREFECT_API_URL": "http://prefect-server:4200/api", + "PREFECT_LOCAL_STORAGE_PATH": "/prefect-storage", + "PREFECT_RESULTS_PERSIST_BY_DEFAULT": "true" + }, + "description": "Environment variables for the container", + "additionalProperties": { + "type": "string" + } + } + } + } + } + ) + + await client.create_work_pool(work_pool) + logger.info(f"Created Docker work pool '{pool_name}'") + + except Exception as e: + logger.error(f"Failed to setup Docker work pool: {e}") + raise + finally: + # Restore original logging level if debug mode was enabled + if debug_setup and 'original_level' in locals(): + logger.setLevel(original_level) + + +def get_actual_compose_project_name(): + """ + Return the hardcoded compose project name for FuzzForge. + + Always returns 'fuzzforge_alpha' as per system requirements. + """ + logger.info("Using hardcoded compose project name: fuzzforge_alpha") + return "fuzzforge_alpha" + + +async def setup_result_storage(): + """ + Create or update Prefect result storage block for findings persistence. + + This sets up a LocalFileSystem storage block pointing to the shared + /prefect-storage volume for result persistence. + """ + from prefect.filesystems import LocalFileSystem + + storage_name = "fuzzforge-results" + + try: + # Create the storage block, overwrite if it exists + logger.info(f"Setting up storage block '{storage_name}'...") + storage = LocalFileSystem(basepath="/prefect-storage") + + block_doc_id = await storage.save(name=storage_name, overwrite=True) + logger.info(f"Storage block '{storage_name}' configured successfully") + return str(block_doc_id) + + except Exception as e: + logger.error(f"Failed to setup result storage: {e}") + # Don't raise the exception - continue without storage block + logger.warning("Continuing without result storage block - findings may not persist") + return None + + +async def validate_docker_connection(): + """ + Validate that Docker is accessible and running. + + Note: In containerized deployments with Docker socket proxy, + the backend doesn't need direct Docker access. + + Raises: + RuntimeError: If Docker is not accessible + """ + import os + + # Skip Docker validation if running in container without socket access + if os.path.exists("/.dockerenv") and not os.path.exists("/var/run/docker.sock"): + logger.info("Running in container without Docker socket - skipping Docker validation") + return + + try: + import docker + client = docker.from_env() + client.ping() + logger.info("Docker connection validated") + except Exception as e: + logger.error(f"Docker is not accessible: {e}") + raise RuntimeError( + "Docker is not running or not accessible. " + "Please ensure Docker is installed and running." + ) + + +async def validate_registry_connectivity(registry_url: str = None): + """ + Validate that the Docker registry is accessible. + + Args: + registry_url: URL of the Docker registry to validate (auto-detected if None) + + Raises: + RuntimeError: If registry is not accessible + """ + # Resolve a reachable test URL from within this process + if registry_url is None: + # If not specified, prefer internal service name in containers, host port on host + import os + if os.path.exists('/.dockerenv'): + registry_url = "registry:5000" + else: + registry_url = "localhost:5001" + + # If we're running inside a container and asked to probe localhost:PORT, + # the probe would hit the container, not the host. Use host.docker.internal instead. + import os + try: + host_part, port_part = registry_url.split(":", 1) + except ValueError: + host_part, port_part = registry_url, "80" + + if os.path.exists('/.dockerenv') and host_part in ("localhost", "127.0.0.1"): + test_host = "host.docker.internal" + else: + test_host = host_part + test_url = f"http://{test_host}:{port_part}/v2/" + + import aiohttp + import asyncio + + logger.info(f"Validating registry connectivity to {registry_url}...") + + try: + async with aiohttp.ClientSession(timeout=aiohttp.ClientTimeout(total=10)) as session: + async with session.get(test_url) as response: + if response.status == 200: + logger.info(f"Registry at {registry_url} is accessible (tested via {test_host})") + return + else: + raise RuntimeError(f"Registry returned status {response.status}") + except asyncio.TimeoutError: + raise RuntimeError(f"Registry at {registry_url} is not responding (timeout)") + except aiohttp.ClientError as e: + raise RuntimeError(f"Registry at {registry_url} is not accessible: {e}") + except Exception as e: + raise RuntimeError(f"Failed to validate registry connectivity: {e}") + + +async def validate_docker_network(network_name: str): + """ + Validate that the specified Docker network exists. + + Args: + network_name: Name of the Docker network to validate + + Raises: + RuntimeError: If network doesn't exist + """ + import os + + # Skip network validation if running in container without Docker socket + if os.path.exists("/.dockerenv") and not os.path.exists("/var/run/docker.sock"): + logger.info("Running in container without Docker socket - skipping network validation") + return + + try: + import docker + client = docker.from_env() + + # List all networks + networks = client.networks.list(names=[network_name]) + + if not networks: + # Try to find networks with similar names + all_networks = client.networks.list() + similar_networks = [n.name for n in all_networks if "fuzzforge" in n.name.lower()] + + error_msg = f"Docker network '{network_name}' not found." + if similar_networks: + error_msg += f" Available networks: {similar_networks}" + else: + error_msg += " Please ensure Docker Compose is running." + + raise RuntimeError(error_msg) + + logger.info(f"Docker network '{network_name}' validated") + + except Exception as e: + if isinstance(e, RuntimeError): + raise + logger.error(f"Network validation failed: {e}") + raise RuntimeError(f"Failed to validate Docker network: {e}") + + +async def validate_infrastructure(): + """ + Validate all required infrastructure components. + + This should be called during startup to ensure everything is ready. + """ + logger.info("Validating infrastructure...") + + # Validate Docker connection + await validate_docker_connection() + + # Validate registry connectivity for custom image building + await validate_registry_connectivity() + + # Validate network (check for default network pattern) + import os + compose_project = os.getenv('COMPOSE_PROJECT_NAME', 'fuzzforge_alpha') + docker_network = f"{compose_project}_default" + + try: + await validate_docker_network(docker_network) + except RuntimeError as e: + logger.warning(f"Network validation failed: {e}") + logger.warning("Workflows may not be able to connect to Prefect services") + + logger.info("Infrastructure validation completed") diff --git a/backend/src/core/workflow_discovery.py b/backend/src/core/workflow_discovery.py new file mode 100644 index 0000000..e348524 --- /dev/null +++ b/backend/src/core/workflow_discovery.py @@ -0,0 +1,459 @@ +""" +Workflow Discovery - Registry-based discovery and loading of workflows +""" + +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + +import logging +import yaml +from pathlib import Path +from typing import Dict, Optional, Any, Callable +from pydantic import BaseModel, Field, ConfigDict + +logger = logging.getLogger(__name__) + + +class WorkflowInfo(BaseModel): + """Information about a discovered workflow""" + name: str = Field(..., description="Workflow name") + path: Path = Field(..., description="Path to workflow directory") + workflow_file: Path = Field(..., description="Path to workflow.py file") + dockerfile: Path = Field(..., description="Path to Dockerfile") + has_docker: bool = Field(..., description="Whether workflow has custom Dockerfile") + metadata: Dict[str, Any] = Field(..., description="Workflow metadata from YAML") + flow_function_name: str = Field(default="main_flow", description="Name of the flow function") + + model_config = ConfigDict(arbitrary_types_allowed=True) + + +class WorkflowDiscovery: + """ + Discovers workflows from the filesystem and validates them against the registry. + + This system: + 1. Scans for workflows with metadata.yaml files + 2. Cross-references them with the manual registry + 3. Provides registry-based flow functions for deployment + + Workflows must have: + - workflow.py: Contains the Prefect flow + - metadata.yaml: Mandatory metadata file + - Entry in toolbox/workflows/registry.py: Manual registration + - Dockerfile (optional): Custom container definition + - requirements.txt (optional): Python dependencies + """ + + def __init__(self, workflows_dir: Path): + """ + Initialize workflow discovery. + + Args: + workflows_dir: Path to the workflows directory + """ + self.workflows_dir = workflows_dir + if not self.workflows_dir.exists(): + self.workflows_dir.mkdir(parents=True, exist_ok=True) + logger.info(f"Created workflows directory: {self.workflows_dir}") + + # Import registry - this validates it on import + try: + from toolbox.workflows.registry import WORKFLOW_REGISTRY, list_registered_workflows + self.registry = WORKFLOW_REGISTRY + logger.info(f"Loaded workflow registry with {len(self.registry)} registered workflows") + except ImportError as e: + logger.error(f"Failed to import workflow registry: {e}") + self.registry = {} + except Exception as e: + logger.error(f"Registry validation failed: {e}") + self.registry = {} + + # Cache for discovered workflows + self._workflow_cache: Optional[Dict[str, WorkflowInfo]] = None + self._cache_timestamp: Optional[float] = None + self._cache_ttl = 60.0 # Cache TTL in seconds + + async def discover_workflows(self) -> Dict[str, WorkflowInfo]: + """ + Discover workflows by cross-referencing filesystem with registry. + Uses caching to avoid frequent filesystem scans. + + Returns: + Dictionary mapping workflow names to their information + """ + # Check cache validity + import time + current_time = time.time() + + if (self._workflow_cache is not None and + self._cache_timestamp is not None and + (current_time - self._cache_timestamp) < self._cache_ttl): + # Return cached results + logger.debug(f"Returning cached workflow discovery ({len(self._workflow_cache)} workflows)") + return self._workflow_cache + workflows = {} + discovered_dirs = set() + registry_names = set(self.registry.keys()) + + if not self.workflows_dir.exists(): + logger.warning(f"Workflows directory does not exist: {self.workflows_dir}") + return workflows + + # Recursively scan all directories and subdirectories + await self._scan_directory_recursive(self.workflows_dir, workflows, discovered_dirs) + + # Check for registry entries without corresponding directories + missing_dirs = registry_names - discovered_dirs + if missing_dirs: + logger.warning( + f"Registry contains workflows without filesystem directories: {missing_dirs}. " + f"These workflows cannot be deployed." + ) + + logger.info( + f"Discovery complete: {len(workflows)} workflows ready for deployment, " + f"{len(missing_dirs)} registry entries missing directories, " + f"{len(discovered_dirs - registry_names)} filesystem workflows not registered" + ) + + # Update cache + self._workflow_cache = workflows + self._cache_timestamp = current_time + + return workflows + + async def _scan_directory_recursive(self, directory: Path, workflows: Dict[str, WorkflowInfo], discovered_dirs: set): + """ + Recursively scan directory for workflows. + + Args: + directory: Directory to scan + workflows: Dictionary to populate with discovered workflows + discovered_dirs: Set to track discovered workflow names + """ + for item in directory.iterdir(): + if not item.is_dir(): + continue + + if item.name.startswith('_') or item.name.startswith('.'): + continue # Skip hidden or private directories + + # Check if this directory contains workflow files (workflow.py and metadata.yaml) + workflow_file = item / "workflow.py" + metadata_file = item / "metadata.yaml" + + if workflow_file.exists() and metadata_file.exists(): + # This is a workflow directory + workflow_name = item.name + discovered_dirs.add(workflow_name) + + # Only process workflows that are in the registry + if workflow_name not in self.registry: + logger.warning( + f"Workflow '{workflow_name}' found in filesystem but not in registry. " + f"Add it to toolbox/workflows/registry.py to enable deployment." + ) + continue + + try: + workflow_info = await self._load_workflow(item) + if workflow_info: + workflows[workflow_info.name] = workflow_info + logger.info(f"Discovered and registered workflow: {workflow_info.name}") + except Exception as e: + logger.error(f"Failed to load workflow from {item}: {e}") + else: + # This is a category directory, recurse into it + await self._scan_directory_recursive(item, workflows, discovered_dirs) + + async def _load_workflow(self, workflow_dir: Path) -> Optional[WorkflowInfo]: + """ + Load and validate a single workflow. + + Args: + workflow_dir: Path to the workflow directory + + Returns: + WorkflowInfo if valid, None otherwise + """ + workflow_name = workflow_dir.name + + # Check for mandatory files + workflow_file = workflow_dir / "workflow.py" + metadata_file = workflow_dir / "metadata.yaml" + + if not workflow_file.exists(): + logger.warning(f"Workflow {workflow_name} missing workflow.py") + return None + + if not metadata_file.exists(): + logger.error(f"Workflow {workflow_name} missing mandatory metadata.yaml") + return None + + # Load and validate metadata + try: + metadata = self._load_metadata(metadata_file) + if not self._validate_metadata(metadata, workflow_name): + return None + except Exception as e: + logger.error(f"Failed to load metadata for {workflow_name}: {e}") + return None + + # Check for mandatory Dockerfile + dockerfile = workflow_dir / "Dockerfile" + if not dockerfile.exists(): + logger.error(f"Workflow {workflow_name} missing mandatory Dockerfile") + return None + + has_docker = True # Always True since Dockerfile is mandatory + + # Get flow function name from metadata or use default + flow_function_name = metadata.get("flow_function", "main_flow") + + return WorkflowInfo( + name=workflow_name, + path=workflow_dir, + workflow_file=workflow_file, + dockerfile=dockerfile, + has_docker=has_docker, + metadata=metadata, + flow_function_name=flow_function_name + ) + + def _load_metadata(self, metadata_file: Path) -> Dict[str, Any]: + """ + Load metadata from YAML file. + + Args: + metadata_file: Path to metadata.yaml + + Returns: + Dictionary containing metadata + """ + with open(metadata_file, 'r') as f: + metadata = yaml.safe_load(f) + + if metadata is None: + raise ValueError("Empty metadata file") + + return metadata + + def _validate_metadata(self, metadata: Dict[str, Any], workflow_name: str) -> bool: + """ + Validate that metadata contains all required fields. + + Args: + metadata: Metadata dictionary + workflow_name: Name of the workflow for logging + + Returns: + True if valid, False otherwise + """ + required_fields = ["name", "version", "description", "author", "category", "parameters", "requirements"] + + missing_fields = [] + for field in required_fields: + if field not in metadata: + missing_fields.append(field) + + if missing_fields: + logger.error( + f"Workflow {workflow_name} metadata missing required fields: {missing_fields}" + ) + return False + + # Validate version format (semantic versioning) + version = metadata.get("version", "") + if not self._is_valid_version(version): + logger.error(f"Workflow {workflow_name} has invalid version format: {version}") + return False + + # Validate parameters structure + parameters = metadata.get("parameters", {}) + if not isinstance(parameters, dict): + logger.error(f"Workflow {workflow_name} parameters must be a dictionary") + return False + + return True + + def _is_valid_version(self, version: str) -> bool: + """ + Check if version follows semantic versioning (x.y.z). + + Args: + version: Version string + + Returns: + True if valid semantic version + """ + try: + parts = version.split('.') + if len(parts) != 3: + return False + for part in parts: + int(part) # Check if each part is a number + return True + except (ValueError, AttributeError): + return False + + def invalidate_cache(self) -> None: + """ + Invalidate the workflow discovery cache. + Useful when workflows are added or modified. + """ + self._workflow_cache = None + self._cache_timestamp = None + logger.debug("Workflow discovery cache invalidated") + + def get_flow_function(self, workflow_name: str) -> Optional[Callable]: + """ + Get the flow function from the registry. + + Args: + workflow_name: Name of the workflow + + Returns: + The flow function if found in registry, None otherwise + """ + if workflow_name not in self.registry: + logger.error( + f"Workflow '{workflow_name}' not found in registry. " + f"Available workflows: {list(self.registry.keys())}" + ) + return None + + try: + from toolbox.workflows.registry import get_workflow_flow + flow_func = get_workflow_flow(workflow_name) + logger.debug(f"Retrieved flow function for '{workflow_name}' from registry") + return flow_func + except Exception as e: + logger.error(f"Failed to get flow function for '{workflow_name}': {e}") + return None + + def get_registry_info(self, workflow_name: str) -> Optional[Dict[str, Any]]: + """ + Get registry information for a workflow. + + Args: + workflow_name: Name of the workflow + + Returns: + Registry information if found, None otherwise + """ + if workflow_name not in self.registry: + return None + + try: + from toolbox.workflows.registry import get_workflow_info + return get_workflow_info(workflow_name) + except Exception as e: + logger.error(f"Failed to get registry info for '{workflow_name}': {e}") + return None + + @staticmethod + def get_metadata_schema() -> Dict[str, Any]: + """ + Get the JSON schema for workflow metadata. + + Returns: + JSON schema dictionary + """ + return { + "type": "object", + "required": ["name", "version", "description", "author", "category", "parameters", "requirements"], + "properties": { + "name": { + "type": "string", + "description": "Workflow name" + }, + "version": { + "type": "string", + "pattern": "^\\d+\\.\\d+\\.\\d+$", + "description": "Semantic version (x.y.z)" + }, + "description": { + "type": "string", + "description": "Workflow description" + }, + "author": { + "type": "string", + "description": "Workflow author" + }, + "category": { + "type": "string", + "enum": ["comprehensive", "specialized", "fuzzing", "focused"], + "description": "Workflow category" + }, + "tags": { + "type": "array", + "items": {"type": "string"}, + "description": "Workflow tags for categorization" + }, + "requirements": { + "type": "object", + "required": ["tools", "resources"], + "properties": { + "tools": { + "type": "array", + "items": {"type": "string"}, + "description": "Required security tools" + }, + "resources": { + "type": "object", + "required": ["memory", "cpu", "timeout"], + "properties": { + "memory": { + "type": "string", + "pattern": "^\\d+[GMK]i$", + "description": "Memory limit (e.g., 1Gi, 512Mi)" + }, + "cpu": { + "type": "string", + "pattern": "^\\d+m?$", + "description": "CPU limit (e.g., 1000m, 2)" + }, + "timeout": { + "type": "integer", + "minimum": 60, + "maximum": 7200, + "description": "Workflow timeout in seconds" + } + } + } + } + }, + "parameters": { + "type": "object", + "description": "Workflow parameters schema" + }, + "default_parameters": { + "type": "object", + "description": "Default parameter values" + }, + "required_modules": { + "type": "array", + "items": {"type": "string"}, + "description": "Required module names" + }, + "supported_volume_modes": { + "type": "array", + "items": {"enum": ["ro", "rw"]}, + "default": ["ro", "rw"], + "description": "Supported volume mount modes" + }, + "flow_function": { + "type": "string", + "default": "main_flow", + "description": "Name of the flow function in workflow.py" + } + } + } \ No newline at end of file diff --git a/backend/src/main.py b/backend/src/main.py new file mode 100644 index 0000000..6843a51 --- /dev/null +++ b/backend/src/main.py @@ -0,0 +1,864 @@ +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + +import asyncio +import logging +import os +from uuid import UUID +from contextlib import AsyncExitStack, asynccontextmanager, suppress +from typing import Any, Dict, Optional, List + +import uvicorn +from fastapi import FastAPI +from starlette.applications import Starlette +from starlette.routing import Mount + +from fastmcp.server.http import create_sse_app + +from src.core.prefect_manager import PrefectManager +from src.core.setup import setup_docker_pool, setup_result_storage, validate_infrastructure +from src.core.workflow_discovery import WorkflowDiscovery +from src.api import workflows, runs, fuzzing +from src.services.prefect_stats_monitor import prefect_stats_monitor + +from fastmcp import FastMCP +from prefect.client.orchestration import get_client +from prefect.client.schemas.filters import ( + FlowRunFilter, + FlowRunFilterDeploymentId, + FlowRunFilterState, + FlowRunFilterStateType, +) +from prefect.client.schemas.sorting import FlowRunSort +from prefect.states import StateType + +logging.basicConfig(level=logging.INFO) +logger = logging.getLogger(__name__) + +prefect_mgr = PrefectManager() + + +class PrefectBootstrapState: + """Tracks Prefect initialization progress for API and MCP consumers.""" + + def __init__(self) -> None: + self.ready: bool = False + self.status: str = "not_started" + self.last_error: Optional[str] = None + self.task_running: bool = False + + def as_dict(self) -> Dict[str, Any]: + return { + "ready": self.ready, + "status": self.status, + "last_error": self.last_error, + "task_running": self.task_running, + } + + +prefect_bootstrap_state = PrefectBootstrapState() + +# Configure retry strategy for bootstrapping Prefect + infrastructure +STARTUP_RETRY_SECONDS = max(1, int(os.getenv("FUZZFORGE_STARTUP_RETRY_SECONDS", "5"))) +STARTUP_RETRY_MAX_SECONDS = max( + STARTUP_RETRY_SECONDS, + int(os.getenv("FUZZFORGE_STARTUP_RETRY_MAX_SECONDS", "60")), +) + +prefect_bootstrap_task: Optional[asyncio.Task] = None + +# --------------------------------------------------------------------------- +# FastAPI application (REST API remains unchanged) +# --------------------------------------------------------------------------- + +app = FastAPI( + title="FuzzForge API", + description="Security testing workflow orchestration API with fuzzing support", + version="0.6.0", +) + +app.include_router(workflows.router) +app.include_router(runs.router) +app.include_router(fuzzing.router) + + +def get_prefect_status() -> Dict[str, Any]: + """Return a snapshot of Prefect bootstrap state for diagnostics.""" + status = prefect_bootstrap_state.as_dict() + status["workflows_loaded"] = len(prefect_mgr.workflows) + status["deployments_tracked"] = len(prefect_mgr.deployments) + status["bootstrap_task_running"] = ( + prefect_bootstrap_task is not None and not prefect_bootstrap_task.done() + ) + return status + + +def _prefect_not_ready_status() -> Optional[Dict[str, Any]]: + """Return status details if Prefect is not ready yet.""" + status = get_prefect_status() + if status.get("ready"): + return None + return status + + +@app.get("/") +async def root() -> Dict[str, Any]: + status = get_prefect_status() + return { + "name": "FuzzForge API", + "version": "0.6.0", + "status": "ready" if status.get("ready") else "initializing", + "workflows_loaded": status.get("workflows_loaded", 0), + "prefect": status, + } + + +@app.get("/health") +async def health() -> Dict[str, str]: + status = get_prefect_status() + health_status = "healthy" if status.get("ready") else "initializing" + return {"status": health_status} + + +# Map FastAPI OpenAPI operationIds to readable MCP tool names +FASTAPI_MCP_NAME_OVERRIDES: Dict[str, str] = { + "list_workflows_workflows__get": "api_list_workflows", + "get_metadata_schema_workflows_metadata_schema_get": "api_get_metadata_schema", + "get_workflow_metadata_workflows__workflow_name__metadata_get": "api_get_workflow_metadata", + "submit_workflow_workflows__workflow_name__submit_post": "api_submit_workflow", + "get_workflow_parameters_workflows__workflow_name__parameters_get": "api_get_workflow_parameters", + "get_run_status_runs__run_id__status_get": "api_get_run_status", + "get_run_findings_runs__run_id__findings_get": "api_get_run_findings", + "get_workflow_findings_runs__workflow_name__findings__run_id__get": "api_get_workflow_findings", + "get_fuzzing_stats_fuzzing__run_id__stats_get": "api_get_fuzzing_stats", + "update_fuzzing_stats_fuzzing__run_id__stats_post": "api_update_fuzzing_stats", + "get_crash_reports_fuzzing__run_id__crashes_get": "api_get_crash_reports", + "report_crash_fuzzing__run_id__crash_post": "api_report_crash", + "stream_fuzzing_updates_fuzzing__run_id__stream_get": "api_stream_fuzzing_updates", + "cleanup_fuzzing_run_fuzzing__run_id__delete": "api_cleanup_fuzzing_run", + "root__get": "api_root", + "health_health_get": "api_health", +} + + +# Create an MCP adapter exposing all FastAPI endpoints via OpenAPI parsing +FASTAPI_MCP_ADAPTER = FastMCP.from_fastapi( + app, + name="FuzzForge FastAPI", + mcp_names=FASTAPI_MCP_NAME_OVERRIDES, +) +_fastapi_mcp_imported = False + + +# --------------------------------------------------------------------------- +# FastMCP server (runs on dedicated port outside FastAPI) +# --------------------------------------------------------------------------- + +mcp = FastMCP(name="FuzzForge MCP") + + +async def _bootstrap_prefect_with_retries() -> None: + """Initialize Prefect infrastructure with exponential backoff retries.""" + + attempt = 0 + + while True: + attempt += 1 + prefect_bootstrap_state.task_running = True + prefect_bootstrap_state.status = "starting" + prefect_bootstrap_state.ready = False + prefect_bootstrap_state.last_error = None + + try: + logger.info("Bootstrapping Prefect infrastructure...") + await validate_infrastructure() + await setup_docker_pool() + await setup_result_storage() + await prefect_mgr.initialize() + await prefect_stats_monitor.start_monitoring() + + prefect_bootstrap_state.ready = True + prefect_bootstrap_state.status = "ready" + prefect_bootstrap_state.task_running = False + logger.info("Prefect infrastructure ready") + return + + except asyncio.CancelledError: + prefect_bootstrap_state.status = "cancelled" + prefect_bootstrap_state.task_running = False + logger.info("Prefect bootstrap task cancelled") + raise + + except Exception as exc: # pragma: no cover - defensive logging on infra startup + logger.exception("Prefect bootstrap failed") + prefect_bootstrap_state.ready = False + prefect_bootstrap_state.status = "error" + prefect_bootstrap_state.last_error = str(exc) + + # Ensure partial initialization does not leave stale state behind + prefect_mgr.workflows.clear() + prefect_mgr.deployments.clear() + await prefect_stats_monitor.stop_monitoring() + + wait_time = min( + STARTUP_RETRY_SECONDS * (2 ** (attempt - 1)), + STARTUP_RETRY_MAX_SECONDS, + ) + logger.info("Retrying Prefect bootstrap in %s second(s)", wait_time) + + try: + await asyncio.sleep(wait_time) + except asyncio.CancelledError: + prefect_bootstrap_state.status = "cancelled" + prefect_bootstrap_state.task_running = False + raise + + +def _lookup_workflow(workflow_name: str): + info = prefect_mgr.workflows.get(workflow_name) + if not info: + return None + metadata = info.metadata + defaults = metadata.get("default_parameters", {}) + default_target_path = metadata.get("default_target_path") or defaults.get("target_path") + supported_modes = metadata.get("supported_volume_modes") or ["ro", "rw"] + if not isinstance(supported_modes, list) or not supported_modes: + supported_modes = ["ro", "rw"] + default_volume_mode = ( + metadata.get("default_volume_mode") + or defaults.get("volume_mode") + or supported_modes[0] + ) + return { + "name": workflow_name, + "version": metadata.get("version", "0.6.0"), + "description": metadata.get("description", ""), + "author": metadata.get("author"), + "tags": metadata.get("tags", []), + "parameters": metadata.get("parameters", {}), + "default_parameters": metadata.get("default_parameters", {}), + "required_modules": metadata.get("required_modules", []), + "supported_volume_modes": supported_modes, + "default_target_path": default_target_path, + "default_volume_mode": default_volume_mode, + "has_custom_docker": bool(info.has_docker), + } + + +@mcp.tool +async def list_workflows_mcp() -> Dict[str, Any]: + """List all discovered workflows and their metadata summary.""" + not_ready = _prefect_not_ready_status() + if not_ready: + return { + "workflows": [], + "prefect": not_ready, + "message": "Prefect infrastructure is still initializing", + } + + workflows_summary = [] + for name, info in prefect_mgr.workflows.items(): + metadata = info.metadata + defaults = metadata.get("default_parameters", {}) + workflows_summary.append({ + "name": name, + "version": metadata.get("version", "0.6.0"), + "description": metadata.get("description", ""), + "author": metadata.get("author"), + "tags": metadata.get("tags", []), + "supported_volume_modes": metadata.get("supported_volume_modes", ["ro", "rw"]), + "default_volume_mode": metadata.get("default_volume_mode") + or defaults.get("volume_mode") + or "ro", + "default_target_path": metadata.get("default_target_path") + or defaults.get("target_path"), + "has_custom_docker": bool(info.has_docker), + }) + return {"workflows": workflows_summary, "prefect": get_prefect_status()} + + +@mcp.tool +async def get_workflow_metadata_mcp(workflow_name: str) -> Dict[str, Any]: + """Fetch detailed metadata for a workflow.""" + not_ready = _prefect_not_ready_status() + if not_ready: + return { + "error": "Prefect infrastructure not ready", + "prefect": not_ready, + } + + data = _lookup_workflow(workflow_name) + if not data: + return {"error": f"Workflow not found: {workflow_name}"} + return data + + +@mcp.tool +async def get_workflow_parameters_mcp(workflow_name: str) -> Dict[str, Any]: + """Return the parameter schema and defaults for a workflow.""" + not_ready = _prefect_not_ready_status() + if not_ready: + return { + "error": "Prefect infrastructure not ready", + "prefect": not_ready, + } + + data = _lookup_workflow(workflow_name) + if not data: + return {"error": f"Workflow not found: {workflow_name}"} + return { + "parameters": data.get("parameters", {}), + "defaults": data.get("default_parameters", {}), + } + + +@mcp.tool +async def get_workflow_metadata_schema_mcp() -> Dict[str, Any]: + """Return the JSON schema describing workflow metadata files.""" + return WorkflowDiscovery.get_metadata_schema() + + +@mcp.tool +async def submit_security_scan_mcp( + workflow_name: str, + target_path: str | None = None, + volume_mode: str | None = None, + parameters: Dict[str, Any] | None = None, +) -> Dict[str, Any] | Dict[str, str]: + """Submit a Prefect workflow via MCP.""" + try: + not_ready = _prefect_not_ready_status() + if not_ready: + return { + "error": "Prefect infrastructure not ready", + "prefect": not_ready, + } + + workflow_info = prefect_mgr.workflows.get(workflow_name) + if not workflow_info: + return {"error": f"Workflow '{workflow_name}' not found"} + + metadata = workflow_info.metadata or {} + defaults = metadata.get("default_parameters", {}) + + resolved_target_path = target_path or metadata.get("default_target_path") or defaults.get("target_path") + if not resolved_target_path: + return { + "error": ( + "target_path is required and no default_target_path is defined in metadata" + ), + "metadata": { + "workflow": workflow_name, + "default_target_path": metadata.get("default_target_path"), + }, + } + + requested_volume_mode = volume_mode or metadata.get("default_volume_mode") or defaults.get("volume_mode") + if not requested_volume_mode: + requested_volume_mode = "ro" + + normalised_volume_mode = ( + str(requested_volume_mode).strip().lower().replace("-", "_") + ) + if normalised_volume_mode in {"read_only", "readonly", "ro"}: + normalised_volume_mode = "ro" + elif normalised_volume_mode in {"read_write", "readwrite", "rw"}: + normalised_volume_mode = "rw" + else: + supported_modes = metadata.get("supported_volume_modes", ["ro", "rw"]) + if isinstance(supported_modes, list) and normalised_volume_mode in supported_modes: + pass + else: + normalised_volume_mode = "ro" + + parameters = parameters or {} + + cleaned_parameters: Dict[str, Any] = {**defaults, **parameters} + + # Ensure *_config structures default to dicts so Prefect validation passes. + for key, value in list(cleaned_parameters.items()): + if isinstance(key, str) and key.endswith("_config") and value is None: + cleaned_parameters[key] = {} + + # Some workflows expect configuration dictionaries even when omitted. + parameter_definitions = ( + metadata.get("parameters", {}).get("properties", {}) + if isinstance(metadata.get("parameters"), dict) + else {} + ) + for key, definition in parameter_definitions.items(): + if not isinstance(key, str) or not key.endswith("_config"): + continue + if key not in cleaned_parameters: + default_value = definition.get("default") if isinstance(definition, dict) else None + cleaned_parameters[key] = default_value if default_value is not None else {} + elif cleaned_parameters[key] is None: + cleaned_parameters[key] = {} + + flow_run = await prefect_mgr.submit_workflow( + workflow_name=workflow_name, + target_path=resolved_target_path, + volume_mode=normalised_volume_mode, + parameters=cleaned_parameters, + ) + + return { + "run_id": str(flow_run.id), + "status": flow_run.state.name if flow_run.state else "PENDING", + "workflow": workflow_name, + "message": f"Workflow '{workflow_name}' submitted successfully", + "target_path": resolved_target_path, + "volume_mode": normalised_volume_mode, + "parameters": cleaned_parameters, + "mcp_enabled": True, + } + except Exception as exc: # pragma: no cover - defensive logging + logger.exception("MCP submit failed") + return {"error": f"Failed to submit workflow: {exc}"} + + +@mcp.tool +async def get_comprehensive_scan_summary(run_id: str) -> Dict[str, Any] | Dict[str, str]: + """Return a summary for the given flow run via MCP.""" + try: + not_ready = _prefect_not_ready_status() + if not_ready: + return { + "error": "Prefect infrastructure not ready", + "prefect": not_ready, + } + + status = await prefect_mgr.get_flow_run_status(run_id) + findings = await prefect_mgr.get_flow_run_findings(run_id) + + workflow_name = "unknown" + deployment_id = status.get("workflow", "") + for name, deployment in prefect_mgr.deployments.items(): + if str(deployment) == str(deployment_id): + workflow_name = name + break + + total_findings = 0 + severity_summary = {"critical": 0, "high": 0, "medium": 0, "low": 0, "info": 0} + + if findings and "sarif" in findings: + sarif = findings["sarif"] + if isinstance(sarif, dict): + total_findings = sarif.get("total_findings", 0) + + return { + "run_id": run_id, + "workflow": workflow_name, + "status": status.get("status", "unknown"), + "is_completed": status.get("is_completed", False), + "total_findings": total_findings, + "severity_summary": severity_summary, + "scan_duration": status.get("updated_at", "") + if status.get("is_completed") + else "In progress", + "recommendations": ( + [ + "Review high and critical severity findings first", + "Implement security fixes based on finding recommendations", + "Re-run scan after applying fixes to verify remediation", + ] + if total_findings > 0 + else ["No security issues found"] + ), + "mcp_analysis": True, + } + except Exception as exc: # pragma: no cover + logger.exception("MCP summary failed") + return {"error": f"Failed to summarize run: {exc}"} + + +@mcp.tool +async def get_run_status_mcp(run_id: str) -> Dict[str, Any]: + """Return current status information for a Prefect run.""" + try: + not_ready = _prefect_not_ready_status() + if not_ready: + return { + "error": "Prefect infrastructure not ready", + "prefect": not_ready, + } + + status = await prefect_mgr.get_flow_run_status(run_id) + workflow_name = "unknown" + deployment_id = status.get("workflow", "") + for name, deployment in prefect_mgr.deployments.items(): + if str(deployment) == str(deployment_id): + workflow_name = name + break + + return { + "run_id": status["run_id"], + "workflow": workflow_name, + "status": status["status"], + "is_completed": status["is_completed"], + "is_failed": status["is_failed"], + "is_running": status["is_running"], + "created_at": status["created_at"], + "updated_at": status["updated_at"], + } + except Exception as exc: + logger.exception("MCP run status failed") + return {"error": f"Failed to get run status: {exc}"} + + +@mcp.tool +async def get_run_findings_mcp(run_id: str) -> Dict[str, Any]: + """Return SARIF findings for a completed run.""" + try: + not_ready = _prefect_not_ready_status() + if not_ready: + return { + "error": "Prefect infrastructure not ready", + "prefect": not_ready, + } + + status = await prefect_mgr.get_flow_run_status(run_id) + if not status.get("is_completed"): + return {"error": f"Run {run_id} not completed. Status: {status.get('status')}"} + + findings = await prefect_mgr.get_flow_run_findings(run_id) + + workflow_name = "unknown" + deployment_id = status.get("workflow", "") + for name, deployment in prefect_mgr.deployments.items(): + if str(deployment) == str(deployment_id): + workflow_name = name + break + + metadata = { + "completion_time": status.get("updated_at"), + "workflow_version": "unknown", + } + info = prefect_mgr.workflows.get(workflow_name) + if info: + metadata["workflow_version"] = info.metadata.get("version", "unknown") + + return { + "workflow": workflow_name, + "run_id": run_id, + "sarif": findings, + "metadata": metadata, + } + except Exception as exc: + logger.exception("MCP findings failed") + return {"error": f"Failed to retrieve findings: {exc}"} + + +@mcp.tool +async def list_recent_runs_mcp( + limit: int = 10, + workflow_name: str | None = None, + states: List[str] | None = None, +) -> Dict[str, Any]: + """List recent Prefect runs with optional workflow/state filters.""" + + not_ready = _prefect_not_ready_status() + if not_ready: + return { + "runs": [], + "prefect": not_ready, + "message": "Prefect infrastructure is still initializing", + } + + try: + limit_value = int(limit) + except (TypeError, ValueError): + limit_value = 10 + limit_value = max(1, min(limit_value, 100)) + + deployment_map = { + str(deployment_id): workflow + for workflow, deployment_id in prefect_mgr.deployments.items() + } + + deployment_filter_value = None + if workflow_name: + deployment_id = prefect_mgr.deployments.get(workflow_name) + if not deployment_id: + return { + "runs": [], + "prefect": get_prefect_status(), + "error": f"Workflow '{workflow_name}' has no registered deployment", + } + try: + deployment_filter_value = UUID(str(deployment_id)) + except ValueError: + return { + "runs": [], + "prefect": get_prefect_status(), + "error": ( + f"Deployment id '{deployment_id}' for workflow '{workflow_name}' is invalid" + ), + } + + desired_state_types: List[StateType] = [] + if states: + for raw_state in states: + if not raw_state: + continue + normalised = raw_state.strip().upper() + if normalised == "ALL": + desired_state_types = [] + break + try: + desired_state_types.append(StateType[normalised]) + except KeyError: + continue + if not desired_state_types: + desired_state_types = [ + StateType.RUNNING, + StateType.COMPLETED, + StateType.FAILED, + StateType.CANCELLED, + ] + + flow_filter = FlowRunFilter() + if desired_state_types: + flow_filter.state = FlowRunFilterState( + type=FlowRunFilterStateType(any_=desired_state_types) + ) + if deployment_filter_value: + flow_filter.deployment_id = FlowRunFilterDeploymentId( + any_=[deployment_filter_value] + ) + + async with get_client() as client: + flow_runs = await client.read_flow_runs( + limit=limit_value, + flow_run_filter=flow_filter, + sort=FlowRunSort.START_TIME_DESC, + ) + + results: List[Dict[str, Any]] = [] + for flow_run in flow_runs: + deployment_id = getattr(flow_run, "deployment_id", None) + workflow = deployment_map.get(str(deployment_id), "unknown") + state = getattr(flow_run, "state", None) + state_name = getattr(state, "name", None) if state else None + state_type = getattr(state, "type", None) if state else None + + results.append( + { + "run_id": str(flow_run.id), + "workflow": workflow, + "deployment_id": str(deployment_id) if deployment_id else None, + "state": state_name or (state_type.name if state_type else None), + "state_type": state_type.name if state_type else None, + "is_completed": bool(getattr(state, "is_completed", lambda: False)()), + "is_running": bool(getattr(state, "is_running", lambda: False)()), + "is_failed": bool(getattr(state, "is_failed", lambda: False)()), + "created_at": getattr(flow_run, "created", None), + "updated_at": getattr(flow_run, "updated", None), + "expected_start_time": getattr(flow_run, "expected_start_time", None), + "start_time": getattr(flow_run, "start_time", None), + } + ) + + # Normalise datetimes to ISO 8601 strings for serialization + for entry in results: + for key in ("created_at", "updated_at", "expected_start_time", "start_time"): + value = entry.get(key) + if value is None: + continue + try: + entry[key] = value.isoformat() + except AttributeError: + entry[key] = str(value) + + return {"runs": results, "prefect": get_prefect_status()} + + +@mcp.tool +async def get_fuzzing_stats_mcp(run_id: str) -> Dict[str, Any]: + """Return fuzzing statistics for a run if available.""" + not_ready = _prefect_not_ready_status() + if not_ready: + return { + "error": "Prefect infrastructure not ready", + "prefect": not_ready, + } + + stats = fuzzing.fuzzing_stats.get(run_id) + if not stats: + return {"error": f"Fuzzing run not found: {run_id}"} + # Be resilient if a plain dict slipped into the cache + if isinstance(stats, dict): + return stats + if hasattr(stats, "model_dump"): + return stats.model_dump() + if hasattr(stats, "dict"): + return stats.dict() + # Last resort + return getattr(stats, "__dict__", {"run_id": run_id}) + + +@mcp.tool +async def get_fuzzing_crash_reports_mcp(run_id: str) -> Dict[str, Any]: + """Return crash reports collected for a fuzzing run.""" + not_ready = _prefect_not_ready_status() + if not_ready: + return { + "error": "Prefect infrastructure not ready", + "prefect": not_ready, + } + + reports = fuzzing.crash_reports.get(run_id) + if reports is None: + return {"error": f"Fuzzing run not found: {run_id}"} + return {"run_id": run_id, "crashes": [report.model_dump() for report in reports]} + + +@mcp.tool +async def get_backend_status_mcp() -> Dict[str, Any]: + """Expose backend readiness, workflows, and registered MCP tools.""" + + status = get_prefect_status() + response: Dict[str, Any] = {"prefect": status} + + if status.get("ready"): + response["workflows"] = list(prefect_mgr.workflows.keys()) + + try: + tools = await mcp._tool_manager.list_tools() + response["mcp_tools"] = sorted(tool.name for tool in tools) + except Exception as exc: # pragma: no cover - defensive logging + logger.debug("Failed to enumerate MCP tools: %s", exc) + + return response + + +def create_mcp_transport_app() -> Starlette: + """Build a Starlette app serving HTTP + SSE transports on one port.""" + + http_app = mcp.http_app(path="/", transport="streamable-http") + sse_app = create_sse_app( + server=mcp, + message_path="/messages", + sse_path="/", + auth=mcp.auth, + ) + + routes = [ + Mount("/mcp", app=http_app), + Mount("/mcp/sse", app=sse_app), + ] + + @asynccontextmanager + async def lifespan(app: Starlette): # pragma: no cover - integration wiring + async with AsyncExitStack() as stack: + await stack.enter_async_context( + http_app.router.lifespan_context(http_app) + ) + await stack.enter_async_context( + sse_app.router.lifespan_context(sse_app) + ) + yield + + combined_app = Starlette(routes=routes, lifespan=lifespan) + combined_app.state.fastmcp_server = mcp + combined_app.state.http_app = http_app + combined_app.state.sse_app = sse_app + return combined_app + + +# --------------------------------------------------------------------------- +# Combined lifespan: Prefect init + dedicated MCP transports +# --------------------------------------------------------------------------- + +@asynccontextmanager +async def combined_lifespan(app: FastAPI): + global prefect_bootstrap_task, _fastapi_mcp_imported + + logger.info("Starting FuzzForge backend...") + + # Ensure FastAPI endpoints are exposed via MCP once + if not _fastapi_mcp_imported: + try: + await mcp.import_server(FASTAPI_MCP_ADAPTER) + _fastapi_mcp_imported = True + logger.info("Mounted FastAPI endpoints as MCP tools") + except Exception as exc: + logger.exception("Failed to import FastAPI endpoints into MCP", exc_info=exc) + + # Kick off Prefect bootstrap in the background if needed + if prefect_bootstrap_task is None or prefect_bootstrap_task.done(): + prefect_bootstrap_task = asyncio.create_task(_bootstrap_prefect_with_retries()) + logger.info("Prefect bootstrap task started") + else: + logger.info("Prefect bootstrap task already running") + + # Start MCP transports on shared port (HTTP + SSE) + mcp_app = create_mcp_transport_app() + mcp_config = uvicorn.Config( + app=mcp_app, + host="0.0.0.0", + port=8010, + log_level="info", + lifespan="on", + ) + mcp_server = uvicorn.Server(mcp_config) + mcp_server.install_signal_handlers = lambda: None # type: ignore[assignment] + mcp_task = asyncio.create_task(mcp_server.serve()) + + async def _wait_for_uvicorn_startup() -> None: + started_attr = getattr(mcp_server, "started", None) + if hasattr(started_attr, "wait"): + await asyncio.wait_for(started_attr.wait(), timeout=10) + return + + # Fallback for uvicorn versions where "started" is a bool + poll_interval = 0.1 + checks = int(10 / poll_interval) + for _ in range(checks): + if getattr(mcp_server, "started", False): + return + await asyncio.sleep(poll_interval) + raise asyncio.TimeoutError + + try: + await _wait_for_uvicorn_startup() + except asyncio.TimeoutError: # pragma: no cover - defensive logging + if mcp_task.done(): + raise RuntimeError("MCP server failed to start") from mcp_task.exception() + logger.warning("Timed out waiting for MCP server startup; continuing anyway") + + logger.info("MCP HTTP available at http://0.0.0.0:8010/mcp") + logger.info("MCP SSE available at http://0.0.0.0:8010/mcp/sse") + + try: + yield + finally: + logger.info("Shutting down MCP transports...") + mcp_server.should_exit = True + mcp_server.force_exit = True + await asyncio.gather(mcp_task, return_exceptions=True) + + if prefect_bootstrap_task and not prefect_bootstrap_task.done(): + prefect_bootstrap_task.cancel() + with suppress(asyncio.CancelledError): + await prefect_bootstrap_task + prefect_bootstrap_state.task_running = False + if not prefect_bootstrap_state.ready: + prefect_bootstrap_state.status = "stopped" + prefect_bootstrap_state.next_retry_seconds = None + prefect_bootstrap_task = None + + logger.info("Shutting down Prefect statistics monitor...") + await prefect_stats_monitor.stop_monitoring() + logger.info("Shutting down FuzzForge backend...") + + +app.router.lifespan_context = combined_lifespan diff --git a/backend/src/models/__init__.py b/backend/src/models/__init__.py new file mode 100644 index 0000000..43bcfe7 --- /dev/null +++ b/backend/src/models/__init__.py @@ -0,0 +1,11 @@ +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + diff --git a/backend/src/models/findings.py b/backend/src/models/findings.py new file mode 100644 index 0000000..05385d9 --- /dev/null +++ b/backend/src/models/findings.py @@ -0,0 +1,182 @@ +""" +Models for workflow findings and submissions +""" + +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + +from pydantic import BaseModel, Field, field_validator +from typing import Dict, Any, Optional, Literal, List +from datetime import datetime +from pathlib import Path + + +class WorkflowFindings(BaseModel): + """Findings from a workflow execution in SARIF format""" + workflow: str = Field(..., description="Workflow name") + run_id: str = Field(..., description="Unique run identifier") + sarif: Dict[str, Any] = Field(..., description="SARIF formatted findings") + metadata: Dict[str, Any] = Field(default_factory=dict, description="Additional metadata") + + +class ResourceLimits(BaseModel): + """Resource limits for workflow execution""" + cpu_limit: Optional[str] = Field(None, description="CPU limit (e.g., '2' for 2 cores, '500m' for 0.5 cores)") + memory_limit: Optional[str] = Field(None, description="Memory limit (e.g., '1Gi', '512Mi')") + cpu_request: Optional[str] = Field(None, description="CPU request (guaranteed)") + memory_request: Optional[str] = Field(None, description="Memory request (guaranteed)") + + +class VolumeMount(BaseModel): + """Volume mount specification""" + host_path: str = Field(..., description="Host path to mount") + container_path: str = Field(..., description="Container path for mount") + mode: Literal["ro", "rw"] = Field(default="ro", description="Mount mode") + + @field_validator("host_path") + @classmethod + def validate_host_path(cls, v): + """Validate that the host path is absolute (existence checked at runtime)""" + path = Path(v) + if not path.is_absolute(): + raise ValueError(f"Host path must be absolute: {v}") + # Note: Path existence is validated at workflow runtime + # We can't validate existence here as this runs inside Docker container + return str(path) + + @field_validator("container_path") + @classmethod + def validate_container_path(cls, v): + """Validate that the container path is absolute""" + if not v.startswith('/'): + raise ValueError(f"Container path must be absolute: {v}") + return v + + +class WorkflowSubmission(BaseModel): + """Submit a workflow with configurable settings""" + target_path: str = Field(..., description="Absolute path to analyze") + volume_mode: Literal["ro", "rw"] = Field( + default="ro", + description="Volume mount mode: read-only (ro) or read-write (rw)" + ) + parameters: Dict[str, Any] = Field( + default_factory=dict, + description="Workflow-specific parameters" + ) + timeout: Optional[int] = Field( + default=None, # Allow workflow-specific defaults + description="Timeout in seconds (None for workflow default)", + ge=1, + le=604800 # Max 7 days to support fuzzing campaigns + ) + resource_limits: Optional[ResourceLimits] = Field( + None, + description="Resource limits for workflow container" + ) + additional_volumes: List[VolumeMount] = Field( + default_factory=list, + description="Additional volume mounts (e.g., for corpus, output directories)" + ) + + @field_validator("target_path") + @classmethod + def validate_path(cls, v): + """Validate that the target path is absolute (existence checked at runtime)""" + path = Path(v) + if not path.is_absolute(): + raise ValueError(f"Path must be absolute: {v}") + # Note: Path existence is validated at workflow runtime when volumes are mounted + # We can't validate existence here as this runs inside Docker container + return str(path) + + +class WorkflowStatus(BaseModel): + """Status of a workflow run""" + run_id: str = Field(..., description="Unique run identifier") + workflow: str = Field(..., description="Workflow name") + status: str = Field(..., description="Current status") + is_completed: bool = Field(..., description="Whether the run is completed") + is_failed: bool = Field(..., description="Whether the run failed") + is_running: bool = Field(..., description="Whether the run is currently running") + created_at: datetime = Field(..., description="Run creation time") + updated_at: datetime = Field(..., description="Last update time") + + +class WorkflowMetadata(BaseModel): + """Complete metadata for a workflow""" + name: str = Field(..., description="Workflow name") + version: str = Field(..., description="Semantic version") + description: str = Field(..., description="Workflow description") + author: Optional[str] = Field(None, description="Workflow author") + tags: List[str] = Field(default_factory=list, description="Workflow tags") + parameters: Dict[str, Any] = Field(..., description="Parameters schema") + default_parameters: Dict[str, Any] = Field( + default_factory=dict, + description="Default parameter values" + ) + required_modules: List[str] = Field( + default_factory=list, + description="Required module names" + ) + supported_volume_modes: List[Literal["ro", "rw"]] = Field( + default=["ro", "rw"], + description="Supported volume mount modes" + ) + has_custom_docker: bool = Field( + default=False, + description="Whether workflow has custom Dockerfile" + ) + + +class WorkflowListItem(BaseModel): + """Summary information for a workflow in list views""" + name: str = Field(..., description="Workflow name") + version: str = Field(..., description="Semantic version") + description: str = Field(..., description="Workflow description") + author: Optional[str] = Field(None, description="Workflow author") + tags: List[str] = Field(default_factory=list, description="Workflow tags") + + +class RunSubmissionResponse(BaseModel): + """Response after submitting a workflow""" + run_id: str = Field(..., description="Unique run identifier") + status: str = Field(..., description="Initial status") + workflow: str = Field(..., description="Workflow name") + message: str = Field(default="Workflow submitted successfully") + + +class FuzzingStats(BaseModel): + """Real-time fuzzing statistics""" + run_id: str = Field(..., description="Unique run identifier") + workflow: str = Field(..., description="Workflow name") + executions: int = Field(default=0, description="Total executions") + executions_per_sec: float = Field(default=0.0, description="Current execution rate") + crashes: int = Field(default=0, description="Total crashes found") + unique_crashes: int = Field(default=0, description="Unique crashes") + coverage: Optional[float] = Field(None, description="Code coverage percentage") + corpus_size: int = Field(default=0, description="Current corpus size") + elapsed_time: int = Field(default=0, description="Elapsed time in seconds") + last_crash_time: Optional[datetime] = Field(None, description="Time of last crash") + + +class CrashReport(BaseModel): + """Individual crash report from fuzzing""" + run_id: str = Field(..., description="Run identifier") + crash_id: str = Field(..., description="Unique crash identifier") + timestamp: datetime = Field(default_factory=datetime.utcnow) + signal: Optional[str] = Field(None, description="Crash signal (SIGSEGV, etc.)") + crash_type: Optional[str] = Field(None, description="Type of crash") + stack_trace: Optional[str] = Field(None, description="Stack trace") + input_file: Optional[str] = Field(None, description="Path to crashing input") + reproducer: Optional[str] = Field(None, description="Minimized reproducer") + severity: str = Field(default="medium", description="Crash severity") + exploitability: Optional[str] = Field(None, description="Exploitability assessment") \ No newline at end of file diff --git a/backend/src/services/prefect_stats_monitor.py b/backend/src/services/prefect_stats_monitor.py new file mode 100644 index 0000000..a46d88a --- /dev/null +++ b/backend/src/services/prefect_stats_monitor.py @@ -0,0 +1,394 @@ +""" +Generic Prefect Statistics Monitor Service + +This service monitors ALL workflows for structured live data logging and +updates the appropriate statistics APIs. Works with any workflow that follows +the standard LIVE_STATS logging pattern. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import asyncio +import json +import logging +from datetime import datetime, timedelta, timezone +from typing import Dict, Any, Optional +from prefect.client.orchestration import get_client +from prefect.client.schemas.objects import FlowRun, TaskRun +from src.models.findings import FuzzingStats +from src.api.fuzzing import fuzzing_stats, initialize_fuzzing_tracking, active_connections + +logger = logging.getLogger(__name__) + + +class PrefectStatsMonitor: + """Monitors Prefect flows and tasks for live statistics from any workflow""" + + def __init__(self): + self.monitoring = False + self.monitor_task = None + self.monitored_runs = set() + self.last_log_ts: Dict[str, datetime] = {} + self._client = None + self._client_refresh_time = None + self._client_refresh_interval = 300 # Refresh connection every 5 minutes + + async def start_monitoring(self): + """Start the Prefect statistics monitoring service""" + if self.monitoring: + logger.warning("Prefect stats monitor already running") + return + + self.monitoring = True + self.monitor_task = asyncio.create_task(self._monitor_flows()) + logger.info("Started Prefect statistics monitor") + + async def stop_monitoring(self): + """Stop the monitoring service""" + self.monitoring = False + if self.monitor_task: + self.monitor_task.cancel() + try: + await self.monitor_task + except asyncio.CancelledError: + pass + logger.info("Stopped Prefect statistics monitor") + + async def _get_or_refresh_client(self): + """Get or refresh Prefect client with connection pooling.""" + now = datetime.now(timezone.utc) + + if (self._client is None or + self._client_refresh_time is None or + (now - self._client_refresh_time).total_seconds() > self._client_refresh_interval): + + if self._client: + try: + await self._client.aclose() + except Exception: + pass + + self._client = get_client() + self._client_refresh_time = now + await self._client.__aenter__() + + return self._client + + async def _monitor_flows(self): + """Main monitoring loop that watches Prefect flows""" + try: + while self.monitoring: + try: + # Use connection pooling for better performance + client = await self._get_or_refresh_client() + + # Get recent flow runs (limit to reduce load) + flow_runs = await client.read_flow_runs( + limit=50, + sort="START_TIME_DESC", + ) + + # Only consider runs from the last 15 minutes + recent_cutoff = datetime.now(timezone.utc) - timedelta(minutes=15) + for flow_run in flow_runs: + created = getattr(flow_run, "created", None) + if created is None: + continue + try: + # Ensure timezone-aware comparison + if created.tzinfo is None: + created = created.replace(tzinfo=timezone.utc) + if created >= recent_cutoff: + await self._monitor_flow_run(client, flow_run) + except Exception: + # If comparison fails, attempt monitoring anyway + await self._monitor_flow_run(client, flow_run) + + await asyncio.sleep(5) # Check every 5 seconds + + except Exception as e: + logger.error(f"Error in Prefect monitoring: {e}") + await asyncio.sleep(10) + + except asyncio.CancelledError: + logger.info("Prefect monitoring cancelled") + except Exception as e: + logger.error(f"Fatal error in Prefect monitoring: {e}") + finally: + # Clean up client on exit + if self._client: + try: + await self._client.__aexit__(None, None, None) + except Exception: + pass + self._client = None + + async def _monitor_flow_run(self, client, flow_run: FlowRun): + """Monitor a specific flow run for statistics""" + run_id = str(flow_run.id) + workflow_name = flow_run.name or "unknown" + + try: + # Initialize tracking if not exists - only for workflows that might have live stats + if run_id not in fuzzing_stats: + initialize_fuzzing_tracking(run_id, workflow_name) + self.monitored_runs.add(run_id) + + # Skip corrupted entries (should not happen after startup cleanup, but defensive) + elif not isinstance(fuzzing_stats[run_id], FuzzingStats): + logger.warning(f"Skipping corrupted stats entry for {run_id}, reinitializing") + initialize_fuzzing_tracking(run_id, workflow_name) + self.monitored_runs.add(run_id) + + # Get task runs for this flow + task_runs = await client.read_task_runs( + flow_run_filter={"id": {"any_": [flow_run.id]}}, + limit=25, + ) + + # Check all tasks for live statistics logging + for task_run in task_runs: + await self._extract_stats_from_task(client, run_id, task_run, workflow_name) + + # Also scan flow-level logs as a fallback + await self._extract_stats_from_flow_logs(client, run_id, flow_run, workflow_name) + + except Exception as e: + logger.warning(f"Error monitoring flow run {run_id}: {e}") + + async def _extract_stats_from_task(self, client, run_id: str, task_run: TaskRun, workflow_name: str): + """Extract statistics from any task that logs live stats""" + try: + # Get task run logs + logs = await client.read_logs( + log_filter={ + "task_run_id": {"any_": [task_run.id]} + }, + limit=100, + sort="TIMESTAMP_ASC" + ) + + # Parse logs for LIVE_STATS entries (generic pattern for any workflow) + latest_stats = None + for log in logs: + # Prefer structured extra field if present + extra_data = getattr(log, "extra", None) or getattr(log, "extra_fields", None) or None + if isinstance(extra_data, dict): + stat_type = extra_data.get("stats_type") + if stat_type in ["fuzzing_live_update", "scan_progress", "analysis_update", "live_stats"]: + latest_stats = extra_data + continue + + # Fallback to parsing from message text + if ("FUZZ_STATS" in log.message or "LIVE_STATS" in log.message): + stats = self._parse_stats_from_log(log.message) + if stats: + latest_stats = stats + + # Update statistics if we found any + if latest_stats: + # Calculate elapsed time from task start + elapsed_time = 0 + if task_run.start_time: + # Ensure timezone-aware arithmetic + now = datetime.now(timezone.utc) + try: + elapsed_time = int((now - task_run.start_time).total_seconds()) + except Exception: + # Fallback to naive UTC if types mismatch + elapsed_time = int((datetime.utcnow() - task_run.start_time.replace(tzinfo=None)).total_seconds()) + + updated_stats = FuzzingStats( + run_id=run_id, + workflow=workflow_name, + executions=latest_stats.get("executions", 0), + executions_per_sec=latest_stats.get("executions_per_sec", 0.0), + crashes=latest_stats.get("crashes", 0), + unique_crashes=latest_stats.get("unique_crashes", 0), + corpus_size=latest_stats.get("corpus_size", 0), + elapsed_time=elapsed_time + ) + + # Update the global stats + previous = fuzzing_stats.get(run_id) + fuzzing_stats[run_id] = updated_stats + + # Broadcast to any active WebSocket clients for this run + if active_connections.get(run_id): + # Handle both Pydantic objects and plain dicts + if isinstance(updated_stats, dict): + stats_data = updated_stats + elif hasattr(updated_stats, 'model_dump'): + stats_data = updated_stats.model_dump() + elif hasattr(updated_stats, 'dict'): + stats_data = updated_stats.dict() + else: + stats_data = updated_stats.__dict__ + + message = { + "type": "stats_update", + "data": stats_data, + } + disconnected = [] + for ws in active_connections[run_id]: + try: + await ws.send_text(json.dumps(message)) + except Exception: + disconnected.append(ws) + # Clean up disconnected sockets + for ws in disconnected: + try: + active_connections[run_id].remove(ws) + except ValueError: + pass + + logger.debug(f"Updated Prefect stats for {run_id}: {updated_stats.executions} execs") + + except Exception as e: + logger.warning(f"Error extracting stats from task {task_run.id}: {e}") + + async def _extract_stats_from_flow_logs(self, client, run_id: str, flow_run: FlowRun, workflow_name: str): + """Extract statistics by scanning flow-level logs for LIVE/FUZZ stats""" + try: + logs = await client.read_logs( + log_filter={ + "flow_run_id": {"any_": [flow_run.id]} + }, + limit=200, + sort="TIMESTAMP_ASC" + ) + + latest_stats = None + last_seen = self.last_log_ts.get(run_id) + max_ts = last_seen + + for log in logs: + # Skip logs we've already processed + ts = getattr(log, "timestamp", None) + if last_seen and ts and ts <= last_seen: + continue + if ts and (max_ts is None or ts > max_ts): + max_ts = ts + + # Prefer structured extra field if available + extra_data = getattr(log, "extra", None) or getattr(log, "extra_fields", None) or None + if isinstance(extra_data, dict): + stat_type = extra_data.get("stats_type") + if stat_type in ["fuzzing_live_update", "scan_progress", "analysis_update", "live_stats"]: + latest_stats = extra_data + continue + + # Fallback to message parse + if ("FUZZ_STATS" in log.message or "LIVE_STATS" in log.message): + stats = self._parse_stats_from_log(log.message) + if stats: + latest_stats = stats + + if max_ts: + self.last_log_ts[run_id] = max_ts + + if latest_stats: + # Use flow_run timestamps for elapsed time if available + elapsed_time = 0 + start_time = getattr(flow_run, "start_time", None) or getattr(flow_run, "start_time", None) + if start_time: + now = datetime.now(timezone.utc) + try: + if start_time.tzinfo is None: + start_time = start_time.replace(tzinfo=timezone.utc) + elapsed_time = int((now - start_time).total_seconds()) + except Exception: + elapsed_time = int((datetime.utcnow() - start_time.replace(tzinfo=None)).total_seconds()) + + updated_stats = FuzzingStats( + run_id=run_id, + workflow=workflow_name, + executions=latest_stats.get("executions", 0), + executions_per_sec=latest_stats.get("executions_per_sec", 0.0), + crashes=latest_stats.get("crashes", 0), + unique_crashes=latest_stats.get("unique_crashes", 0), + corpus_size=latest_stats.get("corpus_size", 0), + elapsed_time=elapsed_time + ) + + fuzzing_stats[run_id] = updated_stats + + # Broadcast if listeners exist + if active_connections.get(run_id): + # Handle both Pydantic objects and plain dicts + if isinstance(updated_stats, dict): + stats_data = updated_stats + elif hasattr(updated_stats, 'model_dump'): + stats_data = updated_stats.model_dump() + elif hasattr(updated_stats, 'dict'): + stats_data = updated_stats.dict() + else: + stats_data = updated_stats.__dict__ + + message = { + "type": "stats_update", + "data": stats_data, + } + disconnected = [] + for ws in active_connections[run_id]: + try: + await ws.send_text(json.dumps(message)) + except Exception: + disconnected.append(ws) + for ws in disconnected: + try: + active_connections[run_id].remove(ws) + except ValueError: + pass + + except Exception as e: + logger.warning(f"Error extracting stats from flow logs {run_id}: {e}") + + def _parse_stats_from_log(self, log_message: str) -> Optional[Dict[str, Any]]: + """Parse statistics from a log message""" + try: + import re + + # Prefer explicit JSON after marker tokens + m = re.search(r'(?:FUZZ_STATS|LIVE_STATS)\s+(\{.*\})', log_message) + if m: + try: + return json.loads(m.group(1)) + except Exception: + pass + + # Fallback: Extract the extra= dict and coerce to JSON + stats_match = re.search(r'extra=({.*?})', log_message) + if not stats_match: + return None + + extra_str = stats_match.group(1) + extra_str = extra_str.replace("'", '"') + extra_str = extra_str.replace('None', 'null') + extra_str = extra_str.replace('True', 'true') + extra_str = extra_str.replace('False', 'false') + + stats_data = json.loads(extra_str) + + # Support multiple stat types for different workflows + stat_type = stats_data.get("stats_type") + if stat_type in ["fuzzing_live_update", "scan_progress", "analysis_update", "live_stats"]: + return stats_data + + except Exception as e: + logger.debug(f"Error parsing log stats: {e}") + + return None + + +# Global instance +prefect_stats_monitor = PrefectStatsMonitor() diff --git a/backend/tests/conftest.py b/backend/tests/conftest.py new file mode 100644 index 0000000..7ab7ec3 --- /dev/null +++ b/backend/tests/conftest.py @@ -0,0 +1,19 @@ +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + +import sys +from pathlib import Path + +# Ensure project root is on sys.path so `src` is importable +ROOT = Path(__file__).resolve().parents[1] +if str(ROOT) not in sys.path: + sys.path.insert(0, str(ROOT)) + diff --git a/backend/tests/test_prefect_stats_monitor.py b/backend/tests/test_prefect_stats_monitor.py new file mode 100644 index 0000000..16c29df --- /dev/null +++ b/backend/tests/test_prefect_stats_monitor.py @@ -0,0 +1,82 @@ +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + +import asyncio +from datetime import datetime, timezone, timedelta + + +from src.services.prefect_stats_monitor import PrefectStatsMonitor +from src.api import fuzzing + + +class FakeLog: + def __init__(self, message: str): + self.message = message + + +class FakeClient: + def __init__(self, logs): + self._logs = logs + + async def read_logs(self, log_filter=None, limit=100, sort="TIMESTAMP_ASC"): + return self._logs + + +class FakeTaskRun: + def __init__(self): + self.id = "task-1" + self.start_time = datetime.now(timezone.utc) - timedelta(seconds=5) + + +def test_parse_stats_from_log_fuzzing(): + mon = PrefectStatsMonitor() + msg = ( + "INFO LIVE_STATS extra={'stats_type': 'fuzzing_live_update', " + "'executions': 42, 'executions_per_sec': 3.14, 'crashes': 1, 'unique_crashes': 1, 'corpus_size': 9}" + ) + stats = mon._parse_stats_from_log(msg) + assert stats is not None + assert stats["stats_type"] == "fuzzing_live_update" + assert stats["executions"] == 42 + + +def test_extract_stats_updates_and_broadcasts(): + mon = PrefectStatsMonitor() + run_id = "run-123" + workflow = "wf" + fuzzing.initialize_fuzzing_tracking(run_id, workflow) + + # Prepare a fake websocket to capture messages + sent = [] + + class FakeWS: + async def send_text(self, text: str): + sent.append(text) + + fuzzing.active_connections[run_id] = [FakeWS()] + + # Craft a log line the parser understands + msg = ( + "INFO LIVE_STATS extra={'stats_type': 'fuzzing_live_update', " + "'executions': 10, 'executions_per_sec': 1.5, 'crashes': 0, 'unique_crashes': 0, 'corpus_size': 2}" + ) + fake_client = FakeClient([FakeLog(msg)]) + task_run = FakeTaskRun() + + asyncio.run(mon._extract_stats_from_task(fake_client, run_id, task_run, workflow)) + + # Verify stats updated + stats = fuzzing.fuzzing_stats[run_id] + assert stats.executions == 10 + assert stats.executions_per_sec == 1.5 + + # Verify a message was sent to WebSocket + assert sent, "Expected a stats_update message to be sent" diff --git a/backend/toolbox/__init__.py b/backend/toolbox/__init__.py new file mode 100644 index 0000000..43bcfe7 --- /dev/null +++ b/backend/toolbox/__init__.py @@ -0,0 +1,11 @@ +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + diff --git a/backend/toolbox/modules/__init__.py b/backend/toolbox/modules/__init__.py new file mode 100644 index 0000000..43bcfe7 --- /dev/null +++ b/backend/toolbox/modules/__init__.py @@ -0,0 +1,11 @@ +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + diff --git a/backend/toolbox/modules/ai_security/__init__.py b/backend/toolbox/modules/ai_security/__init__.py new file mode 100644 index 0000000..23a29dc --- /dev/null +++ b/backend/toolbox/modules/ai_security/__init__.py @@ -0,0 +1,37 @@ +""" +AI Security Modules + +This package contains modules for AI and machine learning model security testing. + +Available modules: +- Garak: LLM/AI model security testing framework for prompt injection, bias, and jailbreaks +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +from typing import List, Type +from ..base import BaseModule + +# Module registry for automatic discovery +AI_SECURITY_MODULES: List[Type[BaseModule]] = [] + +def register_module(module_class: Type[BaseModule]): + """Register an AI security module""" + AI_SECURITY_MODULES.append(module_class) + return module_class + +def get_available_modules() -> List[Type[BaseModule]]: + """Get all available AI security modules""" + return AI_SECURITY_MODULES.copy() + +# Import modules to trigger registration +from .garak import GarakModule \ No newline at end of file diff --git a/backend/toolbox/modules/ai_security/garak.py b/backend/toolbox/modules/ai_security/garak.py new file mode 100644 index 0000000..335382f --- /dev/null +++ b/backend/toolbox/modules/ai_security/garak.py @@ -0,0 +1,526 @@ +""" +Garak AI Security Module + +This module uses Garak for AI red-teaming and LLM vulnerability assessment, +testing for prompt injection, bias, jailbreaks, and other AI-specific security issues. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import asyncio +import json +import os +from pathlib import Path +from typing import Dict, Any, List +import subprocess +import logging + +from ..base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult +from . import register_module + +logger = logging.getLogger(__name__) + + +@register_module +class GarakModule(BaseModule): + """Garak AI red-teaming and LLM vulnerability assessment module""" + + def get_metadata(self) -> ModuleMetadata: + """Get module metadata""" + return ModuleMetadata( + name="garak", + version="0.9.0", + description="AI red-teaming framework for testing LLM vulnerabilities including prompt injection, bias, and jailbreaks", + author="FuzzForge Team", + category="ai_security", + tags=["ai", "llm", "prompt-injection", "bias", "jailbreak", "red-team"], + input_schema={ + "type": "object", + "properties": { + "model_type": { + "type": "string", + "enum": ["openai", "huggingface", "anthropic", "local"], + "description": "Type of LLM to test" + }, + "model_name": { + "type": "string", + "description": "Name/path of the model to test" + }, + "api_key": { + "type": "string", + "description": "API key for cloud models (if required)" + }, + "probes": { + "type": "array", + "items": {"type": "string"}, + "default": ["encoding", "promptinject", "malwaregen", "dan"], + "description": "Probe types to run" + }, + "generations": { + "type": "integer", + "default": 10, + "description": "Number of generations per probe" + }, + "detectors": { + "type": "array", + "items": {"type": "string"}, + "description": "Detectors to use for evaluation" + }, + "config_file": { + "type": "string", + "description": "Path to Garak configuration file" + }, + "report_prefix": { + "type": "string", + "default": "garak", + "description": "Prefix for report files" + }, + "parallel_requests": { + "type": "integer", + "default": 1, + "description": "Number of parallel requests" + }, + "temperature": { + "type": "number", + "default": 0.7, + "description": "Model temperature setting" + }, + "max_tokens": { + "type": "integer", + "default": 150, + "description": "Maximum tokens per generation" + }, + "seed": { + "type": "integer", + "description": "Random seed for reproducibility" + }, + "verbose": { + "type": "boolean", + "default": false, + "description": "Enable verbose output" + } + } + }, + output_schema={ + "type": "object", + "properties": { + "findings": { + "type": "array", + "items": { + "type": "object", + "properties": { + "probe_name": {"type": "string"}, + "vulnerability_type": {"type": "string"}, + "success_rate": {"type": "number"}, + "prompt": {"type": "string"}, + "response": {"type": "string"} + } + } + } + } + } + ) + + def validate_config(self, config: Dict[str, Any]) -> bool: + """Validate configuration""" + model_type = config.get("model_type") + if not model_type: + raise ValueError("model_type is required") + + model_name = config.get("model_name") + if not model_name: + raise ValueError("model_name is required") + + return True + + async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: + """Execute Garak AI security testing""" + self.start_timer() + + try: + # Validate inputs + self.validate_config(config) + self.validate_workspace(workspace) + + logger.info("Running Garak AI security assessment") + + # Check Garak installation + await self._check_garak_installation() + + # Run Garak testing + findings = await self._run_garak_assessment(config, workspace) + + # Create summary + summary = self._create_summary(findings) + + logger.info(f"Garak found {len(findings)} AI security issues") + + return self.create_result( + findings=findings, + status="success", + summary=summary + ) + + except Exception as e: + logger.error(f"Garak module failed: {e}") + return self.create_result( + findings=[], + status="failed", + error=str(e) + ) + + async def _check_garak_installation(self): + """Check if Garak is installed""" + try: + process = await asyncio.create_subprocess_exec( + "python", "-c", "import garak; print(garak.__version__)", + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE + ) + stdout, stderr = await process.communicate() + + if process.returncode != 0: + # Try installing if not available + logger.info("Garak not found, attempting installation...") + install_process = await asyncio.create_subprocess_exec( + "pip", "install", "garak", + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE + ) + await install_process.communicate() + + except Exception as e: + logger.warning(f"Garak installation check failed: {e}") + + async def _run_garak_assessment(self, config: Dict[str, Any], workspace: Path) -> List[ModuleFinding]: + """Run Garak AI security assessment""" + findings = [] + + try: + # Build Garak command + cmd = ["python", "-m", "garak"] + + # Add model configuration + cmd.extend(["--model_type", config["model_type"]]) + cmd.extend(["--model_name", config["model_name"]]) + + # Add API key if provided + api_key = config.get("api_key") + if api_key: + # Set environment variable instead of command line for security + os.environ["GARAK_API_KEY"] = api_key + + # Add probes + probes = config.get("probes", ["encoding", "promptinject"]) + for probe in probes: + cmd.extend(["--probes", probe]) + + # Add generations + generations = config.get("generations", 10) + cmd.extend(["--generations", str(generations)]) + + # Add detectors if specified + detectors = config.get("detectors", []) + for detector in detectors: + cmd.extend(["--detectors", detector]) + + # Add parallel requests + parallel = config.get("parallel_requests", 1) + if parallel > 1: + cmd.extend(["--parallel_requests", str(parallel)]) + + # Add model parameters + temperature = config.get("temperature", 0.7) + cmd.extend(["--temperature", str(temperature)]) + + max_tokens = config.get("max_tokens", 150) + cmd.extend(["--max_tokens", str(max_tokens)]) + + # Add seed for reproducibility + seed = config.get("seed") + if seed: + cmd.extend(["--seed", str(seed)]) + + # Add configuration file + config_file = config.get("config_file") + if config_file: + config_path = workspace / config_file + if config_path.exists(): + cmd.extend(["--config", str(config_path)]) + + # Set output directory + output_dir = workspace / "garak_output" + output_dir.mkdir(exist_ok=True) + cmd.extend(["--report_prefix", str(output_dir / config.get("report_prefix", "garak"))]) + + # Add verbose flag + if config.get("verbose", False): + cmd.append("--verbose") + + logger.debug(f"Running command: {' '.join(cmd)}") + + # Run Garak + process = await asyncio.create_subprocess_exec( + *cmd, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE, + cwd=workspace + ) + + stdout, stderr = await process.communicate() + + # Parse results + findings = self._parse_garak_results(output_dir, workspace, stdout.decode(), stderr.decode()) + + except Exception as e: + logger.warning(f"Error running Garak assessment: {e}") + + return findings + + def _parse_garak_results(self, output_dir: Path, workspace: Path, stdout: str, stderr: str) -> List[ModuleFinding]: + """Parse Garak output for findings""" + findings = [] + + try: + # Look for JSON report files + report_files = list(output_dir.glob("*.report.jsonl")) + + for report_file in report_files: + findings.extend(self._parse_report_file(report_file, workspace)) + + # If no report files, try to parse stdout + if not findings: + findings = self._parse_stdout_output(stdout, stderr, workspace) + + except Exception as e: + logger.warning(f"Error parsing Garak results: {e}") + + return findings + + def _parse_report_file(self, report_file: Path, workspace: Path) -> List[ModuleFinding]: + """Parse Garak JSONL report file""" + findings = [] + + try: + with open(report_file, 'r') as f: + for line in f: + if line.strip(): + data = json.loads(line) + finding = self._create_garak_finding(data, workspace, report_file) + if finding: + findings.append(finding) + + except Exception as e: + logger.warning(f"Error parsing report file {report_file}: {e}") + + return findings + + def _parse_stdout_output(self, stdout: str, stderr: str, workspace: Path) -> List[ModuleFinding]: + """Parse Garak stdout output""" + findings = [] + + try: + # Look for vulnerability indicators in output + lines = stdout.split('\n') + stderr.split('\n') + + for line in lines: + if any(indicator in line.lower() for indicator in [ + "vulnerability", "injection", "jailbreak", "bias", "harmful" + ]): + # Create a basic finding from the output line + finding = self._create_basic_finding(line, workspace) + if finding: + findings.append(finding) + + except Exception as e: + logger.warning(f"Error parsing Garak stdout: {e}") + + return findings + + def _create_garak_finding(self, data: Dict[str, Any], workspace: Path, report_file: Path) -> ModuleFinding: + """Create finding from Garak report data""" + try: + # Extract information from Garak data structure + probe_name = data.get("probe", "unknown") + detector_name = data.get("detector", "unknown") + passed = data.get("passed", True) + prompt = data.get("prompt", "")[:500] # Limit length + response = data.get("response", "")[:500] # Limit length + + # Only create findings for failed tests (vulnerabilities found) + if passed: + return None + + # Determine vulnerability type and severity + vulnerability_type = self._get_vulnerability_type(probe_name, detector_name) + severity = self._get_vulnerability_severity(vulnerability_type, probe_name) + + # Create relative path + try: + rel_path = report_file.relative_to(workspace) + file_path = str(rel_path) + except ValueError: + file_path = str(report_file) + + finding = self.create_finding( + title=f"AI Security Issue: {vulnerability_type}", + description=f"Garak detected a {vulnerability_type} vulnerability using probe '{probe_name}' and detector '{detector_name}'", + severity=severity, + category=self._get_ai_security_category(vulnerability_type), + file_path=file_path, + recommendation=self._get_ai_security_recommendation(vulnerability_type, probe_name), + metadata={ + "probe_name": probe_name, + "detector_name": detector_name, + "vulnerability_type": vulnerability_type, + "prompt_preview": prompt, + "response_preview": response, + "passed": passed, + "fuzzer": "garak" + } + ) + + return finding + + except Exception as e: + logger.warning(f"Error creating Garak finding: {e}") + return None + + def _create_basic_finding(self, line: str, workspace: Path) -> ModuleFinding: + """Create basic finding from output line""" + try: + # Extract basic information from line + vulnerability_type = "ai_security_issue" + if "injection" in line.lower(): + vulnerability_type = "prompt_injection" + elif "jailbreak" in line.lower(): + vulnerability_type = "jailbreak_attempt" + elif "bias" in line.lower(): + vulnerability_type = "bias_detection" + + finding = self.create_finding( + title=f"AI Security Detection: {vulnerability_type.replace('_', ' ').title()}", + description=f"Garak detected potential AI security issue: {line.strip()}", + severity="medium", + category=self._get_ai_security_category(vulnerability_type), + file_path=None, + recommendation=self._get_ai_security_recommendation(vulnerability_type, "general"), + metadata={ + "vulnerability_type": vulnerability_type, + "detection_line": line.strip(), + "fuzzer": "garak" + } + ) + + return finding + + except Exception as e: + logger.warning(f"Error creating basic finding: {e}") + return None + + def _get_vulnerability_type(self, probe_name: str, detector_name: str) -> str: + """Determine vulnerability type from probe and detector names""" + probe_lower = probe_name.lower() + detector_lower = detector_name.lower() + + if "inject" in probe_lower or "inject" in detector_lower: + return "prompt_injection" + elif "jailbreak" in probe_lower or "dan" in probe_lower: + return "jailbreak_attempt" + elif "bias" in probe_lower or "bias" in detector_lower: + return "bias_detection" + elif "malware" in probe_lower or "harmful" in detector_lower: + return "harmful_content_generation" + elif "encoding" in probe_lower: + return "encoding_vulnerability" + elif "leak" in probe_lower: + return "data_leakage" + else: + return "ai_security_vulnerability" + + def _get_vulnerability_severity(self, vulnerability_type: str, probe_name: str) -> str: + """Determine severity based on vulnerability type""" + if vulnerability_type in ["prompt_injection", "jailbreak_attempt"]: + return "high" + elif vulnerability_type in ["harmful_content_generation", "data_leakage"]: + return "high" + elif vulnerability_type in ["bias_detection", "encoding_vulnerability"]: + return "medium" + else: + return "medium" + + def _get_ai_security_category(self, vulnerability_type: str) -> str: + """Get category for AI security vulnerability""" + if "injection" in vulnerability_type: + return "prompt_injection" + elif "jailbreak" in vulnerability_type: + return "jailbreak_attack" + elif "bias" in vulnerability_type: + return "algorithmic_bias" + elif "harmful" in vulnerability_type or "malware" in vulnerability_type: + return "harmful_content" + elif "leak" in vulnerability_type: + return "data_leakage" + elif "encoding" in vulnerability_type: + return "input_manipulation" + else: + return "ai_security" + + def _get_ai_security_recommendation(self, vulnerability_type: str, probe_name: str) -> str: + """Get recommendation for AI security vulnerability""" + if "injection" in vulnerability_type: + return "Implement robust input validation, prompt sanitization, and use structured prompts to prevent injection attacks. Consider implementing content filtering and output validation." + elif "jailbreak" in vulnerability_type: + return "Strengthen model alignment and safety measures. Implement content filtering, use constitutional AI techniques, and add safety classifiers for output validation." + elif "bias" in vulnerability_type: + return "Review training data for bias, implement fairness constraints, use debiasing techniques, and conduct regular bias audits across different demographic groups." + elif "harmful" in vulnerability_type: + return "Implement strict content policies, use safety classifiers, add human oversight for sensitive outputs, and refuse to generate harmful content." + elif "leak" in vulnerability_type: + return "Review data handling practices, implement data anonymization, use differential privacy techniques, and audit model responses for sensitive information disclosure." + elif "encoding" in vulnerability_type: + return "Normalize and validate all input encodings, implement proper character filtering, and use encoding-aware input processing." + else: + return f"Address the {vulnerability_type} vulnerability by implementing appropriate AI safety measures, input validation, and output monitoring." + + def _create_summary(self, findings: List[ModuleFinding]) -> Dict[str, Any]: + """Create analysis summary""" + severity_counts = {"critical": 0, "high": 0, "medium": 0, "low": 0, "info": 0} + category_counts = {} + vulnerability_counts = {} + probe_counts = {} + + for finding in findings: + # Count by severity + severity_counts[finding.severity] += 1 + + # Count by category + category = finding.category + category_counts[category] = category_counts.get(category, 0) + 1 + + # Count by vulnerability type + vuln_type = finding.metadata.get("vulnerability_type", "unknown") + vulnerability_counts[vuln_type] = vulnerability_counts.get(vuln_type, 0) + 1 + + # Count by probe + probe = finding.metadata.get("probe_name", "unknown") + probe_counts[probe] = probe_counts.get(probe, 0) + 1 + + return { + "total_findings": len(findings), + "severity_counts": severity_counts, + "category_counts": category_counts, + "vulnerability_counts": vulnerability_counts, + "probe_counts": probe_counts, + "ai_security_issues": len(findings), + "high_risk_vulnerabilities": severity_counts.get("high", 0) + severity_counts.get("critical", 0) + } \ No newline at end of file diff --git a/backend/toolbox/modules/analyzer/__init__.py b/backend/toolbox/modules/analyzer/__init__.py new file mode 100644 index 0000000..527dab7 --- /dev/null +++ b/backend/toolbox/modules/analyzer/__init__.py @@ -0,0 +1,14 @@ +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + +from .security_analyzer import SecurityAnalyzer + +__all__ = ["SecurityAnalyzer"] \ No newline at end of file diff --git a/backend/toolbox/modules/analyzer/security_analyzer.py b/backend/toolbox/modules/analyzer/security_analyzer.py new file mode 100644 index 0000000..8688c18 --- /dev/null +++ b/backend/toolbox/modules/analyzer/security_analyzer.py @@ -0,0 +1,368 @@ +""" +Security Analyzer Module - Analyzes code for security vulnerabilities +""" + +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + +import logging +import re +from pathlib import Path +from typing import Dict, Any, List, Optional + +try: + from toolbox.modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding +except ImportError: + try: + from modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding + except ImportError: + from src.toolbox.modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding + +logger = logging.getLogger(__name__) + + +class SecurityAnalyzer(BaseModule): + """ + Analyzes source code for common security vulnerabilities. + + This module: + - Detects hardcoded secrets and credentials + - Identifies dangerous function calls + - Finds SQL injection vulnerabilities + - Detects insecure configurations + """ + + def get_metadata(self) -> ModuleMetadata: + """Get module metadata""" + return ModuleMetadata( + name="security_analyzer", + version="1.0.0", + description="Analyzes code for security vulnerabilities", + author="FuzzForge Team", + category="analyzer", + tags=["security", "vulnerabilities", "static-analysis"], + input_schema={ + "file_extensions": { + "type": "array", + "items": {"type": "string"}, + "description": "File extensions to analyze", + "default": [".py", ".js", ".java", ".php", ".rb", ".go"] + }, + "check_secrets": { + "type": "boolean", + "description": "Check for hardcoded secrets", + "default": True + }, + "check_sql": { + "type": "boolean", + "description": "Check for SQL injection risks", + "default": True + }, + "check_dangerous_functions": { + "type": "boolean", + "description": "Check for dangerous function calls", + "default": True + } + }, + output_schema={ + "findings": { + "type": "array", + "description": "List of security findings" + } + }, + requires_workspace=True + ) + + def validate_config(self, config: Dict[str, Any]) -> bool: + """Validate module configuration""" + extensions = config.get("file_extensions", []) + if not isinstance(extensions, list): + raise ValueError("file_extensions must be a list") + + return True + + async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: + """ + Execute the security analysis module. + + Args: + config: Module configuration + workspace: Path to the workspace directory + + Returns: + ModuleResult with security findings + """ + self.start_timer() + self.validate_workspace(workspace) + self.validate_config(config) + + findings = [] + files_analyzed = 0 + + # Get configuration + file_extensions = config.get("file_extensions", [".py", ".js", ".java", ".php", ".rb", ".go"]) + check_secrets = config.get("check_secrets", True) + check_sql = config.get("check_sql", True) + check_dangerous = config.get("check_dangerous_functions", True) + + logger.info(f"Analyzing files with extensions: {file_extensions}") + + try: + # Analyze each file + for ext in file_extensions: + for file_path in workspace.rglob(f"*{ext}"): + if not file_path.is_file(): + continue + + files_analyzed += 1 + relative_path = file_path.relative_to(workspace) + + try: + content = file_path.read_text(encoding='utf-8', errors='ignore') + lines = content.splitlines() + + # Check for secrets + if check_secrets: + secret_findings = self._check_hardcoded_secrets( + content, lines, relative_path + ) + findings.extend(secret_findings) + + # Check for SQL injection + if check_sql and ext in [".py", ".php", ".java", ".js"]: + sql_findings = self._check_sql_injection( + content, lines, relative_path + ) + findings.extend(sql_findings) + + # Check for dangerous functions + if check_dangerous: + dangerous_findings = self._check_dangerous_functions( + content, lines, relative_path, ext + ) + findings.extend(dangerous_findings) + + except Exception as e: + logger.error(f"Error analyzing file {relative_path}: {e}") + + # Create summary + summary = { + "files_analyzed": files_analyzed, + "total_findings": len(findings), + "extensions_scanned": file_extensions + } + + return self.create_result( + findings=findings, + status="success" if files_analyzed > 0 else "partial", + summary=summary, + metadata={ + "workspace": str(workspace), + "config": config + } + ) + + except Exception as e: + logger.error(f"Security analyzer failed: {e}") + return self.create_result( + findings=findings, + status="failed", + error=str(e) + ) + + def _check_hardcoded_secrets( + self, content: str, lines: List[str], file_path: Path + ) -> List[ModuleFinding]: + """ + Check for hardcoded secrets in code. + + Args: + content: File content + lines: File lines + file_path: Relative file path + + Returns: + List of findings + """ + findings = [] + + # Patterns for secrets + secret_patterns = [ + (r'api[_-]?key\s*=\s*["\']([^"\']{20,})["\']', 'API Key'), + (r'api[_-]?secret\s*=\s*["\']([^"\']{20,})["\']', 'API Secret'), + (r'password\s*=\s*["\']([^"\']+)["\']', 'Hardcoded Password'), + (r'token\s*=\s*["\']([^"\']{20,})["\']', 'Authentication Token'), + (r'aws[_-]?access[_-]?key\s*=\s*["\']([^"\']+)["\']', 'AWS Access Key'), + (r'aws[_-]?secret[_-]?key\s*=\s*["\']([^"\']+)["\']', 'AWS Secret Key'), + (r'private[_-]?key\s*=\s*["\']([^"\']+)["\']', 'Private Key'), + (r'["\']([A-Za-z0-9]{32,})["\']', 'Potential Secret Hash'), + (r'Bearer\s+([A-Za-z0-9\-_]+\.[A-Za-z0-9\-_]+\.[A-Za-z0-9\-_]+)', 'JWT Token'), + ] + + for pattern, secret_type in secret_patterns: + for match in re.finditer(pattern, content, re.IGNORECASE): + # Find line number + line_num = content[:match.start()].count('\n') + 1 + line_content = lines[line_num - 1] if line_num <= len(lines) else "" + + # Skip common false positives + if self._is_false_positive_secret(match.group(0)): + continue + + findings.append(self.create_finding( + title=f"Hardcoded {secret_type} detected", + description=f"Found potential hardcoded {secret_type} in {file_path}", + severity="high" if "key" in secret_type.lower() else "medium", + category="hardcoded_secret", + file_path=str(file_path), + line_start=line_num, + code_snippet=line_content.strip()[:100], + recommendation=f"Remove hardcoded {secret_type} and use environment variables or secure vault", + metadata={"secret_type": secret_type} + )) + + return findings + + def _check_sql_injection( + self, content: str, lines: List[str], file_path: Path + ) -> List[ModuleFinding]: + """ + Check for potential SQL injection vulnerabilities. + + Args: + content: File content + lines: File lines + file_path: Relative file path + + Returns: + List of findings + """ + findings = [] + + # SQL injection patterns + sql_patterns = [ + (r'(SELECT|INSERT|UPDATE|DELETE).*\+\s*[\'"]?\s*\+?\s*\w+', 'String concatenation in SQL'), + (r'(SELECT|INSERT|UPDATE|DELETE).*%\s*[\'"]?\s*%?\s*\w+', 'String formatting in SQL'), + (r'f[\'"].*?(SELECT|INSERT|UPDATE|DELETE).*?\{.*?\}', 'F-string in SQL query'), + (r'query\s*=.*?\+', 'Dynamic query building'), + (r'execute\s*\(.*?\+.*?\)', 'Dynamic execute statement'), + ] + + for pattern, vuln_type in sql_patterns: + for match in re.finditer(pattern, content, re.IGNORECASE): + line_num = content[:match.start()].count('\n') + 1 + line_content = lines[line_num - 1] if line_num <= len(lines) else "" + + findings.append(self.create_finding( + title=f"Potential SQL Injection: {vuln_type}", + description=f"Detected potential SQL injection vulnerability via {vuln_type}", + severity="high", + category="sql_injection", + file_path=str(file_path), + line_start=line_num, + code_snippet=line_content.strip()[:100], + recommendation="Use parameterized queries or prepared statements instead", + metadata={"vulnerability_type": vuln_type} + )) + + return findings + + def _check_dangerous_functions( + self, content: str, lines: List[str], file_path: Path, ext: str + ) -> List[ModuleFinding]: + """ + Check for dangerous function calls. + + Args: + content: File content + lines: File lines + file_path: Relative file path + ext: File extension + + Returns: + List of findings + """ + findings = [] + + # Language-specific dangerous functions + dangerous_functions = { + ".py": [ + (r'eval\s*\(', 'eval()', 'Arbitrary code execution'), + (r'exec\s*\(', 'exec()', 'Arbitrary code execution'), + (r'os\.system\s*\(', 'os.system()', 'Command injection risk'), + (r'subprocess\.call\s*\(.*shell=True', 'subprocess with shell=True', 'Command injection risk'), + (r'pickle\.loads?\s*\(', 'pickle.load()', 'Deserialization vulnerability'), + ], + ".js": [ + (r'eval\s*\(', 'eval()', 'Arbitrary code execution'), + (r'new\s+Function\s*\(', 'new Function()', 'Arbitrary code execution'), + (r'innerHTML\s*=', 'innerHTML', 'XSS vulnerability'), + (r'document\.write\s*\(', 'document.write()', 'XSS vulnerability'), + ], + ".php": [ + (r'eval\s*\(', 'eval()', 'Arbitrary code execution'), + (r'exec\s*\(', 'exec()', 'Command execution'), + (r'system\s*\(', 'system()', 'Command execution'), + (r'shell_exec\s*\(', 'shell_exec()', 'Command execution'), + (r'\$_GET\[', 'Direct $_GET usage', 'Input validation missing'), + (r'\$_POST\[', 'Direct $_POST usage', 'Input validation missing'), + ] + } + + if ext in dangerous_functions: + for pattern, func_name, risk_type in dangerous_functions[ext]: + for match in re.finditer(pattern, content): + line_num = content[:match.start()].count('\n') + 1 + line_content = lines[line_num - 1] if line_num <= len(lines) else "" + + findings.append(self.create_finding( + title=f"Dangerous function: {func_name}", + description=f"Use of potentially dangerous function {func_name}: {risk_type}", + severity="medium", + category="dangerous_function", + file_path=str(file_path), + line_start=line_num, + code_snippet=line_content.strip()[:100], + recommendation=f"Consider safer alternatives to {func_name}", + metadata={ + "function": func_name, + "risk": risk_type + } + )) + + return findings + + def _is_false_positive_secret(self, value: str) -> bool: + """ + Check if a potential secret is likely a false positive. + + Args: + value: Potential secret value + + Returns: + True if likely false positive + """ + false_positive_patterns = [ + 'example', + 'test', + 'demo', + 'sample', + 'dummy', + 'placeholder', + 'xxx', + '123', + 'change', + 'your', + 'here' + ] + + value_lower = value.lower() + return any(pattern in value_lower for pattern in false_positive_patterns) \ No newline at end of file diff --git a/backend/toolbox/modules/base.py b/backend/toolbox/modules/base.py new file mode 100644 index 0000000..62a722c --- /dev/null +++ b/backend/toolbox/modules/base.py @@ -0,0 +1,272 @@ +""" +Base module interface for all FuzzForge modules +""" + +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + +from abc import ABC, abstractmethod +from pathlib import Path +from typing import Dict, Any, List, Optional +from pydantic import BaseModel, Field +from datetime import datetime +import logging + +logger = logging.getLogger(__name__) + + +class ModuleMetadata(BaseModel): + """Metadata describing a module's capabilities and requirements""" + name: str = Field(..., description="Module name") + version: str = Field(..., description="Module version") + description: str = Field(..., description="Module description") + author: Optional[str] = Field(None, description="Module author") + category: str = Field(..., description="Module category (scanner, analyzer, reporter, etc.)") + tags: List[str] = Field(default_factory=list, description="Module tags") + input_schema: Dict[str, Any] = Field(default_factory=dict, description="Expected input schema") + output_schema: Dict[str, Any] = Field(default_factory=dict, description="Output schema") + requires_workspace: bool = Field(True, description="Whether module requires workspace access") + + +class ModuleFinding(BaseModel): + """Individual finding from a module""" + id: str = Field(..., description="Unique finding ID") + title: str = Field(..., description="Finding title") + description: str = Field(..., description="Detailed description") + severity: str = Field(..., description="Severity level (info, low, medium, high, critical)") + category: str = Field(..., description="Finding category") + file_path: Optional[str] = Field(None, description="Affected file path relative to workspace") + line_start: Optional[int] = Field(None, description="Starting line number") + line_end: Optional[int] = Field(None, description="Ending line number") + code_snippet: Optional[str] = Field(None, description="Relevant code snippet") + recommendation: Optional[str] = Field(None, description="Remediation recommendation") + metadata: Dict[str, Any] = Field(default_factory=dict, description="Additional metadata") + + +class ModuleResult(BaseModel): + """Standard result format from module execution""" + module: str = Field(..., description="Module name") + version: str = Field(..., description="Module version") + status: str = Field(default="success", description="Execution status (success, partial, failed)") + execution_time: float = Field(..., description="Execution time in seconds") + findings: List[ModuleFinding] = Field(default_factory=list, description="List of findings") + summary: Dict[str, Any] = Field(default_factory=dict, description="Summary statistics") + metadata: Dict[str, Any] = Field(default_factory=dict, description="Additional metadata") + error: Optional[str] = Field(None, description="Error message if failed") + sarif: Optional[Dict[str, Any]] = Field(None, description="SARIF report if generated by reporter module") + + +class BaseModule(ABC): + """ + Base interface for all security testing modules. + + All modules must inherit from this class and implement the required methods. + Modules are designed to be stateless and reusable across different workflows. + """ + + def __init__(self): + """Initialize the module""" + self._metadata = self.get_metadata() + self._start_time = None + logger.info(f"Initialized module: {self._metadata.name} v{self._metadata.version}") + + @abstractmethod + def get_metadata(self) -> ModuleMetadata: + """ + Get module metadata. + + Returns: + ModuleMetadata object describing the module + """ + pass + + @abstractmethod + async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: + """ + Execute the module with given configuration and workspace. + + Args: + config: Module-specific configuration parameters + workspace: Path to the mounted workspace directory + + Returns: + ModuleResult containing findings and metadata + """ + pass + + @abstractmethod + def validate_config(self, config: Dict[str, Any]) -> bool: + """ + Validate the provided configuration against module requirements. + + Args: + config: Configuration to validate + + Returns: + True if configuration is valid, False otherwise + + Raises: + ValueError: If configuration is invalid with details + """ + pass + + def validate_workspace(self, workspace: Path) -> bool: + """ + Validate that the workspace exists and is accessible. + + Args: + workspace: Path to the workspace + + Returns: + True if workspace is valid + + Raises: + ValueError: If workspace is invalid + """ + if not workspace.exists(): + raise ValueError(f"Workspace does not exist: {workspace}") + + if not workspace.is_dir(): + raise ValueError(f"Workspace is not a directory: {workspace}") + + return True + + def create_finding( + self, + title: str, + description: str, + severity: str, + category: str, + **kwargs + ) -> ModuleFinding: + """ + Helper method to create a standardized finding. + + Args: + title: Finding title + description: Detailed description + severity: Severity level + category: Finding category + **kwargs: Additional finding fields + + Returns: + ModuleFinding object + """ + import uuid + finding_id = str(uuid.uuid4()) + + return ModuleFinding( + id=finding_id, + title=title, + description=description, + severity=severity, + category=category, + **kwargs + ) + + def start_timer(self): + """Start the execution timer""" + from time import time + self._start_time = time() + + def get_execution_time(self) -> float: + """Get the execution time in seconds""" + from time import time + if self._start_time is None: + return 0.0 + return time() - self._start_time + + def create_result( + self, + findings: List[ModuleFinding], + status: str = "success", + summary: Dict[str, Any] = None, + metadata: Dict[str, Any] = None, + error: str = None + ) -> ModuleResult: + """ + Helper method to create a module result. + + Args: + findings: List of findings + status: Execution status + summary: Summary statistics + metadata: Additional metadata + error: Error message if failed + + Returns: + ModuleResult object + """ + return ModuleResult( + module=self._metadata.name, + version=self._metadata.version, + status=status, + execution_time=self.get_execution_time(), + findings=findings, + summary=summary or self._generate_summary(findings), + metadata=metadata or {}, + error=error + ) + + def _generate_summary(self, findings: List[ModuleFinding]) -> Dict[str, Any]: + """ + Generate summary statistics from findings. + + Args: + findings: List of findings + + Returns: + Summary dictionary + """ + severity_counts = { + "info": 0, + "low": 0, + "medium": 0, + "high": 0, + "critical": 0 + } + + category_counts = {} + + for finding in findings: + # Count by severity + if finding.severity in severity_counts: + severity_counts[finding.severity] += 1 + + # Count by category + if finding.category not in category_counts: + category_counts[finding.category] = 0 + category_counts[finding.category] += 1 + + return { + "total_findings": len(findings), + "severity_counts": severity_counts, + "category_counts": category_counts, + "highest_severity": self._get_highest_severity(findings) + } + + def _get_highest_severity(self, findings: List[ModuleFinding]) -> str: + """ + Get the highest severity from findings. + + Args: + findings: List of findings + + Returns: + Highest severity level + """ + severity_order = ["critical", "high", "medium", "low", "info"] + + for severity in severity_order: + if any(f.severity == severity for f in findings): + return severity + + return "none" \ No newline at end of file diff --git a/backend/toolbox/modules/cicd_security/__init__.py b/backend/toolbox/modules/cicd_security/__init__.py new file mode 100644 index 0000000..8bb992b --- /dev/null +++ b/backend/toolbox/modules/cicd_security/__init__.py @@ -0,0 +1,37 @@ +""" +CI/CD Security Modules + +This package contains modules for CI/CD pipeline and workflow security testing. + +Available modules: +- Zizmor: GitHub Actions workflow security analyzer +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +from typing import List, Type +from ..base import BaseModule + +# Module registry for automatic discovery +CICD_SECURITY_MODULES: List[Type[BaseModule]] = [] + +def register_module(module_class: Type[BaseModule]): + """Register a CI/CD security module""" + CICD_SECURITY_MODULES.append(module_class) + return module_class + +def get_available_modules() -> List[Type[BaseModule]]: + """Get all available CI/CD security modules""" + return CICD_SECURITY_MODULES.copy() + +# Import modules to trigger registration +from .zizmor import ZizmorModule \ No newline at end of file diff --git a/backend/toolbox/modules/cicd_security/zizmor.py b/backend/toolbox/modules/cicd_security/zizmor.py new file mode 100644 index 0000000..f67496c --- /dev/null +++ b/backend/toolbox/modules/cicd_security/zizmor.py @@ -0,0 +1,595 @@ +""" +Zizmor CI/CD Security Module + +This module uses Zizmor to analyze GitHub Actions workflows for security +vulnerabilities and misconfigurations. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import asyncio +import json +from pathlib import Path +from typing import Dict, Any, List +import subprocess +import logging + +from ..base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult +from . import register_module + +logger = logging.getLogger(__name__) + + +@register_module +class ZizmorModule(BaseModule): + """Zizmor GitHub Actions security analysis module""" + + def get_metadata(self) -> ModuleMetadata: + """Get module metadata""" + return ModuleMetadata( + name="zizmor", + version="0.2.0", + description="GitHub Actions workflow security analyzer for detecting vulnerabilities and misconfigurations", + author="FuzzForge Team", + category="cicd_security", + tags=["github-actions", "cicd", "workflow", "security", "pipeline"], + input_schema={ + "type": "object", + "properties": { + "workflow_dir": { + "type": "string", + "default": ".github/workflows", + "description": "Directory containing GitHub Actions workflows" + }, + "workflow_files": { + "type": "array", + "items": {"type": "string"}, + "description": "Specific workflow files to analyze" + }, + "format": { + "type": "string", + "enum": ["json", "sarif", "pretty"], + "default": "json", + "description": "Output format" + }, + "verbose": { + "type": "boolean", + "default": false, + "description": "Enable verbose output" + }, + "offline": { + "type": "boolean", + "default": false, + "description": "Run in offline mode (no internet lookups)" + }, + "no_online_audits": { + "type": "boolean", + "default": true, + "description": "Disable online audits for faster execution" + }, + "pedantic": { + "type": "boolean", + "default": false, + "description": "Enable pedantic mode (more strict checking)" + }, + "rules": { + "type": "array", + "items": {"type": "string"}, + "description": "Specific rules to run" + }, + "ignore_rules": { + "type": "array", + "items": {"type": "string"}, + "description": "Rules to ignore" + }, + "min_severity": { + "type": "string", + "enum": ["unknown", "informational", "low", "medium", "high"], + "default": "low", + "description": "Minimum severity level to report" + } + } + }, + output_schema={ + "type": "object", + "properties": { + "findings": { + "type": "array", + "items": { + "type": "object", + "properties": { + "rule_id": {"type": "string"}, + "rule_name": {"type": "string"}, + "severity": {"type": "string"}, + "workflow_file": {"type": "string"}, + "line_number": {"type": "integer"} + } + } + } + } + } + ) + + def validate_config(self, config: Dict[str, Any]) -> bool: + """Validate configuration""" + workflow_dir = config.get("workflow_dir", ".github/workflows") + workflow_files = config.get("workflow_files", []) + + if not workflow_dir and not workflow_files: + raise ValueError("Either workflow_dir or workflow_files must be specified") + + return True + + async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: + """Execute Zizmor GitHub Actions security analysis""" + self.start_timer() + + try: + # Validate inputs + self.validate_config(config) + self.validate_workspace(workspace) + + logger.info("Running Zizmor GitHub Actions security analysis") + + # Check Zizmor installation + await self._check_zizmor_installation() + + # Find workflow files + workflow_files = self._find_workflow_files(workspace, config) + if not workflow_files: + logger.info("No GitHub Actions workflow files found") + return self.create_result( + findings=[], + status="success", + summary={"total_findings": 0, "workflows_scanned": 0} + ) + + # Run Zizmor analysis + findings = await self._run_zizmor_analysis(workflow_files, config, workspace) + + # Create summary + summary = self._create_summary(findings, len(workflow_files)) + + logger.info(f"Zizmor found {len(findings)} CI/CD security issues") + + return self.create_result( + findings=findings, + status="success", + summary=summary + ) + + except Exception as e: + logger.error(f"Zizmor module failed: {e}") + return self.create_result( + findings=[], + status="failed", + error=str(e) + ) + + async def _check_zizmor_installation(self): + """Check if Zizmor is installed""" + try: + process = await asyncio.create_subprocess_exec( + "zizmor", "--version", + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE + ) + stdout, stderr = await process.communicate() + + if process.returncode != 0: + raise RuntimeError("Zizmor not found. Install with: cargo install zizmor") + + except FileNotFoundError: + raise RuntimeError("Zizmor not found. Install with: cargo install zizmor") + except Exception as e: + raise RuntimeError(f"Zizmor installation check failed: {e}") + + def _find_workflow_files(self, workspace: Path, config: Dict[str, Any]) -> List[Path]: + """Find GitHub Actions workflow files""" + workflow_files = [] + + # Check for specific files + specific_files = config.get("workflow_files", []) + for file_path in specific_files: + full_path = workspace / file_path + if full_path.exists(): + workflow_files.append(full_path) + + # Check workflow directory + if not workflow_files: + workflow_dir = workspace / config.get("workflow_dir", ".github/workflows") + if workflow_dir.exists(): + # Find YAML files + for pattern in ["*.yml", "*.yaml"]: + workflow_files.extend(workflow_dir.glob(pattern)) + + return list(set(workflow_files)) # Remove duplicates + + async def _run_zizmor_analysis(self, workflow_files: List[Path], config: Dict[str, Any], workspace: Path) -> List[ModuleFinding]: + """Run Zizmor analysis on workflow files""" + findings = [] + + try: + for workflow_file in workflow_files: + file_findings = await self._analyze_workflow_file(workflow_file, config, workspace) + findings.extend(file_findings) + + except Exception as e: + logger.warning(f"Error running Zizmor analysis: {e}") + + return findings + + async def _analyze_workflow_file(self, workflow_file: Path, config: Dict[str, Any], workspace: Path) -> List[ModuleFinding]: + """Analyze a single workflow file with Zizmor""" + findings = [] + + try: + # Build Zizmor command + cmd = ["zizmor"] + + # Add format + format_type = config.get("format", "json") + cmd.extend(["--format", format_type]) + + # Add minimum severity + min_severity = config.get("min_severity", "low") + cmd.extend(["--min-severity", min_severity]) + + # Add flags + if config.get("verbose", False): + cmd.append("--verbose") + + if config.get("offline", False): + cmd.append("--offline") + + if config.get("no_online_audits", True): + cmd.append("--no-online-audits") + + if config.get("pedantic", False): + cmd.append("--pedantic") + + # Add specific rules + rules = config.get("rules", []) + for rule in rules: + cmd.extend(["--rules", rule]) + + # Add ignore rules + ignore_rules = config.get("ignore_rules", []) + for rule in ignore_rules: + cmd.extend(["--ignore", rule]) + + # Add workflow file + cmd.append(str(workflow_file)) + + logger.debug(f"Running command: {' '.join(cmd)}") + + # Run Zizmor + process = await asyncio.create_subprocess_exec( + *cmd, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE, + cwd=workspace + ) + + stdout, stderr = await process.communicate() + + # Parse results (even if return code is non-zero, as it may contain findings) + if stdout.strip(): + findings = self._parse_zizmor_output( + stdout.decode(), workflow_file, workspace, format_type + ) + elif stderr.strip(): + logger.warning(f"Zizmor analysis failed for {workflow_file}: {stderr.decode()}") + + except Exception as e: + logger.warning(f"Error analyzing workflow file {workflow_file}: {e}") + + return findings + + def _parse_zizmor_output(self, output: str, workflow_file: Path, workspace: Path, format_type: str) -> List[ModuleFinding]: + """Parse Zizmor output into findings""" + findings = [] + + try: + if format_type == "json": + findings = self._parse_json_output(output, workflow_file, workspace) + elif format_type == "sarif": + findings = self._parse_sarif_output(output, workflow_file, workspace) + else: + findings = self._parse_text_output(output, workflow_file, workspace) + + except Exception as e: + logger.warning(f"Error parsing Zizmor output: {e}") + + return findings + + def _parse_json_output(self, output: str, workflow_file: Path, workspace: Path) -> List[ModuleFinding]: + """Parse Zizmor JSON output""" + findings = [] + + try: + if not output.strip(): + return findings + + data = json.loads(output) + + # Handle different JSON structures + if isinstance(data, dict): + # Single result + findings.extend(self._process_zizmor_result(data, workflow_file, workspace)) + elif isinstance(data, list): + # Multiple results + for result in data: + findings.extend(self._process_zizmor_result(result, workflow_file, workspace)) + + except json.JSONDecodeError as e: + logger.warning(f"Failed to parse Zizmor JSON output: {e}") + + return findings + + def _parse_sarif_output(self, output: str, workflow_file: Path, workspace: Path) -> List[ModuleFinding]: + """Parse Zizmor SARIF output""" + findings = [] + + try: + data = json.loads(output) + runs = data.get("runs", []) + + for run in runs: + results = run.get("results", []) + for result in results: + finding = self._create_sarif_finding(result, workflow_file, workspace) + if finding: + findings.append(finding) + + except Exception as e: + logger.warning(f"Error parsing SARIF output: {e}") + + return findings + + def _parse_text_output(self, output: str, workflow_file: Path, workspace: Path) -> List[ModuleFinding]: + """Parse Zizmor text output""" + findings = [] + + try: + lines = output.strip().split('\n') + for line in lines: + if line.strip() and not line.startswith('#'): + # Create basic finding from text line + finding = self._create_text_finding(line, workflow_file, workspace) + if finding: + findings.append(finding) + + except Exception as e: + logger.warning(f"Error parsing text output: {e}") + + return findings + + def _process_zizmor_result(self, result: Dict[str, Any], workflow_file: Path, workspace: Path) -> List[ModuleFinding]: + """Process a single Zizmor result""" + findings = [] + + try: + # Extract rule information + rule_id = result.get("rule", {}).get("id", "unknown") + rule_name = result.get("rule", {}).get("desc", rule_id) + severity = result.get("severity", "medium") + message = result.get("message", "") + + # Extract location information + locations = result.get("locations", []) + if not locations: + # Create finding without specific location + finding = self._create_zizmor_finding( + rule_id, rule_name, severity, message, workflow_file, workspace + ) + if finding: + findings.append(finding) + else: + # Create finding for each location + for location in locations: + line_number = location.get("line", 0) + column = location.get("column", 0) + + finding = self._create_zizmor_finding( + rule_id, rule_name, severity, message, workflow_file, workspace, + line_number, column + ) + if finding: + findings.append(finding) + + except Exception as e: + logger.warning(f"Error processing Zizmor result: {e}") + + return findings + + def _create_zizmor_finding(self, rule_id: str, rule_name: str, severity: str, message: str, + workflow_file: Path, workspace: Path, line_number: int = None, column: int = None) -> ModuleFinding: + """Create finding from Zizmor analysis""" + try: + # Map Zizmor severity to our standard levels + finding_severity = self._map_severity(severity) + + # Create relative path + try: + rel_path = workflow_file.relative_to(workspace) + file_path = str(rel_path) + except ValueError: + file_path = str(workflow_file) + + # Get category and recommendation + category = self._get_cicd_category(rule_id, rule_name) + recommendation = self._get_cicd_recommendation(rule_id, rule_name, message) + + finding = self.create_finding( + title=f"CI/CD Security Issue: {rule_name}", + description=message or f"Zizmor detected a security issue: {rule_name}", + severity=finding_severity, + category=category, + file_path=file_path, + line_start=line_number if line_number else None, + recommendation=recommendation, + metadata={ + "rule_id": rule_id, + "rule_name": rule_name, + "zizmor_severity": severity, + "workflow_file": str(workflow_file.name), + "line_number": line_number, + "column": column, + "tool": "zizmor" + } + ) + + return finding + + except Exception as e: + logger.warning(f"Error creating Zizmor finding: {e}") + return None + + def _create_sarif_finding(self, result: Dict[str, Any], workflow_file: Path, workspace: Path) -> ModuleFinding: + """Create finding from SARIF result""" + try: + rule_id = result.get("ruleId", "unknown") + message = result.get("message", {}).get("text", "") + severity = result.get("level", "warning") + + # Extract location + locations = result.get("locations", []) + line_number = None + if locations: + physical_location = locations[0].get("physicalLocation", {}) + region = physical_location.get("region", {}) + line_number = region.get("startLine") + + return self._create_zizmor_finding( + rule_id, rule_id, severity, message, workflow_file, workspace, line_number + ) + + except Exception as e: + logger.warning(f"Error creating SARIF finding: {e}") + return None + + def _create_text_finding(self, line: str, workflow_file: Path, workspace: Path) -> ModuleFinding: + """Create finding from text line""" + try: + try: + rel_path = workflow_file.relative_to(workspace) + file_path = str(rel_path) + except ValueError: + file_path = str(workflow_file) + + finding = self.create_finding( + title="CI/CD Security Issue", + description=line.strip(), + severity="medium", + category="workflow_security", + file_path=file_path, + recommendation="Review and address the workflow security issue identified by Zizmor.", + metadata={ + "detection_line": line.strip(), + "workflow_file": str(workflow_file.name), + "tool": "zizmor" + } + ) + + return finding + + except Exception as e: + logger.warning(f"Error creating text finding: {e}") + return None + + def _map_severity(self, zizmor_severity: str) -> str: + """Map Zizmor severity to our standard levels""" + severity_map = { + "high": "high", + "medium": "medium", + "low": "low", + "informational": "info", + "unknown": "low", + "error": "high", + "warning": "medium", + "note": "low" + } + return severity_map.get(zizmor_severity.lower(), "medium") + + def _get_cicd_category(self, rule_id: str, rule_name: str) -> str: + """Get category for CI/CD security issue""" + rule_lower = f"{rule_id} {rule_name}".lower() + + if any(term in rule_lower for term in ["secret", "token", "credential", "password"]): + return "secret_exposure" + elif any(term in rule_lower for term in ["permission", "access", "privilege"]): + return "permission_escalation" + elif any(term in rule_lower for term in ["injection", "command", "script"]): + return "code_injection" + elif any(term in rule_lower for term in ["artifact", "cache", "upload"]): + return "artifact_security" + elif any(term in rule_lower for term in ["environment", "env", "variable"]): + return "environment_security" + elif any(term in rule_lower for term in ["network", "external", "download"]): + return "network_security" + else: + return "workflow_security" + + def _get_cicd_recommendation(self, rule_id: str, rule_name: str, message: str) -> str: + """Get recommendation for CI/CD security issue""" + rule_lower = f"{rule_id} {rule_name}".lower() + + if "secret" in rule_lower or "token" in rule_lower: + return "Store secrets securely using GitHub Secrets or environment variables. Never hardcode credentials in workflow files." + elif "permission" in rule_lower: + return "Follow the principle of least privilege. Grant only necessary permissions and use specific permission scopes." + elif "injection" in rule_lower: + return "Avoid using user input directly in shell commands. Use proper escaping, validation, or structured approaches." + elif "artifact" in rule_lower: + return "Secure artifact handling by validating checksums, using signed artifacts, and restricting artifact access." + elif "environment" in rule_lower: + return "Protect environment variables and avoid exposing sensitive information in logs or outputs." + elif "network" in rule_lower: + return "Use HTTPS for external connections, validate certificates, and avoid downloading from untrusted sources." + elif message: + return f"Address the identified issue: {message}" + else: + return f"Review and fix the workflow security issue: {rule_name}" + + def _create_summary(self, findings: List[ModuleFinding], workflows_count: int) -> Dict[str, Any]: + """Create analysis summary""" + severity_counts = {"critical": 0, "high": 0, "medium": 0, "low": 0, "info": 0} + category_counts = {} + rule_counts = {} + workflow_counts = {} + + for finding in findings: + # Count by severity + severity_counts[finding.severity] += 1 + + # Count by category + category = finding.category + category_counts[category] = category_counts.get(category, 0) + 1 + + # Count by rule + rule_id = finding.metadata.get("rule_id", "unknown") + rule_counts[rule_id] = rule_counts.get(rule_id, 0) + 1 + + # Count by workflow + workflow = finding.metadata.get("workflow_file", "unknown") + workflow_counts[workflow] = workflow_counts.get(workflow, 0) + 1 + + return { + "total_findings": len(findings), + "workflows_scanned": workflows_count, + "severity_counts": severity_counts, + "category_counts": category_counts, + "top_rules": dict(sorted(rule_counts.items(), key=lambda x: x[1], reverse=True)[:10]), + "workflows_with_issues": len(workflow_counts), + "workflow_issue_counts": dict(sorted(workflow_counts.items(), key=lambda x: x[1], reverse=True)[:10]) + } \ No newline at end of file diff --git a/backend/toolbox/modules/fuzzing/__init__.py b/backend/toolbox/modules/fuzzing/__init__.py new file mode 100644 index 0000000..f1dc43c --- /dev/null +++ b/backend/toolbox/modules/fuzzing/__init__.py @@ -0,0 +1,49 @@ +""" +Fuzzing Modules + +This package contains modules for various fuzzing techniques and tools. + +Available modules: +- LibFuzzer: LLVM's coverage-guided fuzzing engine +- AFL++: Advanced American Fuzzy Lop with modern features +- AFL-RS: Rust-based AFL implementation +- Atheris: Python fuzzing engine for finding bugs in Python code +- Cargo Fuzz: Rust fuzzing integration with libFuzzer +- Go-Fuzz: Coverage-guided fuzzing for Go packages +- OSS-Fuzz: Google's continuous fuzzing for open source +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +from typing import List, Type +from ..base import BaseModule + +# Module registry for automatic discovery +FUZZING_MODULES: List[Type[BaseModule]] = [] + +def register_module(module_class: Type[BaseModule]): + """Register a fuzzing module""" + FUZZING_MODULES.append(module_class) + return module_class + +def get_available_modules() -> List[Type[BaseModule]]: + """Get all available fuzzing modules""" + return FUZZING_MODULES.copy() + +# Import modules to trigger registration +from .libfuzzer import LibFuzzerModule +from .aflplusplus import AFLPlusPlusModule +from .aflrs import AFLRSModule +from .atheris import AtherisModule +from .cargo_fuzz import CargoFuzzModule +from .go_fuzz import GoFuzzModule +from .oss_fuzz import OSSFuzzModule \ No newline at end of file diff --git a/backend/toolbox/modules/fuzzing/aflplusplus.py b/backend/toolbox/modules/fuzzing/aflplusplus.py new file mode 100644 index 0000000..24e83b4 --- /dev/null +++ b/backend/toolbox/modules/fuzzing/aflplusplus.py @@ -0,0 +1,734 @@ +""" +AFL++ Fuzzing Module + +This module uses AFL++ (Advanced American Fuzzy Lop) for coverage-guided +fuzzing with modern features and optimizations. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import asyncio +import json +import os +from pathlib import Path +from typing import Dict, Any, List +import subprocess +import logging +import re + +from ..base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult +from . import register_module + +logger = logging.getLogger(__name__) + + +@register_module +class AFLPlusPlusModule(BaseModule): + """AFL++ advanced fuzzing module""" + + def get_metadata(self) -> ModuleMetadata: + """Get module metadata""" + return ModuleMetadata( + name="aflplusplus", + version="4.09c", + description="Advanced American Fuzzy Lop with modern features for coverage-guided fuzzing", + author="FuzzForge Team", + category="fuzzing", + tags=["coverage-guided", "american-fuzzy-lop", "advanced", "mutation", "instrumentation"], + input_schema={ + "type": "object", + "properties": { + "target_binary": { + "type": "string", + "description": "Path to the target binary (compiled with afl-gcc/afl-clang)" + }, + "input_dir": { + "type": "string", + "description": "Directory containing seed input files" + }, + "output_dir": { + "type": "string", + "default": "afl_output", + "description": "Output directory for AFL++ results" + }, + "dictionary": { + "type": "string", + "description": "Dictionary file for fuzzing keywords" + }, + "timeout": { + "type": "integer", + "default": 1000, + "description": "Timeout for each execution (ms)" + }, + "memory_limit": { + "type": "integer", + "default": 50, + "description": "Memory limit for child process (MB)" + }, + "skip_deterministic": { + "type": "boolean", + "default": false, + "description": "Skip deterministic mutations" + }, + "no_arith": { + "type": "boolean", + "default": false, + "description": "Skip arithmetic mutations" + }, + "shuffle_queue": { + "type": "boolean", + "default": false, + "description": "Shuffle queue entries" + }, + "max_total_time": { + "type": "integer", + "default": 3600, + "description": "Maximum total fuzzing time (seconds)" + }, + "power_schedule": { + "type": "string", + "enum": ["explore", "fast", "coe", "lin", "quad", "exploit", "rare"], + "default": "fast", + "description": "Power schedule algorithm" + }, + "mutation_mode": { + "type": "string", + "enum": ["default", "old", "mopt"], + "default": "default", + "description": "Mutation mode to use" + }, + "parallel_fuzzing": { + "type": "boolean", + "default": false, + "description": "Enable parallel fuzzing with multiple instances" + }, + "fuzzer_instances": { + "type": "integer", + "default": 1, + "description": "Number of parallel fuzzer instances" + }, + "master_instance": { + "type": "string", + "default": "master", + "description": "Name for master fuzzer instance" + }, + "slave_prefix": { + "type": "string", + "default": "slave", + "description": "Prefix for slave fuzzer instances" + }, + "hang_timeout": { + "type": "integer", + "default": 1000, + "description": "Timeout for detecting hangs (ms)" + }, + "crash_mode": { + "type": "boolean", + "default": false, + "description": "Run in crash exploration mode" + }, + "target_args": { + "type": "array", + "items": {"type": "string"}, + "description": "Arguments to pass to target binary" + }, + "env_vars": { + "type": "object", + "description": "Environment variables to set" + }, + "ignore_finds": { + "type": "boolean", + "default": false, + "description": "Ignore existing findings and start fresh" + }, + "force_deterministic": { + "type": "boolean", + "default": false, + "description": "Force deterministic mutations" + } + } + }, + output_schema={ + "type": "object", + "properties": { + "findings": { + "type": "array", + "items": { + "type": "object", + "properties": { + "crash_id": {"type": "string"}, + "crash_file": {"type": "string"}, + "crash_type": {"type": "string"}, + "signal": {"type": "string"} + } + } + } + } + } + ) + + def validate_config(self, config: Dict[str, Any]) -> bool: + """Validate configuration""" + target_binary = config.get("target_binary") + if not target_binary: + raise ValueError("target_binary is required for AFL++") + + input_dir = config.get("input_dir") + if not input_dir: + raise ValueError("input_dir is required for AFL++") + + return True + + async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: + """Execute AFL++ fuzzing""" + self.start_timer() + + try: + # Validate inputs + self.validate_config(config) + self.validate_workspace(workspace) + + logger.info("Running AFL++ fuzzing campaign") + + # Check prerequisites + await self._check_afl_prerequisites(workspace) + + # Setup directories and files + target_binary, input_dir, output_dir = self._setup_afl_directories(config, workspace) + + # Run AFL++ fuzzing + findings = await self._run_afl_fuzzing(target_binary, input_dir, output_dir, config, workspace) + + # Create summary + summary = self._create_summary(findings, output_dir) + + logger.info(f"AFL++ found {len(findings)} crashes") + + return self.create_result( + findings=findings, + status="success", + summary=summary + ) + + except Exception as e: + logger.error(f"AFL++ module failed: {e}") + return self.create_result( + findings=[], + status="failed", + error=str(e) + ) + + async def _check_afl_prerequisites(self, workspace: Path): + """Check AFL++ prerequisites and system setup""" + try: + # Check if afl-fuzz exists + process = await asyncio.create_subprocess_exec( + "which", "afl-fuzz", + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE + ) + stdout, stderr = await process.communicate() + + if process.returncode != 0: + raise RuntimeError("afl-fuzz not found. Please install AFL++") + + # Check core dump pattern (important for AFL) + try: + with open("/proc/sys/kernel/core_pattern", "r") as f: + core_pattern = f.read().strip() + if core_pattern != "core": + logger.warning(f"Core dump pattern is '{core_pattern}', AFL++ may not work optimally") + except Exception: + logger.warning("Could not check core dump pattern") + + except Exception as e: + logger.warning(f"AFL++ prerequisite check failed: {e}") + + def _setup_afl_directories(self, config: Dict[str, Any], workspace: Path): + """Setup AFL++ directories and validate files""" + # Check target binary + target_binary = workspace / config["target_binary"] + if not target_binary.exists(): + raise FileNotFoundError(f"Target binary not found: {target_binary}") + + # Check input directory + input_dir = workspace / config["input_dir"] + if not input_dir.exists(): + raise FileNotFoundError(f"Input directory not found: {input_dir}") + + # Check if input directory has files + input_files = list(input_dir.glob("*")) + if not input_files: + raise ValueError(f"Input directory is empty: {input_dir}") + + # Create output directory + output_dir = workspace / config.get("output_dir", "afl_output") + output_dir.mkdir(exist_ok=True) + + return target_binary, input_dir, output_dir + + async def _run_afl_fuzzing(self, target_binary: Path, input_dir: Path, output_dir: Path, config: Dict[str, Any], workspace: Path) -> List[ModuleFinding]: + """Run AFL++ fuzzing""" + findings = [] + + try: + if config.get("parallel_fuzzing", False): + findings = await self._run_parallel_fuzzing( + target_binary, input_dir, output_dir, config, workspace + ) + else: + findings = await self._run_single_fuzzing( + target_binary, input_dir, output_dir, config, workspace + ) + + except Exception as e: + logger.warning(f"Error running AFL++ fuzzing: {e}") + + return findings + + async def _run_single_fuzzing(self, target_binary: Path, input_dir: Path, output_dir: Path, config: Dict[str, Any], workspace: Path) -> List[ModuleFinding]: + """Run single-instance AFL++ fuzzing""" + findings = [] + + try: + # Build AFL++ command + cmd = ["afl-fuzz"] + + # Add input and output directories + cmd.extend(["-i", str(input_dir)]) + cmd.extend(["-o", str(output_dir)]) + + # Add dictionary if specified + dictionary = config.get("dictionary") + if dictionary: + dict_path = workspace / dictionary + if dict_path.exists(): + cmd.extend(["-x", str(dict_path)]) + + # Add timeout + timeout = config.get("timeout", 1000) + cmd.extend(["-t", str(timeout)]) + + # Add memory limit + memory_limit = config.get("memory_limit", 50) + cmd.extend(["-m", str(memory_limit)]) + + # Add power schedule + power_schedule = config.get("power_schedule", "fast") + cmd.extend(["-p", power_schedule]) + + # Add mutation options + if config.get("skip_deterministic", False): + cmd.append("-d") + + if config.get("no_arith", False): + cmd.append("-a") + + if config.get("shuffle_queue", False): + cmd.append("-Z") + + # Add hang timeout + hang_timeout = config.get("hang_timeout", 1000) + cmd.extend(["-T", str(hang_timeout)]) + + # Add crash mode + if config.get("crash_mode", False): + cmd.append("-C") + + # Add ignore finds + if config.get("ignore_finds", False): + cmd.append("-f") + + # Add force deterministic + if config.get("force_deterministic", False): + cmd.append("-D") + + # Add target binary and arguments + cmd.append("--") + cmd.append(str(target_binary)) + + target_args = config.get("target_args", []) + cmd.extend(target_args) + + # Set up environment + env = os.environ.copy() + env_vars = config.get("env_vars", {}) + env.update(env_vars) + + # Set AFL environment variables + env["AFL_I_DONT_CARE_ABOUT_MISSING_CRASHES"] = "1" # Avoid interactive prompts + env["AFL_SKIP_CPUFREQ"] = "1" # Skip CPU frequency checks + + logger.debug(f"Running command: {' '.join(cmd)}") + + # Run AFL++ with timeout + max_total_time = config.get("max_total_time", 3600) + + try: + process = await asyncio.create_subprocess_exec( + *cmd, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE, + cwd=workspace, + env=env + ) + + # Wait for specified time then terminate + try: + stdout, stderr = await asyncio.wait_for( + process.communicate(), timeout=max_total_time + ) + except asyncio.TimeoutError: + logger.info(f"AFL++ fuzzing timed out after {max_total_time} seconds") + process.terminate() + try: + await asyncio.wait_for(process.wait(), timeout=10) + except asyncio.TimeoutError: + process.kill() + await process.wait() + + # Parse results from output directory + findings = self._parse_afl_results(output_dir, workspace) + + except Exception as e: + logger.warning(f"Error running AFL++ process: {e}") + + except Exception as e: + logger.warning(f"Error in single fuzzing: {e}") + + return findings + + async def _run_parallel_fuzzing(self, target_binary: Path, input_dir: Path, output_dir: Path, config: Dict[str, Any], workspace: Path) -> List[ModuleFinding]: + """Run parallel AFL++ fuzzing""" + findings = [] + + try: + fuzzer_instances = config.get("fuzzer_instances", 2) + master_name = config.get("master_instance", "master") + slave_prefix = config.get("slave_prefix", "slave") + + processes = [] + + # Start master instance + master_cmd = await self._build_afl_command( + target_binary, input_dir, output_dir, config, workspace, + instance_name=master_name, is_master=True + ) + + master_process = await asyncio.create_subprocess_exec( + *master_cmd, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE, + cwd=workspace, + env=self._get_afl_env(config) + ) + processes.append(master_process) + + # Start slave instances + for i in range(1, fuzzer_instances): + slave_name = f"{slave_prefix}{i:02d}" + slave_cmd = await self._build_afl_command( + target_binary, input_dir, output_dir, config, workspace, + instance_name=slave_name, is_master=False + ) + + slave_process = await asyncio.create_subprocess_exec( + *slave_cmd, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE, + cwd=workspace, + env=self._get_afl_env(config) + ) + processes.append(slave_process) + + # Wait for specified time then terminate all + max_total_time = config.get("max_total_time", 3600) + + try: + await asyncio.sleep(max_total_time) + finally: + # Terminate all processes + for process in processes: + if process.returncode is None: + process.terminate() + try: + await asyncio.wait_for(process.wait(), timeout=10) + except asyncio.TimeoutError: + process.kill() + await process.wait() + + # Parse results from output directory + findings = self._parse_afl_results(output_dir, workspace) + + except Exception as e: + logger.warning(f"Error in parallel fuzzing: {e}") + + return findings + + async def _build_afl_command(self, target_binary: Path, input_dir: Path, output_dir: Path, config: Dict[str, Any], workspace: Path, instance_name: str, is_master: bool) -> List[str]: + """Build AFL++ command for a fuzzer instance""" + cmd = ["afl-fuzz"] + + # Add input and output directories + cmd.extend(["-i", str(input_dir)]) + cmd.extend(["-o", str(output_dir)]) + + # Add instance name + if is_master: + cmd.extend(["-M", instance_name]) + else: + cmd.extend(["-S", instance_name]) + + # Add other options (same as single fuzzing) + dictionary = config.get("dictionary") + if dictionary: + dict_path = workspace / dictionary + if dict_path.exists(): + cmd.extend(["-x", str(dict_path)]) + + cmd.extend(["-t", str(config.get("timeout", 1000))]) + cmd.extend(["-m", str(config.get("memory_limit", 50))]) + cmd.extend(["-p", config.get("power_schedule", "fast")]) + + if config.get("skip_deterministic", False): + cmd.append("-d") + + if config.get("no_arith", False): + cmd.append("-a") + + # Add target + cmd.append("--") + cmd.append(str(target_binary)) + cmd.extend(config.get("target_args", [])) + + return cmd + + def _get_afl_env(self, config: Dict[str, Any]) -> Dict[str, str]: + """Get environment variables for AFL++""" + env = os.environ.copy() + env.update(config.get("env_vars", {})) + env["AFL_I_DONT_CARE_ABOUT_MISSING_CRASHES"] = "1" + env["AFL_SKIP_CPUFREQ"] = "1" + return env + + def _parse_afl_results(self, output_dir: Path, workspace: Path) -> List[ModuleFinding]: + """Parse AFL++ results from output directory""" + findings = [] + + try: + # Look for crashes directory + crashes_dirs = [] + + # Single instance + crashes_dir = output_dir / "crashes" + if crashes_dir.exists(): + crashes_dirs.append(crashes_dir) + + # Multiple instances + for instance_dir in output_dir.iterdir(): + if instance_dir.is_dir(): + instance_crashes = instance_dir / "crashes" + if instance_crashes.exists(): + crashes_dirs.append(instance_crashes) + + # Process crash files + for crashes_dir in crashes_dirs: + crash_files = [f for f in crashes_dir.iterdir() if f.is_file() and f.name.startswith("id:")] + + for crash_file in crash_files: + finding = self._create_afl_crash_finding(crash_file, workspace) + if finding: + findings.append(finding) + + except Exception as e: + logger.warning(f"Error parsing AFL++ results: {e}") + + return findings + + def _create_afl_crash_finding(self, crash_file: Path, workspace: Path) -> ModuleFinding: + """Create finding from AFL++ crash file""" + try: + # Parse crash filename for information + filename = crash_file.name + crash_info = self._parse_afl_filename(filename) + + # Try to read crash file (limited size) + crash_content = "" + try: + crash_data = crash_file.read_bytes()[:1000] + crash_content = crash_data.hex()[:200] # Hex representation, limited + except Exception: + pass + + # Determine severity based on signal + severity = self._get_crash_severity(crash_info.get("signal", "")) + + # Create relative path + try: + rel_path = crash_file.relative_to(workspace) + file_path = str(rel_path) + except ValueError: + file_path = str(crash_file) + + finding = self.create_finding( + title=f"AFL++ Crash: {crash_info.get('signal', 'Unknown')}", + description=f"AFL++ discovered a crash with signal {crash_info.get('signal', 'unknown')} in the target program", + severity=severity, + category=self._get_crash_category(crash_info.get("signal", "")), + file_path=file_path, + recommendation=self._get_afl_crash_recommendation(crash_info.get("signal", "")), + metadata={ + "crash_id": crash_info.get("id", ""), + "signal": crash_info.get("signal", ""), + "src": crash_info.get("src", ""), + "crash_file": crash_file.name, + "crash_content_hex": crash_content, + "fuzzer": "afl++" + } + ) + + return finding + + except Exception as e: + logger.warning(f"Error creating AFL++ crash finding: {e}") + return None + + def _parse_afl_filename(self, filename: str) -> Dict[str, str]: + """Parse AFL++ crash filename for information""" + info = {} + + try: + # AFL++ crash filename format: id:XXXXXX,sig:XX,src:XXXXXX,op:XXX,rep:X + parts = filename.split(',') + + for part in parts: + if ':' in part: + key, value = part.split(':', 1) + info[key] = value + + except Exception: + pass + + return info + + def _get_crash_severity(self, signal: str) -> str: + """Determine severity based on crash signal""" + if not signal: + return "medium" + + signal_lower = signal.lower() + + # Critical signals indicating memory corruption + if signal in ["11", "sigsegv", "segv"]: # Segmentation fault + return "critical" + elif signal in ["6", "sigabrt", "abrt"]: # Abort + return "high" + elif signal in ["4", "sigill", "ill"]: # Illegal instruction + return "high" + elif signal in ["8", "sigfpe", "fpe"]: # Floating point exception + return "medium" + elif signal in ["9", "sigkill", "kill"]: # Kill signal + return "medium" + else: + return "medium" + + def _get_crash_category(self, signal: str) -> str: + """Determine category based on crash signal""" + if not signal: + return "program_crash" + + if signal in ["11", "sigsegv", "segv"]: + return "memory_corruption" + elif signal in ["6", "sigabrt", "abrt"]: + return "assertion_failure" + elif signal in ["4", "sigill", "ill"]: + return "illegal_instruction" + elif signal in ["8", "sigfpe", "fpe"]: + return "arithmetic_error" + else: + return "program_crash" + + def _get_afl_crash_recommendation(self, signal: str) -> str: + """Generate recommendation based on crash signal""" + if signal in ["11", "sigsegv", "segv"]: + return "Segmentation fault detected. Investigate memory access patterns, check for buffer overflows, null pointer dereferences, or use-after-free bugs." + elif signal in ["6", "sigabrt", "abrt"]: + return "Program abort detected. Check for assertion failures, memory allocation errors, or explicit abort() calls in the code." + elif signal in ["4", "sigill", "ill"]: + return "Illegal instruction detected. Check for code corruption, invalid function pointers, or architecture-specific instruction issues." + elif signal in ["8", "sigfpe", "fpe"]: + return "Floating point exception detected. Check for division by zero, arithmetic overflow, or invalid floating point operations." + else: + return f"Program crash with signal {signal} detected. Analyze the crash dump and input to identify the root cause." + + def _create_summary(self, findings: List[ModuleFinding], output_dir: Path) -> Dict[str, Any]: + """Create analysis summary""" + severity_counts = {"critical": 0, "high": 0, "medium": 0, "low": 0, "info": 0} + category_counts = {} + signal_counts = {} + + for finding in findings: + # Count by severity + severity_counts[finding.severity] += 1 + + # Count by category + category = finding.category + category_counts[category] = category_counts.get(category, 0) + 1 + + # Count by signal + signal = finding.metadata.get("signal", "unknown") + signal_counts[signal] = signal_counts.get(signal, 0) + 1 + + # Try to read AFL++ statistics + stats = self._read_afl_stats(output_dir) + + return { + "total_findings": len(findings), + "severity_counts": severity_counts, + "category_counts": category_counts, + "signal_counts": signal_counts, + "unique_crashes": len(set(f.metadata.get("crash_id", "") for f in findings)), + "afl_stats": stats + } + + def _read_afl_stats(self, output_dir: Path) -> Dict[str, Any]: + """Read AFL++ fuzzer statistics""" + stats = {} + + try: + # Look for fuzzer_stats file in single or multiple instance setup + stats_files = [] + + # Single instance + single_stats = output_dir / "fuzzer_stats" + if single_stats.exists(): + stats_files.append(single_stats) + + # Multiple instances + for instance_dir in output_dir.iterdir(): + if instance_dir.is_dir(): + instance_stats = instance_dir / "fuzzer_stats" + if instance_stats.exists(): + stats_files.append(instance_stats) + + # Read first stats file found + if stats_files: + with open(stats_files[0], 'r') as f: + for line in f: + if ':' in line: + key, value = line.strip().split(':', 1) + stats[key.strip()] = value.strip() + + except Exception as e: + logger.warning(f"Error reading AFL++ stats: {e}") + + return stats \ No newline at end of file diff --git a/backend/toolbox/modules/fuzzing/aflrs.py b/backend/toolbox/modules/fuzzing/aflrs.py new file mode 100644 index 0000000..3e77238 --- /dev/null +++ b/backend/toolbox/modules/fuzzing/aflrs.py @@ -0,0 +1,678 @@ +""" +AFL-RS Fuzzing Module + +This module uses AFL-RS (AFL in Rust) for high-performance coverage-guided fuzzing +with modern Rust implementations and optimizations. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import asyncio +import json +import os +from pathlib import Path +from typing import Dict, Any, List +import subprocess +import logging +import re + +from ..base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult +from . import register_module + +logger = logging.getLogger(__name__) + + +@register_module +class AFLRSModule(BaseModule): + """AFL-RS Rust-based fuzzing module""" + + def get_metadata(self) -> ModuleMetadata: + """Get module metadata""" + return ModuleMetadata( + name="aflrs", + version="0.2.0", + description="High-performance AFL implementation in Rust with modern fuzzing features", + author="FuzzForge Team", + category="fuzzing", + tags=["coverage-guided", "rust", "afl", "high-performance", "modern"], + input_schema={ + "type": "object", + "properties": { + "target_binary": { + "type": "string", + "description": "Path to the target binary (compiled with AFL-RS instrumentation)" + }, + "input_dir": { + "type": "string", + "description": "Directory containing seed input files" + }, + "output_dir": { + "type": "string", + "default": "aflrs_output", + "description": "Output directory for AFL-RS results" + }, + "dictionary": { + "type": "string", + "description": "Dictionary file for token-based mutations" + }, + "timeout": { + "type": "integer", + "default": 1000, + "description": "Timeout for each execution (ms)" + }, + "memory_limit": { + "type": "integer", + "default": 50, + "description": "Memory limit for target process (MB)" + }, + "max_total_time": { + "type": "integer", + "default": 3600, + "description": "Maximum total fuzzing time (seconds)" + }, + "cpu_cores": { + "type": "integer", + "default": 1, + "description": "Number of CPU cores to use" + }, + "mutation_depth": { + "type": "integer", + "default": 4, + "description": "Maximum depth for cascaded mutations" + }, + "skip_deterministic": { + "type": "boolean", + "default": false, + "description": "Skip deterministic mutations" + }, + "power_schedule": { + "type": "string", + "enum": ["explore", "fast", "coe", "lin", "quad", "exploit", "rare", "mmopt", "seek"], + "default": "fast", + "description": "Power scheduling algorithm" + }, + "custom_mutators": { + "type": "array", + "items": {"type": "string"}, + "description": "Custom mutator libraries to load" + }, + "cmplog": { + "type": "boolean", + "default": true, + "description": "Enable CmpLog for comparison logging" + }, + "redqueen": { + "type": "boolean", + "default": true, + "description": "Enable RedQueen input-to-state correspondence" + }, + "unicorn_mode": { + "type": "boolean", + "default": false, + "description": "Enable Unicorn mode for emulation" + }, + "persistent_mode": { + "type": "boolean", + "default": false, + "description": "Enable persistent mode for faster execution" + }, + "target_args": { + "type": "array", + "items": {"type": "string"}, + "description": "Arguments to pass to target binary" + }, + "env_vars": { + "type": "object", + "description": "Environment variables to set" + }, + "ignore_timeouts": { + "type": "boolean", + "default": false, + "description": "Ignore timeout signals and continue fuzzing" + }, + "ignore_crashes": { + "type": "boolean", + "default": false, + "description": "Ignore crashes and continue fuzzing" + }, + "sync_dir": { + "type": "string", + "description": "Directory for syncing with other AFL instances" + }, + "sync_id": { + "type": "string", + "description": "Fuzzer ID for syncing" + } + } + }, + output_schema={ + "type": "object", + "properties": { + "findings": { + "type": "array", + "items": { + "type": "object", + "properties": { + "crash_id": {"type": "string"}, + "crash_file": {"type": "string"}, + "signal": {"type": "string"}, + "execution_time": {"type": "integer"} + } + } + } + } + } + ) + + def validate_config(self, config: Dict[str, Any]) -> bool: + """Validate configuration""" + target_binary = config.get("target_binary") + if not target_binary: + raise ValueError("target_binary is required for AFL-RS") + + input_dir = config.get("input_dir") + if not input_dir: + raise ValueError("input_dir is required for AFL-RS") + + cpu_cores = config.get("cpu_cores", 1) + if cpu_cores < 1: + raise ValueError("cpu_cores must be at least 1") + + return True + + async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: + """Execute AFL-RS fuzzing""" + self.start_timer() + + try: + # Validate inputs + self.validate_config(config) + self.validate_workspace(workspace) + + logger.info("Running AFL-RS fuzzing campaign") + + # Check AFL-RS installation + await self._check_aflrs_installation() + + # Setup directories and files + target_binary, input_dir, output_dir = self._setup_aflrs_directories(config, workspace) + + # Run AFL-RS fuzzing + findings = await self._run_aflrs_fuzzing(target_binary, input_dir, output_dir, config, workspace) + + # Create summary + summary = self._create_summary(findings, output_dir) + + logger.info(f"AFL-RS found {len(findings)} crashes") + + return self.create_result( + findings=findings, + status="success", + summary=summary + ) + + except Exception as e: + logger.error(f"AFL-RS module failed: {e}") + return self.create_result( + findings=[], + status="failed", + error=str(e) + ) + + async def _check_aflrs_installation(self): + """Check if AFL-RS is installed and available""" + try: + # Check if aflrs is available (assuming aflrs binary) + process = await asyncio.create_subprocess_exec( + "which", "aflrs", + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE + ) + stdout, stderr = await process.communicate() + + if process.returncode != 0: + # Try alternative AFL-RS command names + alt_commands = ["afl-fuzz-rs", "afl-rs", "cargo-afl"] + found = False + + for cmd in alt_commands: + process = await asyncio.create_subprocess_exec( + "which", cmd, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE + ) + stdout, stderr = await process.communicate() + + if process.returncode == 0: + found = True + break + + if not found: + raise RuntimeError("AFL-RS not found. Please install AFL-RS or ensure it's in PATH") + + except Exception as e: + logger.warning(f"AFL-RS installation check failed: {e}") + + def _setup_aflrs_directories(self, config: Dict[str, Any], workspace: Path): + """Setup AFL-RS directories and validate files""" + # Check target binary + target_binary = workspace / config["target_binary"] + if not target_binary.exists(): + raise FileNotFoundError(f"Target binary not found: {target_binary}") + + # Check input directory + input_dir = workspace / config["input_dir"] + if not input_dir.exists(): + raise FileNotFoundError(f"Input directory not found: {input_dir}") + + # Validate input files exist + input_files = list(input_dir.glob("*")) + if not input_files: + raise ValueError(f"Input directory is empty: {input_dir}") + + # Create output directory + output_dir = workspace / config.get("output_dir", "aflrs_output") + output_dir.mkdir(exist_ok=True) + + return target_binary, input_dir, output_dir + + async def _run_aflrs_fuzzing(self, target_binary: Path, input_dir: Path, output_dir: Path, config: Dict[str, Any], workspace: Path) -> List[ModuleFinding]: + """Run AFL-RS fuzzing""" + findings = [] + + try: + # Build AFL-RS command + cmd = await self._build_aflrs_command(target_binary, input_dir, output_dir, config, workspace) + + # Set up environment + env = self._setup_aflrs_environment(config) + + logger.debug(f"Running command: {' '.join(cmd)}") + + # Run AFL-RS with timeout + max_total_time = config.get("max_total_time", 3600) + + try: + process = await asyncio.create_subprocess_exec( + *cmd, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE, + cwd=workspace, + env=env + ) + + # Wait for specified time then terminate + try: + stdout, stderr = await asyncio.wait_for( + process.communicate(), timeout=max_total_time + ) + logger.info(f"AFL-RS completed after {max_total_time} seconds") + except asyncio.TimeoutError: + logger.info(f"AFL-RS fuzzing timed out after {max_total_time} seconds, terminating") + process.terminate() + try: + await asyncio.wait_for(process.wait(), timeout=10) + except asyncio.TimeoutError: + process.kill() + await process.wait() + + # Parse results + findings = self._parse_aflrs_results(output_dir, workspace) + + except Exception as e: + logger.warning(f"Error running AFL-RS process: {e}") + + except Exception as e: + logger.warning(f"Error in AFL-RS fuzzing: {e}") + + return findings + + async def _build_aflrs_command(self, target_binary: Path, input_dir: Path, output_dir: Path, config: Dict[str, Any], workspace: Path) -> List[str]: + """Build AFL-RS command""" + # Try to determine the correct AFL-RS command + aflrs_cmd = "aflrs" # Default + + # Try alternative command names + alt_commands = ["aflrs", "afl-fuzz-rs", "afl-rs"] + for cmd in alt_commands: + try: + process = await asyncio.create_subprocess_exec( + "which", cmd, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE + ) + stdout, stderr = await process.communicate() + if process.returncode == 0: + aflrs_cmd = cmd + break + except Exception: + continue + + cmd = [aflrs_cmd] + + # Add input and output directories + cmd.extend(["-i", str(input_dir)]) + cmd.extend(["-o", str(output_dir)]) + + # Add dictionary if specified + dictionary = config.get("dictionary") + if dictionary: + dict_path = workspace / dictionary + if dict_path.exists(): + cmd.extend(["-x", str(dict_path)]) + + # Add timeout and memory limit + cmd.extend(["-t", str(config.get("timeout", 1000))]) + cmd.extend(["-m", str(config.get("memory_limit", 50))]) + + # Add CPU cores + cpu_cores = config.get("cpu_cores", 1) + if cpu_cores > 1: + cmd.extend(["-j", str(cpu_cores)]) + + # Add mutation depth + mutation_depth = config.get("mutation_depth", 4) + cmd.extend(["-d", str(mutation_depth)]) + + # Add power schedule + power_schedule = config.get("power_schedule", "fast") + cmd.extend(["-p", power_schedule]) + + # Add skip deterministic + if config.get("skip_deterministic", False): + cmd.append("-D") + + # Add custom mutators + custom_mutators = config.get("custom_mutators", []) + for mutator in custom_mutators: + cmd.extend(["-c", mutator]) + + # Add advanced features + if config.get("cmplog", True): + cmd.append("-l") + + if config.get("redqueen", True): + cmd.append("-I") + + if config.get("unicorn_mode", False): + cmd.append("-U") + + if config.get("persistent_mode", False): + cmd.append("-P") + + # Add ignore options + if config.get("ignore_timeouts", False): + cmd.append("-T") + + if config.get("ignore_crashes", False): + cmd.append("-C") + + # Add sync options + sync_dir = config.get("sync_dir") + if sync_dir: + cmd.extend(["-F", sync_dir]) + + sync_id = config.get("sync_id") + if sync_id: + cmd.extend(["-S", sync_id]) + + # Add target binary and arguments + cmd.append("--") + cmd.append(str(target_binary)) + + target_args = config.get("target_args", []) + cmd.extend(target_args) + + return cmd + + def _setup_aflrs_environment(self, config: Dict[str, Any]) -> Dict[str, str]: + """Setup environment variables for AFL-RS""" + env = os.environ.copy() + + # Add user-specified environment variables + env_vars = config.get("env_vars", {}) + env.update(env_vars) + + # Set AFL-RS specific environment variables + env["AFL_I_DONT_CARE_ABOUT_MISSING_CRASHES"] = "1" + env["AFL_SKIP_CPUFREQ"] = "1" + + # Enable advanced features if requested + if config.get("cmplog", True): + env["AFL_USE_CMPLOG"] = "1" + + if config.get("redqueen", True): + env["AFL_USE_REDQUEEN"] = "1" + + return env + + def _parse_aflrs_results(self, output_dir: Path, workspace: Path) -> List[ModuleFinding]: + """Parse AFL-RS results from output directory""" + findings = [] + + try: + # Look for crashes directory + crashes_dir = output_dir / "crashes" + if not crashes_dir.exists(): + logger.info("No crashes directory found in AFL-RS output") + return findings + + # Process crash files + crash_files = [f for f in crashes_dir.iterdir() if f.is_file() and not f.name.startswith(".")] + + for crash_file in crash_files: + finding = self._create_aflrs_crash_finding(crash_file, workspace) + if finding: + findings.append(finding) + + except Exception as e: + logger.warning(f"Error parsing AFL-RS results: {e}") + + return findings + + def _create_aflrs_crash_finding(self, crash_file: Path, workspace: Path) -> ModuleFinding: + """Create finding from AFL-RS crash file""" + try: + # Parse crash filename + filename = crash_file.name + crash_info = self._parse_aflrs_filename(filename) + + # Try to read crash file (limited size) + crash_content = "" + crash_size = 0 + try: + crash_data = crash_file.read_bytes() + crash_size = len(crash_data) + # Store first 500 bytes as hex + crash_content = crash_data[:500].hex() + except Exception: + pass + + # Determine severity based on signal or crash type + signal = crash_info.get("signal", "") + severity = self._get_crash_severity(signal) + + # Create relative path + try: + rel_path = crash_file.relative_to(workspace) + file_path = str(rel_path) + except ValueError: + file_path = str(crash_file) + + finding = self.create_finding( + title=f"AFL-RS Crash: {signal or 'Unknown Signal'}", + description=f"AFL-RS discovered a crash in the target program{' with signal ' + signal if signal else ''}", + severity=severity, + category=self._get_crash_category(signal), + file_path=file_path, + recommendation=self._get_crash_recommendation(signal), + metadata={ + "crash_id": crash_info.get("id", ""), + "signal": signal, + "execution_time": crash_info.get("time", ""), + "crash_file": crash_file.name, + "crash_size": crash_size, + "crash_content_hex": crash_content, + "fuzzer": "aflrs" + } + ) + + return finding + + except Exception as e: + logger.warning(f"Error creating AFL-RS crash finding: {e}") + return None + + def _parse_aflrs_filename(self, filename: str) -> Dict[str, str]: + """Parse AFL-RS crash filename for information""" + info = {} + + try: + # AFL-RS may use similar format to AFL++ + # Example: id_000000_sig_11_src_000000_time_12345_op_havoc_rep_128 + parts = filename.replace("id:", "id_").replace("sig:", "sig_").replace("src:", "src_").replace("time:", "time_").replace("op:", "op_").replace("rep:", "rep_").split("_") + + i = 0 + while i < len(parts) - 1: + if parts[i] in ["id", "sig", "src", "time", "op", "rep"]: + info[parts[i]] = parts[i + 1] + i += 2 + else: + i += 1 + + except Exception: + # Fallback: try to extract signal from filename + signal_match = re.search(r'sig[_:]?(\d+)', filename) + if signal_match: + info["signal"] = signal_match.group(1) + + return info + + def _get_crash_severity(self, signal: str) -> str: + """Determine crash severity based on signal""" + if not signal: + return "medium" + + try: + sig_num = int(signal) + except ValueError: + return "medium" + + # Map common signals to severity + if sig_num == 11: # SIGSEGV + return "critical" + elif sig_num == 6: # SIGABRT + return "high" + elif sig_num == 4: # SIGILL + return "high" + elif sig_num == 8: # SIGFPE + return "medium" + elif sig_num == 9: # SIGKILL + return "medium" + else: + return "medium" + + def _get_crash_category(self, signal: str) -> str: + """Determine crash category based on signal""" + if not signal: + return "program_crash" + + try: + sig_num = int(signal) + except ValueError: + return "program_crash" + + if sig_num == 11: # SIGSEGV + return "memory_corruption" + elif sig_num == 6: # SIGABRT + return "assertion_failure" + elif sig_num == 4: # SIGILL + return "illegal_instruction" + elif sig_num == 8: # SIGFPE + return "arithmetic_error" + else: + return "program_crash" + + def _get_crash_recommendation(self, signal: str) -> str: + """Generate recommendation based on crash signal""" + if not signal: + return "Analyze the crash input to reproduce and debug the issue." + + try: + sig_num = int(signal) + except ValueError: + return "Analyze the crash input to reproduce and debug the issue." + + if sig_num == 11: # SIGSEGV + return "Segmentation fault detected. Check for buffer overflows, null pointer dereferences, use-after-free, or invalid memory access patterns." + elif sig_num == 6: # SIGABRT + return "Program abort detected. Check for assertion failures, memory corruption detected by allocator, or explicit abort calls." + elif sig_num == 4: # SIGILL + return "Illegal instruction detected. Check for code corruption, invalid function pointers, or architecture-specific issues." + elif sig_num == 8: # SIGFPE + return "Floating point exception detected. Check for division by zero, arithmetic overflow, or invalid floating point operations." + else: + return f"Program terminated with signal {signal}. Analyze the crash input and use debugging tools to identify the root cause." + + def _create_summary(self, findings: List[ModuleFinding], output_dir: Path) -> Dict[str, Any]: + """Create analysis summary""" + severity_counts = {"critical": 0, "high": 0, "medium": 0, "low": 0, "info": 0} + category_counts = {} + signal_counts = {} + + for finding in findings: + # Count by severity + severity_counts[finding.severity] += 1 + + # Count by category + category = finding.category + category_counts[category] = category_counts.get(category, 0) + 1 + + # Count by signal + signal = finding.metadata.get("signal", "unknown") + signal_counts[signal] = signal_counts.get(signal, 0) + 1 + + # Try to read AFL-RS statistics + stats = self._read_aflrs_stats(output_dir) + + return { + "total_findings": len(findings), + "severity_counts": severity_counts, + "category_counts": category_counts, + "signal_counts": signal_counts, + "unique_crashes": len(set(f.metadata.get("crash_id", "") for f in findings)), + "aflrs_stats": stats + } + + def _read_aflrs_stats(self, output_dir: Path) -> Dict[str, Any]: + """Read AFL-RS fuzzer statistics""" + stats = {} + + try: + # Look for AFL-RS stats file + stats_file = output_dir / "fuzzer_stats" + if stats_file.exists(): + with open(stats_file, 'r') as f: + for line in f: + if ':' in line: + key, value = line.strip().split(':', 1) + stats[key.strip()] = value.strip() + + # Also look for AFL-RS specific files + plot_data = output_dir / "plot_data" + if plot_data.exists(): + stats["plot_data_available"] = True + + except Exception as e: + logger.warning(f"Error reading AFL-RS stats: {e}") + + return stats \ No newline at end of file diff --git a/backend/toolbox/modules/fuzzing/atheris.py b/backend/toolbox/modules/fuzzing/atheris.py new file mode 100644 index 0000000..6c44cdd --- /dev/null +++ b/backend/toolbox/modules/fuzzing/atheris.py @@ -0,0 +1,774 @@ +""" +Atheris Fuzzing Module + +This module uses Atheris for fuzzing Python code to find bugs and security +vulnerabilities in Python applications and libraries. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import asyncio +import json +import os +import sys +from pathlib import Path +from typing import Dict, Any, List +import subprocess +import logging +import traceback + +from ..base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult +from . import register_module + +logger = logging.getLogger(__name__) + + +@register_module +class AtherisModule(BaseModule): + """Atheris Python fuzzing module""" + + def get_metadata(self) -> ModuleMetadata: + """Get module metadata""" + return ModuleMetadata( + name="atheris", + version="2.3.0", + description="Coverage-guided Python fuzzing engine for finding bugs in Python code", + author="FuzzForge Team", + category="fuzzing", + tags=["python", "coverage-guided", "native", "sanitizers", "libfuzzer"], + input_schema={ + "type": "object", + "properties": { + "target_script": { + "type": "string", + "description": "Path to the Python script containing the fuzz target function" + }, + "target_function": { + "type": "string", + "default": "TestOneInput", + "description": "Name of the target function to fuzz" + }, + "corpus_dir": { + "type": "string", + "description": "Directory containing initial corpus files" + }, + "dict_file": { + "type": "string", + "description": "Dictionary file for fuzzing keywords" + }, + "max_total_time": { + "type": "integer", + "default": 600, + "description": "Maximum total time to run fuzzing (seconds)" + }, + "max_len": { + "type": "integer", + "default": 4096, + "description": "Maximum length of test input" + }, + "timeout": { + "type": "integer", + "default": 25, + "description": "Timeout for individual test cases (seconds)" + }, + "runs": { + "type": "integer", + "default": -1, + "description": "Number of individual test runs (-1 for unlimited)" + }, + "jobs": { + "type": "integer", + "default": 1, + "description": "Number of fuzzing jobs to run in parallel" + }, + "print_final_stats": { + "type": "boolean", + "default": true, + "description": "Print final statistics" + }, + "print_pcs": { + "type": "boolean", + "default": false, + "description": "Print newly covered PCs" + }, + "print_coverage": { + "type": "boolean", + "default": true, + "description": "Print coverage information" + }, + "artifact_prefix": { + "type": "string", + "default": "crash-", + "description": "Prefix for artifact files" + }, + "seed": { + "type": "integer", + "description": "Random seed for reproducibility" + }, + "python_path": { + "type": "array", + "items": {"type": "string"}, + "description": "Additional Python paths to add to sys.path" + }, + "enable_sanitizers": { + "type": "boolean", + "default": true, + "description": "Enable Python-specific sanitizers and checks" + }, + "detect_leaks": { + "type": "boolean", + "default": true, + "description": "Detect memory leaks in native extensions" + }, + "detect_stack_use_after_return": { + "type": "boolean", + "default": false, + "description": "Detect stack use-after-return" + }, + "setup_code": { + "type": "string", + "description": "Python code to execute before fuzzing starts" + }, + "enable_value_profile": { + "type": "boolean", + "default": false, + "description": "Enable value profiling for better mutation" + }, + "shrink": { + "type": "boolean", + "default": true, + "description": "Try to shrink the corpus" + }, + "only_ascii": { + "type": "boolean", + "default": false, + "description": "Only generate ASCII inputs" + } + } + }, + output_schema={ + "type": "object", + "properties": { + "findings": { + "type": "array", + "items": { + "type": "object", + "properties": { + "exception_type": {"type": "string"}, + "exception_message": {"type": "string"}, + "stack_trace": {"type": "string"}, + "crash_input": {"type": "string"} + } + } + } + } + } + ) + + def validate_config(self, config: Dict[str, Any]) -> bool: + """Validate configuration""" + target_script = config.get("target_script") + if not target_script: + raise ValueError("target_script is required for Atheris") + + max_total_time = config.get("max_total_time", 600) + if max_total_time <= 0: + raise ValueError("max_total_time must be positive") + + return True + + async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: + """Execute Atheris Python fuzzing""" + self.start_timer() + + try: + # Validate inputs + self.validate_config(config) + self.validate_workspace(workspace) + + logger.info("Running Atheris Python fuzzing") + + # Check Atheris installation + await self._check_atheris_installation() + + # Validate target script + target_script = workspace / config["target_script"] + if not target_script.exists(): + raise FileNotFoundError(f"Target script not found: {target_script}") + + # Run Atheris fuzzing + findings = await self._run_atheris_fuzzing(target_script, config, workspace) + + # Create summary + summary = self._create_summary(findings) + + logger.info(f"Atheris found {len(findings)} issues") + + return self.create_result( + findings=findings, + status="success", + summary=summary + ) + + except Exception as e: + logger.error(f"Atheris module failed: {e}") + return self.create_result( + findings=[], + status="failed", + error=str(e) + ) + + async def _check_atheris_installation(self): + """Check if Atheris is installed""" + try: + process = await asyncio.create_subprocess_exec( + sys.executable, "-c", "import atheris; print(atheris.__version__)", + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE + ) + stdout, stderr = await process.communicate() + + if process.returncode != 0: + raise RuntimeError("Atheris not installed. Install with: pip install atheris") + + version = stdout.decode().strip() + logger.info(f"Using Atheris version: {version}") + + except Exception as e: + raise RuntimeError(f"Atheris installation check failed: {e}") + + async def _run_atheris_fuzzing(self, target_script: Path, config: Dict[str, Any], workspace: Path) -> List[ModuleFinding]: + """Run Atheris fuzzing""" + findings = [] + + try: + # Create output directory for artifacts + output_dir = workspace / "atheris_output" + output_dir.mkdir(exist_ok=True) + + # Create wrapper script for fuzzing + wrapper_script = await self._create_atheris_wrapper(target_script, config, workspace, output_dir) + + # Build Atheris command + cmd = [sys.executable, str(wrapper_script)] + + # Add corpus directory + corpus_dir = config.get("corpus_dir") + if corpus_dir: + corpus_path = workspace / corpus_dir + if corpus_path.exists(): + cmd.append(str(corpus_path)) + + # Set up environment + env = self._setup_atheris_environment(config) + + logger.debug(f"Running command: {' '.join(cmd)}") + + # Run Atheris with timeout + max_total_time = config.get("max_total_time", 600) + + try: + process = await asyncio.create_subprocess_exec( + *cmd, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE, + cwd=workspace, + env=env + ) + + # Wait for specified time then terminate + try: + stdout, stderr = await asyncio.wait_for( + process.communicate(), timeout=max_total_time + ) + except asyncio.TimeoutError: + logger.info(f"Atheris fuzzing timed out after {max_total_time} seconds") + process.terminate() + try: + await asyncio.wait_for(process.wait(), timeout=10) + except asyncio.TimeoutError: + process.kill() + await process.wait() + + # Parse results + findings = self._parse_atheris_output( + stdout.decode(), stderr.decode(), output_dir, workspace + ) + + # Look for crash files + crash_findings = self._parse_crash_files(output_dir, workspace) + findings.extend(crash_findings) + + except Exception as e: + logger.warning(f"Error running Atheris process: {e}") + + except Exception as e: + logger.warning(f"Error in Atheris fuzzing: {e}") + + return findings + + async def _create_atheris_wrapper(self, target_script: Path, config: Dict[str, Any], workspace: Path, output_dir: Path) -> Path: + """Create wrapper script for Atheris fuzzing""" + wrapper_path = workspace / "atheris_wrapper.py" + + wrapper_code = f'''#!/usr/bin/env python3 +import sys +import os +import atheris +import traceback + +# Add Python paths +python_paths = {config.get("python_path", [])} +for path in python_paths: + if path not in sys.path: + sys.path.insert(0, path) + +# Add workspace to Python path +sys.path.insert(0, r"{workspace}") + +# Setup code +setup_code = """{config.get("setup_code", "")}""" +if setup_code: + exec(setup_code) + +# Import target script +target_module_name = "{target_script.stem}" +sys.path.insert(0, r"{target_script.parent}") + +try: + target_module = __import__(target_module_name) + target_function = getattr(target_module, "{config.get("target_function", "TestOneInput")}") +except Exception as e: + print(f"Failed to import target: {{e}}") + sys.exit(1) + +# Wrapper function to catch exceptions +original_target = target_function + +def wrapped_target(data): + try: + return original_target(data) + except Exception as e: + # Write crash information + crash_info = {{ + "exception_type": type(e).__name__, + "exception_message": str(e), + "stack_trace": traceback.format_exc(), + "input_data": data[:1000].hex() if isinstance(data, bytes) else str(data)[:1000] + }} + + crash_file = r"{output_dir}" + "/crash_" + type(e).__name__ + ".txt" + with open(crash_file, "a") as f: + f.write(f"Exception: {{type(e).__name__}}\\n") + f.write(f"Message: {{str(e)}}\\n") + f.write(f"Stack trace:\\n{{traceback.format_exc()}}\\n") + f.write(f"Input data (first 1000 chars/bytes): {{crash_info['input_data']}}\\n") + f.write("-" * 80 + "\\n") + + # Re-raise to let Atheris handle it + raise + +if __name__ == "__main__": + # Configure Atheris + atheris.Setup(sys.argv, wrapped_target) + + # Set Atheris options + options = [] + + options.append(f"-max_total_time={{config.get('max_total_time', 600)}}") + options.append(f"-max_len={{config.get('max_len', 4096)}}") + options.append(f"-timeout={{config.get('timeout', 25)}}") + options.append(f"-runs={{config.get('runs', -1)}}") + + if {config.get('jobs', 1)} > 1: + options.append(f"-jobs={{config.get('jobs', 1)}}") + + if {config.get('print_final_stats', True)}: + options.append("-print_final_stats=1") + else: + options.append("-print_final_stats=0") + + if {config.get('print_pcs', False)}: + options.append("-print_pcs=1") + + if {config.get('print_coverage', True)}: + options.append("-print_coverage=1") + + artifact_prefix = "{config.get('artifact_prefix', 'crash-')}" + options.append(f"-artifact_prefix={{r'{output_dir}'}}/" + artifact_prefix) + + seed = {config.get('seed')} + if seed is not None: + options.append(f"-seed={{seed}}") + + if {config.get('enable_value_profile', False)}: + options.append("-use_value_profile=1") + + if {config.get('shrink', True)}: + options.append("-shrink=1") + + if {config.get('only_ascii', False)}: + options.append("-only_ascii=1") + + dict_file = "{config.get('dict_file', '')}" + if dict_file: + dict_path = r"{workspace}" + "/" + dict_file + if os.path.exists(dict_path): + options.append(f"-dict={{dict_path}}") + + # Add options to sys.argv + sys.argv.extend(options) + + # Start fuzzing + atheris.Fuzz() +''' + + with open(wrapper_path, 'w') as f: + f.write(wrapper_code) + + return wrapper_path + + def _setup_atheris_environment(self, config: Dict[str, Any]) -> Dict[str, str]: + """Setup environment variables for Atheris""" + env = os.environ.copy() + + # Enable sanitizers if requested + if config.get("enable_sanitizers", True): + env["ASAN_OPTIONS"] = env.get("ASAN_OPTIONS", "") + ":detect_leaks=1:halt_on_error=1" + + if config.get("detect_leaks", True): + env["ASAN_OPTIONS"] = env.get("ASAN_OPTIONS", "") + ":detect_leaks=1" + + if config.get("detect_stack_use_after_return", False): + env["ASAN_OPTIONS"] = env.get("ASAN_OPTIONS", "") + ":detect_stack_use_after_return=1" + + return env + + def _parse_atheris_output(self, stdout: str, stderr: str, output_dir: Path, workspace: Path) -> List[ModuleFinding]: + """Parse Atheris output for crashes and issues""" + findings = [] + + try: + # Combine stdout and stderr + full_output = stdout + "\n" + stderr + + # Look for Python exceptions in output + exception_patterns = [ + r"Traceback \(most recent call last\):(.*?)(?=\n\w|\nDONE|\n=|\Z)", + r"Exception: (\w+).*?\nMessage: (.*?)\nStack trace:\n(.*?)(?=\n-{20,}|\Z)" + ] + + for pattern in exception_patterns: + import re + matches = re.findall(pattern, full_output, re.DOTALL | re.MULTILINE) + for match in matches: + finding = self._create_exception_finding(match, full_output, output_dir) + if finding: + findings.append(finding) + + except Exception as e: + logger.warning(f"Error parsing Atheris output: {e}") + + return findings + + def _parse_crash_files(self, output_dir: Path, workspace: Path) -> List[ModuleFinding]: + """Parse crash files created by wrapper""" + findings = [] + + try: + # Look for crash files + crash_files = list(output_dir.glob("crash_*.txt")) + + for crash_file in crash_files: + findings.extend(self._parse_crash_file(crash_file, workspace)) + + # Also look for Atheris artifact files + artifact_files = list(output_dir.glob("crash-*")) + for artifact_file in artifact_files: + finding = self._create_artifact_finding(artifact_file, workspace) + if finding: + findings.append(finding) + + except Exception as e: + logger.warning(f"Error parsing crash files: {e}") + + return findings + + def _parse_crash_file(self, crash_file: Path, workspace: Path) -> List[ModuleFinding]: + """Parse individual crash file""" + findings = [] + + try: + content = crash_file.read_text() + + # Split by separator + crash_entries = content.split("-" * 80) + + for entry in crash_entries: + if not entry.strip(): + continue + + finding = self._parse_crash_entry(entry, crash_file, workspace) + if finding: + findings.append(finding) + + except Exception as e: + logger.warning(f"Error parsing crash file {crash_file}: {e}") + + return findings + + def _parse_crash_entry(self, entry: str, crash_file: Path, workspace: Path) -> ModuleFinding: + """Parse individual crash entry""" + try: + lines = entry.strip().split('\n') + + exception_type = "" + exception_message = "" + stack_trace = "" + input_data = "" + + current_section = None + stack_lines = [] + + for line in lines: + if line.startswith("Exception: "): + exception_type = line.replace("Exception: ", "") + elif line.startswith("Message: "): + exception_message = line.replace("Message: ", "") + elif line.startswith("Stack trace:"): + current_section = "stack" + elif line.startswith("Input data"): + current_section = "input" + input_data = line.split(":", 1)[1].strip() if ":" in line else "" + elif current_section == "stack": + stack_lines.append(line) + + stack_trace = '\n'.join(stack_lines) + + if not exception_type: + return None + + # Determine severity based on exception type + severity = self._get_exception_severity(exception_type) + + # Create relative path + try: + rel_path = crash_file.relative_to(workspace) + file_path = str(rel_path) + except ValueError: + file_path = str(crash_file) + + finding = self.create_finding( + title=f"Atheris Exception: {exception_type}", + description=f"Atheris discovered a Python exception: {exception_type}{': ' + exception_message if exception_message else ''}", + severity=severity, + category=self._get_exception_category(exception_type), + file_path=file_path, + recommendation=self._get_exception_recommendation(exception_type, exception_message), + metadata={ + "exception_type": exception_type, + "exception_message": exception_message, + "stack_trace": stack_trace[:2000] if stack_trace else "", # Limit size + "crash_input_preview": input_data[:500] if input_data else "", + "fuzzer": "atheris" + } + ) + + return finding + + except Exception as e: + logger.warning(f"Error parsing crash entry: {e}") + return None + + def _create_exception_finding(self, match, full_output: str, output_dir: Path) -> ModuleFinding: + """Create finding from exception match""" + try: + if isinstance(match, tuple) and len(match) >= 1: + # Handle different match formats + if len(match) == 3: # Exception format + exception_type, exception_message, stack_trace = match + else: + stack_trace = match[0] + exception_type = "Unknown" + exception_message = "" + else: + stack_trace = str(match) + exception_type = "Unknown" + exception_message = "" + + # Try to extract exception type from stack trace + if not exception_type or exception_type == "Unknown": + lines = stack_trace.split('\n') + for line in reversed(lines): + if ':' in line and any(exc in line for exc in ['Error', 'Exception', 'Warning']): + exception_type = line.split(':')[0].strip() + exception_message = line.split(':', 1)[1].strip() if ':' in line else "" + break + + severity = self._get_exception_severity(exception_type) + + finding = self.create_finding( + title=f"Atheris Exception: {exception_type}", + description=f"Atheris discovered a Python exception during fuzzing: {exception_type}", + severity=severity, + category=self._get_exception_category(exception_type), + file_path=None, + recommendation=self._get_exception_recommendation(exception_type, exception_message), + metadata={ + "exception_type": exception_type, + "exception_message": exception_message, + "stack_trace": stack_trace[:2000] if stack_trace else "", + "fuzzer": "atheris" + } + ) + + return finding + + except Exception as e: + logger.warning(f"Error creating exception finding: {e}") + return None + + def _create_artifact_finding(self, artifact_file: Path, workspace: Path) -> ModuleFinding: + """Create finding from Atheris artifact file""" + try: + # Try to read artifact content (limited) + artifact_content = "" + try: + content_bytes = artifact_file.read_bytes()[:1000] + artifact_content = content_bytes.hex() + except Exception: + pass + + # Create relative path + try: + rel_path = artifact_file.relative_to(workspace) + file_path = str(rel_path) + except ValueError: + file_path = str(artifact_file) + + finding = self.create_finding( + title="Atheris Crash Artifact", + description=f"Atheris generated a crash artifact file: {artifact_file.name}", + severity="medium", + category="program_crash", + file_path=file_path, + recommendation="Analyze the crash artifact to reproduce and debug the issue. The artifact contains the input that caused the crash.", + metadata={ + "artifact_type": "crash", + "artifact_file": artifact_file.name, + "artifact_content_hex": artifact_content, + "fuzzer": "atheris" + } + ) + + return finding + + except Exception as e: + logger.warning(f"Error creating artifact finding: {e}") + return None + + def _get_exception_severity(self, exception_type: str) -> str: + """Determine severity based on exception type""" + if not exception_type: + return "medium" + + exception_lower = exception_type.lower() + + # Critical security issues + if any(term in exception_lower for term in ["segmentationfault", "accessviolation", "memoryerror"]): + return "critical" + + # High severity exceptions + elif any(term in exception_lower for term in ["attributeerror", "typeerror", "indexerror", "keyerror", "valueerror"]): + return "high" + + # Medium severity exceptions + elif any(term in exception_lower for term in ["assertionerror", "runtimeerror", "ioerror", "oserror"]): + return "medium" + + # Lower severity exceptions + elif any(term in exception_lower for term in ["warning", "deprecation"]): + return "low" + + else: + return "medium" + + def _get_exception_category(self, exception_type: str) -> str: + """Determine category based on exception type""" + if not exception_type: + return "python_exception" + + exception_lower = exception_type.lower() + + if any(term in exception_lower for term in ["memory", "segmentation", "access"]): + return "memory_corruption" + elif any(term in exception_lower for term in ["attribute", "type"]): + return "type_error" + elif any(term in exception_lower for term in ["index", "key", "value"]): + return "data_error" + elif any(term in exception_lower for term in ["io", "os", "file"]): + return "io_error" + elif any(term in exception_lower for term in ["assertion"]): + return "assertion_failure" + else: + return "python_exception" + + def _get_exception_recommendation(self, exception_type: str, exception_message: str) -> str: + """Generate recommendation based on exception type""" + if not exception_type: + return "Analyze the exception and fix the underlying code issue." + + exception_lower = exception_type.lower() + + if "attributeerror" in exception_lower: + return "Fix AttributeError by ensuring objects have the expected attributes before accessing them. Add proper error handling and validation." + elif "typeerror" in exception_lower: + return "Fix TypeError by ensuring correct data types are used. Add type checking and validation for function parameters." + elif "indexerror" in exception_lower: + return "Fix IndexError by adding bounds checking before accessing list/array elements. Validate indices are within valid range." + elif "keyerror" in exception_lower: + return "Fix KeyError by checking if keys exist in dictionaries before accessing them. Use .get() method or proper key validation." + elif "valueerror" in exception_lower: + return "Fix ValueError by validating input values before processing. Add proper input sanitization and validation." + elif "memoryerror" in exception_lower: + return "Fix MemoryError by optimizing memory usage, processing data in chunks, or increasing available memory." + elif "assertionerror" in exception_lower: + return "Fix AssertionError by reviewing assertion conditions and ensuring they properly validate the expected state." + else: + return f"Fix the {exception_type} exception by analyzing the root cause and implementing appropriate error handling and validation." + + def _create_summary(self, findings: List[ModuleFinding]) -> Dict[str, Any]: + """Create analysis summary""" + severity_counts = {"critical": 0, "high": 0, "medium": 0, "low": 0, "info": 0} + category_counts = {} + exception_counts = {} + + for finding in findings: + # Count by severity + severity_counts[finding.severity] += 1 + + # Count by category + category = finding.category + category_counts[category] = category_counts.get(category, 0) + 1 + + # Count by exception type + exception_type = finding.metadata.get("exception_type", "unknown") + exception_counts[exception_type] = exception_counts.get(exception_type, 0) + 1 + + return { + "total_findings": len(findings), + "severity_counts": severity_counts, + "category_counts": category_counts, + "exception_counts": exception_counts, + "unique_exceptions": len(exception_counts), + "python_specific_issues": sum(category_counts.get(cat, 0) for cat in ["type_error", "data_error", "python_exception"]) + } \ No newline at end of file diff --git a/backend/toolbox/modules/fuzzing/cargo_fuzz.py b/backend/toolbox/modules/fuzzing/cargo_fuzz.py new file mode 100644 index 0000000..b6d3d3e --- /dev/null +++ b/backend/toolbox/modules/fuzzing/cargo_fuzz.py @@ -0,0 +1,572 @@ +""" +Cargo Fuzz Module + +This module uses cargo-fuzz for fuzzing Rust code with libFuzzer integration. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import asyncio +import json +import os +from pathlib import Path +from typing import Dict, Any, List, Tuple +import subprocess +import logging +import httpx +import re +from datetime import datetime, timedelta + +try: + from prefect import get_run_context +except ImportError: + # Fallback for when not running in Prefect context + get_run_context = None + +from ..base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult +from . import register_module + +logger = logging.getLogger(__name__) + + +@register_module +class CargoFuzzModule(BaseModule): + """Cargo Fuzz Rust fuzzing module""" + + def get_metadata(self) -> ModuleMetadata: + """Get module metadata""" + return ModuleMetadata( + name="cargo_fuzz", + version="0.11.2", + description="Rust fuzzing integration with libFuzzer using cargo-fuzz", + author="FuzzForge Team", + category="fuzzing", + tags=["rust", "libfuzzer", "cargo", "coverage-guided", "sanitizers"], + input_schema={ + "type": "object", + "properties": { + "project_dir": { + "type": "string", + "description": "Path to Rust project directory (with Cargo.toml)" + }, + "fuzz_target": { + "type": "string", + "description": "Name of the fuzz target to run" + }, + "max_total_time": { + "type": "integer", + "default": 600, + "description": "Maximum total time to run fuzzing (seconds)" + }, + "jobs": { + "type": "integer", + "default": 1, + "description": "Number of worker processes" + }, + "corpus_dir": { + "type": "string", + "description": "Custom corpus directory" + }, + "artifacts_dir": { + "type": "string", + "description": "Custom artifacts directory" + }, + "sanitizer": { + "type": "string", + "enum": ["address", "memory", "thread", "leak", "none"], + "default": "address", + "description": "Sanitizer to use" + }, + "release": { + "type": "boolean", + "default": False, + "description": "Use release mode" + }, + "debug_assertions": { + "type": "boolean", + "default": True, + "description": "Enable debug assertions" + } + } + }, + output_schema={ + "type": "object", + "properties": { + "findings": { + "type": "array", + "items": { + "type": "object", + "properties": { + "crash_type": {"type": "string"}, + "artifact_path": {"type": "string"}, + "stack_trace": {"type": "string"} + } + } + } + } + } + ) + + def validate_config(self, config: Dict[str, Any]) -> bool: + """Validate configuration""" + project_dir = config.get("project_dir") + if not project_dir: + raise ValueError("project_dir is required") + + fuzz_target = config.get("fuzz_target") + if not fuzz_target: + raise ValueError("fuzz_target is required") + + return True + + async def execute(self, config: Dict[str, Any], workspace: Path, stats_callback=None) -> ModuleResult: + """Execute cargo-fuzz fuzzing""" + self.start_timer() + + try: + # Initialize last observed stats for summary propagation + self._last_stats = { + 'executions': 0, + 'executions_per_sec': 0.0, + 'crashes': 0, + 'corpus_size': 0, + 'elapsed_time': 0, + } + # Validate inputs + self.validate_config(config) + self.validate_workspace(workspace) + + logger.info("Running cargo-fuzz Rust fuzzing") + + # Check installation + await self._check_cargo_fuzz_installation() + + # Setup project + project_dir = workspace / config["project_dir"] + await self._setup_cargo_fuzz_project(project_dir, config) + + # Run fuzzing + findings = await self._run_cargo_fuzz(project_dir, config, workspace, stats_callback) + + # Create summary and enrich with last observed runtime stats + summary = self._create_summary(findings) + try: + summary.update({ + 'executions': self._last_stats.get('executions', 0), + 'executions_per_sec': self._last_stats.get('executions_per_sec', 0.0), + 'corpus_size': self._last_stats.get('corpus_size', 0), + 'crashes': self._last_stats.get('crashes', 0), + 'elapsed_time': self._last_stats.get('elapsed_time', 0), + }) + except Exception: + pass + + logger.info(f"cargo-fuzz found {len(findings)} issues") + + return self.create_result( + findings=findings, + status="success", + summary=summary + ) + + except Exception as e: + logger.error(f"cargo-fuzz module failed: {e}") + return self.create_result( + findings=[], + status="failed", + error=str(e) + ) + + async def _check_cargo_fuzz_installation(self): + """Check if cargo-fuzz is installed""" + try: + process = await asyncio.create_subprocess_exec( + "cargo", "fuzz", "--version", + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE + ) + stdout, stderr = await process.communicate() + + if process.returncode != 0: + raise RuntimeError("cargo-fuzz not installed. Install with: cargo install cargo-fuzz") + + except Exception as e: + raise RuntimeError(f"cargo-fuzz installation check failed: {e}") + + async def _setup_cargo_fuzz_project(self, project_dir: Path, config: Dict[str, Any]): + """Setup cargo-fuzz project""" + if not project_dir.exists(): + raise FileNotFoundError(f"Project directory not found: {project_dir}") + + cargo_toml = project_dir / "Cargo.toml" + if not cargo_toml.exists(): + raise FileNotFoundError(f"Cargo.toml not found in {project_dir}") + + # Check if fuzz directory exists, if not initialize + fuzz_dir = project_dir / "fuzz" + if not fuzz_dir.exists(): + logger.info("Initializing cargo-fuzz project") + process = await asyncio.create_subprocess_exec( + "cargo", "fuzz", "init", + cwd=project_dir, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE + ) + await process.communicate() + + async def _run_cargo_fuzz(self, project_dir: Path, config: Dict[str, Any], workspace: Path, stats_callback=None) -> List[ModuleFinding]: + """Run cargo-fuzz with real-time statistics reporting""" + findings = [] + + # Get run_id from Prefect context for statistics reporting + run_id = None + if get_run_context: + try: + context = get_run_context() + run_id = str(context.flow_run.id) + except Exception: + logger.warning("Could not get run_id from Prefect context") + + try: + # Build command + cmd = ["cargo", "fuzz", "run", config["fuzz_target"]] + + # Add options + if config.get("jobs", 1) > 1: + cmd.extend(["--", f"-jobs={config['jobs']}"]) + + max_time = config.get("max_total_time", 600) + cmd.extend(["--", f"-max_total_time={max_time}"]) + + # Set sanitizer + sanitizer = config.get("sanitizer", "address") + if sanitizer != "none": + cmd.append(f"--sanitizer={sanitizer}") + + if config.get("release", False): + cmd.append("--release") + + # Set environment + env = os.environ.copy() + if config.get("debug_assertions", True): + env["RUSTFLAGS"] = env.get("RUSTFLAGS", "") + " -C debug-assertions=on" + + logger.debug(f"Running command: {' '.join(cmd)}") + + # Run with streaming output processing for real-time stats + try: + process = await asyncio.create_subprocess_exec( + *cmd, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.STDOUT, # Merge stderr into stdout + cwd=project_dir, + env=env + ) + + # Process output in real-time + stdout_data, stderr_data = await self._process_streaming_output( + process, max_time, config, stats_callback + ) + + # Parse final results + findings = self._parse_cargo_fuzz_output( + stdout_data, stderr_data, project_dir, workspace, config + ) + + except Exception as e: + logger.warning(f"Error running cargo-fuzz: {e}") + + except Exception as e: + logger.warning(f"Error in cargo-fuzz execution: {e}") + + return findings + + def _parse_cargo_fuzz_output(self, stdout: str, stderr: str, project_dir: Path, workspace: Path, config: Dict[str, Any]) -> List[ModuleFinding]: + """Parse cargo-fuzz output""" + findings = [] + + try: + full_output = stdout + "\n" + stderr + + # Look for crash artifacts + artifacts_dir = project_dir / "fuzz" / "artifacts" / config["fuzz_target"] + if artifacts_dir.exists(): + for artifact in artifacts_dir.iterdir(): + if artifact.is_file(): + finding = self._create_artifact_finding(artifact, workspace, full_output) + if finding: + findings.append(finding) + + except Exception as e: + logger.warning(f"Error parsing cargo-fuzz output: {e}") + + return findings + + def _create_artifact_finding(self, artifact_path: Path, workspace: Path, output: str) -> ModuleFinding: + """Create finding from artifact file""" + try: + # Try to determine crash type from filename or content + crash_type = "crash" + if "leak" in artifact_path.name.lower(): + crash_type = "memory_leak" + elif "timeout" in artifact_path.name.lower(): + crash_type = "timeout" + + # Extract stack trace from output + stack_trace = self._extract_stack_trace_from_output(output, artifact_path.name) + + try: + rel_path = artifact_path.relative_to(workspace) + file_path = str(rel_path) + except ValueError: + file_path = str(artifact_path) + + severity = "high" if "crash" in crash_type else "medium" + + finding = self.create_finding( + title=f"cargo-fuzz {crash_type.title()}", + description=f"cargo-fuzz discovered a {crash_type} in the Rust code", + severity=severity, + category=self._get_crash_category(crash_type), + file_path=file_path, + recommendation=self._get_crash_recommendation(crash_type), + metadata={ + "crash_type": crash_type, + "artifact_path": str(artifact_path), + "stack_trace": stack_trace, + "fuzzer": "cargo_fuzz" + } + ) + + return finding + + except Exception as e: + logger.warning(f"Error creating artifact finding: {e}") + return None + + def _extract_stack_trace_from_output(self, output: str, artifact_name: str) -> str: + """Extract stack trace from output""" + try: + lines = output.split('\n') + stack_lines = [] + in_stack = False + + for line in lines: + if artifact_name in line or "stack backtrace:" in line.lower(): + in_stack = True + continue + + if in_stack: + if line.strip() and ("at " in line or "::" in line or line.strip().startswith("0:")): + stack_lines.append(line.strip()) + elif not line.strip() and stack_lines: + break + + return '\n'.join(stack_lines[:20]) # Limit stack trace size + + except Exception: + return "" + + def _get_crash_category(self, crash_type: str) -> str: + """Get category for crash type""" + if "leak" in crash_type: + return "memory_leak" + elif "timeout" in crash_type: + return "performance_issues" + else: + return "memory_safety" + + def _get_crash_recommendation(self, crash_type: str) -> str: + """Get recommendation for crash type""" + if "leak" in crash_type: + return "Fix memory leak by ensuring proper cleanup of allocated resources. Review memory management patterns." + elif "timeout" in crash_type: + return "Fix timeout by optimizing performance, avoiding infinite loops, and implementing reasonable bounds." + else: + return "Fix the crash by analyzing the stack trace and addressing memory safety issues." + + async def _process_streaming_output(self, process, max_time: int, config: Dict[str, Any], stats_callback=None) -> Tuple[str, str]: + """Process cargo-fuzz output in real-time and report statistics""" + stdout_lines = [] + start_time = datetime.utcnow() + last_update = start_time + stats_data = { + 'executions': 0, + 'executions_per_sec': 0.0, + 'crashes': 0, + 'corpus_size': 0, + 'elapsed_time': 0 + } + + # Get run_id from Prefect context for statistics reporting + run_id = None + if get_run_context: + try: + context = get_run_context() + run_id = str(context.flow_run.id) + except Exception: + logger.debug("Could not get run_id from Prefect context") + + try: + # Emit an initial baseline update so dashboards show activity immediately + try: + await self._send_stats_via_callback(stats_callback, run_id, stats_data) + except Exception: + pass + # Monitor process output in chunks to capture libFuzzer carriage-return updates + buffer = "" + while True: + try: + chunk = await asyncio.wait_for(process.stdout.read(4096), timeout=1.0) + if not chunk: + # Process finished + break + + buffer += chunk.decode('utf-8', errors='ignore') + + # Split on both newline and carriage return + if "\n" in buffer or "\r" in buffer: + parts = re.split(r"[\r\n]", buffer) + buffer = parts[-1] + for part in parts[:-1]: + line = part.strip() + if not line: + continue + stdout_lines.append(line) + self._parse_stats_from_line(line, stats_data) + + except asyncio.TimeoutError: + # No output this second; continue to periodic update check + pass + + # Periodic update (even if there was no output) + current_time = datetime.utcnow() + stats_data['elapsed_time'] = int((current_time - start_time).total_seconds()) + if current_time - last_update >= timedelta(seconds=3): + try: + self._last_stats = dict(stats_data) + except Exception: + pass + await self._send_stats_via_callback(stats_callback, run_id, stats_data) + last_update = current_time + + # Check if max time exceeded + if stats_data['elapsed_time'] >= max_time: + logger.info("Max time reached, terminating cargo-fuzz") + process.terminate() + break + + # Wait for process to complete + await process.wait() + + # Send final stats update + try: + self._last_stats = dict(stats_data) + except Exception: + pass + await self._send_stats_via_callback(stats_callback, run_id, stats_data) + + except Exception as e: + logger.warning(f"Error processing streaming output: {e}") + + stdout_data = '\n'.join(stdout_lines) + return stdout_data, "" + + def _parse_stats_from_line(self, line: str, stats_data: Dict[str, Any]): + """Parse statistics from a cargo-fuzz output line""" + try: + # cargo-fuzz typically shows stats like: + # "#12345: DONE cov: 1234 ft: 5678 corp: 9/10Mb exec/s: 1500 rss: 234Mb" + # "#12345: NEW cov: 1234 ft: 5678 corp: 9/10Mb exec/s: 1500 rss: 234Mb L: 45/67 MS: 3 ..." + + # Extract execution count (the #number) + exec_match = re.search(r'#(\d+)(?::)?', line) + if exec_match: + stats_data['executions'] = int(exec_match.group(1)) + else: + # libFuzzer stats format alternative + exec_alt = re.search(r'stat::number_of_executed_units:\s*(\d+)', line) + if exec_alt: + stats_data['executions'] = int(exec_alt.group(1)) + else: + exec_alt2 = re.search(r'executed units:?\s*(\d+)', line, re.IGNORECASE) + if exec_alt2: + stats_data['executions'] = int(exec_alt2.group(1)) + + # Extract executions per second + exec_per_sec_match = re.search(r'exec/s:\s*([0-9\.]+)', line) + if exec_per_sec_match: + stats_data['executions_per_sec'] = float(exec_per_sec_match.group(1)) + else: + eps_alt = re.search(r'stat::execs_per_sec:\s*([0-9\.]+)', line) + if eps_alt: + stats_data['executions_per_sec'] = float(eps_alt.group(1)) + + # Extract corpus size (corp: X/YMb) + corp_match = re.search(r'corp(?:us)?:\s*(\d+)', line) + if corp_match: + stats_data['corpus_size'] = int(corp_match.group(1)) + + # Look for crash indicators + if any(keyword in line.lower() for keyword in ['crash', 'assert', 'panic', 'abort']): + stats_data['crashes'] += 1 + + except Exception as e: + logger.debug(f"Error parsing stats from line '{line}': {e}") + + async def _send_stats_via_callback(self, stats_callback, run_id: str, stats_data: Dict[str, Any]): + """Send statistics update via callback function""" + if not stats_callback or not run_id: + return + + try: + # Prepare statistics payload + stats_payload = { + "run_id": run_id, + "workflow": "language_fuzzing", + "executions": stats_data['executions'], + "executions_per_sec": stats_data['executions_per_sec'], + "crashes": stats_data['crashes'], + "unique_crashes": stats_data['crashes'], # Assume all crashes are unique for now + "corpus_size": stats_data['corpus_size'], + "elapsed_time": stats_data['elapsed_time'], + "timestamp": datetime.utcnow().isoformat() + } + + # Call the callback function provided by the Prefect task + await stats_callback(stats_payload) + logger.info( + "LIVE STATS SENT: exec=%s eps=%.2f crashes=%s corpus=%s elapsed=%s", + stats_data['executions'], + stats_data['executions_per_sec'], + stats_data['crashes'], + stats_data['corpus_size'], + stats_data['elapsed_time'], + ) + + except Exception as e: + logger.debug(f"Error sending stats via callback: {e}") + + def _create_summary(self, findings: List[ModuleFinding]) -> Dict[str, Any]: + """Create analysis summary""" + severity_counts = {"critical": 0, "high": 0, "medium": 0, "low": 0, "info": 0} + category_counts = {} + + for finding in findings: + severity_counts[finding.severity] += 1 + category_counts[finding.category] = category_counts.get(finding.category, 0) + 1 + + return { + "total_findings": len(findings), + "severity_counts": severity_counts, + "category_counts": category_counts + } diff --git a/backend/toolbox/modules/fuzzing/go_fuzz.py b/backend/toolbox/modules/fuzzing/go_fuzz.py new file mode 100644 index 0000000..89ad165 --- /dev/null +++ b/backend/toolbox/modules/fuzzing/go_fuzz.py @@ -0,0 +1,384 @@ +""" +Go-Fuzz Module + +This module uses go-fuzz for coverage-guided fuzzing of Go packages. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import asyncio +import json +import os +from pathlib import Path +from typing import Dict, Any, List +import subprocess +import logging + +from ..base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult +from . import register_module + +logger = logging.getLogger(__name__) + + +@register_module +class GoFuzzModule(BaseModule): + """Go-Fuzz Go language fuzzing module""" + + def get_metadata(self) -> ModuleMetadata: + """Get module metadata""" + return ModuleMetadata( + name="go_fuzz", + version="1.2.0", + description="Coverage-guided fuzzing for Go packages using go-fuzz", + author="FuzzForge Team", + category="fuzzing", + tags=["go", "golang", "coverage-guided", "packages"], + input_schema={ + "type": "object", + "properties": { + "package_path": { + "type": "string", + "description": "Path to Go package to fuzz" + }, + "fuzz_function": { + "type": "string", + "default": "Fuzz", + "description": "Name of the fuzz function" + }, + "workdir": { + "type": "string", + "default": "go_fuzz_workdir", + "description": "Working directory for go-fuzz" + }, + "procs": { + "type": "integer", + "default": 1, + "description": "Number of parallel processes" + }, + "timeout": { + "type": "integer", + "default": 600, + "description": "Total fuzzing timeout (seconds)" + }, + "race": { + "type": "boolean", + "default": false, + "description": "Enable race detector" + }, + "minimize": { + "type": "boolean", + "default": true, + "description": "Minimize crashers" + }, + "sonar": { + "type": "boolean", + "default": false, + "description": "Enable sonar mode" + } + } + }, + output_schema={ + "type": "object", + "properties": { + "findings": { + "type": "array", + "items": { + "type": "object", + "properties": { + "crash_type": {"type": "string"}, + "crash_file": {"type": "string"}, + "stack_trace": {"type": "string"} + } + } + } + } + } + ) + + def validate_config(self, config: Dict[str, Any]) -> bool: + """Validate configuration""" + package_path = config.get("package_path") + if not package_path: + raise ValueError("package_path is required") + + return True + + async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: + """Execute go-fuzz fuzzing""" + self.start_timer() + + try: + # Validate inputs + self.validate_config(config) + self.validate_workspace(workspace) + + logger.info("Running go-fuzz Go fuzzing") + + # Check installation + await self._check_go_fuzz_installation() + + # Setup + package_path = workspace / config["package_path"] + workdir = workspace / config.get("workdir", "go_fuzz_workdir") + + # Build and run + findings = await self._run_go_fuzz(package_path, workdir, config, workspace) + + # Create summary + summary = self._create_summary(findings) + + logger.info(f"go-fuzz found {len(findings)} issues") + + return self.create_result( + findings=findings, + status="success", + summary=summary + ) + + except Exception as e: + logger.error(f"go-fuzz module failed: {e}") + return self.create_result( + findings=[], + status="failed", + error=str(e) + ) + + async def _check_go_fuzz_installation(self): + """Check if go-fuzz is installed""" + try: + process = await asyncio.create_subprocess_exec( + "go-fuzz", "--help", + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE + ) + await process.communicate() + + if process.returncode != 0: + # Try building + process = await asyncio.create_subprocess_exec( + "go", "install", "github.com/dvyukov/go-fuzz/go-fuzz@latest", + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE + ) + await process.communicate() + + except Exception as e: + raise RuntimeError(f"go-fuzz installation failed: {e}") + + async def _run_go_fuzz(self, package_path: Path, workdir: Path, config: Dict[str, Any], workspace: Path) -> List[ModuleFinding]: + """Run go-fuzz""" + findings = [] + + try: + # Create workdir + workdir.mkdir(exist_ok=True) + + # Build + await self._build_go_fuzz(package_path, config) + + # Run fuzzing + cmd = ["go-fuzz", "-bin", f"{package_path.name}-fuzz.zip", "-workdir", str(workdir)] + + if config.get("procs", 1) > 1: + cmd.extend(["-procs", str(config["procs"])]) + + if config.get("race", False): + cmd.append("-race") + + if config.get("sonar", False): + cmd.append("-sonar") + + timeout = config.get("timeout", 600) + + try: + process = await asyncio.create_subprocess_exec( + *cmd, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE, + cwd=package_path.parent + ) + + try: + stdout, stderr = await asyncio.wait_for( + process.communicate(), timeout=timeout + ) + except asyncio.TimeoutError: + process.terminate() + await process.wait() + + # Parse results + findings = self._parse_go_fuzz_results(workdir, workspace, config) + + except Exception as e: + logger.warning(f"Error running go-fuzz: {e}") + + except Exception as e: + logger.warning(f"Error in go-fuzz execution: {e}") + + return findings + + async def _build_go_fuzz(self, package_path: Path, config: Dict[str, Any]): + """Build go-fuzz binary""" + cmd = ["go-fuzz-build"] + if config.get("race", False): + cmd.append("-race") + + process = await asyncio.create_subprocess_exec( + *cmd, + cwd=package_path, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE + ) + stdout, stderr = await process.communicate() + + if process.returncode != 0: + raise RuntimeError(f"go-fuzz-build failed: {stderr.decode()}") + + def _parse_go_fuzz_results(self, workdir: Path, workspace: Path, config: Dict[str, Any]) -> List[ModuleFinding]: + """Parse go-fuzz results""" + findings = [] + + try: + # Look for crashers + crashers_dir = workdir / "crashers" + if crashers_dir.exists(): + for crash_file in crashers_dir.iterdir(): + if crash_file.is_file() and not crash_file.name.startswith("."): + finding = self._create_crash_finding(crash_file, workspace) + if finding: + findings.append(finding) + + # Look for suppressions (potential issues) + suppressions_dir = workdir / "suppressions" + if suppressions_dir.exists(): + for supp_file in suppressions_dir.iterdir(): + if supp_file.is_file(): + finding = self._create_suppression_finding(supp_file, workspace) + if finding: + findings.append(finding) + + except Exception as e: + logger.warning(f"Error parsing go-fuzz results: {e}") + + return findings + + def _create_crash_finding(self, crash_file: Path, workspace: Path) -> ModuleFinding: + """Create finding from crash file""" + try: + # Read crash output + crash_content = "" + if crash_file.name.endswith(".output"): + crash_content = crash_file.read_text() + + # Determine crash type + crash_type = "panic" + if "runtime error" in crash_content: + crash_type = "runtime_error" + elif "race" in crash_content: + crash_type = "race_condition" + + try: + rel_path = crash_file.relative_to(workspace) + file_path = str(rel_path) + except ValueError: + file_path = str(crash_file) + + finding = self.create_finding( + title=f"go-fuzz {crash_type.title()}", + description=f"go-fuzz discovered a {crash_type} in the Go code", + severity=self._get_crash_severity(crash_type), + category=self._get_crash_category(crash_type), + file_path=file_path, + recommendation=self._get_crash_recommendation(crash_type), + metadata={ + "crash_type": crash_type, + "crash_file": str(crash_file), + "stack_trace": crash_content[:1000], + "fuzzer": "go_fuzz" + } + ) + + return finding + + except Exception as e: + logger.warning(f"Error creating crash finding: {e}") + return None + + def _create_suppression_finding(self, supp_file: Path, workspace: Path) -> ModuleFinding: + """Create finding from suppression file""" + try: + try: + rel_path = supp_file.relative_to(workspace) + file_path = str(rel_path) + except ValueError: + file_path = str(supp_file) + + finding = self.create_finding( + title="go-fuzz Potential Issue", + description="go-fuzz identified a potential issue that was suppressed", + severity="low", + category="potential_issue", + file_path=file_path, + recommendation="Review suppressed issue to determine if it requires attention.", + metadata={ + "suppression_file": str(supp_file), + "fuzzer": "go_fuzz" + } + ) + + return finding + + except Exception as e: + logger.warning(f"Error creating suppression finding: {e}") + return None + + def _get_crash_severity(self, crash_type: str) -> str: + """Get crash severity""" + if crash_type == "race_condition": + return "high" + elif crash_type == "runtime_error": + return "high" + else: + return "medium" + + def _get_crash_category(self, crash_type: str) -> str: + """Get crash category""" + if crash_type == "race_condition": + return "race_condition" + elif crash_type == "runtime_error": + return "runtime_error" + else: + return "program_crash" + + def _get_crash_recommendation(self, crash_type: str) -> str: + """Get crash recommendation""" + if crash_type == "race_condition": + return "Fix race condition by adding proper synchronization (mutexes, channels, etc.)" + elif crash_type == "runtime_error": + return "Fix runtime error by adding bounds checking and proper error handling" + else: + return "Analyze the crash and fix the underlying issue" + + def _create_summary(self, findings: List[ModuleFinding]) -> Dict[str, Any]: + """Create analysis summary""" + severity_counts = {"critical": 0, "high": 0, "medium": 0, "low": 0, "info": 0} + category_counts = {} + + for finding in findings: + severity_counts[finding.severity] += 1 + category_counts[finding.category] = category_counts.get(finding.category, 0) + 1 + + return { + "total_findings": len(findings), + "severity_counts": severity_counts, + "category_counts": category_counts + } \ No newline at end of file diff --git a/backend/toolbox/modules/fuzzing/libfuzzer.py b/backend/toolbox/modules/fuzzing/libfuzzer.py new file mode 100644 index 0000000..0addcbb --- /dev/null +++ b/backend/toolbox/modules/fuzzing/libfuzzer.py @@ -0,0 +1,705 @@ +""" +LibFuzzer Fuzzing Module + +This module uses LibFuzzer (LLVM's coverage-guided fuzzing engine) to find +bugs and security vulnerabilities in C/C++ code. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import asyncio +import json +import os +from pathlib import Path +from typing import Dict, Any, List +import subprocess +import logging +import re + +from ..base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult +from . import register_module + +logger = logging.getLogger(__name__) + + +@register_module +class LibFuzzerModule(BaseModule): + """LibFuzzer coverage-guided fuzzing module""" + + def get_metadata(self) -> ModuleMetadata: + """Get module metadata""" + return ModuleMetadata( + name="libfuzzer", + version="17.0.0", + description="LLVM's coverage-guided fuzzing engine for finding bugs in C/C++ code", + author="FuzzForge Team", + category="fuzzing", + tags=["coverage-guided", "c", "cpp", "llvm", "sanitizers", "memory-safety"], + input_schema={ + "type": "object", + "properties": { + "target_binary": { + "type": "string", + "description": "Path to the fuzz target binary (compiled with -fsanitize=fuzzer)" + }, + "corpus_dir": { + "type": "string", + "description": "Directory containing initial corpus files" + }, + "dict_file": { + "type": "string", + "description": "Dictionary file for fuzzing keywords" + }, + "max_total_time": { + "type": "integer", + "default": 600, + "description": "Maximum total time to run fuzzing (seconds)" + }, + "max_len": { + "type": "integer", + "default": 4096, + "description": "Maximum length of test input" + }, + "timeout": { + "type": "integer", + "default": 25, + "description": "Timeout for individual test cases (seconds)" + }, + "runs": { + "type": "integer", + "default": -1, + "description": "Number of individual test runs (-1 for unlimited)" + }, + "jobs": { + "type": "integer", + "default": 1, + "description": "Number of fuzzing jobs to run in parallel" + }, + "workers": { + "type": "integer", + "default": 1, + "description": "Number of workers for parallel fuzzing" + }, + "reload": { + "type": "integer", + "default": 1, + "description": "Reload the main corpus periodically" + }, + "print_final_stats": { + "type": "boolean", + "default": true, + "description": "Print final statistics" + }, + "print_pcs": { + "type": "boolean", + "default": false, + "description": "Print newly covered PCs" + }, + "print_funcs": { + "type": "boolean", + "default": false, + "description": "Print newly covered functions" + }, + "print_coverage": { + "type": "boolean", + "default": true, + "description": "Print coverage information" + }, + "shrink": { + "type": "boolean", + "default": true, + "description": "Try to shrink the corpus" + }, + "reduce_inputs": { + "type": "boolean", + "default": true, + "description": "Try to reduce the size of inputs" + }, + "use_value_profile": { + "type": "boolean", + "default": false, + "description": "Use value profile for fuzzing" + }, + "sanitizers": { + "type": "array", + "items": {"type": "string", "enum": ["address", "memory", "undefined", "thread", "leak"]}, + "default": ["address"], + "description": "Sanitizers to use during fuzzing" + }, + "artifact_prefix": { + "type": "string", + "default": "crash-", + "description": "Prefix for artifact files" + }, + "exact_artifact_path": { + "type": "string", + "description": "Exact path for artifact files" + }, + "fork": { + "type": "integer", + "default": 0, + "description": "Fork mode (number of simultaneous processes)" + }, + "ignore_crashes": { + "type": "boolean", + "default": false, + "description": "Ignore crashes and continue fuzzing" + }, + "ignore_timeouts": { + "type": "boolean", + "default": false, + "description": "Ignore timeouts and continue fuzzing" + }, + "ignore_ooms": { + "type": "boolean", + "default": false, + "description": "Ignore out-of-memory and continue fuzzing" + } + } + }, + output_schema={ + "type": "object", + "properties": { + "findings": { + "type": "array", + "items": { + "type": "object", + "properties": { + "crash_type": {"type": "string"}, + "crash_file": {"type": "string"}, + "stack_trace": {"type": "string"}, + "sanitizer": {"type": "string"} + } + } + } + } + } + ) + + def validate_config(self, config: Dict[str, Any]) -> bool: + """Validate configuration""" + target_binary = config.get("target_binary") + if not target_binary: + raise ValueError("target_binary is required for LibFuzzer") + + max_total_time = config.get("max_total_time", 600) + if max_total_time <= 0: + raise ValueError("max_total_time must be positive") + + return True + + async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: + """Execute LibFuzzer fuzzing""" + self.start_timer() + + try: + # Validate inputs + self.validate_config(config) + self.validate_workspace(workspace) + + logger.info("Running LibFuzzer fuzzing campaign") + + # Check if target binary exists + target_binary = workspace / config["target_binary"] + if not target_binary.exists(): + raise FileNotFoundError(f"Target binary not found: {target_binary}") + + # Run LibFuzzer + findings = await self._run_libfuzzer(target_binary, config, workspace) + + # Create summary + summary = self._create_summary(findings) + + logger.info(f"LibFuzzer found {len(findings)} issues") + + return self.create_result( + findings=findings, + status="success", + summary=summary + ) + + except Exception as e: + logger.error(f"LibFuzzer module failed: {e}") + return self.create_result( + findings=[], + status="failed", + error=str(e) + ) + + async def _run_libfuzzer(self, target_binary: Path, config: Dict[str, Any], workspace: Path) -> List[ModuleFinding]: + """Run LibFuzzer fuzzing""" + findings = [] + + try: + # Create output directory for artifacts + output_dir = workspace / "libfuzzer_output" + output_dir.mkdir(exist_ok=True) + + # Build LibFuzzer command + cmd = [str(target_binary)] + + # Add corpus directory + corpus_dir = config.get("corpus_dir") + if corpus_dir: + corpus_path = workspace / corpus_dir + if corpus_path.exists(): + cmd.append(str(corpus_path)) + else: + logger.warning(f"Corpus directory not found: {corpus_path}") + + # Add dictionary file + dict_file = config.get("dict_file") + if dict_file: + dict_path = workspace / dict_file + if dict_path.exists(): + cmd.append(f"-dict={dict_path}") + + # Add fuzzing parameters + cmd.append(f"-max_total_time={config.get('max_total_time', 600)}") + cmd.append(f"-max_len={config.get('max_len', 4096)}") + cmd.append(f"-timeout={config.get('timeout', 25)}") + cmd.append(f"-runs={config.get('runs', -1)}") + + if config.get("jobs", 1) > 1: + cmd.append(f"-jobs={config['jobs']}") + + if config.get("workers", 1) > 1: + cmd.append(f"-workers={config['workers']}") + + cmd.append(f"-reload={config.get('reload', 1)}") + + # Add output options + if config.get("print_final_stats", True): + cmd.append("-print_final_stats=1") + + if config.get("print_pcs", False): + cmd.append("-print_pcs=1") + + if config.get("print_funcs", False): + cmd.append("-print_funcs=1") + + if config.get("print_coverage", True): + cmd.append("-print_coverage=1") + + # Add corpus management options + if config.get("shrink", True): + cmd.append("-shrink=1") + + if config.get("reduce_inputs", True): + cmd.append("-reduce_inputs=1") + + if config.get("use_value_profile", False): + cmd.append("-use_value_profile=1") + + # Add artifact options + artifact_prefix = config.get("artifact_prefix", "crash-") + cmd.append(f"-artifact_prefix={output_dir / artifact_prefix}") + + exact_artifact_path = config.get("exact_artifact_path") + if exact_artifact_path: + cmd.append(f"-exact_artifact_path={output_dir / exact_artifact_path}") + + # Add fork mode + fork = config.get("fork", 0) + if fork > 0: + cmd.append(f"-fork={fork}") + + # Add ignore options + if config.get("ignore_crashes", False): + cmd.append("-ignore_crashes=1") + + if config.get("ignore_timeouts", False): + cmd.append("-ignore_timeouts=1") + + if config.get("ignore_ooms", False): + cmd.append("-ignore_ooms=1") + + # Set up environment for sanitizers + env = os.environ.copy() + sanitizers = config.get("sanitizers", ["address"]) + self._setup_sanitizer_environment(env, sanitizers) + + logger.debug(f"Running command: {' '.join(cmd)}") + + # Run LibFuzzer + process = await asyncio.create_subprocess_exec( + *cmd, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE, + cwd=workspace, + env=env + ) + + stdout, stderr = await process.communicate() + + # Parse results + findings = self._parse_libfuzzer_output( + stdout.decode(), stderr.decode(), output_dir, workspace, sanitizers + ) + + # Look for crash files + crash_findings = self._parse_crash_files(output_dir, workspace, sanitizers) + findings.extend(crash_findings) + + except Exception as e: + logger.warning(f"Error running LibFuzzer: {e}") + + return findings + + def _setup_sanitizer_environment(self, env: Dict[str, str], sanitizers: List[str]): + """Set up environment variables for sanitizers""" + if "address" in sanitizers: + env["ASAN_OPTIONS"] = env.get("ASAN_OPTIONS", "") + ":halt_on_error=0:abort_on_error=1" + + if "memory" in sanitizers: + env["MSAN_OPTIONS"] = env.get("MSAN_OPTIONS", "") + ":halt_on_error=0:abort_on_error=1" + + if "undefined" in sanitizers: + env["UBSAN_OPTIONS"] = env.get("UBSAN_OPTIONS", "") + ":halt_on_error=0:abort_on_error=1" + + if "thread" in sanitizers: + env["TSAN_OPTIONS"] = env.get("TSAN_OPTIONS", "") + ":halt_on_error=0:abort_on_error=1" + + if "leak" in sanitizers: + env["LSAN_OPTIONS"] = env.get("LSAN_OPTIONS", "") + ":halt_on_error=0:abort_on_error=1" + + def _parse_libfuzzer_output(self, stdout: str, stderr: str, output_dir: Path, workspace: Path, sanitizers: List[str]) -> List[ModuleFinding]: + """Parse LibFuzzer output for crashes and issues""" + findings = [] + + try: + # Combine stdout and stderr for analysis + full_output = stdout + "\n" + stderr + + # Look for crash indicators + crash_patterns = [ + r"ERROR: AddressSanitizer: (.+)", + r"ERROR: MemorySanitizer: (.+)", + r"ERROR: UndefinedBehaviorSanitizer: (.+)", + r"ERROR: ThreadSanitizer: (.+)", + r"ERROR: LeakSanitizer: (.+)", + r"SUMMARY: (.+Sanitizer): (.+)", + r"==\d+==ERROR: libFuzzer: (.+)" + ] + + for pattern in crash_patterns: + matches = re.finditer(pattern, full_output, re.MULTILINE) + for match in matches: + finding = self._create_crash_finding( + match, full_output, output_dir, sanitizers + ) + if finding: + findings.append(finding) + + # Look for timeout and OOM issues + if "TIMEOUT" in full_output: + finding = self._create_timeout_finding(full_output, output_dir) + if finding: + findings.append(finding) + + if "out-of-memory" in full_output.lower() or "oom" in full_output.lower(): + finding = self._create_oom_finding(full_output, output_dir) + if finding: + findings.append(finding) + + except Exception as e: + logger.warning(f"Error parsing LibFuzzer output: {e}") + + return findings + + def _parse_crash_files(self, output_dir: Path, workspace: Path, sanitizers: List[str]) -> List[ModuleFinding]: + """Parse crash artifact files""" + findings = [] + + try: + # Look for crash files + crash_patterns = ["crash-*", "leak-*", "timeout-*", "oom-*"] + for pattern in crash_patterns: + crash_files = list(output_dir.glob(pattern)) + for crash_file in crash_files: + finding = self._create_artifact_finding(crash_file, workspace, sanitizers) + if finding: + findings.append(finding) + + except Exception as e: + logger.warning(f"Error parsing crash files: {e}") + + return findings + + def _create_crash_finding(self, match, full_output: str, output_dir: Path, sanitizers: List[str]) -> ModuleFinding: + """Create finding from crash match""" + try: + crash_type = match.group(1) if match.groups() else "Unknown crash" + + # Extract stack trace + stack_trace = self._extract_stack_trace(full_output, match.start()) + + # Determine sanitizer + sanitizer = self._identify_sanitizer(match.group(0), sanitizers) + + # Determine severity based on crash type + severity = self._get_crash_severity(crash_type, sanitizer) + + # Create finding + finding = self.create_finding( + title=f"LibFuzzer Crash: {crash_type}", + description=f"LibFuzzer detected a crash with {sanitizer}: {crash_type}", + severity=severity, + category=self._get_crash_category(crash_type), + file_path=None, # LibFuzzer doesn't always provide specific files + recommendation=self._get_crash_recommendation(crash_type, sanitizer), + metadata={ + "crash_type": crash_type, + "sanitizer": sanitizer, + "stack_trace": stack_trace[:2000] if stack_trace else "", # Limit size + "fuzzer": "libfuzzer" + } + ) + + return finding + + except Exception as e: + logger.warning(f"Error creating crash finding: {e}") + return None + + def _create_timeout_finding(self, output: str, output_dir: Path) -> ModuleFinding: + """Create finding for timeout issues""" + try: + finding = self.create_finding( + title="LibFuzzer Timeout", + description="LibFuzzer detected a timeout during fuzzing, indicating potential infinite loop or performance issue", + severity="medium", + category="performance_issues", + file_path=None, + recommendation="Review the code for potential infinite loops, excessive computation, or blocking operations that could cause timeouts.", + metadata={ + "issue_type": "timeout", + "fuzzer": "libfuzzer" + } + ) + return finding + + except Exception as e: + logger.warning(f"Error creating timeout finding: {e}") + return None + + def _create_oom_finding(self, output: str, output_dir: Path) -> ModuleFinding: + """Create finding for out-of-memory issues""" + try: + finding = self.create_finding( + title="LibFuzzer Out-of-Memory", + description="LibFuzzer detected an out-of-memory condition during fuzzing, indicating potential memory leak or excessive allocation", + severity="medium", + category="memory_management", + file_path=None, + recommendation="Review memory allocation patterns, check for memory leaks, and consider implementing proper bounds checking.", + metadata={ + "issue_type": "out_of_memory", + "fuzzer": "libfuzzer" + } + ) + return finding + + except Exception as e: + logger.warning(f"Error creating OOM finding: {e}") + return None + + def _create_artifact_finding(self, crash_file: Path, workspace: Path, sanitizers: List[str]) -> ModuleFinding: + """Create finding from crash artifact file""" + try: + crash_type = crash_file.name.split('-')[0] # e.g., "crash", "leak", "timeout" + + # Try to read crash file content (limited) + crash_content = "" + try: + crash_content = crash_file.read_bytes()[:1000].decode('utf-8', errors='ignore') + except Exception: + pass + + # Determine severity + severity = self._get_artifact_severity(crash_type) + + finding = self.create_finding( + title=f"LibFuzzer Artifact: {crash_type}", + description=f"LibFuzzer generated a {crash_type} artifact file indicating a potential issue", + severity=severity, + category=self._get_crash_category(crash_type), + file_path=str(crash_file.relative_to(workspace)), + recommendation=self._get_artifact_recommendation(crash_type), + metadata={ + "artifact_type": crash_type, + "artifact_file": str(crash_file.name), + "crash_content_preview": crash_content, + "fuzzer": "libfuzzer" + } + ) + + return finding + + except Exception as e: + logger.warning(f"Error creating artifact finding: {e}") + return None + + def _extract_stack_trace(self, output: str, start_pos: int) -> str: + """Extract stack trace from output""" + try: + lines = output[start_pos:].split('\n') + stack_lines = [] + + for line in lines[:50]: # Limit to first 50 lines + if any(indicator in line for indicator in ["#0", "#1", "#2", "at ", "in "]): + stack_lines.append(line.strip()) + elif stack_lines and not line.strip(): + break + + return '\n'.join(stack_lines) + + except Exception: + return "" + + def _identify_sanitizer(self, crash_line: str, sanitizers: List[str]) -> str: + """Identify which sanitizer detected the issue""" + crash_lower = crash_line.lower() + + if "addresssanitizer" in crash_lower: + return "AddressSanitizer" + elif "memorysanitizer" in crash_lower: + return "MemorySanitizer" + elif "undefinedbehaviorsanitizer" in crash_lower: + return "UndefinedBehaviorSanitizer" + elif "threadsanitizer" in crash_lower: + return "ThreadSanitizer" + elif "leaksanitizer" in crash_lower: + return "LeakSanitizer" + elif "libfuzzer" in crash_lower: + return "LibFuzzer" + else: + return "Unknown" + + def _get_crash_severity(self, crash_type: str, sanitizer: str) -> str: + """Determine severity based on crash type and sanitizer""" + crash_lower = crash_type.lower() + + # Critical issues + if any(term in crash_lower for term in ["heap-buffer-overflow", "stack-buffer-overflow", "use-after-free", "double-free"]): + return "critical" + + # High severity issues + elif any(term in crash_lower for term in ["heap-use-after-free", "stack-use-after-return", "global-buffer-overflow"]): + return "high" + + # Medium severity issues + elif any(term in crash_lower for term in ["uninitialized", "leak", "race", "deadlock"]): + return "medium" + + # Default to high for any crash + else: + return "high" + + def _get_crash_category(self, crash_type: str) -> str: + """Determine category based on crash type""" + crash_lower = crash_type.lower() + + if any(term in crash_lower for term in ["buffer-overflow", "heap-buffer", "stack-buffer", "global-buffer"]): + return "buffer_overflow" + elif any(term in crash_lower for term in ["use-after-free", "double-free", "invalid-free"]): + return "memory_corruption" + elif any(term in crash_lower for term in ["uninitialized", "uninit"]): + return "uninitialized_memory" + elif any(term in crash_lower for term in ["leak"]): + return "memory_leak" + elif any(term in crash_lower for term in ["race", "data-race"]): + return "race_condition" + elif any(term in crash_lower for term in ["timeout"]): + return "performance_issues" + elif any(term in crash_lower for term in ["oom", "out-of-memory"]): + return "memory_management" + else: + return "memory_safety" + + def _get_artifact_severity(self, artifact_type: str) -> str: + """Determine severity for artifact types""" + if artifact_type == "crash": + return "high" + elif artifact_type == "leak": + return "medium" + elif artifact_type in ["timeout", "oom"]: + return "medium" + else: + return "low" + + def _get_crash_recommendation(self, crash_type: str, sanitizer: str) -> str: + """Generate recommendation based on crash type""" + crash_lower = crash_type.lower() + + if "buffer-overflow" in crash_lower: + return "Fix buffer overflow by implementing proper bounds checking, using safe string functions, and validating array indices." + elif "use-after-free" in crash_lower: + return "Fix use-after-free by setting pointers to NULL after freeing, using smart pointers, or redesigning object lifetime management." + elif "double-free" in crash_lower: + return "Fix double-free by ensuring each allocation has exactly one corresponding free, or use RAII patterns." + elif "uninitialized" in crash_lower: + return "Initialize all variables before use and ensure proper constructor implementation." + elif "leak" in crash_lower: + return "Fix memory leak by ensuring all allocated memory is properly freed, use smart pointers, or implement proper cleanup routines." + elif "race" in crash_lower: + return "Fix data race by using proper synchronization mechanisms like mutexes, atomic operations, or lock-free data structures." + else: + return f"Address the {crash_type} issue detected by {sanitizer}. Review code for memory safety and proper resource management." + + def _get_artifact_recommendation(self, artifact_type: str) -> str: + """Generate recommendation for artifact types""" + if artifact_type == "crash": + return "Analyze the crash artifact file to reproduce the issue and identify the root cause. Fix the underlying bug that caused the crash." + elif artifact_type == "leak": + return "Investigate the memory leak by analyzing allocation patterns and ensuring proper cleanup of resources." + elif artifact_type == "timeout": + return "Optimize code performance to prevent timeouts, check for infinite loops, and implement reasonable time limits." + elif artifact_type == "oom": + return "Reduce memory usage, implement proper memory management, and add bounds checking for allocations." + else: + return f"Analyze the {artifact_type} artifact to understand and fix the underlying issue." + + def _create_summary(self, findings: List[ModuleFinding]) -> Dict[str, Any]: + """Create analysis summary""" + severity_counts = {"critical": 0, "high": 0, "medium": 0, "low": 0, "info": 0} + category_counts = {} + sanitizer_counts = {} + crash_type_counts = {} + + for finding in findings: + # Count by severity + severity_counts[finding.severity] += 1 + + # Count by category + category = finding.category + category_counts[category] = category_counts.get(category, 0) + 1 + + # Count by sanitizer + sanitizer = finding.metadata.get("sanitizer", "unknown") + sanitizer_counts[sanitizer] = sanitizer_counts.get(sanitizer, 0) + 1 + + # Count by crash type + crash_type = finding.metadata.get("crash_type", finding.metadata.get("issue_type", "unknown")) + crash_type_counts[crash_type] = crash_type_counts.get(crash_type, 0) + 1 + + return { + "total_findings": len(findings), + "severity_counts": severity_counts, + "category_counts": category_counts, + "sanitizer_counts": sanitizer_counts, + "crash_type_counts": crash_type_counts, + "memory_safety_issues": category_counts.get("memory_safety", 0) + + category_counts.get("buffer_overflow", 0) + + category_counts.get("memory_corruption", 0), + "performance_issues": category_counts.get("performance_issues", 0) + } \ No newline at end of file diff --git a/backend/toolbox/modules/fuzzing/oss_fuzz.py b/backend/toolbox/modules/fuzzing/oss_fuzz.py new file mode 100644 index 0000000..83ff7d8 --- /dev/null +++ b/backend/toolbox/modules/fuzzing/oss_fuzz.py @@ -0,0 +1,547 @@ +""" +OSS-Fuzz Module + +This module integrates with Google's OSS-Fuzz for continuous fuzzing +of open source projects. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import asyncio +import json +import os +from pathlib import Path +from typing import Dict, Any, List +import subprocess +import logging + +from ..base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult +from . import register_module + +logger = logging.getLogger(__name__) + + +@register_module +class OSSFuzzModule(BaseModule): + """OSS-Fuzz continuous fuzzing module""" + + def get_metadata(self) -> ModuleMetadata: + """Get module metadata""" + return ModuleMetadata( + name="oss_fuzz", + version="1.0.0", + description="Google's continuous fuzzing for open source projects integration", + author="FuzzForge Team", + category="fuzzing", + tags=["oss-fuzz", "continuous", "google", "open-source", "docker"], + input_schema={ + "type": "object", + "properties": { + "project_name": { + "type": "string", + "description": "OSS-Fuzz project name" + }, + "source_dir": { + "type": "string", + "description": "Source directory to fuzz" + }, + "build_script": { + "type": "string", + "default": "build.sh", + "description": "Build script path" + }, + "dockerfile": { + "type": "string", + "default": "Dockerfile", + "description": "Dockerfile path" + }, + "project_yaml": { + "type": "string", + "default": "project.yaml", + "description": "Project configuration file" + }, + "sanitizer": { + "type": "string", + "enum": ["address", "memory", "undefined", "coverage"], + "default": "address", + "description": "Sanitizer to use" + }, + "architecture": { + "type": "string", + "enum": ["x86_64", "i386"], + "default": "x86_64", + "description": "Target architecture" + }, + "fuzzing_engine": { + "type": "string", + "enum": ["libfuzzer", "afl", "honggfuzz"], + "default": "libfuzzer", + "description": "Fuzzing engine to use" + }, + "timeout": { + "type": "integer", + "default": 3600, + "description": "Fuzzing timeout (seconds)" + }, + "check_build": { + "type": "boolean", + "default": true, + "description": "Check if build is successful" + }, + "reproduce_bugs": { + "type": "boolean", + "default": false, + "description": "Try to reproduce existing bugs" + } + } + }, + output_schema={ + "type": "object", + "properties": { + "findings": { + "type": "array", + "items": { + "type": "object", + "properties": { + "bug_type": {"type": "string"}, + "reproducer": {"type": "string"}, + "stack_trace": {"type": "string"}, + "sanitizer": {"type": "string"} + } + } + } + } + } + ) + + def validate_config(self, config: Dict[str, Any]) -> bool: + """Validate configuration""" + project_name = config.get("project_name") + if not project_name: + raise ValueError("project_name is required") + + source_dir = config.get("source_dir") + if not source_dir: + raise ValueError("source_dir is required") + + return True + + async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: + """Execute OSS-Fuzz integration""" + self.start_timer() + + try: + # Validate inputs + self.validate_config(config) + self.validate_workspace(workspace) + + logger.info("Running OSS-Fuzz integration") + + # Check Docker + await self._check_docker() + + # Clone/update OSS-Fuzz if needed + oss_fuzz_dir = await self._setup_oss_fuzz(workspace) + + # Setup project + await self._setup_project(oss_fuzz_dir, config, workspace) + + # Build and run + findings = await self._run_oss_fuzz(oss_fuzz_dir, config, workspace) + + # Create summary + summary = self._create_summary(findings) + + logger.info(f"OSS-Fuzz found {len(findings)} issues") + + return self.create_result( + findings=findings, + status="success", + summary=summary + ) + + except Exception as e: + logger.error(f"OSS-Fuzz module failed: {e}") + return self.create_result( + findings=[], + status="failed", + error=str(e) + ) + + async def _check_docker(self): + """Check if Docker is available""" + try: + process = await asyncio.create_subprocess_exec( + "docker", "--version", + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE + ) + stdout, stderr = await process.communicate() + + if process.returncode != 0: + raise RuntimeError("Docker not available. OSS-Fuzz requires Docker.") + + except Exception as e: + raise RuntimeError(f"Docker check failed: {e}") + + async def _setup_oss_fuzz(self, workspace: Path) -> Path: + """Setup OSS-Fuzz repository""" + oss_fuzz_dir = workspace / "oss-fuzz" + + if not oss_fuzz_dir.exists(): + logger.info("Cloning OSS-Fuzz repository") + process = await asyncio.create_subprocess_exec( + "git", "clone", "https://github.com/google/oss-fuzz.git", + cwd=workspace, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE + ) + stdout, stderr = await process.communicate() + + if process.returncode != 0: + raise RuntimeError(f"Failed to clone OSS-Fuzz: {stderr.decode()}") + + return oss_fuzz_dir + + async def _setup_project(self, oss_fuzz_dir: Path, config: Dict[str, Any], workspace: Path): + """Setup OSS-Fuzz project""" + project_name = config["project_name"] + project_dir = oss_fuzz_dir / "projects" / project_name + + # Create project directory if it doesn't exist + project_dir.mkdir(parents=True, exist_ok=True) + + # Copy source if provided + source_dir = workspace / config["source_dir"] + if source_dir.exists(): + # Create symlink or copy source + logger.info(f"Setting up source directory: {source_dir}") + + # Setup required files if they don't exist + await self._create_project_files(project_dir, config, workspace) + + async def _create_project_files(self, project_dir: Path, config: Dict[str, Any], workspace: Path): + """Create required OSS-Fuzz project files""" + + # Create Dockerfile if it doesn't exist + dockerfile = project_dir / config.get("dockerfile", "Dockerfile") + if not dockerfile.exists(): + dockerfile_content = f'''FROM gcr.io/oss-fuzz-base/base-builder +COPY . $SRC/{config["project_name"]} +WORKDIR $SRC/{config["project_name"]} +COPY {config.get("build_script", "build.sh")} $SRC/ +''' + dockerfile.write_text(dockerfile_content) + + # Create build.sh if it doesn't exist + build_script = project_dir / config.get("build_script", "build.sh") + if not build_script.exists(): + build_content = f'''#!/bin/bash -eu +# Build script for {config["project_name"]} +# Add your build commands here +echo "Building {config['project_name']}..." +''' + build_script.write_text(build_content) + build_script.chmod(0o755) + + # Create project.yaml if it doesn't exist + project_yaml = project_dir / config.get("project_yaml", "project.yaml") + if not project_yaml.exists(): + yaml_content = f'''homepage: "https://example.com" +language: c++ +primary_contact: "security@example.com" +auto_ccs: + - "fuzzing@example.com" +sanitizers: + - {config.get("sanitizer", "address")} +architectures: + - {config.get("architecture", "x86_64")} +fuzzing_engines: + - {config.get("fuzzing_engine", "libfuzzer")} +''' + project_yaml.write_text(yaml_content) + + async def _run_oss_fuzz(self, oss_fuzz_dir: Path, config: Dict[str, Any], workspace: Path) -> List[ModuleFinding]: + """Run OSS-Fuzz""" + findings = [] + + try: + project_name = config["project_name"] + sanitizer = config.get("sanitizer", "address") + architecture = config.get("architecture", "x86_64") + + # Build project + if config.get("check_build", True): + await self._build_project(oss_fuzz_dir, project_name, sanitizer, architecture) + + # Check build + await self._check_build(oss_fuzz_dir, project_name, sanitizer, architecture) + + # Run fuzzing (limited time for this integration) + timeout = min(config.get("timeout", 300), 300) # Max 5 minutes for demo + findings = await self._run_fuzzing(oss_fuzz_dir, project_name, sanitizer, timeout, workspace) + + # Reproduce bugs if requested + if config.get("reproduce_bugs", False): + repro_findings = await self._reproduce_bugs(oss_fuzz_dir, project_name, workspace) + findings.extend(repro_findings) + + except Exception as e: + logger.warning(f"Error running OSS-Fuzz: {e}") + + return findings + + async def _build_project(self, oss_fuzz_dir: Path, project_name: str, sanitizer: str, architecture: str): + """Build OSS-Fuzz project""" + cmd = [ + "python3", "infra/helper.py", "build_image", project_name + ] + + process = await asyncio.create_subprocess_exec( + *cmd, + cwd=oss_fuzz_dir, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE + ) + stdout, stderr = await process.communicate() + + if process.returncode != 0: + logger.warning(f"Build image failed: {stderr.decode()}") + + async def _check_build(self, oss_fuzz_dir: Path, project_name: str, sanitizer: str, architecture: str): + """Check OSS-Fuzz build""" + cmd = [ + "python3", "infra/helper.py", "check_build", project_name + ] + + process = await asyncio.create_subprocess_exec( + *cmd, + cwd=oss_fuzz_dir, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE + ) + stdout, stderr = await process.communicate() + + if process.returncode != 0: + logger.warning(f"Build check failed: {stderr.decode()}") + + async def _run_fuzzing(self, oss_fuzz_dir: Path, project_name: str, sanitizer: str, timeout: int, workspace: Path) -> List[ModuleFinding]: + """Run OSS-Fuzz fuzzing""" + findings = [] + + try: + # This is a simplified version - real OSS-Fuzz runs for much longer + cmd = [ + "python3", "infra/helper.py", "run_fuzzer", project_name, + "--", f"-max_total_time={timeout}" + ] + + process = await asyncio.create_subprocess_exec( + *cmd, + cwd=oss_fuzz_dir, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE + ) + + try: + stdout, stderr = await asyncio.wait_for( + process.communicate(), timeout=timeout + 60 + ) + except asyncio.TimeoutError: + process.terminate() + await process.wait() + + # Parse output for crashes + full_output = stdout.decode() + stderr.decode() + findings = self._parse_oss_fuzz_output(full_output, workspace, sanitizer) + + except Exception as e: + logger.warning(f"Error in OSS-Fuzz execution: {e}") + + return findings + + async def _reproduce_bugs(self, oss_fuzz_dir: Path, project_name: str, workspace: Path) -> List[ModuleFinding]: + """Reproduce existing bugs""" + findings = [] + + try: + # Look for existing testcases or artifacts + testcases_dir = oss_fuzz_dir / "projects" / project_name / "testcases" + if testcases_dir.exists(): + for testcase in testcases_dir.iterdir(): + if testcase.is_file(): + finding = self._create_testcase_finding(testcase, workspace) + if finding: + findings.append(finding) + + except Exception as e: + logger.warning(f"Error reproducing bugs: {e}") + + return findings + + def _parse_oss_fuzz_output(self, output: str, workspace: Path, sanitizer: str) -> List[ModuleFinding]: + """Parse OSS-Fuzz output""" + findings = [] + + try: + # Look for common crash indicators + lines = output.split('\n') + crash_info = None + + for line in lines: + if "ERROR:" in line and any(term in line for term in ["AddressSanitizer", "MemorySanitizer", "UBSan"]): + crash_info = { + "type": self._extract_crash_type(line), + "sanitizer": sanitizer, + "line": line + } + elif crash_info and line.strip().startswith("#"): + # Stack trace line + if "stack_trace" not in crash_info: + crash_info["stack_trace"] = [] + crash_info["stack_trace"].append(line.strip()) + + if crash_info: + finding = self._create_oss_fuzz_finding(crash_info, workspace) + if finding: + findings.append(finding) + + except Exception as e: + logger.warning(f"Error parsing OSS-Fuzz output: {e}") + + return findings + + def _create_oss_fuzz_finding(self, crash_info: Dict[str, Any], workspace: Path) -> ModuleFinding: + """Create finding from OSS-Fuzz crash""" + try: + bug_type = crash_info.get("type", "unknown") + sanitizer = crash_info.get("sanitizer", "unknown") + stack_trace = '\n'.join(crash_info.get("stack_trace", [])[:20]) + + severity = self._get_oss_fuzz_severity(bug_type) + + finding = self.create_finding( + title=f"OSS-Fuzz {bug_type.title()}", + description=f"OSS-Fuzz detected a {bug_type} using {sanitizer} sanitizer", + severity=severity, + category=self._get_oss_fuzz_category(bug_type), + file_path=None, + recommendation=self._get_oss_fuzz_recommendation(bug_type, sanitizer), + metadata={ + "bug_type": bug_type, + "sanitizer": sanitizer, + "stack_trace": stack_trace, + "fuzzer": "oss_fuzz" + } + ) + + return finding + + except Exception as e: + logger.warning(f"Error creating OSS-Fuzz finding: {e}") + return None + + def _create_testcase_finding(self, testcase_file: Path, workspace: Path) -> ModuleFinding: + """Create finding from testcase file""" + try: + try: + rel_path = testcase_file.relative_to(workspace) + file_path = str(rel_path) + except ValueError: + file_path = str(testcase_file) + + finding = self.create_finding( + title="OSS-Fuzz Testcase", + description=f"OSS-Fuzz testcase found: {testcase_file.name}", + severity="info", + category="testcase", + file_path=file_path, + recommendation="Analyze testcase to understand potential issues", + metadata={ + "testcase_file": str(testcase_file), + "fuzzer": "oss_fuzz" + } + ) + + return finding + + except Exception as e: + logger.warning(f"Error creating testcase finding: {e}") + return None + + def _extract_crash_type(self, line: str) -> str: + """Extract crash type from error line""" + if "heap-buffer-overflow" in line: + return "heap_buffer_overflow" + elif "stack-buffer-overflow" in line: + return "stack_buffer_overflow" + elif "use-after-free" in line: + return "use_after_free" + elif "double-free" in line: + return "double_free" + elif "memory leak" in line: + return "memory_leak" + else: + return "unknown_crash" + + def _get_oss_fuzz_severity(self, bug_type: str) -> str: + """Get severity for OSS-Fuzz bug type""" + if bug_type in ["heap_buffer_overflow", "stack_buffer_overflow", "use_after_free", "double_free"]: + return "critical" + elif bug_type == "memory_leak": + return "medium" + else: + return "high" + + def _get_oss_fuzz_category(self, bug_type: str) -> str: + """Get category for OSS-Fuzz bug type""" + if "overflow" in bug_type: + return "buffer_overflow" + elif "free" in bug_type: + return "memory_corruption" + elif "leak" in bug_type: + return "memory_leak" + else: + return "memory_safety" + + def _get_oss_fuzz_recommendation(self, bug_type: str, sanitizer: str) -> str: + """Get recommendation for OSS-Fuzz finding""" + if "overflow" in bug_type: + return "Fix buffer overflow by implementing proper bounds checking and using safe string functions." + elif "use_after_free" in bug_type: + return "Fix use-after-free by ensuring proper object lifetime management and setting pointers to NULL after freeing." + elif "double_free" in bug_type: + return "Fix double-free by ensuring each allocation has exactly one corresponding free operation." + elif "leak" in bug_type: + return "Fix memory leak by ensuring all allocated memory is properly freed in all code paths." + else: + return f"Address the {bug_type} issue detected by OSS-Fuzz with {sanitizer} sanitizer." + + def _create_summary(self, findings: List[ModuleFinding]) -> Dict[str, Any]: + """Create analysis summary""" + severity_counts = {"critical": 0, "high": 0, "medium": 0, "low": 0, "info": 0} + category_counts = {} + sanitizer_counts = {} + + for finding in findings: + severity_counts[finding.severity] += 1 + category_counts[finding.category] = category_counts.get(finding.category, 0) + 1 + + sanitizer = finding.metadata.get("sanitizer", "unknown") + sanitizer_counts[sanitizer] = sanitizer_counts.get(sanitizer, 0) + 1 + + return { + "total_findings": len(findings), + "severity_counts": severity_counts, + "category_counts": category_counts, + "sanitizer_counts": sanitizer_counts + } \ No newline at end of file diff --git a/backend/toolbox/modules/infrastructure/__init__.py b/backend/toolbox/modules/infrastructure/__init__.py new file mode 100644 index 0000000..d27c14d --- /dev/null +++ b/backend/toolbox/modules/infrastructure/__init__.py @@ -0,0 +1,43 @@ +""" +Infrastructure Security Modules + +This package contains modules for Infrastructure as Code (IaC) security testing. + +Available modules: +- Checkov: Terraform/CloudFormation/Kubernetes IaC security +- Hadolint: Dockerfile security linting and best practices +- Kubesec: Kubernetes security risk analysis +- Polaris: Kubernetes configuration validation +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +from typing import List, Type +from ..base import BaseModule + +# Module registry for automatic discovery +INFRASTRUCTURE_MODULES: List[Type[BaseModule]] = [] + +def register_module(module_class: Type[BaseModule]): + """Register an infrastructure security module""" + INFRASTRUCTURE_MODULES.append(module_class) + return module_class + +def get_available_modules() -> List[Type[BaseModule]]: + """Get all available infrastructure security modules""" + return INFRASTRUCTURE_MODULES.copy() + +# Import modules to trigger registration +from .checkov import CheckovModule +from .hadolint import HadolintModule +from .kubesec import KubesecModule +from .polaris import PolarisModule \ No newline at end of file diff --git a/backend/toolbox/modules/infrastructure/checkov.py b/backend/toolbox/modules/infrastructure/checkov.py new file mode 100644 index 0000000..76fe2a7 --- /dev/null +++ b/backend/toolbox/modules/infrastructure/checkov.py @@ -0,0 +1,411 @@ +""" +Checkov Infrastructure Security Module + +This module uses Checkov to scan Infrastructure as Code (IaC) files for +security misconfigurations and compliance violations. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import asyncio +import json +from pathlib import Path +from typing import Dict, Any, List +import subprocess +import logging + +from ..base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult +from . import register_module + +logger = logging.getLogger(__name__) + + +@register_module +class CheckovModule(BaseModule): + """Checkov Infrastructure as Code security scanning module""" + + def get_metadata(self) -> ModuleMetadata: + """Get module metadata""" + return ModuleMetadata( + name="checkov", + version="3.1.34", + description="Infrastructure as Code security scanning for Terraform, CloudFormation, Kubernetes, and more", + author="FuzzForge Team", + category="infrastructure", + tags=["iac", "terraform", "cloudformation", "kubernetes", "security", "compliance"], + input_schema={ + "type": "object", + "properties": { + "frameworks": { + "type": "array", + "items": {"type": "string"}, + "default": ["terraform", "cloudformation", "kubernetes"], + "description": "IaC frameworks to scan" + }, + "checks": { + "type": "array", + "items": {"type": "string"}, + "description": "Specific checks to run" + }, + "skip_checks": { + "type": "array", + "items": {"type": "string"}, + "description": "Checks to skip" + }, + "severity": { + "type": "array", + "items": {"type": "string", "enum": ["CRITICAL", "HIGH", "MEDIUM", "LOW", "INFO"]}, + "default": ["CRITICAL", "HIGH", "MEDIUM", "LOW", "INFO"], + "description": "Minimum severity levels to report" + }, + "compact": { + "type": "boolean", + "default": False, + "description": "Use compact output format" + }, + "quiet": { + "type": "boolean", + "default": False, + "description": "Suppress verbose output" + }, + "soft_fail": { + "type": "boolean", + "default": True, + "description": "Return exit code 0 even when issues are found" + }, + "include_patterns": { + "type": "array", + "items": {"type": "string"}, + "description": "File patterns to include" + }, + "exclude_patterns": { + "type": "array", + "items": {"type": "string"}, + "description": "File patterns to exclude" + } + } + }, + output_schema={ + "type": "object", + "properties": { + "findings": { + "type": "array", + "items": { + "type": "object", + "properties": { + "check_id": {"type": "string"}, + "check_name": {"type": "string"}, + "severity": {"type": "string"}, + "file_path": {"type": "string"}, + "line_range": {"type": "array"} + } + } + } + } + } + ) + + def validate_config(self, config: Dict[str, Any]) -> bool: + """Validate configuration""" + frameworks = config.get("frameworks", []) + supported_frameworks = [ + "terraform", "cloudformation", "kubernetes", "dockerfile", + "ansible", "helm", "serverless", "bicep", "github_actions" + ] + + for framework in frameworks: + if framework not in supported_frameworks: + raise ValueError(f"Unsupported framework: {framework}. Supported: {supported_frameworks}") + + return True + + async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: + """Execute Checkov IaC security scanning""" + self.start_timer() + + try: + # Validate inputs + self.validate_config(config) + self.validate_workspace(workspace) + + logger.info(f"Running Checkov IaC scan on {workspace}") + + # Check if there are any IaC files + iac_files = self._find_iac_files(workspace, config.get("frameworks", [])) + if not iac_files: + logger.info("No Infrastructure as Code files found") + return self.create_result( + findings=[], + status="success", + summary={"total_findings": 0, "files_scanned": 0} + ) + + # Build checkov command + cmd = ["checkov", "-d", str(workspace)] + + # Add output format + cmd.extend(["--output", "json"]) + + # Add frameworks + frameworks = config.get("frameworks", ["terraform", "cloudformation", "kubernetes"]) + cmd.extend(["--framework"] + frameworks) + + # Add specific checks + if config.get("checks"): + cmd.extend(["--check", ",".join(config["checks"])]) + + # Add skip checks + if config.get("skip_checks"): + cmd.extend(["--skip-check", ",".join(config["skip_checks"])]) + + # Add compact flag + if config.get("compact", False): + cmd.append("--compact") + + # Add quiet flag + if config.get("quiet", False): + cmd.append("--quiet") + + # Add soft fail + if config.get("soft_fail", True): + cmd.append("--soft-fail") + + # Add include patterns + if config.get("include_patterns"): + for pattern in config["include_patterns"]: + cmd.extend(["--include", pattern]) + + # Add exclude patterns + if config.get("exclude_patterns"): + for pattern in config["exclude_patterns"]: + cmd.extend(["--exclude", pattern]) + + # Disable update checks and telemetry + cmd.extend(["--no-guide", "--skip-download"]) + + logger.debug(f"Running command: {' '.join(cmd)}") + + # Run Checkov + process = await asyncio.create_subprocess_exec( + *cmd, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE, + cwd=workspace + ) + + stdout, stderr = await process.communicate() + + # Parse results + findings = [] + if process.returncode == 0 or config.get("soft_fail", True): + findings = self._parse_checkov_output(stdout.decode(), workspace, config) + else: + error_msg = stderr.decode() + logger.error(f"Checkov failed: {error_msg}") + return self.create_result( + findings=[], + status="failed", + error=f"Checkov execution failed: {error_msg}" + ) + + # Create summary + summary = self._create_summary(findings, len(iac_files)) + + logger.info(f"Checkov found {len(findings)} security issues") + + return self.create_result( + findings=findings, + status="success", + summary=summary + ) + + except Exception as e: + logger.error(f"Checkov module failed: {e}") + return self.create_result( + findings=[], + status="failed", + error=str(e) + ) + + def _find_iac_files(self, workspace: Path, frameworks: List[str]) -> List[Path]: + """Find Infrastructure as Code files in workspace""" + iac_patterns = { + "terraform": ["*.tf", "*.tfvars"], + "cloudformation": ["*.yaml", "*.yml", "*.json", "*template*"], + "kubernetes": ["*.yaml", "*.yml"], + "dockerfile": ["Dockerfile", "*.dockerfile"], + "ansible": ["*.yaml", "*.yml", "playbook*"], + "helm": ["Chart.yaml", "values.yaml", "*.yaml"], + "bicep": ["*.bicep"], + "github_actions": [".github/workflows/*.yaml", ".github/workflows/*.yml"] + } + + found_files = [] + for framework in frameworks: + patterns = iac_patterns.get(framework, []) + for pattern in patterns: + found_files.extend(workspace.rglob(pattern)) + + return list(set(found_files)) # Remove duplicates + + def _parse_checkov_output(self, output: str, workspace: Path, config: Dict[str, Any]) -> List[ModuleFinding]: + """Parse Checkov JSON output into findings""" + findings = [] + + if not output.strip(): + return findings + + try: + data = json.loads(output) + + # Get severity filter + allowed_severities = set(s.upper() for s in config.get("severity", ["CRITICAL", "HIGH", "MEDIUM", "LOW", "INFO"])) + + # Process failed checks + failed_checks = data.get("results", {}).get("failed_checks", []) + + for check in failed_checks: + # Extract information + check_id = check.get("check_id", "unknown") + check_name = check.get("check_name", "") + severity = check.get("severity", "MEDIUM").upper() + file_path = check.get("file_path", "") + file_line_range = check.get("file_line_range", []) + resource = check.get("resource", "") + description = check.get("description", "") + guideline = check.get("guideline", "") + + # Apply severity filter + if severity not in allowed_severities: + continue + + # Make file path relative to workspace + if file_path: + try: + rel_path = Path(file_path).relative_to(workspace) + file_path = str(rel_path) + except ValueError: + pass + + # Map severity to our standard levels + finding_severity = self._map_severity(severity) + + # Create finding + finding = self.create_finding( + title=f"IaC Security Issue: {check_name}", + description=description or f"Checkov check {check_id} failed for resource {resource}", + severity=finding_severity, + category=self._get_category(check_id, check_name), + file_path=file_path if file_path else None, + line_start=file_line_range[0] if file_line_range and len(file_line_range) > 0 else None, + line_end=file_line_range[1] if file_line_range and len(file_line_range) > 1 else None, + recommendation=self._get_recommendation(check_id, check_name, guideline), + metadata={ + "check_id": check_id, + "check_name": check_name, + "checkov_severity": severity, + "resource": resource, + "guideline": guideline, + "bc_category": check.get("bc_category", ""), + "benchmarks": check.get("benchmarks", {}), + "fixed_definition": check.get("fixed_definition", "") + } + ) + + findings.append(finding) + + except json.JSONDecodeError as e: + logger.warning(f"Failed to parse Checkov output: {e}") + except Exception as e: + logger.warning(f"Error processing Checkov results: {e}") + + return findings + + def _map_severity(self, checkov_severity: str) -> str: + """Map Checkov severity to our standard severity levels""" + severity_map = { + "CRITICAL": "critical", + "HIGH": "high", + "MEDIUM": "medium", + "LOW": "low", + "INFO": "info" + } + return severity_map.get(checkov_severity.upper(), "medium") + + def _get_category(self, check_id: str, check_name: str) -> str: + """Determine finding category based on check""" + check_lower = f"{check_id} {check_name}".lower() + + if any(term in check_lower for term in ["encryption", "encrypt", "kms", "ssl", "tls"]): + return "encryption" + elif any(term in check_lower for term in ["access", "iam", "rbac", "permission"]): + return "access_control" + elif any(term in check_lower for term in ["network", "security group", "firewall", "vpc"]): + return "network_security" + elif any(term in check_lower for term in ["logging", "monitor", "audit"]): + return "logging_monitoring" + elif any(term in check_lower for term in ["storage", "s3", "bucket", "database"]): + return "data_protection" + elif any(term in check_lower for term in ["secret", "password", "key", "credential"]): + return "secrets_management" + elif any(term in check_lower for term in ["backup", "snapshot", "versioning"]): + return "backup_recovery" + else: + return "infrastructure_security" + + def _get_recommendation(self, check_id: str, check_name: str, guideline: str) -> str: + """Generate recommendation based on check""" + if guideline: + return f"Follow the guideline: {guideline}" + + # Generic recommendations based on common patterns + check_lower = f"{check_id} {check_name}".lower() + + if "encryption" in check_lower: + return "Enable encryption for sensitive data at rest and in transit using appropriate encryption algorithms." + elif "access" in check_lower or "iam" in check_lower: + return "Review and tighten access controls. Follow the principle of least privilege." + elif "network" in check_lower or "security group" in check_lower: + return "Restrict network access to only necessary ports and IP ranges." + elif "logging" in check_lower: + return "Enable comprehensive logging and monitoring for security events." + elif "backup" in check_lower: + return "Implement proper backup and disaster recovery procedures." + else: + return f"Review and fix the security configuration issue identified by check {check_id}." + + def _create_summary(self, findings: List[ModuleFinding], total_files: int) -> Dict[str, Any]: + """Create analysis summary""" + severity_counts = {"critical": 0, "high": 0, "medium": 0, "low": 0, "info": 0} + category_counts = {} + check_counts = {} + + for finding in findings: + # Count by severity + severity_counts[finding.severity] += 1 + + # Count by category + category = finding.category + category_counts[category] = category_counts.get(category, 0) + 1 + + # Count by check + check_id = finding.metadata.get("check_id", "unknown") + check_counts[check_id] = check_counts.get(check_id, 0) + 1 + + return { + "total_findings": len(findings), + "files_scanned": total_files, + "severity_counts": severity_counts, + "category_counts": category_counts, + "top_checks": dict(sorted(check_counts.items(), key=lambda x: x[1], reverse=True)[:10]), + "files_with_issues": len(set(f.file_path for f in findings if f.file_path)) + } \ No newline at end of file diff --git a/backend/toolbox/modules/infrastructure/hadolint.py b/backend/toolbox/modules/infrastructure/hadolint.py new file mode 100644 index 0000000..a3d812c --- /dev/null +++ b/backend/toolbox/modules/infrastructure/hadolint.py @@ -0,0 +1,406 @@ +""" +Hadolint Infrastructure Security Module + +This module uses Hadolint to scan Dockerfiles for security best practices +and potential vulnerabilities. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import asyncio +import json +from pathlib import Path +from typing import Dict, Any, List +import subprocess +import logging + +from ..base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult +from . import register_module + +logger = logging.getLogger(__name__) + + +@register_module +class HadolintModule(BaseModule): + """Hadolint Dockerfile security scanning module""" + + def get_metadata(self) -> ModuleMetadata: + """Get module metadata""" + return ModuleMetadata( + name="hadolint", + version="2.12.0", + description="Dockerfile security linting and best practices validation", + author="FuzzForge Team", + category="infrastructure", + tags=["dockerfile", "docker", "security", "best-practices", "linting"], + input_schema={ + "type": "object", + "properties": { + "severity": { + "type": "array", + "items": {"type": "string", "enum": ["error", "warning", "info", "style"]}, + "default": ["error", "warning", "info", "style"], + "description": "Minimum severity levels to report" + }, + "ignored_rules": { + "type": "array", + "items": {"type": "string"}, + "description": "Hadolint rules to ignore" + }, + "trusted_registries": { + "type": "array", + "items": {"type": "string"}, + "description": "List of trusted Docker registries" + }, + "allowed_maintainers": { + "type": "array", + "items": {"type": "string"}, + "description": "List of allowed maintainer emails" + }, + "dockerfile_patterns": { + "type": "array", + "items": {"type": "string"}, + "default": ["**/Dockerfile", "**/*.dockerfile", "**/Containerfile"], + "description": "Patterns to find Dockerfile-like files" + }, + "strict": { + "type": "boolean", + "default": False, + "description": "Enable strict mode (fail on any issue)" + }, + "no_fail": { + "type": "boolean", + "default": True, + "description": "Don't fail on lint errors (useful for reporting)" + } + } + }, + output_schema={ + "type": "object", + "properties": { + "findings": { + "type": "array", + "items": { + "type": "object", + "properties": { + "rule": {"type": "string"}, + "severity": {"type": "string"}, + "message": {"type": "string"}, + "file_path": {"type": "string"}, + "line": {"type": "integer"} + } + } + } + } + } + ) + + def validate_config(self, config: Dict[str, Any]) -> bool: + """Validate configuration""" + severity_levels = config.get("severity", ["error", "warning", "info", "style"]) + valid_severities = ["error", "warning", "info", "style"] + + for severity in severity_levels: + if severity not in valid_severities: + raise ValueError(f"Invalid severity level: {severity}. Valid: {valid_severities}") + + return True + + async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: + """Execute Hadolint Dockerfile security scanning""" + self.start_timer() + + try: + # Validate inputs + self.validate_config(config) + self.validate_workspace(workspace) + + logger.info(f"Running Hadolint Dockerfile scan on {workspace}") + + # Find all Dockerfiles + dockerfiles = self._find_dockerfiles(workspace, config) + if not dockerfiles: + logger.info("No Dockerfiles found for Hadolint analysis") + return self.create_result( + findings=[], + status="success", + summary={"total_findings": 0, "files_scanned": 0} + ) + + logger.info(f"Found {len(dockerfiles)} Dockerfile(s) to analyze") + + # Process each Dockerfile + all_findings = [] + for dockerfile in dockerfiles: + findings = await self._scan_dockerfile(dockerfile, workspace, config) + all_findings.extend(findings) + + # Create summary + summary = self._create_summary(all_findings, len(dockerfiles)) + + logger.info(f"Hadolint found {len(all_findings)} issues across {len(dockerfiles)} Dockerfiles") + + return self.create_result( + findings=all_findings, + status="success", + summary=summary + ) + + except Exception as e: + logger.error(f"Hadolint module failed: {e}") + return self.create_result( + findings=[], + status="failed", + error=str(e) + ) + + def _find_dockerfiles(self, workspace: Path, config: Dict[str, Any]) -> List[Path]: + """Find Dockerfile-like files in workspace""" + patterns = config.get("dockerfile_patterns", [ + "**/Dockerfile", "**/*.dockerfile", "**/Containerfile" + ]) + + # Debug logging + logger.info(f"Hadolint searching in workspace: {workspace}") + logger.info(f"Workspace exists: {workspace.exists()}") + if workspace.exists(): + all_files = list(workspace.rglob("*")) + logger.info(f"All files in workspace: {all_files}") + + dockerfiles = [] + for pattern in patterns: + matches = list(workspace.glob(pattern)) + logger.info(f"Pattern '{pattern}' found: {matches}") + dockerfiles.extend(matches) + + logger.info(f"Final dockerfiles list: {dockerfiles}") + return list(set(dockerfiles)) # Remove duplicates + + async def _scan_dockerfile(self, dockerfile: Path, workspace: Path, config: Dict[str, Any]) -> List[ModuleFinding]: + """Scan a single Dockerfile with Hadolint""" + findings = [] + + try: + # Build hadolint command + cmd = ["hadolint", "--format", "json"] + + # Add severity levels + severity_levels = config.get("severity", ["error", "warning", "info", "style"]) + if "error" not in severity_levels: + cmd.append("--no-error") + if "warning" not in severity_levels: + cmd.append("--no-warning") + if "info" not in severity_levels: + cmd.append("--no-info") + if "style" not in severity_levels: + cmd.append("--no-style") + + # Add ignored rules + ignored_rules = config.get("ignored_rules", []) + for rule in ignored_rules: + cmd.extend(["--ignore", rule]) + + # Add trusted registries + trusted_registries = config.get("trusted_registries", []) + for registry in trusted_registries: + cmd.extend(["--trusted-registry", registry]) + + # Add strict mode + if config.get("strict", False): + cmd.append("--strict-labels") + + # Add the dockerfile + cmd.append(str(dockerfile)) + + logger.debug(f"Running command: {' '.join(cmd)}") + + # Run hadolint + process = await asyncio.create_subprocess_exec( + *cmd, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE, + cwd=workspace + ) + + stdout, stderr = await process.communicate() + + # Parse results + if process.returncode == 0 or config.get("no_fail", True): + findings = self._parse_hadolint_output( + stdout.decode(), dockerfile, workspace + ) + else: + error_msg = stderr.decode() + logger.warning(f"Hadolint failed for {dockerfile}: {error_msg}") + # Continue with other files even if one fails + + except Exception as e: + logger.warning(f"Error scanning {dockerfile}: {e}") + + return findings + + def _parse_hadolint_output(self, output: str, dockerfile: Path, workspace: Path) -> List[ModuleFinding]: + """Parse Hadolint JSON output into findings""" + findings = [] + + if not output.strip(): + return findings + + try: + # Hadolint outputs JSON array + issues = json.loads(output) + + for issue in issues: + # Extract information + rule = issue.get("code", "unknown") + message = issue.get("message", "") + level = issue.get("level", "warning").lower() + line = issue.get("line", 0) + column = issue.get("column", 0) + + # Make file path relative to workspace + try: + rel_path = dockerfile.relative_to(workspace) + file_path = str(rel_path) + except ValueError: + file_path = str(dockerfile) + + # Map Hadolint level to our severity + severity = self._map_severity(level) + + # Get category based on rule + category = self._get_category(rule, message) + + # Create finding + finding = self.create_finding( + title=f"Dockerfile issue: {rule}", + description=message or f"Hadolint rule {rule} violation", + severity=severity, + category=category, + file_path=file_path, + line_start=line if line > 0 else None, + recommendation=self._get_recommendation(rule, message), + metadata={ + "rule": rule, + "hadolint_level": level, + "column": column, + "file": str(dockerfile) + } + ) + + findings.append(finding) + + except json.JSONDecodeError as e: + logger.warning(f"Failed to parse Hadolint output: {e}") + except Exception as e: + logger.warning(f"Error processing Hadolint results: {e}") + + return findings + + def _map_severity(self, hadolint_level: str) -> str: + """Map Hadolint severity to our standard severity levels""" + severity_map = { + "error": "high", + "warning": "medium", + "info": "low", + "style": "info" + } + return severity_map.get(hadolint_level.lower(), "medium") + + def _get_category(self, rule: str, message: str) -> str: + """Determine finding category based on rule and message""" + rule_lower = rule.lower() + message_lower = message.lower() + + # Security-related categories + if any(term in rule_lower for term in ["dl3", "dl4"]): + if "user" in message_lower or "root" in message_lower: + return "privilege_escalation" + elif "secret" in message_lower or "password" in message_lower: + return "secrets_management" + elif "version" in message_lower or "pin" in message_lower: + return "dependency_management" + elif "add" in message_lower or "copy" in message_lower: + return "file_operations" + else: + return "security_best_practices" + elif any(term in rule_lower for term in ["dl1", "dl2"]): + return "syntax_errors" + elif "3001" in rule or "3002" in rule: + return "user_management" + elif "3008" in rule or "3009" in rule: + return "privilege_escalation" + elif "3014" in rule or "3015" in rule: + return "port_management" + elif "3020" in rule or "3021" in rule: + return "copy_operations" + else: + return "dockerfile_best_practices" + + def _get_recommendation(self, rule: str, message: str) -> str: + """Generate recommendation based on Hadolint rule""" + recommendations = { + # Security-focused recommendations + "DL3002": "Create a non-root user and switch to it before running the application.", + "DL3008": "Pin package versions to ensure reproducible builds and avoid supply chain attacks.", + "DL3009": "Clean up package manager cache after installation to reduce image size and attack surface.", + "DL3020": "Use COPY instead of ADD for local files to avoid unexpected behavior.", + "DL3025": "Use JSON format for CMD and ENTRYPOINT to avoid shell injection vulnerabilities.", + "DL3059": "Use multi-stage builds to reduce final image size and attack surface.", + "DL4001": "Don't use sudo in Dockerfiles as it's unnecessary and can introduce vulnerabilities.", + "DL4003": "Use a package manager instead of downloading and installing manually.", + "DL4004": "Don't use SSH in Dockerfiles as it's a security risk.", + "DL4005": "Use SHELL instruction to specify shell for RUN commands instead of hardcoding paths.", + } + + if rule in recommendations: + return recommendations[rule] + + # Generic recommendations based on patterns + message_lower = message.lower() + if "user" in message_lower and "root" in message_lower: + return "Avoid running containers as root user. Create and use a non-privileged user." + elif "version" in message_lower or "pin" in message_lower: + return "Pin package versions to specific versions to ensure reproducible builds." + elif "cache" in message_lower or "clean" in message_lower: + return "Clean up package manager caches to reduce image size and potential security issues." + elif "secret" in message_lower or "password" in message_lower: + return "Don't include secrets in Dockerfiles. Use build arguments or runtime secrets instead." + else: + return f"Follow Dockerfile best practices to address rule {rule}." + + def _create_summary(self, findings: List[ModuleFinding], total_files: int) -> Dict[str, Any]: + """Create analysis summary""" + severity_counts = {"high": 0, "medium": 0, "low": 0, "info": 0} + category_counts = {} + rule_counts = {} + + for finding in findings: + # Count by severity + severity_counts[finding.severity] += 1 + + # Count by category + category = finding.category + category_counts[category] = category_counts.get(category, 0) + 1 + + # Count by rule + rule = finding.metadata.get("rule", "unknown") + rule_counts[rule] = rule_counts.get(rule, 0) + 1 + + return { + "total_findings": len(findings), + "files_scanned": total_files, + "severity_counts": severity_counts, + "category_counts": category_counts, + "top_rules": dict(sorted(rule_counts.items(), key=lambda x: x[1], reverse=True)[:10]), + "files_with_issues": len(set(f.file_path for f in findings if f.file_path)) + } \ No newline at end of file diff --git a/backend/toolbox/modules/infrastructure/kubesec.py b/backend/toolbox/modules/infrastructure/kubesec.py new file mode 100644 index 0000000..76c679c --- /dev/null +++ b/backend/toolbox/modules/infrastructure/kubesec.py @@ -0,0 +1,447 @@ +""" +Kubesec Infrastructure Security Module + +This module uses Kubesec to scan Kubernetes manifests for security +misconfigurations and best practices violations. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import asyncio +import json +from pathlib import Path +from typing import Dict, Any, List +import subprocess +import logging + +from ..base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult +from . import register_module + +logger = logging.getLogger(__name__) + + +@register_module +class KubesecModule(BaseModule): + """Kubesec Kubernetes security scanning module""" + + def get_metadata(self) -> ModuleMetadata: + """Get module metadata""" + return ModuleMetadata( + name="kubesec", + version="2.14.0", + description="Kubernetes security scanning for YAML/JSON manifests with security best practices validation", + author="FuzzForge Team", + category="infrastructure", + tags=["kubernetes", "k8s", "security", "best-practices", "manifests"], + input_schema={ + "type": "object", + "properties": { + "scan_mode": { + "type": "string", + "enum": ["scan", "http"], + "default": "scan", + "description": "Kubesec scan mode (local scan or HTTP API)" + }, + "threshold": { + "type": "integer", + "default": 15, + "description": "Minimum security score threshold" + }, + "exit_code": { + "type": "integer", + "default": 0, + "description": "Exit code to return on failure" + }, + "format": { + "type": "string", + "enum": ["json", "template"], + "default": "json", + "description": "Output format" + }, + "kubernetes_patterns": { + "type": "array", + "items": {"type": "string"}, + "default": ["**/*.yaml", "**/*.yml", "**/k8s/*.yaml", "**/kubernetes/*.yaml"], + "description": "Patterns to find Kubernetes manifest files" + }, + "exclude_patterns": { + "type": "array", + "items": {"type": "string"}, + "description": "Patterns to exclude from scanning" + }, + "strict": { + "type": "boolean", + "default": False, + "description": "Enable strict mode (fail on any security issue)" + } + } + }, + output_schema={ + "type": "object", + "properties": { + "findings": { + "type": "array", + "items": { + "type": "object", + "properties": { + "score": {"type": "integer"}, + "security_issues": {"type": "array"}, + "file_path": {"type": "string"}, + "manifest_kind": {"type": "string"} + } + } + } + } + } + ) + + def validate_config(self, config: Dict[str, Any]) -> bool: + """Validate configuration""" + scan_mode = config.get("scan_mode", "scan") + if scan_mode not in ["scan", "http"]: + raise ValueError(f"Invalid scan mode: {scan_mode}. Valid: ['scan', 'http']") + + threshold = config.get("threshold", 0) + if not isinstance(threshold, int): + raise ValueError(f"Threshold must be an integer, got: {type(threshold)}") + + return True + + async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: + """Execute Kubesec Kubernetes security scanning""" + self.start_timer() + + try: + # Validate inputs + self.validate_config(config) + self.validate_workspace(workspace) + + logger.info(f"Running Kubesec Kubernetes scan on {workspace}") + + # Find all Kubernetes manifests + k8s_files = self._find_kubernetes_files(workspace, config) + if not k8s_files: + logger.info("No Kubernetes manifest files found") + return self.create_result( + findings=[], + status="success", + summary={"total_findings": 0, "files_scanned": 0} + ) + + logger.info(f"Found {len(k8s_files)} Kubernetes manifest file(s) to analyze") + + # Process each manifest file + all_findings = [] + for k8s_file in k8s_files: + findings = await self._scan_manifest(k8s_file, workspace, config) + all_findings.extend(findings) + + # Create summary + summary = self._create_summary(all_findings, len(k8s_files)) + + logger.info(f"Kubesec found {len(all_findings)} security issues across {len(k8s_files)} manifests") + + return self.create_result( + findings=all_findings, + status="success", + summary=summary + ) + + except Exception as e: + logger.error(f"Kubesec module failed: {e}") + return self.create_result( + findings=[], + status="failed", + error=str(e) + ) + + def _find_kubernetes_files(self, workspace: Path, config: Dict[str, Any]) -> List[Path]: + """Find Kubernetes manifest files in workspace""" + patterns = config.get("kubernetes_patterns", [ + "**/*.yaml", "**/*.yml", "**/k8s/*.yaml", "**/kubernetes/*.yaml" + ]) + exclude_patterns = config.get("exclude_patterns", []) + + k8s_files = [] + for pattern in patterns: + files = workspace.glob(pattern) + for file in files: + # Check if file contains Kubernetes resources + if self._is_kubernetes_manifest(file): + # Check if file should be excluded + should_exclude = False + for exclude_pattern in exclude_patterns: + if file.match(exclude_pattern): + should_exclude = True + break + if not should_exclude: + k8s_files.append(file) + + return list(set(k8s_files)) # Remove duplicates + + def _is_kubernetes_manifest(self, file: Path) -> bool: + """Check if a file is a Kubernetes manifest""" + try: + content = file.read_text(encoding='utf-8') + # Simple heuristic: check for common Kubernetes fields + k8s_indicators = [ + "apiVersion:", "kind:", "metadata:", "spec:", + "Deployment", "Service", "Pod", "ConfigMap", + "Secret", "Ingress", "PersistentVolume" + ] + return any(indicator in content for indicator in k8s_indicators) + except Exception: + return False + + async def _scan_manifest(self, manifest_file: Path, workspace: Path, config: Dict[str, Any]) -> List[ModuleFinding]: + """Scan a single Kubernetes manifest with Kubesec""" + findings = [] + + try: + # Build kubesec command + cmd = ["kubesec", "scan"] + + # Add format + format_type = config.get("format", "json") + if format_type == "json": + cmd.append("-f") + cmd.append("json") + + # Add the manifest file + cmd.append(str(manifest_file)) + + logger.debug(f"Running command: {' '.join(cmd)}") + + # Run kubesec + process = await asyncio.create_subprocess_exec( + *cmd, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE, + cwd=workspace + ) + + stdout, stderr = await process.communicate() + + # Parse results + if process.returncode == 0: + findings = self._parse_kubesec_output( + stdout.decode(), manifest_file, workspace, config + ) + else: + error_msg = stderr.decode() + logger.warning(f"Kubesec failed for {manifest_file}: {error_msg}") + + except Exception as e: + logger.warning(f"Error scanning {manifest_file}: {e}") + + return findings + + def _parse_kubesec_output(self, output: str, manifest_file: Path, workspace: Path, config: Dict[str, Any]) -> List[ModuleFinding]: + """Parse Kubesec JSON output into findings""" + findings = [] + + if not output.strip(): + return findings + + try: + # Kubesec outputs JSON array + results = json.loads(output) + if not isinstance(results, list): + results = [results] + + threshold = config.get("threshold", 0) + + for result in results: + score = result.get("score", 0) + object_name = result.get("object", "Unknown") + valid = result.get("valid", True) + message = result.get("message", "") + + # Make file path relative to workspace + try: + rel_path = manifest_file.relative_to(workspace) + file_path = str(rel_path) + except ValueError: + file_path = str(manifest_file) + + # Process scoring and advise sections + advise = result.get("advise", []) + scoring = result.get("scoring", {}) + + # Create findings for low scores + if score < threshold or not valid: + severity = "high" if score < 0 else "medium" if score < 5 else "low" + + finding = self.create_finding( + title=f"Kubernetes Security Score Low: {object_name}", + description=message or f"Security score {score} below threshold {threshold}", + severity=severity, + category="kubernetes_security", + file_path=file_path, + recommendation=self._get_score_recommendation(score, advise), + metadata={ + "score": score, + "threshold": threshold, + "object": object_name, + "valid": valid, + "advise_count": len(advise), + "scoring_details": scoring + } + ) + findings.append(finding) + + # Create findings for each advisory + for advisory in advise: + selector = advisory.get("selector", "") + reason = advisory.get("reason", "") + href = advisory.get("href", "") + + # Determine severity based on advisory type + severity = self._get_advisory_severity(reason, selector) + category = self._get_advisory_category(reason, selector) + + finding = self.create_finding( + title=f"Kubernetes Security Advisory: {selector}", + description=reason, + severity=severity, + category=category, + file_path=file_path, + recommendation=self._get_advisory_recommendation(reason, href), + metadata={ + "selector": selector, + "href": href, + "object": object_name, + "advisory_type": "kubesec_advise" + } + ) + findings.append(finding) + + except json.JSONDecodeError as e: + logger.warning(f"Failed to parse Kubesec output: {e}") + except Exception as e: + logger.warning(f"Error processing Kubesec results: {e}") + + return findings + + def _get_advisory_severity(self, reason: str, selector: str) -> str: + """Determine severity based on advisory reason and selector""" + reason_lower = reason.lower() + selector_lower = selector.lower() + + # High severity issues + if any(term in reason_lower for term in [ + "privileged", "root", "hostnetwork", "hostpid", "hostipc", + "allowprivilegeescalation", "runasroot", "security", "capabilities" + ]): + return "high" + + # Medium severity issues + elif any(term in reason_lower for term in [ + "resources", "limits", "requests", "readonly", "securitycontext" + ]): + return "medium" + + # Low severity issues + elif any(term in reason_lower for term in [ + "labels", "annotations", "probe", "liveness", "readiness" + ]): + return "low" + + else: + return "medium" + + def _get_advisory_category(self, reason: str, selector: str) -> str: + """Determine category based on advisory""" + reason_lower = reason.lower() + + if any(term in reason_lower for term in ["privilege", "root", "security", "capabilities"]): + return "privilege_escalation" + elif any(term in reason_lower for term in ["network", "host"]): + return "network_security" + elif any(term in reason_lower for term in ["resources", "limits"]): + return "resource_management" + elif any(term in reason_lower for term in ["probe", "health"]): + return "health_monitoring" + else: + return "kubernetes_best_practices" + + def _get_score_recommendation(self, score: int, advise: List[Dict]) -> str: + """Generate recommendation based on score and advisories""" + if score < 0: + return "Critical security issues detected. Address all security advisories immediately." + elif score < 5: + return "Low security score detected. Review and implement security best practices." + elif len(advise) > 0: + return f"Security score is {score}. Review {len(advise)} advisory recommendations for improvement." + else: + return "Review Kubernetes security configuration and apply security hardening measures." + + def _get_advisory_recommendation(self, reason: str, href: str) -> str: + """Generate recommendation for advisory""" + if href: + return f"{reason} For more details, see: {href}" + + reason_lower = reason.lower() + + # Specific recommendations based on common patterns + if "privileged" in reason_lower: + return "Remove privileged: true from security context. Run containers with minimal privileges." + elif "root" in reason_lower or "runasroot" in reason_lower: + return "Configure runAsNonRoot: true and set runAsUser to a non-root user ID." + elif "allowprivilegeescalation" in reason_lower: + return "Set allowPrivilegeEscalation: false to prevent privilege escalation." + elif "resources" in reason_lower: + return "Define resource requests and limits to prevent resource exhaustion." + elif "readonly" in reason_lower: + return "Set readOnlyRootFilesystem: true to prevent filesystem modifications." + elif "capabilities" in reason_lower: + return "Drop unnecessary capabilities and add only required ones." + elif "probe" in reason_lower: + return "Add liveness and readiness probes for better health monitoring." + else: + return f"Address the security concern: {reason}" + + def _create_summary(self, findings: List[ModuleFinding], total_files: int) -> Dict[str, Any]: + """Create analysis summary""" + severity_counts = {"critical": 0, "high": 0, "medium": 0, "low": 0, "info": 0} + category_counts = {} + object_counts = {} + scores = [] + + for finding in findings: + # Count by severity + severity_counts[finding.severity] += 1 + + # Count by category + category = finding.category + category_counts[category] = category_counts.get(category, 0) + 1 + + # Count by object + obj = finding.metadata.get("object", "unknown") + object_counts[obj] = object_counts.get(obj, 0) + 1 + + # Collect scores + score = finding.metadata.get("score") + if score is not None: + scores.append(score) + + return { + "total_findings": len(findings), + "files_scanned": total_files, + "severity_counts": severity_counts, + "category_counts": category_counts, + "object_counts": object_counts, + "average_score": sum(scores) / len(scores) if scores else 0, + "min_score": min(scores) if scores else 0, + "max_score": max(scores) if scores else 0, + "files_with_issues": len(set(f.file_path for f in findings if f.file_path)) + } \ No newline at end of file diff --git a/backend/toolbox/modules/infrastructure/polaris.py b/backend/toolbox/modules/infrastructure/polaris.py new file mode 100644 index 0000000..68d5f10 --- /dev/null +++ b/backend/toolbox/modules/infrastructure/polaris.py @@ -0,0 +1,519 @@ +""" +Polaris Infrastructure Security Module + +This module uses Polaris to validate Kubernetes resources against security +and best practice policies. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import asyncio +import json +from pathlib import Path +from typing import Dict, Any, List +import subprocess +import logging + +from ..base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult +from . import register_module + +logger = logging.getLogger(__name__) + + +@register_module +class PolarisModule(BaseModule): + """Polaris Kubernetes best practices validation module""" + + def get_metadata(self) -> ModuleMetadata: + """Get module metadata""" + return ModuleMetadata( + name="polaris", + version="8.5.0", + description="Kubernetes best practices validation and policy enforcement using Polaris", + author="FuzzForge Team", + category="infrastructure", + tags=["kubernetes", "k8s", "policy", "best-practices", "validation"], + input_schema={ + "type": "object", + "properties": { + "audit_path": { + "type": "string", + "description": "Path to audit (defaults to workspace)" + }, + "config_file": { + "type": "string", + "description": "Path to Polaris config file" + }, + "only_show_failed_tests": { + "type": "boolean", + "default": True, + "description": "Show only failed validation tests" + }, + "severity_threshold": { + "type": "string", + "enum": ["error", "warning", "info"], + "default": "info", + "description": "Minimum severity level to report" + }, + "format": { + "type": "string", + "enum": ["json", "yaml", "pretty"], + "default": "json", + "description": "Output format" + }, + "kubernetes_patterns": { + "type": "array", + "items": {"type": "string"}, + "default": ["**/*.yaml", "**/*.yml", "**/k8s/*.yaml", "**/kubernetes/*.yaml"], + "description": "Patterns to find Kubernetes manifest files" + }, + "exclude_patterns": { + "type": "array", + "items": {"type": "string"}, + "description": "File patterns to exclude" + }, + "disable_checks": { + "type": "array", + "items": {"type": "string"}, + "description": "List of check names to disable" + }, + "enable_checks": { + "type": "array", + "items": {"type": "string"}, + "description": "List of check names to enable (if using custom config)" + } + } + }, + output_schema={ + "type": "object", + "properties": { + "findings": { + "type": "array", + "items": { + "type": "object", + "properties": { + "check_name": {"type": "string"}, + "severity": {"type": "string"}, + "category": {"type": "string"}, + "file_path": {"type": "string"}, + "resource_name": {"type": "string"} + } + } + } + } + } + ) + + def validate_config(self, config: Dict[str, Any]) -> bool: + """Validate configuration""" + severity_threshold = config.get("severity_threshold", "warning") + valid_severities = ["error", "warning", "info"] + if severity_threshold not in valid_severities: + raise ValueError(f"Invalid severity threshold: {severity_threshold}. Valid: {valid_severities}") + + format_type = config.get("format", "json") + valid_formats = ["json", "yaml", "pretty"] + if format_type not in valid_formats: + raise ValueError(f"Invalid format: {format_type}. Valid: {valid_formats}") + + return True + + async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: + """Execute Polaris Kubernetes validation""" + self.start_timer() + + try: + # Validate inputs + self.validate_config(config) + self.validate_workspace(workspace) + + logger.info(f"Running Polaris Kubernetes validation on {workspace}") + + # Find all Kubernetes manifests + k8s_files = self._find_kubernetes_files(workspace, config) + if not k8s_files: + logger.info("No Kubernetes manifest files found") + return self.create_result( + findings=[], + status="success", + summary={"total_findings": 0, "files_scanned": 0} + ) + + logger.info(f"Found {len(k8s_files)} Kubernetes manifest file(s) to validate") + + # Run Polaris audit + findings = await self._run_polaris_audit(workspace, config, k8s_files) + + # Create summary + summary = self._create_summary(findings, len(k8s_files)) + + logger.info(f"Polaris found {len(findings)} policy violations across {len(k8s_files)} manifests") + + return self.create_result( + findings=findings, + status="success", + summary=summary + ) + + except Exception as e: + logger.error(f"Polaris module failed: {e}") + return self.create_result( + findings=[], + status="failed", + error=str(e) + ) + + def _find_kubernetes_files(self, workspace: Path, config: Dict[str, Any]) -> List[Path]: + """Find Kubernetes manifest files in workspace""" + patterns = config.get("kubernetes_patterns", [ + "**/*.yaml", "**/*.yml", "**/k8s/*.yaml", "**/kubernetes/*.yaml" + ]) + exclude_patterns = config.get("exclude_patterns", []) + + k8s_files = [] + for pattern in patterns: + files = workspace.glob(pattern) + for file in files: + # Check if file contains Kubernetes resources + if self._is_kubernetes_manifest(file): + # Check if file should be excluded + should_exclude = False + for exclude_pattern in exclude_patterns: + if file.match(exclude_pattern): + should_exclude = True + break + if not should_exclude: + k8s_files.append(file) + + return list(set(k8s_files)) # Remove duplicates + + def _is_kubernetes_manifest(self, file: Path) -> bool: + """Check if a file is a Kubernetes manifest""" + try: + content = file.read_text(encoding='utf-8') + # Simple heuristic: check for common Kubernetes fields + k8s_indicators = [ + "apiVersion:", "kind:", "metadata:", "spec:", + "Deployment", "Service", "Pod", "ConfigMap", + "Secret", "Ingress", "PersistentVolume" + ] + return any(indicator in content for indicator in k8s_indicators) + except Exception: + return False + + async def _run_polaris_audit(self, workspace: Path, config: Dict[str, Any], k8s_files: List[Path]) -> List[ModuleFinding]: + """Run Polaris audit on workspace""" + findings = [] + + try: + # Build polaris command + cmd = ["polaris", "audit"] + + # Add audit path + audit_path = config.get("audit_path", str(workspace)) + cmd.extend(["--audit-path", audit_path]) + + # Add config file if specified + config_file = config.get("config_file") + if config_file: + cmd.extend(["--config", config_file]) + + # Add format + format_type = config.get("format", "json") + cmd.extend(["--format", format_type]) + + # Add only failed tests flag + if config.get("only_show_failed_tests", True): + cmd.append("--only-show-failed-tests") + + # Add severity threshold + severity_threshold = config.get("severity_threshold", "warning") + cmd.extend(["--severity", severity_threshold]) + + # Add disable checks + disable_checks = config.get("disable_checks", []) + for check in disable_checks: + cmd.extend(["--disable-check", check]) + + logger.debug(f"Running command: {' '.join(cmd)}") + + # Run polaris + process = await asyncio.create_subprocess_exec( + *cmd, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE, + cwd=workspace + ) + + stdout, stderr = await process.communicate() + + # Parse results + if process.returncode == 0 or format_type == "json": + findings = self._parse_polaris_output(stdout.decode(), workspace, config) + else: + error_msg = stderr.decode() + logger.warning(f"Polaris audit failed: {error_msg}") + + except Exception as e: + logger.warning(f"Error running Polaris audit: {e}") + + return findings + + def _parse_polaris_output(self, output: str, workspace: Path, config: Dict[str, Any]) -> List[ModuleFinding]: + """Parse Polaris JSON output into findings""" + findings = [] + + if not output.strip(): + return findings + + try: + data = json.loads(output) + + # Get severity threshold for filtering + severity_threshold = config.get("severity_threshold", "warning") + severity_levels = {"error": 3, "warning": 2, "info": 1} + min_severity_level = severity_levels.get(severity_threshold, 2) + + # Process audit results + audit_results = data.get("AuditResults", []) + + for result in audit_results: + namespace = result.get("Namespace", "default") + results_by_kind = result.get("Results", {}) + + for kind, kind_results in results_by_kind.items(): + for resource_name, resource_data in kind_results.items(): + # Get container results + container_results = resource_data.get("ContainerResults", {}) + pod_result = resource_data.get("PodResult", {}) + + # Process container results + for container_name, container_data in container_results.items(): + self._process_container_results( + findings, container_data, kind, resource_name, + container_name, namespace, workspace, min_severity_level + ) + + # Process pod-level results + if pod_result: + self._process_pod_results( + findings, pod_result, kind, resource_name, + namespace, workspace, min_severity_level + ) + + except json.JSONDecodeError as e: + logger.warning(f"Failed to parse Polaris output: {e}") + except Exception as e: + logger.warning(f"Error processing Polaris results: {e}") + + return findings + + def _process_container_results(self, findings: List[ModuleFinding], container_data: Dict, + kind: str, resource_name: str, container_name: str, + namespace: str, workspace: Path, min_severity_level: int): + """Process container-level validation results""" + results = container_data.get("Results", {}) + + for check_name, check_result in results.items(): + severity = check_result.get("Severity", "warning") + success = check_result.get("Success", True) + message = check_result.get("Message", "") + category_name = check_result.get("Category", "") + + # Skip if check passed or severity too low + if success: + continue + + severity_levels = {"error": 3, "warning": 2, "info": 1} + if severity_levels.get(severity, 1) < min_severity_level: + continue + + # Map severity to our standard levels + finding_severity = self._map_severity(severity) + category = self._get_category(check_name, category_name) + + finding = self.create_finding( + title=f"Polaris Policy Violation: {check_name}", + description=message or f"Container {container_name} in {kind} {resource_name} failed check {check_name}", + severity=finding_severity, + category=category, + file_path=None, # Polaris doesn't provide file paths in audit mode + recommendation=self._get_recommendation(check_name, message), + metadata={ + "check_name": check_name, + "polaris_severity": severity, + "polaris_category": category_name, + "resource_kind": kind, + "resource_name": resource_name, + "container_name": container_name, + "namespace": namespace, + "context": "container" + } + ) + findings.append(finding) + + def _process_pod_results(self, findings: List[ModuleFinding], pod_result: Dict, + kind: str, resource_name: str, namespace: str, + workspace: Path, min_severity_level: int): + """Process pod-level validation results""" + results = pod_result.get("Results", {}) + + for check_name, check_result in results.items(): + severity = check_result.get("Severity", "warning") + success = check_result.get("Success", True) + message = check_result.get("Message", "") + category_name = check_result.get("Category", "") + + # Skip if check passed or severity too low + if success: + continue + + severity_levels = {"error": 3, "warning": 2, "info": 1} + if severity_levels.get(severity, 1) < min_severity_level: + continue + + # Map severity to our standard levels + finding_severity = self._map_severity(severity) + category = self._get_category(check_name, category_name) + + finding = self.create_finding( + title=f"Polaris Policy Violation: {check_name}", + description=message or f"{kind} {resource_name} failed check {check_name}", + severity=finding_severity, + category=category, + file_path=None, # Polaris doesn't provide file paths in audit mode + recommendation=self._get_recommendation(check_name, message), + metadata={ + "check_name": check_name, + "polaris_severity": severity, + "polaris_category": category_name, + "resource_kind": kind, + "resource_name": resource_name, + "namespace": namespace, + "context": "pod" + } + ) + findings.append(finding) + + def _map_severity(self, polaris_severity: str) -> str: + """Map Polaris severity to our standard severity levels""" + severity_map = { + "error": "high", + "warning": "medium", + "info": "low" + } + return severity_map.get(polaris_severity.lower(), "medium") + + def _get_category(self, check_name: str, category_name: str) -> str: + """Determine finding category based on check name and category""" + check_lower = check_name.lower() + category_lower = category_name.lower() + + # Use Polaris category if available + if "security" in category_lower: + return "security_configuration" + elif "efficiency" in category_lower: + return "resource_efficiency" + elif "reliability" in category_lower: + return "reliability" + + # Fallback to check name analysis + if any(term in check_lower for term in ["security", "privilege", "root", "capabilities"]): + return "security_configuration" + elif any(term in check_lower for term in ["resources", "limits", "requests"]): + return "resource_management" + elif any(term in check_lower for term in ["probe", "health", "liveness", "readiness"]): + return "health_monitoring" + elif any(term in check_lower for term in ["image", "tag", "pull"]): + return "image_management" + elif any(term in check_lower for term in ["network", "host"]): + return "network_security" + else: + return "kubernetes_best_practices" + + def _get_recommendation(self, check_name: str, message: str) -> str: + """Generate recommendation based on check name and message""" + check_lower = check_name.lower() + + # Security-related recommendations + if "privileged" in check_lower: + return "Remove privileged: true from container security context to reduce security risks." + elif "runasroot" in check_lower: + return "Configure runAsNonRoot: true and specify a non-root user ID." + elif "allowprivilegeescalation" in check_lower: + return "Set allowPrivilegeEscalation: false to prevent privilege escalation attacks." + elif "capabilities" in check_lower: + return "Remove unnecessary capabilities and add only required ones using drop/add lists." + elif "readonly" in check_lower: + return "Set readOnlyRootFilesystem: true to prevent filesystem modifications." + + # Resource management recommendations + elif "memory" in check_lower and "requests" in check_lower: + return "Set memory requests to ensure proper resource allocation and scheduling." + elif "memory" in check_lower and "limits" in check_lower: + return "Set memory limits to prevent containers from using excessive memory." + elif "cpu" in check_lower and "requests" in check_lower: + return "Set CPU requests for proper resource allocation and quality of service." + elif "cpu" in check_lower and "limits" in check_lower: + return "Set CPU limits to prevent CPU starvation of other containers." + + # Health monitoring recommendations + elif "liveness" in check_lower: + return "Add liveness probes to detect and recover from container failures." + elif "readiness" in check_lower: + return "Add readiness probes to ensure containers are ready before receiving traffic." + + # Image management recommendations + elif "tag" in check_lower: + return "Use specific image tags instead of 'latest' for reproducible deployments." + elif "pullpolicy" in check_lower: + return "Set imagePullPolicy appropriately based on your deployment requirements." + + # Generic recommendation + elif message: + return f"Address the policy violation: {message}" + else: + return f"Review and fix the configuration issue identified by check: {check_name}" + + def _create_summary(self, findings: List[ModuleFinding], total_files: int) -> Dict[str, Any]: + """Create analysis summary""" + severity_counts = {"critical": 0, "high": 0, "medium": 0, "low": 0, "info": 0} + category_counts = {} + check_counts = {} + resource_counts = {} + + for finding in findings: + # Count by severity + severity_counts[finding.severity] += 1 + + # Count by category + category = finding.category + category_counts[category] = category_counts.get(category, 0) + 1 + + # Count by check + check_name = finding.metadata.get("check_name", "unknown") + check_counts[check_name] = check_counts.get(check_name, 0) + 1 + + # Count by resource + resource_kind = finding.metadata.get("resource_kind", "unknown") + resource_counts[resource_kind] = resource_counts.get(resource_kind, 0) + 1 + + return { + "total_findings": len(findings), + "files_scanned": total_files, + "severity_counts": severity_counts, + "category_counts": category_counts, + "top_checks": dict(sorted(check_counts.items(), key=lambda x: x[1], reverse=True)[:10]), + "resource_type_counts": resource_counts, + "unique_resources": len(set(f"{f.metadata.get('resource_kind')}:{f.metadata.get('resource_name')}" for f in findings)), + "namespaces": len(set(f.metadata.get("namespace", "default") for f in findings)) + } \ No newline at end of file diff --git a/backend/toolbox/modules/penetration_testing/__init__.py b/backend/toolbox/modules/penetration_testing/__init__.py new file mode 100644 index 0000000..3e09d19 --- /dev/null +++ b/backend/toolbox/modules/penetration_testing/__init__.py @@ -0,0 +1,43 @@ +""" +Penetration Testing Modules + +This package contains modules for penetration testing and vulnerability assessment. + +Available modules: +- Nuclei: Fast and customizable vulnerability scanner +- Nmap: Network discovery and security auditing +- Masscan: High-speed Internet-wide port scanner +- SQLMap: Automatic SQL injection detection and exploitation +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +from typing import List, Type +from ..base import BaseModule + +# Module registry for automatic discovery +PENETRATION_TESTING_MODULES: List[Type[BaseModule]] = [] + +def register_module(module_class: Type[BaseModule]): + """Register a penetration testing module""" + PENETRATION_TESTING_MODULES.append(module_class) + return module_class + +def get_available_modules() -> List[Type[BaseModule]]: + """Get all available penetration testing modules""" + return PENETRATION_TESTING_MODULES.copy() + +# Import modules to trigger registration +from .nuclei import NucleiModule +from .nmap import NmapModule +from .masscan import MasscanModule +from .sqlmap import SQLMapModule \ No newline at end of file diff --git a/backend/toolbox/modules/penetration_testing/masscan.py b/backend/toolbox/modules/penetration_testing/masscan.py new file mode 100644 index 0000000..3452168 --- /dev/null +++ b/backend/toolbox/modules/penetration_testing/masscan.py @@ -0,0 +1,607 @@ +""" +Masscan Penetration Testing Module + +This module uses Masscan for high-speed Internet-wide port scanning. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import asyncio +import json +from pathlib import Path +from typing import Dict, Any, List +import subprocess +import logging + +from ..base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult +from . import register_module + +logger = logging.getLogger(__name__) + + +@register_module +class MasscanModule(BaseModule): + """Masscan high-speed port scanner module""" + + def get_metadata(self) -> ModuleMetadata: + """Get module metadata""" + return ModuleMetadata( + name="masscan", + version="1.3.2", + description="High-speed Internet-wide port scanner for large-scale network discovery", + author="FuzzForge Team", + category="penetration_testing", + tags=["port-scan", "network", "discovery", "high-speed", "mass-scan"], + input_schema={ + "type": "object", + "properties": { + "targets": { + "type": "array", + "items": {"type": "string"}, + "description": "List of targets (IP addresses, CIDR ranges, domains)" + }, + "target_file": { + "type": "string", + "description": "File containing targets to scan" + }, + "ports": { + "type": "string", + "default": "1-1000", + "description": "Port range or specific ports to scan" + }, + "top_ports": { + "type": "integer", + "description": "Scan top N most common ports" + }, + "rate": { + "type": "integer", + "default": 1000, + "description": "Packet transmission rate (packets/second)" + }, + "max_rate": { + "type": "integer", + "description": "Maximum packet rate limit" + }, + "connection_timeout": { + "type": "integer", + "default": 10, + "description": "Connection timeout in seconds" + }, + "wait_time": { + "type": "integer", + "default": 10, + "description": "Time to wait for responses (seconds)" + }, + "retries": { + "type": "integer", + "default": 0, + "description": "Number of retries for failed connections" + }, + "randomize_hosts": { + "type": "boolean", + "default": True, + "description": "Randomize host order" + }, + "source_ip": { + "type": "string", + "description": "Source IP address to use" + }, + "source_port": { + "type": "string", + "description": "Source port range to use" + }, + "interface": { + "type": "string", + "description": "Network interface to use" + }, + "router_mac": { + "type": "string", + "description": "Router MAC address" + }, + "exclude_targets": { + "type": "array", + "items": {"type": "string"}, + "description": "Targets to exclude from scanning" + }, + "exclude_file": { + "type": "string", + "description": "File containing targets to exclude" + }, + "ping": { + "type": "boolean", + "default": False, + "description": "Include ping scan" + }, + "banners": { + "type": "boolean", + "default": False, + "description": "Grab banners from services" + }, + "http_user_agent": { + "type": "string", + "description": "HTTP User-Agent string for banner grabbing" + } + } + }, + output_schema={ + "type": "object", + "properties": { + "findings": { + "type": "array", + "items": { + "type": "object", + "properties": { + "host": {"type": "string"}, + "port": {"type": "integer"}, + "protocol": {"type": "string"}, + "state": {"type": "string"}, + "banner": {"type": "string"} + } + } + } + } + } + ) + + def validate_config(self, config: Dict[str, Any]) -> bool: + """Validate configuration""" + targets = config.get("targets", []) + target_file = config.get("target_file") + + if not targets and not target_file: + raise ValueError("Either 'targets' or 'target_file' must be specified") + + rate = config.get("rate", 1000) + if rate <= 0 or rate > 10000000: # Masscan limit + raise ValueError("Rate must be between 1 and 10,000,000 packets/second") + + return True + + async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: + """Execute Masscan port scanning""" + self.start_timer() + + try: + # Validate inputs + self.validate_config(config) + self.validate_workspace(workspace) + + logger.info("Running Masscan high-speed port scan") + + # Prepare target specification + target_args = self._prepare_targets(config, workspace) + if not target_args: + logger.info("No targets specified for scanning") + return self.create_result( + findings=[], + status="success", + summary={"total_findings": 0, "targets_scanned": 0} + ) + + # Run Masscan scan + findings = await self._run_masscan_scan(target_args, config, workspace) + + # Create summary + target_count = len(config.get("targets", [])) if config.get("targets") else 1 + summary = self._create_summary(findings, target_count) + + logger.info(f"Masscan found {len(findings)} open ports") + + return self.create_result( + findings=findings, + status="success", + summary=summary + ) + + except Exception as e: + logger.error(f"Masscan module failed: {e}") + return self.create_result( + findings=[], + status="failed", + error=str(e) + ) + + def _prepare_targets(self, config: Dict[str, Any], workspace: Path) -> List[str]: + """Prepare target arguments for masscan""" + target_args = [] + + # Add targets from list + targets = config.get("targets", []) + for target in targets: + target_args.extend(["-t", target]) + + # Add targets from file + target_file = config.get("target_file") + if target_file: + target_path = workspace / target_file + if target_path.exists(): + target_args.extend(["-iL", str(target_path)]) + else: + raise FileNotFoundError(f"Target file not found: {target_file}") + + return target_args + + async def _run_masscan_scan(self, target_args: List[str], config: Dict[str, Any], workspace: Path) -> List[ModuleFinding]: + """Run Masscan scan""" + findings = [] + + try: + # Build masscan command + cmd = ["masscan"] + + # Add target arguments + cmd.extend(target_args) + + # Add port specification + if config.get("top_ports"): + # Masscan doesn't have built-in top ports, use common ports + top_ports = self._get_top_ports(config["top_ports"]) + cmd.extend(["-p", top_ports]) + else: + ports = config.get("ports", "1-1000") + cmd.extend(["-p", ports]) + + # Add rate limiting + rate = config.get("rate", 1000) + cmd.extend(["--rate", str(rate)]) + + # Add max rate if specified + max_rate = config.get("max_rate") + if max_rate: + cmd.extend(["--max-rate", str(max_rate)]) + + # Add connection timeout + connection_timeout = config.get("connection_timeout", 10) + cmd.extend(["--connection-timeout", str(connection_timeout)]) + + # Add wait time + wait_time = config.get("wait_time", 10) + cmd.extend(["--wait", str(wait_time)]) + + # Add retries + retries = config.get("retries", 0) + if retries > 0: + cmd.extend(["--retries", str(retries)]) + + # Add randomization + if config.get("randomize_hosts", True): + cmd.append("--randomize-hosts") + + # Add source IP + source_ip = config.get("source_ip") + if source_ip: + cmd.extend(["--source-ip", source_ip]) + + # Add source port + source_port = config.get("source_port") + if source_port: + cmd.extend(["--source-port", source_port]) + + # Add interface + interface = config.get("interface") + if interface: + cmd.extend(["-e", interface]) + + # Add router MAC + router_mac = config.get("router_mac") + if router_mac: + cmd.extend(["--router-mac", router_mac]) + + # Add exclude targets + exclude_targets = config.get("exclude_targets", []) + for exclude in exclude_targets: + cmd.extend(["--exclude", exclude]) + + # Add exclude file + exclude_file = config.get("exclude_file") + if exclude_file: + exclude_path = workspace / exclude_file + if exclude_path.exists(): + cmd.extend(["--excludefile", str(exclude_path)]) + + # Add ping scan + if config.get("ping", False): + cmd.append("--ping") + + # Add banner grabbing + if config.get("banners", False): + cmd.append("--banners") + + # Add HTTP User-Agent + user_agent = config.get("http_user_agent") + if user_agent: + cmd.extend(["--http-user-agent", user_agent]) + + # Set output format to JSON + output_file = workspace / "masscan_results.json" + cmd.extend(["-oJ", str(output_file)]) + + logger.debug(f"Running command: {' '.join(cmd)}") + + # Run masscan + process = await asyncio.create_subprocess_exec( + *cmd, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE, + cwd=workspace + ) + + stdout, stderr = await process.communicate() + + # Parse results from JSON file + if output_file.exists(): + findings = self._parse_masscan_json(output_file, workspace) + else: + # Try to parse stdout if no file was created + if stdout: + findings = self._parse_masscan_output(stdout.decode(), workspace) + else: + error_msg = stderr.decode() + logger.error(f"Masscan scan failed: {error_msg}") + + except Exception as e: + logger.warning(f"Error running Masscan scan: {e}") + + return findings + + def _get_top_ports(self, count: int) -> str: + """Get top N common ports for masscan""" + # Common ports based on Nmap's top ports list + top_ports = [ + 80, 23, 443, 21, 22, 25, 53, 110, 111, 995, 993, 143, 993, 995, 587, 465, + 109, 88, 53, 135, 139, 445, 993, 995, 143, 25, 110, 465, 587, 993, 995, + 80, 8080, 443, 8443, 8000, 8888, 8880, 2222, 9999, 3389, 5900, 5901, + 1433, 3306, 5432, 1521, 50000, 1494, 554, 37, 79, 82, 5060, 50030 + ] + + # Take first N unique ports + selected_ports = list(dict.fromkeys(top_ports))[:count] + return ",".join(map(str, selected_ports)) + + def _parse_masscan_json(self, json_file: Path, workspace: Path) -> List[ModuleFinding]: + """Parse Masscan JSON output into findings""" + findings = [] + + try: + with open(json_file, 'r') as f: + content = f.read().strip() + + # Masscan outputs JSONL format (one JSON object per line) + for line in content.split('\n'): + if not line.strip(): + continue + + try: + result = json.loads(line) + finding = self._process_masscan_result(result) + if finding: + findings.append(finding) + except json.JSONDecodeError: + continue + + except Exception as e: + logger.warning(f"Error parsing Masscan JSON: {e}") + + return findings + + def _parse_masscan_output(self, output: str, workspace: Path) -> List[ModuleFinding]: + """Parse Masscan text output into findings""" + findings = [] + + try: + for line in output.split('\n'): + if not line.strip() or line.startswith('#'): + continue + + # Parse format: "open tcp 80 1.2.3.4" + parts = line.split() + if len(parts) >= 4 and parts[0] == "open": + protocol = parts[1] + port = int(parts[2]) + ip = parts[3] + + result = { + "ip": ip, + "ports": [{"port": port, "proto": protocol, "status": "open"}] + } + + finding = self._process_masscan_result(result) + if finding: + findings.append(finding) + + except Exception as e: + logger.warning(f"Error parsing Masscan output: {e}") + + return findings + + def _process_masscan_result(self, result: Dict) -> ModuleFinding: + """Process a single Masscan result into a finding""" + try: + ip_address = result.get("ip", "") + ports_data = result.get("ports", []) + + if not ip_address or not ports_data: + return None + + # Process first port (Masscan typically reports one port per result) + port_data = ports_data[0] + port_number = port_data.get("port", 0) + protocol = port_data.get("proto", "tcp") + status = port_data.get("status", "open") + service = port_data.get("service", {}) + banner = service.get("banner", "") if service else "" + + # Only report open ports + if status != "open": + return None + + # Determine severity based on port + severity = self._get_port_severity(port_number) + + # Get category + category = self._get_port_category(port_number) + + # Create description + description = f"Open port {port_number}/{protocol} on {ip_address}" + if banner: + description += f" (Banner: {banner[:100]})" + + # Create finding + finding = self.create_finding( + title=f"Open Port: {port_number}/{protocol}", + description=description, + severity=severity, + category=category, + file_path=None, # Network scan, no file + recommendation=self._get_port_recommendation(port_number, banner), + metadata={ + "host": ip_address, + "port": port_number, + "protocol": protocol, + "status": status, + "banner": banner, + "service_info": service + } + ) + + return finding + + except Exception as e: + logger.warning(f"Error processing Masscan result: {e}") + return None + + def _get_port_severity(self, port: int) -> str: + """Determine severity based on port number""" + # High risk ports (commonly exploited or sensitive services) + high_risk_ports = [21, 23, 135, 139, 445, 1433, 1521, 3389, 5900, 6379, 27017] + + # Medium risk ports (network services that could be risky if misconfigured) + medium_risk_ports = [22, 25, 53, 110, 143, 993, 995, 3306, 5432] + + # Web ports are generally lower risk but still noteworthy + web_ports = [80, 443, 8080, 8443, 8000, 8888] + + if port in high_risk_ports: + return "high" + elif port in medium_risk_ports: + return "medium" + elif port in web_ports: + return "low" + elif port < 1024: # Well-known ports + return "medium" + else: + return "low" + + def _get_port_category(self, port: int) -> str: + """Determine category based on port number""" + if port in [80, 443, 8080, 8443, 8000, 8888]: + return "web_services" + elif port == 22: + return "remote_access" + elif port in [20, 21]: + return "file_transfer" + elif port in [25, 110, 143, 587, 993, 995]: + return "email_services" + elif port in [1433, 3306, 5432, 1521, 27017, 6379]: + return "database_services" + elif port == 3389: + return "remote_desktop" + elif port == 53: + return "dns_services" + elif port in [135, 139, 445]: + return "windows_services" + elif port in [23, 5900]: + return "insecure_protocols" + else: + return "network_services" + + def _get_port_recommendation(self, port: int, banner: str) -> str: + """Generate recommendation based on port and banner""" + # Port-specific recommendations + recommendations = { + 21: "FTP service detected. Consider using SFTP instead for secure file transfer.", + 22: "SSH service detected. Ensure strong authentication and key-based access.", + 23: "Telnet service detected. Replace with SSH for secure remote access.", + 25: "SMTP service detected. Ensure proper authentication and encryption.", + 53: "DNS service detected. Verify it's not an open resolver.", + 80: "HTTP service detected. Consider upgrading to HTTPS.", + 110: "POP3 service detected. Consider using secure alternatives like IMAPS.", + 135: "Windows RPC service exposed. Restrict access if not required.", + 139: "NetBIOS service detected. Ensure proper access controls.", + 143: "IMAP service detected. Consider using encrypted IMAPS.", + 445: "SMB service detected. Ensure latest patches and access controls.", + 443: "HTTPS service detected. Verify SSL/TLS configuration.", + 993: "IMAPS service detected. Verify certificate configuration.", + 995: "POP3S service detected. Verify certificate configuration.", + 1433: "SQL Server detected. Ensure strong authentication and network restrictions.", + 1521: "Oracle DB detected. Ensure proper security configuration.", + 3306: "MySQL service detected. Secure with strong passwords and access controls.", + 3389: "RDP service detected. Use strong passwords and consider VPN access.", + 5432: "PostgreSQL detected. Ensure proper authentication and access controls.", + 5900: "VNC service detected. Use strong passwords and encryption.", + 6379: "Redis service detected. Configure authentication and access controls.", + 8080: "HTTP proxy/web service detected. Verify if exposure is intended.", + 8443: "HTTPS service on non-standard port. Verify certificate configuration." + } + + recommendation = recommendations.get(port, f"Port {port} is open. Verify if this service is required and properly secured.") + + # Add banner-specific advice + if banner: + banner_lower = banner.lower() + if "default" in banner_lower or "admin" in banner_lower: + recommendation += " Default credentials may be in use - change immediately." + elif any(version in banner_lower for version in ["1.0", "2.0", "old", "legacy"]): + recommendation += " Service version appears outdated - consider upgrading." + + return recommendation + + def _create_summary(self, findings: List[ModuleFinding], targets_count: int) -> Dict[str, Any]: + """Create analysis summary""" + severity_counts = {"critical": 0, "high": 0, "medium": 0, "low": 0, "info": 0} + category_counts = {} + port_counts = {} + host_counts = {} + protocol_counts = {"tcp": 0, "udp": 0} + + for finding in findings: + # Count by severity + severity_counts[finding.severity] += 1 + + # Count by category + category = finding.category + category_counts[category] = category_counts.get(category, 0) + 1 + + # Count by port + port = finding.metadata.get("port") + if port: + port_counts[port] = port_counts.get(port, 0) + 1 + + # Count by host + host = finding.metadata.get("host", "unknown") + host_counts[host] = host_counts.get(host, 0) + 1 + + # Count by protocol + protocol = finding.metadata.get("protocol", "tcp") + if protocol in protocol_counts: + protocol_counts[protocol] += 1 + + return { + "total_findings": len(findings), + "targets_scanned": targets_count, + "severity_counts": severity_counts, + "category_counts": category_counts, + "protocol_counts": protocol_counts, + "unique_hosts": len(host_counts), + "top_ports": dict(sorted(port_counts.items(), key=lambda x: x[1], reverse=True)[:10]), + "host_counts": dict(sorted(host_counts.items(), key=lambda x: x[1], reverse=True)[:10]) + } \ No newline at end of file diff --git a/backend/toolbox/modules/penetration_testing/nmap.py b/backend/toolbox/modules/penetration_testing/nmap.py new file mode 100644 index 0000000..4cfb363 --- /dev/null +++ b/backend/toolbox/modules/penetration_testing/nmap.py @@ -0,0 +1,710 @@ +""" +Nmap Penetration Testing Module + +This module uses Nmap for network discovery, port scanning, and security auditing. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import asyncio +import json +import xml.etree.ElementTree as ET +from pathlib import Path +from typing import Dict, Any, List +import subprocess +import logging + +from ..base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult +from . import register_module + +logger = logging.getLogger(__name__) + + +@register_module +class NmapModule(BaseModule): + """Nmap network discovery and security auditing module""" + + def get_metadata(self) -> ModuleMetadata: + """Get module metadata""" + return ModuleMetadata( + name="nmap", + version="7.94", + description="Network discovery and security auditing using Nmap", + author="FuzzForge Team", + category="penetration_testing", + tags=["network", "port-scan", "discovery", "security-audit", "service-detection"], + input_schema={ + "type": "object", + "properties": { + "targets": { + "type": "array", + "items": {"type": "string"}, + "description": "List of targets (IP addresses, domains, CIDR ranges)" + }, + "target_file": { + "type": "string", + "description": "File containing targets to scan" + }, + "scan_type": { + "type": "string", + "enum": ["syn", "tcp", "udp", "ack", "window", "maimon"], + "default": "syn", + "description": "Type of scan to perform" + }, + "ports": { + "type": "string", + "default": "1-1000", + "description": "Port range or specific ports to scan" + }, + "top_ports": { + "type": "integer", + "description": "Scan top N most common ports" + }, + "service_detection": { + "type": "boolean", + "default": True, + "description": "Enable service version detection" + }, + "os_detection": { + "type": "boolean", + "default": False, + "description": "Enable OS detection (requires root)" + }, + "script_scan": { + "type": "boolean", + "default": True, + "description": "Enable default NSE scripts" + }, + "scripts": { + "type": "array", + "items": {"type": "string"}, + "description": "Specific NSE scripts to run" + }, + "script_categories": { + "type": "array", + "items": {"type": "string"}, + "description": "NSE script categories to run (safe, vuln, etc.)" + }, + "timing_template": { + "type": "string", + "enum": ["paranoid", "sneaky", "polite", "normal", "aggressive", "insane"], + "default": "normal", + "description": "Timing template (0-5)" + }, + "max_retries": { + "type": "integer", + "default": 1, + "description": "Maximum number of retries" + }, + "host_timeout": { + "type": "integer", + "default": 300, + "description": "Host timeout in seconds" + }, + "min_rate": { + "type": "integer", + "description": "Minimum packet rate (packets/second)" + }, + "max_rate": { + "type": "integer", + "description": "Maximum packet rate (packets/second)" + }, + "stealth": { + "type": "boolean", + "default": False, + "description": "Enable stealth scanning options" + }, + "skip_discovery": { + "type": "boolean", + "default": False, + "description": "Skip host discovery (treat all as online)" + } + } + }, + output_schema={ + "type": "object", + "properties": { + "findings": { + "type": "array", + "items": { + "type": "object", + "properties": { + "host": {"type": "string"}, + "port": {"type": "integer"}, + "service": {"type": "string"}, + "state": {"type": "string"}, + "version": {"type": "string"} + } + } + } + } + } + ) + + def validate_config(self, config: Dict[str, Any]) -> bool: + """Validate configuration""" + targets = config.get("targets", []) + target_file = config.get("target_file") + + if not targets and not target_file: + raise ValueError("Either 'targets' or 'target_file' must be specified") + + scan_type = config.get("scan_type", "syn") + valid_scan_types = ["syn", "tcp", "udp", "ack", "window", "maimon"] + if scan_type not in valid_scan_types: + raise ValueError(f"Invalid scan type: {scan_type}. Valid: {valid_scan_types}") + + timing = config.get("timing_template", "normal") + valid_timings = ["paranoid", "sneaky", "polite", "normal", "aggressive", "insane"] + if timing not in valid_timings: + raise ValueError(f"Invalid timing template: {timing}. Valid: {valid_timings}") + + return True + + async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: + """Execute Nmap network scanning""" + self.start_timer() + + try: + # Validate inputs + self.validate_config(config) + self.validate_workspace(workspace) + + logger.info("Running Nmap network scan") + + # Prepare target file + target_file = await self._prepare_targets(config, workspace) + if not target_file: + logger.info("No targets specified for scanning") + return self.create_result( + findings=[], + status="success", + summary={"total_findings": 0, "hosts_scanned": 0} + ) + + # Run Nmap scan + findings = await self._run_nmap_scan(target_file, config, workspace) + + # Create summary + target_count = len(config.get("targets", [])) if config.get("targets") else 1 + summary = self._create_summary(findings, target_count) + + logger.info(f"Nmap found {len(findings)} results") + + return self.create_result( + findings=findings, + status="success", + summary=summary + ) + + except Exception as e: + logger.error(f"Nmap module failed: {e}") + return self.create_result( + findings=[], + status="failed", + error=str(e) + ) + + async def _prepare_targets(self, config: Dict[str, Any], workspace: Path) -> Path: + """Prepare target file for scanning""" + targets = config.get("targets", []) + target_file = config.get("target_file") + + if target_file: + # Use existing target file + target_path = workspace / target_file + if target_path.exists(): + return target_path + else: + raise FileNotFoundError(f"Target file not found: {target_file}") + + if targets: + # Create temporary target file + target_path = workspace / "nmap_targets.txt" + with open(target_path, 'w') as f: + for target in targets: + f.write(f"{target}\n") + return target_path + + return None + + async def _run_nmap_scan(self, target_file: Path, config: Dict[str, Any], workspace: Path) -> List[ModuleFinding]: + """Run Nmap scan""" + findings = [] + + try: + # Build nmap command + cmd = ["nmap"] + + # Add scan type + scan_type = config.get("scan_type", "syn") + scan_type_map = { + "syn": "-sS", + "tcp": "-sT", + "udp": "-sU", + "ack": "-sA", + "window": "-sW", + "maimon": "-sM" + } + cmd.append(scan_type_map[scan_type]) + + # Add port specification + if config.get("top_ports"): + cmd.extend(["--top-ports", str(config["top_ports"])]) + else: + ports = config.get("ports", "1-1000") + cmd.extend(["-p", ports]) + + # Add service detection + if config.get("service_detection", True): + cmd.append("-sV") + + # Add OS detection + if config.get("os_detection", False): + cmd.append("-O") + + # Add script scanning + if config.get("script_scan", True): + cmd.append("-sC") + + # Add specific scripts + scripts = config.get("scripts", []) + if scripts: + cmd.extend(["--script", ",".join(scripts)]) + + # Add script categories + script_categories = config.get("script_categories", []) + if script_categories: + cmd.extend(["--script", ",".join(script_categories)]) + + # Add timing template + timing = config.get("timing_template", "normal") + timing_map = { + "paranoid": "-T0", + "sneaky": "-T1", + "polite": "-T2", + "normal": "-T3", + "aggressive": "-T4", + "insane": "-T5" + } + cmd.append(timing_map[timing]) + + # Add retry options + max_retries = config.get("max_retries", 1) + cmd.extend(["--max-retries", str(max_retries)]) + + # Add timeout + host_timeout = config.get("host_timeout", 300) + cmd.extend(["--host-timeout", f"{host_timeout}s"]) + + # Add rate limiting + if config.get("min_rate"): + cmd.extend(["--min-rate", str(config["min_rate"])]) + + if config.get("max_rate"): + cmd.extend(["--max-rate", str(config["max_rate"])]) + + # Add stealth options + if config.get("stealth", False): + cmd.extend(["-f", "--randomize-hosts"]) + + # Skip host discovery if requested + if config.get("skip_discovery", False): + cmd.append("-Pn") + + # Add output format + output_file = workspace / "nmap_results.xml" + cmd.extend(["-oX", str(output_file)]) + + # Add targets from file + cmd.extend(["-iL", str(target_file)]) + + # Add verbose and reason flags + cmd.extend(["-v", "--reason"]) + + logger.debug(f"Running command: {' '.join(cmd)}") + + # Run nmap + process = await asyncio.create_subprocess_exec( + *cmd, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE, + cwd=workspace + ) + + stdout, stderr = await process.communicate() + + # Parse results from XML file + if output_file.exists(): + findings = self._parse_nmap_xml(output_file, workspace) + else: + error_msg = stderr.decode() + logger.error(f"Nmap scan failed: {error_msg}") + + except Exception as e: + logger.warning(f"Error running Nmap scan: {e}") + + return findings + + def _parse_nmap_xml(self, xml_file: Path, workspace: Path) -> List[ModuleFinding]: + """Parse Nmap XML output into findings""" + findings = [] + + try: + tree = ET.parse(xml_file) + root = tree.getroot() + + # Process each host + for host_elem in root.findall(".//host"): + # Get host information + host_status = host_elem.find("status") + if host_status is None or host_status.get("state") != "up": + continue + + # Get IP address + address_elem = host_elem.find("address[@addrtype='ipv4']") + if address_elem is None: + address_elem = host_elem.find("address[@addrtype='ipv6']") + + if address_elem is None: + continue + + ip_address = address_elem.get("addr") + + # Get hostname if available + hostname = "" + hostnames_elem = host_elem.find("hostnames") + if hostnames_elem is not None: + hostname_elem = hostnames_elem.find("hostname") + if hostname_elem is not None: + hostname = hostname_elem.get("name", "") + + # Get OS information + os_info = self._extract_os_info(host_elem) + + # Process ports + ports_elem = host_elem.find("ports") + if ports_elem is not None: + for port_elem in ports_elem.findall("port"): + finding = self._process_port(port_elem, ip_address, hostname, os_info) + if finding: + findings.append(finding) + + # Process host scripts + host_scripts = host_elem.find("hostscript") + if host_scripts is not None: + for script_elem in host_scripts.findall("script"): + finding = self._process_host_script(script_elem, ip_address, hostname) + if finding: + findings.append(finding) + + except ET.ParseError as e: + logger.warning(f"Failed to parse Nmap XML: {e}") + except Exception as e: + logger.warning(f"Error processing Nmap results: {e}") + + return findings + + def _extract_os_info(self, host_elem) -> Dict[str, Any]: + """Extract OS information from host element""" + os_info = {} + + os_elem = host_elem.find("os") + if os_elem is not None: + osmatch_elem = os_elem.find("osmatch") + if osmatch_elem is not None: + os_info["name"] = osmatch_elem.get("name", "") + os_info["accuracy"] = osmatch_elem.get("accuracy", "0") + + return os_info + + def _process_port(self, port_elem, ip_address: str, hostname: str, os_info: Dict) -> ModuleFinding: + """Process a port element into a finding""" + try: + port_id = port_elem.get("portid") + protocol = port_elem.get("protocol") + + # Get state + state_elem = port_elem.find("state") + if state_elem is None: + return None + + state = state_elem.get("state") + reason = state_elem.get("reason", "") + + # Only report open ports + if state != "open": + return None + + # Get service information + service_elem = port_elem.find("service") + service_name = "" + service_version = "" + service_product = "" + service_extra = "" + + if service_elem is not None: + service_name = service_elem.get("name", "") + service_version = service_elem.get("version", "") + service_product = service_elem.get("product", "") + service_extra = service_elem.get("extrainfo", "") + + # Determine severity based on service + severity = self._get_port_severity(int(port_id), service_name) + + # Get category + category = self._get_port_category(int(port_id), service_name) + + # Create description + desc_parts = [f"Open port {port_id}/{protocol}"] + if service_name: + desc_parts.append(f"running {service_name}") + if service_product: + desc_parts.append(f"({service_product}") + if service_version: + desc_parts.append(f"version {service_version}") + desc_parts.append(")") + + description = " ".join(desc_parts) + + # Process port scripts + script_results = [] + script_elems = port_elem.findall("script") + for script_elem in script_elems: + script_id = script_elem.get("id", "") + script_output = script_elem.get("output", "") + if script_output: + script_results.append({"id": script_id, "output": script_output}) + + # Create finding + finding = self.create_finding( + title=f"Open Port: {port_id}/{protocol}", + description=description, + severity=severity, + category=category, + file_path=None, # Network scan, no file + recommendation=self._get_port_recommendation(int(port_id), service_name, script_results), + metadata={ + "host": ip_address, + "hostname": hostname, + "port": int(port_id), + "protocol": protocol, + "state": state, + "reason": reason, + "service_name": service_name, + "service_version": service_version, + "service_product": service_product, + "service_extra": service_extra, + "os_info": os_info, + "script_results": script_results + } + ) + + return finding + + except Exception as e: + logger.warning(f"Error processing port: {e}") + return None + + def _process_host_script(self, script_elem, ip_address: str, hostname: str) -> ModuleFinding: + """Process a host script result into a finding""" + try: + script_id = script_elem.get("id", "") + script_output = script_elem.get("output", "") + + if not script_output or not script_id: + return None + + # Determine if this is a security issue + severity = self._get_script_severity(script_id, script_output) + + if severity == "info": + # Skip informational scripts + return None + + category = self._get_script_category(script_id) + + finding = self.create_finding( + title=f"Host Script Result: {script_id}", + description=script_output.strip(), + severity=severity, + category=category, + file_path=None, + recommendation=self._get_script_recommendation(script_id, script_output), + metadata={ + "host": ip_address, + "hostname": hostname, + "script_id": script_id, + "script_output": script_output.strip() + } + ) + + return finding + + except Exception as e: + logger.warning(f"Error processing host script: {e}") + return None + + def _get_port_severity(self, port: int, service: str) -> str: + """Determine severity based on port and service""" + # High risk ports + high_risk_ports = [21, 23, 135, 139, 445, 1433, 1521, 3389, 5432, 5900, 6379] + # Medium risk ports + medium_risk_ports = [22, 25, 53, 110, 143, 993, 995] + # Web ports are generally lower risk + web_ports = [80, 443, 8080, 8443, 8000, 8888] + + if port in high_risk_ports: + return "high" + elif port in medium_risk_ports: + return "medium" + elif port in web_ports: + return "low" + elif port < 1024: # Well-known ports + return "medium" + else: + return "low" + + def _get_port_category(self, port: int, service: str) -> str: + """Determine category based on port and service""" + service_lower = service.lower() + + if service_lower in ["http", "https"] or port in [80, 443, 8080, 8443]: + return "web_services" + elif service_lower in ["ssh"] or port == 22: + return "remote_access" + elif service_lower in ["ftp", "ftps"] or port in [20, 21]: + return "file_transfer" + elif service_lower in ["smtp", "pop3", "imap"] or port in [25, 110, 143, 587, 993, 995]: + return "email_services" + elif service_lower in ["mysql", "postgresql", "mssql", "oracle"] or port in [1433, 3306, 5432, 1521]: + return "database_services" + elif service_lower in ["rdp"] or port == 3389: + return "remote_desktop" + elif service_lower in ["dns"] or port == 53: + return "dns_services" + elif port in [135, 139, 445]: + return "windows_services" + else: + return "network_services" + + def _get_script_severity(self, script_id: str, output: str) -> str: + """Determine severity for script results""" + script_lower = script_id.lower() + output_lower = output.lower() + + # High severity indicators + if any(term in script_lower for term in ["vuln", "exploit", "backdoor"]): + return "high" + if any(term in output_lower for term in ["vulnerable", "exploit", "critical"]): + return "high" + + # Medium severity indicators + if any(term in script_lower for term in ["auth", "brute", "enum"]): + return "medium" + if any(term in output_lower for term in ["anonymous", "default", "weak"]): + return "medium" + + # Everything else is informational + return "info" + + def _get_script_category(self, script_id: str) -> str: + """Determine category for script results""" + script_lower = script_id.lower() + + if "vuln" in script_lower: + return "vulnerability_detection" + elif "auth" in script_lower or "brute" in script_lower: + return "authentication_testing" + elif "enum" in script_lower: + return "information_gathering" + elif "ssl" in script_lower or "tls" in script_lower: + return "ssl_tls_testing" + else: + return "service_detection" + + def _get_port_recommendation(self, port: int, service: str, scripts: List[Dict]) -> str: + """Generate recommendation for open port""" + # Check for script-based issues + for script in scripts: + script_id = script.get("id", "") + if "vuln" in script_id.lower(): + return "Vulnerability detected by NSE scripts. Review and patch the service." + + # Port-specific recommendations + if port == 21: + return "FTP service detected. Consider using SFTP instead for secure file transfer." + elif port == 23: + return "Telnet service detected. Use SSH instead for secure remote access." + elif port == 135: + return "Windows RPC service exposed. Restrict access if not required." + elif port in [139, 445]: + return "SMB/NetBIOS services detected. Ensure proper access controls and patch levels." + elif port == 1433: + return "SQL Server detected. Ensure strong authentication and network restrictions." + elif port == 3389: + return "RDP service detected. Use strong passwords and consider VPN access." + elif port in [80, 443]: + return "Web service detected. Ensure regular security updates and proper configuration." + else: + return f"Open port {port} detected. Verify if this service is required and properly secured." + + def _get_script_recommendation(self, script_id: str, output: str) -> str: + """Generate recommendation for script results""" + if "vuln" in script_id.lower(): + return "Vulnerability detected. Apply security patches and updates." + elif "auth" in script_id.lower(): + return "Authentication issue detected. Review and strengthen authentication mechanisms." + elif "ssl" in script_id.lower(): + return "SSL/TLS configuration issue. Update SSL configuration and certificates." + else: + return "Review the script output and address any security concerns identified." + + def _create_summary(self, findings: List[ModuleFinding], hosts_count: int) -> Dict[str, Any]: + """Create analysis summary""" + severity_counts = {"critical": 0, "high": 0, "medium": 0, "low": 0, "info": 0} + category_counts = {} + port_counts = {} + service_counts = {} + host_counts = {} + + for finding in findings: + # Count by severity + severity_counts[finding.severity] += 1 + + # Count by category + category = finding.category + category_counts[category] = category_counts.get(category, 0) + 1 + + # Count by port + port = finding.metadata.get("port") + if port: + port_counts[port] = port_counts.get(port, 0) + 1 + + # Count by service + service = finding.metadata.get("service_name", "unknown") + service_counts[service] = service_counts.get(service, 0) + 1 + + # Count by host + host = finding.metadata.get("host", "unknown") + host_counts[host] = host_counts.get(host, 0) + 1 + + return { + "total_findings": len(findings), + "hosts_scanned": hosts_count, + "severity_counts": severity_counts, + "category_counts": category_counts, + "unique_hosts": len(host_counts), + "top_ports": dict(sorted(port_counts.items(), key=lambda x: x[1], reverse=True)[:10]), + "top_services": dict(sorted(service_counts.items(), key=lambda x: x[1], reverse=True)[:10]), + "host_counts": dict(sorted(host_counts.items(), key=lambda x: x[1], reverse=True)[:5]) + } \ No newline at end of file diff --git a/backend/toolbox/modules/penetration_testing/nuclei.py b/backend/toolbox/modules/penetration_testing/nuclei.py new file mode 100644 index 0000000..8114960 --- /dev/null +++ b/backend/toolbox/modules/penetration_testing/nuclei.py @@ -0,0 +1,501 @@ +""" +Nuclei Penetration Testing Module + +This module uses Nuclei to perform fast and customizable vulnerability scanning +using community-powered templates. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import asyncio +import json +from pathlib import Path +from typing import Dict, Any, List +import subprocess +import logging + +from ..base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult +from . import register_module + +logger = logging.getLogger(__name__) + + +@register_module +class NucleiModule(BaseModule): + """Nuclei fast vulnerability scanner module""" + + def get_metadata(self) -> ModuleMetadata: + """Get module metadata""" + return ModuleMetadata( + name="nuclei", + version="3.1.0", + description="Fast and customizable vulnerability scanner using community-powered templates", + author="FuzzForge Team", + category="penetration_testing", + tags=["vulnerability", "scanner", "web", "network", "templates"], + input_schema={ + "type": "object", + "properties": { + "targets": { + "type": "array", + "items": {"type": "string"}, + "description": "List of targets (URLs, domains, IP addresses)" + }, + "target_file": { + "type": "string", + "description": "File containing targets to scan" + }, + "templates": { + "type": "array", + "items": {"type": "string"}, + "description": "Specific templates to use" + }, + "template_directory": { + "type": "string", + "description": "Directory containing custom templates" + }, + "tags": { + "type": "array", + "items": {"type": "string"}, + "description": "Template tags to include" + }, + "exclude_tags": { + "type": "array", + "items": {"type": "string"}, + "description": "Template tags to exclude" + }, + "severity": { + "type": "array", + "items": {"type": "string", "enum": ["critical", "high", "medium", "low", "info"]}, + "default": ["critical", "high", "medium"], + "description": "Severity levels to include" + }, + "concurrency": { + "type": "integer", + "default": 25, + "description": "Number of concurrent threads" + }, + "rate_limit": { + "type": "integer", + "default": 150, + "description": "Rate limit (requests per second)" + }, + "timeout": { + "type": "integer", + "default": 10, + "description": "Timeout for requests (seconds)" + }, + "retries": { + "type": "integer", + "default": 1, + "description": "Number of retries for failed requests" + }, + "update_templates": { + "type": "boolean", + "default": False, + "description": "Update templates before scanning" + }, + "disable_clustering": { + "type": "boolean", + "default": False, + "description": "Disable template clustering" + }, + "no_interactsh": { + "type": "boolean", + "default": True, + "description": "Disable interactsh server for OAST testing" + } + } + }, + output_schema={ + "type": "object", + "properties": { + "findings": { + "type": "array", + "items": { + "type": "object", + "properties": { + "template_id": {"type": "string"}, + "name": {"type": "string"}, + "severity": {"type": "string"}, + "host": {"type": "string"}, + "matched_at": {"type": "string"} + } + } + } + } + } + ) + + def validate_config(self, config: Dict[str, Any]) -> bool: + """Validate configuration""" + targets = config.get("targets", []) + target_file = config.get("target_file") + + if not targets and not target_file: + raise ValueError("Either 'targets' or 'target_file' must be specified") + + severity_levels = config.get("severity", []) + valid_severities = ["critical", "high", "medium", "low", "info"] + for severity in severity_levels: + if severity not in valid_severities: + raise ValueError(f"Invalid severity: {severity}. Valid: {valid_severities}") + + return True + + async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: + """Execute Nuclei vulnerability scanning""" + self.start_timer() + + try: + # Validate inputs + self.validate_config(config) + self.validate_workspace(workspace) + + logger.info("Running Nuclei vulnerability scan") + + # Update templates if requested + if config.get("update_templates", False): + await self._update_templates(workspace) + + # Prepare target file + target_file = await self._prepare_targets(config, workspace) + if not target_file: + logger.info("No targets specified for scanning") + return self.create_result( + findings=[], + status="success", + summary={"total_findings": 0, "targets_scanned": 0} + ) + + # Run Nuclei scan + findings = await self._run_nuclei_scan(target_file, config, workspace) + + # Create summary + summary = self._create_summary(findings, len(config.get("targets", []))) + + logger.info(f"Nuclei found {len(findings)} vulnerabilities") + + return self.create_result( + findings=findings, + status="success", + summary=summary + ) + + except Exception as e: + logger.error(f"Nuclei module failed: {e}") + return self.create_result( + findings=[], + status="failed", + error=str(e) + ) + + async def _update_templates(self, workspace: Path): + """Update Nuclei templates""" + try: + logger.info("Updating Nuclei templates...") + cmd = ["nuclei", "-update-templates"] + + process = await asyncio.create_subprocess_exec( + *cmd, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE, + cwd=workspace + ) + + stdout, stderr = await process.communicate() + + if process.returncode == 0: + logger.info("Templates updated successfully") + else: + logger.warning(f"Template update failed: {stderr.decode()}") + + except Exception as e: + logger.warning(f"Error updating templates: {e}") + + async def _prepare_targets(self, config: Dict[str, Any], workspace: Path) -> Path: + """Prepare target file for scanning""" + targets = config.get("targets", []) + target_file = config.get("target_file") + + if target_file: + # Use existing target file + target_path = workspace / target_file + if target_path.exists(): + return target_path + else: + raise FileNotFoundError(f"Target file not found: {target_file}") + + if targets: + # Create temporary target file + target_path = workspace / "nuclei_targets.txt" + with open(target_path, 'w') as f: + for target in targets: + f.write(f"{target}\n") + return target_path + + return None + + async def _run_nuclei_scan(self, target_file: Path, config: Dict[str, Any], workspace: Path) -> List[ModuleFinding]: + """Run Nuclei scan""" + findings = [] + + try: + # Build nuclei command + cmd = ["nuclei", "-l", str(target_file)] + + # Add output format + cmd.extend(["-json"]) + + # Add templates + templates = config.get("templates", []) + if templates: + cmd.extend(["-t", ",".join(templates)]) + + # Add template directory + template_dir = config.get("template_directory") + if template_dir: + cmd.extend(["-t", template_dir]) + + # Add tags + tags = config.get("tags", []) + if tags: + cmd.extend(["-tags", ",".join(tags)]) + + # Add exclude tags + exclude_tags = config.get("exclude_tags", []) + if exclude_tags: + cmd.extend(["-exclude-tags", ",".join(exclude_tags)]) + + # Add severity + severity_levels = config.get("severity", ["critical", "high", "medium"]) + cmd.extend(["-severity", ",".join(severity_levels)]) + + # Add concurrency + concurrency = config.get("concurrency", 25) + cmd.extend(["-c", str(concurrency)]) + + # Add rate limit + rate_limit = config.get("rate_limit", 150) + cmd.extend(["-rl", str(rate_limit)]) + + # Add timeout + timeout = config.get("timeout", 10) + cmd.extend(["-timeout", str(timeout)]) + + # Add retries + retries = config.get("retries", 1) + cmd.extend(["-retries", str(retries)]) + + # Add other flags + if config.get("disable_clustering", False): + cmd.append("-no-color") + + if config.get("no_interactsh", True): + cmd.append("-no-interactsh") + + # Add silent flag for JSON output + cmd.append("-silent") + + logger.debug(f"Running command: {' '.join(cmd)}") + + # Run nuclei + process = await asyncio.create_subprocess_exec( + *cmd, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE, + cwd=workspace + ) + + stdout, stderr = await process.communicate() + + # Parse results + if process.returncode == 0 or stdout: + findings = self._parse_nuclei_output(stdout.decode(), workspace) + else: + error_msg = stderr.decode() + logger.error(f"Nuclei scan failed: {error_msg}") + + except Exception as e: + logger.warning(f"Error running Nuclei scan: {e}") + + return findings + + def _parse_nuclei_output(self, output: str, workspace: Path) -> List[ModuleFinding]: + """Parse Nuclei JSON output into findings""" + findings = [] + + if not output.strip(): + return findings + + try: + # Parse each line as JSON (JSONL format) + for line in output.strip().split('\n'): + if not line.strip(): + continue + + result = json.loads(line) + + # Extract information + template_id = result.get("template-id", "") + template_name = result.get("info", {}).get("name", "") + severity = result.get("info", {}).get("severity", "medium") + host = result.get("host", "") + matched_at = result.get("matched-at", "") + description = result.get("info", {}).get("description", "") + reference = result.get("info", {}).get("reference", []) + classification = result.get("info", {}).get("classification", {}) + extracted_results = result.get("extracted-results", []) + + # Map severity to our standard levels + finding_severity = self._map_severity(severity) + + # Get category based on template + category = self._get_category(template_id, template_name, classification) + + # Create finding + finding = self.create_finding( + title=f"Nuclei Detection: {template_name}", + description=description or f"Vulnerability detected using template {template_id}", + severity=finding_severity, + category=category, + file_path=None, # Nuclei scans network targets + recommendation=self._get_recommendation(template_id, template_name, reference), + metadata={ + "template_id": template_id, + "template_name": template_name, + "nuclei_severity": severity, + "host": host, + "matched_at": matched_at, + "classification": classification, + "reference": reference, + "extracted_results": extracted_results + } + ) + findings.append(finding) + + except json.JSONDecodeError as e: + logger.warning(f"Failed to parse Nuclei output: {e}") + except Exception as e: + logger.warning(f"Error processing Nuclei results: {e}") + + return findings + + def _map_severity(self, nuclei_severity: str) -> str: + """Map Nuclei severity to our standard severity levels""" + severity_map = { + "critical": "critical", + "high": "high", + "medium": "medium", + "low": "low", + "info": "info" + } + return severity_map.get(nuclei_severity.lower(), "medium") + + def _get_category(self, template_id: str, template_name: str, classification: Dict) -> str: + """Determine finding category based on template and classification""" + template_lower = f"{template_id} {template_name}".lower() + + # Use classification if available + cwe_id = classification.get("cwe-id") + if cwe_id: + # Map common CWE IDs to categories + if cwe_id in ["CWE-79", "CWE-80"]: + return "cross_site_scripting" + elif cwe_id in ["CWE-89"]: + return "sql_injection" + elif cwe_id in ["CWE-22", "CWE-23"]: + return "path_traversal" + elif cwe_id in ["CWE-352"]: + return "csrf" + elif cwe_id in ["CWE-601"]: + return "redirect" + + # Analyze template content + if any(term in template_lower for term in ["xss", "cross-site"]): + return "cross_site_scripting" + elif any(term in template_lower for term in ["sql", "injection"]): + return "sql_injection" + elif any(term in template_lower for term in ["lfi", "rfi", "file", "path", "traversal"]): + return "file_inclusion" + elif any(term in template_lower for term in ["rce", "command", "execution"]): + return "remote_code_execution" + elif any(term in template_lower for term in ["auth", "login", "bypass"]): + return "authentication_bypass" + elif any(term in template_lower for term in ["disclosure", "exposure", "leak"]): + return "information_disclosure" + elif any(term in template_lower for term in ["config", "misconfiguration"]): + return "misconfiguration" + elif any(term in template_lower for term in ["cve-"]): + return "known_vulnerability" + else: + return "web_vulnerability" + + def _get_recommendation(self, template_id: str, template_name: str, references: List) -> str: + """Generate recommendation based on template""" + # Use references if available + if references: + ref_text = ", ".join(references[:3]) # Limit to first 3 references + return f"Review the vulnerability and apply appropriate fixes. References: {ref_text}" + + # Generate based on template type + template_lower = f"{template_id} {template_name}".lower() + + if "xss" in template_lower: + return "Implement proper input validation and output encoding to prevent XSS attacks." + elif "sql" in template_lower: + return "Use parameterized queries and input validation to prevent SQL injection." + elif "lfi" in template_lower or "rfi" in template_lower: + return "Validate and sanitize file paths. Avoid dynamic file includes with user input." + elif "rce" in template_lower: + return "Sanitize user input and avoid executing system commands with user-controlled data." + elif "auth" in template_lower: + return "Review authentication mechanisms and implement proper access controls." + elif "exposure" in template_lower or "disclosure" in template_lower: + return "Restrict access to sensitive information and implement proper authorization." + elif "cve-" in template_lower: + return "Update the affected software to the latest version to patch known vulnerabilities." + else: + return f"Review and remediate the security issue identified by template {template_id}." + + def _create_summary(self, findings: List[ModuleFinding], targets_count: int) -> Dict[str, Any]: + """Create analysis summary""" + severity_counts = {"critical": 0, "high": 0, "medium": 0, "low": 0, "info": 0} + category_counts = {} + template_counts = {} + host_counts = {} + + for finding in findings: + # Count by severity + severity_counts[finding.severity] += 1 + + # Count by category + category = finding.category + category_counts[category] = category_counts.get(category, 0) + 1 + + # Count by template + template_id = finding.metadata.get("template_id", "unknown") + template_counts[template_id] = template_counts.get(template_id, 0) + 1 + + # Count by host + host = finding.metadata.get("host", "unknown") + host_counts[host] = host_counts.get(host, 0) + 1 + + return { + "total_findings": len(findings), + "targets_scanned": targets_count, + "severity_counts": severity_counts, + "category_counts": category_counts, + "top_templates": dict(sorted(template_counts.items(), key=lambda x: x[1], reverse=True)[:10]), + "affected_hosts": len(host_counts), + "host_counts": dict(sorted(host_counts.items(), key=lambda x: x[1], reverse=True)[:10]) + } \ No newline at end of file diff --git a/backend/toolbox/modules/penetration_testing/sqlmap.py b/backend/toolbox/modules/penetration_testing/sqlmap.py new file mode 100644 index 0000000..84d888b --- /dev/null +++ b/backend/toolbox/modules/penetration_testing/sqlmap.py @@ -0,0 +1,671 @@ +""" +SQLMap Penetration Testing Module + +This module uses SQLMap for automatic SQL injection detection and exploitation. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import asyncio +import json +from pathlib import Path +from typing import Dict, Any, List +import subprocess +import logging + +from ..base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult +from . import register_module + +logger = logging.getLogger(__name__) + + +@register_module +class SQLMapModule(BaseModule): + """SQLMap automatic SQL injection detection and exploitation module""" + + def get_metadata(self) -> ModuleMetadata: + """Get module metadata""" + return ModuleMetadata( + name="sqlmap", + version="1.7.11", + description="Automatic SQL injection detection and exploitation tool", + author="FuzzForge Team", + category="penetration_testing", + tags=["sql-injection", "web", "database", "vulnerability", "exploitation"], + input_schema={ + "type": "object", + "properties": { + "target_url": { + "type": "string", + "description": "Target URL to test for SQL injection" + }, + "target_file": { + "type": "string", + "description": "File containing URLs to test" + }, + "request_file": { + "type": "string", + "description": "Load HTTP request from file (Burp log, etc.)" + }, + "data": { + "type": "string", + "description": "Data string to be sent through POST" + }, + "cookie": { + "type": "string", + "description": "HTTP Cookie header value" + }, + "user_agent": { + "type": "string", + "description": "HTTP User-Agent header value" + }, + "referer": { + "type": "string", + "description": "HTTP Referer header value" + }, + "headers": { + "type": "object", + "description": "Additional HTTP headers" + }, + "method": { + "type": "string", + "enum": ["GET", "POST", "PUT", "DELETE", "PATCH"], + "default": "GET", + "description": "HTTP method to use" + }, + "testable_parameters": { + "type": "array", + "items": {"type": "string"}, + "description": "Comma-separated list of testable parameter(s)" + }, + "skip_parameters": { + "type": "array", + "items": {"type": "string"}, + "description": "Parameters to skip during testing" + }, + "dbms": { + "type": "string", + "enum": ["mysql", "postgresql", "oracle", "mssql", "sqlite", "access", "firebird", "sybase", "db2", "hsqldb", "h2"], + "description": "Force back-end DBMS to provided value" + }, + "level": { + "type": "integer", + "enum": [1, 2, 3, 4, 5], + "default": 1, + "description": "Level of tests to perform (1-5)" + }, + "risk": { + "type": "integer", + "enum": [1, 2, 3], + "default": 1, + "description": "Risk of tests to perform (1-3)" + }, + "technique": { + "type": "array", + "items": {"type": "string", "enum": ["B", "E", "U", "S", "T", "Q"]}, + "description": "SQL injection techniques to use (B=Boolean, E=Error, U=Union, S=Stacked, T=Time, Q=Inline)" + }, + "time_sec": { + "type": "integer", + "default": 5, + "description": "Seconds to delay DBMS response for time-based blind SQL injection" + }, + "union_cols": { + "type": "string", + "description": "Range of columns to test for UNION query SQL injection" + }, + "threads": { + "type": "integer", + "default": 1, + "description": "Maximum number of concurrent HTTP requests" + }, + "timeout": { + "type": "integer", + "default": 30, + "description": "Seconds to wait before timeout connection" + }, + "retries": { + "type": "integer", + "default": 3, + "description": "Retries when connection timeouts" + }, + "randomize": { + "type": "boolean", + "default": True, + "description": "Randomly change value of given parameter(s)" + }, + "safe_url": { + "type": "string", + "description": "URL to visit frequently during testing" + }, + "safe_freq": { + "type": "integer", + "description": "Test requests between visits to safe URL" + }, + "crawl": { + "type": "integer", + "description": "Crawl website starting from target URL (depth)" + }, + "forms": { + "type": "boolean", + "default": False, + "description": "Parse and test forms on target URL" + }, + "batch": { + "type": "boolean", + "default": True, + "description": "Never ask for user input, use default behavior" + }, + "cleanup": { + "type": "boolean", + "default": True, + "description": "Clean up files used by SQLMap" + }, + "check_waf": { + "type": "boolean", + "default": False, + "description": "Check for existence of WAF/IPS protection" + }, + "tamper": { + "type": "array", + "items": {"type": "string"}, + "description": "Use tamper scripts to modify requests" + } + } + }, + output_schema={ + "type": "object", + "properties": { + "findings": { + "type": "array", + "items": { + "type": "object", + "properties": { + "url": {"type": "string"}, + "parameter": {"type": "string"}, + "technique": {"type": "string"}, + "dbms": {"type": "string"}, + "payload": {"type": "string"} + } + } + } + } + } + ) + + def validate_config(self, config: Dict[str, Any]) -> bool: + """Validate configuration""" + target_url = config.get("target_url") + target_file = config.get("target_file") + request_file = config.get("request_file") + + if not any([target_url, target_file, request_file]): + raise ValueError("Either 'target_url', 'target_file', or 'request_file' must be specified") + + level = config.get("level", 1) + if level not in [1, 2, 3, 4, 5]: + raise ValueError("Level must be between 1 and 5") + + risk = config.get("risk", 1) + if risk not in [1, 2, 3]: + raise ValueError("Risk must be between 1 and 3") + + return True + + async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: + """Execute SQLMap SQL injection testing""" + self.start_timer() + + try: + # Validate inputs + self.validate_config(config) + self.validate_workspace(workspace) + + logger.info("Running SQLMap SQL injection scan") + + # Run SQLMap scan + findings = await self._run_sqlmap_scan(config, workspace) + + # Create summary + summary = self._create_summary(findings) + + logger.info(f"SQLMap found {len(findings)} SQL injection vulnerabilities") + + return self.create_result( + findings=findings, + status="success", + summary=summary + ) + + except Exception as e: + logger.error(f"SQLMap module failed: {e}") + return self.create_result( + findings=[], + status="failed", + error=str(e) + ) + + async def _run_sqlmap_scan(self, config: Dict[str, Any], workspace: Path) -> List[ModuleFinding]: + """Run SQLMap scan""" + findings = [] + + try: + # Build sqlmap command + cmd = ["sqlmap"] + + # Add target specification + target_url = config.get("target_url") + if target_url: + cmd.extend(["-u", target_url]) + + target_file = config.get("target_file") + if target_file: + target_path = workspace / target_file + if target_path.exists(): + cmd.extend(["-m", str(target_path)]) + else: + raise FileNotFoundError(f"Target file not found: {target_file}") + + request_file = config.get("request_file") + if request_file: + request_path = workspace / request_file + if request_path.exists(): + cmd.extend(["-r", str(request_path)]) + else: + raise FileNotFoundError(f"Request file not found: {request_file}") + + # Add HTTP options + data = config.get("data") + if data: + cmd.extend(["--data", data]) + + cookie = config.get("cookie") + if cookie: + cmd.extend(["--cookie", cookie]) + + user_agent = config.get("user_agent") + if user_agent: + cmd.extend(["--user-agent", user_agent]) + + referer = config.get("referer") + if referer: + cmd.extend(["--referer", referer]) + + headers = config.get("headers", {}) + for key, value in headers.items(): + cmd.extend(["--header", f"{key}: {value}"]) + + method = config.get("method") + if method and method != "GET": + cmd.extend(["--method", method]) + + # Add parameter options + testable_params = config.get("testable_parameters", []) + if testable_params: + cmd.extend(["-p", ",".join(testable_params)]) + + skip_params = config.get("skip_parameters", []) + if skip_params: + cmd.extend(["--skip", ",".join(skip_params)]) + + # Add injection options + dbms = config.get("dbms") + if dbms: + cmd.extend(["--dbms", dbms]) + + level = config.get("level", 1) + cmd.extend(["--level", str(level)]) + + risk = config.get("risk", 1) + cmd.extend(["--risk", str(risk)]) + + techniques = config.get("technique", []) + if techniques: + cmd.extend(["--technique", "".join(techniques)]) + + time_sec = config.get("time_sec", 5) + cmd.extend(["--time-sec", str(time_sec)]) + + union_cols = config.get("union_cols") + if union_cols: + cmd.extend(["--union-cols", union_cols]) + + # Add performance options + threads = config.get("threads", 1) + cmd.extend(["--threads", str(threads)]) + + timeout = config.get("timeout", 30) + cmd.extend(["--timeout", str(timeout)]) + + retries = config.get("retries", 3) + cmd.extend(["--retries", str(retries)]) + + # Add request options + if config.get("randomize", True): + cmd.append("--randomize") + + safe_url = config.get("safe_url") + if safe_url: + cmd.extend(["--safe-url", safe_url]) + + safe_freq = config.get("safe_freq") + if safe_freq: + cmd.extend(["--safe-freq", str(safe_freq)]) + + # Add crawling options + crawl_depth = config.get("crawl") + if crawl_depth: + cmd.extend(["--crawl", str(crawl_depth)]) + + if config.get("forms", False): + cmd.append("--forms") + + # Add behavioral options + if config.get("batch", True): + cmd.append("--batch") + + if config.get("cleanup", True): + cmd.append("--cleanup") + + if config.get("check_waf", False): + cmd.append("--check-waf") + + # Add tamper scripts + tamper_scripts = config.get("tamper", []) + if tamper_scripts: + cmd.extend(["--tamper", ",".join(tamper_scripts)]) + + # Set output directory + output_dir = workspace / "sqlmap_output" + output_dir.mkdir(exist_ok=True) + cmd.extend(["--output-dir", str(output_dir)]) + + # Add format for easier parsing + cmd.append("--flush-session") # Start fresh + cmd.append("--fresh-queries") # Ignore previous results + + logger.debug(f"Running command: {' '.join(cmd)}") + + # Run sqlmap + process = await asyncio.create_subprocess_exec( + *cmd, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE, + cwd=workspace + ) + + stdout, stderr = await process.communicate() + + # Parse results from output directory + findings = self._parse_sqlmap_output(output_dir, stdout.decode(), workspace) + + # Log results + if findings: + logger.info(f"SQLMap detected {len(findings)} SQL injection vulnerabilities") + else: + logger.info("No SQL injection vulnerabilities found") + # Check for errors + stderr_text = stderr.decode() + if stderr_text: + logger.warning(f"SQLMap warnings/errors: {stderr_text}") + + except Exception as e: + logger.warning(f"Error running SQLMap scan: {e}") + + return findings + + def _parse_sqlmap_output(self, output_dir: Path, stdout: str, workspace: Path) -> List[ModuleFinding]: + """Parse SQLMap output into findings""" + findings = [] + + try: + # Look for session files in output directory + session_files = list(output_dir.glob("**/*.sqlite")) + log_files = list(output_dir.glob("**/*.log")) + + # Parse stdout for injection information + findings.extend(self._parse_stdout_output(stdout)) + + # Parse log files for additional details + for log_file in log_files: + findings.extend(self._parse_log_file(log_file)) + + # If we have session files, we can extract more detailed information + # For now, we'll rely on stdout parsing + + except Exception as e: + logger.warning(f"Error parsing SQLMap output: {e}") + + return findings + + def _parse_stdout_output(self, stdout: str) -> List[ModuleFinding]: + """Parse SQLMap stdout for SQL injection findings""" + findings = [] + + try: + lines = stdout.split('\n') + current_url = None + current_parameter = None + current_technique = None + current_dbms = None + injection_found = False + + for line in lines: + line = line.strip() + + # Extract URL being tested + if "testing URL" in line or "testing connection to the target URL" in line: + # Extract URL from line + if "'" in line: + url_start = line.find("'") + 1 + url_end = line.find("'", url_start) + if url_end > url_start: + current_url = line[url_start:url_end] + + # Extract parameter being tested + elif "testing parameter" in line or "testing" in line and "parameter" in line: + if "'" in line: + param_parts = line.split("'") + if len(param_parts) >= 2: + current_parameter = param_parts[1] + + # Detect SQL injection found + elif any(indicator in line.lower() for indicator in [ + "parameter appears to be vulnerable", + "injectable", + "parameter is vulnerable" + ]): + injection_found = True + + # Extract technique information + elif "Type:" in line: + current_technique = line.replace("Type:", "").strip() + + # Extract database information + elif "back-end DBMS:" in line.lower(): + current_dbms = line.split(":")[-1].strip() + + # Extract payload information + elif "Payload:" in line: + payload = line.replace("Payload:", "").strip() + + # Create finding if we have injection + if injection_found and current_url and current_parameter: + finding = self._create_sqlmap_finding( + current_url, current_parameter, current_technique, + current_dbms, payload + ) + if finding: + findings.append(finding) + + # Reset state + injection_found = False + current_technique = None + + except Exception as e: + logger.warning(f"Error parsing SQLMap stdout: {e}") + + return findings + + def _parse_log_file(self, log_file: Path) -> List[ModuleFinding]: + """Parse SQLMap log file for additional findings""" + findings = [] + + try: + with open(log_file, 'r') as f: + content = f.read() + + # Look for injection indicators in log + if "injectable" in content.lower() or "vulnerable" in content.lower(): + # Could parse more detailed information from log + # For now, we'll rely on stdout parsing + pass + + except Exception as e: + logger.warning(f"Error parsing log file {log_file}: {e}") + + return findings + + def _create_sqlmap_finding(self, url: str, parameter: str, technique: str, dbms: str, payload: str) -> ModuleFinding: + """Create a ModuleFinding for SQL injection""" + try: + # Map technique to readable description + technique_map = { + "boolean-based blind": "Boolean-based blind SQL injection", + "time-based blind": "Time-based blind SQL injection", + "error-based": "Error-based SQL injection", + "UNION query": "UNION-based SQL injection", + "stacked queries": "Stacked queries SQL injection", + "inline query": "Inline query SQL injection" + } + + technique_desc = technique_map.get(technique, technique or "SQL injection") + + # Create description + description = f"SQL injection vulnerability detected in parameter '{parameter}' using {technique_desc}" + if dbms: + description += f" against {dbms} database" + + # Determine severity based on technique + severity = self._get_injection_severity(technique, dbms) + + # Create finding + finding = self.create_finding( + title=f"SQL Injection: {parameter}", + description=description, + severity=severity, + category="sql_injection", + file_path=None, # Web application testing + recommendation=self._get_sqlinjection_recommendation(technique, dbms), + metadata={ + "url": url, + "parameter": parameter, + "technique": technique, + "dbms": dbms, + "payload": payload[:500] if payload else "", # Limit payload length + "injection_type": technique_desc + } + ) + + return finding + + except Exception as e: + logger.warning(f"Error creating SQLMap finding: {e}") + return None + + def _get_injection_severity(self, technique: str, dbms: str) -> str: + """Determine severity based on injection technique and database""" + if not technique: + return "high" # Any SQL injection is serious + + technique_lower = technique.lower() + + # Critical severity for techniques that allow easy data extraction + if any(term in technique_lower for term in ["union", "error-based"]): + return "critical" + + # High severity for techniques that allow some data extraction + elif any(term in technique_lower for term in ["boolean-based", "time-based"]): + return "high" + + # Stacked queries are very dangerous as they allow multiple statements + elif "stacked" in technique_lower: + return "critical" + + else: + return "high" + + def _get_sqlinjection_recommendation(self, technique: str, dbms: str) -> str: + """Generate recommendation for SQL injection""" + base_recommendation = "Implement parameterized queries/prepared statements and input validation to prevent SQL injection attacks." + + if technique: + technique_lower = technique.lower() + if "union" in technique_lower: + base_recommendation += " The UNION-based injection allows direct data extraction - immediate remediation required." + elif "error-based" in technique_lower: + base_recommendation += " Error-based injection reveals database structure - disable error messages in production." + elif "time-based" in technique_lower: + base_recommendation += " Time-based injection allows blind data extraction - implement query timeout limits." + elif "stacked" in technique_lower: + base_recommendation += " Stacked queries injection allows multiple SQL statements - extremely dangerous, fix immediately." + + if dbms: + dbms_lower = dbms.lower() + if "mysql" in dbms_lower: + base_recommendation += " For MySQL: disable LOAD_FILE and INTO OUTFILE if not needed." + elif "postgresql" in dbms_lower: + base_recommendation += " For PostgreSQL: review user privileges and disable unnecessary functions." + elif "mssql" in dbms_lower: + base_recommendation += " For SQL Server: disable xp_cmdshell and review extended stored procedures." + + return base_recommendation + + def _create_summary(self, findings: List[ModuleFinding]) -> Dict[str, Any]: + """Create analysis summary""" + severity_counts = {"critical": 0, "high": 0, "medium": 0, "low": 0, "info": 0} + technique_counts = {} + dbms_counts = {} + parameter_counts = {} + url_counts = {} + + for finding in findings: + # Count by severity + severity_counts[finding.severity] += 1 + + # Count by technique + technique = finding.metadata.get("technique", "unknown") + technique_counts[technique] = technique_counts.get(technique, 0) + 1 + + # Count by DBMS + dbms = finding.metadata.get("dbms", "unknown") + if dbms != "unknown": + dbms_counts[dbms] = dbms_counts.get(dbms, 0) + 1 + + # Count by parameter + parameter = finding.metadata.get("parameter", "unknown") + parameter_counts[parameter] = parameter_counts.get(parameter, 0) + 1 + + # Count by URL + url = finding.metadata.get("url", "unknown") + url_counts[url] = url_counts.get(url, 0) + 1 + + return { + "total_findings": len(findings), + "severity_counts": severity_counts, + "technique_counts": technique_counts, + "dbms_counts": dbms_counts, + "vulnerable_parameters": list(parameter_counts.keys()), + "vulnerable_urls": len(url_counts), + "most_common_techniques": dict(sorted(technique_counts.items(), key=lambda x: x[1], reverse=True)[:5]), + "affected_databases": list(dbms_counts.keys()) + } \ No newline at end of file diff --git a/backend/toolbox/modules/reporter/__init__.py b/backend/toolbox/modules/reporter/__init__.py new file mode 100644 index 0000000..7812ff1 --- /dev/null +++ b/backend/toolbox/modules/reporter/__init__.py @@ -0,0 +1,14 @@ +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + +from .sarif_reporter import SARIFReporter + +__all__ = ["SARIFReporter"] \ No newline at end of file diff --git a/backend/toolbox/modules/reporter/sarif_reporter.py b/backend/toolbox/modules/reporter/sarif_reporter.py new file mode 100644 index 0000000..e504462 --- /dev/null +++ b/backend/toolbox/modules/reporter/sarif_reporter.py @@ -0,0 +1,401 @@ +""" +SARIF Reporter Module - Generates SARIF-formatted security reports +""" + +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + +import logging +from pathlib import Path +from typing import Dict, Any, List +from datetime import datetime +import json + +try: + from toolbox.modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding +except ImportError: + try: + from modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding + except ImportError: + from src.toolbox.modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding + +logger = logging.getLogger(__name__) + + +class SARIFReporter(BaseModule): + """ + Generates SARIF (Static Analysis Results Interchange Format) reports. + + This module: + - Converts findings to SARIF format + - Aggregates results from multiple modules + - Adds metadata and context + - Provides actionable recommendations + """ + + def get_metadata(self) -> ModuleMetadata: + """Get module metadata""" + return ModuleMetadata( + name="sarif_reporter", + version="1.0.0", + description="Generates SARIF-formatted security reports", + author="FuzzForge Team", + category="reporter", + tags=["reporting", "sarif", "output"], + input_schema={ + "findings": { + "type": "array", + "description": "List of findings to report", + "required": True + }, + "tool_name": { + "type": "string", + "description": "Name of the tool", + "default": "FuzzForge Security Assessment" + }, + "tool_version": { + "type": "string", + "description": "Tool version", + "default": "1.0.0" + }, + "include_code_flows": { + "type": "boolean", + "description": "Include code flow information", + "default": False + } + }, + output_schema={ + "sarif": { + "type": "object", + "description": "SARIF 2.1.0 formatted report" + } + }, + requires_workspace=False # Reporter doesn't need direct workspace access + ) + + def validate_config(self, config: Dict[str, Any]) -> bool: + """Validate module configuration""" + if "findings" not in config and "modules_results" not in config: + raise ValueError("Either 'findings' or 'modules_results' must be provided") + return True + + async def execute(self, config: Dict[str, Any], workspace: Path = None) -> ModuleResult: + """ + Execute the SARIF reporter module. + + Args: + config: Module configuration with findings + workspace: Optional workspace path for context + + Returns: + ModuleResult with SARIF report + """ + self.start_timer() + self.validate_config(config) + + # Get configuration + tool_name = config.get("tool_name", "FuzzForge Security Assessment") + tool_version = config.get("tool_version", "1.0.0") + include_code_flows = config.get("include_code_flows", False) + + # Collect findings from either direct findings or module results + all_findings = [] + + if "findings" in config: + # Direct findings provided + all_findings = config["findings"] + if isinstance(all_findings, list) and all(isinstance(f, dict) for f in all_findings): + # Convert dict findings to ModuleFinding objects + all_findings = [ModuleFinding(**f) if isinstance(f, dict) else f for f in all_findings] + elif "modules_results" in config: + # Aggregate from module results + for module_result in config["modules_results"]: + if isinstance(module_result, dict): + findings = module_result.get("findings", []) + all_findings.extend(findings) + elif hasattr(module_result, "findings"): + all_findings.extend(module_result.findings) + + logger.info(f"Generating SARIF report for {len(all_findings)} findings") + + try: + # Generate SARIF report + sarif_report = self._generate_sarif( + findings=all_findings, + tool_name=tool_name, + tool_version=tool_version, + include_code_flows=include_code_flows, + workspace_path=str(workspace) if workspace else None + ) + + # Create summary + summary = self._generate_report_summary(all_findings) + + return ModuleResult( + module=self.get_metadata().name, + version=self.get_metadata().version, + status="success", + execution_time=self.get_execution_time(), + findings=[], # Reporter doesn't generate new findings + summary=summary, + metadata={ + "tool_name": tool_name, + "tool_version": tool_version, + "report_format": "SARIF 2.1.0", + "total_findings": len(all_findings) + }, + error=None, + sarif=sarif_report # Add SARIF as custom field + ) + + except Exception as e: + logger.error(f"SARIF reporter failed: {e}") + return self.create_result( + findings=[], + status="failed", + error=str(e) + ) + + def _generate_sarif( + self, + findings: List[ModuleFinding], + tool_name: str, + tool_version: str, + include_code_flows: bool, + workspace_path: str = None + ) -> Dict[str, Any]: + """ + Generate SARIF 2.1.0 formatted report. + + Args: + findings: List of findings to report + tool_name: Name of the tool + tool_version: Tool version + include_code_flows: Whether to include code flow information + workspace_path: Optional workspace path + + Returns: + SARIF formatted dictionary + """ + # Create rules from unique finding types + rules = self._create_rules(findings) + + # Create results from findings + results = self._create_results(findings, include_code_flows) + + # Build SARIF structure + sarif = { + "$schema": "https://raw.githubusercontent.com/oasis-tcs/sarif-spec/master/Schemata/sarif-schema-2.1.0.json", + "version": "2.1.0", + "runs": [ + { + "tool": { + "driver": { + "name": tool_name, + "version": tool_version, + "informationUri": "https://fuzzforge.io", + "rules": rules + } + }, + "results": results, + "invocations": [ + { + "executionSuccessful": True, + "endTimeUtc": datetime.utcnow().isoformat() + "Z" + } + ] + } + ] + } + + # Add workspace information if available + if workspace_path: + sarif["runs"][0]["originalUriBaseIds"] = { + "WORKSPACE": { + "uri": f"file://{workspace_path}/", + "description": "The workspace root directory" + } + } + + return sarif + + def _create_rules(self, findings: List[ModuleFinding]) -> List[Dict[str, Any]]: + """ + Create SARIF rules from findings. + + Args: + findings: List of findings + + Returns: + List of SARIF rule objects + """ + rules_dict = {} + + for finding in findings: + rule_id = f"{finding.category}_{finding.severity}" + + if rule_id not in rules_dict: + rules_dict[rule_id] = { + "id": rule_id, + "name": finding.category.replace("_", " ").title(), + "shortDescription": { + "text": f"{finding.category} vulnerability" + }, + "fullDescription": { + "text": f"Detection rule for {finding.category} vulnerabilities with {finding.severity} severity" + }, + "defaultConfiguration": { + "level": self._severity_to_sarif_level(finding.severity) + }, + "properties": { + "category": finding.category, + "severity": finding.severity, + "tags": ["security", finding.category, finding.severity] + } + } + + return list(rules_dict.values()) + + def _create_results( + self, findings: List[ModuleFinding], include_code_flows: bool + ) -> List[Dict[str, Any]]: + """ + Create SARIF results from findings. + + Args: + findings: List of findings + include_code_flows: Whether to include code flows + + Returns: + List of SARIF result objects + """ + results = [] + + for finding in findings: + result = { + "ruleId": f"{finding.category}_{finding.severity}", + "level": self._severity_to_sarif_level(finding.severity), + "message": { + "text": finding.description + }, + "locations": [] + } + + # Add location information if available + if finding.file_path: + location = { + "physicalLocation": { + "artifactLocation": { + "uri": finding.file_path, + "uriBaseId": "WORKSPACE" + } + } + } + + # Add line information if available + if finding.line_start: + location["physicalLocation"]["region"] = { + "startLine": finding.line_start + } + if finding.line_end: + location["physicalLocation"]["region"]["endLine"] = finding.line_end + + # Add code snippet if available + if finding.code_snippet: + location["physicalLocation"]["region"]["snippet"] = { + "text": finding.code_snippet + } + + result["locations"].append(location) + + # Add fix suggestions if available + if finding.recommendation: + result["fixes"] = [ + { + "description": { + "text": finding.recommendation + } + } + ] + + # Add properties + result["properties"] = { + "findingId": finding.id, + "title": finding.title, + "metadata": finding.metadata + } + + results.append(result) + + return results + + def _severity_to_sarif_level(self, severity: str) -> str: + """ + Convert severity to SARIF level. + + Args: + severity: Finding severity + + Returns: + SARIF level string + """ + mapping = { + "critical": "error", + "high": "error", + "medium": "warning", + "low": "note", + "info": "none" + } + return mapping.get(severity.lower(), "warning") + + def _generate_report_summary(self, findings: List[ModuleFinding]) -> Dict[str, Any]: + """ + Generate summary statistics for the report. + + Args: + findings: List of findings + + Returns: + Summary dictionary + """ + severity_counts = { + "critical": 0, + "high": 0, + "medium": 0, + "low": 0, + "info": 0 + } + + category_counts = {} + affected_files = set() + + for finding in findings: + # Count by severity + if finding.severity in severity_counts: + severity_counts[finding.severity] += 1 + + # Count by category + if finding.category not in category_counts: + category_counts[finding.category] = 0 + category_counts[finding.category] += 1 + + # Track affected files + if finding.file_path: + affected_files.add(finding.file_path) + + return { + "total_findings": len(findings), + "severity_distribution": severity_counts, + "category_distribution": category_counts, + "affected_files": len(affected_files), + "report_format": "SARIF 2.1.0", + "generated_at": datetime.utcnow().isoformat() + } \ No newline at end of file diff --git a/backend/toolbox/modules/scanner/__init__.py b/backend/toolbox/modules/scanner/__init__.py new file mode 100644 index 0000000..ae02119 --- /dev/null +++ b/backend/toolbox/modules/scanner/__init__.py @@ -0,0 +1,14 @@ +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + +from .file_scanner import FileScanner + +__all__ = ["FileScanner"] \ No newline at end of file diff --git a/backend/toolbox/modules/scanner/file_scanner.py b/backend/toolbox/modules/scanner/file_scanner.py new file mode 100644 index 0000000..908ab7e --- /dev/null +++ b/backend/toolbox/modules/scanner/file_scanner.py @@ -0,0 +1,315 @@ +""" +File Scanner Module - Scans and enumerates files in the workspace +""" + +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + +import logging +import mimetypes +from pathlib import Path +from typing import Dict, Any, List +import hashlib + +try: + from toolbox.modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding +except ImportError: + try: + from modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding + except ImportError: + from src.toolbox.modules.base import BaseModule, ModuleMetadata, ModuleResult, ModuleFinding + +logger = logging.getLogger(__name__) + + +class FileScanner(BaseModule): + """ + Scans files in the mounted workspace and collects information. + + This module: + - Enumerates files based on patterns + - Detects file types + - Calculates file hashes + - Identifies potentially sensitive files + """ + + def get_metadata(self) -> ModuleMetadata: + """Get module metadata""" + return ModuleMetadata( + name="file_scanner", + version="1.0.0", + description="Scans and enumerates files in the workspace", + author="FuzzForge Team", + category="scanner", + tags=["files", "enumeration", "discovery"], + input_schema={ + "patterns": { + "type": "array", + "items": {"type": "string"}, + "description": "File patterns to scan (e.g., ['*.py', '*.js'])", + "default": ["*"] + }, + "max_file_size": { + "type": "integer", + "description": "Maximum file size to scan in bytes", + "default": 10485760 # 10MB + }, + "check_sensitive": { + "type": "boolean", + "description": "Check for sensitive file patterns", + "default": True + }, + "calculate_hashes": { + "type": "boolean", + "description": "Calculate SHA256 hashes for files", + "default": False + } + }, + output_schema={ + "findings": { + "type": "array", + "description": "List of discovered files with metadata" + } + }, + requires_workspace=True + ) + + def validate_config(self, config: Dict[str, Any]) -> bool: + """Validate module configuration""" + patterns = config.get("patterns", ["*"]) + if not isinstance(patterns, list): + raise ValueError("patterns must be a list") + + max_size = config.get("max_file_size", 10485760) + if not isinstance(max_size, int) or max_size <= 0: + raise ValueError("max_file_size must be a positive integer") + + return True + + async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: + """ + Execute the file scanning module. + + Args: + config: Module configuration + workspace: Path to the workspace directory + + Returns: + ModuleResult with file findings + """ + self.start_timer() + self.validate_workspace(workspace) + self.validate_config(config) + + findings = [] + file_count = 0 + total_size = 0 + file_types = {} + + # Get configuration + patterns = config.get("patterns", ["*"]) + max_file_size = config.get("max_file_size", 10485760) + check_sensitive = config.get("check_sensitive", True) + calculate_hashes = config.get("calculate_hashes", False) + + logger.info(f"Scanning workspace with patterns: {patterns}") + + try: + # Scan for each pattern + for pattern in patterns: + for file_path in workspace.rglob(pattern): + if not file_path.is_file(): + continue + + file_count += 1 + relative_path = file_path.relative_to(workspace) + + # Get file stats + try: + stats = file_path.stat() + file_size = stats.st_size + total_size += file_size + + # Skip large files + if file_size > max_file_size: + logger.warning(f"Skipping large file: {relative_path} ({file_size} bytes)") + continue + + # Detect file type + file_type = self._detect_file_type(file_path) + if file_type not in file_types: + file_types[file_type] = 0 + file_types[file_type] += 1 + + # Check for sensitive files + if check_sensitive and self._is_sensitive_file(file_path): + findings.append(self.create_finding( + title=f"Potentially sensitive file: {relative_path.name}", + description=f"Found potentially sensitive file at {relative_path}", + severity="medium", + category="sensitive_file", + file_path=str(relative_path), + metadata={ + "file_size": file_size, + "file_type": file_type + } + )) + + # Calculate hash if requested + file_hash = None + if calculate_hashes and file_size < 1048576: # Only hash files < 1MB + file_hash = self._calculate_hash(file_path) + + # Create informational finding for each file + findings.append(self.create_finding( + title=f"File discovered: {relative_path.name}", + description=f"File: {relative_path}", + severity="info", + category="file_enumeration", + file_path=str(relative_path), + metadata={ + "file_size": file_size, + "file_type": file_type, + "file_hash": file_hash + } + )) + + except Exception as e: + logger.error(f"Error processing file {relative_path}: {e}") + + # Create summary + summary = { + "total_files": file_count, + "total_size_bytes": total_size, + "file_types": file_types, + "patterns_scanned": patterns + } + + return self.create_result( + findings=findings, + status="success", + summary=summary, + metadata={ + "workspace": str(workspace), + "config": config + } + ) + + except Exception as e: + logger.error(f"File scanner failed: {e}") + return self.create_result( + findings=findings, + status="failed", + error=str(e) + ) + + def _detect_file_type(self, file_path: Path) -> str: + """ + Detect the type of a file. + + Args: + file_path: Path to the file + + Returns: + File type string + """ + # Try to determine from extension + mime_type, _ = mimetypes.guess_type(str(file_path)) + if mime_type: + return mime_type + + # Check by extension + ext = file_path.suffix.lower() + type_map = { + '.py': 'text/x-python', + '.js': 'application/javascript', + '.java': 'text/x-java', + '.cpp': 'text/x-c++', + '.c': 'text/x-c', + '.go': 'text/x-go', + '.rs': 'text/x-rust', + '.rb': 'text/x-ruby', + '.php': 'text/x-php', + '.yaml': 'text/yaml', + '.yml': 'text/yaml', + '.json': 'application/json', + '.xml': 'text/xml', + '.md': 'text/markdown', + '.txt': 'text/plain', + '.sh': 'text/x-shellscript', + '.bat': 'text/x-batch', + '.ps1': 'text/x-powershell' + } + + return type_map.get(ext, 'application/octet-stream') + + def _is_sensitive_file(self, file_path: Path) -> bool: + """ + Check if a file might contain sensitive information. + + Args: + file_path: Path to the file + + Returns: + True if potentially sensitive + """ + sensitive_patterns = [ + '.env', + '.env.local', + '.env.production', + 'credentials', + 'password', + 'secret', + 'private_key', + 'id_rsa', + 'id_dsa', + '.pem', + '.key', + '.pfx', + '.p12', + 'wallet', + '.ssh', + 'token', + 'api_key', + 'config.json', + 'settings.json', + '.git-credentials', + '.npmrc', + '.pypirc', + '.docker/config.json' + ] + + file_name_lower = file_path.name.lower() + for pattern in sensitive_patterns: + if pattern in file_name_lower: + return True + + return False + + def _calculate_hash(self, file_path: Path) -> str: + """ + Calculate SHA256 hash of a file. + + Args: + file_path: Path to the file + + Returns: + Hex string of SHA256 hash + """ + try: + sha256_hash = hashlib.sha256() + with open(file_path, "rb") as f: + for byte_block in iter(lambda: f.read(4096), b""): + sha256_hash.update(byte_block) + return sha256_hash.hexdigest() + except Exception as e: + logger.error(f"Failed to calculate hash for {file_path}: {e}") + return None \ No newline at end of file diff --git a/backend/toolbox/modules/static_analysis/__init__.py b/backend/toolbox/modules/static_analysis/__init__.py new file mode 100644 index 0000000..274a1a2 --- /dev/null +++ b/backend/toolbox/modules/static_analysis/__init__.py @@ -0,0 +1,38 @@ +""" +Static Analysis Security Testing (SAST) Modules + +This package contains modules for static code analysis and security testing. + +Available modules: +- CodeQL: GitHub's semantic code analysis engine +- SonarQube: Code quality and security analysis platform +- Snyk: Vulnerability scanning for dependencies and code +- OpenGrep: Open-source pattern-based static analysis tool +- Bandit: Python-specific security issue identifier +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +from typing import List, Type +from ..base import BaseModule + +# Module registry for automatic discovery +STATIC_ANALYSIS_MODULES: List[Type[BaseModule]] = [] + +def register_module(module_class: Type[BaseModule]): + """Register a static analysis module""" + STATIC_ANALYSIS_MODULES.append(module_class) + return module_class + +def get_available_modules() -> List[Type[BaseModule]]: + """Get all available static analysis modules""" + return STATIC_ANALYSIS_MODULES.copy() \ No newline at end of file diff --git a/backend/toolbox/modules/static_analysis/bandit.py b/backend/toolbox/modules/static_analysis/bandit.py new file mode 100644 index 0000000..2ceff4f --- /dev/null +++ b/backend/toolbox/modules/static_analysis/bandit.py @@ -0,0 +1,418 @@ +""" +Bandit Static Analysis Module + +This module uses Bandit to detect security vulnerabilities in Python code. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import asyncio +import json +from pathlib import Path +from typing import Dict, Any, List +import subprocess +import logging + +from ..base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult +from . import register_module + +logger = logging.getLogger(__name__) + + +@register_module +class BanditModule(BaseModule): + """Bandit Python security analysis module""" + + def get_metadata(self) -> ModuleMetadata: + """Get module metadata""" + return ModuleMetadata( + name="bandit", + version="1.7.5", + description="Python-specific security issue identifier using Bandit", + author="FuzzForge Team", + category="static_analysis", + tags=["python", "sast", "security", "vulnerabilities"], + input_schema={ + "type": "object", + "properties": { + "confidence": { + "type": "string", + "enum": ["LOW", "MEDIUM", "HIGH"], + "default": "LOW", + "description": "Minimum confidence level for reported issues" + }, + "severity": { + "type": "string", + "enum": ["LOW", "MEDIUM", "HIGH"], + "default": "LOW", + "description": "Minimum severity level for reported issues" + }, + "tests": { + "type": "array", + "items": {"type": "string"}, + "description": "Specific test IDs to run" + }, + "skips": { + "type": "array", + "items": {"type": "string"}, + "description": "Test IDs to skip" + }, + "exclude_dirs": { + "type": "array", + "items": {"type": "string"}, + "default": ["tests", "test", ".git", "__pycache__"], + "description": "Directories to exclude from analysis" + }, + "include_patterns": { + "type": "array", + "items": {"type": "string"}, + "default": ["*.py"], + "description": "File patterns to include" + }, + "aggregate": { + "type": "string", + "enum": ["file", "vuln"], + "default": "file", + "description": "How to aggregate results" + }, + "context_lines": { + "type": "integer", + "default": 3, + "description": "Number of context lines to show" + } + } + }, + output_schema={ + "type": "object", + "properties": { + "findings": { + "type": "array", + "items": { + "type": "object", + "properties": { + "test_id": {"type": "string"}, + "test_name": {"type": "string"}, + "confidence": {"type": "string"}, + "severity": {"type": "string"}, + "file_path": {"type": "string"}, + "line_number": {"type": "integer"} + } + } + } + } + } + ) + + def validate_config(self, config: Dict[str, Any]) -> bool: + """Validate configuration""" + confidence = config.get("confidence", "LOW") + # Handle both string and list formats + if isinstance(confidence, list): + confidence = confidence[0] if confidence else "MEDIUM" + if confidence not in ["LOW", "MEDIUM", "HIGH"]: + raise ValueError("confidence must be LOW, MEDIUM, or HIGH") + + severity = config.get("severity", "LOW") + # Handle both string and list formats + if isinstance(severity, list): + severity = severity[0] if severity else "MEDIUM" + if severity not in ["LOW", "MEDIUM", "HIGH"]: + raise ValueError("severity must be LOW, MEDIUM, or HIGH") + + context_lines = config.get("context_lines", 3) + if not isinstance(context_lines, int) or context_lines < 0 or context_lines > 10: + raise ValueError("context_lines must be between 0 and 10") + + return True + + async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: + """Execute Bandit security analysis""" + self.start_timer() + + try: + # Validate inputs + self.validate_config(config) + self.validate_workspace(workspace) + + logger.info(f"Running Bandit analysis on {workspace}") + + # Check if there are any Python files + python_files = list(workspace.rglob("*.py")) + if not python_files: + logger.info("No Python files found for Bandit analysis") + return self.create_result( + findings=[], + status="success", + summary={"total_findings": 0, "files_scanned": 0} + ) + + # Build bandit command + cmd = ["bandit", "-f", "json"] + + # Add confidence level + confidence = config.get("confidence", "LOW") + # Handle both string and list formats + if isinstance(confidence, list): + confidence = confidence[0] if confidence else "MEDIUM" + cmd.extend(["--confidence-level", self._get_confidence_levels(confidence)]) + + # Add severity level + severity = config.get("severity", "LOW") + # Handle both string and list formats + if isinstance(severity, list): + severity = severity[0] if severity else "MEDIUM" + cmd.extend(["--severity-level", self._get_severity_levels(severity)]) + + # Add tests to run + if config.get("tests"): + cmd.extend(["-t", ",".join(config["tests"])]) + + # Add tests to skip + if config.get("skips"): + cmd.extend(["-s", ",".join(config["skips"])]) + + # Add exclude directories + exclude_dirs = config.get("exclude_dirs", ["tests", "test", ".git", "__pycache__"]) + if exclude_dirs: + cmd.extend(["-x", ",".join(exclude_dirs)]) + + # Add aggregate mode + aggregate = config.get("aggregate", "file") + cmd.extend(["-a", aggregate]) + + # Add context lines + context_lines = config.get("context_lines", 3) + cmd.extend(["-n", str(context_lines)]) + + # Add recursive flag and target + cmd.extend(["-r", str(workspace)]) + + logger.debug(f"Running command: {' '.join(cmd)}") + + # Run Bandit + process = await asyncio.create_subprocess_exec( + *cmd, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE, + cwd=workspace + ) + + stdout, stderr = await process.communicate() + + # Parse results + findings = [] + if process.returncode in [0, 1]: # 0 = no issues, 1 = issues found + findings = self._parse_bandit_output(stdout.decode(), workspace) + else: + error_msg = stderr.decode() + logger.error(f"Bandit failed: {error_msg}") + return self.create_result( + findings=[], + status="failed", + error=f"Bandit execution failed: {error_msg}" + ) + + # Create summary + summary = self._create_summary(findings, len(python_files)) + + logger.info(f"Bandit found {len(findings)} security issues") + + return self.create_result( + findings=findings, + status="success", + summary=summary + ) + + except Exception as e: + logger.error(f"Bandit module failed: {e}") + return self.create_result( + findings=[], + status="failed", + error=str(e) + ) + + def _get_confidence_levels(self, min_confidence: str) -> str: + """Get minimum confidence level for Bandit""" + return min_confidence.lower() + + def _get_severity_levels(self, min_severity: str) -> str: + """Get minimum severity level for Bandit""" + return min_severity.lower() + + def _parse_bandit_output(self, output: str, workspace: Path) -> List[ModuleFinding]: + """Parse Bandit JSON output into findings""" + findings = [] + + if not output.strip(): + return findings + + try: + data = json.loads(output) + results = data.get("results", []) + + for result in results: + # Extract information + test_id = result.get("test_id", "unknown") + test_name = result.get("test_name", "") + issue_confidence = result.get("issue_confidence", "MEDIUM") + issue_severity = result.get("issue_severity", "MEDIUM") + issue_text = result.get("issue_text", "") + + # File location + filename = result.get("filename", "") + line_number = result.get("line_number", 0) + line_range = result.get("line_range", []) + + # Code context + code = result.get("code", "") + + # Make file path relative to workspace + if filename: + try: + rel_path = Path(filename).relative_to(workspace) + filename = str(rel_path) + except ValueError: + pass + + # Map Bandit severity to our levels + finding_severity = self._map_severity(issue_severity) + + # Determine category based on test_id + category = self._get_category(test_id, test_name) + + # Create finding + finding = self.create_finding( + title=f"Python security issue: {test_name}", + description=issue_text or f"Bandit test {test_id} detected a security issue", + severity=finding_severity, + category=category, + file_path=filename if filename else None, + line_start=line_number if line_number > 0 else None, + line_end=line_range[-1] if line_range and len(line_range) > 1 else None, + code_snippet=code.strip() if code else None, + recommendation=self._get_recommendation(test_id, test_name), + metadata={ + "test_id": test_id, + "test_name": test_name, + "bandit_confidence": issue_confidence, + "bandit_severity": issue_severity, + "line_range": line_range, + "more_info": result.get("more_info", "") + } + ) + + findings.append(finding) + + except json.JSONDecodeError as e: + logger.warning(f"Failed to parse Bandit output: {e}") + except Exception as e: + logger.warning(f"Error processing Bandit results: {e}") + + return findings + + def _map_severity(self, bandit_severity: str) -> str: + """Map Bandit severity to our standard severity levels""" + severity_map = { + "HIGH": "high", + "MEDIUM": "medium", + "LOW": "low" + } + return severity_map.get(bandit_severity.upper(), "medium") + + def _get_category(self, test_id: str, test_name: str) -> str: + """Determine finding category based on Bandit test""" + # Map common Bandit test categories + if "sql" in test_id.lower() or "injection" in test_name.lower(): + return "injection" + elif "crypto" in test_id.lower() or "hash" in test_name.lower(): + return "cryptography" + elif "shell" in test_id.lower() or "subprocess" in test_name.lower(): + return "command_injection" + elif "hardcode" in test_id.lower() or "password" in test_name.lower(): + return "hardcoded_secrets" + elif "pickle" in test_id.lower() or "deserial" in test_name.lower(): + return "deserialization" + elif "request" in test_id.lower() or "http" in test_name.lower(): + return "web_security" + elif "random" in test_id.lower(): + return "weak_randomness" + elif "path" in test_id.lower() or "traversal" in test_name.lower(): + return "path_traversal" + else: + return "python_security" + + def _get_recommendation(self, test_id: str, test_name: str) -> str: + """Generate recommendation based on Bandit test""" + recommendations = { + # SQL Injection + "B608": "Use parameterized queries instead of string formatting for SQL queries.", + "B703": "Use parameterized queries with Django ORM or raw SQL.", + + # Cryptography + "B101": "Remove hardcoded passwords and use secure configuration management.", + "B105": "Remove hardcoded passwords and use environment variables or secret management.", + "B106": "Remove hardcoded passwords from function arguments.", + "B107": "Remove hardcoded passwords from default function arguments.", + "B303": "Use cryptographically secure hash functions like SHA-256 or better.", + "B324": "Use strong cryptographic algorithms instead of deprecated ones.", + "B413": "Use secure encryption algorithms and proper key management.", + + # Command Injection + "B602": "Validate and sanitize input before using in subprocess calls.", + "B603": "Avoid using subprocess with shell=True. Use array form instead.", + "B605": "Avoid starting processes with shell=True.", + + # Deserialization + "B301": "Avoid using pickle for untrusted data. Use JSON or safer alternatives.", + "B302": "Avoid using marshal for untrusted data.", + "B506": "Use safe YAML loading methods like yaml.safe_load().", + + # Web Security + "B501": "Validate SSL certificates in requests to prevent MITM attacks.", + "B401": "Import and use telnetlib carefully, prefer SSH for remote connections.", + + # Random + "B311": "Use cryptographically secure random generators like secrets module.", + + # Path Traversal + "B108": "Validate file paths to prevent directory traversal attacks." + } + + return recommendations.get(test_id, + f"Review the {test_name} security issue and apply appropriate security measures.") + + def _create_summary(self, findings: List[ModuleFinding], total_files: int) -> Dict[str, Any]: + """Create analysis summary""" + severity_counts = {"high": 0, "medium": 0, "low": 0} + category_counts = {} + test_counts = {} + + for finding in findings: + # Count by severity + severity_counts[finding.severity] += 1 + + # Count by category + category = finding.category + category_counts[category] = category_counts.get(category, 0) + 1 + + # Count by test + test_id = finding.metadata.get("test_id", "unknown") + test_counts[test_id] = test_counts.get(test_id, 0) + 1 + + return { + "total_findings": len(findings), + "files_scanned": total_files, + "severity_counts": severity_counts, + "category_counts": category_counts, + "top_tests": dict(sorted(test_counts.items(), key=lambda x: x[1], reverse=True)[:10]), + "files_with_issues": len(set(f.file_path for f in findings if f.file_path)) + } \ No newline at end of file diff --git a/backend/toolbox/modules/static_analysis/opengrep.py b/backend/toolbox/modules/static_analysis/opengrep.py new file mode 100644 index 0000000..03353b3 --- /dev/null +++ b/backend/toolbox/modules/static_analysis/opengrep.py @@ -0,0 +1,396 @@ +""" +OpenGrep Static Analysis Module + +This module uses OpenGrep (open-source version of Semgrep) for pattern-based +static analysis across multiple programming languages. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import asyncio +import json +import tempfile +from pathlib import Path +from typing import Dict, Any, List +import subprocess +import logging + +from ..base import BaseModule, ModuleMetadata, ModuleFinding, ModuleResult +from . import register_module + +logger = logging.getLogger(__name__) + + +@register_module +class OpenGrepModule(BaseModule): + """OpenGrep static analysis module""" + + def get_metadata(self) -> ModuleMetadata: + """Get module metadata""" + return ModuleMetadata( + name="opengrep", + version="1.45.0", + description="Open-source pattern-based static analysis tool for security vulnerabilities", + author="FuzzForge Team", + category="static_analysis", + tags=["sast", "pattern-matching", "multi-language", "security"], + input_schema={ + "type": "object", + "properties": { + "config": { + "type": "string", + "enum": ["auto", "p/security-audit", "p/owasp-top-ten", "p/cwe-top-25"], + "default": "auto", + "description": "Rule configuration to use" + }, + "languages": { + "type": "array", + "items": {"type": "string"}, + "description": "Specific languages to analyze" + }, + "include_patterns": { + "type": "array", + "items": {"type": "string"}, + "description": "File patterns to include" + }, + "exclude_patterns": { + "type": "array", + "items": {"type": "string"}, + "description": "File patterns to exclude" + }, + "max_target_bytes": { + "type": "integer", + "default": 1000000, + "description": "Maximum file size to analyze (bytes)" + }, + "timeout": { + "type": "integer", + "default": 300, + "description": "Analysis timeout in seconds" + }, + "severity": { + "type": "array", + "items": {"type": "string", "enum": ["ERROR", "WARNING", "INFO"]}, + "default": ["ERROR", "WARNING", "INFO"], + "description": "Minimum severity levels to report" + }, + "confidence": { + "type": "array", + "items": {"type": "string", "enum": ["HIGH", "MEDIUM", "LOW"]}, + "default": ["HIGH", "MEDIUM", "LOW"], + "description": "Minimum confidence levels to report" + } + } + }, + output_schema={ + "type": "object", + "properties": { + "findings": { + "type": "array", + "items": { + "type": "object", + "properties": { + "rule_id": {"type": "string"}, + "severity": {"type": "string"}, + "confidence": {"type": "string"}, + "file_path": {"type": "string"}, + "line_number": {"type": "integer"} + } + } + } + } + } + ) + + def validate_config(self, config: Dict[str, Any]) -> bool: + """Validate configuration""" + timeout = config.get("timeout", 300) + if not isinstance(timeout, int) or timeout < 30 or timeout > 3600: + raise ValueError("Timeout must be between 30 and 3600 seconds") + + max_bytes = config.get("max_target_bytes", 1000000) + if not isinstance(max_bytes, int) or max_bytes < 1000 or max_bytes > 10000000: + raise ValueError("max_target_bytes must be between 1000 and 10000000") + + return True + + async def execute(self, config: Dict[str, Any], workspace: Path) -> ModuleResult: + """Execute OpenGrep static analysis""" + self.start_timer() + + try: + # Validate inputs + self.validate_config(config) + self.validate_workspace(workspace) + + logger.info(f"Running OpenGrep analysis on {workspace}") + + # Build opengrep command + cmd = ["semgrep", "--json"] + + # Add configuration + config_type = config.get("config", "auto") + if config_type == "auto": + cmd.extend(["--config", "auto"]) + else: + cmd.extend(["--config", config_type]) + + # Add timeout + cmd.extend(["--timeout", str(config.get("timeout", 300))]) + + # Add max target bytes + cmd.extend(["--max-target-bytes", str(config.get("max_target_bytes", 1000000))]) + + # Add languages if specified + if config.get("languages"): + for lang in config["languages"]: + cmd.extend(["--lang", lang]) + + # Add include patterns + if config.get("include_patterns"): + for pattern in config["include_patterns"]: + cmd.extend(["--include", pattern]) + + # Add exclude patterns + if config.get("exclude_patterns"): + for pattern in config["exclude_patterns"]: + cmd.extend(["--exclude", pattern]) + + # Add severity filter (semgrep only accepts one severity level) + severity_levels = config.get("severity", ["ERROR", "WARNING", "INFO"]) + if severity_levels: + # Use the highest severity level from the list + severity_priority = {"ERROR": 3, "WARNING": 2, "INFO": 1} + highest_severity = max(severity_levels, key=lambda x: severity_priority.get(x, 0)) + cmd.extend(["--severity", highest_severity]) + + # Add confidence filter (if supported in this version) + confidence_levels = config.get("confidence", ["HIGH", "MEDIUM"]) + if confidence_levels and len(confidence_levels) < 3: # Only if not all levels + # Note: confidence filtering might need to be done post-processing + pass + + # Disable metrics collection + cmd.append("--disable-version-check") + cmd.append("--no-git-ignore") + + # Add target directory + cmd.append(str(workspace)) + + logger.debug(f"Running command: {' '.join(cmd)}") + + # Run OpenGrep + process = await asyncio.create_subprocess_exec( + *cmd, + stdout=asyncio.subprocess.PIPE, + stderr=asyncio.subprocess.PIPE, + cwd=workspace + ) + + stdout, stderr = await process.communicate() + + # Parse results + findings = [] + if process.returncode in [0, 1]: # 0 = no findings, 1 = findings found + findings = self._parse_opengrep_output(stdout.decode(), workspace, config) + else: + error_msg = stderr.decode() + logger.error(f"OpenGrep failed: {error_msg}") + return self.create_result( + findings=[], + status="failed", + error=f"OpenGrep execution failed: {error_msg}" + ) + + # Create summary + summary = self._create_summary(findings) + + logger.info(f"OpenGrep found {len(findings)} potential issues") + + return self.create_result( + findings=findings, + status="success", + summary=summary + ) + + except Exception as e: + logger.error(f"OpenGrep module failed: {e}") + return self.create_result( + findings=[], + status="failed", + error=str(e) + ) + + def _parse_opengrep_output(self, output: str, workspace: Path, config: Dict[str, Any]) -> List[ModuleFinding]: + """Parse OpenGrep JSON output into findings""" + findings = [] + + if not output.strip(): + return findings + + try: + data = json.loads(output) + results = data.get("results", []) + + # Get filtering criteria + allowed_severities = set(config.get("severity", ["ERROR", "WARNING", "INFO"])) + allowed_confidences = set(config.get("confidence", ["HIGH", "MEDIUM", "LOW"])) + + for result in results: + # Extract basic info + rule_id = result.get("check_id", "unknown") + message = result.get("message", "") + severity = result.get("extra", {}).get("severity", "INFO").upper() + + # File location info + path_info = result.get("path", "") + start_line = result.get("start", {}).get("line", 0) + end_line = result.get("end", {}).get("line", 0) + start_col = result.get("start", {}).get("col", 0) + end_col = result.get("end", {}).get("col", 0) + + # Code snippet + lines = result.get("extra", {}).get("lines", "") + + # Metadata + metadata = result.get("extra", {}) + cwe = metadata.get("metadata", {}).get("cwe", []) + owasp = metadata.get("metadata", {}).get("owasp", []) + confidence = metadata.get("metadata", {}).get("confidence", "MEDIUM").upper() + + # Apply severity filter + if severity not in allowed_severities: + continue + + # Apply confidence filter + if confidence not in allowed_confidences: + continue + + # Make file path relative to workspace + if path_info: + try: + rel_path = Path(path_info).relative_to(workspace) + path_info = str(rel_path) + except ValueError: + pass + + # Map severity to our standard levels + finding_severity = self._map_severity(severity) + + # Create finding + finding = self.create_finding( + title=f"Security issue: {rule_id}", + description=message or f"OpenGrep rule {rule_id} triggered", + severity=finding_severity, + category=self._get_category(rule_id, metadata), + file_path=path_info if path_info else None, + line_start=start_line if start_line > 0 else None, + line_end=end_line if end_line > 0 and end_line != start_line else None, + code_snippet=lines.strip() if lines else None, + recommendation=self._get_recommendation(rule_id, metadata), + metadata={ + "rule_id": rule_id, + "opengrep_severity": severity, + "confidence": confidence, + "cwe": cwe, + "owasp": owasp, + "fix": metadata.get("fix", ""), + "impact": metadata.get("impact", ""), + "likelihood": metadata.get("likelihood", ""), + "references": metadata.get("references", []) + } + ) + + findings.append(finding) + + except json.JSONDecodeError as e: + logger.warning(f"Failed to parse OpenGrep output: {e}") + except Exception as e: + logger.warning(f"Error processing OpenGrep results: {e}") + + return findings + + def _map_severity(self, opengrep_severity: str) -> str: + """Map OpenGrep severity to our standard severity levels""" + severity_map = { + "ERROR": "high", + "WARNING": "medium", + "INFO": "low" + } + return severity_map.get(opengrep_severity.upper(), "medium") + + def _get_category(self, rule_id: str, metadata: Dict[str, Any]) -> str: + """Determine finding category based on rule and metadata""" + cwe_list = metadata.get("metadata", {}).get("cwe", []) + owasp_list = metadata.get("metadata", {}).get("owasp", []) + + # Check for common security categories + if any("injection" in rule_id.lower() for x in [rule_id]): + return "injection" + elif any("xss" in rule_id.lower() for x in [rule_id]): + return "xss" + elif any("csrf" in rule_id.lower() for x in [rule_id]): + return "csrf" + elif any("auth" in rule_id.lower() for x in [rule_id]): + return "authentication" + elif any("crypto" in rule_id.lower() for x in [rule_id]): + return "cryptography" + elif cwe_list: + return f"cwe-{cwe_list[0]}" + elif owasp_list: + return f"owasp-{owasp_list[0].replace(' ', '-').lower()}" + else: + return "security" + + def _get_recommendation(self, rule_id: str, metadata: Dict[str, Any]) -> str: + """Generate recommendation based on rule and metadata""" + fix_suggestion = metadata.get("fix", "") + if fix_suggestion: + return fix_suggestion + + # Generic recommendations based on rule type + if "injection" in rule_id.lower(): + return "Use parameterized queries or prepared statements to prevent injection attacks." + elif "xss" in rule_id.lower(): + return "Properly encode/escape user input before displaying it in web pages." + elif "crypto" in rule_id.lower(): + return "Use cryptographically secure algorithms and proper key management." + elif "hardcode" in rule_id.lower(): + return "Remove hardcoded secrets and use secure configuration management." + else: + return "Review this security issue and apply appropriate fixes based on your security requirements." + + def _create_summary(self, findings: List[ModuleFinding]) -> Dict[str, Any]: + """Create analysis summary""" + severity_counts = {"critical": 0, "high": 0, "medium": 0, "low": 0} + category_counts = {} + rule_counts = {} + + for finding in findings: + # Count by severity + severity_counts[finding.severity] += 1 + + # Count by category + category = finding.category + category_counts[category] = category_counts.get(category, 0) + 1 + + # Count by rule + rule_id = finding.metadata.get("rule_id", "unknown") + rule_counts[rule_id] = rule_counts.get(rule_id, 0) + 1 + + return { + "total_findings": len(findings), + "severity_counts": severity_counts, + "category_counts": category_counts, + "top_rules": dict(sorted(rule_counts.items(), key=lambda x: x[1], reverse=True)[:10]), + "files_analyzed": len(set(f.file_path for f in findings if f.file_path)) + } \ No newline at end of file diff --git a/backend/toolbox/workflows/__init__.py b/backend/toolbox/workflows/__init__.py new file mode 100644 index 0000000..43bcfe7 --- /dev/null +++ b/backend/toolbox/workflows/__init__.py @@ -0,0 +1,11 @@ +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + diff --git a/backend/toolbox/workflows/comprehensive/__init__.py b/backend/toolbox/workflows/comprehensive/__init__.py new file mode 100644 index 0000000..83b7d4a --- /dev/null +++ b/backend/toolbox/workflows/comprehensive/__init__.py @@ -0,0 +1,12 @@ +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + diff --git a/backend/toolbox/workflows/registry.py b/backend/toolbox/workflows/registry.py new file mode 100644 index 0000000..ad58bc0 --- /dev/null +++ b/backend/toolbox/workflows/registry.py @@ -0,0 +1,187 @@ +""" +Manual Workflow Registry for Prefect Deployment + +This file contains the manual registry of all workflows that can be deployed. +Developers MUST add their workflows here after creating them. + +This approach is required because: +1. Prefect cannot deploy dynamically imported flows +2. Docker deployment needs static flow references +3. Explicit registration provides better control and visibility +""" + +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + +from typing import Dict, Any, Callable +import logging + +logger = logging.getLogger(__name__) + +# Import only essential workflows +# Import each workflow individually to handle failures gracefully +security_assessment_flow = None +secret_detection_flow = None + +# Try to import each workflow individually +try: + from .security_assessment.workflow import main_flow as security_assessment_flow +except ImportError as e: + logger.warning(f"Failed to import security_assessment workflow: {e}") + +try: + from .comprehensive.secret_detection_scan.workflow import main_flow as secret_detection_flow +except ImportError as e: + logger.warning(f"Failed to import secret_detection_scan workflow: {e}") + + +# Manual registry - developers add workflows here after creation +# Only include workflows that were successfully imported +WORKFLOW_REGISTRY: Dict[str, Dict[str, Any]] = {} + +# Add workflows that were successfully imported +if security_assessment_flow is not None: + WORKFLOW_REGISTRY["security_assessment"] = { + "flow": security_assessment_flow, + "module_path": "toolbox.workflows.security_assessment.workflow", + "function_name": "main_flow", + "description": "Comprehensive security assessment workflow that scans files, analyzes code for vulnerabilities, and generates SARIF reports", + "version": "1.0.0", + "author": "FuzzForge Team", + "tags": ["security", "scanner", "analyzer", "static-analysis", "sarif"] + } + +if secret_detection_flow is not None: + WORKFLOW_REGISTRY["secret_detection_scan"] = { + "flow": secret_detection_flow, + "module_path": "toolbox.workflows.comprehensive.secret_detection_scan.workflow", + "function_name": "main_flow", + "description": "Comprehensive secret detection using TruffleHog and Gitleaks for thorough credential scanning", + "version": "1.0.0", + "author": "FuzzForge Team", + "tags": ["secrets", "credentials", "detection", "trufflehog", "gitleaks", "comprehensive"] + } + +# +# To add a new workflow, follow this pattern: +# +# "my_new_workflow": { +# "flow": my_new_flow_function, # Import the flow function above +# "module_path": "toolbox.workflows.my_new_workflow.workflow", +# "function_name": "my_new_flow_function", +# "description": "Description of what this workflow does", +# "version": "1.0.0", +# "author": "Developer Name", +# "tags": ["tag1", "tag2"] +# } + + +def get_workflow_flow(workflow_name: str) -> Callable: + """ + Get the flow function for a workflow. + + Args: + workflow_name: Name of the workflow + + Returns: + Flow function + + Raises: + KeyError: If workflow not found in registry + """ + if workflow_name not in WORKFLOW_REGISTRY: + available = list(WORKFLOW_REGISTRY.keys()) + raise KeyError( + f"Workflow '{workflow_name}' not found in registry. " + f"Available workflows: {available}. " + f"Please add the workflow to toolbox/workflows/registry.py" + ) + + return WORKFLOW_REGISTRY[workflow_name]["flow"] + + +def get_workflow_info(workflow_name: str) -> Dict[str, Any]: + """ + Get registry information for a workflow. + + Args: + workflow_name: Name of the workflow + + Returns: + Registry information dictionary + + Raises: + KeyError: If workflow not found in registry + """ + if workflow_name not in WORKFLOW_REGISTRY: + available = list(WORKFLOW_REGISTRY.keys()) + raise KeyError( + f"Workflow '{workflow_name}' not found in registry. " + f"Available workflows: {available}" + ) + + return WORKFLOW_REGISTRY[workflow_name] + + +def list_registered_workflows() -> Dict[str, Dict[str, Any]]: + """ + Get all registered workflows. + + Returns: + Dictionary of all workflow registry entries + """ + return WORKFLOW_REGISTRY.copy() + + +def validate_registry() -> bool: + """ + Validate the workflow registry for consistency. + + Returns: + True if valid, raises exceptions if not + + Raises: + ValueError: If registry is invalid + """ + if not WORKFLOW_REGISTRY: + raise ValueError("Workflow registry is empty") + + required_fields = ["flow", "module_path", "function_name", "description"] + + for name, entry in WORKFLOW_REGISTRY.items(): + # Check required fields + missing_fields = [field for field in required_fields if field not in entry] + if missing_fields: + raise ValueError( + f"Workflow '{name}' missing required fields: {missing_fields}" + ) + + # Check if flow is callable + if not callable(entry["flow"]): + raise ValueError(f"Workflow '{name}' flow is not callable") + + # Check if flow has the required Prefect attributes + if not hasattr(entry["flow"], "deploy"): + raise ValueError( + f"Workflow '{name}' flow is not a Prefect flow (missing deploy method)" + ) + + logger.info(f"Registry validation passed. {len(WORKFLOW_REGISTRY)} workflows registered.") + return True + + +# Validate registry on import +try: + validate_registry() + logger.info(f"Workflow registry loaded successfully with {len(WORKFLOW_REGISTRY)} workflows") +except Exception as e: + logger.error(f"Workflow registry validation failed: {e}") + raise \ No newline at end of file diff --git a/backend/toolbox/workflows/security_assessment/Dockerfile b/backend/toolbox/workflows/security_assessment/Dockerfile new file mode 100644 index 0000000..2b46c2c --- /dev/null +++ b/backend/toolbox/workflows/security_assessment/Dockerfile @@ -0,0 +1,30 @@ +FROM prefecthq/prefect:3-python3.11 + +WORKDIR /app + +# Create toolbox directory structure to match expected import paths +RUN mkdir -p /app/toolbox/workflows /app/toolbox/modules + +# Copy base module infrastructure +COPY modules/__init__.py /app/toolbox/modules/ +COPY modules/base.py /app/toolbox/modules/ + +# Copy only required modules (manual selection) +COPY modules/scanner /app/toolbox/modules/scanner +COPY modules/analyzer /app/toolbox/modules/analyzer +COPY modules/reporter /app/toolbox/modules/reporter + +# Copy this workflow +COPY workflows/security_assessment /app/toolbox/workflows/security_assessment + +# Install workflow-specific requirements if they exist +RUN if [ -f /app/toolbox/workflows/security_assessment/requirements.txt ]; then pip install --no-cache-dir -r /app/toolbox/workflows/security_assessment/requirements.txt; fi + +# Install common requirements +RUN pip install --no-cache-dir pyyaml + +# Set Python path +ENV PYTHONPATH=/app:$PYTHONPATH + +# Create workspace directory +RUN mkdir -p /workspace diff --git a/backend/toolbox/workflows/security_assessment/__init__.py b/backend/toolbox/workflows/security_assessment/__init__.py new file mode 100644 index 0000000..43bcfe7 --- /dev/null +++ b/backend/toolbox/workflows/security_assessment/__init__.py @@ -0,0 +1,11 @@ +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + diff --git a/backend/toolbox/workflows/security_assessment/metadata.yaml b/backend/toolbox/workflows/security_assessment/metadata.yaml new file mode 100644 index 0000000..e3ffbe8 --- /dev/null +++ b/backend/toolbox/workflows/security_assessment/metadata.yaml @@ -0,0 +1,111 @@ +name: security_assessment +version: "2.0.0" +description: "Comprehensive security assessment workflow that scans files, analyzes code for vulnerabilities, and generates SARIF reports" +author: "FuzzForge Team" +category: "comprehensive" +tags: + - "security" + - "scanner" + - "analyzer" + - "static-analysis" + - "sarif" + - "comprehensive" + +supported_volume_modes: + - "ro" + - "rw" + +default_volume_mode: "ro" +default_target_path: "/workspace" + +requirements: + tools: + - "file_scanner" + - "security_analyzer" + - "sarif_reporter" + resources: + memory: "512Mi" + cpu: "500m" + timeout: 1800 + +has_docker: true + +default_parameters: + target_path: "/workspace" + volume_mode: "ro" + scanner_config: {} + analyzer_config: {} + reporter_config: {} + +parameters: + type: object + properties: + target_path: + type: string + default: "/workspace" + description: "Path to analyze" + volume_mode: + type: string + enum: ["ro", "rw"] + default: "ro" + description: "Volume mount mode" + scanner_config: + type: object + description: "File scanner configuration" + properties: + patterns: + type: array + items: + type: string + description: "File patterns to scan" + check_sensitive: + type: boolean + description: "Check for sensitive files" + calculate_hashes: + type: boolean + description: "Calculate file hashes" + max_file_size: + type: integer + description: "Maximum file size to scan (bytes)" + analyzer_config: + type: object + description: "Security analyzer configuration" + properties: + file_extensions: + type: array + items: + type: string + description: "File extensions to analyze" + check_secrets: + type: boolean + description: "Check for hardcoded secrets" + check_sql: + type: boolean + description: "Check for SQL injection risks" + check_dangerous_functions: + type: boolean + description: "Check for dangerous function calls" + reporter_config: + type: object + description: "SARIF reporter configuration" + properties: + include_code_flows: + type: boolean + description: "Include code flow information" + +output_schema: + type: object + properties: + sarif: + type: object + description: "SARIF-formatted security findings" + summary: + type: object + description: "Scan execution summary" + properties: + total_findings: + type: integer + severity_counts: + type: object + tool_counts: + type: object diff --git a/backend/toolbox/workflows/security_assessment/requirements.txt b/backend/toolbox/workflows/security_assessment/requirements.txt new file mode 100644 index 0000000..f481334 --- /dev/null +++ b/backend/toolbox/workflows/security_assessment/requirements.txt @@ -0,0 +1,4 @@ +# Requirements for security assessment workflow +pydantic>=2.0.0 +pyyaml>=6.0 +aiofiles>=23.0.0 \ No newline at end of file diff --git a/backend/toolbox/workflows/security_assessment/workflow.py b/backend/toolbox/workflows/security_assessment/workflow.py new file mode 100644 index 0000000..584bf65 --- /dev/null +++ b/backend/toolbox/workflows/security_assessment/workflow.py @@ -0,0 +1,252 @@ +""" +Security Assessment Workflow - Comprehensive security analysis using multiple modules +""" + +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + +import sys +import logging +from pathlib import Path +from typing import Dict, Any, Optional +from prefect import flow, task +import json + +# Add modules to path +sys.path.insert(0, '/app') + +# Import modules +from toolbox.modules.scanner import FileScanner +from toolbox.modules.analyzer import SecurityAnalyzer +from toolbox.modules.reporter import SARIFReporter + +# Configure logging +logging.basicConfig(level=logging.INFO) +logger = logging.getLogger(__name__) + + +@task(name="file_scanning") +async def scan_files_task(workspace: Path, config: Dict[str, Any]) -> Dict[str, Any]: + """ + Task to scan files in the workspace. + + Args: + workspace: Path to the workspace + config: Scanner configuration + + Returns: + Scanner results + """ + logger.info(f"Starting file scanning in {workspace}") + scanner = FileScanner() + + result = await scanner.execute(config, workspace) + + logger.info(f"File scanning completed: {result.summary.get('total_files', 0)} files found") + return result.dict() + + +@task(name="security_analysis") +async def analyze_security_task(workspace: Path, config: Dict[str, Any]) -> Dict[str, Any]: + """ + Task to analyze security vulnerabilities. + + Args: + workspace: Path to the workspace + config: Analyzer configuration + + Returns: + Analysis results + """ + logger.info("Starting security analysis") + analyzer = SecurityAnalyzer() + + result = await analyzer.execute(config, workspace) + + logger.info( + f"Security analysis completed: {result.summary.get('total_findings', 0)} findings" + ) + return result.dict() + + +@task(name="report_generation") +async def generate_report_task( + scan_results: Dict[str, Any], + analysis_results: Dict[str, Any], + config: Dict[str, Any], + workspace: Path +) -> Dict[str, Any]: + """ + Task to generate SARIF report from all findings. + + Args: + scan_results: Results from scanner + analysis_results: Results from analyzer + config: Reporter configuration + workspace: Path to the workspace + + Returns: + SARIF report + """ + logger.info("Generating SARIF report") + reporter = SARIFReporter() + + # Combine findings from all modules + all_findings = [] + + # Add scanner findings (only sensitive files, not all files) + scanner_findings = scan_results.get("findings", []) + sensitive_findings = [f for f in scanner_findings if f.get("severity") != "info"] + all_findings.extend(sensitive_findings) + + # Add analyzer findings + analyzer_findings = analysis_results.get("findings", []) + all_findings.extend(analyzer_findings) + + # Prepare reporter config + reporter_config = { + **config, + "findings": all_findings, + "tool_name": "FuzzForge Security Assessment", + "tool_version": "1.0.0" + } + + result = await reporter.execute(reporter_config, workspace) + + # Extract SARIF from result + sarif = result.dict().get("sarif", {}) + + logger.info(f"Report generated with {len(all_findings)} total findings") + return sarif + + +@flow(name="security_assessment", log_prints=True) +async def main_flow( + target_path: str = "/workspace", + volume_mode: str = "ro", + scanner_config: Optional[Dict[str, Any]] = None, + analyzer_config: Optional[Dict[str, Any]] = None, + reporter_config: Optional[Dict[str, Any]] = None +) -> Dict[str, Any]: + """ + Main security assessment workflow. + + This workflow: + 1. Scans files in the workspace + 2. Analyzes code for security vulnerabilities + 3. Generates a SARIF report with all findings + + Args: + target_path: Path to the mounted workspace (default: /workspace) + volume_mode: Volume mount mode (ro/rw) + scanner_config: Configuration for file scanner + analyzer_config: Configuration for security analyzer + reporter_config: Configuration for SARIF reporter + + Returns: + SARIF-formatted findings report + """ + logger.info(f"Starting security assessment workflow") + logger.info(f"Workspace: {target_path}, Mode: {volume_mode}") + + # Set workspace path + workspace = Path(target_path) + + if not workspace.exists(): + logger.error(f"Workspace does not exist: {workspace}") + return { + "error": f"Workspace not found: {workspace}", + "sarif": None + } + + # Default configurations + if not scanner_config: + scanner_config = { + "patterns": ["*"], + "check_sensitive": True, + "calculate_hashes": False, + "max_file_size": 10485760 # 10MB + } + + if not analyzer_config: + analyzer_config = { + "file_extensions": [".py", ".js", ".java", ".php", ".rb", ".go"], + "check_secrets": True, + "check_sql": True, + "check_dangerous_functions": True + } + + if not reporter_config: + reporter_config = { + "include_code_flows": False + } + + try: + # Execute workflow tasks + logger.info("Phase 1: File scanning") + scan_results = await scan_files_task(workspace, scanner_config) + + logger.info("Phase 2: Security analysis") + analysis_results = await analyze_security_task(workspace, analyzer_config) + + logger.info("Phase 3: Report generation") + sarif_report = await generate_report_task( + scan_results, + analysis_results, + reporter_config, + workspace + ) + + # Log summary + if sarif_report and "runs" in sarif_report: + results_count = len(sarif_report["runs"][0].get("results", [])) + logger.info(f"Workflow completed successfully with {results_count} findings") + else: + logger.info("Workflow completed successfully") + + return sarif_report + + except Exception as e: + logger.error(f"Workflow failed: {e}") + # Return error in SARIF format + return { + "$schema": "https://raw.githubusercontent.com/oasis-tcs/sarif-spec/master/Schemata/sarif-schema-2.1.0.json", + "version": "2.1.0", + "runs": [ + { + "tool": { + "driver": { + "name": "FuzzForge Security Assessment", + "version": "1.0.0" + } + }, + "results": [], + "invocations": [ + { + "executionSuccessful": False, + "exitCode": 1, + "exitCodeDescription": str(e) + } + ] + } + ] + } + + +if __name__ == "__main__": + # For local testing + import asyncio + + asyncio.run(main_flow( + target_path="/tmp/test", + scanner_config={"patterns": ["*.py"]}, + analyzer_config={"check_secrets": True} + )) \ No newline at end of file diff --git a/cli/.gitignore b/cli/.gitignore new file mode 100644 index 0000000..f24b22c --- /dev/null +++ b/cli/.gitignore @@ -0,0 +1,64 @@ +# FuzzForge CLI specific .gitignore + +# Python +__pycache__/ +*.py[cod] +*$py.class +*.so +.Python +build/ +develop-eggs/ +dist/ +downloads/ +eggs/ +.eggs/ +lib/ +lib64/ +parts/ +sdist/ +var/ +wheels/ +*.egg-info/ +.installed.cfg +*.egg +MANIFEST + +# Virtual environments +.venv/ +venv/ +ENV/ +env/ + +# UV package manager - keep uv.lock for CLI +# uv.lock # Commented out - we want to keep this for reproducible CLI builds + +# IDE +.vscode/ +.idea/ +*.swp +*.swo + +# OS +.DS_Store +Thumbs.db + +# Testing +.coverage +.pytest_cache/ +.tox/ +htmlcov/ + +# MyPy +.mypy_cache/ + +# Local development +local_config.yaml +.env.local + +# Generated files +*.log +*.tmp + +# CLI specific +# Don't ignore uv.lock in CLI as it's needed for reproducible builds +!uv.lock \ No newline at end of file diff --git a/cli/README.md b/cli/README.md new file mode 100644 index 0000000..510598d --- /dev/null +++ b/cli/README.md @@ -0,0 +1,621 @@ +# FuzzForge CLI + +๐Ÿ›ก๏ธ **FuzzForge CLI** - Command-line interface for FuzzForge security testing platform + +A comprehensive CLI for managing security testing workflows, monitoring runs in real-time, and analyzing findings with beautiful terminal interfaces and persistent project management. + +## โœจ Features + +- ๐Ÿ“ **Project Management** - Initialize and manage FuzzForge projects with local databases +- ๐Ÿ”ง **Workflow Management** - Browse, configure, and run security testing workflows +- ๐Ÿš€ **Workflow Execution** - Execute and manage security testing workflows +- ๐Ÿ” **Findings Analysis** - View, export, and analyze security findings in multiple formats +- ๐Ÿ“Š **Real-time Monitoring** - Live dashboards for fuzzing statistics and crash reports +- โš™๏ธ **Configuration** - Flexible project and global configuration management +- ๐ŸŽจ **Rich UI** - Beautiful tables, progress bars, and interactive prompts +- ๐Ÿ’พ **Persistent Storage** - SQLite database for runs, findings, and crash data +- ๐Ÿ›ก๏ธ **Error Handling** - Comprehensive error handling with user-friendly messages +- ๐Ÿ”„ **Network Resilience** - Automatic retries and graceful degradation + +## ๐Ÿš€ Quick Start + +### Installation + +#### Prerequisites +- Python 3.11 or higher +- [uv](https://docs.astral.sh/uv/) package manager + +#### Install FuzzForge CLI +```bash +# Clone the repository +git clone https://github.com/FuzzingLabs/fuzzforge_alpha.git +cd fuzzforge_alpha/cli + +# Install globally with uv (recommended) +uv tool install . + +# Alternative: Install in development mode +uv sync +uv add --editable ../sdk +uv tool install --editable . + +# Verify installation +fuzzforge --help +``` + +#### Shell Completion (Optional) +```bash +# Install completion for your shell +fuzzforge --install-completion +``` + +### Initialize Your First Project + +```bash +# Create a new project directory +mkdir my-security-project +cd my-security-project + +# Initialize FuzzForge project +ff init + +# Check status +fuzzforge status +``` + +This creates a `.fuzzforge/` directory with: +- SQLite database for persistent storage +- Configuration file (`config.yaml`) +- Project metadata + +### Run Your First Analysis + +```bash +# List available workflows +fuzzforge workflows list + +# Get workflow details +fuzzforge workflows info security_assessment + +# Submit a workflow for analysis +fuzzforge workflow security_assessment /path/to/your/code + +# Monitor progress in real-time +fuzzforge monitor live + +# View findings when complete +fuzzforge finding +``` + +## ๐Ÿ“š Command Reference + +### Project Management + +#### `ff init` +Initialize a new FuzzForge project in the current directory. + +```bash +ff init --name "My Security Project" --api-url "http://localhost:8000" +``` + +**Options:** +- `--name, -n` - Project name (defaults to directory name) +- `--api-url, -u` - FuzzForge API URL (defaults to http://localhost:8000) +- `--force, -f` - Force initialization even if project exists + +#### `fuzzforge status` +Show comprehensive project and API status information. + +```bash +fuzzforge status +``` + +Displays: +- Project information and configuration +- Database statistics (runs, findings, crashes) +- API connectivity and available workflows + +### Workflow Management + +#### `fuzzforge workflows list` +List all available security testing workflows. + +```bash +fuzzforge workflows list +``` + +#### `fuzzforge workflows info ` +Show detailed information about a specific workflow. + +```bash +fuzzforge workflows info security_assessment +``` + +Displays: +- Workflow metadata (version, author, description) +- Parameter schema and requirements +- Supported volume modes and features + +#### `fuzzforge workflows parameters ` +Interactive parameter builder for workflows. + +```bash +# Interactive mode +fuzzforge workflows parameters security_assessment + +# Save parameters to file +fuzzforge workflows parameters security_assessment --output params.json + +# Non-interactive mode (show schema only) +fuzzforge workflows parameters security_assessment --no-interactive +``` + +### Workflow Execution + +#### `fuzzforge workflow ` +Execute a security testing workflow. + +```bash +# Basic execution +fuzzforge workflow security_assessment /path/to/code + +# With parameters +fuzzforge workflow security_assessment /path/to/binary \ + --param timeout=3600 \ + --param iterations=10000 + +# With parameter file +fuzzforge workflow security_assessment /path/to/code \ + --param-file my-params.json + +# Wait for completion +fuzzforge workflow security_assessment /path/to/code --wait +``` + +**Options:** +- `--param, -p` - Parameter in key=value format (can be used multiple times) +- `--param-file, -f` - JSON file containing parameters +- `--volume-mode, -v` - Volume mount mode: `ro` (read-only) or `rw` (read-write) +- `--timeout, -t` - Execution timeout in seconds +- `--interactive/--no-interactive, -i/-n` - Interactive parameter input +- `--wait, -w` - Wait for execution to complete +- `--live, -l` - Show live monitoring during execution + +#### `fuzzforge workflow status [execution-id]` +Check the status of a workflow execution. + +```bash +# Check specific execution +fuzzforge workflow status abc123def456 + +# Check most recent execution +fuzzforge workflow status +``` + +#### `fuzzforge workflow history` +Show workflow execution history from local database. + +```bash +# List all executions +fuzzforge workflow history + +# Filter by workflow +fuzzforge workflow history --workflow security_assessment + +# Filter by status +fuzzforge workflow history --status completed + +# Limit results +fuzzforge workflow history --limit 10 +``` + +#### `fuzzforge workflow retry ` +Retry a workflow with the same or modified parameters. + +```bash +# Retry with same parameters +fuzzforge workflow retry abc123def456 + +# Modify parameters interactively +fuzzforge workflow retry abc123def456 --modify-params +``` + +### Findings Management + +#### `fuzzforge finding [execution-id]` +View security findings for a specific execution. + +```bash +# Display latest findings +fuzzforge finding + +# Display specific execution findings +fuzzforge finding abc123def456 +``` + +#### `fuzzforge findings` +Browse all security findings from local database. + +```bash +# List all findings +fuzzforge findings + +# Show findings history +fuzzforge findings history --limit 20 +``` + +#### `fuzzforge finding export [execution-id]` +Export security findings in various formats. + +```bash +# Export latest findings +fuzzforge finding export --format json + +# Export specific execution findings +fuzzforge finding export abc123def456 --format sarif + +# Export as CSV with output file +fuzzforge finding export abc123def456 --format csv --output report.csv + +# Export as HTML report +fuzzforge finding export --format html --output report.html +``` + +### Real-time Monitoring + +#### `fuzzforge monitor stats ` +Show current fuzzing statistics. + +```bash +# Show stats once +fuzzforge monitor stats abc123def456 --once + +# Live updating stats (default) +fuzzforge monitor stats abc123def456 --refresh 5 +``` + +#### `fuzzforge monitor crashes ` +Display crash reports for a fuzzing run. + +```bash +fuzzforge monitor crashes abc123def456 --limit 50 +``` + +#### `fuzzforge monitor live ` +Real-time monitoring dashboard with live updates. + +```bash +fuzzforge monitor live abc123def456 --refresh 3 +``` + +Features: +- Live updating statistics +- Progress indicators and bars +- Run status monitoring +- Automatic completion detection + +### Configuration Management + +#### `fuzzforge config show` +Display current configuration settings. + +```bash +# Show project configuration +fuzzforge config show + +# Show global configuration +fuzzforge config show --global +``` + +#### `fuzzforge config set ` +Set a configuration value. + +```bash +# Project settings +fuzzforge config set project.api_url "http://api.fuzzforge.com" +fuzzforge config set project.default_timeout 7200 +fuzzforge config set project.default_workflow "security_assessment" + +# Retention settings +fuzzforge config set retention.max_runs 200 +fuzzforge config set retention.keep_findings_days 120 + +# Preferences +fuzzforge config set preferences.auto_save_findings true +fuzzforge config set preferences.show_progress_bars false + +# Global configuration +fuzzforge config set project.api_url "http://global.api.com" --global +``` + +#### `fuzzforge config get ` +Get a specific configuration value. + +```bash +fuzzforge config get project.api_url +fuzzforge config get retention.max_runs --global +``` + +#### `fuzzforge config reset` +Reset configuration to defaults. + +```bash +# Reset project configuration +fuzzforge config reset + +# Reset global configuration +fuzzforge config reset --global + +# Skip confirmation +fuzzforge config reset --force +``` + +#### `fuzzforge config edit` +Open configuration file in default editor. + +```bash +# Edit project configuration +fuzzforge config edit + +# Edit global configuration +fuzzforge config edit --global +``` + +## ๐Ÿ—๏ธ Project Structure + +When you initialize a FuzzForge project, the following structure is created: + +``` +my-project/ +โ”œโ”€โ”€ .fuzzforge/ +โ”‚ โ”œโ”€โ”€ config.yaml # Project configuration +โ”‚ โ””โ”€โ”€ findings.db # SQLite database +โ”œโ”€โ”€ .gitignore # Updated with FuzzForge entries +โ””โ”€โ”€ README.md # Project README (if created) +``` + +### Database Schema + +The SQLite database stores: + +- **runs** - Workflow run history and metadata +- **findings** - Security findings and SARIF data +- **crashes** - Crash reports and fuzzing data + +### Configuration Format + +Project configuration (`.fuzzforge/config.yaml`): + +```yaml +project: + name: "My Security Project" + api_url: "http://localhost:8000" + default_timeout: 3600 + default_workflow: null + +retention: + max_runs: 100 + keep_findings_days: 90 + +preferences: + auto_save_findings: true + show_progress_bars: true + table_style: "rich" + color_output: true +``` + +## ๐Ÿ”ง Advanced Usage + +### Parameter Handling + +FuzzForge CLI supports flexible parameter input: + +1. **Command line parameters**: + ```bash + ff workflow workflow-name /path key1=value1 key2=value2 + ``` + +2. **Parameter files**: + ```bash + echo '{"timeout": 3600, "threads": 4}' > params.json + ff workflow workflow-name /path --param-file params.json + ``` + +3. **Interactive prompts**: + ```bash + ff workflow workflow-name /path --interactive + ``` + +4. **Parameter builder**: + ```bash + ff workflows parameters workflow-name --output my-params.json + ff workflow workflow-name /path --param-file my-params.json + ``` + +### Environment Variables + +Override configuration with environment variables: + +```bash +export FUZZFORGE_API_URL="http://production.api.com" +export FUZZFORGE_TIMEOUT="7200" +``` + +### Data Retention + +Configure automatic cleanup of old data: + +```bash +# Keep only 50 runs +fuzzforge config set retention.max_runs 50 + +# Keep findings for 30 days +fuzzforge config set retention.keep_findings_days 30 +``` + +### Export Formats + +Support for multiple export formats: + +- **JSON** - Simplified findings structure +- **CSV** - Tabular data for spreadsheets +- **HTML** - Interactive web report +- **SARIF** - Standard security analysis format + +## ๐Ÿ› ๏ธ Development + +### Setup Development Environment + +```bash +# Clone repository +git clone https://github.com/FuzzingLabs/fuzzforge_alpha.git +cd fuzzforge_alpha/cli + +# Install in development mode +uv sync +uv add --editable ../sdk + +# Install CLI in editable mode +uv tool install --editable . +``` + +### Project Structure + +``` +cli/ +โ”œโ”€โ”€ src/fuzzforge_cli/ +โ”‚ โ”œโ”€โ”€ __init__.py +โ”‚ โ”œโ”€โ”€ main.py # Main CLI app +โ”‚ โ”œโ”€โ”€ config.py # Configuration management +โ”‚ โ”œโ”€โ”€ database.py # Database operations +โ”‚ โ”œโ”€โ”€ exceptions.py # Error handling +โ”‚ โ”œโ”€โ”€ api_validation.py # API response validation +โ”‚ โ””โ”€โ”€ commands/ # Command implementations +โ”‚ โ”œโ”€โ”€ init.py # Project initialization +โ”‚ โ”œโ”€โ”€ workflows.py # Workflow management +โ”‚ โ”œโ”€โ”€ runs.py # Run management +โ”‚ โ”œโ”€โ”€ findings.py # Findings management +โ”‚ โ”œโ”€โ”€ monitor.py # Real-time monitoring +โ”‚ โ”œโ”€โ”€ config.py # Configuration commands +โ”‚ โ””โ”€โ”€ status.py # Status information +โ”œโ”€โ”€ pyproject.toml # Project configuration +โ””โ”€โ”€ README.md # This file +``` + +### Running Tests + +```bash +# Run tests (when available) +uv run pytest + +# Code formatting +uv run black src/ +uv run isort src/ + +# Type checking +uv run mypy src/ +``` + +## โš ๏ธ Troubleshooting + +### Common Issues + +#### "No FuzzForge project found" +```bash +# Initialize a project first +ff init +``` + +#### API Connection Failed +```bash +# Check API URL configuration +fuzzforge config get project.api_url + +# Test API connectivity +fuzzforge status + +# Update API URL if needed +fuzzforge config set project.api_url "http://correct-url:8000" +``` + +#### Permission Errors +```bash +# Ensure proper permissions for project directory +chmod -R 755 .fuzzforge/ + +# Check file ownership +ls -la .fuzzforge/ +``` + +#### Database Issues +```bash +# Check database file exists +ls -la .fuzzforge/findings.db + +# Reinitialize if corrupted (will lose data) +rm .fuzzforge/findings.db +ff init --force +``` + +### Environment Variables + +Set these environment variables for debugging: + +```bash +export FUZZFORGE_DEBUG=1 # Enable debug logging +export FUZZFORGE_API_URL="..." # Override API URL +export FUZZFORGE_TIMEOUT="30" # Override timeout +``` + +### Getting Help + +```bash +# General help +fuzzforge --help + +# Command-specific help +ff workflows --help +ff workflow run --help +ff monitor live --help + +# Show version +fuzzforge --version +``` + +## ๐Ÿ† Example Workflow + +Here's a complete example of analyzing a project: + +```bash +# 1. Initialize project +mkdir my-security-audit +cd my-security-audit +ff init --name "Security Audit 2024" + +# 2. Check available workflows +fuzzforge workflows list + +# 3. Submit comprehensive security assessment +ff workflow security_assessment /path/to/source/code --wait + +# 4. View findings in table format +fuzzforge findings get + +# 5. Export detailed report +fuzzforge findings export --format html --output security_report.html + +# 6. Check project statistics +fuzzforge status +``` + +## ๐Ÿ“œ License + +This project is licensed under the terms specified in the main FuzzForge repository. + +## ๐Ÿค Contributing + +Contributions are welcome! Please see the main FuzzForge repository for contribution guidelines. + +--- + +**FuzzForge CLI** - Making security testing workflows accessible and efficient from the command line. \ No newline at end of file diff --git a/cli/completion_install.py b/cli/completion_install.py new file mode 100644 index 0000000..3fc5dc9 --- /dev/null +++ b/cli/completion_install.py @@ -0,0 +1,323 @@ +#!/usr/bin/env python3 +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + +""" +Install shell completion for FuzzForge CLI. + +This script installs completion using Typer's built-in --install-completion command. +""" + +import os +import sys +import subprocess +from pathlib import Path +import typer + + +def run_fuzzforge_completion_install(shell: str) -> bool: + """Install completion using the fuzzforge CLI itself.""" + try: + # Use the CLI's built-in completion installation + result = subprocess.run([ + sys.executable, "-m", "fuzzforge_cli.main", + "--install-completion", shell + ], capture_output=True, text=True, cwd=Path(__file__).parent.parent) + + if result.returncode == 0: + print(f"โœ… {shell.capitalize()} completion installed successfully") + return True + else: + print(f"โŒ Failed to install {shell} completion: {result.stderr}") + return False + + except Exception as e: + print(f"โŒ Error installing {shell} completion: {e}") + return False + + +def create_manual_completion_scripts(): + """Create manual completion scripts as fallback.""" + scripts = { + "bash": ''' +# FuzzForge CLI completion for bash +_fuzzforge_completion() { + local IFS=$'\\t' + local response + + response=$(env COMP_WORDS="${COMP_WORDS[*]}" COMP_CWORD=$COMP_CWORD _FUZZFORGE_COMPLETE=bash_complete $1) + + for completion in $response; do + IFS=',' read type value <<< "$completion" + + if [[ $type == 'dir' ]]; then + COMPREPLY=() + compopt -o dirnames + elif [[ $type == 'file' ]]; then + COMPREPLY=() + compopt -o default + elif [[ $type == 'plain' ]]; then + COMPREPLY+=($value) + fi + done + + return 0 +} + +complete -o nosort -F _fuzzforge_completion fuzzforge + ''', + + "zsh": ''' +#compdef fuzzforge + +_fuzzforge_completion() { + local -a completions + local -a completions_with_descriptions + local -a response + response=(${(f)"$(env COMP_WORDS="${words[*]}" COMP_CWORD=$((CURRENT-1)) _FUZZFORGE_COMPLETE=zsh_complete fuzzforge)"}) + + for type_and_line in $response; do + if [[ "$type_and_line" =~ ^([^,]*),(.*)$ ]]; then + local type="$match[1]" + local line="$match[2]" + + if [[ "$type" == "dir" ]]; then + _path_files -/ + elif [[ "$type" == "file" ]]; then + _path_files -f + elif [[ "$type" == "plain" ]]; then + if [[ "$line" =~ ^([^:]*):(.*)$ ]]; then + completions_with_descriptions+=("$match[1]":"$match[2]") + else + completions+=("$line") + fi + fi + fi + done + + if [ -n "$completions_with_descriptions" ]; then + _describe "" completions_with_descriptions -V unsorted + fi + + if [ -n "$completions" ]; then + compadd -U -V unsorted -a completions + fi +} + +compdef _fuzzforge_completion fuzzforge; + ''', + + "fish": ''' +# FuzzForge CLI completion for fish +function __fuzzforge_completion + set -l response + + for value in (env _FUZZFORGE_COMPLETE=fish_complete COMP_WORDS=(commandline -cp) COMP_CWORD=(commandline -t) fuzzforge) + set response $response $value + end + + for completion in $response + set -l metadata (string split "," $completion) + + if test $metadata[1] = "dir" + __fish_complete_directories $metadata[2] + else if test $metadata[1] = "file" + __fish_complete_path $metadata[2] + else if test $metadata[1] = "plain" + echo $metadata[2] + end + end +end + +complete --no-files --command fuzzforge --arguments "(__fuzzforge_completion)" + ''' + } + + return scripts + + +def install_bash_completion(): + """Install bash completion.""" + print("๐Ÿ“ Installing bash completion...") + + # Get the manual completion script + scripts = create_manual_completion_scripts() + completion_script = scripts["bash"] + + # Try different locations for bash completion + completion_dirs = [ + Path.home() / ".bash_completion.d", + Path("/usr/local/etc/bash_completion.d"), + Path("/etc/bash_completion.d") + ] + + for completion_dir in completion_dirs: + try: + completion_dir.mkdir(exist_ok=True) + completion_file = completion_dir / "fuzzforge" + completion_file.write_text(completion_script) + print(f"โœ… Bash completion installed to: {completion_file}") + + # Add source line to .bashrc if not present + bashrc = Path.home() / ".bashrc" + source_line = f"source {completion_file}" + + if bashrc.exists(): + bashrc_content = bashrc.read_text() + if source_line not in bashrc_content: + with bashrc.open("a") as f: + f.write(f"\n# FuzzForge CLI completion\n{source_line}\n") + print("โœ… Added completion source to ~/.bashrc") + + return True + except PermissionError: + continue + except Exception as e: + print(f"โŒ Failed to install bash completion: {e}") + continue + + print("โŒ Could not install bash completion (permission denied)") + return False + + +def install_zsh_completion(): + """Install zsh completion.""" + print("๐Ÿ“ Installing zsh completion...") + + # Get the manual completion script + scripts = create_manual_completion_scripts() + completion_script = scripts["zsh"] + + # Create completion directory + comp_dir = Path.home() / ".zsh" / "completions" + comp_dir.mkdir(parents=True, exist_ok=True) + + try: + completion_file = comp_dir / "_fuzzforge" + completion_file.write_text(completion_script) + print(f"โœ… Zsh completion installed to: {completion_file}") + + # Add fpath to .zshrc if not present + zshrc = Path.home() / ".zshrc" + fpath_line = f'fpath=(~/.zsh/completions $fpath)' + autoload_line = 'autoload -U compinit && compinit' + + if zshrc.exists(): + zshrc_content = zshrc.read_text() + lines_to_add = [] + + if fpath_line not in zshrc_content: + lines_to_add.append(fpath_line) + + if autoload_line not in zshrc_content: + lines_to_add.append(autoload_line) + + if lines_to_add: + with zshrc.open("a") as f: + f.write(f"\n# FuzzForge CLI completion\n") + for line in lines_to_add: + f.write(f"{line}\n") + print("โœ… Added completion setup to ~/.zshrc") + + return True + except Exception as e: + print(f"โŒ Failed to install zsh completion: {e}") + return False + + +def install_fish_completion(): + """Install fish completion.""" + print("๐Ÿ“ Installing fish completion...") + + # Get the manual completion script + scripts = create_manual_completion_scripts() + completion_script = scripts["fish"] + + # Fish completion directory + comp_dir = Path.home() / ".config" / "fish" / "completions" + comp_dir.mkdir(parents=True, exist_ok=True) + + try: + completion_file = comp_dir / "fuzzforge.fish" + completion_file.write_text(completion_script) + print(f"โœ… Fish completion installed to: {completion_file}") + return True + except Exception as e: + print(f"โŒ Failed to install fish completion: {e}") + return False + + +def detect_shell(): + """Detect the current shell.""" + shell_path = os.environ.get('SHELL', '') + if 'bash' in shell_path: + return 'bash' + elif 'zsh' in shell_path: + return 'zsh' + elif 'fish' in shell_path: + return 'fish' + else: + return None + + +def main(): + """Install completion for the current shell or all shells.""" + print("๐Ÿš€ FuzzForge CLI Completion Installer") + print("=" * 50) + + current_shell = detect_shell() + if current_shell: + print(f"๐Ÿš Detected shell: {current_shell}") + + # Check for command line arguments + if len(sys.argv) > 1 and sys.argv[1] == "--all": + install_all = True + print("Installing completion for all shells...") + else: + # Ask user which shells to install (with default to current shell only) + if current_shell: + install_all = typer.confirm("Install completion for all supported shells (bash, zsh, fish)?", default=False) + if not install_all: + print(f"Installing completion for {current_shell} only...") + else: + install_all = typer.confirm("Install completion for all supported shells (bash, zsh, fish)?", default=True) + + success_count = 0 + + if install_all or current_shell == 'bash': + if install_bash_completion(): + success_count += 1 + + if install_all or current_shell == 'zsh': + if install_zsh_completion(): + success_count += 1 + + if install_all or current_shell == 'fish': + if install_fish_completion(): + success_count += 1 + + print("\n" + "=" * 50) + if success_count > 0: + print(f"โœ… Successfully installed completion for {success_count} shell(s)!") + print("\n๐Ÿ“‹ To activate completion:") + print(" โ€ข Bash: Restart your terminal or run 'source ~/.bashrc'") + print(" โ€ข Zsh: Restart your terminal or run 'source ~/.zshrc'") + print(" โ€ข Fish: Completion is active immediately") + print("\n๐Ÿ’ก Try typing 'fuzzforge ' to test completion!") + else: + print("โŒ No completions were installed successfully.") + return 1 + + return 0 + + +if __name__ == "__main__": + sys.exit(main()) \ No newline at end of file diff --git a/cli/main.py b/cli/main.py new file mode 100644 index 0000000..f51211d --- /dev/null +++ b/cli/main.py @@ -0,0 +1,22 @@ +""" +FuzzForge CLI - Command-line interface for FuzzForge security testing platform. + +This module provides the main entry point for the FuzzForge CLI application. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import typer +from src.fuzzforge_cli.main import app + +if __name__ == "__main__": + app() diff --git a/cli/pyproject.toml b/cli/pyproject.toml new file mode 100644 index 0000000..5204c72 --- /dev/null +++ b/cli/pyproject.toml @@ -0,0 +1,41 @@ +[project] +name = "fuzzforge-cli" +version = "0.6.0" +description = "FuzzForge CLI - Command-line interface for FuzzForge security testing platform" +readme = "README.md" +authors = [ + { name = "Tanguy Duhamel", email = "tduhamel@fuzzinglabs.com" } +] +requires-python = ">=3.11" +dependencies = [ + "typer>=0.12.0", + "rich>=13.0.0", + "pyyaml>=6.0.0", + "pydantic>=2.0.0", + "httpx>=0.27.0", + "websockets>=13.0", + "sseclient-py>=1.8.0", + "fuzzforge-sdk", + "fuzzforge-ai", +] + +[project.optional-dependencies] +dev = [ + "pytest>=8.0.0", + "pytest-asyncio>=0.23.0", + "black>=24.0.0", + "isort>=5.13.0", + "mypy>=1.11.0", +] + +[project.scripts] +fuzzforge = "fuzzforge_cli.main:main" +ff = "fuzzforge_cli.main:main" + +[build-system] +requires = ["uv_build>=0.8.17,<0.9.0"] +build-backend = "uv_build" + +[tool.uv.sources] +fuzzforge-sdk = { path = "../sdk", editable = true } +fuzzforge-ai = { path = "../ai", editable = true } diff --git a/cli/src/fuzzforge_cli/__init__.py b/cli/src/fuzzforge_cli/__init__.py new file mode 100644 index 0000000..9d26c75 --- /dev/null +++ b/cli/src/fuzzforge_cli/__init__.py @@ -0,0 +1,19 @@ +""" +FuzzForge CLI - Command-line interface for FuzzForge security testing platform. + +A comprehensive CLI for managing workflows, runs, findings, and real-time monitoring +with local project management and persistent storage. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +__version__ = "0.6.0" \ No newline at end of file diff --git a/cli/src/fuzzforge_cli/api_validation.py b/cli/src/fuzzforge_cli/api_validation.py new file mode 100644 index 0000000..4174947 --- /dev/null +++ b/cli/src/fuzzforge_cli/api_validation.py @@ -0,0 +1,311 @@ +""" +API response validation and graceful degradation utilities. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import logging +from typing import Any, Dict, List, Optional, Union +from pydantic import BaseModel, ValidationError as PydanticValidationError + +from .exceptions import ValidationError, APIConnectionError + +logger = logging.getLogger(__name__) + + +class WorkflowMetadata(BaseModel): + """Expected workflow metadata structure""" + name: str + version: str + author: Optional[str] = None + description: Optional[str] = None + parameters: Dict[str, Any] = {} + supported_volume_modes: List[str] = ["ro", "rw"] + + +class RunStatus(BaseModel): + """Expected run status structure""" + run_id: str + workflow: str + status: str + created_at: str + updated_at: str + + @property + def is_completed(self) -> bool: + """Check if run is in a completed state""" + return self.status.lower() in ["completed", "success", "finished"] + + @property + def is_running(self) -> bool: + """Check if run is currently running""" + return self.status.lower() in ["running", "in_progress", "active"] + + @property + def is_failed(self) -> bool: + """Check if run has failed""" + return self.status.lower() in ["failed", "error", "cancelled"] + + +class FindingsResponse(BaseModel): + """Expected findings response structure""" + run_id: str + sarif: Dict[str, Any] + total_issues: Optional[int] = None + + def model_post_init(self, __context: Any) -> None: + """Validate SARIF structure after initialization""" + if not self.sarif.get("runs"): + logger.warning(f"SARIF data for run {self.run_id} missing 'runs' section") + elif not isinstance(self.sarif["runs"], list): + logger.warning(f"SARIF 'runs' section is not a list for run {self.run_id}") + + +def validate_api_response(response_data: Any, expected_model: type[BaseModel], + operation: str = "API operation") -> BaseModel: + """ + Validate API response against expected Pydantic model. + + Args: + response_data: Raw response data from API + expected_model: Pydantic model class to validate against + operation: Description of the operation for error messages + + Returns: + Validated model instance + + Raises: + ValidationError: If validation fails + """ + try: + return expected_model.model_validate(response_data) + except PydanticValidationError as e: + logger.error(f"API response validation failed for {operation}: {e}") + raise ValidationError( + f"API response for {operation}", + str(response_data)[:200] + "..." if len(str(response_data)) > 200 else str(response_data), + f"valid {expected_model.__name__} format" + ) from e + except Exception as e: + logger.error(f"Unexpected error validating API response for {operation}: {e}") + raise ValidationError( + f"API response for {operation}", + "invalid data", + f"valid {expected_model.__name__} format" + ) from e + + +def validate_sarif_structure(sarif_data: Dict[str, Any]) -> Dict[str, str]: + """ + Validate basic SARIF structure and return validation issues. + + Args: + sarif_data: SARIF data dictionary + + Returns: + Dictionary of validation issues found + """ + issues = {} + + # Check basic SARIF structure + if not isinstance(sarif_data, dict): + issues["structure"] = "SARIF data is not a dictionary" + return issues + + if "runs" not in sarif_data: + issues["runs"] = "Missing 'runs' section in SARIF data" + elif not isinstance(sarif_data["runs"], list): + issues["runs_type"] = "'runs' section is not a list" + elif len(sarif_data["runs"]) == 0: + issues["runs_empty"] = "'runs' section is empty" + else: + # Check first run structure + run = sarif_data["runs"][0] + if not isinstance(run, dict): + issues["run_structure"] = "First run is not a dictionary" + else: + if "results" not in run: + issues["results"] = "Missing 'results' section in run" + elif not isinstance(run["results"], list): + issues["results_type"] = "'results' section is not a list" + + if "tool" not in run: + issues["tool"] = "Missing 'tool' section in run" + elif not isinstance(run["tool"], dict): + issues["tool_type"] = "'tool' section is not a dictionary" + + return issues + + +def safe_extract_sarif_summary(sarif_data: Dict[str, Any]) -> Dict[str, Any]: + """ + Safely extract summary information from SARIF data with fallbacks. + + Args: + sarif_data: SARIF data dictionary + + Returns: + Summary dictionary with safe defaults + """ + summary = { + "total_issues": 0, + "by_severity": {}, + "by_rule": {}, + "tools": [], + "validation_issues": [] + } + + # Validate structure first + validation_issues = validate_sarif_structure(sarif_data) + if validation_issues: + summary["validation_issues"] = list(validation_issues.values()) + logger.warning(f"SARIF validation issues: {validation_issues}") + + try: + runs = sarif_data.get("runs", []) + if not runs: + return summary + + run = runs[0] + results = run.get("results", []) + + summary["total_issues"] = len(results) + + # Count by severity/level + for result in results: + try: + level = result.get("level", "note") + rule_id = result.get("ruleId", "unknown") + + summary["by_severity"][level] = summary["by_severity"].get(level, 0) + 1 + summary["by_rule"][rule_id] = summary["by_rule"].get(rule_id, 0) + 1 + except Exception as e: + logger.warning(f"Failed to process result: {e}") + continue + + # Extract tool information safely + try: + tool = run.get("tool", {}) + driver = tool.get("driver", {}) + if driver.get("name"): + summary["tools"].append({ + "name": driver.get("name", "unknown"), + "version": driver.get("version", "unknown"), + "rules": len(driver.get("rules", [])) + }) + except Exception as e: + logger.warning(f"Failed to extract tool information: {e}") + + except Exception as e: + logger.error(f"Failed to extract SARIF summary: {e}") + summary["validation_issues"].append(f"Summary extraction failed: {e}") + + return summary + + +def validate_workflow_parameters(parameters: Dict[str, Any], + workflow_schema: Dict[str, Any]) -> List[str]: + """ + Validate workflow parameters against schema with detailed error messages. + + Args: + parameters: Parameters to validate + workflow_schema: JSON schema for the workflow + + Returns: + List of validation error messages + """ + errors = [] + + try: + properties = workflow_schema.get("properties", {}) + required = set(workflow_schema.get("required", [])) + + # Check required parameters + missing_required = required - set(parameters.keys()) + if missing_required: + errors.append(f"Missing required parameters: {', '.join(missing_required)}") + + # Validate individual parameters + for param_name, param_value in parameters.items(): + if param_name not in properties: + errors.append(f"Unknown parameter: {param_name}") + continue + + param_schema = properties[param_name] + param_type = param_schema.get("type", "string") + + # Type validation + if param_type == "integer" and not isinstance(param_value, int): + errors.append(f"Parameter '{param_name}' must be an integer") + elif param_type == "number" and not isinstance(param_value, (int, float)): + errors.append(f"Parameter '{param_name}' must be a number") + elif param_type == "boolean" and not isinstance(param_value, bool): + errors.append(f"Parameter '{param_name}' must be a boolean") + elif param_type == "array" and not isinstance(param_value, list): + errors.append(f"Parameter '{param_name}' must be an array") + + # Range validation for numbers + if param_type in ["integer", "number"] and isinstance(param_value, (int, float)): + minimum = param_schema.get("minimum") + maximum = param_schema.get("maximum") + + if minimum is not None and param_value < minimum: + errors.append(f"Parameter '{param_name}' must be >= {minimum}") + if maximum is not None and param_value > maximum: + errors.append(f"Parameter '{param_name}' must be <= {maximum}") + + except Exception as e: + logger.error(f"Parameter validation failed: {e}") + errors.append(f"Parameter validation error: {e}") + + return errors + + +def create_fallback_response(response_type: str, **kwargs) -> Dict[str, Any]: + """ + Create fallback responses when API calls fail. + + Args: + response_type: Type of response to create + **kwargs: Additional data for the fallback + + Returns: + Fallback response dictionary + """ + fallbacks = { + "workflow_list": { + "workflows": [], + "message": "Unable to fetch workflows from API" + }, + "run_status": { + "run_id": kwargs.get("run_id", "unknown"), + "workflow": kwargs.get("workflow", "unknown"), + "status": "unknown", + "created_at": kwargs.get("created_at", "unknown"), + "updated_at": kwargs.get("updated_at", "unknown"), + "message": "Unable to fetch run status from API" + }, + "findings": { + "run_id": kwargs.get("run_id", "unknown"), + "sarif": { + "version": "2.1.0", + "runs": [] + }, + "message": "Unable to fetch findings from API" + } + } + + fallback = fallbacks.get(response_type, {"message": f"No fallback available for {response_type}"}) + logger.info(f"Using fallback response for {response_type}: {fallback.get('message', 'Unknown fallback')}") + + return fallback \ No newline at end of file diff --git a/cli/src/fuzzforge_cli/commands/__init__.py b/cli/src/fuzzforge_cli/commands/__init__.py new file mode 100644 index 0000000..7e53182 --- /dev/null +++ b/cli/src/fuzzforge_cli/commands/__init__.py @@ -0,0 +1,14 @@ +""" +Command modules for FuzzForge CLI. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + diff --git a/cli/src/fuzzforge_cli/commands/ai.py b/cli/src/fuzzforge_cli/commands/ai.py new file mode 100644 index 0000000..c30febd --- /dev/null +++ b/cli/src/fuzzforge_cli/commands/ai.py @@ -0,0 +1,133 @@ +"""AI integration commands for the FuzzForge CLI.""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +from __future__ import annotations + +import asyncio +import os +from datetime import datetime +from typing import Optional + +import typer +from rich.console import Console +from rich.panel import Panel +from rich.table import Table + +from ..config import ProjectConfigManager + +console = Console() +app = typer.Typer(name="ai", help="Interact with the FuzzForge AI system") + + +@app.command("agent") +def ai_agent() -> None: + """Launch the full AI agent CLI with A2A orchestration.""" + console.print("[cyan]๐Ÿค– Opening Project FuzzForge AI Agent session[/cyan]\n") + + try: + from fuzzforge_ai.cli import FuzzForgeCLI + + cli = FuzzForgeCLI() + asyncio.run(cli.run()) + except ImportError as exc: + console.print(f"[red]Failed to import AI CLI:[/red] {exc}") + console.print("[dim]Ensure AI dependencies are installed (pip install -e .)[/dim]") + raise typer.Exit(1) from exc + except Exception as exc: # pragma: no cover - runtime safety + console.print(f"[red]Failed to launch AI agent:[/red] {exc}") + console.print("[dim]Check that .env contains LITELLM_MODEL and API keys[/dim]") + raise typer.Exit(1) from exc + + +# Memory + health commands +@app.command("status") +def ai_status() -> None: + """Show AI system health and configuration.""" + try: + status = asyncio.run(get_ai_status_async()) + except Exception as exc: # pragma: no cover + console.print(f"[red]Failed to get AI status:[/red] {exc}") + raise typer.Exit(1) from exc + + console.print("[bold cyan]๐Ÿค– FuzzForge AI System Status[/bold cyan]\n") + + config_table = Table(title="Configuration", show_header=True, header_style="bold magenta") + config_table.add_column("Setting", style="bold") + config_table.add_column("Value", style="cyan") + config_table.add_column("Status", style="green") + + for key, info in status["config"].items(): + status_icon = "โœ…" if info["configured"] else "โŒ" + display_value = info["value"] if info["value"] else "-" + config_table.add_row(key, display_value, f"{status_icon}") + + console.print(config_table) + console.print() + + components_table = Table(title="AI Components", show_header=True, header_style="bold magenta") + components_table.add_column("Component", style="bold") + components_table.add_column("Status", style="green") + components_table.add_column("Details", style="dim") + + for component, info in status["components"].items(): + status_icon = "๐ŸŸข" if info["available"] else "๐Ÿ”ด" + components_table.add_row(component, status_icon, info["details"]) + + console.print(components_table) + + if status["agents"]: + console.print() + console.print(f"[bold green]โœ“[/bold green] {len(status['agents'])} agents registered") + + +@app.command("server") +def ai_server( + port: int = typer.Option(10100, "--port", "-p", help="Server port (default: 10100)"), +) -> None: + """Start AI system as an A2A server.""" + console.print(f"[cyan]๐Ÿš€ Starting FuzzForge AI Server on port {port}[/cyan]") + console.print("[dim]Other agents can register this instance at the A2A endpoint[/dim]\n") + + try: + os.environ["FUZZFORGE_PORT"] = str(port) + from fuzzforge_ai.__main__ import main as start_server + + start_server() + except Exception as exc: # pragma: no cover + console.print(f"[red]Failed to start AI server:[/red] {exc}") + raise typer.Exit(1) from exc + + +# --------------------------------------------------------------------------- +# Helper functions (largely adapted from the OSS implementation) +# --------------------------------------------------------------------------- + + +@app.callback(invoke_without_command=True) +def ai_callback(ctx: typer.Context): + """ + ๐Ÿค– AI integration features + """ + # Check if a subcommand is being invoked + if ctx.invoked_subcommand is not None: + # Let the subcommand handle it + return + + # Show not implemented message for default command + console.print("๐Ÿšง [yellow]AI command is not fully implemented yet.[/yellow]") + console.print("Please use specific subcommands:") + console.print(" โ€ข [cyan]ff ai agent[/cyan] - Launch the full AI agent CLI") + console.print(" โ€ข [cyan]ff ai status[/cyan] - Show AI system health and configuration") + console.print(" โ€ข [cyan]ff ai server[/cyan] - Start AI system as an A2A server") + + diff --git a/cli/src/fuzzforge_cli/commands/config.py b/cli/src/fuzzforge_cli/commands/config.py new file mode 100644 index 0000000..3af160b --- /dev/null +++ b/cli/src/fuzzforge_cli/commands/config.py @@ -0,0 +1,384 @@ +""" +Configuration management commands. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import typer +from pathlib import Path +from rich.console import Console +from rich.table import Table +from rich.panel import Panel +from rich.prompt import Prompt, Confirm +from rich import box +from typing import Optional + +from ..config import ( + get_project_config, + ensure_project_config, + get_global_config, + save_global_config, + FuzzForgeConfig +) +from ..exceptions import require_project, ValidationError, handle_error + +console = Console() +app = typer.Typer() + + +@app.command("show") +def show_config( + global_config: bool = typer.Option( + False, "--global", "-g", + help="Show global configuration instead of project config" + ) +): + """ + ๐Ÿ“‹ Display current configuration settings + """ + if global_config: + config = get_global_config() + config_type = "Global" + config_path = Path.home() / ".config" / "fuzzforge" / "config.yaml" + else: + try: + require_project() + config = get_project_config() + if not config: + raise ValidationError("project configuration", "missing", "initialized project") + except Exception as e: + handle_error(e, "loading project configuration") + return # Unreachable, but makes static analysis happy + config_type = "Project" + config_path = Path.cwd() / ".fuzzforge" / "config.yaml" + + console.print(f"\nโš™๏ธ [bold]{config_type} Configuration[/bold]\n") + + # Project settings + project_table = Table(show_header=False, box=box.SIMPLE) + project_table.add_column("Setting", style="bold cyan") + project_table.add_column("Value") + + project_table.add_row("Project Name", config.project.name) + project_table.add_row("API URL", config.project.api_url) + project_table.add_row("Default Timeout", f"{config.project.default_timeout}s") + if config.project.default_workflow: + project_table.add_row("Default Workflow", config.project.default_workflow) + + console.print( + Panel.fit( + project_table, + title="๐Ÿ“ Project Settings", + box=box.ROUNDED + ) + ) + + # Retention settings + retention_table = Table(show_header=False, box=box.SIMPLE) + retention_table.add_column("Setting", style="bold cyan") + retention_table.add_column("Value") + + retention_table.add_row("Max Runs", str(config.retention.max_runs)) + retention_table.add_row("Keep Findings (days)", str(config.retention.keep_findings_days)) + + console.print( + Panel.fit( + retention_table, + title="๐Ÿ—„๏ธ Data Retention", + box=box.ROUNDED + ) + ) + + # Preferences + prefs_table = Table(show_header=False, box=box.SIMPLE) + prefs_table.add_column("Setting", style="bold cyan") + prefs_table.add_column("Value") + + prefs_table.add_row("Auto Save Findings", "โœ… Yes" if config.preferences.auto_save_findings else "โŒ No") + prefs_table.add_row("Show Progress Bars", "โœ… Yes" if config.preferences.show_progress_bars else "โŒ No") + prefs_table.add_row("Table Style", config.preferences.table_style) + prefs_table.add_row("Color Output", "โœ… Yes" if config.preferences.color_output else "โŒ No") + + console.print( + Panel.fit( + prefs_table, + title="๐ŸŽจ Preferences", + box=box.ROUNDED + ) + ) + + console.print(f"\n๐Ÿ“ Config file: [dim]{config_path}[/dim]") + + +@app.command("set") +def set_config( + key: str = typer.Argument(..., help="Configuration key to set (e.g., 'project.name', 'project.api_url')"), + value: str = typer.Argument(..., help="Value to set"), + global_config: bool = typer.Option( + False, "--global", "-g", + help="Set in global configuration instead of project config" + ) +): + """ + โš™๏ธ Set a configuration value + """ + if global_config: + config = get_global_config() + config_type = "global" + else: + config = get_project_config() + if not config: + console.print("โŒ No project configuration found. Run 'ff init' first.", style="red") + raise typer.Exit(1) + config_type = "project" + + # Parse the key path + key_parts = key.split('.') + if len(key_parts) != 2: + console.print("โŒ Key must be in format 'section.setting' (e.g., 'project.name')", style="red") + raise typer.Exit(1) + + section, setting = key_parts + + try: + # Update configuration + if section == "project": + if setting == "name": + config.project.name = value + elif setting == "api_url": + config.project.api_url = value + elif setting == "default_timeout": + config.project.default_timeout = int(value) + elif setting == "default_workflow": + config.project.default_workflow = value if value.lower() != "none" else None + else: + console.print(f"โŒ Unknown project setting: {setting}", style="red") + raise typer.Exit(1) + + elif section == "retention": + if setting == "max_runs": + config.retention.max_runs = int(value) + elif setting == "keep_findings_days": + config.retention.keep_findings_days = int(value) + else: + console.print(f"โŒ Unknown retention setting: {setting}", style="red") + raise typer.Exit(1) + + elif section == "preferences": + if setting == "auto_save_findings": + config.preferences.auto_save_findings = value.lower() in ("true", "yes", "1", "on") + elif setting == "show_progress_bars": + config.preferences.show_progress_bars = value.lower() in ("true", "yes", "1", "on") + elif setting == "table_style": + config.preferences.table_style = value + elif setting == "color_output": + config.preferences.color_output = value.lower() in ("true", "yes", "1", "on") + else: + console.print(f"โŒ Unknown preferences setting: {setting}", style="red") + raise typer.Exit(1) + + else: + console.print(f"โŒ Unknown configuration section: {section}", style="red") + console.print("Valid sections: project, retention, preferences", style="dim") + raise typer.Exit(1) + + # Save configuration + if global_config: + save_global_config(config) + else: + config_path = Path.cwd() / ".fuzzforge" / "config.yaml" + config.save_to_file(config_path) + + console.print(f"โœ… Set {config_type} configuration: [bold cyan]{key}[/bold cyan] = [bold]{value}[/bold]", style="green") + + except ValueError as e: + console.print(f"โŒ Invalid value for {key}: {e}", style="red") + raise typer.Exit(1) + except Exception as e: + console.print(f"โŒ Failed to set configuration: {e}", style="red") + raise typer.Exit(1) + + +@app.command("get") +def get_config( + key: str = typer.Argument(..., help="Configuration key to get (e.g., 'project.name')"), + global_config: bool = typer.Option( + False, "--global", "-g", + help="Get from global configuration instead of project config" + ) +): + """ + ๐Ÿ“– Get a specific configuration value + """ + if global_config: + config = get_global_config() + else: + config = get_project_config() + if not config: + console.print("โŒ No project configuration found. Run 'ff init' first.", style="red") + raise typer.Exit(1) + + # Parse the key path + key_parts = key.split('.') + if len(key_parts) != 2: + console.print("โŒ Key must be in format 'section.setting' (e.g., 'project.name')", style="red") + raise typer.Exit(1) + + section, setting = key_parts + + try: + # Get configuration value + if section == "project": + if setting == "name": + value = config.project.name + elif setting == "api_url": + value = config.project.api_url + elif setting == "default_timeout": + value = config.project.default_timeout + elif setting == "default_workflow": + value = config.project.default_workflow or "none" + else: + console.print(f"โŒ Unknown project setting: {setting}", style="red") + raise typer.Exit(1) + + elif section == "retention": + if setting == "max_runs": + value = config.retention.max_runs + elif setting == "keep_findings_days": + value = config.retention.keep_findings_days + else: + console.print(f"โŒ Unknown retention setting: {setting}", style="red") + raise typer.Exit(1) + + elif section == "preferences": + if setting == "auto_save_findings": + value = config.preferences.auto_save_findings + elif setting == "show_progress_bars": + value = config.preferences.show_progress_bars + elif setting == "table_style": + value = config.preferences.table_style + elif setting == "color_output": + value = config.preferences.color_output + else: + console.print(f"โŒ Unknown preferences setting: {setting}", style="red") + raise typer.Exit(1) + + else: + console.print(f"โŒ Unknown configuration section: {section}", style="red") + raise typer.Exit(1) + + console.print(f"{key}: [bold cyan]{value}[/bold cyan]") + + except Exception as e: + console.print(f"โŒ Failed to get configuration: {e}", style="red") + raise typer.Exit(1) + + +@app.command("reset") +def reset_config( + global_config: bool = typer.Option( + False, "--global", "-g", + help="Reset global configuration instead of project config" + ), + force: bool = typer.Option( + False, "--force", "-f", + help="Skip confirmation prompt" + ) +): + """ + ๐Ÿ”„ Reset configuration to defaults + """ + config_type = "global" if global_config else "project" + + if not force: + if not Confirm.ask(f"Reset {config_type} configuration to defaults?", default=False, console=console): + console.print("โŒ Reset cancelled", style="yellow") + raise typer.Exit(0) + + try: + # Create new default configuration + new_config = FuzzForgeConfig() + + if global_config: + save_global_config(new_config) + else: + if not Path.cwd().joinpath(".fuzzforge").exists(): + console.print("โŒ No project configuration found. Run 'ff init' first.", style="red") + raise typer.Exit(1) + + config_path = Path.cwd() / ".fuzzforge" / "config.yaml" + new_config.save_to_file(config_path) + + console.print(f"โœ… {config_type.title()} configuration reset to defaults", style="green") + + except Exception as e: + console.print(f"โŒ Failed to reset configuration: {e}", style="red") + raise typer.Exit(1) + + +@app.command("edit") +def edit_config( + global_config: bool = typer.Option( + False, "--global", "-g", + help="Edit global configuration instead of project config" + ) +): + """ + ๐Ÿ“ Open configuration file in default editor + """ + import os + import subprocess + + if global_config: + config_path = Path.home() / ".config" / "fuzzforge" / "config.yaml" + config_type = "global" + else: + config_path = Path.cwd() / ".fuzzforge" / "config.yaml" + config_type = "project" + + if not config_path.exists(): + console.print("โŒ No project configuration found. Run 'ff init' first.", style="red") + raise typer.Exit(1) + + # Try to find a suitable editor + editors = ["code", "vim", "nano", "notepad"] + editor = None + + for e in editors: + try: + subprocess.run([e, "--version"], capture_output=True, check=True) + editor = e + break + except (subprocess.CalledProcessError, FileNotFoundError): + continue + + if not editor: + console.print(f"๐Ÿ“ Configuration file: [bold cyan]{config_path}[/bold cyan]") + console.print("โŒ No suitable editor found. Please edit the file manually.", style="red") + raise typer.Exit(1) + + try: + console.print(f"๐Ÿ“ Opening {config_type} configuration in {editor}...") + subprocess.run([editor, str(config_path)], check=True) + console.print(f"โœ… Configuration file edited", style="green") + + except subprocess.CalledProcessError as e: + console.print(f"โŒ Failed to open editor: {e}", style="red") + raise typer.Exit(1) + + +@app.callback() +def config_callback(): + """ + โš™๏ธ Manage configuration settings + """ + pass diff --git a/cli/src/fuzzforge_cli/commands/findings.py b/cli/src/fuzzforge_cli/commands/findings.py new file mode 100644 index 0000000..c4ceff8 --- /dev/null +++ b/cli/src/fuzzforge_cli/commands/findings.py @@ -0,0 +1,940 @@ +""" +Findings and security results management commands. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import json +import csv +from datetime import datetime +from pathlib import Path +from typing import Optional, Dict, Any, List + +import typer +from rich.console import Console +from rich.table import Table, Column +from rich.panel import Panel +from rich.syntax import Syntax +from rich.tree import Tree +from rich.text import Text +from rich import box + +from ..config import get_project_config, FuzzForgeConfig +from ..database import get_project_db, ensure_project_db, FindingRecord +from ..exceptions import ( + handle_error, retry_on_network_error, validate_run_id, + require_project, ValidationError, DatabaseError +) +from fuzzforge_sdk import FuzzForgeClient + +console = Console() +app = typer.Typer() + + +@retry_on_network_error(max_retries=3, delay=1.0) +def get_client() -> FuzzForgeClient: + """Get configured FuzzForge client with retry on network errors""" + config = get_project_config() or FuzzForgeConfig() + return FuzzForgeClient(base_url=config.get_api_url(), timeout=config.get_timeout()) + + +def severity_style(severity: str) -> str: + """Get rich style for severity level""" + return { + "error": "bold red", + "warning": "bold yellow", + "note": "bold blue", + "info": "bold cyan" + }.get(severity.lower(), "white") + + +@app.command("get") +def get_findings( + run_id: str = typer.Argument(..., help="Run ID to get findings for"), + save: bool = typer.Option( + True, "--save/--no-save", + help="Save findings to local database" + ), + format: str = typer.Option( + "table", "--format", "-f", + help="Output format: table, json, sarif" + ) +): + """ + ๐Ÿ” Retrieve and display security findings for a run + """ + try: + require_project() + validate_run_id(run_id) + + if format not in ["table", "json", "sarif"]: + raise ValidationError("format", format, "one of: table, json, sarif") + with get_client() as client: + console.print(f"๐Ÿ” Fetching findings for run: {run_id}") + findings = client.get_run_findings(run_id) + + # Save to database if requested + if save: + try: + db = ensure_project_db() + + # Extract summary from SARIF + sarif_data = findings.sarif + runs_data = sarif_data.get("runs", []) + summary = {} + + if runs_data: + results = runs_data[0].get("results", []) + summary = { + "total_issues": len(results), + "by_severity": {}, + "by_rule": {}, + "tools": [] + } + + for result in results: + level = result.get("level", "note") + rule_id = result.get("ruleId", "unknown") + + summary["by_severity"][level] = summary["by_severity"].get(level, 0) + 1 + summary["by_rule"][rule_id] = summary["by_rule"].get(rule_id, 0) + 1 + + # Extract tool info + tool = runs_data[0].get("tool", {}) + driver = tool.get("driver", {}) + if driver.get("name"): + summary["tools"].append({ + "name": driver.get("name"), + "version": driver.get("version"), + "rules": len(driver.get("rules", [])) + }) + + finding_record = FindingRecord( + run_id=run_id, + sarif_data=sarif_data, + summary=summary, + created_at=datetime.now() + ) + db.save_findings(finding_record) + console.print("โœ… Findings saved to local database", style="green") + except Exception as e: + console.print(f"โš ๏ธ Failed to save findings to database: {e}", style="yellow") + + # Display findings + if format == "json": + findings_json = json.dumps(findings.sarif, indent=2) + console.print(Syntax(findings_json, "json", theme="monokai")) + + elif format == "sarif": + sarif_json = json.dumps(findings.sarif, indent=2) + console.print(sarif_json) + + else: # table format + display_findings_table(findings.sarif) + + except Exception as e: + console.print(f"โŒ Failed to get findings: {e}", style="red") + raise typer.Exit(1) + + +def display_findings_table(sarif_data: Dict[str, Any]): + """Display SARIF findings in a rich table format""" + runs = sarif_data.get("runs", []) + if not runs: + console.print("โ„น๏ธ No findings data available", style="dim") + return + + run_data = runs[0] + results = run_data.get("results", []) + tool = run_data.get("tool", {}) + driver = tool.get("driver", {}) + + # Tool information + console.print(f"\n๐Ÿ” [bold]Security Analysis Results[/bold]") + if driver.get("name"): + console.print(f"Tool: {driver.get('name')} v{driver.get('version', 'unknown')}") + + if not results: + console.print("โœ… No security issues found!", style="green") + return + + # Summary statistics + summary_by_level = {} + for result in results: + level = result.get("level", "note") + summary_by_level[level] = summary_by_level.get(level, 0) + 1 + + summary_table = Table(show_header=False, box=box.SIMPLE) + summary_table.add_column("Severity", width=15, justify="left", style="bold") + summary_table.add_column("Count", width=8, justify="right", style="bold") + + for level, count in sorted(summary_by_level.items()): + # Create Rich Text object with color styling + level_text = level.upper() + severity_text = Text(level_text, style=severity_style(level)) + count_text = Text(str(count)) + + summary_table.add_row(severity_text, count_text) + + console.print( + Panel.fit( + summary_table, + title=f"๐Ÿ“Š Summary ({len(results)} total issues)", + box=box.ROUNDED + ) + ) + + # Detailed results - Rich Text-based table with proper emoji alignment + results_table = Table(box=box.ROUNDED) + results_table.add_column("Severity", width=12, justify="left", no_wrap=True) + results_table.add_column("Rule", width=25, justify="left", style="bold cyan", no_wrap=True) + results_table.add_column("Message", width=55, justify="left", no_wrap=True) + results_table.add_column("Location", width=20, justify="left", style="dim", no_wrap=True) + + for result in results[:50]: # Limit to first 50 results + level = result.get("level", "note") + rule_id = result.get("ruleId", "unknown") + message = result.get("message", {}).get("text", "No message") + + # Extract location information + locations = result.get("locations", []) + location_str = "" + if locations: + physical_location = locations[0].get("physicalLocation", {}) + artifact_location = physical_location.get("artifactLocation", {}) + region = physical_location.get("region", {}) + + file_path = artifact_location.get("uri", "") + if file_path: + location_str = Path(file_path).name + if region.get("startLine"): + location_str += f":{region['startLine']}" + if region.get("startColumn"): + location_str += f":{region['startColumn']}" + + # Create Rich Text objects with color styling + severity_text = Text(level.upper(), style=severity_style(level)) + severity_text.truncate(12, overflow="ellipsis") + + rule_text = Text(rule_id) + rule_text.truncate(25, overflow="ellipsis") + + message_text = Text(message) + message_text.truncate(55, overflow="ellipsis") + + location_text = Text(location_str) + location_text.truncate(20, overflow="ellipsis") + + results_table.add_row( + severity_text, + rule_text, + message_text, + location_text + ) + + console.print(f"\n๐Ÿ“‹ [bold]Detailed Results[/bold]") + if len(results) > 50: + console.print(f"Showing first 50 of {len(results)} results") + console.print() + console.print(results_table) + + +@app.command("history") +def findings_history( + limit: int = typer.Option(20, "--limit", "-l", help="Maximum number of findings to show") +): + """ + ๐Ÿ“š Show findings history from local database + """ + db = get_project_db() + if not db: + console.print("โŒ No FuzzForge project found. Run 'ff init' first.", style="red") + raise typer.Exit(1) + + try: + findings = db.list_findings(limit=limit) + + if not findings: + console.print("โŒ No findings found in database", style="red") + return + + table = Table(box=box.ROUNDED) + table.add_column("Run ID", style="bold cyan", width=36) # Full UUID width + table.add_column("Date", justify="center") + table.add_column("Total Issues", justify="center", style="bold") + table.add_column("Errors", justify="center", style="red") + table.add_column("Warnings", justify="center", style="yellow") + table.add_column("Notes", justify="center", style="blue") + table.add_column("Tools", style="dim") + + for finding in findings: + summary = finding.summary + total_issues = summary.get("total_issues", 0) + by_severity = summary.get("by_severity", {}) + tools = summary.get("tools", []) + + tool_names = ", ".join([tool.get("name", "Unknown") for tool in tools]) + + table.add_row( + finding.run_id, # Show full Run ID + finding.created_at.strftime("%m-%d %H:%M"), + str(total_issues), + str(by_severity.get("error", 0)), + str(by_severity.get("warning", 0)), + str(by_severity.get("note", 0)), + tool_names[:30] + "..." if len(tool_names) > 30 else tool_names + ) + + console.print(f"\n๐Ÿ“š [bold]Findings History ({len(findings)})[/bold]\n") + console.print(table) + + console.print(f"\n๐Ÿ’ก Use [bold cyan]fuzzforge finding [/bold cyan] to view detailed findings") + + except Exception as e: + console.print(f"โŒ Failed to get findings history: {e}", style="red") + raise typer.Exit(1) + + +@app.command("export") +def export_findings( + run_id: str = typer.Argument(..., help="Run ID to export findings for"), + format: str = typer.Option( + "json", "--format", "-f", + help="Export format: json, csv, html, sarif" + ), + output: Optional[str] = typer.Option( + None, "--output", "-o", + help="Output file path (defaults to findings-.)" + ) +): + """ + ๐Ÿ“ค Export security findings in various formats + """ + db = get_project_db() + if not db: + console.print("โŒ No FuzzForge project found. Run 'ff init' first.", style="red") + raise typer.Exit(1) + + try: + # Get findings from database first, fallback to API + findings_data = db.get_findings(run_id) + if not findings_data: + console.print(f"๐Ÿ“ก Fetching findings from API for run: {run_id}") + with get_client() as client: + findings = client.get_run_findings(run_id) + sarif_data = findings.sarif + else: + sarif_data = findings_data.sarif_data + + # Generate output filename + if not output: + output = f"findings-{run_id[:8]}.{format}" + + output_path = Path(output) + + # Export based on format + if format == "sarif": + with open(output_path, 'w') as f: + json.dump(sarif_data, f, indent=2) + + elif format == "json": + # Simplified JSON format + simplified_data = extract_simplified_findings(sarif_data) + with open(output_path, 'w') as f: + json.dump(simplified_data, f, indent=2) + + elif format == "csv": + export_to_csv(sarif_data, output_path) + + elif format == "html": + export_to_html(sarif_data, output_path, run_id) + + else: + console.print(f"โŒ Unsupported format: {format}", style="red") + raise typer.Exit(1) + + console.print(f"โœ… Findings exported to: [bold cyan]{output_path}[/bold cyan]") + + except Exception as e: + console.print(f"โŒ Failed to export findings: {e}", style="red") + raise typer.Exit(1) + + +def extract_simplified_findings(sarif_data: Dict[str, Any]) -> Dict[str, Any]: + """Extract simplified findings structure from SARIF""" + runs = sarif_data.get("runs", []) + if not runs: + return {"findings": [], "summary": {}} + + run_data = runs[0] + results = run_data.get("results", []) + tool = run_data.get("tool", {}).get("driver", {}) + + simplified = { + "tool": { + "name": tool.get("name", "Unknown"), + "version": tool.get("version", "Unknown") + }, + "summary": { + "total_issues": len(results), + "by_severity": {} + }, + "findings": [] + } + + for result in results: + level = result.get("level", "note") + simplified["summary"]["by_severity"][level] = simplified["summary"]["by_severity"].get(level, 0) + 1 + + # Extract location + location_info = {} + locations = result.get("locations", []) + if locations: + physical_location = locations[0].get("physicalLocation", {}) + artifact_location = physical_location.get("artifactLocation", {}) + region = physical_location.get("region", {}) + + location_info = { + "file": artifact_location.get("uri", ""), + "line": region.get("startLine"), + "column": region.get("startColumn") + } + + simplified["findings"].append({ + "rule_id": result.get("ruleId", "unknown"), + "severity": level, + "message": result.get("message", {}).get("text", ""), + "location": location_info + }) + + return simplified + + +def export_to_csv(sarif_data: Dict[str, Any], output_path: Path): + """Export findings to CSV format""" + runs = sarif_data.get("runs", []) + if not runs: + return + + results = runs[0].get("results", []) + + with open(output_path, 'w', newline='', encoding='utf-8') as csvfile: + fieldnames = ['rule_id', 'severity', 'message', 'file', 'line', 'column'] + writer = csv.DictWriter(csvfile, fieldnames=fieldnames) + writer.writeheader() + + for result in results: + location_info = {"file": "", "line": "", "column": ""} + locations = result.get("locations", []) + if locations: + physical_location = locations[0].get("physicalLocation", {}) + artifact_location = physical_location.get("artifactLocation", {}) + region = physical_location.get("region", {}) + + location_info = { + "file": artifact_location.get("uri", ""), + "line": region.get("startLine", ""), + "column": region.get("startColumn", "") + } + + writer.writerow({ + "rule_id": result.get("ruleId", ""), + "severity": result.get("level", "note"), + "message": result.get("message", {}).get("text", ""), + **location_info + }) + + +def export_to_html(sarif_data: Dict[str, Any], output_path: Path, run_id: str): + """Export findings to HTML format""" + runs = sarif_data.get("runs", []) + if not runs: + return + + run_data = runs[0] + results = run_data.get("results", []) + tool = run_data.get("tool", {}).get("driver", {}) + + # Simple HTML template + html_content = f""" + + + Security Findings - {run_id} + + + +
+

Security Findings Report

+

Run ID: {run_id}

+

Tool: {tool.get('name', 'Unknown')} v{tool.get('version', 'Unknown')}

+

Generated: {datetime.now().strftime('%Y-%m-%d %H:%M:%S')}

+
+ +
+

Summary

+

Total Issues: {len(results)}

+
+ +
+

Detailed Findings

+ + + + + + + + + + +""" + + for result in results: + level = result.get("level", "note") + rule_id = result.get("ruleId", "unknown") + message = result.get("message", {}).get("text", "") + + # Extract location + location_str = "" + locations = result.get("locations", []) + if locations: + physical_location = locations[0].get("physicalLocation", {}) + artifact_location = physical_location.get("artifactLocation", {}) + region = physical_location.get("region", {}) + + file_path = artifact_location.get("uri", "") + if file_path: + location_str = file_path + if region.get("startLine"): + location_str += f":{region['startLine']}" + + html_content += f""" + + + + + + + """ + + html_content += """ + +
Rule IDSeverityMessageLocation
{rule_id}{level}{message}{location_str}
+
+ + + """ + + with open(output_path, 'w', encoding='utf-8') as f: + f.write(html_content) + + +@app.command("all") +def all_findings( + workflow: Optional[str] = typer.Option( + None, "--workflow", "-w", + help="Filter by workflow name" + ), + severity: Optional[str] = typer.Option( + None, "--severity", "-s", + help="Filter by severity levels (comma-separated: error,warning,note,info)" + ), + since: Optional[str] = typer.Option( + None, "--since", + help="Show findings since date (YYYY-MM-DD)" + ), + limit: Optional[int] = typer.Option( + None, "--limit", "-l", + help="Maximum number of findings to show" + ), + export_format: Optional[str] = typer.Option( + None, "--export", "-e", + help="Export format: json, csv, html" + ), + output: Optional[str] = typer.Option( + None, "--output", "-o", + help="Output file for export" + ), + stats_only: bool = typer.Option( + False, "--stats", + help="Show statistics only" + ), + show_findings: bool = typer.Option( + False, "--show-findings", "-f", + help="Show actual findings content, not just summary" + ), + max_findings: int = typer.Option( + 50, "--max-findings", + help="Maximum number of individual findings to display" + ) +): + """ + ๐Ÿ“Š Show all findings for the entire project + """ + db = get_project_db() + if not db: + console.print("โŒ No FuzzForge project found. Run 'ff init' first.", style="red") + raise typer.Exit(1) + + try: + # Parse filters + severity_list = None + if severity: + severity_list = [s.strip().lower() for s in severity.split(",")] + + since_date = None + if since: + try: + since_date = datetime.strptime(since, "%Y-%m-%d") + except ValueError: + console.print(f"โŒ Invalid date format: {since}. Use YYYY-MM-DD", style="red") + raise typer.Exit(1) + + # Get aggregated stats + stats = db.get_aggregated_stats() + + # Show statistics + if stats_only or not export_format: + # Create summary panel + summary_text = f"""[bold]๐Ÿ“Š Project Security Summary[/bold] + +[cyan]Total Findings Records:[/cyan] {stats['total_findings_records']} +[cyan]Total Runs Analyzed:[/cyan] {stats['total_runs']} +[cyan]Total Security Issues:[/cyan] {stats['total_issues']} +[cyan]Recent Findings (7 days):[/cyan] {stats['recent_findings']} + +[bold]Severity Distribution:[/bold] + ๐Ÿ”ด Errors: {stats['severity_distribution'].get('error', 0)} + ๐ŸŸก Warnings: {stats['severity_distribution'].get('warning', 0)} + ๐Ÿ”ต Notes: {stats['severity_distribution'].get('note', 0)} + โ„น๏ธ Info: {stats['severity_distribution'].get('info', 0)} + +[bold]By Workflow:[/bold]""" + + for wf_name, count in stats['workflows'].items(): + summary_text += f"\n โ€ข {wf_name}: {count} findings" + + console.print(Panel(summary_text, box=box.ROUNDED, title="FuzzForge Project Analysis", border_style="cyan")) + + if stats_only: + return + + # Get all findings with filters + findings = db.get_all_findings( + workflow=workflow, + severity=severity_list, + since_date=since_date, + limit=limit + ) + + if not findings: + console.print("โ„น๏ธ No findings match the specified filters", style="dim") + return + + # Export if requested + if export_format: + if not output: + timestamp = datetime.now().strftime("%Y%m%d_%H%M%S") + output = f"all_findings_{timestamp}.{export_format}" + + export_all_findings(findings, export_format, output) + console.print(f"โœ… Exported {len(findings)} findings to: {output}", style="green") + return + + # Display findings table + table = Table(box=box.ROUNDED, title=f"All Project Findings ({len(findings)} records)") + table.add_column("Run ID", style="bold cyan", width=36) # Full UUID width + table.add_column("Workflow", style="dim", width=20) + table.add_column("Date", justify="center") + table.add_column("Issues", justify="center", style="bold") + table.add_column("Errors", justify="center", style="red") + table.add_column("Warnings", justify="center", style="yellow") + table.add_column("Notes", justify="center", style="blue") + + # Get run info for each finding + runs_info = {} + for finding in findings: + run_id = finding.run_id + if run_id not in runs_info: + run_info = db.get_run(run_id) + runs_info[run_id] = run_info + + for finding in findings: + run_id = finding.run_id + run_info = runs_info.get(run_id) + workflow_name = run_info.workflow if run_info else "unknown" + + summary = finding.summary + total_issues = summary.get("total_issues", 0) + by_severity = summary.get("by_severity", {}) + + # Count issues from SARIF data if summary is incomplete + if total_issues == 0 and "runs" in finding.sarif_data: + for run in finding.sarif_data["runs"]: + total_issues += len(run.get("results", [])) + + table.add_row( + run_id, # Show full Run ID + workflow_name[:17] + "..." if len(workflow_name) > 20 else workflow_name, + finding.created_at.strftime("%Y-%m-%d %H:%M"), + str(total_issues), + str(by_severity.get("error", 0)), + str(by_severity.get("warning", 0)), + str(by_severity.get("note", 0)) + ) + + console.print(table) + + # Show actual findings if requested + if show_findings: + display_detailed_findings(findings, max_findings) + + console.print(f"\n๐Ÿ’ก Use filters to refine results: --workflow, --severity, --since") + console.print(f"๐Ÿ’ก Show findings content: --show-findings") + console.print(f"๐Ÿ’ก Export findings: --export json --output report.json") + console.print(f"๐Ÿ’ก View specific findings: [bold cyan]fuzzforge finding [/bold cyan]") + + except Exception as e: + console.print(f"โŒ Failed to get all findings: {e}", style="red") + raise typer.Exit(1) + + +def display_detailed_findings(findings: List[FindingRecord], max_findings: int): + """Display detailed findings content""" + console.print(f"\n๐Ÿ“‹ [bold]Detailed Findings Content[/bold] (showing up to {max_findings} findings)\n") + + findings_count = 0 + + for finding_record in findings: + if findings_count >= max_findings: + remaining = sum(len(run.get("results", [])) + for f in findings[findings.index(finding_record):] + for run in f.sarif_data.get("runs", [])) + if remaining > 0: + console.print(f"\n... and {remaining} more findings (use --max-findings to show more)") + break + + # Get run info for this finding + sarif_data = finding_record.sarif_data + if not sarif_data or "runs" not in sarif_data: + continue + + for run in sarif_data["runs"]: + tool = run.get("tool", {}) + driver = tool.get("driver", {}) + tool_name = driver.get("name", "Unknown Tool") + + results = run.get("results", []) + if not results: + continue + + # Group results by severity + for result in results: + if findings_count >= max_findings: + break + + findings_count += 1 + + # Extract key information + rule_id = result.get("ruleId", "unknown") + level = result.get("level", "note").upper() + message_text = result.get("message", {}).get("text", "No description") + + # Get location information + locations = result.get("locations", []) + location_str = "Unknown location" + if locations: + physical = locations[0].get("physicalLocation", {}) + artifact = physical.get("artifactLocation", {}) + region = physical.get("region", {}) + + file_path = artifact.get("uri", "") + line_number = region.get("startLine", "") + + if file_path: + location_str = f"{file_path}" + if line_number: + location_str += f":{line_number}" + + # Get severity style + severity_style = { + "ERROR": "bold red", + "WARNING": "bold yellow", + "NOTE": "bold blue", + "INFO": "bold cyan" + }.get(level, "white") + + # Create finding panel + finding_content = f"""[bold]Rule:[/bold] {rule_id} +[bold]Location:[/bold] {location_str} +[bold]Tool:[/bold] {tool_name} +[bold]Run:[/bold] {finding_record.run_id[:12]}... + +[bold]Description:[/bold] +{message_text}""" + + # Add code context if available + region = locations[0].get("physicalLocation", {}).get("region", {}) if locations else {} + if region.get("snippet", {}).get("text"): + code_snippet = region["snippet"]["text"].strip() + finding_content += f"\n\n[bold]Code:[/bold]\n[dim]{code_snippet}[/dim]" + + console.print(Panel( + finding_content, + title=f"[{severity_style}]{level}[/{severity_style}] Finding #{findings_count}", + border_style=severity_style.split()[-1] if " " in severity_style else severity_style, + box=box.ROUNDED + )) + + console.print() # Add spacing between findings + + +def export_all_findings(findings: List[FindingRecord], format: str, output_path: str): + """Export all findings to specified format""" + output_file = Path(output_path) + + if format == "json": + # Combine all SARIF data + all_results = [] + for finding in findings: + if "runs" in finding.sarif_data: + for run in finding.sarif_data["runs"]: + for result in run.get("results", []): + result_entry = { + "run_id": finding.run_id, + "created_at": finding.created_at.isoformat(), + **result + } + all_results.append(result_entry) + + with open(output_file, 'w') as f: + json.dump({ + "total_findings": len(findings), + "export_date": datetime.now().isoformat(), + "results": all_results + }, f, indent=2) + + elif format == "csv": + # Export to CSV + with open(output_file, 'w', newline='') as f: + writer = csv.writer(f) + writer.writerow(["Run ID", "Date", "Severity", "Rule ID", "Message", "File", "Line"]) + + for finding in findings: + if "runs" in finding.sarif_data: + for run in finding.sarif_data["runs"]: + for result in run.get("results", []): + locations = result.get("locations", []) + location_info = locations[0] if locations else {} + physical = location_info.get("physicalLocation", {}) + artifact = physical.get("artifactLocation", {}) + region = physical.get("region", {}) + + writer.writerow([ + finding.run_id[:12], + finding.created_at.strftime("%Y-%m-%d %H:%M"), + result.get("level", "note"), + result.get("ruleId", ""), + result.get("message", {}).get("text", ""), + artifact.get("uri", ""), + region.get("startLine", "") + ]) + + elif format == "html": + # Generate HTML report + html_content = f""" + + + FuzzForge Security Findings Report + + + +

FuzzForge Security Findings Report

+
+

Generated: {datetime.now().strftime("%Y-%m-%d %H:%M:%S")}

+

Total Findings: {len(findings)}

+
+ + + + + + + + + """ + + for finding in findings: + if "runs" in finding.sarif_data: + for run in finding.sarif_data["runs"]: + for result in run.get("results", []): + level = result.get("level", "note") + locations = result.get("locations", []) + location_info = locations[0] if locations else {} + physical = location_info.get("physicalLocation", {}) + artifact = physical.get("artifactLocation", {}) + region = physical.get("region", {}) + + html_content += f""" + + + + + + + + """ + + html_content += """ +
Run IDDateSeverityRuleMessageLocation
{finding.run_id[:12]}{finding.created_at.strftime("%Y-%m-%d %H:%M")}{level.upper()}{result.get("ruleId", "")}{result.get("message", {}).get("text", "")}{artifact.get("uri", "")} : {region.get("startLine", "")}
+ +""" + + with open(output_file, 'w') as f: + f.write(html_content) + + +@app.callback(invoke_without_command=True) +def findings_callback(ctx: typer.Context): + """ + ๐Ÿ” View and export security findings + """ + # Check if a subcommand is being invoked + if ctx.invoked_subcommand is not None: + # Let the subcommand handle it + return + + # Default to history when no subcommand provided + findings_history(limit=20) \ No newline at end of file diff --git a/cli/src/fuzzforge_cli/commands/ingest.py b/cli/src/fuzzforge_cli/commands/ingest.py new file mode 100644 index 0000000..20e657c --- /dev/null +++ b/cli/src/fuzzforge_cli/commands/ingest.py @@ -0,0 +1,251 @@ +"""Cognee ingestion commands for FuzzForge CLI.""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +from __future__ import annotations + +import asyncio +import os +from pathlib import Path +from typing import List, Optional + +import typer +from rich.console import Console +from rich.prompt import Confirm + +from ..config import ProjectConfigManager +from ..ingest_utils import collect_ingest_files + +console = Console() +app = typer.Typer( + name="ingest", + help="Ingest files or directories into the Cognee knowledge graph for the current project", + invoke_without_command=True, +) + + +@app.callback() +def ingest_callback( + ctx: typer.Context, + path: Optional[Path] = typer.Argument( + None, + exists=True, + file_okay=True, + dir_okay=True, + readable=True, + resolve_path=True, + help="File or directory to ingest (defaults to current directory)", + ), + recursive: bool = typer.Option( + False, + "--recursive", + "-r", + help="Recursively ingest directories", + ), + file_types: Optional[List[str]] = typer.Option( + None, + "--file-types", + "-t", + help="File extensions to include (e.g. --file-types .py --file-types .js)", + ), + exclude: Optional[List[str]] = typer.Option( + None, + "--exclude", + "-e", + help="Glob patterns to exclude", + ), + dataset: Optional[str] = typer.Option( + None, + "--dataset", + "-d", + help="Dataset name to ingest into", + ), + force: bool = typer.Option( + False, + "--force", + "-f", + help="Force re-ingestion and skip confirmation", + ), +): + """Entry point for `fuzzforge ingest` when no subcommand is provided.""" + if ctx.invoked_subcommand: + return + + try: + config = ProjectConfigManager() + except FileNotFoundError as exc: + console.print(f"[red]Error:[/red] {exc}") + raise typer.Exit(1) from exc + + if not config.is_initialized(): + console.print("[red]Error: FuzzForge project not initialized. Run 'ff init' first.[/red]") + raise typer.Exit(1) + + config.setup_cognee_environment() + if os.getenv("FUZZFORGE_DEBUG", "0") == "1": + console.print( + "[dim]Cognee directories:\n" + f" DATA: {os.getenv('COGNEE_DATA_ROOT', 'unset')}\n" + f" SYSTEM: {os.getenv('COGNEE_SYSTEM_ROOT', 'unset')}\n" + f" USER: {os.getenv('COGNEE_USER_ID', 'unset')}\n", + ) + project_context = config.get_project_context() + + target_path = path or Path.cwd() + dataset_name = dataset or f"{project_context['project_name']}_codebase" + + try: + import cognee # noqa: F401 # Just to validate installation + except ImportError as exc: + console.print("[red]Cognee is not installed.[/red]") + console.print("Install with: pip install 'cognee[all]' litellm") + raise typer.Exit(1) from exc + + console.print(f"[bold]๐Ÿ” Ingesting {target_path} into Cognee knowledge graph[/bold]") + console.print( + f"Project: [cyan]{project_context['project_name']}[/cyan] " + f"(ID: [dim]{project_context['project_id']}[/dim])" + ) + console.print(f"Dataset: [cyan]{dataset_name}[/cyan]") + console.print(f"Tenant: [dim]{project_context['tenant_id']}[/dim]") + + if not force: + confirm_message = f"Ingest {target_path} into knowledge graph for this project?" + if not Confirm.ask(confirm_message, console=console): + console.print("[yellow]Ingestion cancelled[/yellow]") + raise typer.Exit(0) + + try: + asyncio.run( + _run_ingestion( + config=config, + path=target_path.resolve(), + recursive=recursive, + file_types=file_types, + exclude=exclude, + dataset=dataset_name, + force=force, + ) + ) + except KeyboardInterrupt: + console.print("\n[yellow]Ingestion cancelled by user[/yellow]") + raise typer.Exit(1) + except Exception as exc: # pragma: no cover - rich reporting + console.print(f"[red]Failed to ingest:[/red] {exc}") + raise typer.Exit(1) from exc + + +async def _run_ingestion( + *, + config: ProjectConfigManager, + path: Path, + recursive: bool, + file_types: Optional[List[str]], + exclude: Optional[List[str]], + dataset: str, + force: bool, +) -> None: + """Perform the actual ingestion work.""" + from fuzzforge_ai.cognee_service import CogneeService + + cognee_service = CogneeService(config) + await cognee_service.initialize() + + # Always skip internal bookkeeping directories + exclude_patterns = list(exclude or []) + default_excludes = { + ".fuzzforge/**", + ".git/**", + } + added_defaults = [] + for pattern in default_excludes: + if pattern not in exclude_patterns: + exclude_patterns.append(pattern) + added_defaults.append(pattern) + + if added_defaults and os.getenv("FUZZFORGE_DEBUG", "0") == "1": + console.print( + "[dim]Auto-excluding paths: {patterns}[/dim]".format( + patterns=", ".join(added_defaults) + ) + ) + + try: + files_to_ingest = collect_ingest_files(path, recursive, file_types, exclude_patterns) + except Exception as exc: + console.print(f"[red]Failed to collect files:[/red] {exc}") + return + + if not files_to_ingest: + console.print("[yellow]No files found to ingest[/yellow]") + return + + console.print(f"Found [green]{len(files_to_ingest)}[/green] files to ingest") + + if force: + console.print("Cleaning existing data for this project...") + try: + await cognee_service.clear_data(confirm=True) + except Exception as exc: + console.print(f"[yellow]Warning:[/yellow] Could not clean existing data: {exc}") + + console.print("Adding files to Cognee...") + valid_file_paths = [] + for file_path in files_to_ingest: + try: + with open(file_path, "r", encoding="utf-8") as fh: + fh.read(1) + valid_file_paths.append(file_path) + console.print(f" โœ“ {file_path}") + except (UnicodeDecodeError, PermissionError) as exc: + console.print(f"[yellow]Skipping {file_path}: {exc}[/yellow]") + + if not valid_file_paths: + console.print("[yellow]No readable files found to ingest[/yellow]") + return + + results = await cognee_service.ingest_files(valid_file_paths, dataset) + + console.print( + f"[green]โœ… Successfully ingested {results['success']} files into knowledge graph[/green]" + ) + if results["failed"]: + console.print( + f"[yellow]โš ๏ธ Skipped {results['failed']} files due to errors[/yellow]" + ) + + try: + insights = await cognee_service.search_insights( + query=f"What insights can you provide about the {dataset} dataset?", + dataset=dataset, + ) + if insights: + console.print(f"\n[bold]๐Ÿ“Š Generated {len(insights)} insights:[/bold]") + for index, insight in enumerate(insights[:3], 1): + console.print(f" {index}. {insight}") + if len(insights) > 3: + console.print(f" ... and {len(insights) - 3} more") + + chunks = await cognee_service.search_chunks( + query=f"functions classes methods in {dataset}", + dataset=dataset, + ) + if chunks: + console.print( + f"\n[bold]๐Ÿ” Sample searchable content ({len(chunks)} chunks found):[/bold]" + ) + for index, chunk in enumerate(chunks[:2], 1): + preview = chunk[:100] + "..." if len(chunk) > 100 else chunk + console.print(f" {index}. {preview}") + except Exception: + # Best-effort stats โ€” ignore failures here + pass diff --git a/cli/src/fuzzforge_cli/commands/init.py b/cli/src/fuzzforge_cli/commands/init.py new file mode 100644 index 0000000..1847349 --- /dev/null +++ b/cli/src/fuzzforge_cli/commands/init.py @@ -0,0 +1,282 @@ +"""Project initialization commands.""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +from __future__ import annotations + +from pathlib import Path +import os +from textwrap import dedent +from typing import Optional + +import typer +from rich.console import Console +from rich.prompt import Confirm, Prompt + +from ..config import ensure_project_config +from ..database import ensure_project_db + +console = Console() +app = typer.Typer() + + +@app.command() +def project( + name: Optional[str] = typer.Option( + None, "--name", "-n", + help="Project name (defaults to current directory name)" + ), + api_url: Optional[str] = typer.Option( + None, "--api-url", "-u", + help="FuzzForge API URL (defaults to http://localhost:8000)" + ), + force: bool = typer.Option( + False, "--force", "-f", + help="Force initialization even if project already exists" + ) +): + """ + ๐Ÿ“ Initialize a new FuzzForge project in the current directory. + + This creates a .fuzzforge directory with: + โ€ข SQLite database for storing runs, findings, and crashes + โ€ข Configuration file with project settings + โ€ข Default ignore patterns and preferences + """ + current_dir = Path.cwd() + fuzzforge_dir = current_dir / ".fuzzforge" + + # Check if project already exists + if fuzzforge_dir.exists() and not force: + if fuzzforge_dir.is_dir() and any(fuzzforge_dir.iterdir()): + console.print("โŒ FuzzForge project already exists in this directory", style="red") + console.print("Use --force to reinitialize", style="dim") + raise typer.Exit(1) + + # Get project name + if not name: + name = Prompt.ask( + "Project name", + default=current_dir.name, + console=console + ) + + # Get API URL + if not api_url: + api_url = Prompt.ask( + "FuzzForge API URL", + default="http://localhost:8000", + console=console + ) + + # Confirm initialization + console.print(f"\n๐Ÿ“ Initializing FuzzForge project: [bold cyan]{name}[/bold cyan]") + console.print(f"๐Ÿ“ Location: [dim]{current_dir}[/dim]") + console.print(f"๐Ÿ”— API URL: [dim]{api_url}[/dim]") + + if not Confirm.ask("\nProceed with initialization?", default=True, console=console): + console.print("โŒ Initialization cancelled", style="yellow") + raise typer.Exit(0) + + try: + # Create .fuzzforge directory + console.print("\n๐Ÿ”จ Creating project structure...") + fuzzforge_dir.mkdir(exist_ok=True) + + # Initialize configuration + console.print("โš™๏ธ Setting up configuration...") + ensure_project_config( + project_dir=current_dir, + project_name=name, + api_url=api_url, + ) + + # Initialize database + console.print("๐Ÿ—„๏ธ Initializing database...") + ensure_project_db(current_dir) + + _ensure_env_file(fuzzforge_dir, force) + _ensure_agents_registry(fuzzforge_dir, force) + + # Create .gitignore if needed + gitignore_path = current_dir / ".gitignore" + gitignore_entries = [ + "# FuzzForge CLI", + ".fuzzforge/findings.db-*", # SQLite temp files + ".fuzzforge/cache/", + ".fuzzforge/temp/", + ] + + if gitignore_path.exists(): + with open(gitignore_path, 'r') as f: + existing_content = f.read() + + if "# FuzzForge CLI" not in existing_content: + with open(gitignore_path, 'a') as f: + f.write(f"\n{chr(10).join(gitignore_entries)}\n") + console.print("๐Ÿ“ Updated .gitignore with FuzzForge entries") + else: + with open(gitignore_path, 'w') as f: + f.write(f"{chr(10).join(gitignore_entries)}\n") + console.print("๐Ÿ“ Created .gitignore") + + # Create README if it doesn't exist + readme_path = current_dir / "README.md" + if not readme_path.exists(): + readme_content = f"""# {name} + +FuzzForge security testing project. + +## Quick Start + +```bash +# List available workflows +fuzzforge workflows + +# Submit a workflow for analysis +fuzzforge workflow /path/to/target + +# Monitor run progress +fuzzforge monitor live + +# View findings +fuzzforge finding +``` + +## Project Structure + +- `.fuzzforge/` - Project data and configuration +- `.fuzzforge/config.yaml` - Project configuration +- `.fuzzforge/findings.db` - Local database for runs and findings +""" + + with open(readme_path, 'w') as f: + f.write(readme_content) + console.print("๐Ÿ“š Created README.md") + + console.print("\nโœ… FuzzForge project initialized successfully!", style="green") + console.print(f"\n๐ŸŽฏ Next steps:") + console.print(" โ€ข ff workflows - See available workflows") + console.print(" โ€ข ff status - Check API connectivity") + console.print(" โ€ข ff workflow - Start your first analysis") + console.print(" โ€ข edit .fuzzforge/.env with API keys & provider settings") + + except Exception as e: + console.print(f"\nโŒ Initialization failed: {e}", style="red") + raise typer.Exit(1) + + +@app.callback() +def init_callback(): + """ + ๐Ÿ“ Initialize FuzzForge projects and components + """ + + +def _ensure_env_file(fuzzforge_dir: Path, force: bool) -> None: + """Create or update the .fuzzforge/.env file with AI defaults.""" + + env_path = fuzzforge_dir / ".env" + if env_path.exists() and not force: + console.print("๐Ÿงช Using existing .fuzzforge/.env (use --force to regenerate)") + return + + console.print("๐Ÿง  Configuring AI environment...") + console.print(" โ€ข Default LLM provider: openai") + console.print(" โ€ข Default LLM model: gpt-5-mini") + console.print(" โ€ข To customise provider/model later, edit .fuzzforge/.env") + + llm_provider = "openai" + llm_model = "gpt-5-mini" + + api_key = Prompt.ask( + "OpenAI API key (leave blank to fill manually)", + default="", + show_default=False, + console=console, + ) + + enable_cognee = False + cognee_url = "" + + session_db_path = fuzzforge_dir / "fuzzforge_sessions.db" + session_db_rel = session_db_path.relative_to(fuzzforge_dir.parent) + + env_lines = [ + "# FuzzForge AI configuration", + "# Populate the API key(s) that match your LLM provider", + "", + f"LLM_PROVIDER={llm_provider}", + f"LLM_MODEL={llm_model}", + f"LITELLM_MODEL={llm_model}", + f"OPENAI_API_KEY={api_key}", + f"FUZZFORGE_MCP_URL={os.getenv('FUZZFORGE_MCP_URL', 'http://localhost:8010/mcp')}", + "", + "# Cognee configuration mirrors the primary LLM by default", + f"LLM_COGNEE_PROVIDER={llm_provider}", + f"LLM_COGNEE_MODEL={llm_model}", + f"LLM_COGNEE_API_KEY={api_key}", + "LLM_COGNEE_ENDPOINT=", + "COGNEE_MCP_URL=", + "", + "# Session persistence options: inmemory | sqlite", + "SESSION_PERSISTENCE=sqlite", + f"SESSION_DB_PATH={session_db_rel}", + "", + "# Optional integrations", + "AGENTOPS_API_KEY=", + "FUZZFORGE_DEBUG=0", + "", + ] + + env_path.write_text("\n".join(env_lines), encoding="utf-8") + console.print(f"๐Ÿ“ Created {env_path.relative_to(fuzzforge_dir.parent)}") + + template_path = fuzzforge_dir / ".env.template" + if not template_path.exists() or force: + template_lines = [] + for line in env_lines: + if line.startswith("OPENAI_API_KEY="): + template_lines.append("OPENAI_API_KEY=") + elif line.startswith("LLM_COGNEE_API_KEY="): + template_lines.append("LLM_COGNEE_API_KEY=") + else: + template_lines.append(line) + template_path.write_text("\n".join(template_lines), encoding="utf-8") + console.print(f"๐Ÿ“ Created {template_path.relative_to(fuzzforge_dir.parent)}") + + # SQLite session DB will be created automatically when first used by the AI agent + + +def _ensure_agents_registry(fuzzforge_dir: Path, force: bool) -> None: + """Create a starter agents.yaml registry if needed.""" + + agents_path = fuzzforge_dir / "agents.yaml" + if agents_path.exists() and not force: + return + + template = dedent( + """\ + # FuzzForge Registered Agents + # Populate this list to auto-register remote agents when the AI CLI starts + registered_agents: [] + + # Example: + # registered_agents: + # - name: Calculator + # url: http://localhost:10201 + # description: Sample math agent + """.strip() + ) + + agents_path.write_text(template + "\n", encoding="utf-8") + console.print(f"๐Ÿ“ Created {agents_path.relative_to(fuzzforge_dir.parent)}") diff --git a/cli/src/fuzzforge_cli/commands/monitor.py b/cli/src/fuzzforge_cli/commands/monitor.py new file mode 100644 index 0000000..4c8e108 --- /dev/null +++ b/cli/src/fuzzforge_cli/commands/monitor.py @@ -0,0 +1,436 @@ +""" +Real-time monitoring and statistics commands. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import asyncio +import time +from datetime import datetime, timedelta +from typing import Optional + +import typer +from rich.console import Console +from rich.table import Table +from rich.panel import Panel +from rich.live import Live +from rich.layout import Layout +from rich.progress import Progress, BarColumn, TextColumn, SpinnerColumn +from rich.align import Align +from rich import box + +from ..config import get_project_config, FuzzForgeConfig +from ..database import get_project_db, ensure_project_db, CrashRecord +from fuzzforge_sdk import FuzzForgeClient + +console = Console() +app = typer.Typer() + + +def get_client() -> FuzzForgeClient: + """Get configured FuzzForge client""" + config = get_project_config() or FuzzForgeConfig() + return FuzzForgeClient(base_url=config.get_api_url(), timeout=config.get_timeout()) + + +def format_duration(seconds: int) -> str: + """Format duration in human readable format""" + if seconds < 60: + return f"{seconds}s" + elif seconds < 3600: + return f"{seconds // 60}m {seconds % 60}s" + else: + hours = seconds // 3600 + minutes = (seconds % 3600) // 60 + return f"{hours}h {minutes}m" + + +def format_number(num: int) -> str: + """Format large numbers with K, M suffixes""" + if num >= 1000000: + return f"{num / 1000000:.1f}M" + elif num >= 1000: + return f"{num / 1000:.1f}K" + else: + return str(num) + + +@app.command("stats") +def fuzzing_stats( + run_id: str = typer.Argument(..., help="Run ID to get statistics for"), + refresh: int = typer.Option( + 5, "--refresh", "-r", + help="Refresh interval in seconds" + ), + once: bool = typer.Option( + False, "--once", + help="Show stats once and exit" + ) +): + """ + ๐Ÿ“Š Show current fuzzing statistics for a run + """ + try: + with get_client() as client: + if once: + # Show stats once + stats = client.get_fuzzing_stats(run_id) + display_stats_table(stats) + else: + # Live updating stats + console.print(f"๐Ÿ“Š [bold]Live Fuzzing Statistics[/bold] (Run: {run_id[:12]}...)") + console.print(f"Refreshing every {refresh}s. Press Ctrl+C to stop.\n") + + with Live(auto_refresh=False, console=console) as live: + while True: + try: + stats = client.get_fuzzing_stats(run_id) + table = create_stats_table(stats) + live.update(table, refresh=True) + time.sleep(refresh) + except KeyboardInterrupt: + console.print("\n๐Ÿ“Š Monitoring stopped", style="yellow") + break + + except Exception as e: + console.print(f"โŒ Failed to get fuzzing stats: {e}", style="red") + raise typer.Exit(1) + + +def display_stats_table(stats): + """Display stats in a simple table""" + table = create_stats_table(stats) + console.print(table) + + +def create_stats_table(stats) -> Panel: + """Create a rich table for fuzzing statistics""" + # Create main stats table + stats_table = Table(show_header=False, box=box.SIMPLE) + stats_table.add_column("Metric", style="bold cyan") + stats_table.add_column("Value", justify="right", style="bold white") + + stats_table.add_row("Total Executions", format_number(stats.executions)) + stats_table.add_row("Executions/sec", f"{stats.executions_per_sec:.1f}") + stats_table.add_row("Total Crashes", format_number(stats.crashes)) + stats_table.add_row("Unique Crashes", format_number(stats.unique_crashes)) + + if stats.coverage is not None: + stats_table.add_row("Code Coverage", f"{stats.coverage:.1f}%") + + stats_table.add_row("Corpus Size", format_number(stats.corpus_size)) + stats_table.add_row("Elapsed Time", format_duration(stats.elapsed_time)) + + if stats.last_crash_time: + time_since_crash = datetime.now() - stats.last_crash_time + stats_table.add_row("Last Crash", f"{format_duration(int(time_since_crash.total_seconds()))} ago") + + return Panel.fit( + stats_table, + title=f"๐Ÿ“Š Fuzzing Statistics - {stats.workflow}", + subtitle=f"Run: {stats.run_id[:12]}...", + box=box.ROUNDED + ) + + +@app.command("crashes") +def crash_reports( + run_id: str = typer.Argument(..., help="Run ID to get crash reports for"), + save: bool = typer.Option( + True, "--save/--no-save", + help="Save crashes to local database" + ), + limit: int = typer.Option( + 50, "--limit", "-l", + help="Maximum number of crashes to show" + ) +): + """ + ๐Ÿ› Display crash reports for a fuzzing run + """ + try: + with get_client() as client: + console.print(f"๐Ÿ› Fetching crash reports for run: {run_id}") + crashes = client.get_crash_reports(run_id) + + if not crashes: + console.print("โœ… No crashes found!", style="green") + return + + # Save to database if requested + if save: + db = ensure_project_db() + for crash in crashes: + crash_record = CrashRecord( + run_id=run_id, + crash_id=crash.crash_id, + signal=crash.signal, + stack_trace=crash.stack_trace, + input_file=crash.input_file, + severity=crash.severity, + timestamp=crash.timestamp + ) + db.save_crash(crash_record) + console.print("โœ… Crashes saved to local database") + + # Display crashes + crashes_to_show = crashes[:limit] + + # Summary + severity_counts = {} + signal_counts = {} + for crash in crashes: + severity_counts[crash.severity] = severity_counts.get(crash.severity, 0) + 1 + if crash.signal: + signal_counts[crash.signal] = signal_counts.get(crash.signal, 0) + 1 + + summary_table = Table(show_header=False, box=box.SIMPLE) + summary_table.add_column("Metric", style="bold cyan") + summary_table.add_column("Value", justify="right") + + summary_table.add_row("Total Crashes", str(len(crashes))) + summary_table.add_row("Unique Signals", str(len(signal_counts))) + + for severity, count in sorted(severity_counts.items()): + summary_table.add_row(f"{severity.title()} Severity", str(count)) + + console.print( + Panel.fit( + summary_table, + title=f"๐Ÿ› Crash Summary", + box=box.ROUNDED + ) + ) + + # Detailed crash table + if crashes_to_show: + crashes_table = Table(box=box.ROUNDED) + crashes_table.add_column("Crash ID", style="bold cyan") + crashes_table.add_column("Signal", justify="center") + crashes_table.add_column("Severity", justify="center") + crashes_table.add_column("Timestamp", justify="center") + crashes_table.add_column("Input File", style="dim") + + for crash in crashes_to_show: + signal_emoji = { + "SIGSEGV": "๐Ÿ’ฅ", + "SIGABRT": "๐Ÿ›‘", + "SIGFPE": "๐Ÿงฎ", + "SIGILL": "โš ๏ธ" + }.get(crash.signal or "", "๐Ÿ›") + + severity_style = { + "high": "red", + "medium": "yellow", + "low": "green" + }.get(crash.severity.lower(), "white") + + input_display = "" + if crash.input_file: + input_display = crash.input_file.split("/")[-1] # Show just filename + + crashes_table.add_row( + crash.crash_id[:12] + "..." if len(crash.crash_id) > 15 else crash.crash_id, + f"{signal_emoji} {crash.signal or 'Unknown'}", + f"[{severity_style}]{crash.severity}[/{severity_style}]", + crash.timestamp.strftime("%H:%M:%S"), + input_display + ) + + console.print(f"\n๐Ÿ› [bold]Crash Details[/bold]") + if len(crashes) > limit: + console.print(f"Showing first {limit} of {len(crashes)} crashes") + console.print() + console.print(crashes_table) + + console.print(f"\n๐Ÿ’ก Use [bold cyan]fuzzforge finding {run_id}[/bold cyan] for detailed analysis") + + except Exception as e: + console.print(f"โŒ Failed to get crash reports: {e}", style="red") + raise typer.Exit(1) + + +def _live_monitor(run_id: str, refresh: int): + """Helper for live monitoring to allow for cleaner exit handling""" + with get_client() as client: + start_time = time.time() + + def render_layout(run_status, stats): + layout = Layout() + layout.split_column( + Layout(name="header", size=3), + Layout(name="main", ratio=1), + Layout(name="footer", size=3) + ) + layout["main"].split_row( + Layout(name="stats", ratio=1), + Layout(name="progress", ratio=1) + ) + header = Panel( + f"[bold]FuzzForge Live Monitor[/bold]\n" + f"Run: {run_id[:12]}... | Status: {run_status.status} | " + f"Uptime: {format_duration(int(time.time() - start_time))}", + box=box.ROUNDED, + style="cyan" + ) + layout["header"].update(header) + layout["stats"].update(create_stats_table(stats)) + + progress_table = Table(show_header=False, box=box.SIMPLE) + progress_table.add_column("Metric", style="bold") + progress_table.add_column("Progress") + if stats.executions > 0: + exec_rate_percent = min(100, (stats.executions_per_sec / 1000) * 100) + progress_table.add_row("Exec Rate", create_progress_bar(exec_rate_percent, "green")) + crash_rate = (stats.crashes / stats.executions) * 100000 + crash_rate_percent = min(100, crash_rate * 10) + progress_table.add_row("Crash Rate", create_progress_bar(crash_rate_percent, "red")) + if stats.coverage is not None: + progress_table.add_row("Coverage", create_progress_bar(stats.coverage, "blue")) + layout["progress"].update(Panel.fit(progress_table, title="๐Ÿ“Š Progress Indicators", box=box.ROUNDED)) + + footer = Panel( + f"Last updated: {datetime.now().strftime('%H:%M:%S')} | " + f"Refresh interval: {refresh}s | Press Ctrl+C to exit", + box=box.ROUNDED, + style="dim" + ) + layout["footer"].update(footer) + return layout + + with Live(auto_refresh=False, console=console, screen=True) as live: + # Initial fetch + try: + run_status = client.get_run_status(run_id) + stats = client.get_fuzzing_stats(run_id) + except Exception: + # Minimal fallback stats + class FallbackStats: + def __init__(self, run_id): + self.run_id = run_id + self.workflow = "unknown" + self.executions = 0 + self.executions_per_sec = 0.0 + self.crashes = 0 + self.unique_crashes = 0 + self.coverage = None + self.corpus_size = 0 + self.elapsed_time = 0 + self.last_crash_time = None + stats = FallbackStats(run_id) + run_status = type("RS", (), {"status":"Unknown","is_completed":False,"is_failed":False})() + + live.update(render_layout(run_status, stats), refresh=True) + + # Simple polling approach that actually works + consecutive_errors = 0 + max_errors = 5 + + while True: + try: + # Poll for updates + try: + run_status = client.get_run_status(run_id) + consecutive_errors = 0 + except Exception as e: + consecutive_errors += 1 + if consecutive_errors >= max_errors: + console.print(f"โŒ Too many errors getting run status: {e}", style="red") + break + time.sleep(refresh) + continue + + # Try to get fuzzing stats + try: + stats = client.get_fuzzing_stats(run_id) + except Exception as e: + # Create fallback stats if not available + stats = FallbackStats(run_id) + + # Update display + live.update(render_layout(run_status, stats), refresh=True) + + # Check if completed + if getattr(run_status, 'is_completed', False) or getattr(run_status, 'is_failed', False): + # Show final state for a few seconds + console.print("\n๐Ÿ Run completed. Showing final state for 10 seconds...") + time.sleep(10) + break + + # Wait before next poll + time.sleep(refresh) + + except KeyboardInterrupt: + raise + except Exception as e: + console.print(f"โš ๏ธ Monitoring error: {e}", style="yellow") + time.sleep(refresh) + + # Completed status update + final_message = ( + f"[bold]FuzzForge Live Monitor - COMPLETED[/bold]\n" + f"Run: {run_id[:12]}... | Status: {run_status.status} | " + f"Total runtime: {format_duration(int(time.time() - start_time))}" + ) + style = "green" if getattr(run_status, 'is_completed', False) else "red" + live.update(Panel(final_message, box=box.ROUNDED, style=style), refresh=True) + + +@app.command("live") +def live_monitor( + run_id: str = typer.Argument(..., help="Run ID to monitor live"), + refresh: int = typer.Option( + 2, "--refresh", "-r", + help="Refresh interval in seconds (fallback when streaming unavailable)" + ) +): + """ + ๐Ÿ“บ Real-time monitoring dashboard with live updates (WebSocket/SSE with REST fallback) + """ + console.print(f"๐Ÿ“บ [bold]Live Monitoring Dashboard[/bold]") + console.print(f"Run: {run_id}") + console.print(f"Press Ctrl+C to stop monitoring\n") + try: + _live_monitor(run_id, refresh) + except KeyboardInterrupt: + console.print("\n๐Ÿ“Š Monitoring stopped by user.", style="yellow") + except Exception as e: + console.print(f"โŒ Failed to start live monitoring: {e}", style="red") + raise typer.Exit(1) + + +def create_progress_bar(percentage: float, color: str = "green") -> str: + """Create a simple text progress bar""" + width = 20 + filled = int((percentage / 100) * width) + bar = "โ–ˆ" * filled + "โ–‘" * (width - filled) + return f"[{color}]{bar}[/{color}] {percentage:.1f}%" + + +@app.callback(invoke_without_command=True) +def monitor_callback(ctx: typer.Context): + """ + ๐Ÿ“Š Real-time monitoring and statistics + """ + # Check if a subcommand is being invoked + if ctx.invoked_subcommand is not None: + # Let the subcommand handle it + return + + # Show not implemented message for default command + from rich.console import Console + console = Console() + console.print("๐Ÿšง [yellow]Monitor command is not fully implemented yet.[/yellow]") + console.print("Please use specific subcommands:") + console.print(" โ€ข [cyan]ff monitor stats [/cyan] - Show execution statistics") + console.print(" โ€ข [cyan]ff monitor crashes [/cyan] - Show crash reports") + console.print(" โ€ข [cyan]ff monitor live [/cyan] - Live monitoring dashboard") diff --git a/cli/src/fuzzforge_cli/commands/status.py b/cli/src/fuzzforge_cli/commands/status.py new file mode 100644 index 0000000..4874179 --- /dev/null +++ b/cli/src/fuzzforge_cli/commands/status.py @@ -0,0 +1,165 @@ +""" +Status command for showing project and API information. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +from pathlib import Path +from rich.console import Console +from rich.table import Table +from rich.panel import Panel +from rich import box + +from ..config import get_project_config, FuzzForgeConfig +from ..database import get_project_db +from fuzzforge_sdk import FuzzForgeClient + +console = Console() + + +def show_status(): + """Show comprehensive project and API status""" + current_dir = Path.cwd() + fuzzforge_dir = current_dir / ".fuzzforge" + + # Project status + console.print("\n๐Ÿ“Š [bold]FuzzForge Project Status[/bold]\n") + + if not fuzzforge_dir.exists(): + console.print( + Panel.fit( + "โŒ No FuzzForge project found in current directory\n\n" + "Run [bold cyan]ff init[/bold cyan] to initialize a project", + title="Project Status", + box=box.ROUNDED + ) + ) + return + + # Load project configuration + config = get_project_config() + if not config: + config = FuzzForgeConfig() + + # Project info table + project_table = Table(show_header=False, box=box.SIMPLE) + project_table.add_column("Property", style="bold cyan") + project_table.add_column("Value") + + project_table.add_row("Project Name", config.project.name) + project_table.add_row("Location", str(current_dir)) + project_table.add_row("API URL", config.project.api_url) + project_table.add_row("Default Timeout", f"{config.project.default_timeout}s") + + console.print( + Panel.fit( + project_table, + title="โœ… Project Information", + box=box.ROUNDED + ) + ) + + # Database status + db = get_project_db() + if db: + try: + stats = db.get_stats() + db_table = Table(show_header=False, box=box.SIMPLE) + db_table.add_column("Metric", style="bold cyan") + db_table.add_column("Count", justify="right") + + db_table.add_row("Total Runs", str(stats["total_runs"])) + db_table.add_row("Total Findings", str(stats["total_findings"])) + db_table.add_row("Total Crashes", str(stats["total_crashes"])) + db_table.add_row("Runs (Last 7 days)", str(stats["runs_last_7_days"])) + + if stats["runs_by_status"]: + db_table.add_row("", "") # Spacer + for status, count in stats["runs_by_status"].items(): + status_emoji = { + "completed": "โœ…", + "running": "๐Ÿ”„", + "failed": "โŒ", + "queued": "โณ", + "cancelled": "โน๏ธ" + }.get(status, "๐Ÿ“‹") + db_table.add_row(f"{status_emoji} {status.title()}", str(count)) + + console.print( + Panel.fit( + db_table, + title="๐Ÿ—„๏ธ Database Statistics", + box=box.ROUNDED + ) + ) + except Exception as e: + console.print(f"โš ๏ธ Database error: {e}", style="yellow") + + # API status + console.print("\n๐Ÿ”— [bold]API Connectivity[/bold]") + try: + with FuzzForgeClient(base_url=config.get_api_url(), timeout=10.0) as client: + api_status = client.get_api_status() + workflows = client.list_workflows() + + api_table = Table(show_header=False, box=box.SIMPLE) + api_table.add_column("Property", style="bold cyan") + api_table.add_column("Value") + + api_table.add_row("Status", f"โœ… Connected") + api_table.add_row("Service", f"{api_status.name} v{api_status.version}") + api_table.add_row("Workflows", str(len(workflows))) + + console.print( + Panel.fit( + api_table, + title="โœ… API Status", + box=box.ROUNDED + ) + ) + + # Show available workflows + if workflows: + workflow_table = Table(box=box.SIMPLE_HEAD) + workflow_table.add_column("Name", style="bold") + workflow_table.add_column("Version", justify="center") + workflow_table.add_column("Description") + + for workflow in workflows[:10]: # Limit to first 10 + workflow_table.add_row( + workflow.name, + workflow.version, + workflow.description[:60] + "..." if len(workflow.description) > 60 else workflow.description + ) + + if len(workflows) > 10: + workflow_table.add_row("...", "...", f"and {len(workflows) - 10} more workflows") + + console.print( + Panel.fit( + workflow_table, + title=f"๐Ÿ”ง Available Workflows ({len(workflows)})", + box=box.ROUNDED + ) + ) + + except Exception as e: + console.print( + Panel.fit( + f"โŒ Failed to connect to API\n\n" + f"Error: {str(e)}\n\n" + f"API URL: {config.get_api_url()}\n\n" + "Check that the FuzzForge API is running and accessible.", + title="โŒ API Connection Failed", + box=box.ROUNDED + ) + ) \ No newline at end of file diff --git a/cli/src/fuzzforge_cli/commands/workflow_exec.py b/cli/src/fuzzforge_cli/commands/workflow_exec.py new file mode 100644 index 0000000..ad44bb0 --- /dev/null +++ b/cli/src/fuzzforge_cli/commands/workflow_exec.py @@ -0,0 +1,591 @@ +""" +Workflow execution and management commands. +Replaces the old 'runs' terminology with cleaner workflow-centric commands. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import json +import time +from datetime import datetime +from pathlib import Path +from typing import Optional, Dict, Any, List + +import typer +from rich.console import Console +from rich.table import Table +from rich.panel import Panel +from rich.progress import Progress, SpinnerColumn, TextColumn, BarColumn, TaskProgressColumn +from rich.prompt import Prompt, Confirm +from rich.live import Live +from rich import box + +from ..config import get_project_config, FuzzForgeConfig +from ..database import get_project_db, ensure_project_db, RunRecord +from ..exceptions import ( + handle_error, retry_on_network_error, safe_json_load, require_project, + APIConnectionError, ValidationError, DatabaseError, FileOperationError +) +from ..validation import ( + validate_run_id, validate_workflow_name, validate_target_path, + validate_volume_mode, validate_parameters, validate_timeout +) +from ..progress import progress_manager, spinner, step_progress +from ..completion import WorkflowNameComplete, TargetPathComplete, VolumeModetComplete +from ..constants import ( + STATUS_EMOJIS, MAX_RUN_ID_DISPLAY_LENGTH, DEFAULT_VOLUME_MODE, + PROGRESS_STEP_DELAYS, MAX_RETRIES, RETRY_DELAY, POLL_INTERVAL +) +from fuzzforge_sdk import FuzzForgeClient, WorkflowSubmission + +console = Console() +app = typer.Typer() + + +@retry_on_network_error(max_retries=MAX_RETRIES, delay=RETRY_DELAY) +def get_client() -> FuzzForgeClient: + """Get configured FuzzForge client with retry on network errors""" + config = get_project_config() or FuzzForgeConfig() + return FuzzForgeClient(base_url=config.get_api_url(), timeout=config.get_timeout()) + + +def status_emoji(status: str) -> str: + """Get emoji for execution status""" + return STATUS_EMOJIS.get(status.lower(), STATUS_EMOJIS["unknown"]) + + +def parse_inline_parameters(params: List[str]) -> Dict[str, Any]: + """Parse inline key=value parameters using improved validation""" + return validate_parameters(params) + + +def execute_workflow_submission( + client: FuzzForgeClient, + workflow: str, + target_path: str, + parameters: Dict[str, Any], + volume_mode: str, + timeout: Optional[int], + interactive: bool +) -> Any: + """Handle the workflow submission process""" + # Get workflow metadata for parameter validation + console.print(f"๐Ÿ”ง Getting workflow information for: {workflow}") + workflow_meta = client.get_workflow_metadata(workflow) + param_response = client.get_workflow_parameters(workflow) + + # Interactive parameter input + if interactive and workflow_meta.parameters.get("properties"): + properties = workflow_meta.parameters.get("properties", {}) + required_params = set(workflow_meta.parameters.get("required", [])) + defaults = param_response.defaults + + missing_required = required_params - set(parameters.keys()) + + if missing_required: + console.print(f"\n๐Ÿ“ [bold]Missing required parameters:[/bold] {', '.join(missing_required)}") + console.print("Please provide values:\n") + + for param_name in missing_required: + param_schema = properties.get(param_name, {}) + description = param_schema.get("description", "") + param_type = param_schema.get("type", "string") + + prompt_text = f"{param_name}" + if description: + prompt_text += f" ({description})" + prompt_text += f" [{param_type}]" + + while True: + user_input = Prompt.ask(prompt_text, console=console) + + try: + if param_type == "integer": + parameters[param_name] = int(user_input) + elif param_type == "number": + parameters[param_name] = float(user_input) + elif param_type == "boolean": + parameters[param_name] = user_input.lower() in ("true", "yes", "1", "on") + elif param_type == "array": + parameters[param_name] = [item.strip() for item in user_input.split(",") if item.strip()] + else: + parameters[param_name] = user_input + break + except ValueError as e: + console.print(f"โŒ Invalid {param_type}: {e}", style="red") + + # Validate volume mode + validate_volume_mode(volume_mode) + if volume_mode not in workflow_meta.supported_volume_modes: + raise ValidationError( + "volume mode", volume_mode, + f"one of: {', '.join(workflow_meta.supported_volume_modes)}" + ) + + # Create submission + submission = WorkflowSubmission( + target_path=target_path, + volume_mode=volume_mode, + parameters=parameters, + timeout=timeout + ) + + # Show submission summary + console.print(f"\n๐ŸŽฏ [bold]Executing workflow:[/bold]") + console.print(f" Workflow: {workflow}") + console.print(f" Target: {target_path}") + console.print(f" Volume Mode: {volume_mode}") + if parameters: + console.print(f" Parameters: {len(parameters)} provided") + if timeout: + console.print(f" Timeout: {timeout}s") + + # Only ask for confirmation in interactive mode + if interactive: + if not Confirm.ask("\nExecute workflow?", default=True, console=console): + console.print("โŒ Execution cancelled", style="yellow") + raise typer.Exit(0) + else: + console.print("\n๐Ÿš€ Executing workflow...") + + # Submit the workflow with enhanced progress + console.print(f"\n๐Ÿš€ Executing workflow: [bold yellow]{workflow}[/bold yellow]") + + steps = [ + "Validating workflow configuration", + "Connecting to FuzzForge API", + "Uploading parameters and settings", + "Creating workflow deployment", + "Initializing execution environment" + ] + + with step_progress(steps, f"Executing {workflow}") as progress: + progress.next_step() # Validating + time.sleep(PROGRESS_STEP_DELAYS["validating"]) + + progress.next_step() # Connecting + time.sleep(PROGRESS_STEP_DELAYS["connecting"]) + + progress.next_step() # Uploading + response = client.submit_workflow(workflow, submission) + time.sleep(PROGRESS_STEP_DELAYS["uploading"]) + + progress.next_step() # Creating deployment + time.sleep(PROGRESS_STEP_DELAYS["creating"]) + + progress.next_step() # Initializing + time.sleep(PROGRESS_STEP_DELAYS["initializing"]) + + progress.complete(f"Workflow started successfully!") + + return response + + +# Main workflow execution command (replaces 'runs submit') +@app.command(name="exec", hidden=True) # Hidden because it will be called from main workflow command +def execute_workflow( + workflow: str = typer.Argument(..., help="Workflow name to execute"), + target_path: str = typer.Argument(..., help="Path to analyze"), + params: List[str] = typer.Argument(default=None, help="Parameters as key=value pairs"), + param_file: Optional[str] = typer.Option( + None, "--param-file", "-f", + help="JSON file containing workflow parameters" + ), + volume_mode: str = typer.Option( + DEFAULT_VOLUME_MODE, "--volume-mode", "-v", + help="Volume mount mode: ro (read-only) or rw (read-write)" + ), + timeout: Optional[int] = typer.Option( + None, "--timeout", "-t", + help="Execution timeout in seconds" + ), + interactive: bool = typer.Option( + True, "--interactive/--no-interactive", "-i/-n", + help="Interactive parameter input for missing required parameters" + ), + wait: bool = typer.Option( + False, "--wait", "-w", + help="Wait for execution to complete" + ), + live: bool = typer.Option( + False, "--live", "-l", + help="Start live monitoring after execution (useful for fuzzing workflows)" + ) +): + """ + ๐Ÿš€ Execute a workflow on a target + + Use --live for fuzzing workflows to see real-time progress. + Use --wait to wait for completion without live dashboard. + """ + try: + # Validate inputs + validate_workflow_name(workflow) + target_path_obj = validate_target_path(target_path, must_exist=True) + target_path = str(target_path_obj.absolute()) + validate_timeout(timeout) + + # Ensure we're in a project directory + require_project() + except Exception as e: + handle_error(e, "validating inputs") + + # Parse parameters + parameters = {} + + # Load from param file + if param_file: + try: + file_params = safe_json_load(param_file) + if isinstance(file_params, dict): + parameters.update(file_params) + else: + raise ValidationError("parameter file", param_file, "a JSON object") + except Exception as e: + handle_error(e, "loading parameter file") + + # Parse inline parameters + if params: + try: + inline_params = parse_inline_parameters(params) + parameters.update(inline_params) + except Exception as e: + handle_error(e, "parsing parameters") + + try: + with get_client() as client: + response = execute_workflow_submission( + client, workflow, target_path, parameters, + volume_mode, timeout, interactive + ) + + console.print(f"โœ… Workflow execution started!", style="green") + console.print(f" Execution ID: [bold cyan]{response.run_id}[/bold cyan]") + console.print(f" Status: {status_emoji(response.status)} {response.status}") + + # Save to database + try: + db = ensure_project_db() + run_record = RunRecord( + run_id=response.run_id, + workflow=workflow, + status=response.status, + target_path=target_path, + parameters=parameters, + created_at=datetime.now() + ) + db.save_run(run_record) + except Exception as e: + # Don't fail the whole operation if database save fails + console.print(f"โš ๏ธ Failed to save execution to database: {e}", style="yellow") + + console.print(f"\n๐Ÿ’ก Monitor progress: [bold cyan]fuzzforge monitor {response.run_id}[/bold cyan]") + console.print(f"๐Ÿ’ก Check status: [bold cyan]fuzzforge workflow status {response.run_id}[/bold cyan]") + + # Suggest --live for fuzzing workflows + if not live and not wait and "fuzzing" in workflow.lower(): + console.print(f"๐Ÿ’ก Next time try: [bold cyan]fuzzforge workflow {workflow} {target_path} --live[/bold cyan] for real-time fuzzing dashboard", style="dim") + + # Start live monitoring if requested + if live: + # Check if this is a fuzzing workflow to show appropriate messaging + is_fuzzing = "fuzzing" in workflow.lower() + if is_fuzzing: + console.print(f"\n๐Ÿ“บ Starting live fuzzing dashboard...") + console.print("๐Ÿ’ก You'll see real-time crash discovery, execution stats, and coverage data.") + else: + console.print(f"\n๐Ÿ“บ Starting live monitoring dashboard...") + + console.print("Press Ctrl+C to stop monitoring (execution continues in background).\n") + + try: + from ..commands.monitor import live_monitor + # Import monitor command and run it + live_monitor(response.run_id, refresh=3) + except KeyboardInterrupt: + console.print(f"\nโน๏ธ Live monitoring stopped (execution continues in background)", style="yellow") + except Exception as e: + console.print(f"โš ๏ธ Failed to start live monitoring: {e}", style="yellow") + console.print(f"๐Ÿ’ก You can still monitor manually: [bold cyan]fuzzforge monitor {response.run_id}[/bold cyan]") + + # Wait for completion if requested + elif wait: + console.print(f"\nโณ Waiting for execution to complete...") + try: + final_status = client.wait_for_completion(response.run_id, poll_interval=POLL_INTERVAL) + + # Update database + try: + db.update_run_status( + response.run_id, + final_status.status, + completed_at=datetime.now() if final_status.is_completed else None + ) + except Exception as e: + console.print(f"โš ๏ธ Failed to update database: {e}", style="yellow") + + console.print(f"๐Ÿ Execution completed with status: {status_emoji(final_status.status)} {final_status.status}") + + if final_status.is_completed: + console.print(f"๐Ÿ’ก View findings: [bold cyan]fuzzforge findings {response.run_id}[/bold cyan]") + + except KeyboardInterrupt: + console.print(f"\nโน๏ธ Monitoring cancelled (execution continues in background)", style="yellow") + except Exception as e: + handle_error(e, "waiting for completion") + + except Exception as e: + handle_error(e, "executing workflow") + + +@app.command("status") +def workflow_status( + execution_id: Optional[str] = typer.Argument(None, help="Execution ID to check (defaults to most recent)") +): + """ + ๐Ÿ“Š Check the status of a workflow execution + """ + try: + require_project() + + if execution_id: + validate_run_id(execution_id) + + db = get_project_db() + if not db: + raise DatabaseError("get project database", Exception("No database found")) + + # Get execution ID + if not execution_id: + recent_runs = db.list_runs(limit=1) + if not recent_runs: + console.print("โš ๏ธ No executions found in project database", style="yellow") + raise typer.Exit(0) + execution_id = recent_runs[0].run_id + console.print(f"๐Ÿ” Using most recent execution: {execution_id}") + else: + validate_run_id(execution_id) + + # Get status from API + with get_client() as client: + status = client.get_run_status(execution_id) + + # Update local database + try: + db.update_run_status( + execution_id, + status.status, + completed_at=status.updated_at if status.is_completed else None + ) + except Exception as e: + console.print(f"โš ๏ธ Failed to update database: {e}", style="yellow") + + # Display status + console.print(f"\n๐Ÿ“Š [bold]Execution Status: {execution_id}[/bold]\n") + + status_table = Table(show_header=False, box=box.SIMPLE) + status_table.add_column("Property", style="bold cyan") + status_table.add_column("Value") + + status_table.add_row("Execution ID", execution_id) + status_table.add_row("Workflow", status.workflow) + status_table.add_row("Status", f"{status_emoji(status.status)} {status.status}") + status_table.add_row("Created", status.created_at.strftime("%Y-%m-%d %H:%M:%S")) + status_table.add_row("Updated", status.updated_at.strftime("%Y-%m-%d %H:%M:%S")) + + if status.is_completed: + duration = status.updated_at - status.created_at + status_table.add_row("Duration", str(duration).split('.')[0]) # Remove microseconds + + console.print( + Panel.fit( + status_table, + title=f"๐Ÿ“Š Status Information", + box=box.ROUNDED + ) + ) + + # Show next steps + if status.is_running: + console.print(f"\n๐Ÿ’ก Monitor live: [bold cyan]fuzzforge monitor {execution_id}[/bold cyan]") + elif status.is_completed: + console.print(f"๐Ÿ’ก View findings: [bold cyan]fuzzforge finding {execution_id}[/bold cyan]") + elif status.is_failed: + console.print(f"๐Ÿ’ก Check logs: [bold cyan]fuzzforge workflow logs {execution_id}[/bold cyan]") + + except Exception as e: + handle_error(e, "getting execution status") + + +@app.command("history") +def workflow_history( + workflow: Optional[str] = typer.Option(None, "--workflow", "-w", help="Filter by workflow name"), + status: Optional[str] = typer.Option(None, "--status", "-s", help="Filter by status"), + limit: int = typer.Option(20, "--limit", "-l", help="Maximum number of executions to show") +): + """ + ๐Ÿ“‹ Show workflow execution history + """ + try: + require_project() + + if limit <= 0: + raise ValidationError("limit", limit, "a positive integer") + + db = get_project_db() + if not db: + raise DatabaseError("get project database", Exception("No database found")) + runs = db.list_runs(workflow=workflow, status=status, limit=limit) + + if not runs: + console.print("โš ๏ธ No executions found matching criteria", style="yellow") + return + + table = Table(box=box.ROUNDED) + table.add_column("Execution ID", style="bold cyan") + table.add_column("Workflow", style="bold") + table.add_column("Status", justify="center") + table.add_column("Target", style="dim") + table.add_column("Created", justify="center") + table.add_column("Parameters", justify="center", style="dim") + + for run in runs: + param_count = len(run.parameters) if run.parameters else 0 + param_str = f"{param_count} params" if param_count > 0 else "-" + + table.add_row( + run.run_id[:12] + "..." if len(run.run_id) > MAX_RUN_ID_DISPLAY_LENGTH else run.run_id, + run.workflow, + f"{status_emoji(run.status)} {run.status}", + Path(run.target_path).name, + run.created_at.strftime("%m-%d %H:%M"), + param_str + ) + + console.print(f"\n๐Ÿ“‹ [bold]Workflow Execution History ({len(runs)})[/bold]") + if workflow: + console.print(f" Filtered by workflow: {workflow}") + if status: + console.print(f" Filtered by status: {status}") + console.print() + console.print(table) + + console.print(f"\n๐Ÿ’ก Use [bold cyan]fuzzforge workflow status [/bold cyan] for detailed status") + + except Exception as e: + handle_error(e, "listing execution history") + + +@app.command("retry") +def retry_workflow( + execution_id: Optional[str] = typer.Argument(None, help="Execution ID to retry (defaults to most recent)"), + modify_params: bool = typer.Option( + False, "--modify-params", "-m", + help="Interactively modify parameters before retrying" + ) +): + """ + ๐Ÿ”„ Retry a workflow execution with the same or modified parameters + """ + try: + require_project() + + db = get_project_db() + if not db: + raise DatabaseError("get project database", Exception("No database found")) + + # Get execution ID if not provided + if not execution_id: + recent_runs = db.list_runs(limit=1) + if not recent_runs: + console.print("โš ๏ธ No executions found to retry", style="yellow") + raise typer.Exit(0) + execution_id = recent_runs[0].run_id + console.print(f"๐Ÿ”„ Retrying most recent execution: {execution_id}") + else: + validate_run_id(execution_id) + + # Get original execution + original_run = db.get_run(execution_id) + if not original_run: + raise ValidationError("execution_id", execution_id, "an existing execution ID in the database") + + console.print(f"๐Ÿ”„ [bold]Retrying workflow:[/bold] {original_run.workflow}") + console.print(f" Original Execution ID: {execution_id}") + console.print(f" Target: {original_run.target_path}") + + parameters = original_run.parameters.copy() + + # Modify parameters if requested + if modify_params and parameters: + console.print(f"\n๐Ÿ“ [bold]Current parameters:[/bold]") + for key, value in parameters.items(): + new_value = Prompt.ask( + f"{key}", + default=str(value), + console=console + ) + if new_value != str(value): + # Try to maintain type + try: + if isinstance(value, bool): + parameters[key] = new_value.lower() in ("true", "yes", "1", "on") + elif isinstance(value, int): + parameters[key] = int(new_value) + elif isinstance(value, float): + parameters[key] = float(new_value) + elif isinstance(value, list): + parameters[key] = [item.strip() for item in new_value.split(",") if item.strip()] + else: + parameters[key] = new_value + except ValueError: + parameters[key] = new_value + + # Submit new execution + with get_client() as client: + submission = WorkflowSubmission( + target_path=original_run.target_path, + parameters=parameters + ) + + response = client.submit_workflow(original_run.workflow, submission) + + console.print(f"\nโœ… Retry submitted successfully!", style="green") + console.print(f" New Execution ID: [bold cyan]{response.run_id}[/bold cyan]") + console.print(f" Status: {status_emoji(response.status)} {response.status}") + + # Save to database + try: + run_record = RunRecord( + run_id=response.run_id, + workflow=original_run.workflow, + status=response.status, + target_path=original_run.target_path, + parameters=parameters, + created_at=datetime.now(), + metadata={"retry_of": execution_id} + ) + db.save_run(run_record) + except Exception as e: + console.print(f"โš ๏ธ Failed to save execution to database: {e}", style="yellow") + + console.print(f"\n๐Ÿ’ก Monitor progress: [bold cyan]fuzzforge monitor {response.run_id}[/bold cyan]") + + except Exception as e: + handle_error(e, "retrying workflow") + + +@app.callback() +def workflow_exec_callback(): + """ + ๐Ÿš€ Workflow execution management + """ \ No newline at end of file diff --git a/cli/src/fuzzforge_cli/commands/workflows.py b/cli/src/fuzzforge_cli/commands/workflows.py new file mode 100644 index 0000000..cbdd96f --- /dev/null +++ b/cli/src/fuzzforge_cli/commands/workflows.py @@ -0,0 +1,305 @@ +""" +Workflow management commands. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import json +import typer +from rich.console import Console +from rich.table import Table +from rich.panel import Panel +from rich.prompt import Prompt, Confirm +from rich.syntax import Syntax +from rich import box +from typing import Optional, Dict, Any + +from ..config import get_project_config, FuzzForgeConfig +from ..fuzzy import enhanced_workflow_not_found_handler +from fuzzforge_sdk import FuzzForgeClient + +console = Console() +app = typer.Typer() + + +def get_client() -> FuzzForgeClient: + """Get configured FuzzForge client""" + config = get_project_config() or FuzzForgeConfig() + return FuzzForgeClient(base_url=config.get_api_url(), timeout=config.get_timeout()) + + +@app.command("list") +def list_workflows(): + """ + ๐Ÿ“‹ List all available security testing workflows + """ + try: + with get_client() as client: + workflows = client.list_workflows() + + if not workflows: + console.print("โŒ No workflows available", style="red") + return + + table = Table(box=box.ROUNDED) + table.add_column("Name", style="bold cyan") + table.add_column("Version", justify="center") + table.add_column("Description") + table.add_column("Tags", style="dim") + + for workflow in workflows: + tags_str = ", ".join(workflow.tags) if workflow.tags else "" + table.add_row( + workflow.name, + workflow.version, + workflow.description, + tags_str + ) + + console.print(f"\n๐Ÿ”ง [bold]Available Workflows ({len(workflows)})[/bold]\n") + console.print(table) + + console.print(f"\n๐Ÿ’ก Use [bold cyan]fuzzforge workflows info [/bold cyan] for detailed information") + + except Exception as e: + console.print(f"โŒ Failed to fetch workflows: {e}", style="red") + raise typer.Exit(1) + + +@app.command("info") +def workflow_info( + name: str = typer.Argument(..., help="Workflow name to get information about") +): + """ + ๐Ÿ“‹ Show detailed information about a specific workflow + """ + try: + with get_client() as client: + workflow = client.get_workflow_metadata(name) + + console.print(f"\n๐Ÿ”ง [bold]Workflow: {workflow.name}[/bold]\n") + + # Basic information + info_table = Table(show_header=False, box=box.SIMPLE) + info_table.add_column("Property", style="bold cyan") + info_table.add_column("Value") + + info_table.add_row("Name", workflow.name) + info_table.add_row("Version", workflow.version) + info_table.add_row("Description", workflow.description) + if workflow.author: + info_table.add_row("Author", workflow.author) + if workflow.tags: + info_table.add_row("Tags", ", ".join(workflow.tags)) + info_table.add_row("Volume Modes", ", ".join(workflow.supported_volume_modes)) + info_table.add_row("Custom Docker", "โœ… Yes" if workflow.has_custom_docker else "โŒ No") + + console.print( + Panel.fit( + info_table, + title="โ„น๏ธ Basic Information", + box=box.ROUNDED + ) + ) + + # Parameters + if workflow.parameters: + console.print("\n๐Ÿ“ [bold]Parameters Schema[/bold]") + + param_table = Table(box=box.ROUNDED) + param_table.add_column("Parameter", style="bold") + param_table.add_column("Type", style="cyan") + param_table.add_column("Required", justify="center") + param_table.add_column("Default") + param_table.add_column("Description", style="dim") + + # Extract parameter information from JSON schema + properties = workflow.parameters.get("properties", {}) + required_params = set(workflow.parameters.get("required", [])) + defaults = workflow.default_parameters + + for param_name, param_schema in properties.items(): + param_type = param_schema.get("type", "unknown") + is_required = "โœ…" if param_name in required_params else "โŒ" + default_val = str(defaults.get(param_name, "")) if param_name in defaults else "" + description = param_schema.get("description", "") + + # Handle array types + if param_type == "array": + items_type = param_schema.get("items", {}).get("type", "unknown") + param_type = f"array[{items_type}]" + + param_table.add_row( + param_name, + param_type, + is_required, + default_val[:30] + "..." if len(default_val) > 30 else default_val, + description[:50] + "..." if len(description) > 50 else description + ) + + console.print(param_table) + + # Required modules + if workflow.required_modules: + console.print(f"\n๐Ÿ”ง [bold]Required Modules:[/bold] {', '.join(workflow.required_modules)}") + + console.print(f"\n๐Ÿ’ก Use [bold cyan]fuzzforge workflows parameters {name}[/bold cyan] for interactive parameter builder") + + except Exception as e: + error_message = str(e) + if "not found" in error_message.lower() or "404" in error_message: + # Try fuzzy matching for workflow name + enhanced_workflow_not_found_handler(name) + else: + console.print(f"โŒ Failed to get workflow info: {e}", style="red") + raise typer.Exit(1) + + +@app.command("parameters") +def workflow_parameters( + name: str = typer.Argument(..., help="Workflow name"), + output_file: Optional[str] = typer.Option( + None, "--output", "-o", + help="Save parameters to JSON file" + ), + interactive: bool = typer.Option( + True, "--interactive/--no-interactive", "-i/-n", + help="Interactive parameter builder" + ) +): + """ + ๐Ÿ“ Interactive parameter builder for workflows + """ + try: + with get_client() as client: + workflow = client.get_workflow_metadata(name) + param_response = client.get_workflow_parameters(name) + + console.print(f"\n๐Ÿ“ [bold]Parameter Builder: {name}[/bold]\n") + + if not workflow.parameters.get("properties"): + console.print("โ„น๏ธ This workflow has no configurable parameters") + return + + parameters = {} + properties = workflow.parameters.get("properties", {}) + required_params = set(workflow.parameters.get("required", [])) + defaults = param_response.defaults + + if interactive: + console.print("๐Ÿ”ง Enter parameter values (press Enter for default):\n") + + for param_name, param_schema in properties.items(): + param_type = param_schema.get("type", "string") + description = param_schema.get("description", "") + is_required = param_name in required_params + default_value = defaults.get(param_name) + + # Build prompt + prompt_text = f"{param_name}" + if description: + prompt_text += f" ({description})" + if param_type: + prompt_text += f" [{param_type}]" + if is_required: + prompt_text += " [bold red]*required*[/bold red]" + + # Get user input + while True: + if default_value is not None: + user_input = Prompt.ask( + prompt_text, + default=str(default_value), + console=console + ) + else: + user_input = Prompt.ask( + prompt_text, + console=console + ) + + # Validate and convert input + if user_input.strip() == "" and not is_required: + break + + if user_input.strip() == "" and is_required: + console.print("โŒ This parameter is required", style="red") + continue + + try: + # Type conversion + if param_type == "integer": + parameters[param_name] = int(user_input) + elif param_type == "number": + parameters[param_name] = float(user_input) + elif param_type == "boolean": + parameters[param_name] = user_input.lower() in ("true", "yes", "1", "on") + elif param_type == "array": + # Simple comma-separated array + parameters[param_name] = [item.strip() for item in user_input.split(",") if item.strip()] + else: + parameters[param_name] = user_input + + break + + except ValueError as e: + console.print(f"โŒ Invalid {param_type}: {e}", style="red") + + # Show summary + console.print("\n๐Ÿ“‹ [bold]Parameter Summary:[/bold]") + summary_table = Table(show_header=False, box=box.SIMPLE) + summary_table.add_column("Parameter", style="cyan") + summary_table.add_column("Value", style="white") + + for key, value in parameters.items(): + summary_table.add_row(key, str(value)) + + console.print(summary_table) + + else: + # Non-interactive mode - show schema + console.print("๐Ÿ“‹ Parameter Schema:") + schema_json = json.dumps(workflow.parameters, indent=2) + console.print(Syntax(schema_json, "json", theme="monokai")) + + if defaults: + console.print("\n๐Ÿ“‹ Default Values:") + defaults_json = json.dumps(defaults, indent=2) + console.print(Syntax(defaults_json, "json", theme="monokai")) + + # Save to file if requested + if output_file: + if parameters or not interactive: + data_to_save = parameters if interactive else {"schema": workflow.parameters, "defaults": defaults} + with open(output_file, 'w') as f: + json.dump(data_to_save, f, indent=2) + console.print(f"\n๐Ÿ’พ Parameters saved to: {output_file}") + else: + console.print("\nโŒ No parameters to save", style="red") + + except Exception as e: + console.print(f"โŒ Failed to build parameters: {e}", style="red") + raise typer.Exit(1) + + +@app.callback(invoke_without_command=True) +def workflows_callback(ctx: typer.Context): + """ + ๐Ÿ”ง Manage security testing workflows + """ + # Check if a subcommand is being invoked + if ctx.invoked_subcommand is not None: + # Let the subcommand handle it + return + + # Default to list when no subcommand provided + list_workflows() \ No newline at end of file diff --git a/cli/src/fuzzforge_cli/completion.py b/cli/src/fuzzforge_cli/completion.py new file mode 100644 index 0000000..58aad6b --- /dev/null +++ b/cli/src/fuzzforge_cli/completion.py @@ -0,0 +1,190 @@ +""" +Shell auto-completion support for FuzzForge CLI. + +Provides intelligent tab completion for commands, workflows, run IDs, and parameters. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import typer +from typing import List, Optional +from pathlib import Path + +from .config import get_project_config, FuzzForgeConfig +from .database import get_project_db +from fuzzforge_sdk import FuzzForgeClient + + +def complete_workflow_names(incomplete: str) -> List[str]: + """Auto-complete workflow names from the API.""" + try: + config = get_project_config() or FuzzForgeConfig() + with FuzzForgeClient(base_url=config.get_api_url(), timeout=5.0) as client: + workflows = client.list_workflows() + workflow_names = [w.name for w in workflows] + return [name for name in workflow_names if name.startswith(incomplete)] + except Exception: + # Fallback to common workflow names if API is unavailable + common_workflows = [ + "security_assessment", + "language_fuzzing", + "infrastructure_scan", + "static_analysis_scan", + "penetration_testing_scan", + "secret_detection_scan" + ] + return [name for name in common_workflows if name.startswith(incomplete)] + + +def complete_run_ids(incomplete: str) -> List[str]: + """Auto-complete run IDs from local database.""" + try: + db = get_project_db() + if db: + runs = db.get_recent_runs(limit=50) # Get recent runs for completion + run_ids = [run.run_id for run in runs] + return [run_id for run_id in run_ids if run_id.startswith(incomplete)] + except Exception: + pass + return [] + + +def complete_target_paths(incomplete: str) -> List[str]: + """Auto-complete file/directory paths.""" + try: + # Convert incomplete path to Path object + path = Path(incomplete) if incomplete else Path.cwd() + + if path.is_dir(): + # Complete directory contents + try: + entries = [] + for entry in path.iterdir(): + entry_str = str(entry) + if entry.is_dir(): + entry_str += "/" + entries.append(entry_str) + return entries + except PermissionError: + return [] + else: + # Complete parent directory contents that match the incomplete name + parent = path.parent + name = path.name + try: + entries = [] + for entry in parent.iterdir(): + if entry.name.startswith(name): + entry_str = str(entry) + if entry.is_dir(): + entry_str += "/" + entries.append(entry_str) + return entries + except (PermissionError, FileNotFoundError): + return [] + except Exception: + return [] + + +def complete_volume_modes(incomplete: str) -> List[str]: + """Auto-complete volume mount modes.""" + modes = ["ro", "rw"] + return [mode for mode in modes if mode.startswith(incomplete)] + + +def complete_export_formats(incomplete: str) -> List[str]: + """Auto-complete export formats.""" + formats = ["json", "csv", "html", "sarif"] + return [fmt for fmt in formats if fmt.startswith(incomplete)] + + +def complete_severity_levels(incomplete: str) -> List[str]: + """Auto-complete severity levels.""" + severities = ["critical", "high", "medium", "low", "info"] + return [sev for sev in severities if sev.startswith(incomplete)] + + +def complete_workflow_tags(incomplete: str) -> List[str]: + """Auto-complete workflow tags.""" + try: + config = get_project_config() or FuzzForgeConfig() + with FuzzForgeClient(base_url=config.get_api_url(), timeout=5.0) as client: + workflows = client.list_workflows() + all_tags = set() + for w in workflows: + if w.tags: + all_tags.update(w.tags) + return [tag for tag in sorted(all_tags) if tag.startswith(incomplete)] + except Exception: + # Fallback tags + common_tags = [ + "security", "fuzzing", "static-analysis", "infrastructure", + "secrets", "containers", "vulnerabilities", "pentest" + ] + return [tag for tag in common_tags if tag.startswith(incomplete)] + + +def complete_config_keys(incomplete: str) -> List[str]: + """Auto-complete configuration keys.""" + config_keys = [ + "api_url", + "api_timeout", + "default_workflow", + "default_volume_mode", + "project_name", + "data_retention_days", + "auto_save_findings", + "notification_webhook" + ] + return [key for key in config_keys if key.startswith(incomplete)] + + +# Completion callbacks for Typer +WorkflowNameComplete = typer.Option( + autocompletion=complete_workflow_names, + help="Workflow name (tab completion available)" +) + +RunIdComplete = typer.Option( + autocompletion=complete_run_ids, + help="Run ID (tab completion available)" +) + +TargetPathComplete = typer.Argument( + autocompletion=complete_target_paths, + help="Target path (tab completion available)" +) + +VolumeModetComplete = typer.Option( + autocompletion=complete_volume_modes, + help="Volume mode: ro or rw (tab completion available)" +) + +ExportFormatComplete = typer.Option( + autocompletion=complete_export_formats, + help="Export format (tab completion available)" +) + +SeverityComplete = typer.Option( + autocompletion=complete_severity_levels, + help="Severity level (tab completion available)" +) + +WorkflowTagComplete = typer.Option( + autocompletion=complete_workflow_tags, + help="Workflow tag (tab completion available)" +) + +ConfigKeyComplete = typer.Option( + autocompletion=complete_config_keys, + help="Configuration key (tab completion available)" +) \ No newline at end of file diff --git a/cli/src/fuzzforge_cli/config.py b/cli/src/fuzzforge_cli/config.py new file mode 100644 index 0000000..ba67c9e --- /dev/null +++ b/cli/src/fuzzforge_cli/config.py @@ -0,0 +1,420 @@ +""" +Configuration management for FuzzForge CLI. + +Extends project configuration with Cognee integration metadata +and provides helpers for AI modules. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +from __future__ import annotations + +import hashlib +import os +from pathlib import Path +from typing import Any, Dict, Optional + +try: # Optional dependency; fall back if not installed + from dotenv import load_dotenv +except ImportError: # pragma: no cover - optional dependency + load_dotenv = None + +import yaml +from pydantic import BaseModel, Field + + +def _generate_project_id(project_dir: Path, project_name: str) -> str: + """Generate a deterministic project identifier based on path and name.""" + resolved_path = str(project_dir.resolve()) + hash_input = f"{resolved_path}:{project_name}".encode() + return hashlib.sha256(hash_input).hexdigest()[:16] + + +class ProjectConfig(BaseModel): + """Project configuration model.""" + + name: str = "fuzzforge-project" + api_url: str = "http://localhost:8000" + default_timeout: int = 3600 + default_workflow: Optional[str] = None + id: Optional[str] = None + tenant_id: Optional[str] = None + + +class RetentionConfig(BaseModel): + """Data retention configuration.""" + + max_runs: int = 100 + keep_findings_days: int = 90 + + +class PreferencesConfig(BaseModel): + """User preferences.""" + + auto_save_findings: bool = True + show_progress_bars: bool = True + table_style: str = "rich" + color_output: bool = True + + +class CogneeConfig(BaseModel): + """Cognee integration metadata.""" + + enabled: bool = True + graph_database_provider: str = "kuzu" + data_directory: Optional[str] = None + system_directory: Optional[str] = None + backend_access_control: bool = True + project_id: Optional[str] = None + tenant_id: Optional[str] = None + + +class FuzzForgeConfig(BaseModel): + """Complete FuzzForge CLI configuration.""" + + project: ProjectConfig = Field(default_factory=ProjectConfig) + retention: RetentionConfig = Field(default_factory=RetentionConfig) + preferences: PreferencesConfig = Field(default_factory=PreferencesConfig) + cognee: CogneeConfig = Field(default_factory=CogneeConfig) + + @classmethod + def from_file(cls, config_path: Path) -> "FuzzForgeConfig": + """Load configuration from YAML file.""" + if not config_path.exists(): + return cls() + + try: + with open(config_path, "r", encoding="utf-8") as fh: + data = yaml.safe_load(fh) or {} + return cls(**data) + except Exception as exc: # pragma: no cover - defensive fallback + print(f"Warning: Failed to load config from {config_path}: {exc}") + return cls() + + def save_to_file(self, config_path: Path) -> None: + """Save configuration to YAML file.""" + config_path.parent.mkdir(parents=True, exist_ok=True) + with open(config_path, "w", encoding="utf-8") as fh: + yaml.dump( + self.model_dump(), + fh, + default_flow_style=False, + sort_keys=False, + ) + + # ------------------------------------------------------------------ + # Convenience helpers used by CLI and AI modules + # ------------------------------------------------------------------ + def ensure_project_metadata(self, project_dir: Path) -> bool: + """Ensure project id/tenant metadata is populated.""" + changed = False + project = self.project + if not project.id: + project.id = _generate_project_id(project_dir, project.name) + changed = True + if not project.tenant_id: + project.tenant_id = f"fuzzforge_project_{project.id}" + changed = True + return changed + + def ensure_cognee_defaults(self, project_dir: Path) -> bool: + """Ensure Cognee configuration and directories exist.""" + self.ensure_project_metadata(project_dir) + changed = False + + cognee = self.cognee + if not cognee.project_id: + cognee.project_id = self.project.id + changed = True + if not cognee.tenant_id: + cognee.tenant_id = self.project.tenant_id + changed = True + + base_dir = project_dir / ".fuzzforge" / "cognee" / f"project_{self.project.id}" + data_dir = base_dir / "data" + system_dir = base_dir / "system" + + for path in ( + base_dir, + data_dir, + system_dir, + system_dir / "kuzu_db", + system_dir / "lancedb", + ): + if not path.exists(): + path.mkdir(parents=True, exist_ok=True) + + if cognee.data_directory != str(data_dir): + cognee.data_directory = str(data_dir) + changed = True + if cognee.system_directory != str(system_dir): + cognee.system_directory = str(system_dir) + changed = True + + return changed + + def get_api_url(self) -> str: + """Get API URL with environment variable override.""" + return os.getenv("FUZZFORGE_API_URL", self.project.api_url) + + def get_timeout(self) -> int: + """Get timeout with environment variable override.""" + env_timeout = os.getenv("FUZZFORGE_TIMEOUT") + if env_timeout and env_timeout.isdigit(): + return int(env_timeout) + return self.project.default_timeout + + def get_project_context(self, project_dir: Path) -> Dict[str, str]: + """Return project metadata for AI integrations.""" + self.ensure_cognee_defaults(project_dir) + return { + "project_id": self.project.id or "unknown_project", + "project_name": self.project.name, + "tenant_id": self.project.tenant_id or "fuzzforge_tenant", + "data_directory": self.cognee.data_directory, + "system_directory": self.cognee.system_directory, + } + + def get_cognee_config(self, project_dir: Path) -> Dict[str, Any]: + """Expose Cognee configuration as a plain dictionary.""" + self.ensure_cognee_defaults(project_dir) + return self.cognee.model_dump() + + +# ---------------------------------------------------------------------- +# Project-level helpers used across the CLI +# ---------------------------------------------------------------------- + +def _get_project_paths(project_dir: Path) -> Dict[str, Path]: + config_dir = project_dir / ".fuzzforge" + return { + "config_dir": config_dir, + "config_path": config_dir / "config.yaml", + } + + +def get_project_config(project_dir: Optional[Path] = None) -> Optional[FuzzForgeConfig]: + """Get configuration for the current project.""" + project_dir = Path(project_dir or Path.cwd()) + paths = _get_project_paths(project_dir) + config_path = paths["config_path"] + + if not config_path.exists(): + return None + + config = FuzzForgeConfig.from_file(config_path) + if config.ensure_cognee_defaults(project_dir): + config.save_to_file(config_path) + return config + + +def ensure_project_config( + project_dir: Optional[Path] = None, + project_name: Optional[str] = None, + api_url: Optional[str] = None, +) -> FuzzForgeConfig: + """Ensure project configuration exists, creating defaults if needed.""" + project_dir = Path(project_dir or Path.cwd()) + paths = _get_project_paths(project_dir) + config_dir = paths["config_dir"] + config_path = paths["config_path"] + + config_dir.mkdir(parents=True, exist_ok=True) + + if config_path.exists(): + config = FuzzForgeConfig.from_file(config_path) + else: + config = FuzzForgeConfig() + + if project_name: + config.project.name = project_name + if api_url: + config.project.api_url = api_url + + if config.ensure_cognee_defaults(project_dir): + config.save_to_file(config_path) + else: + # Still ensure latest values persisted (e.g., updated name/url) + config.save_to_file(config_path) + + return config + + +def get_global_config() -> FuzzForgeConfig: + """Get global user configuration.""" + home = Path.home() + global_config_dir = home / ".config" / "fuzzforge" + global_config_path = global_config_dir / "config.yaml" + + if global_config_path.exists(): + return FuzzForgeConfig.from_file(global_config_path) + + return FuzzForgeConfig() + + +def save_global_config(config: FuzzForgeConfig) -> None: + """Save global user configuration.""" + home = Path.home() + global_config_dir = home / ".config" / "fuzzforge" + global_config_path = global_config_dir / "config.yaml" + config.save_to_file(global_config_path) + + +# ---------------------------------------------------------------------- +# Compatibility layer for AI modules +# ---------------------------------------------------------------------- + +class ProjectConfigManager: + """Lightweight wrapper mimicking the legacy Config class used by the AI module.""" + + def __init__(self, project_dir: Optional[Path] = None): + self.project_dir = Path(project_dir or Path.cwd()) + paths = _get_project_paths(self.project_dir) + self.config_path = paths["config_dir"] + self.file_path = paths["config_path"] + self._config = get_project_config(self.project_dir) + if self._config is None: + raise FileNotFoundError( + f"FuzzForge project not initialized in {self.project_dir}. Run 'ff init'." + ) + + # Legacy API ------------------------------------------------------ + def is_initialized(self) -> bool: + return self.file_path.exists() + + def get_project_context(self) -> Dict[str, str]: + return self._config.get_project_context(self.project_dir) + + def get_cognee_config(self) -> Dict[str, Any]: + return self._config.get_cognee_config(self.project_dir) + + def setup_cognee_environment(self) -> None: + cognee = self.get_cognee_config() + if not cognee.get("enabled", True): + return + + # Load project-specific environment overrides from .fuzzforge/.env if available + env_file = self.project_dir / ".fuzzforge" / ".env" + if env_file.exists(): + if load_dotenv: + load_dotenv(env_file, override=False) + else: + try: + for line in env_file.read_text(encoding="utf-8").splitlines(): + stripped = line.strip() + if not stripped or stripped.startswith("#"): + continue + if "=" not in stripped: + continue + key, value = stripped.split("=", 1) + os.environ.setdefault(key.strip(), value.strip()) + except Exception: # pragma: no cover - best effort fallback + pass + + backend_access = "true" if cognee.get("backend_access_control", True) else "false" + os.environ["ENABLE_BACKEND_ACCESS_CONTROL"] = backend_access + os.environ["GRAPH_DATABASE_PROVIDER"] = cognee.get("graph_database_provider", "kuzu") + + data_dir = cognee.get("data_directory") + system_dir = cognee.get("system_directory") + tenant_id = cognee.get("tenant_id", "fuzzforge_tenant") + + if data_dir: + os.environ["COGNEE_DATA_ROOT"] = data_dir + if system_dir: + os.environ["COGNEE_SYSTEM_ROOT"] = system_dir + os.environ["COGNEE_USER_ID"] = tenant_id + os.environ["COGNEE_TENANT_ID"] = tenant_id + + # Configure LLM provider defaults for Cognee. Values prefixed with COGNEE_ + # take precedence so users can segregate credentials. + def _env(*names: str, default: str | None = None) -> str | None: + for name in names: + value = os.getenv(name) + if value: + return value + return default + + provider = _env( + "LLM_COGNEE_PROVIDER", + "COGNEE_LLM_PROVIDER", + "LLM_PROVIDER", + default="openai", + ) + model = _env( + "LLM_COGNEE_MODEL", + "COGNEE_LLM_MODEL", + "LLM_MODEL", + "LITELLM_MODEL", + default="gpt-4o-mini", + ) + api_key = _env( + "LLM_COGNEE_API_KEY", + "COGNEE_LLM_API_KEY", + "LLM_API_KEY", + "OPENAI_API_KEY", + ) + endpoint = _env("LLM_COGNEE_ENDPOINT", "COGNEE_LLM_ENDPOINT", "LLM_ENDPOINT") + api_version = _env( + "LLM_COGNEE_API_VERSION", + "COGNEE_LLM_API_VERSION", + "LLM_API_VERSION", + ) + max_tokens = _env( + "LLM_COGNEE_MAX_TOKENS", + "COGNEE_LLM_MAX_TOKENS", + "LLM_MAX_TOKENS", + ) + + if provider: + os.environ["LLM_PROVIDER"] = provider + if model: + os.environ["LLM_MODEL"] = model + # Maintain backwards compatibility with components expecting LITELLM_MODEL + os.environ.setdefault("LITELLM_MODEL", model) + if api_key: + os.environ["LLM_API_KEY"] = api_key + # Provide OPENAI_API_KEY fallback when using OpenAI-compatible providers + if provider and provider.lower() in {"openai", "azure_openai", "custom"}: + os.environ.setdefault("OPENAI_API_KEY", api_key) + if endpoint: + os.environ["LLM_ENDPOINT"] = endpoint + if api_version: + os.environ["LLM_API_VERSION"] = api_version + if max_tokens: + os.environ["LLM_MAX_TOKENS"] = str(max_tokens) + + # Provide a default MCP endpoint for local FuzzForge backend access when unset + if not os.getenv("FUZZFORGE_MCP_URL"): + os.environ["FUZZFORGE_MCP_URL"] = os.getenv( + "FUZZFORGE_DEFAULT_MCP_URL", + "http://localhost:8010/mcp", + ) + + def refresh(self) -> None: + """Reload configuration from disk.""" + self._config = get_project_config(self.project_dir) + if self._config is None: + raise FileNotFoundError( + f"FuzzForge project not initialized in {self.project_dir}. Run 'ff init'." + ) + + # Convenience accessors ------------------------------------------ + @property + def fuzzforge_dir(self) -> Path: + return self.config_path + + def get_api_url(self) -> str: + return self._config.get_api_url() + + def get_timeout(self) -> int: + return self._config.get_timeout() diff --git a/cli/src/fuzzforge_cli/constants.py b/cli/src/fuzzforge_cli/constants.py new file mode 100644 index 0000000..231f5b7 --- /dev/null +++ b/cli/src/fuzzforge_cli/constants.py @@ -0,0 +1,73 @@ +""" +Constants for FuzzForge CLI. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +# Database constants +DEFAULT_DB_TIMEOUT = 30.0 +DEFAULT_CLEANUP_DAYS = 90 +STATS_SAMPLE_SIZE = 100 + +# Network constants +DEFAULT_API_TIMEOUT = 30.0 +MAX_RETRIES = 3 +RETRY_DELAY = 1.0 +POLL_INTERVAL = 5.0 + +# Display constants +MAX_RUN_ID_DISPLAY_LENGTH = 15 +MAX_DESCRIPTION_LENGTH = 50 +MAX_DEFAULT_VALUE_LENGTH = 30 + +# Progress constants +PROGRESS_STEP_DELAYS = { + "validating": 0.3, + "connecting": 0.2, + "uploading": 0.2, + "creating": 0.3, + "initializing": 0.2 +} + +# Status emojis +STATUS_EMOJIS = { + "completed": "โœ…", + "running": "๐Ÿ”„", + "failed": "โŒ", + "queued": "โณ", + "cancelled": "โน๏ธ", + "pending": "๐Ÿ“‹", + "unknown": "โ“" +} + +# Severity styles for Rich +SEVERITY_STYLES = { + "error": "bold red", + "warning": "bold yellow", + "note": "bold blue", + "info": "bold cyan" +} + +# Default volume modes +DEFAULT_VOLUME_MODE = "ro" +SUPPORTED_VOLUME_MODES = ["ro", "rw"] + +# Default export formats +DEFAULT_EXPORT_FORMAT = "sarif" +SUPPORTED_EXPORT_FORMATS = ["sarif", "json", "csv"] + +# Default configuration +DEFAULT_CONFIG = { + "api_url": "http://localhost:8000", + "timeout": DEFAULT_API_TIMEOUT, + "max_retries": MAX_RETRIES, +} \ No newline at end of file diff --git a/cli/src/fuzzforge_cli/database.py b/cli/src/fuzzforge_cli/database.py new file mode 100644 index 0000000..2f488fe --- /dev/null +++ b/cli/src/fuzzforge_cli/database.py @@ -0,0 +1,661 @@ +""" +Database module for FuzzForge CLI. + +Handles SQLite database operations for local project management, +including runs, findings, and crash storage. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import sqlite3 +import json +import logging +from datetime import datetime +from pathlib import Path +from typing import Dict, Any, List, Optional, Union +from contextlib import contextmanager + +from pydantic import BaseModel +from .constants import DEFAULT_DB_TIMEOUT, DEFAULT_CLEANUP_DAYS, STATS_SAMPLE_SIZE + +logger = logging.getLogger(__name__) + + +class RunRecord(BaseModel): + """Database record for workflow runs""" + run_id: str + workflow: str + status: str + target_path: str + parameters: Dict[str, Any] = {} + created_at: datetime + completed_at: Optional[datetime] = None + metadata: Dict[str, Any] = {} + + +class FindingRecord(BaseModel): + """Database record for findings""" + id: Optional[int] = None + run_id: str + sarif_data: Dict[str, Any] + summary: Dict[str, Any] = {} + created_at: datetime + + +class CrashRecord(BaseModel): + """Database record for crash reports""" + id: Optional[int] = None + run_id: str + crash_id: str + signal: Optional[str] = None + stack_trace: Optional[str] = None + input_file: Optional[str] = None + severity: str = "medium" + timestamp: datetime + + +class FuzzForgeDatabase: + """SQLite database manager for FuzzForge CLI projects""" + + SCHEMA = """ + CREATE TABLE IF NOT EXISTS runs ( + run_id TEXT PRIMARY KEY, + workflow TEXT NOT NULL, + status TEXT NOT NULL, + target_path TEXT NOT NULL, + parameters TEXT DEFAULT '{}', + created_at TIMESTAMP NOT NULL, + completed_at TIMESTAMP, + metadata TEXT DEFAULT '{}' + ); + + CREATE TABLE IF NOT EXISTS findings ( + id INTEGER PRIMARY KEY AUTOINCREMENT, + run_id TEXT NOT NULL, + sarif_data TEXT NOT NULL, + summary TEXT DEFAULT '{}', + created_at TIMESTAMP NOT NULL, + FOREIGN KEY (run_id) REFERENCES runs (run_id) + ); + + CREATE TABLE IF NOT EXISTS crashes ( + id INTEGER PRIMARY KEY AUTOINCREMENT, + run_id TEXT NOT NULL, + crash_id TEXT NOT NULL, + signal TEXT, + stack_trace TEXT, + input_file TEXT, + severity TEXT DEFAULT 'medium', + timestamp TIMESTAMP NOT NULL, + FOREIGN KEY (run_id) REFERENCES runs (run_id) + ); + + CREATE INDEX IF NOT EXISTS idx_runs_status ON runs (status); + CREATE INDEX IF NOT EXISTS idx_runs_workflow ON runs (workflow); + CREATE INDEX IF NOT EXISTS idx_runs_created_at ON runs (created_at); + CREATE INDEX IF NOT EXISTS idx_findings_run_id ON findings (run_id); + CREATE INDEX IF NOT EXISTS idx_crashes_run_id ON crashes (run_id); + """ + + def __init__(self, db_path: Union[str, Path]): + self.db_path = Path(db_path) + self.db_path.parent.mkdir(parents=True, exist_ok=True) + self._initialize_db() + + def _initialize_db(self): + """Initialize database with schema, handling corruption""" + try: + with self.connection() as conn: + # Test database integrity first + conn.execute("PRAGMA integrity_check").fetchone() + conn.executescript(self.SCHEMA) + except sqlite3.DatabaseError as e: + logger.warning(f"Database corruption detected: {e}") + # Backup corrupted database + backup_path = self.db_path.with_suffix('.db.corrupted') + if self.db_path.exists(): + self.db_path.rename(backup_path) + logger.info(f"Corrupted database backed up to: {backup_path}") + + # Create fresh database + with self.connection() as conn: + conn.executescript(self.SCHEMA) + logger.info("Created fresh database after corruption") + + @contextmanager + def connection(self): + """Context manager for database connections with proper resource management""" + conn = None + try: + conn = sqlite3.connect( + self.db_path, + detect_types=sqlite3.PARSE_DECLTYPES | sqlite3.PARSE_COLNAMES, + timeout=DEFAULT_DB_TIMEOUT + ) + conn.row_factory = sqlite3.Row + # Enable WAL mode for better concurrency + conn.execute("PRAGMA journal_mode=WAL") + # Enable query optimization + conn.execute("PRAGMA optimize") + yield conn + conn.commit() + except sqlite3.OperationalError as e: + if conn: + try: + conn.rollback() + except: + pass # Connection might be broken + if "database is locked" in str(e).lower(): + raise sqlite3.OperationalError( + "Database is locked. Another FuzzForge process may be running." + ) from e + elif "database disk image is malformed" in str(e).lower(): + raise sqlite3.DatabaseError( + "Database is corrupted. Use 'ff init --force' to reset." + ) from e + raise + except Exception as e: + if conn: + try: + conn.rollback() + except: + pass # Connection might be broken + raise + finally: + if conn: + try: + conn.close() + except: + pass # Ensure cleanup even if close fails + + # Run management methods + + def save_run(self, run: RunRecord) -> None: + """Save or update a run record with validation""" + try: + # Validate JSON serialization before database write + parameters_json = json.dumps(run.parameters) + metadata_json = json.dumps(run.metadata) + + with self.connection() as conn: + conn.execute(""" + INSERT OR REPLACE INTO runs + (run_id, workflow, status, target_path, parameters, created_at, completed_at, metadata) + VALUES (?, ?, ?, ?, ?, ?, ?, ?) + """, ( + run.run_id, + run.workflow, + run.status, + run.target_path, + parameters_json, + run.created_at, + run.completed_at, + metadata_json + )) + except (TypeError, ValueError) as e: + raise ValueError(f"Failed to serialize run data: {e}") from e + + def get_run(self, run_id: str) -> Optional[RunRecord]: + """Get a run record by ID with error handling""" + with self.connection() as conn: + row = conn.execute( + "SELECT * FROM runs WHERE run_id = ?", + (run_id,) + ).fetchone() + + if row: + try: + return RunRecord( + run_id=row["run_id"], + workflow=row["workflow"], + status=row["status"], + target_path=row["target_path"], + parameters=json.loads(row["parameters"] or "{}"), + created_at=row["created_at"], + completed_at=row["completed_at"], + metadata=json.loads(row["metadata"] or "{}") + ) + except (json.JSONDecodeError, TypeError) as e: + logger.warning(f"Failed to deserialize run {run_id}: {e}") + # Return with empty dicts for corrupted JSON + return RunRecord( + run_id=row["run_id"], + workflow=row["workflow"], + status=row["status"], + target_path=row["target_path"], + parameters={}, + created_at=row["created_at"], + completed_at=row["completed_at"], + metadata={} + ) + return None + + def list_runs( + self, + workflow: Optional[str] = None, + status: Optional[str] = None, + limit: int = 50 + ) -> List[RunRecord]: + """List runs with optional filters""" + query = "SELECT * FROM runs WHERE 1=1" + params = [] + + if workflow: + query += " AND workflow = ?" + params.append(workflow) + + if status: + query += " AND status = ?" + params.append(status) + + query += " ORDER BY created_at DESC LIMIT ?" + params.append(limit) + + with self.connection() as conn: + rows = conn.execute(query, params).fetchall() + runs = [] + for row in rows: + try: + runs.append(RunRecord( + run_id=row["run_id"], + workflow=row["workflow"], + status=row["status"], + target_path=row["target_path"], + parameters=json.loads(row["parameters"] or "{}"), + created_at=row["created_at"], + completed_at=row["completed_at"], + metadata=json.loads(row["metadata"] or "{}") + )) + except (json.JSONDecodeError, TypeError) as e: + logger.warning(f"Skipping corrupted run {row['run_id']}: {e}") + # Skip corrupted records instead of failing + continue + return runs + + def update_run_status(self, run_id: str, status: str, completed_at: Optional[datetime] = None): + """Update run status""" + with self.connection() as conn: + conn.execute( + "UPDATE runs SET status = ?, completed_at = ? WHERE run_id = ?", + (status, completed_at, run_id) + ) + + # Findings management methods + + def save_findings(self, finding: FindingRecord) -> int: + """Save findings and return the ID""" + with self.connection() as conn: + cursor = conn.execute(""" + INSERT INTO findings (run_id, sarif_data, summary, created_at) + VALUES (?, ?, ?, ?) + """, ( + finding.run_id, + json.dumps(finding.sarif_data), + json.dumps(finding.summary), + finding.created_at + )) + return cursor.lastrowid + + def get_findings(self, run_id: str) -> Optional[FindingRecord]: + """Get findings for a run""" + with self.connection() as conn: + row = conn.execute( + "SELECT * FROM findings WHERE run_id = ? ORDER BY created_at DESC LIMIT 1", + (run_id,) + ).fetchone() + + if row: + return FindingRecord( + id=row["id"], + run_id=row["run_id"], + sarif_data=json.loads(row["sarif_data"]), + summary=json.loads(row["summary"]), + created_at=row["created_at"] + ) + return None + + def list_findings(self, limit: int = 50) -> List[FindingRecord]: + """List recent findings""" + with self.connection() as conn: + rows = conn.execute(""" + SELECT * FROM findings + ORDER BY created_at DESC + LIMIT ? + """, (limit,)).fetchall() + + return [ + FindingRecord( + id=row["id"], + run_id=row["run_id"], + sarif_data=json.loads(row["sarif_data"]), + summary=json.loads(row["summary"]), + created_at=row["created_at"] + ) + for row in rows + ] + + def get_all_findings(self, + workflow: Optional[str] = None, + severity: Optional[List[str]] = None, + since_date: Optional[datetime] = None, + limit: Optional[int] = None) -> List[FindingRecord]: + """Get all findings with optional filters""" + with self.connection() as conn: + query = """ + SELECT f.*, r.workflow + FROM findings f + JOIN runs r ON f.run_id = r.run_id + WHERE 1=1 + """ + params = [] + + if workflow: + query += " AND r.workflow = ?" + params.append(workflow) + + if since_date: + query += " AND f.created_at >= ?" + params.append(since_date) + + query += " ORDER BY f.created_at DESC" + + if limit: + query += " LIMIT ?" + params.append(limit) + + rows = conn.execute(query, params).fetchall() + + findings = [] + for row in rows: + try: + finding = FindingRecord( + id=row["id"], + run_id=row["run_id"], + sarif_data=json.loads(row["sarif_data"]), + summary=json.loads(row["summary"]), + created_at=row["created_at"] + ) + + # Filter by severity if specified + if severity: + finding_severities = set() + if "runs" in finding.sarif_data: + for run in finding.sarif_data["runs"]: + for result in run.get("results", []): + finding_severities.add(result.get("level", "note").lower()) + + if not any(sev.lower() in finding_severities for sev in severity): + continue + + findings.append(finding) + except (json.JSONDecodeError, KeyError) as e: + logger.warning(f"Skipping malformed finding {row['id']}: {e}") + continue + + return findings + + def get_findings_by_workflow(self, workflow: str) -> List[FindingRecord]: + """Get all findings for a specific workflow""" + return self.get_all_findings(workflow=workflow) + + def get_aggregated_stats(self) -> Dict[str, Any]: + """Get aggregated statistics for all findings using SQL aggregation""" + with self.connection() as conn: + # Total findings and runs + total_findings = conn.execute("SELECT COUNT(*) FROM findings").fetchone()[0] + total_runs = conn.execute("SELECT COUNT(DISTINCT run_id) FROM findings").fetchone()[0] + + # Findings by workflow + workflow_stats = conn.execute(""" + SELECT r.workflow, COUNT(f.id) as count + FROM findings f + JOIN runs r ON f.run_id = r.run_id + GROUP BY r.workflow + ORDER BY count DESC + """).fetchall() + + # Recent activity + recent_findings = conn.execute(""" + SELECT COUNT(*) FROM findings + WHERE created_at > datetime('now', '-7 days') + """).fetchone()[0] + + # Use SQL JSON functions to aggregate severity stats efficiently + # This avoids loading all findings into memory + severity_stats = conn.execute(""" + SELECT + SUM(json_array_length(json_extract(sarif_data, '$.runs[0].results'))) as total_issues, + COUNT(*) as finding_count + FROM findings + WHERE json_extract(sarif_data, '$.runs[0].results') IS NOT NULL + """).fetchone() + + total_issues = severity_stats["total_issues"] or 0 + + # Get severity distribution using SQL + # Note: This is a simplified version - for full accuracy we'd need JSON parsing + # But it's much more efficient than loading all data into Python + severity_counts = {"error": 0, "warning": 0, "note": 0, "info": 0} + + # Sample the first N findings for severity distribution + # This gives a good approximation without loading everything + sample_findings = conn.execute(""" + SELECT sarif_data + FROM findings + LIMIT ? + """, (STATS_SAMPLE_SIZE,)).fetchall() + + for row in sample_findings: + try: + data = json.loads(row["sarif_data"]) + if "runs" in data: + for run in data["runs"]: + for result in run.get("results", []): + level = result.get("level", "note").lower() + severity_counts[level] = severity_counts.get(level, 0) + 1 + except (json.JSONDecodeError, KeyError): + continue + + # Extrapolate severity counts if we have more than sample size + if total_findings > STATS_SAMPLE_SIZE: + multiplier = total_findings / STATS_SAMPLE_SIZE + for key in severity_counts: + severity_counts[key] = int(severity_counts[key] * multiplier) + + return { + "total_findings_records": total_findings, + "total_runs": total_runs, + "total_issues": total_issues, + "severity_distribution": severity_counts, + "workflows": {row["workflow"]: row["count"] for row in workflow_stats}, + "recent_findings": recent_findings, + "last_updated": datetime.now() + } + + # Crash management methods + + def save_crash(self, crash: CrashRecord) -> int: + """Save crash report and return the ID""" + with self.connection() as conn: + cursor = conn.execute(""" + INSERT INTO crashes + (run_id, crash_id, signal, stack_trace, input_file, severity, timestamp) + VALUES (?, ?, ?, ?, ?, ?, ?) + """, ( + crash.run_id, + crash.crash_id, + crash.signal, + crash.stack_trace, + crash.input_file, + crash.severity, + crash.timestamp + )) + return cursor.lastrowid + + def get_crashes(self, run_id: str) -> List[CrashRecord]: + """Get all crashes for a run""" + with self.connection() as conn: + rows = conn.execute( + "SELECT * FROM crashes WHERE run_id = ? ORDER BY timestamp DESC", + (run_id,) + ).fetchall() + + return [ + CrashRecord( + id=row["id"], + run_id=row["run_id"], + crash_id=row["crash_id"], + signal=row["signal"], + stack_trace=row["stack_trace"], + input_file=row["input_file"], + severity=row["severity"], + timestamp=row["timestamp"] + ) + for row in rows + ] + + # Utility methods + + def cleanup_old_runs(self, keep_days: int = DEFAULT_CLEANUP_DAYS) -> int: + """Remove old runs and associated data""" + cutoff_date = datetime.now().replace( + hour=0, minute=0, second=0, microsecond=0 + ) - datetime.timedelta(days=keep_days) + + with self.connection() as conn: + # Get run IDs to delete + old_runs = conn.execute( + "SELECT run_id FROM runs WHERE created_at < ?", + (cutoff_date,) + ).fetchall() + + if not old_runs: + return 0 + + run_ids = [row["run_id"] for row in old_runs] + placeholders = ",".join("?" * len(run_ids)) + + # Delete associated findings and crashes + conn.execute(f"DELETE FROM findings WHERE run_id IN ({placeholders})", run_ids) + conn.execute(f"DELETE FROM crashes WHERE run_id IN ({placeholders})", run_ids) + + # Delete runs + conn.execute(f"DELETE FROM runs WHERE run_id IN ({placeholders})", run_ids) + + return len(run_ids) + + def get_stats(self) -> Dict[str, Any]: + """Get database statistics""" + with self.connection() as conn: + stats = {} + + # Run counts by status + run_stats = conn.execute(""" + SELECT status, COUNT(*) as count + FROM runs + GROUP BY status + """).fetchall() + stats["runs_by_status"] = {row["status"]: row["count"] for row in run_stats} + + # Total counts + stats["total_runs"] = conn.execute("SELECT COUNT(*) FROM runs").fetchone()[0] + stats["total_findings"] = conn.execute("SELECT COUNT(*) FROM findings").fetchone()[0] + stats["total_crashes"] = conn.execute("SELECT COUNT(*) FROM crashes").fetchone()[0] + + # Recent activity + stats["runs_last_7_days"] = conn.execute(""" + SELECT COUNT(*) FROM runs + WHERE created_at > datetime('now', '-7 days') + """).fetchone()[0] + + return stats + + def health_check(self) -> Dict[str, Any]: + """Perform database health check""" + health = { + "healthy": True, + "issues": [], + "recommendations": [] + } + + try: + with self.connection() as conn: + # Check database integrity + integrity_result = conn.execute("PRAGMA integrity_check").fetchone() + if integrity_result[0] != "ok": + health["healthy"] = False + health["issues"].append(f"Database integrity check failed: {integrity_result[0]}") + + # Check for orphaned records + orphaned_findings = conn.execute(""" + SELECT COUNT(*) FROM findings + WHERE run_id NOT IN (SELECT run_id FROM runs) + """).fetchone()[0] + + if orphaned_findings > 0: + health["issues"].append(f"Found {orphaned_findings} orphaned findings") + health["recommendations"].append("Run database cleanup to remove orphaned records") + + orphaned_crashes = conn.execute(""" + SELECT COUNT(*) FROM crashes + WHERE run_id NOT IN (SELECT run_id FROM runs) + """).fetchone()[0] + + if orphaned_crashes > 0: + health["issues"].append(f"Found {orphaned_crashes} orphaned crashes") + + # Check database size + db_size = self.db_path.stat().st_size if self.db_path.exists() else 0 + if db_size > 100 * 1024 * 1024: # 100MB + health["recommendations"].append("Database is large (>100MB). Consider cleanup.") + + except Exception as e: + health["healthy"] = False + health["issues"].append(f"Health check failed: {e}") + + return health + + +def get_project_db(project_dir: Optional[Path] = None) -> Optional[FuzzForgeDatabase]: + """Get the database for the current project with error handling""" + if project_dir is None: + project_dir = Path.cwd() + + fuzzforge_dir = project_dir / ".fuzzforge" + if not fuzzforge_dir.exists(): + return None + + db_path = fuzzforge_dir / "findings.db" + try: + return FuzzForgeDatabase(db_path) + except Exception as e: + logger.error(f"Failed to open project database: {e}") + raise sqlite3.DatabaseError(f"Failed to open project database: {e}") from e + + +def ensure_project_db(project_dir: Optional[Path] = None) -> FuzzForgeDatabase: + """Ensure project database exists, create if needed with error handling""" + if project_dir is None: + project_dir = Path.cwd() + + fuzzforge_dir = project_dir / ".fuzzforge" + try: + fuzzforge_dir.mkdir(exist_ok=True) + except PermissionError as e: + raise PermissionError(f"Cannot create .fuzzforge directory: {e}") from e + + db_path = fuzzforge_dir / "findings.db" + try: + return FuzzForgeDatabase(db_path) + except Exception as e: + logger.error(f"Failed to create/open project database: {e}") + raise sqlite3.DatabaseError(f"Failed to create project database: {e}") from e \ No newline at end of file diff --git a/cli/src/fuzzforge_cli/exceptions.py b/cli/src/fuzzforge_cli/exceptions.py new file mode 100644 index 0000000..b59e30d --- /dev/null +++ b/cli/src/fuzzforge_cli/exceptions.py @@ -0,0 +1,487 @@ +""" +Enhanced exception handling and error utilities for FuzzForge CLI with rich context display. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import time +import functools +from typing import Any, Callable, Optional, Type, Union, List +from pathlib import Path + +import typer +import httpx +from rich.console import Console +from rich.panel import Panel +from rich.text import Text +from rich.table import Table +from rich.columns import Columns +from rich.syntax import Syntax +from rich.markdown import Markdown + +# Import SDK exceptions for rich handling +from fuzzforge_sdk.exceptions import ( + FuzzForgeError as SDKFuzzForgeError, + FuzzForgeHTTPError, + DeploymentError, + WorkflowExecutionError, + ContainerError, + VolumeError, + ValidationError as SDKValidationError, + ConnectionError as SDKConnectionError +) + +console = Console() + + +class FuzzForgeError(Exception): + """Base exception for FuzzForge CLI errors (legacy CLI-specific errors)""" + + def __init__(self, message: str, hint: Optional[str] = None, exit_code: int = 1): + self.message = message + self.hint = hint + self.exit_code = exit_code + super().__init__(message) + + +class ProjectNotFoundError(FuzzForgeError): + """Raised when no FuzzForge project is found in current directory""" + + def __init__(self): + super().__init__( + "No FuzzForge project found in current directory", + "Run 'ff init' to initialize a new project" + ) + + +class APIConnectionError(FuzzForgeError): + """Legacy API connection error for backward compatibility""" + + def __init__(self, url: str, original_error: Exception): + self.url = url + self.original_error = original_error + + if isinstance(original_error, httpx.ConnectTimeout): + message = f"Connection timeout to FuzzForge API at {url}" + hint = "Check if the API server is running and the URL is correct" + elif isinstance(original_error, httpx.ConnectError): + message = f"Failed to connect to FuzzForge API at {url}" + hint = "Verify the API URL is correct and the server is accessible" + elif isinstance(original_error, httpx.TimeoutException): + message = f"Request timeout to FuzzForge API at {url}" + hint = "The API server may be overloaded. Try again later" + else: + message = f"API connection error: {str(original_error)}" + hint = "Check your network connection and API configuration" + + super().__init__(message, hint) + + +class DatabaseError(FuzzForgeError): + """Raised when database operations fail""" + + def __init__(self, operation: str, original_error: Exception): + self.operation = operation + self.original_error = original_error + + message = f"Database error during {operation}: {str(original_error)}" + hint = "The database may be corrupted. Try 'ff init --force' to reset" + + super().__init__(message, hint) + + +class ValidationError(FuzzForgeError): + """Legacy validation error for CLI-specific validation""" + + def __init__(self, field: str, value: Any, expected: str): + self.field = field + self.value = value + self.expected = expected + + message = f"Invalid {field}: {value}" + hint = f"Expected {expected}" + + super().__init__(message, hint) + + +class FileOperationError(FuzzForgeError): + """Raised when file operations fail""" + + def __init__(self, operation: str, path: Union[str, Path], original_error: Exception): + self.operation = operation + self.path = Path(path) + self.original_error = original_error + + if isinstance(original_error, FileNotFoundError): + message = f"File not found: {path}" + hint = "Check the path exists and you have permission to access it" + elif isinstance(original_error, PermissionError): + message = f"Permission denied: {path}" + hint = "Check file permissions or run with appropriate privileges" + else: + message = f"File operation failed ({operation}): {str(original_error)}" + hint = "Check the file path and permissions" + + super().__init__(message, hint) + + +def display_container_logs(diagnostics, title: str = "Container Logs"): + """Display container logs in a rich format.""" + if not diagnostics or not diagnostics.logs: + return + + # Show last 20 lines of logs + recent_logs = diagnostics.logs[-20:] if len(diagnostics.logs) > 20 else diagnostics.logs + + log_content = [] + for log_entry in recent_logs: + timestamp = log_entry.timestamp.strftime("%H:%M:%S") + level_color = { + 'ERROR': 'red', + 'WARNING': 'yellow', + 'INFO': 'blue', + 'DEBUG': 'dim white' + }.get(log_entry.level, 'white') + + log_line = f"[dim]{timestamp}[/dim] [{level_color}]{log_entry.level}[/{level_color}] {log_entry.message}" + log_content.append(log_line) + + if log_content: + logs_panel = Panel( + "\n".join(log_content), + title=title, + title_align="left", + border_style="dim", + expand=False + ) + console.print(logs_panel) + + +def display_container_diagnostics(diagnostics): + """Display comprehensive container diagnostics.""" + if not diagnostics: + return + + # Container Status Table + status_table = Table(title="Container Status", show_header=False, box=None) + status_table.add_column("Property", style="bold") + status_table.add_column("Value") + + status_color = { + 'running': 'green', + 'exited': 'red', + 'failed': 'red', + 'created': 'yellow', + 'unknown': 'dim' + }.get(diagnostics.status.lower(), 'white') + + status_table.add_row("Status", f"[{status_color}]{diagnostics.status}[/{status_color}]") + + if diagnostics.exit_code is not None: + exit_color = 'green' if diagnostics.exit_code == 0 else 'red' + status_table.add_row("Exit Code", f"[{exit_color}]{diagnostics.exit_code}[/{exit_color}]") + + if diagnostics.error: + status_table.add_row("Error", f"[red]{diagnostics.error}[/red]") + + # Resource Usage + if diagnostics.resource_usage: + memory_limit = diagnostics.resource_usage.get('memory_limit', 0) + if memory_limit > 0: + memory_mb = memory_limit // (1024 * 1024) + status_table.add_row("Memory Limit", f"{memory_mb} MB") + + console.print(status_table) + + # Volume Mounts + if diagnostics.volume_mounts: + console.print("\n[bold]Volume Mounts:[/bold]") + for mount in diagnostics.volume_mounts: + mount_info = f" {mount['source']} โ†’ {mount['destination']} ([dim]{mount['mode']}[/dim])" + console.print(mount_info) + + +def display_error_patterns(error_patterns): + """Display detected error patterns.""" + if not error_patterns: + return + + console.print("\n[bold red]๐Ÿ” Detected Issues:[/bold red]") + + for error_type, messages in error_patterns.items(): + # Format error type name + formatted_type = error_type.replace('_', ' ').title() + console.print(f"\n[bold yellow]โ€ข {formatted_type}:[/bold yellow]") + + for message in messages[:3]: # Show first 3 messages + console.print(f" [dim]โ–ธ[/dim] {message}") + + if len(messages) > 3: + console.print(f" [dim]โ–ธ ... and {len(messages) - 3} more similar messages[/dim]") + + +def display_suggestions(suggestions: List[str]): + """Display actionable suggestions.""" + if not suggestions: + return + + console.print("\n[bold green]๐Ÿ’ก Suggested Fixes:[/bold green]") + + for i, suggestion in enumerate(suggestions[:6], 1): # Show max 6 suggestions + console.print(f" [bold green]{i}.[/bold green] {suggestion}") + + +def handle_error(error: Exception, context: str = "") -> None: + """ + Display comprehensive error messages with rich context and exit appropriately. + + Args: + error: The exception that occurred + context: Additional context about where the error occurred + """ + # Handle SDK errors with rich context + if isinstance(error, SDKFuzzForgeError): + console.print() # Add some spacing + + # Main error message + error_title = f"โŒ {error.__class__.__name__}" + if context: + error_title += f" during {context}" + + console.print(Panel( + error.get_summary(), + title=error_title, + title_align="left", + border_style="red", + expand=False + )) + + # Show detailed context if available + if hasattr(error, 'context') and error.context: + ctx = error.context + + # Container diagnostics + if ctx.container_diagnostics: + console.print("\n[bold]Container Diagnostics:[/bold]") + display_container_diagnostics(ctx.container_diagnostics) + display_container_logs(ctx.container_diagnostics) + + # Error patterns + if ctx.error_patterns: + display_error_patterns(ctx.error_patterns) + + # API context + if ctx.url: + console.print(f"\n[dim]Request URL: {ctx.url}[/dim]") + + if ctx.response_data and isinstance(ctx.response_data, dict) and 'raw' not in ctx.response_data: + console.print(f"[dim]API Response: {ctx.response_data}[/dim]") + + # Suggestions + if ctx.suggested_fixes: + display_suggestions(ctx.suggested_fixes) + + console.print() # Add spacing before exit + raise typer.Exit(1) + + # Handle legacy CLI errors + elif isinstance(error, FuzzForgeError): + error_text = Text() + error_text.append("โŒ ", style="red") + error_text.append(error.message, style="red") + + if context: + error_text.append(f" ({context})", style="dim red") + + console.print(error_text) + + if error.hint: + hint_text = Text() + hint_text.append("๐Ÿ’ก ", style="yellow") + hint_text.append(error.hint, style="yellow") + console.print(hint_text) + + raise typer.Exit(error.exit_code) + + elif isinstance(error, KeyboardInterrupt): + console.print("\nโน๏ธ Operation cancelled by user", style="yellow") + raise typer.Exit(130) # Standard exit code for SIGINT + + else: + # Unexpected errors - show minimal info to user, log details + console.print() + + error_panel = Panel( + f"An unexpected error occurred: {str(error)}", + title="โŒ Unexpected Error", + title_align="left", + border_style="red", + expand=False + ) + + if context: + error_panel.title += f" during {context}" + + console.print(error_panel) + + # Show error details for debugging + console.print(f"\n[dim yellow]Error type: {type(error).__name__}[/dim yellow]") + console.print(f"[dim yellow]Please report this issue if it persists[/dim yellow]") + console.print() + + raise typer.Exit(1) + + +def retry_on_network_error(max_retries: int = 3, delay: float = 1.0, backoff_multiplier: float = 2.0): + """ + Decorator to retry network operations with exponential backoff. + + Args: + max_retries: Maximum number of retry attempts + delay: Initial delay between retries in seconds + backoff_multiplier: Multiplier for exponential backoff + """ + def decorator(func: Callable) -> Callable: + @functools.wraps(func) + def wrapper(*args, **kwargs): + last_exception = None + current_delay = delay + + for attempt in range(max_retries + 1): + try: + return func(*args, **kwargs) + except (httpx.ConnectError, httpx.TimeoutException, httpx.NetworkError) as e: + last_exception = e + + if attempt < max_retries: + console.print( + f"๐Ÿ”„ Network error, retrying in {current_delay:.1f}s... " + f"(attempt {attempt + 1}/{max_retries})", + style="yellow" + ) + time.sleep(current_delay) + current_delay *= backoff_multiplier + else: + # Convert to our custom error type + api_url = getattr(args[0], 'base_url', 'unknown') if args else 'unknown' + raise APIConnectionError(str(api_url), e) + + # Should never reach here, but just in case + if last_exception: + raise last_exception + + return wrapper + return decorator + + +def validate_path(path: Union[str, Path], must_exist: bool = True, must_be_file: bool = False, + must_be_dir: bool = False) -> Path: + """ + Validate file/directory paths with user-friendly error messages. + + Args: + path: Path to validate + must_exist: Whether the path must exist + must_be_file: Whether the path must be a file + must_be_dir: Whether the path must be a directory + + Returns: + Validated Path object + + Raises: + ValidationError: If validation fails + """ + path_obj = Path(path) + + if must_exist and not path_obj.exists(): + raise ValidationError("path", str(path), "an existing path") + + if must_be_file and path_obj.exists() and not path_obj.is_file(): + raise ValidationError("path", str(path), "a file") + + if must_be_dir and path_obj.exists() and not path_obj.is_dir(): + raise ValidationError("path", str(path), "a directory") + + return path_obj + + +def validate_run_id(run_id: str) -> str: + """ + Validate run ID format. + + Args: + run_id: Run ID to validate + + Returns: + Validated run ID + + Raises: + ValidationError: If run ID format is invalid + """ + if not run_id or len(run_id) < 8: + raise ValidationError("run_id", run_id, "at least 8 characters") + + if not run_id.replace('-', '').isalnum(): + raise ValidationError("run_id", run_id, "alphanumeric characters and hyphens only") + + return run_id + + +def safe_json_load(file_path: Union[str, Path]) -> dict: + """ + Safely load JSON file with proper error handling. + + Args: + file_path: Path to JSON file + + Returns: + Parsed JSON data + + Raises: + FileOperationError: If file operation fails + ValidationError: If JSON is invalid + """ + path_obj = Path(file_path) + + try: + with open(path_obj, 'r', encoding='utf-8') as f: + import json + return json.load(f) + except FileNotFoundError as e: + raise FileOperationError("read", path_obj, e) + except PermissionError as e: + raise FileOperationError("read", path_obj, e) + except json.JSONDecodeError as e: + raise ValidationError("JSON file", str(path_obj), f"valid JSON format (error: {e})") + except Exception as e: + raise FileOperationError("read", path_obj, e) + + +def require_project() -> Path: + """ + Ensure we're in a FuzzForge project directory. + + Returns: + Path to project root + + Raises: + ProjectNotFoundError: If not in a project directory + """ + current = Path.cwd() + + # Look for .fuzzforge directory in current or parent directories + for path in [current] + list(current.parents): + fuzzforge_dir = path / ".fuzzforge" + if fuzzforge_dir.is_dir(): + return path + + raise ProjectNotFoundError() \ No newline at end of file diff --git a/cli/src/fuzzforge_cli/fuzzy.py b/cli/src/fuzzforge_cli/fuzzy.py new file mode 100644 index 0000000..731e9df --- /dev/null +++ b/cli/src/fuzzforge_cli/fuzzy.py @@ -0,0 +1,309 @@ +""" +Fuzzy matching and smart suggestions for FuzzForge CLI. + +Provides "Did you mean...?" functionality and intelligent command/parameter suggestions. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import difflib +from typing import List, Optional, Dict, Any, Tuple +from rich.console import Console +from rich.panel import Panel +from rich.text import Text + +console = Console() + + +class FuzzyMatcher: + """Fuzzy matching engine for CLI commands and parameters.""" + + def __init__(self): + # Known commands and subcommands + self.commands = { + "init": ["project"], + "workflows": ["list", "info"], + "runs": ["submit", "status", "list", "rerun"], + "findings": ["get", "list", "export", "all"], + "monitor": ["stats", "crashes", "live"], + "config": ["set", "get", "list", "init"], + "ai": ["ask", "summarize", "explain"], + "ingest": ["project", "findings"] + } + + # Common workflow names + self.workflow_names = [ + "security_assessment", + "language_fuzzing", + "infrastructure_scan", + "static_analysis_scan", + "penetration_testing_scan", + "secret_detection_scan" + ] + + # Common parameter names + self.parameter_names = [ + "target_path", + "volume_mode", + "timeout", + "workflow", + "param", + "param-file", + "interactive", + "wait", + "live", + "format", + "output", + "severity", + "since", + "limit", + "stats", + "export" + ] + + # Common values + self.common_values = { + "volume_mode": ["ro", "rw"], + "format": ["json", "csv", "html", "sarif"], + "severity": ["critical", "high", "medium", "low", "info"] + } + + def find_closest_command(self, user_input: str, command_group: Optional[str] = None) -> Optional[Tuple[str, float]]: + """Find the closest matching command.""" + if command_group and command_group in self.commands: + # Search within a specific command group + candidates = self.commands[command_group] + else: + # Search all main commands + candidates = list(self.commands.keys()) + + matches = difflib.get_close_matches( + user_input, candidates, n=1, cutoff=0.6 + ) + + if matches: + match = matches[0] + # Calculate similarity ratio + ratio = difflib.SequenceMatcher(None, user_input, match).ratio() + return match, ratio + + return None + + def find_closest_workflow(self, user_input: str) -> Optional[Tuple[str, float]]: + """Find the closest matching workflow name.""" + matches = difflib.get_close_matches( + user_input, self.workflow_names, n=1, cutoff=0.6 + ) + + if matches: + match = matches[0] + ratio = difflib.SequenceMatcher(None, user_input, match).ratio() + return match, ratio + + return None + + def find_closest_parameter(self, user_input: str) -> Optional[Tuple[str, float]]: + """Find the closest matching parameter name.""" + # Remove leading dashes + clean_input = user_input.lstrip('-') + + matches = difflib.get_close_matches( + clean_input, self.parameter_names, n=1, cutoff=0.6 + ) + + if matches: + match = matches[0] + ratio = difflib.SequenceMatcher(None, clean_input, match).ratio() + return match, ratio + + return None + + def suggest_parameter_values(self, parameter: str, user_input: str) -> List[str]: + """Suggest parameter values based on known options.""" + if parameter in self.common_values: + values = self.common_values[parameter] + if user_input: + # Filter values that start with user input + return [v for v in values if v.startswith(user_input.lower())] + else: + return values + + return [] + + def get_command_suggestions(self, user_command: List[str]) -> Optional[Dict[str, Any]]: + """Get suggestions for a user command that may have typos.""" + if not user_command: + return None + + suggestions = {"type": None, "original": user_command, "suggestions": []} + + # Check main command + main_cmd = user_command[0] + if main_cmd not in self.commands: + closest = self.find_closest_command(main_cmd) + if closest: + match, confidence = closest + suggestions["type"] = "main_command" + suggestions["suggestions"].append({ + "text": match, + "confidence": confidence, + "type": "command" + }) + + # Check subcommand if present + elif len(user_command) > 1: + sub_cmd = user_command[1] + if main_cmd in self.commands and sub_cmd not in self.commands[main_cmd]: + closest = self.find_closest_command(sub_cmd, main_cmd) + if closest: + match, confidence = closest + suggestions["type"] = "subcommand" + suggestions["suggestions"].append({ + "text": f"{main_cmd} {match}", + "confidence": confidence, + "type": "subcommand" + }) + + return suggestions if suggestions["suggestions"] else None + + def suggest_workflow_fix(self, user_workflow: str) -> Optional[str]: + """Suggest a workflow name correction.""" + closest = self.find_closest_workflow(user_workflow) + if closest: + match, confidence = closest + if confidence > 0.6: # Only suggest if reasonably confident + return match + return None + + +def display_command_suggestion(suggestions: Dict[str, Any]): + """Display command suggestions to the user.""" + if not suggestions or not suggestions["suggestions"]: + return + + original = " ".join(suggestions["original"]) + suggestion_type = suggestions["type"] + + # Create suggestion text + text = Text() + text.append("โ“ Command not found: ", style="red") + text.append(f"'{original}'", style="bold red") + text.append("\n\n") + + text.append("๐Ÿ’ก Did you mean:\n", style="yellow") + + for i, suggestion in enumerate(suggestions["suggestions"], 1): + confidence_percent = int(suggestion["confidence"] * 100) + text.append(f" {i}. ", style="bold cyan") + text.append(f"{suggestion['text']}", style="bold white") + text.append(f" ({confidence_percent}% match)", style="dim") + text.append("\n") + + # Add helpful context + if suggestion_type == "main_command": + text.append("\n๐Ÿ’ก Use 'fuzzforge --help' to see all available commands", style="dim") + elif suggestion_type == "subcommand": + main_cmd = suggestions["original"][0] + text.append(f"\n๐Ÿ’ก Use 'fuzzforge {main_cmd} --help' to see available subcommands", style="dim") + + console.print(Panel( + text, + title="๐Ÿค” Command Suggestion", + border_style="yellow", + expand=False + )) + + +def display_workflow_suggestion(original: str, suggestion: str): + """Display workflow name suggestion.""" + text = Text() + text.append("โ“ Workflow not found: ", style="red") + text.append(f"'{original}'", style="bold red") + text.append("\n\n") + + text.append("๐Ÿ’ก Did you mean: ", style="yellow") + text.append(f"'{suggestion}'", style="bold green") + text.append("?\n\n") + + text.append("๐Ÿ’ก Use 'fuzzforge workflows' to see all available workflows", style="dim") + + console.print(Panel( + text, + title="๐Ÿ”ง Workflow Suggestion", + border_style="yellow", + expand=False + )) + + +def display_parameter_suggestion(original: str, suggestion: str): + """Display parameter name suggestion.""" + text = Text() + text.append("โ“ Unknown parameter: ", style="red") + text.append(f"'{original}'", style="bold red") + text.append("\n\n") + + text.append("๐Ÿ’ก Did you mean: ", style="yellow") + text.append(f"'--{suggestion}'", style="bold green") + text.append("?\n\n") + + text.append("๐Ÿ’ก Use '--help' to see all available parameters", style="dim") + + console.print(Panel( + text, + title="โš™๏ธ Parameter Suggestion", + border_style="yellow", + expand=False + )) + + +def enhanced_command_not_found_handler(command_parts: List[str]): + """Handle command not found with fuzzy matching suggestions.""" + matcher = FuzzyMatcher() + suggestions = matcher.get_command_suggestions(command_parts) + + if suggestions: + display_command_suggestion(suggestions) + else: + # Fallback to generic help + console.print("โŒ [red]Command not found[/red]") + console.print("๐Ÿ’ก Use 'fuzzforge --help' to see available commands") + + +def enhanced_workflow_not_found_handler(workflow_name: str): + """Handle workflow not found with suggestions.""" + matcher = FuzzyMatcher() + suggestion = matcher.suggest_workflow_fix(workflow_name) + + if suggestion: + display_workflow_suggestion(workflow_name, suggestion) + else: + console.print(f"โŒ [red]Workflow '{workflow_name}' not found[/red]") + console.print("๐Ÿ’ก Use 'fuzzforge workflows' to see available workflows") + + +def enhanced_parameter_not_found_handler(parameter_name: str): + """Handle unknown parameter with suggestions.""" + matcher = FuzzyMatcher() + closest = matcher.find_closest_parameter(parameter_name) + + if closest: + match, confidence = closest + if confidence > 0.6: + display_parameter_suggestion(parameter_name, match) + return + + console.print(f"โŒ [red]Unknown parameter: '{parameter_name}'[/red]") + console.print("๐Ÿ’ก Use '--help' to see available parameters") + + +# Global fuzzy matcher instance +fuzzy_matcher = FuzzyMatcher() \ No newline at end of file diff --git a/cli/src/fuzzforge_cli/ingest_utils.py b/cli/src/fuzzforge_cli/ingest_utils.py new file mode 100644 index 0000000..8b90a4c --- /dev/null +++ b/cli/src/fuzzforge_cli/ingest_utils.py @@ -0,0 +1,105 @@ +"""Utilities for collecting files to ingest into Cognee.""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +from __future__ import annotations + +import fnmatch +from pathlib import Path +from typing import Iterable, List, Optional + +# Default extensions and exclusions mirrored from the OSS implementation +_DEFAULT_FILE_TYPES = [ + ".py", + ".js", + ".ts", + ".java", + ".cpp", + ".c", + ".h", + ".rs", + ".go", + ".rb", + ".php", + ".cs", + ".swift", + ".kt", + ".scala", + ".clj", + ".hs", + ".md", + ".txt", + ".yaml", + ".yml", + ".json", + ".toml", + ".cfg", + ".ini", +] + +_DEFAULT_EXCLUDE = [ + "*.pyc", + "__pycache__", + ".git", + ".svn", + ".hg", + "node_modules", + ".venv", + "venv", + ".env", + "dist", + "build", + ".pytest_cache", + ".mypy_cache", + ".tox", + "coverage", + "*.log", + "*.tmp", +] + + +def collect_ingest_files( + path: Path, + recursive: bool = True, + file_types: Optional[Iterable[str]] = None, + exclude: Optional[Iterable[str]] = None, +) -> List[Path]: + """Return a list of files eligible for ingestion.""" + path = path.resolve() + files: List[Path] = [] + + extensions = list(file_types) if file_types else list(_DEFAULT_FILE_TYPES) + exclusions = list(exclude) if exclude else [] + exclusions.extend(_DEFAULT_EXCLUDE) + + def should_exclude(file_path: Path) -> bool: + file_str = str(file_path) + for pattern in exclusions: + if fnmatch.fnmatch(file_str, f"*{pattern}*") or fnmatch.fnmatch(file_path.name, pattern): + return True + return False + + if path.is_file(): + if not should_exclude(path) and any(str(path).endswith(ext) for ext in extensions): + files.append(path) + return files + + pattern = "**/*" if recursive else "*" + for file_path in path.glob(pattern): + if file_path.is_file() and not should_exclude(file_path): + if any(str(file_path).endswith(ext) for ext in extensions): + files.append(file_path) + + return files + + +__all__ = ["collect_ingest_files"] diff --git a/cli/src/fuzzforge_cli/main.py b/cli/src/fuzzforge_cli/main.py new file mode 100644 index 0000000..9e820ed --- /dev/null +++ b/cli/src/fuzzforge_cli/main.py @@ -0,0 +1,486 @@ +""" +Main CLI application with improved command structure. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import typer +from rich.console import Console +from rich.traceback import install +from typing import Optional, List +import sys + +from .commands import ( + init, + workflows, + workflow_exec, + findings, + monitor, + config as config_cmd, + ai, + ingest, +) +from .fuzzy import enhanced_command_not_found_handler + +# Install rich traceback handler +install(show_locals=True) + +# Create console for rich output +console = Console() + +# Create the main Typer app +app = typer.Typer( + name="fuzzforge", + help=( + "\b\n" + "[cyan]โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•— โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—\n" + "โ–ˆโ–ˆโ•”โ•โ•โ•โ•โ•โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ•šโ•โ•โ–ˆโ–ˆโ–ˆโ•”โ•โ•šโ•โ•โ–ˆโ–ˆโ–ˆโ•”โ•โ–ˆโ–ˆโ•”โ•โ•โ•โ•โ•โ–ˆโ–ˆโ•”โ•โ•โ•โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•”โ•โ•โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•”โ•โ•โ•โ•โ• โ–ˆโ–ˆโ•”โ•โ•โ•โ•โ•\n" + "โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ–ˆโ•”โ• โ–ˆโ–ˆโ–ˆโ•”โ• โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•”โ•โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ–ˆโ•—โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— \n" + "โ–ˆโ–ˆโ•”โ•โ•โ• โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ–ˆโ•”โ• โ–ˆโ–ˆโ–ˆโ•”โ• โ–ˆโ–ˆโ•”โ•โ•โ• โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ•”โ•โ•โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ•”โ•โ•โ• \n" + "โ–ˆโ–ˆโ•‘ โ•šโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•”โ•โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•‘ โ•šโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•”โ•โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ•šโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•”โ•โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—\n" + "โ•šโ•โ• โ•šโ•โ•โ•โ•โ•โ• โ•šโ•โ•โ•โ•โ•โ•โ•โ•šโ•โ•โ•โ•โ•โ•โ•โ•šโ•โ• โ•šโ•โ•โ•โ•โ•โ• โ•šโ•โ• โ•šโ•โ• โ•šโ•โ•โ•โ•โ•โ• โ•šโ•โ•โ•โ•โ•โ•โ•[/cyan]\n\n" + "๐Ÿ›ก๏ธ Security testing workflow orchestration platform" + ), + rich_markup_mode="rich", + no_args_is_help=True, + context_settings={ + # Prevent help text from wrapping so ASCII art stays aligned + "max_content_width": 200, + # Keep common help flags + "help_option_names": ["--help", "-h"], + }, +) + +# Create workflow singular command group +workflow_app = typer.Typer( + name="workflow", + help="๐Ÿš€ Execute and manage individual workflows", + no_args_is_help=False, # Allow direct execution +) + +# Create finding singular command group +finding_app = typer.Typer( + name="finding", + help="๐Ÿ” View and analyze individual findings", + no_args_is_help=False, +) + + +# === Top-level commands === + +@app.command() +def init( + name: Optional[str] = typer.Option( + None, "--name", "-n", + help="Project name (defaults to current directory name)" + ), + api_url: Optional[str] = typer.Option( + None, "--api-url", "-u", + help="FuzzForge API URL (defaults to http://localhost:8000)" + ), + force: bool = typer.Option( + False, "--force", "-f", + help="Force initialization even if project already exists" + ) +): + """ + ๐Ÿ“ Initialize a new FuzzForge project + """ + from .commands.init import project + project(name=name, api_url=api_url, force=force) + + +@app.command() +def status(): + """ + ๐Ÿ“Š Show project and latest execution status + """ + from .commands.status import show_status + show_status() + + +@app.command() +def config( + key: Optional[str] = typer.Argument(None, help="Configuration key"), + value: Optional[str] = typer.Argument(None, help="Configuration value to set") +): + """ + โš™๏ธ Manage configuration (show all, get, or set values) + """ + from .commands import config as config_cmd + + if key is None: + # No arguments: show all config + config_cmd.show_config(global_config=False) + elif value is None: + # Key only: get specific value + config_cmd.get_config(key=key, global_config=False) + else: + # Key and value: set value + config_cmd.set_config(key=key, value=value, global_config=False) + + +@app.command() +def clean( + days: int = typer.Option( + 90, "--days", "-d", + help="Remove data older than this many days" + ), + dry_run: bool = typer.Option( + False, "--dry-run", + help="Show what would be deleted without actually deleting" + ) +): + """ + ๐Ÿงน Clean old execution data and findings + """ + from .database import get_project_db + from .exceptions import require_project + + try: + require_project() + db = get_project_db() + if not db: + console.print("โŒ No project database found", style="red") + raise typer.Exit(1) + + if dry_run: + console.print(f"๐Ÿ” [bold]Dry run:[/bold] Would clean data older than {days} days") + + deleted = db.cleanup_old_runs(keep_days=days) + + if not dry_run: + console.print(f"โœ… Cleaned {deleted} old executions", style="green") + else: + console.print(f"Would delete {deleted} executions", style="yellow") + except Exception as e: + console.print(f"โŒ Failed to clean data: {e}", style="red") + raise typer.Exit(1) + + +# === Workflow commands (singular) === + +# Add workflow subcommands first (before callback) +workflow_app.command("status")(workflow_exec.workflow_status) +workflow_app.command("history")(workflow_exec.workflow_history) +workflow_app.command("retry")(workflow_exec.retry_workflow) +workflow_app.command("info")(workflows.workflow_info) +workflow_app.command("params")(workflows.workflow_parameters) + +@workflow_app.command("run") +def run_workflow( + workflow: str = typer.Argument(help="Workflow name"), + target: str = typer.Argument(help="Target path"), +): + """ + ๐Ÿš€ Execute a security testing workflow + """ + from .commands.workflow_exec import execute_workflow + + execute_workflow( + workflow=workflow, + target_path=target, + params=[], + param_file=None, + volume_mode='ro', + timeout=None, + interactive=True, + wait=False, + live=False + ) + +@workflow_app.callback() +def workflow_main(): + """ + Execute workflows and manage workflow executions + + Examples: + fuzzforge workflow security_assessment ./target # Execute workflow + fuzzforge workflow status # Check latest status + fuzzforge workflow history # Show execution history + """ + pass + + +# === Finding commands (singular) === + +@finding_app.command("export") +def export_finding( + execution_id: Optional[str] = typer.Argument(None, help="Execution ID (defaults to latest)"), + format: str = typer.Option( + "sarif", "--format", "-f", + help="Export format: sarif, json, csv" + ), + output: Optional[str] = typer.Option( + None, "--output", "-o", + help="Output file (defaults to stdout)" + ) +): + """ + ๐Ÿ“ค Export findings to file + """ + from .commands.findings import export_findings + from .database import get_project_db + from .exceptions import require_project + + try: + require_project() + + # If no ID provided, get the latest + if not execution_id: + db = get_project_db() + if db: + recent_runs = db.list_runs(limit=1) + if recent_runs: + execution_id = recent_runs[0].run_id + console.print(f"๐Ÿ” Using most recent execution: {execution_id}") + else: + console.print("โš ๏ธ No findings found in project database", style="yellow") + return + else: + console.print("โŒ No project database found", style="red") + return + + export_findings(run_id=execution_id, format=format, output=output) + except Exception as e: + console.print(f"โŒ Failed to export findings: {e}", style="red") + + +@finding_app.command("analyze") +def analyze_finding( + finding_id: Optional[str] = typer.Argument(None, help="Finding ID to analyze") +): + """ + ๐Ÿค– AI analysis of a finding + """ + from .commands.ai import analyze_finding as ai_analyze + ai_analyze(finding_id) + +@finding_app.callback(invoke_without_command=True) +def finding_main( + ctx: typer.Context, +): + """ + View and analyze individual findings + + Examples: + fuzzforge finding # Show latest finding + fuzzforge finding # Show specific finding + fuzzforge finding export # Export latest findings + """ + # Check if a subcommand is being invoked + if ctx.invoked_subcommand is not None: + # Let the subcommand handle it + return + + # Get remaining arguments for direct viewing + args = ctx.args if hasattr(ctx, 'args') else [] + finding_id = args[0] if args else None + + # Direct viewing: fuzzforge finding [id] + from .commands.findings import get_findings + from .database import get_project_db + from .exceptions import require_project + + try: + require_project() + + # If no ID provided, get the latest + if not finding_id: + db = get_project_db() + if db: + recent_runs = db.list_runs(limit=1) + if recent_runs: + finding_id = recent_runs[0].run_id + console.print(f"๐Ÿ” Using most recent execution: {finding_id}") + else: + console.print("โš ๏ธ No findings found in project database", style="yellow") + return + else: + console.print("โŒ No project database found", style="red") + return + + get_findings(run_id=finding_id, save=True, format="table") + except Exception as e: + console.print(f"โŒ Failed to get findings: {e}", style="red") + + +# === Add command groups === + +# Plural commands (for browsing/listing) +app.add_typer(workflows.app, name="workflows", help="๐Ÿ“‹ Browse available workflows") +app.add_typer(findings.app, name="findings", help="๐Ÿ“‹ Browse all findings") + +# Singular commands (for actions) +app.add_typer(workflow_app, name="workflow", help="๐Ÿš€ Execute and manage workflows") +app.add_typer(finding_app, name="finding", help="๐Ÿ” View and analyze findings") + +# Other command groups +app.add_typer(monitor.app, name="monitor", help="๐Ÿ“Š Real-time monitoring") +app.add_typer(ai.app, name="ai", help="๐Ÿค– AI integration features") +app.add_typer(ingest.app, name="ingest", help="๐Ÿง  Ingest knowledge into AI") + +# Help and utility commands +@app.command() +def examples(): + """ + ๐Ÿ“š Show usage examples + """ + examples_text = """ +[bold cyan]FuzzForge CLI Examples[/bold cyan] + +[bold]Getting Started:[/bold] + ff init # Initialize a project + ff workflows # List available workflows + ff workflow info afl-fuzzing # Get workflow details + +[bold]Execute Workflows:[/bold] + ff workflow afl-fuzzing ./target # Run fuzzing on target + ff workflow afl-fuzzing . --live # Run with live monitoring + ff workflow scan-c ./src timeout=300 threads=4 # With parameters + +[bold]Monitor Execution:[/bold] + ff status # Check latest execution + ff workflow status # Same as above + ff monitor # Live monitoring dashboard + ff workflow history # Show past executions + +[bold]Review Findings:[/bold] + ff findings # List all findings + ff finding # Show latest finding + ff finding export --format sarif # Export findings + +[bold]AI Features:[/bold] + ff ai chat # Interactive AI chat + ff ai suggest ./src # Get workflow suggestions + ff finding analyze # AI analysis of latest finding +""" + console.print(examples_text) + + +@app.command() +def version(): + """ + ๐Ÿ“ฆ Show version information + """ + from . import __version__ + console.print(f"FuzzForge CLI v{__version__}") + console.print(f"Short command: ff") + + +@app.callback() +def main_callback( + ctx: typer.Context, + version: Optional[bool] = typer.Option( + None, "--version", "-v", + help="Show version information" + ), +): + """ + ๐Ÿ›ก๏ธ FuzzForge CLI - Security testing workflow orchestration platform + + Quick start: + โ€ข ff init - Initialize a new project + โ€ข ff workflows - See available workflows + โ€ข ff workflow - Execute a workflow + โ€ข ff examples - Show usage examples + """ + if version: + from . import __version__ + console.print(f"FuzzForge CLI v{__version__}") + raise typer.Exit() + + +def main(): + """Main entry point with smart command routing and error handling""" + # Smart command routing BEFORE Typer processes arguments + if len(sys.argv) > 1: + args = sys.argv[1:] + + # Handle workflow command with pattern recognition + if len(args) >= 3 and args[0] == 'workflow': + workflow_subcommands = ['run', 'status', 'history', 'retry', 'info', 'params'] + # Skip custom dispatching if help flags are present + if not any(arg in ['--help', '-h', '--version', '-v'] for arg in args): + if args[1] not in workflow_subcommands: + # Direct workflow execution: ff workflow + from .commands.workflow_exec import execute_workflow + + workflow_name = args[1] + target_path = args[2] + remaining_params = args[3:] if len(args) > 3 else [] + + console.print(f"๐Ÿš€ Executing workflow: {workflow_name} on {target_path}") + + try: + execute_workflow( + workflow=workflow_name, + target_path=target_path, + params=remaining_params, + param_file=None, + volume_mode='ro', + timeout=None, + interactive=True, + wait=False, + live=False + ) + return + except Exception as e: + console.print(f"โŒ Failed to execute workflow: {e}", style="red") + sys.exit(1) + + # Handle finding command with pattern recognition + if len(args) >= 2 and args[0] == 'finding': + finding_subcommands = ['export', 'analyze'] + # Skip custom dispatching if help flags are present + if not any(arg in ['--help', '-h', '--version', '-v'] for arg in args): + if args[1] not in finding_subcommands: + # Direct finding display: ff finding + from .commands.findings import get_findings + + finding_id = args[1] + console.print(f"๐Ÿ” Displaying finding: {finding_id}") + + try: + get_findings(run_id=finding_id, save=True, format="table") + return + except Exception as e: + console.print(f"โŒ Failed to get finding: {e}", style="red") + sys.exit(1) + + # Default Typer app handling + try: + app() + except SystemExit as e: + # Enhanced error handling for command not found + if hasattr(e, 'code') and e.code != 0 and len(sys.argv) > 1: + command_parts = sys.argv[1:] + clean_parts = [part for part in command_parts if not part.startswith('-')] + + if clean_parts: + main_cmd = clean_parts[0] + valid_commands = [ + 'init', 'status', 'config', 'clean', + 'workflows', 'workflow', + 'findings', 'finding', + 'monitor', 'ai', 'ingest', + 'examples', 'version' + ] + + if main_cmd not in valid_commands: + enhanced_command_not_found_handler(clean_parts) + sys.exit(1) + raise + + +if __name__ == "__main__": + main() diff --git a/cli/src/fuzzforge_cli/progress.py b/cli/src/fuzzforge_cli/progress.py new file mode 100644 index 0000000..e73b19f --- /dev/null +++ b/cli/src/fuzzforge_cli/progress.py @@ -0,0 +1,371 @@ +""" +Enhanced progress indicators and loading animations for FuzzForge CLI. + +Provides rich progress bars, spinners, and status displays for all long-running operations. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import time +import asyncio +from contextlib import contextmanager +from typing import Optional, Callable, Any, Dict, List +from datetime import datetime, timedelta + +from rich.console import Console +from rich.progress import ( + Progress, SpinnerColumn, TextColumn, BarColumn, TaskProgressColumn, + TimeElapsedColumn, TimeRemainingColumn, MofNCompleteColumn +) +from rich.panel import Panel +from rich.live import Live +from rich.table import Table +from rich.text import Text +from rich import box + +console = Console() + + +class ProgressManager: + """Enhanced progress manager with multiple progress types.""" + + def __init__(self): + self.progress = None + self.live = None + + def create_progress(self, show_speed: bool = False, show_eta: bool = False) -> Progress: + """Create a rich progress instance with customizable columns.""" + columns = [ + SpinnerColumn(), + TextColumn("[bold blue]{task.description}"), + BarColumn(bar_width=40), + TaskProgressColumn(), + ] + + if show_speed: + columns.append(TextColumn("[cyan]{task.fields[speed]}/s")) + + columns.extend([ + TimeElapsedColumn(), + ]) + + if show_eta: + columns.append(TimeRemainingColumn()) + + return Progress(*columns, console=console) + + @contextmanager + def workflow_submission(self, workflow_name: str, target_path: str): + """Progress context for workflow submission.""" + with self.create_progress() as progress: + task = progress.add_task( + f"๐Ÿš€ Submitting workflow: [yellow]{workflow_name}[/yellow]", + total=4 + ) + + # Step 1: Validation + progress.update(task, description="๐Ÿ” Validating parameters...", advance=1) + yield progress, task + + # Step 2: API Connection + progress.update(task, description="๐ŸŒ Connecting to API...", advance=1) + time.sleep(0.5) # Brief pause for visual feedback + + # Step 3: Submission + progress.update(task, description="๐Ÿ“ค Submitting workflow...", advance=1) + time.sleep(0.3) + + # Step 4: Complete + progress.update(task, description="โœ… Workflow submitted successfully!", advance=1) + + @contextmanager + def data_export(self, format_type: str, record_count: int): + """Progress context for data export operations.""" + with self.create_progress(show_eta=True) as progress: + task = progress.add_task( + f"๐Ÿ“Š Exporting {record_count} records as [yellow]{format_type.upper()}[/yellow]", + total=record_count + ) + yield progress, task + + @contextmanager + def file_operations(self, operation: str, file_count: int): + """Progress context for file operations.""" + with self.create_progress(show_eta=True) as progress: + task = progress.add_task( + f"๐Ÿ“ {operation} {file_count} files...", + total=file_count + ) + yield progress, task + + @contextmanager + def api_requests(self, operation: str, request_count: Optional[int] = None): + """Progress context for API requests.""" + if request_count: + with self.create_progress() as progress: + task = progress.add_task( + f"๐ŸŒ {operation}...", + total=request_count + ) + yield progress, task + else: + # Indeterminate progress for unknown request count + with self.create_progress() as progress: + task = progress.add_task( + f"๐ŸŒ {operation}...", + total=None + ) + yield progress, task + + def create_live_stats_display(self) -> Dict[str, Any]: + """Create a live statistics display layout.""" + return { + "layout": None, + "stats_table": None, + "progress_bars": None + } + + +@contextmanager +def spinner(text: str, success_text: Optional[str] = None): + """Simple spinner context manager for quick operations.""" + with Progress( + SpinnerColumn(), + TextColumn("[bold blue]{task.description}"), + console=console + ) as progress: + task = progress.add_task(text, total=None) + try: + yield progress + if success_text: + progress.update(task, description=f"โœ… {success_text}") + time.sleep(0.5) # Brief pause to show success + except Exception as e: + progress.update(task, description=f"โŒ Failed: {str(e)}") + time.sleep(0.5) + raise + + +@contextmanager +def step_progress(steps: List[str], title: str = "Processing"): + """Multi-step progress with predefined steps.""" + with Progress( + SpinnerColumn(), + TextColumn("[bold blue]{task.description}"), + BarColumn(bar_width=30), + MofNCompleteColumn(), + console=console + ) as progress: + task = progress.add_task(f"๐Ÿ”„ {title}", total=len(steps)) + + class StepProgressController: + def __init__(self, progress_instance, task_id): + self.progress = progress_instance + self.task = task_id + self.current_step = 0 + + def next_step(self): + if self.current_step < len(steps): + step_text = steps[self.current_step] + self.progress.update( + self.task, + description=f"๐Ÿ”„ {step_text}", + advance=1 + ) + self.current_step += 1 + + def complete(self, success_text: str = "Completed"): + self.progress.update( + self.task, + description=f"โœ… {success_text}", + completed=len(steps) + ) + + yield StepProgressController(progress, task) + + +def create_workflow_monitoring_display(run_id: str, workflow_name: str) -> Table: + """Create a monitoring display for workflow execution.""" + table = Table(show_header=False, box=box.ROUNDED) + table.add_column("Metric", style="bold cyan") + table.add_column("Value", justify="right") + + table.add_row("Run ID", f"[dim]{run_id[:12]}...[/dim]") + table.add_row("Workflow", f"[yellow]{workflow_name}[/yellow]") + table.add_row("Status", "[orange]Running[/orange]") + table.add_row("Started", datetime.now().strftime("%H:%M:%S")) + + return Panel.fit( + table, + title="๐Ÿ”„ Workflow Monitoring", + border_style="blue" + ) + + +def create_fuzzing_progress_display(stats: Dict[str, Any]) -> Panel: + """Create a rich display for fuzzing progress.""" + # Main stats table + stats_table = Table(show_header=False, box=box.SIMPLE) + stats_table.add_column("Metric", style="bold") + stats_table.add_column("Value", justify="right", style="bold white") + + stats_table.add_row("Executions", f"{stats.get('executions', 0):,}") + stats_table.add_row("Exec/sec", f"{stats.get('executions_per_sec', 0):.1f}") + stats_table.add_row("Crashes", f"[red]{stats.get('crashes', 0):,}[/red]") + stats_table.add_row("Coverage", f"{stats.get('coverage', 0):.1f}%") + + # Progress bars + progress_table = Table(show_header=False, box=box.SIMPLE) + progress_table.add_column("Metric", style="bold") + progress_table.add_column("Progress", min_width=25) + + # Execution rate progress (as percentage of target rate) + exec_rate = stats.get('executions_per_sec', 0) + target_rate = 1000 # Target 1000 exec/sec + exec_progress = min(100, (exec_rate / target_rate) * 100) + progress_table.add_row( + "Exec Rate", + create_progress_bar(exec_progress, color="green") + ) + + # Coverage progress + coverage = stats.get('coverage', 0) + progress_table.add_row( + "Coverage", + create_progress_bar(coverage, color="blue") + ) + + # Combine tables + combined = Table(show_header=False, box=None) + combined.add_column("Stats", ratio=1) + combined.add_column("Progress", ratio=1) + combined.add_row(stats_table, progress_table) + + return Panel( + combined, + title="๐ŸŽฏ Fuzzing Progress", + border_style="green" + ) + + +def create_progress_bar(percentage: float, color: str = "green", width: int = 20) -> Text: + """Create a visual progress bar using Rich Text.""" + filled = int((percentage / 100) * width) + bar = "โ–ˆ" * filled + "โ–‘" * (width - filled) + text = Text(bar, style=color) + text.append(f" {percentage:.1f}%", style="dim") + return text + + +def create_loading_animation(text: str) -> Live: + """Create a loading animation with rotating spinner.""" + frames = ["โ ‹", "โ ™", "โ น", "โ ธ", "โ ผ", "โ ด", "โ ฆ", "โ ง", "โ ‡", "โ "] + frame_index = 0 + + def get_spinner_frame(): + nonlocal frame_index + frame = frames[frame_index] + frame_index = (frame_index + 1) % len(frames) + return frame + + panel = Panel( + f"{get_spinner_frame()} [bold blue]{text}[/bold blue]", + box=box.ROUNDED, + border_style="cyan" + ) + + return Live(panel, auto_refresh=True, refresh_per_second=10) + + +class WorkflowProgressTracker: + """Advanced progress tracker for workflow execution.""" + + def __init__(self, workflow_name: str, run_id: str): + self.workflow_name = workflow_name + self.run_id = run_id + self.start_time = datetime.now() + self.phases = [] + self.current_phase = None + + def add_phase(self, name: str, description: str, estimated_duration: Optional[int] = None): + """Add a phase to the workflow progress.""" + self.phases.append({ + "name": name, + "description": description, + "estimated_duration": estimated_duration, + "start_time": None, + "end_time": None, + "status": "pending" + }) + + def start_phase(self, phase_name: str): + """Start a specific phase.""" + for phase in self.phases: + if phase["name"] == phase_name: + phase["start_time"] = datetime.now() + phase["status"] = "running" + self.current_phase = phase_name + break + + def complete_phase(self, phase_name: str, success: bool = True): + """Complete a specific phase.""" + for phase in self.phases: + if phase["name"] == phase_name: + phase["end_time"] = datetime.now() + phase["status"] = "completed" if success else "failed" + self.current_phase = None + break + + def get_progress_display(self) -> Panel: + """Get the current progress display.""" + # Create progress table + table = Table(show_header=True, box=box.ROUNDED) + table.add_column("Phase", style="bold") + table.add_column("Status", justify="center") + table.add_column("Duration") + + for phase in self.phases: + status_emoji = { + "pending": "โณ", + "running": "๐Ÿ”„", + "completed": "โœ…", + "failed": "โŒ" + } + + status_text = f"{status_emoji.get(phase['status'], 'โ“')} {phase['status'].title()}" + + # Calculate duration + if phase["start_time"]: + end_time = phase["end_time"] or datetime.now() + duration = end_time - phase["start_time"] + duration_text = f"{duration.seconds}s" + else: + duration_text = "-" + + table.add_row( + phase["description"], + status_text, + duration_text + ) + + total_duration = datetime.now() - self.start_time + title = f"๐Ÿ”„ {self.workflow_name} Progress (Run: {self.run_id[:8]}..., {total_duration.seconds}s)" + + return Panel( + table, + title=title, + border_style="blue" + ) + + +# Global progress manager instance +progress_manager = ProgressManager() \ No newline at end of file diff --git a/cli/src/fuzzforge_cli/validation.py b/cli/src/fuzzforge_cli/validation.py new file mode 100644 index 0000000..3246fa7 --- /dev/null +++ b/cli/src/fuzzforge_cli/validation.py @@ -0,0 +1,180 @@ +""" +Input validation utilities for FuzzForge CLI. +""" +# Copyright (c) 2025 FuzzingLabs +# +# Licensed under the Business Source License 1.1 (BSL). See the LICENSE file +# at the root of this repository for details. +# +# After the Change Date (four years from publication), this version of the +# Licensed Work will be made available under the Apache License, Version 2.0. +# See the LICENSE-APACHE file or http://www.apache.org/licenses/LICENSE-2.0 +# +# Additional attribution and requirements are provided in the NOTICE file. + + +import re +from pathlib import Path +from typing import Any, Dict, List, Optional, Union + +from .constants import SUPPORTED_VOLUME_MODES, SUPPORTED_EXPORT_FORMATS +from .exceptions import ValidationError + + +def validate_run_id(run_id: str) -> None: + """Validate a run/execution ID format""" + if not run_id or not isinstance(run_id, str): + raise ValidationError("run_id", run_id, "a non-empty string") + + # Check for reasonable length (UUIDs are typically 36 chars) + if len(run_id) < 8 or len(run_id) > 128: + raise ValidationError("run_id", run_id, "between 8 and 128 characters") + + # Check for valid characters (alphanumeric, hyphens, underscores) + if not re.match(r'^[a-zA-Z0-9_-]+$', run_id): + raise ValidationError("run_id", run_id, "alphanumeric characters, hyphens, and underscores only") + + +def validate_workflow_name(workflow: str) -> None: + """Validate workflow name format""" + if not workflow or not isinstance(workflow, str): + raise ValidationError("workflow_name", workflow, "a non-empty string") + + # Check for reasonable length + if len(workflow) < 2 or len(workflow) > 64: + raise ValidationError("workflow_name", workflow, "between 2 and 64 characters") + + # Check for valid characters (alphanumeric, hyphens, underscores) + if not re.match(r'^[a-zA-Z0-9_-]+$', workflow): + raise ValidationError("workflow_name", workflow, "alphanumeric characters, hyphens, and underscores only") + + +def validate_target_path(target_path: str, must_exist: bool = True) -> Path: + """Validate and normalize a target path""" + if not target_path or not isinstance(target_path, str): + raise ValidationError("target_path", target_path, "a non-empty string") + + try: + path = Path(target_path).resolve() + except Exception as e: + raise ValidationError("target_path", target_path, f"a valid path: {e}") + + if must_exist and not path.exists(): + raise ValidationError("target_path", target_path, "an existing path") + + return path + + +def validate_volume_mode(volume_mode: str) -> None: + """Validate volume mode""" + if volume_mode not in SUPPORTED_VOLUME_MODES: + raise ValidationError( + "volume_mode", volume_mode, + f"one of: {', '.join(SUPPORTED_VOLUME_MODES)}" + ) + + +def validate_export_format(export_format: str) -> None: + """Validate export format""" + if export_format not in SUPPORTED_EXPORT_FORMATS: + raise ValidationError( + "export_format", export_format, + f"one of: {', '.join(SUPPORTED_EXPORT_FORMATS)}" + ) + + +def validate_parameter_value(key: str, value: str, param_type: str) -> Any: + """Validate and convert a parameter value based on its type""" + if param_type == "integer": + try: + return int(value) + except ValueError: + raise ValidationError(f"parameter '{key}'", value, "an integer") + + elif param_type == "number": + try: + return float(value) + except ValueError: + raise ValidationError(f"parameter '{key}'", value, "a number") + + elif param_type == "boolean": + lower_value = value.lower() + if lower_value in ("true", "yes", "1", "on"): + return True + elif lower_value in ("false", "no", "0", "off"): + return False + else: + raise ValidationError(f"parameter '{key}'", value, "a boolean (true/false, yes/no, 1/0, on/off)") + + elif param_type == "array": + # Split by comma and strip whitespace + items = [item.strip() for item in value.split(",") if item.strip()] + if not items: + raise ValidationError(f"parameter '{key}'", value, "a non-empty comma-separated list") + return items + + else: + # String type - basic validation + if not value: + raise ValidationError(f"parameter '{key}'", value, "a non-empty string") + return value + + +def validate_parameters(params: List[str]) -> Dict[str, Any]: + """Validate and parse parameter list""" + parameters = {} + + for param_str in params: + if "=" not in param_str: + raise ValidationError("parameter format", param_str, "key=value format") + + key, value = param_str.split("=", 1) + key = key.strip() + value = value.strip() + + if not key: + raise ValidationError("parameter key", param_str, "a non-empty key") + + if not value: + raise ValidationError(f"parameter '{key}'", param_str, "a non-empty value") + + # Auto-detect type and convert + try: + if value.lower() in ("true", "false"): + parameters[key] = value.lower() == "true" + elif value.isdigit(): + parameters[key] = int(value) + elif re.match(r'^\d+\.\d+$', value): + parameters[key] = float(value) + else: + parameters[key] = value + except ValueError: + parameters[key] = value + + return parameters + + +def validate_config_key(key: str) -> None: + """Validate configuration key format""" + if not key or not isinstance(key, str): + raise ValidationError("config_key", key, "a non-empty string") + + # Check for valid key format (e.g., "api.url", "timeout") + if not re.match(r'^[a-zA-Z0-9._-]+$', key): + raise ValidationError("config_key", key, "alphanumeric characters, dots, hyphens, and underscores only") + + +def validate_positive_integer(value: int, name: str) -> None: + """Validate that a value is a positive integer""" + if not isinstance(value, int) or value <= 0: + raise ValidationError(name, value, "a positive integer") + + +def validate_timeout(timeout: Optional[int]) -> None: + """Validate timeout value""" + if timeout is not None: + if not isinstance(timeout, int) or timeout <= 0: + raise ValidationError("timeout", timeout, "a positive integer (seconds)") + + if timeout > 86400: # 24 hours + raise ValidationError("timeout", timeout, "less than 24 hours (86400 seconds)") \ No newline at end of file diff --git a/cli/uv.lock b/cli/uv.lock new file mode 100644 index 0000000..3d89b0e --- /dev/null +++ b/cli/uv.lock @@ -0,0 +1,5256 @@ +version = 1 +revision = 3 +requires-python = ">=3.11" +resolution-markers = [ + "python_full_version >= '3.14' and platform_python_implementation != 'PyPy' and sys_platform != 'emscripten'", + "python_full_version >= '3.14' and platform_python_implementation == 'PyPy' and sys_platform != 'emscripten'", + "python_full_version >= '3.14' and sys_platform == 'emscripten'", + "python_full_version == '3.13.*' and platform_python_implementation != 'PyPy' and sys_platform != 'emscripten'", + "python_full_version == '3.12.*' and platform_python_implementation != 'PyPy' and sys_platform != 'emscripten'", + "python_full_version < '3.12' and platform_python_implementation != 'PyPy' and sys_platform != 'emscripten'", + "python_full_version == '3.13.*' and platform_python_implementation == 'PyPy' and sys_platform != 'emscripten'", + "python_full_version == '3.12.*' and platform_python_implementation == 'PyPy' and sys_platform != 'emscripten'", + "python_full_version < '3.12' and platform_python_implementation == 'PyPy' and sys_platform != 'emscripten'", + "python_full_version == '3.13.*' and sys_platform == 'emscripten'", + "python_full_version == '3.12.*' and sys_platform == 'emscripten'", + "python_full_version < '3.12' and sys_platform == 'emscripten'", +] + +[[package]] +name = "a2a-sdk" +version = "0.3.7" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "google-api-core" }, + { name = "httpx" }, + { name = "httpx-sse" }, + { name = "protobuf" }, + { name = "pydantic" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/8d/ad/b6ecb58f44459a24f1c260e91304e1ddbb7a8e213f1f82cc4c074f66e9bb/a2a_sdk-0.3.7.tar.gz", hash = "sha256:795aa2bd2cfb3c9e8654a1352bf5f75d6cf1205b262b1bf8f4003b5308267ea2", size = 223426, upload-time = "2025-09-23T16:27:29.585Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e6/27/9cf8c6de4ae71e9c98ec96b3304449d5d0cd36ec3b95e66b6e7f58a9e571/a2a_sdk-0.3.7-py3-none-any.whl", hash = "sha256:0813b8fd7add427b2b56895cf28cae705303cf6d671b305c0aac69987816e03e", size = 137957, upload-time = "2025-09-23T16:27:27.546Z" }, +] + +[[package]] +name = "absolufy-imports" +version = "0.3.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/74/0f/9da9dc9a12ebf4622ec96d9338d221e0172699e7574929f65ec8fdb30f9c/absolufy_imports-0.3.1.tar.gz", hash = "sha256:c90638a6c0b66826d1fb4880ddc20ef7701af34192c94faf40b95d32b59f9793", size = 4724, upload-time = "2022-01-20T14:48:53.434Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a3/a4/b65c9fbc2c0c09c0ea3008f62d2010fd261e62a4881502f03a6301079182/absolufy_imports-0.3.1-py2.py3-none-any.whl", hash = "sha256:49bf7c753a9282006d553ba99217f48f947e3eef09e18a700f8a82f75dc7fc5c", size = 5937, upload-time = "2022-01-20T14:48:51.718Z" }, +] + +[[package]] +name = "agentops" +version = "0.4.21" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "aiohttp" }, + { name = "httpx" }, + { name = "opentelemetry-api" }, + { name = "opentelemetry-exporter-otlp-proto-http" }, + { name = "opentelemetry-instrumentation" }, + { name = "opentelemetry-sdk" }, + { name = "opentelemetry-semantic-conventions" }, + { name = "ordered-set" }, + { name = "packaging" }, + { name = "psutil" }, + { name = "pyyaml" }, + { name = "requests" }, + { name = "termcolor" }, + { name = "wrapt" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/0a/c4/023fe976169c57b1edd71f4c08d6dedaf66814f5b25ecf59b3a8540311ab/agentops-0.4.21.tar.gz", hash = "sha256:47759c6dfd6ea58bad2f7764257e4778cb2e34ae180cef642f60f56adced6510", size = 430861, upload-time = "2025-08-29T06:36:55.323Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d9/63/3e48da56d5121ddcefef8645ad5a3446b0974154111a14bf75ea2b5b3cc3/agentops-0.4.21-py3-none-any.whl", hash = "sha256:93b098ea77bc5f64dcae5031a8292531cb446d9d66e6c7ef2f21a66d4e4fb2f0", size = 309579, upload-time = "2025-08-29T06:36:53.855Z" }, +] + +[[package]] +name = "aiobotocore" +version = "2.24.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "aiohttp" }, + { name = "aioitertools" }, + { name = "botocore" }, + { name = "jmespath" }, + { name = "multidict" }, + { name = "python-dateutil" }, + { name = "wrapt" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/05/93/9f5243c2fd2fc22cff92f8d8a7e98d3080171be60778d49aeabb555a463d/aiobotocore-2.24.2.tar.gz", hash = "sha256:dfb21bdb2610e8de4d22f401e91a24d50f1330a302d03c62c485757becd439a9", size = 119837, upload-time = "2025-09-05T12:13:46.963Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/87/03/2330062ac4ea9fa6447e02b0625f24efd6f05b6c44d61d86610b3555ee66/aiobotocore-2.24.2-py3-none-any.whl", hash = "sha256:808c63b2bd344b91e2f2acb874831118a9f53342d248acd16a68455a226e283a", size = 85441, upload-time = "2025-09-05T12:13:45.378Z" }, +] + +[package.optional-dependencies] +boto3 = [ + { name = "boto3" }, +] + +[[package]] +name = "aiofiles" +version = "23.2.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/af/41/cfed10bc64d774f497a86e5ede9248e1d062db675504b41c320954d99641/aiofiles-23.2.1.tar.gz", hash = "sha256:84ec2218d8419404abcb9f0c02df3f34c6e0a68ed41072acfb1cef5cbc29051a", size = 32072, upload-time = "2023-08-09T15:23:11.564Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c5/19/5af6804c4cc0fed83f47bff6e413a98a36618e7d40185cd36e69737f3b0e/aiofiles-23.2.1-py3-none-any.whl", hash = "sha256:19297512c647d4b27a2cf7c34caa7e405c0d60b5560618a29a9fe027b18b0107", size = 15727, upload-time = "2023-08-09T15:23:09.774Z" }, +] + +[[package]] +name = "aiohappyeyeballs" +version = "2.6.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/26/30/f84a107a9c4331c14b2b586036f40965c128aa4fee4dda5d3d51cb14ad54/aiohappyeyeballs-2.6.1.tar.gz", hash = "sha256:c3f9d0113123803ccadfdf3f0faa505bc78e6a72d1cc4806cbd719826e943558", size = 22760, upload-time = "2025-03-12T01:42:48.764Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/0f/15/5bf3b99495fb160b63f95972b81750f18f7f4e02ad051373b669d17d44f2/aiohappyeyeballs-2.6.1-py3-none-any.whl", hash = "sha256:f349ba8f4b75cb25c99c5c2d84e997e485204d2902a9597802b0371f09331fb8", size = 15265, upload-time = "2025-03-12T01:42:47.083Z" }, +] + +[[package]] +name = "aiohttp" +version = "3.12.15" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "aiohappyeyeballs" }, + { name = "aiosignal" }, + { name = "attrs" }, + { name = "frozenlist" }, + { name = "multidict" }, + { name = "propcache" }, + { name = "yarl" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/9b/e7/d92a237d8802ca88483906c388f7c201bbe96cd80a165ffd0ac2f6a8d59f/aiohttp-3.12.15.tar.gz", hash = "sha256:4fc61385e9c98d72fcdf47e6dd81833f47b2f77c114c29cd64a361be57a763a2", size = 7823716, upload-time = "2025-07-29T05:52:32.215Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/20/19/9e86722ec8e835959bd97ce8c1efa78cf361fa4531fca372551abcc9cdd6/aiohttp-3.12.15-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:d3ce17ce0220383a0f9ea07175eeaa6aa13ae5a41f30bc61d84df17f0e9b1117", size = 711246, upload-time = "2025-07-29T05:50:15.937Z" }, + { url = "https://files.pythonhosted.org/packages/71/f9/0a31fcb1a7d4629ac9d8f01f1cb9242e2f9943f47f5d03215af91c3c1a26/aiohttp-3.12.15-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:010cc9bbd06db80fe234d9003f67e97a10fe003bfbedb40da7d71c1008eda0fe", size = 483515, upload-time = "2025-07-29T05:50:17.442Z" }, + { url = "https://files.pythonhosted.org/packages/62/6c/94846f576f1d11df0c2e41d3001000527c0fdf63fce7e69b3927a731325d/aiohttp-3.12.15-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:3f9d7c55b41ed687b9d7165b17672340187f87a773c98236c987f08c858145a9", size = 471776, upload-time = "2025-07-29T05:50:19.568Z" }, + { url = "https://files.pythonhosted.org/packages/f8/6c/f766d0aaafcee0447fad0328da780d344489c042e25cd58fde566bf40aed/aiohttp-3.12.15-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bc4fbc61bb3548d3b482f9ac7ddd0f18c67e4225aaa4e8552b9f1ac7e6bda9e5", size = 1741977, upload-time = "2025-07-29T05:50:21.665Z" }, + { url = "https://files.pythonhosted.org/packages/17/e5/fb779a05ba6ff44d7bc1e9d24c644e876bfff5abe5454f7b854cace1b9cc/aiohttp-3.12.15-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:7fbc8a7c410bb3ad5d595bb7118147dfbb6449d862cc1125cf8867cb337e8728", size = 1690645, upload-time = "2025-07-29T05:50:23.333Z" }, + { url = "https://files.pythonhosted.org/packages/37/4e/a22e799c2035f5d6a4ad2cf8e7c1d1bd0923192871dd6e367dafb158b14c/aiohttp-3.12.15-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:74dad41b3458dbb0511e760fb355bb0b6689e0630de8a22b1b62a98777136e16", size = 1789437, upload-time = "2025-07-29T05:50:25.007Z" }, + { url = "https://files.pythonhosted.org/packages/28/e5/55a33b991f6433569babb56018b2fb8fb9146424f8b3a0c8ecca80556762/aiohttp-3.12.15-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3b6f0af863cf17e6222b1735a756d664159e58855da99cfe965134a3ff63b0b0", size = 1828482, upload-time = "2025-07-29T05:50:26.693Z" }, + { url = "https://files.pythonhosted.org/packages/c6/82/1ddf0ea4f2f3afe79dffed5e8a246737cff6cbe781887a6a170299e33204/aiohttp-3.12.15-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b5b7fe4972d48a4da367043b8e023fb70a04d1490aa7d68800e465d1b97e493b", size = 1730944, upload-time = "2025-07-29T05:50:28.382Z" }, + { url = "https://files.pythonhosted.org/packages/1b/96/784c785674117b4cb3877522a177ba1b5e4db9ce0fd519430b5de76eec90/aiohttp-3.12.15-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6443cca89553b7a5485331bc9bedb2342b08d073fa10b8c7d1c60579c4a7b9bd", size = 1668020, upload-time = "2025-07-29T05:50:30.032Z" }, + { url = "https://files.pythonhosted.org/packages/12/8a/8b75f203ea7e5c21c0920d84dd24a5c0e971fe1e9b9ebbf29ae7e8e39790/aiohttp-3.12.15-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:6c5f40ec615e5264f44b4282ee27628cea221fcad52f27405b80abb346d9f3f8", size = 1716292, upload-time = "2025-07-29T05:50:31.983Z" }, + { url = "https://files.pythonhosted.org/packages/47/0b/a1451543475bb6b86a5cfc27861e52b14085ae232896a2654ff1231c0992/aiohttp-3.12.15-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:2abbb216a1d3a2fe86dbd2edce20cdc5e9ad0be6378455b05ec7f77361b3ab50", size = 1711451, upload-time = "2025-07-29T05:50:33.989Z" }, + { url = "https://files.pythonhosted.org/packages/55/fd/793a23a197cc2f0d29188805cfc93aa613407f07e5f9da5cd1366afd9d7c/aiohttp-3.12.15-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:db71ce547012a5420a39c1b744d485cfb823564d01d5d20805977f5ea1345676", size = 1691634, upload-time = "2025-07-29T05:50:35.846Z" }, + { url = "https://files.pythonhosted.org/packages/ca/bf/23a335a6670b5f5dfc6d268328e55a22651b440fca341a64fccf1eada0c6/aiohttp-3.12.15-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:ced339d7c9b5030abad5854aa5413a77565e5b6e6248ff927d3e174baf3badf7", size = 1785238, upload-time = "2025-07-29T05:50:37.597Z" }, + { url = "https://files.pythonhosted.org/packages/57/4f/ed60a591839a9d85d40694aba5cef86dde9ee51ce6cca0bb30d6eb1581e7/aiohttp-3.12.15-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:7c7dd29c7b5bda137464dc9bfc738d7ceea46ff70309859ffde8c022e9b08ba7", size = 1805701, upload-time = "2025-07-29T05:50:39.591Z" }, + { url = "https://files.pythonhosted.org/packages/85/e0/444747a9455c5de188c0f4a0173ee701e2e325d4b2550e9af84abb20cdba/aiohttp-3.12.15-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:421da6fd326460517873274875c6c5a18ff225b40da2616083c5a34a7570b685", size = 1718758, upload-time = "2025-07-29T05:50:41.292Z" }, + { url = "https://files.pythonhosted.org/packages/36/ab/1006278d1ffd13a698e5dd4bfa01e5878f6bddefc296c8b62649753ff249/aiohttp-3.12.15-cp311-cp311-win32.whl", hash = "sha256:4420cf9d179ec8dfe4be10e7d0fe47d6d606485512ea2265b0d8c5113372771b", size = 428868, upload-time = "2025-07-29T05:50:43.063Z" }, + { url = "https://files.pythonhosted.org/packages/10/97/ad2b18700708452400278039272032170246a1bf8ec5d832772372c71f1a/aiohttp-3.12.15-cp311-cp311-win_amd64.whl", hash = "sha256:edd533a07da85baa4b423ee8839e3e91681c7bfa19b04260a469ee94b778bf6d", size = 453273, upload-time = "2025-07-29T05:50:44.613Z" }, + { url = "https://files.pythonhosted.org/packages/63/97/77cb2450d9b35f517d6cf506256bf4f5bda3f93a66b4ad64ba7fc917899c/aiohttp-3.12.15-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:802d3868f5776e28f7bf69d349c26fc0efadb81676d0afa88ed00d98a26340b7", size = 702333, upload-time = "2025-07-29T05:50:46.507Z" }, + { url = "https://files.pythonhosted.org/packages/83/6d/0544e6b08b748682c30b9f65640d006e51f90763b41d7c546693bc22900d/aiohttp-3.12.15-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:f2800614cd560287be05e33a679638e586a2d7401f4ddf99e304d98878c29444", size = 476948, upload-time = "2025-07-29T05:50:48.067Z" }, + { url = "https://files.pythonhosted.org/packages/3a/1d/c8c40e611e5094330284b1aea8a4b02ca0858f8458614fa35754cab42b9c/aiohttp-3.12.15-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8466151554b593909d30a0a125d638b4e5f3836e5aecde85b66b80ded1cb5b0d", size = 469787, upload-time = "2025-07-29T05:50:49.669Z" }, + { url = "https://files.pythonhosted.org/packages/38/7d/b76438e70319796bfff717f325d97ce2e9310f752a267bfdf5192ac6082b/aiohttp-3.12.15-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2e5a495cb1be69dae4b08f35a6c4579c539e9b5706f606632102c0f855bcba7c", size = 1716590, upload-time = "2025-07-29T05:50:51.368Z" }, + { url = "https://files.pythonhosted.org/packages/79/b1/60370d70cdf8b269ee1444b390cbd72ce514f0d1cd1a715821c784d272c9/aiohttp-3.12.15-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:6404dfc8cdde35c69aaa489bb3542fb86ef215fc70277c892be8af540e5e21c0", size = 1699241, upload-time = "2025-07-29T05:50:53.628Z" }, + { url = "https://files.pythonhosted.org/packages/a3/2b/4968a7b8792437ebc12186db31523f541943e99bda8f30335c482bea6879/aiohttp-3.12.15-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3ead1c00f8521a5c9070fcb88f02967b1d8a0544e6d85c253f6968b785e1a2ab", size = 1754335, upload-time = "2025-07-29T05:50:55.394Z" }, + { url = "https://files.pythonhosted.org/packages/fb/c1/49524ed553f9a0bec1a11fac09e790f49ff669bcd14164f9fab608831c4d/aiohttp-3.12.15-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6990ef617f14450bc6b34941dba4f12d5613cbf4e33805932f853fbd1cf18bfb", size = 1800491, upload-time = "2025-07-29T05:50:57.202Z" }, + { url = "https://files.pythonhosted.org/packages/de/5e/3bf5acea47a96a28c121b167f5ef659cf71208b19e52a88cdfa5c37f1fcc/aiohttp-3.12.15-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fd736ed420f4db2b8148b52b46b88ed038d0354255f9a73196b7bbce3ea97545", size = 1719929, upload-time = "2025-07-29T05:50:59.192Z" }, + { url = "https://files.pythonhosted.org/packages/39/94/8ae30b806835bcd1cba799ba35347dee6961a11bd507db634516210e91d8/aiohttp-3.12.15-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3c5092ce14361a73086b90c6efb3948ffa5be2f5b6fbcf52e8d8c8b8848bb97c", size = 1635733, upload-time = "2025-07-29T05:51:01.394Z" }, + { url = "https://files.pythonhosted.org/packages/7a/46/06cdef71dd03acd9da7f51ab3a9107318aee12ad38d273f654e4f981583a/aiohttp-3.12.15-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:aaa2234bb60c4dbf82893e934d8ee8dea30446f0647e024074237a56a08c01bd", size = 1696790, upload-time = "2025-07-29T05:51:03.657Z" }, + { url = "https://files.pythonhosted.org/packages/02/90/6b4cfaaf92ed98d0ec4d173e78b99b4b1a7551250be8937d9d67ecb356b4/aiohttp-3.12.15-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:6d86a2fbdd14192e2f234a92d3b494dd4457e683ba07e5905a0b3ee25389ac9f", size = 1718245, upload-time = "2025-07-29T05:51:05.911Z" }, + { url = "https://files.pythonhosted.org/packages/2e/e6/2593751670fa06f080a846f37f112cbe6f873ba510d070136a6ed46117c6/aiohttp-3.12.15-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:a041e7e2612041a6ddf1c6a33b883be6a421247c7afd47e885969ee4cc58bd8d", size = 1658899, upload-time = "2025-07-29T05:51:07.753Z" }, + { url = "https://files.pythonhosted.org/packages/8f/28/c15bacbdb8b8eb5bf39b10680d129ea7410b859e379b03190f02fa104ffd/aiohttp-3.12.15-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:5015082477abeafad7203757ae44299a610e89ee82a1503e3d4184e6bafdd519", size = 1738459, upload-time = "2025-07-29T05:51:09.56Z" }, + { url = "https://files.pythonhosted.org/packages/00/de/c269cbc4faa01fb10f143b1670633a8ddd5b2e1ffd0548f7aa49cb5c70e2/aiohttp-3.12.15-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:56822ff5ddfd1b745534e658faba944012346184fbfe732e0d6134b744516eea", size = 1766434, upload-time = "2025-07-29T05:51:11.423Z" }, + { url = "https://files.pythonhosted.org/packages/52/b0/4ff3abd81aa7d929b27d2e1403722a65fc87b763e3a97b3a2a494bfc63bc/aiohttp-3.12.15-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b2acbbfff69019d9014508c4ba0401822e8bae5a5fdc3b6814285b71231b60f3", size = 1726045, upload-time = "2025-07-29T05:51:13.689Z" }, + { url = "https://files.pythonhosted.org/packages/71/16/949225a6a2dd6efcbd855fbd90cf476052e648fb011aa538e3b15b89a57a/aiohttp-3.12.15-cp312-cp312-win32.whl", hash = "sha256:d849b0901b50f2185874b9a232f38e26b9b3d4810095a7572eacea939132d4e1", size = 423591, upload-time = "2025-07-29T05:51:15.452Z" }, + { url = "https://files.pythonhosted.org/packages/2b/d8/fa65d2a349fe938b76d309db1a56a75c4fb8cc7b17a398b698488a939903/aiohttp-3.12.15-cp312-cp312-win_amd64.whl", hash = "sha256:b390ef5f62bb508a9d67cb3bba9b8356e23b3996da7062f1a57ce1a79d2b3d34", size = 450266, upload-time = "2025-07-29T05:51:17.239Z" }, + { url = "https://files.pythonhosted.org/packages/f2/33/918091abcf102e39d15aba2476ad9e7bd35ddb190dcdd43a854000d3da0d/aiohttp-3.12.15-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:9f922ffd05034d439dde1c77a20461cf4a1b0831e6caa26151fe7aa8aaebc315", size = 696741, upload-time = "2025-07-29T05:51:19.021Z" }, + { url = "https://files.pythonhosted.org/packages/b5/2a/7495a81e39a998e400f3ecdd44a62107254803d1681d9189be5c2e4530cd/aiohttp-3.12.15-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:2ee8a8ac39ce45f3e55663891d4b1d15598c157b4d494a4613e704c8b43112cd", size = 474407, upload-time = "2025-07-29T05:51:21.165Z" }, + { url = "https://files.pythonhosted.org/packages/49/fc/a9576ab4be2dcbd0f73ee8675d16c707cfc12d5ee80ccf4015ba543480c9/aiohttp-3.12.15-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:3eae49032c29d356b94eee45a3f39fdf4b0814b397638c2f718e96cfadf4c4e4", size = 466703, upload-time = "2025-07-29T05:51:22.948Z" }, + { url = "https://files.pythonhosted.org/packages/09/2f/d4bcc8448cf536b2b54eed48f19682031ad182faa3a3fee54ebe5b156387/aiohttp-3.12.15-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b97752ff12cc12f46a9b20327104448042fce5c33a624f88c18f66f9368091c7", size = 1705532, upload-time = "2025-07-29T05:51:25.211Z" }, + { url = "https://files.pythonhosted.org/packages/f1/f3/59406396083f8b489261e3c011aa8aee9df360a96ac8fa5c2e7e1b8f0466/aiohttp-3.12.15-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:894261472691d6fe76ebb7fcf2e5870a2ac284c7406ddc95823c8598a1390f0d", size = 1686794, upload-time = "2025-07-29T05:51:27.145Z" }, + { url = "https://files.pythonhosted.org/packages/dc/71/164d194993a8d114ee5656c3b7ae9c12ceee7040d076bf7b32fb98a8c5c6/aiohttp-3.12.15-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5fa5d9eb82ce98959fc1031c28198b431b4d9396894f385cb63f1e2f3f20ca6b", size = 1738865, upload-time = "2025-07-29T05:51:29.366Z" }, + { url = "https://files.pythonhosted.org/packages/1c/00/d198461b699188a93ead39cb458554d9f0f69879b95078dce416d3209b54/aiohttp-3.12.15-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f0fa751efb11a541f57db59c1dd821bec09031e01452b2b6217319b3a1f34f3d", size = 1788238, upload-time = "2025-07-29T05:51:31.285Z" }, + { url = "https://files.pythonhosted.org/packages/85/b8/9e7175e1fa0ac8e56baa83bf3c214823ce250d0028955dfb23f43d5e61fd/aiohttp-3.12.15-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5346b93e62ab51ee2a9d68e8f73c7cf96ffb73568a23e683f931e52450e4148d", size = 1710566, upload-time = "2025-07-29T05:51:33.219Z" }, + { url = "https://files.pythonhosted.org/packages/59/e4/16a8eac9df39b48ae102ec030fa9f726d3570732e46ba0c592aeeb507b93/aiohttp-3.12.15-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:049ec0360f939cd164ecbfd2873eaa432613d5e77d6b04535e3d1fbae5a9e645", size = 1624270, upload-time = "2025-07-29T05:51:35.195Z" }, + { url = "https://files.pythonhosted.org/packages/1f/f8/cd84dee7b6ace0740908fd0af170f9fab50c2a41ccbc3806aabcb1050141/aiohttp-3.12.15-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:b52dcf013b57464b6d1e51b627adfd69a8053e84b7103a7cd49c030f9ca44461", size = 1677294, upload-time = "2025-07-29T05:51:37.215Z" }, + { url = "https://files.pythonhosted.org/packages/ce/42/d0f1f85e50d401eccd12bf85c46ba84f947a84839c8a1c2c5f6e8ab1eb50/aiohttp-3.12.15-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:9b2af240143dd2765e0fb661fd0361a1b469cab235039ea57663cda087250ea9", size = 1708958, upload-time = "2025-07-29T05:51:39.328Z" }, + { url = "https://files.pythonhosted.org/packages/d5/6b/f6fa6c5790fb602538483aa5a1b86fcbad66244997e5230d88f9412ef24c/aiohttp-3.12.15-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:ac77f709a2cde2cc71257ab2d8c74dd157c67a0558a0d2799d5d571b4c63d44d", size = 1651553, upload-time = "2025-07-29T05:51:41.356Z" }, + { url = "https://files.pythonhosted.org/packages/04/36/a6d36ad545fa12e61d11d1932eef273928b0495e6a576eb2af04297fdd3c/aiohttp-3.12.15-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:47f6b962246f0a774fbd3b6b7be25d59b06fdb2f164cf2513097998fc6a29693", size = 1727688, upload-time = "2025-07-29T05:51:43.452Z" }, + { url = "https://files.pythonhosted.org/packages/aa/c8/f195e5e06608a97a4e52c5d41c7927301bf757a8e8bb5bbf8cef6c314961/aiohttp-3.12.15-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:760fb7db442f284996e39cf9915a94492e1896baac44f06ae551974907922b64", size = 1761157, upload-time = "2025-07-29T05:51:45.643Z" }, + { url = "https://files.pythonhosted.org/packages/05/6a/ea199e61b67f25ba688d3ce93f63b49b0a4e3b3d380f03971b4646412fc6/aiohttp-3.12.15-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:ad702e57dc385cae679c39d318def49aef754455f237499d5b99bea4ef582e51", size = 1710050, upload-time = "2025-07-29T05:51:48.203Z" }, + { url = "https://files.pythonhosted.org/packages/b4/2e/ffeb7f6256b33635c29dbed29a22a723ff2dd7401fff42ea60cf2060abfb/aiohttp-3.12.15-cp313-cp313-win32.whl", hash = "sha256:f813c3e9032331024de2eb2e32a88d86afb69291fbc37a3a3ae81cc9917fb3d0", size = 422647, upload-time = "2025-07-29T05:51:50.718Z" }, + { url = "https://files.pythonhosted.org/packages/1b/8e/78ee35774201f38d5e1ba079c9958f7629b1fd079459aea9467441dbfbf5/aiohttp-3.12.15-cp313-cp313-win_amd64.whl", hash = "sha256:1a649001580bdb37c6fdb1bebbd7e3bc688e8ec2b5c6f52edbb664662b17dc84", size = 449067, upload-time = "2025-07-29T05:51:52.549Z" }, +] + +[[package]] +name = "aioitertools" +version = "0.12.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/06/de/38491a84ab323b47c7f86e94d2830e748780525f7a10c8600b67ead7e9ea/aioitertools-0.12.0.tar.gz", hash = "sha256:c2a9055b4fbb7705f561b9d86053e8af5d10cc845d22c32008c43490b2d8dd6b", size = 19369, upload-time = "2024-09-02T03:33:40.349Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/85/13/58b70a580de00893223d61de8fea167877a3aed97d4a5e1405c9159ef925/aioitertools-0.12.0-py3-none-any.whl", hash = "sha256:fc1f5fac3d737354de8831cbba3eb04f79dd649d8f3afb4c5b114925e662a796", size = 24345, upload-time = "2024-09-02T03:34:59.454Z" }, +] + +[[package]] +name = "aiosignal" +version = "1.4.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "frozenlist" }, + { name = "typing-extensions", marker = "python_full_version < '3.13'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/61/62/06741b579156360248d1ec624842ad0edf697050bbaf7c3e46394e106ad1/aiosignal-1.4.0.tar.gz", hash = "sha256:f47eecd9468083c2029cc99945502cb7708b082c232f9aca65da147157b251c7", size = 25007, upload-time = "2025-07-03T22:54:43.528Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/fb/76/641ae371508676492379f16e2fa48f4e2c11741bd63c48be4b12a6b09cba/aiosignal-1.4.0-py3-none-any.whl", hash = "sha256:053243f8b92b990551949e63930a839ff0cf0b0ebbe0597b0f3fb19e1a0fe82e", size = 7490, upload-time = "2025-07-03T22:54:42.156Z" }, +] + +[[package]] +name = "aiosqlite" +version = "0.21.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/13/7d/8bca2bf9a247c2c5dfeec1d7a5f40db6518f88d314b8bca9da29670d2671/aiosqlite-0.21.0.tar.gz", hash = "sha256:131bb8056daa3bc875608c631c678cda73922a2d4ba8aec373b19f18c17e7aa3", size = 13454, upload-time = "2025-02-03T07:30:16.235Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/f5/10/6c25ed6de94c49f88a91fa5018cb4c0f3625f31d5be9f771ebe5cc7cd506/aiosqlite-0.21.0-py3-none-any.whl", hash = "sha256:2549cf4057f95f53dcba16f2b64e8e2791d7e1adedb13197dd8ed77bb226d7d0", size = 15792, upload-time = "2025-02-03T07:30:13.6Z" }, +] + +[[package]] +name = "alembic" +version = "1.16.5" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "mako" }, + { name = "sqlalchemy" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/9a/ca/4dc52902cf3491892d464f5265a81e9dff094692c8a049a3ed6a05fe7ee8/alembic-1.16.5.tar.gz", hash = "sha256:a88bb7f6e513bd4301ecf4c7f2206fe93f9913f9b48dac3b78babde2d6fe765e", size = 1969868, upload-time = "2025-08-27T18:02:05.668Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/39/4a/4c61d4c84cfd9befb6fa08a702535b27b21fff08c946bc2f6139decbf7f7/alembic-1.16.5-py3-none-any.whl", hash = "sha256:e845dfe090c5ffa7b92593ae6687c5cb1a101e91fa53868497dbd79847f9dbe3", size = 247355, upload-time = "2025-08-27T18:02:07.37Z" }, +] + +[[package]] +name = "annotated-types" +version = "0.7.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/ee/67/531ea369ba64dcff5ec9c3402f9f51bf748cec26dde048a2f973a4eea7f5/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89", size = 16081, upload-time = "2024-05-20T21:33:25.928Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643, upload-time = "2024-05-20T21:33:24.1Z" }, +] + +[[package]] +name = "anyio" +version = "4.10.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "idna" }, + { name = "sniffio" }, + { name = "typing-extensions", marker = "python_full_version < '3.13'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/f1/b4/636b3b65173d3ce9a38ef5f0522789614e590dab6a8d505340a4efe4c567/anyio-4.10.0.tar.gz", hash = "sha256:3f3fae35c96039744587aa5b8371e7e8e603c0702999535961dd336026973ba6", size = 213252, upload-time = "2025-08-04T08:54:26.451Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/6f/12/e5e0282d673bb9746bacfb6e2dba8719989d3660cdb2ea79aee9a9651afb/anyio-4.10.0-py3-none-any.whl", hash = "sha256:60e474ac86736bbfd6f210f7a61218939c318f43f9972497381f1c5e930ed3d1", size = 107213, upload-time = "2025-08-04T08:54:24.882Z" }, +] + +[[package]] +name = "argon2-cffi" +version = "23.1.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "argon2-cffi-bindings" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/31/fa/57ec2c6d16ecd2ba0cf15f3c7d1c3c2e7b5fcb83555ff56d7ab10888ec8f/argon2_cffi-23.1.0.tar.gz", hash = "sha256:879c3e79a2729ce768ebb7d36d4609e3a78a4ca2ec3a9f12286ca057e3d0db08", size = 42798, upload-time = "2023-08-15T14:13:12.711Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a4/6a/e8a041599e78b6b3752da48000b14c8d1e8a04ded09c88c714ba047f34f5/argon2_cffi-23.1.0-py3-none-any.whl", hash = "sha256:c670642b78ba29641818ab2e68bd4e6a78ba53b7eff7b4c3815ae16abf91c7ea", size = 15124, upload-time = "2023-08-15T14:13:10.752Z" }, +] + +[[package]] +name = "argon2-cffi-bindings" +version = "25.1.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "cffi" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/5c/2d/db8af0df73c1cf454f71b2bbe5e356b8c1f8041c979f505b3d3186e520a9/argon2_cffi_bindings-25.1.0.tar.gz", hash = "sha256:b957f3e6ea4d55d820e40ff76f450952807013d361a65d7f28acc0acbf29229d", size = 1783441, upload-time = "2025-07-30T10:02:05.147Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/60/97/3c0a35f46e52108d4707c44b95cfe2afcafc50800b5450c197454569b776/argon2_cffi_bindings-25.1.0-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:3d3f05610594151994ca9ccb3c771115bdb4daef161976a266f0dd8aa9996b8f", size = 54393, upload-time = "2025-07-30T10:01:40.97Z" }, + { url = "https://files.pythonhosted.org/packages/9d/f4/98bbd6ee89febd4f212696f13c03ca302b8552e7dbf9c8efa11ea4a388c3/argon2_cffi_bindings-25.1.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:8b8efee945193e667a396cbc7b4fb7d357297d6234d30a489905d96caabde56b", size = 29328, upload-time = "2025-07-30T10:01:41.916Z" }, + { url = "https://files.pythonhosted.org/packages/43/24/90a01c0ef12ac91a6be05969f29944643bc1e5e461155ae6559befa8f00b/argon2_cffi_bindings-25.1.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:3c6702abc36bf3ccba3f802b799505def420a1b7039862014a65db3205967f5a", size = 31269, upload-time = "2025-07-30T10:01:42.716Z" }, + { url = "https://files.pythonhosted.org/packages/d4/d3/942aa10782b2697eee7af5e12eeff5ebb325ccfb86dd8abda54174e377e4/argon2_cffi_bindings-25.1.0-cp314-cp314t-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a1c70058c6ab1e352304ac7e3b52554daadacd8d453c1752e547c76e9c99ac44", size = 86558, upload-time = "2025-07-30T10:01:43.943Z" }, + { url = "https://files.pythonhosted.org/packages/0d/82/b484f702fec5536e71836fc2dbc8c5267b3f6e78d2d539b4eaa6f0db8bf8/argon2_cffi_bindings-25.1.0-cp314-cp314t-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e2fd3bfbff3c5d74fef31a722f729bf93500910db650c925c2d6ef879a7e51cb", size = 92364, upload-time = "2025-07-30T10:01:44.887Z" }, + { url = "https://files.pythonhosted.org/packages/c9/c1/a606ff83b3f1735f3759ad0f2cd9e038a0ad11a3de3b6c673aa41c24bb7b/argon2_cffi_bindings-25.1.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:c4f9665de60b1b0e99bcd6be4f17d90339698ce954cfd8d9cf4f91c995165a92", size = 85637, upload-time = "2025-07-30T10:01:46.225Z" }, + { url = "https://files.pythonhosted.org/packages/44/b4/678503f12aceb0262f84fa201f6027ed77d71c5019ae03b399b97caa2f19/argon2_cffi_bindings-25.1.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:ba92837e4a9aa6a508c8d2d7883ed5a8f6c308c89a4790e1e447a220deb79a85", size = 91934, upload-time = "2025-07-30T10:01:47.203Z" }, + { url = "https://files.pythonhosted.org/packages/f0/c7/f36bd08ef9bd9f0a9cff9428406651f5937ce27b6c5b07b92d41f91ae541/argon2_cffi_bindings-25.1.0-cp314-cp314t-win32.whl", hash = "sha256:84a461d4d84ae1295871329b346a97f68eade8c53b6ed9a7ca2d7467f3c8ff6f", size = 28158, upload-time = "2025-07-30T10:01:48.341Z" }, + { url = "https://files.pythonhosted.org/packages/b3/80/0106a7448abb24a2c467bf7d527fe5413b7fdfa4ad6d6a96a43a62ef3988/argon2_cffi_bindings-25.1.0-cp314-cp314t-win_amd64.whl", hash = "sha256:b55aec3565b65f56455eebc9b9f34130440404f27fe21c3b375bf1ea4d8fbae6", size = 32597, upload-time = "2025-07-30T10:01:49.112Z" }, + { url = "https://files.pythonhosted.org/packages/05/b8/d663c9caea07e9180b2cb662772865230715cbd573ba3b5e81793d580316/argon2_cffi_bindings-25.1.0-cp314-cp314t-win_arm64.whl", hash = "sha256:87c33a52407e4c41f3b70a9c2d3f6056d88b10dad7695be708c5021673f55623", size = 28231, upload-time = "2025-07-30T10:01:49.92Z" }, + { url = "https://files.pythonhosted.org/packages/1d/57/96b8b9f93166147826da5f90376e784a10582dd39a393c99bb62cfcf52f0/argon2_cffi_bindings-25.1.0-cp39-abi3-macosx_10_9_universal2.whl", hash = "sha256:aecba1723ae35330a008418a91ea6cfcedf6d31e5fbaa056a166462ff066d500", size = 54121, upload-time = "2025-07-30T10:01:50.815Z" }, + { url = "https://files.pythonhosted.org/packages/0a/08/a9bebdb2e0e602dde230bdde8021b29f71f7841bd54801bcfd514acb5dcf/argon2_cffi_bindings-25.1.0-cp39-abi3-macosx_10_9_x86_64.whl", hash = "sha256:2630b6240b495dfab90aebe159ff784d08ea999aa4b0d17efa734055a07d2f44", size = 29177, upload-time = "2025-07-30T10:01:51.681Z" }, + { url = "https://files.pythonhosted.org/packages/b6/02/d297943bcacf05e4f2a94ab6f462831dc20158614e5d067c35d4e63b9acb/argon2_cffi_bindings-25.1.0-cp39-abi3-macosx_11_0_arm64.whl", hash = "sha256:7aef0c91e2c0fbca6fc68e7555aa60ef7008a739cbe045541e438373bc54d2b0", size = 31090, upload-time = "2025-07-30T10:01:53.184Z" }, + { url = "https://files.pythonhosted.org/packages/c1/93/44365f3d75053e53893ec6d733e4a5e3147502663554b4d864587c7828a7/argon2_cffi_bindings-25.1.0-cp39-abi3-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1e021e87faa76ae0d413b619fe2b65ab9a037f24c60a1e6cc43457ae20de6dc6", size = 81246, upload-time = "2025-07-30T10:01:54.145Z" }, + { url = "https://files.pythonhosted.org/packages/09/52/94108adfdd6e2ddf58be64f959a0b9c7d4ef2fa71086c38356d22dc501ea/argon2_cffi_bindings-25.1.0-cp39-abi3-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d3e924cfc503018a714f94a49a149fdc0b644eaead5d1f089330399134fa028a", size = 87126, upload-time = "2025-07-30T10:01:55.074Z" }, + { url = "https://files.pythonhosted.org/packages/72/70/7a2993a12b0ffa2a9271259b79cc616e2389ed1a4d93842fac5a1f923ffd/argon2_cffi_bindings-25.1.0-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:c87b72589133f0346a1cb8d5ecca4b933e3c9b64656c9d175270a000e73b288d", size = 80343, upload-time = "2025-07-30T10:01:56.007Z" }, + { url = "https://files.pythonhosted.org/packages/78/9a/4e5157d893ffc712b74dbd868c7f62365618266982b64accab26bab01edc/argon2_cffi_bindings-25.1.0-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:1db89609c06afa1a214a69a462ea741cf735b29a57530478c06eb81dd403de99", size = 86777, upload-time = "2025-07-30T10:01:56.943Z" }, + { url = "https://files.pythonhosted.org/packages/74/cd/15777dfde1c29d96de7f18edf4cc94c385646852e7c7b0320aa91ccca583/argon2_cffi_bindings-25.1.0-cp39-abi3-win32.whl", hash = "sha256:473bcb5f82924b1becbb637b63303ec8d10e84c8d241119419897a26116515d2", size = 27180, upload-time = "2025-07-30T10:01:57.759Z" }, + { url = "https://files.pythonhosted.org/packages/e2/c6/a759ece8f1829d1f162261226fbfd2c6832b3ff7657384045286d2afa384/argon2_cffi_bindings-25.1.0-cp39-abi3-win_amd64.whl", hash = "sha256:a98cd7d17e9f7ce244c0803cad3c23a7d379c301ba618a5fa76a67d116618b98", size = 31715, upload-time = "2025-07-30T10:01:58.56Z" }, + { url = "https://files.pythonhosted.org/packages/42/b9/f8d6fa329ab25128b7e98fd83a3cb34d9db5b059a9847eddb840a0af45dd/argon2_cffi_bindings-25.1.0-cp39-abi3-win_arm64.whl", hash = "sha256:b0fdbcf513833809c882823f98dc2f931cf659d9a1429616ac3adebb49f5db94", size = 27149, upload-time = "2025-07-30T10:01:59.329Z" }, +] + +[[package]] +name = "attrs" +version = "25.3.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/5a/b0/1367933a8532ee6ff8d63537de4f1177af4bff9f3e829baf7331f595bb24/attrs-25.3.0.tar.gz", hash = "sha256:75d7cefc7fb576747b2c81b4442d4d4a1ce0900973527c011d1030fd3bf4af1b", size = 812032, upload-time = "2025-03-13T11:10:22.779Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/77/06/bb80f5f86020c4551da315d78b3ab75e8228f89f0162f2c3a819e407941a/attrs-25.3.0-py3-none-any.whl", hash = "sha256:427318ce031701fea540783410126f03899a97ffc6f61596ad581ac2e40e3bc3", size = 63815, upload-time = "2025-03-13T11:10:21.14Z" }, +] + +[[package]] +name = "authlib" +version = "1.6.4" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "cryptography" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/ce/bb/73a1f1c64ee527877f64122422dafe5b87a846ccf4ac933fe21bcbb8fee8/authlib-1.6.4.tar.gz", hash = "sha256:104b0442a43061dc8bc23b133d1d06a2b0a9c2e3e33f34c4338929e816287649", size = 164046, upload-time = "2025-09-17T09:59:23.897Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/0e/aa/91355b5f539caf1b94f0e66ff1e4ee39373b757fce08204981f7829ede51/authlib-1.6.4-py2.py3-none-any.whl", hash = "sha256:39313d2a2caac3ecf6d8f95fbebdfd30ae6ea6ae6a6db794d976405fdd9aa796", size = 243076, upload-time = "2025-09-17T09:59:22.259Z" }, +] + +[[package]] +name = "backoff" +version = "2.2.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/47/d7/5bbeb12c44d7c4f2fb5b56abce497eb5ed9f34d85701de869acedd602619/backoff-2.2.1.tar.gz", hash = "sha256:03f829f5bb1923180821643f8753b0502c3b682293992485b0eef2807afa5cba", size = 17001, upload-time = "2022-10-05T19:19:32.061Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/df/73/b6e24bd22e6720ca8ee9a85a0c4a2971af8497d8f3193fa05390cbd46e09/backoff-2.2.1-py3-none-any.whl", hash = "sha256:63579f9a0628e06278f7e47b7d7d5b6ce20dc65c5e96a6f3ca99a6adca0396e8", size = 15148, upload-time = "2022-10-05T19:19:30.546Z" }, +] + +[[package]] +name = "baml-py" +version = "0.201.0" +source = { registry = "https://pypi.org/simple" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/54/54/2b0edb3d22e95ce56f36610391c11108a4ef26ba2837736a32001687ae34/baml_py-0.201.0-cp38-abi3-macosx_10_12_x86_64.whl", hash = "sha256:83228d2af2b0e845bbbb4e14f7cbd3376cec385aee01210ac522ab6076e07bec", size = 17387971, upload-time = "2025-07-03T19:29:05.844Z" }, + { url = "https://files.pythonhosted.org/packages/c9/08/1d48c28c63eadea2c04360cbb7f64968599e99cd6b8fc0ec0bd4424d3cf1/baml_py-0.201.0-cp38-abi3-macosx_11_0_arm64.whl", hash = "sha256:2a9d016139e3ae5b5ce98c7b05b5fbd53d5d38f04dc810ec4d70fb17dd6c10e4", size = 16191010, upload-time = "2025-07-03T19:29:09.323Z" }, + { url = "https://files.pythonhosted.org/packages/73/1a/20b2d46501e3dd0648af339825106a6ac5eeb5d22d7e6a10cf16b9aa1cb8/baml_py-0.201.0-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b5058505b1a3c5f04fc1679aec4d730fa9bef2cbd96209b3ed50152f60b96baf", size = 19950249, upload-time = "2025-07-03T19:29:11.974Z" }, + { url = "https://files.pythonhosted.org/packages/38/24/bc871059e905159ae1913c2e3032dd6ef2f5c3d0983999d2c2f1eebb65a4/baml_py-0.201.0-cp38-abi3-manylinux_2_24_aarch64.whl", hash = "sha256:36289d548581ba4accd5eaaab3246872542dd32dc6717e537654fa0cad884071", size = 19231310, upload-time = "2025-07-03T19:29:14.857Z" }, + { url = "https://files.pythonhosted.org/packages/0e/11/4268a0b82b02c7202fe5aa0d7175712158d998c491cac723b2bac3d5d495/baml_py-0.201.0-cp38-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:5ab70e7bd6481d71edca8a33313347b29faccec78b9960138aa437522813ac9a", size = 19490012, upload-time = "2025-07-03T19:29:18.512Z" }, + { url = "https://files.pythonhosted.org/packages/31/21/c9f9aea1adba2a5978ffab11ba0948a9f3f81ec6ed3056067713260e93a1/baml_py-0.201.0-cp38-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:7efc5c693a7142c230a4f3d6700415127fee0b9f5fdbb36db63e04e27ac4c0f1", size = 20090620, upload-time = "2025-07-03T19:29:21.072Z" }, + { url = "https://files.pythonhosted.org/packages/99/cf/92123d8d753f1d1473e080c4c182139bfe3b9a6418e891cf1d96b6c33848/baml_py-0.201.0-cp38-abi3-win_amd64.whl", hash = "sha256:56499857b7a27ae61a661c8ce0dddd0fb567a45c0b826157e44048a14cf586f9", size = 17253005, upload-time = "2025-07-03T19:29:23.722Z" }, + { url = "https://files.pythonhosted.org/packages/59/88/5056aa1bc9480f758cd6e210d63bd1f9ad90b44c87f4121285906526495e/baml_py-0.201.0-cp38-abi3-win_arm64.whl", hash = "sha256:1e52dc1151db84a302b746590fe2bc484bdd794f83fa5da7216d9394c559f33a", size = 15612701, upload-time = "2025-07-03T19:29:26.712Z" }, +] + +[[package]] +name = "bcrypt" +version = "4.3.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/bb/5d/6d7433e0f3cd46ce0b43cd65e1db465ea024dbb8216fb2404e919c2ad77b/bcrypt-4.3.0.tar.gz", hash = "sha256:3a3fd2204178b6d2adcf09cb4f6426ffef54762577a7c9b54c159008cb288c18", size = 25697, upload-time = "2025-02-28T01:24:09.174Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/bf/2c/3d44e853d1fe969d229bd58d39ae6902b3d924af0e2b5a60d17d4b809ded/bcrypt-4.3.0-cp313-cp313t-macosx_10_12_universal2.whl", hash = "sha256:f01e060f14b6b57bbb72fc5b4a83ac21c443c9a2ee708e04a10e9192f90a6281", size = 483719, upload-time = "2025-02-28T01:22:34.539Z" }, + { url = "https://files.pythonhosted.org/packages/a1/e2/58ff6e2a22eca2e2cff5370ae56dba29d70b1ea6fc08ee9115c3ae367795/bcrypt-4.3.0-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c5eeac541cefd0bb887a371ef73c62c3cd78535e4887b310626036a7c0a817bb", size = 272001, upload-time = "2025-02-28T01:22:38.078Z" }, + { url = "https://files.pythonhosted.org/packages/37/1f/c55ed8dbe994b1d088309e366749633c9eb90d139af3c0a50c102ba68a1a/bcrypt-4.3.0-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:59e1aa0e2cd871b08ca146ed08445038f42ff75968c7ae50d2fdd7860ade2180", size = 277451, upload-time = "2025-02-28T01:22:40.787Z" }, + { url = "https://files.pythonhosted.org/packages/d7/1c/794feb2ecf22fe73dcfb697ea7057f632061faceb7dcf0f155f3443b4d79/bcrypt-4.3.0-cp313-cp313t-manylinux_2_28_aarch64.whl", hash = "sha256:0042b2e342e9ae3d2ed22727c1262f76cc4f345683b5c1715f0250cf4277294f", size = 272792, upload-time = "2025-02-28T01:22:43.144Z" }, + { url = "https://files.pythonhosted.org/packages/13/b7/0b289506a3f3598c2ae2bdfa0ea66969812ed200264e3f61df77753eee6d/bcrypt-4.3.0-cp313-cp313t-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:74a8d21a09f5e025a9a23e7c0fd2c7fe8e7503e4d356c0a2c1486ba010619f09", size = 289752, upload-time = "2025-02-28T01:22:45.56Z" }, + { url = "https://files.pythonhosted.org/packages/dc/24/d0fb023788afe9e83cc118895a9f6c57e1044e7e1672f045e46733421fe6/bcrypt-4.3.0-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:0142b2cb84a009f8452c8c5a33ace5e3dfec4159e7735f5afe9a4d50a8ea722d", size = 277762, upload-time = "2025-02-28T01:22:47.023Z" }, + { url = "https://files.pythonhosted.org/packages/e4/38/cde58089492e55ac4ef6c49fea7027600c84fd23f7520c62118c03b4625e/bcrypt-4.3.0-cp313-cp313t-manylinux_2_34_aarch64.whl", hash = "sha256:12fa6ce40cde3f0b899729dbd7d5e8811cb892d31b6f7d0334a1f37748b789fd", size = 272384, upload-time = "2025-02-28T01:22:49.221Z" }, + { url = "https://files.pythonhosted.org/packages/de/6a/d5026520843490cfc8135d03012a413e4532a400e471e6188b01b2de853f/bcrypt-4.3.0-cp313-cp313t-manylinux_2_34_x86_64.whl", hash = "sha256:5bd3cca1f2aa5dbcf39e2aa13dd094ea181f48959e1071265de49cc2b82525af", size = 277329, upload-time = "2025-02-28T01:22:51.603Z" }, + { url = "https://files.pythonhosted.org/packages/b3/a3/4fc5255e60486466c389e28c12579d2829b28a527360e9430b4041df4cf9/bcrypt-4.3.0-cp313-cp313t-musllinux_1_1_aarch64.whl", hash = "sha256:335a420cfd63fc5bc27308e929bee231c15c85cc4c496610ffb17923abf7f231", size = 305241, upload-time = "2025-02-28T01:22:53.283Z" }, + { url = "https://files.pythonhosted.org/packages/c7/15/2b37bc07d6ce27cc94e5b10fd5058900eb8fb11642300e932c8c82e25c4a/bcrypt-4.3.0-cp313-cp313t-musllinux_1_1_x86_64.whl", hash = "sha256:0e30e5e67aed0187a1764911af023043b4542e70a7461ad20e837e94d23e1d6c", size = 309617, upload-time = "2025-02-28T01:22:55.461Z" }, + { url = "https://files.pythonhosted.org/packages/5f/1f/99f65edb09e6c935232ba0430c8c13bb98cb3194b6d636e61d93fe60ac59/bcrypt-4.3.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:3b8d62290ebefd49ee0b3ce7500f5dbdcf13b81402c05f6dafab9a1e1b27212f", size = 335751, upload-time = "2025-02-28T01:22:57.81Z" }, + { url = "https://files.pythonhosted.org/packages/00/1b/b324030c706711c99769988fcb694b3cb23f247ad39a7823a78e361bdbb8/bcrypt-4.3.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:2ef6630e0ec01376f59a006dc72918b1bf436c3b571b80fa1968d775fa02fe7d", size = 355965, upload-time = "2025-02-28T01:22:59.181Z" }, + { url = "https://files.pythonhosted.org/packages/aa/dd/20372a0579dd915dfc3b1cd4943b3bca431866fcb1dfdfd7518c3caddea6/bcrypt-4.3.0-cp313-cp313t-win32.whl", hash = "sha256:7a4be4cbf241afee43f1c3969b9103a41b40bcb3a3f467ab19f891d9bc4642e4", size = 155316, upload-time = "2025-02-28T01:23:00.763Z" }, + { url = "https://files.pythonhosted.org/packages/6d/52/45d969fcff6b5577c2bf17098dc36269b4c02197d551371c023130c0f890/bcrypt-4.3.0-cp313-cp313t-win_amd64.whl", hash = "sha256:5c1949bf259a388863ced887c7861da1df681cb2388645766c89fdfd9004c669", size = 147752, upload-time = "2025-02-28T01:23:02.908Z" }, + { url = "https://files.pythonhosted.org/packages/11/22/5ada0b9af72b60cbc4c9a399fdde4af0feaa609d27eb0adc61607997a3fa/bcrypt-4.3.0-cp38-abi3-macosx_10_12_universal2.whl", hash = "sha256:f81b0ed2639568bf14749112298f9e4e2b28853dab50a8b357e31798686a036d", size = 498019, upload-time = "2025-02-28T01:23:05.838Z" }, + { url = "https://files.pythonhosted.org/packages/b8/8c/252a1edc598dc1ce57905be173328eda073083826955ee3c97c7ff5ba584/bcrypt-4.3.0-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:864f8f19adbe13b7de11ba15d85d4a428c7e2f344bac110f667676a0ff84924b", size = 279174, upload-time = "2025-02-28T01:23:07.274Z" }, + { url = "https://files.pythonhosted.org/packages/29/5b/4547d5c49b85f0337c13929f2ccbe08b7283069eea3550a457914fc078aa/bcrypt-4.3.0-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3e36506d001e93bffe59754397572f21bb5dc7c83f54454c990c74a468cd589e", size = 283870, upload-time = "2025-02-28T01:23:09.151Z" }, + { url = "https://files.pythonhosted.org/packages/be/21/7dbaf3fa1745cb63f776bb046e481fbababd7d344c5324eab47f5ca92dd2/bcrypt-4.3.0-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:842d08d75d9fe9fb94b18b071090220697f9f184d4547179b60734846461ed59", size = 279601, upload-time = "2025-02-28T01:23:11.461Z" }, + { url = "https://files.pythonhosted.org/packages/6d/64/e042fc8262e971347d9230d9abbe70d68b0a549acd8611c83cebd3eaec67/bcrypt-4.3.0-cp38-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:7c03296b85cb87db865d91da79bf63d5609284fc0cab9472fdd8367bbd830753", size = 297660, upload-time = "2025-02-28T01:23:12.989Z" }, + { url = "https://files.pythonhosted.org/packages/50/b8/6294eb84a3fef3b67c69b4470fcdd5326676806bf2519cda79331ab3c3a9/bcrypt-4.3.0-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:62f26585e8b219cdc909b6a0069efc5e4267e25d4a3770a364ac58024f62a761", size = 284083, upload-time = "2025-02-28T01:23:14.5Z" }, + { url = "https://files.pythonhosted.org/packages/62/e6/baff635a4f2c42e8788fe1b1633911c38551ecca9a749d1052d296329da6/bcrypt-4.3.0-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:beeefe437218a65322fbd0069eb437e7c98137e08f22c4660ac2dc795c31f8bb", size = 279237, upload-time = "2025-02-28T01:23:16.686Z" }, + { url = "https://files.pythonhosted.org/packages/39/48/46f623f1b0c7dc2e5de0b8af5e6f5ac4cc26408ac33f3d424e5ad8da4a90/bcrypt-4.3.0-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:97eea7408db3a5bcce4a55d13245ab3fa566e23b4c67cd227062bb49e26c585d", size = 283737, upload-time = "2025-02-28T01:23:18.897Z" }, + { url = "https://files.pythonhosted.org/packages/49/8b/70671c3ce9c0fca4a6cc3cc6ccbaa7e948875a2e62cbd146e04a4011899c/bcrypt-4.3.0-cp38-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:191354ebfe305e84f344c5964c7cd5f924a3bfc5d405c75ad07f232b6dffb49f", size = 312741, upload-time = "2025-02-28T01:23:21.041Z" }, + { url = "https://files.pythonhosted.org/packages/27/fb/910d3a1caa2d249b6040a5caf9f9866c52114d51523ac2fb47578a27faee/bcrypt-4.3.0-cp38-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:41261d64150858eeb5ff43c753c4b216991e0ae16614a308a15d909503617732", size = 316472, upload-time = "2025-02-28T01:23:23.183Z" }, + { url = "https://files.pythonhosted.org/packages/dc/cf/7cf3a05b66ce466cfb575dbbda39718d45a609daa78500f57fa9f36fa3c0/bcrypt-4.3.0-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:33752b1ba962ee793fa2b6321404bf20011fe45b9afd2a842139de3011898fef", size = 343606, upload-time = "2025-02-28T01:23:25.361Z" }, + { url = "https://files.pythonhosted.org/packages/e3/b8/e970ecc6d7e355c0d892b7f733480f4aa8509f99b33e71550242cf0b7e63/bcrypt-4.3.0-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:50e6e80a4bfd23a25f5c05b90167c19030cf9f87930f7cb2eacb99f45d1c3304", size = 362867, upload-time = "2025-02-28T01:23:26.875Z" }, + { url = "https://files.pythonhosted.org/packages/a9/97/8d3118efd8354c555a3422d544163f40d9f236be5b96c714086463f11699/bcrypt-4.3.0-cp38-abi3-win32.whl", hash = "sha256:67a561c4d9fb9465ec866177e7aebcad08fe23aaf6fbd692a6fab69088abfc51", size = 160589, upload-time = "2025-02-28T01:23:28.381Z" }, + { url = "https://files.pythonhosted.org/packages/29/07/416f0b99f7f3997c69815365babbc2e8754181a4b1899d921b3c7d5b6f12/bcrypt-4.3.0-cp38-abi3-win_amd64.whl", hash = "sha256:584027857bc2843772114717a7490a37f68da563b3620f78a849bcb54dc11e62", size = 152794, upload-time = "2025-02-28T01:23:30.187Z" }, + { url = "https://files.pythonhosted.org/packages/6e/c1/3fa0e9e4e0bfd3fd77eb8b52ec198fd6e1fd7e9402052e43f23483f956dd/bcrypt-4.3.0-cp39-abi3-macosx_10_12_universal2.whl", hash = "sha256:0d3efb1157edebfd9128e4e46e2ac1a64e0c1fe46fb023158a407c7892b0f8c3", size = 498969, upload-time = "2025-02-28T01:23:31.945Z" }, + { url = "https://files.pythonhosted.org/packages/ce/d4/755ce19b6743394787fbd7dff6bf271b27ee9b5912a97242e3caf125885b/bcrypt-4.3.0-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:08bacc884fd302b611226c01014eca277d48f0a05187666bca23aac0dad6fe24", size = 279158, upload-time = "2025-02-28T01:23:34.161Z" }, + { url = "https://files.pythonhosted.org/packages/9b/5d/805ef1a749c965c46b28285dfb5cd272a7ed9fa971f970435a5133250182/bcrypt-4.3.0-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f6746e6fec103fcd509b96bacdfdaa2fbde9a553245dbada284435173a6f1aef", size = 284285, upload-time = "2025-02-28T01:23:35.765Z" }, + { url = "https://files.pythonhosted.org/packages/ab/2b/698580547a4a4988e415721b71eb45e80c879f0fb04a62da131f45987b96/bcrypt-4.3.0-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:afe327968aaf13fc143a56a3360cb27d4ad0345e34da12c7290f1b00b8fe9a8b", size = 279583, upload-time = "2025-02-28T01:23:38.021Z" }, + { url = "https://files.pythonhosted.org/packages/f2/87/62e1e426418204db520f955ffd06f1efd389feca893dad7095bf35612eec/bcrypt-4.3.0-cp39-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:d9af79d322e735b1fc33404b5765108ae0ff232d4b54666d46730f8ac1a43676", size = 297896, upload-time = "2025-02-28T01:23:39.575Z" }, + { url = "https://files.pythonhosted.org/packages/cb/c6/8fedca4c2ada1b6e889c52d2943b2f968d3427e5d65f595620ec4c06fa2f/bcrypt-4.3.0-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:f1e3ffa1365e8702dc48c8b360fef8d7afeca482809c5e45e653af82ccd088c1", size = 284492, upload-time = "2025-02-28T01:23:40.901Z" }, + { url = "https://files.pythonhosted.org/packages/4d/4d/c43332dcaaddb7710a8ff5269fcccba97ed3c85987ddaa808db084267b9a/bcrypt-4.3.0-cp39-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:3004df1b323d10021fda07a813fd33e0fd57bef0e9a480bb143877f6cba996fe", size = 279213, upload-time = "2025-02-28T01:23:42.653Z" }, + { url = "https://files.pythonhosted.org/packages/dc/7f/1e36379e169a7df3a14a1c160a49b7b918600a6008de43ff20d479e6f4b5/bcrypt-4.3.0-cp39-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:531457e5c839d8caea9b589a1bcfe3756b0547d7814e9ce3d437f17da75c32b0", size = 284162, upload-time = "2025-02-28T01:23:43.964Z" }, + { url = "https://files.pythonhosted.org/packages/1c/0a/644b2731194b0d7646f3210dc4d80c7fee3ecb3a1f791a6e0ae6bb8684e3/bcrypt-4.3.0-cp39-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:17a854d9a7a476a89dcef6c8bd119ad23e0f82557afbd2c442777a16408e614f", size = 312856, upload-time = "2025-02-28T01:23:46.011Z" }, + { url = "https://files.pythonhosted.org/packages/dc/62/2a871837c0bb6ab0c9a88bf54de0fc021a6a08832d4ea313ed92a669d437/bcrypt-4.3.0-cp39-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:6fb1fd3ab08c0cbc6826a2e0447610c6f09e983a281b919ed721ad32236b8b23", size = 316726, upload-time = "2025-02-28T01:23:47.575Z" }, + { url = "https://files.pythonhosted.org/packages/0c/a1/9898ea3faac0b156d457fd73a3cb9c2855c6fd063e44b8522925cdd8ce46/bcrypt-4.3.0-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:e965a9c1e9a393b8005031ff52583cedc15b7884fce7deb8b0346388837d6cfe", size = 343664, upload-time = "2025-02-28T01:23:49.059Z" }, + { url = "https://files.pythonhosted.org/packages/40/f2/71b4ed65ce38982ecdda0ff20c3ad1b15e71949c78b2c053df53629ce940/bcrypt-4.3.0-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:79e70b8342a33b52b55d93b3a59223a844962bef479f6a0ea318ebbcadf71505", size = 363128, upload-time = "2025-02-28T01:23:50.399Z" }, + { url = "https://files.pythonhosted.org/packages/11/99/12f6a58eca6dea4be992d6c681b7ec9410a1d9f5cf368c61437e31daa879/bcrypt-4.3.0-cp39-abi3-win32.whl", hash = "sha256:b4d4e57f0a63fd0b358eb765063ff661328f69a04494427265950c71b992a39a", size = 160598, upload-time = "2025-02-28T01:23:51.775Z" }, + { url = "https://files.pythonhosted.org/packages/a9/cf/45fb5261ece3e6b9817d3d82b2f343a505fd58674a92577923bc500bd1aa/bcrypt-4.3.0-cp39-abi3-win_amd64.whl", hash = "sha256:e53e074b120f2877a35cc6c736b8eb161377caae8925c17688bd46ba56daaa5b", size = 152799, upload-time = "2025-02-28T01:23:53.139Z" }, + { url = "https://files.pythonhosted.org/packages/4c/b1/1289e21d710496b88340369137cc4c5f6ee036401190ea116a7b4ae6d32a/bcrypt-4.3.0-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:a839320bf27d474e52ef8cb16449bb2ce0ba03ca9f44daba6d93fa1d8828e48a", size = 275103, upload-time = "2025-02-28T01:24:00.764Z" }, + { url = "https://files.pythonhosted.org/packages/94/41/19be9fe17e4ffc5d10b7b67f10e459fc4eee6ffe9056a88de511920cfd8d/bcrypt-4.3.0-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:bdc6a24e754a555d7316fa4774e64c6c3997d27ed2d1964d55920c7c227bc4ce", size = 280513, upload-time = "2025-02-28T01:24:02.243Z" }, + { url = "https://files.pythonhosted.org/packages/aa/73/05687a9ef89edebdd8ad7474c16d8af685eb4591c3c38300bb6aad4f0076/bcrypt-4.3.0-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:55a935b8e9a1d2def0626c4269db3fcd26728cbff1e84f0341465c31c4ee56d8", size = 274685, upload-time = "2025-02-28T01:24:04.512Z" }, + { url = "https://files.pythonhosted.org/packages/63/13/47bba97924ebe86a62ef83dc75b7c8a881d53c535f83e2c54c4bd701e05c/bcrypt-4.3.0-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:57967b7a28d855313a963aaea51bf6df89f833db4320da458e5b3c5ab6d4c938", size = 280110, upload-time = "2025-02-28T01:24:05.896Z" }, +] + +[[package]] +name = "black" +version = "25.1.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "click" }, + { name = "mypy-extensions" }, + { name = "packaging" }, + { name = "pathspec" }, + { name = "platformdirs" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/94/49/26a7b0f3f35da4b5a65f081943b7bcd22d7002f5f0fb8098ec1ff21cb6ef/black-25.1.0.tar.gz", hash = "sha256:33496d5cd1222ad73391352b4ae8da15253c5de89b93a80b3e2c8d9a19ec2666", size = 649449, upload-time = "2025-01-29T04:15:40.373Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7e/4f/87f596aca05c3ce5b94b8663dbfe242a12843caaa82dd3f85f1ffdc3f177/black-25.1.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a39337598244de4bae26475f77dda852ea00a93bd4c728e09eacd827ec929df0", size = 1614372, upload-time = "2025-01-29T05:37:11.71Z" }, + { url = "https://files.pythonhosted.org/packages/e7/d0/2c34c36190b741c59c901e56ab7f6e54dad8df05a6272a9747ecef7c6036/black-25.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:96c1c7cd856bba8e20094e36e0f948718dc688dba4a9d78c3adde52b9e6c2299", size = 1442865, upload-time = "2025-01-29T05:37:14.309Z" }, + { url = "https://files.pythonhosted.org/packages/21/d4/7518c72262468430ead45cf22bd86c883a6448b9eb43672765d69a8f1248/black-25.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bce2e264d59c91e52d8000d507eb20a9aca4a778731a08cfff7e5ac4a4bb7096", size = 1749699, upload-time = "2025-01-29T04:18:17.688Z" }, + { url = "https://files.pythonhosted.org/packages/58/db/4f5beb989b547f79096e035c4981ceb36ac2b552d0ac5f2620e941501c99/black-25.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:172b1dbff09f86ce6f4eb8edf9dede08b1fce58ba194c87d7a4f1a5aa2f5b3c2", size = 1428028, upload-time = "2025-01-29T04:18:51.711Z" }, + { url = "https://files.pythonhosted.org/packages/83/71/3fe4741df7adf015ad8dfa082dd36c94ca86bb21f25608eb247b4afb15b2/black-25.1.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4b60580e829091e6f9238c848ea6750efed72140b91b048770b64e74fe04908b", size = 1650988, upload-time = "2025-01-29T05:37:16.707Z" }, + { url = "https://files.pythonhosted.org/packages/13/f3/89aac8a83d73937ccd39bbe8fc6ac8860c11cfa0af5b1c96d081facac844/black-25.1.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1e2978f6df243b155ef5fa7e558a43037c3079093ed5d10fd84c43900f2d8ecc", size = 1453985, upload-time = "2025-01-29T05:37:18.273Z" }, + { url = "https://files.pythonhosted.org/packages/6f/22/b99efca33f1f3a1d2552c714b1e1b5ae92efac6c43e790ad539a163d1754/black-25.1.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3b48735872ec535027d979e8dcb20bf4f70b5ac75a8ea99f127c106a7d7aba9f", size = 1783816, upload-time = "2025-01-29T04:18:33.823Z" }, + { url = "https://files.pythonhosted.org/packages/18/7e/a27c3ad3822b6f2e0e00d63d58ff6299a99a5b3aee69fa77cd4b0076b261/black-25.1.0-cp312-cp312-win_amd64.whl", hash = "sha256:ea0213189960bda9cf99be5b8c8ce66bb054af5e9e861249cd23471bd7b0b3ba", size = 1440860, upload-time = "2025-01-29T04:19:12.944Z" }, + { url = "https://files.pythonhosted.org/packages/98/87/0edf98916640efa5d0696e1abb0a8357b52e69e82322628f25bf14d263d1/black-25.1.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8f0b18a02996a836cc9c9c78e5babec10930862827b1b724ddfe98ccf2f2fe4f", size = 1650673, upload-time = "2025-01-29T05:37:20.574Z" }, + { url = "https://files.pythonhosted.org/packages/52/e5/f7bf17207cf87fa6e9b676576749c6b6ed0d70f179a3d812c997870291c3/black-25.1.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:afebb7098bfbc70037a053b91ae8437c3857482d3a690fefc03e9ff7aa9a5fd3", size = 1453190, upload-time = "2025-01-29T05:37:22.106Z" }, + { url = "https://files.pythonhosted.org/packages/e3/ee/adda3d46d4a9120772fae6de454c8495603c37c4c3b9c60f25b1ab6401fe/black-25.1.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:030b9759066a4ee5e5aca28c3c77f9c64789cdd4de8ac1df642c40b708be6171", size = 1782926, upload-time = "2025-01-29T04:18:58.564Z" }, + { url = "https://files.pythonhosted.org/packages/cc/64/94eb5f45dcb997d2082f097a3944cfc7fe87e071907f677e80788a2d7b7a/black-25.1.0-cp313-cp313-win_amd64.whl", hash = "sha256:a22f402b410566e2d1c950708c77ebf5ebd5d0d88a6a2e87c86d9fb48afa0d18", size = 1442613, upload-time = "2025-01-29T04:19:27.63Z" }, + { url = "https://files.pythonhosted.org/packages/09/71/54e999902aed72baf26bca0d50781b01838251a462612966e9fc4891eadd/black-25.1.0-py3-none-any.whl", hash = "sha256:95e8176dae143ba9097f351d174fdaf0ccd29efb414b362ae3fd72bf0f710717", size = 207646, upload-time = "2025-01-29T04:15:38.082Z" }, +] + +[[package]] +name = "boto3" +version = "1.40.18" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "botocore" }, + { name = "jmespath" }, + { name = "s3transfer" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/36/35/a30dc21ca6582358e0ce963f38e85d42ea619f12e7be4101a834c21d749d/boto3-1.40.18.tar.gz", hash = "sha256:64301d39adecc154e3e595eaf0d4f28998ef0a5551f1d033aeac51a9e1a688e5", size = 111994, upload-time = "2025-08-26T19:21:38.61Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ad/b5/3fc1802eb24aef135c3ba69fff2a9bfcc6a7a8258fb396706b1a6a44de36/boto3-1.40.18-py3-none-any.whl", hash = "sha256:daa776ba1251a7458c9d6c7627873d0c2460c8e8272d35759065580e9193700a", size = 140076, upload-time = "2025-08-26T19:21:36.484Z" }, +] + +[[package]] +name = "botocore" +version = "1.40.18" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "jmespath" }, + { name = "python-dateutil" }, + { name = "urllib3" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/6a/91/2e745382793fa7d30810a7d5ca3e05f6817b6db07601ca5aaab12720caf9/botocore-1.40.18.tar.gz", hash = "sha256:afd69bdadd8c55cc89d69de0799829e555193a352d87867f746e19020271cc0f", size = 14375007, upload-time = "2025-08-26T19:21:24.996Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/1a/f5/bd57bf21fdcc4e500cc406ed2c296e626ddd160f0fee2a4932256e5d62d8/botocore-1.40.18-py3-none-any.whl", hash = "sha256:57025c46ca00cf8cec25de07a759521bfbfb3036a0f69b272654a354615dc45f", size = 14039935, upload-time = "2025-08-26T19:21:19.085Z" }, +] + +[[package]] +name = "cachetools" +version = "5.5.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/6c/81/3747dad6b14fa2cf53fcf10548cf5aea6913e96fab41a3c198676f8948a5/cachetools-5.5.2.tar.gz", hash = "sha256:1a661caa9175d26759571b2e19580f9d6393969e5dfca11fdb1f947a23e640d4", size = 28380, upload-time = "2025-02-20T21:01:19.524Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/72/76/20fa66124dbe6be5cafeb312ece67de6b61dd91a0247d1ea13db4ebb33c2/cachetools-5.5.2-py3-none-any.whl", hash = "sha256:d26a22bcc62eb95c3beabd9f1ee5e820d3d2704fe2967cbe350e20c8ffcd3f0a", size = 10080, upload-time = "2025-02-20T21:01:16.647Z" }, +] + +[[package]] +name = "certifi" +version = "2025.8.3" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/dc/67/960ebe6bf230a96cda2e0abcf73af550ec4f090005363542f0765df162e0/certifi-2025.8.3.tar.gz", hash = "sha256:e564105f78ded564e3ae7c923924435e1daa7463faeab5bb932bc53ffae63407", size = 162386, upload-time = "2025-08-03T03:07:47.08Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e5/48/1549795ba7742c948d2ad169c1c8cdbae65bc450d6cd753d124b17c8cd32/certifi-2025.8.3-py3-none-any.whl", hash = "sha256:f6c12493cfb1b06ba2ff328595af9350c65d6644968e5d3a2ffd78699af217a5", size = 161216, upload-time = "2025-08-03T03:07:45.777Z" }, +] + +[[package]] +name = "cffi" +version = "2.0.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pycparser", marker = "implementation_name != 'PyPy'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/eb/56/b1ba7935a17738ae8453301356628e8147c79dbb825bcbc73dc7401f9846/cffi-2.0.0.tar.gz", hash = "sha256:44d1b5909021139fe36001ae048dbdde8214afa20200eda0f64c068cac5d5529", size = 523588, upload-time = "2025-09-08T23:24:04.541Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/12/4a/3dfd5f7850cbf0d06dc84ba9aa00db766b52ca38d8b86e3a38314d52498c/cffi-2.0.0-cp311-cp311-macosx_10_13_x86_64.whl", hash = "sha256:b4c854ef3adc177950a8dfc81a86f5115d2abd545751a304c5bcf2c2c7283cfe", size = 184344, upload-time = "2025-09-08T23:22:26.456Z" }, + { url = "https://files.pythonhosted.org/packages/4f/8b/f0e4c441227ba756aafbe78f117485b25bb26b1c059d01f137fa6d14896b/cffi-2.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2de9a304e27f7596cd03d16f1b7c72219bd944e99cc52b84d0145aefb07cbd3c", size = 180560, upload-time = "2025-09-08T23:22:28.197Z" }, + { url = "https://files.pythonhosted.org/packages/b1/b7/1200d354378ef52ec227395d95c2576330fd22a869f7a70e88e1447eb234/cffi-2.0.0-cp311-cp311-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:baf5215e0ab74c16e2dd324e8ec067ef59e41125d3eade2b863d294fd5035c92", size = 209613, upload-time = "2025-09-08T23:22:29.475Z" }, + { url = "https://files.pythonhosted.org/packages/b8/56/6033f5e86e8cc9bb629f0077ba71679508bdf54a9a5e112a3c0b91870332/cffi-2.0.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:730cacb21e1bdff3ce90babf007d0a0917cc3e6492f336c2f0134101e0944f93", size = 216476, upload-time = "2025-09-08T23:22:31.063Z" }, + { url = "https://files.pythonhosted.org/packages/dc/7f/55fecd70f7ece178db2f26128ec41430d8720f2d12ca97bf8f0a628207d5/cffi-2.0.0-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:6824f87845e3396029f3820c206e459ccc91760e8fa24422f8b0c3d1731cbec5", size = 203374, upload-time = "2025-09-08T23:22:32.507Z" }, + { url = "https://files.pythonhosted.org/packages/84/ef/a7b77c8bdc0f77adc3b46888f1ad54be8f3b7821697a7b89126e829e676a/cffi-2.0.0-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:9de40a7b0323d889cf8d23d1ef214f565ab154443c42737dfe52ff82cf857664", size = 202597, upload-time = "2025-09-08T23:22:34.132Z" }, + { url = "https://files.pythonhosted.org/packages/d7/91/500d892b2bf36529a75b77958edfcd5ad8e2ce4064ce2ecfeab2125d72d1/cffi-2.0.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:8941aaadaf67246224cee8c3803777eed332a19d909b47e29c9842ef1e79ac26", size = 215574, upload-time = "2025-09-08T23:22:35.443Z" }, + { url = "https://files.pythonhosted.org/packages/44/64/58f6255b62b101093d5df22dcb752596066c7e89dd725e0afaed242a61be/cffi-2.0.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:a05d0c237b3349096d3981b727493e22147f934b20f6f125a3eba8f994bec4a9", size = 218971, upload-time = "2025-09-08T23:22:36.805Z" }, + { url = "https://files.pythonhosted.org/packages/ab/49/fa72cebe2fd8a55fbe14956f9970fe8eb1ac59e5df042f603ef7c8ba0adc/cffi-2.0.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:94698a9c5f91f9d138526b48fe26a199609544591f859c870d477351dc7b2414", size = 211972, upload-time = "2025-09-08T23:22:38.436Z" }, + { url = "https://files.pythonhosted.org/packages/0b/28/dd0967a76aab36731b6ebfe64dec4e981aff7e0608f60c2d46b46982607d/cffi-2.0.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:5fed36fccc0612a53f1d4d9a816b50a36702c28a2aa880cb8a122b3466638743", size = 217078, upload-time = "2025-09-08T23:22:39.776Z" }, + { url = "https://files.pythonhosted.org/packages/2b/c0/015b25184413d7ab0a410775fdb4a50fca20f5589b5dab1dbbfa3baad8ce/cffi-2.0.0-cp311-cp311-win32.whl", hash = "sha256:c649e3a33450ec82378822b3dad03cc228b8f5963c0c12fc3b1e0ab940f768a5", size = 172076, upload-time = "2025-09-08T23:22:40.95Z" }, + { url = "https://files.pythonhosted.org/packages/ae/8f/dc5531155e7070361eb1b7e4c1a9d896d0cb21c49f807a6c03fd63fc877e/cffi-2.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:66f011380d0e49ed280c789fbd08ff0d40968ee7b665575489afa95c98196ab5", size = 182820, upload-time = "2025-09-08T23:22:42.463Z" }, + { url = "https://files.pythonhosted.org/packages/95/5c/1b493356429f9aecfd56bc171285a4c4ac8697f76e9bbbbb105e537853a1/cffi-2.0.0-cp311-cp311-win_arm64.whl", hash = "sha256:c6638687455baf640e37344fe26d37c404db8b80d037c3d29f58fe8d1c3b194d", size = 177635, upload-time = "2025-09-08T23:22:43.623Z" }, + { url = "https://files.pythonhosted.org/packages/ea/47/4f61023ea636104d4f16ab488e268b93008c3d0bb76893b1b31db1f96802/cffi-2.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:6d02d6655b0e54f54c4ef0b94eb6be0607b70853c45ce98bd278dc7de718be5d", size = 185271, upload-time = "2025-09-08T23:22:44.795Z" }, + { url = "https://files.pythonhosted.org/packages/df/a2/781b623f57358e360d62cdd7a8c681f074a71d445418a776eef0aadb4ab4/cffi-2.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8eca2a813c1cb7ad4fb74d368c2ffbbb4789d377ee5bb8df98373c2cc0dee76c", size = 181048, upload-time = "2025-09-08T23:22:45.938Z" }, + { url = "https://files.pythonhosted.org/packages/ff/df/a4f0fbd47331ceeba3d37c2e51e9dfc9722498becbeec2bd8bc856c9538a/cffi-2.0.0-cp312-cp312-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:21d1152871b019407d8ac3985f6775c079416c282e431a4da6afe7aefd2bccbe", size = 212529, upload-time = "2025-09-08T23:22:47.349Z" }, + { url = "https://files.pythonhosted.org/packages/d5/72/12b5f8d3865bf0f87cf1404d8c374e7487dcf097a1c91c436e72e6badd83/cffi-2.0.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:b21e08af67b8a103c71a250401c78d5e0893beff75e28c53c98f4de42f774062", size = 220097, upload-time = "2025-09-08T23:22:48.677Z" }, + { url = "https://files.pythonhosted.org/packages/c2/95/7a135d52a50dfa7c882ab0ac17e8dc11cec9d55d2c18dda414c051c5e69e/cffi-2.0.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:1e3a615586f05fc4065a8b22b8152f0c1b00cdbc60596d187c2a74f9e3036e4e", size = 207983, upload-time = "2025-09-08T23:22:50.06Z" }, + { url = "https://files.pythonhosted.org/packages/3a/c8/15cb9ada8895957ea171c62dc78ff3e99159ee7adb13c0123c001a2546c1/cffi-2.0.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:81afed14892743bbe14dacb9e36d9e0e504cd204e0b165062c488942b9718037", size = 206519, upload-time = "2025-09-08T23:22:51.364Z" }, + { url = "https://files.pythonhosted.org/packages/78/2d/7fa73dfa841b5ac06c7b8855cfc18622132e365f5b81d02230333ff26e9e/cffi-2.0.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:3e17ed538242334bf70832644a32a7aae3d83b57567f9fd60a26257e992b79ba", size = 219572, upload-time = "2025-09-08T23:22:52.902Z" }, + { url = "https://files.pythonhosted.org/packages/07/e0/267e57e387b4ca276b90f0434ff88b2c2241ad72b16d31836adddfd6031b/cffi-2.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3925dd22fa2b7699ed2617149842d2e6adde22b262fcbfada50e3d195e4b3a94", size = 222963, upload-time = "2025-09-08T23:22:54.518Z" }, + { url = "https://files.pythonhosted.org/packages/b6/75/1f2747525e06f53efbd878f4d03bac5b859cbc11c633d0fb81432d98a795/cffi-2.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:2c8f814d84194c9ea681642fd164267891702542f028a15fc97d4674b6206187", size = 221361, upload-time = "2025-09-08T23:22:55.867Z" }, + { url = "https://files.pythonhosted.org/packages/7b/2b/2b6435f76bfeb6bbf055596976da087377ede68df465419d192acf00c437/cffi-2.0.0-cp312-cp312-win32.whl", hash = "sha256:da902562c3e9c550df360bfa53c035b2f241fed6d9aef119048073680ace4a18", size = 172932, upload-time = "2025-09-08T23:22:57.188Z" }, + { url = "https://files.pythonhosted.org/packages/f8/ed/13bd4418627013bec4ed6e54283b1959cf6db888048c7cf4b4c3b5b36002/cffi-2.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:da68248800ad6320861f129cd9c1bf96ca849a2771a59e0344e88681905916f5", size = 183557, upload-time = "2025-09-08T23:22:58.351Z" }, + { url = "https://files.pythonhosted.org/packages/95/31/9f7f93ad2f8eff1dbc1c3656d7ca5bfd8fb52c9d786b4dcf19b2d02217fa/cffi-2.0.0-cp312-cp312-win_arm64.whl", hash = "sha256:4671d9dd5ec934cb9a73e7ee9676f9362aba54f7f34910956b84d727b0d73fb6", size = 177762, upload-time = "2025-09-08T23:22:59.668Z" }, + { url = "https://files.pythonhosted.org/packages/4b/8d/a0a47a0c9e413a658623d014e91e74a50cdd2c423f7ccfd44086ef767f90/cffi-2.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:00bdf7acc5f795150faa6957054fbbca2439db2f775ce831222b66f192f03beb", size = 185230, upload-time = "2025-09-08T23:23:00.879Z" }, + { url = "https://files.pythonhosted.org/packages/4a/d2/a6c0296814556c68ee32009d9c2ad4f85f2707cdecfd7727951ec228005d/cffi-2.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:45d5e886156860dc35862657e1494b9bae8dfa63bf56796f2fb56e1679fc0bca", size = 181043, upload-time = "2025-09-08T23:23:02.231Z" }, + { url = "https://files.pythonhosted.org/packages/b0/1e/d22cc63332bd59b06481ceaac49d6c507598642e2230f201649058a7e704/cffi-2.0.0-cp313-cp313-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:07b271772c100085dd28b74fa0cd81c8fb1a3ba18b21e03d7c27f3436a10606b", size = 212446, upload-time = "2025-09-08T23:23:03.472Z" }, + { url = "https://files.pythonhosted.org/packages/a9/f5/a2c23eb03b61a0b8747f211eb716446c826ad66818ddc7810cc2cc19b3f2/cffi-2.0.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d48a880098c96020b02d5a1f7d9251308510ce8858940e6fa99ece33f610838b", size = 220101, upload-time = "2025-09-08T23:23:04.792Z" }, + { url = "https://files.pythonhosted.org/packages/f2/7f/e6647792fc5850d634695bc0e6ab4111ae88e89981d35ac269956605feba/cffi-2.0.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:f93fd8e5c8c0a4aa1f424d6173f14a892044054871c771f8566e4008eaa359d2", size = 207948, upload-time = "2025-09-08T23:23:06.127Z" }, + { url = "https://files.pythonhosted.org/packages/cb/1e/a5a1bd6f1fb30f22573f76533de12a00bf274abcdc55c8edab639078abb6/cffi-2.0.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:dd4f05f54a52fb558f1ba9f528228066954fee3ebe629fc1660d874d040ae5a3", size = 206422, upload-time = "2025-09-08T23:23:07.753Z" }, + { url = "https://files.pythonhosted.org/packages/98/df/0a1755e750013a2081e863e7cd37e0cdd02664372c754e5560099eb7aa44/cffi-2.0.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c8d3b5532fc71b7a77c09192b4a5a200ea992702734a2e9279a37f2478236f26", size = 219499, upload-time = "2025-09-08T23:23:09.648Z" }, + { url = "https://files.pythonhosted.org/packages/50/e1/a969e687fcf9ea58e6e2a928ad5e2dd88cc12f6f0ab477e9971f2309b57c/cffi-2.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:d9b29c1f0ae438d5ee9acb31cadee00a58c46cc9c0b2f9038c6b0b3470877a8c", size = 222928, upload-time = "2025-09-08T23:23:10.928Z" }, + { url = "https://files.pythonhosted.org/packages/36/54/0362578dd2c9e557a28ac77698ed67323ed5b9775ca9d3fe73fe191bb5d8/cffi-2.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6d50360be4546678fc1b79ffe7a66265e28667840010348dd69a314145807a1b", size = 221302, upload-time = "2025-09-08T23:23:12.42Z" }, + { url = "https://files.pythonhosted.org/packages/eb/6d/bf9bda840d5f1dfdbf0feca87fbdb64a918a69bca42cfa0ba7b137c48cb8/cffi-2.0.0-cp313-cp313-win32.whl", hash = "sha256:74a03b9698e198d47562765773b4a8309919089150a0bb17d829ad7b44b60d27", size = 172909, upload-time = "2025-09-08T23:23:14.32Z" }, + { url = "https://files.pythonhosted.org/packages/37/18/6519e1ee6f5a1e579e04b9ddb6f1676c17368a7aba48299c3759bbc3c8b3/cffi-2.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:19f705ada2530c1167abacb171925dd886168931e0a7b78f5bffcae5c6b5be75", size = 183402, upload-time = "2025-09-08T23:23:15.535Z" }, + { url = "https://files.pythonhosted.org/packages/cb/0e/02ceeec9a7d6ee63bb596121c2c8e9b3a9e150936f4fbef6ca1943e6137c/cffi-2.0.0-cp313-cp313-win_arm64.whl", hash = "sha256:256f80b80ca3853f90c21b23ee78cd008713787b1b1e93eae9f3d6a7134abd91", size = 177780, upload-time = "2025-09-08T23:23:16.761Z" }, + { url = "https://files.pythonhosted.org/packages/92/c4/3ce07396253a83250ee98564f8d7e9789fab8e58858f35d07a9a2c78de9f/cffi-2.0.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fc33c5141b55ed366cfaad382df24fe7dcbc686de5be719b207bb248e3053dc5", size = 185320, upload-time = "2025-09-08T23:23:18.087Z" }, + { url = "https://files.pythonhosted.org/packages/59/dd/27e9fa567a23931c838c6b02d0764611c62290062a6d4e8ff7863daf9730/cffi-2.0.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c654de545946e0db659b3400168c9ad31b5d29593291482c43e3564effbcee13", size = 181487, upload-time = "2025-09-08T23:23:19.622Z" }, + { url = "https://files.pythonhosted.org/packages/d6/43/0e822876f87ea8a4ef95442c3d766a06a51fc5298823f884ef87aaad168c/cffi-2.0.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:24b6f81f1983e6df8db3adc38562c83f7d4a0c36162885ec7f7b77c7dcbec97b", size = 220049, upload-time = "2025-09-08T23:23:20.853Z" }, + { url = "https://files.pythonhosted.org/packages/b4/89/76799151d9c2d2d1ead63c2429da9ea9d7aac304603de0c6e8764e6e8e70/cffi-2.0.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:12873ca6cb9b0f0d3a0da705d6086fe911591737a59f28b7936bdfed27c0d47c", size = 207793, upload-time = "2025-09-08T23:23:22.08Z" }, + { url = "https://files.pythonhosted.org/packages/bb/dd/3465b14bb9e24ee24cb88c9e3730f6de63111fffe513492bf8c808a3547e/cffi-2.0.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:d9b97165e8aed9272a6bb17c01e3cc5871a594a446ebedc996e2397a1c1ea8ef", size = 206300, upload-time = "2025-09-08T23:23:23.314Z" }, + { url = "https://files.pythonhosted.org/packages/47/d9/d83e293854571c877a92da46fdec39158f8d7e68da75bf73581225d28e90/cffi-2.0.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:afb8db5439b81cf9c9d0c80404b60c3cc9c3add93e114dcae767f1477cb53775", size = 219244, upload-time = "2025-09-08T23:23:24.541Z" }, + { url = "https://files.pythonhosted.org/packages/2b/0f/1f177e3683aead2bb00f7679a16451d302c436b5cbf2505f0ea8146ef59e/cffi-2.0.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:737fe7d37e1a1bffe70bd5754ea763a62a066dc5913ca57e957824b72a85e205", size = 222828, upload-time = "2025-09-08T23:23:26.143Z" }, + { url = "https://files.pythonhosted.org/packages/c6/0f/cafacebd4b040e3119dcb32fed8bdef8dfe94da653155f9d0b9dc660166e/cffi-2.0.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:38100abb9d1b1435bc4cc340bb4489635dc2f0da7456590877030c9b3d40b0c1", size = 220926, upload-time = "2025-09-08T23:23:27.873Z" }, + { url = "https://files.pythonhosted.org/packages/3e/aa/df335faa45b395396fcbc03de2dfcab242cd61a9900e914fe682a59170b1/cffi-2.0.0-cp314-cp314-win32.whl", hash = "sha256:087067fa8953339c723661eda6b54bc98c5625757ea62e95eb4898ad5e776e9f", size = 175328, upload-time = "2025-09-08T23:23:44.61Z" }, + { url = "https://files.pythonhosted.org/packages/bb/92/882c2d30831744296ce713f0feb4c1cd30f346ef747b530b5318715cc367/cffi-2.0.0-cp314-cp314-win_amd64.whl", hash = "sha256:203a48d1fb583fc7d78a4c6655692963b860a417c0528492a6bc21f1aaefab25", size = 185650, upload-time = "2025-09-08T23:23:45.848Z" }, + { url = "https://files.pythonhosted.org/packages/9f/2c/98ece204b9d35a7366b5b2c6539c350313ca13932143e79dc133ba757104/cffi-2.0.0-cp314-cp314-win_arm64.whl", hash = "sha256:dbd5c7a25a7cb98f5ca55d258b103a2054f859a46ae11aaf23134f9cc0d356ad", size = 180687, upload-time = "2025-09-08T23:23:47.105Z" }, + { url = "https://files.pythonhosted.org/packages/3e/61/c768e4d548bfa607abcda77423448df8c471f25dbe64fb2ef6d555eae006/cffi-2.0.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:9a67fc9e8eb39039280526379fb3a70023d77caec1852002b4da7e8b270c4dd9", size = 188773, upload-time = "2025-09-08T23:23:29.347Z" }, + { url = "https://files.pythonhosted.org/packages/2c/ea/5f76bce7cf6fcd0ab1a1058b5af899bfbef198bea4d5686da88471ea0336/cffi-2.0.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:7a66c7204d8869299919db4d5069a82f1561581af12b11b3c9f48c584eb8743d", size = 185013, upload-time = "2025-09-08T23:23:30.63Z" }, + { url = "https://files.pythonhosted.org/packages/be/b4/c56878d0d1755cf9caa54ba71e5d049479c52f9e4afc230f06822162ab2f/cffi-2.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7cc09976e8b56f8cebd752f7113ad07752461f48a58cbba644139015ac24954c", size = 221593, upload-time = "2025-09-08T23:23:31.91Z" }, + { url = "https://files.pythonhosted.org/packages/e0/0d/eb704606dfe8033e7128df5e90fee946bbcb64a04fcdaa97321309004000/cffi-2.0.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:92b68146a71df78564e4ef48af17551a5ddd142e5190cdf2c5624d0c3ff5b2e8", size = 209354, upload-time = "2025-09-08T23:23:33.214Z" }, + { url = "https://files.pythonhosted.org/packages/d8/19/3c435d727b368ca475fb8742ab97c9cb13a0de600ce86f62eab7fa3eea60/cffi-2.0.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:b1e74d11748e7e98e2f426ab176d4ed720a64412b6a15054378afdb71e0f37dc", size = 208480, upload-time = "2025-09-08T23:23:34.495Z" }, + { url = "https://files.pythonhosted.org/packages/d0/44/681604464ed9541673e486521497406fadcc15b5217c3e326b061696899a/cffi-2.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:28a3a209b96630bca57cce802da70c266eb08c6e97e5afd61a75611ee6c64592", size = 221584, upload-time = "2025-09-08T23:23:36.096Z" }, + { url = "https://files.pythonhosted.org/packages/25/8e/342a504ff018a2825d395d44d63a767dd8ebc927ebda557fecdaca3ac33a/cffi-2.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:7553fb2090d71822f02c629afe6042c299edf91ba1bf94951165613553984512", size = 224443, upload-time = "2025-09-08T23:23:37.328Z" }, + { url = "https://files.pythonhosted.org/packages/e1/5e/b666bacbbc60fbf415ba9988324a132c9a7a0448a9a8f125074671c0f2c3/cffi-2.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:6c6c373cfc5c83a975506110d17457138c8c63016b563cc9ed6e056a82f13ce4", size = 223437, upload-time = "2025-09-08T23:23:38.945Z" }, + { url = "https://files.pythonhosted.org/packages/a0/1d/ec1a60bd1a10daa292d3cd6bb0b359a81607154fb8165f3ec95fe003b85c/cffi-2.0.0-cp314-cp314t-win32.whl", hash = "sha256:1fc9ea04857caf665289b7a75923f2c6ed559b8298a1b8c49e59f7dd95c8481e", size = 180487, upload-time = "2025-09-08T23:23:40.423Z" }, + { url = "https://files.pythonhosted.org/packages/bf/41/4c1168c74fac325c0c8156f04b6749c8b6a8f405bbf91413ba088359f60d/cffi-2.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:d68b6cef7827e8641e8ef16f4494edda8b36104d79773a334beaa1e3521430f6", size = 191726, upload-time = "2025-09-08T23:23:41.742Z" }, + { url = "https://files.pythonhosted.org/packages/ae/3a/dbeec9d1ee0844c679f6bb5d6ad4e9f198b1224f4e7a32825f47f6192b0c/cffi-2.0.0-cp314-cp314t-win_arm64.whl", hash = "sha256:0a1527a803f0a659de1af2e1fd700213caba79377e27e4693648c2923da066f9", size = 184195, upload-time = "2025-09-08T23:23:43.004Z" }, +] + +[[package]] +name = "cfgv" +version = "3.4.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/11/74/539e56497d9bd1d484fd863dd69cbbfa653cd2aa27abfe35653494d85e94/cfgv-3.4.0.tar.gz", hash = "sha256:e52591d4c5f5dead8e0f673fb16db7949d2cfb3f7da4582893288f0ded8fe560", size = 7114, upload-time = "2023-08-12T20:38:17.776Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c5/55/51844dd50c4fc7a33b653bfaba4c2456f06955289ca770a5dbd5fd267374/cfgv-3.4.0-py2.py3-none-any.whl", hash = "sha256:b7265b1f29fd3316bfcd2b330d63d024f2bfd8bcb8b0272f8e19a504856c48f9", size = 7249, upload-time = "2023-08-12T20:38:16.269Z" }, +] + +[[package]] +name = "charset-normalizer" +version = "3.4.3" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/83/2d/5fd176ceb9b2fc619e63405525573493ca23441330fcdaee6bef9460e924/charset_normalizer-3.4.3.tar.gz", hash = "sha256:6fce4b8500244f6fcb71465d4a4930d132ba9ab8e71a7859e6a5d59851068d14", size = 122371, upload-time = "2025-08-09T07:57:28.46Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7f/b5/991245018615474a60965a7c9cd2b4efbaabd16d582a5547c47ee1c7730b/charset_normalizer-3.4.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:b256ee2e749283ef3ddcff51a675ff43798d92d746d1a6e4631bf8c707d22d0b", size = 204483, upload-time = "2025-08-09T07:55:53.12Z" }, + { url = "https://files.pythonhosted.org/packages/c7/2a/ae245c41c06299ec18262825c1569c5d3298fc920e4ddf56ab011b417efd/charset_normalizer-3.4.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:13faeacfe61784e2559e690fc53fa4c5ae97c6fcedb8eb6fb8d0a15b475d2c64", size = 145520, upload-time = "2025-08-09T07:55:54.712Z" }, + { url = "https://files.pythonhosted.org/packages/3a/a4/b3b6c76e7a635748c4421d2b92c7b8f90a432f98bda5082049af37ffc8e3/charset_normalizer-3.4.3-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:00237675befef519d9af72169d8604a067d92755e84fe76492fef5441db05b91", size = 158876, upload-time = "2025-08-09T07:55:56.024Z" }, + { url = "https://files.pythonhosted.org/packages/e2/e6/63bb0e10f90a8243c5def74b5b105b3bbbfb3e7bb753915fe333fb0c11ea/charset_normalizer-3.4.3-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:585f3b2a80fbd26b048a0be90c5aae8f06605d3c92615911c3a2b03a8a3b796f", size = 156083, upload-time = "2025-08-09T07:55:57.582Z" }, + { url = "https://files.pythonhosted.org/packages/87/df/b7737ff046c974b183ea9aa111b74185ac8c3a326c6262d413bd5a1b8c69/charset_normalizer-3.4.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0e78314bdc32fa80696f72fa16dc61168fda4d6a0c014e0380f9d02f0e5d8a07", size = 150295, upload-time = "2025-08-09T07:55:59.147Z" }, + { url = "https://files.pythonhosted.org/packages/61/f1/190d9977e0084d3f1dc169acd060d479bbbc71b90bf3e7bf7b9927dec3eb/charset_normalizer-3.4.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:96b2b3d1a83ad55310de8c7b4a2d04d9277d5591f40761274856635acc5fcb30", size = 148379, upload-time = "2025-08-09T07:56:00.364Z" }, + { url = "https://files.pythonhosted.org/packages/4c/92/27dbe365d34c68cfe0ca76f1edd70e8705d82b378cb54ebbaeabc2e3029d/charset_normalizer-3.4.3-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:939578d9d8fd4299220161fdd76e86c6a251987476f5243e8864a7844476ba14", size = 160018, upload-time = "2025-08-09T07:56:01.678Z" }, + { url = "https://files.pythonhosted.org/packages/99/04/baae2a1ea1893a01635d475b9261c889a18fd48393634b6270827869fa34/charset_normalizer-3.4.3-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:fd10de089bcdcd1be95a2f73dbe6254798ec1bda9f450d5828c96f93e2536b9c", size = 157430, upload-time = "2025-08-09T07:56:02.87Z" }, + { url = "https://files.pythonhosted.org/packages/2f/36/77da9c6a328c54d17b960c89eccacfab8271fdaaa228305330915b88afa9/charset_normalizer-3.4.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:1e8ac75d72fa3775e0b7cb7e4629cec13b7514d928d15ef8ea06bca03ef01cae", size = 151600, upload-time = "2025-08-09T07:56:04.089Z" }, + { url = "https://files.pythonhosted.org/packages/64/d4/9eb4ff2c167edbbf08cdd28e19078bf195762e9bd63371689cab5ecd3d0d/charset_normalizer-3.4.3-cp311-cp311-win32.whl", hash = "sha256:6cf8fd4c04756b6b60146d98cd8a77d0cdae0e1ca20329da2ac85eed779b6849", size = 99616, upload-time = "2025-08-09T07:56:05.658Z" }, + { url = "https://files.pythonhosted.org/packages/f4/9c/996a4a028222e7761a96634d1820de8a744ff4327a00ada9c8942033089b/charset_normalizer-3.4.3-cp311-cp311-win_amd64.whl", hash = "sha256:31a9a6f775f9bcd865d88ee350f0ffb0e25936a7f930ca98995c05abf1faf21c", size = 107108, upload-time = "2025-08-09T07:56:07.176Z" }, + { url = "https://files.pythonhosted.org/packages/e9/5e/14c94999e418d9b87682734589404a25854d5f5d0408df68bc15b6ff54bb/charset_normalizer-3.4.3-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:e28e334d3ff134e88989d90ba04b47d84382a828c061d0d1027b1b12a62b39b1", size = 205655, upload-time = "2025-08-09T07:56:08.475Z" }, + { url = "https://files.pythonhosted.org/packages/7d/a8/c6ec5d389672521f644505a257f50544c074cf5fc292d5390331cd6fc9c3/charset_normalizer-3.4.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0cacf8f7297b0c4fcb74227692ca46b4a5852f8f4f24b3c766dd94a1075c4884", size = 146223, upload-time = "2025-08-09T07:56:09.708Z" }, + { url = "https://files.pythonhosted.org/packages/fc/eb/a2ffb08547f4e1e5415fb69eb7db25932c52a52bed371429648db4d84fb1/charset_normalizer-3.4.3-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c6fd51128a41297f5409deab284fecbe5305ebd7e5a1f959bee1c054622b7018", size = 159366, upload-time = "2025-08-09T07:56:11.326Z" }, + { url = "https://files.pythonhosted.org/packages/82/10/0fd19f20c624b278dddaf83b8464dcddc2456cb4b02bb902a6da126b87a1/charset_normalizer-3.4.3-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:3cfb2aad70f2c6debfbcb717f23b7eb55febc0bb23dcffc0f076009da10c6392", size = 157104, upload-time = "2025-08-09T07:56:13.014Z" }, + { url = "https://files.pythonhosted.org/packages/16/ab/0233c3231af734f5dfcf0844aa9582d5a1466c985bbed6cedab85af9bfe3/charset_normalizer-3.4.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1606f4a55c0fd363d754049cdf400175ee96c992b1f8018b993941f221221c5f", size = 151830, upload-time = "2025-08-09T07:56:14.428Z" }, + { url = "https://files.pythonhosted.org/packages/ae/02/e29e22b4e02839a0e4a06557b1999d0a47db3567e82989b5bb21f3fbbd9f/charset_normalizer-3.4.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:027b776c26d38b7f15b26a5da1044f376455fb3766df8fc38563b4efbc515154", size = 148854, upload-time = "2025-08-09T07:56:16.051Z" }, + { url = "https://files.pythonhosted.org/packages/05/6b/e2539a0a4be302b481e8cafb5af8792da8093b486885a1ae4d15d452bcec/charset_normalizer-3.4.3-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:42e5088973e56e31e4fa58eb6bd709e42fc03799c11c42929592889a2e54c491", size = 160670, upload-time = "2025-08-09T07:56:17.314Z" }, + { url = "https://files.pythonhosted.org/packages/31/e7/883ee5676a2ef217a40ce0bffcc3d0dfbf9e64cbcfbdf822c52981c3304b/charset_normalizer-3.4.3-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:cc34f233c9e71701040d772aa7490318673aa7164a0efe3172b2981218c26d93", size = 158501, upload-time = "2025-08-09T07:56:18.641Z" }, + { url = "https://files.pythonhosted.org/packages/c1/35/6525b21aa0db614cf8b5792d232021dca3df7f90a1944db934efa5d20bb1/charset_normalizer-3.4.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:320e8e66157cc4e247d9ddca8e21f427efc7a04bbd0ac8a9faf56583fa543f9f", size = 153173, upload-time = "2025-08-09T07:56:20.289Z" }, + { url = "https://files.pythonhosted.org/packages/50/ee/f4704bad8201de513fdc8aac1cabc87e38c5818c93857140e06e772b5892/charset_normalizer-3.4.3-cp312-cp312-win32.whl", hash = "sha256:fb6fecfd65564f208cbf0fba07f107fb661bcd1a7c389edbced3f7a493f70e37", size = 99822, upload-time = "2025-08-09T07:56:21.551Z" }, + { url = "https://files.pythonhosted.org/packages/39/f5/3b3836ca6064d0992c58c7561c6b6eee1b3892e9665d650c803bd5614522/charset_normalizer-3.4.3-cp312-cp312-win_amd64.whl", hash = "sha256:86df271bf921c2ee3818f0522e9a5b8092ca2ad8b065ece5d7d9d0e9f4849bcc", size = 107543, upload-time = "2025-08-09T07:56:23.115Z" }, + { url = "https://files.pythonhosted.org/packages/65/ca/2135ac97709b400c7654b4b764daf5c5567c2da45a30cdd20f9eefe2d658/charset_normalizer-3.4.3-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:14c2a87c65b351109f6abfc424cab3927b3bdece6f706e4d12faaf3d52ee5efe", size = 205326, upload-time = "2025-08-09T07:56:24.721Z" }, + { url = "https://files.pythonhosted.org/packages/71/11/98a04c3c97dd34e49c7d247083af03645ca3730809a5509443f3c37f7c99/charset_normalizer-3.4.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:41d1fc408ff5fdfb910200ec0e74abc40387bccb3252f3f27c0676731df2b2c8", size = 146008, upload-time = "2025-08-09T07:56:26.004Z" }, + { url = "https://files.pythonhosted.org/packages/60/f5/4659a4cb3c4ec146bec80c32d8bb16033752574c20b1252ee842a95d1a1e/charset_normalizer-3.4.3-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:1bb60174149316da1c35fa5233681f7c0f9f514509b8e399ab70fea5f17e45c9", size = 159196, upload-time = "2025-08-09T07:56:27.25Z" }, + { url = "https://files.pythonhosted.org/packages/86/9e/f552f7a00611f168b9a5865a1414179b2c6de8235a4fa40189f6f79a1753/charset_normalizer-3.4.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:30d006f98569de3459c2fc1f2acde170b7b2bd265dc1943e87e1a4efe1b67c31", size = 156819, upload-time = "2025-08-09T07:56:28.515Z" }, + { url = "https://files.pythonhosted.org/packages/7e/95/42aa2156235cbc8fa61208aded06ef46111c4d3f0de233107b3f38631803/charset_normalizer-3.4.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:416175faf02e4b0810f1f38bcb54682878a4af94059a1cd63b8747244420801f", size = 151350, upload-time = "2025-08-09T07:56:29.716Z" }, + { url = "https://files.pythonhosted.org/packages/c2/a9/3865b02c56f300a6f94fc631ef54f0a8a29da74fb45a773dfd3dcd380af7/charset_normalizer-3.4.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:6aab0f181c486f973bc7262a97f5aca3ee7e1437011ef0c2ec04b5a11d16c927", size = 148644, upload-time = "2025-08-09T07:56:30.984Z" }, + { url = "https://files.pythonhosted.org/packages/77/d9/cbcf1a2a5c7d7856f11e7ac2d782aec12bdfea60d104e60e0aa1c97849dc/charset_normalizer-3.4.3-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:fdabf8315679312cfa71302f9bd509ded4f2f263fb5b765cf1433b39106c3cc9", size = 160468, upload-time = "2025-08-09T07:56:32.252Z" }, + { url = "https://files.pythonhosted.org/packages/f6/42/6f45efee8697b89fda4d50580f292b8f7f9306cb2971d4b53f8914e4d890/charset_normalizer-3.4.3-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:bd28b817ea8c70215401f657edef3a8aa83c29d447fb0b622c35403780ba11d5", size = 158187, upload-time = "2025-08-09T07:56:33.481Z" }, + { url = "https://files.pythonhosted.org/packages/70/99/f1c3bdcfaa9c45b3ce96f70b14f070411366fa19549c1d4832c935d8e2c3/charset_normalizer-3.4.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:18343b2d246dc6761a249ba1fb13f9ee9a2bcd95decc767319506056ea4ad4dc", size = 152699, upload-time = "2025-08-09T07:56:34.739Z" }, + { url = "https://files.pythonhosted.org/packages/a3/ad/b0081f2f99a4b194bcbb1934ef3b12aa4d9702ced80a37026b7607c72e58/charset_normalizer-3.4.3-cp313-cp313-win32.whl", hash = "sha256:6fb70de56f1859a3f71261cbe41005f56a7842cc348d3aeb26237560bfa5e0ce", size = 99580, upload-time = "2025-08-09T07:56:35.981Z" }, + { url = "https://files.pythonhosted.org/packages/9a/8f/ae790790c7b64f925e5c953b924aaa42a243fb778fed9e41f147b2a5715a/charset_normalizer-3.4.3-cp313-cp313-win_amd64.whl", hash = "sha256:cf1ebb7d78e1ad8ec2a8c4732c7be2e736f6e5123a4146c5b89c9d1f585f8cef", size = 107366, upload-time = "2025-08-09T07:56:37.339Z" }, + { url = "https://files.pythonhosted.org/packages/8e/91/b5a06ad970ddc7a0e513112d40113e834638f4ca1120eb727a249fb2715e/charset_normalizer-3.4.3-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:3cd35b7e8aedeb9e34c41385fda4f73ba609e561faedfae0a9e75e44ac558a15", size = 204342, upload-time = "2025-08-09T07:56:38.687Z" }, + { url = "https://files.pythonhosted.org/packages/ce/ec/1edc30a377f0a02689342f214455c3f6c2fbedd896a1d2f856c002fc3062/charset_normalizer-3.4.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b89bc04de1d83006373429975f8ef9e7932534b8cc9ca582e4db7d20d91816db", size = 145995, upload-time = "2025-08-09T07:56:40.048Z" }, + { url = "https://files.pythonhosted.org/packages/17/e5/5e67ab85e6d22b04641acb5399c8684f4d37caf7558a53859f0283a650e9/charset_normalizer-3.4.3-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:2001a39612b241dae17b4687898843f254f8748b796a2e16f1051a17078d991d", size = 158640, upload-time = "2025-08-09T07:56:41.311Z" }, + { url = "https://files.pythonhosted.org/packages/f1/e5/38421987f6c697ee3722981289d554957c4be652f963d71c5e46a262e135/charset_normalizer-3.4.3-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:8dcfc373f888e4fb39a7bc57e93e3b845e7f462dacc008d9749568b1c4ece096", size = 156636, upload-time = "2025-08-09T07:56:43.195Z" }, + { url = "https://files.pythonhosted.org/packages/a0/e4/5a075de8daa3ec0745a9a3b54467e0c2967daaaf2cec04c845f73493e9a1/charset_normalizer-3.4.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:18b97b8404387b96cdbd30ad660f6407799126d26a39ca65729162fd810a99aa", size = 150939, upload-time = "2025-08-09T07:56:44.819Z" }, + { url = "https://files.pythonhosted.org/packages/02/f7/3611b32318b30974131db62b4043f335861d4d9b49adc6d57c1149cc49d4/charset_normalizer-3.4.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:ccf600859c183d70eb47e05a44cd80a4ce77394d1ac0f79dbd2dd90a69a3a049", size = 148580, upload-time = "2025-08-09T07:56:46.684Z" }, + { url = "https://files.pythonhosted.org/packages/7e/61/19b36f4bd67f2793ab6a99b979b4e4f3d8fc754cbdffb805335df4337126/charset_normalizer-3.4.3-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:53cd68b185d98dde4ad8990e56a58dea83a4162161b1ea9272e5c9182ce415e0", size = 159870, upload-time = "2025-08-09T07:56:47.941Z" }, + { url = "https://files.pythonhosted.org/packages/06/57/84722eefdd338c04cf3030ada66889298eaedf3e7a30a624201e0cbe424a/charset_normalizer-3.4.3-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:30a96e1e1f865f78b030d65241c1ee850cdf422d869e9028e2fc1d5e4db73b92", size = 157797, upload-time = "2025-08-09T07:56:49.756Z" }, + { url = "https://files.pythonhosted.org/packages/72/2a/aff5dd112b2f14bcc3462c312dce5445806bfc8ab3a7328555da95330e4b/charset_normalizer-3.4.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:d716a916938e03231e86e43782ca7878fb602a125a91e7acb8b5112e2e96ac16", size = 152224, upload-time = "2025-08-09T07:56:51.369Z" }, + { url = "https://files.pythonhosted.org/packages/b7/8c/9839225320046ed279c6e839d51f028342eb77c91c89b8ef2549f951f3ec/charset_normalizer-3.4.3-cp314-cp314-win32.whl", hash = "sha256:c6dbd0ccdda3a2ba7c2ecd9d77b37f3b5831687d8dc1b6ca5f56a4880cc7b7ce", size = 100086, upload-time = "2025-08-09T07:56:52.722Z" }, + { url = "https://files.pythonhosted.org/packages/ee/7a/36fbcf646e41f710ce0a563c1c9a343c6edf9be80786edeb15b6f62e17db/charset_normalizer-3.4.3-cp314-cp314-win_amd64.whl", hash = "sha256:73dc19b562516fc9bcf6e5d6e596df0b4eb98d87e4f79f3ae71840e6ed21361c", size = 107400, upload-time = "2025-08-09T07:56:55.172Z" }, + { url = "https://files.pythonhosted.org/packages/8a/1f/f041989e93b001bc4e44bb1669ccdcf54d3f00e628229a85b08d330615c5/charset_normalizer-3.4.3-py3-none-any.whl", hash = "sha256:ce571ab16d890d23b5c278547ba694193a45011ff86a9162a71307ed9f86759a", size = 53175, upload-time = "2025-08-09T07:57:26.864Z" }, +] + +[[package]] +name = "click" +version = "8.2.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "colorama", marker = "sys_platform == 'win32'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/60/6c/8ca2efa64cf75a977a0d7fac081354553ebe483345c734fb6b6515d96bbc/click-8.2.1.tar.gz", hash = "sha256:27c491cc05d968d271d5a1db13e3b5a184636d9d930f148c50b038f0d0646202", size = 286342, upload-time = "2025-05-20T23:19:49.832Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/85/32/10bb5764d90a8eee674e9dc6f4db6a0ab47c8c4d0d83c27f7c39ac415a4d/click-8.2.1-py3-none-any.whl", hash = "sha256:61a3265b914e850b85317d0b3109c7f8cd35a670f963866005d6ef1d5175a12b", size = 102215, upload-time = "2025-05-20T23:19:47.796Z" }, +] + +[[package]] +name = "cloudpickle" +version = "3.1.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/52/39/069100b84d7418bc358d81669d5748efb14b9cceacd2f9c75f550424132f/cloudpickle-3.1.1.tar.gz", hash = "sha256:b216fa8ae4019d5482a8ac3c95d8f6346115d8835911fd4aefd1a445e4242c64", size = 22113, upload-time = "2025-01-14T17:02:05.085Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7e/e8/64c37fadfc2816a7701fa8a6ed8d87327c7d54eacfbfb6edab14a2f2be75/cloudpickle-3.1.1-py3-none-any.whl", hash = "sha256:c8c5a44295039331ee9dad40ba100a9c7297b6f988e50e87ccdf3765a668350e", size = 20992, upload-time = "2025-01-14T17:02:02.417Z" }, +] + +[[package]] +name = "cognee" +version = "0.3.3" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "aiofiles" }, + { name = "aiohttp" }, + { name = "aiosqlite" }, + { name = "alembic" }, + { name = "baml-py" }, + { name = "dlt", extra = ["sqlalchemy"] }, + { name = "fastapi" }, + { name = "fastapi-users", extra = ["sqlalchemy"] }, + { name = "filetype" }, + { name = "instructor" }, + { name = "jinja2" }, + { name = "kuzu" }, + { name = "lancedb" }, + { name = "langfuse" }, + { name = "limits" }, + { name = "litellm" }, + { name = "matplotlib" }, + { name = "networkx" }, + { name = "nltk" }, + { name = "numpy" }, + { name = "onnxruntime" }, + { name = "openai" }, + { name = "pandas" }, + { name = "pre-commit" }, + { name = "pydantic" }, + { name = "pydantic-settings" }, + { name = "pylance" }, + { name = "pympler" }, + { name = "pypdf" }, + { name = "python-dotenv" }, + { name = "python-magic-bin", marker = "sys_platform == 'win32'" }, + { name = "python-multipart" }, + { name = "rdflib" }, + { name = "s3fs", extra = ["boto3"] }, + { name = "scikit-learn" }, + { name = "sentry-sdk", extra = ["fastapi"] }, + { name = "sqlalchemy" }, + { name = "structlog" }, + { name = "tiktoken" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/60/b8/d3dff77c5cccf052deb401afcf441507445516eec79423606fb572d4817b/cognee-0.3.3.tar.gz", hash = "sha256:64b301625ab02d9a026fa64798fc075bf7cf6517b14514b4cee5b843454c6520", size = 14395369, upload-time = "2025-09-12T18:23:37.723Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/69/75/1df5946648dc4650771b3912d368e57192910ebdad9217dcd94d46bb4257/cognee-0.3.3-py3-none-any.whl", hash = "sha256:a595754f822b092573ebf655390ce8eb4c22d0500451b8c804dab14a9b61b774", size = 1515870, upload-time = "2025-09-12T18:23:19.031Z" }, +] + +[[package]] +name = "colorama" +version = "0.4.6" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697, upload-time = "2022-10-25T02:36:22.414Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335, upload-time = "2022-10-25T02:36:20.889Z" }, +] + +[[package]] +name = "coloredlogs" +version = "15.0.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "humanfriendly" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/cc/c7/eed8f27100517e8c0e6b923d5f0845d0cb99763da6fdee00478f91db7325/coloredlogs-15.0.1.tar.gz", hash = "sha256:7c991aa71a4577af2f82600d8f8f3a89f936baeaf9b50a9c197da014e5bf16b0", size = 278520, upload-time = "2021-06-11T10:22:45.202Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a7/06/3d6badcf13db419e25b07041d9c7b4a2c331d3f4e7134445ec5df57714cd/coloredlogs-15.0.1-py2.py3-none-any.whl", hash = "sha256:612ee75c546f53e92e70049c9dbfcc18c935a2b9a53b66085ce9ef6a6e5c0934", size = 46018, upload-time = "2021-06-11T10:22:42.561Z" }, +] + +[[package]] +name = "contourpy" +version = "1.3.3" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "numpy" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/58/01/1253e6698a07380cd31a736d248a3f2a50a7c88779a1813da27503cadc2a/contourpy-1.3.3.tar.gz", hash = "sha256:083e12155b210502d0bca491432bb04d56dc3432f95a979b429f2848c3dbe880", size = 13466174, upload-time = "2025-07-26T12:03:12.549Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/91/2e/c4390a31919d8a78b90e8ecf87cd4b4c4f05a5b48d05ec17db8e5404c6f4/contourpy-1.3.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:709a48ef9a690e1343202916450bc48b9e51c049b089c7f79a267b46cffcdaa1", size = 288773, upload-time = "2025-07-26T12:01:02.277Z" }, + { url = "https://files.pythonhosted.org/packages/0d/44/c4b0b6095fef4dc9c420e041799591e3b63e9619e3044f7f4f6c21c0ab24/contourpy-1.3.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:23416f38bfd74d5d28ab8429cc4d63fa67d5068bd711a85edb1c3fb0c3e2f381", size = 270149, upload-time = "2025-07-26T12:01:04.072Z" }, + { url = "https://files.pythonhosted.org/packages/30/2e/dd4ced42fefac8470661d7cb7e264808425e6c5d56d175291e93890cce09/contourpy-1.3.3-cp311-cp311-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:929ddf8c4c7f348e4c0a5a3a714b5c8542ffaa8c22954862a46ca1813b667ee7", size = 329222, upload-time = "2025-07-26T12:01:05.688Z" }, + { url = "https://files.pythonhosted.org/packages/f2/74/cc6ec2548e3d276c71389ea4802a774b7aa3558223b7bade3f25787fafc2/contourpy-1.3.3-cp311-cp311-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:9e999574eddae35f1312c2b4b717b7885d4edd6cb46700e04f7f02db454e67c1", size = 377234, upload-time = "2025-07-26T12:01:07.054Z" }, + { url = "https://files.pythonhosted.org/packages/03/b3/64ef723029f917410f75c09da54254c5f9ea90ef89b143ccadb09df14c15/contourpy-1.3.3-cp311-cp311-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:0bf67e0e3f482cb69779dd3061b534eb35ac9b17f163d851e2a547d56dba0a3a", size = 380555, upload-time = "2025-07-26T12:01:08.801Z" }, + { url = "https://files.pythonhosted.org/packages/5f/4b/6157f24ca425b89fe2eb7e7be642375711ab671135be21e6faa100f7448c/contourpy-1.3.3-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:51e79c1f7470158e838808d4a996fa9bac72c498e93d8ebe5119bc1e6becb0db", size = 355238, upload-time = "2025-07-26T12:01:10.319Z" }, + { url = "https://files.pythonhosted.org/packages/98/56/f914f0dd678480708a04cfd2206e7c382533249bc5001eb9f58aa693e200/contourpy-1.3.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:598c3aaece21c503615fd59c92a3598b428b2f01bfb4b8ca9c4edeecc2438620", size = 1326218, upload-time = "2025-07-26T12:01:12.659Z" }, + { url = "https://files.pythonhosted.org/packages/fb/d7/4a972334a0c971acd5172389671113ae82aa7527073980c38d5868ff1161/contourpy-1.3.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:322ab1c99b008dad206d406bb61d014cf0174df491ae9d9d0fac6a6fda4f977f", size = 1392867, upload-time = "2025-07-26T12:01:15.533Z" }, + { url = "https://files.pythonhosted.org/packages/75/3e/f2cc6cd56dc8cff46b1a56232eabc6feea52720083ea71ab15523daab796/contourpy-1.3.3-cp311-cp311-win32.whl", hash = "sha256:fd907ae12cd483cd83e414b12941c632a969171bf90fc937d0c9f268a31cafff", size = 183677, upload-time = "2025-07-26T12:01:17.088Z" }, + { url = "https://files.pythonhosted.org/packages/98/4b/9bd370b004b5c9d8045c6c33cf65bae018b27aca550a3f657cdc99acdbd8/contourpy-1.3.3-cp311-cp311-win_amd64.whl", hash = "sha256:3519428f6be58431c56581f1694ba8e50626f2dd550af225f82fb5f5814d2a42", size = 225234, upload-time = "2025-07-26T12:01:18.256Z" }, + { url = "https://files.pythonhosted.org/packages/d9/b6/71771e02c2e004450c12b1120a5f488cad2e4d5b590b1af8bad060360fe4/contourpy-1.3.3-cp311-cp311-win_arm64.whl", hash = "sha256:15ff10bfada4bf92ec8b31c62bf7c1834c244019b4a33095a68000d7075df470", size = 193123, upload-time = "2025-07-26T12:01:19.848Z" }, + { url = "https://files.pythonhosted.org/packages/be/45/adfee365d9ea3d853550b2e735f9d66366701c65db7855cd07621732ccfc/contourpy-1.3.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:b08a32ea2f8e42cf1d4be3169a98dd4be32bafe4f22b6c4cb4ba810fa9e5d2cb", size = 293419, upload-time = "2025-07-26T12:01:21.16Z" }, + { url = "https://files.pythonhosted.org/packages/53/3e/405b59cfa13021a56bba395a6b3aca8cec012b45bf177b0eaf7a202cde2c/contourpy-1.3.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:556dba8fb6f5d8742f2923fe9457dbdd51e1049c4a43fd3986a0b14a1d815fc6", size = 273979, upload-time = "2025-07-26T12:01:22.448Z" }, + { url = "https://files.pythonhosted.org/packages/d4/1c/a12359b9b2ca3a845e8f7f9ac08bdf776114eb931392fcad91743e2ea17b/contourpy-1.3.3-cp312-cp312-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:92d9abc807cf7d0e047b95ca5d957cf4792fcd04e920ca70d48add15c1a90ea7", size = 332653, upload-time = "2025-07-26T12:01:24.155Z" }, + { url = "https://files.pythonhosted.org/packages/63/12/897aeebfb475b7748ea67b61e045accdfcf0d971f8a588b67108ed7f5512/contourpy-1.3.3-cp312-cp312-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:b2e8faa0ed68cb29af51edd8e24798bb661eac3bd9f65420c1887b6ca89987c8", size = 379536, upload-time = "2025-07-26T12:01:25.91Z" }, + { url = "https://files.pythonhosted.org/packages/43/8a/a8c584b82deb248930ce069e71576fc09bd7174bbd35183b7943fb1064fd/contourpy-1.3.3-cp312-cp312-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:626d60935cf668e70a5ce6ff184fd713e9683fb458898e4249b63be9e28286ea", size = 384397, upload-time = "2025-07-26T12:01:27.152Z" }, + { url = "https://files.pythonhosted.org/packages/cc/8f/ec6289987824b29529d0dfda0d74a07cec60e54b9c92f3c9da4c0ac732de/contourpy-1.3.3-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4d00e655fcef08aba35ec9610536bfe90267d7ab5ba944f7032549c55a146da1", size = 362601, upload-time = "2025-07-26T12:01:28.808Z" }, + { url = "https://files.pythonhosted.org/packages/05/0a/a3fe3be3ee2dceb3e615ebb4df97ae6f3828aa915d3e10549ce016302bd1/contourpy-1.3.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:451e71b5a7d597379ef572de31eeb909a87246974d960049a9848c3bc6c41bf7", size = 1331288, upload-time = "2025-07-26T12:01:31.198Z" }, + { url = "https://files.pythonhosted.org/packages/33/1d/acad9bd4e97f13f3e2b18a3977fe1b4a37ecf3d38d815333980c6c72e963/contourpy-1.3.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:459c1f020cd59fcfe6650180678a9993932d80d44ccde1fa1868977438f0b411", size = 1403386, upload-time = "2025-07-26T12:01:33.947Z" }, + { url = "https://files.pythonhosted.org/packages/cf/8f/5847f44a7fddf859704217a99a23a4f6417b10e5ab1256a179264561540e/contourpy-1.3.3-cp312-cp312-win32.whl", hash = "sha256:023b44101dfe49d7d53932be418477dba359649246075c996866106da069af69", size = 185018, upload-time = "2025-07-26T12:01:35.64Z" }, + { url = "https://files.pythonhosted.org/packages/19/e8/6026ed58a64563186a9ee3f29f41261fd1828f527dd93d33b60feca63352/contourpy-1.3.3-cp312-cp312-win_amd64.whl", hash = "sha256:8153b8bfc11e1e4d75bcb0bff1db232f9e10b274e0929de9d608027e0d34ff8b", size = 226567, upload-time = "2025-07-26T12:01:36.804Z" }, + { url = "https://files.pythonhosted.org/packages/d1/e2/f05240d2c39a1ed228d8328a78b6f44cd695f7ef47beb3e684cf93604f86/contourpy-1.3.3-cp312-cp312-win_arm64.whl", hash = "sha256:07ce5ed73ecdc4a03ffe3e1b3e3c1166db35ae7584be76f65dbbe28a7791b0cc", size = 193655, upload-time = "2025-07-26T12:01:37.999Z" }, + { url = "https://files.pythonhosted.org/packages/68/35/0167aad910bbdb9599272bd96d01a9ec6852f36b9455cf2ca67bd4cc2d23/contourpy-1.3.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:177fb367556747a686509d6fef71d221a4b198a3905fe824430e5ea0fda54eb5", size = 293257, upload-time = "2025-07-26T12:01:39.367Z" }, + { url = "https://files.pythonhosted.org/packages/96/e4/7adcd9c8362745b2210728f209bfbcf7d91ba868a2c5f40d8b58f54c509b/contourpy-1.3.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:d002b6f00d73d69333dac9d0b8d5e84d9724ff9ef044fd63c5986e62b7c9e1b1", size = 274034, upload-time = "2025-07-26T12:01:40.645Z" }, + { url = "https://files.pythonhosted.org/packages/73/23/90e31ceeed1de63058a02cb04b12f2de4b40e3bef5e082a7c18d9c8ae281/contourpy-1.3.3-cp313-cp313-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:348ac1f5d4f1d66d3322420f01d42e43122f43616e0f194fc1c9f5d830c5b286", size = 334672, upload-time = "2025-07-26T12:01:41.942Z" }, + { url = "https://files.pythonhosted.org/packages/ed/93/b43d8acbe67392e659e1d984700e79eb67e2acb2bd7f62012b583a7f1b55/contourpy-1.3.3-cp313-cp313-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:655456777ff65c2c548b7c454af9c6f33f16c8884f11083244b5819cc214f1b5", size = 381234, upload-time = "2025-07-26T12:01:43.499Z" }, + { url = "https://files.pythonhosted.org/packages/46/3b/bec82a3ea06f66711520f75a40c8fc0b113b2a75edb36aa633eb11c4f50f/contourpy-1.3.3-cp313-cp313-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:644a6853d15b2512d67881586bd03f462c7ab755db95f16f14d7e238f2852c67", size = 385169, upload-time = "2025-07-26T12:01:45.219Z" }, + { url = "https://files.pythonhosted.org/packages/4b/32/e0f13a1c5b0f8572d0ec6ae2f6c677b7991fafd95da523159c19eff0696a/contourpy-1.3.3-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4debd64f124ca62069f313a9cb86656ff087786016d76927ae2cf37846b006c9", size = 362859, upload-time = "2025-07-26T12:01:46.519Z" }, + { url = "https://files.pythonhosted.org/packages/33/71/e2a7945b7de4e58af42d708a219f3b2f4cff7386e6b6ab0a0fa0033c49a9/contourpy-1.3.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a15459b0f4615b00bbd1e91f1b9e19b7e63aea7483d03d804186f278c0af2659", size = 1332062, upload-time = "2025-07-26T12:01:48.964Z" }, + { url = "https://files.pythonhosted.org/packages/12/fc/4e87ac754220ccc0e807284f88e943d6d43b43843614f0a8afa469801db0/contourpy-1.3.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:ca0fdcd73925568ca027e0b17ab07aad764be4706d0a925b89227e447d9737b7", size = 1403932, upload-time = "2025-07-26T12:01:51.979Z" }, + { url = "https://files.pythonhosted.org/packages/a6/2e/adc197a37443f934594112222ac1aa7dc9a98faf9c3842884df9a9d8751d/contourpy-1.3.3-cp313-cp313-win32.whl", hash = "sha256:b20c7c9a3bf701366556e1b1984ed2d0cedf999903c51311417cf5f591d8c78d", size = 185024, upload-time = "2025-07-26T12:01:53.245Z" }, + { url = "https://files.pythonhosted.org/packages/18/0b/0098c214843213759692cc638fce7de5c289200a830e5035d1791d7a2338/contourpy-1.3.3-cp313-cp313-win_amd64.whl", hash = "sha256:1cadd8b8969f060ba45ed7c1b714fe69185812ab43bd6b86a9123fe8f99c3263", size = 226578, upload-time = "2025-07-26T12:01:54.422Z" }, + { url = "https://files.pythonhosted.org/packages/8a/9a/2f6024a0c5995243cd63afdeb3651c984f0d2bc727fd98066d40e141ad73/contourpy-1.3.3-cp313-cp313-win_arm64.whl", hash = "sha256:fd914713266421b7536de2bfa8181aa8c699432b6763a0ea64195ebe28bff6a9", size = 193524, upload-time = "2025-07-26T12:01:55.73Z" }, + { url = "https://files.pythonhosted.org/packages/c0/b3/f8a1a86bd3298513f500e5b1f5fd92b69896449f6cab6a146a5d52715479/contourpy-1.3.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:88df9880d507169449d434c293467418b9f6cbe82edd19284aa0409e7fdb933d", size = 306730, upload-time = "2025-07-26T12:01:57.051Z" }, + { url = "https://files.pythonhosted.org/packages/3f/11/4780db94ae62fc0c2053909b65dc3246bd7cecfc4f8a20d957ad43aa4ad8/contourpy-1.3.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:d06bb1f751ba5d417047db62bca3c8fde202b8c11fb50742ab3ab962c81e8216", size = 287897, upload-time = "2025-07-26T12:01:58.663Z" }, + { url = "https://files.pythonhosted.org/packages/ae/15/e59f5f3ffdd6f3d4daa3e47114c53daabcb18574a26c21f03dc9e4e42ff0/contourpy-1.3.3-cp313-cp313t-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e4e6b05a45525357e382909a4c1600444e2a45b4795163d3b22669285591c1ae", size = 326751, upload-time = "2025-07-26T12:02:00.343Z" }, + { url = "https://files.pythonhosted.org/packages/0f/81/03b45cfad088e4770b1dcf72ea78d3802d04200009fb364d18a493857210/contourpy-1.3.3-cp313-cp313t-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:ab3074b48c4e2cf1a960e6bbeb7f04566bf36b1861d5c9d4d8ac04b82e38ba20", size = 375486, upload-time = "2025-07-26T12:02:02.128Z" }, + { url = "https://files.pythonhosted.org/packages/0c/ba/49923366492ffbdd4486e970d421b289a670ae8cf539c1ea9a09822b371a/contourpy-1.3.3-cp313-cp313t-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:6c3d53c796f8647d6deb1abe867daeb66dcc8a97e8455efa729516b997b8ed99", size = 388106, upload-time = "2025-07-26T12:02:03.615Z" }, + { url = "https://files.pythonhosted.org/packages/9f/52/5b00ea89525f8f143651f9f03a0df371d3cbd2fccd21ca9b768c7a6500c2/contourpy-1.3.3-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:50ed930df7289ff2a8d7afeb9603f8289e5704755c7e5c3bbd929c90c817164b", size = 352548, upload-time = "2025-07-26T12:02:05.165Z" }, + { url = "https://files.pythonhosted.org/packages/32/1d/a209ec1a3a3452d490f6b14dd92e72280c99ae3d1e73da74f8277d4ee08f/contourpy-1.3.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:4feffb6537d64b84877da813a5c30f1422ea5739566abf0bd18065ac040e120a", size = 1322297, upload-time = "2025-07-26T12:02:07.379Z" }, + { url = "https://files.pythonhosted.org/packages/bc/9e/46f0e8ebdd884ca0e8877e46a3f4e633f6c9c8c4f3f6e72be3fe075994aa/contourpy-1.3.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:2b7e9480ffe2b0cd2e787e4df64270e3a0440d9db8dc823312e2c940c167df7e", size = 1391023, upload-time = "2025-07-26T12:02:10.171Z" }, + { url = "https://files.pythonhosted.org/packages/b9/70/f308384a3ae9cd2209e0849f33c913f658d3326900d0ff5d378d6a1422d2/contourpy-1.3.3-cp313-cp313t-win32.whl", hash = "sha256:283edd842a01e3dcd435b1c5116798d661378d83d36d337b8dde1d16a5fc9ba3", size = 196157, upload-time = "2025-07-26T12:02:11.488Z" }, + { url = "https://files.pythonhosted.org/packages/b2/dd/880f890a6663b84d9e34a6f88cded89d78f0091e0045a284427cb6b18521/contourpy-1.3.3-cp313-cp313t-win_amd64.whl", hash = "sha256:87acf5963fc2b34825e5b6b048f40e3635dd547f590b04d2ab317c2619ef7ae8", size = 240570, upload-time = "2025-07-26T12:02:12.754Z" }, + { url = "https://files.pythonhosted.org/packages/80/99/2adc7d8ffead633234817ef8e9a87115c8a11927a94478f6bb3d3f4d4f7d/contourpy-1.3.3-cp313-cp313t-win_arm64.whl", hash = "sha256:3c30273eb2a55024ff31ba7d052dde990d7d8e5450f4bbb6e913558b3d6c2301", size = 199713, upload-time = "2025-07-26T12:02:14.4Z" }, + { url = "https://files.pythonhosted.org/packages/72/8b/4546f3ab60f78c514ffb7d01a0bd743f90de36f0019d1be84d0a708a580a/contourpy-1.3.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fde6c716d51c04b1c25d0b90364d0be954624a0ee9d60e23e850e8d48353d07a", size = 292189, upload-time = "2025-07-26T12:02:16.095Z" }, + { url = "https://files.pythonhosted.org/packages/fd/e1/3542a9cb596cadd76fcef413f19c79216e002623158befe6daa03dbfa88c/contourpy-1.3.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:cbedb772ed74ff5be440fa8eee9bd49f64f6e3fc09436d9c7d8f1c287b121d77", size = 273251, upload-time = "2025-07-26T12:02:17.524Z" }, + { url = "https://files.pythonhosted.org/packages/b1/71/f93e1e9471d189f79d0ce2497007731c1e6bf9ef6d1d61b911430c3db4e5/contourpy-1.3.3-cp314-cp314-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:22e9b1bd7a9b1d652cd77388465dc358dafcd2e217d35552424aa4f996f524f5", size = 335810, upload-time = "2025-07-26T12:02:18.9Z" }, + { url = "https://files.pythonhosted.org/packages/91/f9/e35f4c1c93f9275d4e38681a80506b5510e9327350c51f8d4a5a724d178c/contourpy-1.3.3-cp314-cp314-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a22738912262aa3e254e4f3cb079a95a67132fc5a063890e224393596902f5a4", size = 382871, upload-time = "2025-07-26T12:02:20.418Z" }, + { url = "https://files.pythonhosted.org/packages/b5/71/47b512f936f66a0a900d81c396a7e60d73419868fba959c61efed7a8ab46/contourpy-1.3.3-cp314-cp314-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:afe5a512f31ee6bd7d0dda52ec9864c984ca3d66664444f2d72e0dc4eb832e36", size = 386264, upload-time = "2025-07-26T12:02:21.916Z" }, + { url = "https://files.pythonhosted.org/packages/04/5f/9ff93450ba96b09c7c2b3f81c94de31c89f92292f1380261bd7195bea4ea/contourpy-1.3.3-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f64836de09927cba6f79dcd00fdd7d5329f3fccc633468507079c829ca4db4e3", size = 363819, upload-time = "2025-07-26T12:02:23.759Z" }, + { url = "https://files.pythonhosted.org/packages/3e/a6/0b185d4cc480ee494945cde102cb0149ae830b5fa17bf855b95f2e70ad13/contourpy-1.3.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:1fd43c3be4c8e5fd6e4f2baeae35ae18176cf2e5cced681cca908addf1cdd53b", size = 1333650, upload-time = "2025-07-26T12:02:26.181Z" }, + { url = "https://files.pythonhosted.org/packages/43/d7/afdc95580ca56f30fbcd3060250f66cedbde69b4547028863abd8aa3b47e/contourpy-1.3.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:6afc576f7b33cf00996e5c1102dc2a8f7cc89e39c0b55df93a0b78c1bd992b36", size = 1404833, upload-time = "2025-07-26T12:02:28.782Z" }, + { url = "https://files.pythonhosted.org/packages/e2/e2/366af18a6d386f41132a48f033cbd2102e9b0cf6345d35ff0826cd984566/contourpy-1.3.3-cp314-cp314-win32.whl", hash = "sha256:66c8a43a4f7b8df8b71ee1840e4211a3c8d93b214b213f590e18a1beca458f7d", size = 189692, upload-time = "2025-07-26T12:02:30.128Z" }, + { url = "https://files.pythonhosted.org/packages/7d/c2/57f54b03d0f22d4044b8afb9ca0e184f8b1afd57b4f735c2fa70883dc601/contourpy-1.3.3-cp314-cp314-win_amd64.whl", hash = "sha256:cf9022ef053f2694e31d630feaacb21ea24224be1c3ad0520b13d844274614fd", size = 232424, upload-time = "2025-07-26T12:02:31.395Z" }, + { url = "https://files.pythonhosted.org/packages/18/79/a9416650df9b525737ab521aa181ccc42d56016d2123ddcb7b58e926a42c/contourpy-1.3.3-cp314-cp314-win_arm64.whl", hash = "sha256:95b181891b4c71de4bb404c6621e7e2390745f887f2a026b2d99e92c17892339", size = 198300, upload-time = "2025-07-26T12:02:32.956Z" }, + { url = "https://files.pythonhosted.org/packages/1f/42/38c159a7d0f2b7b9c04c64ab317042bb6952b713ba875c1681529a2932fe/contourpy-1.3.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:33c82d0138c0a062380332c861387650c82e4cf1747aaa6938b9b6516762e772", size = 306769, upload-time = "2025-07-26T12:02:34.2Z" }, + { url = "https://files.pythonhosted.org/packages/c3/6c/26a8205f24bca10974e77460de68d3d7c63e282e23782f1239f226fcae6f/contourpy-1.3.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:ea37e7b45949df430fe649e5de8351c423430046a2af20b1c1961cae3afcda77", size = 287892, upload-time = "2025-07-26T12:02:35.807Z" }, + { url = "https://files.pythonhosted.org/packages/66/06/8a475c8ab718ebfd7925661747dbb3c3ee9c82ac834ccb3570be49d129f4/contourpy-1.3.3-cp314-cp314t-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d304906ecc71672e9c89e87c4675dc5c2645e1f4269a5063b99b0bb29f232d13", size = 326748, upload-time = "2025-07-26T12:02:37.193Z" }, + { url = "https://files.pythonhosted.org/packages/b4/a3/c5ca9f010a44c223f098fccd8b158bb1cb287378a31ac141f04730dc49be/contourpy-1.3.3-cp314-cp314t-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:ca658cd1a680a5c9ea96dc61cdbae1e85c8f25849843aa799dfd3cb370ad4fbe", size = 375554, upload-time = "2025-07-26T12:02:38.894Z" }, + { url = "https://files.pythonhosted.org/packages/80/5b/68bd33ae63fac658a4145088c1e894405e07584a316738710b636c6d0333/contourpy-1.3.3-cp314-cp314t-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:ab2fd90904c503739a75b7c8c5c01160130ba67944a7b77bbf36ef8054576e7f", size = 388118, upload-time = "2025-07-26T12:02:40.642Z" }, + { url = "https://files.pythonhosted.org/packages/40/52/4c285a6435940ae25d7410a6c36bda5145839bc3f0beb20c707cda18b9d2/contourpy-1.3.3-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b7301b89040075c30e5768810bc96a8e8d78085b47d8be6e4c3f5a0b4ed478a0", size = 352555, upload-time = "2025-07-26T12:02:42.25Z" }, + { url = "https://files.pythonhosted.org/packages/24/ee/3e81e1dd174f5c7fefe50e85d0892de05ca4e26ef1c9a59c2a57e43b865a/contourpy-1.3.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:2a2a8b627d5cc6b7c41a4beff6c5ad5eb848c88255fda4a8745f7e901b32d8e4", size = 1322295, upload-time = "2025-07-26T12:02:44.668Z" }, + { url = "https://files.pythonhosted.org/packages/3c/b2/6d913d4d04e14379de429057cd169e5e00f6c2af3bb13e1710bcbdb5da12/contourpy-1.3.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:fd6ec6be509c787f1caf6b247f0b1ca598bef13f4ddeaa126b7658215529ba0f", size = 1391027, upload-time = "2025-07-26T12:02:47.09Z" }, + { url = "https://files.pythonhosted.org/packages/93/8a/68a4ec5c55a2971213d29a9374913f7e9f18581945a7a31d1a39b5d2dfe5/contourpy-1.3.3-cp314-cp314t-win32.whl", hash = "sha256:e74a9a0f5e3fff48fb5a7f2fd2b9b70a3fe014a67522f79b7cca4c0c7e43c9ae", size = 202428, upload-time = "2025-07-26T12:02:48.691Z" }, + { url = "https://files.pythonhosted.org/packages/fa/96/fd9f641ffedc4fa3ace923af73b9d07e869496c9cc7a459103e6e978992f/contourpy-1.3.3-cp314-cp314t-win_amd64.whl", hash = "sha256:13b68d6a62db8eafaebb8039218921399baf6e47bf85006fd8529f2a08ef33fc", size = 250331, upload-time = "2025-07-26T12:02:50.137Z" }, + { url = "https://files.pythonhosted.org/packages/ae/8c/469afb6465b853afff216f9528ffda78a915ff880ed58813ba4faf4ba0b6/contourpy-1.3.3-cp314-cp314t-win_arm64.whl", hash = "sha256:b7448cb5a725bb1e35ce88771b86fba35ef418952474492cf7c764059933ff8b", size = 203831, upload-time = "2025-07-26T12:02:51.449Z" }, + { url = "https://files.pythonhosted.org/packages/a5/29/8dcfe16f0107943fa92388c23f6e05cff0ba58058c4c95b00280d4c75a14/contourpy-1.3.3-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:cd5dfcaeb10f7b7f9dc8941717c6c2ade08f587be2226222c12b25f0483ed497", size = 278809, upload-time = "2025-07-26T12:02:52.74Z" }, + { url = "https://files.pythonhosted.org/packages/85/a9/8b37ef4f7dafeb335daee3c8254645ef5725be4d9c6aa70b50ec46ef2f7e/contourpy-1.3.3-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:0c1fc238306b35f246d61a1d416a627348b5cf0648648a031e14bb8705fcdfe8", size = 261593, upload-time = "2025-07-26T12:02:54.037Z" }, + { url = "https://files.pythonhosted.org/packages/0a/59/ebfb8c677c75605cc27f7122c90313fd2f375ff3c8d19a1694bda74aaa63/contourpy-1.3.3-pp311-pypy311_pp73-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:70f9aad7de812d6541d29d2bbf8feb22ff7e1c299523db288004e3157ff4674e", size = 302202, upload-time = "2025-07-26T12:02:55.947Z" }, + { url = "https://files.pythonhosted.org/packages/3c/37/21972a15834d90bfbfb009b9d004779bd5a07a0ec0234e5ba8f64d5736f4/contourpy-1.3.3-pp311-pypy311_pp73-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:5ed3657edf08512fc3fe81b510e35c2012fbd3081d2e26160f27ca28affec989", size = 329207, upload-time = "2025-07-26T12:02:57.468Z" }, + { url = "https://files.pythonhosted.org/packages/0c/58/bd257695f39d05594ca4ad60df5bcb7e32247f9951fd09a9b8edb82d1daa/contourpy-1.3.3-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:3d1a3799d62d45c18bafd41c5fa05120b96a28079f2393af559b843d1a966a77", size = 225315, upload-time = "2025-07-26T12:02:58.801Z" }, +] + +[[package]] +name = "cryptography" +version = "46.0.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "cffi", marker = "platform_python_implementation != 'PyPy'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/a9/62/e3664e6ffd7743e1694b244dde70b43a394f6f7fbcacf7014a8ff5197c73/cryptography-46.0.1.tar.gz", hash = "sha256:ed570874e88f213437f5cf758f9ef26cbfc3f336d889b1e592ee11283bb8d1c7", size = 749198, upload-time = "2025-09-17T00:10:35.797Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/4c/8c/44ee01267ec01e26e43ebfdae3f120ec2312aa72fa4c0507ebe41a26739f/cryptography-46.0.1-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:1cd6d50c1a8b79af1a6f703709d8973845f677c8e97b1268f5ff323d38ce8475", size = 7285044, upload-time = "2025-09-17T00:08:36.807Z" }, + { url = "https://files.pythonhosted.org/packages/22/59/9ae689a25047e0601adfcb159ec4f83c0b4149fdb5c3030cc94cd218141d/cryptography-46.0.1-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:0ff483716be32690c14636e54a1f6e2e1b7bf8e22ca50b989f88fa1b2d287080", size = 4308182, upload-time = "2025-09-17T00:08:39.388Z" }, + { url = "https://files.pythonhosted.org/packages/c4/ee/ca6cc9df7118f2fcd142c76b1da0f14340d77518c05b1ebfbbabca6b9e7d/cryptography-46.0.1-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:9873bf7c1f2a6330bdfe8621e7ce64b725784f9f0c3a6a55c3047af5849f920e", size = 4572393, upload-time = "2025-09-17T00:08:41.663Z" }, + { url = "https://files.pythonhosted.org/packages/7f/a3/0f5296f63815d8e985922b05c31f77ce44787b3127a67c0b7f70f115c45f/cryptography-46.0.1-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:0dfb7c88d4462a0cfdd0d87a3c245a7bc3feb59de101f6ff88194f740f72eda6", size = 4308400, upload-time = "2025-09-17T00:08:43.559Z" }, + { url = "https://files.pythonhosted.org/packages/5d/8c/74fcda3e4e01be1d32775d5b4dd841acaac3c1b8fa4d0774c7ac8d52463d/cryptography-46.0.1-cp311-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:e22801b61613ebdebf7deb18b507919e107547a1d39a3b57f5f855032dd7cfb8", size = 4015786, upload-time = "2025-09-17T00:08:45.758Z" }, + { url = "https://files.pythonhosted.org/packages/dc/b8/85d23287baeef273b0834481a3dd55bbed3a53587e3b8d9f0898235b8f91/cryptography-46.0.1-cp311-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:757af4f6341ce7a1e47c326ca2a81f41d236070217e5fbbad61bbfe299d55d28", size = 4982606, upload-time = "2025-09-17T00:08:47.602Z" }, + { url = "https://files.pythonhosted.org/packages/e5/d3/de61ad5b52433b389afca0bc70f02a7a1f074651221f599ce368da0fe437/cryptography-46.0.1-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:f7a24ea78de345cfa7f6a8d3bde8b242c7fac27f2bd78fa23474ca38dfaeeab9", size = 4604234, upload-time = "2025-09-17T00:08:49.879Z" }, + { url = "https://files.pythonhosted.org/packages/dc/1f/dbd4d6570d84748439237a7478d124ee0134bf166ad129267b7ed8ea6d22/cryptography-46.0.1-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:9e8776dac9e660c22241b6587fae51a67b4b0147daa4d176b172c3ff768ad736", size = 4307669, upload-time = "2025-09-17T00:08:52.321Z" }, + { url = "https://files.pythonhosted.org/packages/ec/fd/ca0a14ce7f0bfe92fa727aacaf2217eb25eb7e4ed513b14d8e03b26e63ed/cryptography-46.0.1-cp311-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:9f40642a140c0c8649987027867242b801486865277cbabc8c6059ddef16dc8b", size = 4947579, upload-time = "2025-09-17T00:08:54.697Z" }, + { url = "https://files.pythonhosted.org/packages/89/6b/09c30543bb93401f6f88fce556b3bdbb21e55ae14912c04b7bf355f5f96c/cryptography-46.0.1-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:449ef2b321bec7d97ef2c944173275ebdab78f3abdd005400cc409e27cd159ab", size = 4603669, upload-time = "2025-09-17T00:08:57.16Z" }, + { url = "https://files.pythonhosted.org/packages/23/9a/38cb01cb09ce0adceda9fc627c9cf98eb890fc8d50cacbe79b011df20f8a/cryptography-46.0.1-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:2dd339ba3345b908fa3141ddba4025568fa6fd398eabce3ef72a29ac2d73ad75", size = 4435828, upload-time = "2025-09-17T00:08:59.606Z" }, + { url = "https://files.pythonhosted.org/packages/0f/53/435b5c36a78d06ae0bef96d666209b0ecd8f8181bfe4dda46536705df59e/cryptography-46.0.1-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:7411c910fb2a412053cf33cfad0153ee20d27e256c6c3f14d7d7d1d9fec59fd5", size = 4709553, upload-time = "2025-09-17T00:09:01.832Z" }, + { url = "https://files.pythonhosted.org/packages/f5/c4/0da6e55595d9b9cd3b6eb5dc22f3a07ded7f116a3ea72629cab595abb804/cryptography-46.0.1-cp311-abi3-win32.whl", hash = "sha256:cbb8e769d4cac884bb28e3ff620ef1001b75588a5c83c9c9f1fdc9afbe7f29b0", size = 3058327, upload-time = "2025-09-17T00:09:03.726Z" }, + { url = "https://files.pythonhosted.org/packages/95/0f/cd29a35e0d6e78a0ee61793564c8cff0929c38391cb0de27627bdc7525aa/cryptography-46.0.1-cp311-abi3-win_amd64.whl", hash = "sha256:92e8cfe8bd7dd86eac0a677499894862cd5cc2fd74de917daa881d00871ac8e7", size = 3523893, upload-time = "2025-09-17T00:09:06.272Z" }, + { url = "https://files.pythonhosted.org/packages/f2/dd/eea390f3e78432bc3d2f53952375f8b37cb4d37783e626faa6a51e751719/cryptography-46.0.1-cp311-abi3-win_arm64.whl", hash = "sha256:db5597a4c7353b2e5fb05a8e6cb74b56a4658a2b7bf3cb6b1821ae7e7fd6eaa0", size = 2932145, upload-time = "2025-09-17T00:09:08.568Z" }, + { url = "https://files.pythonhosted.org/packages/0a/fb/c73588561afcd5e24b089952bd210b14676c0c5bf1213376350ae111945c/cryptography-46.0.1-cp314-cp314t-macosx_10_9_universal2.whl", hash = "sha256:4c49eda9a23019e11d32a0eb51a27b3e7ddedde91e099c0ac6373e3aacc0d2ee", size = 7193928, upload-time = "2025-09-17T00:09:10.595Z" }, + { url = "https://files.pythonhosted.org/packages/26/34/0ff0bb2d2c79f25a2a63109f3b76b9108a906dd2a2eb5c1d460b9938adbb/cryptography-46.0.1-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:9babb7818fdd71394e576cf26c5452df77a355eac1a27ddfa24096665a27f8fd", size = 4293515, upload-time = "2025-09-17T00:09:12.861Z" }, + { url = "https://files.pythonhosted.org/packages/df/b7/d4f848aee24ecd1be01db6c42c4a270069a4f02a105d9c57e143daf6cf0f/cryptography-46.0.1-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:9f2c4cc63be3ef43c0221861177cee5d14b505cd4d4599a89e2cd273c4d3542a", size = 4545619, upload-time = "2025-09-17T00:09:15.397Z" }, + { url = "https://files.pythonhosted.org/packages/44/a5/42fedefc754fd1901e2d95a69815ea4ec8a9eed31f4c4361fcab80288661/cryptography-46.0.1-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:41c281a74df173876da1dc9a9b6953d387f06e3d3ed9284e3baae3ab3f40883a", size = 4299160, upload-time = "2025-09-17T00:09:17.155Z" }, + { url = "https://files.pythonhosted.org/packages/86/a1/cd21174f56e769c831fbbd6399a1b7519b0ff6280acec1b826d7b072640c/cryptography-46.0.1-cp314-cp314t-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:0a17377fa52563d730248ba1f68185461fff36e8bc75d8787a7dd2e20a802b7a", size = 3994491, upload-time = "2025-09-17T00:09:18.971Z" }, + { url = "https://files.pythonhosted.org/packages/8d/2f/a8cbfa1c029987ddc746fd966711d4fa71efc891d37fbe9f030fe5ab4eec/cryptography-46.0.1-cp314-cp314t-manylinux_2_28_ppc64le.whl", hash = "sha256:0d1922d9280e08cde90b518a10cd66831f632960a8d08cb3418922d83fce6f12", size = 4960157, upload-time = "2025-09-17T00:09:20.923Z" }, + { url = "https://files.pythonhosted.org/packages/67/ae/63a84e6789e0d5a2502edf06b552bcb0fa9ff16147265d5c44a211942abe/cryptography-46.0.1-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:af84e8e99f1a82cea149e253014ea9dc89f75b82c87bb6c7242203186f465129", size = 4577263, upload-time = "2025-09-17T00:09:23.356Z" }, + { url = "https://files.pythonhosted.org/packages/ef/8f/1b9fa8e92bd9cbcb3b7e1e593a5232f2c1e6f9bd72b919c1a6b37d315f92/cryptography-46.0.1-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:ef648d2c690703501714588b2ba640facd50fd16548133b11b2859e8655a69da", size = 4298703, upload-time = "2025-09-17T00:09:25.566Z" }, + { url = "https://files.pythonhosted.org/packages/c3/af/bb95db070e73fea3fae31d8a69ac1463d89d1c084220f549b00dd01094a8/cryptography-46.0.1-cp314-cp314t-manylinux_2_34_ppc64le.whl", hash = "sha256:e94eb5fa32a8a9f9bf991f424f002913e3dd7c699ef552db9b14ba6a76a6313b", size = 4926363, upload-time = "2025-09-17T00:09:27.451Z" }, + { url = "https://files.pythonhosted.org/packages/f5/3b/d8fb17ffeb3a83157a1cc0aa5c60691d062aceecba09c2e5e77ebfc1870c/cryptography-46.0.1-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:534b96c0831855e29fc3b069b085fd185aa5353033631a585d5cd4dd5d40d657", size = 4576958, upload-time = "2025-09-17T00:09:29.924Z" }, + { url = "https://files.pythonhosted.org/packages/d9/46/86bc3a05c10c8aa88c8ae7e953a8b4e407c57823ed201dbcba55c4d655f4/cryptography-46.0.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:f9b55038b5c6c47559aa33626d8ecd092f354e23de3c6975e4bb205df128a2a0", size = 4422507, upload-time = "2025-09-17T00:09:32.222Z" }, + { url = "https://files.pythonhosted.org/packages/a8/4e/387e5a21dfd2b4198e74968a541cfd6128f66f8ec94ed971776e15091ac3/cryptography-46.0.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:ec13b7105117dbc9afd023300fb9954d72ca855c274fe563e72428ece10191c0", size = 4683964, upload-time = "2025-09-17T00:09:34.118Z" }, + { url = "https://files.pythonhosted.org/packages/25/a3/f9f5907b166adb8f26762071474b38bbfcf89858a5282f032899075a38a1/cryptography-46.0.1-cp314-cp314t-win32.whl", hash = "sha256:504e464944f2c003a0785b81668fe23c06f3b037e9cb9f68a7c672246319f277", size = 3029705, upload-time = "2025-09-17T00:09:36.381Z" }, + { url = "https://files.pythonhosted.org/packages/12/66/4d3a4f1850db2e71c2b1628d14b70b5e4c1684a1bd462f7fffb93c041c38/cryptography-46.0.1-cp314-cp314t-win_amd64.whl", hash = "sha256:c52fded6383f7e20eaf70a60aeddd796b3677c3ad2922c801be330db62778e05", size = 3502175, upload-time = "2025-09-17T00:09:38.261Z" }, + { url = "https://files.pythonhosted.org/packages/52/c7/9f10ad91435ef7d0d99a0b93c4360bea3df18050ff5b9038c489c31ac2f5/cryptography-46.0.1-cp314-cp314t-win_arm64.whl", hash = "sha256:9495d78f52c804b5ec8878b5b8c7873aa8e63db9cd9ee387ff2db3fffe4df784", size = 2912354, upload-time = "2025-09-17T00:09:40.078Z" }, + { url = "https://files.pythonhosted.org/packages/98/e5/fbd632385542a3311915976f88e0dfcf09e62a3fc0aff86fb6762162a24d/cryptography-46.0.1-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:d84c40bdb8674c29fa192373498b6cb1e84f882889d21a471b45d1f868d8d44b", size = 7255677, upload-time = "2025-09-17T00:09:42.407Z" }, + { url = "https://files.pythonhosted.org/packages/56/3e/13ce6eab9ad6eba1b15a7bd476f005a4c1b3f299f4c2f32b22408b0edccf/cryptography-46.0.1-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:9ed64e5083fa806709e74fc5ea067dfef9090e5b7a2320a49be3c9df3583a2d8", size = 4301110, upload-time = "2025-09-17T00:09:45.614Z" }, + { url = "https://files.pythonhosted.org/packages/a2/67/65dc233c1ddd688073cf7b136b06ff4b84bf517ba5529607c9d79720fc67/cryptography-46.0.1-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:341fb7a26bc9d6093c1b124b9f13acc283d2d51da440b98b55ab3f79f2522ead", size = 4562369, upload-time = "2025-09-17T00:09:47.601Z" }, + { url = "https://files.pythonhosted.org/packages/17/db/d64ae4c6f4e98c3dac5bf35dd4d103f4c7c345703e43560113e5e8e31b2b/cryptography-46.0.1-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:6ef1488967e729948d424d09c94753d0167ce59afba8d0f6c07a22b629c557b2", size = 4302126, upload-time = "2025-09-17T00:09:49.335Z" }, + { url = "https://files.pythonhosted.org/packages/3d/19/5f1eea17d4805ebdc2e685b7b02800c4f63f3dd46cfa8d4c18373fea46c8/cryptography-46.0.1-cp38-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:7823bc7cdf0b747ecfb096d004cc41573c2f5c7e3a29861603a2871b43d3ef32", size = 4009431, upload-time = "2025-09-17T00:09:51.239Z" }, + { url = "https://files.pythonhosted.org/packages/81/b5/229ba6088fe7abccbfe4c5edb96c7a5ad547fac5fdd0d40aa6ea540b2985/cryptography-46.0.1-cp38-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:f736ab8036796f5a119ff8211deda416f8c15ce03776db704a7a4e17381cb2ef", size = 4980739, upload-time = "2025-09-17T00:09:54.181Z" }, + { url = "https://files.pythonhosted.org/packages/3a/9c/50aa38907b201e74bc43c572f9603fa82b58e831bd13c245613a23cff736/cryptography-46.0.1-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:e46710a240a41d594953012213ea8ca398cd2448fbc5d0f1be8160b5511104a0", size = 4592289, upload-time = "2025-09-17T00:09:56.731Z" }, + { url = "https://files.pythonhosted.org/packages/5a/33/229858f8a5bb22f82468bb285e9f4c44a31978d5f5830bb4ea1cf8a4e454/cryptography-46.0.1-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:84ef1f145de5aee82ea2447224dc23f065ff4cc5791bb3b506615957a6ba8128", size = 4301815, upload-time = "2025-09-17T00:09:58.548Z" }, + { url = "https://files.pythonhosted.org/packages/52/cb/b76b2c87fbd6ed4a231884bea3ce073406ba8e2dae9defad910d33cbf408/cryptography-46.0.1-cp38-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:9394c7d5a7565ac5f7d9ba38b2617448eba384d7b107b262d63890079fad77ca", size = 4943251, upload-time = "2025-09-17T00:10:00.475Z" }, + { url = "https://files.pythonhosted.org/packages/94/0f/f66125ecf88e4cb5b8017ff43f3a87ede2d064cb54a1c5893f9da9d65093/cryptography-46.0.1-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:ed957044e368ed295257ae3d212b95456bd9756df490e1ac4538857f67531fcc", size = 4591247, upload-time = "2025-09-17T00:10:02.874Z" }, + { url = "https://files.pythonhosted.org/packages/f6/22/9f3134ae436b63b463cfdf0ff506a0570da6873adb4bf8c19b8a5b4bac64/cryptography-46.0.1-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:f7de12fa0eee6234de9a9ce0ffcfa6ce97361db7a50b09b65c63ac58e5f22fc7", size = 4428534, upload-time = "2025-09-17T00:10:04.994Z" }, + { url = "https://files.pythonhosted.org/packages/89/39/e6042bcb2638650b0005c752c38ea830cbfbcbb1830e4d64d530000aa8dc/cryptography-46.0.1-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:7fab1187b6c6b2f11a326f33b036f7168f5b996aedd0c059f9738915e4e8f53a", size = 4699541, upload-time = "2025-09-17T00:10:06.925Z" }, + { url = "https://files.pythonhosted.org/packages/68/46/753d457492d15458c7b5a653fc9a84a1c9c7a83af6ebdc94c3fc373ca6e8/cryptography-46.0.1-cp38-abi3-win32.whl", hash = "sha256:45f790934ac1018adeba46a0f7289b2b8fe76ba774a88c7f1922213a56c98bc1", size = 3043779, upload-time = "2025-09-17T00:10:08.951Z" }, + { url = "https://files.pythonhosted.org/packages/2f/50/b6f3b540c2f6ee712feeb5fa780bb11fad76634e71334718568e7695cb55/cryptography-46.0.1-cp38-abi3-win_amd64.whl", hash = "sha256:7176a5ab56fac98d706921f6416a05e5aff7df0e4b91516f450f8627cda22af3", size = 3517226, upload-time = "2025-09-17T00:10:10.769Z" }, + { url = "https://files.pythonhosted.org/packages/ff/e8/77d17d00981cdd27cc493e81e1749a0b8bbfb843780dbd841e30d7f50743/cryptography-46.0.1-cp38-abi3-win_arm64.whl", hash = "sha256:efc9e51c3e595267ff84adf56e9b357db89ab2279d7e375ffcaf8f678606f3d9", size = 2923149, upload-time = "2025-09-17T00:10:13.236Z" }, + { url = "https://files.pythonhosted.org/packages/27/27/077e09fd92075dd1338ea0ffaf5cfee641535545925768350ad90d8c36ca/cryptography-46.0.1-pp311-pypy311_pp73-macosx_10_9_x86_64.whl", hash = "sha256:b9c79af2c3058430d911ff1a5b2b96bbfe8da47d5ed961639ce4681886614e70", size = 3722319, upload-time = "2025-09-17T00:10:20.273Z" }, + { url = "https://files.pythonhosted.org/packages/db/32/6fc7250280920418651640d76cee34d91c1e0601d73acd44364570cf041f/cryptography-46.0.1-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:0ca4be2af48c24df689a150d9cd37404f689e2968e247b6b8ff09bff5bcd786f", size = 4249030, upload-time = "2025-09-17T00:10:22.396Z" }, + { url = "https://files.pythonhosted.org/packages/32/33/8d5398b2da15a15110b2478480ab512609f95b45ead3a105c9a9c76f9980/cryptography-46.0.1-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:13e67c4d3fb8b6bc4ef778a7ccdd8df4cd15b4bcc18f4239c8440891a11245cc", size = 4528009, upload-time = "2025-09-17T00:10:24.418Z" }, + { url = "https://files.pythonhosted.org/packages/fd/1c/4012edad2a8977ab386c36b6e21f5065974d37afa3eade83a9968cba4855/cryptography-46.0.1-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:15b5fd9358803b0d1cc42505a18d8bca81dabb35b5cfbfea1505092e13a9d96d", size = 4248902, upload-time = "2025-09-17T00:10:26.255Z" }, + { url = "https://files.pythonhosted.org/packages/58/a3/257cd5ae677302de8fa066fca9de37128f6729d1e63c04dd6a15555dd450/cryptography-46.0.1-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:e34da95e29daf8a71cb2841fd55df0511539a6cdf33e6f77c1e95e44006b9b46", size = 4527150, upload-time = "2025-09-17T00:10:28.28Z" }, + { url = "https://files.pythonhosted.org/packages/6a/cd/fe6b65e1117ec7631f6be8951d3db076bac3e1b096e3e12710ed071ffc3c/cryptography-46.0.1-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:34f04b7311174469ab3ac2647469743720f8b6c8b046f238e5cb27905695eb2a", size = 3448210, upload-time = "2025-09-17T00:10:30.145Z" }, +] + +[[package]] +name = "cycler" +version = "0.12.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/a9/95/a3dbbb5028f35eafb79008e7522a75244477d2838f38cbb722248dabc2a8/cycler-0.12.1.tar.gz", hash = "sha256:88bb128f02ba341da8ef447245a9e138fae777f6a23943da4540077d3601eb1c", size = 7615, upload-time = "2023-10-07T05:32:18.335Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e7/05/c19819d5e3d95294a6f5947fb9b9629efb316b96de511b418c53d245aae6/cycler-0.12.1-py3-none-any.whl", hash = "sha256:85cef7cff222d8644161529808465972e51340599459b8ac3ccbac5a854e0d30", size = 8321, upload-time = "2023-10-07T05:32:16.783Z" }, +] + +[[package]] +name = "cyclopts" +version = "3.24.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "attrs" }, + { name = "docstring-parser", marker = "python_full_version < '4'" }, + { name = "rich" }, + { name = "rich-rst" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/30/ca/7782da3b03242d5f0a16c20371dff99d4bd1fedafe26bc48ff82e42be8c9/cyclopts-3.24.0.tar.gz", hash = "sha256:de6964a041dfb3c57bf043b41e68c43548227a17de1bad246e3a0bfc5c4b7417", size = 76131, upload-time = "2025-09-08T15:40:57.75Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/f0/8b/2c95f0645c6f40211896375e6fa51f504b8ccb29c21f6ae661fe87ab044e/cyclopts-3.24.0-py3-none-any.whl", hash = "sha256:809d04cde9108617106091140c3964ee6fceb33cecdd537f7ffa360bde13ed71", size = 86154, upload-time = "2025-09-08T15:40:56.41Z" }, +] + +[[package]] +name = "deprecated" +version = "1.2.18" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "wrapt" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/98/97/06afe62762c9a8a86af0cfb7bfdab22a43ad17138b07af5b1a58442690a2/deprecated-1.2.18.tar.gz", hash = "sha256:422b6f6d859da6f2ef57857761bfb392480502a64c3028ca9bbe86085d72115d", size = 2928744, upload-time = "2025-01-27T10:46:25.7Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/6e/c6/ac0b6c1e2d138f1002bcf799d330bd6d85084fece321e662a14223794041/Deprecated-1.2.18-py2.py3-none-any.whl", hash = "sha256:bd5011788200372a32418f888e326a09ff80d0214bd961147cfed01b5c018eec", size = 9998, upload-time = "2025-01-27T10:46:09.186Z" }, +] + +[[package]] +name = "deprecation" +version = "2.1.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "packaging" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/5a/d3/8ae2869247df154b64c1884d7346d412fed0c49df84db635aab2d1c40e62/deprecation-2.1.0.tar.gz", hash = "sha256:72b3bde64e5d778694b0cf68178aed03d15e15477116add3fb773e581f9518ff", size = 173788, upload-time = "2020-04-20T14:23:38.738Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/02/c3/253a89ee03fc9b9682f1541728eb66db7db22148cd94f89ab22528cd1e1b/deprecation-2.1.0-py2.py3-none-any.whl", hash = "sha256:a10811591210e1fb0e768a8c25517cabeabcba6f0bf96564f8ff45189f90b14a", size = 11178, upload-time = "2020-04-20T14:23:36.581Z" }, +] + +[[package]] +name = "diskcache" +version = "5.6.3" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/3f/21/1c1ffc1a039ddcc459db43cc108658f32c57d271d7289a2794e401d0fdb6/diskcache-5.6.3.tar.gz", hash = "sha256:2c3a3fa2743d8535d832ec61c2054a1641f41775aa7c556758a109941e33e4fc", size = 67916, upload-time = "2023-08-31T06:12:00.316Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/3f/27/4570e78fc0bf5ea0ca45eb1de3818a23787af9b390c0b0a0033a1b8236f9/diskcache-5.6.3-py3-none-any.whl", hash = "sha256:5e31b2d5fbad117cc363ebaf6b689474db18a1f6438bc82358b024abd4c2ca19", size = 45550, upload-time = "2023-08-31T06:11:58.822Z" }, +] + +[[package]] +name = "distlib" +version = "0.4.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/96/8e/709914eb2b5749865801041647dc7f4e6d00b549cfe88b65ca192995f07c/distlib-0.4.0.tar.gz", hash = "sha256:feec40075be03a04501a973d81f633735b4b69f98b05450592310c0f401a4e0d", size = 614605, upload-time = "2025-07-17T16:52:00.465Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/33/6b/e0547afaf41bf2c42e52430072fa5658766e3d65bd4b03a563d1b6336f57/distlib-0.4.0-py2.py3-none-any.whl", hash = "sha256:9659f7d87e46584a30b5780e43ac7a2143098441670ff0a49d5f9034c54a6c16", size = 469047, upload-time = "2025-07-17T16:51:58.613Z" }, +] + +[[package]] +name = "distro" +version = "1.9.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/fc/f8/98eea607f65de6527f8a2e8885fc8015d3e6f5775df186e443e0964a11c3/distro-1.9.0.tar.gz", hash = "sha256:2fa77c6fd8940f116ee1d6b94a2f90b13b5ea8d019b98bc8bafdcabcdd9bdbed", size = 60722, upload-time = "2023-12-24T09:54:32.31Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/12/b3/231ffd4ab1fc9d679809f356cebee130ac7daa00d6d6f3206dd4fd137e9e/distro-1.9.0-py3-none-any.whl", hash = "sha256:7bffd925d65168f85027d8da9af6bddab658135b840670a223589bc0c8ef02b2", size = 20277, upload-time = "2023-12-24T09:54:30.421Z" }, +] + +[[package]] +name = "dlt" +version = "1.16.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "click" }, + { name = "fsspec" }, + { name = "gitpython" }, + { name = "giturlparse" }, + { name = "hexbytes" }, + { name = "humanize" }, + { name = "jsonpath-ng" }, + { name = "orjson", marker = "python_full_version >= '3.14' or sys_platform != 'emscripten'" }, + { name = "packaging" }, + { name = "pathvalidate" }, + { name = "pendulum" }, + { name = "pluggy" }, + { name = "pytz" }, + { name = "pywin32", marker = "sys_platform == 'win32'" }, + { name = "pyyaml" }, + { name = "requests" }, + { name = "requirements-parser" }, + { name = "rich-argparse" }, + { name = "semver" }, + { name = "setuptools" }, + { name = "simplejson" }, + { name = "sqlglot" }, + { name = "tenacity" }, + { name = "tomlkit" }, + { name = "typing-extensions" }, + { name = "tzdata" }, + { name = "win-precise-time", marker = "python_full_version < '3.13' and os_name == 'nt'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/47/45/8f32b8cc4c709c79edc54763ab0e5f62df55a17bfaf8c31e2d2538422e34/dlt-1.16.0.tar.gz", hash = "sha256:113d17a3f27aa4f41c3438b0b032a68d30db195d8415a471ba43a9502e971a21", size = 809187, upload-time = "2025-09-10T06:53:06.365Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c6/1c/0a96ced9fb52e859b44624cc86ace5f59324ca899ac7e5a5cfeb1f1c797c/dlt-1.16.0-py3-none-any.whl", hash = "sha256:882ef281bbdc32eaba3b5ced984a8ed7014d8978fd7ab4a58b198023c8938c9f", size = 1029963, upload-time = "2025-09-10T06:53:04.014Z" }, +] + +[package.optional-dependencies] +sqlalchemy = [ + { name = "alembic" }, + { name = "sqlalchemy" }, +] + +[[package]] +name = "dnspython" +version = "2.8.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/8c/8b/57666417c0f90f08bcafa776861060426765fdb422eb10212086fb811d26/dnspython-2.8.0.tar.gz", hash = "sha256:181d3c6996452cb1189c4046c61599b84a5a86e099562ffde77d26984ff26d0f", size = 368251, upload-time = "2025-09-07T18:58:00.022Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ba/5a/18ad964b0086c6e62e2e7500f7edc89e3faa45033c71c1893d34eed2b2de/dnspython-2.8.0-py3-none-any.whl", hash = "sha256:01d9bbc4a2d76bf0db7c1f729812ded6d912bd318d3b1cf81d30c0f845dbf3af", size = 331094, upload-time = "2025-09-07T18:57:58.071Z" }, +] + +[[package]] +name = "docstring-parser" +version = "0.17.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/b2/9d/c3b43da9515bd270df0f80548d9944e389870713cc1fe2b8fb35fe2bcefd/docstring_parser-0.17.0.tar.gz", hash = "sha256:583de4a309722b3315439bb31d64ba3eebada841f2e2cee23b99df001434c912", size = 27442, upload-time = "2025-07-21T07:35:01.868Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/55/e2/2537ebcff11c1ee1ff17d8d0b6f4db75873e3b0fb32c2d4a2ee31ecb310a/docstring_parser-0.17.0-py3-none-any.whl", hash = "sha256:cf2569abd23dce8099b300f9b4fa8191e9582dda731fd533daf54c4551658708", size = 36896, upload-time = "2025-07-21T07:35:00.684Z" }, +] + +[[package]] +name = "docutils" +version = "0.22.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/4a/c0/89fe6215b443b919cb98a5002e107cb5026854ed1ccb6b5833e0768419d1/docutils-0.22.2.tar.gz", hash = "sha256:9fdb771707c8784c8f2728b67cb2c691305933d68137ef95a75db5f4dfbc213d", size = 2289092, upload-time = "2025-09-20T17:55:47.994Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/66/dd/f95350e853a4468ec37478414fc04ae2d61dad7a947b3015c3dcc51a09b9/docutils-0.22.2-py3-none-any.whl", hash = "sha256:b0e98d679283fc3bb0ead8a5da7f501baa632654e7056e9c5846842213d674d8", size = 632667, upload-time = "2025-09-20T17:55:43.052Z" }, +] + +[[package]] +name = "email-validator" +version = "2.2.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "dnspython" }, + { name = "idna" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/48/ce/13508a1ec3f8bb981ae4ca79ea40384becc868bfae97fd1c942bb3a001b1/email_validator-2.2.0.tar.gz", hash = "sha256:cb690f344c617a714f22e66ae771445a1ceb46821152df8e165c5f9a364582b7", size = 48967, upload-time = "2024-06-20T11:30:30.034Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d7/ee/bf0adb559ad3c786f12bcbc9296b3f5675f529199bef03e2df281fa1fadb/email_validator-2.2.0-py3-none-any.whl", hash = "sha256:561977c2d73ce3611850a06fa56b414621e0c8faa9d66f2611407d87465da631", size = 33521, upload-time = "2024-06-20T11:30:28.248Z" }, +] + +[[package]] +name = "exceptiongroup" +version = "1.3.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "typing-extensions", marker = "python_full_version < '3.13'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/0b/9f/a65090624ecf468cdca03533906e7c69ed7588582240cfe7cc9e770b50eb/exceptiongroup-1.3.0.tar.gz", hash = "sha256:b241f5885f560bc56a59ee63ca4c6a8bfa46ae4ad651af316d4e81817bb9fd88", size = 29749, upload-time = "2025-05-10T17:42:51.123Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/36/f4/c6e662dade71f56cd2f3735141b265c3c79293c109549c1e6933b0651ffc/exceptiongroup-1.3.0-py3-none-any.whl", hash = "sha256:4d111e6e0c13d0644cad6ddaa7ed0261a0b36971f6d23e7ec9b4b9097da78a10", size = 16674, upload-time = "2025-05-10T17:42:49.33Z" }, +] + +[[package]] +name = "fastapi" +version = "0.116.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pydantic" }, + { name = "starlette" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/01/64/1296f46d6b9e3b23fb22e5d01af3f104ef411425531376212f1eefa2794d/fastapi-0.116.2.tar.gz", hash = "sha256:231a6af2fe21cfa2c32730170ad8514985fc250bec16c9b242d3b94c835ef529", size = 298595, upload-time = "2025-09-16T18:29:23.058Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/32/e4/c543271a8018874b7f682bf6156863c416e1334b8ed3e51a69495c5d4360/fastapi-0.116.2-py3-none-any.whl", hash = "sha256:c3a7a8fb830b05f7e087d920e0d786ca1fc9892eb4e9a84b227be4c1bc7569db", size = 95670, upload-time = "2025-09-16T18:29:21.329Z" }, +] + +[[package]] +name = "fastapi-users" +version = "14.0.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "email-validator" }, + { name = "fastapi" }, + { name = "makefun" }, + { name = "pwdlib", extra = ["argon2", "bcrypt"] }, + { name = "pyjwt", extra = ["crypto"] }, + { name = "python-multipart" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/e4/26/7fe4e6a4f60d9cde2b95f58ba45ff03219b62bd03bea75d914b723ecfa2a/fastapi_users-14.0.1.tar.gz", hash = "sha256:8c032b3a75c6fb2b1f5eab8ffce5321176e9916efe1fe93e7c15ee55f0b02236", size = 120315, upload-time = "2025-01-04T13:20:05.95Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/2c/52/2821d3e95a92567d38f98a33d1ef89302aa3448866bf45ff19a48a5f28f8/fastapi_users-14.0.1-py3-none-any.whl", hash = "sha256:074df59676dccf79412d2880bdcb661ab1fabc2ecec1f043b4e6a23be97ed9e1", size = 38717, upload-time = "2025-01-04T13:20:04.441Z" }, +] + +[package.optional-dependencies] +sqlalchemy = [ + { name = "fastapi-users-db-sqlalchemy" }, +] + +[[package]] +name = "fastapi-users-db-sqlalchemy" +version = "7.0.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "fastapi-users" }, + { name = "sqlalchemy", extra = ["asyncio"] }, +] +sdist = { url = "https://files.pythonhosted.org/packages/87/12/bc9e6146ae31564741cefc87ee6e37fa5b566933f0afe8aa030779d60e60/fastapi_users_db_sqlalchemy-7.0.0.tar.gz", hash = "sha256:6823eeedf8a92f819276a2b2210ef1dcfd71fe8b6e37f7b4da8d1c60e3dfd595", size = 10877, upload-time = "2025-01-04T13:09:05.086Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a6/08/9968963c1fb8c34627b7f1fbcdfe9438540f87dc7c9bfb59bb4fd19a4ecf/fastapi_users_db_sqlalchemy-7.0.0-py3-none-any.whl", hash = "sha256:5fceac018e7cfa69efc70834dd3035b3de7988eb4274154a0dbe8b14f5aa001e", size = 6891, upload-time = "2025-01-04T13:09:02.869Z" }, +] + +[[package]] +name = "fastmcp" +version = "2.12.3" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "authlib" }, + { name = "cyclopts" }, + { name = "exceptiongroup" }, + { name = "httpx" }, + { name = "mcp" }, + { name = "openapi-core" }, + { name = "openapi-pydantic" }, + { name = "pydantic", extra = ["email"] }, + { name = "pyperclip" }, + { name = "python-dotenv" }, + { name = "rich" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/99/5e/035fdfa23646de8811776cd62d93440e334e8a4557b35c63c1bff125c08c/fastmcp-2.12.3.tar.gz", hash = "sha256:541dd569d5b6c083140b04d997ba3dc47f7c10695cee700d0a733ce63b20bb65", size = 5246812, upload-time = "2025-09-12T12:28:07.136Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/96/79/0fd386e61819e205563d4eb15da76564b80dc2edd3c64b46f2706235daec/fastmcp-2.12.3-py3-none-any.whl", hash = "sha256:aee50872923a9cba731861fc0120e7dbe4642a2685ba251b2b202b82fb6c25a9", size = 314031, upload-time = "2025-09-12T12:28:05.024Z" }, +] + +[[package]] +name = "fastuuid" +version = "0.12.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/19/17/13146a1e916bd2971d0a58db5e0a4ad23efdd49f78f33ac871c161f8007b/fastuuid-0.12.0.tar.gz", hash = "sha256:d0bd4e5b35aad2826403f4411937c89e7c88857b1513fe10f696544c03e9bd8e", size = 19180, upload-time = "2025-01-27T18:04:14.387Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d4/99/555eab31381c7912103d4c8654082611e5e82a7bb88ad5ab067e36b622d7/fastuuid-0.12.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2bced35269315d16fe0c41003f8c9d63f2ee16a59295d90922cad5e6a67d0418", size = 247249, upload-time = "2025-01-27T18:03:23.092Z" }, + { url = "https://files.pythonhosted.org/packages/6d/3b/d62ce7f2af3d50a8e787603d44809770f43a3f2ff708bf10c252bf479109/fastuuid-0.12.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:82106e4b0a24f4f2f73c88f89dadbc1533bb808900740ca5db9bbb17d3b0c824", size = 258369, upload-time = "2025-01-27T18:04:08.903Z" }, + { url = "https://files.pythonhosted.org/packages/86/23/33ec5355036745cf83ea9ca7576d2e0750ff8d268c03b4af40ed26f1a303/fastuuid-0.12.0-cp311-cp311-manylinux_2_34_x86_64.whl", hash = "sha256:4db1bc7b8caa1d7412e1bea29b016d23a8d219131cff825b933eb3428f044dca", size = 278316, upload-time = "2025-01-27T18:04:12.74Z" }, + { url = "https://files.pythonhosted.org/packages/40/91/32ce82a14650148b6979ccd1a0089fd63d92505a90fb7156d2acc3245cbd/fastuuid-0.12.0-cp311-cp311-win_amd64.whl", hash = "sha256:07afc8e674e67ac3d35a608c68f6809da5fab470fb4ef4469094fdb32ba36c51", size = 156643, upload-time = "2025-01-27T18:05:59.266Z" }, + { url = "https://files.pythonhosted.org/packages/f6/28/442e79d6219b90208cb243ac01db05d89cc4fdf8ecd563fb89476baf7122/fastuuid-0.12.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:328694a573fe9dce556b0b70c9d03776786801e028d82f0b6d9db1cb0521b4d1", size = 247372, upload-time = "2025-01-27T18:03:40.967Z" }, + { url = "https://files.pythonhosted.org/packages/40/eb/e0fd56890970ca7a9ec0d116844580988b692b1a749ac38e0c39e1dbdf23/fastuuid-0.12.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:02acaea2c955bb2035a7d8e7b3fba8bd623b03746ae278e5fa932ef54c702f9f", size = 258200, upload-time = "2025-01-27T18:04:12.138Z" }, + { url = "https://files.pythonhosted.org/packages/f5/3c/4b30e376e65597a51a3dc929461a0dec77c8aec5d41d930f482b8f43e781/fastuuid-0.12.0-cp312-cp312-manylinux_2_34_x86_64.whl", hash = "sha256:ed9f449cba8cf16cced252521aee06e633d50ec48c807683f21cc1d89e193eb0", size = 278446, upload-time = "2025-01-27T18:04:15.877Z" }, + { url = "https://files.pythonhosted.org/packages/fe/96/cc5975fd23d2197b3e29f650a7a9beddce8993eaf934fa4ac595b77bb71f/fastuuid-0.12.0-cp312-cp312-win_amd64.whl", hash = "sha256:0df2ea4c9db96fd8f4fa38d0e88e309b3e56f8fd03675a2f6958a5b082a0c1e4", size = 157185, upload-time = "2025-01-27T18:06:19.21Z" }, + { url = "https://files.pythonhosted.org/packages/a9/e8/d2bb4f19e5ee15f6f8e3192a54a897678314151aa17d0fb766d2c2cbc03d/fastuuid-0.12.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:7fe2407316a04ee8f06d3dbc7eae396d0a86591d92bafe2ca32fce23b1145786", size = 247512, upload-time = "2025-01-27T18:04:08.115Z" }, + { url = "https://files.pythonhosted.org/packages/bc/53/25e811d92fd60f5c65e098c3b68bd8f1a35e4abb6b77a153025115b680de/fastuuid-0.12.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b9b31dd488d0778c36f8279b306dc92a42f16904cba54acca71e107d65b60b0c", size = 258257, upload-time = "2025-01-27T18:03:56.408Z" }, + { url = "https://files.pythonhosted.org/packages/10/23/73618e7793ea0b619caae2accd9e93e60da38dd78dd425002d319152ef2f/fastuuid-0.12.0-cp313-cp313-manylinux_2_34_x86_64.whl", hash = "sha256:b19361ee649365eefc717ec08005972d3d1eb9ee39908022d98e3bfa9da59e37", size = 278559, upload-time = "2025-01-27T18:03:58.661Z" }, + { url = "https://files.pythonhosted.org/packages/e4/41/6317ecfc4757d5f2a604e5d3993f353ba7aee85fa75ad8b86fce6fc2fa40/fastuuid-0.12.0-cp313-cp313-win_amd64.whl", hash = "sha256:8fc66b11423e6f3e1937385f655bedd67aebe56a3dcec0cb835351cfe7d358c9", size = 157276, upload-time = "2025-01-27T18:06:39.245Z" }, +] + +[[package]] +name = "filelock" +version = "3.19.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/40/bb/0ab3e58d22305b6f5440629d20683af28959bf793d98d11950e305c1c326/filelock-3.19.1.tar.gz", hash = "sha256:66eda1888b0171c998b35be2bcc0f6d75c388a7ce20c3f3f37aa8e96c2dddf58", size = 17687, upload-time = "2025-08-14T16:56:03.016Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/42/14/42b2651a2f46b022ccd948bca9f2d5af0fd8929c4eec235b8d6d844fbe67/filelock-3.19.1-py3-none-any.whl", hash = "sha256:d38e30481def20772f5baf097c122c3babc4fcdb7e14e57049eb9d88c6dc017d", size = 15988, upload-time = "2025-08-14T16:56:01.633Z" }, +] + +[[package]] +name = "filetype" +version = "1.2.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/bb/29/745f7d30d47fe0f251d3ad3dc2978a23141917661998763bebb6da007eb1/filetype-1.2.0.tar.gz", hash = "sha256:66b56cd6474bf41d8c54660347d37afcc3f7d1970648de365c102ef77548aadb", size = 998020, upload-time = "2022-11-02T17:34:04.141Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/18/79/1b8fa1bb3568781e84c9200f951c735f3f157429f44be0495da55894d620/filetype-1.2.0-py2.py3-none-any.whl", hash = "sha256:7ce71b6880181241cf7ac8697a2f1eb6a8bd9b429f7ad6d27b8db9ba5f1c2d25", size = 19970, upload-time = "2022-11-02T17:34:01.425Z" }, +] + +[[package]] +name = "flatbuffers" +version = "25.2.10" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/e4/30/eb5dce7994fc71a2f685d98ec33cc660c0a5887db5610137e60d8cbc4489/flatbuffers-25.2.10.tar.gz", hash = "sha256:97e451377a41262f8d9bd4295cc836133415cc03d8cb966410a4af92eb00d26e", size = 22170, upload-time = "2025-02-11T04:26:46.257Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b8/25/155f9f080d5e4bc0082edfda032ea2bc2b8fab3f4d25d46c1e9dd22a1a89/flatbuffers-25.2.10-py2.py3-none-any.whl", hash = "sha256:ebba5f4d5ea615af3f7fd70fc310636fbb2bbd1f566ac0a23d98dd412de50051", size = 30953, upload-time = "2025-02-11T04:26:44.484Z" }, +] + +[[package]] +name = "fonttools" +version = "4.60.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/27/d9/4eabd956fe123651a1f0efe29d9758b3837b5ae9a98934bdb571117033bb/fonttools-4.60.0.tar.gz", hash = "sha256:8f5927f049091a0ca74d35cce7f78e8f7775c83a6901a8fbe899babcc297146a", size = 3553671, upload-time = "2025-09-17T11:34:01.504Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/da/3d/c57731fbbf204ef1045caca28d5176430161ead73cd9feac3e9d9ef77ee6/fonttools-4.60.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:a9106c202d68ff5f9b4a0094c4d7ad2eaa7e9280f06427b09643215e706eb016", size = 2830883, upload-time = "2025-09-17T11:32:10.552Z" }, + { url = "https://files.pythonhosted.org/packages/cc/2d/b7a6ebaed464ce441c755252cc222af11edc651d17c8f26482f429cc2c0e/fonttools-4.60.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:9da3a4a3f2485b156bb429b4f8faa972480fc01f553f7c8c80d05d48f17eec89", size = 2356005, upload-time = "2025-09-17T11:32:13.248Z" }, + { url = "https://files.pythonhosted.org/packages/ee/c2/ea834e921324e2051403e125c1fe0bfbdde4951a7c1784e4ae6bdbd286cc/fonttools-4.60.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1f84de764c6057b2ffd4feb50ddef481d92e348f0c70f2c849b723118d352bf3", size = 5041201, upload-time = "2025-09-17T11:32:15.373Z" }, + { url = "https://files.pythonhosted.org/packages/93/3c/1c64a338e9aa410d2d0728827d5bb1301463078cb225b94589f27558b427/fonttools-4.60.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:800b3fa0d5c12ddff02179d45b035a23989a6c597a71c8035c010fff3b2ef1bb", size = 4977696, upload-time = "2025-09-17T11:32:17.674Z" }, + { url = "https://files.pythonhosted.org/packages/07/cc/c8c411a0d9732bb886b870e052f20658fec9cf91118314f253950d2c1d65/fonttools-4.60.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8dd68f60b030277f292a582d31c374edfadc60bb33d51ec7b6cd4304531819ba", size = 5020386, upload-time = "2025-09-17T11:32:20.089Z" }, + { url = "https://files.pythonhosted.org/packages/13/01/1d3bc07cf92e7f4fc27f06d4494bf6078dc595b2e01b959157a4fd23df12/fonttools-4.60.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:53328e3ca9e5c8660ef6de07c35f8f312c189b757535e12141be7a8ec942de6e", size = 5131575, upload-time = "2025-09-17T11:32:22.582Z" }, + { url = "https://files.pythonhosted.org/packages/5a/16/08db3917ee19e89d2eb0ee637d37cd4136c849dc421ff63f406b9165c1a1/fonttools-4.60.0-cp311-cp311-win32.whl", hash = "sha256:d493c175ddd0b88a5376e61163e3e6fde3be8b8987db9b092e0a84650709c9e7", size = 2229297, upload-time = "2025-09-17T11:32:24.834Z" }, + { url = "https://files.pythonhosted.org/packages/d2/0b/76764da82c0dfcea144861f568d9e83f4b921e84f2be617b451257bb25a7/fonttools-4.60.0-cp311-cp311-win_amd64.whl", hash = "sha256:cc2770c9dc49c2d0366e9683f4d03beb46c98042d7ccc8ddbadf3459ecb051a7", size = 2277193, upload-time = "2025-09-17T11:32:27.094Z" }, + { url = "https://files.pythonhosted.org/packages/2a/9b/706ebf84b55ab03439c1f3a94d6915123c0d96099f4238b254fdacffe03a/fonttools-4.60.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:8c68928a438d60dfde90e2f09aa7f848ed201176ca6652341744ceec4215859f", size = 2831953, upload-time = "2025-09-17T11:32:29.39Z" }, + { url = "https://files.pythonhosted.org/packages/76/40/782f485be450846e4f3aecff1f10e42af414fc6e19d235c70020f64278e1/fonttools-4.60.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:b7133821249097cffabf0624eafd37f5a3358d5ce814febe9db688e3673e724e", size = 2351716, upload-time = "2025-09-17T11:32:31.46Z" }, + { url = "https://files.pythonhosted.org/packages/39/77/ad8d2a6ecc19716eb488c8cf118de10f7802e14bdf61d136d7b52358d6b1/fonttools-4.60.0-cp312-cp312-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:d3638905d3d77ac8791127ce181f7cb434f37e4204d8b2e31b8f1e154320b41f", size = 4922729, upload-time = "2025-09-17T11:32:33.659Z" }, + { url = "https://files.pythonhosted.org/packages/6b/48/aa543037c6e7788e1bc36b3f858ac70a59d32d0f45915263d0b330a35140/fonttools-4.60.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7968a26ef010ae89aabbb2f8e9dec1e2709a2541bb8620790451ee8aeb4f6fbf", size = 4967188, upload-time = "2025-09-17T11:32:35.74Z" }, + { url = "https://files.pythonhosted.org/packages/ac/58/e407d2028adc6387947eff8f2940b31f4ed40b9a83c2c7bbc8b9255126e2/fonttools-4.60.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:1ef01ca7847c356b0fe026b7b92304bc31dc60a4218689ee0acc66652c1a36b2", size = 4910043, upload-time = "2025-09-17T11:32:38.054Z" }, + { url = "https://files.pythonhosted.org/packages/16/ef/e78519b3c296ef757a21b792fc6a785aa2ef9a2efb098083d8ed5f6ee2ba/fonttools-4.60.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:f3482d7ed7867edfcf785f77c1dffc876c4b2ddac19539c075712ff2a0703cf5", size = 5061980, upload-time = "2025-09-17T11:32:40.457Z" }, + { url = "https://files.pythonhosted.org/packages/00/4c/ad72444d1e3ef704ee90af8d5abf198016a39908d322bf41235562fb01a0/fonttools-4.60.0-cp312-cp312-win32.whl", hash = "sha256:8c937c4fe8addff575a984c9519433391180bf52cf35895524a07b520f376067", size = 2217750, upload-time = "2025-09-17T11:32:42.586Z" }, + { url = "https://files.pythonhosted.org/packages/46/55/3e8ac21963e130242f5a9ea2ebc57f5726d704bf4dcca89088b5b637b2d3/fonttools-4.60.0-cp312-cp312-win_amd64.whl", hash = "sha256:99b06d5d6f29f32e312adaed0367112f5ff2d300ea24363d377ec917daf9e8c5", size = 2266025, upload-time = "2025-09-17T11:32:44.8Z" }, + { url = "https://files.pythonhosted.org/packages/b4/6b/d090cd54abe88192fe3010f573508b2592cf1d1f98b14bcb799a8ad20525/fonttools-4.60.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:97100ba820936cdb5148b634e0884f0088699c7e2f1302ae7bba3747c7a19fb3", size = 2824791, upload-time = "2025-09-17T11:32:47.002Z" }, + { url = "https://files.pythonhosted.org/packages/97/8c/7ccb5a27aac9a535623fe04935fb9f469a4f8a1253991af9fbac2fe88c17/fonttools-4.60.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:03fccf84f377f83e99a5328a9ebe6b41e16fcf64a1450c352b6aa7e0deedbc01", size = 2347081, upload-time = "2025-09-17T11:32:49.204Z" }, + { url = "https://files.pythonhosted.org/packages/f8/1a/c14f0bb20b4cb7849dc0519f0ab0da74318d52236dc23168530569958599/fonttools-4.60.0-cp313-cp313-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:a3ef06671f862cd7da78ab105fbf8dce9da3634a8f91b3a64ed5c29c0ac6a9a8", size = 4902095, upload-time = "2025-09-17T11:32:51.848Z" }, + { url = "https://files.pythonhosted.org/packages/c9/a0/c7c91f07c40de5399cbaec7d25e04c9afac6c8f80036a98c125efdb5fe1a/fonttools-4.60.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3f2195faf96594c238462c420c7eff97d1aa51de595434f806ec3952df428616", size = 4959137, upload-time = "2025-09-17T11:32:54.185Z" }, + { url = "https://files.pythonhosted.org/packages/38/d2/169e49498df9f2c721763aa39b0bf3d08cb762864ebc8a8ddb99f5ba7ec8/fonttools-4.60.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:3887008865fa4f56cff58a1878f1300ba81a4e34f76daf9b47234698493072ee", size = 4900467, upload-time = "2025-09-17T11:32:56.664Z" }, + { url = "https://files.pythonhosted.org/packages/cc/9c/bfb56b89c3eab8bcb739c7fd1e8a43285c8dd833e1e1d18d4f54f2f641af/fonttools-4.60.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:5567bd130378f21231d3856d8f0571dcdfcd77e47832978c26dabe572d456daa", size = 5043508, upload-time = "2025-09-17T11:32:58.944Z" }, + { url = "https://files.pythonhosted.org/packages/77/30/2b511c7eb99faee1fd9a0b42e984fb91275da3d681da650af4edf409d0fd/fonttools-4.60.0-cp313-cp313-win32.whl", hash = "sha256:699d0b521ec0b188ac11f2c14ccf6a926367795818ddf2bd00a273e9a052dd20", size = 2216037, upload-time = "2025-09-17T11:33:01.192Z" }, + { url = "https://files.pythonhosted.org/packages/3d/73/a2cc5ee4faeb0302cc81942c27f3b516801bf489fdc422a1b20090fff695/fonttools-4.60.0-cp313-cp313-win_amd64.whl", hash = "sha256:24296163268e7c800009711ce5c0e9997be8882c0bd546696c82ef45966163a6", size = 2265190, upload-time = "2025-09-17T11:33:03.935Z" }, + { url = "https://files.pythonhosted.org/packages/86/dd/a126706e45e0ce097cef6de4108b5597795acaa945fdbdd922dbc090d335/fonttools-4.60.0-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:b6fe3efdc956bdad95145cea906ad9ff345c17b706356dfc1098ce3230591343", size = 2821835, upload-time = "2025-09-17T11:33:06.094Z" }, + { url = "https://files.pythonhosted.org/packages/ac/90/5c17f311bbd983fd614b82a7a06da967b5d3c87e3e61cf34de6029a92ff4/fonttools-4.60.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:764b2aaab839762a3aa3207e5b3f0e0dfa41799e0b091edec5fcbccc584fdab5", size = 2344536, upload-time = "2025-09-17T11:33:08.574Z" }, + { url = "https://files.pythonhosted.org/packages/60/67/48c1a6229b2a5668c4111fbd1694ca417adedc1254c5cd2f9a11834c429d/fonttools-4.60.0-cp314-cp314-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:b81c7c47d9e78106a4d70f1dbeb49150513171715e45e0d2661809f2b0e3f710", size = 4842494, upload-time = "2025-09-17T11:33:11.338Z" }, + { url = "https://files.pythonhosted.org/packages/13/3e/83b0b37d02b7e321cbe2b8fcec0aa18571f0a47d3dc222196404371d83b6/fonttools-4.60.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:799ff60ee66b300ebe1fe6632b1cc55a66400fe815cef7b034d076bce6b1d8fc", size = 4943203, upload-time = "2025-09-17T11:33:13.285Z" }, + { url = "https://files.pythonhosted.org/packages/c9/07/11163e49497c53392eaca210a474104e4987c17ca7731f8754ba0d416a67/fonttools-4.60.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:f9878abe155ddd1b433bab95d027a686898a6afba961f3c5ca14b27488f2d772", size = 4889233, upload-time = "2025-09-17T11:33:15.175Z" }, + { url = "https://files.pythonhosted.org/packages/60/90/e85005d955cb26e7de015d5678778b8cc3293c0f3d717865675bd641fbfc/fonttools-4.60.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:ded432b7133ea4602fdb4731a4a7443a8e9548edad28987b99590cf6da626254", size = 4998335, upload-time = "2025-09-17T11:33:17.217Z" }, + { url = "https://files.pythonhosted.org/packages/2a/82/0374ad53729de6e3788ecdb8a3731ce6592c5ffa9bff823cef2ffe0164af/fonttools-4.60.0-cp314-cp314-win32.whl", hash = "sha256:5d97cf3a9245316d5978628c05642b939809c4f55ca632ca40744cb9de6e8d4a", size = 2219840, upload-time = "2025-09-17T11:33:19.494Z" }, + { url = "https://files.pythonhosted.org/packages/11/c3/804cd47453dcafb7976f9825b43cc0e61a2fe30eddb971b681cd72c4ca65/fonttools-4.60.0-cp314-cp314-win_amd64.whl", hash = "sha256:61b9ef46dd5e9dcb6f437eb0cc5ed83d5049e1bf9348e31974ffee1235db0f8f", size = 2269891, upload-time = "2025-09-17T11:33:21.743Z" }, + { url = "https://files.pythonhosted.org/packages/75/bf/1bd760aca04098e7028b4e0e5f73b41ff74b322275698071454652476a44/fonttools-4.60.0-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:bba7e3470cf353e1484a36dfb4108f431c2859e3f6097fe10118eeae92166773", size = 2893361, upload-time = "2025-09-17T11:33:23.68Z" }, + { url = "https://files.pythonhosted.org/packages/25/35/7a2c09aa990ed77f34924def383f44fc576a5596cc3df8438071e1baa1ac/fonttools-4.60.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:c5ac6439a38c27b3287063176b3303b34982024b01e2e95bba8ac1e45f6d41c1", size = 2374086, upload-time = "2025-09-17T11:33:25.988Z" }, + { url = "https://files.pythonhosted.org/packages/77/a9/f85ed2493e82837ff73421f3f7a1c3ae8f0b14051307418c916d9563da1f/fonttools-4.60.0-cp314-cp314t-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:4acd21e9f125a1257da59edf7a6e9bd4abd76282770715c613f1fe482409e9f9", size = 4848766, upload-time = "2025-09-17T11:33:28.018Z" }, + { url = "https://files.pythonhosted.org/packages/d1/91/29830eda31ae9231a06d5246e5d0c686422d03456ed666e13576c24c3f97/fonttools-4.60.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b4a6fc53039ea047e35dc62b958af9cd397eedbc3fa42406d2910ae091b9ae37", size = 5084613, upload-time = "2025-09-17T11:33:30.562Z" }, + { url = "https://files.pythonhosted.org/packages/48/01/615905e7db2568fe1843145077e680443494b7caab2089527b7e112c7606/fonttools-4.60.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:ef34f44eadf133e94e82c775a33ee3091dd37ee0161c5f5ea224b46e3ce0fb8e", size = 4956620, upload-time = "2025-09-17T11:33:32.497Z" }, + { url = "https://files.pythonhosted.org/packages/97/8e/64e65255871ec2f13b6c00b5b12d08b928b504867cfb7e7ed73e5e941832/fonttools-4.60.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:d112cae3e7ad1bb5d7f7a60365fcf6c181374648e064a8c07617b240e7c828ee", size = 4973202, upload-time = "2025-09-17T11:33:34.561Z" }, + { url = "https://files.pythonhosted.org/packages/e0/6d/04d16243eb441e8de61074c7809e92d2e35df4cd11af5632e486bc630dab/fonttools-4.60.0-cp314-cp314t-win32.whl", hash = "sha256:0f7b2c251dc338973e892a1e153016114e7a75f6aac7a49b84d5d1a4c0608d08", size = 2281217, upload-time = "2025-09-17T11:33:36.965Z" }, + { url = "https://files.pythonhosted.org/packages/ab/5f/09bd2f9f28ef0d6f3620fa19699d11c4bc83ff8a2786d8ccdd97c209b19a/fonttools-4.60.0-cp314-cp314t-win_amd64.whl", hash = "sha256:c8a72771106bc7434098db35abecd84d608857f6e116d3ef00366b213c502ce9", size = 2344738, upload-time = "2025-09-17T11:33:39.372Z" }, + { url = "https://files.pythonhosted.org/packages/f9/a4/247d3e54eb5ed59e94e09866cfc4f9567e274fbf310ba390711851f63b3b/fonttools-4.60.0-py3-none-any.whl", hash = "sha256:496d26e4d14dcccdd6ada2e937e4d174d3138e3d73f5c9b6ec6eb2fd1dab4f66", size = 1142186, upload-time = "2025-09-17T11:33:59.287Z" }, +] + +[[package]] +name = "frozenlist" +version = "1.7.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/79/b1/b64018016eeb087db503b038296fd782586432b9c077fc5c7839e9cb6ef6/frozenlist-1.7.0.tar.gz", hash = "sha256:2e310d81923c2437ea8670467121cc3e9b0f76d3043cc1d2331d56c7fb7a3a8f", size = 45078, upload-time = "2025-06-09T23:02:35.538Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/34/7e/803dde33760128acd393a27eb002f2020ddb8d99d30a44bfbaab31c5f08a/frozenlist-1.7.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:aa51e147a66b2d74de1e6e2cf5921890de6b0f4820b257465101d7f37b49fb5a", size = 82251, upload-time = "2025-06-09T23:00:16.279Z" }, + { url = "https://files.pythonhosted.org/packages/75/a9/9c2c5760b6ba45eae11334db454c189d43d34a4c0b489feb2175e5e64277/frozenlist-1.7.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:9b35db7ce1cd71d36ba24f80f0c9e7cff73a28d7a74e91fe83e23d27c7828750", size = 48183, upload-time = "2025-06-09T23:00:17.698Z" }, + { url = "https://files.pythonhosted.org/packages/47/be/4038e2d869f8a2da165f35a6befb9158c259819be22eeaf9c9a8f6a87771/frozenlist-1.7.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:34a69a85e34ff37791e94542065c8416c1afbf820b68f720452f636d5fb990cd", size = 47107, upload-time = "2025-06-09T23:00:18.952Z" }, + { url = "https://files.pythonhosted.org/packages/79/26/85314b8a83187c76a37183ceed886381a5f992975786f883472fcb6dc5f2/frozenlist-1.7.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4a646531fa8d82c87fe4bb2e596f23173caec9185bfbca5d583b4ccfb95183e2", size = 237333, upload-time = "2025-06-09T23:00:20.275Z" }, + { url = "https://files.pythonhosted.org/packages/1f/fd/e5b64f7d2c92a41639ffb2ad44a6a82f347787abc0c7df5f49057cf11770/frozenlist-1.7.0-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:79b2ffbba483f4ed36a0f236ccb85fbb16e670c9238313709638167670ba235f", size = 231724, upload-time = "2025-06-09T23:00:21.705Z" }, + { url = "https://files.pythonhosted.org/packages/20/fb/03395c0a43a5976af4bf7534759d214405fbbb4c114683f434dfdd3128ef/frozenlist-1.7.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a26f205c9ca5829cbf82bb2a84b5c36f7184c4316617d7ef1b271a56720d6b30", size = 245842, upload-time = "2025-06-09T23:00:23.148Z" }, + { url = "https://files.pythonhosted.org/packages/d0/15/c01c8e1dffdac5d9803507d824f27aed2ba76b6ed0026fab4d9866e82f1f/frozenlist-1.7.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:bcacfad3185a623fa11ea0e0634aac7b691aa925d50a440f39b458e41c561d98", size = 239767, upload-time = "2025-06-09T23:00:25.103Z" }, + { url = "https://files.pythonhosted.org/packages/14/99/3f4c6fe882c1f5514b6848aa0a69b20cb5e5d8e8f51a339d48c0e9305ed0/frozenlist-1.7.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:72c1b0fe8fe451b34f12dce46445ddf14bd2a5bcad7e324987194dc8e3a74c86", size = 224130, upload-time = "2025-06-09T23:00:27.061Z" }, + { url = "https://files.pythonhosted.org/packages/4d/83/220a374bd7b2aeba9d0725130665afe11de347d95c3620b9b82cc2fcab97/frozenlist-1.7.0-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:61d1a5baeaac6c0798ff6edfaeaa00e0e412d49946c53fae8d4b8e8b3566c4ae", size = 235301, upload-time = "2025-06-09T23:00:29.02Z" }, + { url = "https://files.pythonhosted.org/packages/03/3c/3e3390d75334a063181625343e8daab61b77e1b8214802cc4e8a1bb678fc/frozenlist-1.7.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:7edf5c043c062462f09b6820de9854bf28cc6cc5b6714b383149745e287181a8", size = 234606, upload-time = "2025-06-09T23:00:30.514Z" }, + { url = "https://files.pythonhosted.org/packages/23/1e/58232c19608b7a549d72d9903005e2d82488f12554a32de2d5fb59b9b1ba/frozenlist-1.7.0-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:d50ac7627b3a1bd2dcef6f9da89a772694ec04d9a61b66cf87f7d9446b4a0c31", size = 248372, upload-time = "2025-06-09T23:00:31.966Z" }, + { url = "https://files.pythonhosted.org/packages/c0/a4/e4a567e01702a88a74ce8a324691e62a629bf47d4f8607f24bf1c7216e7f/frozenlist-1.7.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:ce48b2fece5aeb45265bb7a58259f45027db0abff478e3077e12b05b17fb9da7", size = 229860, upload-time = "2025-06-09T23:00:33.375Z" }, + { url = "https://files.pythonhosted.org/packages/73/a6/63b3374f7d22268b41a9db73d68a8233afa30ed164c46107b33c4d18ecdd/frozenlist-1.7.0-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:fe2365ae915a1fafd982c146754e1de6ab3478def8a59c86e1f7242d794f97d5", size = 245893, upload-time = "2025-06-09T23:00:35.002Z" }, + { url = "https://files.pythonhosted.org/packages/6d/eb/d18b3f6e64799a79673c4ba0b45e4cfbe49c240edfd03a68be20002eaeaa/frozenlist-1.7.0-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:45a6f2fdbd10e074e8814eb98b05292f27bad7d1883afbe009d96abdcf3bc898", size = 246323, upload-time = "2025-06-09T23:00:36.468Z" }, + { url = "https://files.pythonhosted.org/packages/5a/f5/720f3812e3d06cd89a1d5db9ff6450088b8f5c449dae8ffb2971a44da506/frozenlist-1.7.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:21884e23cffabb157a9dd7e353779077bf5b8f9a58e9b262c6caad2ef5f80a56", size = 233149, upload-time = "2025-06-09T23:00:37.963Z" }, + { url = "https://files.pythonhosted.org/packages/69/68/03efbf545e217d5db8446acfd4c447c15b7c8cf4dbd4a58403111df9322d/frozenlist-1.7.0-cp311-cp311-win32.whl", hash = "sha256:284d233a8953d7b24f9159b8a3496fc1ddc00f4db99c324bd5fb5f22d8698ea7", size = 39565, upload-time = "2025-06-09T23:00:39.753Z" }, + { url = "https://files.pythonhosted.org/packages/58/17/fe61124c5c333ae87f09bb67186d65038834a47d974fc10a5fadb4cc5ae1/frozenlist-1.7.0-cp311-cp311-win_amd64.whl", hash = "sha256:387cbfdcde2f2353f19c2f66bbb52406d06ed77519ac7ee21be0232147c2592d", size = 44019, upload-time = "2025-06-09T23:00:40.988Z" }, + { url = "https://files.pythonhosted.org/packages/ef/a2/c8131383f1e66adad5f6ecfcce383d584ca94055a34d683bbb24ac5f2f1c/frozenlist-1.7.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:3dbf9952c4bb0e90e98aec1bd992b3318685005702656bc6f67c1a32b76787f2", size = 81424, upload-time = "2025-06-09T23:00:42.24Z" }, + { url = "https://files.pythonhosted.org/packages/4c/9d/02754159955088cb52567337d1113f945b9e444c4960771ea90eb73de8db/frozenlist-1.7.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:1f5906d3359300b8a9bb194239491122e6cf1444c2efb88865426f170c262cdb", size = 47952, upload-time = "2025-06-09T23:00:43.481Z" }, + { url = "https://files.pythonhosted.org/packages/01/7a/0046ef1bd6699b40acd2067ed6d6670b4db2f425c56980fa21c982c2a9db/frozenlist-1.7.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3dabd5a8f84573c8d10d8859a50ea2dec01eea372031929871368c09fa103478", size = 46688, upload-time = "2025-06-09T23:00:44.793Z" }, + { url = "https://files.pythonhosted.org/packages/d6/a2/a910bafe29c86997363fb4c02069df4ff0b5bc39d33c5198b4e9dd42d8f8/frozenlist-1.7.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aa57daa5917f1738064f302bf2626281a1cb01920c32f711fbc7bc36111058a8", size = 243084, upload-time = "2025-06-09T23:00:46.125Z" }, + { url = "https://files.pythonhosted.org/packages/64/3e/5036af9d5031374c64c387469bfcc3af537fc0f5b1187d83a1cf6fab1639/frozenlist-1.7.0-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:c193dda2b6d49f4c4398962810fa7d7c78f032bf45572b3e04dd5249dff27e08", size = 233524, upload-time = "2025-06-09T23:00:47.73Z" }, + { url = "https://files.pythonhosted.org/packages/06/39/6a17b7c107a2887e781a48ecf20ad20f1c39d94b2a548c83615b5b879f28/frozenlist-1.7.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bfe2b675cf0aaa6d61bf8fbffd3c274b3c9b7b1623beb3809df8a81399a4a9c4", size = 248493, upload-time = "2025-06-09T23:00:49.742Z" }, + { url = "https://files.pythonhosted.org/packages/be/00/711d1337c7327d88c44d91dd0f556a1c47fb99afc060ae0ef66b4d24793d/frozenlist-1.7.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8fc5d5cda37f62b262405cf9652cf0856839c4be8ee41be0afe8858f17f4c94b", size = 244116, upload-time = "2025-06-09T23:00:51.352Z" }, + { url = "https://files.pythonhosted.org/packages/24/fe/74e6ec0639c115df13d5850e75722750adabdc7de24e37e05a40527ca539/frozenlist-1.7.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b0d5ce521d1dd7d620198829b87ea002956e4319002ef0bc8d3e6d045cb4646e", size = 224557, upload-time = "2025-06-09T23:00:52.855Z" }, + { url = "https://files.pythonhosted.org/packages/8d/db/48421f62a6f77c553575201e89048e97198046b793f4a089c79a6e3268bd/frozenlist-1.7.0-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:488d0a7d6a0008ca0db273c542098a0fa9e7dfaa7e57f70acef43f32b3f69dca", size = 241820, upload-time = "2025-06-09T23:00:54.43Z" }, + { url = "https://files.pythonhosted.org/packages/1d/fa/cb4a76bea23047c8462976ea7b7a2bf53997a0ca171302deae9d6dd12096/frozenlist-1.7.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:15a7eaba63983d22c54d255b854e8108e7e5f3e89f647fc854bd77a237e767df", size = 236542, upload-time = "2025-06-09T23:00:56.409Z" }, + { url = "https://files.pythonhosted.org/packages/5d/32/476a4b5cfaa0ec94d3f808f193301debff2ea42288a099afe60757ef6282/frozenlist-1.7.0-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:1eaa7e9c6d15df825bf255649e05bd8a74b04a4d2baa1ae46d9c2d00b2ca2cb5", size = 249350, upload-time = "2025-06-09T23:00:58.468Z" }, + { url = "https://files.pythonhosted.org/packages/8d/ba/9a28042f84a6bf8ea5dbc81cfff8eaef18d78b2a1ad9d51c7bc5b029ad16/frozenlist-1.7.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:e4389e06714cfa9d47ab87f784a7c5be91d3934cd6e9a7b85beef808297cc025", size = 225093, upload-time = "2025-06-09T23:01:00.015Z" }, + { url = "https://files.pythonhosted.org/packages/bc/29/3a32959e68f9cf000b04e79ba574527c17e8842e38c91d68214a37455786/frozenlist-1.7.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:73bd45e1488c40b63fe5a7df892baf9e2a4d4bb6409a2b3b78ac1c6236178e01", size = 245482, upload-time = "2025-06-09T23:01:01.474Z" }, + { url = "https://files.pythonhosted.org/packages/80/e8/edf2f9e00da553f07f5fa165325cfc302dead715cab6ac8336a5f3d0adc2/frozenlist-1.7.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:99886d98e1643269760e5fe0df31e5ae7050788dd288947f7f007209b8c33f08", size = 249590, upload-time = "2025-06-09T23:01:02.961Z" }, + { url = "https://files.pythonhosted.org/packages/1c/80/9a0eb48b944050f94cc51ee1c413eb14a39543cc4f760ed12657a5a3c45a/frozenlist-1.7.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:290a172aae5a4c278c6da8a96222e6337744cd9c77313efe33d5670b9f65fc43", size = 237785, upload-time = "2025-06-09T23:01:05.095Z" }, + { url = "https://files.pythonhosted.org/packages/f3/74/87601e0fb0369b7a2baf404ea921769c53b7ae00dee7dcfe5162c8c6dbf0/frozenlist-1.7.0-cp312-cp312-win32.whl", hash = "sha256:426c7bc70e07cfebc178bc4c2bf2d861d720c4fff172181eeb4a4c41d4ca2ad3", size = 39487, upload-time = "2025-06-09T23:01:06.54Z" }, + { url = "https://files.pythonhosted.org/packages/0b/15/c026e9a9fc17585a9d461f65d8593d281fedf55fbf7eb53f16c6df2392f9/frozenlist-1.7.0-cp312-cp312-win_amd64.whl", hash = "sha256:563b72efe5da92e02eb68c59cb37205457c977aa7a449ed1b37e6939e5c47c6a", size = 43874, upload-time = "2025-06-09T23:01:07.752Z" }, + { url = "https://files.pythonhosted.org/packages/24/90/6b2cebdabdbd50367273c20ff6b57a3dfa89bd0762de02c3a1eb42cb6462/frozenlist-1.7.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ee80eeda5e2a4e660651370ebffd1286542b67e268aa1ac8d6dbe973120ef7ee", size = 79791, upload-time = "2025-06-09T23:01:09.368Z" }, + { url = "https://files.pythonhosted.org/packages/83/2e/5b70b6a3325363293fe5fc3ae74cdcbc3e996c2a11dde2fd9f1fb0776d19/frozenlist-1.7.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:d1a81c85417b914139e3a9b995d4a1c84559afc839a93cf2cb7f15e6e5f6ed2d", size = 47165, upload-time = "2025-06-09T23:01:10.653Z" }, + { url = "https://files.pythonhosted.org/packages/f4/25/a0895c99270ca6966110f4ad98e87e5662eab416a17e7fd53c364bf8b954/frozenlist-1.7.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:cbb65198a9132ebc334f237d7b0df163e4de83fb4f2bdfe46c1e654bdb0c5d43", size = 45881, upload-time = "2025-06-09T23:01:12.296Z" }, + { url = "https://files.pythonhosted.org/packages/19/7c/71bb0bbe0832793c601fff68cd0cf6143753d0c667f9aec93d3c323f4b55/frozenlist-1.7.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dab46c723eeb2c255a64f9dc05b8dd601fde66d6b19cdb82b2e09cc6ff8d8b5d", size = 232409, upload-time = "2025-06-09T23:01:13.641Z" }, + { url = "https://files.pythonhosted.org/packages/c0/45/ed2798718910fe6eb3ba574082aaceff4528e6323f9a8570be0f7028d8e9/frozenlist-1.7.0-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:6aeac207a759d0dedd2e40745575ae32ab30926ff4fa49b1635def65806fddee", size = 225132, upload-time = "2025-06-09T23:01:15.264Z" }, + { url = "https://files.pythonhosted.org/packages/ba/e2/8417ae0f8eacb1d071d4950f32f229aa6bf68ab69aab797b72a07ea68d4f/frozenlist-1.7.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bd8c4e58ad14b4fa7802b8be49d47993182fdd4023393899632c88fd8cd994eb", size = 237638, upload-time = "2025-06-09T23:01:16.752Z" }, + { url = "https://files.pythonhosted.org/packages/f8/b7/2ace5450ce85f2af05a871b8c8719b341294775a0a6c5585d5e6170f2ce7/frozenlist-1.7.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:04fb24d104f425da3540ed83cbfc31388a586a7696142004c577fa61c6298c3f", size = 233539, upload-time = "2025-06-09T23:01:18.202Z" }, + { url = "https://files.pythonhosted.org/packages/46/b9/6989292c5539553dba63f3c83dc4598186ab2888f67c0dc1d917e6887db6/frozenlist-1.7.0-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6a5c505156368e4ea6b53b5ac23c92d7edc864537ff911d2fb24c140bb175e60", size = 215646, upload-time = "2025-06-09T23:01:19.649Z" }, + { url = "https://files.pythonhosted.org/packages/72/31/bc8c5c99c7818293458fe745dab4fd5730ff49697ccc82b554eb69f16a24/frozenlist-1.7.0-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8bd7eb96a675f18aa5c553eb7ddc24a43c8c18f22e1f9925528128c052cdbe00", size = 232233, upload-time = "2025-06-09T23:01:21.175Z" }, + { url = "https://files.pythonhosted.org/packages/59/52/460db4d7ba0811b9ccb85af996019f5d70831f2f5f255f7cc61f86199795/frozenlist-1.7.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:05579bf020096fe05a764f1f84cd104a12f78eaab68842d036772dc6d4870b4b", size = 227996, upload-time = "2025-06-09T23:01:23.098Z" }, + { url = "https://files.pythonhosted.org/packages/ba/c9/f4b39e904c03927b7ecf891804fd3b4df3db29b9e487c6418e37988d6e9d/frozenlist-1.7.0-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:376b6222d114e97eeec13d46c486facd41d4f43bab626b7c3f6a8b4e81a5192c", size = 242280, upload-time = "2025-06-09T23:01:24.808Z" }, + { url = "https://files.pythonhosted.org/packages/b8/33/3f8d6ced42f162d743e3517781566b8481322be321b486d9d262adf70bfb/frozenlist-1.7.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:0aa7e176ebe115379b5b1c95b4096fb1c17cce0847402e227e712c27bdb5a949", size = 217717, upload-time = "2025-06-09T23:01:26.28Z" }, + { url = "https://files.pythonhosted.org/packages/3e/e8/ad683e75da6ccef50d0ab0c2b2324b32f84fc88ceee778ed79b8e2d2fe2e/frozenlist-1.7.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:3fbba20e662b9c2130dc771e332a99eff5da078b2b2648153a40669a6d0e36ca", size = 236644, upload-time = "2025-06-09T23:01:27.887Z" }, + { url = "https://files.pythonhosted.org/packages/b2/14/8d19ccdd3799310722195a72ac94ddc677541fb4bef4091d8e7775752360/frozenlist-1.7.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:f3f4410a0a601d349dd406b5713fec59b4cee7e71678d5b17edda7f4655a940b", size = 238879, upload-time = "2025-06-09T23:01:29.524Z" }, + { url = "https://files.pythonhosted.org/packages/ce/13/c12bf657494c2fd1079a48b2db49fa4196325909249a52d8f09bc9123fd7/frozenlist-1.7.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:e2cdfaaec6a2f9327bf43c933c0319a7c429058e8537c508964a133dffee412e", size = 232502, upload-time = "2025-06-09T23:01:31.287Z" }, + { url = "https://files.pythonhosted.org/packages/d7/8b/e7f9dfde869825489382bc0d512c15e96d3964180c9499efcec72e85db7e/frozenlist-1.7.0-cp313-cp313-win32.whl", hash = "sha256:5fc4df05a6591c7768459caba1b342d9ec23fa16195e744939ba5914596ae3e1", size = 39169, upload-time = "2025-06-09T23:01:35.503Z" }, + { url = "https://files.pythonhosted.org/packages/35/89/a487a98d94205d85745080a37860ff5744b9820a2c9acbcdd9440bfddf98/frozenlist-1.7.0-cp313-cp313-win_amd64.whl", hash = "sha256:52109052b9791a3e6b5d1b65f4b909703984b770694d3eb64fad124c835d7cba", size = 43219, upload-time = "2025-06-09T23:01:36.784Z" }, + { url = "https://files.pythonhosted.org/packages/56/d5/5c4cf2319a49eddd9dd7145e66c4866bdc6f3dbc67ca3d59685149c11e0d/frozenlist-1.7.0-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:a6f86e4193bb0e235ef6ce3dde5cbabed887e0b11f516ce8a0f4d3b33078ec2d", size = 84345, upload-time = "2025-06-09T23:01:38.295Z" }, + { url = "https://files.pythonhosted.org/packages/a4/7d/ec2c1e1dc16b85bc9d526009961953df9cec8481b6886debb36ec9107799/frozenlist-1.7.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:82d664628865abeb32d90ae497fb93df398a69bb3434463d172b80fc25b0dd7d", size = 48880, upload-time = "2025-06-09T23:01:39.887Z" }, + { url = "https://files.pythonhosted.org/packages/69/86/f9596807b03de126e11e7d42ac91e3d0b19a6599c714a1989a4e85eeefc4/frozenlist-1.7.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:912a7e8375a1c9a68325a902f3953191b7b292aa3c3fb0d71a216221deca460b", size = 48498, upload-time = "2025-06-09T23:01:41.318Z" }, + { url = "https://files.pythonhosted.org/packages/5e/cb/df6de220f5036001005f2d726b789b2c0b65f2363b104bbc16f5be8084f8/frozenlist-1.7.0-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9537c2777167488d539bc5de2ad262efc44388230e5118868e172dd4a552b146", size = 292296, upload-time = "2025-06-09T23:01:42.685Z" }, + { url = "https://files.pythonhosted.org/packages/83/1f/de84c642f17c8f851a2905cee2dae401e5e0daca9b5ef121e120e19aa825/frozenlist-1.7.0-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:f34560fb1b4c3e30ba35fa9a13894ba39e5acfc5f60f57d8accde65f46cc5e74", size = 273103, upload-time = "2025-06-09T23:01:44.166Z" }, + { url = "https://files.pythonhosted.org/packages/88/3c/c840bfa474ba3fa13c772b93070893c6e9d5c0350885760376cbe3b6c1b3/frozenlist-1.7.0-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:acd03d224b0175f5a850edc104ac19040d35419eddad04e7cf2d5986d98427f1", size = 292869, upload-time = "2025-06-09T23:01:45.681Z" }, + { url = "https://files.pythonhosted.org/packages/a6/1c/3efa6e7d5a39a1d5ef0abeb51c48fb657765794a46cf124e5aca2c7a592c/frozenlist-1.7.0-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f2038310bc582f3d6a09b3816ab01737d60bf7b1ec70f5356b09e84fb7408ab1", size = 291467, upload-time = "2025-06-09T23:01:47.234Z" }, + { url = "https://files.pythonhosted.org/packages/4f/00/d5c5e09d4922c395e2f2f6b79b9a20dab4b67daaf78ab92e7729341f61f6/frozenlist-1.7.0-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b8c05e4c8e5f36e5e088caa1bf78a687528f83c043706640a92cb76cd6999384", size = 266028, upload-time = "2025-06-09T23:01:48.819Z" }, + { url = "https://files.pythonhosted.org/packages/4e/27/72765be905619dfde25a7f33813ac0341eb6b076abede17a2e3fbfade0cb/frozenlist-1.7.0-cp313-cp313t-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:765bb588c86e47d0b68f23c1bee323d4b703218037765dcf3f25c838c6fecceb", size = 284294, upload-time = "2025-06-09T23:01:50.394Z" }, + { url = "https://files.pythonhosted.org/packages/88/67/c94103a23001b17808eb7dd1200c156bb69fb68e63fcf0693dde4cd6228c/frozenlist-1.7.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:32dc2e08c67d86d0969714dd484fd60ff08ff81d1a1e40a77dd34a387e6ebc0c", size = 281898, upload-time = "2025-06-09T23:01:52.234Z" }, + { url = "https://files.pythonhosted.org/packages/42/34/a3e2c00c00f9e2a9db5653bca3fec306349e71aff14ae45ecc6d0951dd24/frozenlist-1.7.0-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:c0303e597eb5a5321b4de9c68e9845ac8f290d2ab3f3e2c864437d3c5a30cd65", size = 290465, upload-time = "2025-06-09T23:01:53.788Z" }, + { url = "https://files.pythonhosted.org/packages/bb/73/f89b7fbce8b0b0c095d82b008afd0590f71ccb3dee6eee41791cf8cd25fd/frozenlist-1.7.0-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:a47f2abb4e29b3a8d0b530f7c3598badc6b134562b1a5caee867f7c62fee51e3", size = 266385, upload-time = "2025-06-09T23:01:55.769Z" }, + { url = "https://files.pythonhosted.org/packages/cd/45/e365fdb554159462ca12df54bc59bfa7a9a273ecc21e99e72e597564d1ae/frozenlist-1.7.0-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:3d688126c242a6fabbd92e02633414d40f50bb6002fa4cf995a1d18051525657", size = 288771, upload-time = "2025-06-09T23:01:57.4Z" }, + { url = "https://files.pythonhosted.org/packages/00/11/47b6117002a0e904f004d70ec5194fe9144f117c33c851e3d51c765962d0/frozenlist-1.7.0-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:4e7e9652b3d367c7bd449a727dc79d5043f48b88d0cbfd4f9f1060cf2b414104", size = 288206, upload-time = "2025-06-09T23:01:58.936Z" }, + { url = "https://files.pythonhosted.org/packages/40/37/5f9f3c3fd7f7746082ec67bcdc204db72dad081f4f83a503d33220a92973/frozenlist-1.7.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:1a85e345b4c43db8b842cab1feb41be5cc0b10a1830e6295b69d7310f99becaf", size = 282620, upload-time = "2025-06-09T23:02:00.493Z" }, + { url = "https://files.pythonhosted.org/packages/0b/31/8fbc5af2d183bff20f21aa743b4088eac4445d2bb1cdece449ae80e4e2d1/frozenlist-1.7.0-cp313-cp313t-win32.whl", hash = "sha256:3a14027124ddb70dfcee5148979998066897e79f89f64b13328595c4bdf77c81", size = 43059, upload-time = "2025-06-09T23:02:02.072Z" }, + { url = "https://files.pythonhosted.org/packages/bb/ed/41956f52105b8dbc26e457c5705340c67c8cc2b79f394b79bffc09d0e938/frozenlist-1.7.0-cp313-cp313t-win_amd64.whl", hash = "sha256:3bf8010d71d4507775f658e9823210b7427be36625b387221642725b515dcf3e", size = 47516, upload-time = "2025-06-09T23:02:03.779Z" }, + { url = "https://files.pythonhosted.org/packages/ee/45/b82e3c16be2182bff01179db177fe144d58b5dc787a7d4492c6ed8b9317f/frozenlist-1.7.0-py3-none-any.whl", hash = "sha256:9a5af342e34f7e97caf8c995864c7a396418ae2859cc6fdf1b1073020d516a7e", size = 13106, upload-time = "2025-06-09T23:02:34.204Z" }, +] + +[[package]] +name = "fsspec" +version = "2025.3.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/45/d8/8425e6ba5fcec61a1d16e41b1b71d2bf9344f1fe48012c2b48b9620feae5/fsspec-2025.3.2.tar.gz", hash = "sha256:e52c77ef398680bbd6a98c0e628fbc469491282981209907bbc8aea76a04fdc6", size = 299281, upload-time = "2025-03-31T15:27:08.524Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/44/4b/e0cfc1a6f17e990f3e64b7d941ddc4acdc7b19d6edd51abf495f32b1a9e4/fsspec-2025.3.2-py3-none-any.whl", hash = "sha256:2daf8dc3d1dfa65b6aa37748d112773a7a08416f6c70d96b264c96476ecaf711", size = 194435, upload-time = "2025-03-31T15:27:07.028Z" }, +] + +[[package]] +name = "fuzzforge-ai" +version = "0.1.0" +source = { editable = "../ai" } +dependencies = [ + { name = "a2a-sdk" }, + { name = "agentops" }, + { name = "cognee" }, + { name = "fastmcp" }, + { name = "google-adk" }, + { name = "httpx" }, + { name = "litellm" }, + { name = "mcp" }, + { name = "python-dotenv" }, + { name = "rich" }, + { name = "typing-extensions" }, + { name = "uvicorn" }, +] + +[package.metadata] +requires-dist = [ + { name = "a2a-sdk" }, + { name = "agentops" }, + { name = "black", marker = "extra == 'dev'" }, + { name = "cognee", specifier = ">=0.3.0" }, + { name = "fastmcp" }, + { name = "google-adk" }, + { name = "httpx" }, + { name = "litellm" }, + { name = "mcp" }, + { name = "pytest", marker = "extra == 'dev'" }, + { name = "pytest-asyncio", marker = "extra == 'dev'" }, + { name = "python-dotenv" }, + { name = "rich" }, + { name = "ruff", marker = "extra == 'dev'" }, + { name = "typing-extensions" }, + { name = "uvicorn" }, +] +provides-extras = ["dev"] + +[package.metadata.requires-dev] +dev = [ + { name = "pytest" }, + { name = "pytest-asyncio" }, +] + +[[package]] +name = "fuzzforge-cli" +version = "0.1.0" +source = { editable = "." } +dependencies = [ + { name = "fuzzforge-ai" }, + { name = "fuzzforge-sdk" }, + { name = "httpx" }, + { name = "pydantic" }, + { name = "pyyaml" }, + { name = "rich" }, + { name = "sseclient-py" }, + { name = "typer" }, + { name = "websockets" }, +] + +[package.optional-dependencies] +dev = [ + { name = "black" }, + { name = "isort" }, + { name = "mypy" }, + { name = "pytest" }, + { name = "pytest-asyncio" }, +] + +[package.metadata] +requires-dist = [ + { name = "black", marker = "extra == 'dev'", specifier = ">=24.0.0" }, + { name = "fuzzforge-ai", editable = "../ai" }, + { name = "fuzzforge-sdk", editable = "../sdk" }, + { name = "httpx", specifier = ">=0.27.0" }, + { name = "isort", marker = "extra == 'dev'", specifier = ">=5.13.0" }, + { name = "mypy", marker = "extra == 'dev'", specifier = ">=1.11.0" }, + { name = "pydantic", specifier = ">=2.0.0" }, + { name = "pytest", marker = "extra == 'dev'", specifier = ">=8.0.0" }, + { name = "pytest-asyncio", marker = "extra == 'dev'", specifier = ">=0.23.0" }, + { name = "pyyaml", specifier = ">=6.0.0" }, + { name = "rich", specifier = ">=13.0.0" }, + { name = "sseclient-py", specifier = ">=1.8.0" }, + { name = "typer", specifier = ">=0.12.0" }, + { name = "websockets", specifier = ">=13.0" }, +] +provides-extras = ["dev"] + +[[package]] +name = "fuzzforge-sdk" +version = "0.1.0" +source = { editable = "../sdk" } +dependencies = [ + { name = "httpx" }, + { name = "pydantic" }, + { name = "sseclient-py" }, + { name = "websockets" }, +] + +[package.metadata] +requires-dist = [ + { name = "black", marker = "extra == 'dev'", specifier = ">=24.0.0" }, + { name = "httpx", specifier = ">=0.27.0" }, + { name = "isort", marker = "extra == 'dev'", specifier = ">=5.13.0" }, + { name = "mypy", marker = "extra == 'dev'", specifier = ">=1.11.0" }, + { name = "pydantic", specifier = ">=2.0.0" }, + { name = "pytest", marker = "extra == 'dev'", specifier = ">=8.0.0" }, + { name = "pytest-asyncio", marker = "extra == 'dev'", specifier = ">=0.23.0" }, + { name = "pytest-mock", marker = "extra == 'dev'", specifier = ">=3.14.0" }, + { name = "sseclient-py", specifier = ">=1.8.0" }, + { name = "websockets", specifier = ">=13.0" }, +] +provides-extras = ["dev"] + +[[package]] +name = "gitdb" +version = "4.0.12" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "smmap" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/72/94/63b0fc47eb32792c7ba1fe1b694daec9a63620db1e313033d18140c2320a/gitdb-4.0.12.tar.gz", hash = "sha256:5ef71f855d191a3326fcfbc0d5da835f26b13fbcba60c32c21091c349ffdb571", size = 394684, upload-time = "2025-01-02T07:20:46.413Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a0/61/5c78b91c3143ed5c14207f463aecfc8f9dbb5092fb2869baf37c273b2705/gitdb-4.0.12-py3-none-any.whl", hash = "sha256:67073e15955400952c6565cc3e707c554a4eea2e428946f7a4c162fab9bd9bcf", size = 62794, upload-time = "2025-01-02T07:20:43.624Z" }, +] + +[[package]] +name = "gitpython" +version = "3.1.45" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "gitdb" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/9a/c8/dd58967d119baab745caec2f9d853297cec1989ec1d63f677d3880632b88/gitpython-3.1.45.tar.gz", hash = "sha256:85b0ee964ceddf211c41b9f27a49086010a190fd8132a24e21f362a4b36a791c", size = 215076, upload-time = "2025-07-24T03:45:54.871Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/01/61/d4b89fec821f72385526e1b9d9a3a0385dda4a72b206d28049e2c7cd39b8/gitpython-3.1.45-py3-none-any.whl", hash = "sha256:8908cb2e02fb3b93b7eb0f2827125cb699869470432cc885f019b8fd0fccff77", size = 208168, upload-time = "2025-07-24T03:45:52.517Z" }, +] + +[[package]] +name = "giturlparse" +version = "0.12.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/37/5f/543dc54c82842376139748226e5aa61eb95093992f63dd495af9c6b4f076/giturlparse-0.12.0.tar.gz", hash = "sha256:c0fff7c21acc435491b1779566e038757a205c1ffdcb47e4f81ea52ad8c3859a", size = 14907, upload-time = "2023-09-24T07:22:36.795Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/dd/94/c6ff3388b8e3225a014e55aed957188639aa0966443e0408d38f0c9614a7/giturlparse-0.12.0-py2.py3-none-any.whl", hash = "sha256:412b74f2855f1da2fefa89fd8dde62df48476077a72fc19b62039554d27360eb", size = 15752, upload-time = "2023-09-24T07:22:35.465Z" }, +] + +[[package]] +name = "google-adk" +version = "1.14.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "absolufy-imports" }, + { name = "anyio" }, + { name = "authlib" }, + { name = "click" }, + { name = "fastapi" }, + { name = "google-api-python-client" }, + { name = "google-cloud-aiplatform", extra = ["agent-engines"] }, + { name = "google-cloud-bigtable" }, + { name = "google-cloud-secret-manager" }, + { name = "google-cloud-spanner" }, + { name = "google-cloud-speech" }, + { name = "google-cloud-storage" }, + { name = "google-genai" }, + { name = "graphviz" }, + { name = "mcp" }, + { name = "opentelemetry-api" }, + { name = "opentelemetry-exporter-gcp-trace" }, + { name = "opentelemetry-sdk" }, + { name = "pydantic" }, + { name = "python-dateutil" }, + { name = "python-dotenv" }, + { name = "pyyaml" }, + { name = "requests" }, + { name = "sqlalchemy" }, + { name = "sqlalchemy-spanner" }, + { name = "starlette" }, + { name = "tenacity" }, + { name = "typing-extensions" }, + { name = "tzlocal" }, + { name = "uvicorn" }, + { name = "watchdog" }, + { name = "websockets" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/35/fe/0efba60d22bfcd7ab18f48d23771f0701664fd93be247eddc42592b9b68f/google_adk-1.14.1.tar.gz", hash = "sha256:06caab4599286123eceb9348e4accb6c3c1476b8d9b2b13f078a975c8ace966f", size = 1681879, upload-time = "2025-09-15T00:06:48.823Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/31/74/0b68fab470f13e80fd135bcf890c13bb1154804c1eaaff60dd1f5995027c/google_adk-1.14.1-py3-none-any.whl", hash = "sha256:acb31ed41d3b05b0d3a65cce76f6ef1289385f49a72164a07dae56190b648d50", size = 1922802, upload-time = "2025-09-15T00:06:47.011Z" }, +] + +[[package]] +name = "google-api-core" +version = "2.25.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "google-auth" }, + { name = "googleapis-common-protos" }, + { name = "proto-plus" }, + { name = "protobuf" }, + { name = "requests" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/dc/21/e9d043e88222317afdbdb567165fdbc3b0aad90064c7e0c9eb0ad9955ad8/google_api_core-2.25.1.tar.gz", hash = "sha256:d2aaa0b13c78c61cb3f4282c464c046e45fbd75755683c9c525e6e8f7ed0a5e8", size = 165443, upload-time = "2025-06-12T20:52:20.439Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/14/4b/ead00905132820b623732b175d66354e9d3e69fcf2a5dcdab780664e7896/google_api_core-2.25.1-py3-none-any.whl", hash = "sha256:8a2a56c1fef82987a524371f99f3bd0143702fecc670c72e600c1cda6bf8dbb7", size = 160807, upload-time = "2025-06-12T20:52:19.334Z" }, +] + +[package.optional-dependencies] +grpc = [ + { name = "grpcio" }, + { name = "grpcio-status" }, +] + +[[package]] +name = "google-api-python-client" +version = "2.182.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "google-api-core" }, + { name = "google-auth" }, + { name = "google-auth-httplib2" }, + { name = "httplib2" }, + { name = "uritemplate" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/6f/cb/b85b1d7d7fd520739fb70c4878f1f414043c3c34434bc90ba9d4f93366ed/google_api_python_client-2.182.0.tar.gz", hash = "sha256:cb2aa127e33c3a31e89a06f39cf9de982db90a98dee020911b21013afafad35f", size = 13599318, upload-time = "2025-09-16T21:10:57.97Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c1/29/76dabe97ebb710ca9a308f0415b2206e37d149983ec2becbf66525c52322/google_api_python_client-2.182.0-py3-none-any.whl", hash = "sha256:a9b071036d41a17991d8fbf27bedb61f2888a39ae5696cb5a326bf999b2d5209", size = 14168745, upload-time = "2025-09-16T21:10:54.657Z" }, +] + +[[package]] +name = "google-auth" +version = "2.40.3" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "cachetools" }, + { name = "pyasn1-modules" }, + { name = "rsa" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/9e/9b/e92ef23b84fa10a64ce4831390b7a4c2e53c0132568d99d4ae61d04c8855/google_auth-2.40.3.tar.gz", hash = "sha256:500c3a29adedeb36ea9cf24b8d10858e152f2412e3ca37829b3fa18e33d63b77", size = 281029, upload-time = "2025-06-04T18:04:57.577Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/17/63/b19553b658a1692443c62bd07e5868adaa0ad746a0751ba62c59568cd45b/google_auth-2.40.3-py2.py3-none-any.whl", hash = "sha256:1370d4593e86213563547f97a92752fc658456fe4514c809544f330fed45a7ca", size = 216137, upload-time = "2025-06-04T18:04:55.573Z" }, +] + +[[package]] +name = "google-auth-httplib2" +version = "0.2.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "google-auth" }, + { name = "httplib2" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/56/be/217a598a818567b28e859ff087f347475c807a5649296fb5a817c58dacef/google-auth-httplib2-0.2.0.tar.gz", hash = "sha256:38aa7badf48f974f1eb9861794e9c0cb2a0511a4ec0679b1f886d108f5640e05", size = 10842, upload-time = "2023-12-12T17:40:30.722Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/be/8a/fe34d2f3f9470a27b01c9e76226965863f153d5fbe276f83608562e49c04/google_auth_httplib2-0.2.0-py2.py3-none-any.whl", hash = "sha256:b65a0a2123300dd71281a7bf6e64d65a0759287df52729bdd1ae2e47dc311a3d", size = 9253, upload-time = "2023-12-12T17:40:13.055Z" }, +] + +[[package]] +name = "google-cloud-aiplatform" +version = "1.114.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "docstring-parser" }, + { name = "google-api-core", extra = ["grpc"] }, + { name = "google-auth" }, + { name = "google-cloud-bigquery" }, + { name = "google-cloud-resource-manager" }, + { name = "google-cloud-storage" }, + { name = "google-genai" }, + { name = "packaging" }, + { name = "proto-plus" }, + { name = "protobuf" }, + { name = "pydantic" }, + { name = "shapely" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/3d/0e/8097231fba8e688993b0b6d371ee298ac3955cdca77fc0731799de1253ca/google_cloud_aiplatform-1.114.0.tar.gz", hash = "sha256:44e5e3da9b23c9316a4d9e7cd6a04258ebf84f3aadf95a725d5d1de179e2c2ce", size = 9650673, upload-time = "2025-09-16T19:47:55.12Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7a/0a/526e70e5cd8e0e96207e201721457dac020d9b8d1bd2ce7326e550b8150d/google_cloud_aiplatform-1.114.0-py2.py3-none-any.whl", hash = "sha256:87386d9364bd0bed4dd33873845afbbe251d1ed83ee25d676c3c0cea630af682", size = 8032171, upload-time = "2025-09-16T19:47:52.725Z" }, +] + +[package.optional-dependencies] +agent-engines = [ + { name = "cloudpickle" }, + { name = "google-cloud-logging" }, + { name = "google-cloud-trace" }, + { name = "opentelemetry-exporter-gcp-trace" }, + { name = "opentelemetry-sdk" }, + { name = "packaging" }, + { name = "pydantic" }, + { name = "typing-extensions" }, +] + +[[package]] +name = "google-cloud-appengine-logging" +version = "1.6.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "google-api-core", extra = ["grpc"] }, + { name = "google-auth" }, + { name = "proto-plus" }, + { name = "protobuf" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/e7/ea/85da73d4f162b29d24ad591c4ce02688b44094ee5f3d6c0cc533c2b23b23/google_cloud_appengine_logging-1.6.2.tar.gz", hash = "sha256:4890928464c98da9eecc7bf4e0542eba2551512c0265462c10f3a3d2a6424b90", size = 16587, upload-time = "2025-06-11T22:38:53.525Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e4/9e/dc1fd7f838dcaf608c465171b1a25d8ce63f9987e2d5c73bda98792097a9/google_cloud_appengine_logging-1.6.2-py3-none-any.whl", hash = "sha256:2b28ed715e92b67e334c6fcfe1deb523f001919560257b25fc8fcda95fd63938", size = 16889, upload-time = "2025-06-11T22:38:52.26Z" }, +] + +[[package]] +name = "google-cloud-audit-log" +version = "0.3.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "googleapis-common-protos" }, + { name = "protobuf" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/85/af/53b4ef636e492d136b3c217e52a07bee569430dda07b8e515d5f2b701b1e/google_cloud_audit_log-0.3.2.tar.gz", hash = "sha256:2598f1533a7d7cdd6c7bf448c12e5519c1d53162d78784e10bcdd1df67791bc3", size = 33377, upload-time = "2025-03-17T11:27:59.808Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b1/74/38a70339e706b174b3c1117ad931aaa0ff0565b599869317a220d1967e1b/google_cloud_audit_log-0.3.2-py3-none-any.whl", hash = "sha256:daaedfb947a0d77f524e1bd2b560242ab4836fe1afd6b06b92f152b9658554ed", size = 32472, upload-time = "2025-03-17T11:27:58.51Z" }, +] + +[[package]] +name = "google-cloud-bigquery" +version = "3.38.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "google-api-core", extra = ["grpc"] }, + { name = "google-auth" }, + { name = "google-cloud-core" }, + { name = "google-resumable-media" }, + { name = "packaging" }, + { name = "python-dateutil" }, + { name = "requests" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/07/b2/a17e40afcf9487e3d17db5e36728ffe75c8d5671c46f419d7b6528a5728a/google_cloud_bigquery-3.38.0.tar.gz", hash = "sha256:8afcb7116f5eac849097a344eb8bfda78b7cfaae128e60e019193dd483873520", size = 503666, upload-time = "2025-09-17T20:33:33.47Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/39/3c/c8cada9ec282b29232ed9aed5a0b5cca6cf5367cb2ffa8ad0d2583d743f1/google_cloud_bigquery-3.38.0-py3-none-any.whl", hash = "sha256:e06e93ff7b245b239945ef59cb59616057598d369edac457ebf292bd61984da6", size = 259257, upload-time = "2025-09-17T20:33:31.404Z" }, +] + +[[package]] +name = "google-cloud-bigtable" +version = "2.32.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "google-api-core", extra = ["grpc"] }, + { name = "google-auth" }, + { name = "google-cloud-core" }, + { name = "google-crc32c" }, + { name = "grpc-google-iam-v1" }, + { name = "proto-plus" }, + { name = "protobuf" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/88/18/52eaef1e08b1570a56a74bb909345bfae082b6915e482df10de1fb0b341d/google_cloud_bigtable-2.32.0.tar.gz", hash = "sha256:1dcf8a9fae5801164dc184558cd8e9e930485424655faae254e2c7350fa66946", size = 746803, upload-time = "2025-08-06T17:28:54.589Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/20/89/2e3607c3c6f85954c3351078f3b891e5a2ec6dec9b964e260731818dcaec/google_cloud_bigtable-2.32.0-py3-none-any.whl", hash = "sha256:39881c36a4009703fa046337cf3259da4dd2cbcabe7b95ee5b0b0a8f19c3234e", size = 520438, upload-time = "2025-08-06T17:28:53.27Z" }, +] + +[[package]] +name = "google-cloud-core" +version = "2.4.3" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "google-api-core" }, + { name = "google-auth" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/d6/b8/2b53838d2acd6ec6168fd284a990c76695e84c65deee79c9f3a4276f6b4f/google_cloud_core-2.4.3.tar.gz", hash = "sha256:1fab62d7102844b278fe6dead3af32408b1df3eb06f5c7e8634cbd40edc4da53", size = 35861, upload-time = "2025-03-10T21:05:38.948Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/40/86/bda7241a8da2d28a754aad2ba0f6776e35b67e37c36ae0c45d49370f1014/google_cloud_core-2.4.3-py2.py3-none-any.whl", hash = "sha256:5130f9f4c14b4fafdff75c79448f9495cfade0d8775facf1b09c3bf67e027f6e", size = 29348, upload-time = "2025-03-10T21:05:37.785Z" }, +] + +[[package]] +name = "google-cloud-logging" +version = "3.12.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "google-api-core", extra = ["grpc"] }, + { name = "google-auth" }, + { name = "google-cloud-appengine-logging" }, + { name = "google-cloud-audit-log" }, + { name = "google-cloud-core" }, + { name = "grpc-google-iam-v1" }, + { name = "opentelemetry-api" }, + { name = "proto-plus" }, + { name = "protobuf" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/14/9c/d42ecc94f795a6545930e5f846a7ae59ff685ded8bc086648dd2bee31a1a/google_cloud_logging-3.12.1.tar.gz", hash = "sha256:36efc823985055b203904e83e1c8f9f999b3c64270bcda39d57386ca4effd678", size = 289569, upload-time = "2025-04-22T20:50:24.71Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b1/41/f8a3197d39b773a91f335dee36c92ef26a8ec96efe78d64baad89d367df4/google_cloud_logging-3.12.1-py2.py3-none-any.whl", hash = "sha256:6817878af76ec4e7568976772839ab2c43ddfd18fbbf2ce32b13ef549cd5a862", size = 229466, upload-time = "2025-04-22T20:50:23.294Z" }, +] + +[[package]] +name = "google-cloud-resource-manager" +version = "1.14.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "google-api-core", extra = ["grpc"] }, + { name = "google-auth" }, + { name = "grpc-google-iam-v1" }, + { name = "proto-plus" }, + { name = "protobuf" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/6e/ca/a4648f5038cb94af4b3942815942a03aa9398f9fb0bef55b3f1585b9940d/google_cloud_resource_manager-1.14.2.tar.gz", hash = "sha256:962e2d904c550d7bac48372607904ff7bb3277e3bb4a36d80cc9a37e28e6eb74", size = 446370, upload-time = "2025-03-17T11:35:56.343Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b1/ea/a92631c358da377af34d3a9682c97af83185c2d66363d5939ab4a1169a7f/google_cloud_resource_manager-1.14.2-py3-none-any.whl", hash = "sha256:d0fa954dedd1d2b8e13feae9099c01b8aac515b648e612834f9942d2795a9900", size = 394344, upload-time = "2025-03-17T11:35:54.722Z" }, +] + +[[package]] +name = "google-cloud-secret-manager" +version = "2.24.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "google-api-core", extra = ["grpc"] }, + { name = "google-auth" }, + { name = "grpc-google-iam-v1" }, + { name = "proto-plus" }, + { name = "protobuf" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/58/7a/2fa6735ec693d822fe08a76709c4d95d9b5b4c02e83e720497355039d2ee/google_cloud_secret_manager-2.24.0.tar.gz", hash = "sha256:ce573d40ffc2fb7d01719243a94ee17aa243ea642a6ae6c337501e58fbf642b5", size = 269516, upload-time = "2025-06-05T22:22:22.965Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/be/af/db1217cae1809e69a4527ee6293b82a9af2a1fb2313ad110c775e8f3c820/google_cloud_secret_manager-2.24.0-py3-none-any.whl", hash = "sha256:9bea1254827ecc14874bc86c63b899489f8f50bfe1442bfb2517530b30b3a89b", size = 218050, upload-time = "2025-06-10T02:02:19.88Z" }, +] + +[[package]] +name = "google-cloud-spanner" +version = "3.57.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "google-api-core", extra = ["grpc"] }, + { name = "google-cloud-core" }, + { name = "grpc-google-iam-v1" }, + { name = "grpc-interceptor" }, + { name = "proto-plus" }, + { name = "protobuf" }, + { name = "sqlparse" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/5e/e8/e008f9ffa2dcf596718d2533d96924735110378853c55f730d2527a19e04/google_cloud_spanner-3.57.0.tar.gz", hash = "sha256:73f52f58617449fcff7073274a7f7a798f4f7b2788eda26de3b7f98ad857ab99", size = 701574, upload-time = "2025-08-14T15:24:59.18Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/3a/9f/66fe9118bc0e593b65ade612775e397f596b0bcd75daa3ea63dbe1020f95/google_cloud_spanner-3.57.0-py3-none-any.whl", hash = "sha256:5b10b40bc646091f1b4cbb2e7e2e82ec66bcce52c7105f86b65070d34d6df86f", size = 501380, upload-time = "2025-08-14T15:24:57.683Z" }, +] + +[[package]] +name = "google-cloud-speech" +version = "2.33.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "google-api-core", extra = ["grpc"] }, + { name = "google-auth" }, + { name = "proto-plus" }, + { name = "protobuf" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/9a/74/9c5a556f8af19cab461058aa15e1409e7afa453ca2383473a24a12801ef7/google_cloud_speech-2.33.0.tar.gz", hash = "sha256:fd08511b5124fdaa768d71a4054e84a5d8eb02531cb6f84f311c0387ea1314ed", size = 389072, upload-time = "2025-06-11T23:56:37.231Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/12/1d/880342b2541b4bad888ad8ab2ac77d4b5dad25b32a2a1c5f21140c14c8e3/google_cloud_speech-2.33.0-py3-none-any.whl", hash = "sha256:4ba16c8517c24a6abcde877289b0f40b719090504bf06b1adea248198ccd50a5", size = 335681, upload-time = "2025-06-11T23:56:36.026Z" }, +] + +[[package]] +name = "google-cloud-storage" +version = "2.19.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "google-api-core" }, + { name = "google-auth" }, + { name = "google-cloud-core" }, + { name = "google-crc32c" }, + { name = "google-resumable-media" }, + { name = "requests" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/36/76/4d965702e96bb67976e755bed9828fa50306dca003dbee08b67f41dd265e/google_cloud_storage-2.19.0.tar.gz", hash = "sha256:cd05e9e7191ba6cb68934d8eb76054d9be4562aa89dbc4236feee4d7d51342b2", size = 5535488, upload-time = "2024-12-05T01:35:06.49Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d5/94/6db383d8ee1adf45dc6c73477152b82731fa4c4a46d9c1932cc8757e0fd4/google_cloud_storage-2.19.0-py2.py3-none-any.whl", hash = "sha256:aeb971b5c29cf8ab98445082cbfe7b161a1f48ed275822f59ed3f1524ea54fba", size = 131787, upload-time = "2024-12-05T01:35:04.736Z" }, +] + +[[package]] +name = "google-cloud-trace" +version = "1.16.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "google-api-core", extra = ["grpc"] }, + { name = "google-auth" }, + { name = "proto-plus" }, + { name = "protobuf" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/c5/ea/0e42e2196fb2bc8c7b25f081a0b46b5053d160b34d5322e7eac2d5f7a742/google_cloud_trace-1.16.2.tar.gz", hash = "sha256:89bef223a512465951eb49335be6d60bee0396d576602dbf56368439d303cab4", size = 97826, upload-time = "2025-06-12T00:53:02.12Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/08/96/7a8d271e91effa9ccc2fd7cfd5cf287a2d7900080a475477c2ac0c7a331d/google_cloud_trace-1.16.2-py3-none-any.whl", hash = "sha256:40fb74607752e4ee0f3d7e5fc6b8f6eb1803982254a1507ba918172484131456", size = 103755, upload-time = "2025-06-12T00:53:00.672Z" }, +] + +[[package]] +name = "google-crc32c" +version = "1.7.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/19/ae/87802e6d9f9d69adfaedfcfd599266bf386a54d0be058b532d04c794f76d/google_crc32c-1.7.1.tar.gz", hash = "sha256:2bff2305f98846f3e825dbeec9ee406f89da7962accdb29356e4eadc251bd472", size = 14495, upload-time = "2025-03-26T14:29:13.32Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/f7/94/220139ea87822b6fdfdab4fb9ba81b3fff7ea2c82e2af34adc726085bffc/google_crc32c-1.7.1-cp311-cp311-macosx_12_0_arm64.whl", hash = "sha256:6fbab4b935989e2c3610371963ba1b86afb09537fd0c633049be82afe153ac06", size = 30468, upload-time = "2025-03-26T14:32:52.215Z" }, + { url = "https://files.pythonhosted.org/packages/94/97/789b23bdeeb9d15dc2904660463ad539d0318286d7633fe2760c10ed0c1c/google_crc32c-1.7.1-cp311-cp311-macosx_12_0_x86_64.whl", hash = "sha256:ed66cbe1ed9cbaaad9392b5259b3eba4a9e565420d734e6238813c428c3336c9", size = 30313, upload-time = "2025-03-26T14:57:38.758Z" }, + { url = "https://files.pythonhosted.org/packages/81/b8/976a2b843610c211e7ccb3e248996a61e87dbb2c09b1499847e295080aec/google_crc32c-1.7.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ee6547b657621b6cbed3562ea7826c3e11cab01cd33b74e1f677690652883e77", size = 33048, upload-time = "2025-03-26T14:41:30.679Z" }, + { url = "https://files.pythonhosted.org/packages/c9/16/a3842c2cf591093b111d4a5e2bfb478ac6692d02f1b386d2a33283a19dc9/google_crc32c-1.7.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d68e17bad8f7dd9a49181a1f5a8f4b251c6dbc8cc96fb79f1d321dfd57d66f53", size = 32669, upload-time = "2025-03-26T14:41:31.432Z" }, + { url = "https://files.pythonhosted.org/packages/04/17/ed9aba495916fcf5fe4ecb2267ceb851fc5f273c4e4625ae453350cfd564/google_crc32c-1.7.1-cp311-cp311-win_amd64.whl", hash = "sha256:6335de12921f06e1f774d0dd1fbea6bf610abe0887a1638f64d694013138be5d", size = 33476, upload-time = "2025-03-26T14:29:10.211Z" }, + { url = "https://files.pythonhosted.org/packages/dd/b7/787e2453cf8639c94b3d06c9d61f512234a82e1d12d13d18584bd3049904/google_crc32c-1.7.1-cp312-cp312-macosx_12_0_arm64.whl", hash = "sha256:2d73a68a653c57281401871dd4aeebbb6af3191dcac751a76ce430df4d403194", size = 30470, upload-time = "2025-03-26T14:34:31.655Z" }, + { url = "https://files.pythonhosted.org/packages/ed/b4/6042c2b0cbac3ec3a69bb4c49b28d2f517b7a0f4a0232603c42c58e22b44/google_crc32c-1.7.1-cp312-cp312-macosx_12_0_x86_64.whl", hash = "sha256:22beacf83baaf59f9d3ab2bbb4db0fb018da8e5aebdce07ef9f09fce8220285e", size = 30315, upload-time = "2025-03-26T15:01:54.634Z" }, + { url = "https://files.pythonhosted.org/packages/29/ad/01e7a61a5d059bc57b702d9ff6a18b2585ad97f720bd0a0dbe215df1ab0e/google_crc32c-1.7.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:19eafa0e4af11b0a4eb3974483d55d2d77ad1911e6cf6f832e1574f6781fd337", size = 33180, upload-time = "2025-03-26T14:41:32.168Z" }, + { url = "https://files.pythonhosted.org/packages/3b/a5/7279055cf004561894ed3a7bfdf5bf90a53f28fadd01af7cd166e88ddf16/google_crc32c-1.7.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b6d86616faaea68101195c6bdc40c494e4d76f41e07a37ffdef270879c15fb65", size = 32794, upload-time = "2025-03-26T14:41:33.264Z" }, + { url = "https://files.pythonhosted.org/packages/0f/d6/77060dbd140c624e42ae3ece3df53b9d811000729a5c821b9fd671ceaac6/google_crc32c-1.7.1-cp312-cp312-win_amd64.whl", hash = "sha256:b7491bdc0c7564fcf48c0179d2048ab2f7c7ba36b84ccd3a3e1c3f7a72d3bba6", size = 33477, upload-time = "2025-03-26T14:29:10.94Z" }, + { url = "https://files.pythonhosted.org/packages/8b/72/b8d785e9184ba6297a8620c8a37cf6e39b81a8ca01bb0796d7cbb28b3386/google_crc32c-1.7.1-cp313-cp313-macosx_12_0_arm64.whl", hash = "sha256:df8b38bdaf1629d62d51be8bdd04888f37c451564c2042d36e5812da9eff3c35", size = 30467, upload-time = "2025-03-26T14:36:06.909Z" }, + { url = "https://files.pythonhosted.org/packages/34/25/5f18076968212067c4e8ea95bf3b69669f9fc698476e5f5eb97d5b37999f/google_crc32c-1.7.1-cp313-cp313-macosx_12_0_x86_64.whl", hash = "sha256:e42e20a83a29aa2709a0cf271c7f8aefaa23b7ab52e53b322585297bb94d4638", size = 30309, upload-time = "2025-03-26T15:06:15.318Z" }, + { url = "https://files.pythonhosted.org/packages/92/83/9228fe65bf70e93e419f38bdf6c5ca5083fc6d32886ee79b450ceefd1dbd/google_crc32c-1.7.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:905a385140bf492ac300026717af339790921f411c0dfd9aa5a9e69a08ed32eb", size = 33133, upload-time = "2025-03-26T14:41:34.388Z" }, + { url = "https://files.pythonhosted.org/packages/c3/ca/1ea2fd13ff9f8955b85e7956872fdb7050c4ace8a2306a6d177edb9cf7fe/google_crc32c-1.7.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6b211ddaf20f7ebeec5c333448582c224a7c90a9d98826fbab82c0ddc11348e6", size = 32773, upload-time = "2025-03-26T14:41:35.19Z" }, + { url = "https://files.pythonhosted.org/packages/89/32/a22a281806e3ef21b72db16f948cad22ec68e4bdd384139291e00ff82fe2/google_crc32c-1.7.1-cp313-cp313-win_amd64.whl", hash = "sha256:0f99eaa09a9a7e642a61e06742856eec8b19fc0037832e03f941fe7cf0c8e4db", size = 33475, upload-time = "2025-03-26T14:29:11.771Z" }, + { url = "https://files.pythonhosted.org/packages/b8/c5/002975aff514e57fc084ba155697a049b3f9b52225ec3bc0f542871dd524/google_crc32c-1.7.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:32d1da0d74ec5634a05f53ef7df18fc646666a25efaaca9fc7dcfd4caf1d98c3", size = 33243, upload-time = "2025-03-26T14:41:35.975Z" }, + { url = "https://files.pythonhosted.org/packages/61/cb/c585282a03a0cea70fcaa1bf55d5d702d0f2351094d663ec3be1c6c67c52/google_crc32c-1.7.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e10554d4abc5238823112c2ad7e4560f96c7bf3820b202660373d769d9e6e4c9", size = 32870, upload-time = "2025-03-26T14:41:37.08Z" }, + { url = "https://files.pythonhosted.org/packages/16/1b/1693372bf423ada422f80fd88260dbfd140754adb15cbc4d7e9a68b1cb8e/google_crc32c-1.7.1-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:85fef7fae11494e747c9fd1359a527e5970fc9603c90764843caabd3a16a0a48", size = 28241, upload-time = "2025-03-26T14:41:45.898Z" }, + { url = "https://files.pythonhosted.org/packages/fd/3c/2a19a60a473de48717b4efb19398c3f914795b64a96cf3fbe82588044f78/google_crc32c-1.7.1-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6efb97eb4369d52593ad6f75e7e10d053cf00c48983f7a973105bc70b0ac4d82", size = 28048, upload-time = "2025-03-26T14:41:46.696Z" }, +] + +[[package]] +name = "google-genai" +version = "1.38.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio" }, + { name = "google-auth" }, + { name = "httpx" }, + { name = "pydantic" }, + { name = "requests" }, + { name = "tenacity" }, + { name = "typing-extensions" }, + { name = "websockets" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/b4/11/108ddd3aca8af6a9e2369e59b9646a3a4c64aefb39d154f6467ab8d79f34/google_genai-1.38.0.tar.gz", hash = "sha256:363272fc4f677d0be6a1aed7ebabe8adf45e1626a7011a7886a587e9464ca9ec", size = 244903, upload-time = "2025-09-16T23:25:42.577Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/53/6c/1de711bab3c118284904c3bedf870519e8c63a7a8e0905ac3833f1db9cbc/google_genai-1.38.0-py3-none-any.whl", hash = "sha256:95407425132d42b3fa11bc92b3f5cf61a0fbd8d9add1f0e89aac52c46fbba090", size = 245558, upload-time = "2025-09-16T23:25:41.141Z" }, +] + +[[package]] +name = "google-resumable-media" +version = "2.7.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "google-crc32c" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/58/5a/0efdc02665dca14e0837b62c8a1a93132c264bd02054a15abb2218afe0ae/google_resumable_media-2.7.2.tar.gz", hash = "sha256:5280aed4629f2b60b847b0d42f9857fd4935c11af266744df33d8074cae92fe0", size = 2163099, upload-time = "2024-08-07T22:20:38.555Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/82/35/b8d3baf8c46695858cb9d8835a53baa1eeb9906ddaf2f728a5f5b640fd1e/google_resumable_media-2.7.2-py2.py3-none-any.whl", hash = "sha256:3ce7551e9fe6d99e9a126101d2536612bb73486721951e9562fee0f90c6ababa", size = 81251, upload-time = "2024-08-07T22:20:36.409Z" }, +] + +[[package]] +name = "googleapis-common-protos" +version = "1.70.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "protobuf" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/39/24/33db22342cf4a2ea27c9955e6713140fedd51e8b141b5ce5260897020f1a/googleapis_common_protos-1.70.0.tar.gz", hash = "sha256:0e1b44e0ea153e6594f9f394fef15193a68aaaea2d843f83e2742717ca753257", size = 145903, upload-time = "2025-04-14T10:17:02.924Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/86/f1/62a193f0227cf15a920390abe675f386dec35f7ae3ffe6da582d3ade42c7/googleapis_common_protos-1.70.0-py3-none-any.whl", hash = "sha256:b8bfcca8c25a2bb253e0e0b0adaf8c00773e5e6af6fd92397576680b807e0fd8", size = 294530, upload-time = "2025-04-14T10:17:01.271Z" }, +] + +[package.optional-dependencies] +grpc = [ + { name = "grpcio" }, +] + +[[package]] +name = "graphviz" +version = "0.21" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f8/b3/3ac91e9be6b761a4b30d66ff165e54439dcd48b83f4e20d644867215f6ca/graphviz-0.21.tar.gz", hash = "sha256:20743e7183be82aaaa8ad6c93f8893c923bd6658a04c32ee115edb3c8a835f78", size = 200434, upload-time = "2025-06-15T09:35:05.824Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/91/4c/e0ce1ef95d4000ebc1c11801f9b944fa5910ecc15b5e351865763d8657f8/graphviz-0.21-py3-none-any.whl", hash = "sha256:54f33de9f4f911d7e84e4191749cac8cc5653f815b06738c54db9a15ab8b1e42", size = 47300, upload-time = "2025-06-15T09:35:04.433Z" }, +] + +[[package]] +name = "greenlet" +version = "3.2.4" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/03/b8/704d753a5a45507a7aab61f18db9509302ed3d0a27ac7e0359ec2905b1a6/greenlet-3.2.4.tar.gz", hash = "sha256:0dca0d95ff849f9a364385f36ab49f50065d76964944638be9691e1832e9f86d", size = 188260, upload-time = "2025-08-07T13:24:33.51Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a4/de/f28ced0a67749cac23fecb02b694f6473f47686dff6afaa211d186e2ef9c/greenlet-3.2.4-cp311-cp311-macosx_11_0_universal2.whl", hash = "sha256:96378df1de302bc38e99c3a9aa311967b7dc80ced1dcc6f171e99842987882a2", size = 272305, upload-time = "2025-08-07T13:15:41.288Z" }, + { url = "https://files.pythonhosted.org/packages/09/16/2c3792cba130000bf2a31c5272999113f4764fd9d874fb257ff588ac779a/greenlet-3.2.4-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:1ee8fae0519a337f2329cb78bd7a8e128ec0f881073d43f023c7b8d4831d5246", size = 632472, upload-time = "2025-08-07T13:42:55.044Z" }, + { url = "https://files.pythonhosted.org/packages/ae/8f/95d48d7e3d433e6dae5b1682e4292242a53f22df82e6d3dda81b1701a960/greenlet-3.2.4-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:94abf90142c2a18151632371140b3dba4dee031633fe614cb592dbb6c9e17bc3", size = 644646, upload-time = "2025-08-07T13:45:26.523Z" }, + { url = "https://files.pythonhosted.org/packages/d5/5e/405965351aef8c76b8ef7ad370e5da58d57ef6068df197548b015464001a/greenlet-3.2.4-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:4d1378601b85e2e5171b99be8d2dc85f594c79967599328f95c1dc1a40f1c633", size = 640519, upload-time = "2025-08-07T13:53:13.928Z" }, + { url = "https://files.pythonhosted.org/packages/25/5d/382753b52006ce0218297ec1b628e048c4e64b155379331f25a7316eb749/greenlet-3.2.4-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:0db5594dce18db94f7d1650d7489909b57afde4c580806b8d9203b6e79cdc079", size = 639707, upload-time = "2025-08-07T13:18:27.146Z" }, + { url = "https://files.pythonhosted.org/packages/1f/8e/abdd3f14d735b2929290a018ecf133c901be4874b858dd1c604b9319f064/greenlet-3.2.4-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2523e5246274f54fdadbce8494458a2ebdcdbc7b802318466ac5606d3cded1f8", size = 587684, upload-time = "2025-08-07T13:18:25.164Z" }, + { url = "https://files.pythonhosted.org/packages/5d/65/deb2a69c3e5996439b0176f6651e0052542bb6c8f8ec2e3fba97c9768805/greenlet-3.2.4-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:1987de92fec508535687fb807a5cea1560f6196285a4cde35c100b8cd632cc52", size = 1116647, upload-time = "2025-08-07T13:42:38.655Z" }, + { url = "https://files.pythonhosted.org/packages/3f/cc/b07000438a29ac5cfb2194bfc128151d52f333cee74dd7dfe3fb733fc16c/greenlet-3.2.4-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:55e9c5affaa6775e2c6b67659f3a71684de4c549b3dd9afca3bc773533d284fa", size = 1142073, upload-time = "2025-08-07T13:18:21.737Z" }, + { url = "https://files.pythonhosted.org/packages/d8/0f/30aef242fcab550b0b3520b8e3561156857c94288f0332a79928c31a52cf/greenlet-3.2.4-cp311-cp311-win_amd64.whl", hash = "sha256:9c40adce87eaa9ddb593ccb0fa6a07caf34015a29bf8d344811665b573138db9", size = 299100, upload-time = "2025-08-07T13:44:12.287Z" }, + { url = "https://files.pythonhosted.org/packages/44/69/9b804adb5fd0671f367781560eb5eb586c4d495277c93bde4307b9e28068/greenlet-3.2.4-cp312-cp312-macosx_11_0_universal2.whl", hash = "sha256:3b67ca49f54cede0186854a008109d6ee71f66bd57bb36abd6d0a0267b540cdd", size = 274079, upload-time = "2025-08-07T13:15:45.033Z" }, + { url = "https://files.pythonhosted.org/packages/46/e9/d2a80c99f19a153eff70bc451ab78615583b8dac0754cfb942223d2c1a0d/greenlet-3.2.4-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:ddf9164e7a5b08e9d22511526865780a576f19ddd00d62f8a665949327fde8bb", size = 640997, upload-time = "2025-08-07T13:42:56.234Z" }, + { url = "https://files.pythonhosted.org/packages/3b/16/035dcfcc48715ccd345f3a93183267167cdd162ad123cd93067d86f27ce4/greenlet-3.2.4-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:f28588772bb5fb869a8eb331374ec06f24a83a9c25bfa1f38b6993afe9c1e968", size = 655185, upload-time = "2025-08-07T13:45:27.624Z" }, + { url = "https://files.pythonhosted.org/packages/31/da/0386695eef69ffae1ad726881571dfe28b41970173947e7c558d9998de0f/greenlet-3.2.4-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:5c9320971821a7cb77cfab8d956fa8e39cd07ca44b6070db358ceb7f8797c8c9", size = 649926, upload-time = "2025-08-07T13:53:15.251Z" }, + { url = "https://files.pythonhosted.org/packages/68/88/69bf19fd4dc19981928ceacbc5fd4bb6bc2215d53199e367832e98d1d8fe/greenlet-3.2.4-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c60a6d84229b271d44b70fb6e5fa23781abb5d742af7b808ae3f6efd7c9c60f6", size = 651839, upload-time = "2025-08-07T13:18:30.281Z" }, + { url = "https://files.pythonhosted.org/packages/19/0d/6660d55f7373b2ff8152401a83e02084956da23ae58cddbfb0b330978fe9/greenlet-3.2.4-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3b3812d8d0c9579967815af437d96623f45c0f2ae5f04e366de62a12d83a8fb0", size = 607586, upload-time = "2025-08-07T13:18:28.544Z" }, + { url = "https://files.pythonhosted.org/packages/8e/1a/c953fdedd22d81ee4629afbb38d2f9d71e37d23caace44775a3a969147d4/greenlet-3.2.4-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:abbf57b5a870d30c4675928c37278493044d7c14378350b3aa5d484fa65575f0", size = 1123281, upload-time = "2025-08-07T13:42:39.858Z" }, + { url = "https://files.pythonhosted.org/packages/3f/c7/12381b18e21aef2c6bd3a636da1088b888b97b7a0362fac2e4de92405f97/greenlet-3.2.4-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:20fb936b4652b6e307b8f347665e2c615540d4b42b3b4c8a321d8286da7e520f", size = 1151142, upload-time = "2025-08-07T13:18:22.981Z" }, + { url = "https://files.pythonhosted.org/packages/e9/08/b0814846b79399e585f974bbeebf5580fbe59e258ea7be64d9dfb253c84f/greenlet-3.2.4-cp312-cp312-win_amd64.whl", hash = "sha256:a7d4e128405eea3814a12cc2605e0e6aedb4035bf32697f72deca74de4105e02", size = 299899, upload-time = "2025-08-07T13:38:53.448Z" }, + { url = "https://files.pythonhosted.org/packages/49/e8/58c7f85958bda41dafea50497cbd59738c5c43dbbea5ee83d651234398f4/greenlet-3.2.4-cp313-cp313-macosx_11_0_universal2.whl", hash = "sha256:1a921e542453fe531144e91e1feedf12e07351b1cf6c9e8a3325ea600a715a31", size = 272814, upload-time = "2025-08-07T13:15:50.011Z" }, + { url = "https://files.pythonhosted.org/packages/62/dd/b9f59862e9e257a16e4e610480cfffd29e3fae018a68c2332090b53aac3d/greenlet-3.2.4-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:cd3c8e693bff0fff6ba55f140bf390fa92c994083f838fece0f63be121334945", size = 641073, upload-time = "2025-08-07T13:42:57.23Z" }, + { url = "https://files.pythonhosted.org/packages/f7/0b/bc13f787394920b23073ca3b6c4a7a21396301ed75a655bcb47196b50e6e/greenlet-3.2.4-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:710638eb93b1fa52823aa91bf75326f9ecdfd5e0466f00789246a5280f4ba0fc", size = 655191, upload-time = "2025-08-07T13:45:29.752Z" }, + { url = "https://files.pythonhosted.org/packages/f2/d6/6adde57d1345a8d0f14d31e4ab9c23cfe8e2cd39c3baf7674b4b0338d266/greenlet-3.2.4-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:c5111ccdc9c88f423426df3fd1811bfc40ed66264d35aa373420a34377efc98a", size = 649516, upload-time = "2025-08-07T13:53:16.314Z" }, + { url = "https://files.pythonhosted.org/packages/7f/3b/3a3328a788d4a473889a2d403199932be55b1b0060f4ddd96ee7cdfcad10/greenlet-3.2.4-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:d76383238584e9711e20ebe14db6c88ddcedc1829a9ad31a584389463b5aa504", size = 652169, upload-time = "2025-08-07T13:18:32.861Z" }, + { url = "https://files.pythonhosted.org/packages/ee/43/3cecdc0349359e1a527cbf2e3e28e5f8f06d3343aaf82ca13437a9aa290f/greenlet-3.2.4-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:23768528f2911bcd7e475210822ffb5254ed10d71f4028387e5a99b4c6699671", size = 610497, upload-time = "2025-08-07T13:18:31.636Z" }, + { url = "https://files.pythonhosted.org/packages/b8/19/06b6cf5d604e2c382a6f31cafafd6f33d5dea706f4db7bdab184bad2b21d/greenlet-3.2.4-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:00fadb3fedccc447f517ee0d3fd8fe49eae949e1cd0f6a611818f4f6fb7dc83b", size = 1121662, upload-time = "2025-08-07T13:42:41.117Z" }, + { url = "https://files.pythonhosted.org/packages/a2/15/0d5e4e1a66fab130d98168fe984c509249c833c1a3c16806b90f253ce7b9/greenlet-3.2.4-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:d25c5091190f2dc0eaa3f950252122edbbadbb682aa7b1ef2f8af0f8c0afefae", size = 1149210, upload-time = "2025-08-07T13:18:24.072Z" }, + { url = "https://files.pythonhosted.org/packages/0b/55/2321e43595e6801e105fcfdee02b34c0f996eb71e6ddffca6b10b7e1d771/greenlet-3.2.4-cp313-cp313-win_amd64.whl", hash = "sha256:554b03b6e73aaabec3745364d6239e9e012d64c68ccd0b8430c64ccc14939a8b", size = 299685, upload-time = "2025-08-07T13:24:38.824Z" }, + { url = "https://files.pythonhosted.org/packages/22/5c/85273fd7cc388285632b0498dbbab97596e04b154933dfe0f3e68156c68c/greenlet-3.2.4-cp314-cp314-macosx_11_0_universal2.whl", hash = "sha256:49a30d5fda2507ae77be16479bdb62a660fa51b1eb4928b524975b3bde77b3c0", size = 273586, upload-time = "2025-08-07T13:16:08.004Z" }, + { url = "https://files.pythonhosted.org/packages/d1/75/10aeeaa3da9332c2e761e4c50d4c3556c21113ee3f0afa2cf5769946f7a3/greenlet-3.2.4-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:299fd615cd8fc86267b47597123e3f43ad79c9d8a22bebdce535e53550763e2f", size = 686346, upload-time = "2025-08-07T13:42:59.944Z" }, + { url = "https://files.pythonhosted.org/packages/c0/aa/687d6b12ffb505a4447567d1f3abea23bd20e73a5bed63871178e0831b7a/greenlet-3.2.4-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:c17b6b34111ea72fc5a4e4beec9711d2226285f0386ea83477cbb97c30a3f3a5", size = 699218, upload-time = "2025-08-07T13:45:30.969Z" }, + { url = "https://files.pythonhosted.org/packages/dc/8b/29aae55436521f1d6f8ff4e12fb676f3400de7fcf27fccd1d4d17fd8fecd/greenlet-3.2.4-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:b4a1870c51720687af7fa3e7cda6d08d801dae660f75a76f3845b642b4da6ee1", size = 694659, upload-time = "2025-08-07T13:53:17.759Z" }, + { url = "https://files.pythonhosted.org/packages/92/2e/ea25914b1ebfde93b6fc4ff46d6864564fba59024e928bdc7de475affc25/greenlet-3.2.4-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:061dc4cf2c34852b052a8620d40f36324554bc192be474b9e9770e8c042fd735", size = 695355, upload-time = "2025-08-07T13:18:34.517Z" }, + { url = "https://files.pythonhosted.org/packages/72/60/fc56c62046ec17f6b0d3060564562c64c862948c9d4bc8aa807cf5bd74f4/greenlet-3.2.4-cp314-cp314-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:44358b9bf66c8576a9f57a590d5f5d6e72fa4228b763d0e43fee6d3b06d3a337", size = 657512, upload-time = "2025-08-07T13:18:33.969Z" }, + { url = "https://files.pythonhosted.org/packages/e3/a5/6ddab2b4c112be95601c13428db1d8b6608a8b6039816f2ba09c346c08fc/greenlet-3.2.4-cp314-cp314-win_amd64.whl", hash = "sha256:e37ab26028f12dbb0ff65f29a8d3d44a765c61e729647bf2ddfbbed621726f01", size = 303425, upload-time = "2025-08-07T13:32:27.59Z" }, +] + +[[package]] +name = "grpc-google-iam-v1" +version = "0.14.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "googleapis-common-protos", extra = ["grpc"] }, + { name = "grpcio" }, + { name = "protobuf" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/b9/4e/8d0ca3b035e41fe0b3f31ebbb638356af720335e5a11154c330169b40777/grpc_google_iam_v1-0.14.2.tar.gz", hash = "sha256:b3e1fc387a1a329e41672197d0ace9de22c78dd7d215048c4c78712073f7bd20", size = 16259, upload-time = "2025-03-17T11:40:23.586Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/66/6f/dd9b178aee7835b96c2e63715aba6516a9d50f6bebbd1cc1d32c82a2a6c3/grpc_google_iam_v1-0.14.2-py3-none-any.whl", hash = "sha256:a3171468459770907926d56a440b2bb643eec1d7ba215f48f3ecece42b4d8351", size = 19242, upload-time = "2025-03-17T11:40:22.648Z" }, +] + +[[package]] +name = "grpc-interceptor" +version = "0.15.4" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "grpcio" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/9f/28/57449d5567adf4c1d3e216aaca545913fbc21a915f2da6790d6734aac76e/grpc-interceptor-0.15.4.tar.gz", hash = "sha256:1f45c0bcb58b6f332f37c637632247c9b02bc6af0fdceb7ba7ce8d2ebbfb0926", size = 19322, upload-time = "2023-11-16T02:05:42.459Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/15/ac/8d53f230a7443401ce81791ec50a3b0e54924bf615ad287654fa4a2f5cdc/grpc_interceptor-0.15.4-py3-none-any.whl", hash = "sha256:0035f33228693ed3767ee49d937bac424318db173fef4d2d0170b3215f254d9d", size = 20848, upload-time = "2023-11-16T02:05:40.913Z" }, +] + +[[package]] +name = "grpcio" +version = "1.75.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/91/88/fe2844eefd3d2188bc0d7a2768c6375b46dfd96469ea52d8aeee8587d7e0/grpcio-1.75.0.tar.gz", hash = "sha256:b989e8b09489478c2d19fecc744a298930f40d8b27c3638afbfe84d22f36ce4e", size = 12722485, upload-time = "2025-09-16T09:20:21.731Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/95/b7/a6f42596fc367656970f5811e5d2d9912ca937aa90621d5468a11680ef47/grpcio-1.75.0-cp311-cp311-linux_armv7l.whl", hash = "sha256:7f89d6d0cd43170a80ebb4605cad54c7d462d21dc054f47688912e8bf08164af", size = 5699769, upload-time = "2025-09-16T09:18:32.536Z" }, + { url = "https://files.pythonhosted.org/packages/c2/42/284c463a311cd2c5f804fd4fdbd418805460bd5d702359148dd062c1685d/grpcio-1.75.0-cp311-cp311-macosx_11_0_universal2.whl", hash = "sha256:cb6c5b075c2d092f81138646a755f0dad94e4622300ebef089f94e6308155d82", size = 11480362, upload-time = "2025-09-16T09:18:35.562Z" }, + { url = "https://files.pythonhosted.org/packages/0b/10/60d54d5a03062c3ae91bddb6e3acefe71264307a419885f453526d9203ff/grpcio-1.75.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:494dcbade5606128cb9f530ce00331a90ecf5e7c5b243d373aebdb18e503c346", size = 6284753, upload-time = "2025-09-16T09:18:38.055Z" }, + { url = "https://files.pythonhosted.org/packages/cf/af/381a4bfb04de5e2527819452583e694df075c7a931e9bf1b2a603b593ab2/grpcio-1.75.0-cp311-cp311-manylinux2014_i686.manylinux_2_17_i686.whl", hash = "sha256:050760fd29c8508844a720f06c5827bb00de8f5e02f58587eb21a4444ad706e5", size = 6944103, upload-time = "2025-09-16T09:18:40.844Z" }, + { url = "https://files.pythonhosted.org/packages/16/18/c80dd7e1828bd6700ce242c1616871927eef933ed0c2cee5c636a880e47b/grpcio-1.75.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:266fa6209b68a537b2728bb2552f970e7e78c77fe43c6e9cbbe1f476e9e5c35f", size = 6464036, upload-time = "2025-09-16T09:18:43.351Z" }, + { url = "https://files.pythonhosted.org/packages/79/3f/78520c7ed9ccea16d402530bc87958bbeb48c42a2ec8032738a7864d38f8/grpcio-1.75.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:06d22e1d8645e37bc110f4c589cb22c283fd3de76523065f821d6e81de33f5d4", size = 7097455, upload-time = "2025-09-16T09:18:45.465Z" }, + { url = "https://files.pythonhosted.org/packages/ad/69/3cebe4901a865eb07aefc3ee03a02a632e152e9198dadf482a7faf926f31/grpcio-1.75.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:9880c323595d851292785966cadb6c708100b34b163cab114e3933f5773cba2d", size = 8037203, upload-time = "2025-09-16T09:18:47.878Z" }, + { url = "https://files.pythonhosted.org/packages/04/ed/1e483d1eba5032642c10caf28acf07ca8de0508244648947764956db346a/grpcio-1.75.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:55a2d5ae79cd0f68783fb6ec95509be23746e3c239290b2ee69c69a38daa961a", size = 7492085, upload-time = "2025-09-16T09:18:50.907Z" }, + { url = "https://files.pythonhosted.org/packages/ee/65/6ef676aa7dbd9578dfca990bb44d41a49a1e36344ca7d79de6b59733ba96/grpcio-1.75.0-cp311-cp311-win32.whl", hash = "sha256:352dbdf25495eef584c8de809db280582093bc3961d95a9d78f0dfb7274023a2", size = 3944697, upload-time = "2025-09-16T09:18:53.427Z" }, + { url = "https://files.pythonhosted.org/packages/0d/83/b753373098b81ec5cb01f71c21dfd7aafb5eb48a1566d503e9fd3c1254fe/grpcio-1.75.0-cp311-cp311-win_amd64.whl", hash = "sha256:678b649171f229fb16bda1a2473e820330aa3002500c4f9fd3a74b786578e90f", size = 4642235, upload-time = "2025-09-16T09:18:56.095Z" }, + { url = "https://files.pythonhosted.org/packages/0d/93/a1b29c2452d15cecc4a39700fbf54721a3341f2ddbd1bd883f8ec0004e6e/grpcio-1.75.0-cp312-cp312-linux_armv7l.whl", hash = "sha256:fa35ccd9501ffdd82b861809cbfc4b5b13f4b4c5dc3434d2d9170b9ed38a9054", size = 5661861, upload-time = "2025-09-16T09:18:58.748Z" }, + { url = "https://files.pythonhosted.org/packages/b8/ce/7280df197e602d14594e61d1e60e89dfa734bb59a884ba86cdd39686aadb/grpcio-1.75.0-cp312-cp312-macosx_11_0_universal2.whl", hash = "sha256:0fcb77f2d718c1e58cc04ef6d3b51e0fa3b26cf926446e86c7eba105727b6cd4", size = 11459982, upload-time = "2025-09-16T09:19:01.211Z" }, + { url = "https://files.pythonhosted.org/packages/7c/9b/37e61349771f89b543a0a0bbc960741115ea8656a2414bfb24c4de6f3dd7/grpcio-1.75.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:36764a4ad9dc1eb891042fab51e8cdf7cc014ad82cee807c10796fb708455041", size = 6239680, upload-time = "2025-09-16T09:19:04.443Z" }, + { url = "https://files.pythonhosted.org/packages/a6/66/f645d9d5b22ca307f76e71abc83ab0e574b5dfef3ebde4ec8b865dd7e93e/grpcio-1.75.0-cp312-cp312-manylinux2014_i686.manylinux_2_17_i686.whl", hash = "sha256:725e67c010f63ef17fc052b261004942763c0b18dcd84841e6578ddacf1f9d10", size = 6908511, upload-time = "2025-09-16T09:19:07.884Z" }, + { url = "https://files.pythonhosted.org/packages/e6/9a/34b11cd62d03c01b99068e257595804c695c3c119596c7077f4923295e19/grpcio-1.75.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:91fbfc43f605c5ee015c9056d580a70dd35df78a7bad97e05426795ceacdb59f", size = 6429105, upload-time = "2025-09-16T09:19:10.085Z" }, + { url = "https://files.pythonhosted.org/packages/1a/46/76eaceaad1f42c1e7e6a5b49a61aac40fc5c9bee4b14a1630f056ac3a57e/grpcio-1.75.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:7a9337ac4ce61c388e02019d27fa837496c4b7837cbbcec71b05934337e51531", size = 7060578, upload-time = "2025-09-16T09:19:12.283Z" }, + { url = "https://files.pythonhosted.org/packages/3d/82/181a0e3f1397b6d43239e95becbeb448563f236c0db11ce990f073b08d01/grpcio-1.75.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:ee16e232e3d0974750ab5f4da0ab92b59d6473872690b5e40dcec9a22927f22e", size = 8003283, upload-time = "2025-09-16T09:19:15.601Z" }, + { url = "https://files.pythonhosted.org/packages/de/09/a335bca211f37a3239be4b485e3c12bf3da68d18b1f723affdff2b9e9680/grpcio-1.75.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:55dfb9122973cc69520b23d39867726722cafb32e541435707dc10249a1bdbc6", size = 7460319, upload-time = "2025-09-16T09:19:18.409Z" }, + { url = "https://files.pythonhosted.org/packages/aa/59/6330105cdd6bc4405e74c96838cd7e148c3653ae3996e540be6118220c79/grpcio-1.75.0-cp312-cp312-win32.whl", hash = "sha256:fb64dd62face3d687a7b56cd881e2ea39417af80f75e8b36f0f81dfd93071651", size = 3934011, upload-time = "2025-09-16T09:19:21.013Z" }, + { url = "https://files.pythonhosted.org/packages/ff/14/e1309a570b7ebdd1c8ca24c4df6b8d6690009fa8e0d997cb2c026ce850c9/grpcio-1.75.0-cp312-cp312-win_amd64.whl", hash = "sha256:6b365f37a9c9543a9e91c6b4103d68d38d5bcb9965b11d5092b3c157bd6a5ee7", size = 4637934, upload-time = "2025-09-16T09:19:23.19Z" }, + { url = "https://files.pythonhosted.org/packages/00/64/dbce0ffb6edaca2b292d90999dd32a3bd6bc24b5b77618ca28440525634d/grpcio-1.75.0-cp313-cp313-linux_armv7l.whl", hash = "sha256:1bb78d052948d8272c820bb928753f16a614bb2c42fbf56ad56636991b427518", size = 5666860, upload-time = "2025-09-16T09:19:25.417Z" }, + { url = "https://files.pythonhosted.org/packages/f3/e6/da02c8fa882ad3a7f868d380bb3da2c24d35dd983dd12afdc6975907a352/grpcio-1.75.0-cp313-cp313-macosx_11_0_universal2.whl", hash = "sha256:9dc4a02796394dd04de0b9673cb79a78901b90bb16bf99ed8cb528c61ed9372e", size = 11455148, upload-time = "2025-09-16T09:19:28.615Z" }, + { url = "https://files.pythonhosted.org/packages/ba/a0/84f87f6c2cf2a533cfce43b2b620eb53a51428ec0c8fe63e5dd21d167a70/grpcio-1.75.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:437eeb16091d31498585d73b133b825dc80a8db43311e332c08facf820d36894", size = 6243865, upload-time = "2025-09-16T09:19:31.342Z" }, + { url = "https://files.pythonhosted.org/packages/be/12/53da07aa701a4839dd70d16e61ce21ecfcc9e929058acb2f56e9b2dd8165/grpcio-1.75.0-cp313-cp313-manylinux2014_i686.manylinux_2_17_i686.whl", hash = "sha256:c2c39984e846bd5da45c5f7bcea8fafbe47c98e1ff2b6f40e57921b0c23a52d0", size = 6915102, upload-time = "2025-09-16T09:19:33.658Z" }, + { url = "https://files.pythonhosted.org/packages/5b/c0/7eaceafd31f52ec4bf128bbcf36993b4bc71f64480f3687992ddd1a6e315/grpcio-1.75.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:38d665f44b980acdbb2f0e1abf67605ba1899f4d2443908df9ec8a6f26d2ed88", size = 6432042, upload-time = "2025-09-16T09:19:36.583Z" }, + { url = "https://files.pythonhosted.org/packages/6b/12/a2ce89a9f4fc52a16ed92951f1b05f53c17c4028b3db6a4db7f08332bee8/grpcio-1.75.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:2e8e752ab5cc0a9c5b949808c000ca7586223be4f877b729f034b912364c3964", size = 7062984, upload-time = "2025-09-16T09:19:39.163Z" }, + { url = "https://files.pythonhosted.org/packages/55/a6/2642a9b491e24482d5685c0f45c658c495a5499b43394846677abed2c966/grpcio-1.75.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:3a6788b30aa8e6f207c417874effe3f79c2aa154e91e78e477c4825e8b431ce0", size = 8001212, upload-time = "2025-09-16T09:19:41.726Z" }, + { url = "https://files.pythonhosted.org/packages/19/20/530d4428750e9ed6ad4254f652b869a20a40a276c1f6817b8c12d561f5ef/grpcio-1.75.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:ffc33e67cab6141c54e75d85acd5dec616c5095a957ff997b4330a6395aa9b51", size = 7457207, upload-time = "2025-09-16T09:19:44.368Z" }, + { url = "https://files.pythonhosted.org/packages/e2/6f/843670007e0790af332a21468d10059ea9fdf97557485ae633b88bd70efc/grpcio-1.75.0-cp313-cp313-win32.whl", hash = "sha256:c8cfc780b7a15e06253aae5f228e1e84c0d3c4daa90faf5bc26b751174da4bf9", size = 3934235, upload-time = "2025-09-16T09:19:46.815Z" }, + { url = "https://files.pythonhosted.org/packages/4b/92/c846b01b38fdf9e2646a682b12e30a70dc7c87dfe68bd5e009ee1501c14b/grpcio-1.75.0-cp313-cp313-win_amd64.whl", hash = "sha256:0c91d5b16eff3cbbe76b7a1eaaf3d91e7a954501e9d4f915554f87c470475c3d", size = 4637558, upload-time = "2025-09-16T09:19:49.698Z" }, +] + +[[package]] +name = "grpcio-status" +version = "1.75.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "googleapis-common-protos" }, + { name = "grpcio" }, + { name = "protobuf" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/ca/8a/2e45ec0512d4ce9afa136c6e4186d063721b5b4c192eec7536ce6b7ba615/grpcio_status-1.75.0.tar.gz", hash = "sha256:69d5b91be1b8b926f086c1c483519a968c14640773a0ccdd6c04282515dbedf7", size = 13646, upload-time = "2025-09-16T09:24:51.069Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/2b/24/d536f0a0fda3a3eeb334893e5fb9d567c2777de6a5384413f71b35cfd0e5/grpcio_status-1.75.0-py3-none-any.whl", hash = "sha256:de62557ef97b7e19c3ce6da19793a12c5f6c1fbbb918d233d9671aba9d9e1d78", size = 14424, upload-time = "2025-09-16T09:23:33.843Z" }, +] + +[[package]] +name = "h11" +version = "0.16.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/01/ee/02a2c011bdab74c6fb3c75474d40b3052059d95df7e73351460c8588d963/h11-0.16.0.tar.gz", hash = "sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1", size = 101250, upload-time = "2025-04-24T03:35:25.427Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/04/4b/29cac41a4d98d144bf5f6d33995617b185d14b22401f75ca86f384e87ff1/h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86", size = 37515, upload-time = "2025-04-24T03:35:24.344Z" }, +] + +[[package]] +name = "hexbytes" +version = "1.3.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/7f/87/adf4635b4b8c050283d74e6db9a81496063229c9263e6acc1903ab79fbec/hexbytes-1.3.1.tar.gz", hash = "sha256:a657eebebdfe27254336f98d8af6e2236f3f83aed164b87466b6cf6c5f5a4765", size = 8633, upload-time = "2025-05-14T16:45:17.5Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/8d/e0/3b31492b1c89da3c5a846680517871455b30c54738486fc57ac79a5761bd/hexbytes-1.3.1-py3-none-any.whl", hash = "sha256:da01ff24a1a9a2b1881c4b85f0e9f9b0f51b526b379ffa23832ae7899d29c2c7", size = 5074, upload-time = "2025-05-14T16:45:16.179Z" }, +] + +[[package]] +name = "hf-xet" +version = "1.1.10" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/74/31/feeddfce1748c4a233ec1aa5b7396161c07ae1aa9b7bdbc9a72c3c7dd768/hf_xet-1.1.10.tar.gz", hash = "sha256:408aef343800a2102374a883f283ff29068055c111f003ff840733d3b715bb97", size = 487910, upload-time = "2025-09-12T20:10:27.12Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/f7/a2/343e6d05de96908366bdc0081f2d8607d61200be2ac802769c4284cc65bd/hf_xet-1.1.10-cp37-abi3-macosx_10_12_x86_64.whl", hash = "sha256:686083aca1a6669bc85c21c0563551cbcdaa5cf7876a91f3d074a030b577231d", size = 2761466, upload-time = "2025-09-12T20:10:22.836Z" }, + { url = "https://files.pythonhosted.org/packages/31/f9/6215f948ac8f17566ee27af6430ea72045e0418ce757260248b483f4183b/hf_xet-1.1.10-cp37-abi3-macosx_11_0_arm64.whl", hash = "sha256:71081925383b66b24eedff3013f8e6bbd41215c3338be4b94ba75fd75b21513b", size = 2623807, upload-time = "2025-09-12T20:10:21.118Z" }, + { url = "https://files.pythonhosted.org/packages/15/07/86397573efefff941e100367bbda0b21496ffcdb34db7ab51912994c32a2/hf_xet-1.1.10-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6b6bceb6361c80c1cc42b5a7b4e3efd90e64630bcf11224dcac50ef30a47e435", size = 3186960, upload-time = "2025-09-12T20:10:19.336Z" }, + { url = "https://files.pythonhosted.org/packages/01/a7/0b2e242b918cc30e1f91980f3c4b026ff2eedaf1e2ad96933bca164b2869/hf_xet-1.1.10-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:eae7c1fc8a664e54753ffc235e11427ca61f4b0477d757cc4eb9ae374b69f09c", size = 3087167, upload-time = "2025-09-12T20:10:17.255Z" }, + { url = "https://files.pythonhosted.org/packages/4a/25/3e32ab61cc7145b11eee9d745988e2f0f4fafda81b25980eebf97d8cff15/hf_xet-1.1.10-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:0a0005fd08f002180f7a12d4e13b22be277725bc23ed0529f8add5c7a6309c06", size = 3248612, upload-time = "2025-09-12T20:10:24.093Z" }, + { url = "https://files.pythonhosted.org/packages/2c/3d/ab7109e607ed321afaa690f557a9ada6d6d164ec852fd6bf9979665dc3d6/hf_xet-1.1.10-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:f900481cf6e362a6c549c61ff77468bd59d6dd082f3170a36acfef2eb6a6793f", size = 3353360, upload-time = "2025-09-12T20:10:25.563Z" }, + { url = "https://files.pythonhosted.org/packages/ee/0e/471f0a21db36e71a2f1752767ad77e92d8cde24e974e03d662931b1305ec/hf_xet-1.1.10-cp37-abi3-win_amd64.whl", hash = "sha256:5f54b19cc347c13235ae7ee98b330c26dd65ef1df47e5316ffb1e87713ca7045", size = 2804691, upload-time = "2025-09-12T20:10:28.433Z" }, +] + +[[package]] +name = "httpcore" +version = "1.0.9" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "certifi" }, + { name = "h11" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/06/94/82699a10bca87a5556c9c59b5963f2d039dbd239f25bc2a63907a05a14cb/httpcore-1.0.9.tar.gz", hash = "sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8", size = 85484, upload-time = "2025-04-24T22:06:22.219Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7e/f5/f66802a942d491edb555dd61e3a9961140fd64c90bce1eafd741609d334d/httpcore-1.0.9-py3-none-any.whl", hash = "sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55", size = 78784, upload-time = "2025-04-24T22:06:20.566Z" }, +] + +[[package]] +name = "httplib2" +version = "0.31.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pyparsing" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/52/77/6653db69c1f7ecfe5e3f9726fdadc981794656fcd7d98c4209fecfea9993/httplib2-0.31.0.tar.gz", hash = "sha256:ac7ab497c50975147d4f7b1ade44becc7df2f8954d42b38b3d69c515f531135c", size = 250759, upload-time = "2025-09-11T12:16:03.403Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/8c/a2/0d269db0f6163be503775dc8b6a6fa15820cc9fdc866f6ba608d86b721f2/httplib2-0.31.0-py3-none-any.whl", hash = "sha256:b9cd78abea9b4e43a7714c6e0f8b6b8561a6fc1e95d5dbd367f5bf0ef35f5d24", size = 91148, upload-time = "2025-09-11T12:16:01.803Z" }, +] + +[[package]] +name = "httpx" +version = "0.28.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio" }, + { name = "certifi" }, + { name = "httpcore" }, + { name = "idna" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/b1/df/48c586a5fe32a0f01324ee087459e112ebb7224f646c0b5023f5e79e9956/httpx-0.28.1.tar.gz", hash = "sha256:75e98c5f16b0f35b567856f597f06ff2270a374470a5c2392242528e3e3e42fc", size = 141406, upload-time = "2024-12-06T15:37:23.222Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/2a/39/e50c7c3a983047577ee07d2a9e53faf5a69493943ec3f6a384bdc792deb2/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad", size = 73517, upload-time = "2024-12-06T15:37:21.509Z" }, +] + +[[package]] +name = "httpx-sse" +version = "0.4.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/6e/fa/66bd985dd0b7c109a3bcb89272ee0bfb7e2b4d06309ad7b38ff866734b2a/httpx_sse-0.4.1.tar.gz", hash = "sha256:8f44d34414bc7b21bf3602713005c5df4917884f76072479b21f68befa4ea26e", size = 12998, upload-time = "2025-06-24T13:21:05.71Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/25/0a/6269e3473b09aed2dab8aa1a600c70f31f00ae1349bee30658f7e358a159/httpx_sse-0.4.1-py3-none-any.whl", hash = "sha256:cba42174344c3a5b06f255ce65b350880f962d99ead85e776f23c6618a377a37", size = 8054, upload-time = "2025-06-24T13:21:04.772Z" }, +] + +[[package]] +name = "huggingface-hub" +version = "0.35.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "filelock" }, + { name = "fsspec" }, + { name = "hf-xet", marker = "platform_machine == 'aarch64' or platform_machine == 'amd64' or platform_machine == 'arm64' or platform_machine == 'x86_64'" }, + { name = "packaging" }, + { name = "pyyaml" }, + { name = "requests" }, + { name = "tqdm" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/37/79/d71d40efa058e8c4a075158f8855bc2998037b5ff1c84f249f34435c1df7/huggingface_hub-0.35.0.tar.gz", hash = "sha256:ccadd2a78eef75effff184ad89401413629fabc52cefd76f6bbacb9b1c0676ac", size = 461486, upload-time = "2025-09-16T13:49:33.282Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/fe/85/a18508becfa01f1e4351b5e18651b06d210dbd96debccd48a452acccb901/huggingface_hub-0.35.0-py3-none-any.whl", hash = "sha256:f2e2f693bca9a26530b1c0b9bcd4c1495644dad698e6a0060f90e22e772c31e9", size = 563436, upload-time = "2025-09-16T13:49:30.627Z" }, +] + +[[package]] +name = "humanfriendly" +version = "10.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pyreadline3", marker = "sys_platform == 'win32'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/cc/3f/2c29224acb2e2df4d2046e4c73ee2662023c58ff5b113c4c1adac0886c43/humanfriendly-10.0.tar.gz", hash = "sha256:6b0b831ce8f15f7300721aa49829fc4e83921a9a301cc7f606be6686a2288ddc", size = 360702, upload-time = "2021-09-17T21:40:43.31Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/f0/0f/310fb31e39e2d734ccaa2c0fb981ee41f7bd5056ce9bc29b2248bd569169/humanfriendly-10.0-py2.py3-none-any.whl", hash = "sha256:1697e1a8a8f550fd43c2865cd84542fc175a61dcb779b6fee18cf6b6ccba1477", size = 86794, upload-time = "2021-09-17T21:40:39.897Z" }, +] + +[[package]] +name = "humanize" +version = "4.13.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/98/1d/3062fcc89ee05a715c0b9bfe6490c00c576314f27ffee3a704122c6fd259/humanize-4.13.0.tar.gz", hash = "sha256:78f79e68f76f0b04d711c4e55d32bebef5be387148862cb1ef83d2b58e7935a0", size = 81884, upload-time = "2025-08-25T09:39:20.04Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/1e/c7/316e7ca04d26695ef0635dc81683d628350810eb8e9b2299fc08ba49f366/humanize-4.13.0-py3-none-any.whl", hash = "sha256:b810820b31891813b1673e8fec7f1ed3312061eab2f26e3fa192c393d11ed25f", size = 128869, upload-time = "2025-08-25T09:39:18.54Z" }, +] + +[[package]] +name = "identify" +version = "2.6.14" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/52/c4/62963f25a678f6a050fb0505a65e9e726996171e6dbe1547f79619eefb15/identify-2.6.14.tar.gz", hash = "sha256:663494103b4f717cb26921c52f8751363dc89db64364cd836a9bf1535f53cd6a", size = 99283, upload-time = "2025-09-06T19:30:52.938Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e5/ae/2ad30f4652712c82f1c23423d79136fbce338932ad166d70c1efb86a5998/identify-2.6.14-py2.py3-none-any.whl", hash = "sha256:11a073da82212c6646b1f39bb20d4483bfb9543bd5566fec60053c4bb309bf2e", size = 99172, upload-time = "2025-09-06T19:30:51.759Z" }, +] + +[[package]] +name = "idna" +version = "3.10" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f1/70/7703c29685631f5a7590aa73f1f1d3fa9a380e654b86af429e0934a32f7d/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9", size = 190490, upload-time = "2024-09-15T18:07:39.745Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3", size = 70442, upload-time = "2024-09-15T18:07:37.964Z" }, +] + +[[package]] +name = "importlib-metadata" +version = "8.7.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "zipp" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/76/66/650a33bd90f786193e4de4b3ad86ea60b53c89b669a5c7be931fac31cdb0/importlib_metadata-8.7.0.tar.gz", hash = "sha256:d13b81ad223b890aa16c5471f2ac3056cf76c5f10f82d6f9292f0b415f389000", size = 56641, upload-time = "2025-04-27T15:29:01.736Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/20/b0/36bd937216ec521246249be3bf9855081de4c5e06a0c9b4219dbeda50373/importlib_metadata-8.7.0-py3-none-any.whl", hash = "sha256:e5dd1551894c77868a30651cef00984d50e1002d06942a7101d34870c5f02afd", size = 27656, upload-time = "2025-04-27T15:29:00.214Z" }, +] + +[[package]] +name = "iniconfig" +version = "2.1.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f2/97/ebf4da567aa6827c909642694d71c9fcf53e5b504f2d96afea02718862f3/iniconfig-2.1.0.tar.gz", hash = "sha256:3abbd2e30b36733fee78f9c7f7308f2d0050e88f0087fd25c2645f63c773e1c7", size = 4793, upload-time = "2025-03-19T20:09:59.721Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/2c/e1/e6716421ea10d38022b952c159d5161ca1193197fb744506875fbb87ea7b/iniconfig-2.1.0-py3-none-any.whl", hash = "sha256:9deba5723312380e77435581c6bf4935c94cbfab9b1ed33ef8d238ea168eb760", size = 6050, upload-time = "2025-03-19T20:10:01.071Z" }, +] + +[[package]] +name = "instructor" +version = "1.11.3" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "aiohttp" }, + { name = "diskcache" }, + { name = "docstring-parser" }, + { name = "jinja2" }, + { name = "jiter" }, + { name = "openai" }, + { name = "pydantic" }, + { name = "pydantic-core" }, + { name = "requests" }, + { name = "rich" }, + { name = "tenacity" }, + { name = "typer" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/6a/af/428b5d7a6a6eca5738c51706795a395099c141779cd1bbb9a6e2b0d3a94d/instructor-1.11.3.tar.gz", hash = "sha256:6f58fea6fadfa228c411ecdedad4662230c456718f4a770a97a806dcb36b3287", size = 69879936, upload-time = "2025-09-09T15:44:31.548Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/4c/5f/54783e5b1a497de204a0a59b5e22549f67f5f1aceaa08e00db21b1107ce4/instructor-1.11.3-py3-none-any.whl", hash = "sha256:9ecd7a3780a045506165debad2ddcc4a30e1057f06997973185f356b0a42c6e3", size = 155501, upload-time = "2025-09-09T15:44:26.139Z" }, +] + +[[package]] +name = "isodate" +version = "0.7.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/54/4d/e940025e2ce31a8ce1202635910747e5a87cc3a6a6bb2d00973375014749/isodate-0.7.2.tar.gz", hash = "sha256:4cd1aa0f43ca76f4a6c6c0292a85f40b35ec2e43e315b59f06e6d32171a953e6", size = 29705, upload-time = "2024-10-08T23:04:11.5Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/15/aa/0aca39a37d3c7eb941ba736ede56d689e7be91cab5d9ca846bde3999eba6/isodate-0.7.2-py3-none-any.whl", hash = "sha256:28009937d8031054830160fce6d409ed342816b543597cece116d966c6d99e15", size = 22320, upload-time = "2024-10-08T23:04:09.501Z" }, +] + +[[package]] +name = "isort" +version = "6.0.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/b8/21/1e2a441f74a653a144224d7d21afe8f4169e6c7c20bb13aec3a2dc3815e0/isort-6.0.1.tar.gz", hash = "sha256:1cb5df28dfbc742e490c5e41bad6da41b805b0a8be7bc93cd0fb2a8a890ac450", size = 821955, upload-time = "2025-02-26T21:13:16.955Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c1/11/114d0a5f4dabbdcedc1125dee0888514c3c3b16d3e9facad87ed96fad97c/isort-6.0.1-py3-none-any.whl", hash = "sha256:2dc5d7f65c9678d94c88dfc29161a320eec67328bc97aad576874cb4be1e9615", size = 94186, upload-time = "2025-02-26T21:13:14.911Z" }, +] + +[[package]] +name = "jinja2" +version = "3.1.6" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "markupsafe" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/df/bf/f7da0350254c0ed7c72f3e33cef02e048281fec7ecec5f032d4aac52226b/jinja2-3.1.6.tar.gz", hash = "sha256:0137fb05990d35f1275a587e9aee6d56da821fc83491a0fb838183be43f66d6d", size = 245115, upload-time = "2025-03-05T20:05:02.478Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/62/a1/3d680cbfd5f4b8f15abc1d571870c5fc3e594bb582bc3b64ea099db13e56/jinja2-3.1.6-py3-none-any.whl", hash = "sha256:85ece4451f492d0c13c5dd7c13a64681a86afae63a5f347908daf103ce6d2f67", size = 134899, upload-time = "2025-03-05T20:05:00.369Z" }, +] + +[[package]] +name = "jiter" +version = "0.10.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/ee/9d/ae7ddb4b8ab3fb1b51faf4deb36cb48a4fbbd7cb36bad6a5fca4741306f7/jiter-0.10.0.tar.gz", hash = "sha256:07a7142c38aacc85194391108dc91b5b57093c978a9932bd86a36862759d9500", size = 162759, upload-time = "2025-05-18T19:04:59.73Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/1b/dd/6cefc6bd68b1c3c979cecfa7029ab582b57690a31cd2f346c4d0ce7951b6/jiter-0.10.0-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:3bebe0c558e19902c96e99217e0b8e8b17d570906e72ed8a87170bc290b1e978", size = 317473, upload-time = "2025-05-18T19:03:25.942Z" }, + { url = "https://files.pythonhosted.org/packages/be/cf/fc33f5159ce132be1d8dd57251a1ec7a631c7df4bd11e1cd198308c6ae32/jiter-0.10.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:558cc7e44fd8e507a236bee6a02fa17199ba752874400a0ca6cd6e2196cdb7dc", size = 321971, upload-time = "2025-05-18T19:03:27.255Z" }, + { url = "https://files.pythonhosted.org/packages/68/a4/da3f150cf1d51f6c472616fb7650429c7ce053e0c962b41b68557fdf6379/jiter-0.10.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4d613e4b379a07d7c8453c5712ce7014e86c6ac93d990a0b8e7377e18505e98d", size = 345574, upload-time = "2025-05-18T19:03:28.63Z" }, + { url = "https://files.pythonhosted.org/packages/84/34/6e8d412e60ff06b186040e77da5f83bc158e9735759fcae65b37d681f28b/jiter-0.10.0-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:f62cf8ba0618eda841b9bf61797f21c5ebd15a7a1e19daab76e4e4b498d515b2", size = 371028, upload-time = "2025-05-18T19:03:30.292Z" }, + { url = "https://files.pythonhosted.org/packages/fb/d9/9ee86173aae4576c35a2f50ae930d2ccb4c4c236f6cb9353267aa1d626b7/jiter-0.10.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:919d139cdfa8ae8945112398511cb7fca58a77382617d279556b344867a37e61", size = 491083, upload-time = "2025-05-18T19:03:31.654Z" }, + { url = "https://files.pythonhosted.org/packages/d9/2c/f955de55e74771493ac9e188b0f731524c6a995dffdcb8c255b89c6fb74b/jiter-0.10.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:13ddbc6ae311175a3b03bd8994881bc4635c923754932918e18da841632349db", size = 388821, upload-time = "2025-05-18T19:03:33.184Z" }, + { url = "https://files.pythonhosted.org/packages/81/5a/0e73541b6edd3f4aada586c24e50626c7815c561a7ba337d6a7eb0a915b4/jiter-0.10.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4c440ea003ad10927a30521a9062ce10b5479592e8a70da27f21eeb457b4a9c5", size = 352174, upload-time = "2025-05-18T19:03:34.965Z" }, + { url = "https://files.pythonhosted.org/packages/1c/c0/61eeec33b8c75b31cae42be14d44f9e6fe3ac15a4e58010256ac3abf3638/jiter-0.10.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:dc347c87944983481e138dea467c0551080c86b9d21de6ea9306efb12ca8f606", size = 391869, upload-time = "2025-05-18T19:03:36.436Z" }, + { url = "https://files.pythonhosted.org/packages/41/22/5beb5ee4ad4ef7d86f5ea5b4509f680a20706c4a7659e74344777efb7739/jiter-0.10.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:13252b58c1f4d8c5b63ab103c03d909e8e1e7842d302473f482915d95fefd605", size = 523741, upload-time = "2025-05-18T19:03:38.168Z" }, + { url = "https://files.pythonhosted.org/packages/ea/10/768e8818538e5817c637b0df52e54366ec4cebc3346108a4457ea7a98f32/jiter-0.10.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:7d1bbf3c465de4a24ab12fb7766a0003f6f9bce48b8b6a886158c4d569452dc5", size = 514527, upload-time = "2025-05-18T19:03:39.577Z" }, + { url = "https://files.pythonhosted.org/packages/73/6d/29b7c2dc76ce93cbedabfd842fc9096d01a0550c52692dfc33d3cc889815/jiter-0.10.0-cp311-cp311-win32.whl", hash = "sha256:db16e4848b7e826edca4ccdd5b145939758dadf0dc06e7007ad0e9cfb5928ae7", size = 210765, upload-time = "2025-05-18T19:03:41.271Z" }, + { url = "https://files.pythonhosted.org/packages/c2/c9/d394706deb4c660137caf13e33d05a031d734eb99c051142e039d8ceb794/jiter-0.10.0-cp311-cp311-win_amd64.whl", hash = "sha256:9c9c1d5f10e18909e993f9641f12fe1c77b3e9b533ee94ffa970acc14ded3812", size = 209234, upload-time = "2025-05-18T19:03:42.918Z" }, + { url = "https://files.pythonhosted.org/packages/6d/b5/348b3313c58f5fbfb2194eb4d07e46a35748ba6e5b3b3046143f3040bafa/jiter-0.10.0-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:1e274728e4a5345a6dde2d343c8da018b9d4bd4350f5a472fa91f66fda44911b", size = 312262, upload-time = "2025-05-18T19:03:44.637Z" }, + { url = "https://files.pythonhosted.org/packages/9c/4a/6a2397096162b21645162825f058d1709a02965606e537e3304b02742e9b/jiter-0.10.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:7202ae396446c988cb2a5feb33a543ab2165b786ac97f53b59aafb803fef0744", size = 320124, upload-time = "2025-05-18T19:03:46.341Z" }, + { url = "https://files.pythonhosted.org/packages/2a/85/1ce02cade7516b726dd88f59a4ee46914bf79d1676d1228ef2002ed2f1c9/jiter-0.10.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:23ba7722d6748b6920ed02a8f1726fb4b33e0fd2f3f621816a8b486c66410ab2", size = 345330, upload-time = "2025-05-18T19:03:47.596Z" }, + { url = "https://files.pythonhosted.org/packages/75/d0/bb6b4f209a77190ce10ea8d7e50bf3725fc16d3372d0a9f11985a2b23eff/jiter-0.10.0-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:371eab43c0a288537d30e1f0b193bc4eca90439fc08a022dd83e5e07500ed026", size = 369670, upload-time = "2025-05-18T19:03:49.334Z" }, + { url = "https://files.pythonhosted.org/packages/a0/f5/a61787da9b8847a601e6827fbc42ecb12be2c925ced3252c8ffcb56afcaf/jiter-0.10.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6c675736059020365cebc845a820214765162728b51ab1e03a1b7b3abb70f74c", size = 489057, upload-time = "2025-05-18T19:03:50.66Z" }, + { url = "https://files.pythonhosted.org/packages/12/e4/6f906272810a7b21406c760a53aadbe52e99ee070fc5c0cb191e316de30b/jiter-0.10.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0c5867d40ab716e4684858e4887489685968a47e3ba222e44cde6e4a2154f959", size = 389372, upload-time = "2025-05-18T19:03:51.98Z" }, + { url = "https://files.pythonhosted.org/packages/e2/ba/77013b0b8ba904bf3762f11e0129b8928bff7f978a81838dfcc958ad5728/jiter-0.10.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:395bb9a26111b60141757d874d27fdea01b17e8fac958b91c20128ba8f4acc8a", size = 352038, upload-time = "2025-05-18T19:03:53.703Z" }, + { url = "https://files.pythonhosted.org/packages/67/27/c62568e3ccb03368dbcc44a1ef3a423cb86778a4389e995125d3d1aaa0a4/jiter-0.10.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6842184aed5cdb07e0c7e20e5bdcfafe33515ee1741a6835353bb45fe5d1bd95", size = 391538, upload-time = "2025-05-18T19:03:55.046Z" }, + { url = "https://files.pythonhosted.org/packages/c0/72/0d6b7e31fc17a8fdce76164884edef0698ba556b8eb0af9546ae1a06b91d/jiter-0.10.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:62755d1bcea9876770d4df713d82606c8c1a3dca88ff39046b85a048566d56ea", size = 523557, upload-time = "2025-05-18T19:03:56.386Z" }, + { url = "https://files.pythonhosted.org/packages/2f/09/bc1661fbbcbeb6244bd2904ff3a06f340aa77a2b94e5a7373fd165960ea3/jiter-0.10.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:533efbce2cacec78d5ba73a41756beff8431dfa1694b6346ce7af3a12c42202b", size = 514202, upload-time = "2025-05-18T19:03:57.675Z" }, + { url = "https://files.pythonhosted.org/packages/1b/84/5a5d5400e9d4d54b8004c9673bbe4403928a00d28529ff35b19e9d176b19/jiter-0.10.0-cp312-cp312-win32.whl", hash = "sha256:8be921f0cadd245e981b964dfbcd6fd4bc4e254cdc069490416dd7a2632ecc01", size = 211781, upload-time = "2025-05-18T19:03:59.025Z" }, + { url = "https://files.pythonhosted.org/packages/9b/52/7ec47455e26f2d6e5f2ea4951a0652c06e5b995c291f723973ae9e724a65/jiter-0.10.0-cp312-cp312-win_amd64.whl", hash = "sha256:a7c7d785ae9dda68c2678532a5a1581347e9c15362ae9f6e68f3fdbfb64f2e49", size = 206176, upload-time = "2025-05-18T19:04:00.305Z" }, + { url = "https://files.pythonhosted.org/packages/2e/b0/279597e7a270e8d22623fea6c5d4eeac328e7d95c236ed51a2b884c54f70/jiter-0.10.0-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:e0588107ec8e11b6f5ef0e0d656fb2803ac6cf94a96b2b9fc675c0e3ab5e8644", size = 311617, upload-time = "2025-05-18T19:04:02.078Z" }, + { url = "https://files.pythonhosted.org/packages/91/e3/0916334936f356d605f54cc164af4060e3e7094364add445a3bc79335d46/jiter-0.10.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:cafc4628b616dc32530c20ee53d71589816cf385dd9449633e910d596b1f5c8a", size = 318947, upload-time = "2025-05-18T19:04:03.347Z" }, + { url = "https://files.pythonhosted.org/packages/6a/8e/fd94e8c02d0e94539b7d669a7ebbd2776e51f329bb2c84d4385e8063a2ad/jiter-0.10.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:520ef6d981172693786a49ff5b09eda72a42e539f14788124a07530f785c3ad6", size = 344618, upload-time = "2025-05-18T19:04:04.709Z" }, + { url = "https://files.pythonhosted.org/packages/6f/b0/f9f0a2ec42c6e9c2e61c327824687f1e2415b767e1089c1d9135f43816bd/jiter-0.10.0-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:554dedfd05937f8fc45d17ebdf298fe7e0c77458232bcb73d9fbbf4c6455f5b3", size = 368829, upload-time = "2025-05-18T19:04:06.912Z" }, + { url = "https://files.pythonhosted.org/packages/e8/57/5bbcd5331910595ad53b9fd0c610392ac68692176f05ae48d6ce5c852967/jiter-0.10.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5bc299da7789deacf95f64052d97f75c16d4fc8c4c214a22bf8d859a4288a1c2", size = 491034, upload-time = "2025-05-18T19:04:08.222Z" }, + { url = "https://files.pythonhosted.org/packages/9b/be/c393df00e6e6e9e623a73551774449f2f23b6ec6a502a3297aeeece2c65a/jiter-0.10.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5161e201172de298a8a1baad95eb85db4fb90e902353b1f6a41d64ea64644e25", size = 388529, upload-time = "2025-05-18T19:04:09.566Z" }, + { url = "https://files.pythonhosted.org/packages/42/3e/df2235c54d365434c7f150b986a6e35f41ebdc2f95acea3036d99613025d/jiter-0.10.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2e2227db6ba93cb3e2bf67c87e594adde0609f146344e8207e8730364db27041", size = 350671, upload-time = "2025-05-18T19:04:10.98Z" }, + { url = "https://files.pythonhosted.org/packages/c6/77/71b0b24cbcc28f55ab4dbfe029f9a5b73aeadaba677843fc6dc9ed2b1d0a/jiter-0.10.0-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:15acb267ea5e2c64515574b06a8bf393fbfee6a50eb1673614aa45f4613c0cca", size = 390864, upload-time = "2025-05-18T19:04:12.722Z" }, + { url = "https://files.pythonhosted.org/packages/6a/d3/ef774b6969b9b6178e1d1e7a89a3bd37d241f3d3ec5f8deb37bbd203714a/jiter-0.10.0-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:901b92f2e2947dc6dfcb52fd624453862e16665ea909a08398dde19c0731b7f4", size = 522989, upload-time = "2025-05-18T19:04:14.261Z" }, + { url = "https://files.pythonhosted.org/packages/0c/41/9becdb1d8dd5d854142f45a9d71949ed7e87a8e312b0bede2de849388cb9/jiter-0.10.0-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:d0cb9a125d5a3ec971a094a845eadde2db0de85b33c9f13eb94a0c63d463879e", size = 513495, upload-time = "2025-05-18T19:04:15.603Z" }, + { url = "https://files.pythonhosted.org/packages/9c/36/3468e5a18238bdedae7c4d19461265b5e9b8e288d3f86cd89d00cbb48686/jiter-0.10.0-cp313-cp313-win32.whl", hash = "sha256:48a403277ad1ee208fb930bdf91745e4d2d6e47253eedc96e2559d1e6527006d", size = 211289, upload-time = "2025-05-18T19:04:17.541Z" }, + { url = "https://files.pythonhosted.org/packages/7e/07/1c96b623128bcb913706e294adb5f768fb7baf8db5e1338ce7b4ee8c78ef/jiter-0.10.0-cp313-cp313-win_amd64.whl", hash = "sha256:75f9eb72ecb640619c29bf714e78c9c46c9c4eaafd644bf78577ede459f330d4", size = 205074, upload-time = "2025-05-18T19:04:19.21Z" }, + { url = "https://files.pythonhosted.org/packages/54/46/caa2c1342655f57d8f0f2519774c6d67132205909c65e9aa8255e1d7b4f4/jiter-0.10.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:28ed2a4c05a1f32ef0e1d24c2611330219fed727dae01789f4a335617634b1ca", size = 318225, upload-time = "2025-05-18T19:04:20.583Z" }, + { url = "https://files.pythonhosted.org/packages/43/84/c7d44c75767e18946219ba2d703a5a32ab37b0bc21886a97bc6062e4da42/jiter-0.10.0-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:14a4c418b1ec86a195f1ca69da8b23e8926c752b685af665ce30777233dfe070", size = 350235, upload-time = "2025-05-18T19:04:22.363Z" }, + { url = "https://files.pythonhosted.org/packages/01/16/f5a0135ccd968b480daad0e6ab34b0c7c5ba3bc447e5088152696140dcb3/jiter-0.10.0-cp313-cp313t-win_amd64.whl", hash = "sha256:d7bfed2fe1fe0e4dda6ef682cee888ba444b21e7a6553e03252e4feb6cf0adca", size = 207278, upload-time = "2025-05-18T19:04:23.627Z" }, + { url = "https://files.pythonhosted.org/packages/1c/9b/1d646da42c3de6c2188fdaa15bce8ecb22b635904fc68be025e21249ba44/jiter-0.10.0-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:5e9251a5e83fab8d87799d3e1a46cb4b7f2919b895c6f4483629ed2446f66522", size = 310866, upload-time = "2025-05-18T19:04:24.891Z" }, + { url = "https://files.pythonhosted.org/packages/ad/0e/26538b158e8a7c7987e94e7aeb2999e2e82b1f9d2e1f6e9874ddf71ebda0/jiter-0.10.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:023aa0204126fe5b87ccbcd75c8a0d0261b9abdbbf46d55e7ae9f8e22424eeb8", size = 318772, upload-time = "2025-05-18T19:04:26.161Z" }, + { url = "https://files.pythonhosted.org/packages/7b/fb/d302893151caa1c2636d6574d213e4b34e31fd077af6050a9c5cbb42f6fb/jiter-0.10.0-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3c189c4f1779c05f75fc17c0c1267594ed918996a231593a21a5ca5438445216", size = 344534, upload-time = "2025-05-18T19:04:27.495Z" }, + { url = "https://files.pythonhosted.org/packages/01/d8/5780b64a149d74e347c5128d82176eb1e3241b1391ac07935693466d6219/jiter-0.10.0-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:15720084d90d1098ca0229352607cd68256c76991f6b374af96f36920eae13c4", size = 369087, upload-time = "2025-05-18T19:04:28.896Z" }, + { url = "https://files.pythonhosted.org/packages/e8/5b/f235a1437445160e777544f3ade57544daf96ba7e96c1a5b24a6f7ac7004/jiter-0.10.0-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e4f2fb68e5f1cfee30e2b2a09549a00683e0fde4c6a2ab88c94072fc33cb7426", size = 490694, upload-time = "2025-05-18T19:04:30.183Z" }, + { url = "https://files.pythonhosted.org/packages/85/a9/9c3d4617caa2ff89cf61b41e83820c27ebb3f7b5fae8a72901e8cd6ff9be/jiter-0.10.0-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ce541693355fc6da424c08b7edf39a2895f58d6ea17d92cc2b168d20907dee12", size = 388992, upload-time = "2025-05-18T19:04:32.028Z" }, + { url = "https://files.pythonhosted.org/packages/68/b1/344fd14049ba5c94526540af7eb661871f9c54d5f5601ff41a959b9a0bbd/jiter-0.10.0-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:31c50c40272e189d50006ad5c73883caabb73d4e9748a688b216e85a9a9ca3b9", size = 351723, upload-time = "2025-05-18T19:04:33.467Z" }, + { url = "https://files.pythonhosted.org/packages/41/89/4c0e345041186f82a31aee7b9d4219a910df672b9fef26f129f0cda07a29/jiter-0.10.0-cp314-cp314-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fa3402a2ff9815960e0372a47b75c76979d74402448509ccd49a275fa983ef8a", size = 392215, upload-time = "2025-05-18T19:04:34.827Z" }, + { url = "https://files.pythonhosted.org/packages/55/58/ee607863e18d3f895feb802154a2177d7e823a7103f000df182e0f718b38/jiter-0.10.0-cp314-cp314-musllinux_1_1_aarch64.whl", hash = "sha256:1956f934dca32d7bb647ea21d06d93ca40868b505c228556d3373cbd255ce853", size = 522762, upload-time = "2025-05-18T19:04:36.19Z" }, + { url = "https://files.pythonhosted.org/packages/15/d0/9123fb41825490d16929e73c212de9a42913d68324a8ce3c8476cae7ac9d/jiter-0.10.0-cp314-cp314-musllinux_1_1_x86_64.whl", hash = "sha256:fcedb049bdfc555e261d6f65a6abe1d5ad68825b7202ccb9692636c70fcced86", size = 513427, upload-time = "2025-05-18T19:04:37.544Z" }, + { url = "https://files.pythonhosted.org/packages/d8/b3/2bd02071c5a2430d0b70403a34411fc519c2f227da7b03da9ba6a956f931/jiter-0.10.0-cp314-cp314-win32.whl", hash = "sha256:ac509f7eccca54b2a29daeb516fb95b6f0bd0d0d8084efaf8ed5dfc7b9f0b357", size = 210127, upload-time = "2025-05-18T19:04:38.837Z" }, + { url = "https://files.pythonhosted.org/packages/03/0c/5fe86614ea050c3ecd728ab4035534387cd41e7c1855ef6c031f1ca93e3f/jiter-0.10.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:5ed975b83a2b8639356151cef5c0d597c68376fc4922b45d0eb384ac058cfa00", size = 318527, upload-time = "2025-05-18T19:04:40.612Z" }, + { url = "https://files.pythonhosted.org/packages/b3/4a/4175a563579e884192ba6e81725fc0448b042024419be8d83aa8a80a3f44/jiter-0.10.0-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3aa96f2abba33dc77f79b4cf791840230375f9534e5fac927ccceb58c5e604a5", size = 354213, upload-time = "2025-05-18T19:04:41.894Z" }, +] + +[[package]] +name = "jmespath" +version = "1.0.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/00/2a/e867e8531cf3e36b41201936b7fa7ba7b5702dbef42922193f05c8976cd6/jmespath-1.0.1.tar.gz", hash = "sha256:90261b206d6defd58fdd5e85f478bf633a2901798906be2ad389150c5c60edbe", size = 25843, upload-time = "2022-06-17T18:00:12.224Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/31/b4/b9b800c45527aadd64d5b442f9b932b00648617eb5d63d2c7a6587b7cafc/jmespath-1.0.1-py3-none-any.whl", hash = "sha256:02e2e4cc71b5bcab88332eebf907519190dd9e6e82107fa7f83b1003a6252980", size = 20256, upload-time = "2022-06-17T18:00:10.251Z" }, +] + +[[package]] +name = "joblib" +version = "1.5.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/e8/5d/447af5ea094b9e4c4054f82e223ada074c552335b9b4b2d14bd9b35a67c4/joblib-1.5.2.tar.gz", hash = "sha256:3faa5c39054b2f03ca547da9b2f52fde67c06240c31853f306aea97f13647b55", size = 331077, upload-time = "2025-08-27T12:15:46.575Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/1e/e8/685f47e0d754320684db4425a0967f7d3fa70126bffd76110b7009a0090f/joblib-1.5.2-py3-none-any.whl", hash = "sha256:4e1f0bdbb987e6d843c70cf43714cb276623def372df3c22fe5266b2670bc241", size = 308396, upload-time = "2025-08-27T12:15:45.188Z" }, +] + +[[package]] +name = "jsonpath-ng" +version = "1.7.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "ply" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/6d/86/08646239a313f895186ff0a4573452038eed8c86f54380b3ebac34d32fb2/jsonpath-ng-1.7.0.tar.gz", hash = "sha256:f6f5f7fd4e5ff79c785f1573b394043b39849fb2bb47bcead935d12b00beab3c", size = 37838, upload-time = "2024-10-11T15:41:42.404Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/35/5a/73ecb3d82f8615f32ccdadeb9356726d6cae3a4bbc840b437ceb95708063/jsonpath_ng-1.7.0-py3-none-any.whl", hash = "sha256:f3d7f9e848cba1b6da28c55b1c26ff915dc9e0b1ba7e752a53d6da8d5cbd00b6", size = 30105, upload-time = "2024-11-20T17:58:30.418Z" }, +] + +[[package]] +name = "jsonschema" +version = "4.25.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "attrs" }, + { name = "jsonschema-specifications" }, + { name = "referencing" }, + { name = "rpds-py" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/74/69/f7185de793a29082a9f3c7728268ffb31cb5095131a9c139a74078e27336/jsonschema-4.25.1.tar.gz", hash = "sha256:e4a9655ce0da0c0b67a085847e00a3a51449e1157f4f75e9fb5aa545e122eb85", size = 357342, upload-time = "2025-08-18T17:03:50.038Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/bf/9c/8c95d856233c1f82500c2450b8c68576b4cf1c871db3afac5c34ff84e6fd/jsonschema-4.25.1-py3-none-any.whl", hash = "sha256:3fba0169e345c7175110351d456342c364814cfcf3b964ba4587f22915230a63", size = 90040, upload-time = "2025-08-18T17:03:48.373Z" }, +] + +[[package]] +name = "jsonschema-path" +version = "0.3.4" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pathable" }, + { name = "pyyaml" }, + { name = "referencing" }, + { name = "requests" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/6e/45/41ebc679c2a4fced6a722f624c18d658dee42612b83ea24c1caf7c0eb3a8/jsonschema_path-0.3.4.tar.gz", hash = "sha256:8365356039f16cc65fddffafda5f58766e34bebab7d6d105616ab52bc4297001", size = 11159, upload-time = "2025-01-24T14:33:16.547Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/cb/58/3485da8cb93d2f393bce453adeef16896751f14ba3e2024bc21dc9597646/jsonschema_path-0.3.4-py3-none-any.whl", hash = "sha256:f502191fdc2b22050f9a81c9237be9d27145b9001c55842bece5e94e382e52f8", size = 14810, upload-time = "2025-01-24T14:33:14.652Z" }, +] + +[[package]] +name = "jsonschema-specifications" +version = "2025.9.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "referencing" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/19/74/a633ee74eb36c44aa6d1095e7cc5569bebf04342ee146178e2d36600708b/jsonschema_specifications-2025.9.1.tar.gz", hash = "sha256:b540987f239e745613c7a9176f3edb72b832a4ac465cf02712288397832b5e8d", size = 32855, upload-time = "2025-09-08T01:34:59.186Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/41/45/1a4ed80516f02155c51f51e8cedb3c1902296743db0bbc66608a0db2814f/jsonschema_specifications-2025.9.1-py3-none-any.whl", hash = "sha256:98802fee3a11ee76ecaca44429fda8a41bff98b00a0f2838151b113f210cc6fe", size = 18437, upload-time = "2025-09-08T01:34:57.871Z" }, +] + +[[package]] +name = "kiwisolver" +version = "1.4.9" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/5c/3c/85844f1b0feb11ee581ac23fe5fce65cd049a200c1446708cc1b7f922875/kiwisolver-1.4.9.tar.gz", hash = "sha256:c3b22c26c6fd6811b0ae8363b95ca8ce4ea3c202d3d0975b2914310ceb1bcc4d", size = 97564, upload-time = "2025-08-10T21:27:49.279Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/6f/ab/c80b0d5a9d8a1a65f4f815f2afff9798b12c3b9f31f1d304dd233dd920e2/kiwisolver-1.4.9-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:eb14a5da6dc7642b0f3a18f13654847cd8b7a2550e2645a5bda677862b03ba16", size = 124167, upload-time = "2025-08-10T21:25:53.403Z" }, + { url = "https://files.pythonhosted.org/packages/a0/c0/27fe1a68a39cf62472a300e2879ffc13c0538546c359b86f149cc19f6ac3/kiwisolver-1.4.9-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:39a219e1c81ae3b103643d2aedb90f1ef22650deb266ff12a19e7773f3e5f089", size = 66579, upload-time = "2025-08-10T21:25:54.79Z" }, + { url = "https://files.pythonhosted.org/packages/31/a2/a12a503ac1fd4943c50f9822678e8015a790a13b5490354c68afb8489814/kiwisolver-1.4.9-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2405a7d98604b87f3fc28b1716783534b1b4b8510d8142adca34ee0bc3c87543", size = 65309, upload-time = "2025-08-10T21:25:55.76Z" }, + { url = "https://files.pythonhosted.org/packages/66/e1/e533435c0be77c3f64040d68d7a657771194a63c279f55573188161e81ca/kiwisolver-1.4.9-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:dc1ae486f9abcef254b5618dfb4113dd49f94c68e3e027d03cf0143f3f772b61", size = 1435596, upload-time = "2025-08-10T21:25:56.861Z" }, + { url = "https://files.pythonhosted.org/packages/67/1e/51b73c7347f9aabdc7215aa79e8b15299097dc2f8e67dee2b095faca9cb0/kiwisolver-1.4.9-cp311-cp311-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8a1f570ce4d62d718dce3f179ee78dac3b545ac16c0c04bb363b7607a949c0d1", size = 1246548, upload-time = "2025-08-10T21:25:58.246Z" }, + { url = "https://files.pythonhosted.org/packages/21/aa/72a1c5d1e430294f2d32adb9542719cfb441b5da368d09d268c7757af46c/kiwisolver-1.4.9-cp311-cp311-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:cb27e7b78d716c591e88e0a09a2139c6577865d7f2e152488c2cc6257f460872", size = 1263618, upload-time = "2025-08-10T21:25:59.857Z" }, + { url = "https://files.pythonhosted.org/packages/a3/af/db1509a9e79dbf4c260ce0cfa3903ea8945f6240e9e59d1e4deb731b1a40/kiwisolver-1.4.9-cp311-cp311-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:15163165efc2f627eb9687ea5f3a28137217d217ac4024893d753f46bce9de26", size = 1317437, upload-time = "2025-08-10T21:26:01.105Z" }, + { url = "https://files.pythonhosted.org/packages/e0/f2/3ea5ee5d52abacdd12013a94130436e19969fa183faa1e7c7fbc89e9a42f/kiwisolver-1.4.9-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:bdee92c56a71d2b24c33a7d4c2856bd6419d017e08caa7802d2963870e315028", size = 2195742, upload-time = "2025-08-10T21:26:02.675Z" }, + { url = "https://files.pythonhosted.org/packages/6f/9b/1efdd3013c2d9a2566aa6a337e9923a00590c516add9a1e89a768a3eb2fc/kiwisolver-1.4.9-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:412f287c55a6f54b0650bd9b6dce5aceddb95864a1a90c87af16979d37c89771", size = 2290810, upload-time = "2025-08-10T21:26:04.009Z" }, + { url = "https://files.pythonhosted.org/packages/fb/e5/cfdc36109ae4e67361f9bc5b41323648cb24a01b9ade18784657e022e65f/kiwisolver-1.4.9-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:2c93f00dcba2eea70af2be5f11a830a742fe6b579a1d4e00f47760ef13be247a", size = 2461579, upload-time = "2025-08-10T21:26:05.317Z" }, + { url = "https://files.pythonhosted.org/packages/62/86/b589e5e86c7610842213994cdea5add00960076bef4ae290c5fa68589cac/kiwisolver-1.4.9-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:f117e1a089d9411663a3207ba874f31be9ac8eaa5b533787024dc07aeb74f464", size = 2268071, upload-time = "2025-08-10T21:26:06.686Z" }, + { url = "https://files.pythonhosted.org/packages/3b/c6/f8df8509fd1eee6c622febe54384a96cfaf4d43bf2ccec7a0cc17e4715c9/kiwisolver-1.4.9-cp311-cp311-win_amd64.whl", hash = "sha256:be6a04e6c79819c9a8c2373317d19a96048e5a3f90bec587787e86a1153883c2", size = 73840, upload-time = "2025-08-10T21:26:07.94Z" }, + { url = "https://files.pythonhosted.org/packages/e2/2d/16e0581daafd147bc11ac53f032a2b45eabac897f42a338d0a13c1e5c436/kiwisolver-1.4.9-cp311-cp311-win_arm64.whl", hash = "sha256:0ae37737256ba2de764ddc12aed4956460277f00c4996d51a197e72f62f5eec7", size = 65159, upload-time = "2025-08-10T21:26:09.048Z" }, + { url = "https://files.pythonhosted.org/packages/86/c9/13573a747838aeb1c76e3267620daa054f4152444d1f3d1a2324b78255b5/kiwisolver-1.4.9-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:ac5a486ac389dddcc5bef4f365b6ae3ffff2c433324fb38dd35e3fab7c957999", size = 123686, upload-time = "2025-08-10T21:26:10.034Z" }, + { url = "https://files.pythonhosted.org/packages/51/ea/2ecf727927f103ffd1739271ca19c424d0e65ea473fbaeea1c014aea93f6/kiwisolver-1.4.9-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:f2ba92255faa7309d06fe44c3a4a97efe1c8d640c2a79a5ef728b685762a6fd2", size = 66460, upload-time = "2025-08-10T21:26:11.083Z" }, + { url = "https://files.pythonhosted.org/packages/5b/5a/51f5464373ce2aeb5194508298a508b6f21d3867f499556263c64c621914/kiwisolver-1.4.9-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:4a2899935e724dd1074cb568ce7ac0dce28b2cd6ab539c8e001a8578eb106d14", size = 64952, upload-time = "2025-08-10T21:26:12.058Z" }, + { url = "https://files.pythonhosted.org/packages/70/90/6d240beb0f24b74371762873e9b7f499f1e02166a2d9c5801f4dbf8fa12e/kiwisolver-1.4.9-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f6008a4919fdbc0b0097089f67a1eb55d950ed7e90ce2cc3e640abadd2757a04", size = 1474756, upload-time = "2025-08-10T21:26:13.096Z" }, + { url = "https://files.pythonhosted.org/packages/12/42/f36816eaf465220f683fb711efdd1bbf7a7005a2473d0e4ed421389bd26c/kiwisolver-1.4.9-cp312-cp312-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:67bb8b474b4181770f926f7b7d2f8c0248cbcb78b660fdd41a47054b28d2a752", size = 1276404, upload-time = "2025-08-10T21:26:14.457Z" }, + { url = "https://files.pythonhosted.org/packages/2e/64/bc2de94800adc830c476dce44e9b40fd0809cddeef1fde9fcf0f73da301f/kiwisolver-1.4.9-cp312-cp312-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:2327a4a30d3ee07d2fbe2e7933e8a37c591663b96ce42a00bc67461a87d7df77", size = 1294410, upload-time = "2025-08-10T21:26:15.73Z" }, + { url = "https://files.pythonhosted.org/packages/5f/42/2dc82330a70aa8e55b6d395b11018045e58d0bb00834502bf11509f79091/kiwisolver-1.4.9-cp312-cp312-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:7a08b491ec91b1d5053ac177afe5290adacf1f0f6307d771ccac5de30592d198", size = 1343631, upload-time = "2025-08-10T21:26:17.045Z" }, + { url = "https://files.pythonhosted.org/packages/22/fd/f4c67a6ed1aab149ec5a8a401c323cee7a1cbe364381bb6c9c0d564e0e20/kiwisolver-1.4.9-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:d8fc5c867c22b828001b6a38d2eaeb88160bf5783c6cb4a5e440efc981ce286d", size = 2224963, upload-time = "2025-08-10T21:26:18.737Z" }, + { url = "https://files.pythonhosted.org/packages/45/aa/76720bd4cb3713314677d9ec94dcc21ced3f1baf4830adde5bb9b2430a5f/kiwisolver-1.4.9-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:3b3115b2581ea35bb6d1f24a4c90af37e5d9b49dcff267eeed14c3893c5b86ab", size = 2321295, upload-time = "2025-08-10T21:26:20.11Z" }, + { url = "https://files.pythonhosted.org/packages/80/19/d3ec0d9ab711242f56ae0dc2fc5d70e298bb4a1f9dfab44c027668c673a1/kiwisolver-1.4.9-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:858e4c22fb075920b96a291928cb7dea5644e94c0ee4fcd5af7e865655e4ccf2", size = 2487987, upload-time = "2025-08-10T21:26:21.49Z" }, + { url = "https://files.pythonhosted.org/packages/39/e9/61e4813b2c97e86b6fdbd4dd824bf72d28bcd8d4849b8084a357bc0dd64d/kiwisolver-1.4.9-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ed0fecd28cc62c54b262e3736f8bb2512d8dcfdc2bcf08be5f47f96bf405b145", size = 2291817, upload-time = "2025-08-10T21:26:22.812Z" }, + { url = "https://files.pythonhosted.org/packages/a0/41/85d82b0291db7504da3c2defe35c9a8a5c9803a730f297bd823d11d5fb77/kiwisolver-1.4.9-cp312-cp312-win_amd64.whl", hash = "sha256:f68208a520c3d86ea51acf688a3e3002615a7f0238002cccc17affecc86a8a54", size = 73895, upload-time = "2025-08-10T21:26:24.37Z" }, + { url = "https://files.pythonhosted.org/packages/e2/92/5f3068cf15ee5cb624a0c7596e67e2a0bb2adee33f71c379054a491d07da/kiwisolver-1.4.9-cp312-cp312-win_arm64.whl", hash = "sha256:2c1a4f57df73965f3f14df20b80ee29e6a7930a57d2d9e8491a25f676e197c60", size = 64992, upload-time = "2025-08-10T21:26:25.732Z" }, + { url = "https://files.pythonhosted.org/packages/31/c1/c2686cda909742ab66c7388e9a1a8521a59eb89f8bcfbee28fc980d07e24/kiwisolver-1.4.9-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:a5d0432ccf1c7ab14f9949eec60c5d1f924f17c037e9f8b33352fa05799359b8", size = 123681, upload-time = "2025-08-10T21:26:26.725Z" }, + { url = "https://files.pythonhosted.org/packages/ca/f0/f44f50c9f5b1a1860261092e3bc91ecdc9acda848a8b8c6abfda4a24dd5c/kiwisolver-1.4.9-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:efb3a45b35622bb6c16dbfab491a8f5a391fe0e9d45ef32f4df85658232ca0e2", size = 66464, upload-time = "2025-08-10T21:26:27.733Z" }, + { url = "https://files.pythonhosted.org/packages/2d/7a/9d90a151f558e29c3936b8a47ac770235f436f2120aca41a6d5f3d62ae8d/kiwisolver-1.4.9-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:1a12cf6398e8a0a001a059747a1cbf24705e18fe413bc22de7b3d15c67cffe3f", size = 64961, upload-time = "2025-08-10T21:26:28.729Z" }, + { url = "https://files.pythonhosted.org/packages/e9/e9/f218a2cb3a9ffbe324ca29a9e399fa2d2866d7f348ec3a88df87fc248fc5/kiwisolver-1.4.9-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:b67e6efbf68e077dd71d1a6b37e43e1a99d0bff1a3d51867d45ee8908b931098", size = 1474607, upload-time = "2025-08-10T21:26:29.798Z" }, + { url = "https://files.pythonhosted.org/packages/d9/28/aac26d4c882f14de59041636292bc838db8961373825df23b8eeb807e198/kiwisolver-1.4.9-cp313-cp313-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5656aa670507437af0207645273ccdfee4f14bacd7f7c67a4306d0dcaeaf6eed", size = 1276546, upload-time = "2025-08-10T21:26:31.401Z" }, + { url = "https://files.pythonhosted.org/packages/8b/ad/8bfc1c93d4cc565e5069162f610ba2f48ff39b7de4b5b8d93f69f30c4bed/kiwisolver-1.4.9-cp313-cp313-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:bfc08add558155345129c7803b3671cf195e6a56e7a12f3dde7c57d9b417f525", size = 1294482, upload-time = "2025-08-10T21:26:32.721Z" }, + { url = "https://files.pythonhosted.org/packages/da/f1/6aca55ff798901d8ce403206d00e033191f63d82dd708a186e0ed2067e9c/kiwisolver-1.4.9-cp313-cp313-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:40092754720b174e6ccf9e845d0d8c7d8e12c3d71e7fc35f55f3813e96376f78", size = 1343720, upload-time = "2025-08-10T21:26:34.032Z" }, + { url = "https://files.pythonhosted.org/packages/d1/91/eed031876c595c81d90d0f6fc681ece250e14bf6998c3d7c419466b523b7/kiwisolver-1.4.9-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:497d05f29a1300d14e02e6441cf0f5ee81c1ff5a304b0d9fb77423974684e08b", size = 2224907, upload-time = "2025-08-10T21:26:35.824Z" }, + { url = "https://files.pythonhosted.org/packages/e9/ec/4d1925f2e49617b9cca9c34bfa11adefad49d00db038e692a559454dfb2e/kiwisolver-1.4.9-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:bdd1a81a1860476eb41ac4bc1e07b3f07259e6d55bbf739b79c8aaedcf512799", size = 2321334, upload-time = "2025-08-10T21:26:37.534Z" }, + { url = "https://files.pythonhosted.org/packages/43/cb/450cd4499356f68802750c6ddc18647b8ea01ffa28f50d20598e0befe6e9/kiwisolver-1.4.9-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:e6b93f13371d341afee3be9f7c5964e3fe61d5fa30f6a30eb49856935dfe4fc3", size = 2488313, upload-time = "2025-08-10T21:26:39.191Z" }, + { url = "https://files.pythonhosted.org/packages/71/67/fc76242bd99f885651128a5d4fa6083e5524694b7c88b489b1b55fdc491d/kiwisolver-1.4.9-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:d75aa530ccfaa593da12834b86a0724f58bff12706659baa9227c2ccaa06264c", size = 2291970, upload-time = "2025-08-10T21:26:40.828Z" }, + { url = "https://files.pythonhosted.org/packages/75/bd/f1a5d894000941739f2ae1b65a32892349423ad49c2e6d0771d0bad3fae4/kiwisolver-1.4.9-cp313-cp313-win_amd64.whl", hash = "sha256:dd0a578400839256df88c16abddf9ba14813ec5f21362e1fe65022e00c883d4d", size = 73894, upload-time = "2025-08-10T21:26:42.33Z" }, + { url = "https://files.pythonhosted.org/packages/95/38/dce480814d25b99a391abbddadc78f7c117c6da34be68ca8b02d5848b424/kiwisolver-1.4.9-cp313-cp313-win_arm64.whl", hash = "sha256:d4188e73af84ca82468f09cadc5ac4db578109e52acb4518d8154698d3a87ca2", size = 64995, upload-time = "2025-08-10T21:26:43.889Z" }, + { url = "https://files.pythonhosted.org/packages/e2/37/7d218ce5d92dadc5ebdd9070d903e0c7cf7edfe03f179433ac4d13ce659c/kiwisolver-1.4.9-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:5a0f2724dfd4e3b3ac5a82436a8e6fd16baa7d507117e4279b660fe8ca38a3a1", size = 126510, upload-time = "2025-08-10T21:26:44.915Z" }, + { url = "https://files.pythonhosted.org/packages/23/b0/e85a2b48233daef4b648fb657ebbb6f8367696a2d9548a00b4ee0eb67803/kiwisolver-1.4.9-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:1b11d6a633e4ed84fc0ddafd4ebfd8ea49b3f25082c04ad12b8315c11d504dc1", size = 67903, upload-time = "2025-08-10T21:26:45.934Z" }, + { url = "https://files.pythonhosted.org/packages/44/98/f2425bc0113ad7de24da6bb4dae1343476e95e1d738be7c04d31a5d037fd/kiwisolver-1.4.9-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:61874cdb0a36016354853593cffc38e56fc9ca5aa97d2c05d3dcf6922cd55a11", size = 66402, upload-time = "2025-08-10T21:26:47.101Z" }, + { url = "https://files.pythonhosted.org/packages/98/d8/594657886df9f34c4177cc353cc28ca7e6e5eb562d37ccc233bff43bbe2a/kiwisolver-1.4.9-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:60c439763a969a6af93b4881db0eed8fadf93ee98e18cbc35bc8da868d0c4f0c", size = 1582135, upload-time = "2025-08-10T21:26:48.665Z" }, + { url = "https://files.pythonhosted.org/packages/5c/c6/38a115b7170f8b306fc929e166340c24958347308ea3012c2b44e7e295db/kiwisolver-1.4.9-cp313-cp313t-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:92a2f997387a1b79a75e7803aa7ded2cfbe2823852ccf1ba3bcf613b62ae3197", size = 1389409, upload-time = "2025-08-10T21:26:50.335Z" }, + { url = "https://files.pythonhosted.org/packages/bf/3b/e04883dace81f24a568bcee6eb3001da4ba05114afa622ec9b6fafdc1f5e/kiwisolver-1.4.9-cp313-cp313t-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a31d512c812daea6d8b3be3b2bfcbeb091dbb09177706569bcfc6240dcf8b41c", size = 1401763, upload-time = "2025-08-10T21:26:51.867Z" }, + { url = "https://files.pythonhosted.org/packages/9f/80/20ace48e33408947af49d7d15c341eaee69e4e0304aab4b7660e234d6288/kiwisolver-1.4.9-cp313-cp313t-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:52a15b0f35dad39862d376df10c5230155243a2c1a436e39eb55623ccbd68185", size = 1453643, upload-time = "2025-08-10T21:26:53.592Z" }, + { url = "https://files.pythonhosted.org/packages/64/31/6ce4380a4cd1f515bdda976a1e90e547ccd47b67a1546d63884463c92ca9/kiwisolver-1.4.9-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:a30fd6fdef1430fd9e1ba7b3398b5ee4e2887783917a687d86ba69985fb08748", size = 2330818, upload-time = "2025-08-10T21:26:55.051Z" }, + { url = "https://files.pythonhosted.org/packages/fa/e9/3f3fcba3bcc7432c795b82646306e822f3fd74df0ee81f0fa067a1f95668/kiwisolver-1.4.9-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:cc9617b46837c6468197b5945e196ee9ca43057bb7d9d1ae688101e4e1dddf64", size = 2419963, upload-time = "2025-08-10T21:26:56.421Z" }, + { url = "https://files.pythonhosted.org/packages/99/43/7320c50e4133575c66e9f7dadead35ab22d7c012a3b09bb35647792b2a6d/kiwisolver-1.4.9-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:0ab74e19f6a2b027ea4f845a78827969af45ce790e6cb3e1ebab71bdf9f215ff", size = 2594639, upload-time = "2025-08-10T21:26:57.882Z" }, + { url = "https://files.pythonhosted.org/packages/65/d6/17ae4a270d4a987ef8a385b906d2bdfc9fce502d6dc0d3aea865b47f548c/kiwisolver-1.4.9-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:dba5ee5d3981160c28d5490f0d1b7ed730c22470ff7f6cc26cfcfaacb9896a07", size = 2391741, upload-time = "2025-08-10T21:26:59.237Z" }, + { url = "https://files.pythonhosted.org/packages/2a/8f/8f6f491d595a9e5912971f3f863d81baddccc8a4d0c3749d6a0dd9ffc9df/kiwisolver-1.4.9-cp313-cp313t-win_arm64.whl", hash = "sha256:0749fd8f4218ad2e851e11cc4dc05c7cbc0cbc4267bdfdb31782e65aace4ee9c", size = 68646, upload-time = "2025-08-10T21:27:00.52Z" }, + { url = "https://files.pythonhosted.org/packages/6b/32/6cc0fbc9c54d06c2969faa9c1d29f5751a2e51809dd55c69055e62d9b426/kiwisolver-1.4.9-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:9928fe1eb816d11ae170885a74d074f57af3a0d65777ca47e9aeb854a1fba386", size = 123806, upload-time = "2025-08-10T21:27:01.537Z" }, + { url = "https://files.pythonhosted.org/packages/b2/dd/2bfb1d4a4823d92e8cbb420fe024b8d2167f72079b3bb941207c42570bdf/kiwisolver-1.4.9-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:d0005b053977e7b43388ddec89fa567f43d4f6d5c2c0affe57de5ebf290dc552", size = 66605, upload-time = "2025-08-10T21:27:03.335Z" }, + { url = "https://files.pythonhosted.org/packages/f7/69/00aafdb4e4509c2ca6064646cba9cd4b37933898f426756adb2cb92ebbed/kiwisolver-1.4.9-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:2635d352d67458b66fd0667c14cb1d4145e9560d503219034a18a87e971ce4f3", size = 64925, upload-time = "2025-08-10T21:27:04.339Z" }, + { url = "https://files.pythonhosted.org/packages/43/dc/51acc6791aa14e5cb6d8a2e28cefb0dc2886d8862795449d021334c0df20/kiwisolver-1.4.9-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:767c23ad1c58c9e827b649a9ab7809fd5fd9db266a9cf02b0e926ddc2c680d58", size = 1472414, upload-time = "2025-08-10T21:27:05.437Z" }, + { url = "https://files.pythonhosted.org/packages/3d/bb/93fa64a81db304ac8a246f834d5094fae4b13baf53c839d6bb6e81177129/kiwisolver-1.4.9-cp314-cp314-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:72d0eb9fba308b8311685c2268cf7d0a0639a6cd027d8128659f72bdd8a024b4", size = 1281272, upload-time = "2025-08-10T21:27:07.063Z" }, + { url = "https://files.pythonhosted.org/packages/70/e6/6df102916960fb8d05069d4bd92d6d9a8202d5a3e2444494e7cd50f65b7a/kiwisolver-1.4.9-cp314-cp314-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:f68e4f3eeca8fb22cc3d731f9715a13b652795ef657a13df1ad0c7dc0e9731df", size = 1298578, upload-time = "2025-08-10T21:27:08.452Z" }, + { url = "https://files.pythonhosted.org/packages/7c/47/e142aaa612f5343736b087864dbaebc53ea8831453fb47e7521fa8658f30/kiwisolver-1.4.9-cp314-cp314-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d84cd4061ae292d8ac367b2c3fa3aad11cb8625a95d135fe93f286f914f3f5a6", size = 1345607, upload-time = "2025-08-10T21:27:10.125Z" }, + { url = "https://files.pythonhosted.org/packages/54/89/d641a746194a0f4d1a3670fb900d0dbaa786fb98341056814bc3f058fa52/kiwisolver-1.4.9-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:a60ea74330b91bd22a29638940d115df9dc00af5035a9a2a6ad9399ffb4ceca5", size = 2230150, upload-time = "2025-08-10T21:27:11.484Z" }, + { url = "https://files.pythonhosted.org/packages/aa/6b/5ee1207198febdf16ac11f78c5ae40861b809cbe0e6d2a8d5b0b3044b199/kiwisolver-1.4.9-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:ce6a3a4e106cf35c2d9c4fa17c05ce0b180db622736845d4315519397a77beaf", size = 2325979, upload-time = "2025-08-10T21:27:12.917Z" }, + { url = "https://files.pythonhosted.org/packages/fc/ff/b269eefd90f4ae14dcc74973d5a0f6d28d3b9bb1afd8c0340513afe6b39a/kiwisolver-1.4.9-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:77937e5e2a38a7b48eef0585114fe7930346993a88060d0bf886086d2aa49ef5", size = 2491456, upload-time = "2025-08-10T21:27:14.353Z" }, + { url = "https://files.pythonhosted.org/packages/fc/d4/10303190bd4d30de547534601e259a4fbf014eed94aae3e5521129215086/kiwisolver-1.4.9-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:24c175051354f4a28c5d6a31c93906dc653e2bf234e8a4bbfb964892078898ce", size = 2294621, upload-time = "2025-08-10T21:27:15.808Z" }, + { url = "https://files.pythonhosted.org/packages/28/e0/a9a90416fce5c0be25742729c2ea52105d62eda6c4be4d803c2a7be1fa50/kiwisolver-1.4.9-cp314-cp314-win_amd64.whl", hash = "sha256:0763515d4df10edf6d06a3c19734e2566368980d21ebec439f33f9eb936c07b7", size = 75417, upload-time = "2025-08-10T21:27:17.436Z" }, + { url = "https://files.pythonhosted.org/packages/1f/10/6949958215b7a9a264299a7db195564e87900f709db9245e4ebdd3c70779/kiwisolver-1.4.9-cp314-cp314-win_arm64.whl", hash = "sha256:0e4e2bf29574a6a7b7f6cb5fa69293b9f96c928949ac4a53ba3f525dffb87f9c", size = 66582, upload-time = "2025-08-10T21:27:18.436Z" }, + { url = "https://files.pythonhosted.org/packages/ec/79/60e53067903d3bc5469b369fe0dfc6b3482e2133e85dae9daa9527535991/kiwisolver-1.4.9-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:d976bbb382b202f71c67f77b0ac11244021cfa3f7dfd9e562eefcea2df711548", size = 126514, upload-time = "2025-08-10T21:27:19.465Z" }, + { url = "https://files.pythonhosted.org/packages/25/d1/4843d3e8d46b072c12a38c97c57fab4608d36e13fe47d47ee96b4d61ba6f/kiwisolver-1.4.9-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:2489e4e5d7ef9a1c300a5e0196e43d9c739f066ef23270607d45aba368b91f2d", size = 67905, upload-time = "2025-08-10T21:27:20.51Z" }, + { url = "https://files.pythonhosted.org/packages/8c/ae/29ffcbd239aea8b93108de1278271ae764dfc0d803a5693914975f200596/kiwisolver-1.4.9-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:e2ea9f7ab7fbf18fffb1b5434ce7c69a07582f7acc7717720f1d69f3e806f90c", size = 66399, upload-time = "2025-08-10T21:27:21.496Z" }, + { url = "https://files.pythonhosted.org/packages/a1/ae/d7ba902aa604152c2ceba5d352d7b62106bedbccc8e95c3934d94472bfa3/kiwisolver-1.4.9-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:b34e51affded8faee0dfdb705416153819d8ea9250bbbf7ea1b249bdeb5f1122", size = 1582197, upload-time = "2025-08-10T21:27:22.604Z" }, + { url = "https://files.pythonhosted.org/packages/f2/41/27c70d427eddb8bc7e4f16420a20fefc6f480312122a59a959fdfe0445ad/kiwisolver-1.4.9-cp314-cp314t-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d8aacd3d4b33b772542b2e01beb50187536967b514b00003bdda7589722d2a64", size = 1390125, upload-time = "2025-08-10T21:27:24.036Z" }, + { url = "https://files.pythonhosted.org/packages/41/42/b3799a12bafc76d962ad69083f8b43b12bf4fe78b097b12e105d75c9b8f1/kiwisolver-1.4.9-cp314-cp314t-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:7cf974dd4e35fa315563ac99d6287a1024e4dc2077b8a7d7cd3d2fb65d283134", size = 1402612, upload-time = "2025-08-10T21:27:25.773Z" }, + { url = "https://files.pythonhosted.org/packages/d2/b5/a210ea073ea1cfaca1bb5c55a62307d8252f531beb364e18aa1e0888b5a0/kiwisolver-1.4.9-cp314-cp314t-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:85bd218b5ecfbee8c8a82e121802dcb519a86044c9c3b2e4aef02fa05c6da370", size = 1453990, upload-time = "2025-08-10T21:27:27.089Z" }, + { url = "https://files.pythonhosted.org/packages/5f/ce/a829eb8c033e977d7ea03ed32fb3c1781b4fa0433fbadfff29e39c676f32/kiwisolver-1.4.9-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:0856e241c2d3df4efef7c04a1e46b1936b6120c9bcf36dd216e3acd84bc4fb21", size = 2331601, upload-time = "2025-08-10T21:27:29.343Z" }, + { url = "https://files.pythonhosted.org/packages/e0/4b/b5e97eb142eb9cd0072dacfcdcd31b1c66dc7352b0f7c7255d339c0edf00/kiwisolver-1.4.9-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:9af39d6551f97d31a4deebeac6f45b156f9755ddc59c07b402c148f5dbb6482a", size = 2422041, upload-time = "2025-08-10T21:27:30.754Z" }, + { url = "https://files.pythonhosted.org/packages/40/be/8eb4cd53e1b85ba4edc3a9321666f12b83113a178845593307a3e7891f44/kiwisolver-1.4.9-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:bb4ae2b57fc1d8cbd1cf7b1d9913803681ffa903e7488012be5b76dedf49297f", size = 2594897, upload-time = "2025-08-10T21:27:32.803Z" }, + { url = "https://files.pythonhosted.org/packages/99/dd/841e9a66c4715477ea0abc78da039832fbb09dac5c35c58dc4c41a407b8a/kiwisolver-1.4.9-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:aedff62918805fb62d43a4aa2ecd4482c380dc76cd31bd7c8878588a61bd0369", size = 2391835, upload-time = "2025-08-10T21:27:34.23Z" }, + { url = "https://files.pythonhosted.org/packages/0c/28/4b2e5c47a0da96896fdfdb006340ade064afa1e63675d01ea5ac222b6d52/kiwisolver-1.4.9-cp314-cp314t-win_amd64.whl", hash = "sha256:1fa333e8b2ce4d9660f2cda9c0e1b6bafcfb2457a9d259faa82289e73ec24891", size = 79988, upload-time = "2025-08-10T21:27:35.587Z" }, + { url = "https://files.pythonhosted.org/packages/80/be/3578e8afd18c88cdf9cb4cffde75a96d2be38c5a903f1ed0ceec061bd09e/kiwisolver-1.4.9-cp314-cp314t-win_arm64.whl", hash = "sha256:4a48a2ce79d65d363597ef7b567ce3d14d68783d2b2263d98db3d9477805ba32", size = 70260, upload-time = "2025-08-10T21:27:36.606Z" }, + { url = "https://files.pythonhosted.org/packages/a3/0f/36d89194b5a32c054ce93e586d4049b6c2c22887b0eb229c61c68afd3078/kiwisolver-1.4.9-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:720e05574713db64c356e86732c0f3c5252818d05f9df320f0ad8380641acea5", size = 60104, upload-time = "2025-08-10T21:27:43.287Z" }, + { url = "https://files.pythonhosted.org/packages/52/ba/4ed75f59e4658fd21fe7dde1fee0ac397c678ec3befba3fe6482d987af87/kiwisolver-1.4.9-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:17680d737d5335b552994a2008fab4c851bcd7de33094a82067ef3a576ff02fa", size = 58592, upload-time = "2025-08-10T21:27:44.314Z" }, + { url = "https://files.pythonhosted.org/packages/33/01/a8ea7c5ea32a9b45ceeaee051a04c8ed4320f5add3c51bfa20879b765b70/kiwisolver-1.4.9-pp311-pypy311_pp73-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:85b5352f94e490c028926ea567fc569c52ec79ce131dadb968d3853e809518c2", size = 80281, upload-time = "2025-08-10T21:27:45.369Z" }, + { url = "https://files.pythonhosted.org/packages/da/e3/dbd2ecdce306f1d07a1aaf324817ee993aab7aee9db47ceac757deabafbe/kiwisolver-1.4.9-pp311-pypy311_pp73-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:464415881e4801295659462c49461a24fb107c140de781d55518c4b80cb6790f", size = 78009, upload-time = "2025-08-10T21:27:46.376Z" }, + { url = "https://files.pythonhosted.org/packages/da/e9/0d4add7873a73e462aeb45c036a2dead2562b825aa46ba326727b3f31016/kiwisolver-1.4.9-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:fb940820c63a9590d31d88b815e7a3aa5915cad3ce735ab45f0c730b39547de1", size = 73929, upload-time = "2025-08-10T21:27:48.236Z" }, +] + +[[package]] +name = "kuzu" +version = "0.11.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/d8/7c/d2c9355054a67a79ec0cc516b3fad68d970245a1a6f5173eaa2bf94d1782/kuzu-0.11.0.tar.gz", hash = "sha256:34b9fe2d9f94421585f921cb0513bd584842a5705ae757c09fd075e23acb42d7", size = 4897335, upload-time = "2025-07-13T18:37:37.009Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c7/32/f60c8cd9f3ceb4ff75fb4a2e9c9ea02ad40ae50323e14f71fd8440c4eb70/kuzu-0.11.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:8f20ea8c608bb40d6e15d32538a903ace177464c90aa88ee542e99814bf78881", size = 3694199, upload-time = "2025-07-13T18:36:46.867Z" }, + { url = "https://files.pythonhosted.org/packages/a7/33/544e65c08ce49f41e2ee35cd8576df602c87cc58b033cd10f9d7847cc98f/kuzu-0.11.0-cp311-cp311-macosx_11_0_x86_64.whl", hash = "sha256:df94c3beaf57d2c3ac84ce4087fc210c09e9ff5b5c9863a496b274bbc82f0a3f", size = 4092338, upload-time = "2025-07-13T18:36:48.185Z" }, + { url = "https://files.pythonhosted.org/packages/46/da/bd305260c82fe40d1d1e1710cd20a538160c0dd858559568cebe5e3ad5b7/kuzu-0.11.0-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0457283aaf75bcd7912dcdf0292adaabdd615db654b09435387637a70cbae28d", size = 6201525, upload-time = "2025-07-13T18:36:49.601Z" }, + { url = "https://files.pythonhosted.org/packages/40/98/dfc00fca1c126a2eb678cb75cca4d966b902450f5215cab1ca221bb0dbc9/kuzu-0.11.0-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3a474c74aa7953cca399862dce2098fc5bbc94f4d83b04d891688fe6fb2e14c4", size = 6980556, upload-time = "2025-07-13T18:36:51.402Z" }, + { url = "https://files.pythonhosted.org/packages/4d/58/fe2f00687531c02b6b4a636a4ff2603d161d504ace4ca2d01878db87793a/kuzu-0.11.0-cp311-cp311-win_amd64.whl", hash = "sha256:139c0b52cc2037ee03243335f37734fc30fe20b8d94b7dea66a1ee8ad44e5b16", size = 4289032, upload-time = "2025-07-13T18:36:53.395Z" }, + { url = "https://files.pythonhosted.org/packages/08/ee/c172bd487e6b11734db2febc03f0b5517225bafbfe144a080f265569b010/kuzu-0.11.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f200955e3af6a64ecb3f8db24e88d2620e4f04cfe958f580d614d6fca4b7b73d", size = 3693481, upload-time = "2025-07-13T18:36:54.828Z" }, + { url = "https://files.pythonhosted.org/packages/56/c0/1a4f466366454e0657e3f6de8b9fd649a2a12e7c72d86f1d341dc264e927/kuzu-0.11.0-cp312-cp312-macosx_11_0_x86_64.whl", hash = "sha256:6de9af1886401cdec89e41bbe67fdd37b562bdc39ad81b4cc62c4c7e5703e23e", size = 4094896, upload-time = "2025-07-13T18:36:56.204Z" }, + { url = "https://files.pythonhosted.org/packages/b4/02/387ad1d493944f5ab7b2dc521ee5adf45b2e6d1b549e6ed1876192c847bd/kuzu-0.11.0-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3843f4107c287c9759d34e0082feea84d8f48366033ea191a58518baf3a9e2d8", size = 6201276, upload-time = "2025-07-13T18:36:57.617Z" }, + { url = "https://files.pythonhosted.org/packages/2f/e5/678dab0df8cd47b61d4d82f9ba4fd46e92f98689ec4031c19911880dbce8/kuzu-0.11.0-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3c0bdb0cbc7be83eb3e5e6c999e8c6add4cb88d26468c67205b3062fe01af859", size = 6979740, upload-time = "2025-07-13T18:36:59.293Z" }, + { url = "https://files.pythonhosted.org/packages/bb/ff/8368ed24f2cd90769b604b6c86ee9f01adcc024adc4a6f0ef4564a484672/kuzu-0.11.0-cp312-cp312-win_amd64.whl", hash = "sha256:a74660da390adb1996b5c8305e442304bfdb84b40424f2b045a7d4977ae22f34", size = 4289700, upload-time = "2025-07-13T18:37:01.023Z" }, + { url = "https://files.pythonhosted.org/packages/e7/22/b1577470c1e142272cc3646cd68ec13dc06b68bfe26869c1339e3ba8a1b0/kuzu-0.11.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:d3b928a6646aad0a4284a07918140761f70626e936976c7bc9a1504395029353", size = 3693508, upload-time = "2025-07-13T18:37:02.4Z" }, + { url = "https://files.pythonhosted.org/packages/af/7c/c97de999c782860bff2a223d07afaa71c9ae4e0a214a1d7c3db866cf9157/kuzu-0.11.0-cp313-cp313-macosx_11_0_x86_64.whl", hash = "sha256:5a995172d99e961fe2ff073722a447d335dca608d566fc924520f1bfea4f97cf", size = 4095016, upload-time = "2025-07-13T18:37:03.742Z" }, + { url = "https://files.pythonhosted.org/packages/2a/df/c9d63b4a3835b944d042add771bdfbaca5bd61a1490b78492e4e299c948f/kuzu-0.11.0-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:836af97ba5159a59e55cb336869f45987d74d9875bd97caae31af5244f8b99e8", size = 6201752, upload-time = "2025-07-13T18:37:05.756Z" }, + { url = "https://files.pythonhosted.org/packages/e6/8d/55226444b7607d81299e3ff1d47ae4ad76149c0fd266ae7fe04eab52060e/kuzu-0.11.0-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7ee8559686eac9f874d125708f9a83f1dca09bb165e5b838c6c0ad521cce68ee", size = 6979587, upload-time = "2025-07-13T18:37:07.468Z" }, + { url = "https://files.pythonhosted.org/packages/a7/19/1e19851f7229953cd696df9983b953dcc2c0cc1f0ae81e02be9eddd2b379/kuzu-0.11.0-cp313-cp313-win_amd64.whl", hash = "sha256:7ae94e8add6b5cc25f3cf2a38a07f3c4a4acb9b636078be8a53ac3e8f736d6ba", size = 4289847, upload-time = "2025-07-13T18:37:09.08Z" }, + { url = "https://files.pythonhosted.org/packages/9f/2a/f4579d9b7a8dd205bfc1af89596ed3cbcfea3c0bdf14206083fea509c545/kuzu-0.11.0-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3667b430de2efbc96e45878e460851d1aa8aa94be96fa5d4d82186f19a95889a", size = 6204963, upload-time = "2025-07-13T18:37:10.637Z" }, + { url = "https://files.pythonhosted.org/packages/ff/bd/a827d5eff7a7abd577841bbe71f8df485501ca8f0250ddbe29c7edf67e6e/kuzu-0.11.0-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4162d80861e606f4d82d6e559fc11c0d7efa7725a6dc811c61bcd266a2963705", size = 6982953, upload-time = "2025-07-13T18:37:12.429Z" }, + { url = "https://files.pythonhosted.org/packages/03/19/6d41056e2d429ddb19396d992dee5f7804cdb3bee160d53c3cbf97c0f251/kuzu-0.11.0-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7da89fb506be064ebb7d3954f9ffb6e9c0f9ef9c10f37be59a347a0bc48efd28", size = 6202100, upload-time = "2025-07-13T18:37:14.156Z" }, + { url = "https://files.pythonhosted.org/packages/ea/a7/13585d872b65263da8e83c77100914fbaafe91fea11160151a61cf111e03/kuzu-0.11.0-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b17cc92a925073a3bbd65e05af59a9c0c931e1573755d7ad340705059d849af7", size = 6205072, upload-time = "2025-07-13T18:37:15.907Z" }, +] + +[[package]] +name = "lance-namespace" +version = "0.0.6" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "lance-namespace-urllib3-client" }, + { name = "pyarrow" }, + { name = "pylance" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/55/07/5e809f1053a53bdbe0a8f461a710bbf7e1b3119e1432a60b46b648d51ba3/lance_namespace-0.0.6.tar.gz", hash = "sha256:3eeeba5f6bb8d01504cda33d86e6c22bd9cefb1f6f3aac1f963d46a9ff09b9a0", size = 11973, upload-time = "2025-08-20T19:28:03.213Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/25/c1/35bb590f9a9421f02b5d4440c975b6852becaad8292b5007994a8d3fe0cd/lance_namespace-0.0.6-py3-none-any.whl", hash = "sha256:fd102aec0ca3672b15cae65f4b9bf15086f7a73cedb7f5c12c47b5b48f9090b4", size = 9050, upload-time = "2025-08-20T19:28:02.535Z" }, +] + +[[package]] +name = "lance-namespace-urllib3-client" +version = "0.0.14" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pydantic" }, + { name = "python-dateutil" }, + { name = "typing-extensions" }, + { name = "urllib3" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/43/09/727f5749da387a16ffd342339d859073e950ae451f66554bfba8e8adac71/lance_namespace_urllib3_client-0.0.14.tar.gz", hash = "sha256:911c6a3b5c2c98f4239b6d96609cf840e740c3af5482f5fb22096afb9db1dc1c", size = 134488, upload-time = "2025-09-02T03:48:43.108Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ca/90/ceb58b9a9f3aca0af1c294d71115ee9d44d6d82e0c9dc57d6743574d6358/lance_namespace_urllib3_client-0.0.14-py3-none-any.whl", hash = "sha256:40277cfcf7c9084419c2784e7924b3e316f6fe5b8057f4dc62a49f3b40c2d80c", size = 229639, upload-time = "2025-09-02T03:48:41.975Z" }, +] + +[[package]] +name = "lancedb" +version = "0.25.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "deprecation" }, + { name = "lance-namespace" }, + { name = "numpy" }, + { name = "overrides" }, + { name = "packaging" }, + { name = "pyarrow" }, + { name = "pydantic" }, + { name = "tqdm" }, +] +wheels = [ + { url = "https://files.pythonhosted.org/packages/a2/e7/10953deea89b06ae5bc568169d5ae888ff6df314decb92b9b3e453f53f0b/lancedb-0.25.0-cp39-abi3-macosx_10_15_x86_64.whl", hash = "sha256:ae2e80b7b3be3fa4d92fc8d500f47549dd1f8d28ca5092f1c898b92d0cfd4393", size = 34171227, upload-time = "2025-09-04T11:05:31.327Z" }, + { url = "https://files.pythonhosted.org/packages/55/7f/2874a3709f1b8c487e707e171c9004a9240af3af0fd7a247b9187bb6e0f7/lancedb-0.25.0-cp39-abi3-macosx_11_0_arm64.whl", hash = "sha256:a9d67ea9edffa596c6f190151fdd535da8e355a4fd1979c1dc19d540a5665916", size = 31552856, upload-time = "2025-09-04T09:46:50.788Z" }, + { url = "https://files.pythonhosted.org/packages/e3/e9/faab70ad918576ed3bb7cb936474137ac265ac3026d3e16e30cd4d3daac2/lancedb-0.25.0-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d8fe20079ed86b1ab75c65dcfc920a9646c835e9c40ef825cadd148c11b0001e", size = 32487962, upload-time = "2025-09-04T08:51:35.358Z" }, + { url = "https://files.pythonhosted.org/packages/ce/40/5471bc8115f287040b5afdf9d7a20c4685ec16cddb4a7da79e7c1f63914e/lancedb-0.25.0-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b37bc402d85c83e454d9f2e79480b31acc5904bb159a4fc715032c7560494157", size = 35726794, upload-time = "2025-09-04T08:57:30.554Z" }, + { url = "https://files.pythonhosted.org/packages/47/5e/aa3d9d2c7a834a9aa539b2b1c731ab860f7e32e2c87b9086ad233ecb13cd/lancedb-0.25.0-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:f9bbc20bd1e64be359ca11c90428c00b0062d26b0291bddf32ab5471a3525c76", size = 32492508, upload-time = "2025-09-04T08:53:54.661Z" }, + { url = "https://files.pythonhosted.org/packages/fa/37/75f4e3ed7fa00a2cd5d321e8bf13441cdb61a83fbbcd0fa0f1a7241affe1/lancedb-0.25.0-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:1306be9c08e208a5bcb5188275f47f962c2eda96369fad5949a3ddaf592afc6d", size = 35776383, upload-time = "2025-09-04T08:57:18.737Z" }, + { url = "https://files.pythonhosted.org/packages/b5/af/eb217ea1daab5c28ce4c764d2f672f4e3a5bcd3d4faf7921a8ee28c6cb5b/lancedb-0.25.0-cp39-abi3-win_amd64.whl", hash = "sha256:f66283e5d63c99c2bfbd4eaa134d9a5c5b0145eb26a972648214f8ba87777e24", size = 37826272, upload-time = "2025-09-04T09:15:23.729Z" }, +] + +[[package]] +name = "langfuse" +version = "2.60.10" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio" }, + { name = "backoff" }, + { name = "httpx" }, + { name = "idna" }, + { name = "packaging" }, + { name = "pydantic" }, + { name = "requests" }, + { name = "wrapt" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/eb/45/77fdf53c9e9f49bb78f72eba3f992f2f3d8343e05976aabfe1fca276a640/langfuse-2.60.10.tar.gz", hash = "sha256:a26d0d927a28ee01b2d12bb5b862590b643cc4e60a28de6e2b0c2cfff5dbfc6a", size = 152648, upload-time = "2025-09-16T15:08:12.426Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/76/69/08584fbd69e14398d3932a77d0c8d7e20389da3e6470210d6719afba2801/langfuse-2.60.10-py3-none-any.whl", hash = "sha256:815c6369194aa5b2a24f88eb9952f7c3fc863272c41e90642a71f3bc76f4a11f", size = 275568, upload-time = "2025-09-16T15:08:10.166Z" }, +] + +[[package]] +name = "lazy-object-proxy" +version = "1.12.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/08/a2/69df9c6ba6d316cfd81fe2381e464db3e6de5db45f8c43c6a23504abf8cb/lazy_object_proxy-1.12.0.tar.gz", hash = "sha256:1f5a462d92fd0cfb82f1fab28b51bfb209fabbe6aabf7f0d51472c0c124c0c61", size = 43681, upload-time = "2025-08-22T13:50:06.783Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/01/b3/4684b1e128a87821e485f5a901b179790e6b5bc02f89b7ee19c23be36ef3/lazy_object_proxy-1.12.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1cf69cd1a6c7fe2dbcc3edaa017cf010f4192e53796538cc7d5e1fedbfa4bcff", size = 26656, upload-time = "2025-08-22T13:42:30.605Z" }, + { url = "https://files.pythonhosted.org/packages/3a/03/1bdc21d9a6df9ff72d70b2ff17d8609321bea4b0d3cffd2cea92fb2ef738/lazy_object_proxy-1.12.0-cp311-cp311-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:efff4375a8c52f55a145dc8487a2108c2140f0bec4151ab4e1843e52eb9987ad", size = 68832, upload-time = "2025-08-22T13:42:31.675Z" }, + { url = "https://files.pythonhosted.org/packages/3d/4b/5788e5e8bd01d19af71e50077ab020bc5cce67e935066cd65e1215a09ff9/lazy_object_proxy-1.12.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1192e8c2f1031a6ff453ee40213afa01ba765b3dc861302cd91dbdb2e2660b00", size = 69148, upload-time = "2025-08-22T13:42:32.876Z" }, + { url = "https://files.pythonhosted.org/packages/79/0e/090bf070f7a0de44c61659cb7f74c2fe02309a77ca8c4b43adfe0b695f66/lazy_object_proxy-1.12.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:3605b632e82a1cbc32a1e5034278a64db555b3496e0795723ee697006b980508", size = 67800, upload-time = "2025-08-22T13:42:34.054Z" }, + { url = "https://files.pythonhosted.org/packages/cf/d2/b320325adbb2d119156f7c506a5fbfa37fcab15c26d13cf789a90a6de04e/lazy_object_proxy-1.12.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:a61095f5d9d1a743e1e20ec6d6db6c2ca511961777257ebd9b288951b23b44fa", size = 68085, upload-time = "2025-08-22T13:42:35.197Z" }, + { url = "https://files.pythonhosted.org/packages/6a/48/4b718c937004bf71cd82af3713874656bcb8d0cc78600bf33bb9619adc6c/lazy_object_proxy-1.12.0-cp311-cp311-win_amd64.whl", hash = "sha256:997b1d6e10ecc6fb6fe0f2c959791ae59599f41da61d652f6c903d1ee58b7370", size = 26535, upload-time = "2025-08-22T13:42:36.521Z" }, + { url = "https://files.pythonhosted.org/packages/0d/1b/b5f5bd6bda26f1e15cd3232b223892e4498e34ec70a7f4f11c401ac969f1/lazy_object_proxy-1.12.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8ee0d6027b760a11cc18281e702c0309dd92da458a74b4c15025d7fc490deede", size = 26746, upload-time = "2025-08-22T13:42:37.572Z" }, + { url = "https://files.pythonhosted.org/packages/55/64/314889b618075c2bfc19293ffa9153ce880ac6153aacfd0a52fcabf21a66/lazy_object_proxy-1.12.0-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:4ab2c584e3cc8be0dfca422e05ad30a9abe3555ce63e9ab7a559f62f8dbc6ff9", size = 71457, upload-time = "2025-08-22T13:42:38.743Z" }, + { url = "https://files.pythonhosted.org/packages/11/53/857fc2827fc1e13fbdfc0ba2629a7d2579645a06192d5461809540b78913/lazy_object_proxy-1.12.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:14e348185adbd03ec17d051e169ec45686dcd840a3779c9d4c10aabe2ca6e1c0", size = 71036, upload-time = "2025-08-22T13:42:40.184Z" }, + { url = "https://files.pythonhosted.org/packages/2b/24/e581ffed864cd33c1b445b5763d617448ebb880f48675fc9de0471a95cbc/lazy_object_proxy-1.12.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:c4fcbe74fb85df8ba7825fa05eddca764138da752904b378f0ae5ab33a36c308", size = 69329, upload-time = "2025-08-22T13:42:41.311Z" }, + { url = "https://files.pythonhosted.org/packages/78/be/15f8f5a0b0b2e668e756a152257d26370132c97f2f1943329b08f057eff0/lazy_object_proxy-1.12.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:563d2ec8e4d4b68ee7848c5ab4d6057a6d703cb7963b342968bb8758dda33a23", size = 70690, upload-time = "2025-08-22T13:42:42.51Z" }, + { url = "https://files.pythonhosted.org/packages/5d/aa/f02be9bbfb270e13ee608c2b28b8771f20a5f64356c6d9317b20043c6129/lazy_object_proxy-1.12.0-cp312-cp312-win_amd64.whl", hash = "sha256:53c7fd99eb156bbb82cbc5d5188891d8fdd805ba6c1e3b92b90092da2a837073", size = 26563, upload-time = "2025-08-22T13:42:43.685Z" }, + { url = "https://files.pythonhosted.org/packages/f4/26/b74c791008841f8ad896c7f293415136c66cc27e7c7577de4ee68040c110/lazy_object_proxy-1.12.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:86fd61cb2ba249b9f436d789d1356deae69ad3231dc3c0f17293ac535162672e", size = 26745, upload-time = "2025-08-22T13:42:44.982Z" }, + { url = "https://files.pythonhosted.org/packages/9b/52/641870d309e5d1fb1ea7d462a818ca727e43bfa431d8c34b173eb090348c/lazy_object_proxy-1.12.0-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:81d1852fb30fab81696f93db1b1e55a5d1ff7940838191062f5f56987d5fcc3e", size = 71537, upload-time = "2025-08-22T13:42:46.141Z" }, + { url = "https://files.pythonhosted.org/packages/47/b6/919118e99d51c5e76e8bf5a27df406884921c0acf2c7b8a3b38d847ab3e9/lazy_object_proxy-1.12.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:be9045646d83f6c2664c1330904b245ae2371b5c57a3195e4028aedc9f999655", size = 71141, upload-time = "2025-08-22T13:42:47.375Z" }, + { url = "https://files.pythonhosted.org/packages/e5/47/1d20e626567b41de085cf4d4fb3661a56c159feaa73c825917b3b4d4f806/lazy_object_proxy-1.12.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:67f07ab742f1adfb3966c40f630baaa7902be4222a17941f3d85fd1dae5565ff", size = 69449, upload-time = "2025-08-22T13:42:48.49Z" }, + { url = "https://files.pythonhosted.org/packages/58/8d/25c20ff1a1a8426d9af2d0b6f29f6388005fc8cd10d6ee71f48bff86fdd0/lazy_object_proxy-1.12.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:75ba769017b944fcacbf6a80c18b2761a1795b03f8899acdad1f1c39db4409be", size = 70744, upload-time = "2025-08-22T13:42:49.608Z" }, + { url = "https://files.pythonhosted.org/packages/c0/67/8ec9abe15c4f8a4bcc6e65160a2c667240d025cbb6591b879bea55625263/lazy_object_proxy-1.12.0-cp313-cp313-win_amd64.whl", hash = "sha256:7b22c2bbfb155706b928ac4d74c1a63ac8552a55ba7fff4445155523ea4067e1", size = 26568, upload-time = "2025-08-22T13:42:57.719Z" }, + { url = "https://files.pythonhosted.org/packages/23/12/cd2235463f3469fd6c62d41d92b7f120e8134f76e52421413a0ad16d493e/lazy_object_proxy-1.12.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:4a79b909aa16bde8ae606f06e6bbc9d3219d2e57fb3e0076e17879072b742c65", size = 27391, upload-time = "2025-08-22T13:42:50.62Z" }, + { url = "https://files.pythonhosted.org/packages/60/9e/f1c53e39bbebad2e8609c67d0830cc275f694d0ea23d78e8f6db526c12d3/lazy_object_proxy-1.12.0-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:338ab2f132276203e404951205fe80c3fd59429b3a724e7b662b2eb539bb1be9", size = 80552, upload-time = "2025-08-22T13:42:51.731Z" }, + { url = "https://files.pythonhosted.org/packages/4c/b6/6c513693448dcb317d9d8c91d91f47addc09553613379e504435b4cc8b3e/lazy_object_proxy-1.12.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8c40b3c9faee2e32bfce0df4ae63f4e73529766893258eca78548bac801c8f66", size = 82857, upload-time = "2025-08-22T13:42:53.225Z" }, + { url = "https://files.pythonhosted.org/packages/12/1c/d9c4aaa4c75da11eb7c22c43d7c90a53b4fca0e27784a5ab207768debea7/lazy_object_proxy-1.12.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:717484c309df78cedf48396e420fa57fc8a2b1f06ea889df7248fdd156e58847", size = 80833, upload-time = "2025-08-22T13:42:54.391Z" }, + { url = "https://files.pythonhosted.org/packages/0b/ae/29117275aac7d7d78ae4f5a4787f36ff33262499d486ac0bf3e0b97889f6/lazy_object_proxy-1.12.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:a6b7ea5ea1ffe15059eb44bcbcb258f97bcb40e139b88152c40d07b1a1dfc9ac", size = 79516, upload-time = "2025-08-22T13:42:55.812Z" }, + { url = "https://files.pythonhosted.org/packages/19/40/b4e48b2c38c69392ae702ae7afa7b6551e0ca5d38263198b7c79de8b3bdf/lazy_object_proxy-1.12.0-cp313-cp313t-win_amd64.whl", hash = "sha256:08c465fb5cd23527512f9bd7b4c7ba6cec33e28aad36fbbe46bf7b858f9f3f7f", size = 27656, upload-time = "2025-08-22T13:42:56.793Z" }, + { url = "https://files.pythonhosted.org/packages/ef/3a/277857b51ae419a1574557c0b12e0d06bf327b758ba94cafc664cb1e2f66/lazy_object_proxy-1.12.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c9defba70ab943f1df98a656247966d7729da2fe9c2d5d85346464bf320820a3", size = 26582, upload-time = "2025-08-22T13:49:49.366Z" }, + { url = "https://files.pythonhosted.org/packages/1a/b6/c5e0fa43535bb9c87880e0ba037cdb1c50e01850b0831e80eb4f4762f270/lazy_object_proxy-1.12.0-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:6763941dbf97eea6b90f5b06eb4da9418cc088fce0e3883f5816090f9afcde4a", size = 71059, upload-time = "2025-08-22T13:49:50.488Z" }, + { url = "https://files.pythonhosted.org/packages/06/8a/7dcad19c685963c652624702f1a968ff10220b16bfcc442257038216bf55/lazy_object_proxy-1.12.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:fdc70d81235fc586b9e3d1aeef7d1553259b62ecaae9db2167a5d2550dcc391a", size = 71034, upload-time = "2025-08-22T13:49:54.224Z" }, + { url = "https://files.pythonhosted.org/packages/12/ac/34cbfb433a10e28c7fd830f91c5a348462ba748413cbb950c7f259e67aa7/lazy_object_proxy-1.12.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:0a83c6f7a6b2bfc11ef3ed67f8cbe99f8ff500b05655d8e7df9aab993a6abc95", size = 69529, upload-time = "2025-08-22T13:49:55.29Z" }, + { url = "https://files.pythonhosted.org/packages/6f/6a/11ad7e349307c3ca4c0175db7a77d60ce42a41c60bcb11800aabd6a8acb8/lazy_object_proxy-1.12.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:256262384ebd2a77b023ad02fbcc9326282bcfd16484d5531154b02bc304f4c5", size = 70391, upload-time = "2025-08-22T13:49:56.35Z" }, + { url = "https://files.pythonhosted.org/packages/59/97/9b410ed8fbc6e79c1ee8b13f8777a80137d4bc189caf2c6202358e66192c/lazy_object_proxy-1.12.0-cp314-cp314-win_amd64.whl", hash = "sha256:7601ec171c7e8584f8ff3f4e440aa2eebf93e854f04639263875b8c2971f819f", size = 26988, upload-time = "2025-08-22T13:49:57.302Z" }, + { url = "https://files.pythonhosted.org/packages/41/a0/b91504515c1f9a299fc157967ffbd2f0321bce0516a3d5b89f6f4cad0355/lazy_object_proxy-1.12.0-pp39.pp310.pp311.graalpy311-none-any.whl", hash = "sha256:c3b2e0af1f7f77c4263759c4824316ce458fabe0fceadcd24ef8ca08b2d1e402", size = 15072, upload-time = "2025-08-22T13:50:05.498Z" }, +] + +[[package]] +name = "limits" +version = "4.8.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "deprecated" }, + { name = "packaging" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/71/c6/18c4676257f78add093babffbe4d101ff943e9b86e4f708ca5b8fad03a9e/limits-4.8.0.tar.gz", hash = "sha256:74a9691f8a2c82c37480ee9305de3490f6cab3df5b8c61dbde670550f2b34510", size = 95679, upload-time = "2025-04-23T21:00:28.166Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/6d/c9/556846b9d112a3387397850d5560f5ec63464508c6aa068257f0516159d0/limits-4.8.0-py3-none-any.whl", hash = "sha256:de43d24969a0050b859dd29bbd61bd807a5de3ed9255f666aec1ea3dd3fc407e", size = 62028, upload-time = "2025-04-23T21:00:26.017Z" }, +] + +[[package]] +name = "litellm" +version = "1.77.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "aiohttp" }, + { name = "click" }, + { name = "fastuuid" }, + { name = "httpx" }, + { name = "importlib-metadata" }, + { name = "jinja2" }, + { name = "jsonschema" }, + { name = "openai" }, + { name = "pydantic" }, + { name = "python-dotenv" }, + { name = "tiktoken" }, + { name = "tokenizers" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/8c/65/71fe4851709fa4a612e41b80001a9ad803fea979d21b90970093fd65eded/litellm-1.77.1.tar.gz", hash = "sha256:76bab5203115efb9588244e5bafbfc07a800a239be75d8dc6b1b9d17394c6418", size = 10275745, upload-time = "2025-09-13T21:05:21.377Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/bb/dc/ff4f119cd4d783742c9648a03e0ba5c2b52fc385b2ae9f0d32acf3a78241/litellm-1.77.1-py3-none-any.whl", hash = "sha256:407761dc3c35fbcd41462d3fe65dd3ed70aac705f37cde318006c18940f695a0", size = 9067070, upload-time = "2025-09-13T21:05:18.078Z" }, +] + +[[package]] +name = "makefun" +version = "1.16.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/7b/cf/6780ab8bc3b84a1cce3e4400aed3d64b6db7d5e227a2f75b6ded5674701a/makefun-1.16.0.tar.gz", hash = "sha256:e14601831570bff1f6d7e68828bcd30d2f5856f24bad5de0ccb22921ceebc947", size = 73565, upload-time = "2025-05-09T15:00:42.313Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b7/c0/4bc973defd1270b89ccaae04cef0d5fa3ea85b59b108ad2c08aeea9afb76/makefun-1.16.0-py2.py3-none-any.whl", hash = "sha256:43baa4c3e7ae2b17de9ceac20b669e9a67ceeadff31581007cca20a07bbe42c4", size = 22923, upload-time = "2025-05-09T15:00:41.042Z" }, +] + +[[package]] +name = "mako" +version = "1.3.10" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "markupsafe" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/9e/38/bd5b78a920a64d708fe6bc8e0a2c075e1389d53bef8413725c63ba041535/mako-1.3.10.tar.gz", hash = "sha256:99579a6f39583fa7e5630a28c3c1f440e4e97a414b80372649c0ce338da2ea28", size = 392474, upload-time = "2025-04-10T12:44:31.16Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/87/fb/99f81ac72ae23375f22b7afdb7642aba97c00a713c217124420147681a2f/mako-1.3.10-py3-none-any.whl", hash = "sha256:baef24a52fc4fc514a0887ac600f9f1cff3d82c61d4d700a1fa84d597b88db59", size = 78509, upload-time = "2025-04-10T12:50:53.297Z" }, +] + +[[package]] +name = "markdown-it-py" +version = "4.0.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "mdurl" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/5b/f5/4ec618ed16cc4f8fb3b701563655a69816155e79e24a17b651541804721d/markdown_it_py-4.0.0.tar.gz", hash = "sha256:cb0a2b4aa34f932c007117b194e945bd74e0ec24133ceb5bac59009cda1cb9f3", size = 73070, upload-time = "2025-08-11T12:57:52.854Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/94/54/e7d793b573f298e1c9013b8c4dade17d481164aa517d1d7148619c2cedbf/markdown_it_py-4.0.0-py3-none-any.whl", hash = "sha256:87327c59b172c5011896038353a81343b6754500a08cd7a4973bb48c6d578147", size = 87321, upload-time = "2025-08-11T12:57:51.923Z" }, +] + +[[package]] +name = "markupsafe" +version = "3.0.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/b2/97/5d42485e71dfc078108a86d6de8fa46db44a1a9295e89c5d6d4a06e23a62/markupsafe-3.0.2.tar.gz", hash = "sha256:ee55d3edf80167e48ea11a923c7386f4669df67d7994554387f84e7d8b0a2bf0", size = 20537, upload-time = "2024-10-18T15:21:54.129Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/6b/28/bbf83e3f76936960b850435576dd5e67034e200469571be53f69174a2dfd/MarkupSafe-3.0.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:9025b4018f3a1314059769c7bf15441064b2207cb3f065e6ea1e7359cb46db9d", size = 14353, upload-time = "2024-10-18T15:21:02.187Z" }, + { url = "https://files.pythonhosted.org/packages/6c/30/316d194b093cde57d448a4c3209f22e3046c5bb2fb0820b118292b334be7/MarkupSafe-3.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:93335ca3812df2f366e80509ae119189886b0f3c2b81325d39efdb84a1e2ae93", size = 12392, upload-time = "2024-10-18T15:21:02.941Z" }, + { url = "https://files.pythonhosted.org/packages/f2/96/9cdafba8445d3a53cae530aaf83c38ec64c4d5427d975c974084af5bc5d2/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2cb8438c3cbb25e220c2ab33bb226559e7afb3baec11c4f218ffa7308603c832", size = 23984, upload-time = "2024-10-18T15:21:03.953Z" }, + { url = "https://files.pythonhosted.org/packages/f1/a4/aefb044a2cd8d7334c8a47d3fb2c9f328ac48cb349468cc31c20b539305f/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a123e330ef0853c6e822384873bef7507557d8e4a082961e1defa947aa59ba84", size = 23120, upload-time = "2024-10-18T15:21:06.495Z" }, + { url = "https://files.pythonhosted.org/packages/8d/21/5e4851379f88f3fad1de30361db501300d4f07bcad047d3cb0449fc51f8c/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1e084f686b92e5b83186b07e8a17fc09e38fff551f3602b249881fec658d3eca", size = 23032, upload-time = "2024-10-18T15:21:07.295Z" }, + { url = "https://files.pythonhosted.org/packages/00/7b/e92c64e079b2d0d7ddf69899c98842f3f9a60a1ae72657c89ce2655c999d/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d8213e09c917a951de9d09ecee036d5c7d36cb6cb7dbaece4c71a60d79fb9798", size = 24057, upload-time = "2024-10-18T15:21:08.073Z" }, + { url = "https://files.pythonhosted.org/packages/f9/ac/46f960ca323037caa0a10662ef97d0a4728e890334fc156b9f9e52bcc4ca/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:5b02fb34468b6aaa40dfc198d813a641e3a63b98c2b05a16b9f80b7ec314185e", size = 23359, upload-time = "2024-10-18T15:21:09.318Z" }, + { url = "https://files.pythonhosted.org/packages/69/84/83439e16197337b8b14b6a5b9c2105fff81d42c2a7c5b58ac7b62ee2c3b1/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:0bff5e0ae4ef2e1ae4fdf2dfd5b76c75e5c2fa4132d05fc1b0dabcd20c7e28c4", size = 23306, upload-time = "2024-10-18T15:21:10.185Z" }, + { url = "https://files.pythonhosted.org/packages/9a/34/a15aa69f01e2181ed8d2b685c0d2f6655d5cca2c4db0ddea775e631918cd/MarkupSafe-3.0.2-cp311-cp311-win32.whl", hash = "sha256:6c89876f41da747c8d3677a2b540fb32ef5715f97b66eeb0c6b66f5e3ef6f59d", size = 15094, upload-time = "2024-10-18T15:21:11.005Z" }, + { url = "https://files.pythonhosted.org/packages/da/b8/3a3bd761922d416f3dc5d00bfbed11f66b1ab89a0c2b6e887240a30b0f6b/MarkupSafe-3.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:70a87b411535ccad5ef2f1df5136506a10775d267e197e4cf531ced10537bd6b", size = 15521, upload-time = "2024-10-18T15:21:12.911Z" }, + { url = "https://files.pythonhosted.org/packages/22/09/d1f21434c97fc42f09d290cbb6350d44eb12f09cc62c9476effdb33a18aa/MarkupSafe-3.0.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:9778bd8ab0a994ebf6f84c2b949e65736d5575320a17ae8984a77fab08db94cf", size = 14274, upload-time = "2024-10-18T15:21:13.777Z" }, + { url = "https://files.pythonhosted.org/packages/6b/b0/18f76bba336fa5aecf79d45dcd6c806c280ec44538b3c13671d49099fdd0/MarkupSafe-3.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:846ade7b71e3536c4e56b386c2a47adf5741d2d8b94ec9dc3e92e5e1ee1e2225", size = 12348, upload-time = "2024-10-18T15:21:14.822Z" }, + { url = "https://files.pythonhosted.org/packages/e0/25/dd5c0f6ac1311e9b40f4af06c78efde0f3b5cbf02502f8ef9501294c425b/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1c99d261bd2d5f6b59325c92c73df481e05e57f19837bdca8413b9eac4bd8028", size = 24149, upload-time = "2024-10-18T15:21:15.642Z" }, + { url = "https://files.pythonhosted.org/packages/f3/f0/89e7aadfb3749d0f52234a0c8c7867877876e0a20b60e2188e9850794c17/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e17c96c14e19278594aa4841ec148115f9c7615a47382ecb6b82bd8fea3ab0c8", size = 23118, upload-time = "2024-10-18T15:21:17.133Z" }, + { url = "https://files.pythonhosted.org/packages/d5/da/f2eeb64c723f5e3777bc081da884b414671982008c47dcc1873d81f625b6/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:88416bd1e65dcea10bc7569faacb2c20ce071dd1f87539ca2ab364bf6231393c", size = 22993, upload-time = "2024-10-18T15:21:18.064Z" }, + { url = "https://files.pythonhosted.org/packages/da/0e/1f32af846df486dce7c227fe0f2398dc7e2e51d4a370508281f3c1c5cddc/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:2181e67807fc2fa785d0592dc2d6206c019b9502410671cc905d132a92866557", size = 24178, upload-time = "2024-10-18T15:21:18.859Z" }, + { url = "https://files.pythonhosted.org/packages/c4/f6/bb3ca0532de8086cbff5f06d137064c8410d10779c4c127e0e47d17c0b71/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:52305740fe773d09cffb16f8ed0427942901f00adedac82ec8b67752f58a1b22", size = 23319, upload-time = "2024-10-18T15:21:19.671Z" }, + { url = "https://files.pythonhosted.org/packages/a2/82/8be4c96ffee03c5b4a034e60a31294daf481e12c7c43ab8e34a1453ee48b/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ad10d3ded218f1039f11a75f8091880239651b52e9bb592ca27de44eed242a48", size = 23352, upload-time = "2024-10-18T15:21:20.971Z" }, + { url = "https://files.pythonhosted.org/packages/51/ae/97827349d3fcffee7e184bdf7f41cd6b88d9919c80f0263ba7acd1bbcb18/MarkupSafe-3.0.2-cp312-cp312-win32.whl", hash = "sha256:0f4ca02bea9a23221c0182836703cbf8930c5e9454bacce27e767509fa286a30", size = 15097, upload-time = "2024-10-18T15:21:22.646Z" }, + { url = "https://files.pythonhosted.org/packages/c1/80/a61f99dc3a936413c3ee4e1eecac96c0da5ed07ad56fd975f1a9da5bc630/MarkupSafe-3.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:8e06879fc22a25ca47312fbe7c8264eb0b662f6db27cb2d3bbbc74b1df4b9b87", size = 15601, upload-time = "2024-10-18T15:21:23.499Z" }, + { url = "https://files.pythonhosted.org/packages/83/0e/67eb10a7ecc77a0c2bbe2b0235765b98d164d81600746914bebada795e97/MarkupSafe-3.0.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ba9527cdd4c926ed0760bc301f6728ef34d841f405abf9d4f959c478421e4efd", size = 14274, upload-time = "2024-10-18T15:21:24.577Z" }, + { url = "https://files.pythonhosted.org/packages/2b/6d/9409f3684d3335375d04e5f05744dfe7e9f120062c9857df4ab490a1031a/MarkupSafe-3.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f8b3d067f2e40fe93e1ccdd6b2e1d16c43140e76f02fb1319a05cf2b79d99430", size = 12352, upload-time = "2024-10-18T15:21:25.382Z" }, + { url = "https://files.pythonhosted.org/packages/d2/f5/6eadfcd3885ea85fe2a7c128315cc1bb7241e1987443d78c8fe712d03091/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:569511d3b58c8791ab4c2e1285575265991e6d8f8700c7be0e88f86cb0672094", size = 24122, upload-time = "2024-10-18T15:21:26.199Z" }, + { url = "https://files.pythonhosted.org/packages/0c/91/96cf928db8236f1bfab6ce15ad070dfdd02ed88261c2afafd4b43575e9e9/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:15ab75ef81add55874e7ab7055e9c397312385bd9ced94920f2802310c930396", size = 23085, upload-time = "2024-10-18T15:21:27.029Z" }, + { url = "https://files.pythonhosted.org/packages/c2/cf/c9d56af24d56ea04daae7ac0940232d31d5a8354f2b457c6d856b2057d69/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f3818cb119498c0678015754eba762e0d61e5b52d34c8b13d770f0719f7b1d79", size = 22978, upload-time = "2024-10-18T15:21:27.846Z" }, + { url = "https://files.pythonhosted.org/packages/2a/9f/8619835cd6a711d6272d62abb78c033bda638fdc54c4e7f4272cf1c0962b/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:cdb82a876c47801bb54a690c5ae105a46b392ac6099881cdfb9f6e95e4014c6a", size = 24208, upload-time = "2024-10-18T15:21:28.744Z" }, + { url = "https://files.pythonhosted.org/packages/f9/bf/176950a1792b2cd2102b8ffeb5133e1ed984547b75db47c25a67d3359f77/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:cabc348d87e913db6ab4aa100f01b08f481097838bdddf7c7a84b7575b7309ca", size = 23357, upload-time = "2024-10-18T15:21:29.545Z" }, + { url = "https://files.pythonhosted.org/packages/ce/4f/9a02c1d335caabe5c4efb90e1b6e8ee944aa245c1aaaab8e8a618987d816/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:444dcda765c8a838eaae23112db52f1efaf750daddb2d9ca300bcae1039adc5c", size = 23344, upload-time = "2024-10-18T15:21:30.366Z" }, + { url = "https://files.pythonhosted.org/packages/ee/55/c271b57db36f748f0e04a759ace9f8f759ccf22b4960c270c78a394f58be/MarkupSafe-3.0.2-cp313-cp313-win32.whl", hash = "sha256:bcf3e58998965654fdaff38e58584d8937aa3096ab5354d493c77d1fdd66d7a1", size = 15101, upload-time = "2024-10-18T15:21:31.207Z" }, + { url = "https://files.pythonhosted.org/packages/29/88/07df22d2dd4df40aba9f3e402e6dc1b8ee86297dddbad4872bd5e7b0094f/MarkupSafe-3.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:e6a2a455bd412959b57a172ce6328d2dd1f01cb2135efda2e4576e8a23fa3b0f", size = 15603, upload-time = "2024-10-18T15:21:32.032Z" }, + { url = "https://files.pythonhosted.org/packages/62/6a/8b89d24db2d32d433dffcd6a8779159da109842434f1dd2f6e71f32f738c/MarkupSafe-3.0.2-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:b5a6b3ada725cea8a5e634536b1b01c30bcdcd7f9c6fff4151548d5bf6b3a36c", size = 14510, upload-time = "2024-10-18T15:21:33.625Z" }, + { url = "https://files.pythonhosted.org/packages/7a/06/a10f955f70a2e5a9bf78d11a161029d278eeacbd35ef806c3fd17b13060d/MarkupSafe-3.0.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:a904af0a6162c73e3edcb969eeeb53a63ceeb5d8cf642fade7d39e7963a22ddb", size = 12486, upload-time = "2024-10-18T15:21:34.611Z" }, + { url = "https://files.pythonhosted.org/packages/34/cf/65d4a571869a1a9078198ca28f39fba5fbb910f952f9dbc5220afff9f5e6/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4aa4e5faecf353ed117801a068ebab7b7e09ffb6e1d5e412dc852e0da018126c", size = 25480, upload-time = "2024-10-18T15:21:35.398Z" }, + { url = "https://files.pythonhosted.org/packages/0c/e3/90e9651924c430b885468b56b3d597cabf6d72be4b24a0acd1fa0e12af67/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c0ef13eaeee5b615fb07c9a7dadb38eac06a0608b41570d8ade51c56539e509d", size = 23914, upload-time = "2024-10-18T15:21:36.231Z" }, + { url = "https://files.pythonhosted.org/packages/66/8c/6c7cf61f95d63bb866db39085150df1f2a5bd3335298f14a66b48e92659c/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d16a81a06776313e817c951135cf7340a3e91e8c1ff2fac444cfd75fffa04afe", size = 23796, upload-time = "2024-10-18T15:21:37.073Z" }, + { url = "https://files.pythonhosted.org/packages/bb/35/cbe9238ec3f47ac9a7c8b3df7a808e7cb50fe149dc7039f5f454b3fba218/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:6381026f158fdb7c72a168278597a5e3a5222e83ea18f543112b2662a9b699c5", size = 25473, upload-time = "2024-10-18T15:21:37.932Z" }, + { url = "https://files.pythonhosted.org/packages/e6/32/7621a4382488aa283cc05e8984a9c219abad3bca087be9ec77e89939ded9/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:3d79d162e7be8f996986c064d1c7c817f6df3a77fe3d6859f6f9e7be4b8c213a", size = 24114, upload-time = "2024-10-18T15:21:39.799Z" }, + { url = "https://files.pythonhosted.org/packages/0d/80/0985960e4b89922cb5a0bac0ed39c5b96cbc1a536a99f30e8c220a996ed9/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:131a3c7689c85f5ad20f9f6fb1b866f402c445b220c19fe4308c0b147ccd2ad9", size = 24098, upload-time = "2024-10-18T15:21:40.813Z" }, + { url = "https://files.pythonhosted.org/packages/82/78/fedb03c7d5380df2427038ec8d973587e90561b2d90cd472ce9254cf348b/MarkupSafe-3.0.2-cp313-cp313t-win32.whl", hash = "sha256:ba8062ed2cf21c07a9e295d5b8a2a5ce678b913b45fdf68c32d95d6c1291e0b6", size = 15208, upload-time = "2024-10-18T15:21:41.814Z" }, + { url = "https://files.pythonhosted.org/packages/4f/65/6079a46068dfceaeabb5dcad6d674f5f5c61a6fa5673746f42a9f4c233b3/MarkupSafe-3.0.2-cp313-cp313t-win_amd64.whl", hash = "sha256:e444a31f8db13eb18ada366ab3cf45fd4b31e4db1236a4448f68778c1d1a5a2f", size = 15739, upload-time = "2024-10-18T15:21:42.784Z" }, +] + +[[package]] +name = "matplotlib" +version = "3.10.6" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "contourpy" }, + { name = "cycler" }, + { name = "fonttools" }, + { name = "kiwisolver" }, + { name = "numpy" }, + { name = "packaging" }, + { name = "pillow" }, + { name = "pyparsing" }, + { name = "python-dateutil" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/a0/59/c3e6453a9676ffba145309a73c462bb407f4400de7de3f2b41af70720a3c/matplotlib-3.10.6.tar.gz", hash = "sha256:ec01b645840dd1996df21ee37f208cd8ba57644779fa20464010638013d3203c", size = 34804264, upload-time = "2025-08-30T00:14:25.137Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/80/d6/5d3665aa44c49005aaacaa68ddea6fcb27345961cd538a98bb0177934ede/matplotlib-3.10.6-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:905b60d1cb0ee604ce65b297b61cf8be9f4e6cfecf95a3fe1c388b5266bc8f4f", size = 8257527, upload-time = "2025-08-30T00:12:45.31Z" }, + { url = "https://files.pythonhosted.org/packages/8c/af/30ddefe19ca67eebd70047dabf50f899eaff6f3c5e6a1a7edaecaf63f794/matplotlib-3.10.6-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:7bac38d816637343e53d7185d0c66677ff30ffb131044a81898b5792c956ba76", size = 8119583, upload-time = "2025-08-30T00:12:47.236Z" }, + { url = "https://files.pythonhosted.org/packages/d3/29/4a8650a3dcae97fa4f375d46efcb25920d67b512186f8a6788b896062a81/matplotlib-3.10.6-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:942a8de2b5bfff1de31d95722f702e2966b8a7e31f4e68f7cd963c7cd8861cf6", size = 8692682, upload-time = "2025-08-30T00:12:48.781Z" }, + { url = "https://files.pythonhosted.org/packages/aa/d3/b793b9cb061cfd5d42ff0f69d1822f8d5dbc94e004618e48a97a8373179a/matplotlib-3.10.6-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a3276c85370bc0dfca051ec65c5817d1e0f8f5ce1b7787528ec8ed2d524bbc2f", size = 9521065, upload-time = "2025-08-30T00:12:50.602Z" }, + { url = "https://files.pythonhosted.org/packages/f7/c5/53de5629f223c1c66668d46ac2621961970d21916a4bc3862b174eb2a88f/matplotlib-3.10.6-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:9df5851b219225731f564e4b9e7f2ac1e13c9e6481f941b5631a0f8e2d9387ce", size = 9576888, upload-time = "2025-08-30T00:12:52.92Z" }, + { url = "https://files.pythonhosted.org/packages/fc/8e/0a18d6d7d2d0a2e66585032a760d13662e5250c784d53ad50434e9560991/matplotlib-3.10.6-cp311-cp311-win_amd64.whl", hash = "sha256:abb5d9478625dd9c9eb51a06d39aae71eda749ae9b3138afb23eb38824026c7e", size = 8115158, upload-time = "2025-08-30T00:12:54.863Z" }, + { url = "https://files.pythonhosted.org/packages/07/b3/1a5107bb66c261e23b9338070702597a2d374e5aa7004b7adfc754fbed02/matplotlib-3.10.6-cp311-cp311-win_arm64.whl", hash = "sha256:886f989ccfae63659183173bb3fced7fd65e9eb793c3cc21c273add368536951", size = 7992444, upload-time = "2025-08-30T00:12:57.067Z" }, + { url = "https://files.pythonhosted.org/packages/ea/1a/7042f7430055d567cc3257ac409fcf608599ab27459457f13772c2d9778b/matplotlib-3.10.6-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:31ca662df6a80bd426f871105fdd69db7543e28e73a9f2afe80de7e531eb2347", size = 8272404, upload-time = "2025-08-30T00:12:59.112Z" }, + { url = "https://files.pythonhosted.org/packages/a9/5d/1d5f33f5b43f4f9e69e6a5fe1fb9090936ae7bc8e2ff6158e7a76542633b/matplotlib-3.10.6-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1678bb61d897bb4ac4757b5ecfb02bfb3fddf7f808000fb81e09c510712fda75", size = 8128262, upload-time = "2025-08-30T00:13:01.141Z" }, + { url = "https://files.pythonhosted.org/packages/67/c3/135fdbbbf84e0979712df58e5e22b4f257b3f5e52a3c4aacf1b8abec0d09/matplotlib-3.10.6-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:56cd2d20842f58c03d2d6e6c1f1cf5548ad6f66b91e1e48f814e4fb5abd1cb95", size = 8697008, upload-time = "2025-08-30T00:13:03.24Z" }, + { url = "https://files.pythonhosted.org/packages/9c/be/c443ea428fb2488a3ea7608714b1bd85a82738c45da21b447dc49e2f8e5d/matplotlib-3.10.6-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:662df55604a2f9a45435566d6e2660e41efe83cd94f4288dfbf1e6d1eae4b0bb", size = 9530166, upload-time = "2025-08-30T00:13:05.951Z" }, + { url = "https://files.pythonhosted.org/packages/a9/35/48441422b044d74034aea2a3e0d1a49023f12150ebc58f16600132b9bbaf/matplotlib-3.10.6-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:08f141d55148cd1fc870c3387d70ca4df16dee10e909b3b038782bd4bda6ea07", size = 9593105, upload-time = "2025-08-30T00:13:08.356Z" }, + { url = "https://files.pythonhosted.org/packages/45/c3/994ef20eb4154ab84cc08d033834555319e4af970165e6c8894050af0b3c/matplotlib-3.10.6-cp312-cp312-win_amd64.whl", hash = "sha256:590f5925c2d650b5c9d813c5b3b5fc53f2929c3f8ef463e4ecfa7e052044fb2b", size = 8122784, upload-time = "2025-08-30T00:13:10.367Z" }, + { url = "https://files.pythonhosted.org/packages/57/b8/5c85d9ae0e40f04e71bedb053aada5d6bab1f9b5399a0937afb5d6b02d98/matplotlib-3.10.6-cp312-cp312-win_arm64.whl", hash = "sha256:f44c8d264a71609c79a78d50349e724f5d5fc3684ead7c2a473665ee63d868aa", size = 7992823, upload-time = "2025-08-30T00:13:12.24Z" }, + { url = "https://files.pythonhosted.org/packages/a0/db/18380e788bb837e724358287b08e223b32bc8dccb3b0c12fa8ca20bc7f3b/matplotlib-3.10.6-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:819e409653c1106c8deaf62e6de6b8611449c2cd9939acb0d7d4e57a3d95cc7a", size = 8273231, upload-time = "2025-08-30T00:13:13.881Z" }, + { url = "https://files.pythonhosted.org/packages/d3/0f/38dd49445b297e0d4f12a322c30779df0d43cb5873c7847df8a82e82ec67/matplotlib-3.10.6-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:59c8ac8382fefb9cb71308dde16a7c487432f5255d8f1fd32473523abecfecdf", size = 8128730, upload-time = "2025-08-30T00:13:15.556Z" }, + { url = "https://files.pythonhosted.org/packages/e5/b8/9eea6630198cb303d131d95d285a024b3b8645b1763a2916fddb44ca8760/matplotlib-3.10.6-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:84e82d9e0fd70c70bc55739defbd8055c54300750cbacf4740c9673a24d6933a", size = 8698539, upload-time = "2025-08-30T00:13:17.297Z" }, + { url = "https://files.pythonhosted.org/packages/71/34/44c7b1f075e1ea398f88aeabcc2907c01b9cc99e2afd560c1d49845a1227/matplotlib-3.10.6-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:25f7a3eb42d6c1c56e89eacd495661fc815ffc08d9da750bca766771c0fd9110", size = 9529702, upload-time = "2025-08-30T00:13:19.248Z" }, + { url = "https://files.pythonhosted.org/packages/b5/7f/e5c2dc9950c7facaf8b461858d1b92c09dd0cf174fe14e21953b3dda06f7/matplotlib-3.10.6-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:f9c862d91ec0b7842920a4cfdaaec29662195301914ea54c33e01f1a28d014b2", size = 9593742, upload-time = "2025-08-30T00:13:21.181Z" }, + { url = "https://files.pythonhosted.org/packages/ff/1d/70c28528794f6410ee2856cd729fa1f1756498b8d3126443b0a94e1a8695/matplotlib-3.10.6-cp313-cp313-win_amd64.whl", hash = "sha256:1b53bd6337eba483e2e7d29c5ab10eee644bc3a2491ec67cc55f7b44583ffb18", size = 8122753, upload-time = "2025-08-30T00:13:23.44Z" }, + { url = "https://files.pythonhosted.org/packages/e8/74/0e1670501fc7d02d981564caf7c4df42974464625935424ca9654040077c/matplotlib-3.10.6-cp313-cp313-win_arm64.whl", hash = "sha256:cbd5eb50b7058b2892ce45c2f4e92557f395c9991f5c886d1bb74a1582e70fd6", size = 7992973, upload-time = "2025-08-30T00:13:26.632Z" }, + { url = "https://files.pythonhosted.org/packages/b1/4e/60780e631d73b6b02bd7239f89c451a72970e5e7ec34f621eda55cd9a445/matplotlib-3.10.6-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:acc86dd6e0e695c095001a7fccff158c49e45e0758fdf5dcdbb0103318b59c9f", size = 8316869, upload-time = "2025-08-30T00:13:28.262Z" }, + { url = "https://files.pythonhosted.org/packages/f8/15/baa662374a579413210fc2115d40c503b7360a08e9cc254aa0d97d34b0c1/matplotlib-3.10.6-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:e228cd2ffb8f88b7d0b29e37f68ca9aaf83e33821f24a5ccc4f082dd8396bc27", size = 8178240, upload-time = "2025-08-30T00:13:30.007Z" }, + { url = "https://files.pythonhosted.org/packages/c6/3f/3c38e78d2aafdb8829fcd0857d25aaf9e7dd2dfcf7ec742765b585774931/matplotlib-3.10.6-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:658bc91894adeab669cf4bb4a186d049948262987e80f0857216387d7435d833", size = 8711719, upload-time = "2025-08-30T00:13:31.72Z" }, + { url = "https://files.pythonhosted.org/packages/96/4b/2ec2bbf8cefaa53207cc56118d1fa8a0f9b80642713ea9390235d331ede4/matplotlib-3.10.6-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8913b7474f6dd83ac444c9459c91f7f0f2859e839f41d642691b104e0af056aa", size = 9541422, upload-time = "2025-08-30T00:13:33.611Z" }, + { url = "https://files.pythonhosted.org/packages/83/7d/40255e89b3ef11c7871020563b2dd85f6cb1b4eff17c0f62b6eb14c8fa80/matplotlib-3.10.6-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:091cea22e059b89f6d7d1a18e2c33a7376c26eee60e401d92a4d6726c4e12706", size = 9594068, upload-time = "2025-08-30T00:13:35.833Z" }, + { url = "https://files.pythonhosted.org/packages/f0/a9/0213748d69dc842537a113493e1c27daf9f96bd7cc316f933dc8ec4de985/matplotlib-3.10.6-cp313-cp313t-win_amd64.whl", hash = "sha256:491e25e02a23d7207629d942c666924a6b61e007a48177fdd231a0097b7f507e", size = 8200100, upload-time = "2025-08-30T00:13:37.668Z" }, + { url = "https://files.pythonhosted.org/packages/be/15/79f9988066ce40b8a6f1759a934ea0cde8dc4adc2262255ee1bc98de6ad0/matplotlib-3.10.6-cp313-cp313t-win_arm64.whl", hash = "sha256:3d80d60d4e54cda462e2cd9a086d85cd9f20943ead92f575ce86885a43a565d5", size = 8042142, upload-time = "2025-08-30T00:13:39.426Z" }, + { url = "https://files.pythonhosted.org/packages/7c/58/e7b6d292beae6fb4283ca6fb7fa47d7c944a68062d6238c07b497dd35493/matplotlib-3.10.6-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:70aaf890ce1d0efd482df969b28a5b30ea0b891224bb315810a3940f67182899", size = 8273802, upload-time = "2025-08-30T00:13:41.006Z" }, + { url = "https://files.pythonhosted.org/packages/9f/f6/7882d05aba16a8cdd594fb9a03a9d3cca751dbb6816adf7b102945522ee9/matplotlib-3.10.6-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:1565aae810ab79cb72e402b22facfa6501365e73ebab70a0fdfb98488d2c3c0c", size = 8131365, upload-time = "2025-08-30T00:13:42.664Z" }, + { url = "https://files.pythonhosted.org/packages/94/bf/ff32f6ed76e78514e98775a53715eca4804b12bdcf35902cdd1cf759d324/matplotlib-3.10.6-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f3b23315a01981689aa4e1a179dbf6ef9fbd17143c3eea77548c2ecfb0499438", size = 9533961, upload-time = "2025-08-30T00:13:44.372Z" }, + { url = "https://files.pythonhosted.org/packages/fe/c3/6bf88c2fc2da7708a2ff8d2eeb5d68943130f50e636d5d3dcf9d4252e971/matplotlib-3.10.6-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:30fdd37edf41a4e6785f9b37969de57aea770696cb637d9946eb37470c94a453", size = 9804262, upload-time = "2025-08-30T00:13:46.614Z" }, + { url = "https://files.pythonhosted.org/packages/0f/7a/e05e6d9446d2d577b459427ad060cd2de5742d0e435db3191fea4fcc7e8b/matplotlib-3.10.6-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:bc31e693da1c08012c764b053e702c1855378e04102238e6a5ee6a7117c53a47", size = 9595508, upload-time = "2025-08-30T00:13:48.731Z" }, + { url = "https://files.pythonhosted.org/packages/39/fb/af09c463ced80b801629fd73b96f726c9f6124c3603aa2e480a061d6705b/matplotlib-3.10.6-cp314-cp314-win_amd64.whl", hash = "sha256:05be9bdaa8b242bc6ff96330d18c52f1fc59c6fb3a4dd411d953d67e7e1baf98", size = 8252742, upload-time = "2025-08-30T00:13:50.539Z" }, + { url = "https://files.pythonhosted.org/packages/b1/f9/b682f6db9396d9ab8f050c0a3bfbb5f14fb0f6518f08507c04cc02f8f229/matplotlib-3.10.6-cp314-cp314-win_arm64.whl", hash = "sha256:f56a0d1ab05d34c628592435781d185cd99630bdfd76822cd686fb5a0aecd43a", size = 8124237, upload-time = "2025-08-30T00:13:54.3Z" }, + { url = "https://files.pythonhosted.org/packages/b5/d2/b69b4a0923a3c05ab90527c60fdec899ee21ca23ede7f0fb818e6620d6f2/matplotlib-3.10.6-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:94f0b4cacb23763b64b5dace50d5b7bfe98710fed5f0cef5c08135a03399d98b", size = 8316956, upload-time = "2025-08-30T00:13:55.932Z" }, + { url = "https://files.pythonhosted.org/packages/28/e9/dc427b6f16457ffaeecb2fc4abf91e5adb8827861b869c7a7a6d1836fa73/matplotlib-3.10.6-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:cc332891306b9fb39462673d8225d1b824c89783fee82840a709f96714f17a5c", size = 8178260, upload-time = "2025-08-30T00:14:00.942Z" }, + { url = "https://files.pythonhosted.org/packages/c4/89/1fbd5ad611802c34d1c7ad04607e64a1350b7fb9c567c4ec2c19e066ed35/matplotlib-3.10.6-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ee1d607b3fb1590deb04b69f02ea1d53ed0b0bf75b2b1a5745f269afcbd3cdd3", size = 9541422, upload-time = "2025-08-30T00:14:02.664Z" }, + { url = "https://files.pythonhosted.org/packages/b0/3b/65fec8716025b22c1d72d5a82ea079934c76a547696eaa55be6866bc89b1/matplotlib-3.10.6-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:376a624a218116461696b27b2bbf7a8945053e6d799f6502fc03226d077807bf", size = 9803678, upload-time = "2025-08-30T00:14:04.741Z" }, + { url = "https://files.pythonhosted.org/packages/c7/b0/40fb2b3a1ab9381bb39a952e8390357c8be3bdadcf6d5055d9c31e1b35ae/matplotlib-3.10.6-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:83847b47f6524c34b4f2d3ce726bb0541c48c8e7692729865c3df75bfa0f495a", size = 9594077, upload-time = "2025-08-30T00:14:07.012Z" }, + { url = "https://files.pythonhosted.org/packages/76/34/c4b71b69edf5b06e635eee1ed10bfc73cf8df058b66e63e30e6a55e231d5/matplotlib-3.10.6-cp314-cp314t-win_amd64.whl", hash = "sha256:c7e0518e0d223683532a07f4b512e2e0729b62674f1b3a1a69869f98e6b1c7e3", size = 8342822, upload-time = "2025-08-30T00:14:09.041Z" }, + { url = "https://files.pythonhosted.org/packages/e8/62/aeabeef1a842b6226a30d49dd13e8a7a1e81e9ec98212c0b5169f0a12d83/matplotlib-3.10.6-cp314-cp314t-win_arm64.whl", hash = "sha256:4dd83e029f5b4801eeb87c64efd80e732452781c16a9cf7415b7b63ec8f374d7", size = 8172588, upload-time = "2025-08-30T00:14:11.166Z" }, + { url = "https://files.pythonhosted.org/packages/12/bb/02c35a51484aae5f49bd29f091286e7af5f3f677a9736c58a92b3c78baeb/matplotlib-3.10.6-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:f2d684c3204fa62421bbf770ddfebc6b50130f9cad65531eeba19236d73bb488", size = 8252296, upload-time = "2025-08-30T00:14:19.49Z" }, + { url = "https://files.pythonhosted.org/packages/7d/85/41701e3092005aee9a2445f5ee3904d9dbd4a7df7a45905ffef29b7ef098/matplotlib-3.10.6-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:6f4a69196e663a41d12a728fab8751177215357906436804217d6d9cf0d4d6cf", size = 8116749, upload-time = "2025-08-30T00:14:21.344Z" }, + { url = "https://files.pythonhosted.org/packages/16/53/8d8fa0ea32a8c8239e04d022f6c059ee5e1b77517769feccd50f1df43d6d/matplotlib-3.10.6-pp311-pypy311_pp73-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:4d6ca6ef03dfd269f4ead566ec6f3fb9becf8dab146fb999022ed85ee9f6b3eb", size = 8693933, upload-time = "2025-08-30T00:14:22.942Z" }, +] + +[[package]] +name = "mcp" +version = "1.14.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio" }, + { name = "httpx" }, + { name = "httpx-sse" }, + { name = "jsonschema" }, + { name = "pydantic" }, + { name = "pydantic-settings" }, + { name = "python-multipart" }, + { name = "pywin32", marker = "sys_platform == 'win32'" }, + { name = "sse-starlette" }, + { name = "starlette" }, + { name = "uvicorn", marker = "sys_platform != 'emscripten'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/95/fd/d6e941a52446198b73e5e4a953441f667f1469aeb06fb382d9f6729d6168/mcp-1.14.0.tar.gz", hash = "sha256:2e7d98b195e08b2abc1dc6191f6f3dc0059604ac13ee6a40f88676274787fac4", size = 454855, upload-time = "2025-09-11T17:40:48.667Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/04/7b/84b0dd4c2c5a499d2c5d63fb7a1224c25fc4c8b6c24623fa7a566471480d/mcp-1.14.0-py3-none-any.whl", hash = "sha256:b2d27feba27b4c53d41b58aa7f4d090ae0cb740cbc4e339af10f8cbe54c4e19d", size = 163805, upload-time = "2025-09-11T17:40:46.891Z" }, +] + +[[package]] +name = "mdurl" +version = "0.1.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/d6/54/cfe61301667036ec958cb99bd3efefba235e65cdeb9c84d24a8293ba1d90/mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba", size = 8729, upload-time = "2022-08-14T12:40:10.846Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b3/38/89ba8ad64ae25be8de66a6d463314cf1eb366222074cfda9ee839c56a4b4/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8", size = 9979, upload-time = "2022-08-14T12:40:09.779Z" }, +] + +[[package]] +name = "more-itertools" +version = "10.8.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/ea/5d/38b681d3fce7a266dd9ab73c66959406d565b3e85f21d5e66e1181d93721/more_itertools-10.8.0.tar.gz", hash = "sha256:f638ddf8a1a0d134181275fb5d58b086ead7c6a72429ad725c67503f13ba30bd", size = 137431, upload-time = "2025-09-02T15:23:11.018Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a4/8e/469e5a4a2f5855992e425f3cb33804cc07bf18d48f2db061aec61ce50270/more_itertools-10.8.0-py3-none-any.whl", hash = "sha256:52d4362373dcf7c52546bc4af9a86ee7c4579df9a8dc268be0a2f949d376cc9b", size = 69667, upload-time = "2025-09-02T15:23:09.635Z" }, +] + +[[package]] +name = "mpmath" +version = "1.3.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/e0/47/dd32fa426cc72114383ac549964eecb20ecfd886d1e5ccf5340b55b02f57/mpmath-1.3.0.tar.gz", hash = "sha256:7a28eb2a9774d00c7bc92411c19a89209d5da7c4c9a9e227be8330a23a25b91f", size = 508106, upload-time = "2023-03-07T16:47:11.061Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/43/e3/7d92a15f894aa0c9c4b49b8ee9ac9850d6e63b03c9c32c0367a13ae62209/mpmath-1.3.0-py3-none-any.whl", hash = "sha256:a0b2b9fe80bbcd81a6647ff13108738cfb482d481d826cc0e02f5b35e5c88d2c", size = 536198, upload-time = "2023-03-07T16:47:09.197Z" }, +] + +[[package]] +name = "multidict" +version = "6.6.4" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/69/7f/0652e6ed47ab288e3756ea9c0df8b14950781184d4bd7883f4d87dd41245/multidict-6.6.4.tar.gz", hash = "sha256:d2d4e4787672911b48350df02ed3fa3fffdc2f2e8ca06dd6afdf34189b76a9dd", size = 101843, upload-time = "2025-08-11T12:08:48.217Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/6b/7f/90a7f01e2d005d6653c689039977f6856718c75c5579445effb7e60923d1/multidict-6.6.4-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:c7a0e9b561e6460484318a7612e725df1145d46b0ef57c6b9866441bf6e27e0c", size = 76472, upload-time = "2025-08-11T12:06:29.006Z" }, + { url = "https://files.pythonhosted.org/packages/54/a3/bed07bc9e2bb302ce752f1dabc69e884cd6a676da44fb0e501b246031fdd/multidict-6.6.4-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6bf2f10f70acc7a2446965ffbc726e5fc0b272c97a90b485857e5c70022213eb", size = 44634, upload-time = "2025-08-11T12:06:30.374Z" }, + { url = "https://files.pythonhosted.org/packages/a7/4b/ceeb4f8f33cf81277da464307afeaf164fb0297947642585884f5cad4f28/multidict-6.6.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:66247d72ed62d5dd29752ffc1d3b88f135c6a8de8b5f63b7c14e973ef5bda19e", size = 44282, upload-time = "2025-08-11T12:06:31.958Z" }, + { url = "https://files.pythonhosted.org/packages/03/35/436a5da8702b06866189b69f655ffdb8f70796252a8772a77815f1812679/multidict-6.6.4-cp311-cp311-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:105245cc6b76f51e408451a844a54e6823bbd5a490ebfe5bdfc79798511ceded", size = 229696, upload-time = "2025-08-11T12:06:33.087Z" }, + { url = "https://files.pythonhosted.org/packages/b6/0e/915160be8fecf1fca35f790c08fb74ca684d752fcba62c11daaf3d92c216/multidict-6.6.4-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:cbbc54e58b34c3bae389ef00046be0961f30fef7cb0dd9c7756aee376a4f7683", size = 246665, upload-time = "2025-08-11T12:06:34.448Z" }, + { url = "https://files.pythonhosted.org/packages/08/ee/2f464330acd83f77dcc346f0b1a0eaae10230291450887f96b204b8ac4d3/multidict-6.6.4-cp311-cp311-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:56c6b3652f945c9bc3ac6c8178cd93132b8d82dd581fcbc3a00676c51302bc1a", size = 225485, upload-time = "2025-08-11T12:06:35.672Z" }, + { url = "https://files.pythonhosted.org/packages/71/cc/9a117f828b4d7fbaec6adeed2204f211e9caf0a012692a1ee32169f846ae/multidict-6.6.4-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:b95494daf857602eccf4c18ca33337dd2be705bccdb6dddbfc9d513e6addb9d9", size = 257318, upload-time = "2025-08-11T12:06:36.98Z" }, + { url = "https://files.pythonhosted.org/packages/25/77/62752d3dbd70e27fdd68e86626c1ae6bccfebe2bb1f84ae226363e112f5a/multidict-6.6.4-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:e5b1413361cef15340ab9dc61523e653d25723e82d488ef7d60a12878227ed50", size = 254689, upload-time = "2025-08-11T12:06:38.233Z" }, + { url = "https://files.pythonhosted.org/packages/00/6e/fac58b1072a6fc59af5e7acb245e8754d3e1f97f4f808a6559951f72a0d4/multidict-6.6.4-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e167bf899c3d724f9662ef00b4f7fef87a19c22b2fead198a6f68b263618df52", size = 246709, upload-time = "2025-08-11T12:06:39.517Z" }, + { url = "https://files.pythonhosted.org/packages/01/ef/4698d6842ef5e797c6db7744b0081e36fb5de3d00002cc4c58071097fac3/multidict-6.6.4-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:aaea28ba20a9026dfa77f4b80369e51cb767c61e33a2d4043399c67bd95fb7c6", size = 243185, upload-time = "2025-08-11T12:06:40.796Z" }, + { url = "https://files.pythonhosted.org/packages/aa/c9/d82e95ae1d6e4ef396934e9b0e942dfc428775f9554acf04393cce66b157/multidict-6.6.4-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:8c91cdb30809a96d9ecf442ec9bc45e8cfaa0f7f8bdf534e082c2443a196727e", size = 237838, upload-time = "2025-08-11T12:06:42.595Z" }, + { url = "https://files.pythonhosted.org/packages/57/cf/f94af5c36baaa75d44fab9f02e2a6bcfa0cd90acb44d4976a80960759dbc/multidict-6.6.4-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:1a0ccbfe93ca114c5d65a2471d52d8829e56d467c97b0e341cf5ee45410033b3", size = 246368, upload-time = "2025-08-11T12:06:44.304Z" }, + { url = "https://files.pythonhosted.org/packages/4a/fe/29f23460c3d995f6a4b678cb2e9730e7277231b981f0b234702f0177818a/multidict-6.6.4-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:55624b3f321d84c403cb7d8e6e982f41ae233d85f85db54ba6286f7295dc8a9c", size = 253339, upload-time = "2025-08-11T12:06:45.597Z" }, + { url = "https://files.pythonhosted.org/packages/29/b6/fd59449204426187b82bf8a75f629310f68c6adc9559dc922d5abe34797b/multidict-6.6.4-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:4a1fb393a2c9d202cb766c76208bd7945bc194eba8ac920ce98c6e458f0b524b", size = 246933, upload-time = "2025-08-11T12:06:46.841Z" }, + { url = "https://files.pythonhosted.org/packages/19/52/d5d6b344f176a5ac3606f7a61fb44dc746e04550e1a13834dff722b8d7d6/multidict-6.6.4-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:43868297a5759a845fa3a483fb4392973a95fb1de891605a3728130c52b8f40f", size = 242225, upload-time = "2025-08-11T12:06:48.588Z" }, + { url = "https://files.pythonhosted.org/packages/ec/d3/5b2281ed89ff4d5318d82478a2a2450fcdfc3300da48ff15c1778280ad26/multidict-6.6.4-cp311-cp311-win32.whl", hash = "sha256:ed3b94c5e362a8a84d69642dbeac615452e8af9b8eb825b7bc9f31a53a1051e2", size = 41306, upload-time = "2025-08-11T12:06:49.95Z" }, + { url = "https://files.pythonhosted.org/packages/74/7d/36b045c23a1ab98507aefd44fd8b264ee1dd5e5010543c6fccf82141ccef/multidict-6.6.4-cp311-cp311-win_amd64.whl", hash = "sha256:d8c112f7a90d8ca5d20213aa41eac690bb50a76da153e3afb3886418e61cb22e", size = 46029, upload-time = "2025-08-11T12:06:51.082Z" }, + { url = "https://files.pythonhosted.org/packages/0f/5e/553d67d24432c5cd52b49047f2d248821843743ee6d29a704594f656d182/multidict-6.6.4-cp311-cp311-win_arm64.whl", hash = "sha256:3bb0eae408fa1996d87247ca0d6a57b7fc1dcf83e8a5c47ab82c558c250d4adf", size = 43017, upload-time = "2025-08-11T12:06:52.243Z" }, + { url = "https://files.pythonhosted.org/packages/05/f6/512ffd8fd8b37fb2680e5ac35d788f1d71bbaf37789d21a820bdc441e565/multidict-6.6.4-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:0ffb87be160942d56d7b87b0fdf098e81ed565add09eaa1294268c7f3caac4c8", size = 76516, upload-time = "2025-08-11T12:06:53.393Z" }, + { url = "https://files.pythonhosted.org/packages/99/58/45c3e75deb8855c36bd66cc1658007589662ba584dbf423d01df478dd1c5/multidict-6.6.4-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:d191de6cbab2aff5de6c5723101705fd044b3e4c7cfd587a1929b5028b9714b3", size = 45394, upload-time = "2025-08-11T12:06:54.555Z" }, + { url = "https://files.pythonhosted.org/packages/fd/ca/e8c4472a93a26e4507c0b8e1f0762c0d8a32de1328ef72fd704ef9cc5447/multidict-6.6.4-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:38a0956dd92d918ad5feff3db8fcb4a5eb7dba114da917e1a88475619781b57b", size = 43591, upload-time = "2025-08-11T12:06:55.672Z" }, + { url = "https://files.pythonhosted.org/packages/05/51/edf414f4df058574a7265034d04c935aa84a89e79ce90fcf4df211f47b16/multidict-6.6.4-cp312-cp312-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:6865f6d3b7900ae020b495d599fcf3765653bc927951c1abb959017f81ae8287", size = 237215, upload-time = "2025-08-11T12:06:57.213Z" }, + { url = "https://files.pythonhosted.org/packages/c8/45/8b3d6dbad8cf3252553cc41abea09ad527b33ce47a5e199072620b296902/multidict-6.6.4-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0a2088c126b6f72db6c9212ad827d0ba088c01d951cee25e758c450da732c138", size = 258299, upload-time = "2025-08-11T12:06:58.946Z" }, + { url = "https://files.pythonhosted.org/packages/3c/e8/8ca2e9a9f5a435fc6db40438a55730a4bf4956b554e487fa1b9ae920f825/multidict-6.6.4-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:0f37bed7319b848097085d7d48116f545985db988e2256b2e6f00563a3416ee6", size = 242357, upload-time = "2025-08-11T12:07:00.301Z" }, + { url = "https://files.pythonhosted.org/packages/0f/84/80c77c99df05a75c28490b2af8f7cba2a12621186e0a8b0865d8e745c104/multidict-6.6.4-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:01368e3c94032ba6ca0b78e7ccb099643466cf24f8dc8eefcfdc0571d56e58f9", size = 268369, upload-time = "2025-08-11T12:07:01.638Z" }, + { url = "https://files.pythonhosted.org/packages/0d/e9/920bfa46c27b05fb3e1ad85121fd49f441492dca2449c5bcfe42e4565d8a/multidict-6.6.4-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:8fe323540c255db0bffee79ad7f048c909f2ab0edb87a597e1c17da6a54e493c", size = 269341, upload-time = "2025-08-11T12:07:02.943Z" }, + { url = "https://files.pythonhosted.org/packages/af/65/753a2d8b05daf496f4a9c367fe844e90a1b2cac78e2be2c844200d10cc4c/multidict-6.6.4-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b8eb3025f17b0a4c3cd08cda49acf312a19ad6e8a4edd9dbd591e6506d999402", size = 256100, upload-time = "2025-08-11T12:07:04.564Z" }, + { url = "https://files.pythonhosted.org/packages/09/54/655be13ae324212bf0bc15d665a4e34844f34c206f78801be42f7a0a8aaa/multidict-6.6.4-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:bbc14f0365534d35a06970d6a83478b249752e922d662dc24d489af1aa0d1be7", size = 253584, upload-time = "2025-08-11T12:07:05.914Z" }, + { url = "https://files.pythonhosted.org/packages/5c/74/ab2039ecc05264b5cec73eb018ce417af3ebb384ae9c0e9ed42cb33f8151/multidict-6.6.4-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:75aa52fba2d96bf972e85451b99d8e19cc37ce26fd016f6d4aa60da9ab2b005f", size = 251018, upload-time = "2025-08-11T12:07:08.301Z" }, + { url = "https://files.pythonhosted.org/packages/af/0a/ccbb244ac848e56c6427f2392741c06302bbfba49c0042f1eb3c5b606497/multidict-6.6.4-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:4fefd4a815e362d4f011919d97d7b4a1e566f1dde83dc4ad8cfb5b41de1df68d", size = 251477, upload-time = "2025-08-11T12:07:10.248Z" }, + { url = "https://files.pythonhosted.org/packages/0e/b0/0ed49bba775b135937f52fe13922bc64a7eaf0a3ead84a36e8e4e446e096/multidict-6.6.4-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:db9801fe021f59a5b375ab778973127ca0ac52429a26e2fd86aa9508f4d26eb7", size = 263575, upload-time = "2025-08-11T12:07:11.928Z" }, + { url = "https://files.pythonhosted.org/packages/3e/d9/7fb85a85e14de2e44dfb6a24f03c41e2af8697a6df83daddb0e9b7569f73/multidict-6.6.4-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:a650629970fa21ac1fb06ba25dabfc5b8a2054fcbf6ae97c758aa956b8dba802", size = 259649, upload-time = "2025-08-11T12:07:13.244Z" }, + { url = "https://files.pythonhosted.org/packages/03/9e/b3a459bcf9b6e74fa461a5222a10ff9b544cb1cd52fd482fb1b75ecda2a2/multidict-6.6.4-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:452ff5da78d4720d7516a3a2abd804957532dd69296cb77319c193e3ffb87e24", size = 251505, upload-time = "2025-08-11T12:07:14.57Z" }, + { url = "https://files.pythonhosted.org/packages/86/a2/8022f78f041dfe6d71e364001a5cf987c30edfc83c8a5fb7a3f0974cff39/multidict-6.6.4-cp312-cp312-win32.whl", hash = "sha256:8c2fcb12136530ed19572bbba61b407f655e3953ba669b96a35036a11a485793", size = 41888, upload-time = "2025-08-11T12:07:15.904Z" }, + { url = "https://files.pythonhosted.org/packages/c7/eb/d88b1780d43a56db2cba24289fa744a9d216c1a8546a0dc3956563fd53ea/multidict-6.6.4-cp312-cp312-win_amd64.whl", hash = "sha256:047d9425860a8c9544fed1b9584f0c8bcd31bcde9568b047c5e567a1025ecd6e", size = 46072, upload-time = "2025-08-11T12:07:17.045Z" }, + { url = "https://files.pythonhosted.org/packages/9f/16/b929320bf5750e2d9d4931835a4c638a19d2494a5b519caaaa7492ebe105/multidict-6.6.4-cp312-cp312-win_arm64.whl", hash = "sha256:14754eb72feaa1e8ae528468f24250dd997b8e2188c3d2f593f9eba259e4b364", size = 43222, upload-time = "2025-08-11T12:07:18.328Z" }, + { url = "https://files.pythonhosted.org/packages/3a/5d/e1db626f64f60008320aab00fbe4f23fc3300d75892a3381275b3d284580/multidict-6.6.4-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:f46a6e8597f9bd71b31cc708195d42b634c8527fecbcf93febf1052cacc1f16e", size = 75848, upload-time = "2025-08-11T12:07:19.912Z" }, + { url = "https://files.pythonhosted.org/packages/4c/aa/8b6f548d839b6c13887253af4e29c939af22a18591bfb5d0ee6f1931dae8/multidict-6.6.4-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:22e38b2bc176c5eb9c0a0e379f9d188ae4cd8b28c0f53b52bce7ab0a9e534657", size = 45060, upload-time = "2025-08-11T12:07:21.163Z" }, + { url = "https://files.pythonhosted.org/packages/eb/c6/f5e97e5d99a729bc2aa58eb3ebfa9f1e56a9b517cc38c60537c81834a73f/multidict-6.6.4-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:5df8afd26f162da59e218ac0eefaa01b01b2e6cd606cffa46608f699539246da", size = 43269, upload-time = "2025-08-11T12:07:22.392Z" }, + { url = "https://files.pythonhosted.org/packages/dc/31/d54eb0c62516776f36fe67f84a732f97e0b0e12f98d5685bebcc6d396910/multidict-6.6.4-cp313-cp313-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:49517449b58d043023720aa58e62b2f74ce9b28f740a0b5d33971149553d72aa", size = 237158, upload-time = "2025-08-11T12:07:23.636Z" }, + { url = "https://files.pythonhosted.org/packages/c4/1c/8a10c1c25b23156e63b12165a929d8eb49a6ed769fdbefb06e6f07c1e50d/multidict-6.6.4-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ae9408439537c5afdca05edd128a63f56a62680f4b3c234301055d7a2000220f", size = 257076, upload-time = "2025-08-11T12:07:25.049Z" }, + { url = "https://files.pythonhosted.org/packages/ad/86/90e20b5771d6805a119e483fd3d1e8393e745a11511aebca41f0da38c3e2/multidict-6.6.4-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:87a32d20759dc52a9e850fe1061b6e41ab28e2998d44168a8a341b99ded1dba0", size = 240694, upload-time = "2025-08-11T12:07:26.458Z" }, + { url = "https://files.pythonhosted.org/packages/e7/49/484d3e6b535bc0555b52a0a26ba86e4d8d03fd5587d4936dc59ba7583221/multidict-6.6.4-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:52e3c8d43cdfff587ceedce9deb25e6ae77daba560b626e97a56ddcad3756879", size = 266350, upload-time = "2025-08-11T12:07:27.94Z" }, + { url = "https://files.pythonhosted.org/packages/bf/b4/aa4c5c379b11895083d50021e229e90c408d7d875471cb3abf721e4670d6/multidict-6.6.4-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:ad8850921d3a8d8ff6fbef790e773cecfc260bbfa0566998980d3fa8f520bc4a", size = 267250, upload-time = "2025-08-11T12:07:29.303Z" }, + { url = "https://files.pythonhosted.org/packages/80/e5/5e22c5bf96a64bdd43518b1834c6d95a4922cc2066b7d8e467dae9b6cee6/multidict-6.6.4-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:497a2954adc25c08daff36f795077f63ad33e13f19bfff7736e72c785391534f", size = 254900, upload-time = "2025-08-11T12:07:30.764Z" }, + { url = "https://files.pythonhosted.org/packages/17/38/58b27fed927c07035abc02befacab42491e7388ca105e087e6e0215ead64/multidict-6.6.4-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:024ce601f92d780ca1617ad4be5ac15b501cc2414970ffa2bb2bbc2bd5a68fa5", size = 252355, upload-time = "2025-08-11T12:07:32.205Z" }, + { url = "https://files.pythonhosted.org/packages/d0/a1/dad75d23a90c29c02b5d6f3d7c10ab36c3197613be5d07ec49c7791e186c/multidict-6.6.4-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:a693fc5ed9bdd1c9e898013e0da4dcc640de7963a371c0bd458e50e046bf6438", size = 250061, upload-time = "2025-08-11T12:07:33.623Z" }, + { url = "https://files.pythonhosted.org/packages/b8/1a/ac2216b61c7f116edab6dc3378cca6c70dc019c9a457ff0d754067c58b20/multidict-6.6.4-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:190766dac95aab54cae5b152a56520fd99298f32a1266d66d27fdd1b5ac00f4e", size = 249675, upload-time = "2025-08-11T12:07:34.958Z" }, + { url = "https://files.pythonhosted.org/packages/d4/79/1916af833b800d13883e452e8e0977c065c4ee3ab7a26941fbfdebc11895/multidict-6.6.4-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:34d8f2a5ffdceab9dcd97c7a016deb2308531d5f0fced2bb0c9e1df45b3363d7", size = 261247, upload-time = "2025-08-11T12:07:36.588Z" }, + { url = "https://files.pythonhosted.org/packages/c5/65/d1f84fe08ac44a5fc7391cbc20a7cedc433ea616b266284413fd86062f8c/multidict-6.6.4-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:59e8d40ab1f5a8597abcef00d04845155a5693b5da00d2c93dbe88f2050f2812", size = 257960, upload-time = "2025-08-11T12:07:39.735Z" }, + { url = "https://files.pythonhosted.org/packages/13/b5/29ec78057d377b195ac2c5248c773703a6b602e132a763e20ec0457e7440/multidict-6.6.4-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:467fe64138cfac771f0e949b938c2e1ada2b5af22f39692aa9258715e9ea613a", size = 250078, upload-time = "2025-08-11T12:07:41.525Z" }, + { url = "https://files.pythonhosted.org/packages/c4/0e/7e79d38f70a872cae32e29b0d77024bef7834b0afb406ddae6558d9e2414/multidict-6.6.4-cp313-cp313-win32.whl", hash = "sha256:14616a30fe6d0a48d0a48d1a633ab3b8bec4cf293aac65f32ed116f620adfd69", size = 41708, upload-time = "2025-08-11T12:07:43.405Z" }, + { url = "https://files.pythonhosted.org/packages/9d/34/746696dffff742e97cd6a23da953e55d0ea51fa601fa2ff387b3edcfaa2c/multidict-6.6.4-cp313-cp313-win_amd64.whl", hash = "sha256:40cd05eaeb39e2bc8939451f033e57feaa2ac99e07dbca8afe2be450a4a3b6cf", size = 45912, upload-time = "2025-08-11T12:07:45.082Z" }, + { url = "https://files.pythonhosted.org/packages/c7/87/3bac136181e271e29170d8d71929cdeddeb77f3e8b6a0c08da3a8e9da114/multidict-6.6.4-cp313-cp313-win_arm64.whl", hash = "sha256:f6eb37d511bfae9e13e82cb4d1af36b91150466f24d9b2b8a9785816deb16605", size = 43076, upload-time = "2025-08-11T12:07:46.746Z" }, + { url = "https://files.pythonhosted.org/packages/64/94/0a8e63e36c049b571c9ae41ee301ada29c3fee9643d9c2548d7d558a1d99/multidict-6.6.4-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:6c84378acd4f37d1b507dfa0d459b449e2321b3ba5f2338f9b085cf7a7ba95eb", size = 82812, upload-time = "2025-08-11T12:07:48.402Z" }, + { url = "https://files.pythonhosted.org/packages/25/1a/be8e369dfcd260d2070a67e65dd3990dd635cbd735b98da31e00ea84cd4e/multidict-6.6.4-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:0e0558693063c75f3d952abf645c78f3c5dfdd825a41d8c4d8156fc0b0da6e7e", size = 48313, upload-time = "2025-08-11T12:07:49.679Z" }, + { url = "https://files.pythonhosted.org/packages/26/5a/dd4ade298674b2f9a7b06a32c94ffbc0497354df8285f27317c66433ce3b/multidict-6.6.4-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:3f8e2384cb83ebd23fd07e9eada8ba64afc4c759cd94817433ab8c81ee4b403f", size = 46777, upload-time = "2025-08-11T12:07:51.318Z" }, + { url = "https://files.pythonhosted.org/packages/89/db/98aa28bc7e071bfba611ac2ae803c24e96dd3a452b4118c587d3d872c64c/multidict-6.6.4-cp313-cp313t-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:f996b87b420995a9174b2a7c1a8daf7db4750be6848b03eb5e639674f7963773", size = 229321, upload-time = "2025-08-11T12:07:52.965Z" }, + { url = "https://files.pythonhosted.org/packages/c7/bc/01ddda2a73dd9d167bd85d0e8ef4293836a8f82b786c63fb1a429bc3e678/multidict-6.6.4-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:cc356250cffd6e78416cf5b40dc6a74f1edf3be8e834cf8862d9ed5265cf9b0e", size = 249954, upload-time = "2025-08-11T12:07:54.423Z" }, + { url = "https://files.pythonhosted.org/packages/06/78/6b7c0f020f9aa0acf66d0ab4eb9f08375bac9a50ff5e3edb1c4ccd59eafc/multidict-6.6.4-cp313-cp313t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:dadf95aa862714ea468a49ad1e09fe00fcc9ec67d122f6596a8d40caf6cec7d0", size = 228612, upload-time = "2025-08-11T12:07:55.914Z" }, + { url = "https://files.pythonhosted.org/packages/00/44/3faa416f89b2d5d76e9d447296a81521e1c832ad6e40b92f990697b43192/multidict-6.6.4-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:7dd57515bebffd8ebd714d101d4c434063322e4fe24042e90ced41f18b6d3395", size = 257528, upload-time = "2025-08-11T12:07:57.371Z" }, + { url = "https://files.pythonhosted.org/packages/05/5f/77c03b89af0fcb16f018f668207768191fb9dcfb5e3361a5e706a11db2c9/multidict-6.6.4-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:967af5f238ebc2eb1da4e77af5492219fbd9b4b812347da39a7b5f5c72c0fa45", size = 256329, upload-time = "2025-08-11T12:07:58.844Z" }, + { url = "https://files.pythonhosted.org/packages/cf/e9/ed750a2a9afb4f8dc6f13dc5b67b514832101b95714f1211cd42e0aafc26/multidict-6.6.4-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2a4c6875c37aae9794308ec43e3530e4aa0d36579ce38d89979bbf89582002bb", size = 247928, upload-time = "2025-08-11T12:08:01.037Z" }, + { url = "https://files.pythonhosted.org/packages/1f/b5/e0571bc13cda277db7e6e8a532791d4403dacc9850006cb66d2556e649c0/multidict-6.6.4-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:7f683a551e92bdb7fac545b9c6f9fa2aebdeefa61d607510b3533286fcab67f5", size = 245228, upload-time = "2025-08-11T12:08:02.96Z" }, + { url = "https://files.pythonhosted.org/packages/f3/a3/69a84b0eccb9824491f06368f5b86e72e4af54c3067c37c39099b6687109/multidict-6.6.4-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:3ba5aaf600edaf2a868a391779f7a85d93bed147854925f34edd24cc70a3e141", size = 235869, upload-time = "2025-08-11T12:08:04.746Z" }, + { url = "https://files.pythonhosted.org/packages/a9/9d/28802e8f9121a6a0804fa009debf4e753d0a59969ea9f70be5f5fdfcb18f/multidict-6.6.4-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:580b643b7fd2c295d83cad90d78419081f53fd532d1f1eb67ceb7060f61cff0d", size = 243446, upload-time = "2025-08-11T12:08:06.332Z" }, + { url = "https://files.pythonhosted.org/packages/38/ea/6c98add069b4878c1d66428a5f5149ddb6d32b1f9836a826ac764b9940be/multidict-6.6.4-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:37b7187197da6af3ee0b044dbc9625afd0c885f2800815b228a0e70f9a7f473d", size = 252299, upload-time = "2025-08-11T12:08:07.931Z" }, + { url = "https://files.pythonhosted.org/packages/3a/09/8fe02d204473e14c0af3affd50af9078839dfca1742f025cca765435d6b4/multidict-6.6.4-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:e1b93790ed0bc26feb72e2f08299691ceb6da5e9e14a0d13cc74f1869af327a0", size = 246926, upload-time = "2025-08-11T12:08:09.467Z" }, + { url = "https://files.pythonhosted.org/packages/37/3d/7b1e10d774a6df5175ecd3c92bff069e77bed9ec2a927fdd4ff5fe182f67/multidict-6.6.4-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:a506a77ddee1efcca81ecbeae27ade3e09cdf21a8ae854d766c2bb4f14053f92", size = 243383, upload-time = "2025-08-11T12:08:10.981Z" }, + { url = "https://files.pythonhosted.org/packages/50/b0/a6fae46071b645ae98786ab738447de1ef53742eaad949f27e960864bb49/multidict-6.6.4-cp313-cp313t-win32.whl", hash = "sha256:f93b2b2279883d1d0a9e1bd01f312d6fc315c5e4c1f09e112e4736e2f650bc4e", size = 47775, upload-time = "2025-08-11T12:08:12.439Z" }, + { url = "https://files.pythonhosted.org/packages/b2/0a/2436550b1520091af0600dff547913cb2d66fbac27a8c33bc1b1bccd8d98/multidict-6.6.4-cp313-cp313t-win_amd64.whl", hash = "sha256:6d46a180acdf6e87cc41dc15d8f5c2986e1e8739dc25dbb7dac826731ef381a4", size = 53100, upload-time = "2025-08-11T12:08:13.823Z" }, + { url = "https://files.pythonhosted.org/packages/97/ea/43ac51faff934086db9c072a94d327d71b7d8b40cd5dcb47311330929ef0/multidict-6.6.4-cp313-cp313t-win_arm64.whl", hash = "sha256:756989334015e3335d087a27331659820d53ba432befdef6a718398b0a8493ad", size = 45501, upload-time = "2025-08-11T12:08:15.173Z" }, + { url = "https://files.pythonhosted.org/packages/fd/69/b547032297c7e63ba2af494edba695d781af8a0c6e89e4d06cf848b21d80/multidict-6.6.4-py3-none-any.whl", hash = "sha256:27d8f8e125c07cb954e54d75d04905a9bba8a439c1d84aca94949d4d03d8601c", size = 12313, upload-time = "2025-08-11T12:08:46.891Z" }, +] + +[[package]] +name = "mypy" +version = "1.18.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "mypy-extensions" }, + { name = "pathspec" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/14/a3/931e09fc02d7ba96da65266884da4e4a8806adcdb8a57faaacc6edf1d538/mypy-1.18.1.tar.gz", hash = "sha256:9e988c64ad3ac5987f43f5154f884747faf62141b7f842e87465b45299eea5a9", size = 3448447, upload-time = "2025-09-11T23:00:47.067Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/32/28/47709d5d9e7068b26c0d5189c8137c8783e81065ad1102b505214a08b548/mypy-1.18.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6c903857b3e28fc5489e54042684a9509039ea0aedb2a619469438b544ae1961", size = 12734635, upload-time = "2025-09-11T23:00:24.983Z" }, + { url = "https://files.pythonhosted.org/packages/7c/12/ee5c243e52497d0e59316854041cf3b3130131b92266d0764aca4dec3c00/mypy-1.18.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2a0c8392c19934c2b6c65566d3a6abdc6b51d5da7f5d04e43f0eb627d6eeee65", size = 11817287, upload-time = "2025-09-11T22:59:07.38Z" }, + { url = "https://files.pythonhosted.org/packages/48/bd/2aeb950151005fe708ab59725afed7c4aeeb96daf844f86a05d4b8ac34f8/mypy-1.18.1-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f85eb7efa2ec73ef63fc23b8af89c2fe5bf2a4ad985ed2d3ff28c1bb3c317c92", size = 12430464, upload-time = "2025-09-11T22:58:48.084Z" }, + { url = "https://files.pythonhosted.org/packages/71/e8/7a20407aafb488acb5734ad7fb5e8c2ef78d292ca2674335350fa8ebef67/mypy-1.18.1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:82ace21edf7ba8af31c3308a61dc72df30500f4dbb26f99ac36b4b80809d7e94", size = 13164555, upload-time = "2025-09-11T23:00:13.803Z" }, + { url = "https://files.pythonhosted.org/packages/e8/c9/5f39065252e033b60f397096f538fb57c1d9fd70a7a490f314df20dd9d64/mypy-1.18.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:a2dfd53dfe632f1ef5d161150a4b1f2d0786746ae02950eb3ac108964ee2975a", size = 13359222, upload-time = "2025-09-11T23:00:33.469Z" }, + { url = "https://files.pythonhosted.org/packages/85/b6/d54111ef3c1e55992cd2ec9b8b6ce9c72a407423e93132cae209f7e7ba60/mypy-1.18.1-cp311-cp311-win_amd64.whl", hash = "sha256:320f0ad4205eefcb0e1a72428dde0ad10be73da9f92e793c36228e8ebf7298c0", size = 9760441, upload-time = "2025-09-11T23:00:44.826Z" }, + { url = "https://files.pythonhosted.org/packages/e7/14/1c3f54d606cb88a55d1567153ef3a8bc7b74702f2ff5eb64d0994f9e49cb/mypy-1.18.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:502cde8896be8e638588b90fdcb4c5d5b8c1b004dfc63fd5604a973547367bb9", size = 12911082, upload-time = "2025-09-11T23:00:41.465Z" }, + { url = "https://files.pythonhosted.org/packages/90/83/235606c8b6d50a8eba99773add907ce1d41c068edb523f81eb0d01603a83/mypy-1.18.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:7509549b5e41be279afc1228242d0e397f1af2919a8f2877ad542b199dc4083e", size = 11919107, upload-time = "2025-09-11T22:58:40.903Z" }, + { url = "https://files.pythonhosted.org/packages/ca/25/4e2ce00f8d15b99d0c68a2536ad63e9eac033f723439ef80290ec32c1ff5/mypy-1.18.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5956ecaabb3a245e3f34100172abca1507be687377fe20e24d6a7557e07080e2", size = 12472551, upload-time = "2025-09-11T22:58:37.272Z" }, + { url = "https://files.pythonhosted.org/packages/32/bb/92642a9350fc339dd9dcefcf6862d171b52294af107d521dce075f32f298/mypy-1.18.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8750ceb014a96c9890421c83f0db53b0f3b8633e2864c6f9bc0a8e93951ed18d", size = 13340554, upload-time = "2025-09-11T22:59:38.756Z" }, + { url = "https://files.pythonhosted.org/packages/cd/ee/38d01db91c198fb6350025d28f9719ecf3c8f2c55a0094bfbf3ef478cc9a/mypy-1.18.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:fb89ea08ff41adf59476b235293679a6eb53a7b9400f6256272fb6029bec3ce5", size = 13530933, upload-time = "2025-09-11T22:59:20.228Z" }, + { url = "https://files.pythonhosted.org/packages/da/8d/6d991ae631f80d58edbf9d7066e3f2a96e479dca955d9a968cd6e90850a3/mypy-1.18.1-cp312-cp312-win_amd64.whl", hash = "sha256:2657654d82fcd2a87e02a33e0d23001789a554059bbf34702d623dafe353eabf", size = 9828426, upload-time = "2025-09-11T23:00:21.007Z" }, + { url = "https://files.pythonhosted.org/packages/e4/ec/ef4a7260e1460a3071628a9277a7579e7da1b071bc134ebe909323f2fbc7/mypy-1.18.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:d70d2b5baf9b9a20bc9c730015615ae3243ef47fb4a58ad7b31c3e0a59b5ef1f", size = 12918671, upload-time = "2025-09-11T22:58:29.814Z" }, + { url = "https://files.pythonhosted.org/packages/a1/82/0ea6c3953f16223f0b8eda40c1aeac6bd266d15f4902556ae6e91f6fca4c/mypy-1.18.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:b8367e33506300f07a43012fc546402f283c3f8bcff1dc338636affb710154ce", size = 11913023, upload-time = "2025-09-11T23:00:29.049Z" }, + { url = "https://files.pythonhosted.org/packages/ae/ef/5e2057e692c2690fc27b3ed0a4dbde4388330c32e2576a23f0302bc8358d/mypy-1.18.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:913f668ec50c3337b89df22f973c1c8f0b29ee9e290a8b7fe01cc1ef7446d42e", size = 12473355, upload-time = "2025-09-11T23:00:04.544Z" }, + { url = "https://files.pythonhosted.org/packages/98/43/b7e429fc4be10e390a167b0cd1810d41cb4e4add4ae50bab96faff695a3b/mypy-1.18.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1a0e70b87eb27b33209fa4792b051c6947976f6ab829daa83819df5f58330c71", size = 13346944, upload-time = "2025-09-11T22:58:23.024Z" }, + { url = "https://files.pythonhosted.org/packages/89/4e/899dba0bfe36bbd5b7c52e597de4cf47b5053d337b6d201a30e3798e77a6/mypy-1.18.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:c378d946e8a60be6b6ede48c878d145546fb42aad61df998c056ec151bf6c746", size = 13512574, upload-time = "2025-09-11T22:59:52.152Z" }, + { url = "https://files.pythonhosted.org/packages/f5/f8/7661021a5b0e501b76440454d786b0f01bb05d5c4b125fcbda02023d0250/mypy-1.18.1-cp313-cp313-win_amd64.whl", hash = "sha256:2cd2c1e0f3a7465f22731987fff6fc427e3dcbb4ca5f7db5bbeaff2ff9a31f6d", size = 9837684, upload-time = "2025-09-11T22:58:44.454Z" }, + { url = "https://files.pythonhosted.org/packages/bf/87/7b173981466219eccc64c107cf8e5ab9eb39cc304b4c07df8e7881533e4f/mypy-1.18.1-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:ba24603c58e34dd5b096dfad792d87b304fc6470cbb1c22fd64e7ebd17edcc61", size = 12900265, upload-time = "2025-09-11T22:59:03.4Z" }, + { url = "https://files.pythonhosted.org/packages/ae/cc/b10e65bae75b18a5ac8f81b1e8e5867677e418f0dd2c83b8e2de9ba96ebd/mypy-1.18.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:ed36662fb92ae4cb3cacc682ec6656208f323bbc23d4b08d091eecfc0863d4b5", size = 11942890, upload-time = "2025-09-11T23:00:00.607Z" }, + { url = "https://files.pythonhosted.org/packages/39/d4/aeefa07c44d09f4c2102e525e2031bc066d12e5351f66b8a83719671004d/mypy-1.18.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:040ecc95e026f71a9ad7956fea2724466602b561e6a25c2e5584160d3833aaa8", size = 12472291, upload-time = "2025-09-11T22:59:43.425Z" }, + { url = "https://files.pythonhosted.org/packages/c6/07/711e78668ff8e365f8c19735594ea95938bff3639a4c46a905e3ed8ff2d6/mypy-1.18.1-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:937e3ed86cb731276706e46e03512547e43c391a13f363e08d0fee49a7c38a0d", size = 13318610, upload-time = "2025-09-11T23:00:17.604Z" }, + { url = "https://files.pythonhosted.org/packages/ca/85/df3b2d39339c31d360ce299b418c55e8194ef3205284739b64962f6074e7/mypy-1.18.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:1f95cc4f01c0f1701ca3b0355792bccec13ecb2ec1c469e5b85a6ef398398b1d", size = 13513697, upload-time = "2025-09-11T22:58:59.534Z" }, + { url = "https://files.pythonhosted.org/packages/b1/df/462866163c99ea73bb28f0eb4d415c087e30de5d36ee0f5429d42e28689b/mypy-1.18.1-cp314-cp314-win_amd64.whl", hash = "sha256:e4f16c0019d48941220ac60b893615be2f63afedaba6a0801bdcd041b96991ce", size = 9985739, upload-time = "2025-09-11T22:58:51.644Z" }, + { url = "https://files.pythonhosted.org/packages/e0/1d/4b97d3089b48ef3d904c9ca69fab044475bd03245d878f5f0b3ea1daf7ce/mypy-1.18.1-py3-none-any.whl", hash = "sha256:b76a4de66a0ac01da1be14ecc8ae88ddea33b8380284a9e3eae39d57ebcbe26e", size = 2352212, upload-time = "2025-09-11T22:59:26.576Z" }, +] + +[[package]] +name = "mypy-extensions" +version = "1.1.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/a2/6e/371856a3fb9d31ca8dac321cda606860fa4548858c0cc45d9d1d4ca2628b/mypy_extensions-1.1.0.tar.gz", hash = "sha256:52e68efc3284861e772bbcd66823fde5ae21fd2fdb51c62a211403730b916558", size = 6343, upload-time = "2025-04-22T14:54:24.164Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/79/7b/2c79738432f5c924bef5071f933bcc9efd0473bac3b4aa584a6f7c1c8df8/mypy_extensions-1.1.0-py3-none-any.whl", hash = "sha256:1be4cccdb0f2482337c4743e60421de3a356cd97508abadd57d47403e94f5505", size = 4963, upload-time = "2025-04-22T14:54:22.983Z" }, +] + +[[package]] +name = "networkx" +version = "3.5" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/6c/4f/ccdb8ad3a38e583f214547fd2f7ff1fc160c43a75af88e6aec213404b96a/networkx-3.5.tar.gz", hash = "sha256:d4c6f9cf81f52d69230866796b82afbccdec3db7ae4fbd1b65ea750feed50037", size = 2471065, upload-time = "2025-05-29T11:35:07.804Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/eb/8d/776adee7bbf76365fdd7f2552710282c79a4ead5d2a46408c9043a2b70ba/networkx-3.5-py3-none-any.whl", hash = "sha256:0030d386a9a06dee3565298b4a734b68589749a544acbb6c412dc9e2489ec6ec", size = 2034406, upload-time = "2025-05-29T11:35:04.961Z" }, +] + +[[package]] +name = "nltk" +version = "3.9.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "click" }, + { name = "joblib" }, + { name = "regex" }, + { name = "tqdm" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/3c/87/db8be88ad32c2d042420b6fd9ffd4a149f9a0d7f0e86b3f543be2eeeedd2/nltk-3.9.1.tar.gz", hash = "sha256:87d127bd3de4bd89a4f81265e5fa59cb1b199b27440175370f7417d2bc7ae868", size = 2904691, upload-time = "2024-08-18T19:48:37.769Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/4d/66/7d9e26593edda06e8cb531874633f7c2372279c3b0f46235539fe546df8b/nltk-3.9.1-py3-none-any.whl", hash = "sha256:4fa26829c5b00715afe3061398a8989dc643b92ce7dd93fb4585a70930d168a1", size = 1505442, upload-time = "2024-08-18T19:48:21.909Z" }, +] + +[[package]] +name = "nodeenv" +version = "1.9.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/43/16/fc88b08840de0e0a72a2f9d8c6bae36be573e475a6326ae854bcc549fc45/nodeenv-1.9.1.tar.gz", hash = "sha256:6ec12890a2dab7946721edbfbcd91f3319c6ccc9aec47be7c7e6b7011ee6645f", size = 47437, upload-time = "2024-06-04T18:44:11.171Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d2/1d/1b658dbd2b9fa9c4c9f32accbfc0205d532c8c6194dc0f2a4c0428e7128a/nodeenv-1.9.1-py2.py3-none-any.whl", hash = "sha256:ba11c9782d29c27c70ffbdda2d7415098754709be8a7056d79a737cd901155c9", size = 22314, upload-time = "2024-06-04T18:44:08.352Z" }, +] + +[[package]] +name = "numpy" +version = "2.3.3" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/d0/19/95b3d357407220ed24c139018d2518fab0a61a948e68286a25f1a4d049ff/numpy-2.3.3.tar.gz", hash = "sha256:ddc7c39727ba62b80dfdbedf400d1c10ddfa8eefbd7ec8dcb118be8b56d31029", size = 20576648, upload-time = "2025-09-09T16:54:12.543Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7a/45/e80d203ef6b267aa29b22714fb558930b27960a0c5ce3c19c999232bb3eb/numpy-2.3.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:0ffc4f5caba7dfcbe944ed674b7eef683c7e94874046454bb79ed7ee0236f59d", size = 21259253, upload-time = "2025-09-09T15:56:02.094Z" }, + { url = "https://files.pythonhosted.org/packages/52/18/cf2c648fccf339e59302e00e5f2bc87725a3ce1992f30f3f78c9044d7c43/numpy-2.3.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e7e946c7170858a0295f79a60214424caac2ffdb0063d4d79cb681f9aa0aa569", size = 14450980, upload-time = "2025-09-09T15:56:05.926Z" }, + { url = "https://files.pythonhosted.org/packages/93/fb/9af1082bec870188c42a1c239839915b74a5099c392389ff04215dcee812/numpy-2.3.3-cp311-cp311-macosx_14_0_arm64.whl", hash = "sha256:cd4260f64bc794c3390a63bf0728220dd1a68170c169088a1e0dfa2fde1be12f", size = 5379709, upload-time = "2025-09-09T15:56:07.95Z" }, + { url = "https://files.pythonhosted.org/packages/75/0f/bfd7abca52bcbf9a4a65abc83fe18ef01ccdeb37bfb28bbd6ad613447c79/numpy-2.3.3-cp311-cp311-macosx_14_0_x86_64.whl", hash = "sha256:f0ddb4b96a87b6728df9362135e764eac3cfa674499943ebc44ce96c478ab125", size = 6913923, upload-time = "2025-09-09T15:56:09.443Z" }, + { url = "https://files.pythonhosted.org/packages/79/55/d69adad255e87ab7afda1caf93ca997859092afeb697703e2f010f7c2e55/numpy-2.3.3-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:afd07d377f478344ec6ca2b8d4ca08ae8bd44706763d1efb56397de606393f48", size = 14589591, upload-time = "2025-09-09T15:56:11.234Z" }, + { url = "https://files.pythonhosted.org/packages/10/a2/010b0e27ddeacab7839957d7a8f00e91206e0c2c47abbb5f35a2630e5387/numpy-2.3.3-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bc92a5dedcc53857249ca51ef29f5e5f2f8c513e22cfb90faeb20343b8c6f7a6", size = 16938714, upload-time = "2025-09-09T15:56:14.637Z" }, + { url = "https://files.pythonhosted.org/packages/1c/6b/12ce8ede632c7126eb2762b9e15e18e204b81725b81f35176eac14dc5b82/numpy-2.3.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:7af05ed4dc19f308e1d9fc759f36f21921eb7bbfc82843eeec6b2a2863a0aefa", size = 16370592, upload-time = "2025-09-09T15:56:17.285Z" }, + { url = "https://files.pythonhosted.org/packages/b4/35/aba8568b2593067bb6a8fe4c52babb23b4c3b9c80e1b49dff03a09925e4a/numpy-2.3.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:433bf137e338677cebdd5beac0199ac84712ad9d630b74eceeb759eaa45ddf30", size = 18884474, upload-time = "2025-09-09T15:56:20.943Z" }, + { url = "https://files.pythonhosted.org/packages/45/fa/7f43ba10c77575e8be7b0138d107e4f44ca4a1ef322cd16980ea3e8b8222/numpy-2.3.3-cp311-cp311-win32.whl", hash = "sha256:eb63d443d7b4ffd1e873f8155260d7f58e7e4b095961b01c91062935c2491e57", size = 6599794, upload-time = "2025-09-09T15:56:23.258Z" }, + { url = "https://files.pythonhosted.org/packages/0a/a2/a4f78cb2241fe5664a22a10332f2be886dcdea8784c9f6a01c272da9b426/numpy-2.3.3-cp311-cp311-win_amd64.whl", hash = "sha256:ec9d249840f6a565f58d8f913bccac2444235025bbb13e9a4681783572ee3caa", size = 13088104, upload-time = "2025-09-09T15:56:25.476Z" }, + { url = "https://files.pythonhosted.org/packages/79/64/e424e975adbd38282ebcd4891661965b78783de893b381cbc4832fb9beb2/numpy-2.3.3-cp311-cp311-win_arm64.whl", hash = "sha256:74c2a948d02f88c11a3c075d9733f1ae67d97c6bdb97f2bb542f980458b257e7", size = 10460772, upload-time = "2025-09-09T15:56:27.679Z" }, + { url = "https://files.pythonhosted.org/packages/51/5d/bb7fc075b762c96329147799e1bcc9176ab07ca6375ea976c475482ad5b3/numpy-2.3.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:cfdd09f9c84a1a934cde1eec2267f0a43a7cd44b2cca4ff95b7c0d14d144b0bf", size = 20957014, upload-time = "2025-09-09T15:56:29.966Z" }, + { url = "https://files.pythonhosted.org/packages/6b/0e/c6211bb92af26517acd52125a237a92afe9c3124c6a68d3b9f81b62a0568/numpy-2.3.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:cb32e3cf0f762aee47ad1ddc6672988f7f27045b0783c887190545baba73aa25", size = 14185220, upload-time = "2025-09-09T15:56:32.175Z" }, + { url = "https://files.pythonhosted.org/packages/22/f2/07bb754eb2ede9073f4054f7c0286b0d9d2e23982e090a80d478b26d35ca/numpy-2.3.3-cp312-cp312-macosx_14_0_arm64.whl", hash = "sha256:396b254daeb0a57b1fe0ecb5e3cff6fa79a380fa97c8f7781a6d08cd429418fe", size = 5113918, upload-time = "2025-09-09T15:56:34.175Z" }, + { url = "https://files.pythonhosted.org/packages/81/0a/afa51697e9fb74642f231ea36aca80fa17c8fb89f7a82abd5174023c3960/numpy-2.3.3-cp312-cp312-macosx_14_0_x86_64.whl", hash = "sha256:067e3d7159a5d8f8a0b46ee11148fc35ca9b21f61e3c49fbd0a027450e65a33b", size = 6647922, upload-time = "2025-09-09T15:56:36.149Z" }, + { url = "https://files.pythonhosted.org/packages/5d/f5/122d9cdb3f51c520d150fef6e87df9279e33d19a9611a87c0d2cf78a89f4/numpy-2.3.3-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1c02d0629d25d426585fb2e45a66154081b9fa677bc92a881ff1d216bc9919a8", size = 14281991, upload-time = "2025-09-09T15:56:40.548Z" }, + { url = "https://files.pythonhosted.org/packages/51/64/7de3c91e821a2debf77c92962ea3fe6ac2bc45d0778c1cbe15d4fce2fd94/numpy-2.3.3-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d9192da52b9745f7f0766531dcfa978b7763916f158bb63bdb8a1eca0068ab20", size = 16641643, upload-time = "2025-09-09T15:56:43.343Z" }, + { url = "https://files.pythonhosted.org/packages/30/e4/961a5fa681502cd0d68907818b69f67542695b74e3ceaa513918103b7e80/numpy-2.3.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:cd7de500a5b66319db419dc3c345244404a164beae0d0937283b907d8152e6ea", size = 16056787, upload-time = "2025-09-09T15:56:46.141Z" }, + { url = "https://files.pythonhosted.org/packages/99/26/92c912b966e47fbbdf2ad556cb17e3a3088e2e1292b9833be1dfa5361a1a/numpy-2.3.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:93d4962d8f82af58f0b2eb85daaf1b3ca23fe0a85d0be8f1f2b7bb46034e56d7", size = 18579598, upload-time = "2025-09-09T15:56:49.844Z" }, + { url = "https://files.pythonhosted.org/packages/17/b6/fc8f82cb3520768718834f310c37d96380d9dc61bfdaf05fe5c0b7653e01/numpy-2.3.3-cp312-cp312-win32.whl", hash = "sha256:5534ed6b92f9b7dca6c0a19d6df12d41c68b991cef051d108f6dbff3babc4ebf", size = 6320800, upload-time = "2025-09-09T15:56:52.499Z" }, + { url = "https://files.pythonhosted.org/packages/32/ee/de999f2625b80d043d6d2d628c07d0d5555a677a3cf78fdf868d409b8766/numpy-2.3.3-cp312-cp312-win_amd64.whl", hash = "sha256:497d7cad08e7092dba36e3d296fe4c97708c93daf26643a1ae4b03f6294d30eb", size = 12786615, upload-time = "2025-09-09T15:56:54.422Z" }, + { url = "https://files.pythonhosted.org/packages/49/6e/b479032f8a43559c383acb20816644f5f91c88f633d9271ee84f3b3a996c/numpy-2.3.3-cp312-cp312-win_arm64.whl", hash = "sha256:ca0309a18d4dfea6fc6262a66d06c26cfe4640c3926ceec90e57791a82b6eee5", size = 10195936, upload-time = "2025-09-09T15:56:56.541Z" }, + { url = "https://files.pythonhosted.org/packages/7d/b9/984c2b1ee61a8b803bf63582b4ac4242cf76e2dbd663efeafcb620cc0ccb/numpy-2.3.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f5415fb78995644253370985342cd03572ef8620b934da27d77377a2285955bf", size = 20949588, upload-time = "2025-09-09T15:56:59.087Z" }, + { url = "https://files.pythonhosted.org/packages/a6/e4/07970e3bed0b1384d22af1e9912527ecbeb47d3b26e9b6a3bced068b3bea/numpy-2.3.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:d00de139a3324e26ed5b95870ce63be7ec7352171bc69a4cf1f157a48e3eb6b7", size = 14177802, upload-time = "2025-09-09T15:57:01.73Z" }, + { url = "https://files.pythonhosted.org/packages/35/c7/477a83887f9de61f1203bad89cf208b7c19cc9fef0cebef65d5a1a0619f2/numpy-2.3.3-cp313-cp313-macosx_14_0_arm64.whl", hash = "sha256:9dc13c6a5829610cc07422bc74d3ac083bd8323f14e2827d992f9e52e22cd6a6", size = 5106537, upload-time = "2025-09-09T15:57:03.765Z" }, + { url = "https://files.pythonhosted.org/packages/52/47/93b953bd5866a6f6986344d045a207d3f1cfbad99db29f534ea9cee5108c/numpy-2.3.3-cp313-cp313-macosx_14_0_x86_64.whl", hash = "sha256:d79715d95f1894771eb4e60fb23f065663b2298f7d22945d66877aadf33d00c7", size = 6640743, upload-time = "2025-09-09T15:57:07.921Z" }, + { url = "https://files.pythonhosted.org/packages/23/83/377f84aaeb800b64c0ef4de58b08769e782edcefa4fea712910b6f0afd3c/numpy-2.3.3-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:952cfd0748514ea7c3afc729a0fc639e61655ce4c55ab9acfab14bda4f402b4c", size = 14278881, upload-time = "2025-09-09T15:57:11.349Z" }, + { url = "https://files.pythonhosted.org/packages/9a/a5/bf3db6e66c4b160d6ea10b534c381a1955dfab34cb1017ea93aa33c70ed3/numpy-2.3.3-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:5b83648633d46f77039c29078751f80da65aa64d5622a3cd62aaef9d835b6c93", size = 16636301, upload-time = "2025-09-09T15:57:14.245Z" }, + { url = "https://files.pythonhosted.org/packages/a2/59/1287924242eb4fa3f9b3a2c30400f2e17eb2707020d1c5e3086fe7330717/numpy-2.3.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:b001bae8cea1c7dfdb2ae2b017ed0a6f2102d7a70059df1e338e307a4c78a8ae", size = 16053645, upload-time = "2025-09-09T15:57:16.534Z" }, + { url = "https://files.pythonhosted.org/packages/e6/93/b3d47ed882027c35e94ac2320c37e452a549f582a5e801f2d34b56973c97/numpy-2.3.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:8e9aced64054739037d42fb84c54dd38b81ee238816c948c8f3ed134665dcd86", size = 18578179, upload-time = "2025-09-09T15:57:18.883Z" }, + { url = "https://files.pythonhosted.org/packages/20/d9/487a2bccbf7cc9d4bfc5f0f197761a5ef27ba870f1e3bbb9afc4bbe3fcc2/numpy-2.3.3-cp313-cp313-win32.whl", hash = "sha256:9591e1221db3f37751e6442850429b3aabf7026d3b05542d102944ca7f00c8a8", size = 6312250, upload-time = "2025-09-09T15:57:21.296Z" }, + { url = "https://files.pythonhosted.org/packages/1b/b5/263ebbbbcede85028f30047eab3d58028d7ebe389d6493fc95ae66c636ab/numpy-2.3.3-cp313-cp313-win_amd64.whl", hash = "sha256:f0dadeb302887f07431910f67a14d57209ed91130be0adea2f9793f1a4f817cf", size = 12783269, upload-time = "2025-09-09T15:57:23.034Z" }, + { url = "https://files.pythonhosted.org/packages/fa/75/67b8ca554bbeaaeb3fac2e8bce46967a5a06544c9108ec0cf5cece559b6c/numpy-2.3.3-cp313-cp313-win_arm64.whl", hash = "sha256:3c7cf302ac6e0b76a64c4aecf1a09e51abd9b01fc7feee80f6c43e3ab1b1dbc5", size = 10195314, upload-time = "2025-09-09T15:57:25.045Z" }, + { url = "https://files.pythonhosted.org/packages/11/d0/0d1ddec56b162042ddfafeeb293bac672de9b0cfd688383590090963720a/numpy-2.3.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:eda59e44957d272846bb407aad19f89dc6f58fecf3504bd144f4c5cf81a7eacc", size = 21048025, upload-time = "2025-09-09T15:57:27.257Z" }, + { url = "https://files.pythonhosted.org/packages/36/9e/1996ca6b6d00415b6acbdd3c42f7f03ea256e2c3f158f80bd7436a8a19f3/numpy-2.3.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:823d04112bc85ef5c4fda73ba24e6096c8f869931405a80aa8b0e604510a26bc", size = 14301053, upload-time = "2025-09-09T15:57:30.077Z" }, + { url = "https://files.pythonhosted.org/packages/05/24/43da09aa764c68694b76e84b3d3f0c44cb7c18cdc1ba80e48b0ac1d2cd39/numpy-2.3.3-cp313-cp313t-macosx_14_0_arm64.whl", hash = "sha256:40051003e03db4041aa325da2a0971ba41cf65714e65d296397cc0e32de6018b", size = 5229444, upload-time = "2025-09-09T15:57:32.733Z" }, + { url = "https://files.pythonhosted.org/packages/bc/14/50ffb0f22f7218ef8af28dd089f79f68289a7a05a208db9a2c5dcbe123c1/numpy-2.3.3-cp313-cp313t-macosx_14_0_x86_64.whl", hash = "sha256:6ee9086235dd6ab7ae75aba5662f582a81ced49f0f1c6de4260a78d8f2d91a19", size = 6738039, upload-time = "2025-09-09T15:57:34.328Z" }, + { url = "https://files.pythonhosted.org/packages/55/52/af46ac0795e09657d45a7f4db961917314377edecf66db0e39fa7ab5c3d3/numpy-2.3.3-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:94fcaa68757c3e2e668ddadeaa86ab05499a70725811e582b6a9858dd472fb30", size = 14352314, upload-time = "2025-09-09T15:57:36.255Z" }, + { url = "https://files.pythonhosted.org/packages/a7/b1/dc226b4c90eb9f07a3fff95c2f0db3268e2e54e5cce97c4ac91518aee71b/numpy-2.3.3-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:da1a74b90e7483d6ce5244053399a614b1d6b7bc30a60d2f570e5071f8959d3e", size = 16701722, upload-time = "2025-09-09T15:57:38.622Z" }, + { url = "https://files.pythonhosted.org/packages/9d/9d/9d8d358f2eb5eced14dba99f110d83b5cd9a4460895230f3b396ad19a323/numpy-2.3.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:2990adf06d1ecee3b3dcbb4977dfab6e9f09807598d647f04d385d29e7a3c3d3", size = 16132755, upload-time = "2025-09-09T15:57:41.16Z" }, + { url = "https://files.pythonhosted.org/packages/b6/27/b3922660c45513f9377b3fb42240bec63f203c71416093476ec9aa0719dc/numpy-2.3.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:ed635ff692483b8e3f0fcaa8e7eb8a75ee71aa6d975388224f70821421800cea", size = 18651560, upload-time = "2025-09-09T15:57:43.459Z" }, + { url = "https://files.pythonhosted.org/packages/5b/8e/3ab61a730bdbbc201bb245a71102aa609f0008b9ed15255500a99cd7f780/numpy-2.3.3-cp313-cp313t-win32.whl", hash = "sha256:a333b4ed33d8dc2b373cc955ca57babc00cd6f9009991d9edc5ddbc1bac36bcd", size = 6442776, upload-time = "2025-09-09T15:57:45.793Z" }, + { url = "https://files.pythonhosted.org/packages/1c/3a/e22b766b11f6030dc2decdeff5c2fb1610768055603f9f3be88b6d192fb2/numpy-2.3.3-cp313-cp313t-win_amd64.whl", hash = "sha256:4384a169c4d8f97195980815d6fcad04933a7e1ab3b530921c3fef7a1c63426d", size = 12927281, upload-time = "2025-09-09T15:57:47.492Z" }, + { url = "https://files.pythonhosted.org/packages/7b/42/c2e2bc48c5e9b2a83423f99733950fbefd86f165b468a3d85d52b30bf782/numpy-2.3.3-cp313-cp313t-win_arm64.whl", hash = "sha256:75370986cc0bc66f4ce5110ad35aae6d182cc4ce6433c40ad151f53690130bf1", size = 10265275, upload-time = "2025-09-09T15:57:49.647Z" }, + { url = "https://files.pythonhosted.org/packages/6b/01/342ad585ad82419b99bcf7cebe99e61da6bedb89e213c5fd71acc467faee/numpy-2.3.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:cd052f1fa6a78dee696b58a914b7229ecfa41f0a6d96dc663c1220a55e137593", size = 20951527, upload-time = "2025-09-09T15:57:52.006Z" }, + { url = "https://files.pythonhosted.org/packages/ef/d8/204e0d73fc1b7a9ee80ab1fe1983dd33a4d64a4e30a05364b0208e9a241a/numpy-2.3.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:414a97499480067d305fcac9716c29cf4d0d76db6ebf0bf3cbce666677f12652", size = 14186159, upload-time = "2025-09-09T15:57:54.407Z" }, + { url = "https://files.pythonhosted.org/packages/22/af/f11c916d08f3a18fb8ba81ab72b5b74a6e42ead4c2846d270eb19845bf74/numpy-2.3.3-cp314-cp314-macosx_14_0_arm64.whl", hash = "sha256:50a5fe69f135f88a2be9b6ca0481a68a136f6febe1916e4920e12f1a34e708a7", size = 5114624, upload-time = "2025-09-09T15:57:56.5Z" }, + { url = "https://files.pythonhosted.org/packages/fb/11/0ed919c8381ac9d2ffacd63fd1f0c34d27e99cab650f0eb6f110e6ae4858/numpy-2.3.3-cp314-cp314-macosx_14_0_x86_64.whl", hash = "sha256:b912f2ed2b67a129e6a601e9d93d4fa37bef67e54cac442a2f588a54afe5c67a", size = 6642627, upload-time = "2025-09-09T15:57:58.206Z" }, + { url = "https://files.pythonhosted.org/packages/ee/83/deb5f77cb0f7ba6cb52b91ed388b47f8f3c2e9930d4665c600408d9b90b9/numpy-2.3.3-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9e318ee0596d76d4cb3d78535dc005fa60e5ea348cd131a51e99d0bdbe0b54fe", size = 14296926, upload-time = "2025-09-09T15:58:00.035Z" }, + { url = "https://files.pythonhosted.org/packages/77/cc/70e59dcb84f2b005d4f306310ff0a892518cc0c8000a33d0e6faf7ca8d80/numpy-2.3.3-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ce020080e4a52426202bdb6f7691c65bb55e49f261f31a8f506c9f6bc7450421", size = 16638958, upload-time = "2025-09-09T15:58:02.738Z" }, + { url = "https://files.pythonhosted.org/packages/b6/5a/b2ab6c18b4257e099587d5b7f903317bd7115333ad8d4ec4874278eafa61/numpy-2.3.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:e6687dc183aa55dae4a705b35f9c0f8cb178bcaa2f029b241ac5356221d5c021", size = 16071920, upload-time = "2025-09-09T15:58:05.029Z" }, + { url = "https://files.pythonhosted.org/packages/b8/f1/8b3fdc44324a259298520dd82147ff648979bed085feeacc1250ef1656c0/numpy-2.3.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:d8f3b1080782469fdc1718c4ed1d22549b5fb12af0d57d35e992158a772a37cf", size = 18577076, upload-time = "2025-09-09T15:58:07.745Z" }, + { url = "https://files.pythonhosted.org/packages/f0/a1/b87a284fb15a42e9274e7fcea0dad259d12ddbf07c1595b26883151ca3b4/numpy-2.3.3-cp314-cp314-win32.whl", hash = "sha256:cb248499b0bc3be66ebd6578b83e5acacf1d6cb2a77f2248ce0e40fbec5a76d0", size = 6366952, upload-time = "2025-09-09T15:58:10.096Z" }, + { url = "https://files.pythonhosted.org/packages/70/5f/1816f4d08f3b8f66576d8433a66f8fa35a5acfb3bbd0bf6c31183b003f3d/numpy-2.3.3-cp314-cp314-win_amd64.whl", hash = "sha256:691808c2b26b0f002a032c73255d0bd89751425f379f7bcd22d140db593a96e8", size = 12919322, upload-time = "2025-09-09T15:58:12.138Z" }, + { url = "https://files.pythonhosted.org/packages/8c/de/072420342e46a8ea41c324a555fa90fcc11637583fb8df722936aed1736d/numpy-2.3.3-cp314-cp314-win_arm64.whl", hash = "sha256:9ad12e976ca7b10f1774b03615a2a4bab8addce37ecc77394d8e986927dc0dfe", size = 10478630, upload-time = "2025-09-09T15:58:14.64Z" }, + { url = "https://files.pythonhosted.org/packages/d5/df/ee2f1c0a9de7347f14da5dd3cd3c3b034d1b8607ccb6883d7dd5c035d631/numpy-2.3.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:9cc48e09feb11e1db00b320e9d30a4151f7369afb96bd0e48d942d09da3a0d00", size = 21047987, upload-time = "2025-09-09T15:58:16.889Z" }, + { url = "https://files.pythonhosted.org/packages/d6/92/9453bdc5a4e9e69cf4358463f25e8260e2ffc126d52e10038b9077815989/numpy-2.3.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:901bf6123879b7f251d3631967fd574690734236075082078e0571977c6a8e6a", size = 14301076, upload-time = "2025-09-09T15:58:20.343Z" }, + { url = "https://files.pythonhosted.org/packages/13/77/1447b9eb500f028bb44253105bd67534af60499588a5149a94f18f2ca917/numpy-2.3.3-cp314-cp314t-macosx_14_0_arm64.whl", hash = "sha256:7f025652034199c301049296b59fa7d52c7e625017cae4c75d8662e377bf487d", size = 5229491, upload-time = "2025-09-09T15:58:22.481Z" }, + { url = "https://files.pythonhosted.org/packages/3d/f9/d72221b6ca205f9736cb4b2ce3b002f6e45cd67cd6a6d1c8af11a2f0b649/numpy-2.3.3-cp314-cp314t-macosx_14_0_x86_64.whl", hash = "sha256:533ca5f6d325c80b6007d4d7fb1984c303553534191024ec6a524a4c92a5935a", size = 6737913, upload-time = "2025-09-09T15:58:24.569Z" }, + { url = "https://files.pythonhosted.org/packages/3c/5f/d12834711962ad9c46af72f79bb31e73e416ee49d17f4c797f72c96b6ca5/numpy-2.3.3-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0edd58682a399824633b66885d699d7de982800053acf20be1eaa46d92009c54", size = 14352811, upload-time = "2025-09-09T15:58:26.416Z" }, + { url = "https://files.pythonhosted.org/packages/a1/0d/fdbec6629d97fd1bebed56cd742884e4eead593611bbe1abc3eb40d304b2/numpy-2.3.3-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:367ad5d8fbec5d9296d18478804a530f1191e24ab4d75ab408346ae88045d25e", size = 16702689, upload-time = "2025-09-09T15:58:28.831Z" }, + { url = "https://files.pythonhosted.org/packages/9b/09/0a35196dc5575adde1eb97ddfbc3e1687a814f905377621d18ca9bc2b7dd/numpy-2.3.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:8f6ac61a217437946a1fa48d24c47c91a0c4f725237871117dea264982128097", size = 16133855, upload-time = "2025-09-09T15:58:31.349Z" }, + { url = "https://files.pythonhosted.org/packages/7a/ca/c9de3ea397d576f1b6753eaa906d4cdef1bf97589a6d9825a349b4729cc2/numpy-2.3.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:179a42101b845a816d464b6fe9a845dfaf308fdfc7925387195570789bb2c970", size = 18652520, upload-time = "2025-09-09T15:58:33.762Z" }, + { url = "https://files.pythonhosted.org/packages/fd/c2/e5ed830e08cd0196351db55db82f65bc0ab05da6ef2b72a836dcf1936d2f/numpy-2.3.3-cp314-cp314t-win32.whl", hash = "sha256:1250c5d3d2562ec4174bce2e3a1523041595f9b651065e4a4473f5f48a6bc8a5", size = 6515371, upload-time = "2025-09-09T15:58:36.04Z" }, + { url = "https://files.pythonhosted.org/packages/47/c7/b0f6b5b67f6788a0725f744496badbb604d226bf233ba716683ebb47b570/numpy-2.3.3-cp314-cp314t-win_amd64.whl", hash = "sha256:b37a0b2e5935409daebe82c1e42274d30d9dd355852529eab91dab8dcca7419f", size = 13112576, upload-time = "2025-09-09T15:58:37.927Z" }, + { url = "https://files.pythonhosted.org/packages/06/b9/33bba5ff6fb679aa0b1f8a07e853f002a6b04b9394db3069a1270a7784ca/numpy-2.3.3-cp314-cp314t-win_arm64.whl", hash = "sha256:78c9f6560dc7e6b3990e32df7ea1a50bbd0e2a111e05209963f5ddcab7073b0b", size = 10545953, upload-time = "2025-09-09T15:58:40.576Z" }, + { url = "https://files.pythonhosted.org/packages/b8/f2/7e0a37cfced2644c9563c529f29fa28acbd0960dde32ece683aafa6f4949/numpy-2.3.3-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:1e02c7159791cd481e1e6d5ddd766b62a4d5acf8df4d4d1afe35ee9c5c33a41e", size = 21131019, upload-time = "2025-09-09T15:58:42.838Z" }, + { url = "https://files.pythonhosted.org/packages/1a/7e/3291f505297ed63831135a6cc0f474da0c868a1f31b0dd9a9f03a7a0d2ed/numpy-2.3.3-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:dca2d0fc80b3893ae72197b39f69d55a3cd8b17ea1b50aa4c62de82419936150", size = 14376288, upload-time = "2025-09-09T15:58:45.425Z" }, + { url = "https://files.pythonhosted.org/packages/bf/4b/ae02e985bdeee73d7b5abdefeb98aef1207e96d4c0621ee0cf228ddfac3c/numpy-2.3.3-pp311-pypy311_pp73-macosx_14_0_arm64.whl", hash = "sha256:99683cbe0658f8271b333a1b1b4bb3173750ad59c0c61f5bbdc5b318918fffe3", size = 5305425, upload-time = "2025-09-09T15:58:48.6Z" }, + { url = "https://files.pythonhosted.org/packages/8b/eb/9df215d6d7250db32007941500dc51c48190be25f2401d5b2b564e467247/numpy-2.3.3-pp311-pypy311_pp73-macosx_14_0_x86_64.whl", hash = "sha256:d9d537a39cc9de668e5cd0e25affb17aec17b577c6b3ae8a3d866b479fbe88d0", size = 6819053, upload-time = "2025-09-09T15:58:50.401Z" }, + { url = "https://files.pythonhosted.org/packages/57/62/208293d7d6b2a8998a4a1f23ac758648c3c32182d4ce4346062018362e29/numpy-2.3.3-pp311-pypy311_pp73-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8596ba2f8af5f93b01d97563832686d20206d303024777f6dfc2e7c7c3f1850e", size = 14420354, upload-time = "2025-09-09T15:58:52.704Z" }, + { url = "https://files.pythonhosted.org/packages/ed/0c/8e86e0ff7072e14a71b4c6af63175e40d1e7e933ce9b9e9f765a95b4e0c3/numpy-2.3.3-pp311-pypy311_pp73-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e1ec5615b05369925bd1125f27df33f3b6c8bc10d788d5999ecd8769a1fa04db", size = 16760413, upload-time = "2025-09-09T15:58:55.027Z" }, + { url = "https://files.pythonhosted.org/packages/af/11/0cc63f9f321ccf63886ac203336777140011fb669e739da36d8db3c53b98/numpy-2.3.3-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:2e267c7da5bf7309670523896df97f93f6e469fb931161f483cd6882b3b1a5dc", size = 12971844, upload-time = "2025-09-09T15:58:57.359Z" }, +] + +[[package]] +name = "onnxruntime" +version = "1.22.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "coloredlogs" }, + { name = "flatbuffers" }, + { name = "numpy" }, + { name = "packaging" }, + { name = "protobuf" }, + { name = "sympy" }, +] +wheels = [ + { url = "https://files.pythonhosted.org/packages/82/ff/4a1a6747e039ef29a8d4ee4510060e9a805982b6da906a3da2306b7a3be6/onnxruntime-1.22.1-cp311-cp311-macosx_13_0_universal2.whl", hash = "sha256:f4581bccb786da68725d8eac7c63a8f31a89116b8761ff8b4989dc58b61d49a0", size = 34324148, upload-time = "2025-07-10T19:15:26.584Z" }, + { url = "https://files.pythonhosted.org/packages/0b/05/9f1929723f1cca8c9fb1b2b97ac54ce61362c7201434d38053ea36ee4225/onnxruntime-1.22.1-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7ae7526cf10f93454beb0f751e78e5cb7619e3b92f9fc3bd51aa6f3b7a8977e5", size = 14473779, upload-time = "2025-07-10T19:15:30.183Z" }, + { url = "https://files.pythonhosted.org/packages/59/f3/c93eb4167d4f36ea947930f82850231f7ce0900cb00e1a53dc4995b60479/onnxruntime-1.22.1-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f6effa1299ac549a05c784d50292e3378dbbf010346ded67400193b09ddc2f04", size = 16460799, upload-time = "2025-07-10T19:15:33.005Z" }, + { url = "https://files.pythonhosted.org/packages/a8/01/e536397b03e4462d3260aee5387e6f606c8fa9d2b20b1728f988c3c72891/onnxruntime-1.22.1-cp311-cp311-win_amd64.whl", hash = "sha256:f28a42bb322b4ca6d255531bb334a2b3e21f172e37c1741bd5e66bc4b7b61f03", size = 12689881, upload-time = "2025-07-10T19:15:35.501Z" }, + { url = "https://files.pythonhosted.org/packages/48/70/ca2a4d38a5deccd98caa145581becb20c53684f451e89eb3a39915620066/onnxruntime-1.22.1-cp312-cp312-macosx_13_0_universal2.whl", hash = "sha256:a938d11c0dc811badf78e435daa3899d9af38abee950d87f3ab7430eb5b3cf5a", size = 34342883, upload-time = "2025-07-10T19:15:38.223Z" }, + { url = "https://files.pythonhosted.org/packages/29/e5/00b099b4d4f6223b610421080d0eed9327ef9986785c9141819bbba0d396/onnxruntime-1.22.1-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:984cea2a02fcc5dfea44ade9aca9fe0f7a8a2cd6f77c258fc4388238618f3928", size = 14473861, upload-time = "2025-07-10T19:15:42.911Z" }, + { url = "https://files.pythonhosted.org/packages/0a/50/519828a5292a6ccd8d5cd6d2f72c6b36ea528a2ef68eca69647732539ffa/onnxruntime-1.22.1-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2d39a530aff1ec8d02e365f35e503193991417788641b184f5b1e8c9a6d5ce8d", size = 16475713, upload-time = "2025-07-10T19:15:45.452Z" }, + { url = "https://files.pythonhosted.org/packages/5d/54/7139d463bb0a312890c9a5db87d7815d4a8cce9e6f5f28d04f0b55fcb160/onnxruntime-1.22.1-cp312-cp312-win_amd64.whl", hash = "sha256:6a64291d57ea966a245f749eb970f4fa05a64d26672e05a83fdb5db6b7d62f87", size = 12690910, upload-time = "2025-07-10T19:15:47.478Z" }, + { url = "https://files.pythonhosted.org/packages/e0/39/77cefa829740bd830915095d8408dce6d731b244e24b1f64fe3df9f18e86/onnxruntime-1.22.1-cp313-cp313-macosx_13_0_universal2.whl", hash = "sha256:d29c7d87b6cbed8fecfd09dca471832384d12a69e1ab873e5effbb94adc3e966", size = 34342026, upload-time = "2025-07-10T19:15:50.266Z" }, + { url = "https://files.pythonhosted.org/packages/d2/a6/444291524cb52875b5de980a6e918072514df63a57a7120bf9dfae3aeed1/onnxruntime-1.22.1-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:460487d83b7056ba98f1f7bac80287224c31d8149b15712b0d6f5078fcc33d0f", size = 14474014, upload-time = "2025-07-10T19:15:53.991Z" }, + { url = "https://files.pythonhosted.org/packages/87/9d/45a995437879c18beff26eacc2322f4227224d04c6ac3254dce2e8950190/onnxruntime-1.22.1-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b0c37070268ba4e02a1a9d28560cd00cd1e94f0d4f275cbef283854f861a65fa", size = 16475427, upload-time = "2025-07-10T19:15:56.067Z" }, + { url = "https://files.pythonhosted.org/packages/4c/06/9c765e66ad32a7e709ce4cb6b95d7eaa9cb4d92a6e11ea97c20ffecaf765/onnxruntime-1.22.1-cp313-cp313-win_amd64.whl", hash = "sha256:70980d729145a36a05f74b573435531f55ef9503bcda81fc6c3d6b9306199982", size = 12690841, upload-time = "2025-07-10T19:15:58.337Z" }, + { url = "https://files.pythonhosted.org/packages/52/8c/02af24ee1c8dce4e6c14a1642a7a56cebe323d2fa01d9a360a638f7e4b75/onnxruntime-1.22.1-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:33a7980bbc4b7f446bac26c3785652fe8730ed02617d765399e89ac7d44e0f7d", size = 14479333, upload-time = "2025-07-10T19:16:00.544Z" }, + { url = "https://files.pythonhosted.org/packages/5d/15/d75fd66aba116ce3732bb1050401394c5ec52074c4f7ee18db8838dd4667/onnxruntime-1.22.1-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:6e7e823624b015ea879d976cbef8bfaed2f7e2cc233d7506860a76dd37f8f381", size = 16477261, upload-time = "2025-07-10T19:16:03.226Z" }, +] + +[[package]] +name = "openai" +version = "1.108.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio" }, + { name = "distro" }, + { name = "httpx" }, + { name = "jiter" }, + { name = "pydantic" }, + { name = "sniffio" }, + { name = "tqdm" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/07/3c/3ea4c40c62d5f4b11690de13de35554d0d49b5e5780669fad5e83562d635/openai-1.108.0.tar.gz", hash = "sha256:e859c64e4202d7f5956f19280eee92bb281f211c41cdd5be9e63bf51a024ff72", size = 564659, upload-time = "2025-09-17T22:03:23.075Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/af/dc/0a007b7c5a079e13d66eecc5d521bbc67b53c135e2a3131160ef76b5db1f/openai-1.108.0-py3-none-any.whl", hash = "sha256:31f2e58230e2703f13ddbb50c285f39dacf7fca64ab19882fd8a7a0b2bccd781", size = 948114, upload-time = "2025-09-17T22:03:20.972Z" }, +] + +[[package]] +name = "openapi-core" +version = "0.19.5" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "isodate" }, + { name = "jsonschema" }, + { name = "jsonschema-path" }, + { name = "more-itertools" }, + { name = "openapi-schema-validator" }, + { name = "openapi-spec-validator" }, + { name = "parse" }, + { name = "typing-extensions" }, + { name = "werkzeug" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/b1/35/1acaa5f2fcc6e54eded34a2ec74b479439c4e469fc4e8d0e803fda0234db/openapi_core-0.19.5.tar.gz", hash = "sha256:421e753da56c391704454e66afe4803a290108590ac8fa6f4a4487f4ec11f2d3", size = 103264, upload-time = "2025-03-20T20:17:28.193Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/27/6f/83ead0e2e30a90445ee4fc0135f43741aebc30cca5b43f20968b603e30b6/openapi_core-0.19.5-py3-none-any.whl", hash = "sha256:ef7210e83a59394f46ce282639d8d26ad6fc8094aa904c9c16eb1bac8908911f", size = 106595, upload-time = "2025-03-20T20:17:26.77Z" }, +] + +[[package]] +name = "openapi-pydantic" +version = "0.5.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pydantic" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/02/2e/58d83848dd1a79cb92ed8e63f6ba901ca282c5f09d04af9423ec26c56fd7/openapi_pydantic-0.5.1.tar.gz", hash = "sha256:ff6835af6bde7a459fb93eb93bb92b8749b754fc6e51b2f1590a19dc3005ee0d", size = 60892, upload-time = "2025-01-08T19:29:27.083Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/12/cf/03675d8bd8ecbf4445504d8071adab19f5f993676795708e36402ab38263/openapi_pydantic-0.5.1-py3-none-any.whl", hash = "sha256:a3a09ef4586f5bd760a8df7f43028b60cafb6d9f61de2acba9574766255ab146", size = 96381, upload-time = "2025-01-08T19:29:25.275Z" }, +] + +[[package]] +name = "openapi-schema-validator" +version = "0.6.3" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "jsonschema" }, + { name = "jsonschema-specifications" }, + { name = "rfc3339-validator" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/8b/f3/5507ad3325169347cd8ced61c232ff3df70e2b250c49f0fe140edb4973c6/openapi_schema_validator-0.6.3.tar.gz", hash = "sha256:f37bace4fc2a5d96692f4f8b31dc0f8d7400fd04f3a937798eaf880d425de6ee", size = 11550, upload-time = "2025-01-10T18:08:22.268Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/21/c6/ad0fba32775ae749016829dace42ed80f4407b171da41313d1a3a5f102e4/openapi_schema_validator-0.6.3-py3-none-any.whl", hash = "sha256:f3b9870f4e556b5a62a1c39da72a6b4b16f3ad9c73dc80084b1b11e74ba148a3", size = 8755, upload-time = "2025-01-10T18:08:19.758Z" }, +] + +[[package]] +name = "openapi-spec-validator" +version = "0.7.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "jsonschema" }, + { name = "jsonschema-path" }, + { name = "lazy-object-proxy" }, + { name = "openapi-schema-validator" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/82/af/fe2d7618d6eae6fb3a82766a44ed87cd8d6d82b4564ed1c7cfb0f6378e91/openapi_spec_validator-0.7.2.tar.gz", hash = "sha256:cc029309b5c5dbc7859df0372d55e9d1ff43e96d678b9ba087f7c56fc586f734", size = 36855, upload-time = "2025-06-07T14:48:56.299Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/27/dd/b3fd642260cb17532f66cc1e8250f3507d1e580483e209dc1e9d13bd980d/openapi_spec_validator-0.7.2-py3-none-any.whl", hash = "sha256:4bbdc0894ec85f1d1bea1d6d9c8b2c3c8d7ccaa13577ef40da9c006c9fd0eb60", size = 39713, upload-time = "2025-06-07T14:48:54.077Z" }, +] + +[[package]] +name = "opentelemetry-api" +version = "1.37.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "importlib-metadata" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/63/04/05040d7ce33a907a2a02257e601992f0cdf11c73b33f13c4492bf6c3d6d5/opentelemetry_api-1.37.0.tar.gz", hash = "sha256:540735b120355bd5112738ea53621f8d5edb35ebcd6fe21ada3ab1c61d1cd9a7", size = 64923, upload-time = "2025-09-11T10:29:01.662Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/91/48/28ed9e55dcf2f453128df738210a980e09f4e468a456fa3c763dbc8be70a/opentelemetry_api-1.37.0-py3-none-any.whl", hash = "sha256:accf2024d3e89faec14302213bc39550ec0f4095d1cf5ca688e1bfb1c8612f47", size = 65732, upload-time = "2025-09-11T10:28:41.826Z" }, +] + +[[package]] +name = "opentelemetry-exporter-gcp-trace" +version = "1.9.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "google-cloud-trace" }, + { name = "opentelemetry-api" }, + { name = "opentelemetry-resourcedetector-gcp" }, + { name = "opentelemetry-sdk" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/c3/15/7556d54b01fb894497f69a98d57faa9caa45ffa59896e0bba6847a7f0d15/opentelemetry_exporter_gcp_trace-1.9.0.tar.gz", hash = "sha256:c3fc090342f6ee32a0cc41a5716a6bb716b4422d19facefcb22dc4c6b683ece8", size = 18568, upload-time = "2025-02-04T19:45:08.185Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c0/cd/6d7fbad05771eb3c2bace20f6360ce5dac5ca751c6f2122853e43830c32e/opentelemetry_exporter_gcp_trace-1.9.0-py3-none-any.whl", hash = "sha256:0a8396e8b39f636eeddc3f0ae08ddb40c40f288bc8c5544727c3581545e77254", size = 13973, upload-time = "2025-02-04T19:44:59.148Z" }, +] + +[[package]] +name = "opentelemetry-exporter-otlp-proto-common" +version = "1.37.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "opentelemetry-proto" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/dc/6c/10018cbcc1e6fff23aac67d7fd977c3d692dbe5f9ef9bb4db5c1268726cc/opentelemetry_exporter_otlp_proto_common-1.37.0.tar.gz", hash = "sha256:c87a1bdd9f41fdc408d9cc9367bb53f8d2602829659f2b90be9f9d79d0bfe62c", size = 20430, upload-time = "2025-09-11T10:29:03.605Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/08/13/b4ef09837409a777f3c0af2a5b4ba9b7af34872bc43609dda0c209e4060d/opentelemetry_exporter_otlp_proto_common-1.37.0-py3-none-any.whl", hash = "sha256:53038428449c559b0c564b8d718df3314da387109c4d36bd1b94c9a641b0292e", size = 18359, upload-time = "2025-09-11T10:28:44.939Z" }, +] + +[[package]] +name = "opentelemetry-exporter-otlp-proto-http" +version = "1.37.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "googleapis-common-protos" }, + { name = "opentelemetry-api" }, + { name = "opentelemetry-exporter-otlp-proto-common" }, + { name = "opentelemetry-proto" }, + { name = "opentelemetry-sdk" }, + { name = "requests" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/5d/e3/6e320aeb24f951449e73867e53c55542bebbaf24faeee7623ef677d66736/opentelemetry_exporter_otlp_proto_http-1.37.0.tar.gz", hash = "sha256:e52e8600f1720d6de298419a802108a8f5afa63c96809ff83becb03f874e44ac", size = 17281, upload-time = "2025-09-11T10:29:04.844Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e9/e9/70d74a664d83976556cec395d6bfedd9b85ec1498b778367d5f93e373397/opentelemetry_exporter_otlp_proto_http-1.37.0-py3-none-any.whl", hash = "sha256:54c42b39945a6cc9d9a2a33decb876eabb9547e0dcb49df090122773447f1aef", size = 19576, upload-time = "2025-09-11T10:28:46.726Z" }, +] + +[[package]] +name = "opentelemetry-instrumentation" +version = "0.58b0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "opentelemetry-api" }, + { name = "opentelemetry-semantic-conventions" }, + { name = "packaging" }, + { name = "wrapt" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/f6/36/7c307d9be8ce4ee7beb86d7f1d31027f2a6a89228240405a858d6e4d64f9/opentelemetry_instrumentation-0.58b0.tar.gz", hash = "sha256:df640f3ac715a3e05af145c18f527f4422c6ab6c467e40bd24d2ad75a00cb705", size = 31549, upload-time = "2025-09-11T11:42:14.084Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d4/db/5ff1cd6c5ca1d12ecf1b73be16fbb2a8af2114ee46d4b0e6d4b23f4f4db7/opentelemetry_instrumentation-0.58b0-py3-none-any.whl", hash = "sha256:50f97ac03100676c9f7fc28197f8240c7290ca1baa12da8bfbb9a1de4f34cc45", size = 33019, upload-time = "2025-09-11T11:41:00.624Z" }, +] + +[[package]] +name = "opentelemetry-proto" +version = "1.37.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "protobuf" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/dd/ea/a75f36b463a36f3c5a10c0b5292c58b31dbdde74f6f905d3d0ab2313987b/opentelemetry_proto-1.37.0.tar.gz", hash = "sha256:30f5c494faf66f77faeaefa35ed4443c5edb3b0aa46dad073ed7210e1a789538", size = 46151, upload-time = "2025-09-11T10:29:11.04Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c4/25/f89ea66c59bd7687e218361826c969443c4fa15dfe89733f3bf1e2a9e971/opentelemetry_proto-1.37.0-py3-none-any.whl", hash = "sha256:8ed8c066ae8828bbf0c39229979bdf583a126981142378a9cbe9d6fd5701c6e2", size = 72534, upload-time = "2025-09-11T10:28:56.831Z" }, +] + +[[package]] +name = "opentelemetry-resourcedetector-gcp" +version = "1.9.0a0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "opentelemetry-api" }, + { name = "opentelemetry-sdk" }, + { name = "requests" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/e1/86/f0693998817779802525a5bcc885a3cdb68d05b636bc6faae5c9ade4bee4/opentelemetry_resourcedetector_gcp-1.9.0a0.tar.gz", hash = "sha256:6860a6649d1e3b9b7b7f09f3918cc16b72aa0c0c590d2a72ea6e42b67c9a42e7", size = 20730, upload-time = "2025-02-04T19:45:10.693Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/12/04/7e33228c88422a5518e1774a836c9ec68f10f51bde0f1d5dd5f3054e612a/opentelemetry_resourcedetector_gcp-1.9.0a0-py3-none-any.whl", hash = "sha256:4e5a0822b0f0d7647b7ceb282d7aa921dd7f45466540bd0a24f954f90db8fde8", size = 20378, upload-time = "2025-02-04T19:45:03.898Z" }, +] + +[[package]] +name = "opentelemetry-sdk" +version = "1.37.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "opentelemetry-api" }, + { name = "opentelemetry-semantic-conventions" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/f4/62/2e0ca80d7fe94f0b193135375da92c640d15fe81f636658d2acf373086bc/opentelemetry_sdk-1.37.0.tar.gz", hash = "sha256:cc8e089c10953ded765b5ab5669b198bbe0af1b3f89f1007d19acd32dc46dda5", size = 170404, upload-time = "2025-09-11T10:29:11.779Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/9f/62/9f4ad6a54126fb00f7ed4bb5034964c6e4f00fcd5a905e115bd22707e20d/opentelemetry_sdk-1.37.0-py3-none-any.whl", hash = "sha256:8f3c3c22063e52475c5dbced7209495c2c16723d016d39287dfc215d1771257c", size = 131941, upload-time = "2025-09-11T10:28:57.83Z" }, +] + +[[package]] +name = "opentelemetry-semantic-conventions" +version = "0.58b0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "opentelemetry-api" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/aa/1b/90701d91e6300d9f2fb352153fb1721ed99ed1f6ea14fa992c756016e63a/opentelemetry_semantic_conventions-0.58b0.tar.gz", hash = "sha256:6bd46f51264279c433755767bb44ad00f1c9e2367e1b42af563372c5a6fa0c25", size = 129867, upload-time = "2025-09-11T10:29:12.597Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/07/90/68152b7465f50285d3ce2481b3aec2f82822e3f52e5152eeeaf516bab841/opentelemetry_semantic_conventions-0.58b0-py3-none-any.whl", hash = "sha256:5564905ab1458b96684db1340232729fce3b5375a06e140e8904c78e4f815b28", size = 207954, upload-time = "2025-09-11T10:28:59.218Z" }, +] + +[[package]] +name = "ordered-set" +version = "4.1.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/4c/ca/bfac8bc689799bcca4157e0e0ced07e70ce125193fc2e166d2e685b7e2fe/ordered-set-4.1.0.tar.gz", hash = "sha256:694a8e44c87657c59292ede72891eb91d34131f6531463aab3009191c77364a8", size = 12826, upload-time = "2022-01-26T14:38:56.6Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/33/55/af02708f230eb77084a299d7b08175cff006dea4f2721074b92cdb0296c0/ordered_set-4.1.0-py3-none-any.whl", hash = "sha256:046e1132c71fcf3330438a539928932caf51ddbc582496833e23de611de14562", size = 7634, upload-time = "2022-01-26T14:38:48.677Z" }, +] + +[[package]] +name = "orjson" +version = "3.11.3" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/be/4d/8df5f83256a809c22c4d6792ce8d43bb503be0fb7a8e4da9025754b09658/orjson-3.11.3.tar.gz", hash = "sha256:1c0603b1d2ffcd43a411d64797a19556ef76958aef1c182f22dc30860152a98a", size = 5482394, upload-time = "2025-08-26T17:46:43.171Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/cd/8b/360674cd817faef32e49276187922a946468579fcaf37afdfb6c07046e92/orjson-3.11.3-cp311-cp311-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:9d2ae0cc6aeb669633e0124531f342a17d8e97ea999e42f12a5ad4adaa304c5f", size = 238238, upload-time = "2025-08-26T17:44:54.214Z" }, + { url = "https://files.pythonhosted.org/packages/05/3d/5fa9ea4b34c1a13be7d9046ba98d06e6feb1d8853718992954ab59d16625/orjson-3.11.3-cp311-cp311-macosx_15_0_arm64.whl", hash = "sha256:ba21dbb2493e9c653eaffdc38819b004b7b1b246fb77bfc93dc016fe664eac91", size = 127713, upload-time = "2025-08-26T17:44:55.596Z" }, + { url = "https://files.pythonhosted.org/packages/e5/5f/e18367823925e00b1feec867ff5f040055892fc474bf5f7875649ecfa586/orjson-3.11.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:00f1a271e56d511d1569937c0447d7dce5a99a33ea0dec76673706360a051904", size = 123241, upload-time = "2025-08-26T17:44:57.185Z" }, + { url = "https://files.pythonhosted.org/packages/0f/bd/3c66b91c4564759cf9f473251ac1650e446c7ba92a7c0f9f56ed54f9f0e6/orjson-3.11.3-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:b67e71e47caa6680d1b6f075a396d04fa6ca8ca09aafb428731da9b3ea32a5a6", size = 127895, upload-time = "2025-08-26T17:44:58.349Z" }, + { url = "https://files.pythonhosted.org/packages/82/b5/dc8dcd609db4766e2967a85f63296c59d4722b39503e5b0bf7fd340d387f/orjson-3.11.3-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d7d012ebddffcce8c85734a6d9e5f08180cd3857c5f5a3ac70185b43775d043d", size = 130303, upload-time = "2025-08-26T17:44:59.491Z" }, + { url = "https://files.pythonhosted.org/packages/48/c2/d58ec5fd1270b2aa44c862171891adc2e1241bd7dab26c8f46eb97c6c6f1/orjson-3.11.3-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:dd759f75d6b8d1b62012b7f5ef9461d03c804f94d539a5515b454ba3a6588038", size = 132366, upload-time = "2025-08-26T17:45:00.654Z" }, + { url = "https://files.pythonhosted.org/packages/73/87/0ef7e22eb8dd1ef940bfe3b9e441db519e692d62ed1aae365406a16d23d0/orjson-3.11.3-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6890ace0809627b0dff19cfad92d69d0fa3f089d3e359a2a532507bb6ba34efb", size = 135180, upload-time = "2025-08-26T17:45:02.424Z" }, + { url = "https://files.pythonhosted.org/packages/bb/6a/e5bf7b70883f374710ad74faf99bacfc4b5b5a7797c1d5e130350e0e28a3/orjson-3.11.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f9d4a5e041ae435b815e568537755773d05dac031fee6a57b4ba70897a44d9d2", size = 132741, upload-time = "2025-08-26T17:45:03.663Z" }, + { url = "https://files.pythonhosted.org/packages/bd/0c/4577fd860b6386ffaa56440e792af01c7882b56d2766f55384b5b0e9d39b/orjson-3.11.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:2d68bf97a771836687107abfca089743885fb664b90138d8761cce61d5625d55", size = 131104, upload-time = "2025-08-26T17:45:04.939Z" }, + { url = "https://files.pythonhosted.org/packages/66/4b/83e92b2d67e86d1c33f2ea9411742a714a26de63641b082bdbf3d8e481af/orjson-3.11.3-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:bfc27516ec46f4520b18ef645864cee168d2a027dbf32c5537cb1f3e3c22dac1", size = 403887, upload-time = "2025-08-26T17:45:06.228Z" }, + { url = "https://files.pythonhosted.org/packages/6d/e5/9eea6a14e9b5ceb4a271a1fd2e1dec5f2f686755c0fab6673dc6ff3433f4/orjson-3.11.3-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:f66b001332a017d7945e177e282a40b6997056394e3ed7ddb41fb1813b83e824", size = 145855, upload-time = "2025-08-26T17:45:08.338Z" }, + { url = "https://files.pythonhosted.org/packages/45/78/8d4f5ad0c80ba9bf8ac4d0fc71f93a7d0dc0844989e645e2074af376c307/orjson-3.11.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:212e67806525d2561efbfe9e799633b17eb668b8964abed6b5319b2f1cfbae1f", size = 135361, upload-time = "2025-08-26T17:45:09.625Z" }, + { url = "https://files.pythonhosted.org/packages/0b/5f/16386970370178d7a9b438517ea3d704efcf163d286422bae3b37b88dbb5/orjson-3.11.3-cp311-cp311-win32.whl", hash = "sha256:6e8e0c3b85575a32f2ffa59de455f85ce002b8bdc0662d6b9c2ed6d80ab5d204", size = 136190, upload-time = "2025-08-26T17:45:10.962Z" }, + { url = "https://files.pythonhosted.org/packages/09/60/db16c6f7a41dd8ac9fb651f66701ff2aeb499ad9ebc15853a26c7c152448/orjson-3.11.3-cp311-cp311-win_amd64.whl", hash = "sha256:6be2f1b5d3dc99a5ce5ce162fc741c22ba9f3443d3dd586e6a1211b7bc87bc7b", size = 131389, upload-time = "2025-08-26T17:45:12.285Z" }, + { url = "https://files.pythonhosted.org/packages/3e/2a/bb811ad336667041dea9b8565c7c9faf2f59b47eb5ab680315eea612ef2e/orjson-3.11.3-cp311-cp311-win_arm64.whl", hash = "sha256:fafb1a99d740523d964b15c8db4eabbfc86ff29f84898262bf6e3e4c9e97e43e", size = 126120, upload-time = "2025-08-26T17:45:13.515Z" }, + { url = "https://files.pythonhosted.org/packages/3d/b0/a7edab2a00cdcb2688e1c943401cb3236323e7bfd2839815c6131a3742f4/orjson-3.11.3-cp312-cp312-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:8c752089db84333e36d754c4baf19c0e1437012242048439c7e80eb0e6426e3b", size = 238259, upload-time = "2025-08-26T17:45:15.093Z" }, + { url = "https://files.pythonhosted.org/packages/e1/c6/ff4865a9cc398a07a83342713b5932e4dc3cb4bf4bc04e8f83dedfc0d736/orjson-3.11.3-cp312-cp312-macosx_15_0_arm64.whl", hash = "sha256:9b8761b6cf04a856eb544acdd82fc594b978f12ac3602d6374a7edb9d86fd2c2", size = 127633, upload-time = "2025-08-26T17:45:16.417Z" }, + { url = "https://files.pythonhosted.org/packages/6e/e6/e00bea2d9472f44fe8794f523e548ce0ad51eb9693cf538a753a27b8bda4/orjson-3.11.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8b13974dc8ac6ba22feaa867fc19135a3e01a134b4f7c9c28162fed4d615008a", size = 123061, upload-time = "2025-08-26T17:45:17.673Z" }, + { url = "https://files.pythonhosted.org/packages/54/31/9fbb78b8e1eb3ac605467cb846e1c08d0588506028b37f4ee21f978a51d4/orjson-3.11.3-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:f83abab5bacb76d9c821fd5c07728ff224ed0e52d7a71b7b3de822f3df04e15c", size = 127956, upload-time = "2025-08-26T17:45:19.172Z" }, + { url = "https://files.pythonhosted.org/packages/36/88/b0604c22af1eed9f98d709a96302006915cfd724a7ebd27d6dd11c22d80b/orjson-3.11.3-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e6fbaf48a744b94091a56c62897b27c31ee2da93d826aa5b207131a1e13d4064", size = 130790, upload-time = "2025-08-26T17:45:20.586Z" }, + { url = "https://files.pythonhosted.org/packages/0e/9d/1c1238ae9fffbfed51ba1e507731b3faaf6b846126a47e9649222b0fd06f/orjson-3.11.3-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bc779b4f4bba2847d0d2940081a7b6f7b5877e05408ffbb74fa1faf4a136c424", size = 132385, upload-time = "2025-08-26T17:45:22.036Z" }, + { url = "https://files.pythonhosted.org/packages/a3/b5/c06f1b090a1c875f337e21dd71943bc9d84087f7cdf8c6e9086902c34e42/orjson-3.11.3-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:bd4b909ce4c50faa2192da6bb684d9848d4510b736b0611b6ab4020ea6fd2d23", size = 135305, upload-time = "2025-08-26T17:45:23.4Z" }, + { url = "https://files.pythonhosted.org/packages/a0/26/5f028c7d81ad2ebbf84414ba6d6c9cac03f22f5cd0d01eb40fb2d6a06b07/orjson-3.11.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:524b765ad888dc5518bbce12c77c2e83dee1ed6b0992c1790cc5fb49bb4b6667", size = 132875, upload-time = "2025-08-26T17:45:25.182Z" }, + { url = "https://files.pythonhosted.org/packages/fe/d4/b8df70d9cfb56e385bf39b4e915298f9ae6c61454c8154a0f5fd7efcd42e/orjson-3.11.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:84fd82870b97ae3cdcea9d8746e592b6d40e1e4d4527835fc520c588d2ded04f", size = 130940, upload-time = "2025-08-26T17:45:27.209Z" }, + { url = "https://files.pythonhosted.org/packages/da/5e/afe6a052ebc1a4741c792dd96e9f65bf3939d2094e8b356503b68d48f9f5/orjson-3.11.3-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:fbecb9709111be913ae6879b07bafd4b0785b44c1eb5cac8ac76da048b3885a1", size = 403852, upload-time = "2025-08-26T17:45:28.478Z" }, + { url = "https://files.pythonhosted.org/packages/f8/90/7bbabafeb2ce65915e9247f14a56b29c9334003536009ef5b122783fe67e/orjson-3.11.3-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:9dba358d55aee552bd868de348f4736ca5a4086d9a62e2bfbbeeb5629fe8b0cc", size = 146293, upload-time = "2025-08-26T17:45:29.86Z" }, + { url = "https://files.pythonhosted.org/packages/27/b3/2d703946447da8b093350570644a663df69448c9d9330e5f1d9cce997f20/orjson-3.11.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:eabcf2e84f1d7105f84580e03012270c7e97ecb1fb1618bda395061b2a84a049", size = 135470, upload-time = "2025-08-26T17:45:31.243Z" }, + { url = "https://files.pythonhosted.org/packages/38/70/b14dcfae7aff0e379b0119c8a812f8396678919c431efccc8e8a0263e4d9/orjson-3.11.3-cp312-cp312-win32.whl", hash = "sha256:3782d2c60b8116772aea8d9b7905221437fdf53e7277282e8d8b07c220f96cca", size = 136248, upload-time = "2025-08-26T17:45:32.567Z" }, + { url = "https://files.pythonhosted.org/packages/35/b8/9e3127d65de7fff243f7f3e53f59a531bf6bb295ebe5db024c2503cc0726/orjson-3.11.3-cp312-cp312-win_amd64.whl", hash = "sha256:79b44319268af2eaa3e315b92298de9a0067ade6e6003ddaef72f8e0bedb94f1", size = 131437, upload-time = "2025-08-26T17:45:34.949Z" }, + { url = "https://files.pythonhosted.org/packages/51/92/a946e737d4d8a7fd84a606aba96220043dcc7d6988b9e7551f7f6d5ba5ad/orjson-3.11.3-cp312-cp312-win_arm64.whl", hash = "sha256:0e92a4e83341ef79d835ca21b8bd13e27c859e4e9e4d7b63defc6e58462a3710", size = 125978, upload-time = "2025-08-26T17:45:36.422Z" }, + { url = "https://files.pythonhosted.org/packages/fc/79/8932b27293ad35919571f77cb3693b5906cf14f206ef17546052a241fdf6/orjson-3.11.3-cp313-cp313-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:af40c6612fd2a4b00de648aa26d18186cd1322330bd3a3cc52f87c699e995810", size = 238127, upload-time = "2025-08-26T17:45:38.146Z" }, + { url = "https://files.pythonhosted.org/packages/1c/82/cb93cd8cf132cd7643b30b6c5a56a26c4e780c7a145db6f83de977b540ce/orjson-3.11.3-cp313-cp313-macosx_15_0_arm64.whl", hash = "sha256:9f1587f26c235894c09e8b5b7636a38091a9e6e7fe4531937534749c04face43", size = 127494, upload-time = "2025-08-26T17:45:39.57Z" }, + { url = "https://files.pythonhosted.org/packages/a4/b8/2d9eb181a9b6bb71463a78882bcac1027fd29cf62c38a40cc02fc11d3495/orjson-3.11.3-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:61dcdad16da5bb486d7227a37a2e789c429397793a6955227cedbd7252eb5a27", size = 123017, upload-time = "2025-08-26T17:45:40.876Z" }, + { url = "https://files.pythonhosted.org/packages/b4/14/a0e971e72d03b509190232356d54c0f34507a05050bd026b8db2bf2c192c/orjson-3.11.3-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:11c6d71478e2cbea0a709e8a06365fa63da81da6498a53e4c4f065881d21ae8f", size = 127898, upload-time = "2025-08-26T17:45:42.188Z" }, + { url = "https://files.pythonhosted.org/packages/8e/af/dc74536722b03d65e17042cc30ae586161093e5b1f29bccda24765a6ae47/orjson-3.11.3-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ff94112e0098470b665cb0ed06efb187154b63649403b8d5e9aedeb482b4548c", size = 130742, upload-time = "2025-08-26T17:45:43.511Z" }, + { url = "https://files.pythonhosted.org/packages/62/e6/7a3b63b6677bce089fe939353cda24a7679825c43a24e49f757805fc0d8a/orjson-3.11.3-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae8b756575aaa2a855a75192f356bbda11a89169830e1439cfb1a3e1a6dde7be", size = 132377, upload-time = "2025-08-26T17:45:45.525Z" }, + { url = "https://files.pythonhosted.org/packages/fc/cd/ce2ab93e2e7eaf518f0fd15e3068b8c43216c8a44ed82ac2b79ce5cef72d/orjson-3.11.3-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c9416cc19a349c167ef76135b2fe40d03cea93680428efee8771f3e9fb66079d", size = 135313, upload-time = "2025-08-26T17:45:46.821Z" }, + { url = "https://files.pythonhosted.org/packages/d0/b4/f98355eff0bd1a38454209bbc73372ce351ba29933cb3e2eba16c04b9448/orjson-3.11.3-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b822caf5b9752bc6f246eb08124c3d12bf2175b66ab74bac2ef3bbf9221ce1b2", size = 132908, upload-time = "2025-08-26T17:45:48.126Z" }, + { url = "https://files.pythonhosted.org/packages/eb/92/8f5182d7bc2a1bed46ed960b61a39af8389f0ad476120cd99e67182bfb6d/orjson-3.11.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:414f71e3bdd5573893bf5ecdf35c32b213ed20aa15536fe2f588f946c318824f", size = 130905, upload-time = "2025-08-26T17:45:49.414Z" }, + { url = "https://files.pythonhosted.org/packages/1a/60/c41ca753ce9ffe3d0f67b9b4c093bdd6e5fdb1bc53064f992f66bb99954d/orjson-3.11.3-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:828e3149ad8815dc14468f36ab2a4b819237c155ee1370341b91ea4c8672d2ee", size = 403812, upload-time = "2025-08-26T17:45:51.085Z" }, + { url = "https://files.pythonhosted.org/packages/dd/13/e4a4f16d71ce1868860db59092e78782c67082a8f1dc06a3788aef2b41bc/orjson-3.11.3-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:ac9e05f25627ffc714c21f8dfe3a579445a5c392a9c8ae7ba1d0e9fb5333f56e", size = 146277, upload-time = "2025-08-26T17:45:52.851Z" }, + { url = "https://files.pythonhosted.org/packages/8d/8b/bafb7f0afef9344754a3a0597a12442f1b85a048b82108ef2c956f53babd/orjson-3.11.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:e44fbe4000bd321d9f3b648ae46e0196d21577cf66ae684a96ff90b1f7c93633", size = 135418, upload-time = "2025-08-26T17:45:54.806Z" }, + { url = "https://files.pythonhosted.org/packages/60/d4/bae8e4f26afb2c23bea69d2f6d566132584d1c3a5fe89ee8c17b718cab67/orjson-3.11.3-cp313-cp313-win32.whl", hash = "sha256:2039b7847ba3eec1f5886e75e6763a16e18c68a63efc4b029ddf994821e2e66b", size = 136216, upload-time = "2025-08-26T17:45:57.182Z" }, + { url = "https://files.pythonhosted.org/packages/88/76/224985d9f127e121c8cad882cea55f0ebe39f97925de040b75ccd4b33999/orjson-3.11.3-cp313-cp313-win_amd64.whl", hash = "sha256:29be5ac4164aa8bdcba5fa0700a3c9c316b411d8ed9d39ef8a882541bd452fae", size = 131362, upload-time = "2025-08-26T17:45:58.56Z" }, + { url = "https://files.pythonhosted.org/packages/e2/cf/0dce7a0be94bd36d1346be5067ed65ded6adb795fdbe3abd234c8d576d01/orjson-3.11.3-cp313-cp313-win_arm64.whl", hash = "sha256:18bd1435cb1f2857ceb59cfb7de6f92593ef7b831ccd1b9bfb28ca530e539dce", size = 125989, upload-time = "2025-08-26T17:45:59.95Z" }, + { url = "https://files.pythonhosted.org/packages/ef/77/d3b1fef1fc6aaeed4cbf3be2b480114035f4df8fa1a99d2dac1d40d6e924/orjson-3.11.3-cp314-cp314-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:cf4b81227ec86935568c7edd78352a92e97af8da7bd70bdfdaa0d2e0011a1ab4", size = 238115, upload-time = "2025-08-26T17:46:01.669Z" }, + { url = "https://files.pythonhosted.org/packages/e4/6d/468d21d49bb12f900052edcfbf52c292022d0a323d7828dc6376e6319703/orjson-3.11.3-cp314-cp314-macosx_15_0_arm64.whl", hash = "sha256:bc8bc85b81b6ac9fc4dae393a8c159b817f4c2c9dee5d12b773bddb3b95fc07e", size = 127493, upload-time = "2025-08-26T17:46:03.466Z" }, + { url = "https://files.pythonhosted.org/packages/67/46/1e2588700d354aacdf9e12cc2d98131fb8ac6f31ca65997bef3863edb8ff/orjson-3.11.3-cp314-cp314-manylinux_2_34_aarch64.whl", hash = "sha256:88dcfc514cfd1b0de038443c7b3e6a9797ffb1b3674ef1fd14f701a13397f82d", size = 122998, upload-time = "2025-08-26T17:46:04.803Z" }, + { url = "https://files.pythonhosted.org/packages/3b/94/11137c9b6adb3779f1b34fd98be51608a14b430dbc02c6d41134fbba484c/orjson-3.11.3-cp314-cp314-manylinux_2_34_x86_64.whl", hash = "sha256:d61cd543d69715d5fc0a690c7c6f8dcc307bc23abef9738957981885f5f38229", size = 132915, upload-time = "2025-08-26T17:46:06.237Z" }, + { url = "https://files.pythonhosted.org/packages/10/61/dccedcf9e9bcaac09fdabe9eaee0311ca92115699500efbd31950d878833/orjson-3.11.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:2b7b153ed90ababadbef5c3eb39549f9476890d339cf47af563aea7e07db2451", size = 130907, upload-time = "2025-08-26T17:46:07.581Z" }, + { url = "https://files.pythonhosted.org/packages/0e/fd/0e935539aa7b08b3ca0f817d73034f7eb506792aae5ecc3b7c6e679cdf5f/orjson-3.11.3-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:7909ae2460f5f494fecbcd10613beafe40381fd0316e35d6acb5f3a05bfda167", size = 403852, upload-time = "2025-08-26T17:46:08.982Z" }, + { url = "https://files.pythonhosted.org/packages/4a/2b/50ae1a5505cd1043379132fdb2adb8a05f37b3e1ebffe94a5073321966fd/orjson-3.11.3-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:2030c01cbf77bc67bee7eef1e7e31ecf28649353987775e3583062c752da0077", size = 146309, upload-time = "2025-08-26T17:46:10.576Z" }, + { url = "https://files.pythonhosted.org/packages/cd/1d/a473c158e380ef6f32753b5f39a69028b25ec5be331c2049a2201bde2e19/orjson-3.11.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:a0169ebd1cbd94b26c7a7ad282cf5c2744fce054133f959e02eb5265deae1872", size = 135424, upload-time = "2025-08-26T17:46:12.386Z" }, + { url = "https://files.pythonhosted.org/packages/da/09/17d9d2b60592890ff7382e591aa1d9afb202a266b180c3d4049b1ec70e4a/orjson-3.11.3-cp314-cp314-win32.whl", hash = "sha256:0c6d7328c200c349e3a4c6d8c83e0a5ad029bdc2d417f234152bf34842d0fc8d", size = 136266, upload-time = "2025-08-26T17:46:13.853Z" }, + { url = "https://files.pythonhosted.org/packages/15/58/358f6846410a6b4958b74734727e582ed971e13d335d6c7ce3e47730493e/orjson-3.11.3-cp314-cp314-win_amd64.whl", hash = "sha256:317bbe2c069bbc757b1a2e4105b64aacd3bc78279b66a6b9e51e846e4809f804", size = 131351, upload-time = "2025-08-26T17:46:15.27Z" }, + { url = "https://files.pythonhosted.org/packages/28/01/d6b274a0635be0468d4dbd9cafe80c47105937a0d42434e805e67cd2ed8b/orjson-3.11.3-cp314-cp314-win_arm64.whl", hash = "sha256:e8f6a7a27d7b7bec81bd5924163e9af03d49bbb63013f107b48eb5d16db711bc", size = 125985, upload-time = "2025-08-26T17:46:16.67Z" }, +] + +[[package]] +name = "overrides" +version = "7.7.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/36/86/b585f53236dec60aba864e050778b25045f857e17f6e5ea0ae95fe80edd2/overrides-7.7.0.tar.gz", hash = "sha256:55158fa3d93b98cc75299b1e67078ad9003ca27945c76162c1c0766d6f91820a", size = 22812, upload-time = "2024-01-27T21:01:33.423Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/2c/ab/fc8290c6a4c722e5514d80f62b2dc4c4df1a68a41d1364e625c35990fcf3/overrides-7.7.0-py3-none-any.whl", hash = "sha256:c7ed9d062f78b8e4c1a7b70bd8796b35ead4d9f510227ef9c5dc7626c60d7e49", size = 17832, upload-time = "2024-01-27T21:01:31.393Z" }, +] + +[[package]] +name = "packaging" +version = "24.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/d0/63/68dbb6eb2de9cb10ee4c9c14a0148804425e13c4fb20d61cce69f53106da/packaging-24.2.tar.gz", hash = "sha256:c228a6dc5e932d346bc5739379109d49e8853dd8223571c7c5b55260edc0b97f", size = 163950, upload-time = "2024-11-08T09:47:47.202Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/88/ef/eb23f262cca3c0c4eb7ab1933c3b1f03d021f2c48f54763065b6f0e321be/packaging-24.2-py3-none-any.whl", hash = "sha256:09abb1bccd265c01f4a3aa3f7a7db064b36514d2cba19a2f694fe6150451a759", size = 65451, upload-time = "2024-11-08T09:47:44.722Z" }, +] + +[[package]] +name = "pandas" +version = "2.3.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "numpy" }, + { name = "python-dateutil" }, + { name = "pytz" }, + { name = "tzdata" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/79/8e/0e90233ac205ad182bd6b422532695d2b9414944a280488105d598c70023/pandas-2.3.2.tar.gz", hash = "sha256:ab7b58f8f82706890924ccdfb5f48002b83d2b5a3845976a9fb705d36c34dcdb", size = 4488684, upload-time = "2025-08-21T10:28:29.257Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7a/59/f3e010879f118c2d400902d2d871c2226cef29b08c09fb8dc41111730400/pandas-2.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:1333e9c299adcbb68ee89a9bb568fc3f20f9cbb419f1dd5225071e6cddb2a743", size = 11563308, upload-time = "2025-08-21T10:26:56.656Z" }, + { url = "https://files.pythonhosted.org/packages/38/18/48f10f1cc5c397af59571d638d211f494dba481f449c19adbd282aa8f4ca/pandas-2.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:76972bcbd7de8e91ad5f0ca884a9f2c477a2125354af624e022c49e5bd0dfff4", size = 10820319, upload-time = "2025-08-21T10:26:59.162Z" }, + { url = "https://files.pythonhosted.org/packages/95/3b/1e9b69632898b048e223834cd9702052bcf06b15e1ae716eda3196fb972e/pandas-2.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b98bdd7c456a05eef7cd21fd6b29e3ca243591fe531c62be94a2cc987efb5ac2", size = 11790097, upload-time = "2025-08-21T10:27:02.204Z" }, + { url = "https://files.pythonhosted.org/packages/8b/ef/0e2ffb30b1f7fbc9a588bd01e3c14a0d96854d09a887e15e30cc19961227/pandas-2.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1d81573b3f7db40d020983f78721e9bfc425f411e616ef019a10ebf597aedb2e", size = 12397958, upload-time = "2025-08-21T10:27:05.409Z" }, + { url = "https://files.pythonhosted.org/packages/23/82/e6b85f0d92e9afb0e7f705a51d1399b79c7380c19687bfbf3d2837743249/pandas-2.3.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:e190b738675a73b581736cc8ec71ae113d6c3768d0bd18bffa5b9a0927b0b6ea", size = 13225600, upload-time = "2025-08-21T10:27:07.791Z" }, + { url = "https://files.pythonhosted.org/packages/e8/f1/f682015893d9ed51611948bd83683670842286a8edd4f68c2c1c3b231eef/pandas-2.3.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:c253828cb08f47488d60f43c5fc95114c771bbfff085da54bfc79cb4f9e3a372", size = 13879433, upload-time = "2025-08-21T10:27:10.347Z" }, + { url = "https://files.pythonhosted.org/packages/a7/e7/ae86261695b6c8a36d6a4c8d5f9b9ede8248510d689a2f379a18354b37d7/pandas-2.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:9467697b8083f9667b212633ad6aa4ab32436dcbaf4cd57325debb0ddef2012f", size = 11336557, upload-time = "2025-08-21T10:27:12.983Z" }, + { url = "https://files.pythonhosted.org/packages/ec/db/614c20fb7a85a14828edd23f1c02db58a30abf3ce76f38806155d160313c/pandas-2.3.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:3fbb977f802156e7a3f829e9d1d5398f6192375a3e2d1a9ee0803e35fe70a2b9", size = 11587652, upload-time = "2025-08-21T10:27:15.888Z" }, + { url = "https://files.pythonhosted.org/packages/99/b0/756e52f6582cade5e746f19bad0517ff27ba9c73404607c0306585c201b3/pandas-2.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1b9b52693123dd234b7c985c68b709b0b009f4521000d0525f2b95c22f15944b", size = 10717686, upload-time = "2025-08-21T10:27:18.486Z" }, + { url = "https://files.pythonhosted.org/packages/37/4c/dd5ccc1e357abfeee8353123282de17997f90ff67855f86154e5a13b81e5/pandas-2.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0bd281310d4f412733f319a5bc552f86d62cddc5f51d2e392c8787335c994175", size = 11278722, upload-time = "2025-08-21T10:27:21.149Z" }, + { url = "https://files.pythonhosted.org/packages/d3/a4/f7edcfa47e0a88cda0be8b068a5bae710bf264f867edfdf7b71584ace362/pandas-2.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:96d31a6b4354e3b9b8a2c848af75d31da390657e3ac6f30c05c82068b9ed79b9", size = 11987803, upload-time = "2025-08-21T10:27:23.767Z" }, + { url = "https://files.pythonhosted.org/packages/f6/61/1bce4129f93ab66f1c68b7ed1c12bac6a70b1b56c5dab359c6bbcd480b52/pandas-2.3.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:df4df0b9d02bb873a106971bb85d448378ef14b86ba96f035f50bbd3688456b4", size = 12766345, upload-time = "2025-08-21T10:27:26.6Z" }, + { url = "https://files.pythonhosted.org/packages/8e/46/80d53de70fee835531da3a1dae827a1e76e77a43ad22a8cd0f8142b61587/pandas-2.3.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:213a5adf93d020b74327cb2c1b842884dbdd37f895f42dcc2f09d451d949f811", size = 13439314, upload-time = "2025-08-21T10:27:29.213Z" }, + { url = "https://files.pythonhosted.org/packages/28/30/8114832daff7489f179971dbc1d854109b7f4365a546e3ea75b6516cea95/pandas-2.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:8c13b81a9347eb8c7548f53fd9a4f08d4dfe996836543f805c987bafa03317ae", size = 10983326, upload-time = "2025-08-21T10:27:31.901Z" }, + { url = "https://files.pythonhosted.org/packages/27/64/a2f7bf678af502e16b472527735d168b22b7824e45a4d7e96a4fbb634b59/pandas-2.3.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:0c6ecbac99a354a051ef21c5307601093cb9e0f4b1855984a084bfec9302699e", size = 11531061, upload-time = "2025-08-21T10:27:34.647Z" }, + { url = "https://files.pythonhosted.org/packages/54/4c/c3d21b2b7769ef2f4c2b9299fcadd601efa6729f1357a8dbce8dd949ed70/pandas-2.3.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:c6f048aa0fd080d6a06cc7e7537c09b53be6642d330ac6f54a600c3ace857ee9", size = 10668666, upload-time = "2025-08-21T10:27:37.203Z" }, + { url = "https://files.pythonhosted.org/packages/50/e2/f775ba76ecfb3424d7f5862620841cf0edb592e9abd2d2a5387d305fe7a8/pandas-2.3.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0064187b80a5be6f2f9c9d6bdde29372468751dfa89f4211a3c5871854cfbf7a", size = 11332835, upload-time = "2025-08-21T10:27:40.188Z" }, + { url = "https://files.pythonhosted.org/packages/8f/52/0634adaace9be2d8cac9ef78f05c47f3a675882e068438b9d7ec7ef0c13f/pandas-2.3.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4ac8c320bded4718b298281339c1a50fb00a6ba78cb2a63521c39bec95b0209b", size = 12057211, upload-time = "2025-08-21T10:27:43.117Z" }, + { url = "https://files.pythonhosted.org/packages/0b/9d/2df913f14b2deb9c748975fdb2491da1a78773debb25abbc7cbc67c6b549/pandas-2.3.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:114c2fe4f4328cf98ce5716d1532f3ab79c5919f95a9cfee81d9140064a2e4d6", size = 12749277, upload-time = "2025-08-21T10:27:45.474Z" }, + { url = "https://files.pythonhosted.org/packages/87/af/da1a2417026bd14d98c236dba88e39837182459d29dcfcea510b2ac9e8a1/pandas-2.3.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:48fa91c4dfb3b2b9bfdb5c24cd3567575f4e13f9636810462ffed8925352be5a", size = 13415256, upload-time = "2025-08-21T10:27:49.885Z" }, + { url = "https://files.pythonhosted.org/packages/22/3c/f2af1ce8840ef648584a6156489636b5692c162771918aa95707c165ad2b/pandas-2.3.2-cp313-cp313-win_amd64.whl", hash = "sha256:12d039facec710f7ba305786837d0225a3444af7bbd9c15c32ca2d40d157ed8b", size = 10982579, upload-time = "2025-08-21T10:28:08.435Z" }, + { url = "https://files.pythonhosted.org/packages/f3/98/8df69c4097a6719e357dc249bf437b8efbde808038268e584421696cbddf/pandas-2.3.2-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:c624b615ce97864eb588779ed4046186f967374185c047070545253a52ab2d57", size = 12028163, upload-time = "2025-08-21T10:27:52.232Z" }, + { url = "https://files.pythonhosted.org/packages/0e/23/f95cbcbea319f349e10ff90db488b905c6883f03cbabd34f6b03cbc3c044/pandas-2.3.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:0cee69d583b9b128823d9514171cabb6861e09409af805b54459bd0c821a35c2", size = 11391860, upload-time = "2025-08-21T10:27:54.673Z" }, + { url = "https://files.pythonhosted.org/packages/ad/1b/6a984e98c4abee22058aa75bfb8eb90dce58cf8d7296f8bc56c14bc330b0/pandas-2.3.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2319656ed81124982900b4c37f0e0c58c015af9a7bbc62342ba5ad07ace82ba9", size = 11309830, upload-time = "2025-08-21T10:27:56.957Z" }, + { url = "https://files.pythonhosted.org/packages/15/d5/f0486090eb18dd8710bf60afeaf638ba6817047c0c8ae5c6a25598665609/pandas-2.3.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b37205ad6f00d52f16b6d09f406434ba928c1a1966e2771006a9033c736d30d2", size = 11883216, upload-time = "2025-08-21T10:27:59.302Z" }, + { url = "https://files.pythonhosted.org/packages/10/86/692050c119696da19e20245bbd650d8dfca6ceb577da027c3a73c62a047e/pandas-2.3.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:837248b4fc3a9b83b9c6214699a13f069dc13510a6a6d7f9ba33145d2841a012", size = 12699743, upload-time = "2025-08-21T10:28:02.447Z" }, + { url = "https://files.pythonhosted.org/packages/cd/d7/612123674d7b17cf345aad0a10289b2a384bff404e0463a83c4a3a59d205/pandas-2.3.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:d2c3554bd31b731cd6490d94a28f3abb8dd770634a9e06eb6d2911b9827db370", size = 13186141, upload-time = "2025-08-21T10:28:05.377Z" }, +] + +[[package]] +name = "parse" +version = "1.20.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/4f/78/d9b09ba24bb36ef8b83b71be547e118d46214735b6dfb39e4bfde0e9b9dd/parse-1.20.2.tar.gz", hash = "sha256:b41d604d16503c79d81af5165155c0b20f6c8d6c559efa66b4b695c3e5a0a0ce", size = 29391, upload-time = "2024-06-11T04:41:57.34Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d0/31/ba45bf0b2aa7898d81cbbfac0e88c267befb59ad91a19e36e1bc5578ddb1/parse-1.20.2-py2.py3-none-any.whl", hash = "sha256:967095588cb802add9177d0c0b6133b5ba33b1ea9007ca800e526f42a85af558", size = 20126, upload-time = "2024-06-11T04:41:55.057Z" }, +] + +[[package]] +name = "pathable" +version = "0.4.4" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/67/93/8f2c2075b180c12c1e9f6a09d1a985bc2036906b13dff1d8917e395f2048/pathable-0.4.4.tar.gz", hash = "sha256:6905a3cd17804edfac7875b5f6c9142a218c7caef78693c2dbbbfbac186d88b2", size = 8124, upload-time = "2025-01-10T18:43:13.247Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7d/eb/b6260b31b1a96386c0a880edebe26f89669098acea8e0318bff6adb378fd/pathable-0.4.4-py3-none-any.whl", hash = "sha256:5ae9e94793b6ef5a4cbe0a7ce9dbbefc1eec38df253763fd0aeeacf2762dbbc2", size = 9592, upload-time = "2025-01-10T18:43:11.88Z" }, +] + +[[package]] +name = "pathspec" +version = "0.12.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/ca/bc/f35b8446f4531a7cb215605d100cd88b7ac6f44ab3fc94870c120ab3adbf/pathspec-0.12.1.tar.gz", hash = "sha256:a482d51503a1ab33b1c67a6c3813a26953dbdc71c31dacaef9a838c4e29f5712", size = 51043, upload-time = "2023-12-10T22:30:45Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/cc/20/ff623b09d963f88bfde16306a54e12ee5ea43e9b597108672ff3a408aad6/pathspec-0.12.1-py3-none-any.whl", hash = "sha256:a0d503e138a4c123b27490a4f7beda6a01c6f288df0e4a8b79c7eb0dc7b4cc08", size = 31191, upload-time = "2023-12-10T22:30:43.14Z" }, +] + +[[package]] +name = "pathvalidate" +version = "3.3.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/fa/2a/52a8da6fe965dea6192eb716b357558e103aea0a1e9a8352ad575a8406ca/pathvalidate-3.3.1.tar.gz", hash = "sha256:b18c07212bfead624345bb8e1d6141cdcf15a39736994ea0b94035ad2b1ba177", size = 63262, upload-time = "2025-06-15T09:07:20.736Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/9a/70/875f4a23bfc4731703a5835487d0d2fb999031bd415e7d17c0ae615c18b7/pathvalidate-3.3.1-py3-none-any.whl", hash = "sha256:5263baab691f8e1af96092fa5137ee17df5bdfbd6cff1fcac4d6ef4bc2e1735f", size = 24305, upload-time = "2025-06-15T09:07:19.117Z" }, +] + +[[package]] +name = "pendulum" +version = "3.1.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "python-dateutil" }, + { name = "tzdata" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/23/7c/009c12b86c7cc6c403aec80f8a4308598dfc5995e5c523a5491faaa3952e/pendulum-3.1.0.tar.gz", hash = "sha256:66f96303560f41d097bee7d2dc98ffca716fbb3a832c4b3062034c2d45865015", size = 85930, upload-time = "2025-04-19T14:30:01.675Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/5e/6e/d28d3c22e6708b819a94c05bd05a3dfaed5c685379e8b6dc4b34b473b942/pendulum-3.1.0-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:61a03d14f8c64d13b2f7d5859e4b4053c4a7d3b02339f6c71f3e4606bfd67423", size = 338596, upload-time = "2025-04-19T14:01:11.306Z" }, + { url = "https://files.pythonhosted.org/packages/e1/e6/43324d58021d463c2eeb6146b169d2c935f2f840f9e45ac2d500453d954c/pendulum-3.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e674ed2d158afa5c361e60f1f67872dc55b492a10cacdaa7fcd7b7da5f158f24", size = 325854, upload-time = "2025-04-19T14:01:13.156Z" }, + { url = "https://files.pythonhosted.org/packages/b0/a7/d2ae79b960bfdea94dab67e2f118697b08bc9e98eb6bd8d32c4d99240da3/pendulum-3.1.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7c75377eb16e58bbe7e03ea89eeea49be6fc5de0934a4aef0e263f8b4fa71bc2", size = 344334, upload-time = "2025-04-19T14:01:15.151Z" }, + { url = "https://files.pythonhosted.org/packages/96/94/941f071212e23c29aae7def891fb636930c648386e059ce09ea0dcd43933/pendulum-3.1.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:656b8b0ce070f0f2e5e2668247d3c783c55336534aa1f13bd0969535878955e1", size = 382259, upload-time = "2025-04-19T14:01:16.924Z" }, + { url = "https://files.pythonhosted.org/packages/51/ad/a78a701656aec00d16fee636704445c23ca11617a0bfe7c3848d1caa5157/pendulum-3.1.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:48962903e6c1afe1f13548cb6252666056086c107d59e3d64795c58c9298bc2e", size = 436361, upload-time = "2025-04-19T14:01:18.796Z" }, + { url = "https://files.pythonhosted.org/packages/da/93/83f59ccbf4435c29dca8c63a6560fcbe4783079a468a5f91d9f886fd21f0/pendulum-3.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d364ec3f8e65010fefd4b0aaf7be5eb97e5df761b107a06f5e743b7c3f52c311", size = 353653, upload-time = "2025-04-19T14:01:20.159Z" }, + { url = "https://files.pythonhosted.org/packages/6f/0f/42d6644ec6339b41066f594e52d286162aecd2e9735aaf994d7e00c9e09d/pendulum-3.1.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:dd52caffc2afb86612ec43bbeb226f204ea12ebff9f3d12f900a7d3097210fcc", size = 524567, upload-time = "2025-04-19T14:01:21.457Z" }, + { url = "https://files.pythonhosted.org/packages/de/45/d84d909202755ab9d3379e5481fdf70f53344ebefbd68d6f5803ddde98a6/pendulum-3.1.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:d439fccaa35c91f686bd59d30604dab01e8b5c1d0dd66e81648c432fd3f8a539", size = 525571, upload-time = "2025-04-19T14:01:23.329Z" }, + { url = "https://files.pythonhosted.org/packages/0d/e0/4de160773ce3c2f7843c310db19dd919a0cd02cc1c0384866f63b18a6251/pendulum-3.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:43288773a86d9c5c0ddb645f88f615ff6bd12fd1410b34323662beccb18f3b49", size = 260259, upload-time = "2025-04-19T14:01:24.689Z" }, + { url = "https://files.pythonhosted.org/packages/c1/7f/ffa278f78112c6c6e5130a702042f52aab5c649ae2edf814df07810bbba5/pendulum-3.1.0-cp311-cp311-win_arm64.whl", hash = "sha256:569ea5072ae0f11d625e03b36d865f8037b76e838a3b621f6967314193896a11", size = 253899, upload-time = "2025-04-19T14:01:26.442Z" }, + { url = "https://files.pythonhosted.org/packages/7a/d7/b1bfe15a742f2c2713acb1fdc7dc3594ff46ef9418ac6a96fcb12a6ba60b/pendulum-3.1.0-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:4dfd53e7583ccae138be86d6c0a0b324c7547df2afcec1876943c4d481cf9608", size = 336209, upload-time = "2025-04-19T14:01:27.815Z" }, + { url = "https://files.pythonhosted.org/packages/eb/87/0392da0c603c828b926d9f7097fbdddaafc01388cb8a00888635d04758c3/pendulum-3.1.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6a6e06a28f3a7d696546347805536f6f38be458cb79de4f80754430696bea9e6", size = 323130, upload-time = "2025-04-19T14:01:29.336Z" }, + { url = "https://files.pythonhosted.org/packages/c0/61/95f1eec25796be6dddf71440ee16ec1fd0c573fc61a73bd1ef6daacd529a/pendulum-3.1.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7e68d6a51880708084afd8958af42dc8c5e819a70a6c6ae903b1c4bfc61e0f25", size = 341509, upload-time = "2025-04-19T14:01:31.1Z" }, + { url = "https://files.pythonhosted.org/packages/b5/7b/eb0f5e6aa87d5e1b467a1611009dbdc92f0f72425ebf07669bfadd8885a6/pendulum-3.1.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9e3f1e5da39a7ea7119efda1dd96b529748c1566f8a983412d0908455d606942", size = 378674, upload-time = "2025-04-19T14:01:32.974Z" }, + { url = "https://files.pythonhosted.org/packages/29/68/5a4c1b5de3e54e16cab21d2ec88f9cd3f18599e96cc90a441c0b0ab6b03f/pendulum-3.1.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e9af1e5eeddb4ebbe1b1c9afb9fd8077d73416ade42dd61264b3f3b87742e0bb", size = 436133, upload-time = "2025-04-19T14:01:34.349Z" }, + { url = "https://files.pythonhosted.org/packages/87/5d/f7a1d693e5c0f789185117d5c1d5bee104f5b0d9fbf061d715fb61c840a8/pendulum-3.1.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:20f74aa8029a42e327bfc150472e0e4d2358fa5d795f70460160ba81b94b6945", size = 351232, upload-time = "2025-04-19T14:01:35.669Z" }, + { url = "https://files.pythonhosted.org/packages/30/77/c97617eb31f1d0554edb073201a294019b9e0a9bd2f73c68e6d8d048cd6b/pendulum-3.1.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:cf6229e5ee70c2660148523f46c472e677654d0097bec010d6730f08312a4931", size = 521562, upload-time = "2025-04-19T14:01:37.05Z" }, + { url = "https://files.pythonhosted.org/packages/76/22/0d0ef3393303877e757b848ecef8a9a8c7627e17e7590af82d14633b2cd1/pendulum-3.1.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:350cabb23bf1aec7c7694b915d3030bff53a2ad4aeabc8c8c0d807c8194113d6", size = 523221, upload-time = "2025-04-19T14:01:38.444Z" }, + { url = "https://files.pythonhosted.org/packages/99/f3/aefb579aa3cebd6f2866b205fc7a60d33e9a696e9e629024752107dc3cf5/pendulum-3.1.0-cp312-cp312-win_amd64.whl", hash = "sha256:42959341e843077c41d47420f28c3631de054abd64da83f9b956519b5c7a06a7", size = 260502, upload-time = "2025-04-19T14:01:39.814Z" }, + { url = "https://files.pythonhosted.org/packages/02/74/4332b5d6e34c63d4df8e8eab2249e74c05513b1477757463f7fdca99e9be/pendulum-3.1.0-cp312-cp312-win_arm64.whl", hash = "sha256:006758e2125da2e624493324dfd5d7d1b02b0c44bc39358e18bf0f66d0767f5f", size = 253089, upload-time = "2025-04-19T14:01:41.171Z" }, + { url = "https://files.pythonhosted.org/packages/8e/1f/af928ba4aa403dac9569f787adcf024005e7654433d71f7a84e608716837/pendulum-3.1.0-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:28658b0baf4b30eb31d096a375983cfed033e60c0a7bbe94fa23f06cd779b50b", size = 336209, upload-time = "2025-04-19T14:01:42.775Z" }, + { url = "https://files.pythonhosted.org/packages/b6/16/b010643007ba964c397da7fa622924423883c1bbff1a53f9d1022cd7f024/pendulum-3.1.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:b114dcb99ce511cb8f5495c7b6f0056b2c3dba444ef1ea6e48030d7371bd531a", size = 323132, upload-time = "2025-04-19T14:01:44.577Z" }, + { url = "https://files.pythonhosted.org/packages/64/19/c3c47aeecb5d9bceb0e89faafd800d39809b696c5b7bba8ec8370ad5052c/pendulum-3.1.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2404a6a54c80252ea393291f0b7f35525a61abae3d795407f34e118a8f133a18", size = 341509, upload-time = "2025-04-19T14:01:46.084Z" }, + { url = "https://files.pythonhosted.org/packages/38/cf/c06921ff6b860ff7e62e70b8e5d4dc70e36f5abb66d168bd64d51760bc4e/pendulum-3.1.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d06999790d9ee9962a1627e469f98568bf7ad1085553fa3c30ed08b3944a14d7", size = 378674, upload-time = "2025-04-19T14:01:47.727Z" }, + { url = "https://files.pythonhosted.org/packages/62/0b/a43953b9eba11e82612b033ac5133f716f1b76b6108a65da6f408b3cc016/pendulum-3.1.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:94751c52f6b7c306734d1044c2c6067a474237e1e5afa2f665d1fbcbbbcf24b3", size = 436133, upload-time = "2025-04-19T14:01:49.126Z" }, + { url = "https://files.pythonhosted.org/packages/eb/a0/ec3d70b3b96e23ae1d039f132af35e17704c22a8250d1887aaefea4d78a6/pendulum-3.1.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5553ac27be05e997ec26d7f004cf72788f4ce11fe60bb80dda604a64055b29d0", size = 351232, upload-time = "2025-04-19T14:01:50.575Z" }, + { url = "https://files.pythonhosted.org/packages/f4/97/aba23f1716b82f6951ba2b1c9178a2d107d1e66c102762a9bf19988547ea/pendulum-3.1.0-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:f8dee234ca6142bf0514368d01a72945a44685aaa2fc4c14c98d09da9437b620", size = 521563, upload-time = "2025-04-19T14:01:51.9Z" }, + { url = "https://files.pythonhosted.org/packages/01/33/2c0d5216cc53d16db0c4b3d510f141ee0a540937f8675948541190fbd48b/pendulum-3.1.0-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:7378084fe54faab4ee481897a00b710876f2e901ded6221671e827a253e643f2", size = 523221, upload-time = "2025-04-19T14:01:53.275Z" }, + { url = "https://files.pythonhosted.org/packages/51/89/8de955c339c31aeae77fd86d3225509b998c81875e9dba28cb88b8cbf4b3/pendulum-3.1.0-cp313-cp313-win_amd64.whl", hash = "sha256:8539db7ae2c8da430ac2515079e288948c8ebf7eb1edd3e8281b5cdf433040d6", size = 260501, upload-time = "2025-04-19T14:01:54.749Z" }, + { url = "https://files.pythonhosted.org/packages/15/c3/226a3837363e94f8722461848feec18bfdd7d5172564d53aa3c3397ff01e/pendulum-3.1.0-cp313-cp313-win_arm64.whl", hash = "sha256:1ce26a608e1f7387cd393fba2a129507c4900958d4f47b90757ec17656856571", size = 253087, upload-time = "2025-04-19T14:01:55.998Z" }, + { url = "https://files.pythonhosted.org/packages/6e/23/e98758924d1b3aac11a626268eabf7f3cf177e7837c28d47bf84c64532d0/pendulum-3.1.0-py3-none-any.whl", hash = "sha256:f9178c2a8e291758ade1e8dd6371b1d26d08371b4c7730a6e9a3ef8b16ebae0f", size = 111799, upload-time = "2025-04-19T14:02:34.739Z" }, +] + +[[package]] +name = "pillow" +version = "11.3.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f3/0d/d0d6dea55cd152ce3d6767bb38a8fc10e33796ba4ba210cbab9354b6d238/pillow-11.3.0.tar.gz", hash = "sha256:3828ee7586cd0b2091b6209e5ad53e20d0649bbe87164a459d0676e035e8f523", size = 47113069, upload-time = "2025-07-01T09:16:30.666Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/db/26/77f8ed17ca4ffd60e1dcd220a6ec6d71210ba398cfa33a13a1cd614c5613/pillow-11.3.0-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:1cd110edf822773368b396281a2293aeb91c90a2db00d78ea43e7e861631b722", size = 5316531, upload-time = "2025-07-01T09:13:59.203Z" }, + { url = "https://files.pythonhosted.org/packages/cb/39/ee475903197ce709322a17a866892efb560f57900d9af2e55f86db51b0a5/pillow-11.3.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:9c412fddd1b77a75aa904615ebaa6001f169b26fd467b4be93aded278266b288", size = 4686560, upload-time = "2025-07-01T09:14:01.101Z" }, + { url = "https://files.pythonhosted.org/packages/d5/90/442068a160fd179938ba55ec8c97050a612426fae5ec0a764e345839f76d/pillow-11.3.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7d1aa4de119a0ecac0a34a9c8bde33f34022e2e8f99104e47a3ca392fd60e37d", size = 5870978, upload-time = "2025-07-03T13:09:55.638Z" }, + { url = "https://files.pythonhosted.org/packages/13/92/dcdd147ab02daf405387f0218dcf792dc6dd5b14d2573d40b4caeef01059/pillow-11.3.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:91da1d88226663594e3f6b4b8c3c8d85bd504117d043740a8e0ec449087cc494", size = 7641168, upload-time = "2025-07-03T13:10:00.37Z" }, + { url = "https://files.pythonhosted.org/packages/6e/db/839d6ba7fd38b51af641aa904e2960e7a5644d60ec754c046b7d2aee00e5/pillow-11.3.0-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:643f189248837533073c405ec2f0bb250ba54598cf80e8c1e043381a60632f58", size = 5973053, upload-time = "2025-07-01T09:14:04.491Z" }, + { url = "https://files.pythonhosted.org/packages/f2/2f/d7675ecae6c43e9f12aa8d58b6012683b20b6edfbdac7abcb4e6af7a3784/pillow-11.3.0-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:106064daa23a745510dabce1d84f29137a37224831d88eb4ce94bb187b1d7e5f", size = 6640273, upload-time = "2025-07-01T09:14:06.235Z" }, + { url = "https://files.pythonhosted.org/packages/45/ad/931694675ede172e15b2ff03c8144a0ddaea1d87adb72bb07655eaffb654/pillow-11.3.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:cd8ff254faf15591e724dc7c4ddb6bf4793efcbe13802a4ae3e863cd300b493e", size = 6082043, upload-time = "2025-07-01T09:14:07.978Z" }, + { url = "https://files.pythonhosted.org/packages/3a/04/ba8f2b11fc80d2dd462d7abec16351b45ec99cbbaea4387648a44190351a/pillow-11.3.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:932c754c2d51ad2b2271fd01c3d121daaa35e27efae2a616f77bf164bc0b3e94", size = 6715516, upload-time = "2025-07-01T09:14:10.233Z" }, + { url = "https://files.pythonhosted.org/packages/48/59/8cd06d7f3944cc7d892e8533c56b0acb68399f640786313275faec1e3b6f/pillow-11.3.0-cp311-cp311-win32.whl", hash = "sha256:b4b8f3efc8d530a1544e5962bd6b403d5f7fe8b9e08227c6b255f98ad82b4ba0", size = 6274768, upload-time = "2025-07-01T09:14:11.921Z" }, + { url = "https://files.pythonhosted.org/packages/f1/cc/29c0f5d64ab8eae20f3232da8f8571660aa0ab4b8f1331da5c2f5f9a938e/pillow-11.3.0-cp311-cp311-win_amd64.whl", hash = "sha256:1a992e86b0dd7aeb1f053cd506508c0999d710a8f07b4c791c63843fc6a807ac", size = 6986055, upload-time = "2025-07-01T09:14:13.623Z" }, + { url = "https://files.pythonhosted.org/packages/c6/df/90bd886fabd544c25addd63e5ca6932c86f2b701d5da6c7839387a076b4a/pillow-11.3.0-cp311-cp311-win_arm64.whl", hash = "sha256:30807c931ff7c095620fe04448e2c2fc673fcbb1ffe2a7da3fb39613489b1ddd", size = 2423079, upload-time = "2025-07-01T09:14:15.268Z" }, + { url = "https://files.pythonhosted.org/packages/40/fe/1bc9b3ee13f68487a99ac9529968035cca2f0a51ec36892060edcc51d06a/pillow-11.3.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:fdae223722da47b024b867c1ea0be64e0df702c5e0a60e27daad39bf960dd1e4", size = 5278800, upload-time = "2025-07-01T09:14:17.648Z" }, + { url = "https://files.pythonhosted.org/packages/2c/32/7e2ac19b5713657384cec55f89065fb306b06af008cfd87e572035b27119/pillow-11.3.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:921bd305b10e82b4d1f5e802b6850677f965d8394203d182f078873851dada69", size = 4686296, upload-time = "2025-07-01T09:14:19.828Z" }, + { url = "https://files.pythonhosted.org/packages/8e/1e/b9e12bbe6e4c2220effebc09ea0923a07a6da1e1f1bfbc8d7d29a01ce32b/pillow-11.3.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:eb76541cba2f958032d79d143b98a3a6b3ea87f0959bbe256c0b5e416599fd5d", size = 5871726, upload-time = "2025-07-03T13:10:04.448Z" }, + { url = "https://files.pythonhosted.org/packages/8d/33/e9200d2bd7ba00dc3ddb78df1198a6e80d7669cce6c2bdbeb2530a74ec58/pillow-11.3.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:67172f2944ebba3d4a7b54f2e95c786a3a50c21b88456329314caaa28cda70f6", size = 7644652, upload-time = "2025-07-03T13:10:10.391Z" }, + { url = "https://files.pythonhosted.org/packages/41/f1/6f2427a26fc683e00d985bc391bdd76d8dd4e92fac33d841127eb8fb2313/pillow-11.3.0-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:97f07ed9f56a3b9b5f49d3661dc9607484e85c67e27f3e8be2c7d28ca032fec7", size = 5977787, upload-time = "2025-07-01T09:14:21.63Z" }, + { url = "https://files.pythonhosted.org/packages/e4/c9/06dd4a38974e24f932ff5f98ea3c546ce3f8c995d3f0985f8e5ba48bba19/pillow-11.3.0-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:676b2815362456b5b3216b4fd5bd89d362100dc6f4945154ff172e206a22c024", size = 6645236, upload-time = "2025-07-01T09:14:23.321Z" }, + { url = "https://files.pythonhosted.org/packages/40/e7/848f69fb79843b3d91241bad658e9c14f39a32f71a301bcd1d139416d1be/pillow-11.3.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3e184b2f26ff146363dd07bde8b711833d7b0202e27d13540bfe2e35a323a809", size = 6086950, upload-time = "2025-07-01T09:14:25.237Z" }, + { url = "https://files.pythonhosted.org/packages/0b/1a/7cff92e695a2a29ac1958c2a0fe4c0b2393b60aac13b04a4fe2735cad52d/pillow-11.3.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:6be31e3fc9a621e071bc17bb7de63b85cbe0bfae91bb0363c893cbe67247780d", size = 6723358, upload-time = "2025-07-01T09:14:27.053Z" }, + { url = "https://files.pythonhosted.org/packages/26/7d/73699ad77895f69edff76b0f332acc3d497f22f5d75e5360f78cbcaff248/pillow-11.3.0-cp312-cp312-win32.whl", hash = "sha256:7b161756381f0918e05e7cb8a371fff367e807770f8fe92ecb20d905d0e1c149", size = 6275079, upload-time = "2025-07-01T09:14:30.104Z" }, + { url = "https://files.pythonhosted.org/packages/8c/ce/e7dfc873bdd9828f3b6e5c2bbb74e47a98ec23cc5c74fc4e54462f0d9204/pillow-11.3.0-cp312-cp312-win_amd64.whl", hash = "sha256:a6444696fce635783440b7f7a9fc24b3ad10a9ea3f0ab66c5905be1c19ccf17d", size = 6986324, upload-time = "2025-07-01T09:14:31.899Z" }, + { url = "https://files.pythonhosted.org/packages/16/8f/b13447d1bf0b1f7467ce7d86f6e6edf66c0ad7cf44cf5c87a37f9bed9936/pillow-11.3.0-cp312-cp312-win_arm64.whl", hash = "sha256:2aceea54f957dd4448264f9bf40875da0415c83eb85f55069d89c0ed436e3542", size = 2423067, upload-time = "2025-07-01T09:14:33.709Z" }, + { url = "https://files.pythonhosted.org/packages/1e/93/0952f2ed8db3a5a4c7a11f91965d6184ebc8cd7cbb7941a260d5f018cd2d/pillow-11.3.0-cp313-cp313-ios_13_0_arm64_iphoneos.whl", hash = "sha256:1c627742b539bba4309df89171356fcb3cc5a9178355b2727d1b74a6cf155fbd", size = 2128328, upload-time = "2025-07-01T09:14:35.276Z" }, + { url = "https://files.pythonhosted.org/packages/4b/e8/100c3d114b1a0bf4042f27e0f87d2f25e857e838034e98ca98fe7b8c0a9c/pillow-11.3.0-cp313-cp313-ios_13_0_arm64_iphonesimulator.whl", hash = "sha256:30b7c02f3899d10f13d7a48163c8969e4e653f8b43416d23d13d1bbfdc93b9f8", size = 2170652, upload-time = "2025-07-01T09:14:37.203Z" }, + { url = "https://files.pythonhosted.org/packages/aa/86/3f758a28a6e381758545f7cdb4942e1cb79abd271bea932998fc0db93cb6/pillow-11.3.0-cp313-cp313-ios_13_0_x86_64_iphonesimulator.whl", hash = "sha256:7859a4cc7c9295f5838015d8cc0a9c215b77e43d07a25e460f35cf516df8626f", size = 2227443, upload-time = "2025-07-01T09:14:39.344Z" }, + { url = "https://files.pythonhosted.org/packages/01/f4/91d5b3ffa718df2f53b0dc109877993e511f4fd055d7e9508682e8aba092/pillow-11.3.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:ec1ee50470b0d050984394423d96325b744d55c701a439d2bd66089bff963d3c", size = 5278474, upload-time = "2025-07-01T09:14:41.843Z" }, + { url = "https://files.pythonhosted.org/packages/f9/0e/37d7d3eca6c879fbd9dba21268427dffda1ab00d4eb05b32923d4fbe3b12/pillow-11.3.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:7db51d222548ccfd274e4572fdbf3e810a5e66b00608862f947b163e613b67dd", size = 4686038, upload-time = "2025-07-01T09:14:44.008Z" }, + { url = "https://files.pythonhosted.org/packages/ff/b0/3426e5c7f6565e752d81221af9d3676fdbb4f352317ceafd42899aaf5d8a/pillow-11.3.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:2d6fcc902a24ac74495df63faad1884282239265c6839a0a6416d33faedfae7e", size = 5864407, upload-time = "2025-07-03T13:10:15.628Z" }, + { url = "https://files.pythonhosted.org/packages/fc/c1/c6c423134229f2a221ee53f838d4be9d82bab86f7e2f8e75e47b6bf6cd77/pillow-11.3.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f0f5d8f4a08090c6d6d578351a2b91acf519a54986c055af27e7a93feae6d3f1", size = 7639094, upload-time = "2025-07-03T13:10:21.857Z" }, + { url = "https://files.pythonhosted.org/packages/ba/c9/09e6746630fe6372c67c648ff9deae52a2bc20897d51fa293571977ceb5d/pillow-11.3.0-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c37d8ba9411d6003bba9e518db0db0c58a680ab9fe5179f040b0463644bc9805", size = 5973503, upload-time = "2025-07-01T09:14:45.698Z" }, + { url = "https://files.pythonhosted.org/packages/d5/1c/a2a29649c0b1983d3ef57ee87a66487fdeb45132df66ab30dd37f7dbe162/pillow-11.3.0-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:13f87d581e71d9189ab21fe0efb5a23e9f28552d5be6979e84001d3b8505abe8", size = 6642574, upload-time = "2025-07-01T09:14:47.415Z" }, + { url = "https://files.pythonhosted.org/packages/36/de/d5cc31cc4b055b6c6fd990e3e7f0f8aaf36229a2698501bcb0cdf67c7146/pillow-11.3.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:023f6d2d11784a465f09fd09a34b150ea4672e85fb3d05931d89f373ab14abb2", size = 6084060, upload-time = "2025-07-01T09:14:49.636Z" }, + { url = "https://files.pythonhosted.org/packages/d5/ea/502d938cbaeec836ac28a9b730193716f0114c41325db428e6b280513f09/pillow-11.3.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:45dfc51ac5975b938e9809451c51734124e73b04d0f0ac621649821a63852e7b", size = 6721407, upload-time = "2025-07-01T09:14:51.962Z" }, + { url = "https://files.pythonhosted.org/packages/45/9c/9c5e2a73f125f6cbc59cc7087c8f2d649a7ae453f83bd0362ff7c9e2aee2/pillow-11.3.0-cp313-cp313-win32.whl", hash = "sha256:a4d336baed65d50d37b88ca5b60c0fa9d81e3a87d4a7930d3880d1624d5b31f3", size = 6273841, upload-time = "2025-07-01T09:14:54.142Z" }, + { url = "https://files.pythonhosted.org/packages/23/85/397c73524e0cd212067e0c969aa245b01d50183439550d24d9f55781b776/pillow-11.3.0-cp313-cp313-win_amd64.whl", hash = "sha256:0bce5c4fd0921f99d2e858dc4d4d64193407e1b99478bc5cacecba2311abde51", size = 6978450, upload-time = "2025-07-01T09:14:56.436Z" }, + { url = "https://files.pythonhosted.org/packages/17/d2/622f4547f69cd173955194b78e4d19ca4935a1b0f03a302d655c9f6aae65/pillow-11.3.0-cp313-cp313-win_arm64.whl", hash = "sha256:1904e1264881f682f02b7f8167935cce37bc97db457f8e7849dc3a6a52b99580", size = 2423055, upload-time = "2025-07-01T09:14:58.072Z" }, + { url = "https://files.pythonhosted.org/packages/dd/80/a8a2ac21dda2e82480852978416cfacd439a4b490a501a288ecf4fe2532d/pillow-11.3.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:4c834a3921375c48ee6b9624061076bc0a32a60b5532b322cc0ea64e639dd50e", size = 5281110, upload-time = "2025-07-01T09:14:59.79Z" }, + { url = "https://files.pythonhosted.org/packages/44/d6/b79754ca790f315918732e18f82a8146d33bcd7f4494380457ea89eb883d/pillow-11.3.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:5e05688ccef30ea69b9317a9ead994b93975104a677a36a8ed8106be9260aa6d", size = 4689547, upload-time = "2025-07-01T09:15:01.648Z" }, + { url = "https://files.pythonhosted.org/packages/49/20/716b8717d331150cb00f7fdd78169c01e8e0c219732a78b0e59b6bdb2fd6/pillow-11.3.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:1019b04af07fc0163e2810167918cb5add8d74674b6267616021ab558dc98ced", size = 5901554, upload-time = "2025-07-03T13:10:27.018Z" }, + { url = "https://files.pythonhosted.org/packages/74/cf/a9f3a2514a65bb071075063a96f0a5cf949c2f2fce683c15ccc83b1c1cab/pillow-11.3.0-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f944255db153ebb2b19c51fe85dd99ef0ce494123f21b9db4877ffdfc5590c7c", size = 7669132, upload-time = "2025-07-03T13:10:33.01Z" }, + { url = "https://files.pythonhosted.org/packages/98/3c/da78805cbdbee9cb43efe8261dd7cc0b4b93f2ac79b676c03159e9db2187/pillow-11.3.0-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1f85acb69adf2aaee8b7da124efebbdb959a104db34d3a2cb0f3793dbae422a8", size = 6005001, upload-time = "2025-07-01T09:15:03.365Z" }, + { url = "https://files.pythonhosted.org/packages/6c/fa/ce044b91faecf30e635321351bba32bab5a7e034c60187fe9698191aef4f/pillow-11.3.0-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:05f6ecbeff5005399bb48d198f098a9b4b6bdf27b8487c7f38ca16eeb070cd59", size = 6668814, upload-time = "2025-07-01T09:15:05.655Z" }, + { url = "https://files.pythonhosted.org/packages/7b/51/90f9291406d09bf93686434f9183aba27b831c10c87746ff49f127ee80cb/pillow-11.3.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:a7bc6e6fd0395bc052f16b1a8670859964dbd7003bd0af2ff08342eb6e442cfe", size = 6113124, upload-time = "2025-07-01T09:15:07.358Z" }, + { url = "https://files.pythonhosted.org/packages/cd/5a/6fec59b1dfb619234f7636d4157d11fb4e196caeee220232a8d2ec48488d/pillow-11.3.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:83e1b0161c9d148125083a35c1c5a89db5b7054834fd4387499e06552035236c", size = 6747186, upload-time = "2025-07-01T09:15:09.317Z" }, + { url = "https://files.pythonhosted.org/packages/49/6b/00187a044f98255225f172de653941e61da37104a9ea60e4f6887717e2b5/pillow-11.3.0-cp313-cp313t-win32.whl", hash = "sha256:2a3117c06b8fb646639dce83694f2f9eac405472713fcb1ae887469c0d4f6788", size = 6277546, upload-time = "2025-07-01T09:15:11.311Z" }, + { url = "https://files.pythonhosted.org/packages/e8/5c/6caaba7e261c0d75bab23be79f1d06b5ad2a2ae49f028ccec801b0e853d6/pillow-11.3.0-cp313-cp313t-win_amd64.whl", hash = "sha256:857844335c95bea93fb39e0fa2726b4d9d758850b34075a7e3ff4f4fa3aa3b31", size = 6985102, upload-time = "2025-07-01T09:15:13.164Z" }, + { url = "https://files.pythonhosted.org/packages/f3/7e/b623008460c09a0cb38263c93b828c666493caee2eb34ff67f778b87e58c/pillow-11.3.0-cp313-cp313t-win_arm64.whl", hash = "sha256:8797edc41f3e8536ae4b10897ee2f637235c94f27404cac7297f7b607dd0716e", size = 2424803, upload-time = "2025-07-01T09:15:15.695Z" }, + { url = "https://files.pythonhosted.org/packages/73/f4/04905af42837292ed86cb1b1dabe03dce1edc008ef14c473c5c7e1443c5d/pillow-11.3.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:d9da3df5f9ea2a89b81bb6087177fb1f4d1c7146d583a3fe5c672c0d94e55e12", size = 5278520, upload-time = "2025-07-01T09:15:17.429Z" }, + { url = "https://files.pythonhosted.org/packages/41/b0/33d79e377a336247df6348a54e6d2a2b85d644ca202555e3faa0cf811ecc/pillow-11.3.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:0b275ff9b04df7b640c59ec5a3cb113eefd3795a8df80bac69646ef699c6981a", size = 4686116, upload-time = "2025-07-01T09:15:19.423Z" }, + { url = "https://files.pythonhosted.org/packages/49/2d/ed8bc0ab219ae8768f529597d9509d184fe8a6c4741a6864fea334d25f3f/pillow-11.3.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:0743841cabd3dba6a83f38a92672cccbd69af56e3e91777b0ee7f4dba4385632", size = 5864597, upload-time = "2025-07-03T13:10:38.404Z" }, + { url = "https://files.pythonhosted.org/packages/b5/3d/b932bb4225c80b58dfadaca9d42d08d0b7064d2d1791b6a237f87f661834/pillow-11.3.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:2465a69cf967b8b49ee1b96d76718cd98c4e925414ead59fdf75cf0fd07df673", size = 7638246, upload-time = "2025-07-03T13:10:44.987Z" }, + { url = "https://files.pythonhosted.org/packages/09/b5/0487044b7c096f1b48f0d7ad416472c02e0e4bf6919541b111efd3cae690/pillow-11.3.0-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:41742638139424703b4d01665b807c6468e23e699e8e90cffefe291c5832b027", size = 5973336, upload-time = "2025-07-01T09:15:21.237Z" }, + { url = "https://files.pythonhosted.org/packages/a8/2d/524f9318f6cbfcc79fbc004801ea6b607ec3f843977652fdee4857a7568b/pillow-11.3.0-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:93efb0b4de7e340d99057415c749175e24c8864302369e05914682ba642e5d77", size = 6642699, upload-time = "2025-07-01T09:15:23.186Z" }, + { url = "https://files.pythonhosted.org/packages/6f/d2/a9a4f280c6aefedce1e8f615baaa5474e0701d86dd6f1dede66726462bbd/pillow-11.3.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:7966e38dcd0fa11ca390aed7c6f20454443581d758242023cf36fcb319b1a874", size = 6083789, upload-time = "2025-07-01T09:15:25.1Z" }, + { url = "https://files.pythonhosted.org/packages/fe/54/86b0cd9dbb683a9d5e960b66c7379e821a19be4ac5810e2e5a715c09a0c0/pillow-11.3.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:98a9afa7b9007c67ed84c57c9e0ad86a6000da96eaa638e4f8abe5b65ff83f0a", size = 6720386, upload-time = "2025-07-01T09:15:27.378Z" }, + { url = "https://files.pythonhosted.org/packages/e7/95/88efcaf384c3588e24259c4203b909cbe3e3c2d887af9e938c2022c9dd48/pillow-11.3.0-cp314-cp314-win32.whl", hash = "sha256:02a723e6bf909e7cea0dac1b0e0310be9d7650cd66222a5f1c571455c0a45214", size = 6370911, upload-time = "2025-07-01T09:15:29.294Z" }, + { url = "https://files.pythonhosted.org/packages/2e/cc/934e5820850ec5eb107e7b1a72dd278140731c669f396110ebc326f2a503/pillow-11.3.0-cp314-cp314-win_amd64.whl", hash = "sha256:a418486160228f64dd9e9efcd132679b7a02a5f22c982c78b6fc7dab3fefb635", size = 7117383, upload-time = "2025-07-01T09:15:31.128Z" }, + { url = "https://files.pythonhosted.org/packages/d6/e9/9c0a616a71da2a5d163aa37405e8aced9a906d574b4a214bede134e731bc/pillow-11.3.0-cp314-cp314-win_arm64.whl", hash = "sha256:155658efb5e044669c08896c0c44231c5e9abcaadbc5cd3648df2f7c0b96b9a6", size = 2511385, upload-time = "2025-07-01T09:15:33.328Z" }, + { url = "https://files.pythonhosted.org/packages/1a/33/c88376898aff369658b225262cd4f2659b13e8178e7534df9e6e1fa289f6/pillow-11.3.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:59a03cdf019efbfeeed910bf79c7c93255c3d54bc45898ac2a4140071b02b4ae", size = 5281129, upload-time = "2025-07-01T09:15:35.194Z" }, + { url = "https://files.pythonhosted.org/packages/1f/70/d376247fb36f1844b42910911c83a02d5544ebd2a8bad9efcc0f707ea774/pillow-11.3.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:f8a5827f84d973d8636e9dc5764af4f0cf2318d26744b3d902931701b0d46653", size = 4689580, upload-time = "2025-07-01T09:15:37.114Z" }, + { url = "https://files.pythonhosted.org/packages/eb/1c/537e930496149fbac69efd2fc4329035bbe2e5475b4165439e3be9cb183b/pillow-11.3.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:ee92f2fd10f4adc4b43d07ec5e779932b4eb3dbfbc34790ada5a6669bc095aa6", size = 5902860, upload-time = "2025-07-03T13:10:50.248Z" }, + { url = "https://files.pythonhosted.org/packages/bd/57/80f53264954dcefeebcf9dae6e3eb1daea1b488f0be8b8fef12f79a3eb10/pillow-11.3.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c96d333dcf42d01f47b37e0979b6bd73ec91eae18614864622d9b87bbd5bbf36", size = 7670694, upload-time = "2025-07-03T13:10:56.432Z" }, + { url = "https://files.pythonhosted.org/packages/70/ff/4727d3b71a8578b4587d9c276e90efad2d6fe0335fd76742a6da08132e8c/pillow-11.3.0-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4c96f993ab8c98460cd0c001447bff6194403e8b1d7e149ade5f00594918128b", size = 6005888, upload-time = "2025-07-01T09:15:39.436Z" }, + { url = "https://files.pythonhosted.org/packages/05/ae/716592277934f85d3be51d7256f3636672d7b1abfafdc42cf3f8cbd4b4c8/pillow-11.3.0-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:41342b64afeba938edb034d122b2dda5db2139b9a4af999729ba8818e0056477", size = 6670330, upload-time = "2025-07-01T09:15:41.269Z" }, + { url = "https://files.pythonhosted.org/packages/e7/bb/7fe6cddcc8827b01b1a9766f5fdeb7418680744f9082035bdbabecf1d57f/pillow-11.3.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:068d9c39a2d1b358eb9f245ce7ab1b5c3246c7c8c7d9ba58cfa5b43146c06e50", size = 6114089, upload-time = "2025-07-01T09:15:43.13Z" }, + { url = "https://files.pythonhosted.org/packages/8b/f5/06bfaa444c8e80f1a8e4bff98da9c83b37b5be3b1deaa43d27a0db37ef84/pillow-11.3.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:a1bc6ba083b145187f648b667e05a2534ecc4b9f2784c2cbe3089e44868f2b9b", size = 6748206, upload-time = "2025-07-01T09:15:44.937Z" }, + { url = "https://files.pythonhosted.org/packages/f0/77/bc6f92a3e8e6e46c0ca78abfffec0037845800ea38c73483760362804c41/pillow-11.3.0-cp314-cp314t-win32.whl", hash = "sha256:118ca10c0d60b06d006be10a501fd6bbdfef559251ed31b794668ed569c87e12", size = 6377370, upload-time = "2025-07-01T09:15:46.673Z" }, + { url = "https://files.pythonhosted.org/packages/4a/82/3a721f7d69dca802befb8af08b7c79ebcab461007ce1c18bd91a5d5896f9/pillow-11.3.0-cp314-cp314t-win_amd64.whl", hash = "sha256:8924748b688aa210d79883357d102cd64690e56b923a186f35a82cbc10f997db", size = 7121500, upload-time = "2025-07-01T09:15:48.512Z" }, + { url = "https://files.pythonhosted.org/packages/89/c7/5572fa4a3f45740eaab6ae86fcdf7195b55beac1371ac8c619d880cfe948/pillow-11.3.0-cp314-cp314t-win_arm64.whl", hash = "sha256:79ea0d14d3ebad43ec77ad5272e6ff9bba5b679ef73375ea760261207fa8e0aa", size = 2512835, upload-time = "2025-07-01T09:15:50.399Z" }, + { url = "https://files.pythonhosted.org/packages/9e/e3/6fa84033758276fb31da12e5fb66ad747ae83b93c67af17f8c6ff4cc8f34/pillow-11.3.0-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:7c8ec7a017ad1bd562f93dbd8505763e688d388cde6e4a010ae1486916e713e6", size = 5270566, upload-time = "2025-07-01T09:16:19.801Z" }, + { url = "https://files.pythonhosted.org/packages/5b/ee/e8d2e1ab4892970b561e1ba96cbd59c0d28cf66737fc44abb2aec3795a4e/pillow-11.3.0-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:9ab6ae226de48019caa8074894544af5b53a117ccb9d3b3dcb2871464c829438", size = 4654618, upload-time = "2025-07-01T09:16:21.818Z" }, + { url = "https://files.pythonhosted.org/packages/f2/6d/17f80f4e1f0761f02160fc433abd4109fa1548dcfdca46cfdadaf9efa565/pillow-11.3.0-pp311-pypy311_pp73-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:fe27fb049cdcca11f11a7bfda64043c37b30e6b91f10cb5bab275806c32f6ab3", size = 4874248, upload-time = "2025-07-03T13:11:20.738Z" }, + { url = "https://files.pythonhosted.org/packages/de/5f/c22340acd61cef960130585bbe2120e2fd8434c214802f07e8c03596b17e/pillow-11.3.0-pp311-pypy311_pp73-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:465b9e8844e3c3519a983d58b80be3f668e2a7a5db97f2784e7079fbc9f9822c", size = 6583963, upload-time = "2025-07-03T13:11:26.283Z" }, + { url = "https://files.pythonhosted.org/packages/31/5e/03966aedfbfcbb4d5f8aa042452d3361f325b963ebbadddac05b122e47dd/pillow-11.3.0-pp311-pypy311_pp73-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5418b53c0d59b3824d05e029669efa023bbef0f3e92e75ec8428f3799487f361", size = 4957170, upload-time = "2025-07-01T09:16:23.762Z" }, + { url = "https://files.pythonhosted.org/packages/cc/2d/e082982aacc927fc2cab48e1e731bdb1643a1406acace8bed0900a61464e/pillow-11.3.0-pp311-pypy311_pp73-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:504b6f59505f08ae014f724b6207ff6222662aab5cc9542577fb084ed0676ac7", size = 5581505, upload-time = "2025-07-01T09:16:25.593Z" }, + { url = "https://files.pythonhosted.org/packages/34/e7/ae39f538fd6844e982063c3a5e4598b8ced43b9633baa3a85ef33af8c05c/pillow-11.3.0-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:c84d689db21a1c397d001aa08241044aa2069e7587b398c8cc63020390b1c1b8", size = 6984598, upload-time = "2025-07-01T09:16:27.732Z" }, +] + +[[package]] +name = "platformdirs" +version = "4.4.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/23/e8/21db9c9987b0e728855bd57bff6984f67952bea55d6f75e055c46b5383e8/platformdirs-4.4.0.tar.gz", hash = "sha256:ca753cf4d81dc309bc67b0ea38fd15dc97bc30ce419a7f58d13eb3bf14c4febf", size = 21634, upload-time = "2025-08-26T14:32:04.268Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/40/4b/2028861e724d3bd36227adfa20d3fd24c3fc6d52032f4a93c133be5d17ce/platformdirs-4.4.0-py3-none-any.whl", hash = "sha256:abd01743f24e5287cd7a5db3752faf1a2d65353f38ec26d98e25a6db65958c85", size = 18654, upload-time = "2025-08-26T14:32:02.735Z" }, +] + +[[package]] +name = "pluggy" +version = "1.6.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f9/e2/3e91f31a7d2b083fe6ef3fa267035b518369d9511ffab804f839851d2779/pluggy-1.6.0.tar.gz", hash = "sha256:7dcc130b76258d33b90f61b658791dede3486c3e6bfb003ee5c9bfb396dd22f3", size = 69412, upload-time = "2025-05-15T12:30:07.975Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/54/20/4d324d65cc6d9205fabedc306948156824eb9f0ee1633355a8f7ec5c66bf/pluggy-1.6.0-py3-none-any.whl", hash = "sha256:e920276dd6813095e9377c0bc5566d94c932c33b27a3e3945d8389c374dd4746", size = 20538, upload-time = "2025-05-15T12:30:06.134Z" }, +] + +[[package]] +name = "ply" +version = "3.11" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/e5/69/882ee5c9d017149285cab114ebeab373308ef0f874fcdac9beb90e0ac4da/ply-3.11.tar.gz", hash = "sha256:00c7c1aaa88358b9c765b6d3000c6eec0ba42abca5351b095321aef446081da3", size = 159130, upload-time = "2018-02-15T19:01:31.097Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a3/58/35da89ee790598a0700ea49b2a66594140f44dec458c07e8e3d4979137fc/ply-3.11-py2.py3-none-any.whl", hash = "sha256:096f9b8350b65ebd2fd1346b12452efe5b9607f7482813ffca50c22722a807ce", size = 49567, upload-time = "2018-02-15T19:01:27.172Z" }, +] + +[[package]] +name = "pre-commit" +version = "4.3.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "cfgv" }, + { name = "identify" }, + { name = "nodeenv" }, + { name = "pyyaml" }, + { name = "virtualenv" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/ff/29/7cf5bbc236333876e4b41f56e06857a87937ce4bf91e117a6991a2dbb02a/pre_commit-4.3.0.tar.gz", hash = "sha256:499fe450cc9d42e9d58e606262795ecb64dd05438943c62b66f6a8673da30b16", size = 193792, upload-time = "2025-08-09T18:56:14.651Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/5b/a5/987a405322d78a73b66e39e4a90e4ef156fd7141bf71df987e50717c321b/pre_commit-4.3.0-py2.py3-none-any.whl", hash = "sha256:2b0747ad7e6e967169136edffee14c16e148a778a54e4f967921aa1ebf2308d8", size = 220965, upload-time = "2025-08-09T18:56:13.192Z" }, +] + +[[package]] +name = "propcache" +version = "0.3.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/a6/16/43264e4a779dd8588c21a70f0709665ee8f611211bdd2c87d952cfa7c776/propcache-0.3.2.tar.gz", hash = "sha256:20d7d62e4e7ef05f221e0db2856b979540686342e7dd9973b815599c7057e168", size = 44139, upload-time = "2025-06-09T22:56:06.081Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/80/8d/e8b436717ab9c2cfc23b116d2c297305aa4cd8339172a456d61ebf5669b8/propcache-0.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:0b8d2f607bd8f80ddc04088bc2a037fdd17884a6fcadc47a96e334d72f3717be", size = 74207, upload-time = "2025-06-09T22:54:05.399Z" }, + { url = "https://files.pythonhosted.org/packages/d6/29/1e34000e9766d112171764b9fa3226fa0153ab565d0c242c70e9945318a7/propcache-0.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:06766d8f34733416e2e34f46fea488ad5d60726bb9481d3cddf89a6fa2d9603f", size = 43648, upload-time = "2025-06-09T22:54:08.023Z" }, + { url = "https://files.pythonhosted.org/packages/46/92/1ad5af0df781e76988897da39b5f086c2bf0f028b7f9bd1f409bb05b6874/propcache-0.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a2dc1f4a1df4fecf4e6f68013575ff4af84ef6f478fe5344317a65d38a8e6dc9", size = 43496, upload-time = "2025-06-09T22:54:09.228Z" }, + { url = "https://files.pythonhosted.org/packages/b3/ce/e96392460f9fb68461fabab3e095cb00c8ddf901205be4eae5ce246e5b7e/propcache-0.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:be29c4f4810c5789cf10ddf6af80b041c724e629fa51e308a7a0fb19ed1ef7bf", size = 217288, upload-time = "2025-06-09T22:54:10.466Z" }, + { url = "https://files.pythonhosted.org/packages/c5/2a/866726ea345299f7ceefc861a5e782b045545ae6940851930a6adaf1fca6/propcache-0.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:59d61f6970ecbd8ff2e9360304d5c8876a6abd4530cb752c06586849ac8a9dc9", size = 227456, upload-time = "2025-06-09T22:54:11.828Z" }, + { url = "https://files.pythonhosted.org/packages/de/03/07d992ccb6d930398689187e1b3c718339a1c06b8b145a8d9650e4726166/propcache-0.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:62180e0b8dbb6b004baec00a7983e4cc52f5ada9cd11f48c3528d8cfa7b96a66", size = 225429, upload-time = "2025-06-09T22:54:13.823Z" }, + { url = "https://files.pythonhosted.org/packages/5d/e6/116ba39448753b1330f48ab8ba927dcd6cf0baea8a0ccbc512dfb49ba670/propcache-0.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c144ca294a204c470f18cf4c9d78887810d04a3e2fbb30eea903575a779159df", size = 213472, upload-time = "2025-06-09T22:54:15.232Z" }, + { url = "https://files.pythonhosted.org/packages/a6/85/f01f5d97e54e428885a5497ccf7f54404cbb4f906688a1690cd51bf597dc/propcache-0.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c5c2a784234c28854878d68978265617aa6dc0780e53d44b4d67f3651a17a9a2", size = 204480, upload-time = "2025-06-09T22:54:17.104Z" }, + { url = "https://files.pythonhosted.org/packages/e3/79/7bf5ab9033b8b8194cc3f7cf1aaa0e9c3256320726f64a3e1f113a812dce/propcache-0.3.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:5745bc7acdafa978ca1642891b82c19238eadc78ba2aaa293c6863b304e552d7", size = 214530, upload-time = "2025-06-09T22:54:18.512Z" }, + { url = "https://files.pythonhosted.org/packages/31/0b/bd3e0c00509b609317df4a18e6b05a450ef2d9a963e1d8bc9c9415d86f30/propcache-0.3.2-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:c0075bf773d66fa8c9d41f66cc132ecc75e5bb9dd7cce3cfd14adc5ca184cb95", size = 205230, upload-time = "2025-06-09T22:54:19.947Z" }, + { url = "https://files.pythonhosted.org/packages/7a/23/fae0ff9b54b0de4e819bbe559508da132d5683c32d84d0dc2ccce3563ed4/propcache-0.3.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:5f57aa0847730daceff0497f417c9de353c575d8da3579162cc74ac294c5369e", size = 206754, upload-time = "2025-06-09T22:54:21.716Z" }, + { url = "https://files.pythonhosted.org/packages/b7/7f/ad6a3c22630aaa5f618b4dc3c3598974a72abb4c18e45a50b3cdd091eb2f/propcache-0.3.2-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:eef914c014bf72d18efb55619447e0aecd5fb7c2e3fa7441e2e5d6099bddff7e", size = 218430, upload-time = "2025-06-09T22:54:23.17Z" }, + { url = "https://files.pythonhosted.org/packages/5b/2c/ba4f1c0e8a4b4c75910742f0d333759d441f65a1c7f34683b4a74c0ee015/propcache-0.3.2-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:2a4092e8549031e82facf3decdbc0883755d5bbcc62d3aea9d9e185549936dcf", size = 223884, upload-time = "2025-06-09T22:54:25.539Z" }, + { url = "https://files.pythonhosted.org/packages/88/e4/ebe30fc399e98572019eee82ad0caf512401661985cbd3da5e3140ffa1b0/propcache-0.3.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:85871b050f174bc0bfb437efbdb68aaf860611953ed12418e4361bc9c392749e", size = 211480, upload-time = "2025-06-09T22:54:26.892Z" }, + { url = "https://files.pythonhosted.org/packages/96/0a/7d5260b914e01d1d0906f7f38af101f8d8ed0dc47426219eeaf05e8ea7c2/propcache-0.3.2-cp311-cp311-win32.whl", hash = "sha256:36c8d9b673ec57900c3554264e630d45980fd302458e4ac801802a7fd2ef7897", size = 37757, upload-time = "2025-06-09T22:54:28.241Z" }, + { url = "https://files.pythonhosted.org/packages/e1/2d/89fe4489a884bc0da0c3278c552bd4ffe06a1ace559db5ef02ef24ab446b/propcache-0.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:e53af8cb6a781b02d2ea079b5b853ba9430fcbe18a8e3ce647d5982a3ff69f39", size = 41500, upload-time = "2025-06-09T22:54:29.4Z" }, + { url = "https://files.pythonhosted.org/packages/a8/42/9ca01b0a6f48e81615dca4765a8f1dd2c057e0540f6116a27dc5ee01dfb6/propcache-0.3.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:8de106b6c84506b31c27168582cd3cb3000a6412c16df14a8628e5871ff83c10", size = 73674, upload-time = "2025-06-09T22:54:30.551Z" }, + { url = "https://files.pythonhosted.org/packages/af/6e/21293133beb550f9c901bbece755d582bfaf2176bee4774000bd4dd41884/propcache-0.3.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:28710b0d3975117239c76600ea351934ac7b5ff56e60953474342608dbbb6154", size = 43570, upload-time = "2025-06-09T22:54:32.296Z" }, + { url = "https://files.pythonhosted.org/packages/0c/c8/0393a0a3a2b8760eb3bde3c147f62b20044f0ddac81e9d6ed7318ec0d852/propcache-0.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ce26862344bdf836650ed2487c3d724b00fbfec4233a1013f597b78c1cb73615", size = 43094, upload-time = "2025-06-09T22:54:33.929Z" }, + { url = "https://files.pythonhosted.org/packages/37/2c/489afe311a690399d04a3e03b069225670c1d489eb7b044a566511c1c498/propcache-0.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bca54bd347a253af2cf4544bbec232ab982f4868de0dd684246b67a51bc6b1db", size = 226958, upload-time = "2025-06-09T22:54:35.186Z" }, + { url = "https://files.pythonhosted.org/packages/9d/ca/63b520d2f3d418c968bf596839ae26cf7f87bead026b6192d4da6a08c467/propcache-0.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:55780d5e9a2ddc59711d727226bb1ba83a22dd32f64ee15594b9392b1f544eb1", size = 234894, upload-time = "2025-06-09T22:54:36.708Z" }, + { url = "https://files.pythonhosted.org/packages/11/60/1d0ed6fff455a028d678df30cc28dcee7af77fa2b0e6962ce1df95c9a2a9/propcache-0.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:035e631be25d6975ed87ab23153db6a73426a48db688070d925aa27e996fe93c", size = 233672, upload-time = "2025-06-09T22:54:38.062Z" }, + { url = "https://files.pythonhosted.org/packages/37/7c/54fd5301ef38505ab235d98827207176a5c9b2aa61939b10a460ca53e123/propcache-0.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ee6f22b6eaa39297c751d0e80c0d3a454f112f5c6481214fcf4c092074cecd67", size = 224395, upload-time = "2025-06-09T22:54:39.634Z" }, + { url = "https://files.pythonhosted.org/packages/ee/1a/89a40e0846f5de05fdc6779883bf46ba980e6df4d2ff8fb02643de126592/propcache-0.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7ca3aee1aa955438c4dba34fc20a9f390e4c79967257d830f137bd5a8a32ed3b", size = 212510, upload-time = "2025-06-09T22:54:41.565Z" }, + { url = "https://files.pythonhosted.org/packages/5e/33/ca98368586c9566a6b8d5ef66e30484f8da84c0aac3f2d9aec6d31a11bd5/propcache-0.3.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:7a4f30862869fa2b68380d677cc1c5fcf1e0f2b9ea0cf665812895c75d0ca3b8", size = 222949, upload-time = "2025-06-09T22:54:43.038Z" }, + { url = "https://files.pythonhosted.org/packages/ba/11/ace870d0aafe443b33b2f0b7efdb872b7c3abd505bfb4890716ad7865e9d/propcache-0.3.2-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:b77ec3c257d7816d9f3700013639db7491a434644c906a2578a11daf13176251", size = 217258, upload-time = "2025-06-09T22:54:44.376Z" }, + { url = "https://files.pythonhosted.org/packages/5b/d2/86fd6f7adffcfc74b42c10a6b7db721d1d9ca1055c45d39a1a8f2a740a21/propcache-0.3.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:cab90ac9d3f14b2d5050928483d3d3b8fb6b4018893fc75710e6aa361ecb2474", size = 213036, upload-time = "2025-06-09T22:54:46.243Z" }, + { url = "https://files.pythonhosted.org/packages/07/94/2d7d1e328f45ff34a0a284cf5a2847013701e24c2a53117e7c280a4316b3/propcache-0.3.2-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:0b504d29f3c47cf6b9e936c1852246c83d450e8e063d50562115a6be6d3a2535", size = 227684, upload-time = "2025-06-09T22:54:47.63Z" }, + { url = "https://files.pythonhosted.org/packages/b7/05/37ae63a0087677e90b1d14710e532ff104d44bc1efa3b3970fff99b891dc/propcache-0.3.2-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:ce2ac2675a6aa41ddb2a0c9cbff53780a617ac3d43e620f8fd77ba1c84dcfc06", size = 234562, upload-time = "2025-06-09T22:54:48.982Z" }, + { url = "https://files.pythonhosted.org/packages/a4/7c/3f539fcae630408d0bd8bf3208b9a647ccad10976eda62402a80adf8fc34/propcache-0.3.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:62b4239611205294cc433845b914131b2a1f03500ff3c1ed093ed216b82621e1", size = 222142, upload-time = "2025-06-09T22:54:50.424Z" }, + { url = "https://files.pythonhosted.org/packages/7c/d2/34b9eac8c35f79f8a962546b3e97e9d4b990c420ee66ac8255d5d9611648/propcache-0.3.2-cp312-cp312-win32.whl", hash = "sha256:df4a81b9b53449ebc90cc4deefb052c1dd934ba85012aa912c7ea7b7e38b60c1", size = 37711, upload-time = "2025-06-09T22:54:52.072Z" }, + { url = "https://files.pythonhosted.org/packages/19/61/d582be5d226cf79071681d1b46b848d6cb03d7b70af7063e33a2787eaa03/propcache-0.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:7046e79b989d7fe457bb755844019e10f693752d169076138abf17f31380800c", size = 41479, upload-time = "2025-06-09T22:54:53.234Z" }, + { url = "https://files.pythonhosted.org/packages/dc/d1/8c747fafa558c603c4ca19d8e20b288aa0c7cda74e9402f50f31eb65267e/propcache-0.3.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ca592ed634a73ca002967458187109265e980422116c0a107cf93d81f95af945", size = 71286, upload-time = "2025-06-09T22:54:54.369Z" }, + { url = "https://files.pythonhosted.org/packages/61/99/d606cb7986b60d89c36de8a85d58764323b3a5ff07770a99d8e993b3fa73/propcache-0.3.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:9ecb0aad4020e275652ba3975740f241bd12a61f1a784df044cf7477a02bc252", size = 42425, upload-time = "2025-06-09T22:54:55.642Z" }, + { url = "https://files.pythonhosted.org/packages/8c/96/ef98f91bbb42b79e9bb82bdd348b255eb9d65f14dbbe3b1594644c4073f7/propcache-0.3.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:7f08f1cc28bd2eade7a8a3d2954ccc673bb02062e3e7da09bc75d843386b342f", size = 41846, upload-time = "2025-06-09T22:54:57.246Z" }, + { url = "https://files.pythonhosted.org/packages/5b/ad/3f0f9a705fb630d175146cd7b1d2bf5555c9beaed54e94132b21aac098a6/propcache-0.3.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d1a342c834734edb4be5ecb1e9fb48cb64b1e2320fccbd8c54bf8da8f2a84c33", size = 208871, upload-time = "2025-06-09T22:54:58.975Z" }, + { url = "https://files.pythonhosted.org/packages/3a/38/2085cda93d2c8b6ec3e92af2c89489a36a5886b712a34ab25de9fbca7992/propcache-0.3.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8a544caaae1ac73f1fecfae70ded3e93728831affebd017d53449e3ac052ac1e", size = 215720, upload-time = "2025-06-09T22:55:00.471Z" }, + { url = "https://files.pythonhosted.org/packages/61/c1/d72ea2dc83ac7f2c8e182786ab0fc2c7bd123a1ff9b7975bee671866fe5f/propcache-0.3.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:310d11aa44635298397db47a3ebce7db99a4cc4b9bbdfcf6c98a60c8d5261cf1", size = 215203, upload-time = "2025-06-09T22:55:01.834Z" }, + { url = "https://files.pythonhosted.org/packages/af/81/b324c44ae60c56ef12007105f1460d5c304b0626ab0cc6b07c8f2a9aa0b8/propcache-0.3.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4c1396592321ac83157ac03a2023aa6cc4a3cc3cfdecb71090054c09e5a7cce3", size = 206365, upload-time = "2025-06-09T22:55:03.199Z" }, + { url = "https://files.pythonhosted.org/packages/09/73/88549128bb89e66d2aff242488f62869014ae092db63ccea53c1cc75a81d/propcache-0.3.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8cabf5b5902272565e78197edb682017d21cf3b550ba0460ee473753f28d23c1", size = 196016, upload-time = "2025-06-09T22:55:04.518Z" }, + { url = "https://files.pythonhosted.org/packages/b9/3f/3bdd14e737d145114a5eb83cb172903afba7242f67c5877f9909a20d948d/propcache-0.3.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:0a2f2235ac46a7aa25bdeb03a9e7060f6ecbd213b1f9101c43b3090ffb971ef6", size = 205596, upload-time = "2025-06-09T22:55:05.942Z" }, + { url = "https://files.pythonhosted.org/packages/0f/ca/2f4aa819c357d3107c3763d7ef42c03980f9ed5c48c82e01e25945d437c1/propcache-0.3.2-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:92b69e12e34869a6970fd2f3da91669899994b47c98f5d430b781c26f1d9f387", size = 200977, upload-time = "2025-06-09T22:55:07.792Z" }, + { url = "https://files.pythonhosted.org/packages/cd/4a/e65276c7477533c59085251ae88505caf6831c0e85ff8b2e31ebcbb949b1/propcache-0.3.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:54e02207c79968ebbdffc169591009f4474dde3b4679e16634d34c9363ff56b4", size = 197220, upload-time = "2025-06-09T22:55:09.173Z" }, + { url = "https://files.pythonhosted.org/packages/7c/54/fc7152e517cf5578278b242396ce4d4b36795423988ef39bb8cd5bf274c8/propcache-0.3.2-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:4adfb44cb588001f68c5466579d3f1157ca07f7504fc91ec87862e2b8e556b88", size = 210642, upload-time = "2025-06-09T22:55:10.62Z" }, + { url = "https://files.pythonhosted.org/packages/b9/80/abeb4a896d2767bf5f1ea7b92eb7be6a5330645bd7fb844049c0e4045d9d/propcache-0.3.2-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:fd3e6019dc1261cd0291ee8919dd91fbab7b169bb76aeef6c716833a3f65d206", size = 212789, upload-time = "2025-06-09T22:55:12.029Z" }, + { url = "https://files.pythonhosted.org/packages/b3/db/ea12a49aa7b2b6d68a5da8293dcf50068d48d088100ac016ad92a6a780e6/propcache-0.3.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4c181cad81158d71c41a2bce88edce078458e2dd5ffee7eddd6b05da85079f43", size = 205880, upload-time = "2025-06-09T22:55:13.45Z" }, + { url = "https://files.pythonhosted.org/packages/d1/e5/9076a0bbbfb65d1198007059c65639dfd56266cf8e477a9707e4b1999ff4/propcache-0.3.2-cp313-cp313-win32.whl", hash = "sha256:8a08154613f2249519e549de2330cf8e2071c2887309a7b07fb56098f5170a02", size = 37220, upload-time = "2025-06-09T22:55:15.284Z" }, + { url = "https://files.pythonhosted.org/packages/d3/f5/b369e026b09a26cd77aa88d8fffd69141d2ae00a2abaaf5380d2603f4b7f/propcache-0.3.2-cp313-cp313-win_amd64.whl", hash = "sha256:e41671f1594fc4ab0a6dec1351864713cb3a279910ae8b58f884a88a0a632c05", size = 40678, upload-time = "2025-06-09T22:55:16.445Z" }, + { url = "https://files.pythonhosted.org/packages/a4/3a/6ece377b55544941a08d03581c7bc400a3c8cd3c2865900a68d5de79e21f/propcache-0.3.2-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:9a3cf035bbaf035f109987d9d55dc90e4b0e36e04bbbb95af3055ef17194057b", size = 76560, upload-time = "2025-06-09T22:55:17.598Z" }, + { url = "https://files.pythonhosted.org/packages/0c/da/64a2bb16418740fa634b0e9c3d29edff1db07f56d3546ca2d86ddf0305e1/propcache-0.3.2-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:156c03d07dc1323d8dacaa221fbe028c5c70d16709cdd63502778e6c3ccca1b0", size = 44676, upload-time = "2025-06-09T22:55:18.922Z" }, + { url = "https://files.pythonhosted.org/packages/36/7b/f025e06ea51cb72c52fb87e9b395cced02786610b60a3ed51da8af017170/propcache-0.3.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:74413c0ba02ba86f55cf60d18daab219f7e531620c15f1e23d95563f505efe7e", size = 44701, upload-time = "2025-06-09T22:55:20.106Z" }, + { url = "https://files.pythonhosted.org/packages/a4/00/faa1b1b7c3b74fc277f8642f32a4c72ba1d7b2de36d7cdfb676db7f4303e/propcache-0.3.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f066b437bb3fa39c58ff97ab2ca351db465157d68ed0440abecb21715eb24b28", size = 276934, upload-time = "2025-06-09T22:55:21.5Z" }, + { url = "https://files.pythonhosted.org/packages/74/ab/935beb6f1756e0476a4d5938ff44bf0d13a055fed880caf93859b4f1baf4/propcache-0.3.2-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f1304b085c83067914721e7e9d9917d41ad87696bf70f0bc7dee450e9c71ad0a", size = 278316, upload-time = "2025-06-09T22:55:22.918Z" }, + { url = "https://files.pythonhosted.org/packages/f8/9d/994a5c1ce4389610838d1caec74bdf0e98b306c70314d46dbe4fcf21a3e2/propcache-0.3.2-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ab50cef01b372763a13333b4e54021bdcb291fc9a8e2ccb9c2df98be51bcde6c", size = 282619, upload-time = "2025-06-09T22:55:24.651Z" }, + { url = "https://files.pythonhosted.org/packages/2b/00/a10afce3d1ed0287cef2e09506d3be9822513f2c1e96457ee369adb9a6cd/propcache-0.3.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fad3b2a085ec259ad2c2842666b2a0a49dea8463579c606426128925af1ed725", size = 265896, upload-time = "2025-06-09T22:55:26.049Z" }, + { url = "https://files.pythonhosted.org/packages/2e/a8/2aa6716ffa566ca57c749edb909ad27884680887d68517e4be41b02299f3/propcache-0.3.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:261fa020c1c14deafd54c76b014956e2f86991af198c51139faf41c4d5e83892", size = 252111, upload-time = "2025-06-09T22:55:27.381Z" }, + { url = "https://files.pythonhosted.org/packages/36/4f/345ca9183b85ac29c8694b0941f7484bf419c7f0fea2d1e386b4f7893eed/propcache-0.3.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:46d7f8aa79c927e5f987ee3a80205c987717d3659f035c85cf0c3680526bdb44", size = 268334, upload-time = "2025-06-09T22:55:28.747Z" }, + { url = "https://files.pythonhosted.org/packages/3e/ca/fcd54f78b59e3f97b3b9715501e3147f5340167733d27db423aa321e7148/propcache-0.3.2-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:6d8f3f0eebf73e3c0ff0e7853f68be638b4043c65a70517bb575eff54edd8dbe", size = 255026, upload-time = "2025-06-09T22:55:30.184Z" }, + { url = "https://files.pythonhosted.org/packages/8b/95/8e6a6bbbd78ac89c30c225210a5c687790e532ba4088afb8c0445b77ef37/propcache-0.3.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:03c89c1b14a5452cf15403e291c0ccd7751d5b9736ecb2c5bab977ad6c5bcd81", size = 250724, upload-time = "2025-06-09T22:55:31.646Z" }, + { url = "https://files.pythonhosted.org/packages/ee/b0/0dd03616142baba28e8b2d14ce5df6631b4673850a3d4f9c0f9dd714a404/propcache-0.3.2-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:0cc17efde71e12bbaad086d679ce575268d70bc123a5a71ea7ad76f70ba30bba", size = 268868, upload-time = "2025-06-09T22:55:33.209Z" }, + { url = "https://files.pythonhosted.org/packages/c5/98/2c12407a7e4fbacd94ddd32f3b1e3d5231e77c30ef7162b12a60e2dd5ce3/propcache-0.3.2-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:acdf05d00696bc0447e278bb53cb04ca72354e562cf88ea6f9107df8e7fd9770", size = 271322, upload-time = "2025-06-09T22:55:35.065Z" }, + { url = "https://files.pythonhosted.org/packages/35/91/9cb56efbb428b006bb85db28591e40b7736847b8331d43fe335acf95f6c8/propcache-0.3.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:4445542398bd0b5d32df908031cb1b30d43ac848e20470a878b770ec2dcc6330", size = 265778, upload-time = "2025-06-09T22:55:36.45Z" }, + { url = "https://files.pythonhosted.org/packages/9a/4c/b0fe775a2bdd01e176b14b574be679d84fc83958335790f7c9a686c1f468/propcache-0.3.2-cp313-cp313t-win32.whl", hash = "sha256:f86e5d7cd03afb3a1db8e9f9f6eff15794e79e791350ac48a8c924e6f439f394", size = 41175, upload-time = "2025-06-09T22:55:38.436Z" }, + { url = "https://files.pythonhosted.org/packages/a4/ff/47f08595e3d9b5e149c150f88d9714574f1a7cbd89fe2817158a952674bf/propcache-0.3.2-cp313-cp313t-win_amd64.whl", hash = "sha256:9704bedf6e7cbe3c65eca4379a9b53ee6a83749f047808cbb5044d40d7d72198", size = 44857, upload-time = "2025-06-09T22:55:39.687Z" }, + { url = "https://files.pythonhosted.org/packages/cc/35/cc0aaecf278bb4575b8555f2b137de5ab821595ddae9da9d3cd1da4072c7/propcache-0.3.2-py3-none-any.whl", hash = "sha256:98f1ec44fb675f5052cccc8e609c46ed23a35a1cfd18545ad4e29002d858a43f", size = 12663, upload-time = "2025-06-09T22:56:04.484Z" }, +] + +[[package]] +name = "proto-plus" +version = "1.26.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "protobuf" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/f4/ac/87285f15f7cce6d4a008f33f1757fb5a13611ea8914eb58c3d0d26243468/proto_plus-1.26.1.tar.gz", hash = "sha256:21a515a4c4c0088a773899e23c7bbade3d18f9c66c73edd4c7ee3816bc96a012", size = 56142, upload-time = "2025-03-10T15:54:38.843Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/4e/6d/280c4c2ce28b1593a19ad5239c8b826871fc6ec275c21afc8e1820108039/proto_plus-1.26.1-py3-none-any.whl", hash = "sha256:13285478c2dcf2abb829db158e1047e2f1e8d63a077d94263c2b88b043c75a66", size = 50163, upload-time = "2025-03-10T15:54:37.335Z" }, +] + +[[package]] +name = "protobuf" +version = "6.32.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/fa/a4/cc17347aa2897568beece2e674674359f911d6fe21b0b8d6268cd42727ac/protobuf-6.32.1.tar.gz", hash = "sha256:ee2469e4a021474ab9baafea6cd070e5bf27c7d29433504ddea1a4ee5850f68d", size = 440635, upload-time = "2025-09-11T21:38:42.935Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c0/98/645183ea03ab3995d29086b8bf4f7562ebd3d10c9a4b14ee3f20d47cfe50/protobuf-6.32.1-cp310-abi3-win32.whl", hash = "sha256:a8a32a84bc9f2aad712041b8b366190f71dde248926da517bde9e832e4412085", size = 424411, upload-time = "2025-09-11T21:38:27.427Z" }, + { url = "https://files.pythonhosted.org/packages/8c/f3/6f58f841f6ebafe076cebeae33fc336e900619d34b1c93e4b5c97a81fdfa/protobuf-6.32.1-cp310-abi3-win_amd64.whl", hash = "sha256:b00a7d8c25fa471f16bc8153d0e53d6c9e827f0953f3c09aaa4331c718cae5e1", size = 435738, upload-time = "2025-09-11T21:38:30.959Z" }, + { url = "https://files.pythonhosted.org/packages/10/56/a8a3f4e7190837139e68c7002ec749190a163af3e330f65d90309145a210/protobuf-6.32.1-cp39-abi3-macosx_10_9_universal2.whl", hash = "sha256:d8c7e6eb619ffdf105ee4ab76af5a68b60a9d0f66da3ea12d1640e6d8dab7281", size = 426454, upload-time = "2025-09-11T21:38:34.076Z" }, + { url = "https://files.pythonhosted.org/packages/3f/be/8dd0a927c559b37d7a6c8ab79034fd167dcc1f851595f2e641ad62be8643/protobuf-6.32.1-cp39-abi3-manylinux2014_aarch64.whl", hash = "sha256:2f5b80a49e1eb7b86d85fcd23fe92df154b9730a725c3b38c4e43b9d77018bf4", size = 322874, upload-time = "2025-09-11T21:38:35.509Z" }, + { url = "https://files.pythonhosted.org/packages/5c/f6/88d77011b605ef979aace37b7703e4eefad066f7e84d935e5a696515c2dd/protobuf-6.32.1-cp39-abi3-manylinux2014_x86_64.whl", hash = "sha256:b1864818300c297265c83a4982fd3169f97122c299f56a56e2445c3698d34710", size = 322013, upload-time = "2025-09-11T21:38:37.017Z" }, + { url = "https://files.pythonhosted.org/packages/97/b7/15cc7d93443d6c6a84626ae3258a91f4c6ac8c0edd5df35ea7658f71b79c/protobuf-6.32.1-py3-none-any.whl", hash = "sha256:2601b779fc7d32a866c6b4404f9d42a3f67c5b9f3f15b4db3cccabe06b95c346", size = 169289, upload-time = "2025-09-11T21:38:41.234Z" }, +] + +[[package]] +name = "psutil" +version = "7.0.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/2a/80/336820c1ad9286a4ded7e845b2eccfcb27851ab8ac6abece774a6ff4d3de/psutil-7.0.0.tar.gz", hash = "sha256:7be9c3eba38beccb6495ea33afd982a44074b78f28c434a1f51cc07fd315c456", size = 497003, upload-time = "2025-02-13T21:54:07.946Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ed/e6/2d26234410f8b8abdbf891c9da62bee396583f713fb9f3325a4760875d22/psutil-7.0.0-cp36-abi3-macosx_10_9_x86_64.whl", hash = "sha256:101d71dc322e3cffd7cea0650b09b3d08b8e7c4109dd6809fe452dfd00e58b25", size = 238051, upload-time = "2025-02-13T21:54:12.36Z" }, + { url = "https://files.pythonhosted.org/packages/04/8b/30f930733afe425e3cbfc0e1468a30a18942350c1a8816acfade80c005c4/psutil-7.0.0-cp36-abi3-macosx_11_0_arm64.whl", hash = "sha256:39db632f6bb862eeccf56660871433e111b6ea58f2caea825571951d4b6aa3da", size = 239535, upload-time = "2025-02-13T21:54:16.07Z" }, + { url = "https://files.pythonhosted.org/packages/2a/ed/d362e84620dd22876b55389248e522338ed1bf134a5edd3b8231d7207f6d/psutil-7.0.0-cp36-abi3-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1fcee592b4c6f146991ca55919ea3d1f8926497a713ed7faaf8225e174581e91", size = 275004, upload-time = "2025-02-13T21:54:18.662Z" }, + { url = "https://files.pythonhosted.org/packages/bf/b9/b0eb3f3cbcb734d930fdf839431606844a825b23eaf9a6ab371edac8162c/psutil-7.0.0-cp36-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4b1388a4f6875d7e2aff5c4ca1cc16c545ed41dd8bb596cefea80111db353a34", size = 277986, upload-time = "2025-02-13T21:54:21.811Z" }, + { url = "https://files.pythonhosted.org/packages/eb/a2/709e0fe2f093556c17fbafda93ac032257242cabcc7ff3369e2cb76a97aa/psutil-7.0.0-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a5f098451abc2828f7dc6b58d44b532b22f2088f4999a937557b603ce72b1993", size = 279544, upload-time = "2025-02-13T21:54:24.68Z" }, + { url = "https://files.pythonhosted.org/packages/50/e6/eecf58810b9d12e6427369784efe814a1eec0f492084ce8eb8f4d89d6d61/psutil-7.0.0-cp37-abi3-win32.whl", hash = "sha256:ba3fcef7523064a6c9da440fc4d6bd07da93ac726b5733c29027d7dc95b39d99", size = 241053, upload-time = "2025-02-13T21:54:34.31Z" }, + { url = "https://files.pythonhosted.org/packages/50/1b/6921afe68c74868b4c9fa424dad3be35b095e16687989ebbb50ce4fceb7c/psutil-7.0.0-cp37-abi3-win_amd64.whl", hash = "sha256:4cf3d4eb1aa9b348dec30105c55cd9b7d4629285735a102beb4441e38db90553", size = 244885, upload-time = "2025-02-13T21:54:37.486Z" }, +] + +[[package]] +name = "pwdlib" +version = "0.2.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/82/a0/9daed437a6226f632a25d98d65d60ba02bdafa920c90dcb6454c611ead6c/pwdlib-0.2.1.tar.gz", hash = "sha256:9a1d8a8fa09a2f7ebf208265e55d7d008103cbdc82b9e4902ffdd1ade91add5e", size = 11699, upload-time = "2024-08-19T06:48:59.58Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/01/f3/0dae5078a486f0fdf4d4a1121e103bc42694a9da9bea7b0f2c63f29cfbd3/pwdlib-0.2.1-py3-none-any.whl", hash = "sha256:1823dc6f22eae472b540e889ecf57fd424051d6a4023ec0bcf7f0de2d9d7ef8c", size = 8082, upload-time = "2024-08-19T06:49:00.997Z" }, +] + +[package.optional-dependencies] +argon2 = [ + { name = "argon2-cffi" }, +] +bcrypt = [ + { name = "bcrypt" }, +] + +[[package]] +name = "pyarrow" +version = "21.0.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/ef/c2/ea068b8f00905c06329a3dfcd40d0fcc2b7d0f2e355bdb25b65e0a0e4cd4/pyarrow-21.0.0.tar.gz", hash = "sha256:5051f2dccf0e283ff56335760cbc8622cf52264d67e359d5569541ac11b6d5bc", size = 1133487, upload-time = "2025-07-18T00:57:31.761Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/94/dc/80564a3071a57c20b7c32575e4a0120e8a330ef487c319b122942d665960/pyarrow-21.0.0-cp311-cp311-macosx_12_0_arm64.whl", hash = "sha256:c077f48aab61738c237802836fc3844f85409a46015635198761b0d6a688f87b", size = 31243234, upload-time = "2025-07-18T00:55:03.812Z" }, + { url = "https://files.pythonhosted.org/packages/ea/cc/3b51cb2db26fe535d14f74cab4c79b191ed9a8cd4cbba45e2379b5ca2746/pyarrow-21.0.0-cp311-cp311-macosx_12_0_x86_64.whl", hash = "sha256:689f448066781856237eca8d1975b98cace19b8dd2ab6145bf49475478bcaa10", size = 32714370, upload-time = "2025-07-18T00:55:07.495Z" }, + { url = "https://files.pythonhosted.org/packages/24/11/a4431f36d5ad7d83b87146f515c063e4d07ef0b7240876ddb885e6b44f2e/pyarrow-21.0.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:479ee41399fcddc46159a551705b89c05f11e8b8cb8e968f7fec64f62d91985e", size = 41135424, upload-time = "2025-07-18T00:55:11.461Z" }, + { url = "https://files.pythonhosted.org/packages/74/dc/035d54638fc5d2971cbf1e987ccd45f1091c83bcf747281cf6cc25e72c88/pyarrow-21.0.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:40ebfcb54a4f11bcde86bc586cbd0272bac0d516cfa539c799c2453768477569", size = 42823810, upload-time = "2025-07-18T00:55:16.301Z" }, + { url = "https://files.pythonhosted.org/packages/2e/3b/89fced102448a9e3e0d4dded1f37fa3ce4700f02cdb8665457fcc8015f5b/pyarrow-21.0.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8d58d8497814274d3d20214fbb24abcad2f7e351474357d552a8d53bce70c70e", size = 43391538, upload-time = "2025-07-18T00:55:23.82Z" }, + { url = "https://files.pythonhosted.org/packages/fb/bb/ea7f1bd08978d39debd3b23611c293f64a642557e8141c80635d501e6d53/pyarrow-21.0.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:585e7224f21124dd57836b1530ac8f2df2afc43c861d7bf3d58a4870c42ae36c", size = 45120056, upload-time = "2025-07-18T00:55:28.231Z" }, + { url = "https://files.pythonhosted.org/packages/6e/0b/77ea0600009842b30ceebc3337639a7380cd946061b620ac1a2f3cb541e2/pyarrow-21.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:555ca6935b2cbca2c0e932bedd853e9bc523098c39636de9ad4693b5b1df86d6", size = 26220568, upload-time = "2025-07-18T00:55:32.122Z" }, + { url = "https://files.pythonhosted.org/packages/ca/d4/d4f817b21aacc30195cf6a46ba041dd1be827efa4a623cc8bf39a1c2a0c0/pyarrow-21.0.0-cp312-cp312-macosx_12_0_arm64.whl", hash = "sha256:3a302f0e0963db37e0a24a70c56cf91a4faa0bca51c23812279ca2e23481fccd", size = 31160305, upload-time = "2025-07-18T00:55:35.373Z" }, + { url = "https://files.pythonhosted.org/packages/a2/9c/dcd38ce6e4b4d9a19e1d36914cb8e2b1da4e6003dd075474c4cfcdfe0601/pyarrow-21.0.0-cp312-cp312-macosx_12_0_x86_64.whl", hash = "sha256:b6b27cf01e243871390474a211a7922bfbe3bda21e39bc9160daf0da3fe48876", size = 32684264, upload-time = "2025-07-18T00:55:39.303Z" }, + { url = "https://files.pythonhosted.org/packages/4f/74/2a2d9f8d7a59b639523454bec12dba35ae3d0a07d8ab529dc0809f74b23c/pyarrow-21.0.0-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:e72a8ec6b868e258a2cd2672d91f2860ad532d590ce94cdf7d5e7ec674ccf03d", size = 41108099, upload-time = "2025-07-18T00:55:42.889Z" }, + { url = "https://files.pythonhosted.org/packages/ad/90/2660332eeb31303c13b653ea566a9918484b6e4d6b9d2d46879a33ab0622/pyarrow-21.0.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:b7ae0bbdc8c6674259b25bef5d2a1d6af5d39d7200c819cf99e07f7dfef1c51e", size = 42829529, upload-time = "2025-07-18T00:55:47.069Z" }, + { url = "https://files.pythonhosted.org/packages/33/27/1a93a25c92717f6aa0fca06eb4700860577d016cd3ae51aad0e0488ac899/pyarrow-21.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:58c30a1729f82d201627c173d91bd431db88ea74dcaa3885855bc6203e433b82", size = 43367883, upload-time = "2025-07-18T00:55:53.069Z" }, + { url = "https://files.pythonhosted.org/packages/05/d9/4d09d919f35d599bc05c6950095e358c3e15148ead26292dfca1fb659b0c/pyarrow-21.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:072116f65604b822a7f22945a7a6e581cfa28e3454fdcc6939d4ff6090126623", size = 45133802, upload-time = "2025-07-18T00:55:57.714Z" }, + { url = "https://files.pythonhosted.org/packages/71/30/f3795b6e192c3ab881325ffe172e526499eb3780e306a15103a2764916a2/pyarrow-21.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:cf56ec8b0a5c8c9d7021d6fd754e688104f9ebebf1bf4449613c9531f5346a18", size = 26203175, upload-time = "2025-07-18T00:56:01.364Z" }, + { url = "https://files.pythonhosted.org/packages/16/ca/c7eaa8e62db8fb37ce942b1ea0c6d7abfe3786ca193957afa25e71b81b66/pyarrow-21.0.0-cp313-cp313-macosx_12_0_arm64.whl", hash = "sha256:e99310a4ebd4479bcd1964dff9e14af33746300cb014aa4a3781738ac63baf4a", size = 31154306, upload-time = "2025-07-18T00:56:04.42Z" }, + { url = "https://files.pythonhosted.org/packages/ce/e8/e87d9e3b2489302b3a1aea709aaca4b781c5252fcb812a17ab6275a9a484/pyarrow-21.0.0-cp313-cp313-macosx_12_0_x86_64.whl", hash = "sha256:d2fe8e7f3ce329a71b7ddd7498b3cfac0eeb200c2789bd840234f0dc271a8efe", size = 32680622, upload-time = "2025-07-18T00:56:07.505Z" }, + { url = "https://files.pythonhosted.org/packages/84/52/79095d73a742aa0aba370c7942b1b655f598069489ab387fe47261a849e1/pyarrow-21.0.0-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:f522e5709379d72fb3da7785aa489ff0bb87448a9dc5a75f45763a795a089ebd", size = 41104094, upload-time = "2025-07-18T00:56:10.994Z" }, + { url = "https://files.pythonhosted.org/packages/89/4b/7782438b551dbb0468892a276b8c789b8bbdb25ea5c5eb27faadd753e037/pyarrow-21.0.0-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:69cbbdf0631396e9925e048cfa5bce4e8c3d3b41562bbd70c685a8eb53a91e61", size = 42825576, upload-time = "2025-07-18T00:56:15.569Z" }, + { url = "https://files.pythonhosted.org/packages/b3/62/0f29de6e0a1e33518dec92c65be0351d32d7ca351e51ec5f4f837a9aab91/pyarrow-21.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:731c7022587006b755d0bdb27626a1a3bb004bb56b11fb30d98b6c1b4718579d", size = 43368342, upload-time = "2025-07-18T00:56:19.531Z" }, + { url = "https://files.pythonhosted.org/packages/90/c7/0fa1f3f29cf75f339768cc698c8ad4ddd2481c1742e9741459911c9ac477/pyarrow-21.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:dc56bc708f2d8ac71bd1dcb927e458c93cec10b98eb4120206a4091db7b67b99", size = 45131218, upload-time = "2025-07-18T00:56:23.347Z" }, + { url = "https://files.pythonhosted.org/packages/01/63/581f2076465e67b23bc5a37d4a2abff8362d389d29d8105832e82c9c811c/pyarrow-21.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:186aa00bca62139f75b7de8420f745f2af12941595bbbfa7ed3870ff63e25636", size = 26087551, upload-time = "2025-07-18T00:56:26.758Z" }, + { url = "https://files.pythonhosted.org/packages/c9/ab/357d0d9648bb8241ee7348e564f2479d206ebe6e1c47ac5027c2e31ecd39/pyarrow-21.0.0-cp313-cp313t-macosx_12_0_arm64.whl", hash = "sha256:a7a102574faa3f421141a64c10216e078df467ab9576684d5cd696952546e2da", size = 31290064, upload-time = "2025-07-18T00:56:30.214Z" }, + { url = "https://files.pythonhosted.org/packages/3f/8a/5685d62a990e4cac2043fc76b4661bf38d06efed55cf45a334b455bd2759/pyarrow-21.0.0-cp313-cp313t-macosx_12_0_x86_64.whl", hash = "sha256:1e005378c4a2c6db3ada3ad4c217b381f6c886f0a80d6a316fe586b90f77efd7", size = 32727837, upload-time = "2025-07-18T00:56:33.935Z" }, + { url = "https://files.pythonhosted.org/packages/fc/de/c0828ee09525c2bafefd3e736a248ebe764d07d0fd762d4f0929dbc516c9/pyarrow-21.0.0-cp313-cp313t-manylinux_2_28_aarch64.whl", hash = "sha256:65f8e85f79031449ec8706b74504a316805217b35b6099155dd7e227eef0d4b6", size = 41014158, upload-time = "2025-07-18T00:56:37.528Z" }, + { url = "https://files.pythonhosted.org/packages/6e/26/a2865c420c50b7a3748320b614f3484bfcde8347b2639b2b903b21ce6a72/pyarrow-21.0.0-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:3a81486adc665c7eb1a2bde0224cfca6ceaba344a82a971ef059678417880eb8", size = 42667885, upload-time = "2025-07-18T00:56:41.483Z" }, + { url = "https://files.pythonhosted.org/packages/0a/f9/4ee798dc902533159250fb4321267730bc0a107d8c6889e07c3add4fe3a5/pyarrow-21.0.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:fc0d2f88b81dcf3ccf9a6ae17f89183762c8a94a5bdcfa09e05cfe413acf0503", size = 43276625, upload-time = "2025-07-18T00:56:48.002Z" }, + { url = "https://files.pythonhosted.org/packages/5a/da/e02544d6997037a4b0d22d8e5f66bc9315c3671371a8b18c79ade1cefe14/pyarrow-21.0.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:6299449adf89df38537837487a4f8d3bd91ec94354fdd2a7d30bc11c48ef6e79", size = 44951890, upload-time = "2025-07-18T00:56:52.568Z" }, + { url = "https://files.pythonhosted.org/packages/e5/4e/519c1bc1876625fe6b71e9a28287c43ec2f20f73c658b9ae1d485c0c206e/pyarrow-21.0.0-cp313-cp313t-win_amd64.whl", hash = "sha256:222c39e2c70113543982c6b34f3077962b44fca38c0bd9e68bb6781534425c10", size = 26371006, upload-time = "2025-07-18T00:56:56.379Z" }, +] + +[[package]] +name = "pyasn1" +version = "0.6.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/ba/e9/01f1a64245b89f039897cb0130016d79f77d52669aae6ee7b159a6c4c018/pyasn1-0.6.1.tar.gz", hash = "sha256:6f580d2bdd84365380830acf45550f2511469f673cb4a5ae3857a3170128b034", size = 145322, upload-time = "2024-09-10T22:41:42.55Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c8/f1/d6a797abb14f6283c0ddff96bbdd46937f64122b8c925cab503dd37f8214/pyasn1-0.6.1-py3-none-any.whl", hash = "sha256:0d632f46f2ba09143da3a8afe9e33fb6f92fa2320ab7e886e2d0f7672af84629", size = 83135, upload-time = "2024-09-11T16:00:36.122Z" }, +] + +[[package]] +name = "pyasn1-modules" +version = "0.4.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pyasn1" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/e9/e6/78ebbb10a8c8e4b61a59249394a4a594c1a7af95593dc933a349c8d00964/pyasn1_modules-0.4.2.tar.gz", hash = "sha256:677091de870a80aae844b1ca6134f54652fa2c8c5a52aa396440ac3106e941e6", size = 307892, upload-time = "2025-03-28T02:41:22.17Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/47/8d/d529b5d697919ba8c11ad626e835d4039be708a35b0d22de83a269a6682c/pyasn1_modules-0.4.2-py3-none-any.whl", hash = "sha256:29253a9207ce32b64c3ac6600edc75368f98473906e8fd1043bd6b5b1de2c14a", size = 181259, upload-time = "2025-03-28T02:41:19.028Z" }, +] + +[[package]] +name = "pycparser" +version = "2.23" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/fe/cf/d2d3b9f5699fb1e4615c8e32ff220203e43b248e1dfcc6736ad9057731ca/pycparser-2.23.tar.gz", hash = "sha256:78816d4f24add8f10a06d6f05b4d424ad9e96cfebf68a4ddc99c65c0720d00c2", size = 173734, upload-time = "2025-09-09T13:23:47.91Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a0/e3/59cd50310fc9b59512193629e1984c1f95e5c8ae6e5d8c69532ccc65a7fe/pycparser-2.23-py3-none-any.whl", hash = "sha256:e5c6e8d3fbad53479cab09ac03729e0a9faf2bee3db8208a550daf5af81a5934", size = 118140, upload-time = "2025-09-09T13:23:46.651Z" }, +] + +[[package]] +name = "pydantic" +version = "2.11.9" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "annotated-types" }, + { name = "pydantic-core" }, + { name = "typing-extensions" }, + { name = "typing-inspection" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/ff/5d/09a551ba512d7ca404d785072700d3f6727a02f6f3c24ecfd081c7cf0aa8/pydantic-2.11.9.tar.gz", hash = "sha256:6b8ffda597a14812a7975c90b82a8a2e777d9257aba3453f973acd3c032a18e2", size = 788495, upload-time = "2025-09-13T11:26:39.325Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/3e/d3/108f2006987c58e76691d5ae5d200dd3e0f532cb4e5fa3560751c3a1feba/pydantic-2.11.9-py3-none-any.whl", hash = "sha256:c42dd626f5cfc1c6950ce6205ea58c93efa406da65f479dcb4029d5934857da2", size = 444855, upload-time = "2025-09-13T11:26:36.909Z" }, +] + +[package.optional-dependencies] +email = [ + { name = "email-validator" }, +] + +[[package]] +name = "pydantic-core" +version = "2.33.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/ad/88/5f2260bdfae97aabf98f1778d43f69574390ad787afb646292a638c923d4/pydantic_core-2.33.2.tar.gz", hash = "sha256:7cb8bc3605c29176e1b105350d2e6474142d7c1bd1d9327c4a9bdb46bf827acc", size = 435195, upload-time = "2025-04-23T18:33:52.104Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/3f/8d/71db63483d518cbbf290261a1fc2839d17ff89fce7089e08cad07ccfce67/pydantic_core-2.33.2-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:4c5b0a576fb381edd6d27f0a85915c6daf2f8138dc5c267a57c08a62900758c7", size = 2028584, upload-time = "2025-04-23T18:31:03.106Z" }, + { url = "https://files.pythonhosted.org/packages/24/2f/3cfa7244ae292dd850989f328722d2aef313f74ffc471184dc509e1e4e5a/pydantic_core-2.33.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e799c050df38a639db758c617ec771fd8fb7a5f8eaaa4b27b101f266b216a246", size = 1855071, upload-time = "2025-04-23T18:31:04.621Z" }, + { url = "https://files.pythonhosted.org/packages/b3/d3/4ae42d33f5e3f50dd467761304be2fa0a9417fbf09735bc2cce003480f2a/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dc46a01bf8d62f227d5ecee74178ffc448ff4e5197c756331f71efcc66dc980f", size = 1897823, upload-time = "2025-04-23T18:31:06.377Z" }, + { url = "https://files.pythonhosted.org/packages/f4/f3/aa5976e8352b7695ff808599794b1fba2a9ae2ee954a3426855935799488/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a144d4f717285c6d9234a66778059f33a89096dfb9b39117663fd8413d582dcc", size = 1983792, upload-time = "2025-04-23T18:31:07.93Z" }, + { url = "https://files.pythonhosted.org/packages/d5/7a/cda9b5a23c552037717f2b2a5257e9b2bfe45e687386df9591eff7b46d28/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:73cf6373c21bc80b2e0dc88444f41ae60b2f070ed02095754eb5a01df12256de", size = 2136338, upload-time = "2025-04-23T18:31:09.283Z" }, + { url = "https://files.pythonhosted.org/packages/2b/9f/b8f9ec8dd1417eb9da784e91e1667d58a2a4a7b7b34cf4af765ef663a7e5/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3dc625f4aa79713512d1976fe9f0bc99f706a9dee21dfd1810b4bbbf228d0e8a", size = 2730998, upload-time = "2025-04-23T18:31:11.7Z" }, + { url = "https://files.pythonhosted.org/packages/47/bc/cd720e078576bdb8255d5032c5d63ee5c0bf4b7173dd955185a1d658c456/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:881b21b5549499972441da4758d662aeea93f1923f953e9cbaff14b8b9565aef", size = 2003200, upload-time = "2025-04-23T18:31:13.536Z" }, + { url = "https://files.pythonhosted.org/packages/ca/22/3602b895ee2cd29d11a2b349372446ae9727c32e78a94b3d588a40fdf187/pydantic_core-2.33.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bdc25f3681f7b78572699569514036afe3c243bc3059d3942624e936ec93450e", size = 2113890, upload-time = "2025-04-23T18:31:15.011Z" }, + { url = "https://files.pythonhosted.org/packages/ff/e6/e3c5908c03cf00d629eb38393a98fccc38ee0ce8ecce32f69fc7d7b558a7/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:fe5b32187cbc0c862ee201ad66c30cf218e5ed468ec8dc1cf49dec66e160cc4d", size = 2073359, upload-time = "2025-04-23T18:31:16.393Z" }, + { url = "https://files.pythonhosted.org/packages/12/e7/6a36a07c59ebefc8777d1ffdaf5ae71b06b21952582e4b07eba88a421c79/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:bc7aee6f634a6f4a95676fcb5d6559a2c2a390330098dba5e5a5f28a2e4ada30", size = 2245883, upload-time = "2025-04-23T18:31:17.892Z" }, + { url = "https://files.pythonhosted.org/packages/16/3f/59b3187aaa6cc0c1e6616e8045b284de2b6a87b027cce2ffcea073adf1d2/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:235f45e5dbcccf6bd99f9f472858849f73d11120d76ea8707115415f8e5ebebf", size = 2241074, upload-time = "2025-04-23T18:31:19.205Z" }, + { url = "https://files.pythonhosted.org/packages/e0/ed/55532bb88f674d5d8f67ab121a2a13c385df382de2a1677f30ad385f7438/pydantic_core-2.33.2-cp311-cp311-win32.whl", hash = "sha256:6368900c2d3ef09b69cb0b913f9f8263b03786e5b2a387706c5afb66800efd51", size = 1910538, upload-time = "2025-04-23T18:31:20.541Z" }, + { url = "https://files.pythonhosted.org/packages/fe/1b/25b7cccd4519c0b23c2dd636ad39d381abf113085ce4f7bec2b0dc755eb1/pydantic_core-2.33.2-cp311-cp311-win_amd64.whl", hash = "sha256:1e063337ef9e9820c77acc768546325ebe04ee38b08703244c1309cccc4f1bab", size = 1952909, upload-time = "2025-04-23T18:31:22.371Z" }, + { url = "https://files.pythonhosted.org/packages/49/a9/d809358e49126438055884c4366a1f6227f0f84f635a9014e2deb9b9de54/pydantic_core-2.33.2-cp311-cp311-win_arm64.whl", hash = "sha256:6b99022f1d19bc32a4c2a0d544fc9a76e3be90f0b3f4af413f87d38749300e65", size = 1897786, upload-time = "2025-04-23T18:31:24.161Z" }, + { url = "https://files.pythonhosted.org/packages/18/8a/2b41c97f554ec8c71f2a8a5f85cb56a8b0956addfe8b0efb5b3d77e8bdc3/pydantic_core-2.33.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:a7ec89dc587667f22b6a0b6579c249fca9026ce7c333fc142ba42411fa243cdc", size = 2009000, upload-time = "2025-04-23T18:31:25.863Z" }, + { url = "https://files.pythonhosted.org/packages/a1/02/6224312aacb3c8ecbaa959897af57181fb6cf3a3d7917fd44d0f2917e6f2/pydantic_core-2.33.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3c6db6e52c6d70aa0d00d45cdb9b40f0433b96380071ea80b09277dba021ddf7", size = 1847996, upload-time = "2025-04-23T18:31:27.341Z" }, + { url = "https://files.pythonhosted.org/packages/d6/46/6dcdf084a523dbe0a0be59d054734b86a981726f221f4562aed313dbcb49/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e61206137cbc65e6d5256e1166f88331d3b6238e082d9f74613b9b765fb9025", size = 1880957, upload-time = "2025-04-23T18:31:28.956Z" }, + { url = "https://files.pythonhosted.org/packages/ec/6b/1ec2c03837ac00886ba8160ce041ce4e325b41d06a034adbef11339ae422/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:eb8c529b2819c37140eb51b914153063d27ed88e3bdc31b71198a198e921e011", size = 1964199, upload-time = "2025-04-23T18:31:31.025Z" }, + { url = "https://files.pythonhosted.org/packages/2d/1d/6bf34d6adb9debd9136bd197ca72642203ce9aaaa85cfcbfcf20f9696e83/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c52b02ad8b4e2cf14ca7b3d918f3eb0ee91e63b3167c32591e57c4317e134f8f", size = 2120296, upload-time = "2025-04-23T18:31:32.514Z" }, + { url = "https://files.pythonhosted.org/packages/e0/94/2bd0aaf5a591e974b32a9f7123f16637776c304471a0ab33cf263cf5591a/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:96081f1605125ba0855dfda83f6f3df5ec90c61195421ba72223de35ccfb2f88", size = 2676109, upload-time = "2025-04-23T18:31:33.958Z" }, + { url = "https://files.pythonhosted.org/packages/f9/41/4b043778cf9c4285d59742281a769eac371b9e47e35f98ad321349cc5d61/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f57a69461af2a5fa6e6bbd7a5f60d3b7e6cebb687f55106933188e79ad155c1", size = 2002028, upload-time = "2025-04-23T18:31:39.095Z" }, + { url = "https://files.pythonhosted.org/packages/cb/d5/7bb781bf2748ce3d03af04d5c969fa1308880e1dca35a9bd94e1a96a922e/pydantic_core-2.33.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:572c7e6c8bb4774d2ac88929e3d1f12bc45714ae5ee6d9a788a9fb35e60bb04b", size = 2100044, upload-time = "2025-04-23T18:31:41.034Z" }, + { url = "https://files.pythonhosted.org/packages/fe/36/def5e53e1eb0ad896785702a5bbfd25eed546cdcf4087ad285021a90ed53/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:db4b41f9bd95fbe5acd76d89920336ba96f03e149097365afe1cb092fceb89a1", size = 2058881, upload-time = "2025-04-23T18:31:42.757Z" }, + { url = "https://files.pythonhosted.org/packages/01/6c/57f8d70b2ee57fc3dc8b9610315949837fa8c11d86927b9bb044f8705419/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:fa854f5cf7e33842a892e5c73f45327760bc7bc516339fda888c75ae60edaeb6", size = 2227034, upload-time = "2025-04-23T18:31:44.304Z" }, + { url = "https://files.pythonhosted.org/packages/27/b9/9c17f0396a82b3d5cbea4c24d742083422639e7bb1d5bf600e12cb176a13/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:5f483cfb75ff703095c59e365360cb73e00185e01aaea067cd19acffd2ab20ea", size = 2234187, upload-time = "2025-04-23T18:31:45.891Z" }, + { url = "https://files.pythonhosted.org/packages/b0/6a/adf5734ffd52bf86d865093ad70b2ce543415e0e356f6cacabbc0d9ad910/pydantic_core-2.33.2-cp312-cp312-win32.whl", hash = "sha256:9cb1da0f5a471435a7bc7e439b8a728e8b61e59784b2af70d7c169f8dd8ae290", size = 1892628, upload-time = "2025-04-23T18:31:47.819Z" }, + { url = "https://files.pythonhosted.org/packages/43/e4/5479fecb3606c1368d496a825d8411e126133c41224c1e7238be58b87d7e/pydantic_core-2.33.2-cp312-cp312-win_amd64.whl", hash = "sha256:f941635f2a3d96b2973e867144fde513665c87f13fe0e193c158ac51bfaaa7b2", size = 1955866, upload-time = "2025-04-23T18:31:49.635Z" }, + { url = "https://files.pythonhosted.org/packages/0d/24/8b11e8b3e2be9dd82df4b11408a67c61bb4dc4f8e11b5b0fc888b38118b5/pydantic_core-2.33.2-cp312-cp312-win_arm64.whl", hash = "sha256:cca3868ddfaccfbc4bfb1d608e2ccaaebe0ae628e1416aeb9c4d88c001bb45ab", size = 1888894, upload-time = "2025-04-23T18:31:51.609Z" }, + { url = "https://files.pythonhosted.org/packages/46/8c/99040727b41f56616573a28771b1bfa08a3d3fe74d3d513f01251f79f172/pydantic_core-2.33.2-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:1082dd3e2d7109ad8b7da48e1d4710c8d06c253cbc4a27c1cff4fbcaa97a9e3f", size = 2015688, upload-time = "2025-04-23T18:31:53.175Z" }, + { url = "https://files.pythonhosted.org/packages/3a/cc/5999d1eb705a6cefc31f0b4a90e9f7fc400539b1a1030529700cc1b51838/pydantic_core-2.33.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f517ca031dfc037a9c07e748cefd8d96235088b83b4f4ba8939105d20fa1dcd6", size = 1844808, upload-time = "2025-04-23T18:31:54.79Z" }, + { url = "https://files.pythonhosted.org/packages/6f/5e/a0a7b8885c98889a18b6e376f344da1ef323d270b44edf8174d6bce4d622/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0a9f2c9dd19656823cb8250b0724ee9c60a82f3cdf68a080979d13092a3b0fef", size = 1885580, upload-time = "2025-04-23T18:31:57.393Z" }, + { url = "https://files.pythonhosted.org/packages/3b/2a/953581f343c7d11a304581156618c3f592435523dd9d79865903272c256a/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2b0a451c263b01acebe51895bfb0e1cc842a5c666efe06cdf13846c7418caa9a", size = 1973859, upload-time = "2025-04-23T18:31:59.065Z" }, + { url = "https://files.pythonhosted.org/packages/e6/55/f1a813904771c03a3f97f676c62cca0c0a4138654107c1b61f19c644868b/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ea40a64d23faa25e62a70ad163571c0b342b8bf66d5fa612ac0dec4f069d916", size = 2120810, upload-time = "2025-04-23T18:32:00.78Z" }, + { url = "https://files.pythonhosted.org/packages/aa/c3/053389835a996e18853ba107a63caae0b9deb4a276c6b472931ea9ae6e48/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0fb2d542b4d66f9470e8065c5469ec676978d625a8b7a363f07d9a501a9cb36a", size = 2676498, upload-time = "2025-04-23T18:32:02.418Z" }, + { url = "https://files.pythonhosted.org/packages/eb/3c/f4abd740877a35abade05e437245b192f9d0ffb48bbbbd708df33d3cda37/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fdac5d6ffa1b5a83bca06ffe7583f5576555e6c8b3a91fbd25ea7780f825f7d", size = 2000611, upload-time = "2025-04-23T18:32:04.152Z" }, + { url = "https://files.pythonhosted.org/packages/59/a7/63ef2fed1837d1121a894d0ce88439fe3e3b3e48c7543b2a4479eb99c2bd/pydantic_core-2.33.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:04a1a413977ab517154eebb2d326da71638271477d6ad87a769102f7c2488c56", size = 2107924, upload-time = "2025-04-23T18:32:06.129Z" }, + { url = "https://files.pythonhosted.org/packages/04/8f/2551964ef045669801675f1cfc3b0d74147f4901c3ffa42be2ddb1f0efc4/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:c8e7af2f4e0194c22b5b37205bfb293d166a7344a5b0d0eaccebc376546d77d5", size = 2063196, upload-time = "2025-04-23T18:32:08.178Z" }, + { url = "https://files.pythonhosted.org/packages/26/bd/d9602777e77fc6dbb0c7db9ad356e9a985825547dce5ad1d30ee04903918/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:5c92edd15cd58b3c2d34873597a1e20f13094f59cf88068adb18947df5455b4e", size = 2236389, upload-time = "2025-04-23T18:32:10.242Z" }, + { url = "https://files.pythonhosted.org/packages/42/db/0e950daa7e2230423ab342ae918a794964b053bec24ba8af013fc7c94846/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:65132b7b4a1c0beded5e057324b7e16e10910c106d43675d9bd87d4f38dde162", size = 2239223, upload-time = "2025-04-23T18:32:12.382Z" }, + { url = "https://files.pythonhosted.org/packages/58/4d/4f937099c545a8a17eb52cb67fe0447fd9a373b348ccfa9a87f141eeb00f/pydantic_core-2.33.2-cp313-cp313-win32.whl", hash = "sha256:52fb90784e0a242bb96ec53f42196a17278855b0f31ac7c3cc6f5c1ec4811849", size = 1900473, upload-time = "2025-04-23T18:32:14.034Z" }, + { url = "https://files.pythonhosted.org/packages/a0/75/4a0a9bac998d78d889def5e4ef2b065acba8cae8c93696906c3a91f310ca/pydantic_core-2.33.2-cp313-cp313-win_amd64.whl", hash = "sha256:c083a3bdd5a93dfe480f1125926afcdbf2917ae714bdb80b36d34318b2bec5d9", size = 1955269, upload-time = "2025-04-23T18:32:15.783Z" }, + { url = "https://files.pythonhosted.org/packages/f9/86/1beda0576969592f1497b4ce8e7bc8cbdf614c352426271b1b10d5f0aa64/pydantic_core-2.33.2-cp313-cp313-win_arm64.whl", hash = "sha256:e80b087132752f6b3d714f041ccf74403799d3b23a72722ea2e6ba2e892555b9", size = 1893921, upload-time = "2025-04-23T18:32:18.473Z" }, + { url = "https://files.pythonhosted.org/packages/a4/7d/e09391c2eebeab681df2b74bfe6c43422fffede8dc74187b2b0bf6fd7571/pydantic_core-2.33.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:61c18fba8e5e9db3ab908620af374db0ac1baa69f0f32df4f61ae23f15e586ac", size = 1806162, upload-time = "2025-04-23T18:32:20.188Z" }, + { url = "https://files.pythonhosted.org/packages/f1/3d/847b6b1fed9f8ed3bb95a9ad04fbd0b212e832d4f0f50ff4d9ee5a9f15cf/pydantic_core-2.33.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:95237e53bb015f67b63c91af7518a62a8660376a6a0db19b89acc77a4d6199f5", size = 1981560, upload-time = "2025-04-23T18:32:22.354Z" }, + { url = "https://files.pythonhosted.org/packages/6f/9a/e73262f6c6656262b5fdd723ad90f518f579b7bc8622e43a942eec53c938/pydantic_core-2.33.2-cp313-cp313t-win_amd64.whl", hash = "sha256:c2fc0a768ef76c15ab9238afa6da7f69895bb5d1ee83aeea2e3509af4472d0b9", size = 1935777, upload-time = "2025-04-23T18:32:25.088Z" }, + { url = "https://files.pythonhosted.org/packages/7b/27/d4ae6487d73948d6f20dddcd94be4ea43e74349b56eba82e9bdee2d7494c/pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:dd14041875d09cc0f9308e37a6f8b65f5585cf2598a53aa0123df8b129d481f8", size = 2025200, upload-time = "2025-04-23T18:33:14.199Z" }, + { url = "https://files.pythonhosted.org/packages/f1/b8/b3cb95375f05d33801024079b9392a5ab45267a63400bf1866e7ce0f0de4/pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:d87c561733f66531dced0da6e864f44ebf89a8fba55f31407b00c2f7f9449593", size = 1859123, upload-time = "2025-04-23T18:33:16.555Z" }, + { url = "https://files.pythonhosted.org/packages/05/bc/0d0b5adeda59a261cd30a1235a445bf55c7e46ae44aea28f7bd6ed46e091/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2f82865531efd18d6e07a04a17331af02cb7a651583c418df8266f17a63c6612", size = 1892852, upload-time = "2025-04-23T18:33:18.513Z" }, + { url = "https://files.pythonhosted.org/packages/3e/11/d37bdebbda2e449cb3f519f6ce950927b56d62f0b84fd9cb9e372a26a3d5/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bfb5112df54209d820d7bf9317c7a6c9025ea52e49f46b6a2060104bba37de7", size = 2067484, upload-time = "2025-04-23T18:33:20.475Z" }, + { url = "https://files.pythonhosted.org/packages/8c/55/1f95f0a05ce72ecb02a8a8a1c3be0579bbc29b1d5ab68f1378b7bebc5057/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:64632ff9d614e5eecfb495796ad51b0ed98c453e447a76bcbeeb69615079fc7e", size = 2108896, upload-time = "2025-04-23T18:33:22.501Z" }, + { url = "https://files.pythonhosted.org/packages/53/89/2b2de6c81fa131f423246a9109d7b2a375e83968ad0800d6e57d0574629b/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:f889f7a40498cc077332c7ab6b4608d296d852182211787d4f3ee377aaae66e8", size = 2069475, upload-time = "2025-04-23T18:33:24.528Z" }, + { url = "https://files.pythonhosted.org/packages/b8/e9/1f7efbe20d0b2b10f6718944b5d8ece9152390904f29a78e68d4e7961159/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:de4b83bb311557e439b9e186f733f6c645b9417c84e2eb8203f3f820a4b988bf", size = 2239013, upload-time = "2025-04-23T18:33:26.621Z" }, + { url = "https://files.pythonhosted.org/packages/3c/b2/5309c905a93811524a49b4e031e9851a6b00ff0fb668794472ea7746b448/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:82f68293f055f51b51ea42fafc74b6aad03e70e191799430b90c13d643059ebb", size = 2238715, upload-time = "2025-04-23T18:33:28.656Z" }, + { url = "https://files.pythonhosted.org/packages/32/56/8a7ca5d2cd2cda1d245d34b1c9a942920a718082ae8e54e5f3e5a58b7add/pydantic_core-2.33.2-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:329467cecfb529c925cf2bbd4d60d2c509bc2fb52a20c1045bf09bb70971a9c1", size = 2066757, upload-time = "2025-04-23T18:33:30.645Z" }, +] + +[[package]] +name = "pydantic-settings" +version = "2.10.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pydantic" }, + { name = "python-dotenv" }, + { name = "typing-inspection" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/68/85/1ea668bbab3c50071ca613c6ab30047fb36ab0da1b92fa8f17bbc38fd36c/pydantic_settings-2.10.1.tar.gz", hash = "sha256:06f0062169818d0f5524420a360d632d5857b83cffd4d42fe29597807a1614ee", size = 172583, upload-time = "2025-06-24T13:26:46.841Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/58/f0/427018098906416f580e3cf1366d3b1abfb408a0652e9f31600c24a1903c/pydantic_settings-2.10.1-py3-none-any.whl", hash = "sha256:a60952460b99cf661dc25c29c0ef171721f98bfcb52ef8d9ea4c943d7c8cc796", size = 45235, upload-time = "2025-06-24T13:26:45.485Z" }, +] + +[[package]] +name = "pygments" +version = "2.19.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/b0/77/a5b8c569bf593b0140bde72ea885a803b82086995367bf2037de0159d924/pygments-2.19.2.tar.gz", hash = "sha256:636cb2477cec7f8952536970bc533bc43743542f70392ae026374600add5b887", size = 4968631, upload-time = "2025-06-21T13:39:12.283Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c7/21/705964c7812476f378728bdf590ca4b771ec72385c533964653c68e86bdc/pygments-2.19.2-py3-none-any.whl", hash = "sha256:86540386c03d588bb81d44bc3928634ff26449851e99741617ecb9037ee5ec0b", size = 1225217, upload-time = "2025-06-21T13:39:07.939Z" }, +] + +[[package]] +name = "pyjwt" +version = "2.10.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/e7/46/bd74733ff231675599650d3e47f361794b22ef3e3770998dda30d3b63726/pyjwt-2.10.1.tar.gz", hash = "sha256:3cc5772eb20009233caf06e9d8a0577824723b44e6648ee0a2aedb6cf9381953", size = 87785, upload-time = "2024-11-28T03:43:29.933Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/61/ad/689f02752eeec26aed679477e80e632ef1b682313be70793d798c1d5fc8f/PyJWT-2.10.1-py3-none-any.whl", hash = "sha256:dcdd193e30abefd5debf142f9adfcdd2b58004e644f25406ffaebd50bd98dacb", size = 22997, upload-time = "2024-11-28T03:43:27.893Z" }, +] + +[package.optional-dependencies] +crypto = [ + { name = "cryptography" }, +] + +[[package]] +name = "pylance" +version = "0.36.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "numpy" }, + { name = "pyarrow" }, +] +wheels = [ + { url = "https://files.pythonhosted.org/packages/09/13/f7f029d12a3dfdc9f3059d77b3999d40f9cc064ba85fef885a08bf65dcb2/pylance-0.36.0-cp39-abi3-macosx_10_15_x86_64.whl", hash = "sha256:160ed088dc5fb63a71c8c96640d43ea58464f64bca8aa23b0337b1a96fd47b79", size = 43403867, upload-time = "2025-09-12T20:29:25.507Z" }, + { url = "https://files.pythonhosted.org/packages/95/95/defad18786260653b33d5ef8223736c0e481861c8d33311756bd471468ad/pylance-0.36.0-cp39-abi3-macosx_11_0_arm64.whl", hash = "sha256:ce43ad002b4e67ffb1a33925d05d472bbde77c57a5e84aca1728faa9ace0c086", size = 39777498, upload-time = "2025-09-12T20:27:02.906Z" }, + { url = "https://files.pythonhosted.org/packages/19/33/7080ed4e45648d8c803a49cd5a206eb95176ef9dc06bff26748ec2109c65/pylance-0.36.0-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6ad7b168b0d4b7864be6040bebaf6d9a3959e76a190ff401a84b165b75eade96", size = 41819489, upload-time = "2025-09-12T20:17:06.37Z" }, + { url = "https://files.pythonhosted.org/packages/29/9a/0c572994d96e03e70481dafb2b062033a9ce24beb5ac6045f00f013ca57c/pylance-0.36.0-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:353deeb7b19be505db490258b5f2fc897efd4a45255fa0d51455662e01ad59ab", size = 45366480, upload-time = "2025-09-12T20:19:53.924Z" }, + { url = "https://files.pythonhosted.org/packages/fe/82/a74f0436b6a983c2798d1f44699352cd98c42bc335781ece98a878cf63fb/pylance-0.36.0-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:9cd963fc22257591d1daf281fa2369e05299d78950cb11980aa099d7cbacdf00", size = 41833322, upload-time = "2025-09-12T20:17:40.784Z" }, + { url = "https://files.pythonhosted.org/packages/a8/f2/d28fa3487992c3bd46af6838da13cf9a00be24fcf4cf928f77feec52d8d6/pylance-0.36.0-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:40117569a87379e08ed12eccac658999158f81df946f2ed02693b77776b57597", size = 45347065, upload-time = "2025-09-12T20:19:26.435Z" }, + { url = "https://files.pythonhosted.org/packages/ff/ab/e7fc302950f1c6815a6e832d052d0860130374bfe4bd482b075299dc8384/pylance-0.36.0-cp39-abi3-win_amd64.whl", hash = "sha256:a2930738192e5075220bc38c8a58ff4e48a71d53b3ca2a577ffce0318609cac0", size = 46348996, upload-time = "2025-09-12T20:36:04.663Z" }, +] + +[[package]] +name = "pympler" +version = "1.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pywin32", marker = "sys_platform == 'win32'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/dd/37/c384631908029676d8e7213dd956bb686af303a80db7afbc9be36bc49495/pympler-1.1.tar.gz", hash = "sha256:1eaa867cb8992c218430f1708fdaccda53df064144d1c5656b1e6f1ee6000424", size = 179954, upload-time = "2024-06-28T19:56:06.563Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/79/4f/a6a2e2b202d7fd97eadfe90979845b8706676b41cbd3b42ba75adf329d1f/Pympler-1.1-py3-none-any.whl", hash = "sha256:5b223d6027d0619584116a0cbc28e8d2e378f7a79c1e5e024f9ff3b673c58506", size = 165766, upload-time = "2024-06-28T19:56:05.087Z" }, +] + +[[package]] +name = "pyparsing" +version = "3.2.4" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/98/c9/b4594e6a81371dfa9eb7a2c110ad682acf985d96115ae8b25a1d63b4bf3b/pyparsing-3.2.4.tar.gz", hash = "sha256:fff89494f45559d0f2ce46613b419f632bbb6afbdaed49696d322bcf98a58e99", size = 1098809, upload-time = "2025-09-13T05:47:19.732Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/53/b8/fbab973592e23ae313042d450fc26fa24282ebffba21ba373786e1ce63b4/pyparsing-3.2.4-py3-none-any.whl", hash = "sha256:91d0fcde680d42cd031daf3a6ba20da3107e08a75de50da58360e7d94ab24d36", size = 113869, upload-time = "2025-09-13T05:47:17.863Z" }, +] + +[[package]] +name = "pypdf" +version = "6.0.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/20/ac/a300a03c3b34967c050677ccb16e7a4b65607ee5df9d51e8b6d713de4098/pypdf-6.0.0.tar.gz", hash = "sha256:282a99d2cc94a84a3a3159f0d9358c0af53f85b4d28d76ea38b96e9e5ac2a08d", size = 5033827, upload-time = "2025-08-11T14:22:02.352Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/2c/83/2cacc506eb322bb31b747bc06ccb82cc9aa03e19ee9c1245e538e49d52be/pypdf-6.0.0-py3-none-any.whl", hash = "sha256:56ea60100ce9f11fc3eec4f359da15e9aec3821b036c1f06d2b660d35683abb8", size = 310465, upload-time = "2025-08-11T14:22:00.481Z" }, +] + +[[package]] +name = "pyperclip" +version = "1.10.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/15/99/25f4898cf420efb6f45f519de018f4faea5391114a8618b16736ef3029f1/pyperclip-1.10.0.tar.gz", hash = "sha256:180c8346b1186921c75dfd14d9048a6b5d46bfc499778811952c6dd6eb1ca6be", size = 12193, upload-time = "2025-09-18T00:54:00.384Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/1e/bc/22540e73c5f5ae18f02924cd3954a6c9a4aa6b713c841a94c98335d333a1/pyperclip-1.10.0-py3-none-any.whl", hash = "sha256:596fbe55dc59263bff26e61d2afbe10223e2fccb5210c9c96a28d6887cfcc7ec", size = 11062, upload-time = "2025-09-18T00:53:59.252Z" }, +] + +[[package]] +name = "pyreadline3" +version = "3.5.4" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/0f/49/4cea918a08f02817aabae639e3d0ac046fef9f9180518a3ad394e22da148/pyreadline3-3.5.4.tar.gz", hash = "sha256:8d57d53039a1c75adba8e50dd3d992b28143480816187ea5efbd5c78e6c885b7", size = 99839, upload-time = "2024-09-19T02:40:10.062Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/5a/dc/491b7661614ab97483abf2056be1deee4dc2490ecbf7bff9ab5cdbac86e1/pyreadline3-3.5.4-py3-none-any.whl", hash = "sha256:eaf8e6cc3c49bcccf145fc6067ba8643d1df34d604a1ec0eccbf7a18e6d3fae6", size = 83178, upload-time = "2024-09-19T02:40:08.598Z" }, +] + +[[package]] +name = "pytest" +version = "8.4.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "colorama", marker = "sys_platform == 'win32'" }, + { name = "iniconfig" }, + { name = "packaging" }, + { name = "pluggy" }, + { name = "pygments" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/a3/5c/00a0e072241553e1a7496d638deababa67c5058571567b92a7eaa258397c/pytest-8.4.2.tar.gz", hash = "sha256:86c0d0b93306b961d58d62a4db4879f27fe25513d4b969df351abdddb3c30e01", size = 1519618, upload-time = "2025-09-04T14:34:22.711Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a8/a4/20da314d277121d6534b3a980b29035dcd51e6744bd79075a6ce8fa4eb8d/pytest-8.4.2-py3-none-any.whl", hash = "sha256:872f880de3fc3a5bdc88a11b39c9710c3497a547cfa9320bc3c5e62fbf272e79", size = 365750, upload-time = "2025-09-04T14:34:20.226Z" }, +] + +[[package]] +name = "pytest-asyncio" +version = "1.2.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pytest" }, + { name = "typing-extensions", marker = "python_full_version < '3.13'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/42/86/9e3c5f48f7b7b638b216e4b9e645f54d199d7abbbab7a64a13b4e12ba10f/pytest_asyncio-1.2.0.tar.gz", hash = "sha256:c609a64a2a8768462d0c99811ddb8bd2583c33fd33cf7f21af1c142e824ffb57", size = 50119, upload-time = "2025-09-12T07:33:53.816Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/04/93/2fa34714b7a4ae72f2f8dad66ba17dd9a2c793220719e736dda28b7aec27/pytest_asyncio-1.2.0-py3-none-any.whl", hash = "sha256:8e17ae5e46d8e7efe51ab6494dd2010f4ca8dae51652aa3c8d55acf50bfb2e99", size = 15095, upload-time = "2025-09-12T07:33:52.639Z" }, +] + +[[package]] +name = "python-dateutil" +version = "2.9.0.post0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "six" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/66/c0/0c8b6ad9f17a802ee498c46e004a0eb49bc148f2fd230864601a86dcf6db/python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3", size = 342432, upload-time = "2024-03-01T18:36:20.211Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ec/57/56b9bcc3c9c6a792fcbaf139543cee77261f3651ca9da0c93f5c1221264b/python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427", size = 229892, upload-time = "2024-03-01T18:36:18.57Z" }, +] + +[[package]] +name = "python-dotenv" +version = "1.1.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f6/b0/4bc07ccd3572a2f9df7e6782f52b0c6c90dcbb803ac4a167702d7d0dfe1e/python_dotenv-1.1.1.tar.gz", hash = "sha256:a8a6399716257f45be6a007360200409fce5cda2661e3dec71d23dc15f6189ab", size = 41978, upload-time = "2025-06-24T04:21:07.341Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/5f/ed/539768cf28c661b5b068d66d96a2f155c4971a5d55684a514c1a0e0dec2f/python_dotenv-1.1.1-py3-none-any.whl", hash = "sha256:31f23644fe2602f88ff55e1f5c79ba497e01224ee7737937930c448e4d0e24dc", size = 20556, upload-time = "2025-06-24T04:21:06.073Z" }, +] + +[[package]] +name = "python-magic-bin" +version = "0.4.14" +source = { registry = "https://pypi.org/simple" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/5a/5d/10b9ac745d9fd2f7151a2ab901e6bb6983dbd70e87c71111f54859d1ca2e/python_magic_bin-0.4.14-py2.py3-none-win32.whl", hash = "sha256:34a788c03adde7608028203e2dbb208f1f62225ad91518787ae26d603ae68892", size = 397784, upload-time = "2017-10-02T16:30:15.806Z" }, + { url = "https://files.pythonhosted.org/packages/07/c2/094e3d62b906d952537196603a23aec4bcd7c6126bf80eb14e6f9f4be3a2/python_magic_bin-0.4.14-py2.py3-none-win_amd64.whl", hash = "sha256:90be6206ad31071a36065a2fc169c5afb5e0355cbe6030e87641c6c62edc2b69", size = 409299, upload-time = "2017-10-02T16:30:18.545Z" }, +] + +[[package]] +name = "python-multipart" +version = "0.0.20" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f3/87/f44d7c9f274c7ee665a29b885ec97089ec5dc034c7f3fafa03da9e39a09e/python_multipart-0.0.20.tar.gz", hash = "sha256:8dd0cab45b8e23064ae09147625994d090fa46f5b0d1e13af944c331a7fa9d13", size = 37158, upload-time = "2024-12-16T19:45:46.972Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/45/58/38b5afbc1a800eeea951b9285d3912613f2603bdf897a4ab0f4bd7f405fc/python_multipart-0.0.20-py3-none-any.whl", hash = "sha256:8a62d3a8335e06589fe01f2a3e178cdcc632f3fbe0d492ad9ee0ec35aab1f104", size = 24546, upload-time = "2024-12-16T19:45:44.423Z" }, +] + +[[package]] +name = "pytz" +version = "2025.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f8/bf/abbd3cdfb8fbc7fb3d4d38d320f2441b1e7cbe29be4f23797b4a2b5d8aac/pytz-2025.2.tar.gz", hash = "sha256:360b9e3dbb49a209c21ad61809c7fb453643e048b38924c765813546746e81c3", size = 320884, upload-time = "2025-03-25T02:25:00.538Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/81/c4/34e93fe5f5429d7570ec1fa436f1986fb1f00c3e0f43a589fe2bbcd22c3f/pytz-2025.2-py2.py3-none-any.whl", hash = "sha256:5ddf76296dd8c44c26eb8f4b6f35488f3ccbf6fbbd7adee0b7262d43f0ec2f00", size = 509225, upload-time = "2025-03-25T02:24:58.468Z" }, +] + +[[package]] +name = "pywin32" +version = "311" +source = { registry = "https://pypi.org/simple" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7c/af/449a6a91e5d6db51420875c54f6aff7c97a86a3b13a0b4f1a5c13b988de3/pywin32-311-cp311-cp311-win32.whl", hash = "sha256:184eb5e436dea364dcd3d2316d577d625c0351bf237c4e9a5fabbcfa5a58b151", size = 8697031, upload-time = "2025-07-14T20:13:13.266Z" }, + { url = "https://files.pythonhosted.org/packages/51/8f/9bb81dd5bb77d22243d33c8397f09377056d5c687aa6d4042bea7fbf8364/pywin32-311-cp311-cp311-win_amd64.whl", hash = "sha256:3ce80b34b22b17ccbd937a6e78e7225d80c52f5ab9940fe0506a1a16f3dab503", size = 9508308, upload-time = "2025-07-14T20:13:15.147Z" }, + { url = "https://files.pythonhosted.org/packages/44/7b/9c2ab54f74a138c491aba1b1cd0795ba61f144c711daea84a88b63dc0f6c/pywin32-311-cp311-cp311-win_arm64.whl", hash = "sha256:a733f1388e1a842abb67ffa8e7aad0e70ac519e09b0f6a784e65a136ec7cefd2", size = 8703930, upload-time = "2025-07-14T20:13:16.945Z" }, + { url = "https://files.pythonhosted.org/packages/e7/ab/01ea1943d4eba0f850c3c61e78e8dd59757ff815ff3ccd0a84de5f541f42/pywin32-311-cp312-cp312-win32.whl", hash = "sha256:750ec6e621af2b948540032557b10a2d43b0cee2ae9758c54154d711cc852d31", size = 8706543, upload-time = "2025-07-14T20:13:20.765Z" }, + { url = "https://files.pythonhosted.org/packages/d1/a8/a0e8d07d4d051ec7502cd58b291ec98dcc0c3fff027caad0470b72cfcc2f/pywin32-311-cp312-cp312-win_amd64.whl", hash = "sha256:b8c095edad5c211ff31c05223658e71bf7116daa0ecf3ad85f3201ea3190d067", size = 9495040, upload-time = "2025-07-14T20:13:22.543Z" }, + { url = "https://files.pythonhosted.org/packages/ba/3a/2ae996277b4b50f17d61f0603efd8253cb2d79cc7ae159468007b586396d/pywin32-311-cp312-cp312-win_arm64.whl", hash = "sha256:e286f46a9a39c4a18b319c28f59b61de793654af2f395c102b4f819e584b5852", size = 8710102, upload-time = "2025-07-14T20:13:24.682Z" }, + { url = "https://files.pythonhosted.org/packages/a5/be/3fd5de0979fcb3994bfee0d65ed8ca9506a8a1260651b86174f6a86f52b3/pywin32-311-cp313-cp313-win32.whl", hash = "sha256:f95ba5a847cba10dd8c4d8fefa9f2a6cf283b8b88ed6178fa8a6c1ab16054d0d", size = 8705700, upload-time = "2025-07-14T20:13:26.471Z" }, + { url = "https://files.pythonhosted.org/packages/e3/28/e0a1909523c6890208295a29e05c2adb2126364e289826c0a8bc7297bd5c/pywin32-311-cp313-cp313-win_amd64.whl", hash = "sha256:718a38f7e5b058e76aee1c56ddd06908116d35147e133427e59a3983f703a20d", size = 9494700, upload-time = "2025-07-14T20:13:28.243Z" }, + { url = "https://files.pythonhosted.org/packages/04/bf/90339ac0f55726dce7d794e6d79a18a91265bdf3aa70b6b9ca52f35e022a/pywin32-311-cp313-cp313-win_arm64.whl", hash = "sha256:7b4075d959648406202d92a2310cb990fea19b535c7f4a78d3f5e10b926eeb8a", size = 8709318, upload-time = "2025-07-14T20:13:30.348Z" }, + { url = "https://files.pythonhosted.org/packages/c9/31/097f2e132c4f16d99a22bfb777e0fd88bd8e1c634304e102f313af69ace5/pywin32-311-cp314-cp314-win32.whl", hash = "sha256:b7a2c10b93f8986666d0c803ee19b5990885872a7de910fc460f9b0c2fbf92ee", size = 8840714, upload-time = "2025-07-14T20:13:32.449Z" }, + { url = "https://files.pythonhosted.org/packages/90/4b/07c77d8ba0e01349358082713400435347df8426208171ce297da32c313d/pywin32-311-cp314-cp314-win_amd64.whl", hash = "sha256:3aca44c046bd2ed8c90de9cb8427f581c479e594e99b5c0bb19b29c10fd6cb87", size = 9656800, upload-time = "2025-07-14T20:13:34.312Z" }, + { url = "https://files.pythonhosted.org/packages/c0/d2/21af5c535501a7233e734b8af901574572da66fcc254cb35d0609c9080dd/pywin32-311-cp314-cp314-win_arm64.whl", hash = "sha256:a508e2d9025764a8270f93111a970e1d0fbfc33f4153b388bb649b7eec4f9b42", size = 8932540, upload-time = "2025-07-14T20:13:36.379Z" }, +] + +[[package]] +name = "pyyaml" +version = "6.0.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/54/ed/79a089b6be93607fa5cdaedf301d7dfb23af5f25c398d5ead2525b063e17/pyyaml-6.0.2.tar.gz", hash = "sha256:d584d9ec91ad65861cc08d42e834324ef890a082e591037abe114850ff7bbc3e", size = 130631, upload-time = "2024-08-06T20:33:50.674Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/f8/aa/7af4e81f7acba21a4c6be026da38fd2b872ca46226673c89a758ebdc4fd2/PyYAML-6.0.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:cc1c1159b3d456576af7a3e4d1ba7e6924cb39de8f67111c735f6fc832082774", size = 184612, upload-time = "2024-08-06T20:32:03.408Z" }, + { url = "https://files.pythonhosted.org/packages/8b/62/b9faa998fd185f65c1371643678e4d58254add437edb764a08c5a98fb986/PyYAML-6.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1e2120ef853f59c7419231f3bf4e7021f1b936f6ebd222406c3b60212205d2ee", size = 172040, upload-time = "2024-08-06T20:32:04.926Z" }, + { url = "https://files.pythonhosted.org/packages/ad/0c/c804f5f922a9a6563bab712d8dcc70251e8af811fce4524d57c2c0fd49a4/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5d225db5a45f21e78dd9358e58a98702a0302f2659a3c6cd320564b75b86f47c", size = 736829, upload-time = "2024-08-06T20:32:06.459Z" }, + { url = "https://files.pythonhosted.org/packages/51/16/6af8d6a6b210c8e54f1406a6b9481febf9c64a3109c541567e35a49aa2e7/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5ac9328ec4831237bec75defaf839f7d4564be1e6b25ac710bd1a96321cc8317", size = 764167, upload-time = "2024-08-06T20:32:08.338Z" }, + { url = "https://files.pythonhosted.org/packages/75/e4/2c27590dfc9992f73aabbeb9241ae20220bd9452df27483b6e56d3975cc5/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3ad2a3decf9aaba3d29c8f537ac4b243e36bef957511b4766cb0057d32b0be85", size = 762952, upload-time = "2024-08-06T20:32:14.124Z" }, + { url = "https://files.pythonhosted.org/packages/9b/97/ecc1abf4a823f5ac61941a9c00fe501b02ac3ab0e373c3857f7d4b83e2b6/PyYAML-6.0.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:ff3824dc5261f50c9b0dfb3be22b4567a6f938ccce4587b38952d85fd9e9afe4", size = 735301, upload-time = "2024-08-06T20:32:16.17Z" }, + { url = "https://files.pythonhosted.org/packages/45/73/0f49dacd6e82c9430e46f4a027baa4ca205e8b0a9dce1397f44edc23559d/PyYAML-6.0.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:797b4f722ffa07cc8d62053e4cff1486fa6dc094105d13fea7b1de7d8bf71c9e", size = 756638, upload-time = "2024-08-06T20:32:18.555Z" }, + { url = "https://files.pythonhosted.org/packages/22/5f/956f0f9fc65223a58fbc14459bf34b4cc48dec52e00535c79b8db361aabd/PyYAML-6.0.2-cp311-cp311-win32.whl", hash = "sha256:11d8f3dd2b9c1207dcaf2ee0bbbfd5991f571186ec9cc78427ba5bd32afae4b5", size = 143850, upload-time = "2024-08-06T20:32:19.889Z" }, + { url = "https://files.pythonhosted.org/packages/ed/23/8da0bbe2ab9dcdd11f4f4557ccaf95c10b9811b13ecced089d43ce59c3c8/PyYAML-6.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:e10ce637b18caea04431ce14fabcf5c64a1c61ec9c56b071a4b7ca131ca52d44", size = 161980, upload-time = "2024-08-06T20:32:21.273Z" }, + { url = "https://files.pythonhosted.org/packages/86/0c/c581167fc46d6d6d7ddcfb8c843a4de25bdd27e4466938109ca68492292c/PyYAML-6.0.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:c70c95198c015b85feafc136515252a261a84561b7b1d51e3384e0655ddf25ab", size = 183873, upload-time = "2024-08-06T20:32:25.131Z" }, + { url = "https://files.pythonhosted.org/packages/a8/0c/38374f5bb272c051e2a69281d71cba6fdb983413e6758b84482905e29a5d/PyYAML-6.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ce826d6ef20b1bc864f0a68340c8b3287705cae2f8b4b1d932177dcc76721725", size = 173302, upload-time = "2024-08-06T20:32:26.511Z" }, + { url = "https://files.pythonhosted.org/packages/c3/93/9916574aa8c00aa06bbac729972eb1071d002b8e158bd0e83a3b9a20a1f7/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1f71ea527786de97d1a0cc0eacd1defc0985dcf6b3f17bb77dcfc8c34bec4dc5", size = 739154, upload-time = "2024-08-06T20:32:28.363Z" }, + { url = "https://files.pythonhosted.org/packages/95/0f/b8938f1cbd09739c6da569d172531567dbcc9789e0029aa070856f123984/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9b22676e8097e9e22e36d6b7bda33190d0d400f345f23d4065d48f4ca7ae0425", size = 766223, upload-time = "2024-08-06T20:32:30.058Z" }, + { url = "https://files.pythonhosted.org/packages/b9/2b/614b4752f2e127db5cc206abc23a8c19678e92b23c3db30fc86ab731d3bd/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:80bab7bfc629882493af4aa31a4cfa43a4c57c83813253626916b8c7ada83476", size = 767542, upload-time = "2024-08-06T20:32:31.881Z" }, + { url = "https://files.pythonhosted.org/packages/d4/00/dd137d5bcc7efea1836d6264f049359861cf548469d18da90cd8216cf05f/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:0833f8694549e586547b576dcfaba4a6b55b9e96098b36cdc7ebefe667dfed48", size = 731164, upload-time = "2024-08-06T20:32:37.083Z" }, + { url = "https://files.pythonhosted.org/packages/c9/1f/4f998c900485e5c0ef43838363ba4a9723ac0ad73a9dc42068b12aaba4e4/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8b9c7197f7cb2738065c481a0461e50ad02f18c78cd75775628afb4d7137fb3b", size = 756611, upload-time = "2024-08-06T20:32:38.898Z" }, + { url = "https://files.pythonhosted.org/packages/df/d1/f5a275fdb252768b7a11ec63585bc38d0e87c9e05668a139fea92b80634c/PyYAML-6.0.2-cp312-cp312-win32.whl", hash = "sha256:ef6107725bd54b262d6dedcc2af448a266975032bc85ef0172c5f059da6325b4", size = 140591, upload-time = "2024-08-06T20:32:40.241Z" }, + { url = "https://files.pythonhosted.org/packages/0c/e8/4f648c598b17c3d06e8753d7d13d57542b30d56e6c2dedf9c331ae56312e/PyYAML-6.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:7e7401d0de89a9a855c839bc697c079a4af81cf878373abd7dc625847d25cbd8", size = 156338, upload-time = "2024-08-06T20:32:41.93Z" }, + { url = "https://files.pythonhosted.org/packages/ef/e3/3af305b830494fa85d95f6d95ef7fa73f2ee1cc8ef5b495c7c3269fb835f/PyYAML-6.0.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:efdca5630322a10774e8e98e1af481aad470dd62c3170801852d752aa7a783ba", size = 181309, upload-time = "2024-08-06T20:32:43.4Z" }, + { url = "https://files.pythonhosted.org/packages/45/9f/3b1c20a0b7a3200524eb0076cc027a970d320bd3a6592873c85c92a08731/PyYAML-6.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:50187695423ffe49e2deacb8cd10510bc361faac997de9efef88badc3bb9e2d1", size = 171679, upload-time = "2024-08-06T20:32:44.801Z" }, + { url = "https://files.pythonhosted.org/packages/7c/9a/337322f27005c33bcb656c655fa78325b730324c78620e8328ae28b64d0c/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0ffe8360bab4910ef1b9e87fb812d8bc0a308b0d0eef8c8f44e0254ab3b07133", size = 733428, upload-time = "2024-08-06T20:32:46.432Z" }, + { url = "https://files.pythonhosted.org/packages/a3/69/864fbe19e6c18ea3cc196cbe5d392175b4cf3d5d0ac1403ec3f2d237ebb5/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:17e311b6c678207928d649faa7cb0d7b4c26a0ba73d41e99c4fff6b6c3276484", size = 763361, upload-time = "2024-08-06T20:32:51.188Z" }, + { url = "https://files.pythonhosted.org/packages/04/24/b7721e4845c2f162d26f50521b825fb061bc0a5afcf9a386840f23ea19fa/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:70b189594dbe54f75ab3a1acec5f1e3faa7e8cf2f1e08d9b561cb41b845f69d5", size = 759523, upload-time = "2024-08-06T20:32:53.019Z" }, + { url = "https://files.pythonhosted.org/packages/2b/b2/e3234f59ba06559c6ff63c4e10baea10e5e7df868092bf9ab40e5b9c56b6/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:41e4e3953a79407c794916fa277a82531dd93aad34e29c2a514c2c0c5fe971cc", size = 726660, upload-time = "2024-08-06T20:32:54.708Z" }, + { url = "https://files.pythonhosted.org/packages/fe/0f/25911a9f080464c59fab9027482f822b86bf0608957a5fcc6eaac85aa515/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:68ccc6023a3400877818152ad9a1033e3db8625d899c72eacb5a668902e4d652", size = 751597, upload-time = "2024-08-06T20:32:56.985Z" }, + { url = "https://files.pythonhosted.org/packages/14/0d/e2c3b43bbce3cf6bd97c840b46088a3031085179e596d4929729d8d68270/PyYAML-6.0.2-cp313-cp313-win32.whl", hash = "sha256:bc2fa7c6b47d6bc618dd7fb02ef6fdedb1090ec036abab80d4681424b84c1183", size = 140527, upload-time = "2024-08-06T20:33:03.001Z" }, + { url = "https://files.pythonhosted.org/packages/fa/de/02b54f42487e3d3c6efb3f89428677074ca7bf43aae402517bc7cca949f3/PyYAML-6.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:8388ee1976c416731879ac16da0aff3f63b286ffdd57cdeb95f3f2e085687563", size = 156446, upload-time = "2024-08-06T20:33:04.33Z" }, +] + +[[package]] +name = "rdflib" +version = "7.1.4" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pyparsing" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/e8/7e/cb2d74466bd8495051ebe2d241b1cb1d4acf9740d481126aef19ef2697f5/rdflib-7.1.4.tar.gz", hash = "sha256:fed46e24f26a788e2ab8e445f7077f00edcf95abb73bcef4b86cefa8b62dd174", size = 4692745, upload-time = "2025-03-29T02:23:02.386Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/f4/31/e9b6f04288dcd3fa60cb3179260d6dad81b92aef3063d679ac7d80a827ea/rdflib-7.1.4-py3-none-any.whl", hash = "sha256:72f4adb1990fa5241abd22ddaf36d7cafa5d91d9ff2ba13f3086d339b213d997", size = 565051, upload-time = "2025-03-29T02:22:44.987Z" }, +] + +[[package]] +name = "referencing" +version = "0.36.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "attrs" }, + { name = "rpds-py" }, + { name = "typing-extensions", marker = "python_full_version < '3.13'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/2f/db/98b5c277be99dd18bfd91dd04e1b759cad18d1a338188c936e92f921c7e2/referencing-0.36.2.tar.gz", hash = "sha256:df2e89862cd09deabbdba16944cc3f10feb6b3e6f18e902f7cc25609a34775aa", size = 74744, upload-time = "2025-01-25T08:48:16.138Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c1/b1/3baf80dc6d2b7bc27a95a67752d0208e410351e3feb4eb78de5f77454d8d/referencing-0.36.2-py3-none-any.whl", hash = "sha256:e8699adbbf8b5c7de96d8ffa0eb5c158b3beafce084968e2ea8bb08c6794dcd0", size = 26775, upload-time = "2025-01-25T08:48:14.241Z" }, +] + +[[package]] +name = "regex" +version = "2025.9.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/b2/5a/4c63457fbcaf19d138d72b2e9b39405954f98c0349b31c601bfcb151582c/regex-2025.9.1.tar.gz", hash = "sha256:88ac07b38d20b54d79e704e38aa3bd2c0f8027432164226bdee201a1c0c9c9ff", size = 400852, upload-time = "2025-09-01T22:10:10.479Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/06/4d/f741543c0c59f96c6625bc6c11fea1da2e378b7d293ffff6f318edc0ce14/regex-2025.9.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:e5bcf112b09bfd3646e4db6bf2e598534a17d502b0c01ea6550ba4eca780c5e6", size = 484811, upload-time = "2025-09-01T22:08:12.834Z" }, + { url = "https://files.pythonhosted.org/packages/c2/bd/27e73e92635b6fbd51afc26a414a3133243c662949cd1cda677fe7bb09bd/regex-2025.9.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:67a0295a3c31d675a9ee0238d20238ff10a9a2fdb7a1323c798fc7029578b15c", size = 288977, upload-time = "2025-09-01T22:08:14.499Z" }, + { url = "https://files.pythonhosted.org/packages/eb/7d/7dc0c6efc8bc93cd6e9b947581f5fde8a5dbaa0af7c4ec818c5729fdc807/regex-2025.9.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:ea8267fbadc7d4bd7c1301a50e85c2ff0de293ff9452a1a9f8d82c6cafe38179", size = 286606, upload-time = "2025-09-01T22:08:15.881Z" }, + { url = "https://files.pythonhosted.org/packages/d1/01/9b5c6dd394f97c8f2c12f6e8f96879c9ac27292a718903faf2e27a0c09f6/regex-2025.9.1-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6aeff21de7214d15e928fb5ce757f9495214367ba62875100d4c18d293750cc1", size = 792436, upload-time = "2025-09-01T22:08:17.38Z" }, + { url = "https://files.pythonhosted.org/packages/fc/24/b7430cfc6ee34bbb3db6ff933beb5e7692e5cc81e8f6f4da63d353566fb0/regex-2025.9.1-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:d89f1bbbbbc0885e1c230f7770d5e98f4f00b0ee85688c871d10df8b184a6323", size = 858705, upload-time = "2025-09-01T22:08:19.037Z" }, + { url = "https://files.pythonhosted.org/packages/d6/98/155f914b4ea6ae012663188545c4f5216c11926d09b817127639d618b003/regex-2025.9.1-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:ca3affe8ddea498ba9d294ab05f5f2d3b5ad5d515bc0d4a9016dd592a03afe52", size = 905881, upload-time = "2025-09-01T22:08:20.377Z" }, + { url = "https://files.pythonhosted.org/packages/8a/a7/a470e7bc8259c40429afb6d6a517b40c03f2f3e455c44a01abc483a1c512/regex-2025.9.1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:91892a7a9f0a980e4c2c85dd19bc14de2b219a3a8867c4b5664b9f972dcc0c78", size = 798968, upload-time = "2025-09-01T22:08:22.081Z" }, + { url = "https://files.pythonhosted.org/packages/1d/fa/33f6fec4d41449fea5f62fdf5e46d668a1c046730a7f4ed9f478331a8e3a/regex-2025.9.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:e1cb40406f4ae862710615f9f636c1e030fd6e6abe0e0f65f6a695a2721440c6", size = 781884, upload-time = "2025-09-01T22:08:23.832Z" }, + { url = "https://files.pythonhosted.org/packages/42/de/2b45f36ab20da14eedddf5009d370625bc5942d9953fa7e5037a32d66843/regex-2025.9.1-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:94f6cff6f7e2149c7e6499a6ecd4695379eeda8ccbccb9726e8149f2fe382e92", size = 852935, upload-time = "2025-09-01T22:08:25.536Z" }, + { url = "https://files.pythonhosted.org/packages/1e/f9/878f4fc92c87e125e27aed0f8ee0d1eced9b541f404b048f66f79914475a/regex-2025.9.1-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:6c0226fb322b82709e78c49cc33484206647f8a39954d7e9de1567f5399becd0", size = 844340, upload-time = "2025-09-01T22:08:27.141Z" }, + { url = "https://files.pythonhosted.org/packages/90/c2/5b6f2bce6ece5f8427c718c085eca0de4bbb4db59f54db77aa6557aef3e9/regex-2025.9.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:a12f59c7c380b4fcf7516e9cbb126f95b7a9518902bcf4a852423ff1dcd03e6a", size = 787238, upload-time = "2025-09-01T22:08:28.75Z" }, + { url = "https://files.pythonhosted.org/packages/47/66/1ef1081c831c5b611f6f55f6302166cfa1bc9574017410ba5595353f846a/regex-2025.9.1-cp311-cp311-win32.whl", hash = "sha256:49865e78d147a7a4f143064488da5d549be6bfc3f2579e5044cac61f5c92edd4", size = 264118, upload-time = "2025-09-01T22:08:30.388Z" }, + { url = "https://files.pythonhosted.org/packages/ad/e0/8adc550d7169df1d6b9be8ff6019cda5291054a0107760c2f30788b6195f/regex-2025.9.1-cp311-cp311-win_amd64.whl", hash = "sha256:d34b901f6f2f02ef60f4ad3855d3a02378c65b094efc4b80388a3aeb700a5de7", size = 276151, upload-time = "2025-09-01T22:08:32.073Z" }, + { url = "https://files.pythonhosted.org/packages/cb/bd/46fef29341396d955066e55384fb93b0be7d64693842bf4a9a398db6e555/regex-2025.9.1-cp311-cp311-win_arm64.whl", hash = "sha256:47d7c2dab7e0b95b95fd580087b6ae196039d62306a592fa4e162e49004b6299", size = 268460, upload-time = "2025-09-01T22:08:33.281Z" }, + { url = "https://files.pythonhosted.org/packages/39/ef/a0372febc5a1d44c1be75f35d7e5aff40c659ecde864d7fa10e138f75e74/regex-2025.9.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:84a25164bd8dcfa9f11c53f561ae9766e506e580b70279d05a7946510bdd6f6a", size = 486317, upload-time = "2025-09-01T22:08:34.529Z" }, + { url = "https://files.pythonhosted.org/packages/b5/25/d64543fb7eb41a1024786d518cc57faf1ce64aa6e9ddba097675a0c2f1d2/regex-2025.9.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:645e88a73861c64c1af558dd12294fb4e67b5c1eae0096a60d7d8a2143a611c7", size = 289698, upload-time = "2025-09-01T22:08:36.162Z" }, + { url = "https://files.pythonhosted.org/packages/d8/dc/fbf31fc60be317bd9f6f87daa40a8a9669b3b392aa8fe4313df0a39d0722/regex-2025.9.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:10a450cba5cd5409526ee1d4449f42aad38dd83ac6948cbd6d7f71ca7018f7db", size = 287242, upload-time = "2025-09-01T22:08:37.794Z" }, + { url = "https://files.pythonhosted.org/packages/0f/74/f933a607a538f785da5021acf5323961b4620972e2c2f1f39b6af4b71db7/regex-2025.9.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e9dc5991592933a4192c166eeb67b29d9234f9c86344481173d1bc52f73a7104", size = 797441, upload-time = "2025-09-01T22:08:39.108Z" }, + { url = "https://files.pythonhosted.org/packages/89/d0/71fc49b4f20e31e97f199348b8c4d6e613e7b6a54a90eb1b090c2b8496d7/regex-2025.9.1-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a32291add816961aab472f4fad344c92871a2ee33c6c219b6598e98c1f0108f2", size = 862654, upload-time = "2025-09-01T22:08:40.586Z" }, + { url = "https://files.pythonhosted.org/packages/59/05/984edce1411a5685ba9abbe10d42cdd9450aab4a022271f9585539788150/regex-2025.9.1-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:588c161a68a383478e27442a678e3b197b13c5ba51dbba40c1ccb8c4c7bee9e9", size = 910862, upload-time = "2025-09-01T22:08:42.416Z" }, + { url = "https://files.pythonhosted.org/packages/b2/02/5c891bb5fe0691cc1bad336e3a94b9097fbcf9707ec8ddc1dce9f0397289/regex-2025.9.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:47829ffaf652f30d579534da9085fe30c171fa2a6744a93d52ef7195dc38218b", size = 801991, upload-time = "2025-09-01T22:08:44.072Z" }, + { url = "https://files.pythonhosted.org/packages/f1/ae/fd10d6ad179910f7a1b3e0a7fde1ef8bb65e738e8ac4fd6ecff3f52252e4/regex-2025.9.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:1e978e5a35b293ea43f140c92a3269b6ab13fe0a2bf8a881f7ac740f5a6ade85", size = 786651, upload-time = "2025-09-01T22:08:46.079Z" }, + { url = "https://files.pythonhosted.org/packages/30/cf/9d686b07bbc5bf94c879cc168db92542d6bc9fb67088d03479fef09ba9d3/regex-2025.9.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:4cf09903e72411f4bf3ac1eddd624ecfd423f14b2e4bf1c8b547b72f248b7bf7", size = 856556, upload-time = "2025-09-01T22:08:48.376Z" }, + { url = "https://files.pythonhosted.org/packages/91/9d/302f8a29bb8a49528abbab2d357a793e2a59b645c54deae0050f8474785b/regex-2025.9.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:d016b0f77be63e49613c9e26aaf4a242f196cd3d7a4f15898f5f0ab55c9b24d2", size = 849001, upload-time = "2025-09-01T22:08:50.067Z" }, + { url = "https://files.pythonhosted.org/packages/93/fa/b4c6dbdedc85ef4caec54c817cd5f4418dbfa2453214119f2538082bf666/regex-2025.9.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:656563e620de6908cd1c9d4f7b9e0777e3341ca7db9d4383bcaa44709c90281e", size = 788138, upload-time = "2025-09-01T22:08:51.933Z" }, + { url = "https://files.pythonhosted.org/packages/4a/1b/91ee17a3cbf87f81e8c110399279d0e57f33405468f6e70809100f2ff7d8/regex-2025.9.1-cp312-cp312-win32.whl", hash = "sha256:df33f4ef07b68f7ab637b1dbd70accbf42ef0021c201660656601e8a9835de45", size = 264524, upload-time = "2025-09-01T22:08:53.75Z" }, + { url = "https://files.pythonhosted.org/packages/92/28/6ba31cce05b0f1ec6b787921903f83bd0acf8efde55219435572af83c350/regex-2025.9.1-cp312-cp312-win_amd64.whl", hash = "sha256:5aba22dfbc60cda7c0853516104724dc904caa2db55f2c3e6e984eb858d3edf3", size = 275489, upload-time = "2025-09-01T22:08:55.037Z" }, + { url = "https://files.pythonhosted.org/packages/bd/ed/ea49f324db00196e9ef7fe00dd13c6164d5173dd0f1bbe495e61bb1fb09d/regex-2025.9.1-cp312-cp312-win_arm64.whl", hash = "sha256:ec1efb4c25e1849c2685fa95da44bfde1b28c62d356f9c8d861d4dad89ed56e9", size = 268589, upload-time = "2025-09-01T22:08:56.369Z" }, + { url = "https://files.pythonhosted.org/packages/98/25/b2959ce90c6138c5142fe5264ee1f9b71a0c502ca4c7959302a749407c79/regex-2025.9.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:bc6834727d1b98d710a63e6c823edf6ffbf5792eba35d3fa119531349d4142ef", size = 485932, upload-time = "2025-09-01T22:08:57.913Z" }, + { url = "https://files.pythonhosted.org/packages/49/2e/6507a2a85f3f2be6643438b7bd976e67ad73223692d6988eb1ff444106d3/regex-2025.9.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:c3dc05b6d579875719bccc5f3037b4dc80433d64e94681a0061845bd8863c025", size = 289568, upload-time = "2025-09-01T22:08:59.258Z" }, + { url = "https://files.pythonhosted.org/packages/c7/d8/de4a4b57215d99868f1640e062a7907e185ec7476b4b689e2345487c1ff4/regex-2025.9.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:22213527df4c985ec4a729b055a8306272d41d2f45908d7bacb79be0fa7a75ad", size = 286984, upload-time = "2025-09-01T22:09:00.835Z" }, + { url = "https://files.pythonhosted.org/packages/03/15/e8cb403403a57ed316e80661db0e54d7aa2efcd85cb6156f33cc18746922/regex-2025.9.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8e3f6e3c5a5a1adc3f7ea1b5aec89abfc2f4fbfba55dafb4343cd1d084f715b2", size = 797514, upload-time = "2025-09-01T22:09:02.538Z" }, + { url = "https://files.pythonhosted.org/packages/e4/26/2446f2b9585fed61faaa7e2bbce3aca7dd8df6554c32addee4c4caecf24a/regex-2025.9.1-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:bcb89c02a0d6c2bec9b0bb2d8c78782699afe8434493bfa6b4021cc51503f249", size = 862586, upload-time = "2025-09-01T22:09:04.322Z" }, + { url = "https://files.pythonhosted.org/packages/fd/b8/82ffbe9c0992c31bbe6ae1c4b4e21269a5df2559102b90543c9b56724c3c/regex-2025.9.1-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:b0e2f95413eb0c651cd1516a670036315b91b71767af83bc8525350d4375ccba", size = 910815, upload-time = "2025-09-01T22:09:05.978Z" }, + { url = "https://files.pythonhosted.org/packages/2f/d8/7303ea38911759c1ee30cc5bc623ee85d3196b733c51fd6703c34290a8d9/regex-2025.9.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:09a41dc039e1c97d3c2ed3e26523f748e58c4de3ea7a31f95e1cf9ff973fff5a", size = 802042, upload-time = "2025-09-01T22:09:07.865Z" }, + { url = "https://files.pythonhosted.org/packages/fc/0e/6ad51a55ed4b5af512bb3299a05d33309bda1c1d1e1808fa869a0bed31bc/regex-2025.9.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:4f0b4258b161094f66857a26ee938d3fe7b8a5063861e44571215c44fbf0e5df", size = 786764, upload-time = "2025-09-01T22:09:09.362Z" }, + { url = "https://files.pythonhosted.org/packages/8d/d5/394e3ffae6baa5a9217bbd14d96e0e5da47bb069d0dbb8278e2681a2b938/regex-2025.9.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:bf70e18ac390e6977ea7e56f921768002cb0fa359c4199606c7219854ae332e0", size = 856557, upload-time = "2025-09-01T22:09:11.129Z" }, + { url = "https://files.pythonhosted.org/packages/cd/80/b288d3910c41194ad081b9fb4b371b76b0bbfdce93e7709fc98df27b37dc/regex-2025.9.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:b84036511e1d2bb0a4ff1aec26951caa2dea8772b223c9e8a19ed8885b32dbac", size = 849108, upload-time = "2025-09-01T22:09:12.877Z" }, + { url = "https://files.pythonhosted.org/packages/d1/cd/5ec76bf626d0d5abdc277b7a1734696f5f3d14fbb4a3e2540665bc305d85/regex-2025.9.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:c2e05dcdfe224047f2a59e70408274c325d019aad96227ab959403ba7d58d2d7", size = 788201, upload-time = "2025-09-01T22:09:14.561Z" }, + { url = "https://files.pythonhosted.org/packages/b5/36/674672f3fdead107565a2499f3007788b878188acec6d42bc141c5366c2c/regex-2025.9.1-cp313-cp313-win32.whl", hash = "sha256:3b9a62107a7441b81ca98261808fed30ae36ba06c8b7ee435308806bd53c1ed8", size = 264508, upload-time = "2025-09-01T22:09:16.193Z" }, + { url = "https://files.pythonhosted.org/packages/83/ad/931134539515eb64ce36c24457a98b83c1b2e2d45adf3254b94df3735a76/regex-2025.9.1-cp313-cp313-win_amd64.whl", hash = "sha256:b38afecc10c177eb34cfae68d669d5161880849ba70c05cbfbe409f08cc939d7", size = 275469, upload-time = "2025-09-01T22:09:17.462Z" }, + { url = "https://files.pythonhosted.org/packages/24/8c/96d34e61c0e4e9248836bf86d69cb224fd222f270fa9045b24e218b65604/regex-2025.9.1-cp313-cp313-win_arm64.whl", hash = "sha256:ec329890ad5e7ed9fc292858554d28d58d56bf62cf964faf0aa57964b21155a0", size = 268586, upload-time = "2025-09-01T22:09:18.948Z" }, + { url = "https://files.pythonhosted.org/packages/21/b1/453cbea5323b049181ec6344a803777914074b9726c9c5dc76749966d12d/regex-2025.9.1-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:72fb7a016467d364546f22b5ae86c45680a4e0de6b2a6f67441d22172ff641f1", size = 486111, upload-time = "2025-09-01T22:09:20.734Z" }, + { url = "https://files.pythonhosted.org/packages/f6/0e/92577f197bd2f7652c5e2857f399936c1876978474ecc5b068c6d8a79c86/regex-2025.9.1-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:c9527fa74eba53f98ad86be2ba003b3ebe97e94b6eb2b916b31b5f055622ef03", size = 289520, upload-time = "2025-09-01T22:09:22.249Z" }, + { url = "https://files.pythonhosted.org/packages/af/c6/b472398116cca7ea5a6c4d5ccd0fc543f7fd2492cb0c48d2852a11972f73/regex-2025.9.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c905d925d194c83a63f92422af7544ec188301451b292c8b487f0543726107ca", size = 287215, upload-time = "2025-09-01T22:09:23.657Z" }, + { url = "https://files.pythonhosted.org/packages/cf/11/f12ecb0cf9ca792a32bb92f758589a84149017467a544f2f6bfb45c0356d/regex-2025.9.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:74df7c74a63adcad314426b1f4ea6054a5ab25d05b0244f0c07ff9ce640fa597", size = 797855, upload-time = "2025-09-01T22:09:25.197Z" }, + { url = "https://files.pythonhosted.org/packages/46/88/bbb848f719a540fb5997e71310f16f0b33a92c5d4b4d72d4311487fff2a3/regex-2025.9.1-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:4f6e935e98ea48c7a2e8be44494de337b57a204470e7f9c9c42f912c414cd6f5", size = 863363, upload-time = "2025-09-01T22:09:26.705Z" }, + { url = "https://files.pythonhosted.org/packages/54/a9/2321eb3e2838f575a78d48e03c1e83ea61bd08b74b7ebbdeca8abc50fc25/regex-2025.9.1-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:4a62d033cd9ebefc7c5e466731a508dfabee827d80b13f455de68a50d3c2543d", size = 910202, upload-time = "2025-09-01T22:09:28.906Z" }, + { url = "https://files.pythonhosted.org/packages/33/07/d1d70835d7d11b7e126181f316f7213c4572ecf5c5c97bdbb969fb1f38a2/regex-2025.9.1-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ef971ebf2b93bdc88d8337238be4dfb851cc97ed6808eb04870ef67589415171", size = 801808, upload-time = "2025-09-01T22:09:30.733Z" }, + { url = "https://files.pythonhosted.org/packages/13/d1/29e4d1bed514ef2bf3a4ead3cb8bb88ca8af94130239a4e68aa765c35b1c/regex-2025.9.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:d936a1db208bdca0eca1f2bb2c1ba1d8370b226785c1e6db76e32a228ffd0ad5", size = 786824, upload-time = "2025-09-01T22:09:32.61Z" }, + { url = "https://files.pythonhosted.org/packages/33/27/20d8ccb1bee460faaa851e6e7cc4cfe852a42b70caa1dca22721ba19f02f/regex-2025.9.1-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:7e786d9e4469698fc63815b8de08a89165a0aa851720eb99f5e0ea9d51dd2b6a", size = 857406, upload-time = "2025-09-01T22:09:34.117Z" }, + { url = "https://files.pythonhosted.org/packages/74/fe/60c6132262dc36430d51e0c46c49927d113d3a38c1aba6a26c7744c84cf3/regex-2025.9.1-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:6b81d7dbc5466ad2c57ce3a0ddb717858fe1a29535c8866f8514d785fdb9fc5b", size = 848593, upload-time = "2025-09-01T22:09:35.598Z" }, + { url = "https://files.pythonhosted.org/packages/cc/ae/2d4ff915622fabbef1af28387bf71e7f2f4944a348b8460d061e85e29bf0/regex-2025.9.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:cd4890e184a6feb0ef195338a6ce68906a8903a0f2eb7e0ab727dbc0a3156273", size = 787951, upload-time = "2025-09-01T22:09:37.139Z" }, + { url = "https://files.pythonhosted.org/packages/85/37/dc127703a9e715a284cc2f7dbdd8a9776fd813c85c126eddbcbdd1ca5fec/regex-2025.9.1-cp314-cp314-win32.whl", hash = "sha256:34679a86230e46164c9e0396b56cab13c0505972343880b9e705083cc5b8ec86", size = 269833, upload-time = "2025-09-01T22:09:39.245Z" }, + { url = "https://files.pythonhosted.org/packages/83/bf/4bed4d3d0570e16771defd5f8f15f7ea2311edcbe91077436d6908956c4a/regex-2025.9.1-cp314-cp314-win_amd64.whl", hash = "sha256:a1196e530a6bfa5f4bde029ac5b0295a6ecfaaffbfffede4bbaf4061d9455b70", size = 278742, upload-time = "2025-09-01T22:09:40.651Z" }, + { url = "https://files.pythonhosted.org/packages/cf/3e/7d7ac6fd085023312421e0d69dfabdfb28e116e513fadbe9afe710c01893/regex-2025.9.1-cp314-cp314-win_arm64.whl", hash = "sha256:f46d525934871ea772930e997d577d48c6983e50f206ff7b66d4ac5f8941e993", size = 271860, upload-time = "2025-09-01T22:09:42.413Z" }, +] + +[[package]] +name = "requests" +version = "2.32.5" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "certifi" }, + { name = "charset-normalizer" }, + { name = "idna" }, + { name = "urllib3" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/c9/74/b3ff8e6c8446842c3f5c837e9c3dfcfe2018ea6ecef224c710c85ef728f4/requests-2.32.5.tar.gz", hash = "sha256:dbba0bac56e100853db0ea71b82b4dfd5fe2bf6d3754a8893c3af500cec7d7cf", size = 134517, upload-time = "2025-08-18T20:46:02.573Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/1e/db/4254e3eabe8020b458f1a747140d32277ec7a271daf1d235b70dc0b4e6e3/requests-2.32.5-py3-none-any.whl", hash = "sha256:2462f94637a34fd532264295e186976db0f5d453d1cdd31473c85a6a161affb6", size = 64738, upload-time = "2025-08-18T20:46:00.542Z" }, +] + +[[package]] +name = "requirements-parser" +version = "0.13.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "packaging" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/95/96/fb6dbfebb524d5601d359a47c78fe7ba1eef90fc4096404aa60c9a906fbb/requirements_parser-0.13.0.tar.gz", hash = "sha256:0843119ca2cb2331de4eb31b10d70462e39ace698fd660a915c247d2301a4418", size = 22630, upload-time = "2025-05-21T13:42:05.464Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/bd/60/50fbb6ffb35f733654466f1a90d162bcbea358adc3b0871339254fbc37b2/requirements_parser-0.13.0-py3-none-any.whl", hash = "sha256:2b3173faecf19ec5501971b7222d38f04cb45bb9d87d0ad629ca71e2e62ded14", size = 14782, upload-time = "2025-05-21T13:42:04.007Z" }, +] + +[[package]] +name = "rfc3339-validator" +version = "0.1.4" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "six" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/28/ea/a9387748e2d111c3c2b275ba970b735e04e15cdb1eb30693b6b5708c4dbd/rfc3339_validator-0.1.4.tar.gz", hash = "sha256:138a2abdf93304ad60530167e51d2dfb9549521a836871b88d7f4695d0022f6b", size = 5513, upload-time = "2021-05-12T16:37:54.178Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7b/44/4e421b96b67b2daff264473f7465db72fbdf36a07e05494f50300cc7b0c6/rfc3339_validator-0.1.4-py2.py3-none-any.whl", hash = "sha256:24f6ec1eda14ef823da9e36ec7113124b39c04d50a4d3d3a3c2859577e7791fa", size = 3490, upload-time = "2021-05-12T16:37:52.536Z" }, +] + +[[package]] +name = "rich" +version = "14.1.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "markdown-it-py" }, + { name = "pygments" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/fe/75/af448d8e52bf1d8fa6a9d089ca6c07ff4453d86c65c145d0a300bb073b9b/rich-14.1.0.tar.gz", hash = "sha256:e497a48b844b0320d45007cdebfeaeed8db2a4f4bcf49f15e455cfc4af11eaa8", size = 224441, upload-time = "2025-07-25T07:32:58.125Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e3/30/3c4d035596d3cf444529e0b2953ad0466f6049528a879d27534700580395/rich-14.1.0-py3-none-any.whl", hash = "sha256:536f5f1785986d6dbdea3c75205c473f970777b4a0d6c6dd1b696aa05a3fa04f", size = 243368, upload-time = "2025-07-25T07:32:56.73Z" }, +] + +[[package]] +name = "rich-argparse" +version = "1.7.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "rich" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/71/a6/34460d81e5534f6d2fc8e8d91ff99a5835fdca53578eac89e4f37b3a7c6d/rich_argparse-1.7.1.tar.gz", hash = "sha256:d7a493cde94043e41ea68fb43a74405fa178de981bf7b800f7a3bd02ac5c27be", size = 38094, upload-time = "2025-05-25T20:20:35.335Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/31/f6/5fc0574af5379606ffd57a4b68ed88f9b415eb222047fe023aefcc00a648/rich_argparse-1.7.1-py3-none-any.whl", hash = "sha256:a8650b42e4a4ff72127837632fba6b7da40784842f08d7395eb67a9cbd7b4bf9", size = 25357, upload-time = "2025-05-25T20:20:33.793Z" }, +] + +[[package]] +name = "rich-rst" +version = "1.3.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "docutils" }, + { name = "rich" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/b0/69/5514c3a87b5f10f09a34bb011bc0927bc12c596c8dae5915604e71abc386/rich_rst-1.3.1.tar.gz", hash = "sha256:fad46e3ba42785ea8c1785e2ceaa56e0ffa32dbe5410dec432f37e4107c4f383", size = 13839, upload-time = "2024-04-30T04:40:38.125Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/fd/bc/cc4e3dbc5e7992398dcb7a8eda0cbcf4fb792a0cdb93f857b478bf3cf884/rich_rst-1.3.1-py3-none-any.whl", hash = "sha256:498a74e3896507ab04492d326e794c3ef76e7cda078703aa592d1853d91098c1", size = 11621, upload-time = "2024-04-30T04:40:32.619Z" }, +] + +[[package]] +name = "rpds-py" +version = "0.27.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/e9/dd/2c0cbe774744272b0ae725f44032c77bdcab6e8bcf544bffa3b6e70c8dba/rpds_py-0.27.1.tar.gz", hash = "sha256:26a1c73171d10b7acccbded82bf6a586ab8203601e565badc74bbbf8bc5a10f8", size = 27479, upload-time = "2025-08-27T12:16:36.024Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b5/c1/7907329fbef97cbd49db6f7303893bd1dd5a4a3eae415839ffdfb0762cae/rpds_py-0.27.1-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:be898f271f851f68b318872ce6ebebbc62f303b654e43bf72683dbdc25b7c881", size = 371063, upload-time = "2025-08-27T12:12:47.856Z" }, + { url = "https://files.pythonhosted.org/packages/11/94/2aab4bc86228bcf7c48760990273653a4900de89c7537ffe1b0d6097ed39/rpds_py-0.27.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:62ac3d4e3e07b58ee0ddecd71d6ce3b1637de2d373501412df395a0ec5f9beb5", size = 353210, upload-time = "2025-08-27T12:12:49.187Z" }, + { url = "https://files.pythonhosted.org/packages/3a/57/f5eb3ecf434342f4f1a46009530e93fd201a0b5b83379034ebdb1d7c1a58/rpds_py-0.27.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4708c5c0ceb2d034f9991623631d3d23cb16e65c83736ea020cdbe28d57c0a0e", size = 381636, upload-time = "2025-08-27T12:12:50.492Z" }, + { url = "https://files.pythonhosted.org/packages/ae/f4/ef95c5945e2ceb5119571b184dd5a1cc4b8541bbdf67461998cfeac9cb1e/rpds_py-0.27.1-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:abfa1171a9952d2e0002aba2ad3780820b00cc3d9c98c6630f2e93271501f66c", size = 394341, upload-time = "2025-08-27T12:12:52.024Z" }, + { url = "https://files.pythonhosted.org/packages/5a/7e/4bd610754bf492d398b61725eb9598ddd5eb86b07d7d9483dbcd810e20bc/rpds_py-0.27.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4b507d19f817ebaca79574b16eb2ae412e5c0835542c93fe9983f1e432aca195", size = 523428, upload-time = "2025-08-27T12:12:53.779Z" }, + { url = "https://files.pythonhosted.org/packages/9f/e5/059b9f65a8c9149361a8b75094864ab83b94718344db511fd6117936ed2a/rpds_py-0.27.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:168b025f8fd8d8d10957405f3fdcef3dc20f5982d398f90851f4abc58c566c52", size = 402923, upload-time = "2025-08-27T12:12:55.15Z" }, + { url = "https://files.pythonhosted.org/packages/f5/48/64cabb7daced2968dd08e8a1b7988bf358d7bd5bcd5dc89a652f4668543c/rpds_py-0.27.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cb56c6210ef77caa58e16e8c17d35c63fe3f5b60fd9ba9d424470c3400bcf9ed", size = 384094, upload-time = "2025-08-27T12:12:57.194Z" }, + { url = "https://files.pythonhosted.org/packages/ae/e1/dc9094d6ff566bff87add8a510c89b9e158ad2ecd97ee26e677da29a9e1b/rpds_py-0.27.1-cp311-cp311-manylinux_2_31_riscv64.whl", hash = "sha256:d252f2d8ca0195faa707f8eb9368955760880b2b42a8ee16d382bf5dd807f89a", size = 401093, upload-time = "2025-08-27T12:12:58.985Z" }, + { url = "https://files.pythonhosted.org/packages/37/8e/ac8577e3ecdd5593e283d46907d7011618994e1d7ab992711ae0f78b9937/rpds_py-0.27.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6e5e54da1e74b91dbc7996b56640f79b195d5925c2b78efaa8c5d53e1d88edde", size = 417969, upload-time = "2025-08-27T12:13:00.367Z" }, + { url = "https://files.pythonhosted.org/packages/66/6d/87507430a8f74a93556fe55c6485ba9c259949a853ce407b1e23fea5ba31/rpds_py-0.27.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:ffce0481cc6e95e5b3f0a47ee17ffbd234399e6d532f394c8dce320c3b089c21", size = 558302, upload-time = "2025-08-27T12:13:01.737Z" }, + { url = "https://files.pythonhosted.org/packages/3a/bb/1db4781ce1dda3eecc735e3152659a27b90a02ca62bfeea17aee45cc0fbc/rpds_py-0.27.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:a205fdfe55c90c2cd8e540ca9ceba65cbe6629b443bc05db1f590a3db8189ff9", size = 589259, upload-time = "2025-08-27T12:13:03.127Z" }, + { url = "https://files.pythonhosted.org/packages/7b/0e/ae1c8943d11a814d01b482e1f8da903f88047a962dff9bbdadf3bd6e6fd1/rpds_py-0.27.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:689fb5200a749db0415b092972e8eba85847c23885c8543a8b0f5c009b1a5948", size = 554983, upload-time = "2025-08-27T12:13:04.516Z" }, + { url = "https://files.pythonhosted.org/packages/b2/d5/0b2a55415931db4f112bdab072443ff76131b5ac4f4dc98d10d2d357eb03/rpds_py-0.27.1-cp311-cp311-win32.whl", hash = "sha256:3182af66048c00a075010bc7f4860f33913528a4b6fc09094a6e7598e462fe39", size = 217154, upload-time = "2025-08-27T12:13:06.278Z" }, + { url = "https://files.pythonhosted.org/packages/24/75/3b7ffe0d50dc86a6a964af0d1cc3a4a2cdf437cb7b099a4747bbb96d1819/rpds_py-0.27.1-cp311-cp311-win_amd64.whl", hash = "sha256:b4938466c6b257b2f5c4ff98acd8128ec36b5059e5c8f8372d79316b1c36bb15", size = 228627, upload-time = "2025-08-27T12:13:07.625Z" }, + { url = "https://files.pythonhosted.org/packages/8d/3f/4fd04c32abc02c710f09a72a30c9a55ea3cc154ef8099078fd50a0596f8e/rpds_py-0.27.1-cp311-cp311-win_arm64.whl", hash = "sha256:2f57af9b4d0793e53266ee4325535a31ba48e2f875da81a9177c9926dfa60746", size = 220998, upload-time = "2025-08-27T12:13:08.972Z" }, + { url = "https://files.pythonhosted.org/packages/bd/fe/38de28dee5df58b8198c743fe2bea0c785c6d40941b9950bac4cdb71a014/rpds_py-0.27.1-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:ae2775c1973e3c30316892737b91f9283f9908e3cc7625b9331271eaaed7dc90", size = 361887, upload-time = "2025-08-27T12:13:10.233Z" }, + { url = "https://files.pythonhosted.org/packages/7c/9a/4b6c7eedc7dd90986bf0fab6ea2a091ec11c01b15f8ba0a14d3f80450468/rpds_py-0.27.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:2643400120f55c8a96f7c9d858f7be0c88d383cd4653ae2cf0d0c88f668073e5", size = 345795, upload-time = "2025-08-27T12:13:11.65Z" }, + { url = "https://files.pythonhosted.org/packages/6f/0e/e650e1b81922847a09cca820237b0edee69416a01268b7754d506ade11ad/rpds_py-0.27.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:16323f674c089b0360674a4abd28d5042947d54ba620f72514d69be4ff64845e", size = 385121, upload-time = "2025-08-27T12:13:13.008Z" }, + { url = "https://files.pythonhosted.org/packages/1b/ea/b306067a712988e2bff00dcc7c8f31d26c29b6d5931b461aa4b60a013e33/rpds_py-0.27.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:9a1f4814b65eacac94a00fc9a526e3fdafd78e439469644032032d0d63de4881", size = 398976, upload-time = "2025-08-27T12:13:14.368Z" }, + { url = "https://files.pythonhosted.org/packages/2c/0a/26dc43c8840cb8fe239fe12dbc8d8de40f2365e838f3d395835dde72f0e5/rpds_py-0.27.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7ba32c16b064267b22f1850a34051121d423b6f7338a12b9459550eb2096e7ec", size = 525953, upload-time = "2025-08-27T12:13:15.774Z" }, + { url = "https://files.pythonhosted.org/packages/22/14/c85e8127b573aaf3a0cbd7fbb8c9c99e735a4a02180c84da2a463b766e9e/rpds_py-0.27.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e5c20f33fd10485b80f65e800bbe5f6785af510b9f4056c5a3c612ebc83ba6cb", size = 407915, upload-time = "2025-08-27T12:13:17.379Z" }, + { url = "https://files.pythonhosted.org/packages/ed/7b/8f4fee9ba1fb5ec856eb22d725a4efa3deb47f769597c809e03578b0f9d9/rpds_py-0.27.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:466bfe65bd932da36ff279ddd92de56b042f2266d752719beb97b08526268ec5", size = 386883, upload-time = "2025-08-27T12:13:18.704Z" }, + { url = "https://files.pythonhosted.org/packages/86/47/28fa6d60f8b74fcdceba81b272f8d9836ac0340570f68f5df6b41838547b/rpds_py-0.27.1-cp312-cp312-manylinux_2_31_riscv64.whl", hash = "sha256:41e532bbdcb57c92ba3be62c42e9f096431b4cf478da9bc3bc6ce5c38ab7ba7a", size = 405699, upload-time = "2025-08-27T12:13:20.089Z" }, + { url = "https://files.pythonhosted.org/packages/d0/fd/c5987b5e054548df56953a21fe2ebed51fc1ec7c8f24fd41c067b68c4a0a/rpds_py-0.27.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f149826d742b406579466283769a8ea448eed82a789af0ed17b0cd5770433444", size = 423713, upload-time = "2025-08-27T12:13:21.436Z" }, + { url = "https://files.pythonhosted.org/packages/ac/ba/3c4978b54a73ed19a7d74531be37a8bcc542d917c770e14d372b8daea186/rpds_py-0.27.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:80c60cfb5310677bd67cb1e85a1e8eb52e12529545441b43e6f14d90b878775a", size = 562324, upload-time = "2025-08-27T12:13:22.789Z" }, + { url = "https://files.pythonhosted.org/packages/b5/6c/6943a91768fec16db09a42b08644b960cff540c66aab89b74be6d4a144ba/rpds_py-0.27.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:7ee6521b9baf06085f62ba9c7a3e5becffbc32480d2f1b351559c001c38ce4c1", size = 593646, upload-time = "2025-08-27T12:13:24.122Z" }, + { url = "https://files.pythonhosted.org/packages/11/73/9d7a8f4be5f4396f011a6bb7a19fe26303a0dac9064462f5651ced2f572f/rpds_py-0.27.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:a512c8263249a9d68cac08b05dd59d2b3f2061d99b322813cbcc14c3c7421998", size = 558137, upload-time = "2025-08-27T12:13:25.557Z" }, + { url = "https://files.pythonhosted.org/packages/6e/96/6772cbfa0e2485bcceef8071de7821f81aeac8bb45fbfd5542a3e8108165/rpds_py-0.27.1-cp312-cp312-win32.whl", hash = "sha256:819064fa048ba01b6dadc5116f3ac48610435ac9a0058bbde98e569f9e785c39", size = 221343, upload-time = "2025-08-27T12:13:26.967Z" }, + { url = "https://files.pythonhosted.org/packages/67/b6/c82f0faa9af1c6a64669f73a17ee0eeef25aff30bb9a1c318509efe45d84/rpds_py-0.27.1-cp312-cp312-win_amd64.whl", hash = "sha256:d9199717881f13c32c4046a15f024971a3b78ad4ea029e8da6b86e5aa9cf4594", size = 232497, upload-time = "2025-08-27T12:13:28.326Z" }, + { url = "https://files.pythonhosted.org/packages/e1/96/2817b44bd2ed11aebacc9251da03689d56109b9aba5e311297b6902136e2/rpds_py-0.27.1-cp312-cp312-win_arm64.whl", hash = "sha256:33aa65b97826a0e885ef6e278fbd934e98cdcfed80b63946025f01e2f5b29502", size = 222790, upload-time = "2025-08-27T12:13:29.71Z" }, + { url = "https://files.pythonhosted.org/packages/cc/77/610aeee8d41e39080c7e14afa5387138e3c9fa9756ab893d09d99e7d8e98/rpds_py-0.27.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:e4b9fcfbc021633863a37e92571d6f91851fa656f0180246e84cbd8b3f6b329b", size = 361741, upload-time = "2025-08-27T12:13:31.039Z" }, + { url = "https://files.pythonhosted.org/packages/3a/fc/c43765f201c6a1c60be2043cbdb664013def52460a4c7adace89d6682bf4/rpds_py-0.27.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:1441811a96eadca93c517d08df75de45e5ffe68aa3089924f963c782c4b898cf", size = 345574, upload-time = "2025-08-27T12:13:32.902Z" }, + { url = "https://files.pythonhosted.org/packages/20/42/ee2b2ca114294cd9847d0ef9c26d2b0851b2e7e00bf14cc4c0b581df0fc3/rpds_py-0.27.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:55266dafa22e672f5a4f65019015f90336ed31c6383bd53f5e7826d21a0e0b83", size = 385051, upload-time = "2025-08-27T12:13:34.228Z" }, + { url = "https://files.pythonhosted.org/packages/fd/e8/1e430fe311e4799e02e2d1af7c765f024e95e17d651612425b226705f910/rpds_py-0.27.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d78827d7ac08627ea2c8e02c9e5b41180ea5ea1f747e9db0915e3adf36b62dcf", size = 398395, upload-time = "2025-08-27T12:13:36.132Z" }, + { url = "https://files.pythonhosted.org/packages/82/95/9dc227d441ff2670651c27a739acb2535ccaf8b351a88d78c088965e5996/rpds_py-0.27.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae92443798a40a92dc5f0b01d8a7c93adde0c4dc965310a29ae7c64d72b9fad2", size = 524334, upload-time = "2025-08-27T12:13:37.562Z" }, + { url = "https://files.pythonhosted.org/packages/87/01/a670c232f401d9ad461d9a332aa4080cd3cb1d1df18213dbd0d2a6a7ab51/rpds_py-0.27.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c46c9dd2403b66a2a3b9720ec4b74d4ab49d4fabf9f03dfdce2d42af913fe8d0", size = 407691, upload-time = "2025-08-27T12:13:38.94Z" }, + { url = "https://files.pythonhosted.org/packages/03/36/0a14aebbaa26fe7fab4780c76f2239e76cc95a0090bdb25e31d95c492fcd/rpds_py-0.27.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2efe4eb1d01b7f5f1939f4ef30ecea6c6b3521eec451fb93191bf84b2a522418", size = 386868, upload-time = "2025-08-27T12:13:40.192Z" }, + { url = "https://files.pythonhosted.org/packages/3b/03/8c897fb8b5347ff6c1cc31239b9611c5bf79d78c984430887a353e1409a1/rpds_py-0.27.1-cp313-cp313-manylinux_2_31_riscv64.whl", hash = "sha256:15d3b4d83582d10c601f481eca29c3f138d44c92187d197aff663a269197c02d", size = 405469, upload-time = "2025-08-27T12:13:41.496Z" }, + { url = "https://files.pythonhosted.org/packages/da/07/88c60edc2df74850d496d78a1fdcdc7b54360a7f610a4d50008309d41b94/rpds_py-0.27.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:4ed2e16abbc982a169d30d1a420274a709949e2cbdef119fe2ec9d870b42f274", size = 422125, upload-time = "2025-08-27T12:13:42.802Z" }, + { url = "https://files.pythonhosted.org/packages/6b/86/5f4c707603e41b05f191a749984f390dabcbc467cf833769b47bf14ba04f/rpds_py-0.27.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a75f305c9b013289121ec0f1181931975df78738cdf650093e6b86d74aa7d8dd", size = 562341, upload-time = "2025-08-27T12:13:44.472Z" }, + { url = "https://files.pythonhosted.org/packages/b2/92/3c0cb2492094e3cd9baf9e49bbb7befeceb584ea0c1a8b5939dca4da12e5/rpds_py-0.27.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:67ce7620704745881a3d4b0ada80ab4d99df390838839921f99e63c474f82cf2", size = 592511, upload-time = "2025-08-27T12:13:45.898Z" }, + { url = "https://files.pythonhosted.org/packages/10/bb/82e64fbb0047c46a168faa28d0d45a7851cd0582f850b966811d30f67ad8/rpds_py-0.27.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9d992ac10eb86d9b6f369647b6a3f412fc0075cfd5d799530e84d335e440a002", size = 557736, upload-time = "2025-08-27T12:13:47.408Z" }, + { url = "https://files.pythonhosted.org/packages/00/95/3c863973d409210da7fb41958172c6b7dbe7fc34e04d3cc1f10bb85e979f/rpds_py-0.27.1-cp313-cp313-win32.whl", hash = "sha256:4f75e4bd8ab8db624e02c8e2fc4063021b58becdbe6df793a8111d9343aec1e3", size = 221462, upload-time = "2025-08-27T12:13:48.742Z" }, + { url = "https://files.pythonhosted.org/packages/ce/2c/5867b14a81dc217b56d95a9f2a40fdbc56a1ab0181b80132beeecbd4b2d6/rpds_py-0.27.1-cp313-cp313-win_amd64.whl", hash = "sha256:f9025faafc62ed0b75a53e541895ca272815bec18abe2249ff6501c8f2e12b83", size = 232034, upload-time = "2025-08-27T12:13:50.11Z" }, + { url = "https://files.pythonhosted.org/packages/c7/78/3958f3f018c01923823f1e47f1cc338e398814b92d83cd278364446fac66/rpds_py-0.27.1-cp313-cp313-win_arm64.whl", hash = "sha256:ed10dc32829e7d222b7d3b93136d25a406ba9788f6a7ebf6809092da1f4d279d", size = 222392, upload-time = "2025-08-27T12:13:52.587Z" }, + { url = "https://files.pythonhosted.org/packages/01/76/1cdf1f91aed5c3a7bf2eba1f1c4e4d6f57832d73003919a20118870ea659/rpds_py-0.27.1-cp313-cp313t-macosx_10_12_x86_64.whl", hash = "sha256:92022bbbad0d4426e616815b16bc4127f83c9a74940e1ccf3cfe0b387aba0228", size = 358355, upload-time = "2025-08-27T12:13:54.012Z" }, + { url = "https://files.pythonhosted.org/packages/c3/6f/bf142541229374287604caf3bb2a4ae17f0a580798fd72d3b009b532db4e/rpds_py-0.27.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:47162fdab9407ec3f160805ac3e154df042e577dd53341745fc7fb3f625e6d92", size = 342138, upload-time = "2025-08-27T12:13:55.791Z" }, + { url = "https://files.pythonhosted.org/packages/1a/77/355b1c041d6be40886c44ff5e798b4e2769e497b790f0f7fd1e78d17e9a8/rpds_py-0.27.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb89bec23fddc489e5d78b550a7b773557c9ab58b7946154a10a6f7a214a48b2", size = 380247, upload-time = "2025-08-27T12:13:57.683Z" }, + { url = "https://files.pythonhosted.org/packages/d6/a4/d9cef5c3946ea271ce2243c51481971cd6e34f21925af2783dd17b26e815/rpds_py-0.27.1-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e48af21883ded2b3e9eb48cb7880ad8598b31ab752ff3be6457001d78f416723", size = 390699, upload-time = "2025-08-27T12:13:59.137Z" }, + { url = "https://files.pythonhosted.org/packages/3a/06/005106a7b8c6c1a7e91b73169e49870f4af5256119d34a361ae5240a0c1d/rpds_py-0.27.1-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6f5b7bd8e219ed50299e58551a410b64daafb5017d54bbe822e003856f06a802", size = 521852, upload-time = "2025-08-27T12:14:00.583Z" }, + { url = "https://files.pythonhosted.org/packages/e5/3e/50fb1dac0948e17a02eb05c24510a8fe12d5ce8561c6b7b7d1339ab7ab9c/rpds_py-0.27.1-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:08f1e20bccf73b08d12d804d6e1c22ca5530e71659e6673bce31a6bb71c1e73f", size = 402582, upload-time = "2025-08-27T12:14:02.034Z" }, + { url = "https://files.pythonhosted.org/packages/cb/b0/f4e224090dc5b0ec15f31a02d746ab24101dd430847c4d99123798661bfc/rpds_py-0.27.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0dc5dceeaefcc96dc192e3a80bbe1d6c410c469e97bdd47494a7d930987f18b2", size = 384126, upload-time = "2025-08-27T12:14:03.437Z" }, + { url = "https://files.pythonhosted.org/packages/54/77/ac339d5f82b6afff1df8f0fe0d2145cc827992cb5f8eeb90fc9f31ef7a63/rpds_py-0.27.1-cp313-cp313t-manylinux_2_31_riscv64.whl", hash = "sha256:d76f9cc8665acdc0c9177043746775aa7babbf479b5520b78ae4002d889f5c21", size = 399486, upload-time = "2025-08-27T12:14:05.443Z" }, + { url = "https://files.pythonhosted.org/packages/d6/29/3e1c255eee6ac358c056a57d6d6869baa00a62fa32eea5ee0632039c50a3/rpds_py-0.27.1-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:134fae0e36022edad8290a6661edf40c023562964efea0cc0ec7f5d392d2aaef", size = 414832, upload-time = "2025-08-27T12:14:06.902Z" }, + { url = "https://files.pythonhosted.org/packages/3f/db/6d498b844342deb3fa1d030598db93937a9964fcf5cb4da4feb5f17be34b/rpds_py-0.27.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:eb11a4f1b2b63337cfd3b4d110af778a59aae51c81d195768e353d8b52f88081", size = 557249, upload-time = "2025-08-27T12:14:08.37Z" }, + { url = "https://files.pythonhosted.org/packages/60/f3/690dd38e2310b6f68858a331399b4d6dbb9132c3e8ef8b4333b96caf403d/rpds_py-0.27.1-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:13e608ac9f50a0ed4faec0e90ece76ae33b34c0e8656e3dceb9a7db994c692cd", size = 587356, upload-time = "2025-08-27T12:14:10.034Z" }, + { url = "https://files.pythonhosted.org/packages/86/e3/84507781cccd0145f35b1dc32c72675200c5ce8d5b30f813e49424ef68fc/rpds_py-0.27.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:dd2135527aa40f061350c3f8f89da2644de26cd73e4de458e79606384f4f68e7", size = 555300, upload-time = "2025-08-27T12:14:11.783Z" }, + { url = "https://files.pythonhosted.org/packages/e5/ee/375469849e6b429b3516206b4580a79e9ef3eb12920ddbd4492b56eaacbe/rpds_py-0.27.1-cp313-cp313t-win32.whl", hash = "sha256:3020724ade63fe320a972e2ffd93b5623227e684315adce194941167fee02688", size = 216714, upload-time = "2025-08-27T12:14:13.629Z" }, + { url = "https://files.pythonhosted.org/packages/21/87/3fc94e47c9bd0742660e84706c311a860dcae4374cf4a03c477e23ce605a/rpds_py-0.27.1-cp313-cp313t-win_amd64.whl", hash = "sha256:8ee50c3e41739886606388ba3ab3ee2aae9f35fb23f833091833255a31740797", size = 228943, upload-time = "2025-08-27T12:14:14.937Z" }, + { url = "https://files.pythonhosted.org/packages/70/36/b6e6066520a07cf029d385de869729a895917b411e777ab1cde878100a1d/rpds_py-0.27.1-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:acb9aafccaae278f449d9c713b64a9e68662e7799dbd5859e2c6b3c67b56d334", size = 362472, upload-time = "2025-08-27T12:14:16.333Z" }, + { url = "https://files.pythonhosted.org/packages/af/07/b4646032e0dcec0df9c73a3bd52f63bc6c5f9cda992f06bd0e73fe3fbebd/rpds_py-0.27.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:b7fb801aa7f845ddf601c49630deeeccde7ce10065561d92729bfe81bd21fb33", size = 345676, upload-time = "2025-08-27T12:14:17.764Z" }, + { url = "https://files.pythonhosted.org/packages/b0/16/2f1003ee5d0af4bcb13c0cf894957984c32a6751ed7206db2aee7379a55e/rpds_py-0.27.1-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fe0dd05afb46597b9a2e11c351e5e4283c741237e7f617ffb3252780cca9336a", size = 385313, upload-time = "2025-08-27T12:14:19.829Z" }, + { url = "https://files.pythonhosted.org/packages/05/cd/7eb6dd7b232e7f2654d03fa07f1414d7dfc980e82ba71e40a7c46fd95484/rpds_py-0.27.1-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:b6dfb0e058adb12d8b1d1b25f686e94ffa65d9995a5157afe99743bf7369d62b", size = 399080, upload-time = "2025-08-27T12:14:21.531Z" }, + { url = "https://files.pythonhosted.org/packages/20/51/5829afd5000ec1cb60f304711f02572d619040aa3ec033d8226817d1e571/rpds_py-0.27.1-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ed090ccd235f6fa8bb5861684567f0a83e04f52dfc2e5c05f2e4b1309fcf85e7", size = 523868, upload-time = "2025-08-27T12:14:23.485Z" }, + { url = "https://files.pythonhosted.org/packages/05/2c/30eebca20d5db95720ab4d2faec1b5e4c1025c473f703738c371241476a2/rpds_py-0.27.1-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:bf876e79763eecf3e7356f157540d6a093cef395b65514f17a356f62af6cc136", size = 408750, upload-time = "2025-08-27T12:14:24.924Z" }, + { url = "https://files.pythonhosted.org/packages/90/1a/cdb5083f043597c4d4276eae4e4c70c55ab5accec078da8611f24575a367/rpds_py-0.27.1-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:12ed005216a51b1d6e2b02a7bd31885fe317e45897de81d86dcce7d74618ffff", size = 387688, upload-time = "2025-08-27T12:14:27.537Z" }, + { url = "https://files.pythonhosted.org/packages/7c/92/cf786a15320e173f945d205ab31585cc43969743bb1a48b6888f7a2b0a2d/rpds_py-0.27.1-cp314-cp314-manylinux_2_31_riscv64.whl", hash = "sha256:ee4308f409a40e50593c7e3bb8cbe0b4d4c66d1674a316324f0c2f5383b486f9", size = 407225, upload-time = "2025-08-27T12:14:28.981Z" }, + { url = "https://files.pythonhosted.org/packages/33/5c/85ee16df5b65063ef26017bef33096557a4c83fbe56218ac7cd8c235f16d/rpds_py-0.27.1-cp314-cp314-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:0b08d152555acf1f455154d498ca855618c1378ec810646fcd7c76416ac6dc60", size = 423361, upload-time = "2025-08-27T12:14:30.469Z" }, + { url = "https://files.pythonhosted.org/packages/4b/8e/1c2741307fcabd1a334ecf008e92c4f47bb6f848712cf15c923becfe82bb/rpds_py-0.27.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:dce51c828941973a5684d458214d3a36fcd28da3e1875d659388f4f9f12cc33e", size = 562493, upload-time = "2025-08-27T12:14:31.987Z" }, + { url = "https://files.pythonhosted.org/packages/04/03/5159321baae9b2222442a70c1f988cbbd66b9be0675dd3936461269be360/rpds_py-0.27.1-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:c1476d6f29eb81aa4151c9a31219b03f1f798dc43d8af1250a870735516a1212", size = 592623, upload-time = "2025-08-27T12:14:33.543Z" }, + { url = "https://files.pythonhosted.org/packages/ff/39/c09fd1ad28b85bc1d4554a8710233c9f4cefd03d7717a1b8fbfd171d1167/rpds_py-0.27.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:3ce0cac322b0d69b63c9cdb895ee1b65805ec9ffad37639f291dd79467bee675", size = 558800, upload-time = "2025-08-27T12:14:35.436Z" }, + { url = "https://files.pythonhosted.org/packages/c5/d6/99228e6bbcf4baa764b18258f519a9035131d91b538d4e0e294313462a98/rpds_py-0.27.1-cp314-cp314-win32.whl", hash = "sha256:dfbfac137d2a3d0725758cd141f878bf4329ba25e34979797c89474a89a8a3a3", size = 221943, upload-time = "2025-08-27T12:14:36.898Z" }, + { url = "https://files.pythonhosted.org/packages/be/07/c802bc6b8e95be83b79bdf23d1aa61d68324cb1006e245d6c58e959e314d/rpds_py-0.27.1-cp314-cp314-win_amd64.whl", hash = "sha256:a6e57b0abfe7cc513450fcf529eb486b6e4d3f8aee83e92eb5f1ef848218d456", size = 233739, upload-time = "2025-08-27T12:14:38.386Z" }, + { url = "https://files.pythonhosted.org/packages/c8/89/3e1b1c16d4c2d547c5717377a8df99aee8099ff050f87c45cb4d5fa70891/rpds_py-0.27.1-cp314-cp314-win_arm64.whl", hash = "sha256:faf8d146f3d476abfee026c4ae3bdd9ca14236ae4e4c310cbd1cf75ba33d24a3", size = 223120, upload-time = "2025-08-27T12:14:39.82Z" }, + { url = "https://files.pythonhosted.org/packages/62/7e/dc7931dc2fa4a6e46b2a4fa744a9fe5c548efd70e0ba74f40b39fa4a8c10/rpds_py-0.27.1-cp314-cp314t-macosx_10_12_x86_64.whl", hash = "sha256:ba81d2b56b6d4911ce735aad0a1d4495e808b8ee4dc58715998741a26874e7c2", size = 358944, upload-time = "2025-08-27T12:14:41.199Z" }, + { url = "https://files.pythonhosted.org/packages/e6/22/4af76ac4e9f336bfb1a5f240d18a33c6b2fcaadb7472ac7680576512b49a/rpds_py-0.27.1-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:84f7d509870098de0e864cad0102711c1e24e9b1a50ee713b65928adb22269e4", size = 342283, upload-time = "2025-08-27T12:14:42.699Z" }, + { url = "https://files.pythonhosted.org/packages/1c/15/2a7c619b3c2272ea9feb9ade67a45c40b3eeb500d503ad4c28c395dc51b4/rpds_py-0.27.1-cp314-cp314t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a9e960fc78fecd1100539f14132425e1d5fe44ecb9239f8f27f079962021523e", size = 380320, upload-time = "2025-08-27T12:14:44.157Z" }, + { url = "https://files.pythonhosted.org/packages/a2/7d/4c6d243ba4a3057e994bb5bedd01b5c963c12fe38dde707a52acdb3849e7/rpds_py-0.27.1-cp314-cp314t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:62f85b665cedab1a503747617393573995dac4600ff51869d69ad2f39eb5e817", size = 391760, upload-time = "2025-08-27T12:14:45.845Z" }, + { url = "https://files.pythonhosted.org/packages/b4/71/b19401a909b83bcd67f90221330bc1ef11bc486fe4e04c24388d28a618ae/rpds_py-0.27.1-cp314-cp314t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:fed467af29776f6556250c9ed85ea5a4dd121ab56a5f8b206e3e7a4c551e48ec", size = 522476, upload-time = "2025-08-27T12:14:47.364Z" }, + { url = "https://files.pythonhosted.org/packages/e4/44/1a3b9715c0455d2e2f0f6df5ee6d6f5afdc423d0773a8a682ed2b43c566c/rpds_py-0.27.1-cp314-cp314t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f2729615f9d430af0ae6b36cf042cb55c0936408d543fb691e1a9e36648fd35a", size = 403418, upload-time = "2025-08-27T12:14:49.991Z" }, + { url = "https://files.pythonhosted.org/packages/1c/4b/fb6c4f14984eb56673bc868a66536f53417ddb13ed44b391998100a06a96/rpds_py-0.27.1-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1b207d881a9aef7ba753d69c123a35d96ca7cb808056998f6b9e8747321f03b8", size = 384771, upload-time = "2025-08-27T12:14:52.159Z" }, + { url = "https://files.pythonhosted.org/packages/c0/56/d5265d2d28b7420d7b4d4d85cad8ef891760f5135102e60d5c970b976e41/rpds_py-0.27.1-cp314-cp314t-manylinux_2_31_riscv64.whl", hash = "sha256:639fd5efec029f99b79ae47e5d7e00ad8a773da899b6309f6786ecaf22948c48", size = 400022, upload-time = "2025-08-27T12:14:53.859Z" }, + { url = "https://files.pythonhosted.org/packages/8f/e9/9f5fc70164a569bdd6ed9046486c3568d6926e3a49bdefeeccfb18655875/rpds_py-0.27.1-cp314-cp314t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fecc80cb2a90e28af8a9b366edacf33d7a91cbfe4c2c4544ea1246e949cfebeb", size = 416787, upload-time = "2025-08-27T12:14:55.673Z" }, + { url = "https://files.pythonhosted.org/packages/d4/64/56dd03430ba491db943a81dcdef115a985aac5f44f565cd39a00c766d45c/rpds_py-0.27.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:42a89282d711711d0a62d6f57d81aa43a1368686c45bc1c46b7f079d55692734", size = 557538, upload-time = "2025-08-27T12:14:57.245Z" }, + { url = "https://files.pythonhosted.org/packages/3f/36/92cc885a3129993b1d963a2a42ecf64e6a8e129d2c7cc980dbeba84e55fb/rpds_py-0.27.1-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:cf9931f14223de59551ab9d38ed18d92f14f055a5f78c1d8ad6493f735021bbb", size = 588512, upload-time = "2025-08-27T12:14:58.728Z" }, + { url = "https://files.pythonhosted.org/packages/dd/10/6b283707780a81919f71625351182b4f98932ac89a09023cb61865136244/rpds_py-0.27.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:f39f58a27cc6e59f432b568ed8429c7e1641324fbe38131de852cd77b2d534b0", size = 555813, upload-time = "2025-08-27T12:15:00.334Z" }, + { url = "https://files.pythonhosted.org/packages/04/2e/30b5ea18c01379da6272a92825dd7e53dc9d15c88a19e97932d35d430ef7/rpds_py-0.27.1-cp314-cp314t-win32.whl", hash = "sha256:d5fa0ee122dc09e23607a28e6d7b150da16c662e66409bbe85230e4c85bb528a", size = 217385, upload-time = "2025-08-27T12:15:01.937Z" }, + { url = "https://files.pythonhosted.org/packages/32/7d/97119da51cb1dd3f2f3c0805f155a3aa4a95fa44fe7d78ae15e69edf4f34/rpds_py-0.27.1-cp314-cp314t-win_amd64.whl", hash = "sha256:6567d2bb951e21232c2f660c24cf3470bb96de56cdcb3f071a83feeaff8a2772", size = 230097, upload-time = "2025-08-27T12:15:03.961Z" }, + { url = "https://files.pythonhosted.org/packages/0c/ed/e1fba02de17f4f76318b834425257c8ea297e415e12c68b4361f63e8ae92/rpds_py-0.27.1-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:cdfe4bb2f9fe7458b7453ad3c33e726d6d1c7c0a72960bcc23800d77384e42df", size = 371402, upload-time = "2025-08-27T12:15:51.561Z" }, + { url = "https://files.pythonhosted.org/packages/af/7c/e16b959b316048b55585a697e94add55a4ae0d984434d279ea83442e460d/rpds_py-0.27.1-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:8fabb8fd848a5f75a2324e4a84501ee3a5e3c78d8603f83475441866e60b94a3", size = 354084, upload-time = "2025-08-27T12:15:53.219Z" }, + { url = "https://files.pythonhosted.org/packages/de/c1/ade645f55de76799fdd08682d51ae6724cb46f318573f18be49b1e040428/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:eda8719d598f2f7f3e0f885cba8646644b55a187762bec091fa14a2b819746a9", size = 383090, upload-time = "2025-08-27T12:15:55.158Z" }, + { url = "https://files.pythonhosted.org/packages/1f/27/89070ca9b856e52960da1472efcb6c20ba27cfe902f4f23ed095b9cfc61d/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:3c64d07e95606ec402a0a1c511fe003873fa6af630bda59bac77fac8b4318ebc", size = 394519, upload-time = "2025-08-27T12:15:57.238Z" }, + { url = "https://files.pythonhosted.org/packages/b3/28/be120586874ef906aa5aeeae95ae8df4184bc757e5b6bd1c729ccff45ed5/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:93a2ed40de81bcff59aabebb626562d48332f3d028ca2036f1d23cbb52750be4", size = 523817, upload-time = "2025-08-27T12:15:59.237Z" }, + { url = "https://files.pythonhosted.org/packages/a8/ef/70cc197bc11cfcde02a86f36ac1eed15c56667c2ebddbdb76a47e90306da/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:387ce8c44ae94e0ec50532d9cb0edce17311024c9794eb196b90e1058aadeb66", size = 403240, upload-time = "2025-08-27T12:16:00.923Z" }, + { url = "https://files.pythonhosted.org/packages/cf/35/46936cca449f7f518f2f4996e0e8344db4b57e2081e752441154089d2a5f/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aaf94f812c95b5e60ebaf8bfb1898a7d7cb9c1af5744d4a67fa47796e0465d4e", size = 385194, upload-time = "2025-08-27T12:16:02.802Z" }, + { url = "https://files.pythonhosted.org/packages/e1/62/29c0d3e5125c3270b51415af7cbff1ec587379c84f55a5761cc9efa8cd06/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_31_riscv64.whl", hash = "sha256:4848ca84d6ded9b58e474dfdbad4b8bfb450344c0551ddc8d958bf4b36aa837c", size = 402086, upload-time = "2025-08-27T12:16:04.806Z" }, + { url = "https://files.pythonhosted.org/packages/8f/66/03e1087679227785474466fdd04157fb793b3b76e3fcf01cbf4c693c1949/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2bde09cbcf2248b73c7c323be49b280180ff39fadcfe04e7b6f54a678d02a7cf", size = 419272, upload-time = "2025-08-27T12:16:06.471Z" }, + { url = "https://files.pythonhosted.org/packages/6a/24/e3e72d265121e00b063aef3e3501e5b2473cf1b23511d56e529531acf01e/rpds_py-0.27.1-pp311-pypy311_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:94c44ee01fd21c9058f124d2d4f0c9dc7634bec93cd4b38eefc385dabe71acbf", size = 560003, upload-time = "2025-08-27T12:16:08.06Z" }, + { url = "https://files.pythonhosted.org/packages/26/ca/f5a344c534214cc2d41118c0699fffbdc2c1bc7046f2a2b9609765ab9c92/rpds_py-0.27.1-pp311-pypy311_pp73-musllinux_1_2_i686.whl", hash = "sha256:df8b74962e35c9249425d90144e721eed198e6555a0e22a563d29fe4486b51f6", size = 590482, upload-time = "2025-08-27T12:16:10.137Z" }, + { url = "https://files.pythonhosted.org/packages/ce/08/4349bdd5c64d9d193c360aa9db89adeee6f6682ab8825dca0a3f535f434f/rpds_py-0.27.1-pp311-pypy311_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:dc23e6820e3b40847e2f4a7726462ba0cf53089512abe9ee16318c366494c17a", size = 556523, upload-time = "2025-08-27T12:16:12.188Z" }, +] + +[[package]] +name = "rsa" +version = "4.9.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pyasn1" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/da/8a/22b7beea3ee0d44b1916c0c1cb0ee3af23b700b6da9f04991899d0c555d4/rsa-4.9.1.tar.gz", hash = "sha256:e7bdbfdb5497da4c07dfd35530e1a902659db6ff241e39d9953cad06ebd0ae75", size = 29034, upload-time = "2025-04-16T09:51:18.218Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/64/8d/0133e4eb4beed9e425d9a98ed6e081a55d195481b7632472be1af08d2f6b/rsa-4.9.1-py3-none-any.whl", hash = "sha256:68635866661c6836b8d39430f97a996acbd61bfa49406748ea243539fe239762", size = 34696, upload-time = "2025-04-16T09:51:17.142Z" }, +] + +[[package]] +name = "s3fs" +version = "2025.3.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "aiobotocore" }, + { name = "aiohttp" }, + { name = "fsspec" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/72/df/559dc6d796c38f1b8a09a5f6dcf62a467a84f3c87a837ee07c59f60a26ad/s3fs-2025.3.2.tar.gz", hash = "sha256:6798f896ec76dd3bfd8beb89f0bb7c5263cb2760e038bae0978505cd172a307c", size = 77280, upload-time = "2025-03-31T15:35:18.881Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/66/e1/4db0388df5655de92ce5f9b60d2bef220a58dde130e0453e5433c579986e/s3fs-2025.3.2-py3-none-any.whl", hash = "sha256:81eae3f37b4b04bcc08845d7bcc607c6ca45878813ef7e6a28d77b2688417130", size = 30485, upload-time = "2025-03-31T15:35:17.384Z" }, +] + +[package.optional-dependencies] +boto3 = [ + { name = "aiobotocore", extra = ["boto3"] }, +] + +[[package]] +name = "s3transfer" +version = "0.13.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "botocore" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/6d/05/d52bf1e65044b4e5e27d4e63e8d1579dbdec54fce685908ae09bc3720030/s3transfer-0.13.1.tar.gz", hash = "sha256:c3fdba22ba1bd367922f27ec8032d6a1cf5f10c934fb5d68cf60fd5a23d936cf", size = 150589, upload-time = "2025-07-18T19:22:42.31Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/6d/4f/d073e09df851cfa251ef7840007d04db3293a0482ce607d2b993926089be/s3transfer-0.13.1-py3-none-any.whl", hash = "sha256:a981aa7429be23fe6dfc13e80e4020057cbab622b08c0315288758d67cabc724", size = 85308, upload-time = "2025-07-18T19:22:40.947Z" }, +] + +[[package]] +name = "scikit-learn" +version = "1.7.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "joblib" }, + { name = "numpy" }, + { name = "scipy" }, + { name = "threadpoolctl" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/98/c2/a7855e41c9d285dfe86dc50b250978105dce513d6e459ea66a6aeb0e1e0c/scikit_learn-1.7.2.tar.gz", hash = "sha256:20e9e49ecd130598f1ca38a1d85090e1a600147b9c02fa6f15d69cb53d968fda", size = 7193136, upload-time = "2025-09-09T08:21:29.075Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/43/83/564e141eef908a5863a54da8ca342a137f45a0bfb71d1d79704c9894c9d1/scikit_learn-1.7.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:c7509693451651cd7361d30ce4e86a1347493554f172b1c72a39300fa2aea79e", size = 9331967, upload-time = "2025-09-09T08:20:32.421Z" }, + { url = "https://files.pythonhosted.org/packages/18/d6/ba863a4171ac9d7314c4d3fc251f015704a2caeee41ced89f321c049ed83/scikit_learn-1.7.2-cp311-cp311-macosx_12_0_arm64.whl", hash = "sha256:0486c8f827c2e7b64837c731c8feff72c0bd2b998067a8a9cbc10643c31f0fe1", size = 8648645, upload-time = "2025-09-09T08:20:34.436Z" }, + { url = "https://files.pythonhosted.org/packages/ef/0e/97dbca66347b8cf0ea8b529e6bb9367e337ba2e8be0ef5c1a545232abfde/scikit_learn-1.7.2-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:89877e19a80c7b11a2891a27c21c4894fb18e2c2e077815bcade10d34287b20d", size = 9715424, upload-time = "2025-09-09T08:20:36.776Z" }, + { url = "https://files.pythonhosted.org/packages/f7/32/1f3b22e3207e1d2c883a7e09abb956362e7d1bd2f14458c7de258a26ac15/scikit_learn-1.7.2-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8da8bf89d4d79aaec192d2bda62f9b56ae4e5b4ef93b6a56b5de4977e375c1f1", size = 9509234, upload-time = "2025-09-09T08:20:38.957Z" }, + { url = "https://files.pythonhosted.org/packages/9f/71/34ddbd21f1da67c7a768146968b4d0220ee6831e4bcbad3e03dd3eae88b6/scikit_learn-1.7.2-cp311-cp311-win_amd64.whl", hash = "sha256:9b7ed8d58725030568523e937c43e56bc01cadb478fc43c042a9aca1dacb3ba1", size = 8894244, upload-time = "2025-09-09T08:20:41.166Z" }, + { url = "https://files.pythonhosted.org/packages/a7/aa/3996e2196075689afb9fce0410ebdb4a09099d7964d061d7213700204409/scikit_learn-1.7.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:8d91a97fa2b706943822398ab943cde71858a50245e31bc71dba62aab1d60a96", size = 9259818, upload-time = "2025-09-09T08:20:43.19Z" }, + { url = "https://files.pythonhosted.org/packages/43/5d/779320063e88af9c4a7c2cf463ff11c21ac9c8bd730c4a294b0000b666c9/scikit_learn-1.7.2-cp312-cp312-macosx_12_0_arm64.whl", hash = "sha256:acbc0f5fd2edd3432a22c69bed78e837c70cf896cd7993d71d51ba6708507476", size = 8636997, upload-time = "2025-09-09T08:20:45.468Z" }, + { url = "https://files.pythonhosted.org/packages/5c/d0/0c577d9325b05594fdd33aa970bf53fb673f051a45496842caee13cfd7fe/scikit_learn-1.7.2-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:e5bf3d930aee75a65478df91ac1225ff89cd28e9ac7bd1196853a9229b6adb0b", size = 9478381, upload-time = "2025-09-09T08:20:47.982Z" }, + { url = "https://files.pythonhosted.org/packages/82/70/8bf44b933837ba8494ca0fc9a9ab60f1c13b062ad0197f60a56e2fc4c43e/scikit_learn-1.7.2-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b4d6e9deed1a47aca9fe2f267ab8e8fe82ee20b4526b2c0cd9e135cea10feb44", size = 9300296, upload-time = "2025-09-09T08:20:50.366Z" }, + { url = "https://files.pythonhosted.org/packages/c6/99/ed35197a158f1fdc2fe7c3680e9c70d0128f662e1fee4ed495f4b5e13db0/scikit_learn-1.7.2-cp312-cp312-win_amd64.whl", hash = "sha256:6088aa475f0785e01bcf8529f55280a3d7d298679f50c0bb70a2364a82d0b290", size = 8731256, upload-time = "2025-09-09T08:20:52.627Z" }, + { url = "https://files.pythonhosted.org/packages/ae/93/a3038cb0293037fd335f77f31fe053b89c72f17b1c8908c576c29d953e84/scikit_learn-1.7.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:0b7dacaa05e5d76759fb071558a8b5130f4845166d88654a0f9bdf3eb57851b7", size = 9212382, upload-time = "2025-09-09T08:20:54.731Z" }, + { url = "https://files.pythonhosted.org/packages/40/dd/9a88879b0c1104259136146e4742026b52df8540c39fec21a6383f8292c7/scikit_learn-1.7.2-cp313-cp313-macosx_12_0_arm64.whl", hash = "sha256:abebbd61ad9e1deed54cca45caea8ad5f79e1b93173dece40bb8e0c658dbe6fe", size = 8592042, upload-time = "2025-09-09T08:20:57.313Z" }, + { url = "https://files.pythonhosted.org/packages/46/af/c5e286471b7d10871b811b72ae794ac5fe2989c0a2df07f0ec723030f5f5/scikit_learn-1.7.2-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:502c18e39849c0ea1a5d681af1dbcf15f6cce601aebb657aabbfe84133c1907f", size = 9434180, upload-time = "2025-09-09T08:20:59.671Z" }, + { url = "https://files.pythonhosted.org/packages/f1/fd/df59faa53312d585023b2da27e866524ffb8faf87a68516c23896c718320/scikit_learn-1.7.2-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7a4c328a71785382fe3fe676a9ecf2c86189249beff90bf85e22bdb7efaf9ae0", size = 9283660, upload-time = "2025-09-09T08:21:01.71Z" }, + { url = "https://files.pythonhosted.org/packages/a7/c7/03000262759d7b6f38c836ff9d512f438a70d8a8ddae68ee80de72dcfb63/scikit_learn-1.7.2-cp313-cp313-win_amd64.whl", hash = "sha256:63a9afd6f7b229aad94618c01c252ce9e6fa97918c5ca19c9a17a087d819440c", size = 8702057, upload-time = "2025-09-09T08:21:04.234Z" }, + { url = "https://files.pythonhosted.org/packages/55/87/ef5eb1f267084532c8e4aef98a28b6ffe7425acbfd64b5e2f2e066bc29b3/scikit_learn-1.7.2-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:9acb6c5e867447b4e1390930e3944a005e2cb115922e693c08a323421a6966e8", size = 9558731, upload-time = "2025-09-09T08:21:06.381Z" }, + { url = "https://files.pythonhosted.org/packages/93/f8/6c1e3fc14b10118068d7938878a9f3f4e6d7b74a8ddb1e5bed65159ccda8/scikit_learn-1.7.2-cp313-cp313t-macosx_12_0_arm64.whl", hash = "sha256:2a41e2a0ef45063e654152ec9d8bcfc39f7afce35b08902bfe290c2498a67a6a", size = 9038852, upload-time = "2025-09-09T08:21:08.628Z" }, + { url = "https://files.pythonhosted.org/packages/83/87/066cafc896ee540c34becf95d30375fe5cbe93c3b75a0ee9aa852cd60021/scikit_learn-1.7.2-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:98335fb98509b73385b3ab2bd0639b1f610541d3988ee675c670371d6a87aa7c", size = 9527094, upload-time = "2025-09-09T08:21:11.486Z" }, + { url = "https://files.pythonhosted.org/packages/9c/2b/4903e1ccafa1f6453b1ab78413938c8800633988c838aa0be386cbb33072/scikit_learn-1.7.2-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:191e5550980d45449126e23ed1d5e9e24b2c68329ee1f691a3987476e115e09c", size = 9367436, upload-time = "2025-09-09T08:21:13.602Z" }, + { url = "https://files.pythonhosted.org/packages/b5/aa/8444be3cfb10451617ff9d177b3c190288f4563e6c50ff02728be67ad094/scikit_learn-1.7.2-cp313-cp313t-win_amd64.whl", hash = "sha256:57dc4deb1d3762c75d685507fbd0bc17160144b2f2ba4ccea5dc285ab0d0e973", size = 9275749, upload-time = "2025-09-09T08:21:15.96Z" }, + { url = "https://files.pythonhosted.org/packages/d9/82/dee5acf66837852e8e68df6d8d3a6cb22d3df997b733b032f513d95205b7/scikit_learn-1.7.2-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fa8f63940e29c82d1e67a45d5297bdebbcb585f5a5a50c4914cc2e852ab77f33", size = 9208906, upload-time = "2025-09-09T08:21:18.557Z" }, + { url = "https://files.pythonhosted.org/packages/3c/30/9029e54e17b87cb7d50d51a5926429c683d5b4c1732f0507a6c3bed9bf65/scikit_learn-1.7.2-cp314-cp314-macosx_12_0_arm64.whl", hash = "sha256:f95dc55b7902b91331fa4e5845dd5bde0580c9cd9612b1b2791b7e80c3d32615", size = 8627836, upload-time = "2025-09-09T08:21:20.695Z" }, + { url = "https://files.pythonhosted.org/packages/60/18/4a52c635c71b536879f4b971c2cedf32c35ee78f48367885ed8025d1f7ee/scikit_learn-1.7.2-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:9656e4a53e54578ad10a434dc1f993330568cfee176dff07112b8785fb413106", size = 9426236, upload-time = "2025-09-09T08:21:22.645Z" }, + { url = "https://files.pythonhosted.org/packages/99/7e/290362f6ab582128c53445458a5befd471ed1ea37953d5bcf80604619250/scikit_learn-1.7.2-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:96dc05a854add0e50d3f47a1ef21a10a595016da5b007c7d9cd9d0bffd1fcc61", size = 9312593, upload-time = "2025-09-09T08:21:24.65Z" }, + { url = "https://files.pythonhosted.org/packages/8e/87/24f541b6d62b1794939ae6422f8023703bbf6900378b2b34e0b4384dfefd/scikit_learn-1.7.2-cp314-cp314-win_amd64.whl", hash = "sha256:bb24510ed3f9f61476181e4db51ce801e2ba37541def12dc9333b946fc7a9cf8", size = 8820007, upload-time = "2025-09-09T08:21:26.713Z" }, +] + +[[package]] +name = "scipy" +version = "1.16.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "numpy" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/4c/3b/546a6f0bfe791bbb7f8d591613454d15097e53f906308ec6f7c1ce588e8e/scipy-1.16.2.tar.gz", hash = "sha256:af029b153d243a80afb6eabe40b0a07f8e35c9adc269c019f364ad747f826a6b", size = 30580599, upload-time = "2025-09-11T17:48:08.271Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/0b/ef/37ed4b213d64b48422df92560af7300e10fe30b5d665dd79932baebee0c6/scipy-1.16.2-cp311-cp311-macosx_10_14_x86_64.whl", hash = "sha256:6ab88ea43a57da1af33292ebd04b417e8e2eaf9d5aa05700be8d6e1b6501cd92", size = 36619956, upload-time = "2025-09-11T17:39:20.5Z" }, + { url = "https://files.pythonhosted.org/packages/85/ab/5c2eba89b9416961a982346a4d6a647d78c91ec96ab94ed522b3b6baf444/scipy-1.16.2-cp311-cp311-macosx_12_0_arm64.whl", hash = "sha256:c95e96c7305c96ede73a7389f46ccd6c659c4da5ef1b2789466baeaed3622b6e", size = 28931117, upload-time = "2025-09-11T17:39:29.06Z" }, + { url = "https://files.pythonhosted.org/packages/80/d1/eed51ab64d227fe60229a2d57fb60ca5898cfa50ba27d4f573e9e5f0b430/scipy-1.16.2-cp311-cp311-macosx_14_0_arm64.whl", hash = "sha256:87eb178db04ece7c698220d523c170125dbffebb7af0345e66c3554f6f60c173", size = 20921997, upload-time = "2025-09-11T17:39:34.892Z" }, + { url = "https://files.pythonhosted.org/packages/be/7c/33ea3e23bbadde96726edba6bf9111fb1969d14d9d477ffa202c67bec9da/scipy-1.16.2-cp311-cp311-macosx_14_0_x86_64.whl", hash = "sha256:4e409eac067dcee96a57fbcf424c13f428037827ec7ee3cb671ff525ca4fc34d", size = 23523374, upload-time = "2025-09-11T17:39:40.846Z" }, + { url = "https://files.pythonhosted.org/packages/96/0b/7399dc96e1e3f9a05e258c98d716196a34f528eef2ec55aad651ed136d03/scipy-1.16.2-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:e574be127bb760f0dad24ff6e217c80213d153058372362ccb9555a10fc5e8d2", size = 33583702, upload-time = "2025-09-11T17:39:49.011Z" }, + { url = "https://files.pythonhosted.org/packages/1a/bc/a5c75095089b96ea72c1bd37a4497c24b581ec73db4ef58ebee142ad2d14/scipy-1.16.2-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f5db5ba6188d698ba7abab982ad6973265b74bb40a1efe1821b58c87f73892b9", size = 35883427, upload-time = "2025-09-11T17:39:57.406Z" }, + { url = "https://files.pythonhosted.org/packages/ab/66/e25705ca3d2b87b97fe0a278a24b7f477b4023a926847935a1a71488a6a6/scipy-1.16.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:ec6e74c4e884104ae006d34110677bfe0098203a3fec2f3faf349f4cb05165e3", size = 36212940, upload-time = "2025-09-11T17:40:06.013Z" }, + { url = "https://files.pythonhosted.org/packages/d6/fd/0bb911585e12f3abdd603d721d83fc1c7492835e1401a0e6d498d7822b4b/scipy-1.16.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:912f46667d2d3834bc3d57361f854226475f695eb08c08a904aadb1c936b6a88", size = 38865092, upload-time = "2025-09-11T17:40:15.143Z" }, + { url = "https://files.pythonhosted.org/packages/d6/73/c449a7d56ba6e6f874183759f8483cde21f900a8be117d67ffbb670c2958/scipy-1.16.2-cp311-cp311-win_amd64.whl", hash = "sha256:91e9e8a37befa5a69e9cacbe0bcb79ae5afb4a0b130fd6db6ee6cc0d491695fa", size = 38687626, upload-time = "2025-09-11T17:40:24.041Z" }, + { url = "https://files.pythonhosted.org/packages/68/72/02f37316adf95307f5d9e579023c6899f89ff3a051fa079dbd6faafc48e5/scipy-1.16.2-cp311-cp311-win_arm64.whl", hash = "sha256:f3bf75a6dcecab62afde4d1f973f1692be013110cad5338007927db8da73249c", size = 25503506, upload-time = "2025-09-11T17:40:30.703Z" }, + { url = "https://files.pythonhosted.org/packages/b7/8d/6396e00db1282279a4ddd507c5f5e11f606812b608ee58517ce8abbf883f/scipy-1.16.2-cp312-cp312-macosx_10_14_x86_64.whl", hash = "sha256:89d6c100fa5c48472047632e06f0876b3c4931aac1f4291afc81a3644316bb0d", size = 36646259, upload-time = "2025-09-11T17:40:39.329Z" }, + { url = "https://files.pythonhosted.org/packages/3b/93/ea9edd7e193fceb8eef149804491890bde73fb169c896b61aa3e2d1e4e77/scipy-1.16.2-cp312-cp312-macosx_12_0_arm64.whl", hash = "sha256:ca748936cd579d3f01928b30a17dc474550b01272d8046e3e1ee593f23620371", size = 28888976, upload-time = "2025-09-11T17:40:46.82Z" }, + { url = "https://files.pythonhosted.org/packages/91/4d/281fddc3d80fd738ba86fd3aed9202331180b01e2c78eaae0642f22f7e83/scipy-1.16.2-cp312-cp312-macosx_14_0_arm64.whl", hash = "sha256:fac4f8ce2ddb40e2e3d0f7ec36d2a1e7f92559a2471e59aec37bd8d9de01fec0", size = 20879905, upload-time = "2025-09-11T17:40:52.545Z" }, + { url = "https://files.pythonhosted.org/packages/69/40/b33b74c84606fd301b2915f0062e45733c6ff5708d121dd0deaa8871e2d0/scipy-1.16.2-cp312-cp312-macosx_14_0_x86_64.whl", hash = "sha256:033570f1dcefd79547a88e18bccacff025c8c647a330381064f561d43b821232", size = 23553066, upload-time = "2025-09-11T17:40:59.014Z" }, + { url = "https://files.pythonhosted.org/packages/55/a7/22c739e2f21a42cc8f16bc76b47cff4ed54fbe0962832c589591c2abec34/scipy-1.16.2-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:ea3421209bf00c8a5ef2227de496601087d8f638a2363ee09af059bd70976dc1", size = 33336407, upload-time = "2025-09-11T17:41:06.796Z" }, + { url = "https://files.pythonhosted.org/packages/53/11/a0160990b82999b45874dc60c0c183d3a3a969a563fffc476d5a9995c407/scipy-1.16.2-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f66bd07ba6f84cd4a380b41d1bf3c59ea488b590a2ff96744845163309ee8e2f", size = 35673281, upload-time = "2025-09-11T17:41:15.055Z" }, + { url = "https://files.pythonhosted.org/packages/96/53/7ef48a4cfcf243c3d0f1643f5887c81f29fdf76911c4e49331828e19fc0a/scipy-1.16.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:5e9feab931bd2aea4a23388c962df6468af3d808ddf2d40f94a81c5dc38f32ef", size = 36004222, upload-time = "2025-09-11T17:41:23.868Z" }, + { url = "https://files.pythonhosted.org/packages/49/7f/71a69e0afd460049d41c65c630c919c537815277dfea214031005f474d78/scipy-1.16.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:03dfc75e52f72cf23ec2ced468645321407faad8f0fe7b1f5b49264adbc29cb1", size = 38664586, upload-time = "2025-09-11T17:41:31.021Z" }, + { url = "https://files.pythonhosted.org/packages/34/95/20e02ca66fb495a95fba0642fd48e0c390d0ece9b9b14c6e931a60a12dea/scipy-1.16.2-cp312-cp312-win_amd64.whl", hash = "sha256:0ce54e07bbb394b417457409a64fd015be623f36e330ac49306433ffe04bc97e", size = 38550641, upload-time = "2025-09-11T17:41:36.61Z" }, + { url = "https://files.pythonhosted.org/packages/92/ad/13646b9beb0a95528ca46d52b7babafbe115017814a611f2065ee4e61d20/scipy-1.16.2-cp312-cp312-win_arm64.whl", hash = "sha256:2a8ffaa4ac0df81a0b94577b18ee079f13fecdb924df3328fc44a7dc5ac46851", size = 25456070, upload-time = "2025-09-11T17:41:41.3Z" }, + { url = "https://files.pythonhosted.org/packages/c1/27/c5b52f1ee81727a9fc457f5ac1e9bf3d6eab311805ea615c83c27ba06400/scipy-1.16.2-cp313-cp313-macosx_10_14_x86_64.whl", hash = "sha256:84f7bf944b43e20b8a894f5fe593976926744f6c185bacfcbdfbb62736b5cc70", size = 36604856, upload-time = "2025-09-11T17:41:47.695Z" }, + { url = "https://files.pythonhosted.org/packages/32/a9/15c20d08e950b540184caa8ced675ba1128accb0e09c653780ba023a4110/scipy-1.16.2-cp313-cp313-macosx_12_0_arm64.whl", hash = "sha256:5c39026d12edc826a1ef2ad35ad1e6d7f087f934bb868fc43fa3049c8b8508f9", size = 28864626, upload-time = "2025-09-11T17:41:52.642Z" }, + { url = "https://files.pythonhosted.org/packages/4c/fc/ea36098df653cca26062a627c1a94b0de659e97127c8491e18713ca0e3b9/scipy-1.16.2-cp313-cp313-macosx_14_0_arm64.whl", hash = "sha256:e52729ffd45b68777c5319560014d6fd251294200625d9d70fd8626516fc49f5", size = 20855689, upload-time = "2025-09-11T17:41:57.886Z" }, + { url = "https://files.pythonhosted.org/packages/dc/6f/d0b53be55727f3e6d7c72687ec18ea6d0047cf95f1f77488b99a2bafaee1/scipy-1.16.2-cp313-cp313-macosx_14_0_x86_64.whl", hash = "sha256:024dd4a118cccec09ca3209b7e8e614931a6ffb804b2a601839499cb88bdf925", size = 23512151, upload-time = "2025-09-11T17:42:02.303Z" }, + { url = "https://files.pythonhosted.org/packages/11/85/bf7dab56e5c4b1d3d8eef92ca8ede788418ad38a7dc3ff50262f00808760/scipy-1.16.2-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7a5dc7ee9c33019973a470556081b0fd3c9f4c44019191039f9769183141a4d9", size = 33329824, upload-time = "2025-09-11T17:42:07.549Z" }, + { url = "https://files.pythonhosted.org/packages/da/6a/1a927b14ddc7714111ea51f4e568203b2bb6ed59bdd036d62127c1a360c8/scipy-1.16.2-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c2275ff105e508942f99d4e3bc56b6ef5e4b3c0af970386ca56b777608ce95b7", size = 35681881, upload-time = "2025-09-11T17:42:13.255Z" }, + { url = "https://files.pythonhosted.org/packages/c1/5f/331148ea5780b4fcc7007a4a6a6ee0a0c1507a796365cc642d4d226e1c3a/scipy-1.16.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:af80196eaa84f033e48444d2e0786ec47d328ba00c71e4299b602235ffef9acb", size = 36006219, upload-time = "2025-09-11T17:42:18.765Z" }, + { url = "https://files.pythonhosted.org/packages/46/3a/e991aa9d2aec723b4a8dcfbfc8365edec5d5e5f9f133888067f1cbb7dfc1/scipy-1.16.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9fb1eb735fe3d6ed1f89918224e3385fbf6f9e23757cacc35f9c78d3b712dd6e", size = 38682147, upload-time = "2025-09-11T17:42:25.177Z" }, + { url = "https://files.pythonhosted.org/packages/a1/57/0f38e396ad19e41b4c5db66130167eef8ee620a49bc7d0512e3bb67e0cab/scipy-1.16.2-cp313-cp313-win_amd64.whl", hash = "sha256:fda714cf45ba43c9d3bae8f2585c777f64e3f89a2e073b668b32ede412d8f52c", size = 38520766, upload-time = "2025-09-11T17:43:25.342Z" }, + { url = "https://files.pythonhosted.org/packages/1b/a5/85d3e867b6822d331e26c862a91375bb7746a0b458db5effa093d34cdb89/scipy-1.16.2-cp313-cp313-win_arm64.whl", hash = "sha256:2f5350da923ccfd0b00e07c3e5cfb316c1c0d6c1d864c07a72d092e9f20db104", size = 25451169, upload-time = "2025-09-11T17:43:30.198Z" }, + { url = "https://files.pythonhosted.org/packages/09/d9/60679189bcebda55992d1a45498de6d080dcaf21ce0c8f24f888117e0c2d/scipy-1.16.2-cp313-cp313t-macosx_10_14_x86_64.whl", hash = "sha256:53d8d2ee29b925344c13bda64ab51785f016b1b9617849dac10897f0701b20c1", size = 37012682, upload-time = "2025-09-11T17:42:30.677Z" }, + { url = "https://files.pythonhosted.org/packages/83/be/a99d13ee4d3b7887a96f8c71361b9659ba4ef34da0338f14891e102a127f/scipy-1.16.2-cp313-cp313t-macosx_12_0_arm64.whl", hash = "sha256:9e05e33657efb4c6a9d23bd8300101536abd99c85cca82da0bffff8d8764d08a", size = 29389926, upload-time = "2025-09-11T17:42:35.845Z" }, + { url = "https://files.pythonhosted.org/packages/bf/0a/130164a4881cec6ca8c00faf3b57926f28ed429cd6001a673f83c7c2a579/scipy-1.16.2-cp313-cp313t-macosx_14_0_arm64.whl", hash = "sha256:7fe65b36036357003b3ef9d37547abeefaa353b237e989c21027b8ed62b12d4f", size = 21381152, upload-time = "2025-09-11T17:42:40.07Z" }, + { url = "https://files.pythonhosted.org/packages/47/a6/503ffb0310ae77fba874e10cddfc4a1280bdcca1d13c3751b8c3c2996cf8/scipy-1.16.2-cp313-cp313t-macosx_14_0_x86_64.whl", hash = "sha256:6406d2ac6d40b861cccf57f49592f9779071655e9f75cd4f977fa0bdd09cb2e4", size = 23914410, upload-time = "2025-09-11T17:42:44.313Z" }, + { url = "https://files.pythonhosted.org/packages/fa/c7/1147774bcea50d00c02600aadaa919facbd8537997a62496270133536ed6/scipy-1.16.2-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:ff4dc42bd321991fbf611c23fc35912d690f731c9914bf3af8f417e64aca0f21", size = 33481880, upload-time = "2025-09-11T17:42:49.325Z" }, + { url = "https://files.pythonhosted.org/packages/6a/74/99d5415e4c3e46b2586f30cdbecb95e101c7192628a484a40dd0d163811a/scipy-1.16.2-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:654324826654d4d9133e10675325708fb954bc84dae6e9ad0a52e75c6b1a01d7", size = 35791425, upload-time = "2025-09-11T17:42:54.711Z" }, + { url = "https://files.pythonhosted.org/packages/1b/ee/a6559de7c1cc710e938c0355d9d4fbcd732dac4d0d131959d1f3b63eb29c/scipy-1.16.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:63870a84cd15c44e65220eaed2dac0e8f8b26bbb991456a033c1d9abfe8a94f8", size = 36178622, upload-time = "2025-09-11T17:43:00.375Z" }, + { url = "https://files.pythonhosted.org/packages/4e/7b/f127a5795d5ba8ece4e0dce7d4a9fb7cb9e4f4757137757d7a69ab7d4f1a/scipy-1.16.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:fa01f0f6a3050fa6a9771a95d5faccc8e2f5a92b4a2e5440a0fa7264a2398472", size = 38783985, upload-time = "2025-09-11T17:43:06.661Z" }, + { url = "https://files.pythonhosted.org/packages/3e/9f/bc81c1d1e033951eb5912cd3750cc005943afa3e65a725d2443a3b3c4347/scipy-1.16.2-cp313-cp313t-win_amd64.whl", hash = "sha256:116296e89fba96f76353a8579820c2512f6e55835d3fad7780fece04367de351", size = 38631367, upload-time = "2025-09-11T17:43:14.44Z" }, + { url = "https://files.pythonhosted.org/packages/d6/5e/2cc7555fd81d01814271412a1d59a289d25f8b63208a0a16c21069d55d3e/scipy-1.16.2-cp313-cp313t-win_arm64.whl", hash = "sha256:98e22834650be81d42982360382b43b17f7ba95e0e6993e2a4f5b9ad9283a94d", size = 25787992, upload-time = "2025-09-11T17:43:19.745Z" }, + { url = "https://files.pythonhosted.org/packages/8b/ac/ad8951250516db71619f0bd3b2eb2448db04b720a003dd98619b78b692c0/scipy-1.16.2-cp314-cp314-macosx_10_14_x86_64.whl", hash = "sha256:567e77755019bb7461513c87f02bb73fb65b11f049aaaa8ca17cfaa5a5c45d77", size = 36595109, upload-time = "2025-09-11T17:43:35.713Z" }, + { url = "https://files.pythonhosted.org/packages/ff/f6/5779049ed119c5b503b0f3dc6d6f3f68eefc3a9190d4ad4c276f854f051b/scipy-1.16.2-cp314-cp314-macosx_12_0_arm64.whl", hash = "sha256:17d9bb346194e8967296621208fcdfd39b55498ef7d2f376884d5ac47cec1a70", size = 28859110, upload-time = "2025-09-11T17:43:40.814Z" }, + { url = "https://files.pythonhosted.org/packages/82/09/9986e410ae38bf0a0c737ff8189ac81a93b8e42349aac009891c054403d7/scipy-1.16.2-cp314-cp314-macosx_14_0_arm64.whl", hash = "sha256:0a17541827a9b78b777d33b623a6dcfe2ef4a25806204d08ead0768f4e529a88", size = 20850110, upload-time = "2025-09-11T17:43:44.981Z" }, + { url = "https://files.pythonhosted.org/packages/0d/ad/485cdef2d9215e2a7df6d61b81d2ac073dfacf6ae24b9ae87274c4e936ae/scipy-1.16.2-cp314-cp314-macosx_14_0_x86_64.whl", hash = "sha256:d7d4c6ba016ffc0f9568d012f5f1eb77ddd99412aea121e6fa8b4c3b7cbad91f", size = 23497014, upload-time = "2025-09-11T17:43:49.074Z" }, + { url = "https://files.pythonhosted.org/packages/a7/74/f6a852e5d581122b8f0f831f1d1e32fb8987776ed3658e95c377d308ed86/scipy-1.16.2-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:9702c4c023227785c779cba2e1d6f7635dbb5b2e0936cdd3a4ecb98d78fd41eb", size = 33401155, upload-time = "2025-09-11T17:43:54.661Z" }, + { url = "https://files.pythonhosted.org/packages/d9/f5/61d243bbc7c6e5e4e13dde9887e84a5cbe9e0f75fd09843044af1590844e/scipy-1.16.2-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:d1cdf0ac28948d225decdefcc45ad7dd91716c29ab56ef32f8e0d50657dffcc7", size = 35691174, upload-time = "2025-09-11T17:44:00.101Z" }, + { url = "https://files.pythonhosted.org/packages/03/99/59933956331f8cc57e406cdb7a483906c74706b156998f322913e789c7e1/scipy-1.16.2-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:70327d6aa572a17c2941cdfb20673f82e536e91850a2e4cb0c5b858b690e1548", size = 36070752, upload-time = "2025-09-11T17:44:05.619Z" }, + { url = "https://files.pythonhosted.org/packages/c6/7d/00f825cfb47ee19ef74ecf01244b43e95eae74e7e0ff796026ea7cd98456/scipy-1.16.2-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:5221c0b2a4b58aa7c4ed0387d360fd90ee9086d383bb34d9f2789fafddc8a936", size = 38701010, upload-time = "2025-09-11T17:44:11.322Z" }, + { url = "https://files.pythonhosted.org/packages/e4/9f/b62587029980378304ba5a8563d376c96f40b1e133daacee76efdcae32de/scipy-1.16.2-cp314-cp314-win_amd64.whl", hash = "sha256:f5a85d7b2b708025af08f060a496dd261055b617d776fc05a1a1cc69e09fe9ff", size = 39360061, upload-time = "2025-09-11T17:45:09.814Z" }, + { url = "https://files.pythonhosted.org/packages/82/04/7a2f1609921352c7fbee0815811b5050582f67f19983096c4769867ca45f/scipy-1.16.2-cp314-cp314-win_arm64.whl", hash = "sha256:2cc73a33305b4b24556957d5857d6253ce1e2dcd67fa0ff46d87d1670b3e1e1d", size = 26126914, upload-time = "2025-09-11T17:45:14.73Z" }, + { url = "https://files.pythonhosted.org/packages/51/b9/60929ce350c16b221928725d2d1d7f86cf96b8bc07415547057d1196dc92/scipy-1.16.2-cp314-cp314t-macosx_10_14_x86_64.whl", hash = "sha256:9ea2a3fed83065d77367775d689401a703d0f697420719ee10c0780bcab594d8", size = 37013193, upload-time = "2025-09-11T17:44:16.757Z" }, + { url = "https://files.pythonhosted.org/packages/2a/41/ed80e67782d4bc5fc85a966bc356c601afddd175856ba7c7bb6d9490607e/scipy-1.16.2-cp314-cp314t-macosx_12_0_arm64.whl", hash = "sha256:7280d926f11ca945c3ef92ba960fa924e1465f8d07ce3a9923080363390624c4", size = 29390172, upload-time = "2025-09-11T17:44:21.783Z" }, + { url = "https://files.pythonhosted.org/packages/c4/a3/2f673ace4090452696ccded5f5f8efffb353b8f3628f823a110e0170b605/scipy-1.16.2-cp314-cp314t-macosx_14_0_arm64.whl", hash = "sha256:8afae1756f6a1fe04636407ef7dbece33d826a5d462b74f3d0eb82deabefd831", size = 21381326, upload-time = "2025-09-11T17:44:25.982Z" }, + { url = "https://files.pythonhosted.org/packages/42/bf/59df61c5d51395066c35836b78136accf506197617c8662e60ea209881e1/scipy-1.16.2-cp314-cp314t-macosx_14_0_x86_64.whl", hash = "sha256:5c66511f29aa8d233388e7416a3f20d5cae7a2744d5cee2ecd38c081f4e861b3", size = 23915036, upload-time = "2025-09-11T17:44:30.527Z" }, + { url = "https://files.pythonhosted.org/packages/91/c3/edc7b300dc16847ad3672f1a6f3f7c5d13522b21b84b81c265f4f2760d4a/scipy-1.16.2-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:efe6305aeaa0e96b0ccca5ff647a43737d9a092064a3894e46c414db84bc54ac", size = 33484341, upload-time = "2025-09-11T17:44:35.981Z" }, + { url = "https://files.pythonhosted.org/packages/26/c7/24d1524e72f06ff141e8d04b833c20db3021020563272ccb1b83860082a9/scipy-1.16.2-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:7f3a337d9ae06a1e8d655ee9d8ecb835ea5ddcdcbd8d23012afa055ab014f374", size = 35790840, upload-time = "2025-09-11T17:44:41.76Z" }, + { url = "https://files.pythonhosted.org/packages/aa/b7/5aaad984eeedd56858dc33d75efa59e8ce798d918e1033ef62d2708f2c3d/scipy-1.16.2-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:bab3605795d269067d8ce78a910220262711b753de8913d3deeaedb5dded3bb6", size = 36174716, upload-time = "2025-09-11T17:44:47.316Z" }, + { url = "https://files.pythonhosted.org/packages/fd/c2/e276a237acb09824822b0ada11b028ed4067fdc367a946730979feacb870/scipy-1.16.2-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:b0348d8ddb55be2a844c518cd8cc8deeeb8aeba707cf834db5758fc89b476a2c", size = 38790088, upload-time = "2025-09-11T17:44:53.011Z" }, + { url = "https://files.pythonhosted.org/packages/c6/b4/5c18a766e8353015439f3780f5fc473f36f9762edc1a2e45da3ff5a31b21/scipy-1.16.2-cp314-cp314t-win_amd64.whl", hash = "sha256:26284797e38b8a75e14ea6631d29bda11e76ceaa6ddb6fdebbfe4c4d90faf2f9", size = 39457455, upload-time = "2025-09-11T17:44:58.899Z" }, + { url = "https://files.pythonhosted.org/packages/97/30/2f9a5243008f76dfc5dee9a53dfb939d9b31e16ce4bd4f2e628bfc5d89d2/scipy-1.16.2-cp314-cp314t-win_arm64.whl", hash = "sha256:d2a4472c231328d4de38d5f1f68fdd6d28a615138f842580a8a321b5845cf779", size = 26448374, upload-time = "2025-09-11T17:45:03.45Z" }, +] + +[[package]] +name = "semver" +version = "3.0.4" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/72/d1/d3159231aec234a59dd7d601e9dd9fe96f3afff15efd33c1070019b26132/semver-3.0.4.tar.gz", hash = "sha256:afc7d8c584a5ed0a11033af086e8af226a9c0b206f313e0301f8dd7b6b589602", size = 269730, upload-time = "2025-01-24T13:19:27.617Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a6/24/4d91e05817e92e3a61c8a21e08fd0f390f5301f1c448b137c57c4bc6e543/semver-3.0.4-py3-none-any.whl", hash = "sha256:9c824d87ba7f7ab4a1890799cec8596f15c1241cb473404ea1cb0c55e4b04746", size = 17912, upload-time = "2025-01-24T13:19:24.949Z" }, +] + +[[package]] +name = "sentry-sdk" +version = "2.38.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "certifi" }, + { name = "urllib3" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/b2/22/60fd703b34d94d216b2387e048ac82de3e86b63bc28869fb076f8bb0204a/sentry_sdk-2.38.0.tar.gz", hash = "sha256:792d2af45e167e2f8a3347143f525b9b6bac6f058fb2014720b40b84ccbeb985", size = 348116, upload-time = "2025-09-15T15:00:37.846Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7a/84/bde4c4bbb269b71bc09316af8eb00da91f67814d40337cc12ef9c8742541/sentry_sdk-2.38.0-py2.py3-none-any.whl", hash = "sha256:2324aea8573a3fa1576df7fb4d65c4eb8d9929c8fa5939647397a07179eef8d0", size = 370346, upload-time = "2025-09-15T15:00:35.821Z" }, +] + +[package.optional-dependencies] +fastapi = [ + { name = "fastapi" }, +] + +[[package]] +name = "setuptools" +version = "80.9.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/18/5d/3bf57dcd21979b887f014ea83c24ae194cfcd12b9e0fda66b957c69d1fca/setuptools-80.9.0.tar.gz", hash = "sha256:f36b47402ecde768dbfafc46e8e4207b4360c654f1f3bb84475f0a28628fb19c", size = 1319958, upload-time = "2025-05-27T00:56:51.443Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a3/dc/17031897dae0efacfea57dfd3a82fdd2a2aeb58e0ff71b77b87e44edc772/setuptools-80.9.0-py3-none-any.whl", hash = "sha256:062d34222ad13e0cc312a4c02d73f059e86a4acbfbdea8f8f76b28c99f306922", size = 1201486, upload-time = "2025-05-27T00:56:49.664Z" }, +] + +[[package]] +name = "shapely" +version = "2.1.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "numpy" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/ca/3c/2da625233f4e605155926566c0e7ea8dda361877f48e8b1655e53456f252/shapely-2.1.1.tar.gz", hash = "sha256:500621967f2ffe9642454808009044c21e5b35db89ce69f8a2042c2ffd0e2772", size = 315422, upload-time = "2025-05-19T11:04:41.265Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/19/97/2df985b1e03f90c503796ad5ecd3d9ed305123b64d4ccb54616b30295b29/shapely-2.1.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:587a1aa72bc858fab9b8c20427b5f6027b7cbc92743b8e2c73b9de55aa71c7a7", size = 1819368, upload-time = "2025-05-19T11:03:55.937Z" }, + { url = "https://files.pythonhosted.org/packages/56/17/504518860370f0a28908b18864f43d72f03581e2b6680540ca668f07aa42/shapely-2.1.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:9fa5c53b0791a4b998f9ad84aad456c988600757a96b0a05e14bba10cebaaaea", size = 1625362, upload-time = "2025-05-19T11:03:57.06Z" }, + { url = "https://files.pythonhosted.org/packages/36/a1/9677337d729b79fce1ef3296aac6b8ef4743419086f669e8a8070eff8f40/shapely-2.1.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aabecd038841ab5310d23495253f01c2a82a3aedae5ab9ca489be214aa458aa7", size = 2999005, upload-time = "2025-05-19T11:03:58.692Z" }, + { url = "https://files.pythonhosted.org/packages/a2/17/e09357274699c6e012bbb5a8ea14765a4d5860bb658df1931c9f90d53bd3/shapely-2.1.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:586f6aee1edec04e16227517a866df3e9a2e43c1f635efc32978bb3dc9c63753", size = 3108489, upload-time = "2025-05-19T11:04:00.059Z" }, + { url = "https://files.pythonhosted.org/packages/17/5d/93a6c37c4b4e9955ad40834f42b17260ca74ecf36df2e81bb14d12221b90/shapely-2.1.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:b9878b9e37ad26c72aada8de0c9cfe418d9e2ff36992a1693b7f65a075b28647", size = 3945727, upload-time = "2025-05-19T11:04:01.786Z" }, + { url = "https://files.pythonhosted.org/packages/a3/1a/ad696648f16fd82dd6bfcca0b3b8fbafa7aacc13431c7fc4c9b49e481681/shapely-2.1.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:d9a531c48f289ba355e37b134e98e28c557ff13965d4653a5228d0f42a09aed0", size = 4109311, upload-time = "2025-05-19T11:04:03.134Z" }, + { url = "https://files.pythonhosted.org/packages/d4/38/150dd245beab179ec0d4472bf6799bf18f21b1efbef59ac87de3377dbf1c/shapely-2.1.1-cp311-cp311-win32.whl", hash = "sha256:4866de2673a971820c75c0167b1f1cd8fb76f2d641101c23d3ca021ad0449bab", size = 1522982, upload-time = "2025-05-19T11:04:05.217Z" }, + { url = "https://files.pythonhosted.org/packages/93/5b/842022c00fbb051083c1c85430f3bb55565b7fd2d775f4f398c0ba8052ce/shapely-2.1.1-cp311-cp311-win_amd64.whl", hash = "sha256:20a9d79958b3d6c70d8a886b250047ea32ff40489d7abb47d01498c704557a93", size = 1703872, upload-time = "2025-05-19T11:04:06.791Z" }, + { url = "https://files.pythonhosted.org/packages/fb/64/9544dc07dfe80a2d489060791300827c941c451e2910f7364b19607ea352/shapely-2.1.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:2827365b58bf98efb60affc94a8e01c56dd1995a80aabe4b701465d86dcbba43", size = 1833021, upload-time = "2025-05-19T11:04:08.022Z" }, + { url = "https://files.pythonhosted.org/packages/07/aa/fb5f545e72e89b6a0f04a0effda144f5be956c9c312c7d4e00dfddbddbcf/shapely-2.1.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:a9c551f7fa7f1e917af2347fe983f21f212863f1d04f08eece01e9c275903fad", size = 1643018, upload-time = "2025-05-19T11:04:09.343Z" }, + { url = "https://files.pythonhosted.org/packages/03/46/61e03edba81de729f09d880ce7ae5c1af873a0814206bbfb4402ab5c3388/shapely-2.1.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:78dec4d4fbe7b1db8dc36de3031767e7ece5911fb7782bc9e95c5cdec58fb1e9", size = 2986417, upload-time = "2025-05-19T11:04:10.56Z" }, + { url = "https://files.pythonhosted.org/packages/1f/1e/83ec268ab8254a446b4178b45616ab5822d7b9d2b7eb6e27cf0b82f45601/shapely-2.1.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:872d3c0a7b8b37da0e23d80496ec5973c4692920b90de9f502b5beb994bbaaef", size = 3098224, upload-time = "2025-05-19T11:04:11.903Z" }, + { url = "https://files.pythonhosted.org/packages/f1/44/0c21e7717c243e067c9ef8fa9126de24239f8345a5bba9280f7bb9935959/shapely-2.1.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:2e2b9125ebfbc28ecf5353511de62f75a8515ae9470521c9a693e4bb9fbe0cf1", size = 3925982, upload-time = "2025-05-19T11:04:13.224Z" }, + { url = "https://files.pythonhosted.org/packages/15/50/d3b4e15fefc103a0eb13d83bad5f65cd6e07a5d8b2ae920e767932a247d1/shapely-2.1.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:4b96cea171b3d7f6786976a0520f178c42792897653ecca0c5422fb1e6946e6d", size = 4089122, upload-time = "2025-05-19T11:04:14.477Z" }, + { url = "https://files.pythonhosted.org/packages/bd/05/9a68f27fc6110baeedeeebc14fd86e73fa38738c5b741302408fb6355577/shapely-2.1.1-cp312-cp312-win32.whl", hash = "sha256:39dca52201e02996df02e447f729da97cfb6ff41a03cb50f5547f19d02905af8", size = 1522437, upload-time = "2025-05-19T11:04:16.203Z" }, + { url = "https://files.pythonhosted.org/packages/bc/e9/a4560e12b9338842a1f82c9016d2543eaa084fce30a1ca11991143086b57/shapely-2.1.1-cp312-cp312-win_amd64.whl", hash = "sha256:13d643256f81d55a50013eff6321142781cf777eb6a9e207c2c9e6315ba6044a", size = 1703479, upload-time = "2025-05-19T11:04:18.497Z" }, + { url = "https://files.pythonhosted.org/packages/71/8e/2bc836437f4b84d62efc1faddce0d4e023a5d990bbddd3c78b2004ebc246/shapely-2.1.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:3004a644d9e89e26c20286d5fdc10f41b1744c48ce910bd1867fdff963fe6c48", size = 1832107, upload-time = "2025-05-19T11:04:19.736Z" }, + { url = "https://files.pythonhosted.org/packages/12/a2/12c7cae5b62d5d851c2db836eadd0986f63918a91976495861f7c492f4a9/shapely-2.1.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:1415146fa12d80a47d13cfad5310b3c8b9c2aa8c14a0c845c9d3d75e77cb54f6", size = 1642355, upload-time = "2025-05-19T11:04:21.035Z" }, + { url = "https://files.pythonhosted.org/packages/5b/7e/6d28b43d53fea56de69c744e34c2b999ed4042f7a811dc1bceb876071c95/shapely-2.1.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:21fcab88b7520820ec16d09d6bea68652ca13993c84dffc6129dc3607c95594c", size = 2968871, upload-time = "2025-05-19T11:04:22.167Z" }, + { url = "https://files.pythonhosted.org/packages/dd/87/1017c31e52370b2b79e4d29e07cbb590ab9e5e58cf7e2bdfe363765d6251/shapely-2.1.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e5ce6a5cc52c974b291237a96c08c5592e50f066871704fb5b12be2639d9026a", size = 3080830, upload-time = "2025-05-19T11:04:23.997Z" }, + { url = "https://files.pythonhosted.org/packages/1d/fe/f4a03d81abd96a6ce31c49cd8aaba970eaaa98e191bd1e4d43041e57ae5a/shapely-2.1.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:04e4c12a45a1d70aeb266618d8cf81a2de9c4df511b63e105b90bfdfb52146de", size = 3908961, upload-time = "2025-05-19T11:04:25.702Z" }, + { url = "https://files.pythonhosted.org/packages/ef/59/7605289a95a6844056a2017ab36d9b0cb9d6a3c3b5317c1f968c193031c9/shapely-2.1.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6ca74d851ca5264aae16c2b47e96735579686cb69fa93c4078070a0ec845b8d8", size = 4079623, upload-time = "2025-05-19T11:04:27.171Z" }, + { url = "https://files.pythonhosted.org/packages/bc/4d/9fea036eff2ef4059d30247128b2d67aaa5f0b25e9fc27e1d15cc1b84704/shapely-2.1.1-cp313-cp313-win32.whl", hash = "sha256:fd9130501bf42ffb7e0695b9ea17a27ae8ce68d50b56b6941c7f9b3d3453bc52", size = 1521916, upload-time = "2025-05-19T11:04:28.405Z" }, + { url = "https://files.pythonhosted.org/packages/12/d9/6d13b8957a17c95794f0c4dfb65ecd0957e6c7131a56ce18d135c1107a52/shapely-2.1.1-cp313-cp313-win_amd64.whl", hash = "sha256:ab8d878687b438a2f4c138ed1a80941c6ab0029e0f4c785ecfe114413b498a97", size = 1702746, upload-time = "2025-05-19T11:04:29.643Z" }, + { url = "https://files.pythonhosted.org/packages/60/36/b1452e3e7f35f5f6454d96f3be6e2bb87082720ff6c9437ecc215fa79be0/shapely-2.1.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:0c062384316a47f776305ed2fa22182717508ffdeb4a56d0ff4087a77b2a0f6d", size = 1833482, upload-time = "2025-05-19T11:04:30.852Z" }, + { url = "https://files.pythonhosted.org/packages/ce/ca/8e6f59be0718893eb3e478141285796a923636dc8f086f83e5b0ec0036d0/shapely-2.1.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:4ecf6c196b896e8f1360cc219ed4eee1c1e5f5883e505d449f263bd053fb8c05", size = 1642256, upload-time = "2025-05-19T11:04:32.068Z" }, + { url = "https://files.pythonhosted.org/packages/ab/78/0053aea449bb1d4503999525fec6232f049abcdc8df60d290416110de943/shapely-2.1.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb00070b4c4860f6743c600285109c273cca5241e970ad56bb87bef0be1ea3a0", size = 3016614, upload-time = "2025-05-19T11:04:33.7Z" }, + { url = "https://files.pythonhosted.org/packages/ee/53/36f1b1de1dfafd1b457dcbafa785b298ce1b8a3e7026b79619e708a245d5/shapely-2.1.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d14a9afa5fa980fbe7bf63706fdfb8ff588f638f145a1d9dbc18374b5b7de913", size = 3093542, upload-time = "2025-05-19T11:04:34.952Z" }, + { url = "https://files.pythonhosted.org/packages/b9/bf/0619f37ceec6b924d84427c88835b61f27f43560239936ff88915c37da19/shapely-2.1.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:b640e390dabde790e3fb947198b466e63223e0a9ccd787da5f07bcb14756c28d", size = 3945961, upload-time = "2025-05-19T11:04:36.32Z" }, + { url = "https://files.pythonhosted.org/packages/93/c9/20ca4afeb572763b07a7997f00854cb9499df6af85929e93012b189d8917/shapely-2.1.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:69e08bf9697c1b73ec6aa70437db922bafcea7baca131c90c26d59491a9760f9", size = 4089514, upload-time = "2025-05-19T11:04:37.683Z" }, + { url = "https://files.pythonhosted.org/packages/33/6a/27036a5a560b80012a544366bceafd491e8abb94a8db14047b5346b5a749/shapely-2.1.1-cp313-cp313t-win32.whl", hash = "sha256:ef2d09d5a964cc90c2c18b03566cf918a61c248596998a0301d5b632beadb9db", size = 1540607, upload-time = "2025-05-19T11:04:38.925Z" }, + { url = "https://files.pythonhosted.org/packages/ea/f1/5e9b3ba5c7aa7ebfaf269657e728067d16a7c99401c7973ddf5f0cf121bd/shapely-2.1.1-cp313-cp313t-win_amd64.whl", hash = "sha256:8cb8f17c377260452e9d7720eeaf59082c5f8ea48cf104524d953e5d36d4bdb7", size = 1723061, upload-time = "2025-05-19T11:04:40.082Z" }, +] + +[[package]] +name = "shellingham" +version = "1.5.4" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/58/15/8b3609fd3830ef7b27b655beb4b4e9c62313a4e8da8c676e142cc210d58e/shellingham-1.5.4.tar.gz", hash = "sha256:8dbca0739d487e5bd35ab3ca4b36e11c4078f3a234bfce294b0a0291363404de", size = 10310, upload-time = "2023-10-24T04:13:40.426Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e0/f9/0595336914c5619e5f28a1fb793285925a8cd4b432c9da0a987836c7f822/shellingham-1.5.4-py2.py3-none-any.whl", hash = "sha256:7ecfff8f2fd72616f7481040475a65b2bf8af90a56c89140852d1120324e8686", size = 9755, upload-time = "2023-10-24T04:13:38.866Z" }, +] + +[[package]] +name = "simplejson" +version = "3.20.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/af/92/51b417685abd96b31308b61b9acce7ec50d8e1de8fbc39a7fd4962c60689/simplejson-3.20.1.tar.gz", hash = "sha256:e64139b4ec4f1f24c142ff7dcafe55a22b811a74d86d66560c8815687143037d", size = 85591, upload-time = "2025-02-15T05:18:53.15Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/76/59/74bc90d1c051bc2432c96b34bd4e8036875ab58b4fcbe4d6a5a76985f853/simplejson-3.20.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:325b8c107253d3217e89d7b50c71015b5b31e2433e6c5bf38967b2f80630a8ca", size = 92132, upload-time = "2025-02-15T05:16:15.743Z" }, + { url = "https://files.pythonhosted.org/packages/71/c7/1970916e0c51794fff89f76da2f632aaf0b259b87753c88a8c409623d3e1/simplejson-3.20.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:88a7baa8211089b9e58d78fbc1b0b322103f3f3d459ff16f03a36cece0d0fcf0", size = 74956, upload-time = "2025-02-15T05:16:17.062Z" }, + { url = "https://files.pythonhosted.org/packages/c8/0d/98cc5909180463f1d75fac7180de62d4cdb4e82c4fef276b9e591979372c/simplejson-3.20.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:299b1007b8101d50d95bc0db1bf5c38dc372e85b504cf77f596462083ee77e3f", size = 74772, upload-time = "2025-02-15T05:16:19.204Z" }, + { url = "https://files.pythonhosted.org/packages/e1/94/a30a5211a90d67725a3e8fcc1c788189f2ae2ed2b96b63ed15d0b7f5d6bb/simplejson-3.20.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:03ec618ed65caab48e81e3ed29586236a8e57daef792f1f3bb59504a7e98cd10", size = 143575, upload-time = "2025-02-15T05:16:21.337Z" }, + { url = "https://files.pythonhosted.org/packages/ee/08/cdb6821f1058eb5db46d252de69ff7e6c53f05f1bae6368fe20d5b51d37e/simplejson-3.20.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:cd2cdead1d3197f0ff43373cf4730213420523ba48697743e135e26f3d179f38", size = 153241, upload-time = "2025-02-15T05:16:22.859Z" }, + { url = "https://files.pythonhosted.org/packages/4c/2d/ca3caeea0bdc5efc5503d5f57a2dfb56804898fb196dfada121323ee0ccb/simplejson-3.20.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3466d2839fdc83e1af42e07b90bc8ff361c4e8796cd66722a40ba14e458faddd", size = 141500, upload-time = "2025-02-15T05:16:25.068Z" }, + { url = "https://files.pythonhosted.org/packages/e1/33/d3e0779d5c58245e7370c98eb969275af6b7a4a5aec3b97cbf85f09ad328/simplejson-3.20.1-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d492ed8e92f3a9f9be829205f44b1d0a89af6582f0cf43e0d129fa477b93fe0c", size = 144757, upload-time = "2025-02-15T05:16:28.301Z" }, + { url = "https://files.pythonhosted.org/packages/54/53/2d93128bb55861b2fa36c5944f38da51a0bc6d83e513afc6f7838440dd15/simplejson-3.20.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:f924b485537b640dc69434565463fd6fc0c68c65a8c6e01a823dd26c9983cf79", size = 144409, upload-time = "2025-02-15T05:16:29.687Z" }, + { url = "https://files.pythonhosted.org/packages/99/4c/dac310a98f897ad3435b4bdc836d92e78f09e38c5dbf28211ed21dc59fa2/simplejson-3.20.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:9e8eacf6a3491bf76ea91a8d46726368a6be0eb94993f60b8583550baae9439e", size = 146082, upload-time = "2025-02-15T05:16:31.064Z" }, + { url = "https://files.pythonhosted.org/packages/ee/22/d7ba958cfed39827335b82656b1c46f89678faecda9a7677b47e87b48ee6/simplejson-3.20.1-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:d34d04bf90b4cea7c22d8b19091633908f14a096caa301b24c2f3d85b5068fb8", size = 154339, upload-time = "2025-02-15T05:16:32.719Z" }, + { url = "https://files.pythonhosted.org/packages/b8/c8/b072b741129406a7086a0799c6f5d13096231bf35fdd87a0cffa789687fc/simplejson-3.20.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:69dd28d4ce38390ea4aaf212902712c0fd1093dc4c1ff67e09687c3c3e15a749", size = 147915, upload-time = "2025-02-15T05:16:34.291Z" }, + { url = "https://files.pythonhosted.org/packages/6c/46/8347e61e9cf3db5342a42f7fd30a81b4f5cf85977f916852d7674a540907/simplejson-3.20.1-cp311-cp311-win32.whl", hash = "sha256:dfe7a9da5fd2a3499436cd350f31539e0a6ded5da6b5b3d422df016444d65e43", size = 73972, upload-time = "2025-02-15T05:16:35.712Z" }, + { url = "https://files.pythonhosted.org/packages/01/85/b52f24859237b4e9d523d5655796d911ba3d46e242eb1959c45b6af5aedd/simplejson-3.20.1-cp311-cp311-win_amd64.whl", hash = "sha256:896a6c04d7861d507d800da7642479c3547060bf97419d9ef73d98ced8258766", size = 75595, upload-time = "2025-02-15T05:16:36.957Z" }, + { url = "https://files.pythonhosted.org/packages/8d/eb/34c16a1ac9ba265d024dc977ad84e1659d931c0a700967c3e59a98ed7514/simplejson-3.20.1-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:f31c4a3a7ab18467ee73a27f3e59158255d1520f3aad74315edde7a940f1be23", size = 93100, upload-time = "2025-02-15T05:16:38.801Z" }, + { url = "https://files.pythonhosted.org/packages/41/fc/2c2c007d135894971e6814e7c0806936e5bade28f8db4dd7e2a58b50debd/simplejson-3.20.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:884e6183d16b725e113b83a6fc0230152ab6627d4d36cb05c89c2c5bccfa7bc6", size = 75464, upload-time = "2025-02-15T05:16:40.905Z" }, + { url = "https://files.pythonhosted.org/packages/0f/05/2b5ecb33b776c34bb5cace5de5d7669f9b60e3ca13c113037b2ca86edfbd/simplejson-3.20.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:03d7a426e416fe0d3337115f04164cd9427eb4256e843a6b8751cacf70abc832", size = 75112, upload-time = "2025-02-15T05:16:42.246Z" }, + { url = "https://files.pythonhosted.org/packages/fe/36/1f3609a2792f06cd4b71030485f78e91eb09cfd57bebf3116bf2980a8bac/simplejson-3.20.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:000602141d0bddfcff60ea6a6e97d5e10c9db6b17fd2d6c66199fa481b6214bb", size = 150182, upload-time = "2025-02-15T05:16:43.557Z" }, + { url = "https://files.pythonhosted.org/packages/2f/b0/053fbda38b8b602a77a4f7829def1b4f316cd8deb5440a6d3ee90790d2a4/simplejson-3.20.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:af8377a8af78226e82e3a4349efdde59ffa421ae88be67e18cef915e4023a595", size = 158363, upload-time = "2025-02-15T05:16:45.748Z" }, + { url = "https://files.pythonhosted.org/packages/d1/4b/2eb84ae867539a80822e92f9be4a7200dffba609275faf99b24141839110/simplejson-3.20.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:15c7de4c88ab2fbcb8781a3b982ef883696736134e20b1210bca43fb42ff1acf", size = 148415, upload-time = "2025-02-15T05:16:47.861Z" }, + { url = "https://files.pythonhosted.org/packages/e0/bd/400b0bd372a5666addf2540c7358bfc3841b9ce5cdbc5cc4ad2f61627ad8/simplejson-3.20.1-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:455a882ff3f97d810709f7b620007d4e0aca8da71d06fc5c18ba11daf1c4df49", size = 152213, upload-time = "2025-02-15T05:16:49.25Z" }, + { url = "https://files.pythonhosted.org/packages/50/12/143f447bf6a827ee9472693768dc1a5eb96154f8feb140a88ce6973a3cfa/simplejson-3.20.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:fc0f523ce923e7f38eb67804bc80e0a028c76d7868500aa3f59225574b5d0453", size = 150048, upload-time = "2025-02-15T05:16:51.5Z" }, + { url = "https://files.pythonhosted.org/packages/5e/ea/dd9b3e8e8ed710a66f24a22c16a907c9b539b6f5f45fd8586bd5c231444e/simplejson-3.20.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:76461ec929282dde4a08061071a47281ad939d0202dc4e63cdd135844e162fbc", size = 151668, upload-time = "2025-02-15T05:16:53Z" }, + { url = "https://files.pythonhosted.org/packages/99/af/ee52a8045426a0c5b89d755a5a70cc821815ef3c333b56fbcad33c4435c0/simplejson-3.20.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:ab19c2da8c043607bde4d4ef3a6b633e668a7d2e3d56f40a476a74c5ea71949f", size = 158840, upload-time = "2025-02-15T05:16:54.851Z" }, + { url = "https://files.pythonhosted.org/packages/68/db/ab32869acea6b5de7d75fa0dac07a112ded795d41eaa7e66c7813b17be95/simplejson-3.20.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b2578bedaedf6294415197b267d4ef678fea336dd78ee2a6d2f4b028e9d07be3", size = 154212, upload-time = "2025-02-15T05:16:56.318Z" }, + { url = "https://files.pythonhosted.org/packages/fa/7a/e3132d454977d75a3bf9a6d541d730f76462ebf42a96fea2621498166f41/simplejson-3.20.1-cp312-cp312-win32.whl", hash = "sha256:339f407373325a36b7fd744b688ba5bae0666b5d340ec6d98aebc3014bf3d8ea", size = 74101, upload-time = "2025-02-15T05:16:57.746Z" }, + { url = "https://files.pythonhosted.org/packages/bc/5d/4e243e937fa3560107c69f6f7c2eed8589163f5ed14324e864871daa2dd9/simplejson-3.20.1-cp312-cp312-win_amd64.whl", hash = "sha256:627d4486a1ea7edf1f66bb044ace1ce6b4c1698acd1b05353c97ba4864ea2e17", size = 75736, upload-time = "2025-02-15T05:16:59.017Z" }, + { url = "https://files.pythonhosted.org/packages/c4/03/0f453a27877cb5a5fff16a975925f4119102cc8552f52536b9a98ef0431e/simplejson-3.20.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:71e849e7ceb2178344998cbe5ade101f1b329460243c79c27fbfc51c0447a7c3", size = 93109, upload-time = "2025-02-15T05:17:00.377Z" }, + { url = "https://files.pythonhosted.org/packages/74/1f/a729f4026850cabeaff23e134646c3f455e86925d2533463420635ae54de/simplejson-3.20.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:b63fdbab29dc3868d6f009a59797cefaba315fd43cd32ddd998ee1da28e50e29", size = 75475, upload-time = "2025-02-15T05:17:02.544Z" }, + { url = "https://files.pythonhosted.org/packages/e2/14/50a2713fee8ff1f8d655b1a14f4a0f1c0c7246768a1b3b3d12964a4ed5aa/simplejson-3.20.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:1190f9a3ce644fd50ec277ac4a98c0517f532cfebdcc4bd975c0979a9f05e1fb", size = 75112, upload-time = "2025-02-15T05:17:03.875Z" }, + { url = "https://files.pythonhosted.org/packages/45/86/ea9835abb646755140e2d482edc9bc1e91997ed19a59fd77ae4c6a0facea/simplejson-3.20.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c1336ba7bcb722ad487cd265701ff0583c0bb6de638364ca947bb84ecc0015d1", size = 150245, upload-time = "2025-02-15T05:17:06.899Z" }, + { url = "https://files.pythonhosted.org/packages/12/b4/53084809faede45da829fe571c65fbda8479d2a5b9c633f46b74124d56f5/simplejson-3.20.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e975aac6a5acd8b510eba58d5591e10a03e3d16c1cf8a8624ca177491f7230f0", size = 158465, upload-time = "2025-02-15T05:17:08.707Z" }, + { url = "https://files.pythonhosted.org/packages/a9/7d/d56579468d1660b3841e1f21c14490d103e33cf911886b22652d6e9683ec/simplejson-3.20.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6a6dd11ee282937ad749da6f3b8d87952ad585b26e5edfa10da3ae2536c73078", size = 148514, upload-time = "2025-02-15T05:17:11.323Z" }, + { url = "https://files.pythonhosted.org/packages/19/e3/874b1cca3d3897b486d3afdccc475eb3a09815bf1015b01cf7fcb52a55f0/simplejson-3.20.1-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ab980fcc446ab87ea0879edad41a5c28f2d86020014eb035cf5161e8de4474c6", size = 152262, upload-time = "2025-02-15T05:17:13.543Z" }, + { url = "https://files.pythonhosted.org/packages/32/84/f0fdb3625292d945c2bd13a814584603aebdb38cfbe5fe9be6b46fe598c4/simplejson-3.20.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f5aee2a4cb6b146bd17333ac623610f069f34e8f31d2f4f0c1a2186e50c594f0", size = 150164, upload-time = "2025-02-15T05:17:15.021Z" }, + { url = "https://files.pythonhosted.org/packages/95/51/6d625247224f01eaaeabace9aec75ac5603a42f8ebcce02c486fbda8b428/simplejson-3.20.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:652d8eecbb9a3b6461b21ec7cf11fd0acbab144e45e600c817ecf18e4580b99e", size = 151795, upload-time = "2025-02-15T05:17:16.542Z" }, + { url = "https://files.pythonhosted.org/packages/7f/d9/bb921df6b35be8412f519e58e86d1060fddf3ad401b783e4862e0a74c4c1/simplejson-3.20.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:8c09948f1a486a89251ee3a67c9f8c969b379f6ffff1a6064b41fea3bce0a112", size = 159027, upload-time = "2025-02-15T05:17:18.083Z" }, + { url = "https://files.pythonhosted.org/packages/03/c5/5950605e4ad023a6621cf4c931b29fd3d2a9c1f36be937230bfc83d7271d/simplejson-3.20.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:cbbd7b215ad4fc6f058b5dd4c26ee5c59f72e031dfda3ac183d7968a99e4ca3a", size = 154380, upload-time = "2025-02-15T05:17:20.334Z" }, + { url = "https://files.pythonhosted.org/packages/66/ad/b74149557c5ec1e4e4d55758bda426f5d2ec0123cd01a53ae63b8de51fa3/simplejson-3.20.1-cp313-cp313-win32.whl", hash = "sha256:ae81e482476eaa088ef9d0120ae5345de924f23962c0c1e20abbdff597631f87", size = 74102, upload-time = "2025-02-15T05:17:22.475Z" }, + { url = "https://files.pythonhosted.org/packages/db/a9/25282fdd24493e1022f30b7f5cdf804255c007218b2bfaa655bd7ad34b2d/simplejson-3.20.1-cp313-cp313-win_amd64.whl", hash = "sha256:1b9fd15853b90aec3b1739f4471efbf1ac05066a2c7041bf8db821bb73cd2ddc", size = 75736, upload-time = "2025-02-15T05:17:24.122Z" }, + { url = "https://files.pythonhosted.org/packages/4b/30/00f02a0a921556dd5a6db1ef2926a1bc7a8bbbfb1c49cfed68a275b8ab2b/simplejson-3.20.1-py3-none-any.whl", hash = "sha256:8a6c1bbac39fa4a79f83cbf1df6ccd8ff7069582a9fd8db1e52cea073bc2c697", size = 57121, upload-time = "2025-02-15T05:18:51.243Z" }, +] + +[[package]] +name = "six" +version = "1.17.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/94/e7/b2c673351809dca68a0e064b6af791aa332cf192da575fd474ed7d6f16a2/six-1.17.0.tar.gz", hash = "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81", size = 34031, upload-time = "2024-12-04T17:35:28.174Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b7/ce/149a00dd41f10bc29e5921b496af8b574d8413afcd5e30dfa0ed46c2cc5e/six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274", size = 11050, upload-time = "2024-12-04T17:35:26.475Z" }, +] + +[[package]] +name = "smmap" +version = "5.0.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/44/cd/a040c4b3119bbe532e5b0732286f805445375489fceaec1f48306068ee3b/smmap-5.0.2.tar.gz", hash = "sha256:26ea65a03958fa0c8a1c7e8c7a58fdc77221b8910f6be2131affade476898ad5", size = 22329, upload-time = "2025-01-02T07:14:40.909Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/04/be/d09147ad1ec7934636ad912901c5fd7667e1c858e19d355237db0d0cd5e4/smmap-5.0.2-py3-none-any.whl", hash = "sha256:b30115f0def7d7531d22a0fb6502488d879e75b260a9db4d0819cfb25403af5e", size = 24303, upload-time = "2025-01-02T07:14:38.724Z" }, +] + +[[package]] +name = "sniffio" +version = "1.3.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/a2/87/a6771e1546d97e7e041b6ae58d80074f81b7d5121207425c964ddf5cfdbd/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc", size = 20372, upload-time = "2024-02-25T23:20:04.057Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235, upload-time = "2024-02-25T23:20:01.196Z" }, +] + +[[package]] +name = "sqlalchemy" +version = "2.0.43" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "greenlet", marker = "(python_full_version < '3.14' and platform_machine == 'AMD64') or (python_full_version < '3.14' and platform_machine == 'WIN32') or (python_full_version < '3.14' and platform_machine == 'aarch64') or (python_full_version < '3.14' and platform_machine == 'amd64') or (python_full_version < '3.14' and platform_machine == 'ppc64le') or (python_full_version < '3.14' and platform_machine == 'win32') or (python_full_version < '3.14' and platform_machine == 'x86_64')" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/d7/bc/d59b5d97d27229b0e009bd9098cd81af71c2fa5549c580a0a67b9bed0496/sqlalchemy-2.0.43.tar.gz", hash = "sha256:788bfcef6787a7764169cfe9859fe425bf44559619e1d9f56f5bddf2ebf6f417", size = 9762949, upload-time = "2025-08-11T14:24:58.438Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/9d/77/fa7189fe44114658002566c6fe443d3ed0ec1fa782feb72af6ef7fbe98e7/sqlalchemy-2.0.43-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:52d9b73b8fb3e9da34c2b31e6d99d60f5f99fd8c1225c9dad24aeb74a91e1d29", size = 2136472, upload-time = "2025-08-11T15:52:21.789Z" }, + { url = "https://files.pythonhosted.org/packages/99/ea/92ac27f2fbc2e6c1766bb807084ca455265707e041ba027c09c17d697867/sqlalchemy-2.0.43-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f42f23e152e4545157fa367b2435a1ace7571cab016ca26038867eb7df2c3631", size = 2126535, upload-time = "2025-08-11T15:52:23.109Z" }, + { url = "https://files.pythonhosted.org/packages/94/12/536ede80163e295dc57fff69724caf68f91bb40578b6ac6583a293534849/sqlalchemy-2.0.43-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4fb1a8c5438e0c5ea51afe9c6564f951525795cf432bed0c028c1cb081276685", size = 3297521, upload-time = "2025-08-11T15:50:33.536Z" }, + { url = "https://files.pythonhosted.org/packages/03/b5/cacf432e6f1fc9d156eca0560ac61d4355d2181e751ba8c0cd9cb232c8c1/sqlalchemy-2.0.43-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:db691fa174e8f7036afefe3061bc40ac2b770718be2862bfb03aabae09051aca", size = 3297343, upload-time = "2025-08-11T15:57:51.186Z" }, + { url = "https://files.pythonhosted.org/packages/ca/ba/d4c9b526f18457667de4c024ffbc3a0920c34237b9e9dd298e44c7c00ee5/sqlalchemy-2.0.43-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:fe2b3b4927d0bc03d02ad883f402d5de201dbc8894ac87d2e981e7d87430e60d", size = 3232113, upload-time = "2025-08-11T15:50:34.949Z" }, + { url = "https://files.pythonhosted.org/packages/aa/79/c0121b12b1b114e2c8a10ea297a8a6d5367bc59081b2be896815154b1163/sqlalchemy-2.0.43-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:4d3d9b904ad4a6b175a2de0738248822f5ac410f52c2fd389ada0b5262d6a1e3", size = 3258240, upload-time = "2025-08-11T15:57:52.983Z" }, + { url = "https://files.pythonhosted.org/packages/79/99/a2f9be96fb382f3ba027ad42f00dbe30fdb6ba28cda5f11412eee346bec5/sqlalchemy-2.0.43-cp311-cp311-win32.whl", hash = "sha256:5cda6b51faff2639296e276591808c1726c4a77929cfaa0f514f30a5f6156921", size = 2101248, upload-time = "2025-08-11T15:55:01.855Z" }, + { url = "https://files.pythonhosted.org/packages/ee/13/744a32ebe3b4a7a9c7ea4e57babae7aa22070d47acf330d8e5a1359607f1/sqlalchemy-2.0.43-cp311-cp311-win_amd64.whl", hash = "sha256:c5d1730b25d9a07727d20ad74bc1039bbbb0a6ca24e6769861c1aa5bf2c4c4a8", size = 2126109, upload-time = "2025-08-11T15:55:04.092Z" }, + { url = "https://files.pythonhosted.org/packages/61/db/20c78f1081446095450bdc6ee6cc10045fce67a8e003a5876b6eaafc5cc4/sqlalchemy-2.0.43-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:20d81fc2736509d7a2bd33292e489b056cbae543661bb7de7ce9f1c0cd6e7f24", size = 2134891, upload-time = "2025-08-11T15:51:13.019Z" }, + { url = "https://files.pythonhosted.org/packages/45/0a/3d89034ae62b200b4396f0f95319f7d86e9945ee64d2343dcad857150fa2/sqlalchemy-2.0.43-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:25b9fc27650ff5a2c9d490c13c14906b918b0de1f8fcbb4c992712d8caf40e83", size = 2123061, upload-time = "2025-08-11T15:51:14.319Z" }, + { url = "https://files.pythonhosted.org/packages/cb/10/2711f7ff1805919221ad5bee205971254845c069ee2e7036847103ca1e4c/sqlalchemy-2.0.43-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6772e3ca8a43a65a37c88e2f3e2adfd511b0b1da37ef11ed78dea16aeae85bd9", size = 3320384, upload-time = "2025-08-11T15:52:35.088Z" }, + { url = "https://files.pythonhosted.org/packages/6e/0e/3d155e264d2ed2778484006ef04647bc63f55b3e2d12e6a4f787747b5900/sqlalchemy-2.0.43-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1a113da919c25f7f641ffbd07fbc9077abd4b3b75097c888ab818f962707eb48", size = 3329648, upload-time = "2025-08-11T15:56:34.153Z" }, + { url = "https://files.pythonhosted.org/packages/5b/81/635100fb19725c931622c673900da5efb1595c96ff5b441e07e3dd61f2be/sqlalchemy-2.0.43-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:4286a1139f14b7d70141c67a8ae1582fc2b69105f1b09d9573494eb4bb4b2687", size = 3258030, upload-time = "2025-08-11T15:52:36.933Z" }, + { url = "https://files.pythonhosted.org/packages/0c/ed/a99302716d62b4965fded12520c1cbb189f99b17a6d8cf77611d21442e47/sqlalchemy-2.0.43-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:529064085be2f4d8a6e5fab12d36ad44f1909a18848fcfbdb59cc6d4bbe48efe", size = 3294469, upload-time = "2025-08-11T15:56:35.553Z" }, + { url = "https://files.pythonhosted.org/packages/5d/a2/3a11b06715149bf3310b55a98b5c1e84a42cfb949a7b800bc75cb4e33abc/sqlalchemy-2.0.43-cp312-cp312-win32.whl", hash = "sha256:b535d35dea8bbb8195e7e2b40059e2253acb2b7579b73c1b432a35363694641d", size = 2098906, upload-time = "2025-08-11T15:55:00.645Z" }, + { url = "https://files.pythonhosted.org/packages/bc/09/405c915a974814b90aa591280623adc6ad6b322f61fd5cff80aeaef216c9/sqlalchemy-2.0.43-cp312-cp312-win_amd64.whl", hash = "sha256:1c6d85327ca688dbae7e2b06d7d84cfe4f3fffa5b5f9e21bb6ce9d0e1a0e0e0a", size = 2126260, upload-time = "2025-08-11T15:55:02.965Z" }, + { url = "https://files.pythonhosted.org/packages/41/1c/a7260bd47a6fae7e03768bf66451437b36451143f36b285522b865987ced/sqlalchemy-2.0.43-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:e7c08f57f75a2bb62d7ee80a89686a5e5669f199235c6d1dac75cd59374091c3", size = 2130598, upload-time = "2025-08-11T15:51:15.903Z" }, + { url = "https://files.pythonhosted.org/packages/8e/84/8a337454e82388283830b3586ad7847aa9c76fdd4f1df09cdd1f94591873/sqlalchemy-2.0.43-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:14111d22c29efad445cd5021a70a8b42f7d9152d8ba7f73304c4d82460946aaa", size = 2118415, upload-time = "2025-08-11T15:51:17.256Z" }, + { url = "https://files.pythonhosted.org/packages/cf/ff/22ab2328148492c4d71899d62a0e65370ea66c877aea017a244a35733685/sqlalchemy-2.0.43-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:21b27b56eb2f82653168cefe6cb8e970cdaf4f3a6cb2c5e3c3c1cf3158968ff9", size = 3248707, upload-time = "2025-08-11T15:52:38.444Z" }, + { url = "https://files.pythonhosted.org/packages/dc/29/11ae2c2b981de60187f7cbc84277d9d21f101093d1b2e945c63774477aba/sqlalchemy-2.0.43-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9c5a9da957c56e43d72126a3f5845603da00e0293720b03bde0aacffcf2dc04f", size = 3253602, upload-time = "2025-08-11T15:56:37.348Z" }, + { url = "https://files.pythonhosted.org/packages/b8/61/987b6c23b12c56d2be451bc70900f67dd7d989d52b1ee64f239cf19aec69/sqlalchemy-2.0.43-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:5d79f9fdc9584ec83d1b3c75e9f4595c49017f5594fee1a2217117647225d738", size = 3183248, upload-time = "2025-08-11T15:52:39.865Z" }, + { url = "https://files.pythonhosted.org/packages/86/85/29d216002d4593c2ce1c0ec2cec46dda77bfbcd221e24caa6e85eff53d89/sqlalchemy-2.0.43-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9df7126fd9db49e3a5a3999442cc67e9ee8971f3cb9644250107d7296cb2a164", size = 3219363, upload-time = "2025-08-11T15:56:39.11Z" }, + { url = "https://files.pythonhosted.org/packages/b6/e4/bd78b01919c524f190b4905d47e7630bf4130b9f48fd971ae1c6225b6f6a/sqlalchemy-2.0.43-cp313-cp313-win32.whl", hash = "sha256:7f1ac7828857fcedb0361b48b9ac4821469f7694089d15550bbcf9ab22564a1d", size = 2096718, upload-time = "2025-08-11T15:55:05.349Z" }, + { url = "https://files.pythonhosted.org/packages/ac/a5/ca2f07a2a201f9497de1928f787926613db6307992fe5cda97624eb07c2f/sqlalchemy-2.0.43-cp313-cp313-win_amd64.whl", hash = "sha256:971ba928fcde01869361f504fcff3b7143b47d30de188b11c6357c0505824197", size = 2123200, upload-time = "2025-08-11T15:55:07.932Z" }, + { url = "https://files.pythonhosted.org/packages/b8/d9/13bdde6521f322861fab67473cec4b1cc8999f3871953531cf61945fad92/sqlalchemy-2.0.43-py3-none-any.whl", hash = "sha256:1681c21dd2ccee222c2fe0bef671d1aef7c504087c9c4e800371cfcc8ac966fc", size = 1924759, upload-time = "2025-08-11T15:39:53.024Z" }, +] + +[package.optional-dependencies] +asyncio = [ + { name = "greenlet" }, +] + +[[package]] +name = "sqlalchemy-spanner" +version = "1.16.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "alembic" }, + { name = "google-cloud-spanner" }, + { name = "sqlalchemy" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/bf/6c/d9a2e05d839ec4d00d11887f18e66de331f696b162159dc2655e3910bb55/sqlalchemy_spanner-1.16.0.tar.gz", hash = "sha256:5143d5d092f2f1fef66b332163291dc7913a58292580733a601ff5fae160515a", size = 82748, upload-time = "2025-09-02T08:26:00.645Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/94/74/a9c88abddfeca46c253000e87aad923014c1907953e06b39a0cbec229a86/sqlalchemy_spanner-1.16.0-py3-none-any.whl", hash = "sha256:e53cadb2b973e88936c0a9874e133ee9a0829ea3261f328b4ca40bdedf2016c1", size = 32069, upload-time = "2025-09-02T08:25:59.264Z" }, +] + +[[package]] +name = "sqlglot" +version = "27.16.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/2d/a3/b29fd2d07ee1b0267b3bbe7d38610d05844daa089bba8657c5321a24fd79/sqlglot-27.16.1.tar.gz", hash = "sha256:b89d2b4dba879e40aff6a1c805d68e8c33d53a821c67242f373d361a727181e8", size = 5471043, upload-time = "2025-09-18T13:01:49.85Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/4c/c2/161ed0fc376c55a4e48074f14dbf3c3dae44abcd9021725584443049d60a/sqlglot-27.16.1-py3-none-any.whl", hash = "sha256:9a080a4ce3bebe5a38b1f84c38c2fb5207828ab8ca09871102ad5ad231f58571", size = 517887, upload-time = "2025-09-18T13:01:47.413Z" }, +] + +[[package]] +name = "sqlparse" +version = "0.5.3" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/e5/40/edede8dd6977b0d3da179a342c198ed100dd2aba4be081861ee5911e4da4/sqlparse-0.5.3.tar.gz", hash = "sha256:09f67787f56a0b16ecdbde1bfc7f5d9c3371ca683cfeaa8e6ff60b4807ec9272", size = 84999, upload-time = "2024-12-10T12:05:30.728Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a9/5c/bfd6bd0bf979426d405cc6e71eceb8701b148b16c21d2dc3c261efc61c7b/sqlparse-0.5.3-py3-none-any.whl", hash = "sha256:cf2196ed3418f3ba5de6af7e82c694a9fbdbfecccdfc72e281548517081f16ca", size = 44415, upload-time = "2024-12-10T12:05:27.824Z" }, +] + +[[package]] +name = "sse-starlette" +version = "3.0.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/42/6f/22ed6e33f8a9e76ca0a412405f31abb844b779d52c5f96660766edcd737c/sse_starlette-3.0.2.tar.gz", hash = "sha256:ccd60b5765ebb3584d0de2d7a6e4f745672581de4f5005ab31c3a25d10b52b3a", size = 20985, upload-time = "2025-07-27T09:07:44.565Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ef/10/c78f463b4ef22eef8491f218f692be838282cd65480f6e423d7730dfd1fb/sse_starlette-3.0.2-py3-none-any.whl", hash = "sha256:16b7cbfddbcd4eaca11f7b586f3b8a080f1afe952c15813455b162edea619e5a", size = 11297, upload-time = "2025-07-27T09:07:43.268Z" }, +] + +[[package]] +name = "sseclient-py" +version = "1.8.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/e8/ed/3df5ab8bb0c12f86c28d0cadb11ed1de44a92ed35ce7ff4fd5518a809325/sseclient-py-1.8.0.tar.gz", hash = "sha256:c547c5c1a7633230a38dc599a21a2dc638f9b5c297286b48b46b935c71fac3e8", size = 7791, upload-time = "2023-09-01T19:39:20.45Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/49/58/97655efdfeb5b4eeab85b1fc5d3fa1023661246c2ab2a26ea8e47402d4f2/sseclient_py-1.8.0-py2.py3-none-any.whl", hash = "sha256:4ecca6dc0b9f963f8384e9d7fd529bf93dd7d708144c4fb5da0e0a1a926fee83", size = 8828, upload-time = "2023-09-01T19:39:17.627Z" }, +] + +[[package]] +name = "starlette" +version = "0.48.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio" }, + { name = "typing-extensions", marker = "python_full_version < '3.13'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/a7/a5/d6f429d43394057b67a6b5bbe6eae2f77a6bf7459d961fdb224bf206eee6/starlette-0.48.0.tar.gz", hash = "sha256:7e8cee469a8ab2352911528110ce9088fdc6a37d9876926e73da7ce4aa4c7a46", size = 2652949, upload-time = "2025-09-13T08:41:05.699Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/be/72/2db2f49247d0a18b4f1bb9a5a39a0162869acf235f3a96418363947b3d46/starlette-0.48.0-py3-none-any.whl", hash = "sha256:0764ca97b097582558ecb498132ed0c7d942f233f365b86ba37770e026510659", size = 73736, upload-time = "2025-09-13T08:41:03.869Z" }, +] + +[[package]] +name = "structlog" +version = "25.4.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/79/b9/6e672db4fec07349e7a8a8172c1a6ae235c58679ca29c3f86a61b5e59ff3/structlog-25.4.0.tar.gz", hash = "sha256:186cd1b0a8ae762e29417095664adf1d6a31702160a46dacb7796ea82f7409e4", size = 1369138, upload-time = "2025-06-02T08:21:12.971Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a0/4a/97ee6973e3a73c74c8120d59829c3861ea52210667ec3e7a16045c62b64d/structlog-25.4.0-py3-none-any.whl", hash = "sha256:fe809ff5c27e557d14e613f45ca441aabda051d119ee5a0102aaba6ce40eed2c", size = 68720, upload-time = "2025-06-02T08:21:11.43Z" }, +] + +[[package]] +name = "sympy" +version = "1.14.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "mpmath" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/83/d3/803453b36afefb7c2bb238361cd4ae6125a569b4db67cd9e79846ba2d68c/sympy-1.14.0.tar.gz", hash = "sha256:d3d3fe8df1e5a0b42f0e7bdf50541697dbe7d23746e894990c030e2b05e72517", size = 7793921, upload-time = "2025-04-27T18:05:01.611Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a2/09/77d55d46fd61b4a135c444fc97158ef34a095e5681d0a6c10b75bf356191/sympy-1.14.0-py3-none-any.whl", hash = "sha256:e091cc3e99d2141a0ba2847328f5479b05d94a6635cb96148ccb3f34671bd8f5", size = 6299353, upload-time = "2025-04-27T18:04:59.103Z" }, +] + +[[package]] +name = "tenacity" +version = "8.5.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/a3/4d/6a19536c50b849338fcbe9290d562b52cbdcf30d8963d3588a68a4107df1/tenacity-8.5.0.tar.gz", hash = "sha256:8bc6c0c8a09b31e6cad13c47afbed1a567518250a9a171418582ed8d9c20ca78", size = 47309, upload-time = "2024-07-05T07:25:31.836Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d2/3f/8ba87d9e287b9d385a02a7114ddcef61b26f86411e121c9003eb509a1773/tenacity-8.5.0-py3-none-any.whl", hash = "sha256:b594c2a5945830c267ce6b79a166228323ed52718f30302c1359836112346687", size = 28165, upload-time = "2024-07-05T07:25:29.591Z" }, +] + +[[package]] +name = "termcolor" +version = "2.4.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/10/56/d7d66a84f96d804155f6ff2873d065368b25a07222a6fd51c4f24ef6d764/termcolor-2.4.0.tar.gz", hash = "sha256:aab9e56047c8ac41ed798fa36d892a37aca6b3e9159f3e0c24bc64a9b3ac7b7a", size = 12664, upload-time = "2023-12-01T11:04:51.66Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d9/5f/8c716e47b3a50cbd7c146f45881e11d9414def768b7cd9c5e6650ec2a80a/termcolor-2.4.0-py3-none-any.whl", hash = "sha256:9297c0df9c99445c2412e832e882a7884038a25617c60cea2ad69488d4040d63", size = 7719, upload-time = "2023-12-01T11:04:50.019Z" }, +] + +[[package]] +name = "threadpoolctl" +version = "3.6.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/b7/4d/08c89e34946fce2aec4fbb45c9016efd5f4d7f24af8e5d93296e935631d8/threadpoolctl-3.6.0.tar.gz", hash = "sha256:8ab8b4aa3491d812b623328249fab5302a68d2d71745c8a4c719a2fcaba9f44e", size = 21274, upload-time = "2025-03-13T13:49:23.031Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/32/d5/f9a850d79b0851d1d4ef6456097579a9005b31fea68726a4ae5f2d82ddd9/threadpoolctl-3.6.0-py3-none-any.whl", hash = "sha256:43a0b8fd5a2928500110039e43a5eed8480b918967083ea48dc3ab9f13c4a7fb", size = 18638, upload-time = "2025-03-13T13:49:21.846Z" }, +] + +[[package]] +name = "tiktoken" +version = "0.11.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "regex" }, + { name = "requests" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/a7/86/ad0155a37c4f310935d5ac0b1ccf9bdb635dcb906e0a9a26b616dd55825a/tiktoken-0.11.0.tar.gz", hash = "sha256:3c518641aee1c52247c2b97e74d8d07d780092af79d5911a6ab5e79359d9b06a", size = 37648, upload-time = "2025-08-08T23:58:08.495Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/8a/91/912b459799a025d2842566fe1e902f7f50d54a1ce8a0f236ab36b5bd5846/tiktoken-0.11.0-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:4ae374c46afadad0f501046db3da1b36cd4dfbfa52af23c998773682446097cf", size = 1059743, upload-time = "2025-08-08T23:57:37.516Z" }, + { url = "https://files.pythonhosted.org/packages/8c/e9/6faa6870489ce64f5f75dcf91512bf35af5864583aee8fcb0dcb593121f5/tiktoken-0.11.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:25a512ff25dc6c85b58f5dd4f3d8c674dc05f96b02d66cdacf628d26a4e4866b", size = 999334, upload-time = "2025-08-08T23:57:38.595Z" }, + { url = "https://files.pythonhosted.org/packages/a1/3e/a05d1547cf7db9dc75d1461cfa7b556a3b48e0516ec29dfc81d984a145f6/tiktoken-0.11.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2130127471e293d385179c1f3f9cd445070c0772be73cdafb7cec9a3684c0458", size = 1129402, upload-time = "2025-08-08T23:57:39.627Z" }, + { url = "https://files.pythonhosted.org/packages/34/9a/db7a86b829e05a01fd4daa492086f708e0a8b53952e1dbc9d380d2b03677/tiktoken-0.11.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:21e43022bf2c33f733ea9b54f6a3f6b4354b909f5a73388fb1b9347ca54a069c", size = 1184046, upload-time = "2025-08-08T23:57:40.689Z" }, + { url = "https://files.pythonhosted.org/packages/9d/bb/52edc8e078cf062ed749248f1454e9e5cfd09979baadb830b3940e522015/tiktoken-0.11.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:adb4e308eb64380dc70fa30493e21c93475eaa11669dea313b6bbf8210bfd013", size = 1244691, upload-time = "2025-08-08T23:57:42.251Z" }, + { url = "https://files.pythonhosted.org/packages/60/d9/884b6cd7ae2570ecdcaffa02b528522b18fef1cbbfdbcaa73799807d0d3b/tiktoken-0.11.0-cp311-cp311-win_amd64.whl", hash = "sha256:ece6b76bfeeb61a125c44bbefdfccc279b5288e6007fbedc0d32bfec602df2f2", size = 884392, upload-time = "2025-08-08T23:57:43.628Z" }, + { url = "https://files.pythonhosted.org/packages/e7/9e/eceddeffc169fc75fe0fd4f38471309f11cb1906f9b8aa39be4f5817df65/tiktoken-0.11.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:fd9e6b23e860973cf9526544e220b223c60badf5b62e80a33509d6d40e6c8f5d", size = 1055199, upload-time = "2025-08-08T23:57:45.076Z" }, + { url = "https://files.pythonhosted.org/packages/4f/cf/5f02bfefffdc6b54e5094d2897bc80efd43050e5b09b576fd85936ee54bf/tiktoken-0.11.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6a76d53cee2da71ee2731c9caa747398762bda19d7f92665e882fef229cb0b5b", size = 996655, upload-time = "2025-08-08T23:57:46.304Z" }, + { url = "https://files.pythonhosted.org/packages/65/8e/c769b45ef379bc360c9978c4f6914c79fd432400a6733a8afc7ed7b0726a/tiktoken-0.11.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6ef72aab3ea240646e642413cb363b73869fed4e604dcfd69eec63dc54d603e8", size = 1128867, upload-time = "2025-08-08T23:57:47.438Z" }, + { url = "https://files.pythonhosted.org/packages/d5/2d/4d77f6feb9292bfdd23d5813e442b3bba883f42d0ac78ef5fdc56873f756/tiktoken-0.11.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7f929255c705efec7a28bf515e29dc74220b2f07544a8c81b8d69e8efc4578bd", size = 1183308, upload-time = "2025-08-08T23:57:48.566Z" }, + { url = "https://files.pythonhosted.org/packages/7a/65/7ff0a65d3bb0fc5a1fb6cc71b03e0f6e71a68c5eea230d1ff1ba3fd6df49/tiktoken-0.11.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:61f1d15822e4404953d499fd1dcc62817a12ae9fb1e4898033ec8fe3915fdf8e", size = 1244301, upload-time = "2025-08-08T23:57:49.642Z" }, + { url = "https://files.pythonhosted.org/packages/f5/6e/5b71578799b72e5bdcef206a214c3ce860d999d579a3b56e74a6c8989ee2/tiktoken-0.11.0-cp312-cp312-win_amd64.whl", hash = "sha256:45927a71ab6643dfd3ef57d515a5db3d199137adf551f66453be098502838b0f", size = 884282, upload-time = "2025-08-08T23:57:50.759Z" }, + { url = "https://files.pythonhosted.org/packages/cc/cd/a9034bcee638716d9310443818d73c6387a6a96db93cbcb0819b77f5b206/tiktoken-0.11.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:a5f3f25ffb152ee7fec78e90a5e5ea5b03b4ea240beed03305615847f7a6ace2", size = 1055339, upload-time = "2025-08-08T23:57:51.802Z" }, + { url = "https://files.pythonhosted.org/packages/f1/91/9922b345f611b4e92581f234e64e9661e1c524875c8eadd513c4b2088472/tiktoken-0.11.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:7dc6e9ad16a2a75b4c4be7208055a1f707c9510541d94d9cc31f7fbdc8db41d8", size = 997080, upload-time = "2025-08-08T23:57:53.442Z" }, + { url = "https://files.pythonhosted.org/packages/d0/9d/49cd047c71336bc4b4af460ac213ec1c457da67712bde59b892e84f1859f/tiktoken-0.11.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5a0517634d67a8a48fd4a4ad73930c3022629a85a217d256a6e9b8b47439d1e4", size = 1128501, upload-time = "2025-08-08T23:57:54.808Z" }, + { url = "https://files.pythonhosted.org/packages/52/d5/a0dcdb40dd2ea357e83cb36258967f0ae96f5dd40c722d6e382ceee6bba9/tiktoken-0.11.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7fb4effe60574675118b73c6fbfd3b5868e5d7a1f570d6cc0d18724b09ecf318", size = 1182743, upload-time = "2025-08-08T23:57:56.307Z" }, + { url = "https://files.pythonhosted.org/packages/3b/17/a0fc51aefb66b7b5261ca1314afa83df0106b033f783f9a7bcbe8e741494/tiktoken-0.11.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:94f984c9831fd32688aef4348803b0905d4ae9c432303087bae370dc1381a2b8", size = 1244057, upload-time = "2025-08-08T23:57:57.628Z" }, + { url = "https://files.pythonhosted.org/packages/50/79/bcf350609f3a10f09fe4fc207f132085e497fdd3612f3925ab24d86a0ca0/tiktoken-0.11.0-cp313-cp313-win_amd64.whl", hash = "sha256:2177ffda31dec4023356a441793fed82f7af5291120751dee4d696414f54db0c", size = 883901, upload-time = "2025-08-08T23:57:59.359Z" }, +] + +[[package]] +name = "tokenizers" +version = "0.22.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "huggingface-hub" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/5e/b4/c1ce3699e81977da2ace8b16d2badfd42b060e7d33d75c4ccdbf9dc920fa/tokenizers-0.22.0.tar.gz", hash = "sha256:2e33b98525be8453f355927f3cab312c36cd3e44f4d7e9e97da2fa94d0a49dcb", size = 362771, upload-time = "2025-08-29T10:25:33.914Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/6d/b1/18c13648edabbe66baa85fe266a478a7931ddc0cd1ba618802eb7b8d9865/tokenizers-0.22.0-cp39-abi3-macosx_10_12_x86_64.whl", hash = "sha256:eaa9620122a3fb99b943f864af95ed14c8dfc0f47afa3b404ac8c16b3f2bb484", size = 3081954, upload-time = "2025-08-29T10:25:24.993Z" }, + { url = "https://files.pythonhosted.org/packages/c2/02/c3c454b641bd7c4f79e4464accfae9e7dfc913a777d2e561e168ae060362/tokenizers-0.22.0-cp39-abi3-macosx_11_0_arm64.whl", hash = "sha256:71784b9ab5bf0ff3075bceeb198149d2c5e068549c0d18fe32d06ba0deb63f79", size = 2945644, upload-time = "2025-08-29T10:25:23.405Z" }, + { url = "https://files.pythonhosted.org/packages/55/02/d10185ba2fd8c2d111e124c9d92de398aee0264b35ce433f79fb8472f5d0/tokenizers-0.22.0-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ec5b71f668a8076802b0241a42387d48289f25435b86b769ae1837cad4172a17", size = 3254764, upload-time = "2025-08-29T10:25:12.445Z" }, + { url = "https://files.pythonhosted.org/packages/13/89/17514bd7ef4bf5bfff58e2b131cec0f8d5cea2b1c8ffe1050a2c8de88dbb/tokenizers-0.22.0-cp39-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ea8562fa7498850d02a16178105b58803ea825b50dc9094d60549a7ed63654bb", size = 3161654, upload-time = "2025-08-29T10:25:15.493Z" }, + { url = "https://files.pythonhosted.org/packages/5a/d8/bac9f3a7ef6dcceec206e3857c3b61bb16c6b702ed7ae49585f5bd85c0ef/tokenizers-0.22.0-cp39-abi3-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4136e1558a9ef2e2f1de1555dcd573e1cbc4a320c1a06c4107a3d46dc8ac6e4b", size = 3511484, upload-time = "2025-08-29T10:25:20.477Z" }, + { url = "https://files.pythonhosted.org/packages/aa/27/9c9800eb6763683010a4851db4d1802d8cab9cec114c17056eccb4d4a6e0/tokenizers-0.22.0-cp39-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:cdf5954de3962a5fd9781dc12048d24a1a6f1f5df038c6e95db328cd22964206", size = 3712829, upload-time = "2025-08-29T10:25:17.154Z" }, + { url = "https://files.pythonhosted.org/packages/10/e3/b1726dbc1f03f757260fa21752e1921445b5bc350389a8314dd3338836db/tokenizers-0.22.0-cp39-abi3-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8337ca75d0731fc4860e6204cc24bb36a67d9736142aa06ed320943b50b1e7ed", size = 3408934, upload-time = "2025-08-29T10:25:18.76Z" }, + { url = "https://files.pythonhosted.org/packages/d4/61/aeab3402c26874b74bb67a7f2c4b569dde29b51032c5384db592e7b216f4/tokenizers-0.22.0-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a89264e26f63c449d8cded9061adea7b5de53ba2346fc7e87311f7e4117c1cc8", size = 3345585, upload-time = "2025-08-29T10:25:22.08Z" }, + { url = "https://files.pythonhosted.org/packages/bc/d3/498b4a8a8764cce0900af1add0f176ff24f475d4413d55b760b8cdf00893/tokenizers-0.22.0-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:790bad50a1b59d4c21592f9c3cf5e5cf9c3c7ce7e1a23a739f13e01fb1be377a", size = 9322986, upload-time = "2025-08-29T10:25:26.607Z" }, + { url = "https://files.pythonhosted.org/packages/a2/62/92378eb1c2c565837ca3cb5f9569860d132ab9d195d7950c1ea2681dffd0/tokenizers-0.22.0-cp39-abi3-musllinux_1_2_armv7l.whl", hash = "sha256:76cf6757c73a10ef10bf06fa937c0ec7393d90432f543f49adc8cab3fb6f26cb", size = 9276630, upload-time = "2025-08-29T10:25:28.349Z" }, + { url = "https://files.pythonhosted.org/packages/eb/f0/342d80457aa1cda7654327460f69db0d69405af1e4c453f4dc6ca7c4a76e/tokenizers-0.22.0-cp39-abi3-musllinux_1_2_i686.whl", hash = "sha256:1626cb186e143720c62c6c6b5371e62bbc10af60481388c0da89bc903f37ea0c", size = 9547175, upload-time = "2025-08-29T10:25:29.989Z" }, + { url = "https://files.pythonhosted.org/packages/14/84/8aa9b4adfc4fbd09381e20a5bc6aa27040c9c09caa89988c01544e008d18/tokenizers-0.22.0-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:da589a61cbfea18ae267723d6b029b84598dc8ca78db9951d8f5beff72d8507c", size = 9692735, upload-time = "2025-08-29T10:25:32.089Z" }, + { url = "https://files.pythonhosted.org/packages/bf/24/83ee2b1dc76bfe05c3142e7d0ccdfe69f0ad2f1ebf6c726cea7f0874c0d0/tokenizers-0.22.0-cp39-abi3-win32.whl", hash = "sha256:dbf9d6851bddae3e046fedfb166f47743c1c7bd11c640f0691dd35ef0bcad3be", size = 2471915, upload-time = "2025-08-29T10:25:36.411Z" }, + { url = "https://files.pythonhosted.org/packages/d1/9b/0e0bf82214ee20231845b127aa4a8015936ad5a46779f30865d10e404167/tokenizers-0.22.0-cp39-abi3-win_amd64.whl", hash = "sha256:c78174859eeaee96021f248a56c801e36bfb6bd5b067f2e95aa82445ca324f00", size = 2680494, upload-time = "2025-08-29T10:25:35.14Z" }, +] + +[[package]] +name = "tomlkit" +version = "0.13.3" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/cc/18/0bbf3884e9eaa38819ebe46a7bd25dcd56b67434402b66a58c4b8e552575/tomlkit-0.13.3.tar.gz", hash = "sha256:430cf247ee57df2b94ee3fbe588e71d362a941ebb545dec29b53961d61add2a1", size = 185207, upload-time = "2025-06-05T07:13:44.947Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/bd/75/8539d011f6be8e29f339c42e633aae3cb73bffa95dd0f9adec09b9c58e85/tomlkit-0.13.3-py3-none-any.whl", hash = "sha256:c89c649d79ee40629a9fda55f8ace8c6a1b42deb912b2a8fd8d942ddadb606b0", size = 38901, upload-time = "2025-06-05T07:13:43.546Z" }, +] + +[[package]] +name = "tqdm" +version = "4.67.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "colorama", marker = "sys_platform == 'win32'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/a8/4b/29b4ef32e036bb34e4ab51796dd745cdba7ed47ad142a9f4a1eb8e0c744d/tqdm-4.67.1.tar.gz", hash = "sha256:f8aef9c52c08c13a65f30ea34f4e5aac3fd1a34959879d7e59e63027286627f2", size = 169737, upload-time = "2024-11-24T20:12:22.481Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl", hash = "sha256:26445eca388f82e72884e0d580d5464cd801a3ea01e63e5601bdff9ba6a48de2", size = 78540, upload-time = "2024-11-24T20:12:19.698Z" }, +] + +[[package]] +name = "typer" +version = "0.17.4" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "click" }, + { name = "rich" }, + { name = "shellingham" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/92/e8/2a73ccf9874ec4c7638f172efc8972ceab13a0e3480b389d6ed822f7a822/typer-0.17.4.tar.gz", hash = "sha256:b77dc07d849312fd2bb5e7f20a7af8985c7ec360c45b051ed5412f64d8dc1580", size = 103734, upload-time = "2025-09-05T18:14:40.746Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/93/72/6b3e70d32e89a5cbb6a4513726c1ae8762165b027af569289e19ec08edd8/typer-0.17.4-py3-none-any.whl", hash = "sha256:015534a6edaa450e7007eba705d5c18c3349dcea50a6ad79a5ed530967575824", size = 46643, upload-time = "2025-09-05T18:14:39.166Z" }, +] + +[[package]] +name = "typing-extensions" +version = "4.15.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/72/94/1a15dd82efb362ac84269196e94cf00f187f7ed21c242792a923cdb1c61f/typing_extensions-4.15.0.tar.gz", hash = "sha256:0cea48d173cc12fa28ecabc3b837ea3cf6f38c6d1136f85cbaaf598984861466", size = 109391, upload-time = "2025-08-25T13:49:26.313Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/18/67/36e9267722cc04a6b9f15c7f3441c2363321a3ea07da7ae0c0707beb2a9c/typing_extensions-4.15.0-py3-none-any.whl", hash = "sha256:f0fa19c6845758ab08074a0cfa8b7aecb71c999ca73d62883bc25cc018c4e548", size = 44614, upload-time = "2025-08-25T13:49:24.86Z" }, +] + +[[package]] +name = "typing-inspection" +version = "0.4.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/f8/b1/0c11f5058406b3af7609f121aaa6b609744687f1d158b3c3a5bf4cc94238/typing_inspection-0.4.1.tar.gz", hash = "sha256:6ae134cc0203c33377d43188d4064e9b357dba58cff3185f22924610e70a9d28", size = 75726, upload-time = "2025-05-21T18:55:23.885Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/17/69/cd203477f944c353c31bade965f880aa1061fd6bf05ded0726ca845b6ff7/typing_inspection-0.4.1-py3-none-any.whl", hash = "sha256:389055682238f53b04f7badcb49b989835495a96700ced5dab2d8feae4b26f51", size = 14552, upload-time = "2025-05-21T18:55:22.152Z" }, +] + +[[package]] +name = "tzdata" +version = "2025.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/95/32/1a225d6164441be760d75c2c42e2780dc0873fe382da3e98a2e1e48361e5/tzdata-2025.2.tar.gz", hash = "sha256:b60a638fcc0daffadf82fe0f57e53d06bdec2f36c4df66280ae79bce6bd6f2b9", size = 196380, upload-time = "2025-03-23T13:54:43.652Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/5c/23/c7abc0ca0a1526a0774eca151daeb8de62ec457e77262b66b359c3c7679e/tzdata-2025.2-py2.py3-none-any.whl", hash = "sha256:1a403fada01ff9221ca8044d701868fa132215d84beb92242d9acd2147f667a8", size = 347839, upload-time = "2025-03-23T13:54:41.845Z" }, +] + +[[package]] +name = "tzlocal" +version = "5.3.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "tzdata", marker = "sys_platform == 'win32'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/8b/2e/c14812d3d4d9cd1773c6be938f89e5735a1f11a9f184ac3639b93cef35d5/tzlocal-5.3.1.tar.gz", hash = "sha256:cceffc7edecefea1f595541dbd6e990cb1ea3d19bf01b2809f362a03dd7921fd", size = 30761, upload-time = "2025-03-05T21:17:41.549Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c2/14/e2a54fabd4f08cd7af1c07030603c3356b74da07f7cc056e600436edfa17/tzlocal-5.3.1-py3-none-any.whl", hash = "sha256:eb1a66c3ef5847adf7a834f1be0800581b683b5608e74f86ecbcef8ab91bb85d", size = 18026, upload-time = "2025-03-05T21:17:39.857Z" }, +] + +[[package]] +name = "uritemplate" +version = "4.2.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/98/60/f174043244c5306c9988380d2cb10009f91563fc4b31293d27e17201af56/uritemplate-4.2.0.tar.gz", hash = "sha256:480c2ed180878955863323eea31b0ede668795de182617fef9c6ca09e6ec9d0e", size = 33267, upload-time = "2025-06-02T15:12:06.318Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a9/99/3ae339466c9183ea5b8ae87b34c0b897eda475d2aec2307cae60e5cd4f29/uritemplate-4.2.0-py3-none-any.whl", hash = "sha256:962201ba1c4edcab02e60f9a0d3821e82dfc5d2d6662a21abd533879bdb8a686", size = 11488, upload-time = "2025-06-02T15:12:03.405Z" }, +] + +[[package]] +name = "urllib3" +version = "2.5.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/15/22/9ee70a2574a4f4599c47dd506532914ce044817c7752a79b6a51286319bc/urllib3-2.5.0.tar.gz", hash = "sha256:3fc47733c7e419d4bc3f6b3dc2b4f890bb743906a30d56ba4a5bfa4bbff92760", size = 393185, upload-time = "2025-06-18T14:07:41.644Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a7/c2/fe1e52489ae3122415c51f387e221dd0773709bad6c6cdaa599e8a2c5185/urllib3-2.5.0-py3-none-any.whl", hash = "sha256:e6b01673c0fa6a13e374b50871808eb3bf7046c4b125b216f6bf1cc604cff0dc", size = 129795, upload-time = "2025-06-18T14:07:40.39Z" }, +] + +[[package]] +name = "uvicorn" +version = "0.35.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "click" }, + { name = "h11" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/5e/42/e0e305207bb88c6b8d3061399c6a961ffe5fbb7e2aa63c9234df7259e9cd/uvicorn-0.35.0.tar.gz", hash = "sha256:bc662f087f7cf2ce11a1d7fd70b90c9f98ef2e2831556dd078d131b96cc94a01", size = 78473, upload-time = "2025-06-28T16:15:46.058Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d2/e2/dc81b1bd1dcfe91735810265e9d26bc8ec5da45b4c0f6237e286819194c3/uvicorn-0.35.0-py3-none-any.whl", hash = "sha256:197535216b25ff9b785e29a0b79199f55222193d47f820816e7da751e9bc8d4a", size = 66406, upload-time = "2025-06-28T16:15:44.816Z" }, +] + +[[package]] +name = "virtualenv" +version = "20.34.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "distlib" }, + { name = "filelock" }, + { name = "platformdirs" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/1c/14/37fcdba2808a6c615681cd216fecae00413c9dab44fb2e57805ecf3eaee3/virtualenv-20.34.0.tar.gz", hash = "sha256:44815b2c9dee7ed86e387b842a84f20b93f7f417f95886ca1996a72a4138eb1a", size = 6003808, upload-time = "2025-08-13T14:24:07.464Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/76/06/04c8e804f813cf972e3262f3f8584c232de64f0cde9f703b46cf53a45090/virtualenv-20.34.0-py3-none-any.whl", hash = "sha256:341f5afa7eee943e4984a9207c025feedd768baff6753cd660c857ceb3e36026", size = 5983279, upload-time = "2025-08-13T14:24:05.111Z" }, +] + +[[package]] +name = "watchdog" +version = "6.0.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/db/7d/7f3d619e951c88ed75c6037b246ddcf2d322812ee8ea189be89511721d54/watchdog-6.0.0.tar.gz", hash = "sha256:9ddf7c82fda3ae8e24decda1338ede66e1c99883db93711d8fb941eaa2d8c282", size = 131220, upload-time = "2024-11-01T14:07:13.037Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e0/24/d9be5cd6642a6aa68352ded4b4b10fb0d7889cb7f45814fb92cecd35f101/watchdog-6.0.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:6eb11feb5a0d452ee41f824e271ca311a09e250441c262ca2fd7ebcf2461a06c", size = 96393, upload-time = "2024-11-01T14:06:31.756Z" }, + { url = "https://files.pythonhosted.org/packages/63/7a/6013b0d8dbc56adca7fdd4f0beed381c59f6752341b12fa0886fa7afc78b/watchdog-6.0.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:ef810fbf7b781a5a593894e4f439773830bdecb885e6880d957d5b9382a960d2", size = 88392, upload-time = "2024-11-01T14:06:32.99Z" }, + { url = "https://files.pythonhosted.org/packages/d1/40/b75381494851556de56281e053700e46bff5b37bf4c7267e858640af5a7f/watchdog-6.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:afd0fe1b2270917c5e23c2a65ce50c2a4abb63daafb0d419fde368e272a76b7c", size = 89019, upload-time = "2024-11-01T14:06:34.963Z" }, + { url = "https://files.pythonhosted.org/packages/39/ea/3930d07dafc9e286ed356a679aa02d777c06e9bfd1164fa7c19c288a5483/watchdog-6.0.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:bdd4e6f14b8b18c334febb9c4425a878a2ac20efd1e0b231978e7b150f92a948", size = 96471, upload-time = "2024-11-01T14:06:37.745Z" }, + { url = "https://files.pythonhosted.org/packages/12/87/48361531f70b1f87928b045df868a9fd4e253d9ae087fa4cf3f7113be363/watchdog-6.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:c7c15dda13c4eb00d6fb6fc508b3c0ed88b9d5d374056b239c4ad1611125c860", size = 88449, upload-time = "2024-11-01T14:06:39.748Z" }, + { url = "https://files.pythonhosted.org/packages/5b/7e/8f322f5e600812e6f9a31b75d242631068ca8f4ef0582dd3ae6e72daecc8/watchdog-6.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6f10cb2d5902447c7d0da897e2c6768bca89174d0c6e1e30abec5421af97a5b0", size = 89054, upload-time = "2024-11-01T14:06:41.009Z" }, + { url = "https://files.pythonhosted.org/packages/68/98/b0345cabdce2041a01293ba483333582891a3bd5769b08eceb0d406056ef/watchdog-6.0.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:490ab2ef84f11129844c23fb14ecf30ef3d8a6abafd3754a6f75ca1e6654136c", size = 96480, upload-time = "2024-11-01T14:06:42.952Z" }, + { url = "https://files.pythonhosted.org/packages/85/83/cdf13902c626b28eedef7ec4f10745c52aad8a8fe7eb04ed7b1f111ca20e/watchdog-6.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:76aae96b00ae814b181bb25b1b98076d5fc84e8a53cd8885a318b42b6d3a5134", size = 88451, upload-time = "2024-11-01T14:06:45.084Z" }, + { url = "https://files.pythonhosted.org/packages/fe/c4/225c87bae08c8b9ec99030cd48ae9c4eca050a59bf5c2255853e18c87b50/watchdog-6.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a175f755fc2279e0b7312c0035d52e27211a5bc39719dd529625b1930917345b", size = 89057, upload-time = "2024-11-01T14:06:47.324Z" }, + { url = "https://files.pythonhosted.org/packages/a9/c7/ca4bf3e518cb57a686b2feb4f55a1892fd9a3dd13f470fca14e00f80ea36/watchdog-6.0.0-py3-none-manylinux2014_aarch64.whl", hash = "sha256:7607498efa04a3542ae3e05e64da8202e58159aa1fa4acddf7678d34a35d4f13", size = 79079, upload-time = "2024-11-01T14:06:59.472Z" }, + { url = "https://files.pythonhosted.org/packages/5c/51/d46dc9332f9a647593c947b4b88e2381c8dfc0942d15b8edc0310fa4abb1/watchdog-6.0.0-py3-none-manylinux2014_armv7l.whl", hash = "sha256:9041567ee8953024c83343288ccc458fd0a2d811d6a0fd68c4c22609e3490379", size = 79078, upload-time = "2024-11-01T14:07:01.431Z" }, + { url = "https://files.pythonhosted.org/packages/d4/57/04edbf5e169cd318d5f07b4766fee38e825d64b6913ca157ca32d1a42267/watchdog-6.0.0-py3-none-manylinux2014_i686.whl", hash = "sha256:82dc3e3143c7e38ec49d61af98d6558288c415eac98486a5c581726e0737c00e", size = 79076, upload-time = "2024-11-01T14:07:02.568Z" }, + { url = "https://files.pythonhosted.org/packages/ab/cc/da8422b300e13cb187d2203f20b9253e91058aaf7db65b74142013478e66/watchdog-6.0.0-py3-none-manylinux2014_ppc64.whl", hash = "sha256:212ac9b8bf1161dc91bd09c048048a95ca3a4c4f5e5d4a7d1b1a7d5752a7f96f", size = 79077, upload-time = "2024-11-01T14:07:03.893Z" }, + { url = "https://files.pythonhosted.org/packages/2c/3b/b8964e04ae1a025c44ba8e4291f86e97fac443bca31de8bd98d3263d2fcf/watchdog-6.0.0-py3-none-manylinux2014_ppc64le.whl", hash = "sha256:e3df4cbb9a450c6d49318f6d14f4bbc80d763fa587ba46ec86f99f9e6876bb26", size = 79078, upload-time = "2024-11-01T14:07:05.189Z" }, + { url = "https://files.pythonhosted.org/packages/62/ae/a696eb424bedff7407801c257d4b1afda455fe40821a2be430e173660e81/watchdog-6.0.0-py3-none-manylinux2014_s390x.whl", hash = "sha256:2cce7cfc2008eb51feb6aab51251fd79b85d9894e98ba847408f662b3395ca3c", size = 79077, upload-time = "2024-11-01T14:07:06.376Z" }, + { url = "https://files.pythonhosted.org/packages/b5/e8/dbf020b4d98251a9860752a094d09a65e1b436ad181faf929983f697048f/watchdog-6.0.0-py3-none-manylinux2014_x86_64.whl", hash = "sha256:20ffe5b202af80ab4266dcd3e91aae72bf2da48c0d33bdb15c66658e685e94e2", size = 79078, upload-time = "2024-11-01T14:07:07.547Z" }, + { url = "https://files.pythonhosted.org/packages/07/f6/d0e5b343768e8bcb4cda79f0f2f55051bf26177ecd5651f84c07567461cf/watchdog-6.0.0-py3-none-win32.whl", hash = "sha256:07df1fdd701c5d4c8e55ef6cf55b8f0120fe1aef7ef39a1c6fc6bc2e606d517a", size = 79065, upload-time = "2024-11-01T14:07:09.525Z" }, + { url = "https://files.pythonhosted.org/packages/db/d9/c495884c6e548fce18a8f40568ff120bc3a4b7b99813081c8ac0c936fa64/watchdog-6.0.0-py3-none-win_amd64.whl", hash = "sha256:cbafb470cf848d93b5d013e2ecb245d4aa1c8fd0504e863ccefa32445359d680", size = 79070, upload-time = "2024-11-01T14:07:10.686Z" }, + { url = "https://files.pythonhosted.org/packages/33/e8/e40370e6d74ddba47f002a32919d91310d6074130fe4e17dabcafc15cbf1/watchdog-6.0.0-py3-none-win_ia64.whl", hash = "sha256:a1914259fa9e1454315171103c6a30961236f508b9b623eae470268bbcc6a22f", size = 79067, upload-time = "2024-11-01T14:07:11.845Z" }, +] + +[[package]] +name = "websockets" +version = "15.0.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/21/e6/26d09fab466b7ca9c7737474c52be4f76a40301b08362eb2dbc19dcc16c1/websockets-15.0.1.tar.gz", hash = "sha256:82544de02076bafba038ce055ee6412d68da13ab47f0c60cab827346de828dee", size = 177016, upload-time = "2025-03-05T20:03:41.606Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/9f/32/18fcd5919c293a398db67443acd33fde142f283853076049824fc58e6f75/websockets-15.0.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:823c248b690b2fd9303ba00c4f66cd5e2d8c3ba4aa968b2779be9532a4dad431", size = 175423, upload-time = "2025-03-05T20:01:56.276Z" }, + { url = "https://files.pythonhosted.org/packages/76/70/ba1ad96b07869275ef42e2ce21f07a5b0148936688c2baf7e4a1f60d5058/websockets-15.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:678999709e68425ae2593acf2e3ebcbcf2e69885a5ee78f9eb80e6e371f1bf57", size = 173082, upload-time = "2025-03-05T20:01:57.563Z" }, + { url = "https://files.pythonhosted.org/packages/86/f2/10b55821dd40eb696ce4704a87d57774696f9451108cff0d2824c97e0f97/websockets-15.0.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:d50fd1ee42388dcfb2b3676132c78116490976f1300da28eb629272d5d93e905", size = 173330, upload-time = "2025-03-05T20:01:59.063Z" }, + { url = "https://files.pythonhosted.org/packages/a5/90/1c37ae8b8a113d3daf1065222b6af61cc44102da95388ac0018fcb7d93d9/websockets-15.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d99e5546bf73dbad5bf3547174cd6cb8ba7273062a23808ffea025ecb1cf8562", size = 182878, upload-time = "2025-03-05T20:02:00.305Z" }, + { url = "https://files.pythonhosted.org/packages/8e/8d/96e8e288b2a41dffafb78e8904ea7367ee4f891dafc2ab8d87e2124cb3d3/websockets-15.0.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:66dd88c918e3287efc22409d426c8f729688d89a0c587c88971a0faa2c2f3792", size = 181883, upload-time = "2025-03-05T20:02:03.148Z" }, + { url = "https://files.pythonhosted.org/packages/93/1f/5d6dbf551766308f6f50f8baf8e9860be6182911e8106da7a7f73785f4c4/websockets-15.0.1-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8dd8327c795b3e3f219760fa603dcae1dcc148172290a8ab15158cf85a953413", size = 182252, upload-time = "2025-03-05T20:02:05.29Z" }, + { url = "https://files.pythonhosted.org/packages/d4/78/2d4fed9123e6620cbf1706c0de8a1632e1a28e7774d94346d7de1bba2ca3/websockets-15.0.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8fdc51055e6ff4adeb88d58a11042ec9a5eae317a0a53d12c062c8a8865909e8", size = 182521, upload-time = "2025-03-05T20:02:07.458Z" }, + { url = "https://files.pythonhosted.org/packages/e7/3b/66d4c1b444dd1a9823c4a81f50231b921bab54eee2f69e70319b4e21f1ca/websockets-15.0.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:693f0192126df6c2327cce3baa7c06f2a117575e32ab2308f7f8216c29d9e2e3", size = 181958, upload-time = "2025-03-05T20:02:09.842Z" }, + { url = "https://files.pythonhosted.org/packages/08/ff/e9eed2ee5fed6f76fdd6032ca5cd38c57ca9661430bb3d5fb2872dc8703c/websockets-15.0.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:54479983bd5fb469c38f2f5c7e3a24f9a4e70594cd68cd1fa6b9340dadaff7cf", size = 181918, upload-time = "2025-03-05T20:02:11.968Z" }, + { url = "https://files.pythonhosted.org/packages/d8/75/994634a49b7e12532be6a42103597b71098fd25900f7437d6055ed39930a/websockets-15.0.1-cp311-cp311-win32.whl", hash = "sha256:16b6c1b3e57799b9d38427dda63edcbe4926352c47cf88588c0be4ace18dac85", size = 176388, upload-time = "2025-03-05T20:02:13.32Z" }, + { url = "https://files.pythonhosted.org/packages/98/93/e36c73f78400a65f5e236cd376713c34182e6663f6889cd45a4a04d8f203/websockets-15.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:27ccee0071a0e75d22cb35849b1db43f2ecd3e161041ac1ee9d2352ddf72f065", size = 176828, upload-time = "2025-03-05T20:02:14.585Z" }, + { url = "https://files.pythonhosted.org/packages/51/6b/4545a0d843594f5d0771e86463606a3988b5a09ca5123136f8a76580dd63/websockets-15.0.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:3e90baa811a5d73f3ca0bcbf32064d663ed81318ab225ee4f427ad4e26e5aff3", size = 175437, upload-time = "2025-03-05T20:02:16.706Z" }, + { url = "https://files.pythonhosted.org/packages/f4/71/809a0f5f6a06522af902e0f2ea2757f71ead94610010cf570ab5c98e99ed/websockets-15.0.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:592f1a9fe869c778694f0aa806ba0374e97648ab57936f092fd9d87f8bc03665", size = 173096, upload-time = "2025-03-05T20:02:18.832Z" }, + { url = "https://files.pythonhosted.org/packages/3d/69/1a681dd6f02180916f116894181eab8b2e25b31e484c5d0eae637ec01f7c/websockets-15.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0701bc3cfcb9164d04a14b149fd74be7347a530ad3bbf15ab2c678a2cd3dd9a2", size = 173332, upload-time = "2025-03-05T20:02:20.187Z" }, + { url = "https://files.pythonhosted.org/packages/a6/02/0073b3952f5bce97eafbb35757f8d0d54812b6174ed8dd952aa08429bcc3/websockets-15.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e8b56bdcdb4505c8078cb6c7157d9811a85790f2f2b3632c7d1462ab5783d215", size = 183152, upload-time = "2025-03-05T20:02:22.286Z" }, + { url = "https://files.pythonhosted.org/packages/74/45/c205c8480eafd114b428284840da0b1be9ffd0e4f87338dc95dc6ff961a1/websockets-15.0.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0af68c55afbd5f07986df82831c7bff04846928ea8d1fd7f30052638788bc9b5", size = 182096, upload-time = "2025-03-05T20:02:24.368Z" }, + { url = "https://files.pythonhosted.org/packages/14/8f/aa61f528fba38578ec553c145857a181384c72b98156f858ca5c8e82d9d3/websockets-15.0.1-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:64dee438fed052b52e4f98f76c5790513235efaa1ef7f3f2192c392cd7c91b65", size = 182523, upload-time = "2025-03-05T20:02:25.669Z" }, + { url = "https://files.pythonhosted.org/packages/ec/6d/0267396610add5bc0d0d3e77f546d4cd287200804fe02323797de77dbce9/websockets-15.0.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:d5f6b181bb38171a8ad1d6aa58a67a6aa9d4b38d0f8c5f496b9e42561dfc62fe", size = 182790, upload-time = "2025-03-05T20:02:26.99Z" }, + { url = "https://files.pythonhosted.org/packages/02/05/c68c5adbf679cf610ae2f74a9b871ae84564462955d991178f95a1ddb7dd/websockets-15.0.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:5d54b09eba2bada6011aea5375542a157637b91029687eb4fdb2dab11059c1b4", size = 182165, upload-time = "2025-03-05T20:02:30.291Z" }, + { url = "https://files.pythonhosted.org/packages/29/93/bb672df7b2f5faac89761cb5fa34f5cec45a4026c383a4b5761c6cea5c16/websockets-15.0.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:3be571a8b5afed347da347bfcf27ba12b069d9d7f42cb8c7028b5e98bbb12597", size = 182160, upload-time = "2025-03-05T20:02:31.634Z" }, + { url = "https://files.pythonhosted.org/packages/ff/83/de1f7709376dc3ca9b7eeb4b9a07b4526b14876b6d372a4dc62312bebee0/websockets-15.0.1-cp312-cp312-win32.whl", hash = "sha256:c338ffa0520bdb12fbc527265235639fb76e7bc7faafbb93f6ba80d9c06578a9", size = 176395, upload-time = "2025-03-05T20:02:33.017Z" }, + { url = "https://files.pythonhosted.org/packages/7d/71/abf2ebc3bbfa40f391ce1428c7168fb20582d0ff57019b69ea20fa698043/websockets-15.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:fcd5cf9e305d7b8338754470cf69cf81f420459dbae8a3b40cee57417f4614a7", size = 176841, upload-time = "2025-03-05T20:02:34.498Z" }, + { url = "https://files.pythonhosted.org/packages/cb/9f/51f0cf64471a9d2b4d0fc6c534f323b664e7095640c34562f5182e5a7195/websockets-15.0.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ee443ef070bb3b6ed74514f5efaa37a252af57c90eb33b956d35c8e9c10a1931", size = 175440, upload-time = "2025-03-05T20:02:36.695Z" }, + { url = "https://files.pythonhosted.org/packages/8a/05/aa116ec9943c718905997412c5989f7ed671bc0188ee2ba89520e8765d7b/websockets-15.0.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:5a939de6b7b4e18ca683218320fc67ea886038265fd1ed30173f5ce3f8e85675", size = 173098, upload-time = "2025-03-05T20:02:37.985Z" }, + { url = "https://files.pythonhosted.org/packages/ff/0b/33cef55ff24f2d92924923c99926dcce78e7bd922d649467f0eda8368923/websockets-15.0.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:746ee8dba912cd6fc889a8147168991d50ed70447bf18bcda7039f7d2e3d9151", size = 173329, upload-time = "2025-03-05T20:02:39.298Z" }, + { url = "https://files.pythonhosted.org/packages/31/1d/063b25dcc01faa8fada1469bdf769de3768b7044eac9d41f734fd7b6ad6d/websockets-15.0.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:595b6c3969023ecf9041b2936ac3827e4623bfa3ccf007575f04c5a6aa318c22", size = 183111, upload-time = "2025-03-05T20:02:40.595Z" }, + { url = "https://files.pythonhosted.org/packages/93/53/9a87ee494a51bf63e4ec9241c1ccc4f7c2f45fff85d5bde2ff74fcb68b9e/websockets-15.0.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3c714d2fc58b5ca3e285461a4cc0c9a66bd0e24c5da9911e30158286c9b5be7f", size = 182054, upload-time = "2025-03-05T20:02:41.926Z" }, + { url = "https://files.pythonhosted.org/packages/ff/b2/83a6ddf56cdcbad4e3d841fcc55d6ba7d19aeb89c50f24dd7e859ec0805f/websockets-15.0.1-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0f3c1e2ab208db911594ae5b4f79addeb3501604a165019dd221c0bdcabe4db8", size = 182496, upload-time = "2025-03-05T20:02:43.304Z" }, + { url = "https://files.pythonhosted.org/packages/98/41/e7038944ed0abf34c45aa4635ba28136f06052e08fc2168520bb8b25149f/websockets-15.0.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:229cf1d3ca6c1804400b0a9790dc66528e08a6a1feec0d5040e8b9eb14422375", size = 182829, upload-time = "2025-03-05T20:02:48.812Z" }, + { url = "https://files.pythonhosted.org/packages/e0/17/de15b6158680c7623c6ef0db361da965ab25d813ae54fcfeae2e5b9ef910/websockets-15.0.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:756c56e867a90fb00177d530dca4b097dd753cde348448a1012ed6c5131f8b7d", size = 182217, upload-time = "2025-03-05T20:02:50.14Z" }, + { url = "https://files.pythonhosted.org/packages/33/2b/1f168cb6041853eef0362fb9554c3824367c5560cbdaad89ac40f8c2edfc/websockets-15.0.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:558d023b3df0bffe50a04e710bc87742de35060580a293c2a984299ed83bc4e4", size = 182195, upload-time = "2025-03-05T20:02:51.561Z" }, + { url = "https://files.pythonhosted.org/packages/86/eb/20b6cdf273913d0ad05a6a14aed4b9a85591c18a987a3d47f20fa13dcc47/websockets-15.0.1-cp313-cp313-win32.whl", hash = "sha256:ba9e56e8ceeeedb2e080147ba85ffcd5cd0711b89576b83784d8605a7df455fa", size = 176393, upload-time = "2025-03-05T20:02:53.814Z" }, + { url = "https://files.pythonhosted.org/packages/1b/6c/c65773d6cab416a64d191d6ee8a8b1c68a09970ea6909d16965d26bfed1e/websockets-15.0.1-cp313-cp313-win_amd64.whl", hash = "sha256:e09473f095a819042ecb2ab9465aee615bd9c2028e4ef7d933600a8401c79561", size = 176837, upload-time = "2025-03-05T20:02:55.237Z" }, + { url = "https://files.pythonhosted.org/packages/fa/a8/5b41e0da817d64113292ab1f8247140aac61cbf6cfd085d6a0fa77f4984f/websockets-15.0.1-py3-none-any.whl", hash = "sha256:f7a866fbc1e97b5c617ee4116daaa09b722101d4a3c170c787450ba409f9736f", size = 169743, upload-time = "2025-03-05T20:03:39.41Z" }, +] + +[[package]] +name = "werkzeug" +version = "3.1.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "markupsafe" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/32/af/d4502dc713b4ccea7175d764718d5183caf8d0867a4f0190d5d4a45cea49/werkzeug-3.1.1.tar.gz", hash = "sha256:8cd39dfbdfc1e051965f156163e2974e52c210f130810e9ad36858f0fd3edad4", size = 806453, upload-time = "2024-11-01T16:40:45.462Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ee/ea/c67e1dee1ba208ed22c06d1d547ae5e293374bfc43e0eb0ef5e262b68561/werkzeug-3.1.1-py3-none-any.whl", hash = "sha256:a71124d1ef06008baafa3d266c02f56e1836a5984afd6dd6c9230669d60d9fb5", size = 224371, upload-time = "2024-11-01T16:40:43.994Z" }, +] + +[[package]] +name = "win-precise-time" +version = "1.4.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/9e/b0/21547e16a47206ccdd15769bf65e143ade1ffae67f0881c855f76e44e9fa/win-precise-time-1.4.2.tar.gz", hash = "sha256:89274785cbc5f2997e01675206da3203835a442c60fd97798415c6b3c179c0b9", size = 7982, upload-time = "2023-10-08T17:08:18.618Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/bb/d6/a48717649fea2d7a6679db86dae9ae4b12078c7a48aa89a8f14a360f29d0/win_precise_time-1.4.2-cp311-cp311-win32.whl", hash = "sha256:59272655ad6f36910d0b585969402386fa627fca3be24acc9a21be1d550e5db8", size = 14703, upload-time = "2023-10-08T17:08:06.945Z" }, + { url = "https://files.pythonhosted.org/packages/f9/9c/46d69220d468c82ca2044284c5a8089705c5eb66be416abcbba156365a14/win_precise_time-1.4.2-cp311-cp311-win_amd64.whl", hash = "sha256:0897bb055f19f3b4336e2ba6bee0115ac20fd7ec615a6d736632e2df77f8851a", size = 14912, upload-time = "2023-10-08T17:08:07.896Z" }, + { url = "https://files.pythonhosted.org/packages/2e/96/55a14b5c0e90439951f4a72672223bba81a5f882033c5850f8a6c7f4308b/win_precise_time-1.4.2-cp312-cp312-win32.whl", hash = "sha256:0210dcea88a520c91de1708ae4c881e3c0ddc956daa08b9eabf2b7c35f3109f5", size = 14694, upload-time = "2023-10-08T17:08:09.275Z" }, + { url = "https://files.pythonhosted.org/packages/17/19/7ea9a22a69fc23d5ca02e8edf65e4a335a210497794af1af0ef8fda91fa0/win_precise_time-1.4.2-cp312-cp312-win_amd64.whl", hash = "sha256:85670f77cc8accd8f1e6d05073999f77561c23012a9ee988cbd44bb7ce655062", size = 14913, upload-time = "2023-10-08T17:08:10.677Z" }, +] + +[[package]] +name = "wrapt" +version = "1.17.3" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/95/8f/aeb76c5b46e273670962298c23e7ddde79916cb74db802131d49a85e4b7d/wrapt-1.17.3.tar.gz", hash = "sha256:f66eb08feaa410fe4eebd17f2a2c8e2e46d3476e9f8c783daa8e09e0faa666d0", size = 55547, upload-time = "2025-08-12T05:53:21.714Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/52/db/00e2a219213856074a213503fdac0511203dceefff26e1daa15250cc01a0/wrapt-1.17.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:273a736c4645e63ac582c60a56b0acb529ef07f78e08dc6bfadf6a46b19c0da7", size = 53482, upload-time = "2025-08-12T05:51:45.79Z" }, + { url = "https://files.pythonhosted.org/packages/5e/30/ca3c4a5eba478408572096fe9ce36e6e915994dd26a4e9e98b4f729c06d9/wrapt-1.17.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:5531d911795e3f935a9c23eb1c8c03c211661a5060aab167065896bbf62a5f85", size = 38674, upload-time = "2025-08-12T05:51:34.629Z" }, + { url = "https://files.pythonhosted.org/packages/31/25/3e8cc2c46b5329c5957cec959cb76a10718e1a513309c31399a4dad07eb3/wrapt-1.17.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:0610b46293c59a3adbae3dee552b648b984176f8562ee0dba099a56cfbe4df1f", size = 38959, upload-time = "2025-08-12T05:51:56.074Z" }, + { url = "https://files.pythonhosted.org/packages/5d/8f/a32a99fc03e4b37e31b57cb9cefc65050ea08147a8ce12f288616b05ef54/wrapt-1.17.3-cp311-cp311-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:b32888aad8b6e68f83a8fdccbf3165f5469702a7544472bdf41f582970ed3311", size = 82376, upload-time = "2025-08-12T05:52:32.134Z" }, + { url = "https://files.pythonhosted.org/packages/31/57/4930cb8d9d70d59c27ee1332a318c20291749b4fba31f113c2f8ac49a72e/wrapt-1.17.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8cccf4f81371f257440c88faed6b74f1053eef90807b77e31ca057b2db74edb1", size = 83604, upload-time = "2025-08-12T05:52:11.663Z" }, + { url = "https://files.pythonhosted.org/packages/a8/f3/1afd48de81d63dd66e01b263a6fbb86e1b5053b419b9b33d13e1f6d0f7d0/wrapt-1.17.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d8a210b158a34164de8bb68b0e7780041a903d7b00c87e906fb69928bf7890d5", size = 82782, upload-time = "2025-08-12T05:52:12.626Z" }, + { url = "https://files.pythonhosted.org/packages/1e/d7/4ad5327612173b144998232f98a85bb24b60c352afb73bc48e3e0d2bdc4e/wrapt-1.17.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:79573c24a46ce11aab457b472efd8d125e5a51da2d1d24387666cd85f54c05b2", size = 82076, upload-time = "2025-08-12T05:52:33.168Z" }, + { url = "https://files.pythonhosted.org/packages/bb/59/e0adfc831674a65694f18ea6dc821f9fcb9ec82c2ce7e3d73a88ba2e8718/wrapt-1.17.3-cp311-cp311-win32.whl", hash = "sha256:c31eebe420a9a5d2887b13000b043ff6ca27c452a9a22fa71f35f118e8d4bf89", size = 36457, upload-time = "2025-08-12T05:53:03.936Z" }, + { url = "https://files.pythonhosted.org/packages/83/88/16b7231ba49861b6f75fc309b11012ede4d6b0a9c90969d9e0db8d991aeb/wrapt-1.17.3-cp311-cp311-win_amd64.whl", hash = "sha256:0b1831115c97f0663cb77aa27d381237e73ad4f721391a9bfb2fe8bc25fa6e77", size = 38745, upload-time = "2025-08-12T05:53:02.885Z" }, + { url = "https://files.pythonhosted.org/packages/9a/1e/c4d4f3398ec073012c51d1c8d87f715f56765444e1a4b11e5180577b7e6e/wrapt-1.17.3-cp311-cp311-win_arm64.whl", hash = "sha256:5a7b3c1ee8265eb4c8f1b7d29943f195c00673f5ab60c192eba2d4a7eae5f46a", size = 36806, upload-time = "2025-08-12T05:52:53.368Z" }, + { url = "https://files.pythonhosted.org/packages/9f/41/cad1aba93e752f1f9268c77270da3c469883d56e2798e7df6240dcb2287b/wrapt-1.17.3-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:ab232e7fdb44cdfbf55fc3afa31bcdb0d8980b9b95c38b6405df2acb672af0e0", size = 53998, upload-time = "2025-08-12T05:51:47.138Z" }, + { url = "https://files.pythonhosted.org/packages/60/f8/096a7cc13097a1869fe44efe68dace40d2a16ecb853141394047f0780b96/wrapt-1.17.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:9baa544e6acc91130e926e8c802a17f3b16fbea0fd441b5a60f5cf2cc5c3deba", size = 39020, upload-time = "2025-08-12T05:51:35.906Z" }, + { url = "https://files.pythonhosted.org/packages/33/df/bdf864b8997aab4febb96a9ae5c124f700a5abd9b5e13d2a3214ec4be705/wrapt-1.17.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6b538e31eca1a7ea4605e44f81a48aa24c4632a277431a6ed3f328835901f4fd", size = 39098, upload-time = "2025-08-12T05:51:57.474Z" }, + { url = "https://files.pythonhosted.org/packages/9f/81/5d931d78d0eb732b95dc3ddaeeb71c8bb572fb01356e9133916cd729ecdd/wrapt-1.17.3-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:042ec3bb8f319c147b1301f2393bc19dba6e176b7da446853406d041c36c7828", size = 88036, upload-time = "2025-08-12T05:52:34.784Z" }, + { url = "https://files.pythonhosted.org/packages/ca/38/2e1785df03b3d72d34fc6252d91d9d12dc27a5c89caef3335a1bbb8908ca/wrapt-1.17.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3af60380ba0b7b5aeb329bc4e402acd25bd877e98b3727b0135cb5c2efdaefe9", size = 88156, upload-time = "2025-08-12T05:52:13.599Z" }, + { url = "https://files.pythonhosted.org/packages/b3/8b/48cdb60fe0603e34e05cffda0b2a4adab81fd43718e11111a4b0100fd7c1/wrapt-1.17.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:0b02e424deef65c9f7326d8c19220a2c9040c51dc165cddb732f16198c168396", size = 87102, upload-time = "2025-08-12T05:52:14.56Z" }, + { url = "https://files.pythonhosted.org/packages/3c/51/d81abca783b58f40a154f1b2c56db1d2d9e0d04fa2d4224e357529f57a57/wrapt-1.17.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:74afa28374a3c3a11b3b5e5fca0ae03bef8450d6aa3ab3a1e2c30e3a75d023dc", size = 87732, upload-time = "2025-08-12T05:52:36.165Z" }, + { url = "https://files.pythonhosted.org/packages/9e/b1/43b286ca1392a006d5336412d41663eeef1ad57485f3e52c767376ba7e5a/wrapt-1.17.3-cp312-cp312-win32.whl", hash = "sha256:4da9f45279fff3543c371d5ababc57a0384f70be244de7759c85a7f989cb4ebe", size = 36705, upload-time = "2025-08-12T05:53:07.123Z" }, + { url = "https://files.pythonhosted.org/packages/28/de/49493f962bd3c586ab4b88066e967aa2e0703d6ef2c43aa28cb83bf7b507/wrapt-1.17.3-cp312-cp312-win_amd64.whl", hash = "sha256:e71d5c6ebac14875668a1e90baf2ea0ef5b7ac7918355850c0908ae82bcb297c", size = 38877, upload-time = "2025-08-12T05:53:05.436Z" }, + { url = "https://files.pythonhosted.org/packages/f1/48/0f7102fe9cb1e8a5a77f80d4f0956d62d97034bbe88d33e94699f99d181d/wrapt-1.17.3-cp312-cp312-win_arm64.whl", hash = "sha256:604d076c55e2fdd4c1c03d06dc1a31b95130010517b5019db15365ec4a405fc6", size = 36885, upload-time = "2025-08-12T05:52:54.367Z" }, + { url = "https://files.pythonhosted.org/packages/fc/f6/759ece88472157acb55fc195e5b116e06730f1b651b5b314c66291729193/wrapt-1.17.3-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:a47681378a0439215912ef542c45a783484d4dd82bac412b71e59cf9c0e1cea0", size = 54003, upload-time = "2025-08-12T05:51:48.627Z" }, + { url = "https://files.pythonhosted.org/packages/4f/a9/49940b9dc6d47027dc850c116d79b4155f15c08547d04db0f07121499347/wrapt-1.17.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:54a30837587c6ee3cd1a4d1c2ec5d24e77984d44e2f34547e2323ddb4e22eb77", size = 39025, upload-time = "2025-08-12T05:51:37.156Z" }, + { url = "https://files.pythonhosted.org/packages/45/35/6a08de0f2c96dcdd7fe464d7420ddb9a7655a6561150e5fc4da9356aeaab/wrapt-1.17.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:16ecf15d6af39246fe33e507105d67e4b81d8f8d2c6598ff7e3ca1b8a37213f7", size = 39108, upload-time = "2025-08-12T05:51:58.425Z" }, + { url = "https://files.pythonhosted.org/packages/0c/37/6faf15cfa41bf1f3dba80cd3f5ccc6622dfccb660ab26ed79f0178c7497f/wrapt-1.17.3-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:6fd1ad24dc235e4ab88cda009e19bf347aabb975e44fd5c2fb22a3f6e4141277", size = 88072, upload-time = "2025-08-12T05:52:37.53Z" }, + { url = "https://files.pythonhosted.org/packages/78/f2/efe19ada4a38e4e15b6dff39c3e3f3f73f5decf901f66e6f72fe79623a06/wrapt-1.17.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0ed61b7c2d49cee3c027372df5809a59d60cf1b6c2f81ee980a091f3afed6a2d", size = 88214, upload-time = "2025-08-12T05:52:15.886Z" }, + { url = "https://files.pythonhosted.org/packages/40/90/ca86701e9de1622b16e09689fc24b76f69b06bb0150990f6f4e8b0eeb576/wrapt-1.17.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:423ed5420ad5f5529db9ce89eac09c8a2f97da18eb1c870237e84c5a5c2d60aa", size = 87105, upload-time = "2025-08-12T05:52:17.914Z" }, + { url = "https://files.pythonhosted.org/packages/fd/e0/d10bd257c9a3e15cbf5523025252cc14d77468e8ed644aafb2d6f54cb95d/wrapt-1.17.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:e01375f275f010fcbf7f643b4279896d04e571889b8a5b3f848423d91bf07050", size = 87766, upload-time = "2025-08-12T05:52:39.243Z" }, + { url = "https://files.pythonhosted.org/packages/e8/cf/7d848740203c7b4b27eb55dbfede11aca974a51c3d894f6cc4b865f42f58/wrapt-1.17.3-cp313-cp313-win32.whl", hash = "sha256:53e5e39ff71b3fc484df8a522c933ea2b7cdd0d5d15ae82e5b23fde87d44cbd8", size = 36711, upload-time = "2025-08-12T05:53:10.074Z" }, + { url = "https://files.pythonhosted.org/packages/57/54/35a84d0a4d23ea675994104e667ceff49227ce473ba6a59ba2c84f250b74/wrapt-1.17.3-cp313-cp313-win_amd64.whl", hash = "sha256:1f0b2f40cf341ee8cc1a97d51ff50dddb9fcc73241b9143ec74b30fc4f44f6cb", size = 38885, upload-time = "2025-08-12T05:53:08.695Z" }, + { url = "https://files.pythonhosted.org/packages/01/77/66e54407c59d7b02a3c4e0af3783168fff8e5d61def52cda8728439d86bc/wrapt-1.17.3-cp313-cp313-win_arm64.whl", hash = "sha256:7425ac3c54430f5fc5e7b6f41d41e704db073309acfc09305816bc6a0b26bb16", size = 36896, upload-time = "2025-08-12T05:52:55.34Z" }, + { url = "https://files.pythonhosted.org/packages/02/a2/cd864b2a14f20d14f4c496fab97802001560f9f41554eef6df201cd7f76c/wrapt-1.17.3-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:cf30f6e3c077c8e6a9a7809c94551203c8843e74ba0c960f4a98cd80d4665d39", size = 54132, upload-time = "2025-08-12T05:51:49.864Z" }, + { url = "https://files.pythonhosted.org/packages/d5/46/d011725b0c89e853dc44cceb738a307cde5d240d023d6d40a82d1b4e1182/wrapt-1.17.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:e228514a06843cae89621384cfe3a80418f3c04aadf8a3b14e46a7be704e4235", size = 39091, upload-time = "2025-08-12T05:51:38.935Z" }, + { url = "https://files.pythonhosted.org/packages/2e/9e/3ad852d77c35aae7ddebdbc3b6d35ec8013af7d7dddad0ad911f3d891dae/wrapt-1.17.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:5ea5eb3c0c071862997d6f3e02af1d055f381b1d25b286b9d6644b79db77657c", size = 39172, upload-time = "2025-08-12T05:51:59.365Z" }, + { url = "https://files.pythonhosted.org/packages/c3/f7/c983d2762bcce2326c317c26a6a1e7016f7eb039c27cdf5c4e30f4160f31/wrapt-1.17.3-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:281262213373b6d5e4bb4353bc36d1ba4084e6d6b5d242863721ef2bf2c2930b", size = 87163, upload-time = "2025-08-12T05:52:40.965Z" }, + { url = "https://files.pythonhosted.org/packages/e4/0f/f673f75d489c7f22d17fe0193e84b41540d962f75fce579cf6873167c29b/wrapt-1.17.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:dc4a8d2b25efb6681ecacad42fca8859f88092d8732b170de6a5dddd80a1c8fa", size = 87963, upload-time = "2025-08-12T05:52:20.326Z" }, + { url = "https://files.pythonhosted.org/packages/df/61/515ad6caca68995da2fac7a6af97faab8f78ebe3bf4f761e1b77efbc47b5/wrapt-1.17.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:373342dd05b1d07d752cecbec0c41817231f29f3a89aa8b8843f7b95992ed0c7", size = 86945, upload-time = "2025-08-12T05:52:21.581Z" }, + { url = "https://files.pythonhosted.org/packages/d3/bd/4e70162ce398462a467bc09e768bee112f1412e563620adc353de9055d33/wrapt-1.17.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:d40770d7c0fd5cbed9d84b2c3f2e156431a12c9a37dc6284060fb4bec0b7ffd4", size = 86857, upload-time = "2025-08-12T05:52:43.043Z" }, + { url = "https://files.pythonhosted.org/packages/2b/b8/da8560695e9284810b8d3df8a19396a6e40e7518059584a1a394a2b35e0a/wrapt-1.17.3-cp314-cp314-win32.whl", hash = "sha256:fbd3c8319de8e1dc79d346929cd71d523622da527cca14e0c1d257e31c2b8b10", size = 37178, upload-time = "2025-08-12T05:53:12.605Z" }, + { url = "https://files.pythonhosted.org/packages/db/c8/b71eeb192c440d67a5a0449aaee2310a1a1e8eca41676046f99ed2487e9f/wrapt-1.17.3-cp314-cp314-win_amd64.whl", hash = "sha256:e1a4120ae5705f673727d3253de3ed0e016f7cd78dc463db1b31e2463e1f3cf6", size = 39310, upload-time = "2025-08-12T05:53:11.106Z" }, + { url = "https://files.pythonhosted.org/packages/45/20/2cda20fd4865fa40f86f6c46ed37a2a8356a7a2fde0773269311f2af56c7/wrapt-1.17.3-cp314-cp314-win_arm64.whl", hash = "sha256:507553480670cab08a800b9463bdb881b2edeed77dc677b0a5915e6106e91a58", size = 37266, upload-time = "2025-08-12T05:52:56.531Z" }, + { url = "https://files.pythonhosted.org/packages/77/ed/dd5cf21aec36c80443c6f900449260b80e2a65cf963668eaef3b9accce36/wrapt-1.17.3-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:ed7c635ae45cfbc1a7371f708727bf74690daedc49b4dba310590ca0bd28aa8a", size = 56544, upload-time = "2025-08-12T05:51:51.109Z" }, + { url = "https://files.pythonhosted.org/packages/8d/96/450c651cc753877ad100c7949ab4d2e2ecc4d97157e00fa8f45df682456a/wrapt-1.17.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:249f88ed15503f6492a71f01442abddd73856a0032ae860de6d75ca62eed8067", size = 40283, upload-time = "2025-08-12T05:51:39.912Z" }, + { url = "https://files.pythonhosted.org/packages/d1/86/2fcad95994d9b572db57632acb6f900695a648c3e063f2cd344b3f5c5a37/wrapt-1.17.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:5a03a38adec8066d5a37bea22f2ba6bbf39fcdefbe2d91419ab864c3fb515454", size = 40366, upload-time = "2025-08-12T05:52:00.693Z" }, + { url = "https://files.pythonhosted.org/packages/64/0e/f4472f2fdde2d4617975144311f8800ef73677a159be7fe61fa50997d6c0/wrapt-1.17.3-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:5d4478d72eb61c36e5b446e375bbc49ed002430d17cdec3cecb36993398e1a9e", size = 108571, upload-time = "2025-08-12T05:52:44.521Z" }, + { url = "https://files.pythonhosted.org/packages/cc/01/9b85a99996b0a97c8a17484684f206cbb6ba73c1ce6890ac668bcf3838fb/wrapt-1.17.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:223db574bb38637e8230eb14b185565023ab624474df94d2af18f1cdb625216f", size = 113094, upload-time = "2025-08-12T05:52:22.618Z" }, + { url = "https://files.pythonhosted.org/packages/25/02/78926c1efddcc7b3aa0bc3d6b33a822f7d898059f7cd9ace8c8318e559ef/wrapt-1.17.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:e405adefb53a435f01efa7ccdec012c016b5a1d3f35459990afc39b6be4d5056", size = 110659, upload-time = "2025-08-12T05:52:24.057Z" }, + { url = "https://files.pythonhosted.org/packages/dc/ee/c414501ad518ac3e6fe184753632fe5e5ecacdcf0effc23f31c1e4f7bfcf/wrapt-1.17.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:88547535b787a6c9ce4086917b6e1d291aa8ed914fdd3a838b3539dc95c12804", size = 106946, upload-time = "2025-08-12T05:52:45.976Z" }, + { url = "https://files.pythonhosted.org/packages/be/44/a1bd64b723d13bb151d6cc91b986146a1952385e0392a78567e12149c7b4/wrapt-1.17.3-cp314-cp314t-win32.whl", hash = "sha256:41b1d2bc74c2cac6f9074df52b2efbef2b30bdfe5f40cb78f8ca22963bc62977", size = 38717, upload-time = "2025-08-12T05:53:15.214Z" }, + { url = "https://files.pythonhosted.org/packages/79/d9/7cfd5a312760ac4dd8bf0184a6ee9e43c33e47f3dadc303032ce012b8fa3/wrapt-1.17.3-cp314-cp314t-win_amd64.whl", hash = "sha256:73d496de46cd2cdbdbcce4ae4bcdb4afb6a11234a1df9c085249d55166b95116", size = 41334, upload-time = "2025-08-12T05:53:14.178Z" }, + { url = "https://files.pythonhosted.org/packages/46/78/10ad9781128ed2f99dbc474f43283b13fea8ba58723e98844367531c18e9/wrapt-1.17.3-cp314-cp314t-win_arm64.whl", hash = "sha256:f38e60678850c42461d4202739f9bf1e3a737c7ad283638251e79cc49effb6b6", size = 38471, upload-time = "2025-08-12T05:52:57.784Z" }, + { url = "https://files.pythonhosted.org/packages/1f/f6/a933bd70f98e9cf3e08167fc5cd7aaaca49147e48411c0bd5ae701bb2194/wrapt-1.17.3-py3-none-any.whl", hash = "sha256:7171ae35d2c33d326ac19dd8facb1e82e5fd04ef8c6c0e394d7af55a55051c22", size = 23591, upload-time = "2025-08-12T05:53:20.674Z" }, +] + +[[package]] +name = "yarl" +version = "1.20.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "idna" }, + { name = "multidict" }, + { name = "propcache" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/3c/fb/efaa23fa4e45537b827620f04cf8f3cd658b76642205162e072703a5b963/yarl-1.20.1.tar.gz", hash = "sha256:d017a4997ee50c91fd5466cef416231bb82177b93b029906cefc542ce14c35ac", size = 186428, upload-time = "2025-06-10T00:46:09.923Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b1/18/893b50efc2350e47a874c5c2d67e55a0ea5df91186b2a6f5ac52eff887cd/yarl-1.20.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:47ee6188fea634bdfaeb2cc420f5b3b17332e6225ce88149a17c413c77ff269e", size = 133833, upload-time = "2025-06-10T00:43:07.393Z" }, + { url = "https://files.pythonhosted.org/packages/89/ed/b8773448030e6fc47fa797f099ab9eab151a43a25717f9ac043844ad5ea3/yarl-1.20.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:d0f6500f69e8402d513e5eedb77a4e1818691e8f45e6b687147963514d84b44b", size = 91070, upload-time = "2025-06-10T00:43:09.538Z" }, + { url = "https://files.pythonhosted.org/packages/e3/e3/409bd17b1e42619bf69f60e4f031ce1ccb29bd7380117a55529e76933464/yarl-1.20.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:7a8900a42fcdaad568de58887c7b2f602962356908eedb7628eaf6021a6e435b", size = 89818, upload-time = "2025-06-10T00:43:11.575Z" }, + { url = "https://files.pythonhosted.org/packages/f8/77/64d8431a4d77c856eb2d82aa3de2ad6741365245a29b3a9543cd598ed8c5/yarl-1.20.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bad6d131fda8ef508b36be3ece16d0902e80b88ea7200f030a0f6c11d9e508d4", size = 347003, upload-time = "2025-06-10T00:43:14.088Z" }, + { url = "https://files.pythonhosted.org/packages/8d/d2/0c7e4def093dcef0bd9fa22d4d24b023788b0a33b8d0088b51aa51e21e99/yarl-1.20.1-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:df018d92fe22aaebb679a7f89fe0c0f368ec497e3dda6cb81a567610f04501f1", size = 336537, upload-time = "2025-06-10T00:43:16.431Z" }, + { url = "https://files.pythonhosted.org/packages/f0/f3/fc514f4b2cf02cb59d10cbfe228691d25929ce8f72a38db07d3febc3f706/yarl-1.20.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8f969afbb0a9b63c18d0feecf0db09d164b7a44a053e78a7d05f5df163e43833", size = 362358, upload-time = "2025-06-10T00:43:18.704Z" }, + { url = "https://files.pythonhosted.org/packages/ea/6d/a313ac8d8391381ff9006ac05f1d4331cee3b1efaa833a53d12253733255/yarl-1.20.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:812303eb4aa98e302886ccda58d6b099e3576b1b9276161469c25803a8db277d", size = 357362, upload-time = "2025-06-10T00:43:20.888Z" }, + { url = "https://files.pythonhosted.org/packages/00/70/8f78a95d6935a70263d46caa3dd18e1f223cf2f2ff2037baa01a22bc5b22/yarl-1.20.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:98c4a7d166635147924aa0bf9bfe8d8abad6fffa6102de9c99ea04a1376f91e8", size = 348979, upload-time = "2025-06-10T00:43:23.169Z" }, + { url = "https://files.pythonhosted.org/packages/cb/05/42773027968968f4f15143553970ee36ead27038d627f457cc44bbbeecf3/yarl-1.20.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:12e768f966538e81e6e7550f9086a6236b16e26cd964cf4df35349970f3551cf", size = 337274, upload-time = "2025-06-10T00:43:27.111Z" }, + { url = "https://files.pythonhosted.org/packages/05/be/665634aa196954156741ea591d2f946f1b78ceee8bb8f28488bf28c0dd62/yarl-1.20.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:fe41919b9d899661c5c28a8b4b0acf704510b88f27f0934ac7a7bebdd8938d5e", size = 363294, upload-time = "2025-06-10T00:43:28.96Z" }, + { url = "https://files.pythonhosted.org/packages/eb/90/73448401d36fa4e210ece5579895731f190d5119c4b66b43b52182e88cd5/yarl-1.20.1-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:8601bc010d1d7780592f3fc1bdc6c72e2b6466ea34569778422943e1a1f3c389", size = 358169, upload-time = "2025-06-10T00:43:30.701Z" }, + { url = "https://files.pythonhosted.org/packages/c3/b0/fce922d46dc1eb43c811f1889f7daa6001b27a4005587e94878570300881/yarl-1.20.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:daadbdc1f2a9033a2399c42646fbd46da7992e868a5fe9513860122d7fe7a73f", size = 362776, upload-time = "2025-06-10T00:43:32.51Z" }, + { url = "https://files.pythonhosted.org/packages/f1/0d/b172628fce039dae8977fd22caeff3eeebffd52e86060413f5673767c427/yarl-1.20.1-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:03aa1e041727cb438ca762628109ef1333498b122e4c76dd858d186a37cec845", size = 381341, upload-time = "2025-06-10T00:43:34.543Z" }, + { url = "https://files.pythonhosted.org/packages/6b/9b/5b886d7671f4580209e855974fe1cecec409aa4a89ea58b8f0560dc529b1/yarl-1.20.1-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:642980ef5e0fa1de5fa96d905c7e00cb2c47cb468bfcac5a18c58e27dbf8d8d1", size = 379988, upload-time = "2025-06-10T00:43:36.489Z" }, + { url = "https://files.pythonhosted.org/packages/73/be/75ef5fd0fcd8f083a5d13f78fd3f009528132a1f2a1d7c925c39fa20aa79/yarl-1.20.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:86971e2795584fe8c002356d3b97ef6c61862720eeff03db2a7c86b678d85b3e", size = 371113, upload-time = "2025-06-10T00:43:38.592Z" }, + { url = "https://files.pythonhosted.org/packages/50/4f/62faab3b479dfdcb741fe9e3f0323e2a7d5cd1ab2edc73221d57ad4834b2/yarl-1.20.1-cp311-cp311-win32.whl", hash = "sha256:597f40615b8d25812f14562699e287f0dcc035d25eb74da72cae043bb884d773", size = 81485, upload-time = "2025-06-10T00:43:41.038Z" }, + { url = "https://files.pythonhosted.org/packages/f0/09/d9c7942f8f05c32ec72cd5c8e041c8b29b5807328b68b4801ff2511d4d5e/yarl-1.20.1-cp311-cp311-win_amd64.whl", hash = "sha256:26ef53a9e726e61e9cd1cda6b478f17e350fb5800b4bd1cd9fe81c4d91cfeb2e", size = 86686, upload-time = "2025-06-10T00:43:42.692Z" }, + { url = "https://files.pythonhosted.org/packages/5f/9a/cb7fad7d73c69f296eda6815e4a2c7ed53fc70c2f136479a91c8e5fbdb6d/yarl-1.20.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:bdcc4cd244e58593a4379fe60fdee5ac0331f8eb70320a24d591a3be197b94a9", size = 133667, upload-time = "2025-06-10T00:43:44.369Z" }, + { url = "https://files.pythonhosted.org/packages/67/38/688577a1cb1e656e3971fb66a3492501c5a5df56d99722e57c98249e5b8a/yarl-1.20.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:b29a2c385a5f5b9c7d9347e5812b6f7ab267193c62d282a540b4fc528c8a9d2a", size = 91025, upload-time = "2025-06-10T00:43:46.295Z" }, + { url = "https://files.pythonhosted.org/packages/50/ec/72991ae51febeb11a42813fc259f0d4c8e0507f2b74b5514618d8b640365/yarl-1.20.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1112ae8154186dfe2de4732197f59c05a83dc814849a5ced892b708033f40dc2", size = 89709, upload-time = "2025-06-10T00:43:48.22Z" }, + { url = "https://files.pythonhosted.org/packages/99/da/4d798025490e89426e9f976702e5f9482005c548c579bdae792a4c37769e/yarl-1.20.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:90bbd29c4fe234233f7fa2b9b121fb63c321830e5d05b45153a2ca68f7d310ee", size = 352287, upload-time = "2025-06-10T00:43:49.924Z" }, + { url = "https://files.pythonhosted.org/packages/1a/26/54a15c6a567aac1c61b18aa0f4b8aa2e285a52d547d1be8bf48abe2b3991/yarl-1.20.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:680e19c7ce3710ac4cd964e90dad99bf9b5029372ba0c7cbfcd55e54d90ea819", size = 345429, upload-time = "2025-06-10T00:43:51.7Z" }, + { url = "https://files.pythonhosted.org/packages/d6/95/9dcf2386cb875b234353b93ec43e40219e14900e046bf6ac118f94b1e353/yarl-1.20.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4a979218c1fdb4246a05efc2cc23859d47c89af463a90b99b7c56094daf25a16", size = 365429, upload-time = "2025-06-10T00:43:53.494Z" }, + { url = "https://files.pythonhosted.org/packages/91/b2/33a8750f6a4bc224242a635f5f2cff6d6ad5ba651f6edcccf721992c21a0/yarl-1.20.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:255b468adf57b4a7b65d8aad5b5138dce6a0752c139965711bdcb81bc370e1b6", size = 363862, upload-time = "2025-06-10T00:43:55.766Z" }, + { url = "https://files.pythonhosted.org/packages/98/28/3ab7acc5b51f4434b181b0cee8f1f4b77a65919700a355fb3617f9488874/yarl-1.20.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a97d67108e79cfe22e2b430d80d7571ae57d19f17cda8bb967057ca8a7bf5bfd", size = 355616, upload-time = "2025-06-10T00:43:58.056Z" }, + { url = "https://files.pythonhosted.org/packages/36/a3/f666894aa947a371724ec7cd2e5daa78ee8a777b21509b4252dd7bd15e29/yarl-1.20.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8570d998db4ddbfb9a590b185a0a33dbf8aafb831d07a5257b4ec9948df9cb0a", size = 339954, upload-time = "2025-06-10T00:43:59.773Z" }, + { url = "https://files.pythonhosted.org/packages/f1/81/5f466427e09773c04219d3450d7a1256138a010b6c9f0af2d48565e9ad13/yarl-1.20.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:97c75596019baae7c71ccf1d8cc4738bc08134060d0adfcbe5642f778d1dca38", size = 365575, upload-time = "2025-06-10T00:44:02.051Z" }, + { url = "https://files.pythonhosted.org/packages/2e/e3/e4b0ad8403e97e6c9972dd587388940a032f030ebec196ab81a3b8e94d31/yarl-1.20.1-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:1c48912653e63aef91ff988c5432832692ac5a1d8f0fb8a33091520b5bbe19ef", size = 365061, upload-time = "2025-06-10T00:44:04.196Z" }, + { url = "https://files.pythonhosted.org/packages/ac/99/b8a142e79eb86c926f9f06452eb13ecb1bb5713bd01dc0038faf5452e544/yarl-1.20.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:4c3ae28f3ae1563c50f3d37f064ddb1511ecc1d5584e88c6b7c63cf7702a6d5f", size = 364142, upload-time = "2025-06-10T00:44:06.527Z" }, + { url = "https://files.pythonhosted.org/packages/34/f2/08ed34a4a506d82a1a3e5bab99ccd930a040f9b6449e9fd050320e45845c/yarl-1.20.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:c5e9642f27036283550f5f57dc6156c51084b458570b9d0d96100c8bebb186a8", size = 381894, upload-time = "2025-06-10T00:44:08.379Z" }, + { url = "https://files.pythonhosted.org/packages/92/f8/9a3fbf0968eac704f681726eff595dce9b49c8a25cd92bf83df209668285/yarl-1.20.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:2c26b0c49220d5799f7b22c6838409ee9bc58ee5c95361a4d7831f03cc225b5a", size = 383378, upload-time = "2025-06-10T00:44:10.51Z" }, + { url = "https://files.pythonhosted.org/packages/af/85/9363f77bdfa1e4d690957cd39d192c4cacd1c58965df0470a4905253b54f/yarl-1.20.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:564ab3d517e3d01c408c67f2e5247aad4019dcf1969982aba3974b4093279004", size = 374069, upload-time = "2025-06-10T00:44:12.834Z" }, + { url = "https://files.pythonhosted.org/packages/35/99/9918c8739ba271dcd935400cff8b32e3cd319eaf02fcd023d5dcd487a7c8/yarl-1.20.1-cp312-cp312-win32.whl", hash = "sha256:daea0d313868da1cf2fac6b2d3a25c6e3a9e879483244be38c8e6a41f1d876a5", size = 81249, upload-time = "2025-06-10T00:44:14.731Z" }, + { url = "https://files.pythonhosted.org/packages/eb/83/5d9092950565481b413b31a23e75dd3418ff0a277d6e0abf3729d4d1ce25/yarl-1.20.1-cp312-cp312-win_amd64.whl", hash = "sha256:48ea7d7f9be0487339828a4de0360d7ce0efc06524a48e1810f945c45b813698", size = 86710, upload-time = "2025-06-10T00:44:16.716Z" }, + { url = "https://files.pythonhosted.org/packages/8a/e1/2411b6d7f769a07687acee88a062af5833cf1966b7266f3d8dfb3d3dc7d3/yarl-1.20.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:0b5ff0fbb7c9f1b1b5ab53330acbfc5247893069e7716840c8e7d5bb7355038a", size = 131811, upload-time = "2025-06-10T00:44:18.933Z" }, + { url = "https://files.pythonhosted.org/packages/b2/27/584394e1cb76fb771371770eccad35de400e7b434ce3142c2dd27392c968/yarl-1.20.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:14f326acd845c2b2e2eb38fb1346c94f7f3b01a4f5c788f8144f9b630bfff9a3", size = 90078, upload-time = "2025-06-10T00:44:20.635Z" }, + { url = "https://files.pythonhosted.org/packages/bf/9a/3246ae92d4049099f52d9b0fe3486e3b500e29b7ea872d0f152966fc209d/yarl-1.20.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f60e4ad5db23f0b96e49c018596707c3ae89f5d0bd97f0ad3684bcbad899f1e7", size = 88748, upload-time = "2025-06-10T00:44:22.34Z" }, + { url = "https://files.pythonhosted.org/packages/a3/25/35afe384e31115a1a801fbcf84012d7a066d89035befae7c5d4284df1e03/yarl-1.20.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:49bdd1b8e00ce57e68ba51916e4bb04461746e794e7c4d4bbc42ba2f18297691", size = 349595, upload-time = "2025-06-10T00:44:24.314Z" }, + { url = "https://files.pythonhosted.org/packages/28/2d/8aca6cb2cabc8f12efcb82749b9cefecbccfc7b0384e56cd71058ccee433/yarl-1.20.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:66252d780b45189975abfed839616e8fd2dbacbdc262105ad7742c6ae58f3e31", size = 342616, upload-time = "2025-06-10T00:44:26.167Z" }, + { url = "https://files.pythonhosted.org/packages/0b/e9/1312633d16b31acf0098d30440ca855e3492d66623dafb8e25b03d00c3da/yarl-1.20.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:59174e7332f5d153d8f7452a102b103e2e74035ad085f404df2e40e663a22b28", size = 361324, upload-time = "2025-06-10T00:44:27.915Z" }, + { url = "https://files.pythonhosted.org/packages/bc/a0/688cc99463f12f7669eec7c8acc71ef56a1521b99eab7cd3abb75af887b0/yarl-1.20.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e3968ec7d92a0c0f9ac34d5ecfd03869ec0cab0697c91a45db3fbbd95fe1b653", size = 359676, upload-time = "2025-06-10T00:44:30.041Z" }, + { url = "https://files.pythonhosted.org/packages/af/44/46407d7f7a56e9a85a4c207724c9f2c545c060380718eea9088f222ba697/yarl-1.20.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d1a4fbb50e14396ba3d375f68bfe02215d8e7bc3ec49da8341fe3157f59d2ff5", size = 352614, upload-time = "2025-06-10T00:44:32.171Z" }, + { url = "https://files.pythonhosted.org/packages/b1/91/31163295e82b8d5485d31d9cf7754d973d41915cadce070491778d9c9825/yarl-1.20.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:11a62c839c3a8eac2410e951301309426f368388ff2f33799052787035793b02", size = 336766, upload-time = "2025-06-10T00:44:34.494Z" }, + { url = "https://files.pythonhosted.org/packages/b4/8e/c41a5bc482121f51c083c4c2bcd16b9e01e1cf8729e380273a952513a21f/yarl-1.20.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:041eaa14f73ff5a8986b4388ac6bb43a77f2ea09bf1913df7a35d4646db69e53", size = 364615, upload-time = "2025-06-10T00:44:36.856Z" }, + { url = "https://files.pythonhosted.org/packages/e3/5b/61a3b054238d33d70ea06ebba7e58597891b71c699e247df35cc984ab393/yarl-1.20.1-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:377fae2fef158e8fd9d60b4c8751387b8d1fb121d3d0b8e9b0be07d1b41e83dc", size = 360982, upload-time = "2025-06-10T00:44:39.141Z" }, + { url = "https://files.pythonhosted.org/packages/df/a3/6a72fb83f8d478cb201d14927bc8040af901811a88e0ff2da7842dd0ed19/yarl-1.20.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:1c92f4390e407513f619d49319023664643d3339bd5e5a56a3bebe01bc67ec04", size = 369792, upload-time = "2025-06-10T00:44:40.934Z" }, + { url = "https://files.pythonhosted.org/packages/7c/af/4cc3c36dfc7c077f8dedb561eb21f69e1e9f2456b91b593882b0b18c19dc/yarl-1.20.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:d25ddcf954df1754ab0f86bb696af765c5bfaba39b74095f27eececa049ef9a4", size = 382049, upload-time = "2025-06-10T00:44:42.854Z" }, + { url = "https://files.pythonhosted.org/packages/19/3a/e54e2c4752160115183a66dc9ee75a153f81f3ab2ba4bf79c3c53b33de34/yarl-1.20.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:909313577e9619dcff8c31a0ea2aa0a2a828341d92673015456b3ae492e7317b", size = 384774, upload-time = "2025-06-10T00:44:45.275Z" }, + { url = "https://files.pythonhosted.org/packages/9c/20/200ae86dabfca89060ec6447649f219b4cbd94531e425e50d57e5f5ac330/yarl-1.20.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:793fd0580cb9664548c6b83c63b43c477212c0260891ddf86809e1c06c8b08f1", size = 374252, upload-time = "2025-06-10T00:44:47.31Z" }, + { url = "https://files.pythonhosted.org/packages/83/75/11ee332f2f516b3d094e89448da73d557687f7d137d5a0f48c40ff211487/yarl-1.20.1-cp313-cp313-win32.whl", hash = "sha256:468f6e40285de5a5b3c44981ca3a319a4b208ccc07d526b20b12aeedcfa654b7", size = 81198, upload-time = "2025-06-10T00:44:49.164Z" }, + { url = "https://files.pythonhosted.org/packages/ba/ba/39b1ecbf51620b40ab402b0fc817f0ff750f6d92712b44689c2c215be89d/yarl-1.20.1-cp313-cp313-win_amd64.whl", hash = "sha256:495b4ef2fea40596bfc0affe3837411d6aa3371abcf31aac0ccc4bdd64d4ef5c", size = 86346, upload-time = "2025-06-10T00:44:51.182Z" }, + { url = "https://files.pythonhosted.org/packages/43/c7/669c52519dca4c95153c8ad96dd123c79f354a376346b198f438e56ffeb4/yarl-1.20.1-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:f60233b98423aab21d249a30eb27c389c14929f47be8430efa7dbd91493a729d", size = 138826, upload-time = "2025-06-10T00:44:52.883Z" }, + { url = "https://files.pythonhosted.org/packages/6a/42/fc0053719b44f6ad04a75d7f05e0e9674d45ef62f2d9ad2c1163e5c05827/yarl-1.20.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:6f3eff4cc3f03d650d8755c6eefc844edde99d641d0dcf4da3ab27141a5f8ddf", size = 93217, upload-time = "2025-06-10T00:44:54.658Z" }, + { url = "https://files.pythonhosted.org/packages/4f/7f/fa59c4c27e2a076bba0d959386e26eba77eb52ea4a0aac48e3515c186b4c/yarl-1.20.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:69ff8439d8ba832d6bed88af2c2b3445977eba9a4588b787b32945871c2444e3", size = 92700, upload-time = "2025-06-10T00:44:56.784Z" }, + { url = "https://files.pythonhosted.org/packages/2f/d4/062b2f48e7c93481e88eff97a6312dca15ea200e959f23e96d8ab898c5b8/yarl-1.20.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3cf34efa60eb81dd2645a2e13e00bb98b76c35ab5061a3989c7a70f78c85006d", size = 347644, upload-time = "2025-06-10T00:44:59.071Z" }, + { url = "https://files.pythonhosted.org/packages/89/47/78b7f40d13c8f62b499cc702fdf69e090455518ae544c00a3bf4afc9fc77/yarl-1.20.1-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:8e0fe9364ad0fddab2688ce72cb7a8e61ea42eff3c7caeeb83874a5d479c896c", size = 323452, upload-time = "2025-06-10T00:45:01.605Z" }, + { url = "https://files.pythonhosted.org/packages/eb/2b/490d3b2dc66f52987d4ee0d3090a147ea67732ce6b4d61e362c1846d0d32/yarl-1.20.1-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8f64fbf81878ba914562c672024089e3401974a39767747691c65080a67b18c1", size = 346378, upload-time = "2025-06-10T00:45:03.946Z" }, + { url = "https://files.pythonhosted.org/packages/66/ad/775da9c8a94ce925d1537f939a4f17d782efef1f973039d821cbe4bcc211/yarl-1.20.1-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f6342d643bf9a1de97e512e45e4b9560a043347e779a173250824f8b254bd5ce", size = 353261, upload-time = "2025-06-10T00:45:05.992Z" }, + { url = "https://files.pythonhosted.org/packages/4b/23/0ed0922b47a4f5c6eb9065d5ff1e459747226ddce5c6a4c111e728c9f701/yarl-1.20.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:56dac5f452ed25eef0f6e3c6a066c6ab68971d96a9fb441791cad0efba6140d3", size = 335987, upload-time = "2025-06-10T00:45:08.227Z" }, + { url = "https://files.pythonhosted.org/packages/3e/49/bc728a7fe7d0e9336e2b78f0958a2d6b288ba89f25a1762407a222bf53c3/yarl-1.20.1-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c7d7f497126d65e2cad8dc5f97d34c27b19199b6414a40cb36b52f41b79014be", size = 329361, upload-time = "2025-06-10T00:45:10.11Z" }, + { url = "https://files.pythonhosted.org/packages/93/8f/b811b9d1f617c83c907e7082a76e2b92b655400e61730cd61a1f67178393/yarl-1.20.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:67e708dfb8e78d8a19169818eeb5c7a80717562de9051bf2413aca8e3696bf16", size = 346460, upload-time = "2025-06-10T00:45:12.055Z" }, + { url = "https://files.pythonhosted.org/packages/70/fd/af94f04f275f95da2c3b8b5e1d49e3e79f1ed8b6ceb0f1664cbd902773ff/yarl-1.20.1-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:595c07bc79af2494365cc96ddeb772f76272364ef7c80fb892ef9d0649586513", size = 334486, upload-time = "2025-06-10T00:45:13.995Z" }, + { url = "https://files.pythonhosted.org/packages/84/65/04c62e82704e7dd0a9b3f61dbaa8447f8507655fd16c51da0637b39b2910/yarl-1.20.1-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:7bdd2f80f4a7df852ab9ab49484a4dee8030023aa536df41f2d922fd57bf023f", size = 342219, upload-time = "2025-06-10T00:45:16.479Z" }, + { url = "https://files.pythonhosted.org/packages/91/95/459ca62eb958381b342d94ab9a4b6aec1ddec1f7057c487e926f03c06d30/yarl-1.20.1-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:c03bfebc4ae8d862f853a9757199677ab74ec25424d0ebd68a0027e9c639a390", size = 350693, upload-time = "2025-06-10T00:45:18.399Z" }, + { url = "https://files.pythonhosted.org/packages/a6/00/d393e82dd955ad20617abc546a8f1aee40534d599ff555ea053d0ec9bf03/yarl-1.20.1-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:344d1103e9c1523f32a5ed704d576172d2cabed3122ea90b1d4e11fe17c66458", size = 355803, upload-time = "2025-06-10T00:45:20.677Z" }, + { url = "https://files.pythonhosted.org/packages/9e/ed/c5fb04869b99b717985e244fd93029c7a8e8febdfcffa06093e32d7d44e7/yarl-1.20.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:88cab98aa4e13e1ade8c141daeedd300a4603b7132819c484841bb7af3edce9e", size = 341709, upload-time = "2025-06-10T00:45:23.221Z" }, + { url = "https://files.pythonhosted.org/packages/24/fd/725b8e73ac2a50e78a4534ac43c6addf5c1c2d65380dd48a9169cc6739a9/yarl-1.20.1-cp313-cp313t-win32.whl", hash = "sha256:b121ff6a7cbd4abc28985b6028235491941b9fe8fe226e6fdc539c977ea1739d", size = 86591, upload-time = "2025-06-10T00:45:25.793Z" }, + { url = "https://files.pythonhosted.org/packages/94/c3/b2e9f38bc3e11191981d57ea08cab2166e74ea770024a646617c9cddd9f6/yarl-1.20.1-cp313-cp313t-win_amd64.whl", hash = "sha256:541d050a355bbbc27e55d906bc91cb6fe42f96c01413dd0f4ed5a5240513874f", size = 93003, upload-time = "2025-06-10T00:45:27.752Z" }, + { url = "https://files.pythonhosted.org/packages/b4/2d/2345fce04cfd4bee161bf1e7d9cdc702e3e16109021035dbb24db654a622/yarl-1.20.1-py3-none-any.whl", hash = "sha256:83b8eb083fe4683c6115795d9fc1cfaf2cbbefb19b3a1cb68f6527460f483a77", size = 46542, upload-time = "2025-06-10T00:46:07.521Z" }, +] + +[[package]] +name = "zipp" +version = "3.23.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/e3/02/0f2892c661036d50ede074e376733dca2ae7c6eb617489437771209d4180/zipp-3.23.0.tar.gz", hash = "sha256:a07157588a12518c9d4034df3fbbee09c814741a33ff63c05fa29d26a2404166", size = 25547, upload-time = "2025-06-08T17:06:39.4Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/2e/54/647ade08bf0db230bfea292f893923872fd20be6ac6f53b2b936ba839d75/zipp-3.23.0-py3-none-any.whl", hash = "sha256:071652d6115ed432f5ce1d34c336c0adfd6a884660d1e9712a256d3d3bd4b14e", size = 10276, upload-time = "2025-06-08T17:06:38.034Z" }, +] diff --git a/docker-compose.yaml b/docker-compose.yaml new file mode 100644 index 0000000..c4c3525 --- /dev/null +++ b/docker-compose.yaml @@ -0,0 +1,227 @@ +services: + registry: + image: registry:2 + restart: unless-stopped + ports: + - "5001:5000" + volumes: + - registry_data:/var/lib/registry + healthcheck: + test: ["CMD-SHELL", "wget -q --spider http://localhost:5000/v2/ || exit 1"] + interval: 10s + timeout: 5s + retries: 3 + + postgres: + image: postgres:14 + environment: + POSTGRES_USER: prefect + POSTGRES_PASSWORD: prefect + POSTGRES_DB: prefect + volumes: + - postgres_data:/var/lib/postgresql/data + healthcheck: + test: ["CMD-SHELL", "pg_isready -U prefect"] + interval: 5s + timeout: 5s + retries: 5 + + redis: + image: redis:7 + volumes: + - redis_data:/data + healthcheck: + test: ["CMD-SHELL", "redis-cli ping"] + interval: 5s + timeout: 5s + retries: 5 + + prefect-server: + image: prefecthq/prefect:3-latest + depends_on: + postgres: + condition: service_healthy + redis: + condition: service_healthy + environment: + PREFECT_API_DATABASE_CONNECTION_URL: postgresql+asyncpg://prefect:prefect@postgres:5432/prefect + PREFECT_SERVER_API_HOST: 0.0.0.0 + PREFECT_API_URL: http://prefect-server:4200/api + PREFECT_MESSAGING_BROKER: prefect_redis.messaging + PREFECT_MESSAGING_CACHE: prefect_redis.messaging + PREFECT_REDIS_MESSAGING_HOST: redis + PREFECT_REDIS_MESSAGING_PORT: 6379 + PREFECT_REDIS_MESSAGING_DB: 0 + PREFECT_LOCAL_STORAGE_PATH: /prefect-storage + PREFECT_RESULTS_PERSIST_BY_DEFAULT: "true" + command: > + sh -c " + mkdir -p /prefect-storage && + chmod 755 /prefect-storage && + prefect server start --no-services + " + ports: + - "4200:4200" + volumes: + - prefect_storage:/prefect-storage + + prefect-services: + image: prefecthq/prefect:3-latest + depends_on: + postgres: + condition: service_healthy + redis: + condition: service_healthy + environment: + PREFECT_API_DATABASE_CONNECTION_URL: postgresql+asyncpg://prefect:prefect@postgres:5432/prefect + PREFECT_MESSAGING_BROKER: prefect_redis.messaging + PREFECT_MESSAGING_CACHE: prefect_redis.messaging + PREFECT_REDIS_MESSAGING_HOST: redis + PREFECT_REDIS_MESSAGING_PORT: 6379 + PREFECT_REDIS_MESSAGING_DB: 0 + PREFECT_LOCAL_STORAGE_PATH: /prefect-storage + PREFECT_RESULTS_PERSIST_BY_DEFAULT: "true" + command: > + sh -c " + mkdir -p /prefect-storage && + chmod 755 /prefect-storage && + prefect server services start + " + volumes: + - prefect_storage:/prefect-storage + + docker-proxy: + image: tecnativa/docker-socket-proxy + environment: + # Enable permissions needed for Prefect worker container creation and management + CONTAINERS: 1 + IMAGES: 1 + BUILD: 1 + VOLUMES: 1 + NETWORKS: 1 + SERVICES: 1 # Required for some container operations + TASKS: 1 # Required for container management + NODES: 1 # Required for container scheduling + GET: 1 + POST: 1 + PUT: 1 + DELETE: 1 + HEAD: 1 + INFO: 1 + VERSION: 1 + PING: 1 + EVENTS: 1 + DISTRIBUTION: 1 + AUTH: 1 + # Still block the most dangerous operations + SYSTEM: 0 + SWARM: 0 + EXEC: 0 # Keep container exec blocked for security + volumes: + - /var/run/docker.sock:/var/run/docker.sock:ro + ports: + - "2375" + networks: + - default + + prefect-worker: + image: prefecthq/prefect:3-latest + depends_on: + prefect-server: + condition: service_started + docker-proxy: + condition: service_started + registry: + condition: service_healthy + environment: + PREFECT_API_URL: http://prefect-server:4200/api + PREFECT_LOCAL_STORAGE_PATH: /prefect-storage + PREFECT_RESULTS_PERSIST_BY_DEFAULT: "true" + DOCKER_HOST: tcp://docker-proxy:2375 + DOCKER_BUILDKIT: 1 # Enable BuildKit for better performance + DOCKER_CONFIG: /tmp/docker + # Registry URLs (set REGISTRY_HOST in your environment or .env) + # - macOS/Windows Docker Desktop: REGISTRY_HOST=host.docker.internal + # - Linux: REGISTRY_HOST=localhost (default) + FUZZFORGE_REGISTRY_PUSH_URL: "${REGISTRY_HOST:-localhost}:5001" + FUZZFORGE_REGISTRY_PULL_URL: "${REGISTRY_HOST:-localhost}:5001" + command: > + sh -c " + mkdir -p /tmp/docker && + mkdir -p /prefect-storage && + chmod 755 /prefect-storage && + echo '{\"insecure-registries\": [\"registry:5000\", \"localhost:5001\", \"host.docker.internal:5001\"]}' > /tmp/docker/config.json && + pip install 'prefect[docker]' && + prefect worker start --pool docker-pool + " + volumes: + - prefect_storage:/prefect-storage # Access to shared storage for results + - toolbox_code:/opt/prefect/toolbox:ro # Access to toolbox code for building + networks: + - default + extra_hosts: + - "host.docker.internal:host-gateway" + + fuzzforge-backend: + build: + context: ./backend + dockerfile: Dockerfile + depends_on: + prefect-server: + condition: service_started + docker-proxy: + condition: service_started + registry: + condition: service_healthy + environment: + PREFECT_API_URL: http://prefect-server:4200/api + PREFECT_LOCAL_STORAGE_PATH: /prefect-storage + PREFECT_RESULTS_PERSIST_BY_DEFAULT: "true" + DOCKER_HOST: tcp://docker-proxy:2375 + DOCKER_BUILDKIT: 1 + DOCKER_CONFIG: /tmp/docker + DOCKER_TLS_VERIFY: "" + DOCKER_REGISTRY_INSECURE: "registry:5000,localhost:5001,host.docker.internal:5001" + # Registry URLs (set REGISTRY_HOST in your environment or .env) + # - macOS/Windows Docker Desktop: REGISTRY_HOST=host.docker.internal + # - Linux: REGISTRY_HOST=localhost (default) + FUZZFORGE_REGISTRY_PUSH_URL: "${REGISTRY_HOST:-localhost}:5001" + FUZZFORGE_REGISTRY_PULL_URL: "${REGISTRY_HOST:-localhost}:5001" + ports: + - "8000:8000" + - "8010:8010" + volumes: + - prefect_storage:/prefect-storage + - ./backend/toolbox:/app/toolbox:ro # Direct host mount (read-only) for live updates + - toolbox_code:/opt/prefect/toolbox # Share toolbox code with workers + - ./test_projects:/app/test_projects:ro # Test projects for workflow testing + networks: + - default + extra_hosts: + - "host.docker.internal:host-gateway" + # Sync toolbox code to shared volume and start server with live reload + command: > + sh -c " + mkdir -p /opt/prefect/toolbox && + mkdir -p /prefect-storage && + mkdir -p /tmp/docker && + chmod 755 /prefect-storage && + echo '{\"insecure-registries\": [\"registry:5000\", \"localhost:5001\", \"host.docker.internal:5001\"]}' > /tmp/docker/config.json && + cp -r /app/toolbox/* /opt/prefect/toolbox/ 2>/dev/null || true && + (while true; do + rsync -av --delete /app/toolbox/ /opt/prefect/toolbox/ > /dev/null 2>&1 || true + sleep 10 + done) & + uv run uvicorn src.main:app --host 0.0.0.0 --port 8000 --reload + " + +volumes: + postgres_data: + redis_data: + prefect_storage: + toolbox_code: + registry_data: + +networks: + default: + name: fuzzforge_alpha_default diff --git a/docs/.gitignore b/docs/.gitignore new file mode 100644 index 0000000..b2d6de3 --- /dev/null +++ b/docs/.gitignore @@ -0,0 +1,20 @@ +# Dependencies +/node_modules + +# Production +/build + +# Generated files +.docusaurus +.cache-loader + +# Misc +.DS_Store +.env.local +.env.development.local +.env.test.local +.env.production.local + +npm-debug.log* +yarn-debug.log* +yarn-error.log* diff --git a/docs/README.md b/docs/README.md new file mode 100644 index 0000000..5f6c5d6 --- /dev/null +++ b/docs/README.md @@ -0,0 +1,25 @@ +# FuzzForge Documentation + +This website is built using [Docusaurus](https://docusaurus.io/), a modern static website generator. + +## Installation + +```bash +yarn +``` + +## Local Development + +```bash +yarn start +``` + +This command starts a local development server and opens up a browser window. Most changes are reflected live without having to restart the server. + +## Build + +```bash +yarn build +``` + +This command generates static content into the `build` directory and can be served using any static contents hosting service. diff --git a/docs/docs/ai/a2a-services.md b/docs/docs/ai/a2a-services.md new file mode 100644 index 0000000..aac4c94 --- /dev/null +++ b/docs/docs/ai/a2a-services.md @@ -0,0 +1,150 @@ +# A2A Services + +The FuzzForge AI module can expose itself as an Agent-to-Agent (A2A) server so downstream systems can register the agent, inspect its card, and call tools over HTTP. + +## Starting the Server + +```bash +fuzzforge ai server +``` + +Run the command from a project directory that already contains `.fuzzforge/`. The server reads the project configuration and reuses the same environment variables as the CLI shell. + +**Default directories** +- Logs: `.fuzzforge/logs/cognee.log` +- Cognee datasets: `.fuzzforge/cognee/project_/{data,system}` +- Artifact cache: `.fuzzforge/artifacts` + +## HTTP Endpoints + +| Method | Path | Purpose | +| --- | --- | --- | +| `GET` | `/artifacts/{id}` | Download artifacts created by the agent, workflows, or remote collaborators | +| `POST` | `/graph/query` | Query the Cognee project graph using `query`, optional `dataset`, and optional `search_type` | +| `POST` | `/project/files` | Mirror a file from the project workspace as a downloadable artifact | + +### `POST /graph/query` + +Request body: +- `query` *(str, required)* โ€“ Natural language question for the graph +- `search_type` *(str, optional)* โ€“ e.g. `GRAPH_COMPLETION`, `INSIGHTS`, `CHUNKS` +- `dataset` *(str, optional)* โ€“ Defaults to `_codebase` + +Example: + +```bash +curl -s http://localhost:10100/graph/query \ + -H 'Content-Type: application/json' \ + -d '{"query":"unsafe Rust", "search_type":"GRAPH_COMPLETION"}' | jq +``` + +### `POST /project/files` + +Registers a source file and returns an artifact descriptor. + +```bash +curl -s http://localhost:10100/project/files \ + -H 'Content-Type: application/json' \ + -d '{"path":"src/lib.rs"}' | jq +``` + +Response excerpt: + +```json +{ + "id": "project_file_4325a8a6", + "file_uri": "http://127.0.0.1:10100/artifacts/project_file_4325a8a6", + "name": "src/lib.rs", + "mime_type": "text/x-c", + "size": 160 +} +``` + +## Typical Collaboration Flow + +1. Ingest project knowledge with `fuzzforge rag ingest --path . --recursive`. +2. Start the A2A server: `fuzzforge ai server`. +3. Downstream agents: + - Call `POST /graph/query` to explore project knowledge. + - Call `POST /project/files` to fetch raw files from the repository. + - Download finished scan summaries with `GET /artifacts/{id}`. +4. The AI module pushes Prefect workflow results into artifacts automatically, so remote agents can poll without re-running scans. + +## Registration Flow + +```mermaid +sequenceDiagram + participant Client as Remote Agent + participant HTTP as A2A HTTP Server + participant Exec as FuzzForgeExecutor + participant Registry as Agent Registry + + Client->>HTTP: GET /.well-known/agent-card.json + HTTP-->>Client: Agent card (skills, protocol version) + Client->>HTTP: POST / (register) + HTTP->>Exec: Register request + Exec->>Registry: Persist remote agent metadata + Exec-->>HTTP: Confirmation + assigned agent ID + HTTP-->>Client: Success response + Client->>Exec: Subsequent messages routed via HTTP endpoints + Exec->>Registry: Update capability cache per message +``` + +### How registration works + +1. **Discovery** โ€“ A remote agent fetches `/.well-known/agent-card.json` to confirm skills, protocol version, and message schemas. +2. **Handshake** โ€“ The remote agent issues `POST /` to start the A2A session. The payload includes its agent card and callback URL. +3. **Persistence** โ€“ `FuzzForgeExecutor` stores the remote agent in the registry (`agents.yaml` when run via the CLI). Auto-registration on future boots replays these entries. +4. **Capability cache** โ€“ Each inbound message updates the capability cache so the router can route `ROUTE_TO AgentName:` commands without another round-trip. +5. **Teardown** โ€“ Removing an agent via `/unregister` purges it from the registry; restart the server to drop any lingering connections. + +### Where agent metadata lives + +- The AI CLI and server share the same registry file. When a project is initialised, registrations are written to `.fuzzforge/agents.yaml` (see `ai/src/fuzzforge_ai/config_manager.py`). +- If the project file is absent, the executor falls back to the packaged default at `ai/src/fuzzforge_ai/config.yaml`. +- Each entry records `name`, `url`, and description. On startup `_auto_register_agents()` replays that list so both the CLI and the A2A server automatically reconnect to known peers. +- Editing `.fuzzforge/agents.yaml` manually is supported; the CLI `/register` and `/unregister` commands update it for you. + +## Agent Card + +The server exposes its agent card at `/.well-known/agent-card.json`. Clients can read that metadata to confirm skills, supported message schemas, and protocol version (`0.3.0`). + +## Artifacts in A2A mode + +- **Creation** โ€“ Conversations generate artifacts automatically when the executor produces code, reports, or workflow summaries. The `/artifacts` CLI command lists them; over HTTP they are addressed by `GET /artifacts/{id}`. +- **Distribution** โ€“ Use `/sendfile [note]` in the CLI or call `POST /project/files` programmatically to turn a project file into an artifact that downstream agents can fetch. +- **Download** โ€“ Remote agents receive the artifact descriptor (including `file_uri`) in A2A responses or via polling. Retrieve the content with `GET /artifacts/{id}`; the cache lives under `.fuzzforge/artifacts/`. +- **Lifecycle** โ€“ Artifacts persist for the life of the project workspace. Clean the directory if you need to reclaim space; the executor recreates entries on demand. + +## Running the server vs. CLI-only mode + +- Launch the server with `fuzzforge ai server`. It loads `.fuzzforge/.env`, sets up Cognee directories via `ProjectConfigManager`, and exposes HTTP endpoints on `127.0.0.1:${FUZZFORGE_PORT:-10100}`. +- Without the server, the `fuzzforge ai agent` CLI still supports A2A-style routing for locally registered peers, but external agents cannot connect because the HTTP surface is absent. +- When the server is running, both the CLI and remote agents share the same executor, task store, and artifact cache. Stopping the server returns the module to CLI-only operation without altering persisted registrations. + +## Communication Patterns + +```mermaid +sequenceDiagram + participant Remote as Remote Agent + participant HTTP as A2A Server + participant Exec as Executor + participant Workflow as Prefect Backend + + Remote->>HTTP: POST / (message with tool request) + HTTP->>Exec: Forward message + Exec->>Workflow: (optional) submit_security_scan_mcp + Workflow-->>Exec: Status / findings + Exec->>HTTP: Response + artifact metadata + HTTP-->>Remote: A2A response with artifacts/tasks + Remote->>HTTP: GET /artifacts/{id} + HTTP-->>Remote: Artifact bytes +``` + +This pattern repeats for subsequent tool invocations. Remote agents can also call the helper endpoints (`/graph/query`, `/project/files`) directly while the conversation is active. + +## Related Files + +- Runtime entry point: `ai/src/fuzzforge_ai/__main__.py` +- HTTP implementation: `ai/src/fuzzforge_ai/a2a_server.py` +- Agent metadata: `ai/src/fuzzforge_ai/agent_card.py` diff --git a/docs/docs/ai/architecture.md b/docs/docs/ai/architecture.md new file mode 100644 index 0000000..60f334b --- /dev/null +++ b/docs/docs/ai/architecture.md @@ -0,0 +1,146 @@ +# AI Architecture + +FuzzForge AI is the orchestration layer that lets large language models drive the broader security platform. Built on the Google ADK runtime, the module coordinates local tools, remote Agent-to-Agent (A2A) peers, and Prefect-backed workflows while persisting long-running context for every project. + +## System Diagram + +```mermaid +graph TB + subgraph Surfaces + CLI[CLI Shell] + HTTP[A2A HTTP Server] + end + + CLI --> AgentCore + HTTP --> AgentCore + + subgraph AgentCore [Agent Core] + AgentCoreNode[FuzzForgeAgent] + AgentCoreNode --> Executor[Executor] + AgentCoreNode --> Memory[Memory Services] + AgentCoreNode --> Registry[Agent Registry] + end + + Executor --> MCP[MCP Workflow Bridge] + Executor --> Router[Capability Router] + Executor --> Files[Artifact Manager] + Executor --> Prompts[Prompt Templates] + + Router --> RemoteAgents[Registered A2A Agents] + MCP --> Prefect[FuzzForge Backend] + Memory --> SessionDB[Session Store] + Memory --> Semantic[Semantic Recall] + Memory --> Graphs[Cognee Graph] + Files --> Artifacts[Artifact Cache] + + +``` + +## Detailed Data Flow + +```mermaid +sequenceDiagram + participant User as User / Remote Agent + participant CLI as CLI / HTTP Surface + participant Exec as FuzzForgeExecutor + participant ADK as ADK Runner + participant Prefect as Prefect Backend + participant Cognee as Cognee + participant Artifact as Artifact Cache + + User->>CLI: Prompt or slash command + CLI->>Exec: Normalised request + context ID + Exec->>ADK: Tool invocation (LiteLLM) + ADK-->>Exec: Structured response / tool result + Exec->>Prefect: (optional) submit workflow via MCP + Prefect-->>Exec: Run status updates + Exec->>Cognee: (optional) knowledge query / ingestion + Cognee-->>Exec: Graph results + Exec->>Artifact: Persist generated files + Exec-->>CLI: Final response + artifact links + task events + CLI-->>User: Rendered answer +``` + +## Entry Points + +- **CLI shell** (`ai/src/fuzzforge_ai/cli.py`) provides the interactive `fuzzforge ai agent` loop. It streams user messages through the executor, wires slash commands for listing agents, sending files, and launching workflows, and keeps session IDs in sync with ADKโ€™s session service. +- **A2A HTTP server** (`ai/src/fuzzforge_ai/a2a_server.py`) wraps the same agent in Starlette. It exposes RPC-compatible endpoints plus helper routes (`/artifacts/{id}`, `/graph/query`, `/project/files`) and reuses the executorโ€™s task store so downstream agents can poll status updates. + +## Core Components + +- **FuzzForgeAgent** (`ai/src/fuzzforge_ai/agent.py`) assembles the runtime: it loads environment variables, constructs the executor, and builds an ADK `Agent` backed by `LiteLlm`. The singleton accessor `get_fuzzforge_agent()` keeps CLI and server instances aligned and shares the generated agent card. +- **FuzzForgeExecutor** (`ai/src/fuzzforge_ai/agent_executor.py`) is the brain. It registers tools, manages session storage (SQLite or in-memory via `DatabaseSessionService` / `InMemorySessionService`), and coordinates artifact storage. The executor also tracks long-running Prefect workflows inside `pending_runs`, produces `TaskStatusUpdateEvent` objects, and funnels every response through ADKโ€™s `Runner` so traces include tool metadata. +- **Remote agent registry** (`ai/src/fuzzforge_ai/remote_agent.py`) holds metadata for downstream agents and handles capability discovery over HTTP. Auto-registration is configured by `ConfigManager` so known agents attach on startup. +- **Memory services**: + - `FuzzForgeMemoryService` and `HybridMemoryManager` (`ai/src/fuzzforge_ai/memory_service.py`) provide conversation recall and bridge to Cognee datasets when configured. + - Cognee bootstrap (`ai/src/fuzzforge_ai/cognee_service.py`) ensures ingestion and knowledge queries stay scoped to the current project. + +## Workflow Automation + +The executor wraps Prefect MCP actions exposed by the backend: + +| Tool | Source | Purpose | +| --- | --- | --- | +| `list_workflows_mcp` | `ai/src/fuzzforge_ai/agent_executor.py` | Enumerate available scans | +| `submit_security_scan_mcp` | `agent_executor.py` | Launch a scan and persist run metadata | +| `get_run_status_mcp` | `agent_executor.py` | Poll Prefect for status and push task events | +| `get_comprehensive_scan_summary` | `agent_executor.py` | Collect findings and bundle artifacts | +| `get_backend_status_mcp` | `agent_executor.py` | Block submissions until Prefect reports `ready` | + +The CLI surface mirrors these helpers as natural-language prompts (`You> run fuzzforge workflow โ€ฆ`). ADKโ€™s `Runner` handles retries and ensures each tool call yields structured `Event` objects for downstream instrumentation. + +## Knowledge & Ingestion + +- The `fuzzforge ingest` and `fuzzforge rag ingest` commands call into `ai/src/fuzzforge_ai/ingest_utils.py`, which filters file types, ignores caches, and populates Cognee datasets under `.fuzzforge/cognee/project_/`. +- Runtime queries hit `query_project_knowledge_api` on the executor, which defers to `cognee_service` for dataset lookup and semantic search. When Cognee credentials are absent the tools return a friendly "not configured" response. + +## Artifact Pipeline + +Artifacts generated during conversations or workflow runs are written to `.fuzzforge/artifacts/`: + +1. The executor creates a unique directory per artifact ID and writes the payload (text, JSON, or binary). +2. Metadata is stored in-memory and, when running under the A2A server, surfaced via `GET /artifacts/{id}`. +3. File uploads from `/project/files` reuse the same pipeline so remote agents see a consistent interface. + +## Task & Event Wiring + +- In CLI mode, `FuzzForgeExecutor` bootstraps shared `InMemoryTaskStore` and `InMemoryQueueManager` instances (see `agent_executor.py`). They allow the agent to emit `TaskStatusUpdateEvent` objects even when the standalone server is not running. +- The A2A HTTP wrapper reuses those handles, so any active workflow is visible to both the local shell and remote peers. + +Use the complementary docs for step-by-step instructions: + +- [Ingestion & Knowledge Graphs](ingestion.md) +- [LLM & Environment Configuration](configuration.md) +- [Prompt Patterns & Examples](prompts.md) +- [A2A Services](a2a-services.md) + +## Memory & Persistence + +```mermaid +graph LR + subgraph ADK Memory Layer + SessionDB[(DatabaseSessionService)] + Semantic[Semantic Recall Index] + end + + subgraph Project Knowledge + CogneeDataset[(Cognee Dataset)] + HybridManager[HybridMemoryManager] + end + + Prompts[Prompts & Tool Outputs] --> SessionDB + SessionDB --> Semantic + Ingestion[Ingestion Pipeline] --> CogneeDataset + CogneeDataset --> HybridManager + HybridManager --> Semantic + HybridManager --> Exec[Executor] + Exec --> Responses[Responses with Context] +``` + +- **Session persistence** is controlled by `SESSION_PERSISTENCE`. When set to `sqlite`, ADKโ€™s `DatabaseSessionService` writes transcripts to the path configured by `SESSION_DB_PATH` (defaults to `./fuzzforge_sessions.db`). With `inmemory`, the context is scoped to the current process. +- **Semantic recall** stores vector embeddings so `/recall` queries can surface earlier prompts, even after restarts when using SQLite. +- **Hybrid memory manager** (`HybridMemoryManager`) stitches Cognee results into the ADK session. When a knowledge query hits Cognee, the relevant nodes are appended back into the session context so follow-up prompts can reference them naturally. +- **Cognee datasets** are unique per project. Ingestion runs populate `_codebase` while custom calls to `ingest_to_dataset` let you maintain dedicated buckets (e.g., `insights`). Data is persisted inside `.fuzzforge/cognee/project_/` and shared across CLI and A2A modes. +- **Task metadata** (workflow runs, artifact descriptors) lives in the executorโ€™s in-memory caches but is also mirrored through A2A task events so remote agents can resubscribe if the CLI restarts. +- **Operational check**: Run `/recall ` or `You> search project knowledge for "topic" using INSIGHTS` after ingestion to confirm both ADK session recall and Cognee graph access are active. +- **CLI quick check**: `/memory status` summarises the current memory type, session persistence, and Cognee dataset directories from inside the agent shell. diff --git a/docs/docs/ai/configuration.md b/docs/docs/ai/configuration.md new file mode 100644 index 0000000..cb42783 --- /dev/null +++ b/docs/docs/ai/configuration.md @@ -0,0 +1,122 @@ +# LLM & Environment Configuration + +FuzzForge AI relies on LiteLLM adapters embedded in the Google ADK runtime, so you can swap between providers without touching code. Configuration is driven by environment variables inside `.fuzzforge/.env`. + +## Minimal Setup + +```env +LLM_PROVIDER=openai +LITELLM_MODEL=gpt-5-mini +OPENAI_API_KEY=sk-your-key +``` + +Set these values before launching `fuzzforge ai agent` or `python -m fuzzforge_ai`. + +## .env Template + +`fuzzforge init` creates `.fuzzforge/.env.template` alongside the real secrets file. Keep the template under version control so teammates can copy it to `.fuzzforge/.env` and fill in provider credentials locally. The template includes commented examples for Cognee, AgentOps, and alternative LLM providersโ€”extend it with any project-specific overrides you expect collaborators to set. + +## Provider Examples + +**OpenAI-compatible (Azure, etc.)** +```env +LLM_PROVIDER=azure_openai +LITELLM_MODEL=gpt-4o-mini +LLM_API_KEY=sk-your-azure-key +LLM_ENDPOINT=https://your-resource.openai.azure.com +``` + +**Anthropic** +```env +LLM_PROVIDER=anthropic +LITELLM_MODEL=claude-3-haiku-20240307 +ANTHROPIC_API_KEY=sk-your-key +``` + +**Ollama (local models)** +```env +LLM_PROVIDER=ollama_chat +LITELLM_MODEL=codellama:latest +OLLAMA_API_BASE=http://localhost:11434 +``` +Run `ollama pull codellama:latest` ahead of time so the adapter can stream tokens immediately. Any Ollama-hosted model works; set `LITELLM_MODEL` to match the image tag. + +**Vertex AI** +```env +LLM_PROVIDER=vertex_ai +LITELLM_MODEL=gemini-1.5-pro +GOOGLE_APPLICATION_CREDENTIALS=/path/to/service-account.json +``` + +## Additional LiteLLM Providers + +LiteLLM exposes dozens of adapters. Popular additions include: + +- `LLM_PROVIDER=anthropic_messages` for Claude 3.5. +- `LLM_PROVIDER=azure_openai` for Azure-hosted GPT variants. +- `LLM_PROVIDER=groq` for Groq LPU-backed models (`GROQ_API_KEY` required). +- `LLM_PROVIDER=ollama_chat` for any local Ollama model. +- `LLM_PROVIDER=vertex_ai` for Gemini. + +Refer to the [LiteLLM provider catalog](https://docs.litellm.ai/docs/providers) when mapping environment variables; each adapter lists the exact keys the ADK runtime expects. + +## Session Persistence + +``` +SESSION_PERSISTENCE=sqlite # sqlite | inmemory +MEMORY_SERVICE=inmemory # ADK memory backend +``` + +Set `SESSION_PERSISTENCE=sqlite` to preserve conversational history across restarts. For ephemeral sessions, switch to `inmemory`. + +## Knowledge Graph Settings + +To enable Cognee-backed graphs: + +```env +LLM_COGNEE_PROVIDER=openai +LLM_COGNEE_MODEL=gpt-5-mini +LLM_COGNEE_API_KEY=sk-your-key +``` + +If the Cognee variables are omitted, graph-specific tools remain available but return a friendly "not configured" response. + +## MCP / Backend Integration + +```env +FUZZFORGE_MCP_URL=http://localhost:8010/mcp +``` + +The agent uses this endpoint to list, launch, and monitor Prefect workflows. + +## Tracing & Observability + +The executor ships with optional AgentOps tracing. Provide an API key to record conversations, tool calls, and workflow updates: + +```env +AGENTOPS_API_KEY=sk-your-agentops-key +AGENTOPS_ENVIRONMENT=local # Optional tag for dashboards +``` + +Set `FUZZFORGE_DEBUG=1` to surface verbose executor logging and enable additional stdout in the CLI. For HTTP deployments, combine that with: + +```env +LOG_LEVEL=DEBUG +``` + +The ADK runtime also honours `GOOGLE_ADK_TRACE_DIR=/path/to/logs` if you want JSONL traces without an external service. + +## Debugging Flags + +```env +FUZZFORGE_DEBUG=1 # Enables verbose logging +LOG_LEVEL=DEBUG # Applies to the A2A server and CLI +``` + +These flags surface additional insight when diagnosing routing or ingestion issues. Combine them with AgentOps tracing to get full timelines of tool usage. + +## Related Code + +- Env bootstrap: `ai/src/fuzzforge_ai/config_manager.py` +- LiteLLM glue: `ai/src/fuzzforge_ai/agent.py` +- Cognee integration: `ai/src/fuzzforge_ai/cognee_service.py` diff --git a/docs/docs/ai/ingestion.md b/docs/docs/ai/ingestion.md new file mode 100644 index 0000000..0af3c9e --- /dev/null +++ b/docs/docs/ai/ingestion.md @@ -0,0 +1,88 @@ +# Ingestion & Knowledge Graphs + +The AI module keeps long-running context by mirroring your repository into a Cognee-powered knowledge graph and persisting conversations in local storage. + +## CLI Commands + +```bash +# Scan the current project (skips .git/, .fuzzforge/, virtualenvs, caches) +fuzzforge ingest --path . --recursive + +# Alias - identical behaviour +fuzzforge rag ingest --path . --recursive +``` + +The command gathers files using the filters defined in `ai/src/fuzzforge_ai/ingest_utils.py`. By default it includes common source, configuration, and documentation file types while skipping temporary and dependency directories. + +### Customising the File Set + +Use CLI flags to override the defaults: + +```bash +fuzzforge ingest --path backend --file-types .py --file-types .yaml --exclude node_modules --exclude dist +``` + +## Command Options + +`fuzzforge ingest` exposes several flags (see `cli/src/fuzzforge_cli/commands/ingest.py`): + +- `--recursive / -r` โ€“ Traverse sub-directories. +- `--file-types / -t` โ€“ Repeatable flag to whitelist extensions (`-t .py -t .rs`). +- `--exclude / -e` โ€“ Repeatable glob patterns to skip (`-e tests/**`). +- `--dataset / -d` โ€“ Write into a named dataset instead of `_codebase`. +- `--force / -f` โ€“ Clear previous Cognee data before ingesting (prompts for confirmation unless flag supplied). + +All runs automatically skip `.fuzzforge/**` and `.git/**` to avoid recursive ingestion of cache folders. + +## Dataset Layout + +- Primary dataset: `_codebase` +- Additional datasets: create ad-hoc buckets such as `insights` via the `ingest_to_dataset` tool +- Storage location: `.fuzzforge/cognee/project_/` + +### Persistence Details + +- Every dataset lives under `.fuzzforge/cognee/project_/{data,system}`. These directories are safe to commit to long-lived storage (they only contain embeddings and metadata). +- Cognee assigns deterministic IDs per project; if you move the repository, copy the entire `.fuzzforge/cognee/` tree to retain graph history. +- `HybridMemoryManager` ensures answers from Cognee are written back into the ADK session store so future prompts can refer to the same nodes without repeating the query. +- All Cognee processing runs locally against the files you ingest. No external service calls are made unless you configure a remote Cognee endpoint. + +## Prompt Examples + +``` +You> refresh the project knowledge graph for ./backend +Assistant> Kicks off `fuzzforge ingest` with recursive scan + +You> search project knowledge for "prefect workflow" using INSIGHTS +Assistant> Routes to Cognee `search_project_knowledge` + +You> ingest_to_dataset("Design doc for new scanner", "insights") +Assistant> Adds the provided text block to the `insights` dataset +``` + +## Environment Template + +The CLI writes a template at `.fuzzforge/.env.template` when you initialise a project. Keep it in source control so collaborators can copy it to `.env` and fill in secrets. + +```env +# Core LLM settings +LLM_PROVIDER=openai +LITELLM_MODEL=gpt-5-mini +OPENAI_API_KEY=sk-your-key + +# FuzzForge backend (Prefect-powered) +FUZZFORGE_MCP_URL=http://localhost:8010/mcp + +# Optional: knowledge graph provider +LLM_COGNEE_PROVIDER=openai +LLM_COGNEE_MODEL=gpt-5-mini +LLM_COGNEE_API_KEY=sk-your-key +``` + +Add comments or project-specific overrides as needed; the agent reads these variables on startup. + +## Tips + +- Re-run ingestion after significant code changes to keep the knowledge graph fresh. +- Large binary assets are skipped automaticallyโ€”store summaries or documentation if you need them searchable. +- Set `FUZZFORGE_DEBUG=1` to surface verbose ingest logs during troubleshooting. diff --git a/docs/docs/ai/intro.md b/docs/docs/ai/intro.md new file mode 100644 index 0000000..073c4b1 --- /dev/null +++ b/docs/docs/ai/intro.md @@ -0,0 +1,114 @@ +--- +sidebar_position: 1 +--- + +# FuzzForge AI Module + +FuzzForge AI is the multi-agent layer that lets you operate the FuzzForge security platform through natural language. It orchestrates local tooling, registered Agent-to-Agent (A2A) peers, and the Prefect-powered backend while keeping long-running context in memory and project knowledge graphs. + +## Quick Start + +1. **Initialise a project** + ```bash + cd /path/to/project + fuzzforge init + ``` +2. **Review environment settings** โ€“ copy `.fuzzforge/.env.template` to `.fuzzforge/.env`, then edit the values to match your provider. The template ships with commented defaults for OpenAI-style usage and placeholders for Cognee keys. + ```env + LLM_PROVIDER=openai + LITELLM_MODEL=gpt-5-mini + OPENAI_API_KEY=sk-your-key + FUZZFORGE_MCP_URL=http://localhost:8010/mcp + SESSION_PERSISTENCE=sqlite + ``` + Optional flags you may want to enable early: + ```env + MEMORY_SERVICE=inmemory + AGENTOPS_API_KEY=sk-your-agentops-key # Enable hosted tracing + LOG_LEVEL=INFO # CLI / server log level + ``` +3. **Populate the knowledge graph** + ```bash + fuzzforge ingest --path . --recursive + # alias: fuzzforge rag ingest --path . --recursive + ``` +4. **Launch the agent shell** + ```bash + fuzzforge ai agent + ``` + Keep the backend running (Prefect API at `FUZZFORGE_MCP_URL`) so workflow commands succeed. + +## Everyday Workflow + +- Run `fuzzforge ai agent` and start with `list available fuzzforge workflows` or `/memory status` to confirm everything is wired. +- Use natural prompts for automation (`run fuzzforge workflow โ€ฆ`, `search project knowledge for โ€ฆ`) and fall back to slash commands for precision (`/recall`, `/sendfile`). +- Keep `/memory datasets` handy to see which Cognee datasets are available after each ingest. +- Start the HTTP surface with `python -m fuzzforge_ai` when external agents need access to artifacts or graph queries. The CLI stays usable at the same time. +- Refresh the knowledge graph regularly: `fuzzforge ingest --path . --recursive --force` keeps responses aligned with recent code changes. + +## What the Agent Can Do + +- **Route requests** โ€“ automatically selects the right local tool or remote agent using the A2A capability registry. +- **Run security workflows** โ€“ list, submit, and monitor FuzzForge workflows via MCP wrappers. +- **Manage artifacts** โ€“ create downloadable files for reports, code edits, and shared attachments. +- **Maintain context** โ€“ stores session history, semantic recall, and Cognee project graphs. +- **Serve over HTTP** โ€“ expose the same agent as an A2A server using `python -m fuzzforge_ai`. + +## Essential Commands + +Inside `fuzzforge ai agent` you can mix slash commands and free-form prompts: + +```text +/list # Show registered A2A agents +/register http://:10201 # Add a remote agent +/artifacts # List generated files +/sendfile SecurityAgent src/report.md "Please review" +You> route_to SecurityAnalyzer: scan ./backend for secrets +You> run fuzzforge workflow static_analysis_scan on ./test_projects/demo +You> search project knowledge for "prefect status" using INSIGHTS +``` + +Artifacts created during the conversation are served from `.fuzzforge/artifacts/` and exposed through the A2A HTTP API. + +## Memory & Knowledge + +The module layers three storage systems: + +- **Session persistence** (SQLite or in-memory) for chat transcripts. +- **Semantic recall** via the ADK memory service for fuzzy search. +- **Cognee graphs** for project-wide knowledge built from ingestion runs. + +Re-run ingestion after major code changes to keep graph answers relevant. If Cognee variables are not set, graph-specific tools automatically respond with a polite "not configured" message. + +## Sample Prompts + +Use these to validate the setup once the agent shell is running: + +- `list available fuzzforge workflows` +- `run fuzzforge workflow static_analysis_scan on ./backend with target_branch=main` +- `show findings for that run once it finishes` +- `refresh the project knowledge graph for ./backend` +- `search project knowledge for "prefect readiness" using INSIGHTS` +- `/recall terraform secrets` +- `/memory status` +- `ROUTE_TO SecurityAnalyzer: audit infrastructure_vulnerable` + +## Need More Detail? + +Dive into the dedicated guides in this category : + +- [Architecture](./architecture.md) โ€“ High-level architecture with diagrams and component breakdowns. +- [Ingestion](./ingestion.md) โ€“ Command options, Cognee persistence, and prompt examples. +- [Configuration](./configuration.md) โ€“ LLM provider matrix, local model setup, and tracing options. +- [Prompts](./prompts.md) โ€“ Slash commands, workflow prompts, and routing tips. +- [A2A Services](./a2a-services.md) โ€“ HTTP endpoints, agent card, and collaboration flow. +- [Memory Persistence](./architecture.md#memory--persistence) โ€“ Deep dive on memory storage, datasets, and how `/memory status` inspects them. + +## Development Notes + +- Entry point for the CLI: `ai/src/fuzzforge_ai/cli.py` +- A2A HTTP server: `ai/src/fuzzforge_ai/a2a_server.py` +- Tool routing & workflow glue: `ai/src/fuzzforge_ai/agent_executor.py` +- Ingestion helpers: `ai/src/fuzzforge_ai/ingest_utils.py` + +Install the module in editable mode (`pip install -e ai`) while iterating so CLI changes are picked up immediately. diff --git a/docs/docs/ai/prompts.md b/docs/docs/ai/prompts.md new file mode 100644 index 0000000..8649b7f --- /dev/null +++ b/docs/docs/ai/prompts.md @@ -0,0 +1,60 @@ +# Prompt Patterns & Examples + +Use the `fuzzforge ai agent` shell to mix structured slash commands with natural requests. The Google ADK runtime keeps conversation context, so follow-ups automatically reuse earlier answers, retrieved files, and workflow IDs. + +## Slash Commands + +| Command | Purpose | Example | +| --- | --- | --- | +| `/list` | Show registered A2A agents | `/list` | +| `/register ` | Register a remote agent card | `/register http://localhost:10201` | +| `/artifacts` | List generated artifacts with download links | `/artifacts` | +| `/sendfile [note]` | Ship a file as an artifact to a remote peer | `/sendfile SecurityAnalyzer reports/latest.md "Please review"` | +| `/memory status` | Summarise conversational memory, session store, and Cognee directories | `/memory status` | +| `/memory datasets` | List available Cognee datasets | `/memory datasets` | +| `/recall ` | Search prior conversation context using semantic vectors | `/recall dependency updates` | + +## Workflow Automation + +``` +You> list available fuzzforge workflows +Assistant> [returns workflow names, descriptions, and required parameters] + +You> run fuzzforge workflow static_analysis_scan on ./backend with target_branch=main +Assistant> Submits the run, emits TaskStatusUpdateEvent entries, and links the SARIF artifact when complete. + +You> show findings for that run once it finishes +Assistant> Streams the `get_comprehensive_scan_summary` output and attaches the artifact URI. +``` + +## Knowledge Graph & Memory Prompts + +``` +You> refresh the project knowledge graph for ./backend +Assistant> Launches `fuzzforge ingest --path ./backend --recursive` and reports file counts. + +You> search project knowledge for "prefect readiness" using INSIGHTS +Assistant> Routes to Cognee via `query_project_knowledge_api` and returns the top matches. + +You> recall "api key rotation" +Assistant> Uses the ADK semantic memory service to surface earlier chat snippets. +``` + +## Routing to Specialist Agents + +``` +You> ROUTE_TO SecurityAnalyzer: audit this Terraform module for secrets +Assistant> Delegates the request to `SecurityAnalyzer` using the A2A capability map. + +You> sendfile DocumentationAgent docs/runbook.md "Incorporate latest workflow" +Assistant> Uploads the file as an artifact and notifies the remote agent. +``` + +## Prompt Tips + +- Use explicit verbs (`list`, `run`, `search`) to trigger the Prefect workflow helpers. +- Include parameter names inline (`with target_branch=main`) so the executor maps values to MCP tool inputs without additional clarification. +- When referencing prior runs, reuse the assistantโ€™s run IDs or ask for "the last run"โ€”the session store tracks them per context ID. +- If Cognee is not configured, graph queries return a friendly notice; set `LLM_COGNEE_*` variables to enable full answers. +- Combine slash commands and natural prompts in the same session; the ADK session service keeps them in a single context thread. +- `/memory search ` is a shortcut for `/recall ` if you want status plus recall in one place. diff --git a/docs/docs/concept/_category_.json b/docs/docs/concept/_category_.json new file mode 100644 index 0000000..102bb11 --- /dev/null +++ b/docs/docs/concept/_category_.json @@ -0,0 +1,8 @@ +{ + "label": "Concept", + "position": 2, + "link": { + "type": "generated-index", + "description": "Concept pages that are understanding-oriented." + } +} diff --git a/docs/docs/concept/architecture.md b/docs/docs/concept/architecture.md new file mode 100644 index 0000000..7117477 --- /dev/null +++ b/docs/docs/concept/architecture.md @@ -0,0 +1,214 @@ +# Architecture + +FuzzForge is a distributed, containerized platform for security analysis workflows. Its architecture is designed for scalability, isolation, and reliability, drawing on modern patterns like microservices and orchestration. This page explains the core architectural concepts behind FuzzForge, meaning what the main components are, how they interact, and why the system is structured this way. + +:::warning + +FuzzForgeโ€™s architecture is evolving. While the long-term goal is a hexagonal architecture, the current implementation is still in transition. Expect changes as the platform matures. + +::: + +--- + +## Why This Architecture? + +FuzzForgeโ€™s architecture is shaped by several key goals: + +- **Scalability:** Handle many workflows in parallel, scaling up or down as needed. +- **Isolation:** Run each workflow in its own secure environment, minimizing risk. +- **Reliability:** Ensure that failures in one part of the system donโ€™t bring down the whole platform. +- **Extensibility:** Make it easy to add new workflows, tools, or integrations. + +## High-Level System Overview + +At a glance, FuzzForge is organized into several layers, each with a clear responsibility: + +- **Client Layer:** Where users and external systems interact (CLI, API clients, MCP server). +- **API Layer:** The FastAPI backend, which exposes REST endpoints and manages requests. +- **Orchestration Layer:** Prefect server and workers, which schedule and execute workflows. +- **Execution Layer:** Docker Engine and containers, where workflows actually run. +- **Storage Layer:** PostgreSQL database, Docker volumes, and a result cache for persistence. + +Hereโ€™s a simplified view of how these layers fit together: + +```mermaid +graph TB + subgraph "Client Layer" + CLI[CLI Client] + API_Client[API Client] + MCP[MCP Server] + end + + subgraph "API Layer" + FastAPI[FastAPI Backend] + Router[Route Handlers] + Middleware[Middleware Stack] + end + + subgraph "Orchestration Layer" + Prefect[Prefect Server] + Workers[Prefect Workers] + Scheduler[Workflow Scheduler] + end + + subgraph "Execution Layer" + Docker[Docker Engine] + Containers[Workflow Containers] + Registry[Docker Registry] + end + + subgraph "Storage Layer" + PostgreSQL[PostgreSQL Database] + Volumes[Docker Volumes] + Cache[Result Cache] + end + + CLI --> FastAPI + API_Client --> FastAPI + MCP --> FastAPI + + FastAPI --> Router + Router --> Middleware + Middleware --> Prefect + + Prefect --> Workers + Workers --> Scheduler + Scheduler --> Docker + + Docker --> Containers + Docker --> Registry + Containers --> Volumes + + FastAPI --> PostgreSQL + Workers --> PostgreSQL + Containers --> Cache +``` + +## What Are the Main Components? + +### API Layer + +- **FastAPI Backend:** The main entry point for users and clients. Handles authentication, request validation, and exposes endpoints for workflow management, results, and health checks. +- **Middleware Stack:** Manages API keys, user authentication, CORS, logging, and error handling. + +### Orchestration Layer + +- **Prefect Server:** Schedules and tracks workflows, backed by PostgreSQL. +- **Prefect Workers:** Execute workflows in Docker containers. Can be scaled horizontally. +- **Workflow Scheduler:** Balances load, manages priorities, and enforces resource limits. + +### Execution Layer + +- **Docker Engine:** Runs workflow containers, enforcing isolation and resource limits. +- **Workflow Containers:** Custom images with security tools, mounting code and results volumes. +- **Docker Registry:** Stores and distributes workflow images. + +### Storage Layer + +- **PostgreSQL Database:** Stores workflow metadata, state, and results. +- **Docker Volumes:** Persist workflow results and artifacts. +- **Result Cache:** Speeds up access to recent results, with in-memory and disk persistence. + +## How Does Data Flow Through the System? + +### Submitting a Workflow + +1. **User submits a workflow** via CLI or API client. +2. **API validates** the request and creates a deployment in Prefect. +3. **Prefect schedules** the workflow and assigns it to a worker. +4. **Worker launches a container** to run the workflow. +5. **Results are stored** in Docker volumes and the database. +6. **Status updates** flow back through Prefect and the API to the user. + +```mermaid +sequenceDiagram + participant User + participant API + participant Prefect + participant Worker + participant Container + participant Storage + + User->>API: Submit workflow + API->>API: Validate parameters + API->>Prefect: Create deployment + Prefect->>Worker: Schedule execution + Worker->>Container: Create and start + Container->>Container: Execute security tools + Container->>Storage: Store SARIF results + Worker->>Prefect: Update status + Prefect->>API: Workflow complete + API->>User: Return results +``` + +### Retrieving Results + +1. **User requests status or results** via the API. +2. **API queries the database** for workflow metadata. +3. **If complete,** results are fetched from storage and returned to the user. + +## How Do Services Communicate? + +- **Internally:** FastAPI talks to Prefect via REST; Prefect coordinates with workers over HTTP; workers manage containers via the Docker Engine API. All core services use pooled connections to PostgreSQL. +- **Externally:** Users interact via CLI or API clients (HTTP REST). The MCP server can automate workflows via its own protocol. + +## How Is Security Enforced? + +- **Container Isolation:** Each workflow runs in its own Docker network, as a non-root user, with strict resource limits and only necessary volumes mounted. +- **Volume Security:** Source code is mounted read-only; results are written to dedicated, temporary volumes. +- **API Security:** All endpoints require API keys, validate inputs, enforce rate limits, and log requests for auditing. + +## How Does FuzzForge Scale? + +- **Horizontally:** Add more Prefect workers to handle more workflows in parallel. Scale the database with read replicas and connection pooling. +- **Vertically:** Adjust CPU and memory limits for containers and services as needed. + +Example Docker Compose scaling: +```yaml +services: + prefect-worker: + deploy: + resources: + limits: + memory: 4G + cpus: '2.0' + reservations: + memory: 1G + cpus: '0.5' +``` + +## How Is It Deployed? + +- **Development:** All services run via Docker Composeโ€”backend, Prefect, workers, database, and registry. +- **Production:** Add load balancers, database clustering, and multiple worker instances for high availability. Health checks, metrics, and centralized logging support monitoring and troubleshooting. + +## How Is Configuration Managed? + +- **Environment Variables:** Control core settings like database URLs, registry location, and Prefect API endpoints. +- **Service Discovery:** Docker Composeโ€™s internal DNS lets services find each other by name, with consistent port mapping and health check endpoints. + +Example configuration: +```bash +COMPOSE_PROJECT_NAME=fuzzforge_alpha +DATABASE_URL=postgresql://postgres:postgres@postgres:5432/fuzzforge +PREFECT_API_URL=http://prefect-server:4200/api +DOCKER_REGISTRY=localhost:5001 +DOCKER_INSECURE_REGISTRY=true +``` + +## How Are Failures Handled? + +- **Failure Isolation:** Each service is independent; failures donโ€™t cascade. Circuit breakers and graceful degradation keep the system stable. +- **Recovery:** Automatic retries with backoff for transient errors, dead letter queues for persistent failures, and workflow state recovery after restarts. + +## Implementation Details + +- **Tech Stack:** FastAPI (Python async), Prefect 3.x, Docker, Docker Compose, PostgreSQL (asyncpg), and Docker networking. +- **Performance:** Workflows start in 2โ€“5 seconds; results are retrieved quickly thanks to caching and database indexing. +- **Extensibility:** Add new workflows by deploying new Docker images; extend the API with new endpoints; configure storage backends as needed. + +--- + +## In Summary + +FuzzForgeโ€™s architecture is designed to be robust, scalable, and secureโ€”ready to handle demanding security analysis workflows in a modern, distributed environment. As the platform evolves, expect even more modularity and flexibility, making it easier to adapt to new requirements and technologies. diff --git a/docs/docs/concept/concept.tmpl b/docs/docs/concept/concept.tmpl new file mode 100644 index 0000000..b6a7f86 --- /dev/null +++ b/docs/docs/concept/concept.tmpl @@ -0,0 +1,20 @@ +# {Concept Title} + +{Brief introduction of the concept, including its origin and general purpose.} + +## Purpose + +- {The primary purpose and its relevance in its field.} + +## Common Usage + +- {Usage 1}: {Brief description.} +- {Usage 2}: {Brief description.} + +## Benefits + +- {Key benefit and why it's preferred in certain scenarios.} + +## Conclusion + +{Summary of its importance and role in its respective field.} diff --git a/docs/docs/concept/docker-containers.md b/docs/docs/concept/docker-containers.md new file mode 100644 index 0000000..ecc4489 --- /dev/null +++ b/docs/docs/concept/docker-containers.md @@ -0,0 +1,217 @@ +# Docker Containers in FuzzForge: Concept and Design + +Docker containers are at the heart of FuzzForgeโ€™s execution model. They provide the isolation, consistency, and flexibility needed to run security workflows reliablyโ€”no matter where FuzzForge is deployed. This page explains the core concepts behind container usage in FuzzForge, why containers are used, and how they shape the platformโ€™s behavior. + +--- + +## Why Use Docker Containers? + +FuzzForge relies on Docker containers for several key reasons: + +- **Isolation:** Each workflow runs in its own container, so tools and processes canโ€™t interfere with each other or the host. +- **Consistency:** The environment inside a container is always the same, regardless of the underlying system. +- **Security:** Containers restrict access to host resources and run as non-root users. +- **Reproducibility:** Results are deterministic, since the environment is controlled and versioned. +- **Scalability:** Containers can be started, stopped, and scaled up or down as needed. + +--- + +## How Does FuzzForge Use Containers? + +### The Container Model + +Every workflow in FuzzForge is executed inside a Docker container. Hereโ€™s what that means in practice: + +- **Workflow containers** are built from language-specific base images (like Python or Node.js), with security tools and workflow code pre-installed. +- **Infrastructure containers** (API server, Prefect, database) use official images and are configured for the platformโ€™s needs. + +### Container Lifecycle: From Build to Cleanup + +The lifecycle of a workflow container looks like this: + +1. **Image Build:** A Docker image is built with all required tools and code. +2. **Image Push/Pull:** The image is pushed to (and later pulled from) a local or remote registry. +3. **Container Creation:** The container is created with the right volumes and environment. +4. **Execution:** The workflow runs inside the container. +5. **Result Storage:** Results are written to mounted volumes. +6. **Cleanup:** The container and any temporary data are removed. + +```mermaid +graph TB + Build[Build Image] --> Push[Push to Registry] + Push --> Pull[Pull Image] + Pull --> Create[Create Container] + Create --> Mount[Mount Volumes] + Mount --> Start[Start Container] + Start --> Execute[Run Workflow] + Execute --> Results[Store Results] + Execute --> Stop[Stop Container] + Stop --> Cleanup[Cleanup Data] + Cleanup --> Remove[Remove Container] +``` + +--- + +## Whatโ€™s Inside a Workflow Container? + +A typical workflow container is structured like this: + +- **Base Image:** Usually a slim language image (e.g., `python:3.11-slim`). +- **System Dependencies:** Installed as needed (e.g., `git`, `curl`). +- **Security Tools:** Pre-installed (e.g., `semgrep`, `bandit`, `safety`). +- **Workflow Code:** Copied into the container. +- **Non-root User:** Created for execution. +- **Entrypoint:** Runs the workflow code. + +Example Dockerfile snippet: + +```dockerfile +FROM python:3.11-slim +RUN apt-get update && apt-get install -y git curl && rm -rf /var/lib/apt/lists/* +RUN pip install semgrep bandit safety +COPY ./toolbox /app/toolbox +WORKDIR /app +RUN useradd -m -u 1000 fuzzforge +USER fuzzforge +CMD ["python", "-m", "toolbox.main"] +``` + +--- + +## How Are Containers Networked and Connected? + +- **Docker Compose Network:** All containers are attached to a custom bridge network for internal communication. +- **Internal DNS:** Services communicate using Docker Compose service names. +- **Port Exposure:** Only necessary ports are exposed to the host. +- **Network Isolation:** Workflow containers are isolated from infrastructure containers when possible. + +Example network config: + +```yaml +networks: + fuzzforge: + driver: bridge + ipam: + config: + - subnet: 172.20.0.0/16 +``` + +--- + +## How Is Data Managed with Volumes? + +### Volume Types + +- **Target Code Volume:** Mounts the code to be analyzed, read-only, into the container. +- **Result Volume:** Stores workflow results and artifacts, persists after container exit. +- **Temporary Volumes:** Used for scratch space, destroyed with the container. + +Example volume mount: + +```yaml +volumes: + - "/host/path/to/code:/app/target:ro" + - "fuzzforge_alpha_prefect_storage:/app/prefect" +``` + +### Volume Security + +- **Read-only Mounts:** Prevent workflows from modifying source code. +- **Isolated Results:** Each workflow writes to its own result directory. +- **No Arbitrary Host Access:** Only explicitly mounted paths are accessible. + +--- + +## How Are Images Built and Managed? + +- **Automated Builds:** Images are built and pushed to a local registry for development, or a secure registry for production. +- **Build Optimization:** Use layer caching, multi-stage builds, and minimal base images. +- **Versioning:** Use tags (`latest`, semantic versions, or SHA digests) to track images. + +Example build and push: + +```bash +docker build -t localhost:5001/fuzzforge-static-analysis:latest . +docker push localhost:5001/fuzzforge-static-analysis:latest +``` + +--- + +## How Are Resources Controlled? + +- **Memory and CPU Limits:** Set per container to prevent resource exhaustion. +- **Resource Monitoring:** Use `docker stats` and platform APIs to track usage. +- **Alerts:** Detect and handle out-of-memory or CPU throttling events. + +Example resource config: + +```yaml +services: + prefect-worker: + deploy: + resources: + limits: + memory: 4G + cpus: '2.0' + reservations: + memory: 1G + cpus: '0.5' +``` + +--- + +## How Is Security Enforced? + +- **Non-root Execution:** Containers run as a dedicated, non-root user. +- **Capability Restrictions:** Drop unnecessary Linux capabilities. +- **Filesystem Protection:** Use read-only filesystems and tmpfs for temporary data. +- **Network Isolation:** Restrict network access to only whatโ€™s needed. +- **No Privileged Mode:** Containers never run with elevated privileges. + +Example security options: + +```yaml +services: + prefect-worker: + security_opt: + - no-new-privileges:true + cap_drop: + - ALL + cap_add: + - CHOWN + - SETGID + - SETUID +``` + +--- + +## How Is Performance Optimized? + +- **Image Layering:** Structure Dockerfiles for efficient caching. +- **Dependency Preinstallation:** Reduce startup time by pre-installing dependencies. +- **Warm Containers:** Optionally pre-create containers for faster workflow startup. +- **Horizontal Scaling:** Scale worker containers to handle more workflows in parallel. + +--- + +## How Are Containers Monitored and Debugged? + +- **Health Checks:** Each service and workflow container has a health endpoint or check. +- **Logging:** All container logs are collected and can be accessed via `docker logs` or the FuzzForge API. +- **Debug Access:** Use `docker exec` to access running containers for troubleshooting. +- **Resource Stats:** Monitor with `docker stats` or platform dashboards. + +--- + +## How Does This All Fit Into FuzzForge? + +- **Prefect Workers:** Manage the full lifecycle of workflow containers. +- **API Integration:** Exposes container status, logs, and resource metrics. +- **Volume Management:** Ensures results and artifacts are collected and persisted. +- **Security and Resource Controls:** Enforced automatically for every workflow. + +--- + +## In Summary + +Docker containers are the foundation of FuzzForgeโ€™s execution model. They provide the isolation, security, and reproducibility needed for robust security analysis workflowsโ€”while making it easy to scale, monitor, and extend the platform. diff --git a/docs/docs/concept/fuzzforge-ai.md b/docs/docs/concept/fuzzforge-ai.md new file mode 100644 index 0000000..5ea3127 --- /dev/null +++ b/docs/docs/concept/fuzzforge-ai.md @@ -0,0 +1,83 @@ +# FuzzForge AI: Conceptual Overview + +Welcome to FuzzForge AIโ€”a multi-agent orchestration platform designed to supercharge your intelligent automation, security workflows, and project knowledge management. This document provides a high-level conceptual introduction to what FuzzForge AI is, what problems it solves, and how its architecture enables powerful, context-aware agent collaboration. + +--- + +## What is FuzzForge AI? + +FuzzForge AI is a multi-agent orchestration system that implements the A2A (Agent-to-Agent) protocol for intelligent agent routing, persistent memory management, and project-scoped knowledge graphs. Think of it as an intelligent hub that coordinates a team of specialized agents, each with their own skills, while maintaining context and knowledge across sessions and projects. + +**Key Goals:** +- Seamlessly route requests to the right agent for the job +- Preserve and leverage project-specific knowledge +- Enable secure, auditable, and extensible automation workflows +- Make multi-agent collaboration as easy as talking to a single assistant + +--- + +## Core Concepts + +### 1. **Agent Orchestration** +FuzzForge AI acts as a conductor, automatically routing your requests to the most capable registered agent. Agents can be local or remote, and each advertises its skills and capabilities via the A2A protocol. + +### 2. **Memory & Knowledge Management** +The system features a three-layer memory architecture: +- **Session Persistence:** Keeps track of ongoing sessions and conversations. +- **Semantic Memory:** Archives conversations and enables semantic search. +- **Knowledge Graphs:** Maintains structured, project-scoped knowledge for deep context. + +### 3. **Artifact System** +Artifacts are files or structured content generated, processed, or shared by agents. The artifact system supports creation, storage, and secure sharing of code, configs, reports, and moreโ€”enabling reproducible, auditable workflows. + +### 4. **A2A Protocol Compliance** +FuzzForge AI fully implements the A2A (Agent-to-Agent) protocol (spec 0.3.0), ensuring standardized, interoperable communication between agentsโ€”whether they're running locally or across the network. + +--- + +## High-Level Architecture + +Here's how the main components fit together: + +``` +FuzzForge AI System +โ”œโ”€โ”€ CLI Interface (cli.py) +โ”‚ โ”œโ”€โ”€ Commands & Session Management +โ”‚ โ””โ”€โ”€ Agent Registry Persistence +โ”œโ”€โ”€ Agent Core (agent.py) +โ”‚ โ”œโ”€โ”€ Main Coordinator +โ”‚ โ””โ”€โ”€ Memory Manager Integration +โ”œโ”€โ”€ Agent Executor (agent_executor.py) +โ”‚ โ”œโ”€โ”€ Tool Management & Orchestration +โ”‚ โ”œโ”€โ”€ ROUTE_TO Pattern Implementation +โ”‚ โ””โ”€โ”€ Artifact Creation & Management +โ”œโ”€โ”€ Memory Architecture (Three Layers) +โ”‚ โ”œโ”€โ”€ Session Persistence +โ”‚ โ”œโ”€โ”€ Semantic Memory +โ”‚ โ””โ”€โ”€ Knowledge Graphs +โ”œโ”€โ”€ A2A Communication Layer +โ”‚ โ”œโ”€โ”€ Remote Agent Connection +โ”‚ โ”œโ”€โ”€ Agent Card Management +โ”‚ โ””โ”€โ”€ Protocol Compliance +โ””โ”€โ”€ A2A Server (a2a_server.py) + โ”œโ”€โ”€ HTTP/SSE Server + โ”œโ”€โ”€ Artifact HTTP Serving + โ””โ”€โ”€ Task Store & Queue Management +``` + +**How it works:** +1. **User Input:** You interact via CLI or API, using natural language or commands. +2. **Agent Routing:** The system decides whether to handle the request itself or route it to a specialist agent. +3. **Tool Execution:** Built-in and agent-provided tools perform operations. +4. **Memory Integration:** Results and context are stored for future use. +5. **Response Generation:** The system returns results, often with artifacts or actionable insights. + +--- + +## Why FuzzForge AI? + +- **Extensible:** Easily add new agents, tools, and workflows. +- **Context-Aware:** Remembers project history, conversations, and knowledge. +- **Secure:** Project isolation, input validation, and artifact management. +- **Collaborative:** Enables multi-agent workflows and knowledge sharing. +- **Fun & Productive:** Designed to make automation and security tasks less tedious and more interactive. diff --git a/docs/docs/concept/sarif-format.md b/docs/docs/concept/sarif-format.md new file mode 100644 index 0000000..058de88 --- /dev/null +++ b/docs/docs/concept/sarif-format.md @@ -0,0 +1,618 @@ +# SARIF Format + +FuzzForge uses the Static Analysis Results Interchange Format (SARIF) as the standardized output format for all security analysis results. SARIF provides a consistent, machine-readable format that enables tool interoperability and comprehensive result analysis. + +## What is SARIF? + +### Overview + +SARIF (Static Analysis Results Interchange Format) is an OASIS-approved standard (SARIF 2.1.0) designed to standardize the output of static analysis tools. FuzzForge extends this standard to cover dynamic analysis, secret detection, infrastructure analysis, and fuzzing results. + +### Key Benefits + +- **Standardization**: Consistent format across all security tools and workflows +- **Interoperability**: Integration with existing security tools and platforms +- **Rich Metadata**: Comprehensive information about findings, tools, and analysis runs +- **Tool Agnostic**: Works with any security tool that produces structured results +- **IDE Integration**: Native support in modern development environments + +### SARIF Structure + +```json +{ + "version": "2.1.0", + "schema": "https://json.schemastore.org/sarif-2.1.0.json", + "runs": [ + { + "tool": { /* Tool information */ }, + "invocations": [ /* How the tool was run */ ], + "artifacts": [ /* Files analyzed */ ], + "results": [ /* Security findings */ ] + } + ] +} +``` + +## FuzzForge SARIF Implementation + +### Run Structure + +Each FuzzForge workflow produces a SARIF "run" containing: + +```json +{ + "tool": { + "driver": { + "name": "FuzzForge", + "version": "1.0.0", + "informationUri": "https://github.com/FuzzingLabs/fuzzforge", + "organization": "FuzzingLabs", + "rules": [ /* Security rules applied */ ] + }, + "extensions": [ + { + "name": "semgrep", + "version": "1.45.0", + "rules": [ /* Semgrep-specific rules */ ] + } + ] + }, + "invocations": [ + { + "executionSuccessful": true, + "startTimeUtc": "2025-09-25T12:00:00.000Z", + "endTimeUtc": "2025-09-25T12:05:30.000Z", + "workingDirectory": { + "uri": "file:///app/target/" + }, + "commandLine": "python -m toolbox.workflows.static_analysis", + "environmentVariables": { + "WORKFLOW_TYPE": "static_analysis_scan" + } + } + ] +} +``` + +### Result Structure + +Each security finding is represented as a SARIF result: + +```json +{ + "ruleId": "semgrep.security.audit.sqli.pg-sqli", + "ruleIndex": 42, + "level": "error", + "message": { + "text": "Potential SQL injection vulnerability detected" + }, + "locations": [ + { + "physicalLocation": { + "artifactLocation": { + "uri": "src/database/queries.py", + "uriBaseId": "SRCROOT" + }, + "region": { + "startLine": 156, + "startColumn": 20, + "endLine": 156, + "endColumn": 45, + "snippet": { + "text": "cursor.execute(query)" + } + } + } + } + ], + "properties": { + "tool": "semgrep", + "confidence": "high", + "severity": "high", + "cwe": ["CWE-89"], + "owasp": ["A03:2021"], + "references": [ + "https://owasp.org/Top10/A03_2021-Injection/" + ] + } +} +``` + +## Finding Categories and Severity + +### Severity Levels + +FuzzForge maps tool-specific severity levels to SARIF standard levels: + +#### SARIF Level Mapping +- **error**: Critical and High severity findings +- **warning**: Medium severity findings +- **note**: Low severity findings +- **info**: Informational findings + +#### Extended Severity Properties +```json +{ + "properties": { + "severity": "high", // FuzzForge severity + "confidence": "medium", // Tool confidence + "exploitability": "high", // Likelihood of exploitation + "impact": "data_breach" // Potential impact + } +} +``` + +### Vulnerability Classification + +#### CWE (Common Weakness Enumeration) +```json +{ + "properties": { + "cwe": ["CWE-89", "CWE-79"], + "cwe_category": "Injection" + } +} +``` + +#### OWASP Top 10 Mapping +```json +{ + "properties": { + "owasp": ["A03:2021", "A06:2021"], + "owasp_category": "Injection" + } +} +``` + +#### Tool-Specific Classifications +```json +{ + "properties": { + "tool_category": "security", + "rule_type": "semantic_grep", + "finding_type": "sql_injection" + } +} +``` + +## Multi-Tool Result Aggregation + +### Tool Extension Model + +FuzzForge aggregates results from multiple tools using SARIF's extension model: + +```json +{ + "tool": { + "driver": { + "name": "FuzzForge", + "version": "1.0.0" + }, + "extensions": [ + { + "name": "semgrep", + "version": "1.45.0", + "guid": "semgrep-extension-guid" + }, + { + "name": "bandit", + "version": "1.7.5", + "guid": "bandit-extension-guid" + } + ] + } +} +``` + +### Result Correlation + +#### Cross-Tool Finding Correlation +```json +{ + "ruleId": "fuzzforge.correlation.sql-injection", + "level": "error", + "message": { + "text": "SQL injection vulnerability confirmed by multiple tools" + }, + "locations": [ /* Primary location */ ], + "relatedLocations": [ /* Additional contexts */ ], + "properties": { + "correlation_id": "corr-001", + "confirming_tools": ["semgrep", "bandit"], + "confidence_score": 0.95, + "aggregated_severity": "critical" + } +} +``` + +#### Finding Relationships +```json +{ + "ruleId": "semgrep.security.audit.xss.direct-use-of-jinja2", + "properties": { + "related_findings": [ + { + "correlation_type": "same_vulnerability_class", + "related_rule": "bandit.B703", + "relationship": "confirms" + }, + { + "correlation_type": "attack_chain", + "related_rule": "nuclei.xss.reflected", + "relationship": "exploits" + } + ] + } +} +``` + +## Workflow-Specific Extensions + +### Static Analysis Results +```json +{ + "properties": { + "analysis_type": "static", + "language": "python", + "complexity_score": 3.2, + "coverage": { + "lines_analyzed": 15420, + "functions_analyzed": 892, + "classes_analyzed": 156 + } + } +} +``` + +### Dynamic Analysis Results +```json +{ + "properties": { + "analysis_type": "dynamic", + "test_method": "web_application_scan", + "target_url": "https://example.com", + "http_method": "POST", + "request_payload": "user_input=", + "response_code": 200, + "exploitation_proof": "alert_box_displayed" + } +} +``` + +### Secret Detection Results +```json +{ + "properties": { + "analysis_type": "secret_detection", + "secret_type": "api_key", + "entropy_score": 4.2, + "commit_hash": "abc123def456", + "commit_date": "2025-09-20T10:30:00Z", + "author": "developer@example.com", + "exposure_duration": "30_days" + } +} +``` + +### Infrastructure Analysis Results +```json +{ + "properties": { + "analysis_type": "infrastructure", + "resource_type": "docker_container", + "policy_violation": "privileged_container", + "compliance_framework": ["CIS", "NIST"], + "remediation_effort": "low", + "deployment_risk": "high" + } +} +``` + +### Fuzzing Results +```json +{ + "properties": { + "analysis_type": "fuzzing", + "fuzzer": "afl++", + "crash_type": "segmentation_fault", + "crash_address": "0x7fff8b2a1000", + "exploitability": "likely_exploitable", + "test_case": "base64:SGVsbG8gV29ybGQ=", + "coverage_achieved": "85%" + } +} +``` + +## SARIF Processing and Analysis + +### Result Filtering + +#### Severity-Based Filtering +```python +def filter_by_severity(sarif_results, min_severity="medium"): + """Filter SARIF results by minimum severity level""" + severity_order = {"info": 0, "note": 1, "warning": 2, "error": 3} + min_level = severity_order.get(min_severity, 1) + + filtered_results = [] + for result in sarif_results["runs"][0]["results"]: + result_level = severity_order.get(result.get("level", "note"), 1) + if result_level >= min_level: + filtered_results.append(result) + + return filtered_results +``` + +#### Rule-Based Filtering +```python +def filter_by_rules(sarif_results, rule_patterns): + """Filter results by rule ID patterns""" + import re + + filtered_results = [] + for result in sarif_results["runs"][0]["results"]: + rule_id = result.get("ruleId", "") + for pattern in rule_patterns: + if re.match(pattern, rule_id): + filtered_results.append(result) + break + + return filtered_results +``` + +### Statistical Analysis + +#### Severity Distribution +```python +def analyze_severity_distribution(sarif_results): + """Analyze distribution of findings by severity""" + distribution = {"error": 0, "warning": 0, "note": 0, "info": 0} + + for result in sarif_results["runs"][0]["results"]: + level = result.get("level", "note") + distribution[level] += 1 + + return distribution +``` + +#### Tool Coverage Analysis +```python +def analyze_tool_coverage(sarif_results): + """Analyze which tools contributed findings""" + tool_stats = {} + + for result in sarif_results["runs"][0]["results"]: + tool = result.get("properties", {}).get("tool", "unknown") + if tool not in tool_stats: + tool_stats[tool] = {"count": 0, "severities": {"error": 0, "warning": 0, "note": 0, "info": 0}} + + tool_stats[tool]["count"] += 1 + level = result.get("level", "note") + tool_stats[tool]["severities"][level] += 1 + + return tool_stats +``` + +## SARIF Export and Integration + +### Export Formats + +#### JSON Export +```python +def export_sarif_json(sarif_results, output_path): + """Export SARIF results as JSON""" + import json + + with open(output_path, 'w') as f: + json.dump(sarif_results, f, indent=2, ensure_ascii=False) +``` + +#### CSV Export for Spreadsheets +```python +def export_sarif_csv(sarif_results, output_path): + """Export SARIF results as CSV for spreadsheet analysis""" + import csv + + with open(output_path, 'w', newline='') as f: + writer = csv.writer(f) + writer.writerow(['Rule ID', 'Severity', 'Message', 'File', 'Line', 'Tool']) + + for result in sarif_results["runs"][0]["results"]: + rule_id = result.get("ruleId", "unknown") + level = result.get("level", "note") + message = result.get("message", {}).get("text", "") + tool = result.get("properties", {}).get("tool", "unknown") + + for location in result.get("locations", []): + physical_location = location.get("physicalLocation", {}) + file_path = physical_location.get("artifactLocation", {}).get("uri", "") + line = physical_location.get("region", {}).get("startLine", "") + + writer.writerow([rule_id, level, message, file_path, line, tool]) +``` + +### IDE Integration + +#### Visual Studio Code +SARIF files can be opened directly in VS Code with the SARIF extension: + +```json +{ + "recommendations": ["ms-sarif.sarif-viewer"], + "sarif.viewer.connectToGitHub": true, + "sarif.viewer.showResultsInExplorer": true +} +``` + +#### GitHub Integration +GitHub automatically processes SARIF files uploaded through Actions: + +```yaml +- name: Upload SARIF results + uses: github/codeql-action/upload-sarif@v2 + with: + sarif_file: fuzzforge-results.sarif + category: security-analysis +``` + +### API Integration + +#### SARIF Result Access +```python +# Example: Accessing SARIF results via FuzzForge API +async with FuzzForgeClient() as client: + result = await client.get_workflow_result(run_id) + + # Access SARIF data + sarif_data = result["sarif"] + findings = sarif_data["runs"][0]["results"] + + # Filter critical findings + critical_findings = [ + f for f in findings + if f.get("level") == "error" and + f.get("properties", {}).get("severity") == "critical" + ] +``` + +## SARIF Validation and Quality + +### Schema Validation +```python +import jsonschema +import requests + +def validate_sarif(sarif_data): + """Validate SARIF data against official schema""" + schema_url = "https://json.schemastore.org/sarif-2.1.0.json" + schema = requests.get(schema_url).json() + + try: + jsonschema.validate(sarif_data, schema) + return True, "Valid SARIF 2.1.0 format" + except jsonschema.ValidationError as e: + return False, f"SARIF validation error: {e.message}" +``` + +### Quality Metrics +```python +def calculate_sarif_quality_metrics(sarif_data): + """Calculate quality metrics for SARIF results""" + results = sarif_data["runs"][0]["results"] + + metrics = { + "total_findings": len(results), + "findings_with_location": len([r for r in results if r.get("locations")]), + "findings_with_message": len([r for r in results if r.get("message", {}).get("text")]), + "findings_with_remediation": len([r for r in results if r.get("fixes")]), + "unique_rules": len(set(r.get("ruleId") for r in results)), + "coverage_percentage": calculate_coverage(sarif_data) + } + + metrics["quality_score"] = ( + metrics["findings_with_location"] / max(metrics["total_findings"], 1) * 0.3 + + metrics["findings_with_message"] / max(metrics["total_findings"], 1) * 0.3 + + metrics["findings_with_remediation"] / max(metrics["total_findings"], 1) * 0.2 + + min(metrics["coverage_percentage"] / 100, 1.0) * 0.2 + ) + + return metrics +``` + +## Advanced SARIF Features + +### Fixes and Remediation +```json +{ + "ruleId": "semgrep.security.audit.sqli.pg-sqli", + "fixes": [ + { + "description": { + "text": "Use parameterized queries to prevent SQL injection" + }, + "artifactChanges": [ + { + "artifactLocation": { + "uri": "src/database/queries.py" + }, + "replacements": [ + { + "deletedRegion": { + "startLine": 156, + "startColumn": 20, + "endLine": 156, + "endColumn": 45 + }, + "insertedContent": { + "text": "cursor.execute(query, params)" + } + } + ] + } + ] + } + ] +} +``` + +### Code Flows for Complex Vulnerabilities +```json +{ + "ruleId": "dataflow.taint.sql-injection", + "codeFlows": [ + { + "message": { + "text": "Tainted data flows from user input to SQL query" + }, + "threadFlows": [ + { + "locations": [ + { + "location": { + "physicalLocation": { + "artifactLocation": {"uri": "src/api/handlers.py"}, + "region": {"startLine": 45} + } + }, + "state": {"source": "user_input"}, + "nestingLevel": 0 + }, + { + "location": { + "physicalLocation": { + "artifactLocation": {"uri": "src/database/queries.py"}, + "region": {"startLine": 156} + } + }, + "state": {"sink": "sql_query"}, + "nestingLevel": 0 + } + ] + } + ] + } + ] +} +``` + +--- + +## SARIF Best Practices + +### Result Quality +- **Precise Locations**: Always include accurate file paths and line numbers +- **Clear Messages**: Write descriptive, actionable finding messages +- **Remediation Guidance**: Include fix suggestions when possible +- **Severity Consistency**: Use consistent severity mappings across tools + +### Performance +- **Efficient Processing**: Process SARIF results efficiently for large result sets +- **Streaming**: Use streaming for very large SARIF files +- **Caching**: Cache processed results for faster repeated access +- **Compression**: Compress SARIF files for storage and transmission + +### Integration +- **Tool Interoperability**: Ensure SARIF compatibility with existing tools +- **Standard Compliance**: Follow SARIF 2.1.0 specification precisely +- **Extension Documentation**: Document any custom extensions clearly +- **Version Management**: Handle SARIF schema version differences diff --git a/docs/docs/concept/security-analysis.md b/docs/docs/concept/security-analysis.md new file mode 100644 index 0000000..a10033b --- /dev/null +++ b/docs/docs/concept/security-analysis.md @@ -0,0 +1,174 @@ +# Security Analysis in FuzzForge: Concepts and Approach + +Security analysis is at the core of FuzzForgeโ€™s mission. This page explains the philosophy, methodologies, and integration patterns that shape how FuzzForge discovers vulnerabilities and helps teams secure their software. If youโ€™re curious about what โ€œsecurity analysisโ€ really means in this platformโ€”and why itโ€™s designed this wayโ€”read on. + +--- + +## Why Does FuzzForge Approach Security Analysis This Way? + +FuzzForgeโ€™s security analysis is built on a few guiding principles: + +- **Defense in Depth:** No single tool or method catches everything. FuzzForge layers multiple analysis typesโ€”static, dynamic, secret detection, infrastructure checks, and fuzzingโ€”to maximize coverage. +- **Tool Diversity:** Different tools find different issues. Running several tools for each analysis type reduces blind spots and increases confidence in results. +- **Standardized Results:** All findings are normalized into SARIF, a widely adopted format. This makes results easy to aggregate, review, and integrate with other tools. +- **Automation and Integration:** Security analysis is only useful if it fits into real-world workflows. FuzzForge is designed for CI/CD, developer feedback, and automated reporting. + +--- + +## What Types of Security Analysis Does FuzzForge Perform? + +### Static Analysis + +- **What it is:** Examines source code without running it, looking for vulnerabilities, anti-patterns, and risky constructs. +- **How it works:** Parses code, analyzes control and data flow, and matches patterns against known vulnerabilities. +- **Tools:** Semgrep, Bandit, CodeQL, ESLint, and more. +- **Strengths:** Fast, broad coverage, no runtime needed. +- **Limitations:** Canโ€™t see runtime issues, may produce false positives. + +### Dynamic Analysis + +- **What it is:** Tests running applications to find vulnerabilities that only appear at runtime. +- **How it works:** Deploys the app in a test environment, probes entry points, and observes behavior under attack. +- **Tools:** Nuclei, OWASP ZAP, Nmap, SQLMap. +- **Strengths:** Finds real, exploitable issues; validates actual behavior. +- **Limitations:** Needs a working environment; slower; may not cover all code. + +### Secret Detection + +- **What it is:** Scans code and configuration for exposed credentials, API keys, and sensitive data. +- **How it works:** Uses pattern matching, entropy analysis, and context checksโ€”sometimes even scanning git history. +- **Tools:** TruffleHog, Gitleaks, detect-secrets, GitGuardian. +- **Strengths:** Fast, critical for preventing leaks. +- **Limitations:** Canโ€™t find encrypted/encoded secrets; needs regular pattern updates. + +### Infrastructure Analysis + +- **What it is:** Analyzes infrastructure-as-code, container configs, and deployment manifests for security misconfigurations. +- **How it works:** Parses config files, applies security policies, checks compliance, and assesses risk. +- **Tools:** Checkov, Hadolint, Kubesec, Terrascan. +- **Strengths:** Prevents misconfigurations before deployment; automates compliance. +- **Limitations:** Canโ€™t see runtime changes; depends on up-to-date policies. + +### Fuzzing + +- **What it is:** Automatically generates and sends unexpected or random inputs to code, looking for crashes or unexpected behavior. +- **How it works:** Identifies targets, generates inputs, monitors execution, and analyzes crashes. +- **Tools:** AFL++, libFuzzer, Cargo Fuzz, Jazzer. +- **Strengths:** Finds deep, complex bugs; great for memory safety. +- **Limitations:** Resource-intensive; may need manual setup. + +### Comprehensive Assessment + +- **What it is:** Combines all the above for a holistic view, correlating findings and prioritizing risks. +- **How it works:** Runs multiple analyses, aggregates and correlates results, and generates unified reports. +- **Benefits:** Complete coverage, better context, prioritized remediation, and compliance support. + +--- + +## How Does FuzzForge Integrate and Orchestrate Analysis? + +### Workflow Composition + +FuzzForge composes analysis workflows by combining different analysis types, each running in its own containerized environment. Inputs (code, configs, parameters) are fed into the appropriate tools, and results are normalized and aggregated. + +```mermaid +graph TB + subgraph "Input" + Target[Target Codebase] + Config[Analysis Configuration] + end + + subgraph "Analysis Workflows" + Static[Static Analysis] + Dynamic[Dynamic Analysis] + Secrets[Secret Detection] + Infra[Infrastructure Analysis] + Fuzz[Fuzzing Analysis] + end + + subgraph "Processing" + Normalize[Result Normalization] + Merge[Finding Aggregation] + Correlate[Cross-Tool Correlation] + end + + subgraph "Output" + SARIF[SARIF Results] + Report[Security Report] + Metrics[Analysis Metrics] + end + + Target --> Static + Target --> Dynamic + Target --> Secrets + Target --> Infra + Target --> Fuzz + Config --> Static + Config --> Dynamic + Config --> Secrets + Config --> Infra + Config --> Fuzz + + Static --> Normalize + Dynamic --> Normalize + Secrets --> Normalize + Infra --> Normalize + Fuzz --> Normalize + + Normalize --> Merge + Merge --> Correlate + Correlate --> SARIF + Correlate --> Report + Correlate --> Metrics +``` + +### Orchestration Patterns + +- **Parallel Execution:** Tools of the same type (e.g., multiple static analyzers) run in parallel for speed and redundancy. +- **Sequential Execution:** Some analyses depend on previous results (e.g., dynamic analysis using endpoints found by static analysis). +- **Result Normalization:** All findings are converted to SARIF for consistency. +- **Correlation:** Related findings from different tools are grouped and prioritized. + +--- + +## How Is Quality Ensured? + +### Metrics and Measurement + +- **Coverage:** How much code, how many rules, and how many vulnerability types are analyzed. +- **Accuracy:** False positive/negative rates, confidence scores, and validation rates. +- **Performance:** Analysis duration, resource usage, and scalability. + +### Quality Assurance + +- **Cross-Tool Validation:** Findings are confirmed by multiple tools when possible. +- **Manual Review:** High-severity findings can be flagged for expert review. +- **Continuous Improvement:** Tools and rules are updated regularly, and user feedback is incorporated. + +--- + +## How Does Security Analysis Fit Into Development Workflows? + +### CI/CD Integration + +- **Pre-commit Hooks:** Run security checks before code is committed. +- **Pipeline Integration:** Block deployments if high/critical issues are found. +- **Quality Gates:** Enforce severity thresholds and track trends over time. + +### Developer Experience + +- **IDE Integration:** Import SARIF findings into supported IDEs for inline feedback. +- **Real-Time Analysis:** Optionally run background checks during development. +- **Reporting:** Executive dashboards, technical reports, and compliance summaries. + +--- + +## Whatโ€™s Next for Security Analysis in FuzzForge? + +FuzzForge is designed to evolve. Advanced techniques like machine learning for pattern recognition, contextual analysis, and business logic checks are on the roadmap. The goal: keep raising the bar for automated, actionable, and developer-friendly security analysis. + +--- + +## In Summary + +FuzzForgeโ€™s security analysis is comprehensive, layered, and designed for real-world integration. By combining multiple analysis types, normalizing results, and focusing on automation and developer experience, FuzzForge helps teams find and fix vulnerabilitiesโ€”before attackers do. diff --git a/docs/docs/concept/workflow.md b/docs/docs/concept/workflow.md new file mode 100644 index 0000000..d46f9cc --- /dev/null +++ b/docs/docs/concept/workflow.md @@ -0,0 +1,128 @@ +# Understanding Workflows in FuzzForge + +Workflows are the backbone of FuzzForgeโ€™s security analysis platform. If you want to get the most out of FuzzForge, itโ€™s essential to understand what workflows are, how theyโ€™re designed, and how they operate from start to finish. This page explains the core concepts, design principles, and execution models behind FuzzForge workflowsโ€”so you can use them confidently and effectively. + +--- + +## What Is a Workflow? + +A **workflow** in FuzzForge is a containerized process that orchestrates one or more security tools to analyze a target codebase or application. Each workflow is tailored for a specific type of security analysis (like static analysis, secret detection, or fuzzing) and is designed to be: + +- **Isolated:** Runs in its own Docker container for security and reproducibility. +- **Integrated:** Can combine multiple tools for comprehensive results. +- **Standardized:** Always produces SARIF-compliant output. +- **Configurable:** Accepts parameters to customize analysis. +- **Scalable:** Can run in parallel and scale horizontally. + +--- + +## How Does a Workflow Operate? + +### High-Level Architecture + +Hereโ€™s how a workflow moves through the FuzzForge system: + +```mermaid +graph TB + User[User/CLI/API] --> API[FuzzForge API] + API --> Prefect[Prefect Orchestrator] + Prefect --> Worker[Prefect Worker] + Worker --> Container[Docker Container] + Container --> Tools[Security Tools] + Tools --> Results[SARIF Results] + Results --> Storage[Persistent Storage] +``` + +**Key roles:** +- **User/CLI/API:** Submits and manages workflows. +- **FuzzForge API:** Validates, orchestrates, and tracks workflows. +- **Prefect Orchestrator:** Schedules and manages workflow execution. +- **Prefect Worker:** Runs the workflow in a Docker container. +- **Security Tools:** Perform the actual analysis. +- **Persistent Storage:** Stores results and artifacts. + +--- + +## Workflow Lifecycle: From Idea to Results + +1. **Design:** Choose tools, define integration logic, set up parameters, and build the Docker image. +2. **Deployment:** Build and push the image, register the workflow, and configure defaults. +3. **Execution:** User submits a workflow; parameters and target are validated; the workflow is scheduled and executed in a container; tools run as designed. +4. **Completion:** Results are collected, normalized, and stored; status is updated; temporary resources are cleaned up; results are made available via API/CLI. + +--- + +## Types of Workflows + +FuzzForge supports several workflow types, each optimized for a specific security need: + +- **Static Analysis:** Examines source code without running it (e.g., Semgrep, Bandit). +- **Dynamic Analysis:** Tests running applications for runtime vulnerabilities (e.g., OWASP ZAP, Nuclei). +- **Secret Detection:** Finds exposed credentials and sensitive data (e.g., TruffleHog, Gitleaks). +- **Infrastructure Analysis:** Checks infrastructure-as-code and configs for misconfigurations (e.g., Checkov, Hadolint). +- **Fuzzing:** Generates unexpected inputs to find crashes and edge cases (e.g., AFL++, libFuzzer). +- **Comprehensive Assessment:** Combines multiple analysis types for full coverage. + +--- + +## Workflow Design Principles + +- **Tool Agnostic:** Workflows abstract away the specifics of underlying tools, providing a consistent interface. +- **Fail-Safe Execution:** If one tool fails, others continueโ€”partial results are still valuable. +- **Configurable:** Users can adjust parameters to control tool behavior, output, and execution. +- **Resource-Aware:** Workflows specify and respect resource limits (CPU, memory). +- **Standardized Output:** All results are normalized to SARIF for easy integration and reporting. + +--- + +## Execution Models + +- **Synchronous:** Wait for the workflow to finish and get results immediatelyโ€”great for interactive use. +- **Asynchronous:** Submit a workflow and check back later for resultsโ€”ideal for long-running or batch jobs. +- **Parallel:** Run multiple workflows at once for comprehensive or time-sensitive analysis. + +--- + +## Data Flow and Storage + +- **Input:** Target code and parameters are validated and mounted as read-only volumes. +- **Processing:** Tools are initialized and run (often in parallel); outputs are collected and normalized. +- **Output:** Results are stored in persistent volumes and indexed for fast retrieval; metadata is saved in the database; intermediate results may be cached for performance. + +--- + +## Error Handling and Recovery + +- **Tool-Level:** Timeouts, resource exhaustion, and crashes are handled gracefully; failed tools donโ€™t stop the workflow. +- **Workflow-Level:** Container failures, volume issues, and network problems are detected and reported. +- **Recovery:** Automatic retries for transient errors; partial results are returned when possible; workflows degrade gracefully if some tools are unavailable. + +--- + +## Performance and Optimization + +- **Container Efficiency:** Docker images are layered and cached for fast startup; containers may be reused when safe. +- **Parallel Processing:** Independent tools run concurrently to maximize CPU usage and minimize wait times. +- **Caching:** Images, dependencies, and intermediate results are cached to avoid unnecessary recomputation. + +--- + +## Monitoring and Observability + +- **Metrics:** Track execution time, resource usage, and success/failure rates. +- **Logging:** Structured logs and tool outputs are captured for debugging and analysis. +- **Real-Time Monitoring:** Live status updates and progress indicators are available via API/WebSocket. + +--- + +## Integration Patterns + +- **CI/CD:** Integrate workflows into pipelines to block deployments on critical findings. +- **API:** Programmatically submit and track workflows from your own tools or scripts. +- **Event-Driven:** Use webhooks or event listeners to trigger actions on workflow completion. + +--- + +## In Summary + +Workflows in FuzzForge are designed to be robust, flexible, and easy to integrate into your security and development processes. By combining containerization, orchestration, and a standardized interface, FuzzForge workflows help you automate and scale security analysisโ€”so you can focus on fixing issues, not just finding them. diff --git a/docs/docs/concept/working-with-documentation.md b/docs/docs/concept/working-with-documentation.md new file mode 100644 index 0000000..7e72ed8 --- /dev/null +++ b/docs/docs/concept/working-with-documentation.md @@ -0,0 +1,72 @@ +# Working with documentation + +To update the documentation on any of the sections just add a new markdown file to the designated subfolder below : + +``` +โ”œโ”€concepts +โ”œโ”€tutorials +โ”œโ”€how-to +โ”‚ โ””โ”€troubleshooting +โ””โ”€reference + โ”œโ”€architecture + โ”œโ”€decisions + โ””โ”€faq +``` + +:::note Templates + +Each folder contains templates that can be used as quickstarts. Those are named `