mirror of
https://github.com/FuzzingLabs/fuzzforge_ai.git
synced 2026-02-12 22:32:45 +00:00
* feat: Complete migration from Prefect to Temporal BREAKING CHANGE: Replaces Prefect workflow orchestration with Temporal ## Major Changes - Replace Prefect with Temporal for workflow orchestration - Implement vertical worker architecture (rust, android) - Replace Docker registry with MinIO for unified storage - Refactor activities to be co-located with workflows - Update all API endpoints for Temporal compatibility ## Infrastructure - New: docker-compose.temporal.yaml (Temporal + MinIO + workers) - New: workers/ directory with rust and android vertical workers - New: backend/src/temporal/ (manager, discovery) - New: backend/src/storage/ (S3-cached storage with MinIO) - New: backend/toolbox/common/ (shared storage activities) - Deleted: docker-compose.yaml (old Prefect setup) - Deleted: backend/src/core/prefect_manager.py - Deleted: backend/src/services/prefect_stats_monitor.py - Deleted: Docker registry and insecure-registries requirement ## Workflows - Migrated: security_assessment workflow to Temporal - New: rust_test workflow (example/test workflow) - Deleted: secret_detection_scan (Prefect-based, to be reimplemented) - Activities now co-located with workflows for independent testing ## API Changes - Updated: backend/src/api/workflows.py (Temporal submission) - Updated: backend/src/api/runs.py (Temporal status/results) - Updated: backend/src/main.py (727 lines, TemporalManager integration) - Updated: All 16 MCP tools to use TemporalManager ## Testing - ✅ All services healthy (Temporal, PostgreSQL, MinIO, workers, backend) - ✅ All API endpoints functional - ✅ End-to-end workflow test passed (72 findings from vulnerable_app) - ✅ MinIO storage integration working (target upload/download, results) - ✅ Worker activity discovery working (6 activities registered) - ✅ Tarball extraction working - ✅ SARIF report generation working ## Documentation - ARCHITECTURE.md: Complete Temporal architecture documentation - QUICKSTART_TEMPORAL.md: Getting started guide - MIGRATION_DECISION.md: Why we chose Temporal over Prefect - IMPLEMENTATION_STATUS.md: Migration progress tracking - workers/README.md: Worker development guide ## Dependencies - Added: temporalio>=1.6.0 - Added: boto3>=1.34.0 (MinIO S3 client) - Removed: prefect>=3.4.18 * feat: Add Python fuzzing vertical with Atheris integration This commit implements a complete Python fuzzing workflow using Atheris: ## Python Worker (workers/python/) - Dockerfile with Python 3.11, Atheris, and build tools - Generic worker.py for dynamic workflow discovery - requirements.txt with temporalio, boto3, atheris dependencies - Added to docker-compose.temporal.yaml with dedicated cache volume ## AtherisFuzzer Module (backend/toolbox/modules/fuzzer/) - Reusable module extending BaseModule - Auto-discovers fuzz targets (fuzz_*.py, *_fuzz.py, fuzz_target.py) - Recursive search to find targets in nested directories - Dynamically loads TestOneInput() function - Configurable max_iterations and timeout - Real-time stats callback support for live monitoring - Returns findings as ModuleFinding objects ## Atheris Fuzzing Workflow (backend/toolbox/workflows/atheris_fuzzing/) - Temporal workflow for orchestrating fuzzing - Downloads user code from MinIO - Executes AtherisFuzzer module - Uploads results to MinIO - Cleans up cache after execution - metadata.yaml with vertical: python for routing ## Test Project (test_projects/python_fuzz_waterfall/) - Demonstrates stateful waterfall vulnerability - main.py with check_secret() that leaks progress - fuzz_target.py with Atheris TestOneInput() harness - Complete README with usage instructions ## Backend Fixes - Fixed parameter merging in REST API endpoints (workflows.py) - Changed workflow parameter passing from positional args to kwargs (manager.py) - Default parameters now properly merged with user parameters ## Testing ✅ Worker discovered AtherisFuzzingWorkflow ✅ Workflow executed end-to-end successfully ✅ Fuzz target auto-discovered in nested directories ✅ Atheris ran 100,000 iterations ✅ Results uploaded and cache cleaned * chore: Complete Temporal migration with updated CLI/SDK/docs This commit includes all remaining Temporal migration changes: ## CLI Updates (cli/) - Updated workflow execution commands for Temporal - Enhanced error handling and exceptions - Updated dependencies in uv.lock ## SDK Updates (sdk/) - Client methods updated for Temporal workflows - Updated models for new workflow execution - Updated dependencies in uv.lock ## Documentation Updates (docs/) - Architecture documentation for Temporal - Workflow concept documentation - Resource management documentation (new) - Debugging guide (new) - Updated tutorials and how-to guides - Troubleshooting updates ## README Updates - Main README with Temporal instructions - Backend README - CLI README - SDK README ## Other - Updated IMPLEMENTATION_STATUS.md - Removed old vulnerable_app.tar.gz These changes complete the Temporal migration and ensure the CLI/SDK work correctly with the new backend. * fix: Use positional args instead of kwargs for Temporal workflows The Temporal Python SDK's start_workflow() method doesn't accept a 'kwargs' parameter. Workflows must receive parameters as positional arguments via the 'args' parameter. Changed from: args=workflow_args # Positional arguments This fixes the error: TypeError: Client.start_workflow() got an unexpected keyword argument 'kwargs' Workflows now correctly receive parameters in order: - security_assessment: [target_id, scanner_config, analyzer_config, reporter_config] - atheris_fuzzing: [target_id, target_file, max_iterations, timeout_seconds] - rust_test: [target_id, test_message] * fix: Filter metadata-only parameters from workflow arguments SecurityAssessmentWorkflow was receiving 7 arguments instead of 2-5. The issue was that target_path and volume_mode from default_parameters were being passed to the workflow, when they should only be used by the system for configuration. Now filters out metadata-only parameters (target_path, volume_mode) before passing arguments to workflow execution. * refactor: Remove Prefect leftovers and volume mounting legacy Complete cleanup of Prefect migration artifacts: Backend: - Delete registry.py and workflow_discovery.py (Prefect-specific files) - Remove Docker validation from setup.py (no longer needed) - Remove ResourceLimits and VolumeMount models - Remove target_path and volume_mode from WorkflowSubmission - Remove supported_volume_modes from API and discovery - Clean up metadata.yaml files (remove volume/path fields) - Simplify parameter filtering in manager.py SDK: - Remove volume_mode parameter from client methods - Remove ResourceLimits and VolumeMount models - Remove Prefect error patterns from docker_logs.py - Clean up WorkflowSubmission and WorkflowMetadata models CLI: - Remove Volume Modes display from workflow info All removed features are Prefect-specific or Docker volume mounting artifacts. Temporal workflows use MinIO storage exclusively. * feat: Add comprehensive test suite and benchmark infrastructure - Add 68 unit tests for fuzzer, scanner, and analyzer modules - Implement pytest-based test infrastructure with fixtures - Add 6 performance benchmarks with category-specific thresholds - Configure GitHub Actions for automated testing and benchmarking - Add test and benchmark documentation Test coverage: - AtherisFuzzer: 8 tests - CargoFuzzer: 14 tests - FileScanner: 22 tests - SecurityAnalyzer: 24 tests All tests passing (68/68) All benchmarks passing (6/6) * fix: Resolve all ruff linting violations across codebase Fixed 27 ruff violations in 12 files: - Removed unused imports (Depends, Dict, Any, Optional, etc.) - Fixed undefined workflow_info variable in workflows.py - Removed dead code with undefined variables in atheris_fuzzer.py - Changed f-string to regular string where no placeholders used All files now pass ruff checks for CI/CD compliance. * fix: Configure CI for unit tests only - Renamed docker-compose.temporal.yaml → docker-compose.yml for CI compatibility - Commented out integration-tests job (no integration tests yet) - Updated test-summary to only depend on lint and unit-tests CI will now run successfully with 68 unit tests. Integration tests can be added later. * feat: Add CI/CD integration with ephemeral deployment model Implements comprehensive CI/CD support for FuzzForge with on-demand worker management: **Worker Management (v0.7.0)** - Add WorkerManager for automatic worker lifecycle control - Auto-start workers from stopped state when workflows execute - Auto-stop workers after workflow completion - Health checks and startup timeout handling (90s default) **CI/CD Features** - `--fail-on` flag: Fail builds based on SARIF severity levels (error/warning/note/info) - `--export-sarif` flag: Export findings in SARIF 2.1.0 format - `--auto-start`/`--auto-stop` flags: Control worker lifecycle - Exit code propagation: Returns 1 on blocking findings, 0 on success **Exit Code Fix** - Add `except typer.Exit: raise` handlers at 3 critical locations - Move worker cleanup to finally block for guaranteed execution - Exit codes now propagate correctly even when build fails **CI Scripts & Examples** - ci-start.sh: Start FuzzForge services with health checks - ci-stop.sh: Clean shutdown with volume preservation option - GitHub Actions workflow example (security-scan.yml) - GitLab CI pipeline example (.gitlab-ci.example.yml) - docker-compose.ci.yml: CI-optimized compose file with profiles **OSS-Fuzz Integration** - New ossfuzz_campaign workflow for running OSS-Fuzz projects - OSS-Fuzz worker with Docker-in-Docker support - Configurable campaign duration and project selection **Documentation** - Comprehensive CI/CD integration guide (docs/how-to/cicd-integration.md) - Updated architecture docs with worker lifecycle details - Updated workspace isolation documentation - CLI README with worker management examples **SDK Enhancements** - Add get_workflow_worker_info() endpoint - Worker vertical metadata in workflow responses **Testing** - All workflows tested: security_assessment, atheris_fuzzing, secret_detection, cargo_fuzzing - All monitoring commands tested: stats, crashes, status, finding - Full CI pipeline simulation verified - Exit codes verified for success/failure scenarios Ephemeral CI/CD model: ~3-4GB RAM, ~60-90s startup, runs entirely in CI containers. * fix: Resolve ruff linting violations in CI/CD code - Remove unused variables (run_id, defaults, result) - Remove unused imports - Fix f-string without placeholders All CI/CD integration files now pass ruff checks.
310 lines
10 KiB
Python
310 lines
10 KiB
Python
"""
|
|
FuzzForge Vertical Worker: Rust/Native Security
|
|
|
|
This worker:
|
|
1. Discovers workflows for the 'rust' vertical from mounted toolbox
|
|
2. Dynamically imports and registers workflow classes
|
|
3. Connects to Temporal and processes tasks
|
|
4. Handles activities for target download/upload from MinIO
|
|
"""
|
|
|
|
import asyncio
|
|
import importlib
|
|
import inspect
|
|
import logging
|
|
import os
|
|
import sys
|
|
from pathlib import Path
|
|
from typing import List, Any
|
|
|
|
import yaml
|
|
from temporalio.client import Client
|
|
from temporalio.worker import Worker
|
|
|
|
# Add toolbox to path for workflow and activity imports
|
|
sys.path.insert(0, '/app/toolbox')
|
|
|
|
# Import common storage activities
|
|
from toolbox.common.storage_activities import (
|
|
get_target_activity,
|
|
cleanup_cache_activity,
|
|
upload_results_activity
|
|
)
|
|
|
|
# Configure logging
|
|
logging.basicConfig(
|
|
level=os.getenv('LOG_LEVEL', 'INFO'),
|
|
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s'
|
|
)
|
|
logger = logging.getLogger(__name__)
|
|
|
|
|
|
async def discover_workflows(vertical: str) -> List[Any]:
|
|
"""
|
|
Discover workflows for this vertical from mounted toolbox.
|
|
|
|
Args:
|
|
vertical: The vertical name (e.g., 'rust', 'android', 'web')
|
|
|
|
Returns:
|
|
List of workflow classes decorated with @workflow.defn
|
|
"""
|
|
workflows = []
|
|
toolbox_path = Path("/app/toolbox/workflows")
|
|
|
|
if not toolbox_path.exists():
|
|
logger.warning(f"Toolbox path does not exist: {toolbox_path}")
|
|
return workflows
|
|
|
|
logger.info(f"Scanning for workflows in: {toolbox_path}")
|
|
|
|
for workflow_dir in toolbox_path.iterdir():
|
|
if not workflow_dir.is_dir():
|
|
continue
|
|
|
|
# Skip special directories
|
|
if workflow_dir.name.startswith('.') or workflow_dir.name == '__pycache__':
|
|
continue
|
|
|
|
metadata_file = workflow_dir / "metadata.yaml"
|
|
if not metadata_file.exists():
|
|
logger.debug(f"No metadata.yaml in {workflow_dir.name}, skipping")
|
|
continue
|
|
|
|
try:
|
|
# Parse metadata
|
|
with open(metadata_file) as f:
|
|
metadata = yaml.safe_load(f)
|
|
|
|
# Check if workflow is for this vertical
|
|
workflow_vertical = metadata.get("vertical")
|
|
if workflow_vertical != vertical:
|
|
logger.debug(
|
|
f"Workflow {workflow_dir.name} is for vertical '{workflow_vertical}', "
|
|
f"not '{vertical}', skipping"
|
|
)
|
|
continue
|
|
|
|
# Check if workflow.py exists
|
|
workflow_file = workflow_dir / "workflow.py"
|
|
if not workflow_file.exists():
|
|
logger.warning(
|
|
f"Workflow {workflow_dir.name} has metadata but no workflow.py, skipping"
|
|
)
|
|
continue
|
|
|
|
# Dynamically import workflow module
|
|
module_name = f"toolbox.workflows.{workflow_dir.name}.workflow"
|
|
logger.info(f"Importing workflow module: {module_name}")
|
|
|
|
try:
|
|
module = importlib.import_module(module_name)
|
|
except Exception as e:
|
|
logger.error(
|
|
f"Failed to import workflow module {module_name}: {e}",
|
|
exc_info=True
|
|
)
|
|
continue
|
|
|
|
# Find @workflow.defn decorated classes
|
|
found_workflows = False
|
|
for name, obj in inspect.getmembers(module, inspect.isclass):
|
|
# Check if class has Temporal workflow definition
|
|
if hasattr(obj, '__temporal_workflow_definition'):
|
|
workflows.append(obj)
|
|
found_workflows = True
|
|
logger.info(
|
|
f"✓ Discovered workflow: {name} from {workflow_dir.name} "
|
|
f"(vertical: {vertical})"
|
|
)
|
|
|
|
if not found_workflows:
|
|
logger.warning(
|
|
f"Workflow {workflow_dir.name} has no @workflow.defn decorated classes"
|
|
)
|
|
|
|
except Exception as e:
|
|
logger.error(
|
|
f"Error processing workflow {workflow_dir.name}: {e}",
|
|
exc_info=True
|
|
)
|
|
continue
|
|
|
|
logger.info(f"Discovered {len(workflows)} workflows for vertical '{vertical}'")
|
|
return workflows
|
|
|
|
|
|
async def discover_activities(workflows_dir: Path) -> List[Any]:
|
|
"""
|
|
Discover activities from workflow directories.
|
|
|
|
Looks for activities.py files alongside workflow.py in each workflow directory.
|
|
|
|
Args:
|
|
workflows_dir: Path to workflows directory
|
|
|
|
Returns:
|
|
List of activity functions decorated with @activity.defn
|
|
"""
|
|
activities = []
|
|
|
|
if not workflows_dir.exists():
|
|
logger.warning(f"Workflows directory does not exist: {workflows_dir}")
|
|
return activities
|
|
|
|
logger.info(f"Scanning for workflow activities in: {workflows_dir}")
|
|
|
|
for workflow_dir in workflows_dir.iterdir():
|
|
if not workflow_dir.is_dir():
|
|
continue
|
|
|
|
# Skip special directories
|
|
if workflow_dir.name.startswith('.') or workflow_dir.name == '__pycache__':
|
|
continue
|
|
|
|
# Check if activities.py exists
|
|
activities_file = workflow_dir / "activities.py"
|
|
if not activities_file.exists():
|
|
logger.debug(f"No activities.py in {workflow_dir.name}, skipping")
|
|
continue
|
|
|
|
try:
|
|
# Dynamically import activities module
|
|
module_name = f"toolbox.workflows.{workflow_dir.name}.activities"
|
|
logger.info(f"Importing activities module: {module_name}")
|
|
|
|
try:
|
|
module = importlib.import_module(module_name)
|
|
except Exception as e:
|
|
logger.error(
|
|
f"Failed to import activities module {module_name}: {e}",
|
|
exc_info=True
|
|
)
|
|
continue
|
|
|
|
# Find @activity.defn decorated functions
|
|
found_activities = False
|
|
for name, obj in inspect.getmembers(module, inspect.isfunction):
|
|
# Check if function has Temporal activity definition
|
|
if hasattr(obj, '__temporal_activity_definition'):
|
|
activities.append(obj)
|
|
found_activities = True
|
|
logger.info(
|
|
f"✓ Discovered activity: {name} from {workflow_dir.name}"
|
|
)
|
|
|
|
if not found_activities:
|
|
logger.warning(
|
|
f"Workflow {workflow_dir.name} has activities.py but no @activity.defn decorated functions"
|
|
)
|
|
|
|
except Exception as e:
|
|
logger.error(
|
|
f"Error processing activities from {workflow_dir.name}: {e}",
|
|
exc_info=True
|
|
)
|
|
continue
|
|
|
|
logger.info(f"Discovered {len(activities)} workflow-specific activities")
|
|
return activities
|
|
|
|
|
|
async def main():
|
|
"""Main worker entry point"""
|
|
# Get configuration from environment
|
|
vertical = os.getenv("WORKER_VERTICAL", "rust")
|
|
temporal_address = os.getenv("TEMPORAL_ADDRESS", "localhost:7233")
|
|
temporal_namespace = os.getenv("TEMPORAL_NAMESPACE", "default")
|
|
task_queue = os.getenv("WORKER_TASK_QUEUE", f"{vertical}-queue")
|
|
max_concurrent_activities = int(os.getenv("MAX_CONCURRENT_ACTIVITIES", "5"))
|
|
|
|
logger.info("=" * 60)
|
|
logger.info(f"FuzzForge Vertical Worker: {vertical}")
|
|
logger.info("=" * 60)
|
|
logger.info(f"Temporal Address: {temporal_address}")
|
|
logger.info(f"Temporal Namespace: {temporal_namespace}")
|
|
logger.info(f"Task Queue: {task_queue}")
|
|
logger.info(f"Max Concurrent Activities: {max_concurrent_activities}")
|
|
logger.info("=" * 60)
|
|
|
|
# Discover workflows for this vertical
|
|
logger.info(f"Discovering workflows for vertical: {vertical}")
|
|
workflows = await discover_workflows(vertical)
|
|
|
|
if not workflows:
|
|
logger.error(f"No workflows found for vertical: {vertical}")
|
|
logger.error("Worker cannot start without workflows. Exiting...")
|
|
sys.exit(1)
|
|
|
|
# Discover activities from workflow directories
|
|
logger.info("Discovering workflow-specific activities...")
|
|
workflows_dir = Path("/app/toolbox/workflows")
|
|
workflow_activities = await discover_activities(workflows_dir)
|
|
|
|
# Combine common storage activities with workflow-specific activities
|
|
activities = [
|
|
get_target_activity,
|
|
cleanup_cache_activity,
|
|
upload_results_activity
|
|
] + workflow_activities
|
|
|
|
logger.info(
|
|
f"Total activities registered: {len(activities)} "
|
|
f"(3 common + {len(workflow_activities)} workflow-specific)"
|
|
)
|
|
|
|
# Connect to Temporal
|
|
logger.info(f"Connecting to Temporal at {temporal_address}...")
|
|
try:
|
|
client = await Client.connect(
|
|
temporal_address,
|
|
namespace=temporal_namespace
|
|
)
|
|
logger.info("✓ Connected to Temporal successfully")
|
|
except Exception as e:
|
|
logger.error(f"Failed to connect to Temporal: {e}", exc_info=True)
|
|
sys.exit(1)
|
|
|
|
# Create worker with discovered workflows and activities
|
|
logger.info(f"Creating worker on task queue: {task_queue}")
|
|
|
|
try:
|
|
worker = Worker(
|
|
client,
|
|
task_queue=task_queue,
|
|
workflows=workflows,
|
|
activities=activities,
|
|
max_concurrent_activities=max_concurrent_activities
|
|
)
|
|
logger.info("✓ Worker created successfully")
|
|
except Exception as e:
|
|
logger.error(f"Failed to create worker: {e}", exc_info=True)
|
|
sys.exit(1)
|
|
|
|
# Start worker
|
|
logger.info("=" * 60)
|
|
logger.info(f"🚀 Worker started for vertical '{vertical}'")
|
|
logger.info(f"📦 Registered {len(workflows)} workflows")
|
|
logger.info(f"⚙️ Registered {len(activities)} activities")
|
|
logger.info(f"📨 Listening on task queue: {task_queue}")
|
|
logger.info("=" * 60)
|
|
logger.info("Worker is ready to process tasks...")
|
|
|
|
try:
|
|
await worker.run()
|
|
except KeyboardInterrupt:
|
|
logger.info("Shutting down worker (keyboard interrupt)...")
|
|
except Exception as e:
|
|
logger.error(f"Worker error: {e}", exc_info=True)
|
|
raise
|
|
|
|
|
|
if __name__ == "__main__":
|
|
try:
|
|
asyncio.run(main())
|
|
except KeyboardInterrupt:
|
|
logger.info("Worker stopped")
|
|
except Exception as e:
|
|
logger.error(f"Fatal error: {e}", exc_info=True)
|
|
sys.exit(1)
|