mirror of
https://github.com/FuzzingLabs/fuzzforge_ai.git
synced 2026-03-15 01:15:57 +00:00
chore: remove old fuzzforge_ai files
This commit is contained in:
48
.github/ISSUE_TEMPLATE/bug_report.md
vendored
48
.github/ISSUE_TEMPLATE/bug_report.md
vendored
@@ -1,48 +0,0 @@
|
||||
---
|
||||
name: 🐛 Bug Report
|
||||
about: Create a report to help us improve FuzzForge
|
||||
title: "[BUG] "
|
||||
labels: bug
|
||||
assignees: ''
|
||||
---
|
||||
|
||||
## Description
|
||||
A clear and concise description of the bug you encountered.
|
||||
|
||||
## Environment
|
||||
Please provide details about your environment:
|
||||
- **OS**: (e.g., macOS 14.0, Ubuntu 22.04, Windows 11)
|
||||
- **Python version**: (e.g., 3.9.7)
|
||||
- **Docker version**: (e.g., 24.0.6)
|
||||
- **FuzzForge version**: (e.g., 0.6.0)
|
||||
|
||||
## Steps to Reproduce
|
||||
Clear steps to recreate the issue:
|
||||
|
||||
1. Go to '...'
|
||||
2. Run command '...'
|
||||
3. Click on '...'
|
||||
4. See error
|
||||
|
||||
## Expected Behavior
|
||||
A clear and concise description of what should happen.
|
||||
|
||||
## Actual Behavior
|
||||
A clear and concise description of what actually happens.
|
||||
|
||||
## Logs
|
||||
Please include relevant error messages and stack traces:
|
||||
|
||||
```
|
||||
Paste logs here
|
||||
```
|
||||
|
||||
## Screenshots
|
||||
If applicable, add screenshots to help explain your problem.
|
||||
|
||||
## Additional Context
|
||||
Add any other context about the problem here (workflow used, specific target, configuration, etc.).
|
||||
|
||||
---
|
||||
|
||||
💬 **Need help?** Join our [Discord Community](https://discord.com/invite/acqv9FVG) for real-time support.
|
||||
8
.github/ISSUE_TEMPLATE/config.yml
vendored
8
.github/ISSUE_TEMPLATE/config.yml
vendored
@@ -1,8 +0,0 @@
|
||||
blank_issues_enabled: false
|
||||
contact_links:
|
||||
- name: 💬 Community Discord
|
||||
url: https://discord.com/invite/acqv9FVG
|
||||
about: Join our Discord to discuss ideas, workflows, and security research with the community.
|
||||
- name: 📖 Documentation
|
||||
url: https://github.com/FuzzingLabs/fuzzforge_ai/tree/main/docs
|
||||
about: Check our documentation for guides, tutorials, and API reference.
|
||||
38
.github/ISSUE_TEMPLATE/feature_request.md
vendored
38
.github/ISSUE_TEMPLATE/feature_request.md
vendored
@@ -1,38 +0,0 @@
|
||||
---
|
||||
name: ✨ Feature Request
|
||||
about: Suggest an idea for FuzzForge
|
||||
title: "[FEATURE] "
|
||||
labels: enhancement
|
||||
assignees: ''
|
||||
---
|
||||
|
||||
## Use Case
|
||||
Why is this feature needed? Describe the problem you're trying to solve or the improvement you'd like to see.
|
||||
|
||||
## Proposed Solution
|
||||
How should it work? Describe your ideal solution in detail.
|
||||
|
||||
## Alternatives
|
||||
What other approaches have you considered? List any alternative solutions or features you've thought about.
|
||||
|
||||
## Implementation
|
||||
**(Optional)** Do you have any technical considerations or implementation ideas?
|
||||
|
||||
## Category
|
||||
What area of FuzzForge would this feature enhance?
|
||||
|
||||
- [ ] 🤖 AI Agents for Security
|
||||
- [ ] 🛠 Workflow Automation
|
||||
- [ ] 📈 Vulnerability Research
|
||||
- [ ] 🔗 Fuzzer Integration
|
||||
- [ ] 🌐 Community Marketplace
|
||||
- [ ] 🔒 Enterprise Features
|
||||
- [ ] 📚 Documentation
|
||||
- [ ] 🎯 Other
|
||||
|
||||
## Additional Context
|
||||
Add any other context, screenshots, references, or examples about the feature request here.
|
||||
|
||||
---
|
||||
|
||||
💬 **Want to discuss this idea?** Join our [Discord Community](https://discord.com/invite/acqv9FVG) to collaborate with other contributors!
|
||||
67
.github/ISSUE_TEMPLATE/workflow_submission.md
vendored
67
.github/ISSUE_TEMPLATE/workflow_submission.md
vendored
@@ -1,67 +0,0 @@
|
||||
---
|
||||
name: 🔄 Workflow Submission
|
||||
about: Contribute a security workflow or module to the FuzzForge community
|
||||
title: "[WORKFLOW] "
|
||||
labels: workflow, community
|
||||
assignees: ''
|
||||
---
|
||||
|
||||
## Workflow Name
|
||||
Provide a short, descriptive name for your workflow.
|
||||
|
||||
## Description
|
||||
Explain what this workflow does and what security problems it solves.
|
||||
|
||||
## Category
|
||||
What type of security workflow is this?
|
||||
|
||||
- [ ] 🛡️ **Security Assessment** - Static analysis, vulnerability scanning
|
||||
- [ ] 🔍 **Secret Detection** - Credential and secret scanning
|
||||
- [ ] 🎯 **Fuzzing** - Dynamic testing and fuzz testing
|
||||
- [ ] 🔄 **Reverse Engineering** - Binary analysis and decompilation
|
||||
- [ ] 🌐 **Infrastructure Security** - Container, cloud, network security
|
||||
- [ ] 🔒 **Penetration Testing** - Offensive security testing
|
||||
- [ ] 📋 **Other** - Please describe
|
||||
|
||||
## Files
|
||||
Please attach or provide links to your workflow files:
|
||||
|
||||
- [ ] `workflow.py` - Main Temporal flow implementation
|
||||
- [ ] `Dockerfile` - Container definition
|
||||
- [ ] `metadata.yaml` - Workflow metadata
|
||||
- [ ] Test files or examples
|
||||
- [ ] Documentation
|
||||
|
||||
## Testing
|
||||
How did you test this workflow? Please describe:
|
||||
|
||||
- **Test targets used**: (e.g., vulnerable_app, custom test cases)
|
||||
- **Expected outputs**: (e.g., SARIF format, specific vulnerabilities detected)
|
||||
- **Validation results**: (e.g., X vulnerabilities found, Y false positives)
|
||||
|
||||
## SARIF Compliance
|
||||
- [ ] My workflow outputs results in SARIF format
|
||||
- [ ] Results include severity levels and descriptions
|
||||
- [ ] Code flow information is provided where applicable
|
||||
|
||||
## Security Guidelines
|
||||
- [ ] This workflow focuses on **defensive security** purposes only
|
||||
- [ ] I have not included any malicious tools or capabilities
|
||||
- [ ] All secrets/credentials are parameterized (no hardcoded values)
|
||||
- [ ] I have followed responsible disclosure practices
|
||||
|
||||
## Registry Integration
|
||||
Have you updated the workflow registry?
|
||||
|
||||
- [ ] Added import statement to `backend/toolbox/workflows/registry.py`
|
||||
- [ ] Added registry entry with proper metadata
|
||||
- [ ] Tested workflow registration and deployment
|
||||
|
||||
## Additional Notes
|
||||
Anything else the maintainers should know about this workflow?
|
||||
|
||||
---
|
||||
|
||||
🚀 **Thank you for contributing to FuzzForge!** Your workflow will help the security community automate and scale their testing efforts.
|
||||
|
||||
💬 **Questions?** Join our [Discord Community](https://discord.com/invite/acqv9FVG) to discuss your contribution!
|
||||
79
.github/pull_request_template.md
vendored
79
.github/pull_request_template.md
vendored
@@ -1,79 +0,0 @@
|
||||
## Description
|
||||
|
||||
<!-- Provide a brief description of the changes in this PR -->
|
||||
|
||||
## Type of Change
|
||||
|
||||
<!-- Mark the appropriate option with an 'x' -->
|
||||
|
||||
- [ ] 🐛 Bug fix (non-breaking change which fixes an issue)
|
||||
- [ ] ✨ New feature (non-breaking change which adds functionality)
|
||||
- [ ] 💥 Breaking change (fix or feature that would cause existing functionality to not work as expected)
|
||||
- [ ] 📝 Documentation update
|
||||
- [ ] 🔧 Configuration change
|
||||
- [ ] ♻️ Refactoring (no functional changes)
|
||||
- [ ] 🎨 Style/formatting changes
|
||||
- [ ] ✅ Test additions or updates
|
||||
|
||||
## Related Issues
|
||||
|
||||
<!-- Link to related issues using #issue_number -->
|
||||
<!-- Example: Closes #123, Relates to #456 -->
|
||||
|
||||
## Changes Made
|
||||
|
||||
<!-- List the specific changes made in this PR -->
|
||||
|
||||
-
|
||||
-
|
||||
-
|
||||
|
||||
## Testing
|
||||
|
||||
<!-- Describe the tests you ran to verify your changes -->
|
||||
|
||||
### Tested Locally
|
||||
|
||||
- [ ] All tests pass (`pytest`, `uv build`, etc.)
|
||||
- [ ] Linting passes (`ruff check`)
|
||||
- [ ] Code builds successfully
|
||||
|
||||
### Worker Changes (if applicable)
|
||||
|
||||
- [ ] Docker images build successfully (`docker compose build`)
|
||||
- [ ] Worker containers start correctly
|
||||
- [ ] Tested with actual workflow execution
|
||||
|
||||
### Documentation
|
||||
|
||||
- [ ] Documentation updated (if needed)
|
||||
- [ ] README updated (if needed)
|
||||
- [ ] CHANGELOG.md updated (if user-facing changes)
|
||||
|
||||
## Pre-Merge Checklist
|
||||
|
||||
<!-- Ensure all items are completed before requesting review -->
|
||||
|
||||
- [ ] My code follows the project's coding standards
|
||||
- [ ] I have performed a self-review of my code
|
||||
- [ ] I have commented my code, particularly in hard-to-understand areas
|
||||
- [ ] I have made corresponding changes to the documentation
|
||||
- [ ] My changes generate no new warnings
|
||||
- [ ] I have added tests that prove my fix is effective or that my feature works
|
||||
- [ ] New and existing unit tests pass locally with my changes
|
||||
- [ ] Any dependent changes have been merged and published
|
||||
|
||||
### Worker-Specific Checks (if workers/ modified)
|
||||
|
||||
- [ ] All worker files properly tracked by git (not gitignored)
|
||||
- [ ] Worker validation script passes (`.github/scripts/validate-workers.sh`)
|
||||
- [ ] Docker images build without errors
|
||||
- [ ] Worker configuration updated in `docker-compose.yml` (if needed)
|
||||
|
||||
## Screenshots (if applicable)
|
||||
|
||||
<!-- Add screenshots to help explain your changes -->
|
||||
|
||||
## Additional Notes
|
||||
|
||||
<!-- Any additional information that reviewers should know -->
|
||||
127
.github/scripts/validate-workers.sh
vendored
127
.github/scripts/validate-workers.sh
vendored
@@ -1,127 +0,0 @@
|
||||
#!/bin/bash
|
||||
# Worker Validation Script
|
||||
# Ensures all workers defined in docker-compose.yml exist in the repository
|
||||
# and are properly tracked by git.
|
||||
|
||||
set -e
|
||||
|
||||
echo "🔍 Validating worker completeness..."
|
||||
|
||||
# Colors for output
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
NC='\033[0m' # No Color
|
||||
|
||||
ERRORS=0
|
||||
WARNINGS=0
|
||||
|
||||
# Extract worker service names from docker-compose.yml
|
||||
echo ""
|
||||
echo "📋 Checking workers defined in docker-compose.yml..."
|
||||
WORKERS=$(grep -E "^\s+worker-" docker-compose.yml | grep -v "#" | cut -d: -f1 | tr -d ' ' | sort -u)
|
||||
|
||||
if [ -z "$WORKERS" ]; then
|
||||
echo -e "${RED}❌ No workers found in docker-compose.yml${NC}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "Found workers:"
|
||||
for worker in $WORKERS; do
|
||||
echo " - $worker"
|
||||
done
|
||||
|
||||
# Check each worker
|
||||
echo ""
|
||||
echo "🔎 Validating worker files..."
|
||||
for worker in $WORKERS; do
|
||||
WORKER_DIR="workers/${worker#worker-}"
|
||||
|
||||
echo ""
|
||||
echo "Checking $worker ($WORKER_DIR)..."
|
||||
|
||||
# Check if directory exists
|
||||
if [ ! -d "$WORKER_DIR" ]; then
|
||||
echo -e "${RED} ❌ Directory not found: $WORKER_DIR${NC}"
|
||||
ERRORS=$((ERRORS + 1))
|
||||
continue
|
||||
fi
|
||||
|
||||
# Check Dockerfile (single file or multi-platform pattern)
|
||||
if [ -f "$WORKER_DIR/Dockerfile" ]; then
|
||||
# Single Dockerfile
|
||||
if ! git ls-files --error-unmatch "$WORKER_DIR/Dockerfile" &> /dev/null; then
|
||||
echo -e "${RED} ❌ File not tracked by git: $WORKER_DIR/Dockerfile${NC}"
|
||||
echo -e "${YELLOW} Check .gitignore patterns!${NC}"
|
||||
ERRORS=$((ERRORS + 1))
|
||||
else
|
||||
echo -e "${GREEN} ✓ Dockerfile (tracked)${NC}"
|
||||
fi
|
||||
elif compgen -G "$WORKER_DIR/Dockerfile.*" > /dev/null; then
|
||||
# Multi-platform Dockerfiles (e.g., Dockerfile.amd64, Dockerfile.arm64)
|
||||
PLATFORM_DOCKERFILES=$(ls "$WORKER_DIR"/Dockerfile.* 2>/dev/null)
|
||||
DOCKERFILE_FOUND=false
|
||||
for dockerfile in $PLATFORM_DOCKERFILES; do
|
||||
if git ls-files --error-unmatch "$dockerfile" &> /dev/null; then
|
||||
echo -e "${GREEN} ✓ $(basename "$dockerfile") (tracked)${NC}"
|
||||
DOCKERFILE_FOUND=true
|
||||
else
|
||||
echo -e "${RED} ❌ File not tracked by git: $dockerfile${NC}"
|
||||
ERRORS=$((ERRORS + 1))
|
||||
fi
|
||||
done
|
||||
if [ "$DOCKERFILE_FOUND" = false ]; then
|
||||
echo -e "${RED} ❌ No platform-specific Dockerfiles found${NC}"
|
||||
ERRORS=$((ERRORS + 1))
|
||||
fi
|
||||
else
|
||||
echo -e "${RED} ❌ Missing Dockerfile or Dockerfile.* files${NC}"
|
||||
ERRORS=$((ERRORS + 1))
|
||||
fi
|
||||
|
||||
# Check other required files
|
||||
REQUIRED_FILES=("requirements.txt" "worker.py")
|
||||
for file in "${REQUIRED_FILES[@]}"; do
|
||||
FILE_PATH="$WORKER_DIR/$file"
|
||||
|
||||
if [ ! -f "$FILE_PATH" ]; then
|
||||
echo -e "${RED} ❌ Missing file: $FILE_PATH${NC}"
|
||||
ERRORS=$((ERRORS + 1))
|
||||
else
|
||||
# Check if file is tracked by git
|
||||
if ! git ls-files --error-unmatch "$FILE_PATH" &> /dev/null; then
|
||||
echo -e "${RED} ❌ File not tracked by git: $FILE_PATH${NC}"
|
||||
echo -e "${YELLOW} Check .gitignore patterns!${NC}"
|
||||
ERRORS=$((ERRORS + 1))
|
||||
else
|
||||
echo -e "${GREEN} ✓ $file (tracked)${NC}"
|
||||
fi
|
||||
fi
|
||||
done
|
||||
done
|
||||
|
||||
# Check for any ignored worker files
|
||||
echo ""
|
||||
echo "🚫 Checking for gitignored worker files..."
|
||||
IGNORED_FILES=$(git check-ignore workers/*/* 2>/dev/null || true)
|
||||
if [ -n "$IGNORED_FILES" ]; then
|
||||
echo -e "${YELLOW}⚠️ Warning: Some worker files are being ignored:${NC}"
|
||||
echo "$IGNORED_FILES" | while read -r file; do
|
||||
echo -e "${YELLOW} - $file${NC}"
|
||||
done
|
||||
WARNINGS=$((WARNINGS + 1))
|
||||
fi
|
||||
|
||||
# Summary
|
||||
echo ""
|
||||
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
|
||||
if [ $ERRORS -eq 0 ] && [ $WARNINGS -eq 0 ]; then
|
||||
echo -e "${GREEN}✅ All workers validated successfully!${NC}"
|
||||
exit 0
|
||||
elif [ $ERRORS -eq 0 ]; then
|
||||
echo -e "${YELLOW}⚠️ Validation passed with $WARNINGS warning(s)${NC}"
|
||||
exit 0
|
||||
else
|
||||
echo -e "${RED}❌ Validation failed with $ERRORS error(s) and $WARNINGS warning(s)${NC}"
|
||||
exit 1
|
||||
fi
|
||||
165
.github/workflows/benchmark.yml
vendored
165
.github/workflows/benchmark.yml
vendored
@@ -1,165 +0,0 @@
|
||||
name: Benchmarks
|
||||
|
||||
on:
|
||||
# Disabled automatic runs - benchmarks not ready for CI/CD yet
|
||||
# schedule:
|
||||
# - cron: '0 2 * * *' # 2 AM UTC every day
|
||||
|
||||
# Allow manual trigger for testing
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
compare_with:
|
||||
description: 'Baseline commit to compare against (optional)'
|
||||
required: false
|
||||
default: ''
|
||||
|
||||
# pull_request:
|
||||
# paths:
|
||||
# - 'backend/benchmarks/**'
|
||||
# - 'backend/toolbox/modules/**'
|
||||
# - '.github/workflows/benchmark.yml'
|
||||
|
||||
jobs:
|
||||
benchmark:
|
||||
name: Run Benchmarks
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0 # Fetch all history for comparison
|
||||
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: '3.11'
|
||||
|
||||
- name: Install system dependencies
|
||||
run: |
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y build-essential
|
||||
|
||||
- name: Install Python dependencies
|
||||
working-directory: ./backend
|
||||
run: |
|
||||
python -m pip install --upgrade pip
|
||||
pip install -e ".[dev]"
|
||||
pip install pytest pytest-asyncio pytest-benchmark pytest-benchmark[histogram]
|
||||
pip install -e ../sdk # Install SDK for benchmarks
|
||||
|
||||
- name: Run benchmarks
|
||||
working-directory: ./backend
|
||||
run: |
|
||||
pytest benchmarks/ \
|
||||
-v \
|
||||
--benchmark-only \
|
||||
--benchmark-json=benchmark-results.json \
|
||||
--benchmark-histogram=benchmark-histogram
|
||||
|
||||
- name: Store benchmark results
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: benchmark-results-${{ github.run_number }}
|
||||
path: |
|
||||
backend/benchmark-results.json
|
||||
backend/benchmark-histogram.svg
|
||||
|
||||
- name: Download baseline benchmarks
|
||||
if: github.event_name == 'pull_request'
|
||||
uses: dawidd6/action-download-artifact@v3
|
||||
continue-on-error: true
|
||||
with:
|
||||
workflow: benchmark.yml
|
||||
branch: ${{ github.base_ref }}
|
||||
name: benchmark-results-*
|
||||
path: ./baseline
|
||||
search_artifacts: true
|
||||
|
||||
- name: Compare with baseline
|
||||
if: github.event_name == 'pull_request' && hashFiles('baseline/benchmark-results.json') != ''
|
||||
run: |
|
||||
python -c "
|
||||
import json
|
||||
import sys
|
||||
|
||||
with open('backend/benchmark-results.json') as f:
|
||||
current = json.load(f)
|
||||
|
||||
with open('baseline/benchmark-results.json') as f:
|
||||
baseline = json.load(f)
|
||||
|
||||
print('\\n## Benchmark Comparison\\n')
|
||||
print('| Benchmark | Current | Baseline | Change |')
|
||||
print('|-----------|---------|----------|--------|')
|
||||
|
||||
regressions = []
|
||||
|
||||
for bench in current['benchmarks']:
|
||||
name = bench['name']
|
||||
current_time = bench['stats']['mean']
|
||||
|
||||
# Find matching baseline
|
||||
baseline_bench = next((b for b in baseline['benchmarks'] if b['name'] == name), None)
|
||||
if baseline_bench:
|
||||
baseline_time = baseline_bench['stats']['mean']
|
||||
change = ((current_time - baseline_time) / baseline_time) * 100
|
||||
|
||||
print(f'| {name} | {current_time:.4f}s | {baseline_time:.4f}s | {change:+.2f}% |')
|
||||
|
||||
# Flag regressions > 10%
|
||||
if change > 10:
|
||||
regressions.append((name, change))
|
||||
else:
|
||||
print(f'| {name} | {current_time:.4f}s | N/A | NEW |')
|
||||
|
||||
if regressions:
|
||||
print('\\n⚠️ **Performance Regressions Detected:**')
|
||||
for name, change in regressions:
|
||||
print(f'- {name}: +{change:.2f}%')
|
||||
sys.exit(1)
|
||||
else:
|
||||
print('\\n✅ No significant performance regressions detected')
|
||||
"
|
||||
|
||||
- name: Comment PR with results
|
||||
if: github.event_name == 'pull_request'
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
const fs = require('fs');
|
||||
const results = JSON.parse(fs.readFileSync('backend/benchmark-results.json', 'utf8'));
|
||||
|
||||
let body = '## Benchmark Results\\n\\n';
|
||||
body += '| Category | Benchmark | Mean Time | Std Dev |\\n';
|
||||
body += '|----------|-----------|-----------|---------|\\n';
|
||||
|
||||
for (const bench of results.benchmarks) {
|
||||
const group = bench.group || 'ungrouped';
|
||||
const name = bench.name.split('::').pop();
|
||||
const mean = bench.stats.mean.toFixed(4);
|
||||
const stddev = bench.stats.stddev.toFixed(4);
|
||||
body += `| ${group} | ${name} | ${mean}s | ${stddev}s |\\n`;
|
||||
}
|
||||
|
||||
body += '\\n📊 Full benchmark results available in artifacts.';
|
||||
|
||||
github.rest.issues.createComment({
|
||||
issue_number: context.issue.number,
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
body: body
|
||||
});
|
||||
|
||||
benchmark-summary:
|
||||
name: Benchmark Summary
|
||||
runs-on: ubuntu-latest
|
||||
needs: benchmark
|
||||
if: always()
|
||||
steps:
|
||||
- name: Check results
|
||||
run: |
|
||||
if [ "${{ needs.benchmark.result }}" != "success" ]; then
|
||||
echo "Benchmarks failed or detected regressions"
|
||||
exit 1
|
||||
fi
|
||||
echo "Benchmarks completed successfully!"
|
||||
70
.github/workflows/ci-python.yml
vendored
70
.github/workflows/ci-python.yml
vendored
@@ -1,70 +0,0 @@
|
||||
name: Python CI
|
||||
|
||||
# This is a dumb Ci to ensure that the python client and backend builds correctly
|
||||
# It could be optimized to run faster, building, testing and linting only changed code
|
||||
# but for now it is good enough. It runs on every push and PR to any branch.
|
||||
# It also runs on demand.
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
|
||||
push:
|
||||
paths:
|
||||
- "ai/**"
|
||||
- "backend/**"
|
||||
- "cli/**"
|
||||
- "sdk/**"
|
||||
- "src/**"
|
||||
pull_request:
|
||||
paths:
|
||||
- "ai/**"
|
||||
- "backend/**"
|
||||
- "cli/**"
|
||||
- "sdk/**"
|
||||
- "src/**"
|
||||
|
||||
jobs:
|
||||
ci:
|
||||
name: ci
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v5
|
||||
|
||||
- name: Setup uv
|
||||
uses: astral-sh/setup-uv@v6
|
||||
with:
|
||||
enable-cache: true
|
||||
|
||||
- name: Set up Python
|
||||
run: uv python install
|
||||
|
||||
# Validate no obvious issues
|
||||
# Quick hack because CLI returns non-zero exit code when no args are provided
|
||||
- name: Run base command
|
||||
run: |
|
||||
set +e
|
||||
uv run ff
|
||||
if [ $? -ne 2 ]; then
|
||||
echo "Expected exit code 2 from 'uv run ff', got $?"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- name: Build fuzzforge_ai package
|
||||
run: uv build
|
||||
|
||||
- name: Build ai package
|
||||
working-directory: ai
|
||||
run: uv build
|
||||
|
||||
- name: Build cli package
|
||||
working-directory: cli
|
||||
run: uv build
|
||||
|
||||
- name: Build sdk package
|
||||
working-directory: sdk
|
||||
run: uv build
|
||||
|
||||
- name: Build backend package
|
||||
working-directory: backend
|
||||
run: uv build
|
||||
57
.github/workflows/docs-deploy.yml
vendored
57
.github/workflows/docs-deploy.yml
vendored
@@ -1,57 +0,0 @@
|
||||
name: Deploy Docusaurus to GitHub Pages
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
|
||||
push:
|
||||
branches:
|
||||
- master
|
||||
paths:
|
||||
- "docs/**"
|
||||
|
||||
jobs:
|
||||
build:
|
||||
name: Build Docusaurus
|
||||
runs-on: ubuntu-latest
|
||||
defaults:
|
||||
run:
|
||||
working-directory: ./docs
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
- uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: 24
|
||||
cache: npm
|
||||
cache-dependency-path: "**/package-lock.json"
|
||||
|
||||
- name: Install dependencies
|
||||
run: npm ci
|
||||
- name: Build website
|
||||
run: npm run build
|
||||
|
||||
- name: Upload Build Artifact
|
||||
uses: actions/upload-pages-artifact@v3
|
||||
with:
|
||||
path: ./docs/build
|
||||
|
||||
deploy:
|
||||
name: Deploy to GitHub Pages
|
||||
needs: build
|
||||
|
||||
# Grant GITHUB_TOKEN the permissions required to make a Pages deployment
|
||||
permissions:
|
||||
pages: write # to deploy to Pages
|
||||
id-token: write # to verify the deployment originates from an appropriate source
|
||||
|
||||
# Deploy to the github-pages environment
|
||||
environment:
|
||||
name: github-pages
|
||||
url: ${{ steps.deployment.outputs.page_url }}
|
||||
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Deploy to GitHub Pages
|
||||
id: deployment
|
||||
uses: actions/deploy-pages@v4
|
||||
33
.github/workflows/docs-test-deploy.yml
vendored
33
.github/workflows/docs-test-deploy.yml
vendored
@@ -1,33 +0,0 @@
|
||||
name: Docusaurus test deployment
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
|
||||
push:
|
||||
paths:
|
||||
- "docs/**"
|
||||
pull_request:
|
||||
paths:
|
||||
- "docs/**"
|
||||
|
||||
jobs:
|
||||
test-deploy:
|
||||
name: Test deployment
|
||||
runs-on: ubuntu-latest
|
||||
defaults:
|
||||
run:
|
||||
working-directory: ./docs
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
- uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: 24
|
||||
cache: npm
|
||||
cache-dependency-path: "**/package-lock.json"
|
||||
|
||||
- name: Install dependencies
|
||||
run: npm ci
|
||||
- name: Test build website
|
||||
run: npm run build
|
||||
152
.github/workflows/examples/security-scan.yml
vendored
152
.github/workflows/examples/security-scan.yml
vendored
@@ -1,152 +0,0 @@
|
||||
# FuzzForge CI/CD Example - Security Scanning
|
||||
#
|
||||
# This workflow demonstrates how to integrate FuzzForge into your CI/CD pipeline
|
||||
# for automated security testing on pull requests and pushes.
|
||||
#
|
||||
# Features:
|
||||
# - Runs entirely in GitHub Actions (no external infrastructure needed)
|
||||
# - Auto-starts FuzzForge services on-demand
|
||||
# - Fails builds on error-level SARIF findings
|
||||
# - Uploads SARIF results to GitHub Security tab
|
||||
# - Exports findings as artifacts
|
||||
#
|
||||
# Prerequisites:
|
||||
# - Ubuntu runner with Docker support
|
||||
# - At least 4GB RAM available
|
||||
# - ~90 seconds startup time
|
||||
|
||||
name: Security Scan Example
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
branches: [main, develop]
|
||||
push:
|
||||
branches: [main]
|
||||
|
||||
jobs:
|
||||
security-scan:
|
||||
name: Security Assessment
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 30
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Start FuzzForge
|
||||
run: |
|
||||
bash scripts/ci-start.sh
|
||||
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: '3.11'
|
||||
|
||||
- name: Install FuzzForge CLI
|
||||
run: |
|
||||
pip install ./cli
|
||||
|
||||
- name: Initialize FuzzForge
|
||||
run: |
|
||||
ff init --api-url http://localhost:8000 --name "GitHub Actions Security Scan"
|
||||
|
||||
- name: Run Security Assessment
|
||||
run: |
|
||||
ff workflow run security_assessment . \
|
||||
--wait \
|
||||
--fail-on error \
|
||||
--export-sarif results.sarif
|
||||
|
||||
- name: Upload SARIF to GitHub Security
|
||||
if: always()
|
||||
uses: github/codeql-action/upload-sarif@v3
|
||||
with:
|
||||
sarif_file: results.sarif
|
||||
|
||||
- name: Upload findings as artifact
|
||||
if: always()
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: security-findings
|
||||
path: results.sarif
|
||||
retention-days: 30
|
||||
|
||||
- name: Stop FuzzForge
|
||||
if: always()
|
||||
run: |
|
||||
bash scripts/ci-stop.sh
|
||||
|
||||
secret-scan:
|
||||
name: Secret Detection
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 15
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Start FuzzForge
|
||||
run: bash scripts/ci-start.sh
|
||||
|
||||
- name: Install CLI
|
||||
run: |
|
||||
pip install ./cli
|
||||
|
||||
- name: Initialize & Scan
|
||||
run: |
|
||||
ff init --api-url http://localhost:8000 --name "Secret Detection"
|
||||
ff workflow run secret_detection . \
|
||||
--wait \
|
||||
--fail-on all \
|
||||
--export-sarif secrets.sarif
|
||||
|
||||
- name: Upload results
|
||||
if: always()
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: secret-scan-results
|
||||
path: secrets.sarif
|
||||
retention-days: 30
|
||||
|
||||
- name: Cleanup
|
||||
if: always()
|
||||
run: bash scripts/ci-stop.sh
|
||||
|
||||
# Example: Nightly fuzzing campaign (long-running)
|
||||
nightly-fuzzing:
|
||||
name: Nightly Fuzzing
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 120
|
||||
# Only run on schedule
|
||||
if: github.event_name == 'schedule'
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Start FuzzForge
|
||||
run: bash scripts/ci-start.sh
|
||||
|
||||
- name: Install CLI
|
||||
run: pip install ./cli
|
||||
|
||||
- name: Run Fuzzing Campaign
|
||||
run: |
|
||||
ff init --api-url http://localhost:8000
|
||||
ff workflow run atheris_fuzzing . \
|
||||
max_iterations=100000000 \
|
||||
timeout_seconds=7200 \
|
||||
--wait \
|
||||
--export-sarif fuzzing-results.sarif
|
||||
# Don't fail on fuzzing findings, just report
|
||||
continue-on-error: true
|
||||
|
||||
- name: Upload fuzzing results
|
||||
if: always()
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: fuzzing-results
|
||||
path: fuzzing-results.sarif
|
||||
retention-days: 90
|
||||
|
||||
- name: Cleanup
|
||||
if: always()
|
||||
run: bash scripts/ci-stop.sh
|
||||
248
.github/workflows/test.yml
vendored
248
.github/workflows/test.yml
vendored
@@ -1,248 +0,0 @@
|
||||
name: Tests
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ main, master, dev, develop, feature/** ]
|
||||
pull_request:
|
||||
branches: [ main, master, dev, develop ]
|
||||
|
||||
jobs:
|
||||
validate-workers:
|
||||
name: Validate Workers
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Run worker validation
|
||||
run: |
|
||||
chmod +x .github/scripts/validate-workers.sh
|
||||
.github/scripts/validate-workers.sh
|
||||
|
||||
build-workers:
|
||||
name: Build Worker Docker Images
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0 # Fetch all history for proper diff
|
||||
|
||||
- name: Check which workers were modified
|
||||
id: check-workers
|
||||
run: |
|
||||
if [ "${{ github.event_name }}" == "pull_request" ]; then
|
||||
# For PRs, check changed files
|
||||
CHANGED_FILES=$(git diff --name-only origin/${{ github.base_ref }}...HEAD)
|
||||
echo "Changed files:"
|
||||
echo "$CHANGED_FILES"
|
||||
else
|
||||
# For direct pushes, check last commit
|
||||
CHANGED_FILES=$(git diff --name-only HEAD~1 HEAD)
|
||||
fi
|
||||
|
||||
# Check if docker-compose.yml changed (build all workers)
|
||||
if echo "$CHANGED_FILES" | grep -q "^docker-compose.yml"; then
|
||||
echo "workers_to_build=worker-python worker-secrets worker-rust worker-android worker-ossfuzz" >> $GITHUB_OUTPUT
|
||||
echo "workers_modified=true" >> $GITHUB_OUTPUT
|
||||
echo "✅ docker-compose.yml modified - building all workers"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# Detect which specific workers changed
|
||||
WORKERS_TO_BUILD=""
|
||||
|
||||
if echo "$CHANGED_FILES" | grep -q "^workers/python/"; then
|
||||
WORKERS_TO_BUILD="$WORKERS_TO_BUILD worker-python"
|
||||
echo "✅ Python worker modified"
|
||||
fi
|
||||
|
||||
if echo "$CHANGED_FILES" | grep -q "^workers/secrets/"; then
|
||||
WORKERS_TO_BUILD="$WORKERS_TO_BUILD worker-secrets"
|
||||
echo "✅ Secrets worker modified"
|
||||
fi
|
||||
|
||||
if echo "$CHANGED_FILES" | grep -q "^workers/rust/"; then
|
||||
WORKERS_TO_BUILD="$WORKERS_TO_BUILD worker-rust"
|
||||
echo "✅ Rust worker modified"
|
||||
fi
|
||||
|
||||
if echo "$CHANGED_FILES" | grep -q "^workers/android/"; then
|
||||
WORKERS_TO_BUILD="$WORKERS_TO_BUILD worker-android"
|
||||
echo "✅ Android worker modified"
|
||||
fi
|
||||
|
||||
if echo "$CHANGED_FILES" | grep -q "^workers/ossfuzz/"; then
|
||||
WORKERS_TO_BUILD="$WORKERS_TO_BUILD worker-ossfuzz"
|
||||
echo "✅ OSS-Fuzz worker modified"
|
||||
fi
|
||||
|
||||
if [ -z "$WORKERS_TO_BUILD" ]; then
|
||||
echo "workers_modified=false" >> $GITHUB_OUTPUT
|
||||
echo "⏭️ No worker changes detected - skipping build"
|
||||
else
|
||||
echo "workers_to_build=$WORKERS_TO_BUILD" >> $GITHUB_OUTPUT
|
||||
echo "workers_modified=true" >> $GITHUB_OUTPUT
|
||||
echo "Building workers:$WORKERS_TO_BUILD"
|
||||
fi
|
||||
|
||||
- name: Set up Docker Buildx
|
||||
if: steps.check-workers.outputs.workers_modified == 'true'
|
||||
uses: docker/setup-buildx-action@v3
|
||||
|
||||
- name: Build worker images
|
||||
if: steps.check-workers.outputs.workers_modified == 'true'
|
||||
run: |
|
||||
WORKERS="${{ steps.check-workers.outputs.workers_to_build }}"
|
||||
echo "Building worker Docker images: $WORKERS"
|
||||
docker compose build $WORKERS --no-cache
|
||||
continue-on-error: false
|
||||
|
||||
lint:
|
||||
name: Lint
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: '3.11'
|
||||
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
python -m pip install --upgrade pip
|
||||
pip install ruff mypy
|
||||
|
||||
- name: Run ruff
|
||||
run: ruff check backend/src backend/toolbox backend/tests backend/benchmarks --output-format=github
|
||||
|
||||
- name: Run mypy (continue on error)
|
||||
run: mypy backend/src backend/toolbox || true
|
||||
continue-on-error: true
|
||||
|
||||
unit-tests:
|
||||
name: Unit Tests
|
||||
runs-on: ubuntu-latest
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: ['3.11', '3.12']
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Set up Python ${{ matrix.python-version }}
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
|
||||
- name: Install system dependencies
|
||||
run: |
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y build-essential
|
||||
|
||||
- name: Install Python dependencies
|
||||
working-directory: ./backend
|
||||
run: |
|
||||
python -m pip install --upgrade pip
|
||||
pip install -e ".[dev]"
|
||||
pip install pytest pytest-asyncio pytest-cov pytest-xdist
|
||||
|
||||
- name: Run unit tests
|
||||
working-directory: ./backend
|
||||
run: |
|
||||
pytest tests/unit/ \
|
||||
-v \
|
||||
--cov=toolbox/modules \
|
||||
--cov=src \
|
||||
--cov-report=xml \
|
||||
--cov-report=term \
|
||||
--cov-report=html \
|
||||
-n auto
|
||||
|
||||
- name: Upload coverage to Codecov
|
||||
if: matrix.python-version == '3.11'
|
||||
uses: codecov/codecov-action@v4
|
||||
with:
|
||||
file: ./backend/coverage.xml
|
||||
flags: unittests
|
||||
name: codecov-backend
|
||||
|
||||
- name: Upload coverage HTML
|
||||
if: matrix.python-version == '3.11'
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: coverage-report
|
||||
path: ./backend/htmlcov/
|
||||
|
||||
# integration-tests:
|
||||
# name: Integration Tests
|
||||
# runs-on: ubuntu-latest
|
||||
# needs: unit-tests
|
||||
#
|
||||
# services:
|
||||
# postgres:
|
||||
# image: postgres:15
|
||||
# env:
|
||||
# POSTGRES_USER: postgres
|
||||
# POSTGRES_PASSWORD: postgres
|
||||
# POSTGRES_DB: fuzzforge_test
|
||||
# options: >-
|
||||
# --health-cmd pg_isready
|
||||
# --health-interval 10s
|
||||
# --health-timeout 5s
|
||||
# --health-retries 5
|
||||
# ports:
|
||||
# - 5432:5432
|
||||
#
|
||||
# steps:
|
||||
# - uses: actions/checkout@v4
|
||||
#
|
||||
# - name: Set up Python
|
||||
# uses: actions/setup-python@v5
|
||||
# with:
|
||||
# python-version: '3.11'
|
||||
#
|
||||
# - name: Set up Docker Buildx
|
||||
# uses: docker/setup-buildx-action@v3
|
||||
#
|
||||
# - name: Install Python dependencies
|
||||
# working-directory: ./backend
|
||||
# run: |
|
||||
# python -m pip install --upgrade pip
|
||||
# pip install -e ".[dev]"
|
||||
# pip install pytest pytest-asyncio
|
||||
#
|
||||
# - name: Start services (Temporal, MinIO)
|
||||
# run: |
|
||||
# docker-compose -f docker-compose.yml up -d temporal minio
|
||||
# sleep 30
|
||||
#
|
||||
# - name: Run integration tests
|
||||
# working-directory: ./backend
|
||||
# run: |
|
||||
# pytest tests/integration/ -v --tb=short
|
||||
# env:
|
||||
# DATABASE_URL: postgresql://postgres:postgres@localhost:5432/fuzzforge_test
|
||||
# TEMPORAL_ADDRESS: localhost:7233
|
||||
# MINIO_ENDPOINT: localhost:9000
|
||||
#
|
||||
# - name: Shutdown services
|
||||
# if: always()
|
||||
# run: docker-compose down
|
||||
|
||||
test-summary:
|
||||
name: Test Summary
|
||||
runs-on: ubuntu-latest
|
||||
needs: [validate-workers, lint, unit-tests]
|
||||
if: always()
|
||||
steps:
|
||||
- name: Check test results
|
||||
run: |
|
||||
if [ "${{ needs.validate-workers.result }}" != "success" ]; then
|
||||
echo "Worker validation failed"
|
||||
exit 1
|
||||
fi
|
||||
if [ "${{ needs.unit-tests.result }}" != "success" ]; then
|
||||
echo "Unit tests failed"
|
||||
exit 1
|
||||
fi
|
||||
echo "All tests passed!"
|
||||
Reference in New Issue
Block a user