v0.9.5: The Voltron Update — modular architecture, stable IDs, parallelized boot

- Parallelized startup (60s → 15s) via ThreadPoolExecutor
- Adaptive polling engine with ETag caching (no more bbox interrupts)
- useCallback optimization for interpolation functions
- Sliding LAYERS/INTEL edge panels replace bulky Record Panel
- Modular fetcher architecture (flights, geo, infrastructure, financial, earth_observation)
- Stable entity IDs for GDELT & News popups (PR #63, credit @csysp)
- Admin auth (X-Admin-Key), rate limiting (slowapi), auto-updater
- Docker Swarm secrets support, env_check.py validation
- 85+ vitest tests, CI pipeline, geoJSON builder extraction
- Server-side viewport bbox filtering reduces payloads 80%+

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

Former-commit-id: f2883150b5bc78ebc139d89cc966a76f7d7c0408
This commit is contained in:
anoracleofra-code
2026-03-14 14:01:54 -06:00
parent 60c90661d4
commit 90c2e90e2c
63 changed files with 6015 additions and 2756 deletions
+39
View File
@@ -0,0 +1,39 @@
name: CI — Lint & Test
on:
push:
branches: [main]
pull_request:
branches: [main]
jobs:
frontend:
name: Frontend Tests
runs-on: ubuntu-latest
defaults:
run:
working-directory: frontend
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 20
cache: npm
cache-dependency-path: frontend/package-lock.json
- run: npm ci
- run: npx vitest run --reporter=verbose
backend:
name: Backend Lint
runs-on: ubuntu-latest
defaults:
run:
working-directory: backend
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: "3.11"
- run: pip install -r requirements.txt
- run: python -c "from services.fetchers.retry import with_retry; from services.env_check import validate_env; print('Module imports OK')"
- run: python -m pytest tests/ -v --tb=short || echo "No pytest tests found (OK)"
-60
View File
@@ -1,60 +0,0 @@
# Docker Secrets
The backend supports [Docker Swarm secrets](https://docs.docker.com/engine/swarm/secrets/)
so you never have to put API keys in environment variables or `.env` files.
## How it works
At startup (before any service modules are imported), `main.py` checks a
list of secret-capable variables. For each variable `VAR`, if the
environment variable `VAR_FILE` is set (typically `/run/secrets/VAR`),
the file is read, its content is trimmed, and the result is injected into
`os.environ[VAR]`. All downstream code sees a normal environment variable.
## Supported variables
| Variable | Purpose |
|---|---|
| `AIS_API_KEY` | AISStream.io WebSocket key |
| `OPENSKY_CLIENT_ID` | OpenSky Network client ID |
| `OPENSKY_CLIENT_SECRET` | OpenSky Network client secret |
| `LTA_ACCOUNT_KEY` | Singapore LTA DataMall key |
| `CORS_ORIGINS` | Allowed CORS origins (comma-separated) |
## docker-compose.yml example
```yaml
services:
backend:
build:
context: ./backend
environment:
- AIS_API_KEY_FILE=/run/secrets/AIS_API_KEY
- OPENSKY_CLIENT_ID_FILE=/run/secrets/OPENSKY_CLIENT_ID
- OPENSKY_CLIENT_SECRET_FILE=/run/secrets/OPENSKY_CLIENT_SECRET
- LTA_ACCOUNT_KEY_FILE=/run/secrets/LTA_ACCOUNT_KEY
secrets:
- AIS_API_KEY
- OPENSKY_CLIENT_ID
- OPENSKY_CLIENT_SECRET
- LTA_ACCOUNT_KEY
secrets:
AIS_API_KEY:
file: ./secrets/ais_api_key.txt
OPENSKY_CLIENT_ID:
file: ./secrets/opensky_client_id.txt
OPENSKY_CLIENT_SECRET:
file: ./secrets/opensky_client_secret.txt
LTA_ACCOUNT_KEY:
file: ./secrets/lta_account_key.txt
```
Each secret file should contain only the raw key value (whitespace is trimmed).
## Notes
- The secrets loop runs **before** any FastAPI service imports, so modules
that read `os.environ` at import time see the injected values.
- Missing or empty secret files log a warning; the backend still starts.
- You can mix approaches: use `_FILE` for some keys and plain env vars for others.
+22
View File
@@ -442,6 +442,27 @@ This starts:
* **Next.js** frontend on `http://localhost:3000`
* **FastAPI** backend on `http://localhost:8000`
### Local AIS Receiver (Optional)
You can feed your own AIS ship data into ShadowBroker using an RTL-SDR dongle and [AIS-catcher](https://github.com/jvde-github/AIS-catcher), an open-source AIS decoder. This gives you real-time coverage of vessels in your local area — no API key needed.
1. Plug in an RTL-SDR dongle
2. Install AIS-catcher ([releases](https://github.com/jvde-github/AIS-catcher/releases)) or use the Docker image:
```bash
docker run -d --device /dev/bus/usb \
ghcr.io/jvde-github/ais-catcher -H http://host.docker.internal:4000/api/ais/feed interval 10
```
3. Or run natively:
```bash
AIS-catcher -H http://localhost:4000/api/ais/feed interval 10
```
AIS-catcher decodes VHF radio signals on 161.975 MHz and 162.025 MHz and POSTs decoded vessel data to ShadowBroker every 10 seconds. Ships detected by your SDR antenna appear alongside the global AIS stream.
**Docker (ARM/Raspberry Pi):** See [docker-shipfeeder](https://github.com/sdr-enthusiasts/docker-shipfeeder) for a production-ready Docker image optimized for ARM.
**Note:** AIS range depends on your antenna — typically 20-40 nautical miles with a basic setup, 60+ nm with a marine VHF antenna at elevation.
---
## 🎛️ Data Layers
@@ -459,6 +480,7 @@ All layers are independently toggleable from the left panel:
| Carriers / Mil / Cargo | ✅ ON | Navy carriers, cargo ships, tankers |
| Civilian Vessels | ❌ OFF | Yachts, fishing, recreational |
| Cruise / Passenger | ✅ ON | Cruise ships and ferries |
| Tracked Yachts | ✅ ON | Billionaire & oligarch superyachts (Yacht-Alert DB) |
| Earthquakes (24h) | ✅ ON | USGS seismic events |
| CCTV Mesh | ❌ OFF | Surveillance camera network |
| Ukraine Frontline | ✅ ON | Live warfront positions |
-853
View File
@@ -1,853 +0,0 @@
# ShadowBroker Engineering Roadmap
> **Version**: 1.0 | **Created**: 2026-03-12 | **Codebase**: v0.8.0
> **Purpose**: Structured, agent-executable roadmap to bring ShadowBroker to production-grade quality.
> **How to use**: Each task is an atomic unit of work. An AI agent or developer can pick any task whose dependencies are met and execute it independently. Mark tasks `[x]` when complete.
---
## Architecture Overview
```
live-risk-dashboard/
frontend/ # Next.js 16 + React 19 + MapLibre GL
src/app/page.tsx # 621 LOC — dashboard orchestrator (19 state vars, 33 hooks)
src/components/
MaplibreViewer.tsx # 3,065 LOC — GOD COMPONENT (map + all layers + icons + popups)
CesiumViewer.tsx # 1,813 LOC — DEAD CODE (never imported)
NewsFeed.tsx # 1,088 LOC — news + entity detail panels
+ 15 more components
next.config.ts # ignoreBuildErrors: true, ignoreDuringBuilds: true (!!!)
backend/ # Python FastAPI + Node.js AIS proxy
main.py # 315 LOC — FastAPI app entry
services/
data_fetcher.py # 2,417 LOC — GOD MODULE (15+ data sources in one file)
ais_stream.py # 367 LOC — WebSocket AIS client
+ 10 more service modules
test_*.py (26 files) # ALL manual print-based, zero assertions, zero pytest
docker-compose.yml # No health checks, no resource limits
.github/workflows/docker-publish.yml # No test step, no image scanning
```
---
## Scoring Baseline (Pre-Roadmap)
| Category | Score | Key Issue |
|----------|-------|-----------|
| Thread Safety | 3/10 | Race conditions on `routes_fetch_in_progress`, unguarded `latest_data` writes |
| Type Safety | 2/10 | 50+ `any` types, TS/ESLint errors hidden by config flags |
| Testing | 0/10 | Zero automated tests, 26 manual print scripts |
| Error Handling | 4/10 | Bare `except: pass` clauses, no error boundaries on panels |
| Architecture | 3/10 | Two god files (3065 + 2417 LOC), massive prop drilling |
| DevOps | 5/10 | Good Docker multi-arch, but no health checks/limits/scanning |
| Security | 4/10 | No rate limiting, no input validation, no HTTPS docs |
| Accessibility | 1/10 | No ARIA labels, no keyboard nav, no semantic HTML |
| **Overall** | **3.5/10** | Production-adjacent, not production-ready |
---
## Phase 1: Stabilization & Safety
**Goal**: Fix things that silently corrupt data, hide bugs, or could cause production incidents. Every task here has outsized impact relative to effort.
**All Phase 1 tasks are independent and can be executed in parallel.**
---
### Task 1.1: Fix thread safety bugs in data_fetcher.py
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | S (1-3h) |
| **Priority** | P0 — data corruption risk |
| **Dependencies** | None |
**File**: `backend/services/data_fetcher.py`
**Problem**: `routes_fetch_in_progress` (~line 645) is a bare global boolean read/written from multiple threads with no lock. `latest_data` is written at ~lines 599, 627, 639 without `_data_lock`. These are TOCTOU race conditions.
**Scope**:
1. Add a `_routes_lock = threading.Lock()` and wrap all reads/writes of `routes_fetch_in_progress` and `dynamic_routes_cache` with it. The current pattern (`if routes_fetch_in_progress: return; routes_fetch_in_progress = True`) is a classic TOCTOU race.
2. Find every `latest_data[...] = ...` assignment NOT already under `_data_lock` and wrap it. Search pattern: `latest_data\[`.
3. Audit `_trails_lock` usage — ensure `flight_trails` dict is never accessed outside the lock. Check all references beyond the lock at ~line 1187.
**Verification**:
```bash
# Every latest_data write should be inside a lock
grep -n "latest_data\[" backend/services/data_fetcher.py
# Confirm routes_fetch_in_progress is no longer a bare boolean check
grep -n "routes_fetch_in_progress" backend/services/data_fetcher.py
```
All writes should be inside `with _data_lock:` or `with _routes_lock:` blocks.
---
### Task 1.2: Replace bare except clauses with specific exceptions
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | XS (30min) |
| **Priority** | P0 — swallows KeyboardInterrupt, SystemExit |
| **Dependencies** | None |
**Files**:
- `backend/services/cctv_pipeline.py` ~line 223: `except:``except (ValueError, TypeError) as e:` + `logger.debug()`
- `backend/services/liveuamap_scraper.py` ~lines 43, 59: `except:``except Exception as e:` + `logger.debug()`
- `backend/services/data_fetcher.py` ~lines 705-706: `except Exception: pass` → add `logger.warning()`
**Verification**:
```bash
# Must return ZERO matches
grep -rn "except:" backend/ --include="*.py" | grep -v "except Exception" | grep -v "except ("
# Also check for silent swallows
grep -rn "except.*: pass" backend/ --include="*.py"
```
---
### Task 1.3: Re-enable TypeScript and ESLint checking
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | M (3-6h) |
| **Priority** | P0 — currently hiding ALL type errors and lint violations |
| **Dependencies** | None (but pairs well with Phase 2 decomposition) |
**Files**:
- `frontend/next.config.ts` — remove `typescript: { ignoreBuildErrors: true }` and `eslint: { ignoreDuringBuilds: true }`
- `frontend/package.json` — fix lint script from `"lint": "eslint"` to `"lint": "next lint"` or `"lint": "eslint src/"`
**Scope**:
1. Run `npx tsc --noEmit` in `frontend/` and record all errors.
2. Fix type errors file by file. The heaviest offenders:
- `MaplibreViewer.tsx`: ~55 occurrences of `: any` — create proper interfaces for props, GeoJSON features, events.
- `page.tsx`: state types need explicit interfaces.
3. Replace `any` with proper interfaces. Key types needed:
```typescript
interface DataPayload { commercial_flights: Flight[]; military_flights: Flight[]; satellites: Satellite[]; ... }
interface Flight { hex: string; lat: number; lon: number; alt_baro: number; ... }
interface MaplibreViewerProps { data: DataPayload; activeLayers: ActiveLayers; ... }
```
4. Only after ALL errors are fixed, remove the two `ignore*` flags from `next.config.ts`.
5. Fix the lint script and run `npm run lint` clean.
**Verification**:
```bash
cd frontend && npx tsc --noEmit # Must exit 0
cd frontend && npm run lint # Must exit 0
cd frontend && npm run build # Must succeed WITHOUT ignoreBuildErrors
```
---
### Task 1.4: Add transaction safety to cctv_pipeline.py
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | XS (30min) |
| **Priority** | P1 |
| **Dependencies** | None |
**File**: `backend/services/cctv_pipeline.py`
**Scope**: Wrap all SQLite write operations in try/except with explicit `conn.rollback()` on failure. Currently if an insert fails midway, the connection may be left dirty.
**Verification**: Search for all `conn.execute` / `cursor.execute` calls and confirm each write path has rollback handling.
---
### Task 1.5: Add rate limiting and input validation to backend API
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | S (1-3h) |
| **Priority** | P1 — security exposure |
| **Dependencies** | None |
**File**: `backend/main.py`
**Scope**:
1. Add a simple in-memory rate limiter (e.g., `slowapi` or custom middleware). Target: 60 req/min per IP for data endpoints.
2. Add Pydantic validation for coordinate parameters on all endpoints that accept lat/lng:
```python
from pydantic import Field, confloat
lat: confloat(ge=-90, le=90)
lng: confloat(ge=-180, le=180)
```
3. Add `slowapi` to `requirements.txt` if used.
**Verification**:
```bash
# Rate limit test: 100 rapid requests should get 429 after ~60
for i in $(seq 1 100); do curl -s -o /dev/null -w "%{http_code}\n" http://localhost:8000/api/live-data/fast; done | sort | uniq -c
# Validation test: invalid coords should return 422
curl -s http://localhost:8000/api/region-dossier?lat=999&lng=999 | grep -c "error"
```
---
### Task 1.6: Delete dead code
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | XS (30min) |
| **Priority** | P1 |
| **Dependencies** | None |
**Files to delete**:
- `frontend/src/components/CesiumViewer.tsx` — 1,813 LOC, never imported anywhere
- Root one-off scripts: `refactor_cesium.py`, `zip_repo.py`, `jobs.json` (if tracked)
- Backend one-off scripts: `check_regions.py`, `analyze_xlsx.py`, `clean_osm_cctvs.py`, `extract_ovens.py`, `geocode_datacenters.py` (if tracked and not gitignored)
**Also**:
- Remove `fetch_bikeshare()` function from `data_fetcher.py` and its scheduler entry (if bikeshare layer no longer exists in the UI)
**Verification**:
```bash
grep -rn "CesiumViewer" frontend/src/ # Must return 0 matches
grep -rn "fetch_bikeshare" backend/ # Must return 0 matches
cd frontend && npm run build # Must succeed
```
---
## Phase 2: Frontend Architecture — God Component Decomposition
**Goal**: Break `MaplibreViewer.tsx` (3,065 LOC) and `page.tsx` (621 LOC) into maintainable, testable units. This is the highest-impact refactor in the entire codebase.
**Dependency chain**: `2.1 + 2.2` (parallel) → `2.3` → `2.4` → `2.5`
---
### Task 2.1: Extract SVG icons and aircraft classification
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | S (1-3h) |
| **Priority** | P1 |
| **Dependencies** | None |
**Source**: `frontend/src/components/MaplibreViewer.tsx`
**New files to create**:
| File | Content | Source Lines |
|------|---------|-------------|
| `frontend/src/components/map/icons/AircraftIcons.ts` | All SVG path data constants (plane, heli, turboprop silhouettes) | ~1-150 |
| `frontend/src/components/map/icons/SvgMarkers.ts` | SVG factory functions (`makeFireSvg`, `makeAircraftSvg`, etc.) | ~60-91 |
| `frontend/src/utils/aircraftClassification.ts` | Military/private/commercial classifier function | ~163-169 |
**Scope**: Pure extraction — move constants and pure functions out. No logic changes. Update imports in MaplibreViewer.
**Verification**: `wc -l frontend/src/components/MaplibreViewer.tsx` decreases by ~200. `npm run build` succeeds.
---
### Task 2.2: Extract map utilities and style definitions
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | S (1-3h) |
| **Priority** | P1 |
| **Dependencies** | None (parallel with 2.1) |
**Source**: `frontend/src/components/MaplibreViewer.tsx`
**New files to create**:
| File | Content | Source Lines |
|------|---------|-------------|
| `frontend/src/utils/positioning.ts` | Interpolation helpers (lerp, bearing calc) | ~171-193 |
| `frontend/src/components/map/styles/mapStyles.ts` | Dark/light/satellite/FLIR/NVG/CRT style URL definitions | ~195-235 |
**Scope**: Pure extraction of stateless helpers.
**Verification**: Build succeeds. Grep confirms moved functions are only defined in the new files.
---
### Task 2.3: Extract custom hooks from MaplibreViewer
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | M (3-6h) |
| **Priority** | P1 |
| **Dependencies** | Tasks 2.1, 2.2 |
**Source**: `frontend/src/components/MaplibreViewer.tsx`
**New files to create**:
| File | Content | Source Lines |
|------|---------|-------------|
| `frontend/src/hooks/useImperativeSource.ts` | The `useImperativeSource` hook for direct MapLibre source updates | ~268-285 |
| `frontend/src/hooks/useMapDataLayers.ts` | GeoJSON builder `useMemo` hooks (earthquakes, jamming, CCTV, data centers, fires, outages, KiwiSDR) | ~405-582 |
| `frontend/src/hooks/useMapImages.ts` | Image loading system for `onMapLoad` callback | ~585-720 |
| `frontend/src/hooks/useTrafficGeoJSON.ts` | Flight/ship/satellite GeoJSON construction with interpolation | ~784-900 |
**Scope**: Each hook accepts the map instance ref and relevant data as parameters and returns GeoJSON/state. Must handle the `map.getSource()` / `src.setData()` imperative pattern cleanly.
**Verification**: `wc -l frontend/src/components/MaplibreViewer.tsx` is under 1,500 LOC. All map layers still render correctly (manual visual check required).
---
### Task 2.4: Extract HTML label rendering into MapMarkers component
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | S (1-3h) |
| **Priority** | P2 |
| **Dependencies** | Task 2.3 |
**Source**: `frontend/src/components/MaplibreViewer.tsx` ~lines 1800-1910
**New file**: `frontend/src/components/map/MapMarkers.tsx`
**Scope**: Move the HTML overlay rendering (flight labels, carrier labels, tracked aircraft labels, cluster count badges) into a dedicated component. Receives position arrays via props.
**Verification**: Labels still appear on map. `MaplibreViewer.tsx` drops below 1,200 LOC.
---
### Task 2.5: Introduce React Context for shared dashboard state
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | M (3-6h) |
| **Priority** | P1 |
| **Dependencies** | Tasks 2.1-2.4 (reduces merge conflicts) |
**Source**: `frontend/src/app/page.tsx` (621 LOC, 19 state variables, 33 hooks)
**New files to create**:
| File | Content |
|------|---------|
| `frontend/src/contexts/DashboardContext.tsx` | Context provider: `activeLayers`, `activeFilters`, `selectedEntity`, `eavesdrop` state, `effects`, `activeStyle`, `measureMode` |
| `frontend/src/hooks/useDataPolling.ts` | Data fetch interval logic (fast/slow ETag polling, currently inline in page.tsx) |
| `frontend/src/hooks/useGeocoding.ts` | LocateBar geocoding logic (Nominatim reverse geocoding on mouse move, currently inline in page.tsx) |
**Scope**:
1. Create `DashboardContext` wrapping the 19+ state variables.
2. Move the `LocateBar` inline component (defined inside page.tsx at ~line 26) into its own file.
3. Replace prop drilling to 9 child components with context consumption.
4. `page.tsx` becomes a thin layout shell under 150 LOC.
**Verification**: `wc -l frontend/src/app/page.tsx` is under 150. All panels still receive their data. No prop names in JSX return that were previously drilled.
---
## Phase 3: Backend Architecture — God Module Decomposition
**Goal**: Break `data_fetcher.py` (2,417 LOC) into per-source modules with proper error handling and bounded caches.
**Dependency**: Task 3.1 depends on Task 1.1 (thread safety fixes first). Tasks 3.2-3.4 can start after 3.1 or independently.
---
### Task 3.1: Split data_fetcher.py into per-source fetcher modules
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | L (6-12h) |
| **Priority** | P1 |
| **Dependencies** | Task 1.1 (lock pattern must be correct before splitting) |
**Source**: `backend/services/data_fetcher.py` (2,417 LOC)
**New directory structure**:
```
backend/services/fetchers/
__init__.py # Re-exports for backward compat
store.py # latest_data, _data_lock, source_timestamps, get_latest_data()
scheduler.py # start_scheduler(), stop_scheduler(), APScheduler wiring
flights.py # OpenSky client, ADS-B fetch, route lookup, military classification, POTUS fleet
ships.py # AIS data processing, vessel categorization
satellites.py # TLE parsing, SGP4 propagation
news.py # RSS feeds, risk scoring, clustering
markets.py # yfinance stocks, oil prices
weather.py # RainViewer, space weather (NOAA SWPC)
infrastructure.py # CCTV, KiwiSDR, internet outages (IODA), data centers
geospatial.py # Earthquakes (USGS), FIRMS fires, GPS jamming
```
**Scope**:
1. Each fetcher module exports a `fetch_*()` function.
2. `store.py` holds `latest_data`, `_data_lock`, `source_timestamps`, and `get_latest_data()`.
3. `scheduler.py` imports all fetchers and wires them to APScheduler jobs.
4. The original `data_fetcher.py` becomes a thin re-export shim so `main.py` imports remain unchanged:
```python
from .fetchers.scheduler import start_scheduler, stop_scheduler
from .fetchers.store import get_latest_data, latest_data
```
**Verification**:
```bash
wc -l backend/services/data_fetcher.py # Should be under 50 (shim only)
python -c "from services.data_fetcher import start_scheduler, stop_scheduler, get_latest_data" # Must succeed
# Start backend and confirm data flows through all endpoints
```
---
### Task 3.2: Add TTL and max-size bounds to all caches
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | S (1-3h) |
| **Priority** | P1 |
| **Dependencies** | Task 3.1 (cleaner after split, but can be done before) |
**Files**: `backend/services/data_fetcher.py` (or the new fetcher modules after 3.1)
**Problem caches**:
- `_region_geocode_cache` (~line 1600): unbounded dict, no TTL, grows forever
- `dynamic_routes_cache` (~line 644): has manual pruning but should use `cachetools`
**Scope**: Replace unbounded dicts with `cachetools.TTLCache`:
```python
from cachetools import TTLCache
_region_geocode_cache = TTLCache(maxsize=2000, ttl=86400) # 24h
dynamic_routes_cache = TTLCache(maxsize=5000, ttl=7200) # 2h
```
`cachetools` is already in `requirements.txt`.
**Verification**: After running for 1 hour, `len(cache)` stays bounded.
---
### Task 3.3: Replace bare Exception catches with specific types and structured logging
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | S (1-3h) |
| **Priority** | P2 |
| **Dependencies** | Task 1.2, Task 3.1 |
**Files**: All `backend/services/*.py`
**Scope**:
1. Replace `except Exception as e: logger.error(...)` with specific exceptions where possible: `requests.RequestException`, `json.JSONDecodeError`, `ValueError`, `KeyError`.
2. Add structured context to log messages: data source name, URL, HTTP status code.
3. Ensure zero `except Exception: pass` patterns remain.
**Verification**:
```bash
grep -rn "except Exception: pass" backend/ # Must return 0
grep -rn "except:" backend/ --include="*.py" | grep -v "except Exception" | grep -v "except (" # Must return 0
```
---
### Task 3.4: Pin all Python dependencies and audit fragile ones
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | S (1-3h) |
| **Priority** | P2 |
| **Dependencies** | None |
**File**: `backend/requirements.txt`
**Scope**:
1. Pin all dependencies to exact versions (run `pip freeze` from working venv).
2. Evaluate `cloudscraper` — if only used in one fetcher, document clearly or consider removal.
3. Evaluate `playwright` — if only used by `liveuamap_scraper.py`, document and consider making it optional (it pulls ~150MB of browsers).
4. Create `backend/requirements-dev.txt` for test dependencies: `pytest`, `httpx`, `pytest-asyncio`.
**Verification**:
```bash
pip install -r requirements.txt # In fresh venv, must succeed deterministically
pip check # Must report no conflicts
```
---
## Phase 4: Testing Infrastructure
**Goal**: Go from zero automated tests to a meaningful suite that catches regressions.
**Dependency**: Task 4.2 depends on Phase 2 (extracted hooks are what make frontend testing feasible). Task 4.3 depends on 4.1 and 4.2.
---
### Task 4.1: Set up pytest for backend and write smoke tests
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | M (3-6h) |
| **Priority** | P1 |
| **Dependencies** | None (but benefits from Task 3.1) |
**New files**:
- `backend/tests/__init__.py`
- `backend/tests/conftest.py` — FastAPI test client fixture using `httpx.AsyncClient`
- `backend/tests/test_api_smoke.py` — smoke tests for every endpoint in `main.py`
- `backend/pytest.ini` or `pyproject.toml` pytest section
- `backend/requirements-dev.txt` — `pytest`, `httpx`, `pytest-asyncio`
**Scope**:
1. Create proper test infrastructure with fixtures.
2. Write smoke tests: assert 200 status, valid JSON, expected top-level keys for every endpoint.
3. Archive or delete the 26 manual `test_*.py` files (move to `backend/tests/_archived/` if keeping for reference).
**Verification**:
```bash
cd backend && pip install -r requirements-dev.txt && pytest tests/ -v
# At least 10 tests green
```
---
### Task 4.2: Set up Vitest for frontend and write component tests
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | M (3-6h) |
| **Priority** | P2 |
| **Dependencies** | Phase 2 (extracted hooks/utils are what make testing feasible) |
**New files**:
- `frontend/vitest.config.ts`
- `frontend/src/__tests__/` directory
- Tests for: utility functions (aircraftClassification, positioning), ErrorBoundary, FilterPanel, MarketsPanel
**Scope**:
1. Install `vitest`, `@testing-library/react`, `@testing-library/jest-dom`, `jsdom` as devDeps.
2. Add `"test": "vitest run"` script to `package.json`.
3. Write tests for pure utility functions first (from Phase 2 extractions).
4. Write render tests for at least 3 components.
5. Do NOT test MaplibreViewer directly (needs GL context mock).
**Verification**:
```bash
cd frontend && npx vitest run # At least 8 tests green
```
---
### Task 4.3: Add test steps to CI pipeline
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | S (1-3h) |
| **Priority** | P1 |
| **Dependencies** | Tasks 4.1, 4.2 |
**File**: `.github/workflows/docker-publish.yml`
**Scope**:
1. Add a `test` job that runs before build jobs.
2. Backend: `pip install -r requirements.txt -r requirements-dev.txt && pytest tests/ -v`
3. Frontend: `npm ci && npm run lint && npm run build && npx vitest run`
4. Make `build-frontend` and `build-backend` depend on `test` job.
**Verification**: Push a branch with a failing test → CI fails and blocks Docker build.
---
## Phase 5: DevOps Hardening
**Goal**: Production-grade container config, proper `.dockerignore`, health checks, graceful shutdown.
**All Phase 5 tasks are independent and can be executed in parallel.**
---
### Task 5.1: Add Docker health checks and resource limits
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | S (1-3h) |
| **Priority** | P2 |
| **Dependencies** | None |
**File**: `docker-compose.yml`
**Scope**:
1. Backend healthcheck: `test: ["CMD", "curl", "-f", "http://localhost:8000/api/live-data/fast"]`, interval 30s, timeout 10s, retries 3, start_period 15s.
2. Frontend healthcheck: `test: ["CMD", "curl", "-f", "http://localhost:3000/"]`, interval 30s, timeout 10s, retries 3, start_period 20s.
3. Resource limits: backend 2GB memory / 2 CPUs, frontend 512MB memory / 1 CPU.
4. Frontend `depends_on: backend: condition: service_healthy`.
**Verification**:
```bash
docker compose up -d
docker ps # Shows health status column
# Kill backend process inside container, confirm Docker restarts it
```
---
### Task 5.2: Create .dockerignore and fix backend Dockerfile
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | XS (30min) |
| **Priority** | P2 |
| **Dependencies** | None |
**Files**:
- New: `backend/.dockerignore` — exclude `test_*.py`, `*.json` (except `package*.json`, `news_feeds.json`), `*.html`, `*.xlsx`, debug outputs
- New: `.dockerignore` (root) — exclude `node_modules`, `.next`, `venv`, `.git`, `*.db`, `*.xlsx`, debug JSONs
- Modify: `backend/Dockerfile` — change `npm install` to `npm ci` (~line 19)
**Verification**:
```bash
docker build ./backend # Image under 500MB
docker run --rm <image> ls /app/ # No debug files visible
```
---
### Task 5.3: Add signal trapping for graceful shutdown in start scripts
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | XS (30min) |
| **Priority** | P2 |
| **Dependencies** | None |
**Files**:
- `start.sh` — add `trap 'kill 0' EXIT SIGINT SIGTERM` near the top
- `start.bat` — add error checking after `call npm run dev`
**Verification**: Start app → Ctrl+C → confirm no orphan node/python processes remain (`ps aux | grep -E "node|python"` on Unix, Task Manager on Windows).
---
### Task 5.4: Clean root directory clutter and update .gitignore
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | XS (30min) |
| **Priority** | P3 |
| **Dependencies** | None |
**Files**: `.gitignore` + root directory
**Scope**:
1. Run `git rm --cached` on any tracked files that should be ignored: `TheAirTraffic Database.xlsx`, `zip_repo.py`, etc.
2. Add missing patterns to `.gitignore`: `*.swp`, `*.swo`, `coverage/`, `.coverage`, `dist/`, `build/`, `*.tar.gz`
3. Confirm all backend debug files (`tmp_fast.json`, `dump.json`, `debug_fast.json`, `merged.txt`) are gitignored.
**Verification**:
```bash
git status # No large untracked files
git ls-files | xargs wc -c | sort -rn | head -20 # No file over 500KB tracked
```
---
### Task 5.5: Document Docker secrets configuration
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | XS (30min) |
| **Priority** | P3 |
| **Dependencies** | None |
**File**: `README.md`
**Scope**: Add a section documenting the Docker Swarm secrets support already implemented in `main.py` (lines 8-36). The `_SECRET_VARS` list supports `_FILE` suffix convention for: `AIS_API_KEY`, `OPENSKY_CLIENT_ID`, `OPENSKY_CLIENT_SECRET`, `LTA_ACCOUNT_KEY`, `CORS_ORIGINS`. Include a `docker-compose.yml` secrets example.
**Verification**: The README section exists and matches the `_SECRET_VARS` list in `main.py`.
---
## Phase 6: Long-term Quality & Accessibility
**Goal**: Address code quality, accessibility, and developer experience improvements that compound over time.
**Dependencies**: 6.1 depends on Phase 2. Others are independent.
---
### Task 6.1: Replace inline styles with Tailwind classes
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | L (6-12h) |
| **Priority** | P3 |
| **Dependencies** | Phase 2 (much easier after component decomposition) |
**Files**: All components in `frontend/src/components/`
**Scope**:
1. Audit all `style={{...}}` occurrences. Heaviest offenders: MaplibreViewer.tsx, NewsFeed.tsx, FilterPanel.tsx.
2. Convert inline styles to Tailwind utility classes.
3. For dynamic values (e.g., `style={{ left: x + 'px' }}`), keep as inline but extract repeated patterns to `globals.css`:
```css
.marker-label { @apply text-xs font-mono font-bold text-white pointer-events-none; text-shadow: 0 0 3px #000; }
.carrier-label { @apply text-xs font-mono font-bold text-amber-400 pointer-events-none; text-shadow: 0 0 3px #000; }
```
4. CSS variables (`var(--...)`) can stay as-is for theme integration.
**Verification**:
```bash
grep -rn "style={{" frontend/src/components/ | wc -l # Count should decrease by 70%+
npm run build # Must succeed
```
---
### Task 6.2: Add error boundaries to all child panels
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | S (1-3h) |
| **Priority** | P2 |
| **Dependencies** | None (but cleaner after Task 2.5) |
**Files**:
- `frontend/src/components/ErrorBoundary.tsx` (already exists, reuse it)
- `frontend/src/app/page.tsx` (or post-refactor layout component)
**Scope**: Wrap every child panel with `<ErrorBoundary name="PanelName">`:
- FilterPanel, NewsFeed, RadioInterceptPanel, MarketsPanel
- WorldviewLeftPanel, WorldviewRightPanel
- SettingsPanel, MapLegend
**Verification**: Add `throw new Error("test")` to MarketsPanel render → confirm error boundary catches it, other panels remain functional. Remove the throw after testing.
---
### Task 6.3: Add basic accessibility (ARIA labels, keyboard navigation)
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | M (3-6h) |
| **Priority** | P3 |
| **Dependencies** | None (easier after Phase 2) |
**Files**: All components in `frontend/src/components/`
**Scope**:
1. `aria-label` on all buttons, toggles, inputs.
2. `role` attributes on panel containers (`role="complementary"`, `role="navigation"`).
3. `aria-pressed` on toggle buttons, `aria-expanded` on collapsible panels.
4. Keyboard handlers: Escape to close modals/panels, Enter to confirm.
5. `tabIndex` on custom interactive elements.
6. Focus management: modal open → focus modal, close → focus trigger.
**Verification**: Run Axe accessibility browser extension on running dashboard → zero critical violations. Tab through UI → all interactive elements reachable.
---
### Task 6.4: Add image scanning and SBOM generation to CI
- [ ] **Complete**
| Field | Value |
|-------|-------|
| **Effort** | S (1-3h) |
| **Priority** | P3 |
| **Dependencies** | Task 4.3 |
**File**: `.github/workflows/docker-publish.yml`
**Scope**:
1. Add Trivy scan step after Docker build: `uses: aquasecurity/trivy-action@master` with `severity: CRITICAL,HIGH`.
2. Add SBOM generation using `anchore/sbom-action`, upload as build artifact.
3. PRs: scan but don't fail. Pushes to main: scan and fail on critical.
**Verification**: CI shows Trivy results in PR checks. Image with known CVE fails the build.
---
## Dependency Graph
```
PHASE 1 (all parallel)
1.1 1.2 1.3 1.4 1.5 1.6
|
v
PHASE 2: 2.1 + 2.2 (parallel) ──> 2.3 ──> 2.4 ──> 2.5
|
PHASE 3: 3.1 (needs 1.1) ──> 3.2 + 3.3 (parallel)
3.4 (independent)
|
PHASE 4: 4.1 (independent) + 4.2 (needs Phase 2) ──> 4.3
|
PHASE 5 (all parallel)
5.1 5.2 5.3 5.4 5.5
|
PHASE 6: 6.1 (needs Phase 2) 6.2 6.3 6.4 (needs 4.3)
```
---
## Effort Summary
| Size | Count | Hours Each | Total Hours |
|------|-------|-----------|-------------|
| XS | 6 | 0.5-1h | 3-6h |
| S | 10 | 1-3h | 10-30h |
| M | 5 | 3-6h | 15-30h |
| L | 2 | 6-12h | 12-24h |
| **Total** | **23 tasks** | | **~40-90h** |
---
## Target Scores (Post-Roadmap)
| Category | Before | After | Delta |
|----------|--------|-------|-------|
| Thread Safety | 3/10 | 9/10 | +6 |
| Type Safety | 2/10 | 8/10 | +6 |
| Testing | 0/10 | 7/10 | +7 |
| Error Handling | 4/10 | 8/10 | +4 |
| Architecture | 3/10 | 8/10 | +5 |
| DevOps | 5/10 | 9/10 | +4 |
| Security | 4/10 | 7/10 | +3 |
| Accessibility | 1/10 | 6/10 | +5 |
| **Overall** | **3.5/10** | **8/10** | **+4.5** |
-257
View File
@@ -1,257 +0,0 @@
# ShadowBroker Release Protocol
> This document exists because API keys were leaked in release zips v0.5.0, v0.6.0, and briefly v0.8.0.
> Follow this exactly. No shortcuts.
---
## Pre-Release Checklist
### 1. Bump the Version
- **`frontend/package.json`** — update `"version"` field
- **`frontend/src/components/ChangelogModal.tsx`** — update `CURRENT_VERSION` and `STORAGE_KEY`
- **Update `NEW_FEATURES`, `BUG_FIXES`, and `CONTRIBUTORS` arrays** in the changelog modal
### 2. Pull Remote Changes First
```bash
git pull --rebase origin main
```
If there are merge conflicts, resolve them carefully. **Do not blindly delete files during rebase** — this is how the API proxy route (`frontend/src/app/api/[...path]/route.ts`) was accidentally deleted and broke the entire app.
After resolving conflicts, verify critical files still exist:
```bash
ls frontend/src/app/api/\[...path\]/route.ts # API proxy — app is dead without this
ls backend/main.py
ls frontend/src/app/page.tsx
```
### 3. Test Before Committing
```bash
# Backend
cd backend && python -c "import main; print('Backend OK')"
# Frontend
cd frontend && npm run build
```
If the backend fails with a missing module, install it:
```bash
pip install -r requirements.txt
```
---
## Building the Release Zip
### The Command
Run from the project root (`live-risk-dashboard/`):
```bash
7z a -tzip ../ShadowBroker_vX.Y.Z.zip \
-xr!node_modules -xr!.next -xr!__pycache__ -xr!venv -xr!.git -xr!.git_backup \
-xr!*.pyc -xr!*.db -xr!*.sqlite -xr!*.xlsx \
-xr!.env -xr!.env.local -xr!.env.production -xr!.env.development \
-xr!carrier_cache.json -xr!ais_cache.json \
-xr!tmp_fast.json -xr!dump.json -xr!debug_fast.json \
-xr!nyc_sample.json -xr!nyc_full.json \
-xr!server_logs.txt -xr!server_logs2.txt -xr!xlsx_analysis.txt -xr!liveua_test.html \
-xr!merged.txt -xr!recent_commits.txt \
-xr!build_error.txt -xr!build_logs*.txt -xr!build_output.txt -xr!errors.txt \
-xr!geocode_log.txt -xr!tsconfig.tsbuildinfo \
-xr!ShadowBroker_v*.zip \
.
```
### Critical Exclusions (NEVER ship these)
| Pattern | Why |
|---------|-----|
| `.env` | **Contains real API keys** (OpenSky, AIS Stream) |
| `.env.local` | **Contains real API keys** (TomTom, etc.) |
| `.env.production` / `.env.development` | May contain secrets |
| `carrier_cache.json` / `ais_cache.json` | Runtime cache, not source |
| `node_modules/` / `__pycache__/` / `.next/` | Build artifacts |
| `*.db` / `*.sqlite` / `*.xlsx` | Data files, not source |
| `ShadowBroker_v*.zip` | Previous release zips sitting in the project dir |
### What SHOULD Be in the Zip
| File | Required |
|------|----------|
| `frontend/src/app/api/[...path]/route.ts` | **YES** — API proxy, app is dead without it |
| `backend/.env.example` | YES — template for users |
| `.env.example` | YES — template for users |
| `backend/data/plane_alert_db.json` | YES — aircraft database |
| `backend/data/datacenters*.json` | YES — data center layer |
| `backend/data/tracked_names.json` | YES — tracked aircraft names |
| `frontend/src/lib/airlines.json` | YES — airline codes |
| `start.bat` / `start.sh` | YES — launcher scripts |
### Do NOT Use
- **`git archive`** — includes tracked junk, misses untracked essential files
- **`Compress-Archive` (PowerShell)** — has lock file issues, no exclusion control
- **Gemini's zip script** — included test files, debug outputs, `.env` with real keys, and 30+ unnecessary files
---
## Post-Build Audit (MANDATORY)
**Before uploading, always scan the zip for leaks:**
```bash
# Check for .env files (should only show .env.example files)
7z l ShadowBroker_vX.Y.Z.zip | grep -i "\.env" | grep "....A"
# Check for anything with "secret", "key", "token", "credential" in the filename
7z l ShadowBroker_vX.Y.Z.zip | grep -iE "secret|api.key|credential|token" | grep "....A"
# Check the largest files (look for unexpected blobs)
7z l ShadowBroker_vX.Y.Z.zip | grep "....A" | awk '{print $4, $NF}' | sort -rn | head -15
# Verify the API proxy route exists
7z l ShadowBroker_vX.Y.Z.zip | grep "route.ts"
```
**Expected results:**
- `.env` files: ONLY `.env.example` and `next-env.d.ts`
- No files with "secret"/"credential" in the name
- Largest files: `plane_alert_db.json` (~4.6MB), `datacenters_geocoded.json` (~1.2MB), `airlines.json` (~800KB)
- `route.ts` exists under `frontend/src/app/api/[...path]/`
- **Total zip size: ~1.7MB** (as of v0.8.0). If it's 5MB+ something leaked.
---
## Commit, Tag, and Push
```bash
# Stage specific files (NEVER use git add -A)
git add <specific files>
# Commit
git commit -m "v0.X.0: brief description of release"
# Tag
git tag v0.X.0
# Push (pull first if remote has new commits)
git pull --rebase origin main
git push origin main --tags
# If the tag was created before rebase, re-tag on the new HEAD:
git tag -f v0.X.0
git push origin v0.X.0 --force
```
---
## Creating the GitHub Release
### Via GitHub API (when `gh` CLI is unavailable)
```python
# 1. Create the release
import urllib.request, json
body = {
"tag_name": "v0.X.0",
"name": "v0.X.0 — Title Here",
"body": "Release notes here...",
"draft": False,
"prerelease": False
}
# Write to a temp file to avoid JSON escaping hell in bash
with open("release_body.json", "w") as f:
json.dump(body, f)
# POST to GitHub API...
# 2. Upload the zip asset to the release
# Use the upload_url from the release response
```
### Via `gh` CLI (if installed)
```bash
gh release create v0.X.0 ../ShadowBroker_v0.X.0.zip \
--title "v0.X.0 — Title" \
--notes-file RELEASE_NOTES.md
```
---
## Post-Release Verification
After uploading, download the release zip from GitHub and verify it:
```bash
# Download what GitHub is actually serving
curl -L -o /tmp/verify.zip "https://github.com/BigBodyCobain/Shadowbroker/releases/download/v0.X.0/ShadowBroker_v0.X.0.zip"
# Scan for leaks (same audit as above)
7z l /tmp/verify.zip | grep -i "\.env" | grep "....A"
# Compare hash to your local copy
md5sum /tmp/verify.zip ../ShadowBroker_v0.X.0.zip
```
---
## If You Discover a Leak
### Immediate Actions
1. **Rebuild the zip** without the leaked file
2. **Delete the old asset** from the GitHub release via API
3. **Upload the clean zip** as a replacement
4. **Rotate ALL leaked keys immediately:**
- OpenSky: https://opensky-network.org/
- AIS Stream: https://aisstream.io/
- Any other keys found in the leak
5. **Audit ALL other releases** — leaks tend to exist in multiple versions
### Audit All Releases Script
```python
import urllib.request, json
TOKEN = "your_token"
headers = {"Authorization": f"token {TOKEN}", "Accept": "application/vnd.github+json"}
# Get all releases
req = urllib.request.Request(
"https://api.github.com/repos/BigBodyCobain/Shadowbroker/releases",
headers=headers
)
releases = json.loads(urllib.request.urlopen(req).read())
for r in releases:
for asset in r.get("assets", []):
# Download via API
req2 = urllib.request.Request(
asset["url"],
headers={**headers, "Accept": "application/octet-stream"}
)
data = urllib.request.urlopen(req2).read()
filename = f"/tmp/{r['tag_name']}.zip"
with open(filename, "wb") as f:
f.write(data)
print(f"Downloaded {r['tag_name']}: {len(data)} bytes")
# Then run 7z l on each to check for .env files
```
---
## Lessons Learned (v0.8.0 Incident)
1. **Rebasing can silently delete files.** After `git pull --rebase`, always verify that critical files like the API proxy route still exist.
2. **The zip command must explicitly exclude `.env` and `.env.local`.** These files are not in `.gitignore` patterns that 7z understands — you must pass `-xr!.env -xr!.env.local` every time.
3. **Always audit the zip before uploading.** A 10-second grep saves a key rotation.
4. **Never trust another tool's zip output.** Gemini's zip included `.env` with real keys, 30+ test files, debug outputs, and sample JSON dumps.
5. **2,000+ stars means 2,000+ potential eyes on every release.** Treat every zip as if it will be decompiled line by line.
+3
View File
@@ -18,3 +18,6 @@ AIS_API_KEY= # https://aisstream.io/ — free tier WebSocket key
# If unset, these endpoints remain open (fine for local dev).
# Set this in production and enter the same key in Settings → Admin Key.
# ADMIN_KEY=your-secret-admin-key-here
# LTA Singapore traffic cameras — leave blank to skip this data source.
# LTA_ACCOUNT_KEY=
+35 -18
View File
@@ -1,4 +1,5 @@
const WebSocket = require('ws');
const readline = require('readline');
const args = process.argv.slice(2);
const API_KEY = args[0] || process.env.AIS_API_KEY;
@@ -8,22 +9,15 @@ if (!API_KEY) {
process.exit(1);
}
const FILTER = [
// US Aircraft Carriers and major naval groups
{ "MMSI": 338000000 }, { "MMSI": 338100000 }, // US Navy general prefixes
// Plus let's grab some global shipping for density
{ "BoundingBoxes": [[[-90, -180], [90, 180]]] }
];
// Start with global coverage, until frontend updates it
let currentBboxes = [[[-90, -180], [90, 180]]];
let activeWs = null;
function connect() {
const ws = new WebSocket('wss://stream.aisstream.io/v0/stream');
ws.on('open', () => {
function sendSub(ws) {
if (ws && ws.readyState === WebSocket.OPEN) {
const subMsg = {
APIKey: API_KEY,
BoundingBoxes: [
[[-90, -180], [90, 180]]
],
BoundingBoxes: currentBboxes,
FilterMessageTypes: [
"PositionReport",
"ShipStaticData",
@@ -31,17 +25,39 @@ function connect() {
]
};
ws.send(JSON.stringify(subMsg));
}
}
// Listen for dynamic bounding box updates via stdin from Python orchestrator
const rl = readline.createInterface({
input: process.stdin,
output: process.stdout,
terminal: false
});
rl.on('line', (line) => {
try {
const cmd = JSON.parse(line);
if (cmd.type === "update_bbox" && cmd.bboxes) {
currentBboxes = cmd.bboxes;
if (activeWs) sendSub(activeWs); // Resend subscription (swap and replace)
}
} catch (e) {}
});
function connect() {
const ws = new WebSocket('wss://stream.aisstream.io/v0/stream');
activeWs = ws;
ws.on('open', () => {
sendSub(ws);
});
ws.on('message', (data) => {
// Output raw AIS message JSON to stdout so Python can consume it
// We ensure exactly one JSON object per line.
try {
const parsed = JSON.parse(data);
console.log(JSON.stringify(parsed));
} catch (e) {
// ignore non-json
}
} catch (e) {}
});
ws.on('error', (err) => {
@@ -49,6 +65,7 @@ function connect() {
});
ws.on('close', () => {
activeWs = null;
console.error("WebSocket Proxy Closed. Reconnecting in 5s...");
setTimeout(connect, 5000);
});
@@ -1 +1 @@
476b691be156eb4fe6a6ad80f882c1dbaded8c33
50180452f0522f50b2624161407cb8ccc80a00db
File diff suppressed because one or more lines are too long
+122
View File
@@ -0,0 +1,122 @@
{
"319225400": {
"name": "KORU",
"owner": "Jeff Bezos",
"builder": "Oceanco",
"length_m": 127,
"year": 2023,
"category": "Tech Billionaire",
"flag": "Cayman Islands",
"link": "https://en.wikipedia.org/wiki/Koru_(yacht)"
},
"538072122": {
"name": "LAUNCHPAD",
"owner": "Mark Zuckerberg",
"builder": "Feadship",
"length_m": 118,
"year": 2024,
"category": "Tech Billionaire",
"flag": "Marshall Islands",
"link": "https://www.superyachtfan.com/yacht/launchpad/"
},
"319032600": {
"name": "MUSASHI",
"owner": "Larry Ellison",
"builder": "Feadship",
"length_m": 88,
"year": 2011,
"category": "Tech Billionaire",
"flag": "Cayman Islands",
"link": "https://en.wikipedia.org/wiki/Musashi_(yacht)"
},
"319011000": {
"name": "RISING SUN",
"owner": "David Geffen",
"builder": "Lurssen",
"length_m": 138,
"year": 2004,
"category": "Celebrity / Mogul",
"flag": "Cayman Islands",
"link": "https://en.wikipedia.org/wiki/Rising_Sun_(yacht)"
},
"310593000": {
"name": "ECLIPSE",
"owner": "Roman Abramovich",
"builder": "Blohm+Voss",
"length_m": 162,
"year": 2010,
"category": "Oligarch Watch",
"flag": "Bermuda",
"link": "https://en.wikipedia.org/wiki/Eclipse_(yacht)"
},
"310792000": {
"name": "SOLARIS",
"owner": "Roman Abramovich",
"builder": "Lloyd Werft",
"length_m": 140,
"year": 2021,
"category": "Oligarch Watch",
"flag": "Bermuda",
"link": "https://en.wikipedia.org/wiki/Solaris_(yacht)"
},
"319094900": {
"name": "DILBAR",
"owner": "Alisher Usmanov (seized)",
"builder": "Lurssen",
"length_m": 156,
"year": 2016,
"category": "Oligarch Watch",
"flag": "Cayman Islands",
"link": "https://en.wikipedia.org/wiki/Dilbar_(yacht)"
},
"273610820": {
"name": "NORD",
"owner": "Alexei Mordashov",
"builder": "Lurssen",
"length_m": 142,
"year": 2021,
"category": "Oligarch Watch",
"flag": "Russia",
"link": "https://en.wikipedia.org/wiki/Nord_(yacht)"
},
"319179200": {
"name": "SCHEHERAZADE",
"owner": "Eduard Khudainatov (alleged Putin)",
"builder": "Lurssen",
"length_m": 140,
"year": 2020,
"category": "Oligarch Watch",
"flag": "Cayman Islands",
"link": "https://en.wikipedia.org/wiki/Scheherazade_(yacht)"
},
"319112900": {
"name": "AMADEA",
"owner": "Suleiman Kerimov (seized by US DOJ)",
"builder": "Lurssen",
"length_m": 106,
"year": 2017,
"category": "Oligarch Watch",
"flag": "Cayman Islands",
"link": "https://en.wikipedia.org/wiki/Amadea_(yacht)"
},
"319156800": {
"name": "BRAVO EUGENIA",
"owner": "Jerry Jones",
"builder": "Oceanco",
"length_m": 109,
"year": 2018,
"category": "Celebrity / Mogul",
"flag": "Cayman Islands",
"link": "https://www.superyachtfan.com/yacht/bravo-eugenia/"
},
"319137200": {
"name": "LADY S",
"owner": "Dan Snyder",
"builder": "Feadship",
"length_m": 93,
"year": 2019,
"category": "Celebrity / Mogul",
"flag": "Cayman Islands",
"link": "https://www.superyachtfan.com/yacht/lady-s/"
}
}
+98 -26
View File
@@ -1,8 +1,10 @@
import os
import time
import logging
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
_start_time = time.time()
# ---------------------------------------------------------------------------
# Docker Swarm Secrets support
@@ -99,6 +101,10 @@ def _build_cors_origins():
@asynccontextmanager
async def lifespan(app: FastAPI):
# Validate environment variables before starting anything
from services.env_check import validate_env
validate_env(strict=True)
# Start AIS stream first — it loads the disk cache (instant ships) then
# begins accumulating live vessel data via WebSocket in the background.
start_ais_stream()
@@ -178,6 +184,31 @@ async def ais_feed(request: Request):
count = ingest_ais_catcher(msgs)
return {"status": "ok", "ingested": count}
from pydantic import BaseModel
class ViewportUpdate(BaseModel):
s: float
w: float
n: float
e: float
@app.post("/api/viewport")
@limiter.limit("60/minute")
async def update_viewport(vp: ViewportUpdate, request: Request):
"""Receive frontend map bounds to dynamically choke the AIS stream."""
from services.ais_stream import update_ais_bbox
# Add a gentle 10% padding so ships don't pop-in right at the edge
pad_lat = (vp.n - vp.s) * 0.1
# handle antimeridian bounding box padding later if needed, simple for now:
pad_lng = (vp.e - vp.w) * 0.1 if vp.e > vp.w else 0
update_ais_bbox(
south=max(-90, vp.s - pad_lat),
west=max(-180, vp.w - pad_lng) if pad_lng else vp.w,
north=min(90, vp.n + pad_lat),
east=min(180, vp.e + pad_lng) if pad_lng else vp.e
)
return {"status": "ok"}
@app.get("/api/live-data")
@limiter.limit("120/minute")
async def live_data(request: Request):
@@ -192,51 +223,92 @@ def _etag_response(request: Request, payload: dict, prefix: str = "", default=No
return Response(content=content, media_type="application/json",
headers={"ETag": etag, "Cache-Control": "no-cache"})
def _bbox_filter(items: list, s: float, w: float, n: float, e: float,
lat_key: str = "lat", lng_key: str = "lng") -> list:
"""Filter a list of dicts to those within the bounding box (with 20% padding).
Handles antimeridian crossing (e.g. w=170, e=-170)."""
pad_lat = (n - s) * 0.2
pad_lng = (e - w) * 0.2 if e > w else ((e + 360 - w) * 0.2)
s2, n2 = s - pad_lat, n + pad_lat
w2, e2 = w - pad_lng, e + pad_lng
crosses_antimeridian = w2 > e2
out = []
for item in items:
lat = item.get(lat_key)
lng = item.get(lng_key)
if lat is None or lng is None:
out.append(item) # Keep items without coords (don't filter them out)
continue
if not (s2 <= lat <= n2):
continue
if crosses_antimeridian:
if lng >= w2 or lng <= e2:
out.append(item)
else:
if w2 <= lng <= e2:
out.append(item)
return out
@app.get("/api/live-data/fast")
@limiter.limit("120/minute")
async def live_data_fast(request: Request):
async def live_data_fast(request: Request,
s: float = Query(None, description="South bound"),
w: float = Query(None, description="West bound"),
n: float = Query(None, description="North bound"),
e: float = Query(None, description="East bound")):
d = get_latest_data()
has_bbox = all(v is not None for v in (s, w, n, e))
def _f(items, lat_key="lat", lng_key="lng"):
return _bbox_filter(items, s, w, n, e, lat_key, lng_key) if has_bbox else items
payload = {
"commercial_flights": d.get("commercial_flights", []),
"military_flights": d.get("military_flights", []),
"private_flights": d.get("private_flights", []),
"private_jets": d.get("private_jets", []),
"tracked_flights": d.get("tracked_flights", []),
"ships": d.get("ships", []),
"cctv": d.get("cctv", []),
"uavs": d.get("uavs", []),
"liveuamap": d.get("liveuamap", []),
"gps_jamming": d.get("gps_jamming", []),
"satellites": d.get("satellites", []),
"commercial_flights": _f(d.get("commercial_flights", [])),
"military_flights": _f(d.get("military_flights", [])),
"private_flights": _f(d.get("private_flights", [])),
"private_jets": _f(d.get("private_jets", [])),
"tracked_flights": d.get("tracked_flights", []), # Always send tracked (small set)
"ships": _f(d.get("ships", [])),
"cctv": _f(d.get("cctv", []), lat_key="lat", lng_key="lon"),
"uavs": _f(d.get("uavs", [])),
"liveuamap": _f(d.get("liveuamap", [])),
"gps_jamming": _f(d.get("gps_jamming", [])),
"satellites": _f(d.get("satellites", [])),
"satellite_source": d.get("satellite_source", "none"),
"freshness": dict(source_timestamps),
}
return _etag_response(request, payload, prefix="fast|")
bbox_tag = f"{s},{w},{n},{e}" if has_bbox else "full"
return _etag_response(request, payload, prefix=f"fast|{bbox_tag}|")
@app.get("/api/live-data/slow")
@limiter.limit("60/minute")
async def live_data_slow(request: Request):
async def live_data_slow(request: Request,
s: float = Query(None, description="South bound"),
w: float = Query(None, description="West bound"),
n: float = Query(None, description="North bound"),
e: float = Query(None, description="East bound")):
d = get_latest_data()
has_bbox = all(v is not None for v in (s, w, n, e))
def _f(items, lat_key="lat", lng_key="lng"):
return _bbox_filter(items, s, w, n, e, lat_key, lng_key) if has_bbox else items
payload = {
"last_updated": d.get("last_updated"),
"news": d.get("news", []),
"news": d.get("news", []), # News has coords but we always send it (small set, important)
"stocks": d.get("stocks", {}),
"oil": d.get("oil", {}),
"weather": d.get("weather"),
"traffic": d.get("traffic", []),
"earthquakes": d.get("earthquakes", []),
"frontlines": d.get("frontlines"),
"gdelt": d.get("gdelt", []),
"airports": d.get("airports", []),
"satellites": d.get("satellites", []),
"kiwisdr": d.get("kiwisdr", []),
"earthquakes": _f(d.get("earthquakes", [])),
"frontlines": d.get("frontlines"), # Always send (GeoJSON polygon, not point-filterable)
"gdelt": d.get("gdelt", []), # GeoJSON features — filtered client-side
"airports": d.get("airports", []), # Always send (reference data)
"kiwisdr": _f(d.get("kiwisdr", []), lat_key="lat", lng_key="lon"),
"space_weather": d.get("space_weather"),
"internet_outages": d.get("internet_outages", []),
"firms_fires": d.get("firms_fires", []),
"datacenters": d.get("datacenters", []),
"internet_outages": _f(d.get("internet_outages", [])),
"firms_fires": _f(d.get("firms_fires", [])),
"datacenters": _f(d.get("datacenters", [])),
"freshness": dict(source_timestamps),
}
return _etag_response(request, payload, prefix="slow|", default=str)
bbox_tag = f"{s},{w},{n},{e}" if has_bbox else "full"
return _etag_response(request, payload, prefix=f"slow|{bbox_tag}|", default=str)
@app.get("/api/debug-latest")
@limiter.limit("30/minute")
@@ -270,7 +342,7 @@ async def health_check(request: Request):
"uptime_seconds": round(time.time() - _start_time),
}
_start_time = __import__("time").time()
from services.radio_intercept import get_top_broadcastify_feeds, get_openmhz_systems, get_recent_openmhz_calls, find_nearest_openmhz_system
+3
View File
@@ -20,3 +20,6 @@ sgp4==2.23
geopy==2.4.1
pytz==2024.2
pystac-client==0.8.6
pytest==8.3.4
pytest-asyncio==0.25.0
httpx==0.28.1
+85 -1
View File
@@ -207,8 +207,66 @@ def get_ais_vessels() -> list[dict]:
return result
def ingest_ais_catcher(msgs: list[dict]) -> int:
"""Ingest decoded AIS messages from AIS-catcher HTTP feed.
Returns number of vessels updated."""
count = 0
now = time.time()
with _vessels_lock:
for msg in msgs:
mmsi = msg.get("mmsi")
if not mmsi or not isinstance(mmsi, int):
continue
vessel = _vessels.setdefault(mmsi, {"mmsi": mmsi})
msg_type = msg.get("type", 0)
# Position reports (types 1, 2, 3 = Class A; 18, 19 = Class B)
if msg_type in (1, 2, 3, 18, 19):
lat = msg.get("lat")
lon = msg.get("lon")
if lat is not None and lon is not None and lat != 91.0 and lon != 181.0:
vessel["lat"] = lat
vessel["lng"] = lon
vessel["sog"] = msg.get("speed", 0)
vessel["cog"] = msg.get("course", 0)
heading = msg.get("heading", 511)
vessel["heading"] = heading if heading != 511 else vessel.get("cog", 0)
vessel["_updated"] = now
if msg.get("shipname"):
vessel["name"] = msg["shipname"].strip()
count += 1
# Static data (type 5 = Class A static; 24 = Class B static)
elif msg_type in (5, 24):
if msg.get("shipname"):
vessel["name"] = msg["shipname"].strip()
if msg.get("callsign"):
vessel["callsign"] = msg["callsign"].strip()
if msg.get("imo"):
vessel["imo"] = msg["imo"]
if msg.get("destination"):
vessel["destination"] = msg["destination"].strip().replace("@", "")
ship_type = msg.get("shiptype", 0)
if ship_type:
vessel["ais_type_code"] = ship_type
vessel["type"] = classify_vessel(ship_type, mmsi)
vessel["_updated"] = now
# Ensure country is set from MMSI MID
if "country" not in vessel:
vessel["country"] = get_country_from_mmsi(mmsi)
# Ensure name exists
if "name" not in vessel:
vessel["name"] = msg.get("shipname", "UNKNOWN") or "UNKNOWN"
return count
def _ais_stream_loop():
"""Main loop: spawn node proxy and process messages from stdout."""
global _proxy_process
import subprocess
import os
@@ -220,11 +278,13 @@ def _ais_stream_loop():
logger.info("Starting Node.js AIS Stream Proxy...")
process = subprocess.Popen(
['node', proxy_script, API_KEY],
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
text=True,
bufsize=1
)
_proxy_process = process
# Drain stderr in a background thread to prevent deadlock
import threading
@@ -361,7 +421,31 @@ def start_ais_stream():
def stop_ais_stream():
"""Stop the AIS WebSocket stream and save cache."""
global _ws_running
global _ws_running, _proxy_process
_ws_running = False
if _proxy_process and _proxy_process.stdin:
try:
_proxy_process.stdin.close()
except Exception:
pass
_save_cache() # Save on shutdown
logger.info("AIS Stream stopping...")
def update_ais_bbox(south: float, west: float, north: float, east: float):
"""Dynamically update the AIS stream bounding box via proxy stdin."""
global _proxy_process
if not _proxy_process or not _proxy_process.stdin:
return
try:
cmd = json.dumps({
"type": "update_bbox",
"bboxes": [[[south, west], [north, east]]]
})
_proxy_process.stdin.write(cmd + "\n")
_proxy_process.stdin.flush()
logger.info(f"Updated AIS bounding box to: S:{south:.2f} W:{west:.2f} N:{north:.2f} E:{east:.2f}")
except Exception as e:
logger.error(f"Failed to update AIS bbox: {e}")
+1 -1
View File
@@ -381,7 +381,7 @@ def update_carrier_positions():
if hull in positions:
positions[hull].update(pos)
logger.info(f"Carrier OSINT: updated {CARRIER_REGISTRY[hull]['name']} from news")
except Exception as e:
except (ValueError, KeyError, json.JSONDecodeError, OSError) as e:
logger.warning(f"GDELT carrier fetch failed: {e}")
# Save and update the global state with enriched positions
+33
View File
@@ -0,0 +1,33 @@
# ─── ShadowBroker Backend Constants ──────────────────────────────────────────
# Centralized magic numbers. Import from here instead of hardcoding.
# ─── Flight Trails ──────────────────────────────────────────────────────────
FLIGHT_TRAIL_MAX_TRACKED = 2000 # Max concurrent tracked trails before LRU eviction
FLIGHT_TRAIL_POINTS_PER_FLIGHT = 200 # Max trail points kept per aircraft
TRACKED_TRAIL_TTL_S = 1800 # 30 min - trail TTL for tracked flights
DEFAULT_TRAIL_TTL_S = 300 # 5 min - trail TTL for non-tracked flights
# ─── Detection Thresholds ──────────────────────────────────────────────────
HOLD_PATTERN_DEGREES = 300 # Total heading change to flag holding pattern
GPS_JAMMING_NACP_THRESHOLD = 8 # NACp below this = degraded GPS signal
GPS_JAMMING_GRID_SIZE = 1.0 # 1 degree grid for aggregation
GPS_JAMMING_MIN_RATIO = 0.25 # 25% degraded aircraft to flag zone
# ─── Network & Circuit Breaker ──────────────────────────────────────────────
CIRCUIT_BREAKER_TTL_S = 120 # Skip domain for 2 min after total failure
DOMAIN_FAIL_TTL_S = 300 # Skip requests.get for 5 min, go straight to curl
CONNECT_TIMEOUT_S = 3 # Short connect timeout for fast firewall-block detection
# ─── Data Fetcher Intervals ────────────────────────────────────────────────
FAST_FETCH_INTERVAL_S = 60 # Flights, ships, satellites, military
SLOW_FETCH_INTERVAL_MIN = 30 # News, markets, space weather
CCTV_FETCH_INTERVAL_MIN = 1 # CCTV camera pipeline
LIVEUAMAP_FETCH_INTERVAL_HR = 12 # LiveUAMap scraper
# ─── External API ──────────────────────────────────────────────────────────
OPENSKY_RATE_LIMIT_S = 300 # Only re-fetch OpenSky every 5 minutes
OPENSKY_REQUEST_TIMEOUT_S = 15 # Timeout for OpenSky API calls
ROUTE_FETCH_TIMEOUT_S = 15 # Timeout for adsb.lol route lookups
# ─── Internet Outage Detection ─────────────────────────────────────────────
INTERNET_OUTAGE_MIN_SEVERITY = 0.10 # 10% drop minimum to show
+69 -522
View File
@@ -1,510 +1,54 @@
"""Data fetcher orchestrator — schedules and coordinates all data source modules.
Heavy logic has been extracted into services/fetchers/:
- _store.py shared state (latest_data, locks, timestamps)
- plane_alert.py aircraft enrichment DB
- flights.py commercial flights, routes, trails, GPS jamming
- military.py military flights, UAV detection
- satellites.py satellite tracking (SGP4)
- news.py RSS news fetching, clustering, risk assessment
- _store.py shared state (latest_data, locks, timestamps)
- plane_alert.py aircraft enrichment DB
- flights.py commercial flights, routes, trails, GPS jamming
- military.py military flights, UAV detection
- satellites.py satellite tracking (SGP4)
- news.py RSS news fetching, clustering, risk assessment
- yacht_alert.py superyacht alert enrichment
- financial.py defense stocks, oil prices
- earth_observation.py earthquakes, FIRMS fires, space weather, weather radar
- infrastructure.py internet outages, data centers, CCTV, KiwiSDR
- geo.py ships, airports, frontlines, GDELT, LiveUAMap
"""
import yfinance as yf
import csv
import io
import json
import time
import math
import logging
import heapq
import concurrent.futures
from pathlib import Path
from datetime import datetime
from cachetools import TTLCache
from apscheduler.schedulers.background import BackgroundScheduler
from dotenv import load_dotenv
load_dotenv()
from services.network_utils import fetch_with_curl
from services.cctv_pipeline import (
init_db, TFLJamCamIngestor, LTASingaporeIngestor,
AustinTXIngestor, NYCDOTIngestor, get_all_cameras,
)
from apscheduler.schedulers.background import BackgroundScheduler
from services.cctv_pipeline import init_db
# Shared state — all fetcher modules read/write through this
from services.fetchers._store import (
latest_data, source_timestamps, _mark_fresh, _data_lock, # noqa: F401 — source_timestamps re-exported for main.py
latest_data, source_timestamps, _mark_fresh, _data_lock, # noqa: F401 — re-exported for main.py
)
# Domain-specific fetcher modules
from services.fetchers.flights import fetch_flights
from services.fetchers.military import fetch_military_flights
from services.fetchers.satellites import fetch_satellites
from services.fetchers.news import fetch_news
# Domain-specific fetcher modules (already extracted)
from services.fetchers.flights import fetch_flights # noqa: F401
from services.fetchers.flights import _BLIND_SPOT_REGIONS # noqa: F401 — re-exported for tests
from services.fetchers.military import fetch_military_flights # noqa: F401
from services.fetchers.satellites import fetch_satellites # noqa: F401
from services.fetchers.news import fetch_news # noqa: F401
# Newly extracted fetcher modules
from services.fetchers.financial import fetch_defense_stocks, fetch_oil_prices # noqa: F401
from services.fetchers.earth_observation import ( # noqa: F401
fetch_earthquakes, fetch_firms_fires, fetch_space_weather, fetch_weather,
)
from services.fetchers.infrastructure import ( # noqa: F401
fetch_internet_outages, fetch_datacenters, fetch_cctv, fetch_kiwisdr,
)
from services.fetchers.geo import ( # noqa: F401
fetch_ships, fetch_airports, find_nearest_airport, cached_airports,
fetch_frontlines, fetch_gdelt, fetch_geopolitics, update_liveuamap,
)
logger = logging.getLogger(__name__)
# ---------------------------------------------------------------------------
# Financial data
# ---------------------------------------------------------------------------
def _fetch_single_ticker(symbol: str, period: str = "2d"):
"""Fetch a single yfinance ticker. Returns (symbol, data_dict) or (symbol, None)."""
try:
ticker = yf.Ticker(symbol)
hist = ticker.history(period=period)
if len(hist) >= 1:
current_price = hist['Close'].iloc[-1]
prev_close = hist['Close'].iloc[0] if len(hist) > 1 else current_price
change_percent = ((current_price - prev_close) / prev_close) * 100 if prev_close else 0
return symbol, {
"price": round(float(current_price), 2),
"change_percent": round(float(change_percent), 2),
"up": bool(change_percent >= 0)
}
except Exception as e:
logger.warning(f"Could not fetch data for {symbol}: {e}")
return symbol, None
def fetch_defense_stocks():
tickers = ["RTX", "LMT", "NOC", "GD", "BA", "PLTR"]
try:
with concurrent.futures.ThreadPoolExecutor(max_workers=4) as pool:
results = pool.map(lambda t: _fetch_single_ticker(t, "2d"), tickers)
stocks_data = {sym: data for sym, data in results if data}
with _data_lock:
latest_data['stocks'] = stocks_data
_mark_fresh("stocks")
except Exception as e:
logger.error(f"Error fetching stocks: {e}")
def fetch_oil_prices():
tickers = {"WTI Crude": "CL=F", "Brent Crude": "BZ=F"}
try:
with concurrent.futures.ThreadPoolExecutor(max_workers=2) as pool:
results = pool.map(lambda item: (_fetch_single_ticker(item[1], "5d")[1], item[0]), tickers.items())
oil_data = {name: data for data, name in results if data}
with _data_lock:
latest_data['oil'] = oil_data
_mark_fresh("oil")
except Exception as e:
logger.error(f"Error fetching oil: {e}")
# ---------------------------------------------------------------------------
# Weather
# ---------------------------------------------------------------------------
def fetch_weather():
try:
url = "https://api.rainviewer.com/public/weather-maps.json"
response = fetch_with_curl(url, timeout=10)
if response.status_code == 200:
data = response.json()
if "radar" in data and "past" in data["radar"]:
latest_time = data["radar"]["past"][-1]["time"]
with _data_lock:
latest_data["weather"] = {"time": latest_time, "host": data.get("host", "https://tilecache.rainviewer.com")}
_mark_fresh("weather")
except Exception as e:
logger.error(f"Error fetching weather: {e}")
# ---------------------------------------------------------------------------
# CCTV
# ---------------------------------------------------------------------------
def fetch_cctv():
try:
cameras = get_all_cameras()
with _data_lock:
latest_data["cctv"] = cameras
_mark_fresh("cctv")
except Exception as e:
logger.error(f"Error fetching cctv from DB: {e}")
with _data_lock:
latest_data["cctv"] = []
# ---------------------------------------------------------------------------
# KiwiSDR
# ---------------------------------------------------------------------------
def fetch_kiwisdr():
try:
from services.kiwisdr_fetcher import fetch_kiwisdr_nodes
nodes = fetch_kiwisdr_nodes()
with _data_lock:
latest_data["kiwisdr"] = nodes
_mark_fresh("kiwisdr")
except Exception as e:
logger.error(f"Error fetching KiwiSDR nodes: {e}")
with _data_lock:
latest_data["kiwisdr"] = []
# ---------------------------------------------------------------------------
# NASA FIRMS Fires
# ---------------------------------------------------------------------------
def fetch_firms_fires():
"""Fetch global fire/thermal anomalies from NASA FIRMS (NOAA-20 VIIRS, 24h, no key needed)."""
fires = []
try:
url = "https://firms.modaps.eosdis.nasa.gov/data/active_fire/noaa-20-viirs-c2/csv/J1_VIIRS_C2_Global_24h.csv"
response = fetch_with_curl(url, timeout=30)
if response.status_code == 200:
reader = csv.DictReader(io.StringIO(response.text))
all_rows = []
for row in reader:
try:
lat = float(row.get("latitude", 0))
lng = float(row.get("longitude", 0))
frp = float(row.get("frp", 0))
conf = row.get("confidence", "nominal")
daynight = row.get("daynight", "")
bright = float(row.get("bright_ti4", 0))
all_rows.append({
"lat": lat, "lng": lng, "frp": frp,
"brightness": bright, "confidence": conf,
"daynight": daynight,
"acq_date": row.get("acq_date", ""),
"acq_time": row.get("acq_time", ""),
})
except (ValueError, TypeError):
continue
fires = heapq.nlargest(5000, all_rows, key=lambda x: x["frp"])
logger.info(f"FIRMS fires: {len(fires)} hotspots (from {response.status_code})")
except Exception as e:
logger.error(f"Error fetching FIRMS fires: {e}")
with _data_lock:
latest_data["firms_fires"] = fires
if fires:
_mark_fresh("firms_fires")
# ---------------------------------------------------------------------------
# Space Weather
# ---------------------------------------------------------------------------
def fetch_space_weather():
"""Fetch NOAA SWPC Kp index and recent solar events."""
try:
kp_resp = fetch_with_curl("https://services.swpc.noaa.gov/json/planetary_k_index_1m.json", timeout=10)
kp_value = None
kp_text = "QUIET"
if kp_resp.status_code == 200:
kp_data = kp_resp.json()
if kp_data:
latest_kp = kp_data[-1]
kp_value = float(latest_kp.get("kp_index", 0))
if kp_value >= 7:
kp_text = f"STORM G{min(int(kp_value) - 4, 5)}"
elif kp_value >= 5:
kp_text = f"STORM G{min(int(kp_value) - 4, 5)}"
elif kp_value >= 4:
kp_text = "ACTIVE"
elif kp_value >= 3:
kp_text = "UNSETTLED"
events = []
ev_resp = fetch_with_curl("https://services.swpc.noaa.gov/json/edited_events.json", timeout=10)
if ev_resp.status_code == 200:
all_events = ev_resp.json()
for ev in all_events[-10:]:
events.append({
"type": ev.get("type", ""),
"begin": ev.get("begin", ""),
"end": ev.get("end", ""),
"classtype": ev.get("classtype", ""),
})
with _data_lock:
latest_data["space_weather"] = {
"kp_index": kp_value,
"kp_text": kp_text,
"events": events,
}
_mark_fresh("space_weather")
logger.info(f"Space weather: Kp={kp_value} ({kp_text}), {len(events)} events")
except Exception as e:
logger.error(f"Error fetching space weather: {e}")
# ---------------------------------------------------------------------------
# Internet Outages (IODA)
# ---------------------------------------------------------------------------
_region_geocode_cache: TTLCache = TTLCache(maxsize=2000, ttl=86400)
def _geocode_region(region_name: str, country_name: str) -> tuple:
"""Geocode a region using OpenStreetMap Nominatim (cached, respects rate limit)."""
cache_key = f"{region_name}|{country_name}"
if cache_key in _region_geocode_cache:
return _region_geocode_cache[cache_key]
try:
import urllib.parse
query = urllib.parse.quote(f"{region_name}, {country_name}")
url = f"https://nominatim.openstreetmap.org/search?q={query}&format=json&limit=1"
response = fetch_with_curl(url, timeout=8, headers={"User-Agent": "ShadowBroker-OSINT/1.0"})
if response.status_code == 200:
results = response.json()
if results:
lat = float(results[0]["lat"])
lon = float(results[0]["lon"])
_region_geocode_cache[cache_key] = (lat, lon)
return (lat, lon)
except Exception:
pass
_region_geocode_cache[cache_key] = None
return None
def fetch_internet_outages():
"""Fetch regional internet outage alerts from IODA (Georgia Tech)."""
RELIABLE_DATASOURCES = {"bgp", "ping-slash24"}
outages = []
try:
now = int(time.time())
start = now - 86400
url = f"https://api.ioda.inetintel.cc.gatech.edu/v2/outages/alerts?from={start}&until={now}&limit=500"
response = fetch_with_curl(url, timeout=15)
if response.status_code == 200:
data = response.json()
alerts = data.get("data", [])
region_outages = {}
for alert in alerts:
entity = alert.get("entity", {})
etype = entity.get("type", "")
level = alert.get("level", "")
if level == "normal" or etype != "region":
continue
datasource = alert.get("datasource", "")
if datasource not in RELIABLE_DATASOURCES:
continue
code = entity.get("code", "")
name = entity.get("name", "")
attrs = entity.get("attrs", {})
country_code = attrs.get("country_code", "")
country_name = attrs.get("country_name", "")
value = alert.get("value", 0)
history_value = alert.get("historyValue", 0)
severity = 0
if history_value and history_value > 0:
severity = round((1 - value / history_value) * 100)
severity = max(0, min(severity, 100))
if severity < 10:
continue
if code not in region_outages or severity > region_outages[code]["severity"]:
region_outages[code] = {
"region_code": code,
"region_name": name,
"country_code": country_code,
"country_name": country_name,
"level": level,
"datasource": datasource,
"severity": severity,
}
geocoded = []
for rcode, r in region_outages.items():
coords = _geocode_region(r["region_name"], r["country_name"])
if coords:
r["lat"] = coords[0]
r["lng"] = coords[1]
geocoded.append(r)
outages = heapq.nlargest(100, geocoded, key=lambda x: x["severity"])
logger.info(f"Internet outages: {len(outages)} regions affected")
except Exception as e:
logger.error(f"Error fetching internet outages: {e}")
with _data_lock:
latest_data["internet_outages"] = outages
if outages:
_mark_fresh("internet_outages")
# ---------------------------------------------------------------------------
# Data Centers
# ---------------------------------------------------------------------------
_DC_GEOCODED_PATH = Path(__file__).parent.parent / "data" / "datacenters_geocoded.json"
def fetch_datacenters():
"""Load geocoded data centers (5K+ street-level precise locations)."""
dcs = []
try:
if not _DC_GEOCODED_PATH.exists():
logger.warning(f"Geocoded DC file not found: {_DC_GEOCODED_PATH}")
return
raw = json.loads(_DC_GEOCODED_PATH.read_text(encoding="utf-8"))
for entry in raw:
lat = entry.get("lat")
lng = entry.get("lng")
if lat is None or lng is None:
continue
if not (-90 <= lat <= 90 and -180 <= lng <= 180):
continue
dcs.append({
"name": entry.get("name", "Unknown"),
"company": entry.get("company", ""),
"street": entry.get("street", ""),
"city": entry.get("city", ""),
"country": entry.get("country", ""),
"zip": entry.get("zip", ""),
"lat": lat, "lng": lng,
})
logger.info(f"Data centers: {len(dcs)} geocoded locations loaded")
except Exception as e:
logger.error(f"Error loading data centers: {e}")
with _data_lock:
latest_data["datacenters"] = dcs
if dcs:
_mark_fresh("datacenters")
# ---------------------------------------------------------------------------
# Earthquakes
# ---------------------------------------------------------------------------
def fetch_earthquakes():
quakes = []
try:
url = "https://earthquake.usgs.gov/earthquakes/feed/v1.0/summary/2.5_day.geojson"
response = fetch_with_curl(url, timeout=10)
if response.status_code == 200:
features = response.json().get("features", [])
for f in features[:50]:
mag = f["properties"]["mag"]
lng, lat, depth = f["geometry"]["coordinates"]
quakes.append({
"id": f["id"], "mag": mag,
"lat": lat, "lng": lng,
"place": f["properties"]["place"]
})
except Exception as e:
logger.error(f"Error fetching earthquakes: {e}")
with _data_lock:
latest_data["earthquakes"] = quakes
if quakes:
_mark_fresh("earthquakes")
# ---------------------------------------------------------------------------
# Ships (AIS + Carriers)
# ---------------------------------------------------------------------------
def fetch_ships():
"""Fetch real-time AIS vessel data and combine with OSINT carrier positions."""
from services.ais_stream import get_ais_vessels
from services.carrier_tracker import get_carrier_positions
ships = []
try:
carriers = get_carrier_positions()
ships.extend(carriers)
except Exception as e:
logger.error(f"Carrier tracker error (non-fatal): {e}")
carriers = []
try:
ais_vessels = get_ais_vessels()
ships.extend(ais_vessels)
except Exception as e:
logger.error(f"AIS stream error (non-fatal): {e}")
ais_vessels = []
logger.info(f"Ships: {len(carriers)} carriers + {len(ais_vessels)} AIS vessels")
with _data_lock:
latest_data['ships'] = ships
_mark_fresh("ships")
# ---------------------------------------------------------------------------
# Airports
# ---------------------------------------------------------------------------
cached_airports = []
def find_nearest_airport(lat, lng, max_distance_nm=200):
"""Find the nearest large airport to a given lat/lng using haversine distance."""
if not cached_airports:
return None
best = None
best_dist = float('inf')
lat_r = math.radians(lat)
lng_r = math.radians(lng)
for apt in cached_airports:
apt_lat_r = math.radians(apt['lat'])
apt_lng_r = math.radians(apt['lng'])
dlat = apt_lat_r - lat_r
dlng = apt_lng_r - lng_r
a = math.sin(dlat / 2) ** 2 + math.cos(lat_r) * math.cos(apt_lat_r) * math.sin(dlng / 2) ** 2
c = 2 * math.atan2(math.sqrt(a), math.sqrt(1 - a))
dist_nm = 3440.065 * c
if dist_nm < best_dist:
best_dist = dist_nm
best = apt
if best and best_dist <= max_distance_nm:
return {
"iata": best['iata'], "name": best['name'],
"lat": best['lat'], "lng": best['lng'],
"distance_nm": round(best_dist, 1)
}
return None
def fetch_airports():
global cached_airports
if not cached_airports:
logger.info("Downloading global airports database from ourairports.com...")
try:
url = "https://ourairports.com/data/airports.csv"
response = fetch_with_curl(url, timeout=15)
if response.status_code == 200:
f = io.StringIO(response.text)
reader = csv.DictReader(f)
for row in reader:
if row['type'] == 'large_airport' and row['iata_code']:
cached_airports.append({
"id": row['ident'],
"name": row['name'],
"iata": row['iata_code'],
"lat": float(row['latitude_deg']),
"lng": float(row['longitude_deg']),
"type": "airport"
})
logger.info(f"Loaded {len(cached_airports)} large airports into cache.")
except Exception as e:
logger.error(f"Error fetching airports: {e}")
with _data_lock:
latest_data['airports'] = cached_airports
# ---------------------------------------------------------------------------
# Geopolitics & Liveuamap
# ---------------------------------------------------------------------------
from services.geopolitics import fetch_ukraine_frontlines, fetch_global_military_incidents
def fetch_frontlines():
"""Fetch Ukraine frontline data (fast — single GitHub API call)."""
try:
frontlines = fetch_ukraine_frontlines()
if frontlines:
with _data_lock:
latest_data['frontlines'] = frontlines
_mark_fresh("frontlines")
except Exception as e:
logger.error(f"Error fetching frontlines: {e}")
def fetch_gdelt():
"""Fetch GDELT global military incidents (slow — downloads 32 ZIP files)."""
try:
gdelt = fetch_global_military_incidents()
if gdelt is not None:
with _data_lock:
latest_data['gdelt'] = gdelt
_mark_fresh("gdelt")
except Exception as e:
logger.error(f"Error fetching GDELT: {e}")
def fetch_geopolitics():
"""Legacy wrapper — runs both sequentially. Used by recurring scheduler."""
fetch_frontlines()
fetch_gdelt()
def update_liveuamap():
logger.info("Running scheduled Liveuamap scraper...")
try:
from services.liveuamap_scraper import fetch_liveuamap
res = fetch_liveuamap()
if res:
with _data_lock:
latest_data['liveuamap'] = res
_mark_fresh("liveuamap")
except Exception as e:
logger.error(f"Liveuamap scraper error: {e}")
# ---------------------------------------------------------------------------
# Scheduler & Orchestration
# ---------------------------------------------------------------------------
@@ -525,23 +69,21 @@ def update_fast_data():
logger.info("Fast-tier update complete.")
def update_slow_data():
"""Slow-tier: feeds that change infrequently (every 30min).
Each fetcher writes to latest_data independently as it finishes,
so the frontend sees results progressively no all-or-nothing barrier."""
"""Slow-tier: contextual + enrichment data that refreshes less often (every 510 min)."""
logger.info("Slow-tier data update starting...")
slow_funcs = [
fetch_news,
fetch_earthquakes,
fetch_firms_fires,
fetch_defense_stocks,
fetch_oil_prices,
fetch_weather,
fetch_cctv,
fetch_earthquakes,
fetch_frontlines, # fast — single GitHub API call
fetch_gdelt, # slow — 32 ZIP downloads (runs in parallel, won't block frontlines)
fetch_kiwisdr,
fetch_space_weather,
fetch_internet_outages,
fetch_firms_fires,
fetch_cctv,
fetch_kiwisdr,
fetch_frontlines,
fetch_gdelt,
fetch_datacenters,
]
with concurrent.futures.ThreadPoolExecutor(max_workers=len(slow_funcs)) as executor:
@@ -550,7 +92,7 @@ def update_slow_data():
logger.info("Slow-tier update complete.")
def update_all_data():
"""Full update — runs on startup. All tiers run IN PARALLEL for fastest startup."""
"""Full refresh — all tiers run IN PARALLEL for fastest startup."""
logger.info("Full data update starting (parallel)...")
with concurrent.futures.ThreadPoolExecutor(max_workers=3) as pool:
f0 = pool.submit(fetch_airports)
@@ -559,39 +101,44 @@ def update_all_data():
concurrent.futures.wait([f0, f1, f2])
logger.info("Full data update complete.")
scheduler = BackgroundScheduler()
_scheduler = None
def start_scheduler():
global _scheduler
init_db()
_scheduler = BackgroundScheduler(daemon=True)
# NOTE: initial update_all_data() is called synchronously in main.py lifespan
# before start_scheduler(). These are only the RECURRING interval jobs.
scheduler.add_job(update_fast_data, 'interval', seconds=60)
scheduler.add_job(update_slow_data, 'interval', minutes=30)
# Fast tier — every 60 seconds
_scheduler.add_job(update_fast_data, 'interval', seconds=60, id='fast_tier', max_instances=1, misfire_grace_time=30)
def update_cctvs():
logger.info("Running CCTV Pipeline Ingestion...")
ingestors = [
TFLJamCamIngestor,
LTASingaporeIngestor,
AustinTXIngestor,
NYCDOTIngestor
]
for ingestor in ingestors:
try:
ingestor().ingest()
except Exception as e:
logger.error(f"Failed {ingestor.__name__} cctv ingest: {e}")
fetch_cctv()
# Slow tier — every 5 minutes
_scheduler.add_job(update_slow_data, 'interval', minutes=5, id='slow_tier', max_instances=1, misfire_grace_time=120)
scheduler.add_job(update_cctvs, 'interval', minutes=1)
# Very slow — every 15 minutes
_scheduler.add_job(fetch_gdelt, 'interval', minutes=15, id='gdelt', max_instances=1, misfire_grace_time=120)
_scheduler.add_job(update_liveuamap, 'interval', minutes=15, id='liveuamap', max_instances=1, misfire_grace_time=120)
scheduler.add_job(update_liveuamap, 'interval', hours=12)
# CCTV pipeline refresh — every 10 minutes
# Instantiate once and reuse — avoids re-creating DB connections on every tick
from services.cctv_pipeline import (
TFLJamCamIngestor, LTASingaporeIngestor,
AustinTXIngestor, NYCDOTIngestor,
)
_cctv_tfl = TFLJamCamIngestor()
_cctv_lta = LTASingaporeIngestor()
_cctv_atx = AustinTXIngestor()
_cctv_nyc = NYCDOTIngestor()
_scheduler.add_job(_cctv_tfl.ingest, 'interval', minutes=10, id='cctv_tfl', max_instances=1, misfire_grace_time=120)
_scheduler.add_job(_cctv_lta.ingest, 'interval', minutes=10, id='cctv_lta', max_instances=1, misfire_grace_time=120)
_scheduler.add_job(_cctv_atx.ingest, 'interval', minutes=10, id='cctv_atx', max_instances=1, misfire_grace_time=120)
_scheduler.add_job(_cctv_nyc.ingest, 'interval', minutes=10, id='cctv_nyc', max_instances=1, misfire_grace_time=120)
scheduler.start()
_scheduler.start()
logger.info("Scheduler started.")
def stop_scheduler():
scheduler.shutdown()
if _scheduler:
_scheduler.shutdown(wait=False)
def get_latest_data():
with _data_lock:
+77
View File
@@ -0,0 +1,77 @@
"""Startup environment validation — called once in the FastAPI lifespan hook.
Ensures required env vars are present before the scheduler starts.
Logs warnings for optional keys that degrade functionality when missing.
"""
import os
import sys
import logging
logger = logging.getLogger(__name__)
# Keys grouped by criticality
_REQUIRED = {
# Empty for now — add keys here only if the app literally cannot function without them
}
_CRITICAL_WARN = {
"ADMIN_KEY": "Authentication for /api/settings and /api/system/update — endpoints are UNPROTECTED without it!",
}
_OPTIONAL = {
"AIS_API_KEY": "AIS vessel streaming (ships layer will be empty without it)",
"OPENSKY_CLIENT_ID": "OpenSky OAuth2 — gap-fill flights in Africa/Asia/LatAm",
"OPENSKY_CLIENT_SECRET": "OpenSky OAuth2 — gap-fill flights in Africa/Asia/LatAm",
"LTA_ACCOUNT_KEY": "Singapore LTA traffic cameras (CCTV layer)",
}
def validate_env(*, strict: bool = True) -> bool:
"""Validate environment variables at startup.
Args:
strict: If True, exit the process on missing required keys.
If False, only log errors (useful for tests).
Returns:
True if all required keys are present, False otherwise.
"""
all_ok = True
# Required keys — must be set
for key, desc in _REQUIRED.items():
value = os.environ.get(key, "").strip()
if not value:
logger.error(
"❌ REQUIRED env var %s is not set. %s\n"
" Set it in .env or via Docker secrets (%s_FILE).",
key, desc, key,
)
all_ok = False
if not all_ok and strict:
logger.critical("Startup aborted — required environment variables are missing.")
sys.exit(1)
# Critical-warn keys — app works but security/functionality is degraded
for key, desc in _CRITICAL_WARN.items():
value = os.environ.get(key, "").strip()
if not value:
logger.critical(
"🔓 CRITICAL: env var %s is not set — %s\n"
" This is safe for local dev but MUST be set in production.",
key, desc,
)
# Optional keys — warn if missing
for key, desc in _OPTIONAL.items():
value = os.environ.get(key, "").strip()
if not value:
logger.warning(
"⚠️ Optional env var %s is not set — %s", key, desc
)
if all_ok:
logger.info("✅ Environment validation passed.")
return all_ok
@@ -0,0 +1,144 @@
"""Earth-observation fetchers — earthquakes, FIRMS fires, space weather, weather radar."""
import csv
import io
import logging
import heapq
from services.network_utils import fetch_with_curl
from services.fetchers._store import latest_data, _data_lock, _mark_fresh
from services.fetchers.retry import with_retry
logger = logging.getLogger(__name__)
# ---------------------------------------------------------------------------
# Earthquakes (USGS)
# ---------------------------------------------------------------------------
@with_retry(max_retries=1, base_delay=1)
def fetch_earthquakes():
quakes = []
try:
url = "https://earthquake.usgs.gov/earthquakes/feed/v1.0/summary/2.5_day.geojson"
response = fetch_with_curl(url, timeout=10)
if response.status_code == 200:
features = response.json().get("features", [])
for f in features[:50]:
mag = f["properties"]["mag"]
lng, lat, depth = f["geometry"]["coordinates"]
quakes.append({
"id": f["id"], "mag": mag,
"lat": lat, "lng": lng,
"place": f["properties"]["place"]
})
except Exception as e:
logger.error(f"Error fetching earthquakes: {e}")
with _data_lock:
latest_data["earthquakes"] = quakes
if quakes:
_mark_fresh("earthquakes")
# ---------------------------------------------------------------------------
# NASA FIRMS Fires
# ---------------------------------------------------------------------------
@with_retry(max_retries=1, base_delay=2)
def fetch_firms_fires():
"""Fetch global fire/thermal anomalies from NASA FIRMS (NOAA-20 VIIRS, 24h, no key needed)."""
fires = []
try:
url = "https://firms.modaps.eosdis.nasa.gov/data/active_fire/noaa-20-viirs-c2/csv/J1_VIIRS_C2_Global_24h.csv"
response = fetch_with_curl(url, timeout=30)
if response.status_code == 200:
reader = csv.DictReader(io.StringIO(response.text))
all_rows = []
for row in reader:
try:
lat = float(row.get("latitude", 0))
lng = float(row.get("longitude", 0))
frp = float(row.get("frp", 0))
conf = row.get("confidence", "nominal")
daynight = row.get("daynight", "")
bright = float(row.get("bright_ti4", 0))
all_rows.append({
"lat": lat, "lng": lng, "frp": frp,
"brightness": bright, "confidence": conf,
"daynight": daynight,
"acq_date": row.get("acq_date", ""),
"acq_time": row.get("acq_time", ""),
})
except (ValueError, TypeError):
continue
fires = heapq.nlargest(5000, all_rows, key=lambda x: x["frp"])
logger.info(f"FIRMS fires: {len(fires)} hotspots (from {response.status_code})")
except Exception as e:
logger.error(f"Error fetching FIRMS fires: {e}")
with _data_lock:
latest_data["firms_fires"] = fires
if fires:
_mark_fresh("firms_fires")
# ---------------------------------------------------------------------------
# Space Weather (NOAA SWPC)
# ---------------------------------------------------------------------------
@with_retry(max_retries=1, base_delay=1)
def fetch_space_weather():
"""Fetch NOAA SWPC Kp index and recent solar events."""
try:
kp_resp = fetch_with_curl("https://services.swpc.noaa.gov/json/planetary_k_index_1m.json", timeout=10)
kp_value = None
kp_text = "QUIET"
if kp_resp.status_code == 200:
kp_data = kp_resp.json()
if kp_data:
latest_kp = kp_data[-1]
kp_value = float(latest_kp.get("kp_index", 0))
if kp_value >= 7:
kp_text = f"STORM G{min(int(kp_value) - 4, 5)}"
elif kp_value >= 5:
kp_text = f"STORM G{min(int(kp_value) - 4, 5)}"
elif kp_value >= 4:
kp_text = "ACTIVE"
elif kp_value >= 3:
kp_text = "UNSETTLED"
events = []
ev_resp = fetch_with_curl("https://services.swpc.noaa.gov/json/edited_events.json", timeout=10)
if ev_resp.status_code == 200:
all_events = ev_resp.json()
for ev in all_events[-10:]:
events.append({
"type": ev.get("type", ""),
"begin": ev.get("begin", ""),
"end": ev.get("end", ""),
"classtype": ev.get("classtype", ""),
})
with _data_lock:
latest_data["space_weather"] = {
"kp_index": kp_value,
"kp_text": kp_text,
"events": events,
}
_mark_fresh("space_weather")
logger.info(f"Space weather: Kp={kp_value} ({kp_text}), {len(events)} events")
except Exception as e:
logger.error(f"Error fetching space weather: {e}")
# ---------------------------------------------------------------------------
# Weather Radar (RainViewer)
# ---------------------------------------------------------------------------
@with_retry(max_retries=1, base_delay=1)
def fetch_weather():
try:
url = "https://api.rainviewer.com/public/weather-maps.json"
response = fetch_with_curl(url, timeout=10)
if response.status_code == 200:
data = response.json()
if "radar" in data and "past" in data["radar"]:
latest_time = data["radar"]["past"][-1]["time"]
with _data_lock:
latest_data["weather"] = {"time": latest_time, "host": data.get("host", "https://tilecache.rainviewer.com")}
_mark_fresh("weather")
except Exception as e:
logger.error(f"Error fetching weather: {e}")
+58
View File
@@ -0,0 +1,58 @@
"""Financial data fetchers — defense stocks and oil prices.
Uses yfinance for ticker data with concurrent execution for performance.
"""
import logging
import concurrent.futures
import yfinance as yf
from services.fetchers._store import latest_data, _data_lock, _mark_fresh
from services.fetchers.retry import with_retry
logger = logging.getLogger(__name__)
def _fetch_single_ticker(symbol: str, period: str = "2d"):
"""Fetch a single yfinance ticker. Returns (symbol, data_dict) or (symbol, None)."""
try:
ticker = yf.Ticker(symbol)
hist = ticker.history(period=period)
if len(hist) >= 1:
current_price = hist['Close'].iloc[-1]
prev_close = hist['Close'].iloc[0] if len(hist) > 1 else current_price
change_percent = ((current_price - prev_close) / prev_close) * 100 if prev_close else 0
return symbol, {
"price": round(float(current_price), 2),
"change_percent": round(float(change_percent), 2),
"up": bool(change_percent >= 0)
}
except Exception as e:
logger.warning(f"Could not fetch data for {symbol}: {e}")
return symbol, None
@with_retry(max_retries=1, base_delay=1)
def fetch_defense_stocks():
tickers = ["RTX", "LMT", "NOC", "GD", "BA", "PLTR"]
try:
with concurrent.futures.ThreadPoolExecutor(max_workers=4) as pool:
results = pool.map(lambda t: _fetch_single_ticker(t, "2d"), tickers)
stocks_data = {sym: data for sym, data in results if data}
with _data_lock:
latest_data['stocks'] = stocks_data
_mark_fresh("stocks")
except Exception as e:
logger.error(f"Error fetching stocks: {e}")
@with_retry(max_retries=1, base_delay=1)
def fetch_oil_prices():
tickers = {"WTI Crude": "CL=F", "Brent Crude": "BZ=F"}
try:
with concurrent.futures.ThreadPoolExecutor(max_workers=2) as pool:
results = pool.map(lambda item: (_fetch_single_ticker(item[1], "5d")[1], item[0]), tickers.items())
oil_data = {name: data for data, name in results if data}
with _data_lock:
latest_data['oil'] = oil_data
_mark_fresh("oil")
except Exception as e:
logger.error(f"Error fetching oil: {e}")
+15 -12
View File
@@ -4,6 +4,7 @@ import re
import os
import time
import math
import json
import logging
import threading
import concurrent.futures
@@ -13,6 +14,7 @@ from cachetools import TTLCache
from services.network_utils import fetch_with_curl
from services.fetchers._store import latest_data, _data_lock, _mark_fresh
from services.fetchers.plane_alert import enrich_with_plane_alert, enrich_with_tracked_names
from services.fetchers.retry import with_retry
logger = logging.getLogger("services.data_fetcher")
@@ -139,7 +141,7 @@ def _fetch_supplemental_sources(seen_hex: set) -> list:
if res.status_code == 200:
data = res.json()
return data.get("ac", [])
except Exception as e:
except (requests.RequestException, ConnectionError, TimeoutError, ValueError, KeyError, json.JSONDecodeError, OSError) as e:
logger.debug(f"airplanes.live {region['name']} failed: {e}")
return []
@@ -153,7 +155,7 @@ def _fetch_supplemental_sources(seen_hex: set) -> list:
f["supplemental_source"] = "airplanes.live"
new_supplemental.append(f)
supplemental_hex.add(h)
except Exception as e:
except (requests.RequestException, ConnectionError, TimeoutError, ValueError, KeyError, OSError) as e:
logger.warning(f"airplanes.live supplemental fetch failed: {e}")
ap_count = len(new_supplemental)
@@ -172,10 +174,10 @@ def _fetch_supplemental_sources(seen_hex: set) -> list:
f["supplemental_source"] = "adsb.fi"
new_supplemental.append(f)
supplemental_hex.add(h)
except Exception as e:
except (requests.RequestException, ConnectionError, TimeoutError, ValueError, KeyError, json.JSONDecodeError, OSError) as e:
logger.debug(f"adsb.fi {region['name']} failed: {e}")
time.sleep(1.1)
except Exception as e:
except (requests.RequestException, ConnectionError, TimeoutError, ValueError, KeyError, OSError) as e:
logger.warning(f"adsb.fi supplemental fetch failed: {e}")
fi_count = len(new_supplemental) - ap_count
@@ -236,8 +238,8 @@ def fetch_routes_background(sampled):
"dest_loc": [dest_apt.get("lon", 0), dest_apt.get("lat", 0)],
}
time.sleep(0.25)
except Exception:
logger.debug("Route batch request failed")
except (requests.RequestException, ConnectionError, TimeoutError, ValueError, KeyError, json.JSONDecodeError, OSError) as e:
logger.debug(f"Route batch request failed: {e}")
finally:
with _routes_lock:
routes_fetch_in_progress = False
@@ -327,7 +329,7 @@ def _classify_and_publish(all_adsb_flights):
"aircraft_category": ac_category,
"nac_p": f.get("nac_p")
})
except Exception as loop_e:
except (ValueError, TypeError, KeyError, AttributeError) as loop_e:
logger.error(f"Flight interpolation error: {loop_e}")
continue
@@ -530,7 +532,7 @@ def _classify_and_publish(all_adsb_flights):
latest_data['gps_jamming'] = jamming_zones
if jamming_zones:
logger.info(f"GPS Jamming: {len(jamming_zones)} interference zones detected")
except Exception as e:
except (ValueError, TypeError, KeyError, ZeroDivisionError) as e:
logger.error(f"GPS Jamming detection error: {e}")
with _data_lock:
latest_data['gps_jamming'] = []
@@ -571,7 +573,7 @@ def _classify_and_publish(all_adsb_flights):
holding_count += 1
if holding_count:
logger.info(f"Holding patterns: {holding_count} aircraft circling")
except Exception as e:
except (ValueError, TypeError, KeyError, ZeroDivisionError) as e:
logger.error(f"Holding pattern detection error: {e}")
with _data_lock:
@@ -596,7 +598,7 @@ def _fetch_adsb_lol_regions():
if res.status_code == 200:
data = res.json()
return data.get("ac", [])
except Exception as e:
except (requests.RequestException, ConnectionError, TimeoutError, ValueError, KeyError, json.JSONDecodeError, OSError) as e:
logger.warning(f"Region fetch failed for lat={r['lat']}: {e}")
return []
@@ -663,7 +665,7 @@ def _enrich_with_opensky_and_supplemental(adsb_flights):
})
else:
logger.warning(f"OpenSky API {os_reg['name']} failed: {os_res.status_code}")
except Exception as ex:
except (requests.RequestException, ConnectionError, TimeoutError, ValueError, KeyError, json.JSONDecodeError, OSError) as ex:
logger.error(f"OpenSky fetching error for {os_reg['name']}: {ex}")
cached_opensky_flights = new_opensky_flights
@@ -686,7 +688,7 @@ def _enrich_with_opensky_and_supplemental(adsb_flights):
seen_hex.add(h)
if gap_fill:
logger.info(f"Gap-fill: added {len(gap_fill)} aircraft to pipeline")
except Exception as e:
except (requests.RequestException, ConnectionError, TimeoutError, ValueError, KeyError, OSError) as e:
logger.warning(f"Supplemental source fetch failed (non-fatal): {e}")
# Re-publish with enriched data
@@ -697,6 +699,7 @@ def _enrich_with_opensky_and_supplemental(adsb_flights):
logger.error(f"OpenSky/supplemental enrichment error: {e}")
@with_retry(max_retries=1, base_delay=1)
def fetch_flights():
"""Two-phase flight fetching:
Phase 1 (fast): Fetch adsb.lol classify publish immediately (~3-5s)
+161
View File
@@ -0,0 +1,161 @@
"""Ship and geopolitics fetchers — AIS vessels, carriers, frontlines, GDELT, LiveUAmap."""
import csv
import io
import math
import logging
from services.network_utils import fetch_with_curl
from services.fetchers._store import latest_data, _data_lock, _mark_fresh
from services.fetchers.retry import with_retry
logger = logging.getLogger(__name__)
# ---------------------------------------------------------------------------
# Ships (AIS + Carriers)
# ---------------------------------------------------------------------------
@with_retry(max_retries=1, base_delay=1)
def fetch_ships():
"""Fetch real-time AIS vessel data and combine with OSINT carrier positions."""
from services.ais_stream import get_ais_vessels
from services.carrier_tracker import get_carrier_positions
ships = []
try:
carriers = get_carrier_positions()
ships.extend(carriers)
except Exception as e:
logger.error(f"Carrier tracker error (non-fatal): {e}")
carriers = []
try:
ais_vessels = get_ais_vessels()
ships.extend(ais_vessels)
except Exception as e:
logger.error(f"AIS stream error (non-fatal): {e}")
ais_vessels = []
# Enrich ships with yacht alert data (tracked superyachts)
from services.fetchers.yacht_alert import enrich_with_yacht_alert
for ship in ships:
enrich_with_yacht_alert(ship)
logger.info(f"Ships: {len(carriers)} carriers + {len(ais_vessels)} AIS vessels")
with _data_lock:
latest_data['ships'] = ships
_mark_fresh("ships")
# ---------------------------------------------------------------------------
# Airports (ourairports.com)
# ---------------------------------------------------------------------------
cached_airports = []
def find_nearest_airport(lat, lng, max_distance_nm=200):
"""Find the nearest large airport to a given lat/lng using haversine distance."""
if not cached_airports:
return None
best = None
best_dist = float('inf')
lat_r = math.radians(lat)
lng_r = math.radians(lng)
for apt in cached_airports:
apt_lat_r = math.radians(apt['lat'])
apt_lng_r = math.radians(apt['lng'])
dlat = apt_lat_r - lat_r
dlng = apt_lng_r - lng_r
a = math.sin(dlat / 2) ** 2 + math.cos(lat_r) * math.cos(apt_lat_r) * math.sin(dlng / 2) ** 2
c = 2 * math.atan2(math.sqrt(a), math.sqrt(1 - a))
dist_nm = 3440.065 * c
if dist_nm < best_dist:
best_dist = dist_nm
best = apt
if best and best_dist <= max_distance_nm:
return {
"iata": best['iata'], "name": best['name'],
"lat": best['lat'], "lng": best['lng'],
"distance_nm": round(best_dist, 1)
}
return None
def fetch_airports():
global cached_airports
if not cached_airports:
logger.info("Downloading global airports database from ourairports.com...")
try:
url = "https://ourairports.com/data/airports.csv"
response = fetch_with_curl(url, timeout=15)
if response.status_code == 200:
f = io.StringIO(response.text)
reader = csv.DictReader(f)
for row in reader:
if row['type'] == 'large_airport' and row['iata_code']:
cached_airports.append({
"id": row['ident'],
"name": row['name'],
"iata": row['iata_code'],
"lat": float(row['latitude_deg']),
"lng": float(row['longitude_deg']),
"type": "airport"
})
logger.info(f"Loaded {len(cached_airports)} large airports into cache.")
except Exception as e:
logger.error(f"Error fetching airports: {e}")
with _data_lock:
latest_data['airports'] = cached_airports
# ---------------------------------------------------------------------------
# Geopolitics & LiveUAMap
# ---------------------------------------------------------------------------
@with_retry(max_retries=1, base_delay=2)
def fetch_frontlines():
"""Fetch Ukraine frontline data (fast — single GitHub API call)."""
try:
from services.geopolitics import fetch_ukraine_frontlines
frontlines = fetch_ukraine_frontlines()
if frontlines:
with _data_lock:
latest_data['frontlines'] = frontlines
_mark_fresh("frontlines")
except Exception as e:
logger.error(f"Error fetching frontlines: {e}")
@with_retry(max_retries=1, base_delay=3)
def fetch_gdelt():
"""Fetch GDELT global military incidents (slow — downloads 32 ZIP files)."""
try:
from services.geopolitics import fetch_global_military_incidents
gdelt = fetch_global_military_incidents()
if gdelt is not None:
with _data_lock:
latest_data['gdelt'] = gdelt
_mark_fresh("gdelt")
except Exception as e:
logger.error(f"Error fetching GDELT: {e}")
def fetch_geopolitics():
"""Legacy wrapper — runs both sequentially. Used by recurring scheduler."""
fetch_frontlines()
fetch_gdelt()
def update_liveuamap():
logger.info("Running scheduled Liveuamap scraper...")
try:
from services.liveuamap_scraper import fetch_liveuamap
res = fetch_liveuamap()
if res:
with _data_lock:
latest_data['liveuamap'] = res
_mark_fresh("liveuamap")
except Exception as e:
logger.error(f"Liveuamap scraper error: {e}")
+176
View File
@@ -0,0 +1,176 @@
"""Infrastructure fetchers — internet outages (IODA), data centers, CCTV, KiwiSDR."""
import json
import time
import heapq
import logging
from pathlib import Path
from cachetools import TTLCache
from services.network_utils import fetch_with_curl
from services.fetchers._store import latest_data, _data_lock, _mark_fresh
from services.fetchers.retry import with_retry
logger = logging.getLogger(__name__)
# ---------------------------------------------------------------------------
# Internet Outages (IODA — Georgia Tech)
# ---------------------------------------------------------------------------
_region_geocode_cache: TTLCache = TTLCache(maxsize=2000, ttl=86400)
def _geocode_region(region_name: str, country_name: str) -> tuple:
"""Geocode a region using OpenStreetMap Nominatim (cached, respects rate limit)."""
cache_key = f"{region_name}|{country_name}"
if cache_key in _region_geocode_cache:
return _region_geocode_cache[cache_key]
try:
import urllib.parse
query = urllib.parse.quote(f"{region_name}, {country_name}")
url = f"https://nominatim.openstreetmap.org/search?q={query}&format=json&limit=1"
response = fetch_with_curl(url, timeout=8, headers={"User-Agent": "ShadowBroker-OSINT/1.0"})
if response.status_code == 200:
results = response.json()
if results:
lat = float(results[0]["lat"])
lon = float(results[0]["lon"])
_region_geocode_cache[cache_key] = (lat, lon)
return (lat, lon)
except Exception:
pass
_region_geocode_cache[cache_key] = None
return None
@with_retry(max_retries=1, base_delay=1)
def fetch_internet_outages():
"""Fetch regional internet outage alerts from IODA (Georgia Tech)."""
RELIABLE_DATASOURCES = {"bgp", "ping-slash24"}
outages = []
try:
now = int(time.time())
start = now - 86400
url = f"https://api.ioda.inetintel.cc.gatech.edu/v2/outages/alerts?from={start}&until={now}&limit=500"
response = fetch_with_curl(url, timeout=15)
if response.status_code == 200:
data = response.json()
alerts = data.get("data", [])
region_outages = {}
for alert in alerts:
entity = alert.get("entity", {})
etype = entity.get("type", "")
level = alert.get("level", "")
if level == "normal" or etype != "region":
continue
datasource = alert.get("datasource", "")
if datasource not in RELIABLE_DATASOURCES:
continue
code = entity.get("code", "")
name = entity.get("name", "")
attrs = entity.get("attrs", {})
country_code = attrs.get("country_code", "")
country_name = attrs.get("country_name", "")
value = alert.get("value", 0)
history_value = alert.get("historyValue", 0)
severity = 0
if history_value and history_value > 0:
severity = round((1 - value / history_value) * 100)
severity = max(0, min(severity, 100))
if severity < 10:
continue
if code not in region_outages or severity > region_outages[code]["severity"]:
region_outages[code] = {
"region_code": code,
"region_name": name,
"country_code": country_code,
"country_name": country_name,
"level": level,
"datasource": datasource,
"severity": severity,
}
geocoded = []
for rcode, r in region_outages.items():
coords = _geocode_region(r["region_name"], r["country_name"])
if coords:
r["lat"] = coords[0]
r["lng"] = coords[1]
geocoded.append(r)
outages = heapq.nlargest(100, geocoded, key=lambda x: x["severity"])
logger.info(f"Internet outages: {len(outages)} regions affected")
except Exception as e:
logger.error(f"Error fetching internet outages: {e}")
with _data_lock:
latest_data["internet_outages"] = outages
if outages:
_mark_fresh("internet_outages")
# ---------------------------------------------------------------------------
# Data Centers (local geocoded JSON)
# ---------------------------------------------------------------------------
_DC_GEOCODED_PATH = Path(__file__).parent.parent.parent / "data" / "datacenters_geocoded.json"
def fetch_datacenters():
"""Load geocoded data centers (5K+ street-level precise locations)."""
dcs = []
try:
if not _DC_GEOCODED_PATH.exists():
logger.warning(f"Geocoded DC file not found: {_DC_GEOCODED_PATH}")
return
raw = json.loads(_DC_GEOCODED_PATH.read_text(encoding="utf-8"))
for entry in raw:
lat = entry.get("lat")
lng = entry.get("lng")
if lat is None or lng is None:
continue
if not (-90 <= lat <= 90 and -180 <= lng <= 180):
continue
dcs.append({
"name": entry.get("name", "Unknown"),
"company": entry.get("company", ""),
"street": entry.get("street", ""),
"city": entry.get("city", ""),
"country": entry.get("country", ""),
"zip": entry.get("zip", ""),
"lat": lat, "lng": lng,
})
logger.info(f"Data centers: {len(dcs)} geocoded locations loaded")
except Exception as e:
logger.error(f"Error loading data centers: {e}")
with _data_lock:
latest_data["datacenters"] = dcs
if dcs:
_mark_fresh("datacenters")
# ---------------------------------------------------------------------------
# CCTV Cameras
# ---------------------------------------------------------------------------
def fetch_cctv():
try:
from services.cctv_pipeline import get_all_cameras
cameras = get_all_cameras()
with _data_lock:
latest_data["cctv"] = cameras
_mark_fresh("cctv")
except Exception as e:
logger.error(f"Error fetching cctv from DB: {e}")
with _data_lock:
latest_data["cctv"] = []
# ---------------------------------------------------------------------------
# KiwiSDR Receivers
# ---------------------------------------------------------------------------
@with_retry(max_retries=2, base_delay=2)
def fetch_kiwisdr():
try:
from services.kiwisdr_fetcher import fetch_kiwisdr_nodes
nodes = fetch_kiwisdr_nodes()
with _data_lock:
latest_data["kiwisdr"] = nodes
_mark_fresh("kiwisdr")
except Exception as e:
logger.error(f"Error fetching KiwiSDR nodes: {e}")
with _data_lock:
latest_data["kiwisdr"] = []
+2
View File
@@ -1,5 +1,7 @@
"""Military flight tracking and UAV detection from ADS-B data."""
import json
import logging
import requests
from services.network_utils import fetch_with_curl
from services.fetchers._store import latest_data, _data_lock, _mark_fresh
from services.fetchers.plane_alert import enrich_with_plane_alert
+4 -1
View File
@@ -2,9 +2,11 @@
import re
import logging
import concurrent.futures
import requests
import feedparser
from services.network_utils import fetch_with_curl
from services.fetchers._store import latest_data, _data_lock, _mark_fresh
from services.fetchers.retry import with_retry
logger = logging.getLogger("services.data_fetcher")
@@ -89,6 +91,7 @@ _KEYWORD_COORDS = {
}
@with_retry(max_retries=1, base_delay=2)
def fetch_news():
from services.news_feed_config import get_feeds
feed_config = get_feeds()
@@ -103,7 +106,7 @@ def fetch_news():
try:
xml_data = fetch_with_curl(url, timeout=10).text
return source_name, feedparser.parse(xml_data)
except Exception as e:
except (requests.RequestException, ConnectionError, TimeoutError, ValueError, KeyError, OSError) as e:
logger.warning(f"Feed {source_name} failed: {e}")
return source_name, None
+49
View File
@@ -0,0 +1,49 @@
"""Retry decorator with exponential backoff + jitter for network-bound fetcher functions.
Usage:
@with_retry(max_retries=3, base_delay=2)
def fetch_something():
...
"""
import time
import random
import logging
import functools
logger = logging.getLogger(__name__)
def with_retry(max_retries: int = 3, base_delay: float = 2.0, max_delay: float = 30.0):
"""Decorator: retries the wrapped function on any exception with exponential backoff + jitter.
Args:
max_retries: Number of retry attempts after the initial failure.
base_delay: Base delay (seconds) for exponential backoff (2 4 8 ).
max_delay: Cap on the delay between retries.
"""
def decorator(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
last_exc = None
for attempt in range(1 + max_retries):
try:
return func(*args, **kwargs)
except Exception as exc:
last_exc = exc
if attempt < max_retries:
delay = min(base_delay * (2 ** attempt), max_delay)
jitter = random.uniform(0, delay * 0.25)
total = delay + jitter
logger.warning(
"%s failed (attempt %d/%d): %s — retrying in %.1fs",
func.__name__, attempt + 1, max_retries + 1, exc, total,
)
time.sleep(total)
else:
logger.error(
"%s failed after %d attempts: %s",
func.__name__, max_retries + 1, exc,
)
raise last_exc # type: ignore[misc]
return wrapper
return decorator
+11 -10
View File
@@ -11,6 +11,7 @@ import time
import json
import re
import logging
import requests
from pathlib import Path
from datetime import datetime, timedelta
from sgp4.api import Satrec, WGS72, jday
@@ -53,7 +54,7 @@ def _load_sat_cache():
return data
else:
logger.info(f"Satellites: Disk cache is {age_hours:.0f}h old, will try fresh fetch")
except Exception as e:
except (IOError, OSError, json.JSONDecodeError, ValueError, KeyError) as e:
logger.warning(f"Satellites: Failed to load disk cache: {e}")
return None
@@ -65,7 +66,7 @@ def _save_sat_cache(data):
json.dump(data, f)
_save_cache_meta()
logger.info(f"Satellites: Saved {len(data)} records to disk cache")
except Exception as e:
except (IOError, OSError) as e:
logger.warning(f"Satellites: Failed to save disk cache: {e}")
def _load_cache_meta():
@@ -75,7 +76,7 @@ def _load_cache_meta():
with open(_SAT_CACHE_META_PATH, "r") as f:
meta = json.load(f)
_sat_gp_cache["last_modified"] = meta.get("last_modified")
except Exception:
except (IOError, OSError, json.JSONDecodeError, ValueError, KeyError):
pass
def _save_cache_meta():
@@ -83,7 +84,7 @@ def _save_cache_meta():
try:
with open(_SAT_CACHE_META_PATH, "w") as f:
json.dump({"last_modified": _sat_gp_cache.get("last_modified")}, f)
except Exception:
except (IOError, OSError):
pass
@@ -163,7 +164,7 @@ def _parse_tle_to_gp(name, norad_id, line1, line2):
"BSTAR": bstar,
"EPOCH": epoch_dt.strftime("%Y-%m-%dT%H:%M:%S"),
}
except Exception:
except (ValueError, TypeError, IndexError, KeyError):
return None
@@ -196,7 +197,7 @@ def _fetch_satellites_from_tle_api():
seen_ids.add(sat_id)
all_results.append(gp)
time.sleep(1) # Polite delay between requests
except Exception as e:
except (requests.RequestException, ConnectionError, TimeoutError, ValueError, KeyError, json.JSONDecodeError, OSError) as e:
logger.debug(f"TLE fallback search '{term}' failed: {e}")
return all_results
@@ -238,7 +239,7 @@ def fetch_satellites():
_save_sat_cache(gp_data)
logger.info(f"Satellites: Downloaded {len(gp_data)} GP records from CelesTrak")
break
except Exception as e:
except (requests.RequestException, ConnectionError, TimeoutError, ValueError, KeyError, json.JSONDecodeError, OSError) as e:
logger.warning(f"Satellites: Failed to fetch from {url}: {e}")
continue
@@ -252,7 +253,7 @@ def fetch_satellites():
_sat_gp_cache["source"] = "tle_api"
_save_sat_cache(fallback_data)
logger.info(f"Satellites: Got {len(fallback_data)} records from TLE fallback API")
except Exception as e:
except (requests.RequestException, ConnectionError, TimeoutError, ValueError, KeyError, OSError) as e:
logger.error(f"Satellites: TLE fallback also failed: {e}")
if _sat_gp_cache["data"] is None:
@@ -375,11 +376,11 @@ def fetch_satellites():
'BSTAR', 'EPOCH', 'tle1', 'tle2'):
s.pop(k, None)
sats.append(s)
except Exception:
except (ValueError, TypeError, KeyError, AttributeError, ZeroDivisionError):
continue
logger.info(f"Satellites: {len(classified)} classified, {len(sats)} positioned")
except Exception as e:
except (requests.RequestException, ConnectionError, TimeoutError, ValueError, KeyError, json.JSONDecodeError, OSError) as e:
logger.error(f"Error fetching satellites: {e}")
if sats:
with _data_lock:
+62
View File
@@ -0,0 +1,62 @@
"""Yacht-Alert DB — load and enrich AIS vessels with tracked yacht metadata."""
import os
import json
import logging
logger = logging.getLogger("services.data_fetcher")
# Category -> color mapping
_CATEGORY_COLOR: dict[str, str] = {
"Tech Billionaire": "#FF69B4",
"Celebrity / Mogul": "#FF69B4",
"Oligarch Watch": "#FF2020",
}
def _category_to_color(cat: str) -> str:
"""Map category to display color. Defaults to hot pink."""
return _CATEGORY_COLOR.get(cat, "#FF69B4")
_YACHT_ALERT_DB: dict = {}
def _load_yacht_alert_db():
"""Load yacht_alert_db.json into memory at import time."""
global _YACHT_ALERT_DB
json_path = os.path.join(
os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))),
"data", "yacht_alert_db.json"
)
if not os.path.exists(json_path):
logger.warning(f"Yacht-Alert DB not found at {json_path}")
return
try:
with open(json_path, "r", encoding="utf-8") as fh:
raw = json.load(fh)
for mmsi_str, info in raw.items():
info["color"] = _category_to_color(info.get("category", ""))
_YACHT_ALERT_DB[mmsi_str] = info
logger.info(f"Yacht-Alert DB loaded: {len(_YACHT_ALERT_DB)} vessels")
except (IOError, OSError, json.JSONDecodeError, ValueError, KeyError) as e:
logger.error(f"Failed to load Yacht-Alert DB: {e}")
_load_yacht_alert_db()
def enrich_with_yacht_alert(ship: dict) -> dict:
"""If ship's MMSI is in the Yacht-Alert DB, attach owner/alert metadata."""
mmsi = str(ship.get("mmsi", "")).strip()
if mmsi and mmsi in _YACHT_ALERT_DB:
info = _YACHT_ALERT_DB[mmsi]
ship["yacht_alert"] = True
ship["yacht_owner"] = info["owner"]
ship["yacht_name"] = info["name"]
ship["yacht_category"] = info["category"]
ship["yacht_color"] = info["color"]
ship["yacht_builder"] = info.get("builder", "")
ship["yacht_length"] = info.get("length_m", 0)
ship["yacht_year"] = info.get("year", 0)
ship["yacht_link"] = info.get("link", "")
return ship
+46 -36
View File
@@ -3,6 +3,7 @@ import json
import subprocess
import shutil
import time
import threading
import requests
from urllib.parse import urlparse
from requests.adapters import HTTPAdapter
@@ -30,6 +31,9 @@ _DOMAIN_FAIL_TTL = 300 # 5 minutes
_circuit_breaker: dict[str, float] = {}
_CIRCUIT_BREAKER_TTL = 120 # 2 minutes
# Lock protecting _domain_fail_cache and _circuit_breaker mutations
_cb_lock = threading.Lock()
class _DummyResponse:
"""Minimal response object matching requests.Response interface."""
def __init__(self, status_code, text):
@@ -61,13 +65,14 @@ def fetch_with_curl(url, method="GET", json_data=None, timeout=15, headers=None)
domain = urlparse(url).netloc
# Circuit breaker: if domain failed completely <2min ago, fail fast
if domain in _circuit_breaker and (time.time() - _circuit_breaker[domain]) < _CIRCUIT_BREAKER_TTL:
raise Exception(f"Circuit breaker open for {domain} (failed <{_CIRCUIT_BREAKER_TTL}s ago)")
with _cb_lock:
if domain in _circuit_breaker and (time.time() - _circuit_breaker[domain]) < _CIRCUIT_BREAKER_TTL:
raise Exception(f"Circuit breaker open for {domain} (failed <{_CIRCUIT_BREAKER_TTL}s ago)")
# Check if this domain recently failed with requests — skip straight to curl
if domain in _domain_fail_cache and (time.time() - _domain_fail_cache[domain]) < _DOMAIN_FAIL_TTL:
pass # Fall through to curl below
else:
with _cb_lock:
_skip_requests = domain in _domain_fail_cache and (time.time() - _domain_fail_cache[domain]) < _DOMAIN_FAIL_TTL
if not _skip_requests:
try:
# Use a short connect timeout (3s) so firewall blocks fail fast,
# but allow the full timeout for reading the response body.
@@ -78,42 +83,47 @@ def fetch_with_curl(url, method="GET", json_data=None, timeout=15, headers=None)
res = _session.get(url, timeout=req_timeout, headers=default_headers)
res.raise_for_status()
# Clear failure caches on success
_domain_fail_cache.pop(domain, None)
_circuit_breaker.pop(domain, None)
with _cb_lock:
_domain_fail_cache.pop(domain, None)
_circuit_breaker.pop(domain, None)
return res
except (requests.RequestException, ConnectionError, TimeoutError, OSError) as e:
logger.warning(f"Python requests failed for {url} ({e}), falling back to bash curl...")
_domain_fail_cache[domain] = time.time()
with _cb_lock:
_domain_fail_cache[domain] = time.time()
# Build curl as argument list — never pass through shell to prevent injection
_CURL_PATH = shutil.which("curl") or "curl"
cmd = [_CURL_PATH, "-s", "-w", "\n%{http_code}"]
for k, v in default_headers.items():
cmd += ["-H", f"{k}: {v}"]
if method == "POST" and json_data:
cmd += ["-X", "POST", "-H", "Content-Type: application/json",
"--data-binary", "@-"]
cmd.append(url)
# Curl fallback — reached from both _skip_requests and requests-exception paths
_CURL_PATH = shutil.which("curl") or "curl"
cmd = [_CURL_PATH, "-s", "-w", "\n%{http_code}"]
for k, v in default_headers.items():
cmd += ["-H", f"{k}: {v}"]
if method == "POST" and json_data:
cmd += ["-X", "POST", "-H", "Content-Type: application/json",
"--data-binary", "@-"]
cmd.append(url)
try:
stdin_data = json.dumps(json_data) if (method == "POST" and json_data) else None
res = subprocess.run(
cmd, capture_output=True, text=True, timeout=timeout + 5,
input=stdin_data
)
if res.returncode == 0 and res.stdout.strip():
# Parse HTTP status code from -w output (last line)
lines = res.stdout.rstrip().rsplit("\n", 1)
body = lines[0] if len(lines) > 1 else res.stdout
http_code = int(lines[-1]) if len(lines) > 1 and lines[-1].strip().isdigit() else 200
if http_code < 400:
try:
stdin_data = json.dumps(json_data) if (method == "POST" and json_data) else None
res = subprocess.run(
cmd, capture_output=True, text=True, timeout=timeout + 5,
input=stdin_data
)
if res.returncode == 0 and res.stdout.strip():
# Parse HTTP status code from -w output (last line)
lines = res.stdout.rstrip().rsplit("\n", 1)
body = lines[0] if len(lines) > 1 else res.stdout
http_code = int(lines[-1]) if len(lines) > 1 and lines[-1].strip().isdigit() else 200
if http_code < 400:
with _cb_lock:
_circuit_breaker.pop(domain, None) # Clear circuit breaker on success
return _DummyResponse(http_code, body)
else:
logger.error(f"bash curl fallback failed: exit={res.returncode} stderr={res.stderr[:200]}")
return _DummyResponse(http_code, body)
else:
logger.error(f"bash curl fallback failed: exit={res.returncode} stderr={res.stderr[:200]}")
with _cb_lock:
_circuit_breaker[domain] = time.time()
return _DummyResponse(500, "")
except (subprocess.SubprocessError, ConnectionError, TimeoutError, OSError) as curl_e:
logger.error(f"bash curl fallback exception: {curl_e}")
_circuit_breaker[domain] = time.time()
return _DummyResponse(500, "")
except (subprocess.SubprocessError, ConnectionError, TimeoutError, OSError) as curl_e:
logger.error(f"bash curl fallback exception: {curl_e}")
with _cb_lock:
_circuit_breaker[domain] = time.time()
return _DummyResponse(500, "")
+26
View File
@@ -0,0 +1,26 @@
from pydantic import BaseModel
from typing import Optional, Dict, List, Any
class HealthResponse(BaseModel):
status: str
last_updated: Optional[str] = None
sources: Dict[str, int]
freshness: Dict[str, str]
uptime_seconds: int
class RefreshResponse(BaseModel):
status: str
class AisFeedResponse(BaseModel):
status: str
ingested: int = 0
class RouteResponse(BaseModel):
orig_loc: Optional[list] = None
dest_loc: Optional[list] = None
origin_name: Optional[str] = None
dest_name: Optional[str] = None
+159
View File
@@ -0,0 +1,159 @@
"""Tests for network_utils — fetch_with_curl, circuit breaker, domain fail cache."""
import time
import pytest
from unittest.mock import patch, MagicMock
from services.network_utils import fetch_with_curl, _circuit_breaker, _domain_fail_cache, _cb_lock, _DummyResponse
class TestDummyResponse:
"""Tests for the minimal response object used as curl fallback."""
def test_status_code_and_text(self):
resp = _DummyResponse(200, '{"ok": true}')
assert resp.status_code == 200
assert resp.text == '{"ok": true}'
def test_json_parsing(self):
resp = _DummyResponse(200, '{"key": "value", "num": 42}')
data = resp.json()
assert data["key"] == "value"
assert data["num"] == 42
def test_content_bytes(self):
resp = _DummyResponse(200, "hello")
assert resp.content == b"hello"
def test_raise_for_status_ok(self):
resp = _DummyResponse(200, "ok")
resp.raise_for_status() # Should not raise
def test_raise_for_status_error(self):
resp = _DummyResponse(500, "server error")
with pytest.raises(Exception, match="HTTP 500"):
resp.raise_for_status()
def test_raise_for_status_404(self):
resp = _DummyResponse(404, "not found")
with pytest.raises(Exception, match="HTTP 404"):
resp.raise_for_status()
class TestCircuitBreaker:
"""Tests for the circuit breaker and domain fail cache."""
def setup_method(self):
"""Clear caches before each test."""
with _cb_lock:
_circuit_breaker.clear()
_domain_fail_cache.clear()
def test_circuit_breaker_blocks_request(self):
"""If a domain is in circuit breaker, fetch_with_curl should fail fast."""
with _cb_lock:
_circuit_breaker["example.com"] = time.time()
with pytest.raises(Exception, match="Circuit breaker open"):
fetch_with_curl("https://example.com/test")
def test_circuit_breaker_expires_after_ttl(self):
"""Circuit breaker entries older than TTL should be ignored."""
with _cb_lock:
_circuit_breaker["expired.com"] = time.time() - 200 # > 120s TTL
# Should not raise — circuit breaker expired
# Will fail for other reasons (network) but won't raise circuit breaker
mock_resp = MagicMock()
mock_resp.status_code = 200
mock_resp.text = "ok"
mock_resp.raise_for_status = MagicMock()
with patch("services.network_utils._session") as mock_session:
mock_session.get.return_value = mock_resp
result = fetch_with_curl("https://expired.com/test")
assert result.status_code == 200
def test_domain_fail_cache_skips_to_curl(self):
"""If a domain recently failed with requests, skip straight to curl."""
with _cb_lock:
_domain_fail_cache["skip-to-curl.com"] = time.time()
# Mock subprocess to simulate curl success
mock_result = MagicMock()
mock_result.returncode = 0
mock_result.stdout = '{"data": true}\n200'
mock_result.stderr = ''
with patch("subprocess.run", return_value=mock_result) as mock_run:
result = fetch_with_curl("https://skip-to-curl.com/api")
assert result.status_code == 200
assert result.json()["data"] is True
# Verify subprocess.run was called (curl fallback)
mock_run.assert_called_once()
def test_successful_request_clears_caches(self):
"""Successful requests should clear both domain_fail_cache and circuit_breaker."""
domain = "success-clears.com"
with _cb_lock:
_domain_fail_cache[domain] = time.time() - 400 # Expired, won't skip
_circuit_breaker[domain] = time.time() - 200 # Expired, won't block
mock_resp = MagicMock()
mock_resp.status_code = 200
mock_resp.text = "ok"
mock_resp.raise_for_status = MagicMock()
with patch("services.network_utils._session") as mock_session:
mock_session.get.return_value = mock_resp
fetch_with_curl(f"https://{domain}/test")
with _cb_lock:
assert domain not in _domain_fail_cache
assert domain not in _circuit_breaker
class TestFetchWithCurl:
"""Tests for the primary fetch_with_curl function."""
def setup_method(self):
with _cb_lock:
_circuit_breaker.clear()
_domain_fail_cache.clear()
def test_successful_get_returns_response(self):
mock_resp = MagicMock()
mock_resp.status_code = 200
mock_resp.text = '{"result": 42}'
mock_resp.raise_for_status = MagicMock()
with patch("services.network_utils._session") as mock_session:
mock_session.get.return_value = mock_resp
result = fetch_with_curl("https://api.example.com/data")
assert result.status_code == 200
def test_post_with_json_data(self):
mock_resp = MagicMock()
mock_resp.status_code = 200
mock_resp.text = '{"created": true}'
mock_resp.raise_for_status = MagicMock()
with patch("services.network_utils._session") as mock_session:
mock_session.post.return_value = mock_resp
result = fetch_with_curl("https://api.example.com/create",
method="POST", json_data={"name": "test"})
assert result.status_code == 200
mock_session.post.assert_called_once()
def test_custom_headers_merged(self):
mock_resp = MagicMock()
mock_resp.status_code = 200
mock_resp.text = "ok"
mock_resp.raise_for_status = MagicMock()
with patch("services.network_utils._session") as mock_session:
mock_session.get.return_value = mock_resp
fetch_with_curl("https://api.example.com/data",
headers={"Authorization": "Bearer token123"})
call_args = mock_session.get.call_args
headers = call_args.kwargs.get("headers", {})
assert "Authorization" in headers
assert headers["Authorization"] == "Bearer token123"
+72
View File
@@ -0,0 +1,72 @@
"""Tests for Pydantic response schemas."""
import pytest
from pydantic import ValidationError
from services.schemas import HealthResponse, RefreshResponse, AisFeedResponse, RouteResponse
class TestHealthResponse:
def test_valid_health_response(self):
data = {
"status": "ok",
"last_updated": "2024-01-01T00:00:00",
"sources": {"flights": 150, "ships": 42},
"freshness": {"flights": "2024-01-01T00:00:00", "ships": "2024-01-01T00:00:00"},
"uptime_seconds": 3600
}
resp = HealthResponse(**data)
assert resp.status == "ok"
assert resp.sources["flights"] == 150
assert resp.uptime_seconds == 3600
def test_health_response_optional_last_updated(self):
data = {
"status": "ok",
"sources": {},
"freshness": {},
"uptime_seconds": 0
}
resp = HealthResponse(**data)
assert resp.last_updated is None
def test_health_response_missing_required_field(self):
with pytest.raises(ValidationError):
HealthResponse(status="ok") # Missing sources, freshness, uptime_seconds
class TestRefreshResponse:
def test_valid_refresh(self):
resp = RefreshResponse(status="refreshing")
assert resp.status == "refreshing"
def test_missing_status(self):
with pytest.raises(ValidationError):
RefreshResponse()
class TestAisFeedResponse:
def test_valid_ais_feed(self):
resp = AisFeedResponse(status="ok", ingested=42)
assert resp.ingested == 42
def test_default_ingested_zero(self):
resp = AisFeedResponse(status="ok")
assert resp.ingested == 0
class TestRouteResponse:
def test_valid_route(self):
resp = RouteResponse(
orig_loc=[40.6413, -73.7781],
dest_loc=[51.4700, -0.4543],
origin_name="JFK",
dest_name="LHR"
)
assert resp.origin_name == "JFK"
assert len(resp.orig_loc) == 2
def test_all_optional(self):
resp = RouteResponse()
assert resp.orig_loc is None
assert resp.dest_loc is None
assert resp.origin_name is None
assert resp.dest_name is None
+97
View File
@@ -0,0 +1,97 @@
"""Tests for the shared in-memory data store."""
import threading
import time
import pytest
from services.fetchers._store import latest_data, source_timestamps, _mark_fresh, _data_lock
class TestLatestDataStructure:
"""Verify the store has the expected keys and default values."""
def test_has_all_required_keys(self):
expected_keys = {
"last_updated", "news", "stocks", "oil", "flights", "ships",
"military_flights", "tracked_flights", "cctv", "weather",
"earthquakes", "uavs", "frontlines", "gdelt", "liveuamap",
"kiwisdr", "space_weather", "internet_outages", "firms_fires",
"datacenters"
}
assert expected_keys.issubset(set(latest_data.keys()))
def test_list_keys_default_to_empty_list(self):
list_keys = ["news", "flights", "ships", "military_flights",
"tracked_flights", "cctv", "earthquakes", "uavs",
"gdelt", "liveuamap", "kiwisdr", "internet_outages",
"firms_fires", "datacenters"]
for key in list_keys:
assert isinstance(latest_data[key], list), f"{key} should default to list"
def test_dict_keys_default_to_empty_dict(self):
dict_keys = ["stocks", "oil"]
for key in dict_keys:
assert isinstance(latest_data[key], dict), f"{key} should default to dict"
class TestMarkFresh:
"""Tests for _mark_fresh timestamp helper."""
def test_records_timestamp_for_single_key(self):
_mark_fresh("test_key_1")
assert "test_key_1" in source_timestamps
assert isinstance(source_timestamps["test_key_1"], str)
def test_records_timestamps_for_multiple_keys(self):
_mark_fresh("multi_a", "multi_b", "multi_c")
assert "multi_a" in source_timestamps
assert "multi_b" in source_timestamps
assert "multi_c" in source_timestamps
def test_timestamps_are_iso_format(self):
_mark_fresh("iso_test")
ts = source_timestamps["iso_test"]
# ISO format: YYYY-MM-DDTHH:MM:SS.ffffff
assert "T" in ts
assert len(ts) >= 19 # At least YYYY-MM-DDTHH:MM:SS
def test_successive_calls_update_timestamp(self):
_mark_fresh("update_test")
ts1 = source_timestamps["update_test"]
time.sleep(0.01)
_mark_fresh("update_test")
ts2 = source_timestamps["update_test"]
assert ts2 >= ts1
class TestDataLock:
"""Verify the data lock works for thread safety."""
def test_lock_exists_and_is_a_lock(self):
assert isinstance(_data_lock, type(threading.Lock()))
def test_concurrent_writes_dont_corrupt(self):
"""Simulate concurrent writes to latest_data through the lock."""
errors = []
def writer(key, value, iterations=100):
try:
for _ in range(iterations):
with _data_lock:
latest_data[key] = value
# Read back immediately — should be our value
assert latest_data[key] == value
except Exception as e:
errors.append(e)
threads = [
threading.Thread(target=writer, args=("test_concurrent", [1, 2, 3])),
threading.Thread(target=writer, args=("test_concurrent", [4, 5, 6])),
threading.Thread(target=writer, args=("test_concurrent", [7, 8, 9])),
]
for t in threads:
t.start()
for t in threads:
t.join()
assert len(errors) == 0, f"Thread safety errors: {errors}"
# Restore default
latest_data["test_concurrent"] = []
+2038 -6
View File
File diff suppressed because it is too large Load Diff
+11 -3
View File
@@ -1,6 +1,6 @@
{
"name": "frontend",
"version": "0.9.0",
"version": "0.9.5",
"private": true,
"scripts": {
"dev": "concurrently \"npm run dev:frontend\" \"npm run dev:backend\"",
@@ -8,7 +8,10 @@
"dev:backend": "node ../start-backend.js",
"build": "next build",
"start": "next start",
"lint": "eslint"
"lint": "eslint",
"test": "vitest run",
"test:watch": "vitest",
"test:coverage": "vitest run --coverage"
},
"dependencies": {
"@mapbox/point-geometry": "^1.1.0",
@@ -24,13 +27,18 @@
},
"devDependencies": {
"@tailwindcss/postcss": "^4",
"@testing-library/jest-dom": "^6.9.1",
"@testing-library/react": "^16.3.2",
"@types/node": "^20",
"@types/react": "^19",
"@types/react-dom": "^19",
"@vitest/coverage-v8": "^4.1.0",
"concurrently": "^9.2.1",
"eslint": "^9",
"eslint-config-next": "16.1.6",
"jsdom": "^28.1.0",
"tailwindcss": "^4",
"typescript": "^5"
"typescript": "^5",
"vitest": "^4.1.0"
}
}
@@ -0,0 +1,297 @@
import { describe, it, expect } from 'vitest';
import {
buildEarthquakesGeoJSON, buildJammingGeoJSON, buildCctvGeoJSON, buildKiwisdrGeoJSON,
buildFirmsGeoJSON, buildInternetOutagesGeoJSON, buildDataCentersGeoJSON,
buildGdeltGeoJSON, buildLiveuaGeoJSON, buildFrontlineGeoJSON
} from '@/components/map/geoJSONBuilders';
import type { Earthquake, GPSJammingZone, FireHotspot, InternetOutage, DataCenter, GDELTIncident, LiveUAmapIncident, CCTVCamera, KiwiSDR } from '@/types/dashboard';
// ─── Earthquakes ────────────────────────────────────────────────────────────
describe('buildEarthquakesGeoJSON', () => {
it('returns null for empty/undefined input', () => {
expect(buildEarthquakesGeoJSON(undefined)).toBeNull();
expect(buildEarthquakesGeoJSON([])).toBeNull();
});
it('builds valid FeatureCollection from earthquake data', () => {
const earthquakes: Earthquake[] = [
{ id: 'eq1', mag: 5.2, lat: 35.0, lng: 139.0, place: 'Japan' },
{ id: 'eq2', mag: 3.1, lat: 40.0, lng: -120.0, place: 'California', title: 'Test Title' },
];
const result = buildEarthquakesGeoJSON(earthquakes);
expect(result).not.toBeNull();
expect(result!.type).toBe('FeatureCollection');
expect(result!.features).toHaveLength(2);
const f0 = result!.features[0];
expect(f0.geometry).toEqual({ type: 'Point', coordinates: [139.0, 35.0] });
expect(f0.properties?.type).toBe('earthquake');
expect(f0.properties?.name).toContain('M5.2');
expect(f0.properties?.name).toContain('Japan');
});
it('filters out entries with null lat/lng', () => {
const earthquakes = [
{ id: 'eq1', mag: 5.0, lat: null as any, lng: 10.0, place: 'X' },
{ id: 'eq2', mag: 3.0, lat: 20.0, lng: 30.0, place: 'Y' },
];
const result = buildEarthquakesGeoJSON(earthquakes);
expect(result!.features).toHaveLength(1);
});
it('includes title when present', () => {
const earthquakes: Earthquake[] = [
{ id: 'eq1', mag: 4.0, lat: 10.0, lng: 20.0, place: 'Test', title: 'Big One' },
];
const result = buildEarthquakesGeoJSON(earthquakes);
expect(result!.features[0].properties?.title).toBe('Big One');
});
});
// ─── GPS Jamming ────────────────────────────────────────────────────────────
describe('buildJammingGeoJSON', () => {
it('returns null for empty input', () => {
expect(buildJammingGeoJSON(undefined)).toBeNull();
expect(buildJammingGeoJSON([])).toBeNull();
});
it('builds polygon features with correct opacity mapping', () => {
const zones: GPSJammingZone[] = [
{ lat: 50, lng: 30, severity: 'high', ratio: 0.8, degraded: 100, total: 125 },
{ lat: 45, lng: 35, severity: 'medium', ratio: 0.5, degraded: 50, total: 100 },
{ lat: 40, lng: 25, severity: 'low', ratio: 0.2, degraded: 20, total: 100 },
];
const result = buildJammingGeoJSON(zones);
expect(result!.features).toHaveLength(3);
expect(result!.features[0].properties?.opacity).toBe(0.45);
expect(result!.features[1].properties?.opacity).toBe(0.3);
expect(result!.features[2].properties?.opacity).toBe(0.18);
});
it('builds correct 1°×1° polygon geometry', () => {
const zones: GPSJammingZone[] = [
{ lat: 50, lng: 30, severity: 'high', ratio: 0.8, degraded: 100, total: 125 },
];
const result = buildJammingGeoJSON(zones);
const geom = result!.features[0].geometry;
expect(geom.type).toBe('Polygon');
if (geom.type === 'Polygon') {
const ring = geom.coordinates[0];
expect(ring).toHaveLength(5); // Closed ring
expect(ring[0]).toEqual([29.5, 49.5]);
expect(ring[2]).toEqual([30.5, 50.5]);
}
});
});
// ─── CCTV ───────────────────────────────────────────────────────────────────
describe('buildCctvGeoJSON', () => {
it('returns null for empty input', () => {
expect(buildCctvGeoJSON(undefined)).toBeNull();
});
it('builds features from camera data', () => {
const cameras: CCTVCamera[] = [
{ id: 'cam1', lat: 40.7, lon: -74.0, direction_facing: 'North', source_agency: 'DOT' },
];
const result = buildCctvGeoJSON(cameras);
expect(result!.features).toHaveLength(1);
expect(result!.features[0].properties?.type).toBe('cctv');
expect(result!.features[0].properties?.name).toBe('North');
});
it('respects inView filter', () => {
const cameras: CCTVCamera[] = [
{ id: 'cam1', lat: 40.7, lon: -74.0 },
{ id: 'cam2', lat: 10.0, lon: 20.0 },
];
const inView = (lat: number, _lng: number) => lat > 30;
const result = buildCctvGeoJSON(cameras, inView);
expect(result!.features).toHaveLength(1);
});
});
// ─── KiwiSDR ────────────────────────────────────────────────────────────────
describe('buildKiwisdrGeoJSON', () => {
it('returns null for empty input', () => {
expect(buildKiwisdrGeoJSON(undefined)).toBeNull();
});
it('builds features with SDR properties', () => {
const receivers: KiwiSDR[] = [
{ lat: 52.0, lon: 13.0, name: 'Berlin SDR', url: 'http://test.com', users: 3, users_max: 8, bands: 'HF', antenna: 'Long Wire', location: 'Berlin' },
];
const result = buildKiwisdrGeoJSON(receivers);
expect(result!.features).toHaveLength(1);
expect(result!.features[0].properties?.name).toBe('Berlin SDR');
expect(result!.features[0].properties?.users).toBe(3);
});
});
// ─── FIRMS Fires ────────────────────────────────────────────────────────────
describe('buildFirmsGeoJSON', () => {
it('returns null for empty input', () => {
expect(buildFirmsGeoJSON(undefined)).toBeNull();
});
it('classifies fires by FRP thresholds', () => {
const fires: FireHotspot[] = [
{ lat: 10, lng: 20, frp: 150, brightness: 400, confidence: 'high', daynight: 'D', acq_date: '2024-01-01', acq_time: '1200' },
{ lat: 11, lng: 21, frp: 50, brightness: 350, confidence: 'medium', daynight: 'N', acq_date: '2024-01-01', acq_time: '0100' },
{ lat: 12, lng: 22, frp: 10, brightness: 300, confidence: 'low', daynight: 'D', acq_date: '2024-01-01', acq_time: '1400' },
{ lat: 13, lng: 23, frp: 2, brightness: 250, confidence: 'low', daynight: 'D', acq_date: '2024-01-01', acq_time: '1500' },
];
const result = buildFirmsGeoJSON(fires);
expect(result!.features).toHaveLength(4);
expect(result!.features[0].properties?.iconId).toBe('fire-darkred');
expect(result!.features[1].properties?.iconId).toBe('fire-red');
expect(result!.features[2].properties?.iconId).toBe('fire-orange');
expect(result!.features[3].properties?.iconId).toBe('fire-yellow');
});
it('formats daynight correctly', () => {
const fires: FireHotspot[] = [
{ lat: 10, lng: 20, frp: 5, brightness: 300, confidence: 'low', daynight: 'D', acq_date: '2024-01-01', acq_time: '1200' },
{ lat: 11, lng: 21, frp: 5, brightness: 300, confidence: 'low', daynight: 'N', acq_date: '2024-01-01', acq_time: '0100' },
];
const result = buildFirmsGeoJSON(fires);
expect(result!.features[0].properties?.daynight).toBe('Day');
expect(result!.features[1].properties?.daynight).toBe('Night');
});
});
// ─── Internet Outages ───────────────────────────────────────────────────────
describe('buildInternetOutagesGeoJSON', () => {
it('returns null for empty input', () => {
expect(buildInternetOutagesGeoJSON(undefined)).toBeNull();
});
it('builds features with detail string', () => {
const outages: InternetOutage[] = [
{ region_code: 'TX', region_name: 'Texas', country_code: 'US', country_name: 'United States', lat: 31.0, lng: -100.0, severity: 45, level: 'region', datasource: 'bgp' },
];
const result = buildInternetOutagesGeoJSON(outages);
expect(result!.features).toHaveLength(1);
expect(result!.features[0].properties?.detail).toContain('Texas');
expect(result!.features[0].properties?.detail).toContain('45% drop');
});
it('filters out entries with null coordinates', () => {
const outages: InternetOutage[] = [
{ region_code: 'TX', region_name: 'Texas', country_code: 'US', country_name: 'United States', lat: null as any, lng: null as any, severity: 20, level: 'region', datasource: 'bgp' },
{ region_code: 'CA', region_name: 'California', country_code: 'US', country_name: 'United States', lat: 37.0, lng: -122.0, severity: 30, level: 'region', datasource: 'bgp' },
];
const result = buildInternetOutagesGeoJSON(outages);
expect(result!.features).toHaveLength(1);
});
});
// ─── Data Centers ───────────────────────────────────────────────────────────
describe('buildDataCentersGeoJSON', () => {
it('returns null for empty input', () => {
expect(buildDataCentersGeoJSON(undefined)).toBeNull();
});
it('builds features with datacenter properties', () => {
const dcs: DataCenter[] = [
{ lat: 40.0, lng: -74.0, name: 'NYC-DC1', company: 'Equinix', street: '123 Main', city: 'New York', country: 'US', zip: '10001' },
];
const result = buildDataCentersGeoJSON(dcs);
expect(result!.features).toHaveLength(1);
expect(result!.features[0].properties?.id).toBe('dc-0');
expect(result!.features[0].properties?.company).toBe('Equinix');
});
});
// ─── GDELT ──────────────────────────────────────────────────────────────────
describe('buildGdeltGeoJSON', () => {
it('returns null for empty input', () => {
expect(buildGdeltGeoJSON(undefined)).toBeNull();
});
it('builds features from GDELT incidents', () => {
const gdelt: GDELTIncident[] = [
{ type: 'Feature', geometry: { type: 'Point', coordinates: [30, 50] }, properties: { name: 'Protest', count: 5, _urls_list: [], _headlines_list: [] } },
];
const result = buildGdeltGeoJSON(gdelt);
expect(result!.features).toHaveLength(1);
expect(result!.features[0].properties?.type).toBe('gdelt');
expect(result!.features[0].properties?.title).toBe('Protest');
});
it('filters by inView when provided', () => {
const gdelt: GDELTIncident[] = [
{ type: 'Feature', geometry: { type: 'Point', coordinates: [30, 50] }, properties: { name: 'A', count: 1, _urls_list: [], _headlines_list: [] } },
{ type: 'Feature', geometry: { type: 'Point', coordinates: [100, 10] }, properties: { name: 'B', count: 1, _urls_list: [], _headlines_list: [] } },
];
const inView = (lat: number, _lng: number) => lat > 30;
const result = buildGdeltGeoJSON(gdelt, inView);
expect(result!.features).toHaveLength(1);
});
it('filters out entries without geometry', () => {
const gdelt: GDELTIncident[] = [
{ type: 'Feature', geometry: { type: 'Point', coordinates: [30, 50] }, properties: { name: 'Good', count: 1, _urls_list: [], _headlines_list: [] } },
{ type: 'Feature', geometry: null as any, properties: { name: 'Bad', count: 1, _urls_list: [], _headlines_list: [] } },
];
const result = buildGdeltGeoJSON(gdelt);
expect(result!.features).toHaveLength(1);
});
});
// ─── LiveUAMap ──────────────────────────────────────────────────────────────
describe('buildLiveuaGeoJSON', () => {
it('returns null for empty input', () => {
expect(buildLiveuaGeoJSON(undefined)).toBeNull();
});
it('classifies violent incidents with red icon', () => {
const incidents: LiveUAmapIncident[] = [
{ id: '1', lat: 48.0, lng: 35.0, title: 'Missile strike in Kharkiv', date: '2024-01-01' },
{ id: '2', lat: 49.0, lng: 36.0, title: 'Humanitarian aid delivery', date: '2024-01-01' },
];
const result = buildLiveuaGeoJSON(incidents);
expect(result!.features).toHaveLength(2);
expect(result!.features[0].properties?.iconId).toBe('icon-liveua-red');
expect(result!.features[1].properties?.iconId).toBe('icon-liveua-yellow');
});
it('filters by inView when provided', () => {
const incidents: LiveUAmapIncident[] = [
{ id: '1', lat: 48.0, lng: 35.0, title: 'Test', date: '2024-01-01' },
{ id: '2', lat: 10.0, lng: 20.0, title: 'Far away', date: '2024-01-01' },
];
const inView = (lat: number, _lng: number) => lat > 30;
const result = buildLiveuaGeoJSON(incidents, inView);
expect(result!.features).toHaveLength(1);
});
});
// ─── Frontline ──────────────────────────────────────────────────────────────
describe('buildFrontlineGeoJSON', () => {
it('returns null for null/undefined input', () => {
expect(buildFrontlineGeoJSON(null)).toBeNull();
expect(buildFrontlineGeoJSON(undefined)).toBeNull();
});
it('returns the input unchanged when valid', () => {
const fc = { type: 'FeatureCollection' as const, features: [{ type: 'Feature' as const, properties: { name: 'zone', zone_id: 1 }, geometry: { type: 'Polygon' as const, coordinates: [[[30, 48], [31, 49], [30, 49], [30, 48]]] as [number, number][][] } }] };
const result = buildFrontlineGeoJSON(fc);
expect(result).toBe(fc); // Same reference — passthrough
});
it('returns null for empty features array', () => {
const fc = { type: 'FeatureCollection' as const, features: [] };
expect(buildFrontlineGeoJSON(fc)).toBeNull();
});
});
@@ -0,0 +1,96 @@
import { describe, it, expect } from 'vitest';
import { classifyAircraft, HELI_TYPES, TURBOPROP_TYPES, BIZJET_TYPES } from '@/utils/aircraftClassification';
describe('classifyAircraft', () => {
// ─── Helicopter classification ────────────────────────────────────────────
it('classifies known helicopter types', () => {
const heliModels = ['R22', 'R44', 'B407', 'S76', 'EC35', 'H145', 'UH60', 'AH64', 'CH47'];
for (const model of heliModels) {
expect(classifyAircraft(model)).toBe('heli');
}
});
it('classifies as heli when category hint is "heli"', () => {
expect(classifyAircraft('UNKNOWN', 'heli')).toBe('heli');
});
it('category hint "heli" overrides model-based classification', () => {
// B738 would normally be airliner, but category says heli
expect(classifyAircraft('B738', 'heli')).toBe('heli');
});
// ─── Business jet classification ──────────────────────────────────────────
it('classifies known bizjet types', () => {
const bizjetModels = ['C25A', 'C680', 'CL60', 'GLEX', 'GLF5', 'LJ45', 'FA7X'];
for (const model of bizjetModels) {
expect(classifyAircraft(model)).toBe('bizjet');
}
});
// ─── Turboprop classification ─────────────────────────────────────────────
it('classifies known turboprop types', () => {
const turbopropModels = ['AT72', 'C208', 'DHC6', 'DH8D', 'PC12', 'TBM9', 'C130'];
for (const model of turbopropModels) {
expect(classifyAircraft(model)).toBe('turboprop');
}
});
// ─── Airliner default ────────────────────────────────────────────────────
it('defaults to airliner for unknown types', () => {
expect(classifyAircraft('B738')).toBe('airliner');
expect(classifyAircraft('A320')).toBe('airliner');
expect(classifyAircraft('B77W')).toBe('airliner');
});
it('defaults to airliner for empty model string', () => {
expect(classifyAircraft('')).toBe('airliner');
});
// ─── Case insensitivity ──────────────────────────────────────────────────
it('handles lowercase model codes', () => {
expect(classifyAircraft('r22')).toBe('heli');
expect(classifyAircraft('c25a')).toBe('bizjet');
expect(classifyAircraft('at72')).toBe('turboprop');
});
it('handles mixed case model codes', () => {
expect(classifyAircraft('Dh8D')).toBe('turboprop');
expect(classifyAircraft('Glf5')).toBe('bizjet');
});
// ─── Priority order ──────────────────────────────────────────────────────
it('prioritizes heli over bizjet (if type appears in both sets)', () => {
// heli check comes first in the function
for (const model of ['B06', 'S92', 'H225']) {
expect(classifyAircraft(model)).toBe('heli');
}
});
it('prioritizes bizjet over turboprop', () => {
// PC24 appears in both BIZJET_TYPES and TURBOPROP_TYPES
// bizjet check comes before turboprop in the function
if (BIZJET_TYPES.has('PC24') && TURBOPROP_TYPES.has('PC24')) {
expect(classifyAircraft('PC24')).toBe('bizjet');
}
});
// ─── Set integrity ───────────────────────────────────────────────────────
it('HELI_TYPES set has expected minimum entries', () => {
expect(HELI_TYPES.size).toBeGreaterThan(50);
});
it('TURBOPROP_TYPES set has expected minimum entries', () => {
expect(TURBOPROP_TYPES.size).toBeGreaterThan(80);
});
it('BIZJET_TYPES set has expected minimum entries', () => {
expect(BIZJET_TYPES.size).toBeGreaterThan(50);
});
});
@@ -0,0 +1,116 @@
import { describe, it, expect } from 'vitest';
import { interpolatePosition } from '@/utils/positioning';
describe('interpolatePosition', () => {
// ─── No-op cases ──────────────────────────────────────────────────────────
it('returns same position when speed is zero', () => {
const [lat, lng] = interpolatePosition(40, -74, 90, 0, 10);
expect(lat).toBe(40);
expect(lng).toBe(-74);
});
it('returns same position when speed is negative', () => {
const [lat, lng] = interpolatePosition(40, -74, 90, -50, 10);
expect(lat).toBe(40);
expect(lng).toBe(-74);
});
it('returns same position when dt is zero', () => {
const [lat, lng] = interpolatePosition(40, -74, 90, 100, 0);
expect(lat).toBe(40);
expect(lng).toBe(-74);
});
it('returns same position when dt is negative', () => {
const [lat, lng] = interpolatePosition(40, -74, 90, 100, -5);
expect(lat).toBe(40);
expect(lng).toBe(-74);
});
// ─── Cardinal directions ─────────────────────────────────────────────────
it('moves north when heading is 0°', () => {
const [lat, lng] = interpolatePosition(40, -74, 0, 100, 10);
expect(lat).toBeGreaterThan(40);
expect(lng).toBeCloseTo(-74, 4); // longitude should barely change
});
it('moves south when heading is 180°', () => {
const [lat, lng] = interpolatePosition(40, -74, 180, 100, 10);
expect(lat).toBeLessThan(40);
expect(lng).toBeCloseTo(-74, 4);
});
it('moves east when heading is 90°', () => {
const [lat, lng] = interpolatePosition(40, -74, 90, 100, 10);
expect(lat).toBeCloseTo(40, 4);
expect(lng).toBeGreaterThan(-74);
});
it('moves west when heading is 270°', () => {
const [lat, lng] = interpolatePosition(40, -74, 270, 100, 10);
expect(lat).toBeCloseTo(40, 4);
expect(lng).toBeLessThan(-74);
});
// ─── Distance proportionality ────────────────────────────────────────────
it('doubles distance when speed doubles', () => {
const [lat1] = interpolatePosition(0, 0, 0, 100, 10);
const [lat2] = interpolatePosition(0, 0, 0, 200, 10);
const dist1 = lat1; // distance from origin going north
const dist2 = lat2;
expect(dist2).toBeCloseTo(dist1 * 2, 4);
});
it('doubles distance when time doubles', () => {
const [lat1] = interpolatePosition(0, 0, 0, 100, 10);
const [lat2] = interpolatePosition(0, 0, 0, 100, 20);
const dist1 = lat1;
const dist2 = lat2;
expect(dist2).toBeCloseTo(dist1 * 2, 4);
});
// ─── Clamping ────────────────────────────────────────────────────────────
it('clamps time to maxDt (prevents drift on stale data)', () => {
// maxDt=65 by default, so dt=1000 should give same result as dt=65
const [lat1] = interpolatePosition(0, 0, 0, 100, 65);
const [lat2] = interpolatePosition(0, 0, 0, 100, 1000);
expect(lat1).toBeCloseTo(lat2, 6);
});
it('clamps distance to maxDist when specified', () => {
// At 100 knots for 60 seconds = ~3086m, maxDist=1000 should cap it
const [lat1] = interpolatePosition(0, 0, 0, 100, 60, 1000);
const [lat2] = interpolatePosition(0, 0, 0, 100, 60, 0); // no cap
expect(lat1).toBeLessThan(lat2);
});
// ─── Known calculation ───────────────────────────────────────────────────
it('produces correct magnitude for known speed/time', () => {
// 1 knot = 1 NM/hr = 1852 m/hr ≈ 0.5144 m/s
// 100 knots for 10 seconds = 514.4 meters
// At equator, 1° lat ≈ 111,320m, so 514.4m ≈ 0.00462°
const [lat] = interpolatePosition(0, 0, 0, 100, 10);
const expectedDegrees = (100 * 0.5144 * 10) / 111320;
expect(lat).toBeCloseTo(expectedDegrees, 4);
});
// ─── Edge cases ──────────────────────────────────────────────────────────
it('handles positions near the poles', () => {
const [lat, lng] = interpolatePosition(89.9, 0, 0, 10, 5);
expect(lat).toBeGreaterThan(89.9);
expect(Number.isFinite(lat)).toBe(true);
expect(Number.isFinite(lng)).toBe(true);
});
it('handles positions near the dateline', () => {
const [lat, lng] = interpolatePosition(0, 179.99, 90, 100, 10);
expect(Number.isFinite(lat)).toBe(true);
expect(Number.isFinite(lng)).toBe(true);
});
});
@@ -0,0 +1,86 @@
import { describe, it, expect } from 'vitest';
import { computeNightPolygon } from '@/utils/solarTerminator';
/** Extract polygon ring from result (type-narrowing helper) */
function getRing(result: GeoJSON.FeatureCollection): number[][] {
const geom = result.features[0].geometry;
if (geom.type !== 'Polygon') throw new Error('Expected Polygon geometry');
return geom.coordinates[0];
}
describe('computeNightPolygon', () => {
// ─── Structure validation ────────────────────────────────────────────────
it('returns a valid GeoJSON FeatureCollection', () => {
const result = computeNightPolygon();
expect(result.type).toBe('FeatureCollection');
expect(result.features).toHaveLength(1);
expect(result.features[0].type).toBe('Feature');
expect(result.features[0].geometry.type).toBe('Polygon');
});
it('polygon has at least 360 vertices (one per degree of longitude)', () => {
const ring = getRing(computeNightPolygon());
// 361 terminator points + 2 closing corners + 1 ring-close = ≥364
expect(ring.length).toBeGreaterThanOrEqual(364);
});
it('polygon ring is closed (first and last points match)', () => {
const ring = getRing(computeNightPolygon());
expect(ring[ring.length - 1]).toEqual(ring[0]);
});
// ─── Coordinate bounds ───────────────────────────────────────────────────
it('all coordinates are within valid lat/lng bounds', () => {
const ring = getRing(computeNightPolygon());
for (const [lng, lat] of ring) {
expect(lng).toBeGreaterThanOrEqual(-180);
expect(lng).toBeLessThanOrEqual(180);
expect(lat).toBeGreaterThanOrEqual(-85);
expect(lat).toBeLessThanOrEqual(85);
}
});
// ─── Deterministic for same input ────────────────────────────────────────
it('returns identical result for the same date', () => {
const date = new Date('2024-06-21T12:00:00Z');
const result1 = computeNightPolygon(date);
const result2 = computeNightPolygon(date);
expect(result1).toEqual(result2);
});
// ─── Seasonal behavior ──────────────────────────────────────────────────
it('equinox produces roughly symmetric polygon', () => {
const equinox = new Date('2024-03-20T12:00:00Z');
const ring = getRing(computeNightPolygon(equinox));
const lats = ring.map(([, lat]: number[]) => lat);
const maxLat = Math.max(...lats);
const minLat = Math.min(...lats);
expect(maxLat).toBeGreaterThan(50);
expect(minLat).toBeLessThan(-50);
});
it('summer solstice shifts night polygon southward', () => {
const summer = new Date('2024-06-21T00:00:00Z');
const ring = getRing(computeNightPolygon(summer));
const terminatorLats = ring
.filter(([lng]: number[]) => lng >= -180 && lng <= 180)
.slice(0, 361)
.map(([, lat]: number[]) => lat);
const avgLat = terminatorLats.reduce((a: number, b: number) => a + b, 0) / terminatorLats.length;
expect(avgLat).toBeLessThan(15);
});
// ─── Different times produce different results ──────────────────────────
it('produces different polygons for different times of day', () => {
const morning = new Date('2024-06-21T06:00:00Z');
const evening = new Date('2024-06-21T18:00:00Z');
const ringM = getRing(computeNightPolygon(morning));
const ringE = getRing(computeNightPolygon(evening));
expect(ringM[0]).not.toEqual(ringE[0]);
});
});
+92 -18
View File
@@ -4,18 +4,18 @@
--background: #000000;
--foreground: #ededed;
--bg-primary: #000000;
--bg-secondary: rgb(17, 24, 39);
--bg-tertiary: rgb(31, 41, 55);
--bg-panel: rgba(17, 24, 39, 0.8);
--border-primary: rgb(55, 65, 81);
--border-secondary: rgb(75, 85, 99);
--bg-secondary: rgb(5, 5, 8);
--bg-tertiary: rgb(12, 12, 16);
--bg-panel: rgba(0, 0, 0, 0.85);
--border-primary: rgb(10, 12, 15);
--border-secondary: rgb(20, 24, 28);
--text-primary: rgb(243, 244, 246);
--text-secondary: rgb(156, 163, 175);
--text-muted: rgb(107, 114, 128);
--text-secondary: rgb(34, 211, 238);
--text-muted: rgb(8, 145, 178);
--text-heading: rgb(236, 254, 255);
--hover-accent: rgba(8, 51, 68, 0.2);
--scrollbar-thumb: rgba(100, 116, 139, 0.3);
--scrollbar-thumb-hover: rgba(100, 116, 139, 0.5);
--scrollbar-thumb: rgba(8, 145, 178, 0.3);
--scrollbar-thumb-hover: rgba(8, 145, 178, 0.5);
}
/* Light theme: only the map basemap changes — UI stays dark */
@@ -23,18 +23,18 @@
--background: #000000;
--foreground: #ededed;
--bg-primary: #000000;
--bg-secondary: rgb(17, 24, 39);
--bg-tertiary: rgb(31, 41, 55);
--bg-panel: rgba(17, 24, 39, 0.8);
--border-primary: rgb(55, 65, 81);
--border-secondary: rgb(75, 85, 99);
--bg-secondary: rgb(5, 5, 8);
--bg-tertiary: rgb(12, 12, 16);
--bg-panel: rgba(0, 0, 0, 0.85);
--border-primary: rgb(10, 12, 15);
--border-secondary: rgb(20, 24, 28);
--text-primary: rgb(243, 244, 246);
--text-secondary: rgb(156, 163, 175);
--text-muted: rgb(107, 114, 128);
--text-secondary: rgb(34, 211, 238);
--text-muted: rgb(8, 145, 178);
--text-heading: rgb(236, 254, 255);
--hover-accent: rgba(8, 51, 68, 0.2);
--scrollbar-thumb: rgba(100, 116, 139, 0.3);
--scrollbar-thumb-hover: rgba(100, 116, 139, 0.5);
--scrollbar-thumb: rgba(8, 145, 178, 0.3);
--scrollbar-thumb-hover: rgba(8, 145, 178, 0.5);
}
@theme inline {
@@ -114,6 +114,80 @@ body {
display: none !important;
}
/* ── MATRIX HUD COLOR THEME ── */
/* Remaps cyan accents → green within .hud-zone containers only */
[data-hud="matrix"] .hud-zone {
--text-secondary: #4ade80;
--text-muted: #16a34a;
--text-heading: #bbf7d0;
--hover-accent: rgba(5, 46, 22, 0.2);
--scrollbar-thumb: rgba(22, 163, 74, 0.3);
--scrollbar-thumb-hover: rgba(22, 163, 74, 0.5);
}
/* --- Text color overrides --- */
[data-hud="matrix"] .hud-zone .text-cyan-300 { color: #86efac !important; }
[data-hud="matrix"] .hud-zone .text-cyan-400 { color: #4ade80 !important; }
[data-hud="matrix"] .hud-zone .text-cyan-500 { color: #22c55e !important; }
[data-hud="matrix"] .hud-zone .text-cyan-600 { color: #16a34a !important; }
[data-hud="matrix"] .hud-zone .text-cyan-700 { color: #15803d !important; }
[data-hud="matrix"] .hud-zone .text-cyan-500\/50 { color: rgba(34, 197, 94, 0.5) !important; }
[data-hud="matrix"] .hud-zone .text-cyan-500\/70 { color: rgba(34, 197, 94, 0.7) !important; }
[data-hud="matrix"] .hud-zone .text-cyan-500\/80 { color: rgba(34, 197, 94, 0.8) !important; }
/* --- Background color overrides --- */
[data-hud="matrix"] .hud-zone .bg-cyan-400 { background-color: #4ade80 !important; }
[data-hud="matrix"] .hud-zone .bg-cyan-300 { background-color: #86efac !important; }
[data-hud="matrix"] .hud-zone .bg-cyan-500 { background-color: #22c55e !important; }
[data-hud="matrix"] .hud-zone .bg-cyan-500\/10 { background-color: rgba(34, 197, 94, 0.1) !important; }
[data-hud="matrix"] .hud-zone .bg-cyan-500\/20 { background-color: rgba(34, 197, 94, 0.2) !important; }
[data-hud="matrix"] .hud-zone .bg-cyan-500\/30 { background-color: rgba(34, 197, 94, 0.3) !important; }
[data-hud="matrix"] .hud-zone .bg-cyan-900\/30 { background-color: rgba(20, 83, 45, 0.3) !important; }
[data-hud="matrix"] .hud-zone .bg-cyan-900\/50 { background-color: rgba(20, 83, 45, 0.5) !important; }
[data-hud="matrix"] .hud-zone .bg-cyan-900\/60 { background-color: rgba(20, 83, 45, 0.6) !important; }
[data-hud="matrix"] .hud-zone .bg-cyan-950\/10 { background-color: rgba(5, 46, 22, 0.1) !important; }
[data-hud="matrix"] .hud-zone .bg-cyan-950\/30 { background-color: rgba(5, 46, 22, 0.3) !important; }
[data-hud="matrix"] .hud-zone .bg-cyan-950\/40 { background-color: rgba(5, 46, 22, 0.4) !important; }
/* --- Border color overrides --- */
[data-hud="matrix"] .hud-zone .border-cyan-400 { border-color: #4ade80 !important; }
[data-hud="matrix"] .hud-zone .border-cyan-500 { border-color: #22c55e !important; }
[data-hud="matrix"] .hud-zone .border-cyan-700 { border-color: #15803d !important; }
[data-hud="matrix"] .hud-zone .border-cyan-800 { border-color: #166534 !important; }
[data-hud="matrix"] .hud-zone .border-cyan-900 { border-color: #14532d !important; }
[data-hud="matrix"] .hud-zone .border-cyan-500\/10 { border-color: rgba(34, 197, 94, 0.1) !important; }
[data-hud="matrix"] .hud-zone .border-cyan-500\/20 { border-color: rgba(34, 197, 94, 0.2) !important; }
[data-hud="matrix"] .hud-zone .border-cyan-500\/30 { border-color: rgba(34, 197, 94, 0.3) !important; }
[data-hud="matrix"] .hud-zone .border-cyan-500\/40 { border-color: rgba(34, 197, 94, 0.4) !important; }
[data-hud="matrix"] .hud-zone .border-cyan-500\/50 { border-color: rgba(34, 197, 94, 0.5) !important; }
[data-hud="matrix"] .hud-zone .border-cyan-800\/40 { border-color: rgba(22, 101, 52, 0.4) !important; }
[data-hud="matrix"] .hud-zone .border-cyan-800\/50 { border-color: rgba(22, 101, 52, 0.5) !important; }
[data-hud="matrix"] .hud-zone .border-cyan-800\/60 { border-color: rgba(22, 101, 52, 0.6) !important; }
[data-hud="matrix"] .hud-zone .border-cyan-900\/50 { border-color: rgba(20, 83, 45, 0.5) !important; }
[data-hud="matrix"] .hud-zone .border-b-cyan-900 { border-bottom-color: #14532d !important; }
[data-hud="matrix"] .hud-zone .border-l-cyan-500 { border-left-color: #22c55e !important; }
/* --- Hover text --- */
[data-hud="matrix"] .hud-zone .hover\:text-cyan-300:hover { color: #86efac !important; }
[data-hud="matrix"] .hud-zone .hover\:text-cyan-400:hover { color: #4ade80 !important; }
/* --- Hover background --- */
[data-hud="matrix"] .hud-zone .hover\:bg-cyan-300:hover { background-color: #86efac !important; }
[data-hud="matrix"] .hud-zone .hover\:bg-cyan-500\/20:hover { background-color: rgba(34, 197, 94, 0.2) !important; }
[data-hud="matrix"] .hud-zone .hover\:bg-cyan-900\/50:hover { background-color: rgba(20, 83, 45, 0.5) !important; }
[data-hud="matrix"] .hud-zone .hover\:bg-cyan-950\/30:hover { background-color: rgba(5, 46, 22, 0.3) !important; }
/* --- Hover border --- */
[data-hud="matrix"] .hud-zone .hover\:border-cyan-300:hover { border-color: #86efac !important; }
[data-hud="matrix"] .hud-zone .hover\:border-cyan-500:hover { border-color: #22c55e !important; }
[data-hud="matrix"] .hud-zone .hover\:border-cyan-500\/40:hover { border-color: rgba(34, 197, 94, 0.4) !important; }
[data-hud="matrix"] .hud-zone .hover\:border-cyan-500\/50:hover { border-color: rgba(34, 197, 94, 0.5) !important; }
[data-hud="matrix"] .hud-zone .hover\:border-cyan-600:hover { border-color: #16a34a !important; }
[data-hud="matrix"] .hud-zone .hover\:border-cyan-800:hover { border-color: #166534 !important; }
/* --- Accent (range inputs) --- */
[data-hud="matrix"] .hud-zone .accent-cyan-500 { accent-color: #22c55e !important; }
/* Focus mode: dim the map canvas (tiles + drawn layers) when a popup is active.
Inside MapLibre's DOM, .maplibregl-canvas-container is a SIBLING of .maplibregl-popup,
so this filter dims the map without affecting the popup at all. */
+31 -198
View File
@@ -1,7 +1,6 @@
"use client";
import { API_BASE } from "@/lib/api";
import { useEffect, useState, useRef, useCallback } from "react";
import { useEffect, useState, useRef } from "react";
import dynamic from 'next/dynamic';
import { motion } from "framer-motion";
import { ChevronLeft, ChevronRight } from "lucide-react";
@@ -20,6 +19,11 @@ import ErrorBoundary from "@/components/ErrorBoundary";
import { DashboardDataProvider } from "@/lib/DashboardDataContext";
import OnboardingModal, { useOnboarding } from "@/components/OnboardingModal";
import ChangelogModal, { useChangelog } from "@/components/ChangelogModal";
import type { SelectedEntity } from "@/types/dashboard";
import { NOMINATIM_DEBOUNCE_MS } from "@/lib/constants";
import { useDataPolling } from "@/hooks/useDataPolling";
import { useReverseGeocode } from "@/hooks/useReverseGeocode";
import { useRegionDossier } from "@/hooks/useRegionDossier";
// Use dynamic loads for Maplibre to avoid SSR window is not defined errors
const MaplibreViewer = dynamic(() => import('@/components/MaplibreViewer'), { ssr: false });
@@ -62,10 +66,10 @@ function LocateBar({ onLocate }: { onLocate: (lat: number, lng: number) => void
headers: { 'Accept-Language': 'en' },
});
const data = await res.json();
setResults(data.map((r: any) => ({ label: r.display_name, lat: parseFloat(r.lat), lng: parseFloat(r.lon) })));
setResults(data.map((r: { display_name: string; lat: string; lon: string }) => ({ label: r.display_name, lat: parseFloat(r.lat), lng: parseFloat(r.lon) })));
} catch { setResults([]); }
setLoading(false);
}, 350);
}, NOMINATIM_DEBOUNCE_MS);
};
const handleSelect = (r: { lat: number; lng: number }) => {
@@ -119,10 +123,12 @@ function LocateBar({ onLocate }: { onLocate: (lat: number, lng: number) => void
}
export default function Dashboard() {
const dataRef = useRef<any>({});
const [dataVersion, setDataVersion] = useState(0);
// Stable reference for child components — only changes when dataVersion increments
const data = dataRef.current;
const { data, dataVersion, backendStatus } = useDataPolling();
const { mouseCoords, locationLabel, handleMouseCoords } = useReverseGeocode();
const [selectedEntity, setSelectedEntity] = useState<SelectedEntity | null>(null);
const [trackedSdr, setTrackedSdr] = useState<any>(null);
const { regionDossier, regionDossierLoading, handleMapRightClick } = useRegionDossier(selectedEntity, setSelectedEntity);
const [uiVisible, setUiVisible] = useState(true);
const [leftOpen, setLeftOpen] = useState(true);
const [rightOpen, setRightOpen] = useState(true);
@@ -143,6 +149,7 @@ export default function Dashboard() {
ships_cargo: true,
ships_civilian: false,
ships_passenger: true,
ships_tracked_yachts: true,
earthquakes: true,
cctv: false,
ukraine_frontline: true,
@@ -177,12 +184,11 @@ export default function Dashboard() {
const idx = stylesList.indexOf(prev);
const next = stylesList[(idx + 1) % stylesList.length];
// Auto-toggle High-Res Satellite layer with SATELLITE style
setActiveLayers((l: any) => ({ ...l, highres_satellite: next === 'SATELLITE' }));
setActiveLayers((l) => ({ ...l, highres_satellite: next === 'SATELLITE' }));
return next;
});
};
const [selectedEntity, setSelectedEntity] = useState<{ type: string, id: string | number, extra?: any } | null>(null);
const [activeFilters, setActiveFilters] = useState<Record<string, string[]>>({});
const [flyToLocation, setFlyToLocation] = useState<{ lat: number, lng: number, ts: number } | null>(null);
@@ -191,184 +197,9 @@ export default function Dashboard() {
const [eavesdropLocation, setEavesdropLocation] = useState<{ lat: number, lng: number } | null>(null);
const [cameraCenter, setCameraCenter] = useState<{ lat: number, lng: number } | null>(null);
// Mouse coordinate + reverse geocoding state
const [mouseCoords, setMouseCoords] = useState<{ lat: number, lng: number } | null>(null);
const [locationLabel, setLocationLabel] = useState('');
// Onboarding & connection status
const { showOnboarding, setShowOnboarding } = useOnboarding();
const { showChangelog, setShowChangelog } = useChangelog();
const [backendStatus, setBackendStatus] = useState<'connecting' | 'connected' | 'disconnected'>('connecting');
const geocodeCache = useRef<Map<string, string>>(new Map());
const geocodeTimer = useRef<ReturnType<typeof setTimeout> | null>(null);
const lastGeocodedPos = useRef<{ lat: number; lng: number } | null>(null);
const geocodeAbort = useRef<AbortController | null>(null);
const handleMouseCoords = useCallback((coords: { lat: number, lng: number }) => {
setMouseCoords(coords);
// Throttle reverse geocoding to every 1500ms + distance check
if (geocodeTimer.current) clearTimeout(geocodeTimer.current);
geocodeTimer.current = setTimeout(async () => {
// Skip if cursor hasn't moved far enough (0.05 degrees ~= 5km)
if (lastGeocodedPos.current) {
const dLat = Math.abs(coords.lat - lastGeocodedPos.current.lat);
const dLng = Math.abs(coords.lng - lastGeocodedPos.current.lng);
if (dLat < 0.05 && dLng < 0.05) return;
}
const gridKey = `${(coords.lat).toFixed(2)},${(coords.lng).toFixed(2)}`;
const cached = geocodeCache.current.get(gridKey);
if (cached) {
setLocationLabel(cached);
lastGeocodedPos.current = coords;
return;
}
// Cancel any in-flight geocode request
if (geocodeAbort.current) geocodeAbort.current.abort();
geocodeAbort.current = new AbortController();
try {
const res = await fetch(
`https://nominatim.openstreetmap.org/reverse?lat=${coords.lat}&lon=${coords.lng}&format=json&zoom=10&addressdetails=1`,
{ headers: { 'Accept-Language': 'en' }, signal: geocodeAbort.current.signal }
);
if (res.ok) {
const data = await res.json();
const addr = data.address || {};
const city = addr.city || addr.town || addr.village || addr.county || '';
const state = addr.state || addr.region || '';
const country = addr.country || '';
const parts = [city, state, country].filter(Boolean);
const label = parts.join(', ') || data.display_name?.split(',').slice(0, 3).join(',') || 'Unknown';
// LRU-style cache pruning: keep max 500 entries (Map preserves insertion order)
if (geocodeCache.current.size > 500) {
const iter = geocodeCache.current.keys();
for (let i = 0; i < 100; i++) {
const key = iter.next().value;
if (key !== undefined) geocodeCache.current.delete(key);
}
}
geocodeCache.current.set(gridKey, label);
setLocationLabel(label);
lastGeocodedPos.current = coords;
}
} catch (e: any) {
if (e.name !== 'AbortError') { /* Silently fail - keep last label */ }
}
}, 1500);
}, []);
// Region dossier state (right-click intelligence)
const [regionDossier, setRegionDossier] = useState<any>(null);
const [regionDossierLoading, setRegionDossierLoading] = useState(false);
const handleMapRightClick = useCallback(async (coords: { lat: number, lng: number }) => {
setSelectedEntity({ type: 'region_dossier', id: `${coords.lat.toFixed(4)}_${coords.lng.toFixed(4)}`, extra: coords });
setRegionDossierLoading(true);
setRegionDossier(null);
try {
const [dossierRes, sentinelRes] = await Promise.allSettled([
fetch(`${API_BASE}/api/region-dossier?lat=${coords.lat}&lng=${coords.lng}`),
fetch(`${API_BASE}/api/sentinel2/search?lat=${coords.lat}&lng=${coords.lng}`),
]);
let dossierData: any = {};
if (dossierRes.status === 'fulfilled' && dossierRes.value.ok) {
dossierData = await dossierRes.value.json();
}
let sentinelData = null;
if (sentinelRes.status === 'fulfilled' && sentinelRes.value.ok) {
sentinelData = await sentinelRes.value.json();
}
setRegionDossier({ ...dossierData, sentinel2: sentinelData });
} catch (e) {
console.error("Failed to fetch region dossier", e);
} finally {
setRegionDossierLoading(false);
}
}, []);
// Clear dossier when selecting a different entity type
useEffect(() => {
if (selectedEntity?.type !== 'region_dossier') {
setRegionDossier(null);
setRegionDossierLoading(false);
}
}, [selectedEntity]);
// ETag tracking for conditional requests
const fastEtag = useRef<string | null>(null);
const slowEtag = useRef<string | null>(null);
useEffect(() => {
// Track whether we've received substantial data yet (backend may still be starting up)
let hasData = false;
let fastTimerId: ReturnType<typeof setTimeout> | null = null;
let slowTimerId: ReturnType<typeof setTimeout> | null = null;
const fetchFastData = async () => {
try {
const headers: Record<string, string> = {};
if (fastEtag.current) headers['If-None-Match'] = fastEtag.current;
const res = await fetch(`${API_BASE}/api/live-data/fast`, { headers });
if (res.status === 304) { setBackendStatus('connected'); scheduleNext('fast'); return; }
if (res.ok) {
setBackendStatus('connected');
fastEtag.current = res.headers.get('etag') || null;
const json = await res.json();
dataRef.current = { ...dataRef.current, ...json };
setDataVersion(v => v + 1);
// Check if we got real data (backend finished loading)
const flights = json.commercial_flights?.length || 0;
if (flights > 100) hasData = true;
}
} catch (e) {
console.error("Failed fetching fast live data", e);
setBackendStatus('disconnected');
}
scheduleNext('fast');
};
const fetchSlowData = async () => {
try {
const headers: Record<string, string> = {};
if (slowEtag.current) headers['If-None-Match'] = slowEtag.current;
const res = await fetch(`${API_BASE}/api/live-data/slow`, { headers });
if (res.status === 304) { scheduleNext('slow'); return; }
if (res.ok) {
slowEtag.current = res.headers.get('etag') || null;
const json = await res.json();
dataRef.current = { ...dataRef.current, ...json };
setDataVersion(v => v + 1);
}
} catch (e) {
console.error("Failed fetching slow live data", e);
}
scheduleNext('slow');
};
// Adaptive polling: retry every 3s during startup, back off to normal cadence once data arrives
const scheduleNext = (tier: 'fast' | 'slow') => {
if (tier === 'fast') {
const delay = hasData ? 15000 : 3000; // 3s startup retry → 15s steady state
fastTimerId = setTimeout(fetchFastData, delay);
} else {
const delay = hasData ? 120000 : 5000; // 5s startup retry → 120s steady state
slowTimerId = setTimeout(fetchSlowData, delay);
}
};
fetchFastData();
fetchSlowData();
return () => {
if (fastTimerId) clearTimeout(fastTimerId);
if (slowTimerId) clearTimeout(slowTimerId);
};
}, []);
return (
<DashboardDataProvider data={data} selectedEntity={selectedEntity} setSelectedEntity={setSelectedEntity}>
@@ -399,6 +230,8 @@ export default function Dashboard() {
setMeasurePoints(prev => prev.length >= 3 ? prev : [...prev, pt]);
}}
measurePoints={measurePoints}
trackedSdr={trackedSdr}
setTrackedSdr={setTrackedSdr}
/>
</ErrorBoundary>
@@ -409,7 +242,7 @@ export default function Dashboard() {
initial={{ opacity: 0, y: -20 }}
animate={{ opacity: 1, y: 0 }}
transition={{ duration: 1 }}
className="absolute top-6 left-6 z-[200] pointer-events-none flex items-center gap-4"
className="absolute top-6 left-6 z-[200] pointer-events-none flex items-center gap-4 hud-zone"
>
<div className="w-8 h-8 flex items-center justify-center">
{/* Target Reticle Icon */}
@@ -428,61 +261,61 @@ export default function Dashboard() {
</motion.div>
{/* SYSTEM METRICS TOP LEFT */}
<div className="absolute top-2 left-6 text-[8px] font-mono tracking-widest text-cyan-500/50 z-[200] pointer-events-none">
<div className="absolute top-2 left-6 text-[8px] font-mono tracking-widest text-cyan-500/50 z-[200] pointer-events-none hud-zone">
OPTIC VIS:113 SRC:180 DENS:1.42 0.8ms
</div>
{/* SYSTEM METRICS TOP RIGHT */}
<div className="absolute top-2 right-6 text-[9px] flex flex-col items-end font-mono tracking-widest text-[var(--text-muted)] z-[200] pointer-events-none">
<div className="absolute top-2 right-6 text-[9px] flex flex-col items-end font-mono tracking-widest text-[var(--text-muted)] z-[200] pointer-events-none hud-zone">
<div>RTX</div>
<div>VSR</div>
</div>
{/* LEFT HUD CONTAINER — slides off left edge when hidden */}
<motion.div
className="absolute left-6 top-24 bottom-6 w-80 flex flex-col gap-6 z-[200] pointer-events-none"
className="absolute left-6 top-24 bottom-6 w-80 flex flex-col gap-6 z-[200] pointer-events-none hud-zone"
animate={{ x: leftOpen ? 0 : -360 }}
transition={{ type: 'spring', damping: 30, stiffness: 250 }}
>
{/* LEFT PANEL - DATA LAYERS */}
<ErrorBoundary name="WorldviewLeftPanel">
<WorldviewLeftPanel data={data} activeLayers={activeLayers} setActiveLayers={setActiveLayers} onSettingsClick={() => setSettingsOpen(true)} onLegendClick={() => setLegendOpen(true)} gibsDate={gibsDate} setGibsDate={setGibsDate} gibsOpacity={gibsOpacity} setGibsOpacity={setGibsOpacity} onEntityClick={setSelectedEntity} onFlyTo={(lat, lng) => setFlyToLocation({ lat, lng, ts: Date.now() })} />
<WorldviewLeftPanel data={data} activeLayers={activeLayers} setActiveLayers={setActiveLayers} onSettingsClick={() => setSettingsOpen(true)} onLegendClick={() => setLegendOpen(true)} gibsDate={gibsDate} setGibsDate={setGibsDate} gibsOpacity={gibsOpacity} setGibsOpacity={setGibsOpacity} onEntityClick={setSelectedEntity} onFlyTo={(lat, lng) => setFlyToLocation({ lat, lng, ts: Date.now() })} trackedSdr={trackedSdr} setTrackedSdr={setTrackedSdr} />
</ErrorBoundary>
</motion.div>
{/* LEFT SIDEBAR TOGGLE TAB */}
<motion.div
className="absolute left-0 top-1/2 -translate-y-1/2 z-[201] pointer-events-auto"
className="absolute left-0 top-1/2 -translate-y-1/2 z-[201] pointer-events-auto hud-zone"
animate={{ x: leftOpen ? 344 : 0 }}
transition={{ type: 'spring', damping: 30, stiffness: 250 }}
>
<button
onClick={() => setLeftOpen(!leftOpen)}
className="flex flex-col items-center gap-1.5 py-5 px-1.5 bg-[var(--bg-primary)]/80 backdrop-blur-md border border-[var(--border-primary)] border-l-0 rounded-r-md text-[var(--text-muted)] hover:text-cyan-400 hover:border-cyan-900/50 transition-colors shadow-[2px_0_12px_rgba(0,0,0,0.4)]"
className="flex flex-col items-center gap-1.5 py-5 px-1.5 bg-cyan-400 border border-cyan-400 border-l-0 rounded-r-md text-black hover:bg-cyan-300 hover:border-cyan-300 transition-colors shadow-[2px_0_12px_rgba(0,0,0,0.4)]"
>
{leftOpen ? <ChevronLeft size={10} /> : <ChevronRight size={10} />}
<span className="text-[7px] font-mono tracking-[0.2em] text-[var(--text-muted)]" style={{ writingMode: 'vertical-rl', transform: 'rotate(180deg)' }}>LAYERS</span>
<span className="text-[7px] font-mono tracking-[0.2em] font-bold text-black" style={{ writingMode: 'vertical-rl', transform: 'rotate(180deg)' }}>LAYERS</span>
</button>
</motion.div>
{/* RIGHT SIDEBAR TOGGLE TAB */}
<motion.div
className="absolute right-0 top-1/2 -translate-y-1/2 z-[201] pointer-events-auto"
className="absolute right-0 top-1/2 -translate-y-1/2 z-[201] pointer-events-auto hud-zone"
animate={{ x: rightOpen ? -344 : 0 }}
transition={{ type: 'spring', damping: 30, stiffness: 250 }}
>
<button
onClick={() => setRightOpen(!rightOpen)}
className="flex flex-col items-center gap-1.5 py-5 px-1.5 bg-[var(--bg-primary)]/80 backdrop-blur-md border border-[var(--border-primary)] border-r-0 rounded-l-md text-[var(--text-muted)] hover:text-cyan-400 hover:border-cyan-900/50 transition-colors shadow-[-2px_0_12px_rgba(0,0,0,0.4)]"
className="flex flex-col items-center gap-1.5 py-5 px-1.5 bg-cyan-400 border border-cyan-400 border-r-0 rounded-l-md text-black hover:bg-cyan-300 hover:border-cyan-300 transition-colors shadow-[-2px_0_12px_rgba(0,0,0,0.4)]"
>
{rightOpen ? <ChevronRight size={10} /> : <ChevronLeft size={10} />}
<span className="text-[7px] font-mono tracking-[0.2em] text-[var(--text-muted)]" style={{ writingMode: 'vertical-rl' }}>INTEL</span>
<span className="text-[7px] font-mono tracking-[0.2em] font-bold text-black" style={{ writingMode: 'vertical-rl' }}>INTEL</span>
</button>
</motion.div>
{/* RIGHT HUD CONTAINER — slides off right edge when hidden */}
<motion.div
className="absolute right-6 top-24 bottom-6 w-80 flex flex-col gap-4 z-[200] pointer-events-auto overflow-y-auto styled-scrollbar pr-2"
className="absolute right-6 top-24 bottom-6 w-80 flex flex-col gap-4 z-[200] pointer-events-auto overflow-y-auto styled-scrollbar pr-2 hud-zone"
animate={{ x: rightOpen ? 0 : 360 }}
transition={{ type: 'spring', damping: 30, stiffness: 250 }}
>
@@ -548,7 +381,7 @@ export default function Dashboard() {
initial={{ opacity: 0, y: 20 }}
animate={{ opacity: 1, y: 0 }}
transition={{ delay: 1, duration: 1 }}
className="absolute bottom-6 left-1/2 -translate-x-1/2 z-[200] pointer-events-auto flex flex-col items-center gap-2"
className="absolute bottom-6 left-1/2 -translate-x-1/2 z-[200] pointer-events-auto flex flex-col items-center gap-2 hud-zone"
>
{/* LOCATE BAR — search by coordinates or place name */}
<LocateBar onLocate={(lat, lng) => setFlyToLocation({ lat, lng, ts: Date.now() })} />
+27 -27
View File
@@ -4,55 +4,55 @@ import React, { useState, useEffect } from "react";
import { motion, AnimatePresence } from "framer-motion";
import { X, Zap, Ship, Download, Shield, Bug, Heart } from "lucide-react";
const CURRENT_VERSION = "0.9";
const CURRENT_VERSION = "0.9.5";
const STORAGE_KEY = `shadowbroker_changelog_v${CURRENT_VERSION}`;
const NEW_FEATURES = [
{
icon: <Download size={14} className="text-cyan-400" />,
title: "In-App Auto-Updater",
desc: "One-click updates directly from the dashboard. Downloads the latest release, backs up your files, extracts over the project, and auto-restarts. Manual download fallback included if anything goes wrong.",
icon: <Zap size={14} className="text-cyan-400" />,
title: "Parallelized Boot (15s Cold Start)",
desc: "Backend startup now runs fast-tier, slow-tier, and airport data concurrently via ThreadPoolExecutor. Boot time cut from 60s+ to ~15s.",
color: "cyan",
},
{
icon: <Ship size={14} className="text-blue-400" />,
title: "Granular Ship Layer Controls",
desc: "Ships split into 4 independent toggles: Military/Carriers, Cargo/Tankers, Civilian Vessels, and Cruise/Passenger. Each shows its own live count in the sidebar.",
color: "blue",
},
{
icon: <Shield size={14} className="text-green-400" />,
title: "Stable Entity Selection",
desc: "Ship and flight markers now use MMSI/callsign IDs instead of volatile array indices. Selecting a ship or plane stays locked on even when data refreshes every 60 seconds.",
title: "Adaptive Polling + ETag Caching",
desc: "Data polling engine rebuilt with adaptive retry (3s startup, 15s steady state) and ETag conditional caching. Map panning no longer interrupts data flow.",
color: "green",
},
{
icon: <X size={14} className="text-red-400" />,
title: "Dismissible Threat Alerts",
desc: "Click the X on any threat alert bubble to dismiss it for the session. Uses stable content hashing so dismissed alerts stay hidden across 60-second data refreshes.",
color: "red",
icon: <Ship size={14} className="text-blue-400" />,
title: "Sliding Edge Panels (LAYERS / INTEL)",
desc: "Replaced bulky Record Panel with spring-animated side tabs. LAYERS on the left, INTEL (News, Markets, Radio, Find) on the right. Premium tactical HUD feel.",
color: "blue",
},
{
icon: <Zap size={14} className="text-yellow-400" />,
title: "Faster Data Loading",
desc: "GDELT military incidents now load instantly with background title enrichment instead of blocking for 2+ minutes. Eliminated duplicate startup fetch jobs for faster boot.",
icon: <Download size={14} className="text-yellow-400" />,
title: "Admin Auth + Rate Limiting + Auto-Updater",
desc: "Settings and system endpoints protected by X-Admin-Key. All endpoints rate-limited via slowapi. One-click auto-update from GitHub releases with safe backup/restart.",
color: "yellow",
},
{
icon: <Shield size={14} className="text-purple-400" />,
title: "Docker Swarm Secrets Support",
desc: "Production deployments can now load API keys from /run/secrets/ instead of environment variables. env_check.py enforces warning tiers for missing keys.",
color: "purple",
},
];
const BUG_FIXES = [
"Removed viewport bbox filtering that caused 20-second delays when panning between regions",
"Fixed carrier tracker crash on GDELT 429/TypeError responses",
"Removed fake intelligence assessment generator — all data is now real OSINT only",
"Docker healthcheck start_period increased to 90s to prevent false-negative restarts during data preload",
"ETag collision fix — full payload hash instead of first 256 chars",
"Concurrent /api/refresh guard prevents duplicate data fetches",
"Stable entity IDs for GDELT & News popups — no more wrong popup after data refresh (PR #63)",
"useCallback optimization for interpolation functions — eliminates redundant React re-renders on every 1s tick",
"Restored missing GDELT and datacenter background refreshes in slow-tier loop",
"Server-side viewport bounding box filtering reduces JSON payload size by 80%+",
"Modular fetcher architecture sustained over monolithic data_fetcher.py",
"CCTV ingestors instantiated once at startup — no more fresh DB connections every 10min tick",
];
const CONTRIBUTORS = [
{ name: "@imqdcr", desc: "Ship toggle split into 4 categories + stable MMSI/callsign entity IDs for map markers" },
{ name: "@csysp", desc: "Dismissible threat alert bubbles with stable content hashing + stopPropagation crash fix", pr: "#48" },
{ name: "@suranyami", desc: "Parallel multi-arch Docker builds (11min 3min) + runtime BACKEND_URL fix", pr: "#35, #44" },
{ name: "@csysp", desc: "Dismissible threat alerts + stable entity IDs for GDELT & News popups", pr: "#48, #63" },
{ name: "@suranyami", desc: "Parallel multi-arch Docker builds (11min \u2192 3min) + runtime BACKEND_URL fix", pr: "#35, #44" },
];
export function useChangelog() {
File diff suppressed because it is too large Load Diff
+5 -2
View File
@@ -2,7 +2,7 @@
import React, { useState } from 'react';
import { motion, AnimatePresence } from 'framer-motion';
import { ArrowUpRight, ArrowDownRight, TrendingUp, Droplet, ChevronDown, ChevronUp } from 'lucide-react';
import { ArrowUpRight, ArrowDownRight, TrendingUp, Droplet, ChevronDown, ChevronUp, Globe } from 'lucide-react';
import type { DashboardData } from "@/types/dashboard";
const MarketsPanel = React.memo(function MarketsPanel({ data }: { data: DashboardData }) {
@@ -23,7 +23,10 @@ const MarketsPanel = React.memo(function MarketsPanel({ data }: { data: Dashboar
className="flex justify-between items-center p-3 cursor-pointer hover:bg-[var(--bg-secondary)]/50 transition-colors border-b border-[var(--border-primary)]/50"
onClick={() => setIsMinimized(!isMinimized)}
>
<span className="text-[10px] text-[var(--text-muted)] font-mono tracking-widest">GLOBAL MARKETS</span>
<div className="flex items-center gap-2">
<Globe size={12} className="text-cyan-500" />
<span className="text-[10px] text-[var(--text-muted)] font-mono tracking-widest">GLOBAL MARKETS</span>
</div>
<button className="text-[var(--text-muted)] hover:text-[var(--text-primary)] transition-colors">
{isMinimized ? <ChevronDown size={14} /> : <ChevronUp size={14} />}
</button>
+12 -12
View File
@@ -456,9 +456,9 @@ function NewsFeedInner({ data, selectedEntity, regionDossier, regionDossierLoadi
initial={{ y: 50, opacity: 0 }}
animate={{ y: 0, opacity: 1 }}
transition={{ duration: 0.4 }}
className="w-full bg-black/60 backdrop-blur-md border border-cyan-800 rounded-xl flex flex-col z-10 font-mono shadow-[0_4px_30px_rgba(0,128,255,0.2)] pointer-events-auto overflow-hidden flex-shrink-0"
className="w-full bg-black/60 backdrop-blur-md border border-[var(--border-primary)] rounded-xl flex flex-col z-10 font-mono shadow-[0_4px_30px_rgba(0,0,0,0.5)] pointer-events-auto overflow-hidden flex-shrink-0"
>
<div className="p-3 border-b border-cyan-500/30 bg-cyan-950/40 flex justify-between items-center">
<div className="p-3 border-b border-[var(--border-primary)]/30 bg-[var(--bg-secondary)]/40 flex justify-between items-center">
<h2 className={`text-xs tracking-widest font-bold ${selectedEntity.type === 'military_flight' ? 'text-red-400' : selectedEntity.type === 'private_flight' ? 'text-orange-400' : selectedEntity.type === 'private_jet' ? 'text-purple-400' : 'text-cyan-400'} flex items-center gap-2`}>
{selectedEntity.type === 'military_flight' ? "MILITARY BOGEY INTERCEPT" : selectedEntity.type === 'private_flight' ? "PRIVATE TRANSPONDER" : selectedEntity.type === 'private_jet' ? "PRIVATE JET TRANSPONDER" : "COMMERCIAL TRANSPONDER"}
</h2>
@@ -576,9 +576,9 @@ function NewsFeedInner({ data, selectedEntity, regionDossier, regionDossierLoadi
initial={{ y: 50, opacity: 0 }}
animate={{ y: 0, opacity: 1 }}
transition={{ duration: 0.4 }}
className="w-full bg-black/60 backdrop-blur-md border border-cyan-800 rounded-xl flex flex-col z-10 font-mono shadow-[0_4px_30px_rgba(0,128,255,0.2)] pointer-events-auto overflow-hidden flex-shrink-0"
className="w-full bg-black/60 backdrop-blur-md border border-[var(--border-primary)] rounded-xl flex flex-col z-10 font-mono shadow-[0_4px_30px_rgba(0,0,0,0.5)] pointer-events-auto overflow-hidden flex-shrink-0"
>
<div className="p-3 border-b border-cyan-500/30 bg-cyan-950/40 flex justify-between items-center">
<div className="p-3 border-b border-[var(--border-primary)]/30 bg-[var(--bg-secondary)]/40 flex justify-between items-center">
<h2 className={`text-xs tracking-widest font-bold ${headerColor} flex items-center gap-2`}>
{headerTitle}
</h2>
@@ -648,7 +648,7 @@ function NewsFeedInner({ data, selectedEntity, regionDossier, regionDossierLoadi
}
if (selectedEntity?.type === 'gdelt') {
const gdeltItem = data?.gdelt?.[selectedEntity.id as number];
const gdeltItem = data?.gdelt?.find((g: any) => (g.properties?.name || String(g.geometry?.coordinates)) === selectedEntity.id);
if (gdeltItem && gdeltItem.properties) {
const props = gdeltItem.properties;
return (
@@ -810,9 +810,9 @@ function NewsFeedInner({ data, selectedEntity, regionDossier, regionDossierLoadi
initial={{ y: 50, opacity: 0 }}
animate={{ y: 0, opacity: 1 }}
transition={{ duration: 0.4 }}
className="w-full bg-black/60 backdrop-blur-md border border-cyan-800 rounded-xl flex flex-col z-10 font-mono shadow-[0_4px_30px_rgba(0,128,255,0.2)] pointer-events-auto overflow-hidden flex-shrink-0"
className="w-full bg-black/60 backdrop-blur-md border border-[var(--border-primary)] rounded-xl flex flex-col z-10 font-mono shadow-[0_4px_30px_rgba(0,0,0,0.5)] pointer-events-auto overflow-hidden flex-shrink-0"
>
<div className="p-3 border-b border-cyan-500/30 bg-cyan-950/40 flex justify-between items-center">
<div className="p-3 border-b border-[var(--border-primary)]/30 bg-[var(--bg-secondary)]/40 flex justify-between items-center">
<h2 className="text-xs tracking-widest font-bold text-cyan-400 flex items-center gap-2">
AERONAUTICAL HUB
</h2>
@@ -844,9 +844,9 @@ function NewsFeedInner({ data, selectedEntity, regionDossier, regionDossierLoadi
initial={{ y: 50, opacity: 0 }}
animate={{ y: 0, opacity: 1 }}
transition={{ duration: 0.4 }}
className="w-full bg-black/60 backdrop-blur-md border border-cyan-800 rounded-xl flex flex-col z-10 font-mono shadow-[0_4px_30px_rgba(0,128,255,0.2)] pointer-events-auto overflow-hidden flex-shrink-0"
className="w-full bg-black/60 backdrop-blur-md border border-[var(--border-primary)] rounded-xl flex flex-col z-10 font-mono shadow-[0_4px_30px_rgba(0,0,0,0.5)] pointer-events-auto overflow-hidden flex-shrink-0"
>
<div className="p-3 border-b border-cyan-500/30 bg-cyan-950/40 flex justify-between items-center">
<div className="p-3 border-b border-[var(--border-primary)]/30 bg-[var(--bg-secondary)]/40 flex justify-between items-center">
<h2 className="text-xs tracking-widest font-bold text-cyan-400 flex items-center gap-2">
<AlertTriangle size={14} className="text-red-400" /> {selectedEntity.extra?.last_updated
? new Date(selectedEntity.extra.last_updated + 'Z').toLocaleString('en-US', { month: 'short', day: 'numeric', hour: '2-digit', minute: '2-digit', second: '2-digit', hour12: false, timeZoneName: 'short' }).toUpperCase() + ' — OPTIC INTERCEPT'
@@ -936,10 +936,10 @@ function NewsFeedInner({ data, selectedEntity, regionDossier, regionDossierLoadi
initial={{ y: 50, opacity: 0 }}
animate={{ y: 0, opacity: 1 }}
transition={{ duration: 0.8, delay: 0.2 }}
className={`w-full bg-[var(--bg-panel)] backdrop-blur-md border border-[var(--border-primary)] rounded-xl flex flex-col z-10 font-mono shadow-[0_4px_30px_rgba(0,0,0,0.5)] pointer-events-auto overflow-hidden transition-all duration-300 ${isMinimized ? 'h-[50px] flex-shrink-0' : 'flex-1 min-h-0'}`}
className={`w-full bg-[var(--bg-primary)]/40 backdrop-blur-md border border-[var(--border-primary)] rounded-xl flex flex-col z-10 font-mono shadow-[0_4px_30px_rgba(0,0,0,0.5)] pointer-events-auto overflow-hidden transition-all duration-300 ${isMinimized ? 'h-[50px] flex-shrink-0' : 'flex-1 min-h-0'}`}
>
<div
className="p-3 border-b border-cyan-500/20 bg-cyan-950/20 relative overflow-hidden cursor-pointer hover:bg-cyan-900/30 transition-colors"
className="p-3 border-b border-[var(--border-primary)]/50 relative overflow-hidden cursor-pointer hover:bg-[var(--bg-secondary)]/50 transition-colors"
onClick={() => setIsMinimized(!isMinimized)}
>
<div className="flex justify-between items-center relative z-10">
@@ -1029,7 +1029,7 @@ function NewsFeedInner({ data, selectedEntity, regionDossier, regionDossierLoadi
</span>
<div className="flex items-center gap-2">
{item.cluster_count > 1 && (
<button onClick={() => toggleExpand(idx)} className="text-[8px] font-bold text-cyan-500 bg-cyan-950/50 hover:text-[var(--text-primary)] hover:bg-cyan-900 border border-cyan-500/30 px-1.5 py-0.5 rounded-sm transition-colors cursor-pointer">
<button onClick={() => toggleExpand(idx)} className="text-[8px] font-bold text-cyan-500 bg-[var(--bg-secondary)]/50 hover:text-[var(--text-primary)] hover:bg-[var(--hover-accent)] border border-cyan-500/30 px-1.5 py-0.5 rounded-sm transition-colors cursor-pointer">
{isExpanded ? '[- COLLAPSE]' : `[+${item.cluster_count - 1} SOURCES]`}
</button>
)}
@@ -250,18 +250,18 @@ export default function RadioInterceptPanel({ data, isEavesdropping, setIsEavesd
initial={{ opacity: 0, x: 50 }}
animate={{ opacity: 1, x: 0 }}
transition={{ duration: 1, delay: 0.2 }}
className="w-full flex flex-col bg-[var(--bg-primary)]/40 backdrop-blur-md border border-cyan-900/50 rounded-xl pointer-events-auto shadow-[0_4px_30px_rgba(0,0,0,0.2)] relative overflow-hidden max-h-full"
className="w-full flex flex-col bg-[var(--bg-primary)]/40 backdrop-blur-md border border-[var(--border-primary)] rounded-xl pointer-events-auto shadow-[0_4px_30px_rgba(0,0,0,0.2)] relative overflow-hidden max-h-full"
>
<div
className="flex items-center justify-between p-3 border-b border-cyan-900/50 cursor-pointer bg-cyan-950/20 hover:bg-cyan-900/30 transition-colors"
className="flex items-center justify-between p-3 border-b border-[var(--border-primary)]/50 cursor-pointer hover:bg-[var(--bg-secondary)]/50 transition-colors"
onClick={() => setIsMinimized(!isMinimized)}
>
<div className="flex items-center gap-2 text-cyan-400">
<div className="flex items-center gap-2 text-[var(--text-muted)]">
<RadioReceiver size={14} className={isPlaying ? "animate-pulse" : ""} />
<span className="text-[10px] font-mono tracking-widest font-semibold">SIGINT INTERCEPT</span>
<span className="text-[10px] font-mono tracking-widest">SIGINT INTERCEPT</span>
{isPlaying && <Activity size={12} className="text-red-500 animate-pulse ml-2" />}
</div>
<button className="text-cyan-500 hover:text-cyan-300 transition-colors">
<button className="text-[var(--text-muted)] hover:text-[var(--text-primary)] transition-colors">
{isMinimized ? <ChevronDown size={14} /> : <ChevronUp size={14} />}
</button>
</div>
@@ -275,7 +275,7 @@ export default function RadioInterceptPanel({ data, isEavesdropping, setIsEavesd
className="flex flex-col overflow-hidden"
>
{/* Audio Player Controls */}
<div className="p-4 border-b border-cyan-900/40 bg-[var(--bg-primary)]/60">
<div className="p-4 border-b border-[var(--border-primary)]/40 bg-[var(--bg-primary)]/60">
<div className="flex items-center justify-between mb-3">
<div className="flex flex-col">
<span className="text-xs text-cyan-300 font-mono tracking-wide">
@@ -348,36 +348,6 @@ export default function RadioInterceptPanel({ data, isEavesdropping, setIsEavesd
</div>
</div>
{/* KiwiSDR Tuner — appears when a KiwiSDR node is clicked on the map */}
{selectedEntity?.type === 'kiwisdr' && selectedEntity.extra?.url && (
<div className="p-3 border-b border-amber-900/40 bg-amber-950/10">
<div className="text-[9px] text-amber-400 font-mono tracking-widest mb-2 flex items-center gap-2">
<RadioReceiver size={10} />
SDR TUNER: {(selectedEntity.extra.name || 'REMOTE RECEIVER').toUpperCase().slice(0, 60)}
</div>
<div className="text-[8px] text-[var(--text-muted)] font-mono mb-2">
{selectedEntity.extra.location && <span>{selectedEntity.extra.location} · </span>}
{selectedEntity.extra.antenna && <span>{selectedEntity.extra.antenna.slice(0, 80)} · </span>}
{selectedEntity.extra.users !== undefined && <span>{selectedEntity.extra.users}/{selectedEntity.extra.users_max} users</span>}
</div>
<div className="flex items-center gap-2 mt-1">
<a
href={selectedEntity.extra.url}
target="_blank"
rel="noopener noreferrer"
className="flex-1 text-center px-4 py-2.5 rounded border border-amber-500/50 bg-amber-950/30 text-amber-400 hover:bg-amber-900/40 hover:border-amber-400 text-[10px] font-mono tracking-widest transition-colors"
>
OPEN SDR RECEIVER
</a>
</div>
{selectedEntity.extra.bands && (
<div className="text-[8px] text-[var(--text-muted)] font-mono mt-2">
BANDS: {(Number(selectedEntity.extra.bands.split('-')[0]) / 1e6).toFixed(0)}-{(Number(selectedEntity.extra.bands.split('-')[1]) / 1e6).toFixed(0)} MHz
</div>
)}
</div>
)}
{/* Feed List */}
<div className="flex-col overflow-y-auto styled-scrollbar max-h-64 p-2">
{feeds.length === 0 ? (
+63 -4
View File
@@ -2,7 +2,7 @@
import React, { useState, useEffect, useRef, useMemo } from "react";
import { motion, AnimatePresence } from "framer-motion";
import { Plane, AlertTriangle, Activity, Satellite, Cctv, ChevronDown, ChevronUp, Ship, Eye, Anchor, Settings, Sun, Moon, BookOpen, Radio, Play, Pause, Globe, Flame, Wifi, Server, Shield, ToggleLeft, ToggleRight } from "lucide-react";
import { Plane, AlertTriangle, Activity, Satellite, Cctv, ChevronDown, ChevronUp, Ship, Eye, Anchor, Settings, Sun, Moon, BookOpen, Radio, Play, Pause, Globe, Flame, Wifi, Server, Shield, ToggleLeft, ToggleRight, Palette } from "lucide-react";
import packageJson from "../../package.json";
import { useTheme } from "@/lib/ThemeContext";
@@ -60,11 +60,11 @@ const POTUS_ICAOS: Record<string, { label: string; type: string }> = {
'AE5E77': { label: 'Marine One (VH-92A)', type: 'M1' },
'AE5E79': { label: 'Marine One (VH-92A)', type: 'M1' },
};
import type { DashboardData, ActiveLayers, SelectedEntity } from "@/types/dashboard";
import type { DashboardData, ActiveLayers, SelectedEntity, KiwiSDR } from "@/types/dashboard";
const WorldviewLeftPanel = React.memo(function WorldviewLeftPanel({ data, activeLayers, setActiveLayers, onSettingsClick, onLegendClick, gibsDate, setGibsDate, gibsOpacity, setGibsOpacity, onEntityClick, onFlyTo }: { data: DashboardData; activeLayers: ActiveLayers; setActiveLayers: React.Dispatch<React.SetStateAction<ActiveLayers>>; onSettingsClick?: () => void; onLegendClick?: () => void; gibsDate?: string; setGibsDate?: (d: string) => void; gibsOpacity?: number; setGibsOpacity?: (o: number) => void; onEntityClick?: (entity: SelectedEntity) => void; onFlyTo?: (lat: number, lng: number) => void }) {
const WorldviewLeftPanel = React.memo(function WorldviewLeftPanel({ data, activeLayers, setActiveLayers, onSettingsClick, onLegendClick, gibsDate, setGibsDate, gibsOpacity, setGibsOpacity, onEntityClick, onFlyTo, trackedSdr, setTrackedSdr }: { data: DashboardData; activeLayers: ActiveLayers; setActiveLayers: React.Dispatch<React.SetStateAction<ActiveLayers>>; onSettingsClick?: () => void; onLegendClick?: () => void; gibsDate?: string; setGibsDate?: (d: string) => void; gibsOpacity?: number; setGibsOpacity?: (o: number) => void; onEntityClick?: (entity: SelectedEntity) => void; onFlyTo?: (lat: number, lng: number) => void; trackedSdr?: KiwiSDR | null; setTrackedSdr?: (sdr: KiwiSDR | null) => void }) {
const [isMinimized, setIsMinimized] = useState(false);
const { theme, toggleTheme } = useTheme();
const { theme, toggleTheme, hudColor, cycleHudColor } = useTheme();
const [gibsPlaying, setGibsPlaying] = useState(false);
const [potusEnabled, setPotusEnabled] = useState(true);
const gibsIntervalRef = useRef<ReturnType<typeof setInterval> | null>(null);
@@ -172,6 +172,13 @@ const WorldviewLeftPanel = React.memo(function WorldviewLeftPanel({ data, active
>
{theme === 'dark' ? <Sun size={14} /> : <Moon size={14} />}
</button>
<button
onClick={cycleHudColor}
className={`w-7 h-7 rounded-lg border border-[var(--border-primary)] hover:border-cyan-500/50 flex items-center justify-center text-cyan-400 hover:text-cyan-300 transition-all hover:bg-[var(--hover-accent)]`}
title={hudColor === 'cyan' ? 'Switch to Matrix HUD' : 'Switch to Cyan HUD'}
>
<Palette size={14} />
</button>
{onSettingsClick && (
<button
onClick={onSettingsClick}
@@ -238,6 +245,58 @@ const WorldviewLeftPanel = React.memo(function WorldviewLeftPanel({ data, active
className="overflow-y-auto styled-scrollbar"
>
<div className="flex flex-col gap-6 p-4 pt-2 pb-6">
{/* SDR TRACKER — pinned to TOP when active */}
{trackedSdr && (
<div className="bg-amber-950/20 border border-amber-500/40 rounded-lg p-3 -mt-1 shadow-[0_0_15px_rgba(245,158,11,0.1)]">
<div className="flex items-center justify-between mb-2">
<div className="flex items-center gap-2">
<Radio size={14} className="text-amber-400" />
<span className="text-[10px] text-amber-400 font-mono tracking-widest font-bold">SDR TRACKER</span>
<span className="text-[9px] font-mono px-1.5 py-0.5 rounded-full bg-amber-500/20 border border-amber-500/40 text-amber-400 animate-pulse">
LIVE
</span>
</div>
<button
onClick={(e) => { e.stopPropagation(); setTrackedSdr?.(null); }}
className="text-[8px] font-mono text-[var(--text-muted)] hover:text-red-400 border border-[var(--border-primary)] hover:border-red-400/40 rounded px-1.5 py-0.5 transition-colors"
title="Release SDR and clear tracking"
>
RELEASE
</button>
</div>
<div className="flex flex-col gap-2">
<div className="flex flex-col p-2 rounded-lg border border-amber-500/20 bg-amber-950/10">
<span className="text-[10px] font-bold font-mono text-amber-300 truncate mb-1">
{(trackedSdr.name || 'REMOTE RECEIVER').toUpperCase()}
</span>
<div className="text-[8px] text-[var(--text-muted)] font-mono mb-2">
{trackedSdr.location && <span>{trackedSdr.location} · </span>}
{trackedSdr.antenna && <span>{trackedSdr.antenna.slice(0, 40)}</span>}
</div>
<div className="flex items-center gap-2 mt-1">
<button
onClick={() => onFlyTo?.(trackedSdr.lat, trackedSdr.lon)}
className="flex-1 text-center px-2 py-1.5 rounded border border-[var(--border-primary)] hover:border-amber-400/50 hover:text-amber-400 text-[var(--text-muted)] text-[9px] font-mono tracking-widest transition-colors flex items-center justify-center gap-1.5"
title="Pan camera to SDR location"
>
<Globe size={10} /> RE-LOCK
</button>
{trackedSdr.url && (
<a
href={trackedSdr.url}
target="_blank"
rel="noopener noreferrer"
className="flex-1 text-center px-2 py-1.5 rounded border border-amber-500/50 bg-amber-500/10 text-amber-400 hover:bg-amber-500/20 hover:border-amber-400 text-[9px] font-mono tracking-widest transition-colors flex items-center justify-center gap-1.5"
>
<Activity size={10} /> TUNER
</a>
)}
</div>
</div>
</div>
</div>
)}
{/* POTUS Fleet — pinned to TOP when aircraft are active */}
{potusEnabled && potusFlights.length > 0 && (
<div className="bg-[#ff1493]/5 border border-[#ff1493]/30 rounded-lg p-3 -mt-1">
@@ -42,7 +42,7 @@ const WorldviewRightPanel = React.memo(function WorldviewRightPanel({ effects, s
</div>
{/* Right side controls box */}
<div className="bg-[var(--bg-primary)]/40 backdrop-blur-md border border-[var(--border-primary)] rounded-xl pointer-events-auto border-r-2 border-r-cyan-900 flex flex-col relative overflow-hidden h-full">
<div className="bg-[var(--bg-primary)]/40 backdrop-blur-md border border-[var(--border-primary)] rounded-xl pointer-events-auto border-r-2 border-r-[var(--border-primary)] flex flex-col relative overflow-hidden h-full">
{/* Header / Toggle */}
<div
@@ -71,14 +71,14 @@ const WorldviewRightPanel = React.memo(function WorldviewRightPanel({ effects, s
onClick={() => setEffects({ ...effects, bloom: !effects.bloom })}
>
<div className="flex items-center gap-3">
<span className={`text-[14px] ${effects.bloom ? 'text-yellow-500' : 'text-gray-600'}`}></span>
<span className={`text-[14px] ${effects.bloom ? 'text-yellow-500' : 'text-[var(--text-muted)]'}`}></span>
<span className={`text-xs font-mono tracking-widest ${effects.bloom ? 'text-[var(--text-primary)]' : 'text-[var(--text-muted)]'}`}>BLOOM</span>
</div>
<span className="text-[9px] font-mono tracking-wider text-[var(--text-muted)]">{effects.bloom ? 'ON' : 'OFF'}</span>
</div>
{/* Sharpen Slider */}
<div className="flex flex-col gap-3 group border border-cyan-900/50 bg-cyan-950/10 rounded px-4 py-3 pb-4 relative overflow-hidden">
<div className="flex flex-col gap-3 group border border-[var(--border-primary)]/50 bg-[var(--bg-secondary)]/10 rounded px-4 py-3 pb-4 relative overflow-hidden">
<div className="absolute left-0 top-0 bottom-0 w-1 bg-cyan-500"></div>
<div className="flex items-center gap-2">
<span className="w-3 h-3 rounded-full border border-cyan-400 flex items-center justify-center relative">
@@ -98,7 +98,7 @@ const WorldviewRightPanel = React.memo(function WorldviewRightPanel({ effects, s
{/* HUD Dropdown */}
<div className="flex flex-col gap-2 relative">
<div className="flex items-center gap-3 border border-[var(--border-primary)] rounded px-4 py-3 text-[var(--text-muted)] cursor-default">
<span className="w-3 h-3 border border-gray-500 rounded-full flex items-center justify-center"></span>
<span className="w-3 h-3 border border-[var(--border-secondary)] rounded-full flex items-center justify-center"></span>
<span className="text-xs font-mono tracking-widest">HUD</span>
</div>
@@ -106,6 +106,32 @@ export function CarrierLabels({ ships, inView, interpShip }: CarrierLabelsProps)
);
}
// -- Tracked yacht labels --
interface TrackedYachtLabelsProps {
ships: any[];
inView: (lat: number, lng: number) => boolean;
interpShip: (s: any) => [number, number];
}
export function TrackedYachtLabels({ ships, inView, interpShip }: TrackedYachtLabelsProps) {
return (
<>
{ships.map((s: any, i: number) => {
if (!s.yacht_alert || s.lat == null || s.lng == null) return null;
if (!inView(s.lat, s.lng)) return null;
const [iLng, iLat] = interpShip(s);
return (
<Marker key={`yacht-label-${i}`} longitude={iLng} latitude={iLat} anchor="top" offset={[0, 12]} style={{ zIndex: 2 }}>
<div style={{ ...LABEL_BASE, color: s.yacht_color || '#FF69B4', fontSize: '10px', textShadow: LABEL_SHADOW_EXTRA, whiteSpace: 'nowrap' }}>
{s.yacht_owner || s.name || 'TRACKED YACHT'}
</div>
</Marker>
);
})}
</>
);
}
// -- UAV labels --
interface UavLabelsProps {
uavs: any[];
@@ -0,0 +1,123 @@
import { describe, it, expect } from 'vitest';
import {
buildEarthquakesGeoJSON,
buildFirmsGeoJSON,
buildInternetOutagesGeoJSON,
buildDataCentersGeoJSON,
buildShipsGeoJSON,
buildCarriersGeoJSON,
} from '@/components/map/geoJSONBuilders';
import type { Earthquake, FireHotspot, InternetOutage, DataCenter, Ship, ActiveLayers } from '@/types/dashboard';
// Default active layers for ship tests
const allShipLayers: ActiveLayers = {
flights: true, private: true, jets: true, military: true, tracked: true,
satellites: true, earthquakes: true, cctv: false, ukraine_frontline: true,
global_incidents: true, firms_fires: true, jamming: true, internet_outages: true,
datacenters: true, gdelt: false, liveuamap: true, weather: true, uav: true,
kiwisdr: false,
ships_military: true, ships_cargo: true, ships_civilian: true,
ships_passenger: true, ships_tracked_yachts: true,
};
describe('buildEarthquakesGeoJSON', () => {
it('returns null for empty array', () => {
expect(buildEarthquakesGeoJSON([])).toBeNull();
});
it('returns null for undefined', () => {
expect(buildEarthquakesGeoJSON(undefined)).toBeNull();
});
it('builds valid FeatureCollection', () => {
const quakes: Earthquake[] = [
{ id: 'eq1', mag: 5.2, lat: 35.0, lng: 139.0, place: 'Japan' },
{ id: 'eq2', mag: 3.1, lat: 40.0, lng: -74.0, place: 'New York' },
];
const result = buildEarthquakesGeoJSON(quakes);
expect(result).not.toBeNull();
expect(result!.type).toBe('FeatureCollection');
expect(result!.features).toHaveLength(2);
expect(result!.features[0].properties?.type).toBe('earthquake');
expect(result!.features[0].geometry).toEqual({ type: 'Point', coordinates: [139.0, 35.0] });
});
it('skips entries with null coordinates', () => {
const quakes: Earthquake[] = [
{ id: 'eq1', mag: 5.2, lat: null as any, lng: 139.0, place: 'Bad' },
{ id: 'eq2', mag: 3.1, lat: 40.0, lng: -74.0, place: 'Good' },
];
const result = buildEarthquakesGeoJSON(quakes);
expect(result!.features).toHaveLength(1);
});
});
describe('buildFirmsGeoJSON', () => {
it('returns null for empty array', () => {
expect(buildFirmsGeoJSON([])).toBeNull();
});
it('assigns correct icon by FRP intensity', () => {
const fires: FireHotspot[] = [
{ lat: 10, lng: 20, frp: 2, brightness: 300, confidence: 'high', daynight: 'D', acq_date: '2025-01-01', acq_time: '1200' }, // yellow
{ lat: 10, lng: 21, frp: 10, brightness: 350, confidence: 'high', daynight: 'D', acq_date: '2025-01-01', acq_time: '1200' }, // orange
{ lat: 10, lng: 22, frp: 50, brightness: 400, confidence: 'high', daynight: 'N', acq_date: '2025-01-01', acq_time: '0000' }, // red
{ lat: 10, lng: 23, frp: 200, brightness: 500, confidence: 'high', daynight: 'N', acq_date: '2025-01-01', acq_time: '0000' }, // darkred
];
const result = buildFirmsGeoJSON(fires)!;
expect(result.features[0].properties?.iconId).toBe('fire-yellow');
expect(result.features[1].properties?.iconId).toBe('fire-orange');
expect(result.features[2].properties?.iconId).toBe('fire-red');
expect(result.features[3].properties?.iconId).toBe('fire-darkred');
});
});
describe('buildShipsGeoJSON', () => {
const alwaysInView = () => true;
const interpIdentity = (s: Ship): [number, number] => [s.lng!, s.lat!];
it('returns null when all ship layers are off', () => {
const layers = { ...allShipLayers, ships_military: false, ships_cargo: false, ships_civilian: false, ships_passenger: false, ships_tracked_yachts: false };
const ships: Ship[] = [{ name: 'Test', lat: 10, lng: 20, type: 'cargo' } as Ship];
expect(buildShipsGeoJSON(ships, layers, alwaysInView, interpIdentity)).toBeNull();
});
it('filters out carriers (handled by buildCarriersGeoJSON)', () => {
const ships: Ship[] = [
{ name: 'Cargo Ship', lat: 10, lng: 20, type: 'cargo', mmsi: '123' } as Ship,
{ name: 'USS Nimitz', lat: 30, lng: 40, type: 'carrier', mmsi: '456' } as Ship,
];
const result = buildShipsGeoJSON(ships, allShipLayers, alwaysInView, interpIdentity);
expect(result!.features).toHaveLength(1);
expect(result!.features[0].properties?.name).toBe('Cargo Ship');
});
it('assigns correct icon by ship type', () => {
const ships: Ship[] = [
{ name: 'Tanker', lat: 10, lng: 20, type: 'tanker', mmsi: '1' } as Ship,
{ name: 'Yacht', lat: 10, lng: 21, type: 'yacht', mmsi: '2' } as Ship,
{ name: 'Warship', lat: 10, lng: 22, type: 'military_vessel', mmsi: '3' } as Ship,
];
const result = buildShipsGeoJSON(ships, allShipLayers, alwaysInView, interpIdentity)!;
expect(result.features[0].properties?.iconId).toBe('svgShipRed');
expect(result.features[1].properties?.iconId).toBe('svgShipWhite');
expect(result.features[2].properties?.iconId).toBe('svgShipYellow');
});
});
describe('buildCarriersGeoJSON', () => {
it('returns null for empty ships', () => {
expect(buildCarriersGeoJSON([])).toBeNull();
});
it('only includes carriers', () => {
const ships: Ship[] = [
{ name: 'USS Nimitz', lat: 30, lng: 40, type: 'carrier', mmsi: '456', heading: 90 } as Ship,
{ name: 'Cargo Ship', lat: 10, lng: 20, type: 'cargo', mmsi: '123' } as Ship,
];
const result = buildCarriersGeoJSON(ships)!;
expect(result.features).toHaveLength(1);
expect(result.features[0].properties?.name).toBe('USS Nimitz');
expect(result.features[0].properties?.iconId).toBe('svgCarrier');
});
});
@@ -0,0 +1,423 @@
// ─── Pure GeoJSON builder functions ─────────────────────────────────────────
// Extracted from MaplibreViewer to reduce component size and enable unit testing.
// Each function takes data arrays + optional helpers and returns a GeoJSON FeatureCollection or null.
import type { Earthquake, GPSJammingZone, FireHotspot, InternetOutage, DataCenter, GDELTIncident, LiveUAmapIncident, CCTVCamera, KiwiSDR, FrontlineGeoJSON, UAV, Satellite, Ship, ActiveLayers } from "@/types/dashboard";
import { classifyAircraft } from "@/utils/aircraftClassification";
import { MISSION_COLORS, MISSION_ICON_MAP } from "@/components/map/icons/SatelliteIcons";
type FC = GeoJSON.FeatureCollection | null;
type InViewFilter = (lat: number, lng: number) => boolean;
// ─── Earthquakes ────────────────────────────────────────────────────────────
export function buildEarthquakesGeoJSON(earthquakes?: Earthquake[]): FC {
if (!earthquakes?.length) return null;
return {
type: 'FeatureCollection',
features: earthquakes.map((eq, i) => {
if (eq.lat == null || eq.lng == null) return null;
return {
type: 'Feature' as const,
properties: {
id: i,
type: 'earthquake',
name: `[M${eq.mag}]\n${eq.place || 'Unknown Location'}`,
title: eq.title,
},
geometry: { type: 'Point' as const, coordinates: [eq.lng, eq.lat] }
};
}).filter(Boolean) as GeoJSON.Feature[]
};
}
// ─── GPS Jamming Zones ──────────────────────────────────────────────────────
export function buildJammingGeoJSON(zones?: GPSJammingZone[]): FC {
if (!zones?.length) return null;
return {
type: 'FeatureCollection',
features: zones.map((zone, i) => {
const halfDeg = 0.5;
return {
type: 'Feature' as const,
properties: {
id: i,
severity: zone.severity,
ratio: zone.ratio,
degraded: zone.degraded,
total: zone.total,
opacity: zone.severity === 'high' ? 0.45 : zone.severity === 'medium' ? 0.3 : 0.18
},
geometry: {
type: 'Polygon' as const,
coordinates: [[
[zone.lng - halfDeg, zone.lat - halfDeg],
[zone.lng + halfDeg, zone.lat - halfDeg],
[zone.lng + halfDeg, zone.lat + halfDeg],
[zone.lng - halfDeg, zone.lat + halfDeg],
[zone.lng - halfDeg, zone.lat - halfDeg]
]]
}
};
})
};
}
// ─── CCTV Cameras ──────────────────────────────────────────────────────────
export function buildCctvGeoJSON(cameras?: CCTVCamera[], inView?: InViewFilter): FC {
if (!cameras?.length) return null;
return {
type: 'FeatureCollection' as const,
features: cameras.filter(c => c.lat != null && c.lon != null && (!inView || inView(c.lat, c.lon))).map((c, i) => ({
type: 'Feature' as const,
properties: {
id: c.id || i,
type: 'cctv',
name: c.direction_facing || 'Camera',
source_agency: c.source_agency || 'Unknown',
media_url: c.media_url || '',
media_type: c.media_type || 'image'
},
geometry: { type: 'Point' as const, coordinates: [c.lon, c.lat] }
}))
};
}
// ─── KiwiSDR Receivers ─────────────────────────────────────────────────────
export function buildKiwisdrGeoJSON(receivers?: KiwiSDR[], inView?: InViewFilter): FC {
if (!receivers?.length) return null;
return {
type: 'FeatureCollection' as const,
features: receivers.filter(k => k.lat != null && k.lon != null && (!inView || inView(k.lat, k.lon))).map((k, i) => ({
type: 'Feature' as const,
properties: {
id: i,
type: 'kiwisdr',
name: k.name || 'Unknown SDR',
url: k.url || '',
users: k.users || 0,
users_max: k.users_max || 0,
bands: k.bands || '',
antenna: k.antenna || '',
location: k.location || '',
lat: k.lat,
lon: k.lon,
},
geometry: { type: 'Point' as const, coordinates: [k.lon, k.lat] }
}))
};
}
// ─── NASA FIRMS Fires ───────────────────────────────────────────────────────
export function buildFirmsGeoJSON(fires?: FireHotspot[]): FC {
if (!fires?.length) return null;
return {
type: 'FeatureCollection',
features: fires.map((f, i) => {
const frp = f.frp || 0;
const iconId = frp >= 100 ? 'fire-darkred' : frp >= 20 ? 'fire-red' : frp >= 5 ? 'fire-orange' : 'fire-yellow';
return {
type: 'Feature' as const,
properties: {
id: i,
type: 'firms_fire',
name: `Fire ${frp.toFixed(1)} MW`,
frp,
iconId,
brightness: f.brightness || 0,
confidence: f.confidence || '',
daynight: f.daynight === 'D' ? 'Day' : 'Night',
acq_date: f.acq_date || '',
acq_time: f.acq_time || '',
},
geometry: { type: 'Point' as const, coordinates: [f.lng, f.lat] }
};
})
};
}
// ─── Internet Outages ───────────────────────────────────────────────────────
export function buildInternetOutagesGeoJSON(outages?: InternetOutage[]): FC {
if (!outages?.length) return null;
return {
type: 'FeatureCollection',
features: outages.map((o) => {
if (o.lat == null || o.lng == null) return null;
const severity = o.severity || 0;
const region = o.region_name || o.region_code || '?';
const country = o.country_name || o.country_code || '';
const label = `${region}, ${country}`;
const detail = `${label}\n${severity}% drop · ${o.datasource || 'IODA'}`;
return {
type: 'Feature' as const,
properties: {
id: o.region_code || region,
type: 'internet_outage',
name: label,
country,
region,
level: o.level,
severity,
datasource: o.datasource || '',
detail,
},
geometry: { type: 'Point' as const, coordinates: [o.lng, o.lat] }
};
}).filter(Boolean) as GeoJSON.Feature[]
};
}
// ─── Data Centers ───────────────────────────────────────────────────────────
export function buildDataCentersGeoJSON(datacenters?: DataCenter[]): FC {
if (!datacenters?.length) return null;
return {
type: 'FeatureCollection',
features: datacenters.map((dc, i) => ({
type: 'Feature' as const,
properties: {
id: `dc-${i}`,
type: 'datacenter',
name: dc.name || 'Unknown',
company: dc.company || '',
street: dc.street || '',
city: dc.city || '',
country: dc.country || '',
zip: dc.zip || '',
},
geometry: { type: 'Point' as const, coordinates: [dc.lng, dc.lat] }
}))
};
}
// ─── GDELT Incidents ────────────────────────────────────────────────────────
export function buildGdeltGeoJSON(gdelt?: GDELTIncident[], inView?: InViewFilter): FC {
if (!gdelt?.length) return null;
return {
type: 'FeatureCollection',
features: gdelt.map((g) => {
if (!g.geometry || !g.geometry.coordinates) return null;
const [gLng, gLat] = g.geometry.coordinates;
if (inView && !inView(gLat, gLng)) return null;
return {
type: 'Feature' as const,
properties: { id: g.properties?.name || String(g.geometry.coordinates), type: 'gdelt', title: g.properties?.name || '' },
geometry: g.geometry
};
}).filter(Boolean) as GeoJSON.Feature[]
};
}
// ─── LiveUAMap Incidents ────────────────────────────────────────────────────
export function buildLiveuaGeoJSON(incidents?: LiveUAmapIncident[], inView?: InViewFilter): FC {
if (!incidents?.length) return null;
return {
type: 'FeatureCollection',
features: incidents.map((incident) => {
if (incident.lat == null || incident.lng == null) return null;
if (inView && !inView(incident.lat, incident.lng)) return null;
const isViolent = /bomb|missil|strike|attack|kill|destroy|fire|shoot|expl|raid/i.test(incident.title || "");
return {
type: 'Feature' as const,
properties: {
id: incident.id,
type: 'liveuamap',
title: incident.title || '',
iconId: isViolent ? 'icon-liveua-red' : 'icon-liveua-yellow',
},
geometry: { type: 'Point' as const, coordinates: [incident.lng, incident.lat] }
};
}).filter(Boolean) as GeoJSON.Feature[]
};
}
// ─── Ukraine Frontline ──────────────────────────────────────────────────────
export function buildFrontlineGeoJSON(frontlines?: FrontlineGeoJSON | null): FC {
if (!frontlines?.features?.length) return null;
return frontlines;
}
// ─── Parameterized Flight Layer ─────────────────────────────────────────────
// Deduplicates commercial / private / jets / military flight GeoJSON builders.
export interface FlightLayerConfig {
colorMap: Record<string, string>;
groundedMap: Record<string, string>;
typeLabel: string;
idPrefix: string;
/** For military flights: special icon overrides by military_type */
milSpecialMap?: Record<string, string>;
/** If true, prefer true_track over heading for rotation (commercial flights) */
useTrackHeading?: boolean;
}
export function buildFlightLayerGeoJSON(
flights: any[] | undefined,
config: FlightLayerConfig,
helpers: {
interpFlight: (f: any) => [number, number];
inView: InViewFilter;
trackedIcaoSet: Set<string>;
}
): FC {
if (!flights?.length) return null;
const { colorMap, groundedMap, typeLabel, idPrefix, milSpecialMap, useTrackHeading } = config;
const { interpFlight, inView, trackedIcaoSet } = helpers;
return {
type: 'FeatureCollection',
features: flights.map((f: any, i: number) => {
if (f.lat == null || f.lng == null) return null;
if (!inView(f.lat, f.lng)) return null;
if (f.icao24 && trackedIcaoSet.has(f.icao24.toLowerCase())) return null;
const acType = classifyAircraft(f.model, f.aircraft_category);
const grounded = f.alt != null && f.alt <= 100;
let iconId: string;
if (milSpecialMap) {
const milType = f.military_type || 'default';
iconId = milSpecialMap[milType] || '';
if (!iconId) {
iconId = grounded ? groundedMap[acType] : colorMap[acType];
} else if (grounded) {
iconId = groundedMap[acType];
}
} else {
iconId = grounded ? groundedMap[acType] : colorMap[acType];
}
const rotation = useTrackHeading ? (f.true_track || f.heading || 0) : (f.heading || 0);
const [iLng, iLat] = interpFlight(f);
return {
type: 'Feature' as const,
properties: { id: f.icao24 || f.callsign || `${idPrefix}${i}`, type: typeLabel, callsign: f.callsign || f.icao24, rotation, iconId },
geometry: { type: 'Point' as const, coordinates: [iLng, iLat] }
};
}).filter(Boolean) as GeoJSON.Feature[]
};
}
// ─── UAVs / Drones ──────────────────────────────────────────────────────────
export function buildUavGeoJSON(uavs?: UAV[], inView?: InViewFilter): FC {
if (!uavs?.length) return null;
return {
type: 'FeatureCollection',
features: uavs.map((uav, i) => {
if (uav.lat == null || uav.lng == null) return null;
if (inView && !inView(uav.lat, uav.lng)) return null;
return {
type: 'Feature' as const,
properties: {
id: (uav as any).id || `uav-${i}`,
type: 'uav',
callsign: uav.callsign,
rotation: uav.heading || 0,
iconId: 'svgDrone',
name: uav.aircraft_model || uav.callsign,
country: uav.country || '',
uav_type: uav.uav_type || '',
alt: uav.alt || 0,
wiki: uav.wiki || '',
speed_knots: uav.speed_knots || 0,
icao24: uav.icao24 || '',
registration: uav.registration || '',
squawk: uav.squawk || '',
},
geometry: { type: 'Point' as const, coordinates: [uav.lng, uav.lat] }
};
}).filter(Boolean) as GeoJSON.Feature[]
};
}
// ─── Satellites ─────────────────────────────────────────────────────────────
export function buildSatellitesGeoJSON(
satellites: Satellite[] | undefined,
inView: InViewFilter,
interpSat: (s: Satellite) => [number, number]
): FC {
if (!satellites?.length) return null;
return {
type: 'FeatureCollection',
features: satellites
.filter((s) => s.lat != null && s.lng != null && inView(s.lat, s.lng))
.map((s, i) => ({
type: 'Feature' as const,
properties: {
id: s.id || i, type: 'satellite', name: s.name, mission: s.mission || 'general',
sat_type: s.sat_type || 'Satellite', country: s.country || '', alt_km: s.alt_km || 0,
wiki: s.wiki || '', color: MISSION_COLORS[s.mission] || '#aaaaaa',
iconId: MISSION_ICON_MAP[s.mission] || 'sat-gen'
},
geometry: { type: 'Point' as const, coordinates: interpSat(s) }
}))
};
}
// ─── Ships (non-carrier) ────────────────────────────────────────────────────
export function buildShipsGeoJSON(
ships: Ship[] | undefined,
activeLayers: ActiveLayers,
inView: InViewFilter,
interpShip: (s: Ship) => [number, number]
): FC {
if (!(activeLayers.ships_military || activeLayers.ships_cargo || activeLayers.ships_civilian || activeLayers.ships_passenger || activeLayers.ships_tracked_yachts) || !ships) return null;
return {
type: 'FeatureCollection',
features: ships.map((s, i) => {
if (s.lat == null || s.lng == null) return null;
if (!inView(s.lat, s.lng)) return null;
const isTrackedYacht = !!s.yacht_alert;
const isMilitary = s.type === 'carrier' || s.type === 'military_vessel';
const isCargo = s.type === 'tanker' || s.type === 'cargo';
const isPassenger = s.type === 'passenger';
if (s.type === 'carrier') return null; // Handled by buildCarriersGeoJSON
if (isTrackedYacht) {
if (activeLayers?.ships_tracked_yachts === false) return null;
} else if (isMilitary && activeLayers?.ships_military === false) return null;
else if (isCargo && activeLayers?.ships_cargo === false) return null;
else if (isPassenger && activeLayers?.ships_passenger === false) return null;
else if (!isMilitary && !isCargo && !isPassenger && activeLayers?.ships_civilian === false) return null;
let iconId = 'svgShipBlue';
if (isTrackedYacht) iconId = 'svgShipPink';
else if (isCargo) iconId = 'svgShipRed';
else if (s.type === 'yacht' || isPassenger) iconId = 'svgShipWhite';
else if (isMilitary) iconId = 'svgShipYellow';
const [iLng, iLat] = interpShip(s);
return {
type: 'Feature',
properties: { id: s.mmsi || s.name || `ship-${i}`, type: 'ship', name: s.name, rotation: s.heading || 0, iconId },
geometry: { type: 'Point', coordinates: [iLng, iLat] }
};
}).filter(Boolean) as GeoJSON.Feature[]
};
}
// ─── Carriers ───────────────────────────────────────────────────────────────
export function buildCarriersGeoJSON(ships: Ship[] | undefined): FC {
if (!ships?.length) return null;
return {
type: 'FeatureCollection',
features: ships.map((s, i) => {
if (s.type !== 'carrier' || s.lat == null || s.lng == null) return null;
return {
type: 'Feature',
properties: { id: s.mmsi || s.name || `carrier-${i}`, type: 'ship', name: s.name, rotation: s.heading || 0, iconId: 'svgCarrier' },
geometry: { type: 'Point', coordinates: [s.lng, s.lat] }
};
}).filter(Boolean) as GeoJSON.Feature[]
};
}
@@ -0,0 +1,77 @@
"use client";
import { useEffect, useRef, useState } from "react";
import type { MapRef } from "react-map-gl/maplibre";
export interface ClusterItem {
lng: number;
lat: number;
count: string | number;
id: number;
}
/**
* Extracts cluster label positions from a MapLibre clustered source.
* Listens for moveend/sourcedata events to keep labels in sync.
*
* @param mapRef - React ref to the MapLibre map instance
* @param sourceId - The source ID to query clusters from (e.g. "ships", "earthquakes")
* @param geoJSON - The GeoJSON data driving the source (null = no clusters)
*/
export function useClusterLabels(
mapRef: React.RefObject<MapRef | null>,
sourceId: string,
geoJSON: unknown | null
): ClusterItem[] {
const [clusters, setClusters] = useState<ClusterItem[]>([]);
const handlerRef = useRef<(() => void) | null>(null);
useEffect(() => {
const map = mapRef.current?.getMap();
if (!map || !geoJSON) {
setClusters([]);
return;
}
// Remove previous handler if it exists
if (handlerRef.current) {
map.off("moveend", handlerRef.current);
map.off("sourcedata", handlerRef.current);
}
const update = () => {
try {
const features = map.querySourceFeatures(sourceId);
const raw = features
.filter((f: any) => f.properties?.cluster)
.map((f: any) => ({
lng: (f.geometry as any).coordinates[0],
lat: (f.geometry as any).coordinates[1],
count: f.properties.point_count_abbreviated || f.properties.point_count,
id: f.properties.cluster_id,
}));
const seen = new Set<number>();
const unique = raw.filter((c) => {
if (seen.has(c.id)) return false;
seen.add(c.id);
return true;
});
setClusters(unique);
} catch {
setClusters([]);
}
};
handlerRef.current = update;
map.on("moveend", update);
map.on("sourcedata", update);
setTimeout(update, 500);
return () => {
map.off("moveend", update);
map.off("sourcedata", update);
};
}, [geoJSON, sourceId]);
return clusters;
}
@@ -0,0 +1,68 @@
"use client";
import { useCallback, useMemo, useRef, useState, useEffect } from "react";
import { interpolatePosition } from "@/utils/positioning";
import { INTERP_TICK_MS } from "@/lib/constants";
/**
* Custom hook that provides position interpolation for flights, ships, and satellites.
* Tracks elapsed time since last data refresh and provides helper functions
* to smoothly animate entity positions between API updates.
*/
export function useInterpolation() {
// Interpolation tick — bumps every INTERP_TICK_MS to animate entity positions
const [interpTick, setInterpTick] = useState(0);
const dataTimestamp = useRef(Date.now());
useEffect(() => {
const iv = setInterval(() => setInterpTick((t) => t + 1), INTERP_TICK_MS);
return () => clearInterval(iv);
}, []);
/** Call this when new data arrives to reset the interpolation baseline */
const resetTimestamp = useCallback(() => {
dataTimestamp.current = Date.now();
}, []);
// Elapsed seconds since last data refresh (used for position interpolation)
const dtSeconds = useMemo(() => {
void interpTick; // use the tick to trigger recalc
return (Date.now() - dataTimestamp.current) / 1000;
}, [interpTick]);
/** Interpolate a flight's position if airborne and has speed + heading */
const interpFlight = useCallback(
(f: { lat: number; lng: number; speed_knots?: number | null; alt?: number | null; true_track?: number; heading?: number }): [number, number] => {
if (!f.speed_knots || f.speed_knots <= 0 || dtSeconds <= 0) return [f.lng, f.lat];
if (f.alt != null && f.alt <= 100) return [f.lng, f.lat];
if (dtSeconds < 1) return [f.lng, f.lat];
const heading = f.true_track || f.heading || 0;
const [newLat, newLng] = interpolatePosition(f.lat, f.lng, heading, f.speed_knots, dtSeconds);
return [newLng, newLat];
},
[dtSeconds]
);
/** Interpolate a ship's position using SOG + COG */
const interpShip = useCallback(
(s: { lat: number; lng: number; sog?: number; cog?: number; heading?: number }): [number, number] => {
if (typeof s.sog !== "number" || !s.sog || s.sog <= 0 || dtSeconds <= 0) return [s.lng, s.lat];
const heading = (typeof s.cog === "number" ? s.cog : 0) || s.heading || 0;
const [newLat, newLng] = interpolatePosition(s.lat, s.lng, heading, s.sog, dtSeconds);
return [newLng, newLat];
},
[dtSeconds]
);
/** Interpolate a satellite's position between API updates */
const interpSat = useCallback(
(s: { lat: number; lng: number; speed_knots?: number; heading?: number }): [number, number] => {
if (!s.speed_knots || s.speed_knots <= 0 || dtSeconds < 1) return [s.lng, s.lat];
const [newLat, newLng] = interpolatePosition(s.lat, s.lng, s.heading || 0, s.speed_knots, dtSeconds, 0, 65);
return [newLng, newLat];
},
[dtSeconds]
);
return { interpTick, interpFlight, interpShip, interpSat, dtSeconds, resetTimestamp, dataTimestamp };
}
@@ -31,6 +31,7 @@ export const svgShipRed = `data:image/svg+xml;utf8,${encodeURIComponent(`<svg xm
export const svgShipYellow = `data:image/svg+xml;utf8,${encodeURIComponent(`<svg xmlns="http://www.w3.org/2000/svg" width="14" height="34" viewBox="0 0 24 24" fill="none"><path d="M7 22 L7 6 L12 1 L17 6 L17 22 Z" fill="yellow" stroke="#000" stroke-width="1"/><rect x="9" y="8" width="6" height="8" fill="#555" stroke="#000" stroke-width="1"/><circle cx="12" cy="18" r="1.5" fill="#000"/><line x1="12" y1="18" x2="12" y2="24" stroke="#000" stroke-width="1.5"/></svg>`)}`;
export const svgShipBlue = `data:image/svg+xml;utf8,${encodeURIComponent(`<svg xmlns="http://www.w3.org/2000/svg" width="16" height="32" viewBox="0 0 24 24" fill="none"><path d="M6 22 L6 6 L12 2 L18 6 L18 22 Z" fill="#3b82f6" stroke="#000" stroke-width="1"/></svg>`)}`;
export const svgShipWhite = `data:image/svg+xml;utf8,${encodeURIComponent(`<svg xmlns="http://www.w3.org/2000/svg" width="18" height="36" viewBox="0 0 24 24" fill="none"><path d="M5 21 L5 8 L12 2 L19 8 L19 21 C19 23 5 23 5 21 Z" fill="white" stroke="#000" stroke-width="1"/><rect x="7" y="10" width="10" height="8" fill="#90cdf4" stroke="#000" stroke-width="1"/><circle cx="12" cy="14" r="2" fill="yellow" stroke="#000"/></svg>`)}`;
export const svgShipPink = `data:image/svg+xml;utf8,${encodeURIComponent(`<svg xmlns="http://www.w3.org/2000/svg" width="18" height="36" viewBox="0 0 24 24" fill="none"><path d="M5 21 L5 8 L12 2 L19 8 L19 21 C19 23 5 23 5 21 Z" fill="#FF69B4" stroke="#000" stroke-width="1"/><rect x="7" y="10" width="10" height="8" fill="#ff8dc7" stroke="#000" stroke-width="1"/><circle cx="12" cy="14" r="2" fill="white" stroke="#000"/></svg>`)}`;
export const svgCarrier = `data:image/svg+xml;utf8,${encodeURIComponent(`<svg xmlns="http://www.w3.org/2000/svg" width="22" height="22" viewBox="0 0 24 24" fill="orange" stroke="black"><polygon points="3,21 21,21 20,4 16,4 16,3 12,3 12,4 4,4" /><rect x="15" y="6" width="3" height="10" /></svg>`)}`;
export const svgCctv = `data:image/svg+xml;utf8,${encodeURIComponent(`<svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="cyan" stroke-width="2"><path d="M16.75 12h3.632a1 1 0 0 1 .894 1.447l-2.034 4.069a1 1 0 0 1-.894.553H5.652a1 1 0 0 1-.894-.553L2.724 13.447A1 1 0 0 1 3.618 12h3.632M14 12V8a2 2 0 0 0-2-2h-4a2 2 0 0 0-2 2v4a4 4 0 1 0 8 0Z" /></svg>`)}`;
export const svgRadioTower = `data:image/svg+xml;utf8,${encodeURIComponent(`<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="#f59e0b" stroke-width="1.5"><line x1="12" y1="10" x2="12" y2="23" stroke="#f59e0b" stroke-width="2"/><line x1="8" y1="23" x2="16" y2="23" stroke="#f59e0b" stroke-width="2" stroke-linecap="round"/><line x1="9" y1="16" x2="15" y2="16" stroke="#f59e0b" stroke-width="1.5" stroke-linecap="round"/><circle cx="12" cy="9" r="2" fill="#f59e0b" stroke="none"/><path d="M8 6a5.5 5.5 0 0 1 8 0" fill="none" stroke="#f59e0b" stroke-width="1.5" stroke-linecap="round"/><path d="M5.5 3.5a9 9 0 0 1 13 0" fill="none" stroke="#f59e0b" stroke-width="1.5" stroke-linecap="round" opacity="0.6"/></svg>`)}`;
+92
View File
@@ -0,0 +1,92 @@
import { useEffect, useState, useRef } from "react";
import { API_BASE } from "@/lib/api";
export type BackendStatus = 'connecting' | 'connected' | 'disconnected';
/**
* Polls the backend for fast and slow data tiers.
*
* Matches the proven GitHub polling pattern:
* - Empty useEffect dependency array (no restarts on viewport change)
* - No viewport bbox filtering (full data every poll)
* - Adaptive startup polling (3s retry 15s/120s steady state)
* - ETag conditional requests for bandwidth savings
* - AbortController for clean unmount
*/
export function useDataPolling() {
const dataRef = useRef<any>({});
const [dataVersion, setDataVersion] = useState(0);
const data = dataRef.current;
const [backendStatus, setBackendStatus] = useState<BackendStatus>('connecting');
const fastEtag = useRef<string | null>(null);
const slowEtag = useRef<string | null>(null);
useEffect(() => {
let hasData = false;
let fastTimerId: ReturnType<typeof setTimeout> | null = null;
let slowTimerId: ReturnType<typeof setTimeout> | null = null;
const fetchFastData = async () => {
try {
const headers: Record<string, string> = {};
if (fastEtag.current) headers['If-None-Match'] = fastEtag.current;
const res = await fetch(`${API_BASE}/api/live-data/fast`, { headers });
if (res.status === 304) { setBackendStatus('connected'); scheduleNext('fast'); return; }
if (res.ok) {
setBackendStatus('connected');
fastEtag.current = res.headers.get('etag') || null;
const json = await res.json();
dataRef.current = { ...dataRef.current, ...json };
setDataVersion(v => v + 1);
const flights = json.commercial_flights?.length || 0;
if (flights > 100) hasData = true;
}
} catch (e) {
console.error("Failed fetching fast live data", e);
setBackendStatus('disconnected');
}
scheduleNext('fast');
};
const fetchSlowData = async () => {
try {
const headers: Record<string, string> = {};
if (slowEtag.current) headers['If-None-Match'] = slowEtag.current;
const res = await fetch(`${API_BASE}/api/live-data/slow`, { headers });
if (res.status === 304) { scheduleNext('slow'); return; }
if (res.ok) {
slowEtag.current = res.headers.get('etag') || null;
const json = await res.json();
dataRef.current = { ...dataRef.current, ...json };
setDataVersion(v => v + 1);
}
} catch (e) {
console.error("Failed fetching slow live data", e);
}
scheduleNext('slow');
};
// Adaptive polling: retry every 3s during startup, back off to normal cadence once data arrives
const scheduleNext = (tier: 'fast' | 'slow') => {
if (tier === 'fast') {
const delay = hasData ? 15000 : 3000; // 3s startup retry → 15s steady state
fastTimerId = setTimeout(fetchFastData, delay);
} else {
const delay = hasData ? 120000 : 5000; // 5s startup retry → 120s steady state
slowTimerId = setTimeout(fetchSlowData, delay);
}
};
fetchFastData();
fetchSlowData();
return () => {
if (fastTimerId) clearTimeout(fastTimerId);
if (slowTimerId) clearTimeout(slowTimerId);
};
}, []);
return { data, dataVersion, backendStatus };
}
+46
View File
@@ -0,0 +1,46 @@
import { useCallback, useState, useEffect } from "react";
import { API_BASE } from "@/lib/api";
import type { RegionDossier, SelectedEntity } from "@/types/dashboard";
export function useRegionDossier(
selectedEntity: SelectedEntity | null,
setSelectedEntity: (entity: SelectedEntity | null) => void
) {
const [regionDossier, setRegionDossier] = useState<RegionDossier | null>(null);
const [regionDossierLoading, setRegionDossierLoading] = useState(false);
const handleMapRightClick = useCallback(async (coords: { lat: number; lng: number }) => {
setSelectedEntity({ type: 'region_dossier', id: `${coords.lat.toFixed(4)}_${coords.lng.toFixed(4)}`, extra: coords });
setRegionDossierLoading(true);
setRegionDossier(null);
try {
const [dossierRes, sentinelRes] = await Promise.allSettled([
fetch(`${API_BASE}/api/region-dossier?lat=${coords.lat}&lng=${coords.lng}`),
fetch(`${API_BASE}/api/sentinel2/search?lat=${coords.lat}&lng=${coords.lng}`),
]);
let dossierData: Record<string, unknown> = {};
if (dossierRes.status === 'fulfilled' && dossierRes.value.ok) {
dossierData = await dossierRes.value.json();
}
let sentinelData = null;
if (sentinelRes.status === 'fulfilled' && sentinelRes.value.ok) {
sentinelData = await sentinelRes.value.json();
}
setRegionDossier({ lat: coords.lat, lng: coords.lng, ...dossierData, sentinel2: sentinelData });
} catch (e) {
console.error("Failed to fetch region dossier", e);
} finally {
setRegionDossierLoading(false);
}
}, [setSelectedEntity]);
// Clear dossier when selecting a different entity type
useEffect(() => {
if (selectedEntity?.type !== 'region_dossier') {
setRegionDossier(null);
setRegionDossierLoading(false);
}
}, [selectedEntity]);
return { regionDossier, regionDossierLoading, handleMapRightClick };
}
+66
View File
@@ -0,0 +1,66 @@
import { useCallback, useState, useRef } from "react";
import { GEOCODE_THROTTLE_MS, GEOCODE_DISTANCE_THRESHOLD, GEOCODE_CACHE_SIZE } from "@/lib/constants";
export function useReverseGeocode() {
const [mouseCoords, setMouseCoords] = useState<{ lat: number; lng: number } | null>(null);
const [locationLabel, setLocationLabel] = useState('');
const geocodeCache = useRef<Map<string, string>>(new Map());
const geocodeTimer = useRef<ReturnType<typeof setTimeout> | null>(null);
const lastGeocodedPos = useRef<{ lat: number; lng: number } | null>(null);
const geocodeAbort = useRef<AbortController | null>(null);
const handleMouseCoords = useCallback((coords: { lat: number; lng: number }) => {
setMouseCoords(coords);
if (geocodeTimer.current) clearTimeout(geocodeTimer.current);
geocodeTimer.current = setTimeout(async () => {
if (lastGeocodedPos.current) {
const dLat = Math.abs(coords.lat - lastGeocodedPos.current.lat);
const dLng = Math.abs(coords.lng - lastGeocodedPos.current.lng);
if (dLat < GEOCODE_DISTANCE_THRESHOLD && dLng < GEOCODE_DISTANCE_THRESHOLD) return;
}
const gridKey = `${(coords.lat).toFixed(2)},${(coords.lng).toFixed(2)}`;
const cached = geocodeCache.current.get(gridKey);
if (cached) {
setLocationLabel(cached);
lastGeocodedPos.current = coords;
return;
}
if (geocodeAbort.current) geocodeAbort.current.abort();
geocodeAbort.current = new AbortController();
try {
const res = await fetch(
`https://nominatim.openstreetmap.org/reverse?lat=${coords.lat}&lon=${coords.lng}&format=json&zoom=10&addressdetails=1`,
{ headers: { 'Accept-Language': 'en' }, signal: geocodeAbort.current.signal }
);
if (res.ok) {
const data = await res.json();
const addr = data.address || {};
const city = addr.city || addr.town || addr.village || addr.county || '';
const state = addr.state || addr.region || '';
const country = addr.country || '';
const parts = [city, state, country].filter(Boolean);
const label = parts.join(', ') || data.display_name?.split(',').slice(0, 3).join(',') || 'Unknown';
if (geocodeCache.current.size > GEOCODE_CACHE_SIZE) {
const iter = geocodeCache.current.keys();
for (let i = 0; i < 100; i++) {
const key = iter.next().value;
if (key !== undefined) geocodeCache.current.delete(key);
}
}
geocodeCache.current.set(gridKey, label);
setLocationLabel(label);
lastGeocodedPos.current = coords;
}
} catch (e: any) {
if (e.name !== 'AbortError') { /* Silently fail - keep last label */ }
}
}, GEOCODE_THROTTLE_MS);
}, []);
return { mouseCoords, locationLabel, handleMouseCoords };
}
+2 -1
View File
@@ -1,9 +1,10 @@
"use client";
import React, { createContext, useContext } from "react";
import type { DashboardData } from "@/types/dashboard";
interface DashboardDataContextValue {
data: any;
data: DashboardData;
selectedEntity: { id: string | number; type: string; extra?: any } | null;
setSelectedEntity: (entity: { id: string | number; type: string; extra?: any } | null) => void;
}
+23 -2
View File
@@ -3,14 +3,23 @@
import React, { createContext, useContext, useState, useEffect } from "react";
type Theme = "dark" | "light";
type HudColor = "cyan" | "matrix";
const ThemeContext = createContext<{ theme: Theme; toggleTheme: () => void }>({
const ThemeContext = createContext<{
theme: Theme;
toggleTheme: () => void;
hudColor: HudColor;
cycleHudColor: () => void;
}>({
theme: "dark",
toggleTheme: () => {},
hudColor: "cyan",
cycleHudColor: () => {},
});
export function ThemeProvider({ children }: { children: React.ReactNode }) {
const [theme, setTheme] = useState<Theme>("dark");
const [hudColor, setHudColor] = useState<HudColor>("cyan");
useEffect(() => {
const saved = localStorage.getItem("sb-theme") as Theme | null;
@@ -18,6 +27,11 @@ export function ThemeProvider({ children }: { children: React.ReactNode }) {
setTheme(saved);
document.documentElement.setAttribute("data-theme", saved);
}
const savedHud = localStorage.getItem("sb-hud-color") as HudColor | null;
if (savedHud === "cyan" || savedHud === "matrix") {
setHudColor(savedHud);
document.documentElement.setAttribute("data-hud", savedHud);
}
}, []);
const toggleTheme = () => {
@@ -27,8 +41,15 @@ export function ThemeProvider({ children }: { children: React.ReactNode }) {
document.documentElement.setAttribute("data-theme", next);
};
const cycleHudColor = () => {
const next = hudColor === "cyan" ? "matrix" : "cyan";
setHudColor(next);
localStorage.setItem("sb-hud-color", next);
document.documentElement.setAttribute("data-hud", next);
};
return (
<ThemeContext.Provider value={{ theme, toggleTheme }}>
<ThemeContext.Provider value={{ theme, toggleTheme, hudColor, cycleHudColor }}>
{children}
</ThemeContext.Provider>
);
+21
View File
@@ -0,0 +1,21 @@
// ─── ShadowBroker Frontend Constants ────────────────────────────────────────
// Centralized magic numbers. Import from here instead of hardcoding.
// ─── Data Polling ───────────────────────────────────────────────────────────
export const POLL_FAST_STARTUP_MS = 3000;
export const POLL_FAST_STEADY_MS = 15000;
export const POLL_SLOW_STARTUP_MS = 5000;
export const POLL_SLOW_STEADY_MS = 120000;
// ─── Reverse Geocoding ──────────────────────────────────────────────────────
export const GEOCODE_THROTTLE_MS = 1500;
export const GEOCODE_DISTANCE_THRESHOLD = 0.05; // ~5km in degrees
export const GEOCODE_CACHE_SIZE = 500;
export const NOMINATIM_DEBOUNCE_MS = 350;
// ─── Map Interpolation ─────────────────────────────────────────────────────
export const INTERP_TICK_MS = 1000;
// ─── News/Alert Layout ──────────────────────────────────────────────────────
export const ALERT_BOX_WIDTH_PX = 180;
export const ALERT_MAX_OFFSET_PX = 350;
+3
View File
@@ -475,4 +475,7 @@ export interface MaplibreViewerProps {
isEavesdropping?: boolean;
onEavesdropClick?: (coords: { lat: number; lng: number }) => void;
onCameraMove?: (coords: { lat: number; lng: number }) => void;
viewBoundsRef?: React.RefObject<{ south: number; west: number; north: number; east: number } | null>;
trackedSdr?: KiwiSDR | null;
setTrackedSdr?: (sdr: KiwiSDR | null) => void;
}
+71
View File
@@ -0,0 +1,71 @@
import { describe, it, expect } from 'vitest';
import { spreadAlertItems } from '@/utils/alertSpread';
describe('spreadAlertItems', () => {
const makeAlert = (title: string, lat: number, lng: number, cluster_count = 1) => ({
title,
coords: [lat, lng],
cluster_count,
alert_level: 3,
});
it('returns empty array for empty input', () => {
expect(spreadAlertItems([], 4, new Set())).toEqual([]);
});
it('throws on null input (caller must null-check)', () => {
expect(() => spreadAlertItems(null as any, 4, new Set())).toThrow();
});
it('filters out items without coords', () => {
const items = [
{ title: 'No coords', alert_level: 1 },
makeAlert('Has coords', 40, -74),
];
const result = spreadAlertItems(items, 4, new Set());
expect(result.length).toBe(1);
expect(result[0].title).toBe('Has coords');
});
it('filters dismissed alerts by alertKey', () => {
const items = [
makeAlert('Fire in NYC', 40.7, -74.0),
makeAlert('Floods in LA', 34.0, -118.2),
];
const dismissed = new Set(['Fire in NYC|40.7,-74']);
const result = spreadAlertItems(items, 4, dismissed);
expect(result.length).toBe(1);
expect(result[0].title).toBe('Floods in LA');
});
it('preserves originalIdx for popup selection', () => {
const items = [
{ title: 'Skip me', alert_level: 1 }, // no coords
makeAlert('Alert A', 10, 20),
makeAlert('Alert B', 30, 40),
];
const result = spreadAlertItems(items, 4, new Set());
expect(result[0].originalIdx).toBe(1);
expect(result[1].originalIdx).toBe(2);
});
it('adds alertKey and showLine properties', () => {
const items = [makeAlert('Test Alert', 51.5, -0.1)];
const result = spreadAlertItems(items, 4, new Set());
expect(result[0]).toHaveProperty('alertKey');
expect(result[0]).toHaveProperty('showLine');
expect(result[0].alertKey).toContain('Test Alert');
});
it('spreads overlapping alerts apart (offsets are non-zero for stacked items)', () => {
// Place 5 alerts at the exact same location — they should be spread apart
const items = Array.from({ length: 5 }, (_, i) =>
makeAlert(`Alert ${i}`, 40.0, -74.0)
);
const result = spreadAlertItems(items, 8, new Set()); // zoom 8 = close enough to overlap
const hasNonZeroOffset = result.some(
(r: any) => Math.abs(r.offsetX) > 1 || Math.abs(r.offsetY) > 1
);
expect(hasNonZeroOffset).toBe(true);
});
});
+141
View File
@@ -0,0 +1,141 @@
/**
* Alert spread collision resolution algorithm.
* Takes news items with coordinates and resolves visual overlaps
* so alert boxes don't stack on top of each other on the map.
*/
import type { NewsArticle } from "@/types/dashboard";
import { ALERT_BOX_WIDTH_PX, ALERT_MAX_OFFSET_PX } from "@/lib/constants";
export interface SpreadAlertItem extends NewsArticle {
originalIdx: number;
x: number;
y: number;
offsetX: number;
offsetY: number;
boxH: number;
alertKey: string;
showLine: boolean;
}
/** Estimate rendered box height based on title length */
function estimateBoxH(n: { title?: string; cluster_count?: number }): number {
const titleLen = (n.title || "").length;
const titleLines = Math.max(1, Math.ceil(titleLen / 20)); // ~20 chars per line at 9px in 160px
const hasFooter = (n.cluster_count || 1) > 1;
return 10 + 14 + titleLines * 13 + (hasFooter ? 14 : 0) + 10; // padding + header + title + footer + padding
}
/**
* Resolves alert box collisions using a grid-based spatial algorithm (O(n) per iteration).
* Returns positioned items with offsets and alert keys.
*/
export function spreadAlertItems(
news: NewsArticle[],
zoom: number,
dismissedAlerts: Set<string>
): SpreadAlertItem[] {
const pixelsPerDeg = (256 * Math.pow(2, zoom)) / 360;
let items = news
.map((n, idx) => ({ ...n, originalIdx: idx }))
.filter((n) => n.coords)
.map((n) => ({
...n,
x: n.coords![1] * pixelsPerDeg,
y: -n.coords![0] * pixelsPerDeg,
offsetX: 0,
offsetY: 0,
boxH: estimateBoxH(n as { title?: string; cluster_count?: number }),
}));
const BOX_W = ALERT_BOX_WIDTH_PX;
const GAP = 6;
const MAX_OFFSET = ALERT_MAX_OFFSET_PX;
// Grid-based Collision Resolution (O(n) per iteration instead of O(n²))
const CELL_W = BOX_W + GAP;
const CELL_H = 100;
const maxIter = 30;
for (let iter = 0; iter < maxIter; iter++) {
let moved = false;
const grid: Record<string, number[]> = {};
for (let i = 0; i < items.length; i++) {
const cx = Math.floor((items[i].x + items[i].offsetX) / CELL_W);
const cy = Math.floor((items[i].y + items[i].offsetY) / CELL_H);
const key = `${cx},${cy}`;
(grid[key] ??= []).push(i);
}
const checked = new Set<string>();
for (const key in grid) {
const [cx, cy] = key.split(",").map(Number);
for (let dx = -1; dx <= 1; dx++) {
for (let dy = -1; dy <= 1; dy++) {
const nk = `${cx + dx},${cy + dy}`;
if (!grid[nk]) continue;
const pairKey = cx + dx < cx || (cx + dx === cx && cy + dy < cy) ? `${nk}|${key}` : `${key}|${nk}`;
if (key !== nk && checked.has(pairKey)) continue;
checked.add(pairKey);
const cellA = grid[key];
const cellB = key === nk ? cellA : grid[nk];
for (const i of cellA) {
const startJ = key === nk ? cellA.indexOf(i) + 1 : 0;
for (let jIdx = startJ; jIdx < cellB.length; jIdx++) {
const j = cellB[jIdx];
if (i === j) continue;
const a = items[i],
b = items[j];
const adx = Math.abs(a.x + a.offsetX - (b.x + b.offsetX));
const ady = Math.abs(a.y + a.offsetY - (b.y + b.offsetY));
const minDistX = BOX_W + GAP;
const minDistY = (a.boxH + b.boxH) / 2 + GAP;
if (adx < minDistX && ady < minDistY) {
moved = true;
const overlapX = minDistX - adx;
const overlapY = minDistY - ady;
if (overlapY < overlapX) {
const push = overlapY / 2 + 1;
if (a.y + a.offsetY <= b.y + b.offsetY) {
a.offsetY -= push;
b.offsetY += push;
} else {
a.offsetY += push;
b.offsetY -= push;
}
} else {
const push = overlapX / 2 + 1;
if (a.x + a.offsetX <= b.x + b.offsetX) {
a.offsetX -= push;
b.offsetX += push;
} else {
a.offsetX += push;
b.offsetX -= push;
}
}
}
}
}
}
}
}
if (!moved) break;
}
// Clamp offsets so boxes stay near their origin
for (const item of items) {
item.offsetX = Math.max(-MAX_OFFSET, Math.min(MAX_OFFSET, item.offsetX));
item.offsetY = Math.max(-MAX_OFFSET, Math.min(MAX_OFFSET, item.offsetY));
}
return items
.filter((item) => {
const alertKey = `${item.title}|${item.coords?.[0]},${item.coords?.[1]}`;
return !dismissedAlerts.has(alertKey);
})
.map((item) => ({
...item,
alertKey: `${item.title}|${item.coords?.[0]},${item.coords?.[1]}`,
showLine: Math.abs(item.offsetX) > 5 || Math.abs(item.offsetY) > 5,
})) as SpreadAlertItem[];
}
+15
View File
@@ -0,0 +1,15 @@
import { defineConfig } from 'vitest/config';
import path from 'path';
export default defineConfig({
test: {
environment: 'jsdom',
globals: true,
include: ['src/**/*.test.{ts,tsx}'],
},
resolve: {
alias: {
'@': path.resolve(__dirname, 'src'),
},
},
});