Compare commits

..

31 Commits

Author SHA1 Message Date
anoracleofra-code 34db99deaf v0.8.0: POTUS fleet tracking, full aircraft color-coding, carrier fidelity, UI overhaul
New features:
- POTUS fleet (AF1, AF2, Marine One) with hot-pink icons + gold halo ring
- 9-color aircraft system: military, medical, police, VIP, privacy, dictators
- Sentinel-2 fullscreen overlay with download/copy/open buttons (green themed)
- Carrier homeport deconfliction — distinct pier positions instead of stacking
- Toggle all data layers button (cyan when active, excludes MODIS Terra)
- Version badge + update checker + Discussions shortcut in UI
- Overhauled MapLegend with POTUS fleet, wildfires, infrastructure sections
- Data center map layer with ~700 global DCs from curated dataset

Fixes:
- All Air Force Two ICAO hex codes now correctly identified
- POTUS icon priority over grounded state
- Sentinel-2 no longer overlaps bottom coordinate bar
- Region dossier Nominatim 429 rate-limit retry/backoff
- Docker ENV legacy format warnings resolved
- UI buttons cyan in dark mode, grey in light mode
- Circuit breaker for flaky upstream APIs

Community: @suranyami — parallel multi-arch Docker builds + runtime BACKEND_URL fix (PR #35, #44)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

Former-commit-id: 7c523df70a2d26f675603166e3513d29230592cd
2026-03-12 09:31:37 -06:00
Shadowbroker a0d0a449eb Merge pull request #44 from suranyami/fix-backend-url-regression-speed-up-docker-builds
ci: speed up multi-arch Docker builds + fix BACKEND_URL baked in at build time
Former-commit-id: 54ca8d59aede7e47df315ac526bde35f4e4d0622
2026-03-11 19:34:57 -06:00
David Parry 26a72f4f95 chore: untrack local config files (.claude, .mise.local.toml)
These are already covered by the .gitignore added in this branch.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

Former-commit-id: dcfdd7bb329ef7e63ee5755ccbe403bf951903f6
2026-03-12 12:11:09 +11:00
David Parry 3eff24c6ed Merge branch 'main' of github.com:suranyami/Shadowbroker
Former-commit-id: 8e9607c7adaf4f1b4b5013fab10429787671ec03
2026-03-12 12:08:19 +11:00
anoracleofra-code bb345ed665 feat: add TopRightControls component
Former-commit-id: e75da4288a
2026-03-11 18:39:26 -06:00
anoracleofra-code dec5b0da9c chore: bump version to 0.7.0
Former-commit-id: 8ee47f52ab
2026-03-11 18:30:49 -06:00
David Parry 68cacc0fed Merge pull request #6 from suranyami/fix-regression-BACKEND_URL
Fix regression, BACKEND_URL now only processed at request-time

Former-commit-id: 4131a0cadb3f17398ccaf7d14704e4399e9fa7b8
2026-03-12 11:22:03 +11:00
David Parry 40e89ac30b Fix regression, BACKEND_URL now only processed at request-time
Former-commit-id: da14f44e910786e9e21b5968b77e97a94f2876ab
2026-03-12 11:18:23 +11:00
David Parry 350ec11725 Merge pull request #5 from suranyami/speed-up-docker-builds
Ensure lower case image name

Former-commit-id: dc43a87ef0
2026-03-12 10:59:41 +11:00
David Parry 5d4dd0560d Ensure lower case image name
Former-commit-id: f98cafd987
2026-03-12 10:34:33 +11:00
David Parry 345f3c7451 Merge pull request #4 from suranyami/speed-up-docker-builds
Add optimizations for separate arm64/x86_64 builds

Former-commit-id: 50d265fcf0
2026-03-12 10:30:01 +11:00
David Parry dde527821c Merge branch 'BigBodyCobain:main' into main
Former-commit-id: 5c49568921
2026-03-12 10:29:30 +11:00
David Parry 5bee764614 Add optimizations for separate arm64/x86_64 builds
Former-commit-id: aff71e6cd7
2026-03-12 10:25:33 +11:00
anoracleofra-code c986de9e35 fix: legend - earthquake icon yellow, outage zone grey
Former-commit-id: 85478250c3
2026-03-11 14:57:51 -06:00
anoracleofra-code d2fa45c6a6 Merge branch 'main' of https://github.com/BigBodyCobain/Shadowbroker
Former-commit-id: cbc506242d
2026-03-11 14:30:25 -06:00
anoracleofra-code d78bf61256 fix: aircraft categorization, fullscreen satellite imagery, region dossier rate-limit, updated map legend
- Fixed 288+ miscategorized aircraft in plane_alert_db.json (gov/police/medical)
- data_fetcher.py: tracked_names enrichment now assigns blue/lime colors for gov/law/medical operators
- region_dossier.py: fixed Nominatim 429 rate-limiting with retry/backoff
- MaplibreViewer.tsx: Sentinel-2 popup replaced with fullscreen overlay + download/copy buttons
- MapLegend.tsx: updated to show all 9 tracked aircraft color categories + POTUS fleet + wildfires + infrastructure


Former-commit-id: d109434616
2026-03-11 14:29:18 -06:00
Shadowbroker b10d6e6e00 Update README.md
Former-commit-id: b1cb267da3
2026-03-11 14:09:50 -06:00
Shadowbroker afdc626bdb Update README.md
Former-commit-id: a3a0f5e990
2026-03-11 14:07:46 -06:00
anoracleofra-code 5ab02e821f feat: POTUS Fleet tracker, Docker secrets, route fix, SQLite->JSON migration
- Add Docker Swarm secrets _FILE support (AIS_API_KEY_FILE, etc.)
- Fix flight route lookup: pass lat/lng to adsb.lol routeset API, return airport names
- Replace SQLite plane_alert DB with JSON file + O(1) category color mapping
- Add POTUS Fleet (AF1, AF2, Marine One) with hardcoded ICAO overrides
- Add tracked_names enrichment from Excel data with POTUS protection
- Add oversized gold-ringed POTUS SVG icons on map
- Add POTUS Fleet tracker panel in WorldviewLeftPanel with fly-to
- Overhaul tracked flight labels: zoom-gated, PIA hidden, color-mapped
- Add orange color to trackedIconMap, soften white icon strokes
- Fix NewsFeed Wikipedia links to use alert_wiki slug


Former-commit-id: 6f952104c1
2026-03-11 12:28:04 -06:00
anoracleofra-code ac62e4763f chore: update ChangelogModal for v0.7.0
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

Former-commit-id: a771fe8cfb
2026-03-11 06:37:15 -06:00
anoracleofra-code cf68f1978d v0.7.0: performance hardening — parallel fetches, deferred icons, AIS stability
Optimizations:
- Parallelized yfinance stock/oil fetches via ThreadPoolExecutor (~2s vs ~8s)
- AIS backoff reset after 200 successes; removed hot-loop pruning (lock contention)
- Single-pass ETag serialization (was double-serializing JSON)
- Deferred ~50 non-critical map icons via setTimeout(0)
- News feed animation capped at 15 items (was 100+ simultaneous)
- heapq.nlargest() for FIRMS fires (60K→5K) and internet outages
- Removed satellite duplication from fast endpoint
- Geopolitics interval 5min → 30min
- Ship counts single-pass memoized; color maps module-level constants
- Improved GDELT URL-to-headline extraction (skip gibberish slugs)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

Former-commit-id: 4a14a2f078
2026-03-11 06:25:31 -06:00
David Parry beadce5dae Merge pull request #3 from suranyami/feat/multi-arch-docker-and-backend-proxy
fix: resolve proxy gzip decoding and BACKEND_URL Docker override issues
Former-commit-id: 7af4af1507
2026-03-11 15:58:05 +11:00
Shadowbroker 10f376d4d7 Merge pull request #35 from suranyami/feat/multi-arch-docker-and-backend-proxy
fix: resolve proxy gzip decoding and BACKEND_URL Docker override issues
Former-commit-id: c539a05d20
2026-03-10 22:45:11 -06:00
David Parry ff168150c9 Merge branch 'main' into feat/multi-arch-docker-and-backend-proxy
Former-commit-id: 7ead58d453
2026-03-11 15:05:55 +11:00
David Parry 782225ff99 fix: resolve proxy gzip decoding and BACKEND_URL Docker override issues
Two bugs introduced by the Next.js proxy Route Handler:

1. ERR_CONTENT_DECODING_FAILED — Node.js fetch() automatically
   decompresses gzip/br responses from the backend, but the proxy was
   still forwarding Content-Encoding and Content-Length headers to the
   browser. The browser would then try to decompress already-decompressed
   data and fail. Fixed by stripping Content-Encoding and Content-Length
   from upstream response headers.

2. BACKEND_URL shell env leak into Docker Compose — docker-compose.yml
   used ${BACKEND_URL:-http://backend:8000}, which was being overridden
   by BACKEND_URL=http://localhost:8000 set in .mise.local.toml for local
   dev. Inside the frontend container, localhost:8000 does not exist,
   causing all proxied requests to return 502. Fixed by hardcoding
   http://backend:8000 in docker-compose.yml so the shell environment
   cannot override it.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

Former-commit-id: 036c62d2c0
2026-03-11 15:00:50 +11:00
David Parry f99cc669f5 Merge pull request #2 from suranyami/feat/multi-arch-docker-and-backend-proxy
feat: proxy backend API through Next.js using runtime BACKEND_URL
Former-commit-id: d930001673
2026-03-11 14:22:58 +11:00
David Parry 25262323f5 feat: proxy backend API through Next.js using runtime BACKEND_URL
Previously, NEXT_PUBLIC_API_URL was a build-time Next.js variable, making
it impossible to configure the backend URL in docker-compose `environment`
without rebuilding the image.

This change introduces a proper server-side proxy:
- next.config.ts: adds a rewrite rule that forwards all /api/* requests
  to BACKEND_URL (read at server startup, not baked at build time).
  Defaults to http://localhost:8000 so local dev works without config.
- api.ts: API_BASE is now an empty string — all fetch calls use relative
  /api/... paths, which the Next.js server proxies to the backend.
- docker-compose.yml: replaces NEXT_PUBLIC_API_URL build arg with a
  runtime BACKEND_URL env var defaulting to http://backend:8000, using
  Docker's internal networking. Port 8000 no longer needs to be exposed.
- README: updates Docker setup docs, standalone compose example, and
  environment variable reference to reflect BACKEND_URL.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

Former-commit-id: a3b18e23c1
2026-03-11 14:18:30 +11:00
Shadowbroker bad50b8924 Merge pull request #33 from suranyami/feat/multi-arch-docker-and-backend-proxy
Feat/multi arch docker and backend URL as env var

Former-commit-id: 4c92fbe990
2026-03-10 21:02:33 -06:00
David Parry 82715c79a6 Merge pull request #1 from suranyami/feat/multi-arch-docker-and-backend-proxy
Feat/multi arch docker and backend proxy

Former-commit-id: 82e0033239
2026-03-11 13:56:22 +11:00
David Parry e2a9ef9bbf feat: proxy backend API through Next.js using runtime BACKEND_URL
Previously, NEXT_PUBLIC_API_URL was a build-time Next.js variable, making
it impossible to configure the backend URL in docker-compose `environment`
without rebuilding the image.

This change introduces a proper server-side proxy:
- next.config.ts: adds a rewrite rule that forwards all /api/* requests
  to BACKEND_URL (read at server startup, not baked at build time).
  Defaults to http://localhost:8000 so local dev works without config.
- api.ts: API_BASE is now an empty string — all fetch calls use relative
  /api/... paths, which the Next.js server proxies to the backend.
- docker-compose.yml: replaces NEXT_PUBLIC_API_URL build arg with a
  runtime BACKEND_URL env var defaulting to http://backend:8000, using
  Docker's internal networking. Port 8000 no longer needs to be exposed.
- README: updates Docker setup docs, standalone compose example, and
  environment variable reference to reflect BACKEND_URL.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

Former-commit-id: b4c9e78cdd
2026-03-11 13:49:00 +11:00
David Parry 3c16071fcd ci: build and publish multi-arch Docker images (amd64 + arm64)
Add `platforms: linux/amd64,linux/arm64` to both the frontend and
backend build-and-push steps. The existing setup-buildx-action already
enables QEMU-based cross-compilation, so no additional steps are needed.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

Former-commit-id: e3e0db6f3d
2026-03-11 13:48:24 +11:00
37 changed files with 2377 additions and 882 deletions
+16
View File
@@ -0,0 +1,16 @@
# ShadowBroker — Docker Compose Environment Variables
# Copy this file to .env and fill in your keys:
# cp .env.example .env
# ── Required for backend container ─────────────────────────────
OPENSKY_CLIENT_ID=
OPENSKY_CLIENT_SECRET=
AIS_API_KEY=
# ── Optional ───────────────────────────────────────────────────
# LTA (Singapore traffic cameras) — leave blank to skip
# LTA_ACCOUNT_KEY=
# Override the backend URL the frontend uses (leave blank for auto-detect)
# NEXT_PUBLIC_API_URL=http://192.168.1.50:8000
+166 -24
View File
@@ -13,17 +13,29 @@ env:
IMAGE_NAME: ${{ github.repository }}
jobs:
build-and-push-frontend:
runs-on: ubuntu-latest
build-frontend:
runs-on: ${{ matrix.runner }}
permissions:
contents: read
packages: write
id-token: write
strategy:
fail-fast: false
matrix:
include:
- platform: linux/amd64
runner: ubuntu-latest
- platform: linux/arm64
runner: ubuntu-24.04-arm
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Lowercase image name
run: echo "IMAGE_NAME=${IMAGE_NAME,,}" >> $GITHUB_ENV
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3.0.0
@@ -35,6 +47,69 @@ jobs:
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Extract Docker metadata
id: meta
uses: docker/metadata-action@v5.0.0
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}-frontend
- name: Build and push Docker image by digest
id: build
uses: docker/build-push-action@v5.0.0
with:
context: ./frontend
platforms: ${{ matrix.platform }}
push: ${{ github.event_name != 'pull_request' }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=gha,scope=frontend-${{ matrix.platform }}
cache-to: type=gha,mode=max,scope=frontend-${{ matrix.platform }}
outputs: type=image,name=${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}-frontend,push-by-digest=true,name-canonical=true,push=${{ github.event_name != 'pull_request' }}
- name: Export digest
if: github.event_name != 'pull_request'
run: |
mkdir -p /tmp/digests/frontend
digest="${{ steps.build.outputs.digest }}"
touch "/tmp/digests/frontend/${digest#sha256:}"
- name: Upload digest
if: github.event_name != 'pull_request'
uses: actions/upload-artifact@v4
with:
name: digests-frontend-${{ matrix.platform == 'linux/amd64' && 'amd64' || 'arm64' }}
path: /tmp/digests/frontend/*
if-no-files-found: error
retention-days: 1
merge-frontend:
runs-on: ubuntu-latest
if: github.event_name != 'pull_request'
needs: build-frontend
permissions:
contents: read
packages: write
steps:
- name: Lowercase image name
run: echo "IMAGE_NAME=${IMAGE_NAME,,}" >> $GITHUB_ENV
- name: Download digests
uses: actions/download-artifact@v4
with:
path: /tmp/digests/frontend
pattern: digests-frontend-*
merge-multiple: true
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3.0.0
- name: Log into registry ${{ env.REGISTRY }}
uses: docker/login-action@v3.0.0
with:
registry: ${{ env.REGISTRY }}
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Extract Docker metadata
id: meta
uses: docker/metadata-action@v5.0.0
@@ -45,28 +120,36 @@ jobs:
type=semver,pattern={{major}}.{{minor}}
type=raw,value=latest,enable={{is_default_branch}}
- name: Build and push Docker image
id: build-and-push
uses: docker/build-push-action@v5.0.0
with:
context: ./frontend
push: ${{ github.event_name != 'pull_request' }}
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=gha
cache-to: type=gha,mode=max
- name: Create and push manifest
working-directory: /tmp/digests/frontend
run: |
docker buildx imagetools create \
$(jq -cr '.tags | map("-t " + .) | join(" ")' <<< "$DOCKER_METADATA_OUTPUT_JSON") \
$(printf '${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}-frontend@sha256:%s ' *)
build-and-push-backend:
runs-on: ubuntu-latest
build-backend:
runs-on: ${{ matrix.runner }}
permissions:
contents: read
packages: write
id-token: write
strategy:
fail-fast: false
matrix:
include:
- platform: linux/amd64
runner: ubuntu-latest
- platform: linux/arm64
runner: ubuntu-24.04-arm
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Lowercase image name
run: echo "IMAGE_NAME=${IMAGE_NAME,,}" >> $GITHUB_ENV
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3.0.0
@@ -78,6 +161,69 @@ jobs:
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Extract Docker metadata
id: meta
uses: docker/metadata-action@v5.0.0
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}-backend
- name: Build and push Docker image by digest
id: build
uses: docker/build-push-action@v5.0.0
with:
context: ./backend
platforms: ${{ matrix.platform }}
push: ${{ github.event_name != 'pull_request' }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=gha,scope=backend-${{ matrix.platform }}
cache-to: type=gha,mode=max,scope=backend-${{ matrix.platform }}
outputs: type=image,name=${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}-backend,push-by-digest=true,name-canonical=true,push=${{ github.event_name != 'pull_request' }}
- name: Export digest
if: github.event_name != 'pull_request'
run: |
mkdir -p /tmp/digests/backend
digest="${{ steps.build.outputs.digest }}"
touch "/tmp/digests/backend/${digest#sha256:}"
- name: Upload digest
if: github.event_name != 'pull_request'
uses: actions/upload-artifact@v4
with:
name: digests-backend-${{ matrix.platform == 'linux/amd64' && 'amd64' || 'arm64' }}
path: /tmp/digests/backend/*
if-no-files-found: error
retention-days: 1
merge-backend:
runs-on: ubuntu-latest
if: github.event_name != 'pull_request'
needs: build-backend
permissions:
contents: read
packages: write
steps:
- name: Lowercase image name
run: echo "IMAGE_NAME=${IMAGE_NAME,,}" >> $GITHUB_ENV
- name: Download digests
uses: actions/download-artifact@v4
with:
path: /tmp/digests/backend
pattern: digests-backend-*
merge-multiple: true
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3.0.0
- name: Log into registry ${{ env.REGISTRY }}
uses: docker/login-action@v3.0.0
with:
registry: ${{ env.REGISTRY }}
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Extract Docker metadata
id: meta
uses: docker/metadata-action@v5.0.0
@@ -88,13 +234,9 @@ jobs:
type=semver,pattern={{major}}.{{minor}}
type=raw,value=latest,enable={{is_default_branch}}
- name: Build and push Docker image
id: build-and-push
uses: docker/build-push-action@v5.0.0
with:
context: ./backend
push: ${{ github.event_name != 'pull_request' }}
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=gha
cache-to: type=gha,mode=max
- name: Create and push manifest
working-directory: /tmp/digests/backend
run: |
docker buildx imagetools create \
$(jq -cr '.tags | map("-t " + .) | join(" ")' <<< "$DOCKER_METADATA_OUTPUT_JSON") \
$(printf '${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}-backend@sha256:%s ' *)
+3
View File
@@ -94,3 +94,6 @@ clean_zip.py
zip_repo.py
refactor_cesium.py
jobs.json
.claude
.mise.local.toml
+62 -21
View File
@@ -9,7 +9,11 @@
---
![560645594-989008ee-c690-4cc0-aade-14c24ca82874](https://github.com/user-attachments/assets/5a879552-327f-4f66-81e9-4ae1cec9c468)
https://github.com/user-attachments/assets/248208ec-62f7-49d1-831d-4bd0a1fa6852
@@ -21,7 +25,7 @@ Built with **Next.js**, **MapLibre GL**, **FastAPI**, and **Python**, it's desig
## Interesting Use Cases
* Track private jets of billionaires
* Track everything from Air Force One to the private jets of billionaires, dictators, and corporations
* Monitor satellites passing overhead and see high-resolution satellite imagery
* Nose around local emergency scanners
* Watch naval traffic worldwide
@@ -219,36 +223,76 @@ cd Shadowbroker
Open `http://localhost:3000` to view the dashboard.
> **Deploying publicly or on a LAN?** The frontend **auto-detects** the
> backend — it uses your browser's hostname with port `8000`
> (e.g. if you visit `http://192.168.1.50:3000`, API calls go to
> `http://192.168.1.50:8000`). **No configuration needed** for most setups.
> **Deploying publicly or on a LAN?** No configuration needed for most setups.
> The frontend proxies all API calls through the Next.js server to `BACKEND_URL`,
> which defaults to `http://backend:8000` (Docker internal networking).
> Port 8000 does not need to be exposed externally.
>
> If your backend runs on a **different port or host** (reverse proxy,
> custom Docker port mapping, separate server), set `NEXT_PUBLIC_API_URL`:
> If your backend runs on a **different host or port**, set `BACKEND_URL` at runtime — no rebuild required:
>
> ```bash
> # Linux / macOS
> NEXT_PUBLIC_API_URL=http://myserver.com:9096 docker-compose up -d --build
> BACKEND_URL=http://myserver.com:9096 docker-compose up -d
>
> # Podman (via compose.sh wrapper)
> NEXT_PUBLIC_API_URL=http://192.168.1.50:9096 ./compose.sh up -d --build
> BACKEND_URL=http://192.168.1.50:9096 ./compose.sh up -d
>
> # Windows (PowerShell)
> $env:NEXT_PUBLIC_API_URL="http://myserver.com:9096"; docker-compose up -d --build
> $env:BACKEND_URL="http://myserver.com:9096"; docker-compose up -d
>
> # Or add to a .env file next to docker-compose.yml:
> # NEXT_PUBLIC_API_URL=http://myserver.com:9096
> # BACKEND_URL=http://myserver.com:9096
> ```
>
> This is a **build-time** variable (Next.js limitation) — it gets baked into
> the frontend during `npm run build`. Changing it requires a rebuild.
If you prefer to call the container engine directly, Podman users can run `podman compose up -d`, or force the wrapper to use Podman with `./compose.sh --engine podman up -d`.
Depending on your local Podman configuration, `podman compose` may still delegate to an external compose provider while talking to the Podman socket.
---
### 🐋 Standalone Deploy (Portainer, Uncloud, NAS, etc.)
No need to clone the repo. Use the pre-built images published to the GitHub Container Registry.
Create a `docker-compose.yml` with the following content and deploy it directly — paste it into Portainer's stack editor, `uncloud deploy`, or any Docker host:
```yaml
services:
backend:
image: ghcr.io/bigbodycobain/shadowbroker-backend:latest
container_name: shadowbroker-backend
ports:
- "8000:8000"
environment:
- AIS_API_KEY=your_aisstream_key # Required — get one free at aisstream.io
- OPENSKY_CLIENT_ID= # Optional — higher flight data rate limits
- OPENSKY_CLIENT_SECRET= # Optional — paired with Client ID above
- LTA_ACCOUNT_KEY= # Optional — Singapore CCTV cameras
- CORS_ORIGINS= # Optional — comma-separated allowed origins
volumes:
- backend_data:/app/data
restart: unless-stopped
frontend:
image: ghcr.io/bigbodycobain/shadowbroker-frontend:latest
container_name: shadowbroker-frontend
ports:
- "3000:3000"
environment:
- BACKEND_URL=http://backend:8000 # Docker internal networking — no rebuild needed
depends_on:
- backend
restart: unless-stopped
volumes:
backend_data:
```
> **How it works:** The frontend container proxies all `/api/*` requests through the Next.js server to `BACKEND_URL` using Docker's internal networking. The browser only ever talks to port 3000 — port 8000 does not need to be exposed externally.
>
> `BACKEND_URL` is a plain runtime environment variable (not a build-time `NEXT_PUBLIC_*`), so you can change it in Portainer, Uncloud, or any compose editor without rebuilding the image. Set it to the address where your backend is reachable from inside the Docker network (e.g. `http://backend:8000`, `http://192.168.1.50:8000`).
---
### 📦 Quick Start (No Code Required)
If you just want to run the dashboard without dealing with terminal commands:
@@ -420,16 +464,13 @@ OPENSKY_CLIENT_SECRET=your_opensky_secret # OAuth2 — paired with Client ID
LTA_ACCOUNT_KEY=your_lta_key # Singapore CCTV cameras
```
### Frontend (optional)
### Frontend
| Variable | Where to set | Purpose |
|---|---|---|
| `NEXT_PUBLIC_API_URL` | `.env` next to `docker-compose.yml`, or shell env | Override backend URL when deploying publicly or behind a reverse proxy. Leave unset for auto-detection. |
| `BACKEND_URL` | `environment` in `docker-compose.yml`, or shell env | URL the Next.js server uses to proxy API calls to the backend. Defaults to `http://backend:8000`. **Runtime variable — no rebuild needed.** |
**How auto-detection works:** When `NEXT_PUBLIC_API_URL` is not set, the frontend
reads `window.location.hostname` in the browser and calls `{protocol}//{hostname}:8000`.
This means the dashboard works on `localhost`, LAN IPs, and public domains without
any configuration — as long as the backend is reachable on port 8000 of the same host.
**How it works:** The frontend proxies all `/api/*` requests through the Next.js server to `BACKEND_URL` using Docker's internal networking. Browsers only talk to port 3000; port 8000 never needs to be exposed externally. For local dev without Docker, `BACKEND_URL` defaults to `http://localhost:8000`.
---
+15
View File
@@ -0,0 +1,15 @@
# ShadowBroker Backend — Environment Variables
# Copy this file to .env and fill in your keys:
# cp .env.example .env
# ── Required Keys ──────────────────────────────────────────────
# Without these, the corresponding data layers will be empty.
OPENSKY_CLIENT_ID= # https://opensky-network.org/ — free account, OAuth2 client ID
OPENSKY_CLIENT_SECRET= # OAuth2 client secret from your OpenSky dashboard
AIS_API_KEY= # https://aisstream.io/ — free tier WebSocket key
# ── Optional ───────────────────────────────────────────────────
# Override allowed CORS origins (comma-separated). Defaults to localhost + LAN auto-detect.
# CORS_ORIGINS=http://192.168.1.50:3000,https://my-domain.com
+6 -4
View File
@@ -9,16 +9,18 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
&& apt-get install -y --no-install-recommends nodejs \
&& rm -rf /var/lib/apt/lists/*
# Install dependencies
# Install Python dependencies
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
# Install Node.js dependencies (ws module for AIS WebSocket proxy)
# Copy manifests first so this layer is cached unless deps change
COPY package*.json ./
RUN npm install --omit=dev
# Copy source code
COPY . .
# Install Node.js dependencies (ws module for AIS WebSocket proxy)
RUN npm install --omit=dev
# Create a non-root user for security
RUN adduser --system --uid 1001 backenduser \
&& chown -R backenduser /app
+41 -9
View File
@@ -1,12 +1,44 @@
{
"feeds": [
{ "name": "NPR", "url": "https://feeds.npr.org/1004/rss.xml", "weight": 4 },
{ "name": "BBC", "url": "http://feeds.bbci.co.uk/news/world/rss.xml", "weight": 3 },
{ "name": "AlJazeera", "url": "https://www.aljazeera.com/xml/rss/all.xml", "weight": 2 },
{ "name": "NYT", "url": "https://rss.nytimes.com/services/xml/rss/nyt/World.xml", "weight": 1 },
{ "name": "GDACS", "url": "https://www.gdacs.org/xml/rss.xml", "weight": 5 },
{ "name": "NHK", "url": "https://www3.nhk.or.jp/nhkworld/rss/world.xml", "weight": 3 },
{ "name": "CNA", "url": "https://www.channelnewsasia.com/rssfeed/8395986", "weight": 3 },
{ "name": "Mercopress", "url": "https://en.mercopress.com/rss/", "weight": 3 }
{
"name": "NPR",
"url": "https://feeds.npr.org/1004/rss.xml",
"weight": 4
},
{
"name": "BBC",
"url": "http://feeds.bbci.co.uk/news/world/rss.xml",
"weight": 3
},
{
"name": "AlJazeera",
"url": "https://www.aljazeera.com/xml/rss/all.xml",
"weight": 2
},
{
"name": "NYT",
"url": "https://rss.nytimes.com/services/xml/rss/nyt/World.xml",
"weight": 1
},
{
"name": "GDACS",
"url": "https://www.gdacs.org/xml/rss.xml",
"weight": 5
},
{
"name": "NHK",
"url": "https://www3.nhk.or.jp/nhkworld/rss/world.xml",
"weight": 3
},
{
"name": "CNA",
"url": "https://www.channelnewsasia.com/rssfeed/8395986",
"weight": 3
},
{
"name": "Mercopress",
"url": "https://en.mercopress.com/rss/",
"weight": 3
}
]
}
}
@@ -0,0 +1 @@
430ac93c4f7c4fb5a3e596ec38e3b7794c731cc1
@@ -0,0 +1 @@
476b691be156eb4fe6a6ad80f882c1dbaded8c33
File diff suppressed because one or more lines are too long
-1
View File
@@ -1 +0,0 @@
{}
@@ -0,0 +1 @@
38a18cbbf1acbec5eb9266b809c28d31e2941c53
File diff suppressed because one or more lines are too long
+166
View File
@@ -0,0 +1,166 @@
"""
Geocode data center street addresses via Nominatim (OpenStreetMap).
Rate limit: 1 request/second (Nominatim policy).
Resumable: caches results in geocode_cache.json so interrupted runs can continue.
"""
import json
import time
import urllib.request
import urllib.parse
import os
import sys
# Fix Windows console encoding + force unbuffered output
if sys.platform == "win32":
sys.stdout.reconfigure(encoding="utf-8", errors="replace")
sys.stderr.reconfigure(encoding="utf-8", errors="replace")
# Force line-buffered stdout for detached processes
class Unbuffered:
def __init__(self, stream):
self.stream = stream
def write(self, data):
self.stream.write(data)
self.stream.flush()
def writelines(self, datas):
self.stream.writelines(datas)
self.stream.flush()
def __getattr__(self, attr):
return getattr(self.stream, attr)
sys.stdout = Unbuffered(sys.stdout)
DATA_FILE = os.path.join(os.path.dirname(__file__), "data", "datacenters.json")
CACHE_FILE = os.path.join(os.path.dirname(__file__), "data", "geocode_cache.json")
OUTPUT_FILE = os.path.join(os.path.dirname(__file__), "data", "datacenters_geocoded.json")
NOMINATIM_URL = "https://nominatim.openstreetmap.org/search"
USER_AGENT = "ShadowBroker-DataCenterGeocoder/1.0"
def geocode_address(address: str, retries: int = 3) -> tuple[float, float] | None:
"""Geocode a single address via Nominatim. Returns (lat, lng) or None."""
params = urllib.parse.urlencode({"q": address, "format": "json", "limit": 1})
url = f"{NOMINATIM_URL}?{params}"
for attempt in range(retries):
req = urllib.request.Request(url, headers={"User-Agent": USER_AGENT})
try:
resp = urllib.request.urlopen(req, timeout=15)
data = json.loads(resp.read())
if data:
return float(data[0]["lat"]), float(data[0]["lon"])
return None # Valid response but no results
except Exception as e:
if attempt < retries - 1:
wait = 2 ** (attempt + 1)
print(f" RETRY ({attempt+1}/{retries}): {e} — waiting {wait}s")
time.sleep(wait)
else:
print(f" ERROR (gave up after {retries} attempts): {e}")
return None
def main():
with open(DATA_FILE, "r", encoding="utf-8") as f:
dcs = json.load(f)
# Load cache
cache = {}
if os.path.exists(CACHE_FILE):
with open(CACHE_FILE, "r", encoding="utf-8") as f:
cache = json.load(f)
print(f"Loaded {len(cache)} cached geocode results")
# Filter to DCs with real street addresses
to_geocode = []
skipped = 0
for i, dc in enumerate(dcs):
street = (dc.get("street") or "").strip()
if not street or len(street) <= 3 or street.lower() in ("tbc", "n/a", "na", "-"):
skipped += 1
continue
to_geocode.append((i, dc))
print(f"Total DCs: {len(dcs)}")
print(f"Skipped (no real address): {skipped}")
print(f"To geocode: {len(to_geocode)}")
# Count how many already cached
already_cached = sum(1 for _, dc in to_geocode if dc.get("address", "") in cache)
need_api = len(to_geocode) - already_cached
print(f"Already cached: {already_cached}")
print(f"Need API calls: {need_api}")
if need_api > 0:
print(f"Estimated time: {need_api // 60}m {need_api % 60}s")
print()
geocoded = 0
failed = 0
api_calls = 0
save_interval = 50 # Save cache every 50 API calls
for idx, (i, dc) in enumerate(to_geocode):
address = dc.get("address", "").strip()
if not address:
# Build address from parts
parts = [dc.get("street", ""), dc.get("zip", ""), dc.get("city", ""), dc.get("country", "")]
address = " ".join(p.strip() for p in parts if p and p.strip())
if not address:
failed += 1
continue
# Check cache first
if address in cache:
result = cache[address]
if result:
dcs[i]["lat"] = result[0]
dcs[i]["lng"] = result[1]
dcs[i]["geocode_source"] = "nominatim"
geocoded += 1
else:
failed += 1
continue
# API call — Nominatim requires 1 req/s, use 1.5s to avoid 429s after heavy use
time.sleep(1.5)
coords = geocode_address(address)
api_calls += 1
if coords:
cache[address] = coords
dcs[i]["lat"] = coords[0]
dcs[i]["lng"] = coords[1]
dcs[i]["geocode_source"] = "nominatim"
geocoded += 1
print(f"[{api_calls}/{need_api}] OK: {dc.get('name', '?')} -> ({coords[0]:.4f}, {coords[1]:.4f})")
else:
cache[address] = None
failed += 1
print(f"[{api_calls}/{need_api}] FAIL: {dc.get('name', '?')} | {address}")
# Periodic cache save
if api_calls % save_interval == 0:
with open(CACHE_FILE, "w", encoding="utf-8") as f:
json.dump(cache, f)
print(f" -- Cache saved ({len(cache)} entries) --")
# Final save
with open(CACHE_FILE, "w", encoding="utf-8") as f:
json.dump(cache, f)
# Write output - only DCs with real coordinates
output = [dc for dc in dcs if dc.get("lat") is not None and dc.get("lng") is not None]
with open(OUTPUT_FILE, "w", encoding="utf-8") as f:
json.dump(output, f, indent=2)
print(f"\nDone!")
print(f"Geocoded: {geocoded}")
print(f"Failed: {failed}")
print(f"API calls made: {api_calls}")
print(f"Output: {len(output)} DCs with coordinates -> {OUTPUT_FILE}")
if __name__ == "__main__":
main()
+59 -33
View File
@@ -1,3 +1,40 @@
import os
import logging
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
# ---------------------------------------------------------------------------
# Docker Swarm Secrets support
# For each VAR below, if VAR_FILE is set (e.g. AIS_API_KEY_FILE=/run/secrets/AIS_API_KEY),
# the file is read and its trimmed content is placed into VAR.
# This MUST run before service imports — modules read os.environ at import time.
# ---------------------------------------------------------------------------
_SECRET_VARS = [
"AIS_API_KEY",
"OPENSKY_CLIENT_ID",
"OPENSKY_CLIENT_SECRET",
"LTA_ACCOUNT_KEY",
"CORS_ORIGINS",
]
for _var in _SECRET_VARS:
_file_var = f"{_var}_FILE"
_file_path = os.environ.get(_file_var)
if _file_path:
try:
with open(_file_path, "r") as _f:
_value = _f.read().strip()
if _value:
os.environ[_var] = _value
logger.info(f"Loaded secret {_var} from {_file_path}")
else:
logger.warning(f"Secret file {_file_path} for {_var} is empty")
except FileNotFoundError:
logger.error(f"Secret file {_file_path} for {_var} not found")
except Exception as _e:
logger.error(f"Failed to read secret file {_file_path} for {_var}: {_e}")
from fastapi import FastAPI, Request, Response
from fastapi.middleware.cors import CORSMiddleware
from contextlib import asynccontextmanager
@@ -5,14 +42,10 @@ from services.data_fetcher import start_scheduler, stop_scheduler, get_latest_da
from services.ais_stream import start_ais_stream, stop_ais_stream
from services.carrier_tracker import start_carrier_tracker, stop_carrier_tracker
import uvicorn
import logging
import hashlib
import json as json_mod
import os
import socket
logging.basicConfig(level=logging.INFO)
def _build_cors_origins():
"""Build a CORS origins whitelist: localhost + LAN IPs + env overrides.
@@ -77,6 +110,15 @@ async def force_refresh():
async def live_data():
return get_latest_data()
def _etag_response(request: Request, payload: dict, prefix: str = "", default=None):
"""Serialize once, hash the bytes for ETag, return 304 or full response."""
content = json_mod.dumps(payload, default=default)
etag = hashlib.md5(f"{prefix}{content[:256]}".encode()).hexdigest()[:16]
if request.headers.get("if-none-match") == etag:
return Response(status_code=304, headers={"ETag": etag, "Cache-Control": "no-cache"})
return Response(content=content, media_type="application/json",
headers={"ETag": etag, "Cache-Control": "no-cache"})
@app.get("/api/live-data/fast")
async def live_data_fast(request: Request):
d = get_latest_data()
@@ -87,25 +129,15 @@ async def live_data_fast(request: Request):
"private_jets": d.get("private_jets", []),
"tracked_flights": d.get("tracked_flights", []),
"ships": d.get("ships", []),
"satellites": d.get("satellites", []),
"cctv": d.get("cctv", []),
"uavs": d.get("uavs", []),
"liveuamap": d.get("liveuamap", []),
"gps_jamming": d.get("gps_jamming", []),
"satellites": d.get("satellites", []),
"satellite_source": d.get("satellite_source", "none"),
"freshness": dict(source_timestamps),
}
# ETag includes last_updated timestamp so it changes on every data refresh,
# not just when item counts change (old bug: positions went stale)
last_updated = d.get("last_updated", "")
counts = "|".join(f"{k}:{len(v) if isinstance(v, list) else 0}" for k, v in payload.items() if k != "freshness")
etag = hashlib.md5(f"{last_updated}|{counts}".encode()).hexdigest()[:16]
if request.headers.get("if-none-match") == etag:
return Response(status_code=304, headers={"ETag": etag, "Cache-Control": "no-cache"})
return Response(
content=json_mod.dumps(payload),
media_type="application/json",
headers={"ETag": etag, "Cache-Control": "no-cache"}
)
return _etag_response(request, payload, prefix="fast|")
@app.get("/api/live-data/slow")
async def live_data_slow(request: Request):
@@ -129,17 +161,7 @@ async def live_data_slow(request: Request):
"datacenters": d.get("datacenters", []),
"freshness": dict(source_timestamps),
}
# ETag based on last_updated + item counts
last_updated = d.get("last_updated", "")
counts = "|".join(f"{k}:{len(v) if isinstance(v, list) else 0}" for k, v in payload.items() if k != "freshness")
etag = hashlib.md5(f"slow|{last_updated}|{counts}".encode()).hexdigest()[:16]
if request.headers.get("if-none-match") == etag:
return Response(status_code=304, headers={"ETag": etag, "Cache-Control": "no-cache"})
return Response(
content=json_mod.dumps(payload, default=str),
media_type="application/json",
headers={"ETag": etag, "Cache-Control": "no-cache"}
)
return _etag_response(request, payload, prefix="slow|", default=str)
@app.get("/api/debug-latest")
async def debug_latest_data():
@@ -200,9 +222,9 @@ async def api_get_nearest_radios_list(lat: float, lng: float, limit: int = 5):
from services.network_utils import fetch_with_curl
@app.get("/api/route/{callsign}")
async def get_flight_route(callsign: str):
r = fetch_with_curl("https://api.adsb.lol/api/0/routeset", method="POST", json_data={"planes": [{"callsign": callsign}]}, timeout=10)
if r.status_code == 200:
async def get_flight_route(callsign: str, lat: float = 0.0, lng: float = 0.0):
r = fetch_with_curl("https://api.adsb.lol/api/0/routeset", method="POST", json_data={"planes": [{"callsign": callsign, "lat": lat, "lng": lng}]}, timeout=10)
if r and r.status_code == 200:
data = r.json()
route_list = []
if isinstance(data, dict):
@@ -214,9 +236,13 @@ async def get_flight_route(callsign: str):
route = route_list[0]
airports = route.get("_airports", [])
if len(airports) >= 2:
orig = airports[0]
dest = airports[-1]
return {
"orig_loc": [airports[0].get("lon", 0), airports[0].get("lat", 0)],
"dest_loc": [airports[-1].get("lon", 0), airports[-1].get("lat", 0)]
"orig_loc": [orig.get("lon", 0), orig.get("lat", 0)],
"dest_loc": [dest.get("lon", 0), dest.get("lat", 0)],
"origin_name": f"{orig.get('iata', '') or orig.get('icao', '')}: {orig.get('name', 'Unknown')}",
"dest_name": f"{dest.get('iata', '') or dest.get('icao', '')}: {dest.get('name', 'Unknown')}",
}
return {}
+27 -24
View File
@@ -238,49 +238,51 @@ def _ais_stream_loop():
logger.info("AIS Stream proxy started — receiving vessel data")
msg_count = 0
ok_streak = 0 # Track consecutive successful messages for backoff reset
last_log_time = time.time()
for raw_msg in iter(process.stdout.readline, ''):
if not _ws_running:
process.terminate()
break
raw_msg = raw_msg.strip()
if not raw_msg:
continue
try:
data = json.loads(raw_msg)
except json.JSONDecodeError:
continue
if "error" in data:
logger.error(f"AIS Stream error: {data['error']}")
continue
msg_type = data.get("MessageType", "")
metadata = data.get("MetaData", {})
message = data.get("Message", {})
mmsi = metadata.get("MMSI", 0)
if not mmsi:
continue
with _vessels_lock:
if mmsi not in _vessels:
_vessels[mmsi] = {"_updated": time.time()}
vessel = _vessels[mmsi]
# Update position from PositionReport or StandardClassBPositionReport
if msg_type in ("PositionReport", "StandardClassBPositionReport"):
report = message.get(msg_type, {})
lat = report.get("Latitude", metadata.get("latitude", 0))
lng = report.get("Longitude", metadata.get("longitude", 0))
# Skip invalid positions
if lat == 0 and lng == 0:
continue
if abs(lat) > 90 or abs(lng) > 180:
continue
with _vessels_lock:
vessel["lat"] = lat
vessel["lng"] = lng
@@ -292,12 +294,12 @@ def _ais_stream_loop():
# Use metadata name if we don't have one yet
if not vessel.get("name") or vessel["name"] == "UNKNOWN":
vessel["name"] = metadata.get("ShipName", "UNKNOWN").strip() or "UNKNOWN"
# Update static data from ShipStaticData
elif msg_type == "ShipStaticData":
static = message.get("ShipStaticData", {})
ais_type = static.get("Type", 0)
with _vessels_lock:
vessel["name"] = (static.get("Name", "") or metadata.get("ShipName", "UNKNOWN")).strip() or "UNKNOWN"
vessel["callsign"] = (static.get("CallSign", "") or "").strip()
@@ -306,21 +308,24 @@ def _ais_stream_loop():
vessel["ais_type_code"] = ais_type
vessel["type"] = classify_vessel(ais_type, mmsi)
vessel["_updated"] = time.time()
msg_count += 1
if msg_count % 5000 == 0:
ok_streak += 1
# Reset backoff after 200 consecutive successful messages
if ok_streak >= 200 and backoff > 1:
backoff = 1
ok_streak = 0
# Periodic logging + cache save (time-based instead of count-based to avoid lock in hot loop)
now = time.time()
if now - last_log_time >= 60:
with _vessels_lock:
# Inline pruning: remove vessels not updated in 15 minutes
prune_cutoff = time.time() - 900
stale = [k for k, v in _vessels.items() if v.get("_updated", 0) < prune_cutoff]
for k in stale:
del _vessels[k]
count = len(_vessels)
if stale:
logger.info(f"AIS pruned {len(stale)} stale vessels")
logger.info(f"AIS Stream: processed {msg_count} messages, tracking {count} vessels")
_save_cache() # Auto-save every 5000 messages (~60 seconds)
_save_cache()
last_log_time = now
except Exception as e:
logger.error(f"AIS proxy connection error: {e}")
if _ws_running:
@@ -328,8 +333,6 @@ def _ais_stream_loop():
time.sleep(backoff)
backoff = min(backoff * 2, 60) # Double up to 60s max
continue
# Reset backoff on successful connection (got at least some messages)
backoff = 1
def _run_ais_loop():
+150 -86
View File
@@ -26,104 +26,116 @@ logger = logging.getLogger(__name__)
# Carrier registry: hull number → metadata + fallback position
# -----------------------------------------------------------------
CARRIER_REGISTRY: Dict[str, dict] = {
# Fallback positions sourced from USNI News Fleet & Marine Tracker (Mar 9, 2026)
# https://news.usni.org/2026/03/09/usni-news-fleet-and-marine-tracker-march-9-2026
# --- Bremerton, WA (Naval Base Kitsap) ---
# Distinct pier positions along Sinclair Inlet so carriers don't stack
"CVN-68": {
"name": "USS Nimitz (CVN-68)",
"wiki": "https://en.wikipedia.org/wiki/USS_Nimitz",
"homeport": "Bremerton, WA",
"homeport_lat": 47.56, "homeport_lng": -122.63,
"fallback_lat": 21.35, "fallback_lng": -157.95,
"fallback_heading": 270,
"fallback_desc": "Pacific Fleet / Pearl Harbor"
},
"CVN-69": {
"name": "USS Dwight D. Eisenhower (CVN-69)",
"wiki": "https://en.wikipedia.org/wiki/USS_Dwight_D._Eisenhower",
"homeport": "Norfolk, VA",
"homeport_lat": 36.95, "homeport_lng": -76.33,
"fallback_lat": 18.0, "fallback_lng": 39.5,
"fallback_heading": 120,
"fallback_desc": "Red Sea / CENTCOM AOR"
},
"CVN-78": {
"name": "USS Gerald R. Ford (CVN-78)",
"wiki": "https://en.wikipedia.org/wiki/USS_Gerald_R._Ford",
"homeport": "Norfolk, VA",
"homeport_lat": 36.95, "homeport_lng": -76.33,
"fallback_lat": 34.0, "fallback_lng": 25.0,
"homeport_lat": 47.5535, "homeport_lng": -122.6400,
"fallback_lat": 47.5535, "fallback_lng": -122.6400,
"fallback_heading": 90,
"fallback_desc": "Eastern Mediterranean deterrence"
},
"CVN-70": {
"name": "USS Carl Vinson (CVN-70)",
"wiki": "https://en.wikipedia.org/wiki/USS_Carl_Vinson",
"homeport": "San Diego, CA",
"homeport_lat": 32.68, "homeport_lng": -117.15,
"fallback_lat": 15.0, "fallback_lng": 115.0,
"fallback_heading": 45,
"fallback_desc": "South China Sea patrol"
},
"CVN-71": {
"name": "USS Theodore Roosevelt (CVN-71)",
"wiki": "https://en.wikipedia.org/wiki/USS_Theodore_Roosevelt_(CVN-71)",
"homeport": "San Diego, CA",
"homeport_lat": 32.68, "homeport_lng": -117.15,
"fallback_lat": 22.0, "fallback_lng": 122.0,
"fallback_heading": 300,
"fallback_desc": "Philippine Sea / Taiwan Strait"
},
"CVN-72": {
"name": "USS Abraham Lincoln (CVN-72)",
"wiki": "https://en.wikipedia.org/wiki/USS_Abraham_Lincoln_(CVN-72)",
"homeport": "San Diego, CA",
"homeport_lat": 32.68, "homeport_lng": -117.15,
"fallback_lat": 21.0, "fallback_lng": -158.0,
"fallback_heading": 270,
"fallback_desc": "Pacific deployment"
},
"CVN-73": {
"name": "USS George Washington (CVN-73)",
"wiki": "https://en.wikipedia.org/wiki/USS_George_Washington_(CVN-73)",
"homeport": "Yokosuka, Japan",
"homeport_lat": 35.28, "homeport_lng": 139.67,
"fallback_lat": 35.0, "fallback_lng": 139.0,
"fallback_heading": 0,
"fallback_desc": "Yokosuka, Japan (Forward deployed)"
},
"CVN-74": {
"name": "USS John C. Stennis (CVN-74)",
"wiki": "https://en.wikipedia.org/wiki/USS_John_C._Stennis",
"homeport": "Norfolk, VA",
"homeport_lat": 36.95, "homeport_lng": -76.33,
"fallback_lat": 36.95, "fallback_lng": -76.33,
"fallback_heading": 0,
"fallback_desc": "RCOH / Norfolk (maintenance)"
},
"CVN-75": {
"name": "USS Harry S. Truman (CVN-75)",
"wiki": "https://en.wikipedia.org/wiki/USS_Harry_S._Truman",
"homeport": "Norfolk, VA",
"homeport_lat": 36.95, "homeport_lng": -76.33,
"fallback_lat": 36.0, "fallback_lng": 15.0,
"fallback_heading": 90,
"fallback_desc": "Mediterranean deployment"
"fallback_desc": "Bremerton, WA (Maintenance)"
},
"CVN-76": {
"name": "USS Ronald Reagan (CVN-76)",
"wiki": "https://en.wikipedia.org/wiki/USS_Ronald_Reagan",
"homeport": "Bremerton, WA",
"homeport_lat": 47.56, "homeport_lng": -122.63,
"fallback_lat": 47.56, "fallback_lng": -122.63,
"homeport_lat": 47.5580, "homeport_lng": -122.6360,
"fallback_lat": 47.5580, "fallback_lng": -122.6360,
"fallback_heading": 90,
"fallback_desc": "Bremerton, WA (Decommissioning)"
},
# --- Norfolk, VA (Naval Station Norfolk) ---
# Piers run N-S along Willoughby Bay; each carrier gets a distinct berth
"CVN-69": {
"name": "USS Dwight D. Eisenhower (CVN-69)",
"wiki": "https://en.wikipedia.org/wiki/USS_Dwight_D._Eisenhower",
"homeport": "Norfolk, VA",
"homeport_lat": 36.9465, "homeport_lng": -76.3265,
"fallback_lat": 36.9465, "fallback_lng": -76.3265,
"fallback_heading": 0,
"fallback_desc": "Bremerton, WA (Homeport)"
"fallback_desc": "Norfolk, VA (Post-deployment maintenance)"
},
"CVN-78": {
"name": "USS Gerald R. Ford (CVN-78)",
"wiki": "https://en.wikipedia.org/wiki/USS_Gerald_R._Ford",
"homeport": "Norfolk, VA",
"homeport_lat": 36.9505, "homeport_lng": -76.3250,
"fallback_lat": 18.0, "fallback_lng": 39.5,
"fallback_heading": 0,
"fallback_desc": "Red Sea — Operation Epic Fury (USNI Mar 9)"
},
"CVN-74": {
"name": "USS John C. Stennis (CVN-74)",
"wiki": "https://en.wikipedia.org/wiki/USS_John_C._Stennis",
"homeport": "Norfolk, VA",
"homeport_lat": 36.9540, "homeport_lng": -76.3235,
"fallback_lat": 36.98, "fallback_lng": -76.43,
"fallback_heading": 0,
"fallback_desc": "Newport News, VA (RCOH refueling overhaul)"
},
"CVN-75": {
"name": "USS Harry S. Truman (CVN-75)",
"wiki": "https://en.wikipedia.org/wiki/USS_Harry_S._Truman",
"homeport": "Norfolk, VA",
"homeport_lat": 36.9580, "homeport_lng": -76.3220,
"fallback_lat": 36.0, "fallback_lng": 15.0,
"fallback_heading": 0,
"fallback_desc": "Mediterranean Sea deployment (USNI Mar 9)"
},
"CVN-77": {
"name": "USS George H.W. Bush (CVN-77)",
"wiki": "https://en.wikipedia.org/wiki/USS_George_H.W._Bush",
"homeport": "Norfolk, VA",
"homeport_lat": 36.95, "homeport_lng": -76.33,
"fallback_lat": 36.95, "fallback_lng": -76.33,
"homeport_lat": 36.9620, "homeport_lng": -76.3210,
"fallback_lat": 36.5, "fallback_lng": -74.0,
"fallback_heading": 0,
"fallback_desc": "Norfolk, VA (Homeport)"
"fallback_desc": "Atlantic — Pre-deployment workups (USNI Mar 9)"
},
# --- San Diego, CA (Naval Base San Diego) ---
# Carrier piers along the east shore of San Diego Bay, spread N-S
"CVN-70": {
"name": "USS Carl Vinson (CVN-70)",
"wiki": "https://en.wikipedia.org/wiki/USS_Carl_Vinson",
"homeport": "San Diego, CA",
"homeport_lat": 32.6840, "homeport_lng": -117.1290,
"fallback_lat": 32.6840, "fallback_lng": -117.1290,
"fallback_heading": 180,
"fallback_desc": "San Diego, CA (Homeport)"
},
"CVN-71": {
"name": "USS Theodore Roosevelt (CVN-71)",
"wiki": "https://en.wikipedia.org/wiki/USS_Theodore_Roosevelt_(CVN-71)",
"homeport": "San Diego, CA",
"homeport_lat": 32.6885, "homeport_lng": -117.1280,
"fallback_lat": 32.6885, "fallback_lng": -117.1280,
"fallback_heading": 180,
"fallback_desc": "San Diego, CA (Maintenance)"
},
"CVN-72": {
"name": "USS Abraham Lincoln (CVN-72)",
"wiki": "https://en.wikipedia.org/wiki/USS_Abraham_Lincoln_(CVN-72)",
"homeport": "San Diego, CA",
"homeport_lat": 32.6925, "homeport_lng": -117.1275,
"fallback_lat": 20.0, "fallback_lng": 64.0,
"fallback_heading": 0,
"fallback_desc": "Arabian Sea — Operation Epic Fury (USNI Mar 9)"
},
# --- Yokosuka, Japan (CFAY) ---
"CVN-73": {
"name": "USS George Washington (CVN-73)",
"wiki": "https://en.wikipedia.org/wiki/USS_George_Washington_(CVN-73)",
"homeport": "Yokosuka, Japan",
"homeport_lat": 35.2830, "homeport_lng": 139.6700,
"fallback_lat": 35.2830, "fallback_lng": 139.6700,
"fallback_heading": 180,
"fallback_desc": "Yokosuka, Japan (Forward deployed)"
},
}
@@ -302,7 +314,8 @@ def _parse_carrier_positions_from_news(articles: List[dict]) -> Dict[str, dict]:
"lat": coords[0],
"lng": coords[1],
"desc": title[:100],
"source": "GDELT OSINT",
"source": "GDELT News API",
"source_url": article.get("url", "https://api.gdeltproject.org"),
"updated": datetime.now(timezone.utc).isoformat()
}
logger.info(f"Carrier update: {CARRIER_REGISTRY[hull]['name']}{coords} (from: {title[:80]})")
@@ -316,7 +329,7 @@ def update_carrier_positions():
logger.info("Carrier tracker: updating positions from OSINT sources...")
# Start with fallback positions
# Start with fallback positions (sourced from USNI News Fleet Tracker)
positions: Dict[str, dict] = {}
for hull, info in CARRIER_REGISTRY.items():
positions[hull] = {
@@ -326,7 +339,8 @@ def update_carrier_positions():
"heading": info["fallback_heading"],
"desc": info["fallback_desc"],
"wiki": info["wiki"],
"source": "Static OSINT estimate",
"source": "USNI News Fleet & Marine Tracker",
"source_url": "https://news.usni.org/category/fleet-tracker",
"updated": datetime.now(timezone.utc).isoformat()
}
@@ -370,6 +384,55 @@ def update_carrier_positions():
logger.info(f"Carrier tracker: {len(positions)} carriers updated. Sources: {sources}")
def _deconflict_positions(result: List[dict]) -> List[dict]:
"""Offset carriers that share identical coordinates so they don't stack.
At port: offset along the pier axis (~500m / 0.004° apart).
At sea: offset perpendicular to each other (~0.08° / ~9km apart)
so they're visibly separate but clearly operating together.
"""
# Group by rounded lat/lng (within ~0.01° ≈ 1km = same spot)
from collections import defaultdict
groups: dict[str, list[int]] = defaultdict(list)
for i, c in enumerate(result):
key = f"{round(c['lat'], 2)},{round(c['lng'], 2)}"
groups[key].append(i)
for indices in groups.values():
if len(indices) < 2:
continue
n = len(indices)
# Determine if this is a port (near a homeport) or at sea
sample = result[indices[0]]
at_port = any(
abs(sample["lat"] - info.get("homeport_lat", 0)) < 0.05
and abs(sample["lng"] - info.get("homeport_lng", 0)) < 0.05
for info in CARRIER_REGISTRY.values()
)
if at_port:
# Use each carrier's distinct homeport pier coordinates
for idx in indices:
carrier = result[idx]
hull = None
for h, info in CARRIER_REGISTRY.items():
if info["name"] == carrier["name"]:
hull = h
break
if hull:
info = CARRIER_REGISTRY[hull]
carrier["lat"] = info["homeport_lat"]
carrier["lng"] = info["homeport_lng"]
else:
# At sea: spread in a line perpendicular to travel (~0.08° apart)
spacing = 0.08 # ~9km — close enough to see they're together
start_offset = -(n - 1) * spacing / 2
for j, idx in enumerate(indices):
result[idx]["lng"] += start_offset + j * spacing
return result
def get_carrier_positions() -> List[dict]:
"""Return current carrier positions for the data pipeline."""
with _positions_lock:
@@ -381,7 +444,7 @@ def get_carrier_positions() -> List[dict]:
"type": "carrier",
"lat": pos["lat"],
"lng": pos["lng"],
"heading": pos.get("heading", 0),
"heading": None, # Heading unknown for carriers — OSINT cannot determine true heading
"sog": 0,
"cog": 0,
"country": "United States",
@@ -389,9 +452,10 @@ def get_carrier_positions() -> List[dict]:
"wiki": pos.get("wiki", info.get("wiki", "")),
"estimated": True,
"source": pos.get("source", "OSINT estimated position"),
"source_url": pos.get("source_url", "https://news.usni.org/category/fleet-tracker"),
"last_osint_update": pos.get("updated", "")
})
return result
return _deconflict_positions(result)
# -----------------------------------------------------------------
+445 -261
View File
@@ -15,6 +15,7 @@ import threading
import io
from apscheduler.schedulers.background import BackgroundScheduler
import concurrent.futures
import heapq
from sgp4.api import Satrec, WGS72
from sgp4.api import jday
from datetime import datetime
@@ -81,6 +82,25 @@ opensky_client = OpenSkyClient(
last_opensky_fetch = 0
cached_opensky_flights = []
# ---------------------------------------------------------------------------
# Supplemental ADS-B sources for blind-spot gap-filling (Russia/China/Africa)
# These aggregators have different feeder pools than adsb.lol and can surface
# aircraft invisible to our primary source. Only gap-fill planes are kept.
# ---------------------------------------------------------------------------
_BLIND_SPOT_REGIONS = [
{"name": "Yekaterinburg", "lat": 56.8, "lon": 60.6, "radius_nm": 250},
{"name": "Novosibirsk", "lat": 55.0, "lon": 82.9, "radius_nm": 250},
{"name": "Krasnoyarsk", "lat": 56.0, "lon": 92.9, "radius_nm": 250},
{"name": "Vladivostok", "lat": 43.1, "lon": 131.9, "radius_nm": 250},
{"name": "Urumqi", "lat": 43.8, "lon": 87.6, "radius_nm": 250},
{"name": "Chengdu", "lat": 30.6, "lon": 104.1, "radius_nm": 250},
{"name": "Lagos-Accra", "lat": 6.5, "lon": 3.4, "radius_nm": 250},
{"name": "Addis Ababa", "lat": 9.0, "lon": 38.7, "radius_nm": 250},
]
_SUPPLEMENTAL_FETCH_INTERVAL = 120 # seconds — only query every 2 min
last_supplemental_fetch = 0
cached_supplemental_flights = []
# In-memory store
@@ -122,56 +142,132 @@ def _mark_fresh(*keys):
_data_lock = threading.Lock()
# ---------------------------------------------------------------------------
# Plane-Alert DB — load tracked aircraft from CSV on startup
# Plane-Alert DB — load tracked aircraft from JSON on startup
# ---------------------------------------------------------------------------
# Category → color mapping
_PINK_CATEGORIES = {
"Dictator Alert", "Head of State", "Da Comrade", "Oligarch",
"Governments", "Royal Aircraft", "Quango",
}
_RED_CATEGORIES = {
"Don't you know who I am?", "As Seen on TV", "Joe Cool",
"Vanity Plate", "Football", "Bizjets",
}
_DARKBLUE_CATEGORIES = {
"USAF", "United States Navy", "United States Marine Corps",
"Special Forces", "Hired Gun", "Oxcart", "Gunship", "Nuclear",
"CAP", "Zoomies",
# Exact category → color mapping for all 53 known categories.
# O(1) dict lookup — no keyword scanning, no false positives.
_CATEGORY_COLOR: dict[str, str] = {
# YELLOW — Military / Intelligence / Defense
"USAF": "yellow",
"Other Air Forces": "yellow",
"Toy Soldiers": "yellow",
"Oxcart": "yellow",
"United States Navy": "yellow",
"GAF": "yellow",
"Hired Gun": "yellow",
"United States Marine Corps": "yellow",
"Gunship": "yellow",
"RAF": "yellow",
"Other Navies": "yellow",
"Special Forces": "yellow",
"Zoomies": "yellow",
"Royal Navy Fleet Air Arm": "yellow",
"Army Air Corps": "yellow",
"Aerobatic Teams": "yellow",
"UAV": "yellow",
"Ukraine": "yellow",
"Nuclear": "yellow",
# LIME — Emergency / Medical / Rescue / Fire
"Flying Doctors": "#32cd32",
"Aerial Firefighter": "#32cd32",
"Coastguard": "#32cd32",
# BLUE — Government / Law Enforcement / Civil
"Police Forces": "blue",
"Governments": "blue",
"Quango": "blue",
"UK National Police Air Service": "blue",
"CAP": "blue",
# BLACK — Privacy / PIA
"PIA": "black",
# RED — Dictator / Oligarch
"Dictator Alert": "red",
"Da Comrade": "red",
"Oligarch": "red",
# HOT PINK — High Value Assets / VIP / Celebrity
"Head of State": "#ff1493",
"Royal Aircraft": "#ff1493",
"Don't you know who I am?": "#ff1493",
"As Seen on TV": "#ff1493",
"Bizjets": "#ff1493",
"Vanity Plate": "#ff1493",
"Football": "#ff1493",
# ORANGE — Joe Cool
"Joe Cool": "orange",
# WHITE — Climate Crisis
"Climate Crisis": "white",
# PURPLE — General Tracked / Other Notable
"Historic": "purple",
"Jump Johnny Jump": "purple",
"Ptolemy would be proud": "purple",
"Distinctive": "purple",
"Dogs with Jobs": "purple",
"You came here in that thing?": "purple",
"Big Hello": "purple",
"Watch Me Fly": "purple",
"Perfectly Serviceable Aircraft": "purple",
"Jesus he Knows me": "purple",
"Gas Bags": "purple",
"Radiohead": "purple",
}
def _category_to_color(cat: str) -> str:
if cat in _PINK_CATEGORIES:
return "pink"
if cat in _RED_CATEGORIES:
return "red"
if cat in _DARKBLUE_CATEGORIES:
return "darkblue"
return "white"
"""O(1) exact lookup. Unknown categories default to purple."""
return _CATEGORY_COLOR.get(cat, "purple")
# Load once on module import
_PLANE_ALERT_DB: dict = {} # uppercase ICAO hex → dict of aircraft info
_PLANE_ALERT_DB: dict = {}
# ---------------------------------------------------------------------------
# POTUS Fleet — override colors and operator names for presidential aircraft.
# These are hardcoded ICAO hexes verified against FAA registry + plane-alert.
# ---------------------------------------------------------------------------
_POTUS_FLEET: dict[str, dict] = {
# Air Force One — Boeing VC-25A (747-200B)
"ADFDF8": {"color": "#ff1493", "operator": "Air Force One (82-8000)", "category": "Head of State", "wiki": "Air_Force_One", "fleet": "AF1"},
"ADFDF9": {"color": "#ff1493", "operator": "Air Force One (92-9000)", "category": "Head of State", "wiki": "Air_Force_One", "fleet": "AF1"},
# Air Force Two — Boeing C-32A (757-200)
"ADFEB7": {"color": "blue", "operator": "Air Force Two (98-0001)", "category": "Governments", "wiki": "Air_Force_Two", "fleet": "AF2"},
"ADFEB8": {"color": "blue", "operator": "Air Force Two (98-0002)", "category": "Governments", "wiki": "Air_Force_Two", "fleet": "AF2"},
"ADFEB9": {"color": "blue", "operator": "Air Force Two (99-0003)", "category": "Governments", "wiki": "Air_Force_Two", "fleet": "AF2"},
"ADFEBA": {"color": "blue", "operator": "Air Force Two (99-0004)", "category": "Governments", "wiki": "Air_Force_Two", "fleet": "AF2"},
"AE4AE6": {"color": "blue", "operator": "Air Force Two (09-0015)", "category": "Governments", "wiki": "Air_Force_Two", "fleet": "AF2"},
"AE4AE8": {"color": "blue", "operator": "Air Force Two (09-0016)", "category": "Governments", "wiki": "Air_Force_Two", "fleet": "AF2"},
"AE4AEA": {"color": "blue", "operator": "Air Force Two (09-0017)", "category": "Governments", "wiki": "Air_Force_Two", "fleet": "AF2"},
"AE4AEC": {"color": "blue", "operator": "Air Force Two (19-0018)", "category": "Governments", "wiki": "Air_Force_Two", "fleet": "AF2"},
# Marine One — VH-3D Sea King / VH-92A Patriot
"AE0865": {"color": "#ff1493", "operator": "Marine One (VH-3D)", "category": "Head of State", "wiki": "Marine_One", "fleet": "M1"},
"AE5E76": {"color": "#ff1493", "operator": "Marine One (VH-92A)", "category": "Head of State", "wiki": "Marine_One", "fleet": "M1"},
"AE5E77": {"color": "#ff1493", "operator": "Marine One (VH-92A)", "category": "Head of State", "wiki": "Marine_One", "fleet": "M1"},
"AE5E79": {"color": "#ff1493", "operator": "Marine One (VH-92A)", "category": "Head of State", "wiki": "Marine_One", "fleet": "M1"},
}
def _load_plane_alert_db():
"""Parse plane_alert_db.json into a dict keyed by uppercase ICAO hex."""
"""Load plane_alert_db.json (exported from SQLite) into memory."""
global _PLANE_ALERT_DB
import json
json_path = os.path.join(
os.path.dirname(os.path.dirname(os.path.abspath(__file__))),
"data", "plane_alert_db.json"
)
if not os.path.exists(json_path):
logger.warning(f"Plane-Alert JSON DB not found at {json_path}")
logger.warning(f"Plane-Alert DB not found at {json_path}")
return
try:
with open(json_path, "r", encoding="utf-8") as fh:
data = json.load(fh)
for icao_hex, info in data.items():
info["color"] = _category_to_color(info.get("category", ""))
_PLANE_ALERT_DB[icao_hex] = info
logger.info(f"Plane-Alert JSON DB loaded: {len(_PLANE_ALERT_DB)} aircraft")
raw = json.load(fh)
for icao_hex, info in raw.items():
info["color"] = _category_to_color(info.get("category", ""))
# Apply POTUS fleet overrides (correct colors + clean operator names)
override = _POTUS_FLEET.get(icao_hex)
if override:
info["color"] = override["color"]
info["operator"] = override["operator"]
info["category"] = override["category"]
info["wiki"] = override.get("wiki", "")
info["potus_fleet"] = override.get("fleet", "")
_PLANE_ALERT_DB[icao_hex] = info
logger.info(f"Plane-Alert DB loaded: {len(_PLANE_ALERT_DB)} aircraft")
except Exception as e:
logger.error(f"Failed to load Plane-Alert JSON DB: {e}")
logger.error(f"Failed to load Plane-Alert DB: {e}")
_load_plane_alert_db()
@@ -184,11 +280,12 @@ def enrich_with_plane_alert(flight: dict) -> dict:
flight["alert_color"] = info["color"]
flight["alert_operator"] = info["operator"]
flight["alert_type"] = info["ac_type"]
flight["alert_tag1"] = info["tag1"]
flight["alert_tag2"] = info["tag2"]
flight["alert_tag3"] = info["tag3"]
flight["alert_tags"] = info["tags"]
flight["alert_link"] = info["link"]
# Override registration if DB has a better one
if info.get("wiki"):
flight["alert_wiki"] = info["wiki"]
if info.get("potus_fleet"):
flight["potus_fleet"] = info["potus_fleet"]
if info["registration"]:
flight["registration"] = info["registration"]
@@ -225,21 +322,37 @@ _load_tracked_names()
def enrich_with_tracked_names(flight: dict) -> dict:
"""If flight's registration matches our Excel extraction, tag it as tracked."""
# POTUS fleet overrides are authoritative — never let Excel overwrite them
icao = flight.get("icao24", "").strip().upper()
if icao in _POTUS_FLEET:
return flight
reg = flight.get("registration", "").strip().upper()
callsign = flight.get("callsign", "").strip().upper()
match = None
if reg and reg in _TRACKED_NAMES_DB:
match = _TRACKED_NAMES_DB[reg]
elif callsign and callsign in _TRACKED_NAMES_DB:
match = _TRACKED_NAMES_DB[callsign]
if match:
# Don't overwrite Plane-Alert DB operator if it exists unless we want Excel to take precedence.
# Let's let Excel take precedence as it has cleaner individual names (e.g. Elon Musk instead of FALCON LANDING LLC).
flight["alert_operator"] = match["name"]
name = match["name"]
# Let Excel take precedence as it has cleaner individual names (e.g. Elon Musk instead of FALCON LANDING LLC).
flight["alert_operator"] = name
flight["alert_category"] = match["category"]
if "alert_color" not in flight:
# Override pink default if the name implies a specific function
name_lower = name.lower()
is_gov = any(w in name_lower for w in ['state of ', 'government', 'republic', 'ministry', 'department', 'federal', 'cia'])
is_law = any(w in name_lower for w in ['police', 'marshal', 'sheriff', 'douane', 'customs', 'patrol', 'gendarmerie', 'guardia', 'law enforcement'])
is_med = any(w in name_lower for w in ['fire', 'bomberos', 'ambulance', 'paramedic', 'medevac', 'rescue', 'hospital', 'medical', 'lifeflight'])
if is_gov or is_law:
flight["alert_color"] = "blue"
elif is_med:
flight["alert_color"] = "#32cd32" # lime
elif "alert_color" not in flight:
flight["alert_color"] = "pink"
return flight
@@ -354,7 +467,8 @@ def fetch_news():
source_weights = {f["name"]: f["weight"] for f in feed_config}
clusters = {}
_cluster_grid = {} # spatial hash grid: (cell_x, cell_y) → [cluster_keys]
# Fetch all feeds in parallel for speed (each has a 10s timeout)
def _fetch_feed(item):
source_name, url = item
@@ -427,20 +541,25 @@ def fetch_news():
break
# If mapped, check if there is an existing cluster within ~400km (4 degrees) to merge them
# Uses spatial hash grid (4° cells) for O(1) lookup instead of O(n) scan
if lat is not None:
key = None
for existing_key in clusters.keys():
if "," in existing_key:
parts = existing_key.split(",")
try:
cell_x, cell_y = int(lng // 4), int(lat // 4)
for dx in range(-1, 2):
for dy in range(-1, 2):
for ckey in _cluster_grid.get((cell_x + dx, cell_y + dy), []):
parts = ckey.split(",")
elat, elng = float(parts[0]), float(parts[1])
if ((lat - elat)**2 + (lng - elng)**2)**0.5 < 4.0:
key = existing_key
key = ckey
break
except ValueError:
pass
if key:
break
if key:
break
if key is None:
key = f"{lat},{lng}"
_cluster_grid.setdefault((cell_x, cell_y), []).append(key)
else:
key = title
@@ -480,27 +599,31 @@ def fetch_news():
latest_data['news'] = news_items
_mark_fresh("news")
def _fetch_single_ticker(symbol: str, period: str = "2d"):
"""Fetch a single yfinance ticker. Returns (symbol, data_dict) or (symbol, None)."""
try:
ticker = yf.Ticker(symbol)
hist = ticker.history(period=period)
if len(hist) >= 1:
current_price = hist['Close'].iloc[-1]
prev_close = hist['Close'].iloc[0] if len(hist) > 1 else current_price
change_percent = ((current_price - prev_close) / prev_close) * 100 if prev_close else 0
return symbol, {
"price": round(float(current_price), 2),
"change_percent": round(float(change_percent), 2),
"up": bool(change_percent >= 0)
}
except Exception as e:
logger.warning(f"Could not fetch data for {symbol}: {e}")
return symbol, None
def fetch_defense_stocks():
tickers = ["RTX", "LMT", "NOC", "GD", "BA", "PLTR"]
stocks_data = {}
try:
for t in tickers:
try:
ticker = yf.Ticker(t)
hist = ticker.history(period="2d")
if len(hist) >= 1:
current_price = hist['Close'].iloc[-1]
prev_close = hist['Close'].iloc[0] if len(hist) > 1 else current_price
change_percent = ((current_price - prev_close) / prev_close) * 100 if prev_close else 0
stocks_data[t] = {
"price": round(float(current_price), 2),
"change_percent": round(float(change_percent), 2),
"up": bool(change_percent >= 0)
}
except Exception as e:
logger.warning(f"Could not fetch data for {t}: {e}")
with concurrent.futures.ThreadPoolExecutor(max_workers=4) as pool:
results = pool.map(lambda t: _fetch_single_ticker(t, "2d"), tickers)
stocks_data = {sym: data for sym, data in results if data}
latest_data['stocks'] = stocks_data
_mark_fresh("stocks")
except Exception as e:
@@ -509,25 +632,10 @@ def fetch_defense_stocks():
def fetch_oil_prices():
# CL=F is Crude Oil, BZ=F is Brent Crude
tickers = {"WTI Crude": "CL=F", "Brent Crude": "BZ=F"}
oil_data = {}
try:
for name, symbol in tickers.items():
try:
ticker = yf.Ticker(symbol)
hist = ticker.history(period="5d")
if len(hist) >= 2:
current_price = hist['Close'].iloc[-1]
prev_close = hist['Close'].iloc[-2]
change_percent = ((current_price - prev_close) / prev_close) * 100 if prev_close else 0
oil_data[name] = {
"price": round(float(current_price), 2),
"change_percent": round(float(change_percent), 2),
"up": bool(change_percent >= 0)
}
except Exception as e:
logger.warning(f"Could not fetch data for {symbol}: {e}")
with concurrent.futures.ThreadPoolExecutor(max_workers=2) as pool:
results = pool.map(lambda item: (_fetch_single_ticker(item[1], "5d")[1], item[0]), tickers.items())
oil_data = {name: data for data, name in results if data}
latest_data['oil'] = oil_data
_mark_fresh("oil")
except Exception as e:
@@ -612,6 +720,87 @@ _HELI_TYPES_BACKEND = {
"B47G", "HUEY", "GAMA", "CABR", "EXE",
}
def _fetch_supplemental_sources(seen_hex: set) -> list:
"""Fetch from airplanes.live and adsb.fi to fill blind-spot gaps.
Only returns aircraft whose ICAO hex is NOT already in seen_hex.
Throttled to run every _SUPPLEMENTAL_FETCH_INTERVAL seconds.
Fully wrapped in try/except returns [] on any failure.
"""
global last_supplemental_fetch, cached_supplemental_flights
now = time.time()
if now - last_supplemental_fetch < _SUPPLEMENTAL_FETCH_INTERVAL:
# Return cached results, but still filter against current seen_hex
return [f for f in cached_supplemental_flights
if f.get("hex", "").lower().strip() not in seen_hex]
new_supplemental = []
supplemental_hex = set() # track hex within supplemental to avoid internal dupes
# --- airplanes.live (parallel, all hotspots) ---
def _fetch_airplaneslive(region):
try:
url = (f"https://api.airplanes.live/v2/point/"
f"{region['lat']}/{region['lon']}/{region['radius_nm']}")
res = fetch_with_curl(url, timeout=10)
if res.status_code == 200:
data = res.json()
return data.get("ac", [])
except Exception as e:
logger.debug(f"airplanes.live {region['name']} failed: {e}")
return []
try:
with concurrent.futures.ThreadPoolExecutor(max_workers=4) as pool:
results = list(pool.map(_fetch_airplaneslive, _BLIND_SPOT_REGIONS))
for region_flights in results:
for f in region_flights:
h = f.get("hex", "").lower().strip()
if h and h not in seen_hex and h not in supplemental_hex:
f["supplemental_source"] = "airplanes.live"
new_supplemental.append(f)
supplemental_hex.add(h)
except Exception as e:
logger.warning(f"airplanes.live supplemental fetch failed: {e}")
ap_count = len(new_supplemental)
# --- adsb.fi (sequential, 1.1s between requests to respect 1 req/sec limit) ---
try:
for region in _BLIND_SPOT_REGIONS:
try:
url = (f"https://opendata.adsb.fi/api/v3/lat/"
f"{region['lat']}/lon/{region['lon']}/dist/{region['radius_nm']}")
res = fetch_with_curl(url, timeout=10)
if res.status_code == 200:
data = res.json()
for f in data.get("ac", []):
h = f.get("hex", "").lower().strip()
if h and h not in seen_hex and h not in supplemental_hex:
f["supplemental_source"] = "adsb.fi"
new_supplemental.append(f)
supplemental_hex.add(h)
except Exception as e:
logger.debug(f"adsb.fi {region['name']} failed: {e}")
time.sleep(1.1) # Rate limit: 1 req/sec
except Exception as e:
logger.warning(f"adsb.fi supplemental fetch failed: {e}")
fi_count = len(new_supplemental) - ap_count
cached_supplemental_flights = new_supplemental
last_supplemental_fetch = now
if new_supplemental:
_mark_fresh("supplemental_flights")
logger.info(f"Supplemental: +{len(new_supplemental)} new aircraft from blind-spot "
f"hotspots (airplanes.live: {ap_count}, adsb.fi: {fi_count})")
return new_supplemental
def fetch_flights():
# OpenSky Network public API for flights. We want to demonstrate global coverage.
flights = []
@@ -712,7 +901,22 @@ def fetch_flights():
all_adsb_flights.append(osf)
seen_hex.add(h.lower().strip())
# -------------------------------------------------------------------
# Supplemental Sources: airplanes.live + adsb.fi (blind-spot gap-fill)
# Only adds aircraft whose ICAO hex is NOT already in seen_hex.
# -------------------------------------------------------------------
try:
gap_fill = _fetch_supplemental_sources(seen_hex)
for f in gap_fill:
all_adsb_flights.append(f)
h = f.get("hex", "").lower().strip()
if h:
seen_hex.add(h)
if gap_fill:
logger.info(f"Gap-fill: added {len(gap_fill)} aircraft to pipeline")
except Exception as e:
logger.warning(f"Supplemental source fetch failed (non-fatal): {e}")
if all_adsb_flights:
# The user requested maximum flight density. Rendering all available aircraft.
@@ -995,11 +1199,11 @@ def fetch_flights():
if hex_id:
seen_hexes.add(hex_id)
# Prune stale trails (10 min for non-tracked, 30 min for tracked)
# Prune stale trails (5 min for non-tracked, 30 min for tracked)
tracked_hexes = {t.get('icao24', '').lower() for t in latest_data.get('tracked_flights', [])}
stale_keys = []
for k, v in flight_trails.items():
cutoff = now_ts - 1800 if k in tracked_hexes else now_ts - 600
cutoff = now_ts - 1800 if k in tracked_hexes else now_ts - 300
if v['last_seen'] < cutoff:
stale_keys.append(k)
for k in stale_keys:
@@ -1110,17 +1314,25 @@ def fetch_ships():
"""Fetch real-time AIS vessel data and combine with OSINT carrier positions."""
from services.ais_stream import get_ais_vessels
from services.carrier_tracker import get_carrier_positions
ships = []
# Dynamic OSINT carrier positions (updated from GDELT + cache)
carriers = get_carrier_positions()
ships.extend(carriers)
try:
carriers = get_carrier_positions()
ships.extend(carriers)
except Exception as e:
logger.error(f"Carrier tracker error (non-fatal): {e}")
carriers = []
# Real AIS vessel data from aisstream.io
ais_vessels = get_ais_vessels()
ships.extend(ais_vessels)
try:
ais_vessels = get_ais_vessels()
ships.extend(ais_vessels)
except Exception as e:
logger.error(f"AIS stream error (non-fatal): {e}")
ais_vessels = []
logger.info(f"Ships: {len(carriers)} carriers + {len(ais_vessels)} AIS vessels")
latest_data['ships'] = ships
_mark_fresh("ships")
@@ -1333,9 +1545,8 @@ def fetch_firms_fires():
})
except (ValueError, TypeError):
continue
# Sort by FRP descending, keep top 5000 (most intense fires first)
all_rows.sort(key=lambda x: x["frp"], reverse=True)
fires = all_rows[:5000]
# Keep top 5000 by FRP (most intense fires first) — heapq is O(n) vs O(n log n) sort
fires = heapq.nlargest(5000, all_rows, key=lambda x: x["frp"])
logger.info(f"FIRMS fires: {len(fires)} hotspots (from {response.status_code})")
except Exception as e:
logger.error(f"Error fetching FIRMS fires: {e}")
@@ -1471,9 +1682,8 @@ def fetch_internet_outages():
r["lat"] = coords[0]
r["lng"] = coords[1]
geocoded.append(r)
# Sort by severity descending, cap at 100
geocoded.sort(key=lambda x: x["severity"], reverse=True)
outages = geocoded[:100]
# Keep top 100 by severity
outages = heapq.nlargest(100, geocoded, key=lambda x: x["severity"])
logger.info(f"Internet outages: {len(outages)} regions affected")
except Exception as e:
logger.error(f"Error fetching internet outages: {e}")
@@ -1481,123 +1691,37 @@ def fetch_internet_outages():
if outages:
_mark_fresh("internet_outages")
_DC_CACHE_PATH = Path(__file__).parent.parent / "data" / "datacenters.json"
_DC_URL = "https://raw.githubusercontent.com/Ringmast4r/Data-Center-Map---Global/1f290297c6a11454dc7a47bf95aef7cf0fe1d34c/datacenters_cleaned.json"
# Country bounding boxes (lat_min, lat_max, lng_min, lng_max) for coordinate validation.
# The source dataset has abs(lat) for all Southern Hemisphere entries, so we fix the sign
# and then validate the result falls within the country's bounding box.
_COUNTRY_BBOX: dict[str, tuple[float, float, float, float]] = {
"Argentina": (-55, -21, -74, -53), "Australia": (-44, -10, 112, 154),
"Bolivia": (-23, -9, -70, -57), "Brazil": (-34, 6, -74, -34),
"Chile": (-56, -17, -76, -66), "Colombia": (-5, 13, -82, -66),
"Ecuador": (-5, 2, -81, -75), "Indonesia": (-11, 6, 95, 141),
"Kenya": (-5, 5, 34, 42), "Madagascar": (-26, -12, 43, 51),
"Mozambique": (-27, -10, 30, 41), "New Zealand": (-47, -34, 166, 179),
"Paraguay": (-28, -19, -63, -54), "Peru": (-18, 0, -82, -68),
"South Africa": (-35, -22, 16, 33), "Tanzania": (-12, -1, 29, 41),
"Uruguay": (-35, -30, -59, -53), "Zimbabwe": (-23, -15, 25, 34),
# Northern-hemisphere countries for validation only
"United States": (24, 72, -180, -65), "Canada": (41, 84, -141, -52),
"United Kingdom": (49, 61, -9, 2), "Germany": (47, 55, 5, 16),
"France": (41, 51, -5, 10), "Japan": (24, 46, 123, 146),
"India": (6, 36, 68, 98), "China": (18, 54, 73, 135),
"Singapore": (1, 2, 103, 105), "Spain": (36, 44, -10, 5),
"Netherlands": (50, 54, 3, 8), "Sweden": (55, 70, 11, 25),
"Italy": (36, 47, 6, 19), "Russia": (41, 82, 19, 180),
"Mexico": (14, 33, -118, -86), "Nigeria": (4, 14, 2, 15),
"Thailand": (5, 21, 97, 106), "Malaysia": (0, 8, 99, 120),
"Philippines": (4, 21, 116, 127), "South Korea": (33, 39, 124, 132),
"Taiwan": (21, 26, 119, 123), "Hong Kong": (22, 23, 113, 115),
"Vietnam": (8, 24, 102, 110), "Poland": (49, 55, 14, 25),
"Switzerland": (45, 48, 5, 11), "Austria": (46, 49, 9, 17),
"Belgium": (49, 52, 2, 7), "Denmark": (54, 58, 8, 16),
"Finland": (59, 70, 20, 32), "Norway": (57, 72, 4, 32),
"Ireland": (51, 56, -11, -5), "Portugal": (36, 42, -10, -6),
"Turkey": (35, 42, 25, 45), "Israel": (29, 34, 34, 36),
"UAE": (22, 27, 51, 56), "Saudi Arabia": (16, 33, 34, 56),
}
# Countries whose DCs always sit south of the equator
_SOUTHERN_COUNTRIES = {
"Argentina", "Australia", "Bolivia", "Brazil", "Chile", "Madagascar",
"Mozambique", "New Zealand", "Paraguay", "Peru", "South Africa",
"Tanzania", "Uruguay", "Zimbabwe",
}
def _fix_dc_coords(lat: float, lng: float, country: str) -> tuple[float, float] | None:
"""Fix and validate data-center coordinates against the stated country.
The source dataset stores abs(lat) for Southern-Hemisphere entries.
We negate lat when the country is in the Southern Hemisphere, then
validate the result falls within the country bounding box (if known).
Returns corrected (lat, lng) or None if the coords are clearly wrong.
"""
# Fix Southern Hemisphere sign
if country in _SOUTHERN_COUNTRIES and lat > 0:
lat = -lat
bbox = _COUNTRY_BBOX.get(country)
if bbox:
lat_min, lat_max, lng_min, lng_max = bbox
if lat_min <= lat <= lat_max and lng_min <= lng <= lng_max:
return lat, lng
# Try swapping sign as last resort (some entries are just wrong sign)
if lat_min <= -lat <= lat_max and lng_min <= lng <= lng_max:
return -lat, lng
# Coords don't match country at all — drop the entry
return None
# No bbox for this country — basic sanity only
return lat, lng
_DC_GEOCODED_PATH = Path(__file__).parent.parent / "data" / "datacenters_geocoded.json"
def fetch_datacenters():
"""Load data center locations (static dataset, cached locally after first fetch)."""
"""Load geocoded data centers (5K+ street-level precise locations)."""
dcs = []
try:
raw = None
# Use local cache if it exists and is less than 7 days old
if _DC_CACHE_PATH.exists():
age_days = (time.time() - _DC_CACHE_PATH.stat().st_mtime) / 86400
if age_days < 7:
raw = json.loads(_DC_CACHE_PATH.read_text(encoding="utf-8"))
# Otherwise fetch from GitHub
if raw is None:
resp = fetch_with_curl(_DC_URL, timeout=20)
if resp.status_code == 200:
raw = resp.json()
_DC_CACHE_PATH.parent.mkdir(parents=True, exist_ok=True)
_DC_CACHE_PATH.write_text(json.dumps(raw), encoding="utf-8")
if raw:
dropped = 0
for entry in raw:
coords = entry.get("city_coords")
if not coords or not isinstance(coords, list) or len(coords) < 2:
continue
lat, lng = coords[0], coords[1]
if not (-90 <= lat <= 90 and -180 <= lng <= 180):
continue
country = entry.get("country", "")
fixed = _fix_dc_coords(lat, lng, country)
if fixed is None:
dropped += 1
continue
lat, lng = fixed
dcs.append({
"name": entry.get("name", "Unknown"),
"company": entry.get("company", ""),
"city": entry.get("city", ""),
"country": country,
"lat": lat,
"lng": lng,
})
if dropped:
logger.info(f"Data centers: dropped {dropped} entries with mismatched coordinates")
logger.info(f"Data centers: {len(dcs)} with valid coordinates (from {'cache' if _DC_CACHE_PATH.exists() else 'GitHub'})")
if not _DC_GEOCODED_PATH.exists():
logger.warning(f"Geocoded DC file not found: {_DC_GEOCODED_PATH}")
return
raw = json.loads(_DC_GEOCODED_PATH.read_text(encoding="utf-8"))
for entry in raw:
lat = entry.get("lat")
lng = entry.get("lng")
if lat is None or lng is None:
continue
if not (-90 <= lat <= 90 and -180 <= lng <= 180):
continue
dcs.append({
"name": entry.get("name", "Unknown"),
"company": entry.get("company", ""),
"street": entry.get("street", ""),
"city": entry.get("city", ""),
"country": entry.get("country", ""),
"zip": entry.get("zip", ""),
"lat": lat,
"lng": lng,
})
logger.info(f"Data centers: {len(dcs)} geocoded locations loaded")
except Exception as e:
logger.error(f"Error fetching data centers: {e}")
logger.error(f"Error loading data centers: {e}")
latest_data["datacenters"] = dcs
if dcs:
_mark_fresh("datacenters")
@@ -1663,7 +1787,37 @@ def fetch_earthquakes():
_mark_fresh("earthquakes")
# Satellite GP data cache — re-download from CelesTrak only every 30 minutes
_sat_gp_cache = {"data": None, "last_fetch": 0}
_sat_gp_cache = {"data": None, "last_fetch": 0, "source": "none"}
_sat_classified_cache = {"data": None, "gp_fetch_ts": 0} # Cache classified sat list (skip re-classification when TLEs unchanged)
_SAT_CACHE_PATH = Path(__file__).parent.parent / "data" / "sat_gp_cache.json"
def _load_sat_cache():
"""Load satellite GP data from local disk cache."""
try:
if _SAT_CACHE_PATH.exists():
import os
age_hours = (time.time() - os.path.getmtime(str(_SAT_CACHE_PATH))) / 3600
if age_hours < 48: # Use cache if less than 48 hours old
with open(_SAT_CACHE_PATH, "r") as f:
data = json.load(f)
if isinstance(data, list) and len(data) > 10:
logger.info(f"Satellites: Loaded {len(data)} records from disk cache ({age_hours:.1f}h old)")
return data
else:
logger.info(f"Satellites: Disk cache is {age_hours:.0f}h old, will try fresh fetch")
except Exception as e:
logger.warning(f"Satellites: Failed to load disk cache: {e}")
return None
def _save_sat_cache(data):
"""Save satellite GP data to local disk cache."""
try:
_SAT_CACHE_PATH.parent.mkdir(parents=True, exist_ok=True)
with open(_SAT_CACHE_PATH, "w") as f:
json.dump(data, f)
logger.info(f"Satellites: Saved {len(data)} records to disk cache")
except Exception as e:
logger.warning(f"Satellites: Failed to save disk cache: {e}")
# Satellite intelligence classification database — module-level constant.
# Key: substring to match in OBJECT_NAME → {country, mission, sat_type, wiki}
@@ -1765,31 +1919,39 @@ def _fetch_satellites_from_tle_api():
term = key.split()[0] if len(key.split()) > 1 and key.split()[0] in ("USA", "NROL") else key
search_terms.add(term)
all_results = []
seen_ids = set()
for term in search_terms:
def _fetch_term(term):
"""Fetch a single search term from TLE API."""
results = []
try:
url = f"https://tle.ivanstanojevic.me/api/tle/?search={term}&page_size=100&format=json"
response = fetch_with_curl(url, timeout=10)
response = fetch_with_curl(url, timeout=8)
if response.status_code != 200:
continue
return results
data = response.json()
for member in data.get("member", []):
sat_id = member.get("satelliteId")
if sat_id in seen_ids:
continue
seen_ids.add(sat_id)
gp = _parse_tle_to_gp(
member.get("name", "UNKNOWN"),
sat_id,
member.get("satelliteId"),
member.get("line1", ""),
member.get("line2", ""),
)
if gp:
all_results.append(gp)
results.append(gp)
except Exception as e:
logger.debug(f"TLE fallback search '{term}' failed: {e}")
continue
return results
# Fetch ALL search terms in parallel (was sequential — 35+ requests taking forever)
all_results = []
seen_ids = set()
with concurrent.futures.ThreadPoolExecutor(max_workers=10) as executor:
future_map = {executor.submit(_fetch_term, term): term for term in search_terms}
for future in concurrent.futures.as_completed(future_map):
for gp in future.result():
sat_id = gp.get("NORAD_CAT_ID")
if sat_id not in seen_ids:
seen_ids.add(sat_id)
all_results.append(gp)
return all_results
@@ -1802,25 +1964,28 @@ def fetch_satellites():
now_ts = time.time()
if _sat_gp_cache["data"] is None or (now_ts - _sat_gp_cache["last_fetch"]) > 1800:
# Try multiple CelesTrak mirrors — .org is often blocked/banned by some networks
# Short timeout (5s) so we fail fast and hit the TLE fallback quickly
gp_urls = [
"https://celestrak.org/NORAD/elements/gp.php?GROUP=active&FORMAT=json",
"https://celestrak.com/NORAD/elements/gp.php?GROUP=active&FORMAT=json",
]
for url in gp_urls:
try:
response = fetch_with_curl(url, timeout=8)
response = fetch_with_curl(url, timeout=5)
if response.status_code == 200:
gp_data = response.json()
if isinstance(gp_data, list) and len(gp_data) > 100:
_sat_gp_cache["data"] = gp_data
_sat_gp_cache["last_fetch"] = now_ts
_sat_gp_cache["source"] = "celestrak"
_save_sat_cache(gp_data)
logger.info(f"Satellites: Downloaded {len(gp_data)} GP records from {url}")
break
except Exception as e:
logger.warning(f"Satellites: Failed to fetch from {url}: {e}")
continue
# Fallback: if CelesTrak is blocked, use tle.ivanstanojevic.me TLE API
# Fallback 1: TLE API (parallel fetch)
if _sat_gp_cache["data"] is None:
logger.info("Satellites: CelesTrak unreachable, trying TLE fallback API...")
try:
@@ -1828,10 +1993,20 @@ def fetch_satellites():
if fallback_data and len(fallback_data) > 10:
_sat_gp_cache["data"] = fallback_data
_sat_gp_cache["last_fetch"] = now_ts
_sat_gp_cache["source"] = "tle_api"
_save_sat_cache(fallback_data)
logger.info(f"Satellites: Got {len(fallback_data)} records from TLE fallback API")
except Exception as e:
logger.error(f"Satellites: TLE fallback also failed: {e}")
# Fallback 2: local disk cache (survives API outages / rate limits)
if _sat_gp_cache["data"] is None:
disk_data = _load_sat_cache()
if disk_data:
_sat_gp_cache["data"] = disk_data
_sat_gp_cache["last_fetch"] = now_ts - 1500 # Mark as slightly stale so we retry sooner
_sat_gp_cache["source"] = "disk_cache"
data = _sat_gp_cache["data"]
if not data:
logger.warning("No satellite GP data available from any source")
@@ -1839,33 +2014,40 @@ def fetch_satellites():
return
# Only keep satellites matching the intel classification DB
classified = []
for sat in data:
name = sat.get("OBJECT_NAME", "UNKNOWN").upper()
intel = None
for key, meta in _SAT_INTEL_DB:
if key.upper() in name:
intel = dict(meta)
break
if not intel:
continue # Skip junk, debris, CubeSats, bulk constellations
entry = {
"id": sat.get("NORAD_CAT_ID"),
"name": sat.get("OBJECT_NAME", "UNKNOWN"),
"MEAN_MOTION": sat.get("MEAN_MOTION"),
"ECCENTRICITY": sat.get("ECCENTRICITY"),
"INCLINATION": sat.get("INCLINATION"),
"RA_OF_ASC_NODE": sat.get("RA_OF_ASC_NODE"),
"ARG_OF_PERICENTER": sat.get("ARG_OF_PERICENTER"),
"MEAN_ANOMALY": sat.get("MEAN_ANOMALY"),
"BSTAR": sat.get("BSTAR"),
"EPOCH": sat.get("EPOCH"),
}
entry.update(intel)
classified.append(entry)
# Skip re-classification if TLEs haven't changed (saves O(n*m) scan)
if _sat_classified_cache["gp_fetch_ts"] == _sat_gp_cache["last_fetch"] and _sat_classified_cache["data"]:
classified = _sat_classified_cache["data"]
logger.info(f"Satellites: Using cached classification ({len(classified)} sats, TLEs unchanged)")
else:
classified = []
for sat in data:
name = sat.get("OBJECT_NAME", "UNKNOWN").upper()
intel = None
for key, meta in _SAT_INTEL_DB:
if key.upper() in name:
intel = dict(meta)
break
if not intel:
continue # Skip junk, debris, CubeSats, bulk constellations
entry = {
"id": sat.get("NORAD_CAT_ID"),
"name": sat.get("OBJECT_NAME", "UNKNOWN"),
"MEAN_MOTION": sat.get("MEAN_MOTION"),
"ECCENTRICITY": sat.get("ECCENTRICITY"),
"INCLINATION": sat.get("INCLINATION"),
"RA_OF_ASC_NODE": sat.get("RA_OF_ASC_NODE"),
"ARG_OF_PERICENTER": sat.get("ARG_OF_PERICENTER"),
"MEAN_ANOMALY": sat.get("MEAN_ANOMALY"),
"BSTAR": sat.get("BSTAR"),
"EPOCH": sat.get("EPOCH"),
}
entry.update(intel)
classified.append(entry)
_sat_classified_cache["data"] = classified
_sat_classified_cache["gp_fetch_ts"] = _sat_gp_cache["last_fetch"]
logger.info(f"Satellites: {len(classified)} intel-classified out of {len(data)} total in catalog")
all_sats = classified
logger.info(f"Satellites: {len(classified)} intel-classified out of {len(data)} total in catalog")
# Propagate orbital elements to get current lat/lng/alt using SGP4
now = datetime.utcnow()
@@ -1960,9 +2142,11 @@ def fetch_satellites():
# Only overwrite if we got data — don't wipe the map on API timeout
if sats:
latest_data["satellites"] = sats
latest_data["satellite_source"] = _sat_gp_cache.get("source", "none")
_mark_fresh("satellites")
elif not latest_data.get("satellites"):
latest_data["satellites"] = []
latest_data["satellite_source"] = "none"
# ---------------------------------------------------------------------------
# Real UAV detection from ADS-B data — filters military drone transponders
@@ -2172,14 +2356,14 @@ def update_slow_data():
logger.info("Slow-tier update complete.")
def update_all_data():
"""Full update — runs on startup. Fast and slow tiers run IN PARALLEL for fastest startup."""
"""Full update — runs on startup. All tiers run IN PARALLEL for fastest startup."""
logger.info("Full data update starting (parallel)...")
fetch_airports() # Cached after first download
# Run fast + slow in parallel so the user sees data ASAP
with concurrent.futures.ThreadPoolExecutor(max_workers=2) as pool:
# Run airports, fast, and slow ALL in parallel so the user sees data ASAP
with concurrent.futures.ThreadPoolExecutor(max_workers=3) as pool:
f0 = pool.submit(fetch_airports) # Cached after first download
f1 = pool.submit(update_fast_data)
f2 = pool.submit(update_slow_data)
concurrent.futures.wait([f1, f2])
concurrent.futures.wait([f0, f1, f2])
logger.info("Full data update complete.")
scheduler = BackgroundScheduler()
@@ -2219,8 +2403,8 @@ def start_scheduler():
scheduler.add_job(update_liveuamap, 'date', run_date=datetime.now())
scheduler.add_job(update_liveuamap, 'interval', hours=12)
# Geopolitics (frontlines) more frequently than other slow data
scheduler.add_job(fetch_geopolitics, 'interval', minutes=5)
# Geopolitics (frontlines) aligned with slow-data tier
scheduler.add_job(fetch_geopolitics, 'interval', minutes=30)
scheduler.start()
+153 -27
View File
@@ -86,8 +86,10 @@ def _extract_domain(url):
def _url_to_headline(url):
"""Extract a human-readable headline from a URL path.
e.g. 'https://nytimes.com/2026/03/us-strikes-iran-nuclear-sites.html' -> 'Us Strikes Iran Nuclear Sites (nytimes.com)'
e.g. 'https://nytimes.com/2026/03/us-strikes-iran-nuclear-sites.html' -> 'Us Strikes Iran Nuclear Sites'
Falls back to domain name if the URL slug is gibberish (hex IDs, UUIDs, etc.).
"""
import re
try:
from urllib.parse import urlparse, unquote
parsed = urlparse(url)
@@ -100,43 +102,151 @@ def _url_to_headline(url):
if not path:
return domain
# Take the last path segment (usually the slug)
slug = path.split('/')[-1]
# Remove file extensions
for ext in ['.html', '.htm', '.php', '.asp', '.aspx', '.shtml']:
if slug.lower().endswith(ext):
slug = slug[:-len(ext)]
# If slug is purely numeric or a short ID, try the second-to-last segment
import re
if re.match(r'^[a-z]?\d{5,}$', slug, re.IGNORECASE):
segments = path.split('/')
if len(segments) >= 2:
slug = segments[-2]
for ext in ['.html', '.htm', '.php']:
if slug.lower().endswith(ext):
slug = slug[:-len(ext)]
# Try the last path segment first, then walk backwards
segments = [s for s in path.split('/') if s]
slug = ''
for seg in reversed(segments):
# Remove file extensions
for ext in ['.html', '.htm', '.php', '.asp', '.aspx', '.shtml']:
if seg.lower().endswith(ext):
seg = seg[:-len(ext)]
# Skip segments that are clearly not headlines
if _is_gibberish(seg):
continue
slug = seg
break
if not slug:
return domain
# Remove common ID patterns at start/end
slug = re.sub(r'^[\d]+-', '', slug) # leading numbers like "13847569-"
slug = re.sub(r'-[\da-f]{6,}$', '', slug) # trailing hex IDs
slug = re.sub(r'[-_]c-\d+$', '', slug) # trailing "-c-21803431"
slug = re.sub(r'^p=\d+$', '', slug) # WordPress ?p=1234
slug = re.sub(r'^[\d]+-', '', slug) # leading "13847569-"
slug = re.sub(r'-[\da-f]{6,}$', '', slug) # trailing hex IDs
slug = re.sub(r'[-_]c-\d+$', '', slug) # trailing "-c-21803431"
slug = re.sub(r'^p=\d+$', '', slug) # WordPress ?p=1234
# Convert slug separators to spaces
slug = slug.replace('-', ' ').replace('_', ' ')
# Clean up multiple spaces
slug = re.sub(r'\s+', ' ', slug).strip()
# If slug is still just a number or too short, fall back to domain
if len(slug) < 5 or re.match(r'^\d+$', slug):
# Final gibberish check after cleanup
if len(slug) < 8 or _is_gibberish(slug.replace(' ', '-')):
return domain
# Title case and truncate
headline = slug.title()
if len(headline) > 80:
headline = headline[:77] + '...'
return f"{headline} ({domain})"
if len(headline) > 90:
headline = headline[:87] + '...'
return headline
except Exception:
return url[:60]
def _is_gibberish(text):
"""Detect if a URL segment is gibberish (hex IDs, UUIDs, numeric IDs, etc.)
rather than a real human-readable slug like 'us-strikes-iran'."""
import re
t = text.strip()
if not t:
return True
# Pure numbers
if re.match(r'^\d+$', t):
return True
# UUID pattern (with or without dashes)
if re.match(r'^[0-9a-f]{8}[_-]?[0-9a-f]{4}[_-]?[0-9a-f]{4}[_-]?[0-9a-f]{4}[_-]?[0-9a-f]{12}$', t, re.I):
return True
# Hex-heavy string: more than 40% hex digits among alphanumeric chars
alnum = re.sub(r'[^a-zA-Z0-9]', '', t)
if alnum:
hex_chars = sum(1 for c in alnum if c in '0123456789abcdefABCDEF')
if hex_chars / len(alnum) > 0.4 and len(alnum) > 6:
return True
# Mostly digits with a few alpha (like "article8efa6c53")
digits = sum(1 for c in alnum if c.isdigit())
if alnum and digits / len(alnum) > 0.5:
return True
# Too short to be a headline slug
if len(t) < 5:
return True
# Query-param style segments
if '=' in t:
return True
return False
# Persistent cache for article titles — survives across GDELT cache refreshes
_article_title_cache = {}
def _fetch_article_title(url):
"""Fetch the real headline from an article's HTML <title> or og:title tag.
Returns the title string, or None if it can't be fetched.
Uses a persistent cache to avoid refetching."""
if url in _article_title_cache:
return _article_title_cache[url]
import re
try:
# Only read the first 32KB — the <title> is always in <head>
resp = requests.get(url, timeout=4, headers={
'User-Agent': 'Mozilla/5.0 (compatible; OSINT Dashboard/1.0)'
}, stream=True)
if resp.status_code != 200:
_article_title_cache[url] = None
return None
chunk = resp.raw.read(32768).decode('utf-8', errors='replace')
resp.close()
title = None
# Try og:title first (usually the cleanest)
og_match = re.search(r'<meta[^>]+property=["\']og:title["\'][^>]+content=["\']([^"\'>]+)["\']', chunk, re.I)
if not og_match:
og_match = re.search(r'<meta[^>]+content=["\']([^"\'>]+)["\'][^>]+property=["\']og:title["\']', chunk, re.I)
if og_match:
title = og_match.group(1).strip()
# Fall back to <title> tag
if not title:
title_match = re.search(r'<title[^>]*>([^<]+)</title>', chunk, re.I)
if title_match:
title = title_match.group(1).strip()
if title:
# Clean up HTML entities
import html as html_mod
title = html_mod.unescape(title)
# Remove site name suffixes like " | CNN" or " - BBC News"
title = re.sub(r'\s*[|\-–—]\s*[^|\-–—]{2,30}$', '', title).strip()
# Truncate very long titles
if len(title) > 120:
title = title[:117] + '...'
if len(title) > 10:
_article_title_cache[url] = title
return title
_article_title_cache[url] = None
return None
except Exception:
_article_title_cache[url] = None
return None
def _batch_fetch_titles(urls):
"""Fetch real article titles for a list of URLs in parallel.
Returns a dict of url -> title (or None if fetch failed)."""
from concurrent.futures import ThreadPoolExecutor
results = {}
with ThreadPoolExecutor(max_workers=16) as executor:
futures = {executor.submit(_fetch_article_title, u): u for u in urls}
for future in futures:
url = futures[future]
try:
results[url] = future.result()
except Exception:
results[url] = None
return results
def _parse_gdelt_export_zip(zip_bytes, conflict_codes, seen_locs, features, loc_index):
"""Parse a single GDELT export ZIP and append conflict features.
loc_index maps loc_key -> index in features list for fast duplicate merging.
@@ -278,11 +388,27 @@ def fetch_global_military_incidents():
if zip_bytes:
_parse_gdelt_export_zip(zip_bytes, CONFLICT_CODES, seen_locs, features, loc_index)
# Collect all unique article URLs for batch title fetching
all_article_urls = set()
for f in features:
for u in f["properties"].get("_urls", []):
if u:
all_article_urls.add(u)
logger.info(f"Fetching real article titles for {len(all_article_urls)} unique URLs...")
fetched_titles = _batch_fetch_titles(all_article_urls)
fetched_count = sum(1 for v in fetched_titles.values() if v)
logger.info(f"Resolved {fetched_count}/{len(all_article_urls)} article titles from HTML")
# Build URL + headline arrays for frontend rendering
for f in features:
urls = f["properties"].pop("_urls", [])
f["properties"].pop("_domains", None)
headlines = [_url_to_headline(u) for u in urls]
headlines = []
for u in urls:
# Try the real fetched title first, then fall back to URL slug parsing
real_title = fetched_titles.get(u)
headlines.append(real_title if real_title else _url_to_headline(u))
f["properties"]["_urls_list"] = urls
f["properties"]["_headlines_list"] = headlines
import html
+15 -1
View File
@@ -24,6 +24,11 @@ _BASH_PATH = shutil.which("bash") or "bash"
_domain_fail_cache: dict[str, float] = {}
_DOMAIN_FAIL_TTL = 300 # 5 minutes
# Circuit breaker: track domains where BOTH requests AND curl fail
# If a domain failed completely within the last 2 minutes, skip it entirely
_circuit_breaker: dict[str, float] = {}
_CIRCUIT_BREAKER_TTL = 120 # 2 minutes
class _DummyResponse:
"""Minimal response object matching requests.Response interface."""
def __init__(self, status_code, text):
@@ -54,6 +59,10 @@ def fetch_with_curl(url, method="GET", json_data=None, timeout=15, headers=None)
domain = urlparse(url).netloc
# Circuit breaker: if domain failed completely <2min ago, fail fast
if domain in _circuit_breaker and (time.time() - _circuit_breaker[domain]) < _CIRCUIT_BREAKER_TTL:
raise Exception(f"Circuit breaker open for {domain} (failed <{_CIRCUIT_BREAKER_TTL}s ago)")
# Check if this domain recently failed with requests — skip straight to curl
if domain in _domain_fail_cache and (time.time() - _domain_fail_cache[domain]) < _DOMAIN_FAIL_TTL:
pass # Fall through to curl below
@@ -64,8 +73,9 @@ def fetch_with_curl(url, method="GET", json_data=None, timeout=15, headers=None)
else:
res = _session.get(url, timeout=timeout, headers=default_headers)
res.raise_for_status()
# Clear failure cache on success
# Clear failure caches on success
_domain_fail_cache.pop(domain, None)
_circuit_breaker.pop(domain, None)
return res
except Exception as e:
logger.warning(f"Python requests failed for {url} ({e}), falling back to bash curl...")
@@ -92,10 +102,14 @@ def fetch_with_curl(url, method="GET", json_data=None, timeout=15, headers=None)
lines = res.stdout.rstrip().rsplit("\n", 1)
body = lines[0] if len(lines) > 1 else res.stdout
http_code = int(lines[-1]) if len(lines) > 1 and lines[-1].strip().isdigit() else 200
if http_code < 400:
_circuit_breaker.pop(domain, None) # Clear circuit breaker on success
return _DummyResponse(http_code, body)
else:
logger.error(f"bash curl fallback failed: exit={res.returncode} stderr={res.stderr[:200]}")
_circuit_breaker[domain] = time.time()
return _DummyResponse(500, "")
except Exception as curl_e:
logger.error(f"bash curl fallback exception: {curl_e}")
_circuit_breaker[domain] = time.time()
return _DummyResponse(500, "")
+36 -14
View File
@@ -1,6 +1,8 @@
import logging
import time
import concurrent.futures
from urllib.parse import quote
import requests as _requests
from cachetools import TTLCache
from services.network_utils import fetch_with_curl
@@ -10,26 +12,46 @@ logger = logging.getLogger(__name__)
# Key: rounded lat/lng grid (0.1 degree ≈ 11km)
dossier_cache = TTLCache(maxsize=500, ttl=86400)
# Nominatim requires max 1 req/sec — track last call time
_nominatim_last_call = 0.0
def _reverse_geocode(lat: float, lng: float) -> dict:
global _nominatim_last_call
url = (
f"https://nominatim.openstreetmap.org/reverse?"
f"lat={lat}&lon={lng}&format=json&zoom=10&addressdetails=1&accept-language=en"
)
try:
res = fetch_with_curl(url, timeout=10)
if res.status_code == 200:
data = res.json()
addr = data.get("address", {})
return {
"city": addr.get("city") or addr.get("town") or addr.get("village") or addr.get("county") or "",
"state": addr.get("state") or addr.get("region") or "",
"country": addr.get("country") or "",
"country_code": (addr.get("country_code") or "").upper(),
"display_name": data.get("display_name", ""),
}
except Exception as e:
logger.warning(f"Reverse geocode failed: {e}")
headers = {"User-Agent": "ShadowBroker-OSINT/1.0 (live-risk-dashboard; contact@shadowbroker.app)"}
for attempt in range(2):
# Enforce Nominatim's 1 req/sec policy
elapsed = time.time() - _nominatim_last_call
if elapsed < 1.1:
time.sleep(1.1 - elapsed)
_nominatim_last_call = time.time()
try:
# Use requests directly — fetch_with_curl raises on non-200 which breaks 429 handling
res = _requests.get(url, timeout=10, headers=headers)
if res.status_code == 200:
data = res.json()
addr = data.get("address", {})
return {
"city": addr.get("city") or addr.get("town") or addr.get("village") or addr.get("county") or "",
"state": addr.get("state") or addr.get("region") or "",
"country": addr.get("country") or "",
"country_code": (addr.get("country_code") or "").upper(),
"display_name": data.get("display_name", ""),
}
elif res.status_code == 429:
logger.warning(f"Nominatim 429 rate-limited, retrying after 2s (attempt {attempt+1})")
time.sleep(2)
continue
else:
logger.warning(f"Nominatim returned {res.status_code}")
except Exception as e:
logger.warning(f"Reverse geocode failed: {e}")
return {}
+4 -4
View File
@@ -21,13 +21,13 @@ services:
frontend:
build:
context: ./frontend
args:
# Optional: set this to your backend's external URL if using custom ports
# e.g. http://192.168.1.50:9096 — leave empty to auto-detect from browser
NEXT_PUBLIC_API_URL: ${NEXT_PUBLIC_API_URL:-}
container_name: shadowbroker-frontend
ports:
- "3000:3000"
environment:
# Points the Next.js server-side proxy at the backend container via Docker networking.
# Change this if your backend runs on a different host or port.
- BACKEND_URL=http://backend:8000
depends_on:
- backend
restart: unless-stopped
+5 -5
View File
@@ -10,7 +10,7 @@ FROM base AS builder
WORKDIR /app
COPY --from=deps /app/node_modules ./node_modules
COPY . .
ENV NEXT_TELEMETRY_DISABLED 1
ENV NEXT_TELEMETRY_DISABLED=1
# NEXT_PUBLIC_* vars must exist at build time for Next.js to inline them.
# Default empty = auto-detect from browser hostname at runtime.
ARG NEXT_PUBLIC_API_URL=""
@@ -19,8 +19,8 @@ RUN npm run build
FROM base AS runner
WORKDIR /app
ENV NODE_ENV production
ENV NEXT_TELEMETRY_DISABLED 1
ENV NODE_ENV=production
ENV NEXT_TELEMETRY_DISABLED=1
RUN addgroup --system --gid 1001 nodejs
RUN adduser --system --uid 1001 nextjs
@@ -36,7 +36,7 @@ COPY --from=builder --chown=nextjs:nodejs /app/.next/static ./.next/static
USER nextjs
EXPOSE 3000
ENV PORT 3000
ENV HOSTNAME "0.0.0.0"
ENV PORT=3000
ENV HOSTNAME="0.0.0.0"
CMD ["node", "server.js"]
+5
View File
@@ -1,5 +1,10 @@
import type { NextConfig } from "next";
// /api/* requests are proxied to the backend by the catch-all route handler at
// src/app/api/[...path]/route.ts, which reads BACKEND_URL at request time.
// Do NOT add rewrites for /api/* here — next.config is evaluated at build time,
// so any URL baked in here ignores the runtime BACKEND_URL env var.
const nextConfig: NextConfig = {
transpilePackages: ['react-map-gl', 'mapbox-gl', 'maplibre-gl'],
output: "standalone",
+1 -1
View File
@@ -1,6 +1,6 @@
{
"name": "frontend",
"version": "0.5.0",
"version": "0.8.0",
"private": true,
"scripts": {
"dev": "concurrently \"npm run dev:frontend\" \"npm run dev:backend\"",
+35 -16
View File
@@ -10,6 +10,7 @@ import NewsFeed from "@/components/NewsFeed";
import MarketsPanel from "@/components/MarketsPanel";
import FilterPanel from "@/components/FilterPanel";
import FindLocateBar from "@/components/FindLocateBar";
import TopRightControls from "@/components/TopRightControls";
import RadioInterceptPanel from "@/components/RadioInterceptPanel";
import SettingsPanel from "@/components/SettingsPanel";
import MapLegend from "@/components/MapLegend";
@@ -28,7 +29,7 @@ function LocateBar({ onLocate }: { onLocate: (lat: number, lng: number) => void
const [results, setResults] = useState<{ label: string; lat: number; lng: number }[]>([]);
const [loading, setLoading] = useState(false);
const inputRef = useRef<HTMLInputElement>(null);
const timerRef = useRef<ReturnType<typeof setTimeout>>();
const timerRef = useRef<ReturnType<typeof setTimeout> | null>(null);
useEffect(() => { if (open) inputRef.current?.focus(); }, [open]);
@@ -50,7 +51,7 @@ function LocateBar({ onLocate }: { onLocate: (lat: number, lng: number) => void
return;
}
// Geocode with Nominatim (debounced)
clearTimeout(timerRef.current);
if (timerRef.current) clearTimeout(timerRef.current);
if (q.trim().length < 2) { setResults([]); return; }
timerRef.current = setTimeout(async () => {
setLoading(true);
@@ -298,23 +299,32 @@ export default function Dashboard() {
const slowEtag = useRef<string | null>(null);
useEffect(() => {
// Track whether we've received substantial data yet (backend may still be starting up)
let hasData = false;
let fastTimerId: ReturnType<typeof setTimeout> | null = null;
let slowTimerId: ReturnType<typeof setTimeout> | null = null;
const fetchFastData = async () => {
try {
const headers: Record<string, string> = {};
if (fastEtag.current) headers['If-None-Match'] = fastEtag.current;
const res = await fetch(`${API_BASE}/api/live-data/fast`, { headers });
if (res.status === 304) { setBackendStatus('connected'); return; }
if (res.status === 304) { setBackendStatus('connected'); scheduleNext('fast'); return; }
if (res.ok) {
setBackendStatus('connected');
fastEtag.current = res.headers.get('etag') || null;
const json = await res.json();
dataRef.current = { ...dataRef.current, ...json };
setDataVersion(v => v + 1);
// Check if we got real data (backend finished loading)
const flights = json.commercial_flights?.length || 0;
if (flights > 100) hasData = true;
}
} catch (e) {
console.error("Failed fetching fast live data", e);
setBackendStatus('disconnected');
}
scheduleNext('fast');
};
const fetchSlowData = async () => {
@@ -322,7 +332,7 @@ export default function Dashboard() {
const headers: Record<string, string> = {};
if (slowEtag.current) headers['If-None-Match'] = slowEtag.current;
const res = await fetch(`${API_BASE}/api/live-data/slow`, { headers });
if (res.status === 304) return;
if (res.status === 304) { scheduleNext('slow'); return; }
if (res.ok) {
slowEtag.current = res.headers.get('etag') || null;
const json = await res.json();
@@ -332,19 +342,26 @@ export default function Dashboard() {
} catch (e) {
console.error("Failed fetching slow live data", e);
}
scheduleNext('slow');
};
// Adaptive polling: retry every 3s during startup, back off to normal cadence once data arrives
const scheduleNext = (tier: 'fast' | 'slow') => {
if (tier === 'fast') {
const delay = hasData ? 15000 : 3000; // 3s startup retry → 15s steady state
fastTimerId = setTimeout(fetchFastData, delay);
} else {
const delay = hasData ? 120000 : 5000; // 5s startup retry → 120s steady state
slowTimerId = setTimeout(fetchSlowData, delay);
}
};
fetchFastData();
fetchSlowData();
// Fast polling: 60s (matches backend update cadence — was 15s, wasting 75% on 304s)
// Slow polling: 120s (backend updates every 30min)
const fastInterval = setInterval(fetchFastData, 60000);
const slowInterval = setInterval(fetchSlowData, 120000);
return () => {
clearInterval(fastInterval);
clearInterval(slowInterval);
if (fastTimerId) clearTimeout(fastTimerId);
if (slowTimerId) clearTimeout(slowTimerId);
};
}, []);
@@ -418,7 +435,7 @@ export default function Dashboard() {
{/* LEFT HUD CONTAINER */}
<div className="absolute left-6 top-24 bottom-6 w-80 flex flex-col gap-6 z-[200] pointer-events-none">
{/* LEFT PANEL - DATA LAYERS */}
<WorldviewLeftPanel data={data} activeLayers={activeLayers} setActiveLayers={setActiveLayers} onSettingsClick={() => setSettingsOpen(true)} onLegendClick={() => setLegendOpen(true)} gibsDate={gibsDate} setGibsDate={setGibsDate} gibsOpacity={gibsOpacity} setGibsOpacity={setGibsOpacity} />
<WorldviewLeftPanel data={data} activeLayers={activeLayers} setActiveLayers={setActiveLayers} onSettingsClick={() => setSettingsOpen(true)} onLegendClick={() => setLegendOpen(true)} gibsDate={gibsDate} setGibsDate={setGibsDate} gibsOpacity={gibsOpacity} setGibsOpacity={setGibsOpacity} onEntityClick={setSelectedEntity} onFlyTo={(lat, lng) => setFlyToLocation({ lat, lng, ts: Date.now() })} />
{/* LEFT BOTTOM - DISPLAY CONFIG */}
<WorldviewRightPanel effects={effects} setEffects={setEffects} setUiVisible={setUiVisible} />
@@ -426,6 +443,8 @@ export default function Dashboard() {
{/* RIGHT HUD CONTAINER */}
<div className="absolute right-6 top-24 bottom-6 w-80 flex flex-col gap-4 z-[200] pointer-events-auto overflow-y-auto styled-scrollbar pr-2">
<TopRightControls />
{/* FIND / LOCATE */}
<div className="flex-shrink-0">
<FindLocateBar
@@ -473,8 +492,8 @@ export default function Dashboard() {
</div>
</div>
{/* BOTTOM CENTER COORDINATE / LOCATION BAR */}
<motion.div
{/* BOTTOM CENTER COORDINATE / LOCATION BAR — hidden when Sentinel-2 imagery overlay is open */}
{!(selectedEntity?.type === 'region_dossier' && regionDossier?.sentinel2) && <motion.div
initial={{ opacity: 0, y: 20 }}
animate={{ opacity: 1, y: 0 }}
transition={{ delay: 1, duration: 1 }}
@@ -530,7 +549,7 @@ export default function Dashboard() {
</div>
</div>
</div>
</motion.div>
</motion.div>}
</>
)}
@@ -592,7 +611,7 @@ export default function Dashboard() {
{backendStatus === 'disconnected' && (
<div className="absolute top-0 left-0 right-0 z-[9000] flex items-center justify-center py-2 bg-red-950/90 border-b border-red-500/40 backdrop-blur-sm">
<span className="text-[10px] font-mono tracking-widest text-red-400">
BACKEND OFFLINE Cannot reach {API_BASE}. Start the backend server or check your connection.
BACKEND OFFLINE Cannot reach backend server. Check that the backend container is running and BACKEND_URL is correct.
</span>
</div>
)}
+2 -4
View File
@@ -656,13 +656,11 @@ export default function CesiumViewer({ data, activeLayers, activeFilters, effect
}
if (filters.tracked_owner?.length) {
const op = (f.alert_operator || '').toLowerCase();
const t1 = (f.alert_tag1 || '').toLowerCase();
const t2 = (f.alert_tag2 || '').toLowerCase();
const t3 = (f.alert_tag3 || '').toLowerCase();
const tags = (f.alert_tags || '').toLowerCase();
const cs = (f.callsign || '').toLowerCase();
if (!filters.tracked_owner.some(sv => {
const q = sv.toLowerCase();
return op.includes(q) || t1.includes(q) || t2.includes(q) || t3.includes(q) || cs.includes(q);
return op.includes(q) || tags.includes(q) || cs.includes(q);
})) return false;
}
return true;
+59 -24
View File
@@ -2,45 +2,60 @@
import React, { useState, useEffect } from "react";
import { motion, AnimatePresence } from "framer-motion";
import { X, Rss, Server, Zap, Shield, Bug } from "lucide-react";
import { X, Zap, Shield, Satellite, MapPin, Palette, ToggleRight, Bug, Heart } from "lucide-react";
const CURRENT_VERSION = "0.6";
const CURRENT_VERSION = "0.8";
const STORAGE_KEY = `shadowbroker_changelog_v${CURRENT_VERSION}`;
const NEW_FEATURES = [
{
icon: <Rss size={14} className="text-orange-400" />,
title: "Custom News Feed Manager",
desc: "Add, remove, and prioritize up to 20 RSS intelligence sources directly from the Settings panel. Assign weight levels (1-5) to control feed importance. No more editing Python files — your custom feeds persist across restarts.",
color: "orange",
icon: <Shield size={14} className="text-pink-400" />,
title: "POTUS Fleet Tracking",
desc: "Air Force One, Air Force Two, and Marine One aircraft now display with oversized hot-pink icons and a gold dashed halo ring — instantly recognizable on the map.",
color: "pink",
},
{
icon: <Server size={14} className="text-purple-400" />,
title: "Global Data Center Map Layer",
desc: "2,000+ data centers plotted worldwide from a curated dataset. Click any DC for operator details — and if an internet outage is detected in the same country, the popup flags it automatically.",
color: "purple",
},
{
icon: <Zap size={14} className="text-yellow-400" />,
title: "Imperative Map Rendering",
desc: "High-volume layers (flights, satellites, fire hotspots) now bypass React reconciliation and update the map directly via setData(). Debounced updates on dense layers. Smoother panning and zooming under load.",
icon: <Palette size={14} className="text-yellow-400" />,
title: "Full Aircraft Color-Coding",
desc: "9-color system: military (yellow), medical/rescue (lime), police/government (blue), privacy (black), VIPs (hot pink), dictators/oligarchs (red), and more — all enriched from plane_alert_db.",
color: "yellow",
},
{
icon: <Shield size={14} className="text-cyan-400" />,
title: "Enhanced Health Observability",
desc: "The /api/health endpoint now reports per-source freshness timestamps and counts for all data layers — UAVs, FIRMS fires, LiveUAMap, GDELT, and more. Better uptime monitoring for self-hosters.",
icon: <Satellite size={14} className="text-green-400" />,
title: "Sentinel-2 Satellite Overhaul",
desc: "Replaced the tiny satellite popup with a fullscreen image overlay. Added Download, Copy to Clipboard, and Open Full Res buttons. Green dossier-themed UI.",
color: "green",
},
{
icon: <MapPin size={14} className="text-blue-400" />,
title: "Region Dossier & Carrier Fidelity",
desc: "Fixed Nominatim 429 rate-limit errors with retry/backoff. Carriers at shared homeports now dock at distinct pier positions instead of stacking.",
color: "blue",
},
{
icon: <Zap size={14} className="text-cyan-400" />,
title: "Overhauled Map Legend & Controls",
desc: "Full 9-color aircraft legend with POTUS fleet, wildfires, and infrastructure sections. New version badge, update checker, and Discussions shortcut in the UI.",
color: "cyan",
},
{
icon: <ToggleRight size={14} className="text-purple-400" />,
title: "Toggle All Data Layers",
desc: "One-click button to enable/disable all data layers at once. Turns cyan when active. MODIS Terra excluded from bulk toggle to prevent accidental imagery load.",
color: "purple",
},
];
const BUG_FIXES = [
"Settings panel now has tabbed UI — API Keys and News Feeds on separate tabs",
"Data center coordinates fixed for 187 Southern Hemisphere entries (were mirrored north of equator)",
"Docker networking: CORS_ORIGINS env var properly passed through docker-compose",
"Start scripts warn on Python 3.13+ compatibility issues before install",
"Satellite and fire hotspot layers debounced (2s) to prevent render thrashing",
"Entries with invalid geocoded coordinates automatically filtered out",
"POTUS fleet ICAO codes expanded — all Air Force Two (C-32A/B) airframes now correctly identified with gold halo",
"POTUS icon priority fixed — presidential aircraft always show the POTUS icon even when grounded",
"Sentinel-2 imagery no longer overlaps the bottom coordinate bar",
"Docker ENV format warnings resolved (legacy syntax → key=value)",
"Settings/Key/Version buttons now cyan in dark mode, grey only in light mode",
];
const CONTRIBUTORS = [
{ name: "@suranyami", desc: "Parallel multi-arch Docker builds (11min → 3min) + runtime BACKEND_URL fix", pr: "#35, #44" },
];
export function useChangelog() {
@@ -145,6 +160,26 @@ const ChangelogModal = React.memo(function ChangelogModal({ onClose }: Changelog
))}
</div>
</div>
{/* Contributors */}
<div>
<div className="text-[9px] font-mono tracking-[0.2em] text-pink-400 font-bold mb-3 flex items-center gap-2">
<Heart size={10} className="text-pink-400" />
COMMUNITY CONTRIBUTORS
</div>
<div className="space-y-1.5">
{CONTRIBUTORS.map((c, i) => (
<div key={i} className="flex items-start gap-2 px-3 py-2 rounded-lg border border-pink-500/20 bg-pink-500/5">
<span className="text-pink-400 text-[10px] mt-0.5 flex-shrink-0">&hearts;</span>
<div>
<span className="text-[10px] font-mono text-pink-300 font-bold">{c.name}</span>
<span className="text-[9px] font-mono text-[var(--text-muted)]"> {c.desc}</span>
<span className="text-[8px] font-mono text-[var(--text-muted)]"> (PR {c.pr})</span>
</div>
</div>
))}
</div>
</div>
</div>
{/* Footer */}
+1 -2
View File
@@ -106,8 +106,7 @@ export default function FilterPanel({ data, activeFilters, setActiveFilters }: F
const ops = new Set<string>(trackedOperators);
for (const f of data?.tracked_flights || []) {
if (f.alert_operator) ops.add(f.alert_operator);
if (f.alert_tag1) ops.add(f.alert_tag1);
if (f.alert_tag2) ops.add(f.alert_tag2);
if (f.alert_tags) ops.add(f.alert_tags);
}
return Array.from(ops).sort();
}, [data?.tracked_flights]);
+6 -4
View File
@@ -89,12 +89,13 @@ export default function FindLocateBar({ data, onLocate, onFilter }: FindLocateBa
});
}
// Tracked flights
// Tracked flights — include tags/owner/name for broad search (first name, last name, etc.)
for (const f of data?.tracked_flights || []) {
const uid = f.icao24 || f.registration || f.callsign || '';
const operator = f.alert_operator || 'Unknown Operator';
const category = f.alert_category || 'Tracked';
const type = f.alert_type || f.model || 'Unknown';
const extras = [f.alert_tags, f.owner, f.name, f.callsign].filter(Boolean).join(' ');
results.push({
id: `tracked-${uid}`,
label: operator,
@@ -104,7 +105,8 @@ export default function FindLocateBar({ data, onLocate, onFilter }: FindLocateBa
lat: f.lat,
lng: f.lng,
entityType: "tracked_flight",
});
_extra: extras,
} as any);
}
// Ships
@@ -144,7 +146,7 @@ export default function FindLocateBar({ data, onLocate, onFilter }: FindLocateBa
const q = query.toLowerCase();
return allEntities
.filter(e => {
const searchable = `${e.label} ${e.sublabel} ${e.id}`.toLowerCase();
const searchable = `${e.label} ${e.sublabel} ${e.id} ${(e as any)._extra || ''}`.toLowerCase();
return searchable.includes(q);
})
.slice(0, 12);
@@ -177,7 +179,7 @@ export default function FindLocateBar({ data, onLocate, onFilter }: FindLocateBa
ref={inputRef}
type="text"
value={query}
placeholder="Find aircraft or vessel..."
placeholder="Find aircraft, person or vessel..."
className="flex-1 bg-transparent text-[10px] text-[var(--text-secondary)] font-mono tracking-wider outline-none placeholder:text-[var(--text-muted)]"
onChange={(e) => {
setQuery(e.target.value);
+34 -5
View File
@@ -101,10 +101,23 @@ const LEGEND: LegendCategory[] = [
name: "TRACKED AIRCRAFT (ALERT)",
color: "text-pink-400 border-pink-500/30",
items: [
{ svg: airliner("#FF1493"), label: "Alert — Low Priority (pink)" },
{ svg: airliner("#FF2020"), label: "Alert — High Priority (red)" },
{ svg: airliner("#1A3A8A"), label: "Alert — Government (navy)" },
{ svg: airliner("white"), label: "Alert — General (white)" },
{ svg: airliner("#FF1493"), label: "VIP / Celebrity / Bizjet (hot pink)" },
{ svg: airliner("#FF2020"), label: "Dictator / Oligarch (red)" },
{ svg: airliner("#3b82f6"), label: "Government / Police / Customs (blue)" },
{ svg: heli("#32CD32"), label: "Medical / Fire / Rescue (lime)" },
{ svg: airliner("yellow"), label: "Military / Intelligence (yellow)" },
{ svg: airliner("#222"), label: "PIA — Privacy / Stealth (black)" },
{ svg: airliner("#FF8C00"), label: "Private Flights / Joe Cool (orange)" },
{ svg: airliner("white"), label: "Climate Crisis (white)" },
{ svg: airliner("#9B59B6"), label: "Private Jets / Historic / Other (purple)" },
],
},
{
name: "POTUS FLEET",
color: "text-yellow-400 border-yellow-500/30",
items: [
{ svg: `<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 32 32"><circle cx="16" cy="16" r="14" fill="none" stroke="gold" stroke-width="2" stroke-dasharray="4 2"/><g transform="translate(6,6)"><path d="M12 2C11.2 2 10.5 2.8 10.5 3.5V8.5L3 13V15L10.5 12.5V18L8 19.5V21L12 19.5L16 21V19.5L13.5 18V12.5L21 15V13L13.5 8.5V3.5C13.5 2.8 12.8 2 12 2Z" fill="#FF1493" stroke="black" stroke-width="0.5"/></g></svg>`, label: "Air Force One / Two (gold ring)" },
{ svg: `<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 32 32"><circle cx="16" cy="16" r="14" fill="none" stroke="gold" stroke-width="2" stroke-dasharray="4 2"/><g transform="translate(8,6)"><path d="M10 6L10 14L8 16L8 18L10 17L12 22L14 17L16 18L16 16L14 14L14 6C14 4 13 2 12 2C11 2 10 4 10 6Z" fill="#FF1493" stroke="black" stroke-width="0.5"/></g></svg>`, label: "Marine One (gold ring + heli)" },
],
},
{
@@ -138,7 +151,15 @@ const LEGEND: LegendCategory[] = [
name: "GEOPHYSICAL",
color: "text-orange-400 border-orange-500/30",
items: [
{ svg: circle("#ff6600"), label: "Earthquake (size = magnitude)" },
{ svg: circle("#ffcc00"), label: "Earthquake (yellow blob, size = magnitude)" },
],
},
{
name: "WILDFIRES",
color: "text-red-400 border-red-500/30",
items: [
{ svg: `<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 24 24"><path d="M12 1C8 7 5 10 5 14a7 7 0 0 0 14 0c0-4-3-7-7-13z" fill="#ff6600" stroke="#ffcc00" stroke-width="1"/></svg>`, label: "Active wildfire / hotspot" },
{ svg: clusterCircle("#cc0000", "#ff3300"), label: "Fire cluster (grouped hotspots)" },
],
},
{
@@ -166,6 +187,14 @@ const LEGEND: LegendCategory[] = [
{ svg: `<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 24 24"><rect x="3" y="3" width="18" height="18" fill="#ff0040" stroke="#000" stroke-width="1" opacity="0.2" rx="2"/></svg>`, label: "Low severity (25-50% degraded)" },
],
},
{
name: "INFRASTRUCTURE",
color: "text-purple-400 border-purple-500/30",
items: [
{ svg: `<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 24 24" fill="none" stroke="#a78bfa" stroke-width="1.5"><rect x="3" y="3" width="18" height="6" rx="1" fill="#2e1065"/><rect x="3" y="11" width="18" height="6" rx="1" fill="#2e1065"/><circle cx="7" cy="6" r="1" fill="#a78bfa"/><circle cx="7" cy="14" r="1" fill="#a78bfa"/></svg>`, label: "Data Center" },
{ svg: circle("#888"), label: "Internet Outage Zone (grey)" },
],
},
{
name: "SURVEILLANCE / CCTV",
color: "text-green-400 border-green-500/30",
File diff suppressed because it is too large Load Diff
+81 -49
View File
@@ -260,28 +260,40 @@ function NewsFeedInner({ data, selectedEntity, regionDossier, regionDossierLoadi
if (flight) {
const callsign = flight.callsign || "UNKNOWN";
const alertColorMap: Record<string, string> = {
'pink': 'text-pink-400', 'red': 'text-red-400',
'darkblue': 'text-blue-400', 'white': 'text-white'
'#ff1493': 'text-[#ff1493]', pink: 'text-[#ff1493]', red: 'text-red-400', yellow: 'text-yellow-400',
blue: 'text-blue-400', orange: 'text-orange-400', '#32cd32': 'text-[#32cd32]', purple: 'text-purple-400',
black: 'text-gray-400', white: 'text-white'
};
const alertBorderMap: Record<string, string> = {
'pink': 'border-pink-500/30', 'red': 'border-red-500/30',
'darkblue': 'border-blue-500/30', 'white': 'border-[var(--border-primary)]/30'
'#ff1493': 'border-[#ff1493]/30', pink: 'border-[#ff1493]/30', red: 'border-red-500/30', yellow: 'border-yellow-500/30',
blue: 'border-blue-500/30', orange: 'border-orange-500/30', '#32cd32': 'border-[#32cd32]/30', purple: 'border-purple-500/30',
black: 'border-gray-500/30', white: 'border-[var(--border-primary)]/30'
};
const alertBgMap: Record<string, string> = {
'pink': 'bg-pink-950/40', 'red': 'bg-red-950/40',
'darkblue': 'bg-blue-950/40', 'white': 'bg-[var(--bg-panel)]'
'#ff1493': 'bg-[#ff1493]/10', pink: 'bg-[#ff1493]/10', red: 'bg-red-950/40', yellow: 'bg-yellow-950/40',
blue: 'bg-blue-950/40', orange: 'bg-orange-950/40', '#32cd32': 'bg-lime-950/40', purple: 'bg-purple-950/40',
black: 'bg-gray-900/40', white: 'bg-[var(--bg-panel)]'
};
const ac = flight.alert_color || 'white';
const headerColor = alertColorMap[ac] || 'text-white';
const borderColor = alertBorderMap[ac] || 'border-[var(--border-primary)]/30';
const bgColor = alertBgMap[ac] || 'bg-[var(--bg-panel)]';
const shadowColor = (ac === 'pink' || ac === '#ff1493') ? 'rgba(255,20,147,0.4)'
: ac === 'red' ? 'rgba(255,32,32,0.2)'
: ac === 'yellow' ? 'rgba(255,255,0,0.2)'
: ac === 'blue' ? 'rgba(59,130,246,0.2)'
: ac === 'orange' ? 'rgba(255,140,0,0.3)'
: ac === '#32cd32' ? 'rgba(50,205,50,0.2)'
: ac === 'purple' ? 'rgba(155,89,182,0.2)'
: 'rgba(255,255,255,0.1)';
return (
<motion.div
initial={{ y: 50, opacity: 0 }}
animate={{ y: 0, opacity: 1 }}
transition={{ duration: 0.4 }}
className={`w-full bg-black/60 backdrop-blur-md border ${ac === 'pink' ? 'border-pink-800' : ac === 'red' ? 'border-red-800' : ac === 'darkblue' ? 'border-blue-800' : 'border-[var(--border-secondary)]'} rounded-xl flex flex-col z-10 font-mono shadow-[0_4px_30px_rgba(255,20,147,0.2)] pointer-events-auto overflow-hidden flex-shrink-0`}
className={`w-full bg-black/60 backdrop-blur-md border ${(ac === 'pink' || ac === '#ff1493') ? 'border-[#ff1493]' : ac === 'red' ? 'border-red-800' : ac === 'yellow' ? 'border-yellow-800' : ac === 'blue' ? 'border-blue-800' : ac === 'orange' ? 'border-orange-800' : ac === '#32cd32' ? 'border-lime-800' : ac === 'purple' ? 'border-purple-800' : 'border-[var(--border-secondary)]'} rounded-xl flex flex-col z-10 font-mono shadow-[0_4px_30px_${shadowColor}] pointer-events-auto overflow-hidden flex-shrink-0`}
>
<div className={`p-3 border-b ${borderColor} ${bgColor} flex justify-between items-center`}>
<h2 className={`text-xs tracking-widest font-bold ${headerColor} flex items-center gap-2`}>
@@ -293,31 +305,39 @@ function NewsFeedInner({ data, selectedEntity, regionDossier, regionDossierLoadi
<div className="p-4 flex flex-col gap-3">
<div className="flex justify-between items-center border-b border-[var(--border-primary)] pb-2">
<span className="text-[var(--text-muted)] text-[10px]">OPERATOR</span>
{flight.alert_operator && flight.alert_operator !== "UNKNOWN" ? (
<a
href={`https://en.wikipedia.org/wiki/${encodeURIComponent(flight.alert_operator.replace(/ /g, '_'))}`}
target="_blank"
rel="noreferrer"
className={`text-xs font-bold underline ${headerColor} hover:opacity-80 transition-opacity`}
title={`Search Wikipedia for ${flight.alert_operator}`}
>
{flight.alert_operator}
</a>
) : (
{flight.alert_operator && flight.alert_operator !== "UNKNOWN" ? (() => {
const wikiSlug = flight.alert_wiki || flight.alert_operator.replace(/\s*\(.*?\)\s*/g, '').trim().replace(/ /g, '_');
const wikiHref = `https://en.wikipedia.org/wiki/${encodeURIComponent(wikiSlug)}`;
return (
<a
href={wikiHref}
target="_blank"
rel="noreferrer"
className={`text-xs font-bold underline ${headerColor} hover:opacity-80 transition-opacity`}
title={`Search Wikipedia for ${flight.alert_operator}`}
>
{flight.alert_operator}
</a>
);
})() : (
<span className={`text-xs font-bold ${headerColor}`}>UNKNOWN</span>
)}
</div>
{/* Owner/Operator Wikipedia photo */}
{flight.alert_operator && flight.alert_operator !== "UNKNOWN" && (
<div className="border-b border-[var(--border-primary)] pb-2">
<WikiImage
wikiUrl={`https://en.wikipedia.org/wiki/${encodeURIComponent(flight.alert_operator.replace(/ /g, '_'))}`}
label={flight.alert_operator}
maxH="max-h-36"
accent={ac === 'pink' ? 'hover:border-pink-500/50' : ac === 'red' ? 'hover:border-red-500/50' : 'hover:border-cyan-500/50'}
/>
</div>
)}
{flight.alert_operator && flight.alert_operator !== "UNKNOWN" && (() => {
const wikiSlug = flight.alert_wiki || flight.alert_operator.replace(/\s*\(.*?\)\s*/g, '').trim().replace(/ /g, '_');
const wikiHref = `https://en.wikipedia.org/wiki/${encodeURIComponent(wikiSlug)}`;
return (
<div className="border-b border-[var(--border-primary)] pb-2">
<WikiImage
wikiUrl={wikiHref}
label={flight.alert_operator}
maxH="max-h-36"
accent={ac === 'pink' ? 'hover:border-pink-500/50' : ac === 'red' ? 'hover:border-red-500/50' : 'hover:border-cyan-500/50'}
/>
</div>
);
})()}
{/* Aircraft model Wikipedia photo */}
{aircraftImgUrl && (
<div className="border-b border-[var(--border-primary)] pb-2">
@@ -348,22 +368,10 @@ function NewsFeedInner({ data, selectedEntity, regionDossier, regionDossierLoadi
<span className="text-[var(--text-muted)] text-[10px]">REGISTRATION</span>
<span className="text-[var(--text-primary)] text-xs font-bold">{flight.registration || "N/A"}</span>
</div>
{flight.alert_tag1 && (
{flight.alert_tags && (
<div className="flex justify-between items-center border-b border-[var(--border-primary)] pb-2">
<span className="text-[var(--text-muted)] text-[10px]">INTEL TAG</span>
<span className={`text-xs font-bold ${headerColor}`}>{flight.alert_tag1}</span>
</div>
)}
{flight.alert_tag2 && (
<div className="flex justify-between items-center border-b border-[var(--border-primary)] pb-2">
<span className="text-[var(--text-muted)] text-[10px]">SECONDARY</span>
<span className="text-[var(--text-primary)] text-xs font-bold">{flight.alert_tag2}</span>
</div>
)}
{flight.alert_tag3 && (
<div className="flex justify-between items-center border-b border-[var(--border-primary)] pb-2">
<span className="text-[var(--text-muted)] text-[10px]">DETAIL</span>
<span className="text-[var(--text-secondary)] text-xs">{flight.alert_tag3}</span>
<span className="text-[var(--text-muted)] text-[10px]">INTEL TAGS</span>
<span className={`text-xs font-bold text-right max-w-[200px] ${headerColor}`}>{flight.alert_tags}</span>
</div>
)}
<div className="flex justify-between items-center border-b border-[var(--border-primary)] pb-2">
@@ -667,10 +675,34 @@ function NewsFeedInner({ data, selectedEntity, regionDossier, regionDossierLoadi
</div>
<div className="flex flex-col gap-2 mt-2">
<span className="text-[var(--text-muted)] text-[10px]">LATEST REPORTS:</span>
<div
className="text-[var(--text-primary)] text-xs whitespace-normal [&_a]:text-orange-400 [&_a]:underline hover:[&_a]:text-orange-300 [&_br]:mb-2"
dangerouslySetInnerHTML={{ __html: props.html || 'No articles available.' }}
/>
<div className="flex flex-col gap-1 max-h-[250px] overflow-y-auto styled-scrollbar">
{(() => {
const urls: string[] = props._urls_list || [];
const headlines: string[] = props._headlines_list || [];
if (urls.length === 0) return <span className="text-[var(--text-muted)] text-[10px]">No articles available.</span>;
return urls.map((url: string, idx: number) => {
const headline = headlines[idx] || '';
let domain = '';
try { domain = new URL(url).hostname.replace('www.', ''); } catch { domain = ''; }
return (
<a
key={idx}
href={url}
target="_blank"
rel="noopener noreferrer"
className="block py-1.5 border-b border-[var(--border-primary)]/50 last:border-0 cursor-pointer group"
>
<span className="text-orange-400 text-[11px] font-bold leading-tight group-hover:text-orange-300 block">
{headline || domain || 'View Article'}
</span>
{headline && domain && (
<span className="text-[var(--text-muted)] text-[9px] block mt-0.5">{domain}</span>
)}
</a>
);
});
})()}
</div>
</div>
</div>
</motion.div>
@@ -966,9 +998,9 @@ function NewsFeedInner({ data, selectedEntity, regionDossier, regionDossierLoadi
<motion.div
key={idx}
ref={(el) => { itemRefs.current[idx] = el; }}
initial={{ opacity: 0, x: -10 }}
initial={idx < 15 ? { opacity: 0, x: -10 } : { opacity: 1, x: 0 }}
animate={{ opacity: 1, x: 0 }}
transition={{ delay: 0.1 + (idx * 0.05) }}
transition={idx < 15 ? { delay: 0.1 + (idx * 0.05) } : { duration: 0 }}
className={`p-2 rounded-sm border-l-[2px] border-r border-t border-b ${bgClass} flex flex-col gap-1 relative group shrink-0`}
>
<div className="flex items-center justify-between text-[8px] text-[var(--text-secondary)] uppercase tracking-widest">
@@ -0,0 +1,81 @@
"use client";
import { useState } from "react";
import { Github, MessageSquare, Download, AlertCircle, CheckCircle2 } from "lucide-react";
import packageJson from "../../package.json";
export default function TopRightControls() {
const [updateStatus, setUpdateStatus] = useState<"idle" | "checking" | "available" | "uptodate" | "error">("idle");
const [latestVersion, setLatestVersion] = useState<string>("");
const currentVersion = packageJson.version;
const checkForUpdates = async () => {
setUpdateStatus("checking");
try {
const res = await fetch("https://api.github.com/repos/BigBodyCobain/Shadowbroker/releases/latest");
if (!res.ok) throw new Error("Failed to fetch");
const data = await res.json();
// Remove 'v' prefix if it exists to compare semver cleanly
const latest = data.tag_name?.replace('v', '') || data.name?.replace('v', '');
const current = currentVersion.replace('v', '');
if (latest && latest !== current) {
setLatestVersion(latest);
setUpdateStatus("available");
} else {
setUpdateStatus("uptodate");
setTimeout(() => setUpdateStatus("idle"), 3000);
}
} catch (err) {
console.error("Update check failed:", err);
setUpdateStatus("error");
setTimeout(() => setUpdateStatus("idle"), 3000);
}
};
return (
<div className="flex items-center gap-2 mb-1 justify-end">
<a
href="https://github.com/BigBodyCobain/Shadowbroker/discussions"
target="_blank"
rel="noreferrer"
className="flex items-center gap-1.5 px-2.5 py-1.5 bg-[var(--bg-primary)]/50 backdrop-blur-md border border-[var(--border-primary)] rounded-lg hover:border-cyan-500/50 hover:bg-[var(--hover-accent)] transition-all text-[10px] text-[var(--text-secondary)] font-mono cursor-pointer"
>
<MessageSquare size={12} className="text-cyan-400 w-3 h-3" />
<span className="tracking-widest">DISCUSSIONS</span>
</a>
{updateStatus === "available" ? (
<a
href="https://github.com/BigBodyCobain/Shadowbroker/releases/latest"
target="_blank"
rel="noreferrer"
className="flex items-center gap-1.5 px-2.5 py-1.5 bg-green-500/10 backdrop-blur-md border border-green-500/50 rounded-lg hover:bg-green-500/20 transition-all text-[10px] text-green-400 font-mono cursor-pointer shadow-[0_0_15px_rgba(34,197,94,0.3)]"
>
<Download size={12} className="w-3 h-3" />
<span className="tracking-widest animate-pulse">v{latestVersion} UPDATE!</span>
</a>
) : (
<button
onClick={checkForUpdates}
disabled={updateStatus === "checking"}
className="flex items-center gap-1.5 px-2.5 py-1.5 bg-[var(--bg-primary)]/50 backdrop-blur-md border border-[var(--border-primary)] rounded-lg hover:border-cyan-500/50 hover:bg-[var(--hover-accent)] transition-all text-[10px] text-[var(--text-secondary)] font-mono cursor-pointer disabled:opacity-50 disabled:cursor-not-allowed"
>
{updateStatus === "checking" && <Github size={12} className="w-3 h-3 animate-spin text-cyan-400" />}
{updateStatus === "idle" && <Github size={12} className="w-3 h-3 text-cyan-400" />}
{updateStatus === "uptodate" && <CheckCircle2 size={12} className="w-3 h-3 text-green-400" />}
{updateStatus === "error" && <AlertCircle size={12} className="w-3 h-3 text-red-400" />}
<span className="tracking-widest">
{updateStatus === "checking" ? "CHECKING..." :
updateStatus === "uptodate" ? "UP TO DATE" :
updateStatus === "error" ? "CHECK FAILED" :
"CHECK UPDATES"}
</span>
</button>
)}
</div>
);
}
+2 -2
View File
@@ -14,7 +14,7 @@ const _cache: Record<string, { url: string | null; done: boolean }> = {};
* maxH: Max height class (default "max-h-32")
* accent: Border hover color class (default "hover:border-cyan-500/50")
*/
export default function WikiImage({ wikiUrl, label, maxH = 'max-h-32', accent = 'hover:border-cyan-500/50' }: {
export default function WikiImage({ wikiUrl, label, maxH = 'max-h-52', accent = 'hover:border-cyan-500/50' }: {
wikiUrl: string;
label?: string;
maxH?: string;
@@ -56,7 +56,7 @@ export default function WikiImage({ wikiUrl, label, maxH = 'max-h-32', accent =
<img
src={imgUrl}
alt={label || title.replace(/_/g, ' ')}
className={`w-full h-auto ${maxH} object-cover rounded border border-[var(--border-primary)]/50 ${accent} transition-colors`}
className={`w-full h-auto ${maxH} object-contain rounded border border-[var(--border-primary)]/50 ${accent} transition-colors`}
/>
</a>
)}
+158 -16
View File
@@ -1,8 +1,9 @@
"use client";
import React, { useState, useEffect, useRef } from "react";
import React, { useState, useEffect, useRef, useMemo } from "react";
import { motion, AnimatePresence } from "framer-motion";
import { Plane, AlertTriangle, Activity, Satellite, Cctv, ChevronDown, ChevronUp, Ship, Eye, Anchor, Settings, Sun, Moon, BookOpen, Radio, Play, Pause, Globe, Flame, Wifi, Server } from "lucide-react";
import { Plane, AlertTriangle, Activity, Satellite, Cctv, ChevronDown, ChevronUp, Ship, Eye, Anchor, Settings, Sun, Moon, BookOpen, Radio, Play, Pause, Globe, Flame, Wifi, Server, Shield, ToggleLeft, ToggleRight } from "lucide-react";
import packageJson from "../../package.json";
import { useTheme } from "@/lib/ThemeContext";
function relativeTime(iso: string | undefined): string {
@@ -40,10 +41,29 @@ const FRESHNESS_MAP: Record<string, string> = {
datacenters: "datacenters",
};
const WorldviewLeftPanel = React.memo(function WorldviewLeftPanel({ data, activeLayers, setActiveLayers, onSettingsClick, onLegendClick, gibsDate, setGibsDate, gibsOpacity, setGibsOpacity }: { data: any; activeLayers: any; setActiveLayers: any; onSettingsClick?: () => void; onLegendClick?: () => void; gibsDate?: string; setGibsDate?: (d: string) => void; gibsOpacity?: number; setGibsOpacity?: (o: number) => void }) {
// POTUS fleet ICAO hex codes for client-side filtering
const POTUS_ICAOS: Record<string, { label: string; type: string }> = {
'ADFDF8': { label: 'Air Force One (82-8000)', type: 'AF1' },
'ADFDF9': { label: 'Air Force One (92-9000)', type: 'AF1' },
'ADFEB7': { label: 'Air Force Two (98-0001)', type: 'AF2' },
'ADFEB8': { label: 'Air Force Two (98-0002)', type: 'AF2' },
'ADFEB9': { label: 'Air Force Two (99-0003)', type: 'AF2' },
'ADFEBA': { label: 'Air Force Two (99-0004)', type: 'AF2' },
'AE4AE6': { label: 'Air Force Two (09-0015)', type: 'AF2' },
'AE4AE8': { label: 'Air Force Two (09-0016)', type: 'AF2' },
'AE4AEA': { label: 'Air Force Two (09-0017)', type: 'AF2' },
'AE4AEC': { label: 'Air Force Two (19-0018)', type: 'AF2' },
'AE0865': { label: 'Marine One (VH-3D)', type: 'M1' },
'AE5E76': { label: 'Marine One (VH-92A)', type: 'M1' },
'AE5E77': { label: 'Marine One (VH-92A)', type: 'M1' },
'AE5E79': { label: 'Marine One (VH-92A)', type: 'M1' },
};
const WorldviewLeftPanel = React.memo(function WorldviewLeftPanel({ data, activeLayers, setActiveLayers, onSettingsClick, onLegendClick, gibsDate, setGibsDate, gibsOpacity, setGibsOpacity, onEntityClick, onFlyTo }: { data: any; activeLayers: any; setActiveLayers: any; onSettingsClick?: () => void; onLegendClick?: () => void; gibsDate?: string; setGibsDate?: (d: string) => void; gibsOpacity?: number; setGibsOpacity?: (o: number) => void; onEntityClick?: (entity: { type: string; id: number; extra?: any }) => void; onFlyTo?: (lat: number, lng: number) => void }) {
const [isMinimized, setIsMinimized] = useState(false);
const { theme, toggleTheme } = useTheme();
const [gibsPlaying, setGibsPlaying] = useState(false);
const [potusEnabled, setPotusEnabled] = useState(true);
const gibsIntervalRef = useRef<ReturnType<typeof setInterval> | null>(null);
// GIBS time slider play/pause animation
@@ -70,10 +90,34 @@ const WorldviewLeftPanel = React.memo(function WorldviewLeftPanel({ data, active
return () => { if (gibsIntervalRef.current) clearInterval(gibsIntervalRef.current); };
}, [gibsPlaying, gibsDate, setGibsDate]);
// Compute ship category counts
const importantShipCount = data?.ships?.filter((s: any) => ['carrier', 'military_vessel', 'tanker', 'cargo'].includes(s.type))?.length || 0;
const passengerShipCount = data?.ships?.filter((s: any) => s.type === 'passenger')?.length || 0;
const civilianShipCount = data?.ships?.filter((s: any) => !['carrier', 'military_vessel', 'tanker', 'cargo', 'passenger'].includes(s.type))?.length || 0;
// Compute ship category counts (memoized — ships array can be 1000+ items)
const { importantShipCount, passengerShipCount, civilianShipCount } = useMemo(() => {
const ships = data?.ships;
if (!ships || !ships.length) return { importantShipCount: 0, passengerShipCount: 0, civilianShipCount: 0 };
let important = 0, passenger = 0, civilian = 0;
for (const s of ships) {
const t = s.type;
if (t === 'carrier' || t === 'military_vessel' || t === 'tanker' || t === 'cargo') important++;
else if (t === 'passenger') passenger++;
else civilian++;
}
return { importantShipCount: important, passengerShipCount: passenger, civilianShipCount: civilian };
}, [data?.ships]);
// Find POTUS fleet planes currently airborne from tracked flights
const potusFlights = useMemo(() => {
const tracked = data?.tracked_flights;
if (!tracked) return [];
const results: { index: number; flight: any; meta: { label: string; type: string } }[] = [];
for (let i = 0; i < tracked.length; i++) {
const f = tracked[i];
const icao = (f.icao24 || '').toUpperCase();
if (POTUS_ICAOS[icao]) {
results.push({ index: i, flight: f, meta: POTUS_ICAOS[icao] });
}
}
return results;
}, [data?.tracked_flights]);
const layers = [
{ id: "flights", name: "Commercial Flights", source: "adsb.lol", count: data?.commercial_flights?.length || 0, icon: Plane },
@@ -82,7 +126,7 @@ const WorldviewLeftPanel = React.memo(function WorldviewLeftPanel({ data, active
{ id: "military", name: "Military Flights", source: "adsb.lol", count: data?.military_flights?.length || 0, icon: AlertTriangle },
{ id: "tracked", name: "Tracked Aircraft", source: "Plane-Alert DB", count: data?.tracked_flights?.length || 0, icon: Eye },
{ id: "earthquakes", name: "Earthquakes (24h)", source: "USGS", count: data?.earthquakes?.length || 0, icon: Activity },
{ id: "satellites", name: "Satellites", source: "CelesTrak SGP4", count: data?.satellites?.length || 0, icon: Satellite },
{ id: "satellites", name: "Satellites", source: data?.satellite_source === "celestrak" ? "CelesTrak SGP4" : data?.satellite_source === "tle_api" ? "TLE API · SGP4" : data?.satellite_source === "disk_cache" ? "Cached · SGP4 (est.)" : "CelesTrak SGP4", count: data?.satellites?.length || 0, icon: Satellite },
{ id: "ships_important", name: "Carriers / Mil / Cargo", source: "AIS Stream", count: importantShipCount, icon: Ship },
{ id: "ships_civilian", name: "Civilian Vessels", source: "AIS Stream", count: civilianShipCount, icon: Anchor },
{ id: "ships_passenger", name: "Cruise / Passenger", source: "AIS Stream", count: passengerShipCount, icon: Anchor },
@@ -116,7 +160,7 @@ const WorldviewLeftPanel = React.memo(function WorldviewLeftPanel({ data, active
<h1 className="text-2xl font-bold tracking-[0.2em] text-[var(--text-heading)]">FLIR</h1>
<button
onClick={toggleTheme}
className="w-7 h-7 rounded-lg border border-[var(--border-primary)] hover:border-cyan-500/50 flex items-center justify-center text-[var(--text-muted)] hover:text-cyan-400 transition-all hover:bg-[var(--hover-accent)]"
className={`w-7 h-7 rounded-lg border border-[var(--border-primary)] hover:border-cyan-500/50 flex items-center justify-center ${theme === 'dark' ? 'text-cyan-400' : 'text-[var(--text-muted)]'} hover:text-cyan-300 transition-all hover:bg-[var(--hover-accent)]`}
title={theme === 'dark' ? 'Switch to Light Mode' : 'Switch to Dark Mode'}
>
{theme === 'dark' ? <Sun size={14} /> : <Moon size={14} />}
@@ -124,7 +168,7 @@ const WorldviewLeftPanel = React.memo(function WorldviewLeftPanel({ data, active
{onSettingsClick && (
<button
onClick={onSettingsClick}
className="w-7 h-7 rounded-lg border border-[var(--border-primary)] hover:border-cyan-500/50 flex items-center justify-center text-[var(--text-muted)] hover:text-cyan-400 transition-all hover:bg-[var(--hover-accent)] group"
className={`w-7 h-7 rounded-lg border border-[var(--border-primary)] hover:border-cyan-500/50 flex items-center justify-center ${theme === 'dark' ? 'text-cyan-400' : 'text-[var(--text-muted)]'} hover:text-cyan-300 transition-all hover:bg-[var(--hover-accent)] group`}
title="System Settings"
>
<Settings size={14} className="group-hover:rotate-90 transition-transform duration-300" />
@@ -133,13 +177,16 @@ const WorldviewLeftPanel = React.memo(function WorldviewLeftPanel({ data, active
{onLegendClick && (
<button
onClick={onLegendClick}
className="h-7 px-2 rounded-lg border border-[var(--border-primary)] hover:border-cyan-500/50 flex items-center justify-center gap-1 text-[var(--text-muted)] hover:text-cyan-400 transition-all hover:bg-[var(--hover-accent)]"
className={`h-7 px-2 rounded-lg border border-[var(--border-primary)] hover:border-cyan-500/50 flex items-center justify-center gap-1 ${theme === 'dark' ? 'text-cyan-400' : 'text-[var(--text-muted)]'} hover:text-cyan-300 transition-all hover:bg-[var(--hover-accent)]`}
title="Map Legend / Icon Key"
>
<BookOpen size={12} />
<span className="text-[8px] font-mono tracking-widest font-bold">KEY</span>
</button>
)}
<span className={`h-7 px-2 rounded-lg border border-[var(--border-primary)] flex items-center justify-center text-[8px] ${theme === 'dark' ? 'text-cyan-400' : 'text-[var(--text-muted)]'} font-mono tracking-widest select-none`}>
v{packageJson.version}
</span>
</div>
</div>
@@ -149,12 +196,30 @@ const WorldviewLeftPanel = React.memo(function WorldviewLeftPanel({ data, active
{/* Header / Toggle */}
<div
className="flex justify-between items-center p-4 cursor-pointer hover:bg-[var(--bg-secondary)]/50 transition-colors border-b border-[var(--border-primary)]/50"
onClick={() => setIsMinimized(!isMinimized)}
>
<span className="text-[10px] text-[var(--text-muted)] font-mono tracking-widest">DATA LAYERS</span>
<button className="text-[var(--text-muted)] hover:text-[var(--text-primary)] transition-colors">
{isMinimized ? <ChevronDown size={14} /> : <ChevronUp size={14} />}
</button>
<span className="text-[10px] text-[var(--text-muted)] font-mono tracking-widest" onClick={() => setIsMinimized(!isMinimized)}>DATA LAYERS</span>
<div className="flex items-center gap-2">
<button
title={Object.entries(activeLayers).filter(([k]) => k !== 'gibs_imagery').every(([, v]) => v) ? "Disable all layers" : "Enable all layers"}
className={`${Object.entries(activeLayers).filter(([k]) => k !== 'gibs_imagery').every(([, v]) => v) ? 'text-cyan-400' : 'text-[var(--text-muted)]'} hover:text-cyan-400 transition-colors`}
onClick={(e) => {
e.stopPropagation();
const allOn = Object.entries(activeLayers).filter(([k]) => k !== 'gibs_imagery').every(([, v]) => v);
setActiveLayers((prev: any) => {
const next: any = {};
for (const k of Object.keys(prev)) {
next[k] = k === 'gibs_imagery' ? false : !allOn;
}
return next;
});
}}
>
{Object.entries(activeLayers).filter(([k]) => k !== 'gibs_imagery').every(([, v]) => v) ? <ToggleRight size={16} /> : <ToggleLeft size={16} />}
</button>
<button className="text-[var(--text-muted)] hover:text-[var(--text-primary)] transition-colors" onClick={() => setIsMinimized(!isMinimized)}>
{isMinimized ? <ChevronDown size={14} /> : <ChevronUp size={14} />}
</button>
</div>
</div>
<AnimatePresence>
@@ -166,6 +231,61 @@ const WorldviewLeftPanel = React.memo(function WorldviewLeftPanel({ data, active
className="overflow-y-auto styled-scrollbar"
>
<div className="flex flex-col gap-6 p-4 pt-2 pb-6">
{/* POTUS Fleet — pinned to TOP when aircraft are active */}
{potusEnabled && potusFlights.length > 0 && (
<div className="bg-[#ff1493]/5 border border-[#ff1493]/30 rounded-lg p-3 -mt-1">
<div className="flex items-center justify-between mb-2">
<div className="flex items-center gap-2">
<Shield size={14} className="text-[#ff1493]" />
<span className="text-[10px] text-[#ff1493] font-mono tracking-widest font-bold">POTUS FLEET</span>
<span className="text-[9px] font-mono px-1.5 py-0.5 rounded-full bg-[#ff1493]/20 border border-[#ff1493]/40 text-[#ff1493] animate-pulse">
{potusFlights.length} ACTIVE
</span>
</div>
<button
onClick={(e) => { e.stopPropagation(); setPotusEnabled(false); }}
className="text-[8px] font-mono text-[var(--text-muted)] hover:text-[#ff1493] border border-[var(--border-primary)] hover:border-[#ff1493]/40 rounded px-1.5 py-0.5 transition-colors"
title="Hide POTUS Fleet tracker"
>
HIDE
</button>
</div>
<div className="flex flex-col gap-2">
{potusFlights.map((pf) => {
const color = pf.meta.type === 'AF1' ? '#ff1493' : pf.meta.type === 'M1' ? '#ff1493' : '#3b82f6';
const alt = pf.flight.alt_baro || pf.flight.alt || 0;
const speed = pf.flight.gs || pf.flight.speed || 0;
return (
<div
key={pf.flight.icao24}
className="flex items-center justify-between p-2 rounded-lg border cursor-pointer transition-all hover:bg-[var(--bg-secondary)]/60"
style={{ borderColor: `${color}40`, background: `${color}10` }}
onClick={() => {
if (onFlyTo && pf.flight.lat != null && pf.flight.lng != null) {
onFlyTo(pf.flight.lat, pf.flight.lng);
}
if (onEntityClick) {
onEntityClick({ type: 'tracked_flight', id: pf.index });
}
}}
>
<div className="flex flex-col">
<span className="text-[10px] font-bold font-mono" style={{ color }}>{pf.meta.label}</span>
<span className="text-[8px] text-[var(--text-muted)] font-mono mt-0.5">
{alt > 0 ? `${Math.round(alt).toLocaleString()} ft` : 'GND'} · {speed > 0 ? `${Math.round(speed)} kts` : 'STATIC'}
</span>
</div>
<div className="flex items-center gap-1.5">
<div className="w-1.5 h-1.5 rounded-full animate-pulse" style={{ backgroundColor: color }} />
<span className="text-[8px] font-mono" style={{ color }}>TRACK</span>
</div>
</div>
);
})}
</div>
</div>
)}
{layers.map((layer, idx) => {
const Icon = layer.icon;
const active = activeLayers[layer.id as keyof typeof activeLayers] || false;
@@ -251,6 +371,28 @@ const WorldviewLeftPanel = React.memo(function WorldviewLeftPanel({ data, active
</div>
)
})}
{/* POTUS Fleet — bottom section when inactive or hidden */}
{(potusFlights.length === 0 || !potusEnabled) && (
<div className="border-t border-[var(--border-primary)]/50 pt-4 mt-2">
<div className="flex items-center justify-between">
<div className="flex items-center gap-2">
<Shield size={14} className="text-[var(--text-muted)]" />
<span className="text-[10px] text-[var(--text-muted)] font-mono tracking-widest">POTUS FLEET</span>
</div>
{!potusEnabled ? (
<button
onClick={(e) => { e.stopPropagation(); setPotusEnabled(true); }}
className="text-[8px] font-mono text-[var(--text-muted)] hover:text-[#ff1493] border border-[var(--border-primary)] hover:border-[#ff1493]/40 rounded px-1.5 py-0.5 transition-colors"
>
SHOW
</button>
) : (
<span className="text-[8px] font-mono text-[var(--text-muted)]">NO ACTIVE AIRCRAFT</span>
)}
</div>
</div>
)}
</div>
</motion.div>
)}
+8 -28
View File
@@ -1,28 +1,8 @@
// NEXT_PUBLIC_* vars are baked at build time in Next.js, so setting them
// in docker-compose `environment` has no effect at runtime. Instead we
// auto-detect: use the browser's current hostname with a configurable port
// so the dashboard works on localhost, LAN IPs, and custom Docker port maps
// without any code changes.
//
// Override order:
// 1. Build-time NEXT_PUBLIC_API_URL (for advanced users who rebuild the image)
// 2. Runtime auto-detect from window.location.hostname + port 8000
function resolveApiBase(): string {
// Build-time override (works when image is rebuilt with the env var)
if (process.env.NEXT_PUBLIC_API_URL) {
return process.env.NEXT_PUBLIC_API_URL;
}
// Server-side rendering: fall back to localhost
if (typeof window === "undefined") {
return "http://localhost:8000";
}
// Client-side: use the same hostname the user is browsing on
const proto = window.location.protocol;
const host = window.location.hostname;
return `${proto}//${host}:8000`;
}
export const API_BASE = resolveApiBase();
// All API calls use relative paths (e.g. /api/flights).
// The catch-all route handler at src/app/api/[...path]/route.ts proxies them
// to BACKEND_URL at runtime (set in docker-compose or .env.local for dev).
// This means:
// - No build-time baking of the backend URL into the client bundle
// - BACKEND_URL=http://backend:8000 works via Docker internal networking
// - Only port 3000 needs to be exposed externally
export const API_BASE = "";