Compare commits

...

18 Commits

Author SHA1 Message Date
dependabot[bot] 29b8e35eaf chore(deps): bump requests from 2.31.0 to 2.34.1 in /backend
Bumps [requests](https://github.com/psf/requests) from 2.31.0 to 2.34.1.
- [Release notes](https://github.com/psf/requests/releases)
- [Changelog](https://github.com/psf/requests/blob/main/HISTORY.md)
- [Commits](https://github.com/psf/requests/compare/v2.31.0...v2.34.1)

---
updated-dependencies:
- dependency-name: requests
  dependency-version: 2.34.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-05-14 07:25:24 +00:00
BigBodyCobain 25a98a9869 Harden Infonet DM address flow and seed sync
Allow local-operator DM invite import without requiring a full admin session.

Prioritize bundled/bootstrap seed peers and shorten stale seed cooldowns for faster Infonet recovery.

Replace raw DM invite dumps with copyable signed-address controls, contact request handling, and safer sealed-send behavior while the private delivery route connects.
2026-05-12 21:23:38 -06:00
BigBodyCobain 2ce0e43ee5 Fix secure messaging test expectations 2026-05-12 12:46:56 -06:00
BigBodyCobain b86a258535 Release v0.9.79 runtime and messaging update
Ship the v0.9.79 runtime refresh with transport lane isolation, Infonet secure-message address management, MeshChat MQTT controls, selected asset trail behavior, telemetry panel refinements, onboarding updates, and desktop/package metadata alignment.

Also ignore local graphify work products so analysis folders do not leak into future commits.
2026-05-12 11:49:46 -06:00
BigBodyCobain 85636ce95c Stabilize secure mail warmup test 2026-05-06 22:54:11 -06:00
BigBodyCobain 5ee4f8ecd7 Stabilize Infonet private sync and selected telemetry 2026-05-06 22:10:04 -06:00
BigBodyCobain b8ac0fb9e7 Harden v0.9.75 wormhole node sync and telemetry panels
Add Tor/onion runtime wiring and faster Infonet node status refresh.

Keep node bootstrap state clearer across Docker and local runtimes.

Use selected aircraft trail history for cumulative tracked-aircraft emissions.
2026-05-06 14:04:16 -06:00
BigBodyCobain 8926e08009 Fix cached satellite propagation 2026-05-06 02:25:10 -06:00
BigBodyCobain 585a08bbac Fix MeshChat decomposition release gate 2026-05-06 01:46:26 -06:00
BigBodyCobain 6ffd54931c Release v0.9.75 runtime and onboarding update
Ship the 0.9.75 source update with improved startup/runtime hardening, operator API key onboarding, Meshtastic MQTT controls, Infonet/MeshChat separation, desktop package versioning, and aircraft telemetry refinements.

Also updates focused backend/frontend tests for node settings, Meshtastic MQTT settings, and desktop runtime behavior.
2026-05-06 01:15:54 -06:00
BigBodyCobain a017ba86d6 Fix desktop release packaging without signing keys 2026-05-04 21:54:29 -06:00
BigBodyCobain 9427935c7f Align CSP tests with hydration-safe policy 2026-05-04 13:04:31 -06:00
BigBodyCobain 63043b32b5 Stabilize Docker startup and runtime proxy
Reduce cold-start stalls by raising the default backend memory limit, bounding heavy feed concurrency, preserving non-empty startup caches, and refreshing working news feeds. Fix the Next API proxy for Docker control-plane writes by stripping unsupported hop/body headers and forwarding small request bodies safely. Keep the dashboard dynamic so production users do not get stuck on a cached startup shell.
2026-05-04 12:37:23 -06:00
BigBodyCobain 1e34fa53b2 Make Docker backend port configurable 2026-05-03 21:13:31 -06:00
BigBodyCobain d69602be9e Align CSP test with production hydration policy 2026-05-03 14:06:39 -06:00
BigBodyCobain ce9ba39cd2 Fix production CSP hydration 2026-05-03 13:59:07 -06:00
BigBodyCobain 3eafb622ed Clarify Podman compose setup 2026-05-03 08:44:56 -06:00
Shadowbroker eb5564ca0e Update README.md 2026-05-03 02:59:03 -06:00
93 changed files with 5388 additions and 1358 deletions
+15
View File
@@ -58,6 +58,21 @@ ADMIN_KEY=
# FAST_STARTUP_CACHE_MAX_AGE_S=21600
# INTEL_STARTUP_CACHE_MAX_AGE_S=21600
# Docker resource tuning. The backend synthesizes large geospatial feeds; keep
# this at 4G or higher on hosts that run AIS, OpenSky, CCTV, satellites, and
# threat feeds together. Lower caps can cause Docker OOM restarts and empty
# slow layers such as news, UAP sightings, military bases, and wastewater.
# BACKEND_MEMORY_LIMIT=4G
# SHADOWBROKER_FETCH_WORKERS=8
# SHADOWBROKER_SLOW_FETCH_CONCURRENCY=4
# SHADOWBROKER_STARTUP_HEAVY_CONCURRENCY=2
# Infonet bootstrap/sync responsiveness. Defaults favor fast seed failure
# detection so stale onion peers do not make the terminal look hung.
# MESH_SYNC_TIMEOUT_S=5
# MESH_SYNC_MAX_PEERS_PER_CYCLE=3
# MESH_BOOTSTRAP_SEED_FAILURE_COOLDOWN_S=15
# Google Earth Engine for VIIRS night lights change detection (optional).
# pip install earthengine-api
# GEE_SERVICE_ACCOUNT_KEY=
+2
View File
@@ -173,6 +173,8 @@ backend/services/test_*.py
# Local analysis & dev tools
backend/analyze_xlsx.py
backend/services/ais_cache.json
graphify/
graphify-out/
# ========================
# Internal docs & brainstorming (never commit)
+42 -9
View File
@@ -17,7 +17,7 @@
**ShadowBroker** is a decentralized real-time, multi-domain OSINT dashboard that fuses 60+ live intelligence feeds into a single dark-ops map interface. Aircraft, ships, satellites, conflict zones, CCTV networks, GPS jamming, internet-connected devices, police scanners, mesh radio nodes, and breaking geopolitical events — all updating in real time on one screen as well as an obfuscated communications protocol and information exchange infrastructure.
**ShadowBroker** is a decentralized intelligence platform that aggregates real-time, multi-domain OSINT telemetry from 60+ live intelligence feeds into a single dark-ops map interface. Aircraft, ships, satellites, conflict zones, CCTV networks, GPS jamming, internet-connected devices, police scanners, mesh radio nodes, and breaking geopolitical events — all updating in real time on one screen as well as an obfuscated communications protocol and information exchange infrastructure.
Built with **Next.js**, **MapLibre GL**, **FastAPI**, and **Python**. 35+ toggleable data layers, including SAR ground-change detection. Multiple visual modes (DEFAULT / SATELLITE / FLIR / NVG / CRT). Right-click any point on Earth for a country dossier, head-of-state lookup, and the latest Sentinel-2 satellite photo. No user data is collected or transmitted — the dashboard runs entirely in your browser against a self-hosted backend.
@@ -70,7 +70,11 @@ docker compose up -d
Open `http://localhost:3000` to view the dashboard! *(Requires [Docker Desktop](https://www.docker.com/products/docker-desktop/) or Docker Engine)*
> **Podman users:** Replace `docker compose` with `podman compose`, or use the `compose.sh` wrapper which auto-detects your engine. Force Podman with `./compose.sh --engine podman up -d`.
> **Backend port already in use?** The browser only needs port `3000`, but the backend API is also published on host port `8000` for local diagnostics. If another app already uses `8000`, create or edit `.env` next to `docker-compose.yml` and set `BACKEND_PORT=8001`, then run `docker compose up -d`.
> **Blank news/UAP/bases/wastewater after several minutes?** Check for backend OOM restarts with `docker events --since 30m --filter container=shadowbroker-backend --filter event=oom`. The default compose file gives the backend 4GB; if your host has less memory, reduce enabled feeds or set `BACKEND_MEMORY_LIMIT=3G` and expect slower/heavier layers to warm more gradually.
> **Podman users:** Podman works, but `podman compose` is a wrapper and still needs a Compose provider installed. On Windows/WSL, if you see `looking up compose provider failed`, install `podman-compose` and run `podman-compose pull` followed by `podman-compose up -d` from inside the cloned `Shadowbroker` folder. On Linux/macOS/WSL shells you can also use `./compose.sh --engine podman pull` and `./compose.sh --engine podman up -d`.
---
@@ -93,6 +97,8 @@ That's it. `pull` grabs the latest images, `up -d` restarts the containers.
> docker compose pull
> docker compose up -d
> ```
>
> Podman users should run the equivalent provider command, for example `podman-compose pull` and `podman-compose up -d`, or use `./compose.sh --engine podman pull` and `./compose.sh --engine podman up -d` from a bash-compatible shell.
### ⚠️ **Stuck on the old version?**
@@ -560,25 +566,51 @@ Open `http://localhost:3000` to view the dashboard.
> **Deploying publicly or on a LAN?** No configuration needed for most setups.
> The frontend proxies all API calls through the Next.js server to `BACKEND_URL`,
> which defaults to `http://backend:8000` (Docker internal networking).
> Port 8000 does not need to be exposed externally.
> Host port `8000` is only published for local API/debug access. If it conflicts
> with another service, set `BACKEND_PORT=8001` in `.env`; leave `BACKEND_URL`
> as `http://backend:8000` because that is the Docker-internal port.
> The backend memory cap is controlled by `BACKEND_MEMORY_LIMIT` and defaults
> to `4G`. If Docker reports OOM events, the backend will restart and slow
> layers can look empty until they repopulate.
>
> If your backend runs on a **different host or port**, set `BACKEND_URL` at runtime — no rebuild required:
>
> ```bash
> # Linux / macOS
> BACKEND_URL=http://myserver.com:9096 docker-compose up -d
> BACKEND_URL=http://myserver.com:9096 docker compose up -d
>
> # Podman (via compose.sh wrapper)
> BACKEND_URL=http://192.168.1.50:9096 ./compose.sh up -d
>
> # Windows (PowerShell)
> $env:BACKEND_URL="http://myserver.com:9096"; docker-compose up -d
> $env:BACKEND_URL="http://myserver.com:9096"; docker compose up -d
>
> # Or add to a .env file next to docker-compose.yml:
> # BACKEND_URL=http://myserver.com:9096
> ```
**Podman users:** Replace `docker compose` with `podman compose`, or use the `compose.sh` wrapper which auto-detects your engine.
**Podman users:** Do not pass the GitHub URL to `podman compose pull`; clone the repo first, `cd Shadowbroker`, then run compose from that folder. `podman compose` also requires a Compose provider. If Podman reports `looking up compose provider failed`, install one:
```bash
# Linux / macOS / WSL
python3 -m pip install --user podman-compose
podman-compose pull
podman-compose up -d
```
```powershell
# Windows PowerShell
py -m pip install --user podman-compose
podman-compose pull
podman-compose up -d
```
If you are in a bash-compatible shell, the included wrapper can auto-detect Docker or Podman:
```bash
./compose.sh --engine podman pull
./compose.sh --engine podman up -d
```
---
@@ -600,7 +632,7 @@ services:
# image: registry.gitlab.com/bigbodycobain/shadowbroker/backend:latest
container_name: shadowbroker-backend
ports:
- "8000:8000"
- "${BACKEND_PORT:-8000}:8000"
environment:
- AIS_API_KEY=your_aisstream_key # Required — get one free at aisstream.io
- OPENSKY_CLIENT_ID= # Optional — higher flight data rate limits
@@ -630,7 +662,7 @@ volumes:
backend_data:
```
> **How it works:** The frontend container proxies all `/api/*` requests through the Next.js server to `BACKEND_URL` using Docker's internal networking. The browser only ever talks to port 3000 — port 8000 does not need to be exposed externally.
> **How it works:** The frontend container proxies all `/api/*` requests through the Next.js server to `BACKEND_URL` using Docker's internal networking. The browser only ever talks to port 3000. The backend's host port is for local API/debug access and can be changed with `BACKEND_PORT=8001` without changing `BACKEND_URL`.
>
> `BACKEND_URL` is a plain runtime environment variable (not a build-time `NEXT_PUBLIC_*`), so you can change it in Portainer, Uncloud, or any compose editor without rebuilding the image. Set it to the address where your backend is reachable from inside the Docker network (e.g. `http://backend:8000`, `http://192.168.1.50:8000`).
@@ -933,8 +965,9 @@ Then confirm authenticated `GET /api/wormhole/status` or `GET /api/settings/worm
| Variable | Where to set | Purpose |
|---|---|---|
| `BACKEND_URL` | `environment` in `docker-compose.yml`, or shell env | URL the Next.js server uses to proxy API calls to the backend. Defaults to `http://backend:8000`. **Runtime variable — no rebuild needed.** |
| `BACKEND_PORT` | repo-root `.env` or shell env before `docker compose up` | Host port used to expose the backend API for local diagnostics. Defaults to `8000`; set `BACKEND_PORT=8001` if port 8000 is already in use. Does not change Docker-internal `BACKEND_URL`. |
**How it works:** The frontend proxies all `/api/*` requests through the Next.js server to `BACKEND_URL` using Docker's internal networking. Browsers only talk to port 3000; port 8000 never needs to be exposed externally. For local dev without Docker, `BACKEND_URL` defaults to `http://localhost:8000`.
**How it works:** The frontend proxies all `/api/*` requests through the Next.js server to `BACKEND_URL` using Docker's internal networking. Browsers only talk to port 3000; the backend host port is only for local diagnostics. For local dev without Docker, `BACKEND_URL` defaults to `http://localhost:8000`.
---
+9 -3
View File
@@ -54,8 +54,9 @@ AIS_API_KEY= # https://aisstream.io/ — free tier WebSocket key
# MESH_MQTT_INCLUDE_DEFAULT_ROOTS=true
# MESH_MQTT_BROKER=mqtt.meshtastic.org
# MESH_MQTT_PORT=1883
# MESH_MQTT_USER=meshdev
# MESH_MQTT_PASS=large4cats
# Leave user/pass blank for the public Meshtastic broker default.
# MESH_MQTT_USER=
# MESH_MQTT_PASS=
# Optional Meshtastic node ID (e.g. "!abcd1234"). When set, included in the
# User-Agent sent to meshtastic.liamcottle.net so the upstream service operator
@@ -127,7 +128,12 @@ AIS_API_KEY= # https://aisstream.io/ — free tier WebSocket key
# MESH_BOOTSTRAP_DISABLED=false
# MESH_BOOTSTRAP_MANIFEST_PATH=data/bootstrap_peers.json
# MESH_BOOTSTRAP_SIGNER_PUBLIC_KEY=
# MESH_DEFAULT_SYNC_PEERS=https://node.shadowbroker.info # bundled pull-only public seed for fresh installs
# Infonet/Wormhole fails closed to onion/RNS by default. Only enable clearnet
# sync for local relay development or an explicitly public testnet.
# MESH_INFONET_ALLOW_CLEARNET_SYNC=false
# MESH_BOOTSTRAP_SEED_PEERS=http://gqpbunqbgtkcqilvclm3xrkt3zowjyl3s62kkktvojgvxzizamvbrqid.onion:8000
# Add comma-separated http://*.onion peers as more private seed/relay nodes come online.
# MESH_DEFAULT_SYNC_PEERS= # legacy alias; prefer MESH_BOOTSTRAP_SEED_PEERS
# MESH_RELAY_PEERS= # comma-separated operator-trusted sync/push peers (empty by default)
# MESH_PEER_PUSH_SECRET= # REQUIRED when relay/RNS peers are configured (min 16 chars, generate with: python -c "import secrets; print(secrets.token_urlsafe(32))")
# MESH_SYNC_INTERVAL_S=300
+3 -1
View File
@@ -22,10 +22,12 @@ FROM python:3.11-slim-bookworm
WORKDIR /app
# Install Node.js (for AIS WebSocket proxy) and curl (for network fallback)
# Install Node.js (for AIS WebSocket proxy), curl (for network fallback), and
# Tor (for Wormhole/remote-agent .onion transport).
RUN apt-get update && apt-get install -y --no-install-recommends \
ca-certificates \
curl \
tor \
&& curl -fsSL https://deb.nodesource.com/setup_20.x | bash - \
&& apt-get install -y --no-install-recommends nodejs \
&& rm -rf /var/lib/apt/lists/*
+2
View File
@@ -361,6 +361,8 @@ async def _verify_openclaw_hmac(request: Request) -> bool:
# Bind request body: digest the raw bytes so any body tampering
# invalidates the signature. Empty/absent bodies hash as sha256(b"").
body_bytes = await request.body()
# Keep the cached body available for downstream handlers that call request.json().
request._body = body_bytes
body_digest = _hashlib_mod.sha256(body_bytes).hexdigest()
# Compute expected signature: HMAC-SHA256(secret, METHOD|path|ts|nonce|body_digest)
+20 -10
View File
@@ -1,15 +1,5 @@
{
"feeds": [
{
"name": "Reuters",
"url": "https://www.reutersagency.com/feed/?best-topics=world",
"weight": 5
},
{
"name": "AP News",
"url": "https://rsshub.app/apnews/topics/world-news",
"weight": 5
},
{
"name": "NPR",
"url": "https://feeds.npr.org/1004/rss.xml",
@@ -99,6 +89,26 @@
"name": "Japan Times",
"url": "https://www.japantimes.co.jp/feed/",
"weight": 3
},
{
"name": "CSM",
"url": "https://www.csmonitor.com/rss/world",
"weight": 4
},
{
"name": "PBS NewsHour",
"url": "https://www.pbs.org/newshour/feeds/rss/world",
"weight": 4
},
{
"name": "France 24",
"url": "https://www.france24.com/en/rss",
"weight": 4
},
{
"name": "DW",
"url": "https://rss.dw.com/xml/rss-en-world",
"weight": 4
}
]
}
+5
View File
@@ -14,4 +14,9 @@ if [ -d /app/image-data ]; then
done
fi
if [ -z "${PRIVACY_CORE_ALLOWED_SHA256:-}" ] && [ -f /app/libprivacy_core.so ]; then
PRIVACY_CORE_ALLOWED_SHA256="$(sha256sum /app/libprivacy_core.so | awk '{print $1}')"
export PRIVACY_CORE_ALLOWED_SHA256
fi
exec "$@"
+331 -68
View File
@@ -14,7 +14,7 @@ from dataclasses import dataclass, field
from typing import Any
from json import JSONDecodeError
APP_VERSION = "0.9.7"
APP_VERSION = "0.9.79"
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
@@ -1084,6 +1084,7 @@ _WORMHOLE_PUBLIC_PROFILE_FIELDS = {"profile", "wormhole_enabled"}
_PRIVATE_LANE_CONTROL_FIELDS = {"private_lane_tier", "private_lane_policy"}
_PUBLIC_RNS_STATUS_FIELDS = {"enabled", "ready", "configured_peers", "active_peers"}
_NODE_PUBLIC_EVENT_HOOK_REGISTERED = False
_INFONET_PRIVATE_TRANSPORT_LOCK = threading.Lock()
def _current_node_mode() -> str:
@@ -1114,9 +1115,10 @@ def _participant_node_enabled() -> bool:
def _node_runtime_snapshot() -> dict[str, Any]:
with _NODE_RUNTIME_LOCK:
return {
"node_mode": _NODE_BOOTSTRAP_STATE.get("node_mode", "participant"),
"node_mode": _current_node_mode(),
"node_enabled": _participant_node_enabled(),
"bootstrap": dict(_NODE_BOOTSTRAP_STATE),
"private_transport_required": _infonet_private_transport_required(),
"bootstrap": {**dict(_NODE_BOOTSTRAP_STATE), "node_mode": _current_node_mode()},
"sync_runtime": get_sync_state().to_dict(),
"push_runtime": dict(_NODE_PUSH_STATE),
}
@@ -1149,6 +1151,79 @@ def _set_participant_node_enabled(enabled: bool) -> dict[str, Any]:
}
def _infonet_private_transport_required() -> bool:
return not bool(getattr(get_settings(), "MESH_INFONET_ALLOW_CLEARNET_SYNC", False))
def _infonet_private_transport_error() -> str:
return "private Infonet requires onion/RNS transport; no clearnet sync fallback"
def _is_private_infonet_transport(transport: str) -> bool:
return str(transport or "").strip().lower() in {"onion", "rns"}
def _filter_infonet_sync_records(records: list[Any]) -> list[Any]:
if not _infonet_private_transport_required():
return records
return [
record
for record in records
if _is_private_infonet_transport(str(getattr(record, "transport", "") or ""))
]
def _ensure_infonet_private_transport_ready(reason: str = "") -> bool:
"""Warm the local onion transport before private Infonet sync.
Infonet may know about an onion seed before the Wormhole UI is opened. The
sync worker still needs Arti marked enabled and a ready SOCKS listener, so
do that lazily in the worker instead of making users manually open another
panel just to participate in the Infonet.
"""
if not _infonet_private_transport_required():
return True
try:
from services.wormhole_supervisor import _check_arti_ready
if bool(get_settings().MESH_ARTI_ENABLED) and _check_arti_ready():
return True
except Exception:
pass
if not _INFONET_PRIVATE_TRANSPORT_LOCK.acquire(blocking=False):
return False
try:
from routers.ai_intel import _write_env_value
from services.tor_hidden_service import tor_service
from services.wormhole_supervisor import _check_arti_ready
label = f" ({reason})" if reason else ""
logger.info("Infonet private transport warmup starting%s", label)
tor_result = tor_service.start(target_port=8000)
if tor_result.get("ok"):
_write_env_value("MESH_ARTI_ENABLED", "true")
get_settings.cache_clear()
if _check_arti_ready():
logger.info("Infonet private transport ready%s", label)
return True
logger.warning("Infonet private transport warmup incomplete%s: %s", label, tor_result)
return False
except Exception as exc:
logger.warning("Infonet private transport warmup failed: %s", exc)
return False
finally:
_INFONET_PRIVATE_TRANSPORT_LOCK.release()
def _configured_bootstrap_seed_peer_urls() -> list[str]:
settings = get_settings()
primary = str(getattr(settings, "MESH_BOOTSTRAP_SEED_PEERS", "") or "").strip()
legacy = str(getattr(settings, "MESH_DEFAULT_SYNC_PEERS", "") or "").strip()
return parse_configured_relay_peers(primary or legacy)
def _refresh_node_peer_store(*, now: float | None = None) -> dict[str, Any]:
from services.mesh.mesh_bootstrap_manifest import load_bootstrap_manifest_from_settings
from services.mesh.mesh_peer_store import (
@@ -1167,14 +1242,17 @@ def _refresh_node_peer_store(*, now: float | None = None) -> dict[str, Any]:
except Exception:
store = PeerStore(DEFAULT_PEER_STORE_PATH)
private_transport_required = _infonet_private_transport_required()
operator_peers = configured_relay_peer_urls()
default_sync_peers = parse_configured_relay_peers(
str(getattr(get_settings(), "MESH_DEFAULT_SYNC_PEERS", "") or "")
)
bootstrap_seed_peers = _configured_bootstrap_seed_peer_urls()
skipped_clearnet_peers = 0
for peer_url in operator_peers:
transport = peer_transport_kind(peer_url)
if not transport:
continue
if private_transport_required and not _is_private_infonet_transport(transport):
skipped_clearnet_peers += 1
continue
store.upsert(
make_sync_peer_record(
peer_url=peer_url,
@@ -1195,19 +1273,22 @@ def _refresh_node_peer_store(*, now: float | None = None) -> dict[str, Any]:
)
operator_peer_set = set(operator_peers)
for peer_url in default_sync_peers:
for peer_url in bootstrap_seed_peers:
if peer_url in operator_peer_set:
continue
transport = peer_transport_kind(peer_url)
if not transport:
continue
if private_transport_required and not _is_private_infonet_transport(transport):
skipped_clearnet_peers += 1
continue
store.upsert(
make_bootstrap_peer_record(
peer_url=peer_url,
transport=transport,
role="seed",
label="ShadowBroker default seed",
signer_id="shadowbroker-default",
label="ShadowBroker bootstrap seed",
signer_id="shadowbroker-bootstrap",
now=timestamp,
)
)
@@ -1217,8 +1298,8 @@ def _refresh_node_peer_store(*, now: float | None = None) -> dict[str, Any]:
transport=transport,
role="seed",
source="bundle",
label="ShadowBroker default seed",
signer_id="shadowbroker-default",
label="ShadowBroker bootstrap seed",
signer_id="shadowbroker-bootstrap",
now=timestamp,
)
)
@@ -1232,6 +1313,9 @@ def _refresh_node_peer_store(*, now: float | None = None) -> dict[str, Any]:
if manifest is not None:
for peer in manifest.peers:
if private_transport_required and not _is_private_infonet_transport(peer.transport):
skipped_clearnet_peers += 1
continue
store.upsert(
make_bootstrap_peer_record(
peer_url=peer.peer_url,
@@ -1254,17 +1338,30 @@ def _refresh_node_peer_store(*, now: float | None = None) -> dict[str, Any]:
)
)
if private_transport_required and skipped_clearnet_peers and not bootstrap_error:
bootstrap_error = _infonet_private_transport_error()
store.save()
bootstrap_records = store.records_for_bucket("bootstrap")
sync_records = store.records_for_bucket("sync")
push_records = store.records_for_bucket("push")
if private_transport_required:
bootstrap_records = [record for record in bootstrap_records if _is_private_infonet_transport(record.transport)]
sync_records = [record for record in sync_records if _is_private_infonet_transport(record.transport)]
push_records = [record for record in push_records if _is_private_infonet_transport(record.transport)]
snapshot = {
"node_mode": mode,
"private_transport_required": private_transport_required,
"skipped_clearnet_peer_count": skipped_clearnet_peers,
"manifest_loaded": manifest is not None,
"manifest_signer_id": manifest.signer_id if manifest is not None else "",
"manifest_valid_until": int(manifest.valid_until or 0) if manifest is not None else 0,
"bootstrap_peer_count": len(store.records_for_bucket("bootstrap")),
"sync_peer_count": len(store.records_for_bucket("sync")),
"push_peer_count": len(store.records_for_bucket("push")),
"bootstrap_peer_count": len(bootstrap_records),
"sync_peer_count": len(sync_records),
"push_peer_count": len(push_records),
"operator_peer_count": len(operator_peers),
"default_sync_peer_count": len(default_sync_peers),
"bootstrap_seed_peer_count": len(bootstrap_seed_peers),
"default_sync_peer_count": len(bootstrap_seed_peers),
"last_bootstrap_error": bootstrap_error,
}
with _NODE_RUNTIME_LOCK:
@@ -1285,14 +1382,22 @@ def _peer_sync_response(peer_url: str, body: dict[str, Any]) -> dict[str, Any]:
normalized = normalize_peer_url(peer_url)
if not normalized:
raise ValueError("invalid peer URL")
transport = peer_transport_kind(normalized)
if _infonet_private_transport_required() and not _is_private_infonet_transport(transport):
raise RuntimeError(_infonet_private_transport_error())
timeout = int(get_settings().MESH_RELAY_PUSH_TIMEOUT_S or 10)
settings = get_settings()
timeout = int(
getattr(settings, "MESH_SYNC_TIMEOUT_S", 0)
or getattr(settings, "MESH_RELAY_PUSH_TIMEOUT_S", 0)
or 10
)
kwargs: dict[str, Any] = {
"json": body,
"timeout": timeout,
"headers": {"Content-Type": "application/json"},
}
if peer_transport_kind(normalized) == "onion":
if transport == "onion":
if not bool(get_settings().MESH_ARTI_ENABLED):
raise RuntimeError("onion sync requires Arti to be enabled")
if not _check_arti_ready():
@@ -1407,20 +1512,39 @@ def _run_public_sync_cycle() -> SyncWorkerState:
except Exception:
store = PeerStore(DEFAULT_PEER_STORE_PATH)
peers = eligible_sync_peers(store.records(), now=time.time())
records = _filter_infonet_sync_records(store.records())
peers = eligible_sync_peers(records, now=time.time())
max_peers = max(1, int(getattr(get_settings(), "MESH_SYNC_MAX_PEERS_PER_CYCLE", 0) or 3))
peers = peers[:max_peers]
with _NODE_RUNTIME_LOCK:
current_state = get_sync_state()
if not peers:
updated = finish_solo_sync(
current_state,
now=time.time(),
current_head=infonet.head_hash,
interval_s=int(get_settings().MESH_SYNC_INTERVAL_S or 300),
)
if _infonet_private_transport_required():
updated = finish_sync(
current_state,
ok=False,
error=_infonet_private_transport_error(),
now=time.time(),
current_head=infonet.head_hash,
failure_backoff_s=int(get_settings().MESH_SYNC_FAILURE_BACKOFF_S or 60),
)
else:
updated = finish_solo_sync(
current_state,
now=time.time(),
current_head=infonet.head_hash,
interval_s=int(get_settings().MESH_SYNC_INTERVAL_S or 300),
)
with _NODE_RUNTIME_LOCK:
set_sync_state(updated)
return updated
if _infonet_private_transport_required() and any(
str(getattr(record, "transport", "") or "").strip().lower() == "onion"
for record in peers
):
_ensure_infonet_private_transport_ready("sync")
last_error = "sync failed"
for record in peers:
started = begin_sync(
@@ -1454,14 +1578,25 @@ def _run_public_sync_cycle() -> SyncWorkerState:
return updated
last_error = error
settings = get_settings()
is_seed_peer = str(getattr(record, "role", "") or "").strip().lower() == "seed"
cooldown_s = int(getattr(settings, "MESH_RELAY_FAILURE_COOLDOWN_S", 120) or 120)
if is_seed_peer:
cooldown_s = int(
getattr(settings, "MESH_BOOTSTRAP_SEED_FAILURE_COOLDOWN_S", cooldown_s)
or cooldown_s
)
store.mark_failure(
record.peer_url,
"sync",
error=error,
cooldown_s=int(get_settings().MESH_RELAY_FAILURE_COOLDOWN_S or 120),
cooldown_s=cooldown_s,
now=time.time(),
)
store.save()
failure_backoff_s = int(settings.MESH_SYNC_FAILURE_BACKOFF_S or 60)
if is_seed_peer:
failure_backoff_s = min(failure_backoff_s, max(1, cooldown_s))
updated = finish_sync(
started,
ok=False,
@@ -1471,7 +1606,7 @@ def _run_public_sync_cycle() -> SyncWorkerState:
fork_detected=forked,
now=time.time(),
interval_s=int(get_settings().MESH_SYNC_INTERVAL_S or 300),
failure_backoff_s=int(get_settings().MESH_SYNC_FAILURE_BACKOFF_S or 60),
failure_backoff_s=failure_backoff_s,
)
with _NODE_RUNTIME_LOCK:
set_sync_state(updated)
@@ -1489,6 +1624,33 @@ def _run_public_sync_cycle() -> SyncWorkerState:
)
_NODE_SYNC_KICK_LOCK = threading.Lock()
def _kick_public_sync_background(reason: str = "") -> None:
"""Start one immediate Infonet sync attempt without waiting for the poll loop."""
if not _node_runtime_supported() or not _participant_node_enabled():
return
def _runner() -> None:
if not _NODE_SYNC_KICK_LOCK.acquire(blocking=False):
return
try:
label = f" ({reason})" if reason else ""
logger.info("Infonet sync kick starting%s", label)
_run_public_sync_cycle()
except Exception:
logger.exception("Infonet sync kick failed")
finally:
_NODE_SYNC_KICK_LOCK.release()
threading.Thread(
target=_runner,
daemon=True,
name="infonet-sync-kick",
).start()
def _public_infonet_sync_loop() -> None:
from services.mesh.mesh_hashchain import infonet
@@ -2202,9 +2364,16 @@ async def lifespan(app: FastAPI):
_refresh_node_peer_store()
if _node_runtime_supported():
if not _participant_node_enabled():
set_sync_state(_set_node_sync_disabled_state())
logger.info("Infonet participant auto-enabled for private seed sync")
_set_participant_node_enabled(True)
threading.Thread(
target=lambda: _ensure_infonet_private_transport_ready("startup"),
daemon=True,
name="infonet-private-transport-warmup",
).start()
_NODE_SYNC_STOP.clear()
threading.Thread(target=_public_infonet_sync_loop, daemon=True).start()
_kick_public_sync_background("startup")
threading.Thread(target=_http_peer_push_loop, daemon=True).start()
threading.Thread(target=_http_gate_push_loop, daemon=True).start()
threading.Thread(target=_http_gate_pull_loop, daemon=True).start()
@@ -2910,6 +3079,24 @@ def _resume_private_delivery_background_work(*, current_tier: str, reason: str)
)
def _is_public_meshtastic_lane_path(path: str, method: str) -> bool:
"""Routes for the public Meshtastic MQTT lane.
These are intentionally outside the Wormhole/Infonet private transport
lifecycle. Polling public MeshChat must not wake or re-enable Wormhole.
"""
normalized_path = str(path or "").strip()
method_name = str(method or "").upper()
if method_name == "POST" and normalized_path == "/api/mesh/meshtastic/send":
return True
if method_name == "GET" and normalized_path in {
"/api/mesh/messages",
"/api/mesh/channels",
}:
return True
return False
def _upgrade_invite_scoped_contact_preferences_background() -> dict[str, Any]:
try:
from services.mesh.mesh_wormhole_contacts import upgrade_invite_scoped_contact_preferences
@@ -2941,7 +3128,11 @@ def _refresh_lookup_handle_rotation_background(*, reason: str) -> dict[str, Any]
@app.middleware("http")
async def enforce_high_privacy_mesh(request: Request, call_next):
path = request.url.path
if path.startswith("/api/mesh") or path.startswith("/api/wormhole/gate/") or path.startswith("/api/wormhole/dm/"):
private_mesh_path = path.startswith("/api/mesh") and not _is_public_meshtastic_lane_path(
path,
request.method,
)
if private_mesh_path or path.startswith("/api/wormhole/gate/") or path.startswith("/api/wormhole/dm/"):
request.state._private_lane_started_at = time.perf_counter()
current_tier = "public_degraded"
try:
@@ -3042,7 +3233,7 @@ async def enforce_high_privacy_mesh(request: Request, call_next):
# Don't block the request on the upgrade — the transport
# manager will converge in the background.
if (
path.startswith("/api/mesh")
private_mesh_path
and str(data.get("privacy_profile", "default")).lower() == "high"
and not bool(data.get("enabled"))
):
@@ -3275,8 +3466,16 @@ async def update_layers(update: LayerUpdate, request: Request):
from services.sigint_bridge import sigint_grid
if old_mesh and not new_mesh:
sigint_grid.mesh.stop()
logger.info("Meshtastic MQTT bridge stopped (layer disabled)")
try:
from services.meshtastic_mqtt_settings import mqtt_bridge_enabled
keep_chat_running = mqtt_bridge_enabled()
except Exception:
keep_chat_running = False
if keep_chat_running:
logger.info("Meshtastic map layer disabled; MQTT bridge kept running for MeshChat")
else:
sigint_grid.mesh.stop()
logger.info("Meshtastic MQTT bridge stopped (layer disabled)")
elif not old_mesh and new_mesh:
# Respect the global MESH_MQTT_ENABLED gate even when the UI layer is
# toggled on. The layer toggle should not bypass the opt-in flag that
@@ -4210,9 +4409,11 @@ async def mesh_send(request: Request):
any_ok = any(r.ok for r in results)
# ─── Mirror to Meshtastic bridge feed ────────────────────────
# The MQTT broker won't echo our own publishes back to our subscriber,
# so inject successfully-sent messages into the bridge's deque directly.
if any_ok and envelope.routed_via == "meshtastic":
# The MQTT broker won't echo our own publishes back to our subscriber, so
# inject successfully-sent channel broadcasts into the bridge directly.
# Node-targeted packets must not appear in the public channel feed.
is_direct_destination = MeshtasticTransport._parse_node_id(destination) is not None
if any_ok and envelope.routed_via == "meshtastic" and not is_direct_destination:
try:
from services.sigint_bridge import sigint_grid
@@ -4220,16 +4421,22 @@ async def mesh_send(request: Request):
if bridge:
from datetime import datetime
bridge.messages.appendleft(
append_text = getattr(bridge, "append_text_message", None)
message_record = (
{
"from": MeshtasticTransport.mesh_address_for_sender(node_id),
"to": destination if MeshtasticTransport._parse_node_id(destination) is not None else "broadcast",
"to": "broadcast",
"text": message,
"region": credentials.get("mesh_region", "US"),
"root": credentials.get("mesh_region", "US"),
"channel": body.get("channel", "LongFast"),
"timestamp": datetime.utcnow().isoformat() + "Z",
}
)
if callable(append_text):
append_text(message_record)
else:
bridge.messages.appendleft(message_record)
except Exception:
pass # Non-critical
@@ -4239,6 +4446,8 @@ async def mesh_send(request: Request):
"event_id": "",
"routed_via": envelope.routed_via,
"route_reason": envelope.route_reason,
"direct": is_direct_destination,
"channel_echo": not is_direct_destination,
"results": [r.to_dict() for r in results],
}
@@ -4337,6 +4546,7 @@ async def mesh_messages(
root: str = "",
channel: str = "",
limit: int = 30,
include_direct: bool = False,
):
"""Get recent Meshtastic text messages from the MQTT bridge."""
from services.sigint_bridge import sigint_grid
@@ -4358,6 +4568,12 @@ async def mesh_messages(
msgs = [m for m in msgs if m.get("root", "").upper() == root_filter]
if channel:
msgs = [m for m in msgs if m.get("channel", "").lower() == channel.lower()]
if not include_direct:
msgs = [
m
for m in msgs
if str(m.get("to") or "broadcast").strip().lower() in {"", "broadcast", "^all"}
]
return msgs[: min(limit, 100)]
@@ -8638,6 +8854,16 @@ export_wormhole_dm_invite = getattr(
"export_wormhole_dm_invite",
_wormhole_identity_unavailable,
)
list_prekey_lookup_handle_records_for_ui = getattr(
_mesh_wormhole_identity,
"list_prekey_lookup_handle_records_for_ui",
_wormhole_identity_unavailable,
)
revoke_prekey_lookup_handle = getattr(
_mesh_wormhole_identity,
"revoke_prekey_lookup_handle",
_wormhole_identity_unavailable,
)
import_wormhole_dm_invite = getattr(
_mesh_wormhole_identity,
"import_wormhole_dm_invite",
@@ -8784,7 +9010,17 @@ async def api_get_node_settings(request: Request):
@limiter.limit("10/minute")
async def api_set_node_settings(request: Request, body: NodeSettingsUpdate):
_refresh_node_peer_store()
return _set_participant_node_enabled(bool(body.enabled))
if bool(body.enabled):
try:
from services.transport_lane_isolation import disable_public_mesh_lane
disable_public_mesh_lane(reason="private_node_enabled")
except Exception as exc:
logger.warning("Failed to disable public Mesh while enabling private node: %s", exc)
result = _set_participant_node_enabled(bool(body.enabled))
if bool(body.enabled):
_kick_public_sync_background("operator_enable")
return result
@app.get("/api/settings/wormhole")
@@ -9505,24 +9741,35 @@ async def api_get_wormhole_status(request: Request):
)
@app.post("/api/wormhole/join", dependencies=[Depends(require_local_operator)])
@app.post("/api/wormhole/join")
@limiter.limit("10/minute")
async def api_wormhole_join(request: Request):
existing = read_wormhole_settings()
updated = write_wormhole_settings(
enabled=True,
transport="direct",
socks_proxy="",
transport="tor_arti",
socks_proxy=f"socks5h://127.0.0.1:{int(get_settings().MESH_ARTI_SOCKS_PORT or 9050)}",
socks_dns=True,
anonymous_mode=False,
anonymous_mode=True,
)
transport_changed = (
str(existing.get("transport", "direct")) != "direct"
or str(existing.get("socks_proxy", "")) != ""
str(existing.get("transport", "direct")) != "tor_arti"
or str(existing.get("socks_proxy", "")) != str(updated.get("socks_proxy", ""))
or bool(existing.get("socks_dns", True)) is not True
or bool(existing.get("anonymous_mode", False)) is not False
or bool(existing.get("anonymous_mode", False)) is not True
or bool(existing.get("enabled", False)) is not True
)
tor_result: dict[str, Any] = {"ok": False, "detail": "not started"}
try:
from services.tor_hidden_service import tor_service
from routers.ai_intel import _write_env_value
tor_result = await asyncio.to_thread(tor_service.start)
if tor_result.get("ok"):
_write_env_value("MESH_ARTI_ENABLED", "true")
get_settings.cache_clear()
except Exception as exc:
tor_result = {"ok": False, "detail": str(exc or type(exc).__name__)}
bootstrap_wormhole_identity()
bootstrap_wormhole_persona_state()
state = (
@@ -9544,19 +9791,19 @@ async def api_wormhole_join(request: Request):
"identity": get_transport_identity(),
"runtime": state,
"settings": updated,
"tor": tor_result,
}
@app.post("/api/wormhole/leave", dependencies=[Depends(require_local_operator)])
@app.post("/api/wormhole/leave")
@limiter.limit("10/minute")
async def api_wormhole_leave(request: Request):
updated = write_wormhole_settings(enabled=False)
state = disconnect_wormhole(reason="leave_wormhole")
# Disable node participation when the user leaves the Wormhole.
from services.node_settings import write_node_settings
write_node_settings(enabled=False)
# Leaving private DM mode must not disable Infonet participation. Infonet
# sync has its own private transport warmup and can remain connected to
# seed/peer nodes while MeshChat stays separately opt-in.
return {
"ok": True,
@@ -9565,7 +9812,7 @@ async def api_wormhole_leave(request: Request):
}
@app.get("/api/wormhole/identity", dependencies=[Depends(require_local_operator)])
@app.get("/api/wormhole/identity")
@limiter.limit("30/minute")
async def api_wormhole_identity(request: Request):
try:
@@ -9578,7 +9825,7 @@ async def api_wormhole_identity(request: Request):
raise HTTPException(status_code=500, detail="wormhole_identity_failed") from exc
@app.post("/api/wormhole/identity/bootstrap", dependencies=[Depends(require_local_operator)])
@app.post("/api/wormhole/identity/bootstrap")
@limiter.limit("10/minute")
async def api_wormhole_identity_bootstrap(request: Request):
bootstrap_wormhole_identity()
@@ -9611,11 +9858,27 @@ async def api_wormhole_dm_identity(request: Request):
@app.get("/api/wormhole/dm/invite", dependencies=[Depends(require_local_operator)])
@limiter.limit("30/minute")
async def api_wormhole_dm_invite(request: Request):
return export_wormhole_dm_invite()
async def api_wormhole_dm_invite(
request: Request,
label: str = Query("", max_length=96),
expires_in_s: int = Query(0, ge=0, le=2_592_000),
):
return export_wormhole_dm_invite(label=label, expires_in_s=expires_in_s)
@app.post("/api/wormhole/dm/invite/import", dependencies=[Depends(require_admin)])
@app.get("/api/wormhole/dm/invite/handles", dependencies=[Depends(require_local_operator)])
@limiter.limit("30/minute")
async def api_wormhole_dm_invite_handles(request: Request):
return list_prekey_lookup_handle_records_for_ui()
@app.delete("/api/wormhole/dm/invite/handles/{handle}", dependencies=[Depends(require_local_operator)])
@limiter.limit("30/minute")
async def api_wormhole_dm_invite_handle_revoke(request: Request, handle: str):
return revoke_prekey_lookup_handle(handle)
@app.post("/api/wormhole/dm/invite/import", dependencies=[Depends(require_local_operator)])
@limiter.limit("30/minute")
async def api_wormhole_dm_invite_import(request: Request, body: WormholeDmInviteImportRequest):
return import_wormhole_dm_invite(
@@ -10342,7 +10605,7 @@ async def api_wormhole_sign(request: Request, body: WormholeSignRequest):
)
@app.post("/api/wormhole/gate/enter", dependencies=[Depends(require_local_operator)])
@app.post("/api/wormhole/gate/enter")
@limiter.limit("20/minute")
async def api_wormhole_gate_enter(request: Request, body: WormholeGateRequest):
gate_id = str(body.gate_id or "")
@@ -10356,25 +10619,25 @@ async def api_wormhole_gate_enter(request: Request, body: WormholeGateRequest):
return result
@app.post("/api/wormhole/gate/leave", dependencies=[Depends(require_local_operator)])
@app.post("/api/wormhole/gate/leave")
@limiter.limit("20/minute")
async def api_wormhole_gate_leave(request: Request, body: WormholeGateRequest):
return leave_gate(str(body.gate_id or ""))
@app.get("/api/wormhole/gate/{gate_id}/identity", dependencies=[Depends(require_local_operator)])
@app.get("/api/wormhole/gate/{gate_id}/identity")
@limiter.limit("30/minute")
async def api_wormhole_gate_identity(request: Request, gate_id: str):
return get_active_gate_identity(gate_id)
@app.get("/api/wormhole/gate/{gate_id}/personas", dependencies=[Depends(require_local_operator)])
@app.get("/api/wormhole/gate/{gate_id}/personas")
@limiter.limit("30/minute")
async def api_wormhole_gate_personas(request: Request, gate_id: str):
return list_gate_personas(gate_id)
@app.get("/api/wormhole/gate/{gate_id}/key", dependencies=[Depends(require_local_operator)])
@app.get("/api/wormhole/gate/{gate_id}/key")
@limiter.limit("30/minute")
async def api_wormhole_gate_key_status(request: Request, gate_id: str):
exposure = metadata_exposure_for_request(request, authenticated=True)
@@ -10398,7 +10661,7 @@ async def api_wormhole_gate_key_rotate(request: Request, body: WormholeGateRotat
return result
@app.post("/api/wormhole/gate/persona/create", dependencies=[Depends(require_local_operator)])
@app.post("/api/wormhole/gate/persona/create")
@limiter.limit("20/minute")
async def api_wormhole_gate_persona_create(
request: Request, body: WormholeGatePersonaCreateRequest
@@ -10414,7 +10677,7 @@ async def api_wormhole_gate_persona_create(
return result
@app.post("/api/wormhole/gate/persona/activate", dependencies=[Depends(require_local_operator)])
@app.post("/api/wormhole/gate/persona/activate")
@limiter.limit("20/minute")
async def api_wormhole_gate_persona_activate(
request: Request, body: WormholeGatePersonaActivateRequest
@@ -10430,7 +10693,7 @@ async def api_wormhole_gate_persona_activate(
return result
@app.post("/api/wormhole/gate/persona/clear", dependencies=[Depends(require_local_operator)])
@app.post("/api/wormhole/gate/persona/clear")
@limiter.limit("20/minute")
async def api_wormhole_gate_persona_clear(request: Request, body: WormholeGateRequest):
gate_id = str(body.gate_id or "")
@@ -10444,7 +10707,7 @@ async def api_wormhole_gate_persona_clear(request: Request, body: WormholeGateRe
return result
@app.post("/api/wormhole/gate/persona/retire", dependencies=[Depends(require_local_operator)])
@app.post("/api/wormhole/gate/persona/retire")
@limiter.limit("20/minute")
async def api_wormhole_gate_persona_retire(
request: Request, body: WormholeGatePersonaActivateRequest
@@ -10525,7 +10788,7 @@ async def api_wormhole_gate_message_compose(request: Request, body: WormholeGate
return composed
@app.post("/api/wormhole/gate/message/sign-encrypted", dependencies=[Depends(require_local_operator)])
@app.post("/api/wormhole/gate/message/sign-encrypted")
@limiter.limit("30/minute")
async def api_wormhole_gate_message_sign_encrypted(
request: Request,
@@ -10557,7 +10820,7 @@ async def api_wormhole_gate_message_sign_encrypted(
return signed
@app.post("/api/wormhole/gate/message/post-encrypted", dependencies=[Depends(require_local_operator)])
@app.post("/api/wormhole/gate/message/post-encrypted")
@limiter.limit("30/minute")
async def api_wormhole_gate_message_post_encrypted(
request: Request,
@@ -10737,13 +11000,13 @@ async def api_wormhole_gate_messages_decrypt(request: Request, body: WormholeGat
return {"ok": True, "results": results}
@app.post("/api/wormhole/gate/state/export", dependencies=[Depends(require_local_operator)])
@app.post("/api/wormhole/gate/state/export")
@limiter.limit("30/minute")
async def api_wormhole_gate_state_export(request: Request, body: WormholeGateRequest):
return export_gate_state_snapshot_with_repair(str(body.gate_id or ""))
@app.post("/api/wormhole/gate/proof", dependencies=[Depends(require_local_operator)])
@app.post("/api/wormhole/gate/proof")
@limiter.limit("30/minute")
async def api_wormhole_gate_proof(request: Request, body: WormholeGateRequest):
proof = _sign_gate_access_proof(str(body.gate_id or ""))
@@ -11290,7 +11553,7 @@ async def api_wormhole_health(request: Request):
return _redact_wormhole_status(full_state, authenticated=ok)
@app.post("/api/wormhole/connect", dependencies=[Depends(require_admin)])
@app.post("/api/wormhole/connect")
@limiter.limit("10/minute")
async def api_wormhole_connect(request: Request):
settings = read_wormhole_settings()
+61 -14
View File
@@ -96,9 +96,10 @@ def _participant_node_enabled() -> bool:
def _node_runtime_snapshot() -> dict[str, Any]:
with _NODE_RUNTIME_LOCK:
return {
"node_mode": _NODE_BOOTSTRAP_STATE.get("node_mode", "participant"),
"node_mode": _current_node_mode(),
"node_enabled": _participant_node_enabled(),
"bootstrap": dict(_NODE_BOOTSTRAP_STATE),
"private_transport_required": _infonet_private_transport_required(),
"bootstrap": {**dict(_NODE_BOOTSTRAP_STATE), "node_mode": _current_node_mode()},
"sync_runtime": get_sync_state().to_dict(),
"push_runtime": dict(_NODE_PUSH_STATE),
}
@@ -131,6 +132,30 @@ def _set_participant_node_enabled(enabled: bool) -> dict[str, Any]:
}
def _infonet_private_transport_required() -> bool:
from services.config import get_settings
return not bool(getattr(get_settings(), "MESH_INFONET_ALLOW_CLEARNET_SYNC", False))
def _infonet_private_transport_error() -> str:
return "private Infonet requires onion/RNS transport; no clearnet sync fallback"
def _is_private_infonet_transport(transport: str) -> bool:
return str(transport or "").strip().lower() in {"onion", "rns"}
def _configured_bootstrap_seed_peer_urls() -> list[str]:
from services.config import get_settings
from services.mesh.mesh_router import parse_configured_relay_peers
settings = get_settings()
primary = str(getattr(settings, "MESH_BOOTSTRAP_SEED_PEERS", "") or "").strip()
legacy = str(getattr(settings, "MESH_DEFAULT_SYNC_PEERS", "") or "").strip()
return parse_configured_relay_peers(primary or legacy)
def _refresh_node_peer_store(*, now: float | None = None) -> dict[str, Any]:
from services.config import get_settings
from services.mesh.mesh_bootstrap_manifest import load_bootstrap_manifest_from_settings
@@ -155,14 +180,17 @@ def _refresh_node_peer_store(*, now: float | None = None) -> dict[str, Any]:
except Exception:
store = PeerStore(DEFAULT_PEER_STORE_PATH)
private_transport_required = _infonet_private_transport_required()
operator_peers = configured_relay_peer_urls()
default_sync_peers = parse_configured_relay_peers(
str(getattr(get_settings(), "MESH_DEFAULT_SYNC_PEERS", "") or "")
)
bootstrap_seed_peers = _configured_bootstrap_seed_peer_urls()
skipped_clearnet_peers = 0
for peer_url in operator_peers:
transport = peer_transport_kind(peer_url)
if not transport:
continue
if private_transport_required and not _is_private_infonet_transport(transport):
skipped_clearnet_peers += 1
continue
store.upsert(
make_sync_peer_record(
peer_url=peer_url,
@@ -183,19 +211,22 @@ def _refresh_node_peer_store(*, now: float | None = None) -> dict[str, Any]:
)
operator_peer_set = set(operator_peers)
for peer_url in default_sync_peers:
for peer_url in bootstrap_seed_peers:
if peer_url in operator_peer_set:
continue
transport = peer_transport_kind(peer_url)
if not transport:
continue
if private_transport_required and not _is_private_infonet_transport(transport):
skipped_clearnet_peers += 1
continue
store.upsert(
make_bootstrap_peer_record(
peer_url=peer_url,
transport=transport,
role="seed",
label="ShadowBroker default seed",
signer_id="shadowbroker-default",
label="ShadowBroker bootstrap seed",
signer_id="shadowbroker-bootstrap",
now=timestamp,
)
)
@@ -205,8 +236,8 @@ def _refresh_node_peer_store(*, now: float | None = None) -> dict[str, Any]:
transport=transport,
role="seed",
source="bundle",
label="ShadowBroker default seed",
signer_id="shadowbroker-default",
label="ShadowBroker bootstrap seed",
signer_id="shadowbroker-bootstrap",
now=timestamp,
)
)
@@ -220,6 +251,9 @@ def _refresh_node_peer_store(*, now: float | None = None) -> dict[str, Any]:
if manifest is not None:
for peer in manifest.peers:
if private_transport_required and not _is_private_infonet_transport(peer.transport):
skipped_clearnet_peers += 1
continue
store.upsert(
make_bootstrap_peer_record(
peer_url=peer.peer_url,
@@ -242,17 +276,30 @@ def _refresh_node_peer_store(*, now: float | None = None) -> dict[str, Any]:
)
)
if private_transport_required and skipped_clearnet_peers and not bootstrap_error:
bootstrap_error = _infonet_private_transport_error()
store.save()
bootstrap_records = store.records_for_bucket("bootstrap")
sync_records = store.records_for_bucket("sync")
push_records = store.records_for_bucket("push")
if private_transport_required:
bootstrap_records = [record for record in bootstrap_records if _is_private_infonet_transport(record.transport)]
sync_records = [record for record in sync_records if _is_private_infonet_transport(record.transport)]
push_records = [record for record in push_records if _is_private_infonet_transport(record.transport)]
snapshot = {
"node_mode": mode,
"private_transport_required": private_transport_required,
"skipped_clearnet_peer_count": skipped_clearnet_peers,
"manifest_loaded": manifest is not None,
"manifest_signer_id": manifest.signer_id if manifest is not None else "",
"manifest_valid_until": int(manifest.valid_until or 0) if manifest is not None else 0,
"bootstrap_peer_count": len(store.records_for_bucket("bootstrap")),
"sync_peer_count": len(store.records_for_bucket("sync")),
"push_peer_count": len(store.records_for_bucket("push")),
"bootstrap_peer_count": len(bootstrap_records),
"sync_peer_count": len(sync_records),
"push_peer_count": len(push_records),
"operator_peer_count": len(operator_peers),
"default_sync_peer_count": len(default_sync_peers),
"bootstrap_seed_peer_count": len(bootstrap_seed_peers),
"default_sync_peer_count": len(bootstrap_seed_peers),
"last_bootstrap_error": bootstrap_error,
}
with _NODE_RUNTIME_LOCK:
+4 -3
View File
@@ -7,7 +7,7 @@ py-modules = []
[project]
name = "backend"
version = "0.9.7"
version = "0.9.79"
requires-python = ">=3.10"
dependencies = [
"apscheduler==3.10.3",
@@ -24,7 +24,8 @@ dependencies = [
"pydantic-settings==2.8.1",
"pystac-client==0.8.6",
"python-dotenv==1.2.2",
"requests==2.31.0",
"requests==2.34.1",
"PySocks==1.7.1",
"reverse-geocoder==1.5.1",
"sgp4==2.25",
"meshtastic>=2.5.0",
@@ -42,7 +43,7 @@ dev = ["pytest>=8.3.4", "pytest-asyncio==0.25.0", "ruff>=0.9.0", "black>=24.0.0"
[tool.ruff.lint]
# The current backend carries historical style debt in large legacy modules.
# Keep CI focused on actionable correctness checks for the v0.9.7 release.
# Keep CI focused on actionable correctness checks for the v0.9.79 release.
ignore = ["E401", "E402", "E701", "E731", "E741", "F401", "F402", "F541", "F811", "F841"]
[tool.black]
+90 -3
View File
@@ -28,6 +28,18 @@ class TimeMachineToggle(BaseModel):
enabled: bool
class MeshtasticMqttUpdate(BaseModel):
enabled: bool | None = None
broker: str | None = None
port: int | None = None
username: str | None = None
password: str | None = None
psk: str | None = None
include_default_roots: bool | None = None
extra_roots: str | None = None
extra_topics: str | None = None
@router.get("/api/settings/api-keys", dependencies=[Depends(require_local_operator)])
@limiter.limit("30/minute")
async def api_get_keys(request: Request):
@@ -120,7 +132,82 @@ async def api_get_node_settings(request: Request):
@limiter.limit("10/minute")
async def api_set_node_settings(request: Request, body: NodeSettingsUpdate):
_refresh_node_peer_store()
return _set_participant_node_enabled(bool(body.enabled))
if bool(body.enabled):
try:
from services.transport_lane_isolation import disable_public_mesh_lane
disable_public_mesh_lane(reason="private_node_enabled")
except Exception as exc:
logger.warning("Failed to disable public Mesh while enabling private node: %s", exc)
result = _set_participant_node_enabled(bool(body.enabled))
if bool(body.enabled):
try:
import main as _main
_main._kick_public_sync_background("operator_enable")
except Exception:
logger.debug("Unable to kick Infonet sync after node enable", exc_info=True)
return result
def _meshtastic_runtime_snapshot() -> dict[str, Any]:
from services.meshtastic_mqtt_settings import redacted_meshtastic_mqtt_settings
from services.sigint_bridge import sigint_grid
return {
**redacted_meshtastic_mqtt_settings(),
"runtime": sigint_grid.mesh.status(),
}
@router.get("/api/settings/meshtastic-mqtt", dependencies=[Depends(require_local_operator)])
@limiter.limit("30/minute")
async def api_get_meshtastic_mqtt_settings(request: Request):
return _meshtastic_runtime_snapshot()
@router.put("/api/settings/meshtastic-mqtt", dependencies=[Depends(require_local_operator)])
@limiter.limit("10/minute")
async def api_set_meshtastic_mqtt_settings(request: Request, body: MeshtasticMqttUpdate):
from services.meshtastic_mqtt_settings import write_meshtastic_mqtt_settings
from services.sigint_bridge import sigint_grid
updates = body.model_dump(exclude_unset=True)
# Empty secret fields mean "keep existing"; explicit non-empty values replace.
if updates.get("password") == "":
updates.pop("password", None)
if updates.get("psk") == "":
updates.pop("psk", None)
enabled_requested = updates.get("enabled")
settings = write_meshtastic_mqtt_settings(**updates)
if isinstance(enabled_requested, bool):
logger.info("Meshtastic MQTT settings update: enabled=%s", enabled_requested)
if enabled_requested is True:
# Public MQTT and Wormhole are intentionally mutually exclusive lanes.
try:
from services.node_settings import write_node_settings
from services.wormhole_settings import write_wormhole_settings
from services.wormhole_supervisor import disconnect_wormhole
write_wormhole_settings(enabled=False)
disconnect_wormhole(reason="public_mesh_enabled")
write_node_settings(enabled=False)
_set_participant_node_enabled(False)
except Exception as exc:
logger.warning("Failed to disable private mesh lane while enabling public mesh: %s", exc)
if bool(settings.get("enabled")):
if sigint_grid.mesh.is_running():
sigint_grid.mesh.stop()
threading.Timer(1.0, sigint_grid.mesh.start).start()
else:
sigint_grid.mesh.start()
else:
sigint_grid.mesh.stop()
return _meshtastic_runtime_snapshot()
@router.get("/api/settings/timemachine")
@@ -282,8 +369,8 @@ async def api_reset_all_agent_credentials(request: Request):
return {
"ok": True,
"new_hmac_secret": new_secret,
"detail": "All agent credentials have been reset. Reconfigure your agent with the new credentials.",
"hmac_regenerated": True,
"detail": "All agent credentials have been reset. Use the agent connection screen to generate or reveal replacement credentials.",
**results,
}
+2 -2
View File
@@ -1585,7 +1585,7 @@ async def agent_tool_manifest(request: Request):
return {
"ok": True,
"version": "0.9.7",
"version": "0.9.79",
"access_tier": access_tier,
"available_commands": available_commands,
"transport": {
@@ -2221,7 +2221,7 @@ async def api_capabilities(request: Request):
access_tier = str(get_settings().OPENCLAW_ACCESS_TIER or "restricted").strip().lower()
return {
"ok": True,
"version": "0.9.7",
"version": "0.9.79",
"auth": {
"method": "HMAC-SHA256",
"headers": ["X-SB-Timestamp", "X-SB-Nonce", "X-SB-Signature"],
+26 -4
View File
@@ -282,6 +282,20 @@ async def ais_feed(request: Request):
return {"status": "ok", "ingested": count}
@router.get("/api/trail/flight/{icao24}")
@limiter.limit("120/minute")
async def get_selected_flight_trail(icao24: str, request: Request): # noqa: ARG001
from services.fetchers.flights import get_flight_trail
return {"id": icao24, "trail": get_flight_trail(icao24)}
@router.get("/api/trail/ship/{mmsi}")
@limiter.limit("120/minute")
async def get_selected_ship_trail(mmsi: int, request: Request): # noqa: ARG001
from services.ais_stream import get_vessel_trail
return {"id": mmsi, "trail": get_vessel_trail(mmsi)}
@router.post("/api/viewport")
@limiter.limit("60/minute")
async def update_viewport(vp: ViewportUpdate, request: Request): # noqa: ARG001
@@ -321,12 +335,20 @@ async def update_layers(update: LayerUpdate, request: Request):
logger.info("AIS stream started (ship layer enabled)")
from services.sigint_bridge import sigint_grid
if old_mesh and not new_mesh:
sigint_grid.mesh.stop()
logger.info("Meshtastic MQTT bridge stopped (layer disabled)")
try:
from services.meshtastic_mqtt_settings import mqtt_bridge_enabled
keep_chat_running = mqtt_bridge_enabled()
except Exception:
keep_chat_running = False
if keep_chat_running:
logger.info("Meshtastic map layer disabled; MQTT bridge kept running for MeshChat")
else:
sigint_grid.mesh.stop()
logger.info("Meshtastic MQTT bridge stopped (layer disabled)")
elif not old_mesh and new_mesh:
try:
from services.config import get_settings
mqtt_enabled = bool(getattr(get_settings(), "MESH_MQTT_ENABLED", False))
from services.meshtastic_mqtt_settings import mqtt_bridge_enabled
mqtt_enabled = mqtt_bridge_enabled()
except Exception:
mqtt_enabled = False
if mqtt_enabled:
+1 -1
View File
@@ -8,7 +8,7 @@ from services.data_fetcher import get_latest_data
from services.schemas import HealthResponse
import os
APP_VERSION = os.environ.get("_HEALTH_APP_VERSION", "0.9.7")
APP_VERSION = os.environ.get("_HEALTH_APP_VERSION", "0.9.79")
router = APIRouter()
+129 -4
View File
@@ -721,9 +721,11 @@ async def mesh_send(request: Request):
any_ok = any(r.ok for r in results)
# ─── Mirror to Meshtastic bridge feed ────────────────────────
# The MQTT broker won't echo our own publishes back to our subscriber,
# so inject successfully-sent messages into the bridge's deque directly.
if any_ok and envelope.routed_via == "meshtastic":
# The MQTT broker won't echo our own publishes back to our subscriber, so
# inject successfully-sent channel broadcasts into the bridge directly.
# Node-targeted packets must not appear in the public channel feed.
is_direct_destination = MeshtasticTransport._parse_node_id(destination) is not None
if any_ok and envelope.routed_via == "meshtastic" and not is_direct_destination:
try:
from services.sigint_bridge import sigint_grid
@@ -734,7 +736,7 @@ async def mesh_send(request: Request):
bridge.messages.appendleft(
{
"from": MeshtasticTransport.mesh_address_for_sender(node_id),
"to": destination if MeshtasticTransport._parse_node_id(destination) is not None else "broadcast",
"to": "broadcast",
"text": message,
"region": credentials.get("mesh_region", "US"),
"channel": body.get("channel", "LongFast"),
@@ -750,6 +752,122 @@ async def mesh_send(request: Request):
"event_id": "",
"routed_via": envelope.routed_via,
"route_reason": envelope.route_reason,
"direct": is_direct_destination,
"channel_echo": not is_direct_destination,
"results": [r.to_dict() for r in results],
}
@router.post("/api/mesh/meshtastic/send", dependencies=[Depends(require_local_operator)])
@limiter.limit("10/minute")
@mesh_write_exempt(MeshWriteExemption.LOCAL_OPERATOR_ONLY)
async def meshtastic_public_send(request: Request):
"""Local public-MQTT send path for standalone Meshtastic-style identities."""
body = await request.json()
destination = str(body.get("destination", "") or "").strip() or "broadcast"
message = str(body.get("message", "") or "")
sender_id = str(body.get("sender_id", "") or "").strip().lower()
if not message:
return {"ok": False, "detail": "Missing required field: message"}
from services.mesh.mesh_router import (
MeshEnvelope,
MeshtasticTransport,
Priority,
TransportResult,
mesh_router,
)
from services.meshtastic_mqtt_settings import mqtt_bridge_enabled
if MeshtasticTransport._parse_node_id(sender_id) is None:
return {"ok": False, "detail": "Missing or invalid public Meshtastic address"}
if not mqtt_bridge_enabled():
return {"ok": False, "detail": "Meshtastic MQTT bridge is disabled"}
payload_bytes = len(message.encode("utf-8"))
payload_type = str(body.get("payload_type", "text") or "text")
max_bytes = _BYTE_LIMITS.get(payload_type, 200)
if payload_bytes > max_bytes:
return {
"ok": False,
"detail": f"Message too long ({payload_bytes} bytes). Maximum: {max_bytes} bytes for {payload_type} messages.",
}
priority_str = str(body.get("priority", "normal") or "normal").lower()
throttle_ok, throttle_reason = _check_throttle(sender_id, priority_str, "meshtastic")
if not throttle_ok:
return {"ok": False, "detail": throttle_reason}
priority_map = {
"emergency": Priority.EMERGENCY,
"high": Priority.HIGH,
"normal": Priority.NORMAL,
"low": Priority.LOW,
}
priority = priority_map.get(priority_str, Priority.NORMAL)
envelope = MeshEnvelope(
sender_id=sender_id,
destination=destination,
channel=str(body.get("channel", "LongFast") or "LongFast"),
priority=priority,
payload=message,
ephemeral=bool(body.get("ephemeral", False)),
trust_tier="public_degraded",
)
if not mesh_router.meshtastic.can_reach(envelope):
results = [TransportResult(False, "meshtastic", "Message exceeds Meshtastic payload limit")]
else:
cb_ok, cb_reason = mesh_router.breakers["meshtastic"].check_and_record(envelope.priority)
if not cb_ok:
results = [TransportResult(False, "meshtastic", cb_reason)]
else:
is_direct_destination = MeshtasticTransport._parse_node_id(destination) is not None
envelope.route_reason = (
"Local public Meshtastic MQTT path"
if not is_direct_destination
else "Local public Meshtastic direct node path"
)
credentials = {"mesh_region": str(body.get("mesh_region", "US") or "US")}
result = mesh_router.meshtastic.send(envelope, credentials)
if result.ok:
envelope.routed_via = mesh_router.meshtastic.NAME
results = [result]
any_ok = any(r.ok for r in results)
is_direct_destination = MeshtasticTransport._parse_node_id(destination) is not None
if any_ok and envelope.routed_via == "meshtastic" and not is_direct_destination:
try:
from datetime import datetime
from services.sigint_bridge import sigint_grid
bridge = sigint_grid.mesh
if bridge:
record = {
"from": MeshtasticTransport.mesh_address_for_sender(sender_id),
"to": "broadcast",
"text": message,
"region": str(body.get("mesh_region", "US") or "US"),
"root": str(body.get("mesh_region", "US") or "US"),
"channel": str(body.get("channel", "LongFast") or "LongFast"),
"timestamp": datetime.utcnow().isoformat() + "Z",
}
append_text = getattr(bridge, "append_text_message", None)
if callable(append_text):
append_text(record)
else:
bridge.messages.appendleft(record)
except Exception:
pass
return {
"ok": any_ok,
"message_id": envelope.message_id,
"event_id": "",
"routed_via": envelope.routed_via,
"route_reason": envelope.route_reason,
"direct": is_direct_destination,
"channel_echo": not is_direct_destination,
"results": [r.to_dict() for r in results],
}
@@ -848,6 +966,7 @@ async def mesh_messages(
root: str = "",
channel: str = "",
limit: int = 30,
include_direct: bool = False,
):
"""Get recent Meshtastic text messages from the MQTT bridge."""
from services.sigint_bridge import sigint_grid
@@ -869,6 +988,12 @@ async def mesh_messages(
msgs = [m for m in msgs if m.get("root", "").upper() == root_filter]
if channel:
msgs = [m for m in msgs if m.get("channel", "").lower() == channel.lower()]
if not include_direct:
msgs = [
m
for m in msgs
if str(m.get("to") or "broadcast").strip().lower() in {"", "broadcast", "^all"}
]
return msgs[: min(limit, 100)]
+99 -336
View File
@@ -78,6 +78,21 @@ export_wormhole_dm_invite = getattr(
"export_wormhole_dm_invite",
_wormhole_identity_unavailable,
)
list_prekey_lookup_handle_records_for_ui = getattr(
_mesh_wormhole_identity,
"list_prekey_lookup_handle_records_for_ui",
_wormhole_identity_unavailable,
)
rename_prekey_lookup_handle = getattr(
_mesh_wormhole_identity,
"rename_prekey_lookup_handle",
_wormhole_identity_unavailable,
)
revoke_prekey_lookup_handle = getattr(
_mesh_wormhole_identity,
"revoke_prekey_lookup_handle",
_wormhole_identity_unavailable,
)
import_wormhole_dm_invite = getattr(
_mesh_wormhole_identity,
"import_wormhole_dm_invite",
@@ -311,6 +326,10 @@ class WormholeDmInviteImportRequest(BaseModel):
alias: str = ""
class WormholeDmInviteHandleUpdateRequest(BaseModel):
label: str = ""
class WormholeDmSenderTokenRequest(BaseModel):
recipient_id: str
delivery_class: str
@@ -477,6 +496,7 @@ def decrypt_wormhole_dm_envelope(
remote_alias: str | None = None,
session_welcome: str | None = None,
) -> dict[str, Any]:
"""Delegate to main.py, which owns current MLS/alias/legacy gating behavior."""
import main as _m
return _m.decrypt_wormhole_dm_envelope(
@@ -489,71 +509,13 @@ def decrypt_wormhole_dm_envelope(
session_welcome=session_welcome,
)
resolved_local, resolved_remote = _resolve_dm_aliases(
peer_id=peer_id,
local_alias=local_alias,
remote_alias=remote_alias,
)
normalized_format = str(payload_format or "dm1").strip().lower() or "dm1"
if normalized_format != "mls1" and is_dm_locked_to_mls(resolved_local, resolved_remote):
return {
"ok": False,
"detail": "DM session is locked to MLS format",
"required_format": "mls1",
"current_format": normalized_format,
}
if normalized_format == "mls1":
has_session = has_mls_dm_session(resolved_local, resolved_remote)
if not has_session.get("ok"):
return has_session
if not has_session.get("exists"):
ensured = ensure_mls_dm_session(resolved_local, resolved_remote, str(session_welcome or ""))
if not ensured.get("ok"):
return ensured
decrypted = decrypt_mls_dm(
resolved_local,
resolved_remote,
str(ciphertext or ""),
str(nonce or ""),
)
if not decrypted.get("ok"):
return decrypted
return {
"ok": True,
"peer_id": str(peer_id or "").strip(),
"local_alias": resolved_local,
"remote_alias": resolved_remote,
"plaintext": str(decrypted.get("plaintext", "") or ""),
"format": "mls1",
}
from services.wormhole_supervisor import get_transport_tier
current_tier = get_transport_tier()
if str(current_tier or "").startswith("private_"):
return {
"ok": False,
"detail": "MLS format required in private transport mode — legacy DM decrypt blocked",
}
logger.warning("legacy dm decrypt path used")
legacy = decrypt_wormhole_dm(peer_id=str(peer_id or ""), ciphertext=str(ciphertext or ""))
if not legacy.get("ok"):
return legacy
return {
"ok": True,
"peer_id": str(peer_id or "").strip(),
"local_alias": resolved_local,
"remote_alias": resolved_remote,
"plaintext": str(legacy.get("result", "") or ""),
"format": "dm1",
}
# --- Routes ---
@router.get("/api/settings/wormhole")
@limiter.limit("30/minute")
@limiter.limit("240/minute")
async def api_get_wormhole_settings(request: Request):
settings = await asyncio.to_thread(read_wormhole_settings)
return _redact_wormhole_settings(settings, authenticated=_scoped_view_authenticated(request, "wormhole"))
@@ -582,248 +544,9 @@ async def api_set_wormhole_settings(request: Request, body: WormholeUpdate):
return {**updated, "requires_restart": False, "runtime": state}
class PrivacyProfileUpdate(BaseModel):
profile: str
class WormholeSignRequest(BaseModel):
event_type: str
payload: dict
sequence: int | None = None
gate_id: str | None = None
class WormholeSignRawRequest(BaseModel):
message: str
class WormholeDmEncryptRequest(BaseModel):
peer_id: str
peer_dh_pub: str = ""
plaintext: str
local_alias: str | None = None
remote_alias: str | None = None
remote_prekey_bundle: dict[str, Any] | None = None
class WormholeDmComposeRequest(BaseModel):
peer_id: str
peer_dh_pub: str = ""
plaintext: str
local_alias: str | None = None
remote_alias: str | None = None
remote_prekey_bundle: dict[str, Any] | None = None
class WormholeDmDecryptRequest(BaseModel):
peer_id: str
ciphertext: str
format: str = "dm1"
nonce: str = ""
local_alias: str | None = None
remote_alias: str | None = None
session_welcome: str | None = None
class WormholeDmResetRequest(BaseModel):
peer_id: str | None = None
class WormholeDmBootstrapEncryptRequest(BaseModel):
peer_id: str
plaintext: str
class WormholeDmBootstrapDecryptRequest(BaseModel):
sender_id: str = ""
ciphertext: str
class WormholeDmSenderTokenRequest(BaseModel):
recipient_id: str
delivery_class: str
recipient_token: str = ""
count: int = 1
class WormholeOpenSealRequest(BaseModel):
sender_seal: str
candidate_dh_pub: str = ""
recipient_id: str
expected_msg_id: str
class WormholeBuildSealRequest(BaseModel):
recipient_id: str
recipient_dh_pub: str = ""
msg_id: str
timestamp: int
class WormholeDeadDropTokenRequest(BaseModel):
peer_id: str
peer_dh_pub: str = ""
peer_ref: str = ""
class WormholePairwiseAliasRequest(BaseModel):
peer_id: str
peer_dh_pub: str = ""
class WormholePairwiseAliasRotateRequest(BaseModel):
peer_id: str
peer_dh_pub: str = ""
grace_ms: int = 45_000
class WormholeDeadDropContactsRequest(BaseModel):
contacts: list[dict[str, Any]]
limit: int = 24
class WormholeSasRequest(BaseModel):
peer_id: str
peer_dh_pub: str = ""
words: int = 8
peer_ref: str = ""
class WormholeGateRequest(BaseModel):
gate_id: str
rotate: bool = False
class WormholeGatePersonaCreateRequest(BaseModel):
gate_id: str
label: str = ""
class WormholeGatePersonaActivateRequest(BaseModel):
gate_id: str
persona_id: str
class WormholeGateKeyGrantRequest(BaseModel):
gate_id: str
recipient_node_id: str
recipient_dh_pub: str
recipient_scope: str = "member"
class WormholeGateComposeRequest(BaseModel):
gate_id: str
plaintext: str
reply_to: str = ""
compat_plaintext: bool = False
class WormholeGateDecryptRequest(BaseModel):
gate_id: str
epoch: int = 0
ciphertext: str
nonce: str = ""
sender_ref: str = ""
format: str = "mls1"
gate_envelope: str = ""
envelope_hash: str = ""
recovery_envelope: bool = False
compat_decrypt: bool = False
event_id: str = ""
class WormholeGateDecryptBatchRequest(BaseModel):
messages: list[WormholeGateDecryptRequest]
class WormholeGateRotateRequest(BaseModel):
gate_id: str
reason: str = "manual_rotate"
def decrypt_wormhole_dm_envelope(
*,
peer_id: str,
ciphertext: str,
payload_format: str = "dm1",
nonce: str = "",
local_alias: str | None = None,
remote_alias: str | None = None,
session_welcome: str | None = None,
) -> dict[str, Any]:
import main as _m
return _m.decrypt_wormhole_dm_envelope(
peer_id=peer_id,
ciphertext=ciphertext,
payload_format=payload_format,
nonce=nonce,
local_alias=local_alias,
remote_alias=remote_alias,
session_welcome=session_welcome,
)
resolved_local, resolved_remote = _resolve_dm_aliases(
peer_id=peer_id,
local_alias=local_alias,
remote_alias=remote_alias,
)
normalized_format = str(payload_format or "dm1").strip().lower() or "dm1"
if normalized_format != "mls1" and is_dm_locked_to_mls(resolved_local, resolved_remote):
return {
"ok": False,
"detail": "DM session is locked to MLS format",
"required_format": "mls1",
"current_format": normalized_format,
}
if normalized_format == "mls1":
has_session = has_mls_dm_session(resolved_local, resolved_remote)
if not has_session.get("ok"):
return has_session
if not has_session.get("exists"):
ensured = ensure_mls_dm_session(resolved_local, resolved_remote, str(session_welcome or ""))
if not ensured.get("ok"):
return ensured
decrypted = decrypt_mls_dm(
resolved_local,
resolved_remote,
str(ciphertext or ""),
str(nonce or ""),
)
if not decrypted.get("ok"):
return decrypted
return {
"ok": True,
"peer_id": str(peer_id or "").strip(),
"local_alias": resolved_local,
"remote_alias": resolved_remote,
"plaintext": str(decrypted.get("plaintext", "") or ""),
"format": "mls1",
}
from services.wormhole_supervisor import get_transport_tier
current_tier = get_transport_tier()
if str(current_tier or "").startswith("private_"):
return {
"ok": False,
"detail": "MLS format required in private transport mode — legacy DM decrypt blocked",
}
logger.warning("legacy dm decrypt path used")
legacy = decrypt_wormhole_dm(peer_id=str(peer_id or ""), ciphertext=str(ciphertext or ""))
if not legacy.get("ok"):
return legacy
return {
"ok": True,
"peer_id": str(peer_id or "").strip(),
"local_alias": resolved_local,
"remote_alias": resolved_remote,
"plaintext": str(legacy.get("result", "") or ""),
"format": "dm1",
}
@router.get("/api/settings/privacy-profile")
@limiter.limit("30/minute")
@limiter.limit("240/minute")
async def api_get_privacy_profile(request: Request):
data = await asyncio.to_thread(read_wormhole_settings)
return _redact_privacy_profile_settings(
@@ -833,7 +556,7 @@ async def api_get_privacy_profile(request: Request):
@router.get("/api/settings/wormhole-status")
@limiter.limit("30/minute")
@limiter.limit("240/minute")
async def api_get_wormhole_status(request: Request):
state = await asyncio.to_thread(get_wormhole_state)
transport_tier = _current_private_lane_tier(state)
@@ -866,24 +589,38 @@ async def api_get_wormhole_status(request: Request):
)
@router.post("/api/wormhole/join", dependencies=[Depends(require_local_operator)])
@router.post("/api/wormhole/join")
@limiter.limit("10/minute")
async def api_wormhole_join(request: Request):
from services.config import get_settings
existing = read_wormhole_settings()
updated = write_wormhole_settings(
enabled=True,
transport="direct",
socks_proxy="",
transport="tor_arti",
socks_proxy=f"socks5h://127.0.0.1:{int(get_settings().MESH_ARTI_SOCKS_PORT or 9050)}",
socks_dns=True,
anonymous_mode=False,
anonymous_mode=True,
)
transport_changed = (
str(existing.get("transport", "direct")) != "direct"
or str(existing.get("socks_proxy", "")) != ""
str(existing.get("transport", "direct")) != "tor_arti"
or str(existing.get("socks_proxy", "")) != str(updated.get("socks_proxy", ""))
or bool(existing.get("socks_dns", True)) is not True
or bool(existing.get("anonymous_mode", False)) is not False
or bool(existing.get("anonymous_mode", False)) is not True
or bool(existing.get("enabled", False)) is not True
)
tor_result: dict[str, Any] = {"ok": False, "detail": "not started"}
try:
import asyncio
from routers.ai_intel import _write_env_value
from services.tor_hidden_service import tor_service
tor_result = await asyncio.to_thread(tor_service.start)
if tor_result.get("ok"):
_write_env_value("MESH_ARTI_ENABLED", "true")
get_settings.cache_clear()
except Exception as exc:
tor_result = {"ok": False, "detail": str(exc or type(exc).__name__)}
bootstrap_wormhole_identity()
bootstrap_wormhole_persona_state()
state = (
@@ -893,7 +630,7 @@ async def api_wormhole_join(request: Request):
)
# Enable node participation so the sync/push workers connect to peers.
# This is the voluntary opt-in the node only joins the network when
# This is the voluntary opt-in — the node only joins the network when
# the user explicitly opens the Wormhole.
from services.node_settings import write_node_settings
@@ -905,19 +642,19 @@ async def api_wormhole_join(request: Request):
"identity": get_transport_identity(),
"runtime": state,
"settings": updated,
"tor": tor_result,
}
@router.post("/api/wormhole/leave", dependencies=[Depends(require_local_operator)])
@router.post("/api/wormhole/leave")
@limiter.limit("10/minute")
async def api_wormhole_leave(request: Request):
updated = write_wormhole_settings(enabled=False)
state = disconnect_wormhole(reason="leave_wormhole")
# Disable node participation when the user leaves the Wormhole.
from services.node_settings import write_node_settings
write_node_settings(enabled=False)
# Leaving private DM mode must not disable Infonet participation. Infonet
# sync has its own private transport warmup and can remain connected to
# seed/peer nodes while MeshChat stays separately opt-in.
return {
"ok": True,
@@ -926,8 +663,8 @@ async def api_wormhole_leave(request: Request):
}
@router.get("/api/wormhole/identity", dependencies=[Depends(require_local_operator)])
@limiter.limit("30/minute")
@router.get("/api/wormhole/identity")
@limiter.limit("240/minute")
async def api_wormhole_identity(request: Request):
try:
bootstrap_wormhole_persona_state()
@@ -937,7 +674,7 @@ async def api_wormhole_identity(request: Request):
raise HTTPException(status_code=500, detail="wormhole_identity_failed") from exc
@router.post("/api/wormhole/identity/bootstrap", dependencies=[Depends(require_local_operator)])
@router.post("/api/wormhole/identity/bootstrap")
@limiter.limit("10/minute")
async def api_wormhole_identity_bootstrap(request: Request):
bootstrap_wormhole_identity()
@@ -956,7 +693,7 @@ async def api_wormhole_identity_bootstrap(request: Request):
@router.get("/api/wormhole/dm/identity", dependencies=[Depends(require_local_operator)])
@limiter.limit("30/minute")
@limiter.limit("240/minute")
async def api_wormhole_dm_identity(request: Request):
try:
bootstrap_wormhole_persona_state()
@@ -968,11 +705,37 @@ async def api_wormhole_dm_identity(request: Request):
@router.get("/api/wormhole/dm/invite", dependencies=[Depends(require_local_operator)])
@limiter.limit("30/minute")
async def api_wormhole_dm_invite(request: Request):
return export_wormhole_dm_invite()
async def api_wormhole_dm_invite(
request: Request,
label: str = Query("", max_length=96),
expires_in_s: int = Query(0, ge=0, le=2_592_000),
):
return export_wormhole_dm_invite(label=label, expires_in_s=expires_in_s)
@router.post("/api/wormhole/dm/invite/import", dependencies=[Depends(require_admin)])
@router.get("/api/wormhole/dm/invite/handles", dependencies=[Depends(require_local_operator)])
@limiter.limit("240/minute")
async def api_wormhole_dm_invite_handles(request: Request):
return list_prekey_lookup_handle_records_for_ui()
@router.patch("/api/wormhole/dm/invite/handles/{handle}", dependencies=[Depends(require_local_operator)])
@limiter.limit("60/minute")
async def api_wormhole_dm_invite_handle_update(
request: Request,
handle: str,
body: WormholeDmInviteHandleUpdateRequest,
):
return rename_prekey_lookup_handle(handle, str(body.label or "").strip())
@router.delete("/api/wormhole/dm/invite/handles/{handle}", dependencies=[Depends(require_local_operator)])
@limiter.limit("30/minute")
async def api_wormhole_dm_invite_handle_revoke(request: Request, handle: str):
return revoke_prekey_lookup_handle(handle)
@router.post("/api/wormhole/dm/invite/import", dependencies=[Depends(require_local_operator)])
@limiter.limit("30/minute")
async def api_wormhole_dm_invite_import(request: Request, body: WormholeDmInviteImportRequest):
return import_wormhole_dm_invite(
@@ -1010,7 +773,7 @@ async def api_wormhole_sign(request: Request, body: WormholeSignRequest):
)
@router.post("/api/wormhole/gate/enter", dependencies=[Depends(require_local_operator)])
@router.post("/api/wormhole/gate/enter")
@limiter.limit("20/minute")
async def api_wormhole_gate_enter(request: Request, body: WormholeGateRequest):
gate_id = str(body.gate_id or "")
@@ -1024,25 +787,25 @@ async def api_wormhole_gate_enter(request: Request, body: WormholeGateRequest):
return result
@router.post("/api/wormhole/gate/leave", dependencies=[Depends(require_local_operator)])
@router.post("/api/wormhole/gate/leave")
@limiter.limit("20/minute")
async def api_wormhole_gate_leave(request: Request, body: WormholeGateRequest):
return leave_gate(str(body.gate_id or ""))
@router.get("/api/wormhole/gate/{gate_id}/identity", dependencies=[Depends(require_local_operator)])
@router.get("/api/wormhole/gate/{gate_id}/identity")
@limiter.limit("30/minute")
async def api_wormhole_gate_identity(request: Request, gate_id: str):
return get_active_gate_identity(gate_id)
@router.get("/api/wormhole/gate/{gate_id}/personas", dependencies=[Depends(require_local_operator)])
@router.get("/api/wormhole/gate/{gate_id}/personas")
@limiter.limit("30/minute")
async def api_wormhole_gate_personas(request: Request, gate_id: str):
return list_gate_personas(gate_id)
@router.get("/api/wormhole/gate/{gate_id}/key", dependencies=[Depends(require_local_operator)])
@router.get("/api/wormhole/gate/{gate_id}/key")
@limiter.limit("30/minute")
async def api_wormhole_gate_key_status(request: Request, gate_id: str):
import main as _m
@@ -1066,7 +829,7 @@ async def api_wormhole_gate_key_rotate(request: Request, body: WormholeGateRotat
return result
@router.post("/api/wormhole/gate/persona/create", dependencies=[Depends(require_local_operator)])
@router.post("/api/wormhole/gate/persona/create")
@limiter.limit("20/minute")
async def api_wormhole_gate_persona_create(
request: Request, body: WormholeGatePersonaCreateRequest
@@ -1082,7 +845,7 @@ async def api_wormhole_gate_persona_create(
return result
@router.post("/api/wormhole/gate/persona/activate", dependencies=[Depends(require_local_operator)])
@router.post("/api/wormhole/gate/persona/activate")
@limiter.limit("20/minute")
async def api_wormhole_gate_persona_activate(
request: Request, body: WormholeGatePersonaActivateRequest
@@ -1098,7 +861,7 @@ async def api_wormhole_gate_persona_activate(
return result
@router.post("/api/wormhole/gate/persona/clear", dependencies=[Depends(require_local_operator)])
@router.post("/api/wormhole/gate/persona/clear")
@limiter.limit("20/minute")
async def api_wormhole_gate_persona_clear(request: Request, body: WormholeGateRequest):
gate_id = str(body.gate_id or "")
@@ -1112,7 +875,7 @@ async def api_wormhole_gate_persona_clear(request: Request, body: WormholeGateRe
return result
@router.post("/api/wormhole/gate/persona/retire", dependencies=[Depends(require_local_operator)])
@router.post("/api/wormhole/gate/persona/retire")
@limiter.limit("20/minute")
async def api_wormhole_gate_persona_retire(
request: Request, body: WormholeGatePersonaActivateRequest
@@ -1181,7 +944,7 @@ async def api_wormhole_gate_message_compose(request: Request, body: WormholeGate
return await _m.api_wormhole_gate_message_compose(request, body)
@router.post("/api/wormhole/gate/message/sign-encrypted", dependencies=[Depends(require_local_operator)])
@router.post("/api/wormhole/gate/message/sign-encrypted")
@limiter.limit("30/minute")
async def api_wormhole_gate_message_sign_encrypted(
request: Request,
@@ -1191,7 +954,7 @@ async def api_wormhole_gate_message_sign_encrypted(
return await _m.api_wormhole_gate_message_sign_encrypted(request, body)
@router.post("/api/wormhole/gate/message/post-encrypted", dependencies=[Depends(require_local_operator)])
@router.post("/api/wormhole/gate/message/post-encrypted")
@limiter.limit("30/minute")
async def api_wormhole_gate_message_post_encrypted(
request: Request,
@@ -1241,14 +1004,14 @@ async def api_wormhole_gate_messages_decrypt(request: Request, body: WormholeGat
return await _m.api_wormhole_gate_messages_decrypt(request, body)
@router.post("/api/wormhole/gate/state/export", dependencies=[Depends(require_local_operator)])
@router.post("/api/wormhole/gate/state/export")
@limiter.limit("30/minute")
async def api_wormhole_gate_state_export(request: Request, body: WormholeGateRequest):
import main as _m
return await _m.api_wormhole_gate_state_export(request, body)
@router.post("/api/wormhole/gate/proof", dependencies=[Depends(require_local_operator)])
@router.post("/api/wormhole/gate/proof")
@limiter.limit("30/minute")
async def api_wormhole_gate_proof(request: Request, body: WormholeGateRequest):
proof = _sign_gate_access_proof(str(body.gate_id or ""))
@@ -1533,7 +1296,7 @@ class PrivateDeliveryActionRequest(BaseModel):
@router.get("/api/wormhole/status")
@limiter.limit("30/minute")
@limiter.limit("240/minute")
async def api_wormhole_status(request: Request):
import main as _m
@@ -1576,7 +1339,7 @@ async def api_wormhole_private_delivery_action(
@router.get("/api/wormhole/health")
@limiter.limit("30/minute")
@limiter.limit("240/minute")
async def api_wormhole_health(request: Request):
state = get_wormhole_state()
transport_tier = _current_private_lane_tier(state)
@@ -1597,7 +1360,7 @@ async def api_wormhole_health(request: Request):
return _redact_wormhole_status(full_state, authenticated=ok)
@router.post("/api/wormhole/connect", dependencies=[Depends(require_admin)])
@router.post("/api/wormhole/connect")
@limiter.limit("10/minute")
async def api_wormhole_connect(request: Request):
settings = read_wormhole_settings()
+50 -1
View File
@@ -339,16 +339,61 @@ def get_country_from_mmsi(mmsi: int) -> str:
# Global vessel store: MMSI → vessel dict
_vessels: dict[int, dict] = {}
_vessel_trails: dict[int, dict] = {}
_vessels_lock = threading.Lock()
_ws_thread: threading.Thread | None = None
_ws_running = False
_proxy_process = None
_VESSEL_TRAIL_INTERVAL_S = 120
_VESSEL_TRAIL_MAX_POINTS = 240
import os
CACHE_FILE = os.path.join(os.path.dirname(__file__), "ais_cache.json")
def _record_vessel_trail_locked(mmsi: int, lat, lng, sog=0, now_ts: float | None = None) -> None:
"""Append a sampled AIS trail point. Caller must hold _vessels_lock."""
if lat is None or lng is None:
return
try:
lat_f = float(lat)
lng_f = float(lng)
except (TypeError, ValueError):
return
if abs(lat_f) > 90 or abs(lng_f) > 180 or (lat_f == 0 and lng_f == 0):
return
now = now_ts or time.time()
trail_data = _vessel_trails.setdefault(int(mmsi), {"points": [], "last_seen": now})
point = [round(lat_f, 5), round(lng_f, 5), round(float(sog or 0), 1), round(now)]
last_point_ts = trail_data["points"][-1][3] if trail_data["points"] else 0
if now - last_point_ts < _VESSEL_TRAIL_INTERVAL_S:
trail_data["last_seen"] = now
return
if (
trail_data["points"]
and trail_data["points"][-1][0] == point[0]
and trail_data["points"][-1][1] == point[1]
):
trail_data["last_seen"] = now
return
trail_data["points"].append(point)
trail_data["last_seen"] = now
if len(trail_data["points"]) > _VESSEL_TRAIL_MAX_POINTS:
trail_data["points"] = trail_data["points"][-_VESSEL_TRAIL_MAX_POINTS:]
def get_vessel_trail(mmsi: int) -> list:
"""Return the accumulated trail for a single vessel without expanding live payloads."""
try:
key = int(mmsi)
except (TypeError, ValueError):
return []
with _vessels_lock:
points = _vessel_trails.get(key, {}).get("points", [])
return [list(point) for point in points]
def _save_cache():
"""Save vessel data to disk for persistence across restarts."""
try:
@@ -391,6 +436,7 @@ def prune_stale_vessels():
stale_keys = [k for k, v in _vessels.items() if v.get("_updated", 0) < stale_cutoff]
for k in stale_keys:
del _vessels[k]
_vessel_trails.pop(k, None)
if stale_keys:
logger.info(f"AIS pruned {len(stale_keys)} stale vessels")
@@ -459,6 +505,7 @@ def ingest_ais_catcher(msgs: list[dict]) -> int:
heading = msg.get("heading", 511)
vessel["heading"] = heading if heading != 511 else vessel.get("cog", 0)
vessel["_updated"] = now
_record_vessel_trail_locked(mmsi, lat, lon, vessel["sog"], now)
if msg.get("shipname"):
vessel["name"] = msg["shipname"].strip()
count += 1
@@ -595,7 +642,9 @@ def _ais_stream_loop():
vessel["cog"] = report.get("Cog", 0)
heading = report.get("TrueHeading", 511)
vessel["heading"] = heading if heading != 511 else report.get("Cog", 0)
vessel["_updated"] = time.time()
now_ts = time.time()
vessel["_updated"] = now_ts
_record_vessel_trail_locked(mmsi, lat, lng, vessel["sog"], now_ts)
# Use metadata name if we don't have one yet
if not vessel.get("name") or vessel["name"] == "UNKNOWN":
vessel["name"] = (
+11 -1
View File
@@ -32,16 +32,26 @@ class Settings(BaseSettings):
MESH_ARTI_ENABLED: bool = False
MESH_ARTI_SOCKS_PORT: int = 9050
MESH_RELAY_PEERS: str = ""
MESH_DEFAULT_SYNC_PEERS: str = "https://node.shadowbroker.info"
# Bootstrap seeds are discovery hints, not authoritative network roots.
# Nodes promote healthy discovered peers from the store/manifest over time.
MESH_BOOTSTRAP_SEED_PEERS: str = "http://gqpbunqbgtkcqilvclm3xrkt3zowjyl3s62kkktvojgvxzizamvbrqid.onion:8000"
# Legacy name kept for older compose/.env files.
MESH_DEFAULT_SYNC_PEERS: str = ""
# Infonet/Wormhole must fail closed to private transports by default.
# Set true only for local relay development or explicitly public testnets.
MESH_INFONET_ALLOW_CLEARNET_SYNC: bool = False
MESH_BOOTSTRAP_DISABLED: bool = False
MESH_BOOTSTRAP_MANIFEST_PATH: str = "data/bootstrap_peers.json"
MESH_BOOTSTRAP_SIGNER_PUBLIC_KEY: str = ""
MESH_NODE_MODE: str = "participant"
MESH_SYNC_INTERVAL_S: int = 300
MESH_SYNC_FAILURE_BACKOFF_S: int = 60
MESH_SYNC_TIMEOUT_S: int = 5
MESH_SYNC_MAX_PEERS_PER_CYCLE: int = 3
MESH_RELAY_PUSH_TIMEOUT_S: int = 10
MESH_RELAY_MAX_FAILURES: int = 3
MESH_RELAY_FAILURE_COOLDOWN_S: int = 120
MESH_BOOTSTRAP_SEED_FAILURE_COOLDOWN_S: int = 15
MESH_PEER_PUSH_SECRET: str = ""
MESH_RNS_APP_NAME: str = "shadowbroker"
MESH_RNS_ASPECT: str = "infonet"
+66 -43
View File
@@ -134,15 +134,22 @@ _INTEL_STARTUP_CACHE_KEYS = (
"trending_markets",
"correlations",
"fimi",
"crowdthreat",
"uap_sightings",
"military_bases",
"wastewater",
)
_STARTUP_PRIORITY_TIMEOUT_S = float(os.environ.get("SHADOWBROKER_STARTUP_PRIORITY_TIMEOUT_S", "18"))
_STARTUP_HEAVY_REFRESH_DELAY_S = float(os.environ.get("SHADOWBROKER_STARTUP_HEAVY_REFRESH_DELAY_S", "90"))
_STARTUP_HEAVY_REFRESH_STARTED = False
_STARTUP_HEAVY_REFRESH_LOCK = threading.Lock()
_FETCH_WORKERS = int(os.environ.get("SHADOWBROKER_FETCH_WORKERS", "8"))
_SLOW_FETCH_CONCURRENCY = int(os.environ.get("SHADOWBROKER_SLOW_FETCH_CONCURRENCY", "4"))
_STARTUP_HEAVY_CONCURRENCY = int(os.environ.get("SHADOWBROKER_STARTUP_HEAVY_CONCURRENCY", "2"))
# Shared thread pool — reused across all fetch cycles instead of creating/destroying per tick
_SHARED_EXECUTOR = concurrent.futures.ThreadPoolExecutor(
max_workers=20, thread_name_prefix="fetch"
max_workers=max(2, _FETCH_WORKERS), thread_name_prefix="fetch"
)
@@ -156,6 +163,14 @@ def _cache_json_safe(value):
return value
def _has_cache_value(value) -> bool:
if value is None:
return False
if isinstance(value, (list, tuple, dict, set)):
return bool(value)
return True
def _load_fast_startup_cache_if_available() -> bool:
"""Seed moving layers from a recent disk cache while live fetches warm up."""
if _FAST_STARTUP_CACHE_MAX_AGE_S <= 0 or not _FAST_STARTUP_CACHE_PATH.exists():
@@ -200,10 +215,15 @@ def _save_fast_startup_cache() -> None:
"""Persist recent moving layers for the next cold start."""
try:
with _data_lock:
layers = {
key: latest_data.get(key)
for key in _FAST_STARTUP_CACHE_KEYS
if _has_cache_value(latest_data.get(key))
}
payload = {
"cached_at": time.time(),
"last_updated": latest_data.get("last_updated"),
"layers": {key: latest_data.get(key) for key in _FAST_STARTUP_CACHE_KEYS},
"layers": layers,
"freshness": {
key: source_timestamps.get(key)
for key in _FAST_STARTUP_CACHE_KEYS
@@ -264,10 +284,15 @@ def _save_intel_startup_cache() -> None:
"""Persist compact right-side intelligence data for the next cold start."""
try:
with _data_lock:
layers = {
key: latest_data.get(key)
for key in _INTEL_STARTUP_CACHE_KEYS
if _has_cache_value(latest_data.get(key))
}
payload = {
"cached_at": time.time(),
"last_updated": latest_data.get("last_updated"),
"layers": {key: latest_data.get(key) for key in _INTEL_STARTUP_CACHE_KEYS},
"layers": layers,
"freshness": {
key: source_timestamps.get(key)
for key in _INTEL_STARTUP_CACHE_KEYS
@@ -294,11 +319,27 @@ def seed_startup_caches() -> None:
# ---------------------------------------------------------------------------
# Scheduler & Orchestration
# ---------------------------------------------------------------------------
def _run_tasks(label: str, funcs: list):
def _run_tasks(label: str, funcs: list, *, max_concurrency: int | None = None):
"""Run tasks concurrently and log any exceptions (do not fail silently)."""
if not funcs:
return
futures = {_SHARED_EXECUTOR.submit(func): (func.__name__, time.perf_counter()) for func in funcs}
if max_concurrency is None:
if label.startswith("slow-tier"):
max_concurrency = _SLOW_FETCH_CONCURRENCY
elif label.startswith("startup-heavy"):
max_concurrency = _STARTUP_HEAVY_CONCURRENCY
else:
max_concurrency = len(funcs)
max_concurrency = max(1, min(max_concurrency, len(funcs)))
remaining_funcs = list(funcs)
while remaining_funcs:
batch, remaining_funcs = remaining_funcs[:max_concurrency], remaining_funcs[max_concurrency:]
futures = {_SHARED_EXECUTOR.submit(func): (func.__name__, time.perf_counter()) for func in batch}
_drain_task_futures(label, futures)
def _drain_task_futures(label: str, futures: dict):
# Iterate directly so future.result(timeout=...) is the blocking call.
# as_completed() blocks inside __next__() waiting for completion — the timeout
# on result() would never be reached for a hanging task under that pattern.
@@ -486,6 +527,7 @@ def update_all_data(*, startup_mode: bool = False):
priority_funcs = [
fetch_airports,
update_fast_data,
fetch_news,
fetch_gdelt,
fetch_crowdthreat,
fetch_firms_fires,
@@ -520,55 +562,36 @@ def update_all_data(*, startup_mode: bool = False):
except Exception as e:
_record_fetch_failure("startup-priority", name, start, e)
logger.info("Startup preload: deferring Playwright Liveuamap scraper to scheduled cadence")
_save_intel_startup_cache()
_schedule_delayed_startup_heavy_refresh()
logger.info("Startup priority preload complete; slow synthesis is warming in background.")
return
futures = {
_SHARED_EXECUTOR.submit(fetch_airports): ("fetch_airports", time.perf_counter()),
_SHARED_EXECUTOR.submit(update_fast_data): ("update_fast_data", time.perf_counter()),
_SHARED_EXECUTOR.submit(update_slow_data): ("update_slow_data", time.perf_counter()),
_SHARED_EXECUTOR.submit(fetch_volcanoes): ("fetch_volcanoes", time.perf_counter()),
_SHARED_EXECUTOR.submit(fetch_viirs_change_nodes): ("fetch_viirs_change_nodes", time.perf_counter()),
_SHARED_EXECUTOR.submit(fetch_unusual_whales): ("fetch_unusual_whales", time.perf_counter()),
_SHARED_EXECUTOR.submit(fetch_fimi): ("fetch_fimi", time.perf_counter()),
_SHARED_EXECUTOR.submit(fetch_gdelt): ("fetch_gdelt", time.perf_counter()),
_SHARED_EXECUTOR.submit(fetch_uap_sightings): ("fetch_uap_sightings", time.perf_counter()),
_SHARED_EXECUTOR.submit(fetch_wastewater): ("fetch_wastewater", time.perf_counter()),
_SHARED_EXECUTOR.submit(fetch_crowdthreat): ("fetch_crowdthreat", time.perf_counter()),
_SHARED_EXECUTOR.submit(fetch_sar_catalog): ("fetch_sar_catalog", time.perf_counter()),
_SHARED_EXECUTOR.submit(fetch_sar_products): ("fetch_sar_products", time.perf_counter()),
}
refresh_funcs = [
fetch_airports,
update_fast_data,
update_slow_data,
fetch_volcanoes,
fetch_viirs_change_nodes,
fetch_unusual_whales,
fetch_fimi,
fetch_gdelt,
fetch_uap_sightings,
fetch_wastewater,
fetch_crowdthreat,
fetch_sar_catalog,
fetch_sar_products,
]
if not startup_mode or not meshtastic_seeded:
futures[_SHARED_EXECUTOR.submit(fetch_meshtastic_nodes)] = (
"fetch_meshtastic_nodes",
time.perf_counter(),
)
refresh_funcs.append(fetch_meshtastic_nodes)
else:
logger.info(
"Startup preload: Meshtastic cache already loaded, deferring remote map refresh to scheduled cadence"
)
if not startup_mode:
futures[_SHARED_EXECUTOR.submit(update_liveuamap)] = (
"update_liveuamap",
time.perf_counter(),
)
refresh_funcs.append(update_liveuamap)
else:
logger.info("Startup preload: deferring Playwright Liveuamap scraper to scheduled cadence")
for future, (name, start) in futures.items():
try:
future.result(timeout=_TASK_HARD_TIMEOUT_S)
duration = time.perf_counter() - start
from services.fetch_health import record_success
record_success(name, duration_s=duration)
if duration > _SLOW_FETCH_S:
logger.warning(f"full-refresh task slow: {name} took {duration:.2f}s")
except Exception as e:
duration = time.perf_counter() - start
from services.fetch_health import record_failure
record_failure(name, error=e, duration_s=duration)
logger.exception(f"full-refresh task failed: {name}")
_run_tasks("full-refresh", refresh_funcs, max_concurrency=_STARTUP_HEAVY_CONCURRENCY)
# Run CCTV ingest immediately so cameras are available on first request
# (the scheduled job also runs every 10 min for ongoing refresh).
if startup_mode:
@@ -32,7 +32,7 @@ _REFRESH_INTERVAL_S = 5 * 24 * 3600
_LIST_TIMEOUT_S = 30
_DOWNLOAD_TIMEOUT_S = 600
_USER_AGENT = (
"ShadowBroker-OSINT/0.9.7 "
"ShadowBroker-OSINT/0.9.79 "
"(+https://github.com/BigBodyCobain/Shadowbroker; "
"contact: bigbodycobain@gmail.com)"
)
+45 -15
View File
@@ -256,7 +256,17 @@ PRIVATE_JET_TYPES = {
# Flight trails state
flight_trails = {} # {icao_hex: {points: [[lat, lng, alt, ts], ...], last_seen: ts}}
_trails_lock = threading.Lock()
_MAX_TRACKED_TRAILS = 2000
_MAX_TRACKED_TRAILS = 20000
def get_flight_trail(icao24: str) -> list:
"""Return the accumulated trail for a single aircraft without expanding live payloads."""
hex_id = str(icao24 or "").strip().lower()
if not hex_id:
return []
with _trails_lock:
points = flight_trails.get(hex_id, {}).get("points", [])
return [list(point) for point in points]
# Route enrichment is now served from services.fetchers.route_database, which
# bulk-loads vrs-standing-data.adsb.lol/routes.csv.gz once per day and looks up
@@ -612,24 +622,30 @@ def _classify_and_publish(all_adsb_flights):
)
# --- Trail Accumulation ---
_TRAIL_INTERVAL_S = 600 # only record a new trail point every 10 minutes
_TRAIL_INTERVAL_S = 60 # selected trails need enough resolution to show where unknown-route traffic came from
def _accumulate_trail(f, now_ts, check_route=True):
def _accumulate_trail(f, now_ts, attach_known_route_trail=False):
hex_id = f.get("icao24", "").lower()
if not hex_id:
return 0, None
if check_route and f.get("origin_name", "UNKNOWN") != "UNKNOWN":
f["trail"] = []
return 0, hex_id
def _known_route_name(value):
normalized = str(value or "").strip().upper()
return bool(normalized and normalized != "UNKNOWN")
has_known_route = bool(
(f.get("origin_loc") and f.get("dest_loc"))
or (_known_route_name(f.get("origin_name")) and _known_route_name(f.get("dest_name")))
)
lat, lng, alt = f.get("lat"), f.get("lng"), f.get("alt", 0)
if lat is None or lng is None:
f["trail"] = flight_trails.get(hex_id, {}).get("points", [])
f["trail"] = [] if has_known_route and not attach_known_route_trail else flight_trails.get(hex_id, {}).get("points", [])
return 0, hex_id
point = [round(lat, 5), round(lng, 5), round(alt, 1), round(now_ts)]
if hex_id not in flight_trails:
flight_trails[hex_id] = {"points": [], "last_seen": now_ts}
trail_data = flight_trails[hex_id]
# Only append a new point if 10 minutes have passed since the last one
# Only append a new point if enough time has passed since the last one
last_point_ts = trail_data["points"][-1][3] if trail_data["points"] else 0
if now_ts - last_point_ts < _TRAIL_INTERVAL_S:
trail_data["last_seen"] = now_ts
@@ -644,32 +660,39 @@ def _classify_and_publish(all_adsb_flights):
trail_data["last_seen"] = now_ts
if len(trail_data["points"]) > 200:
trail_data["points"] = trail_data["points"][-200:]
f["trail"] = trail_data["points"]
# Keep known-route flights visually clean in the main payload; selected
# detail panels can still fetch this server-side trail to compute
# observed fuel/CO2 burn.
f["trail"] = [] if has_known_route and not attach_known_route_trail else trail_data["points"]
return 1, hex_id
now_ts = datetime.utcnow().timestamp()
with _data_lock:
commercial_snapshot = copy.deepcopy(latest_data.get("commercial_flights", []))
private_jets_snapshot = copy.deepcopy(latest_data.get("private_jets", []))
private_ga_snapshot = copy.deepcopy(latest_data.get("private_flights", []))
military_snapshot = copy.deepcopy(latest_data.get("military_flights", []))
tracked_snapshot = copy.deepcopy(latest_data.get("tracked_flights", []))
raw_flights_snapshot = list(latest_data.get("flights", []))
# Commercial/private: skip trail if route is known (route line replaces trail)
route_check_lists = [commercial, private_jets, private_ga]
# Tracked + military: ALWAYS accumulate trails (high-interest flights)
always_trail_lists = [existing_tracked, military_snapshot]
# Accumulate trails for every aircraft so selected details can estimate
# observed fuel/CO2 burn. Known-route flights keep an empty payload trail so
# the route line, not historical breadcrumbs, remains the visible map path.
route_check_lists = [commercial_snapshot, private_jets_snapshot, private_ga_snapshot]
always_trail_lists = [tracked_snapshot, military_snapshot]
seen_hexes = set()
trail_count = 0
with _trails_lock:
for flist in route_check_lists:
for f in flist:
count, hex_id = _accumulate_trail(f, now_ts, check_route=True)
count, hex_id = _accumulate_trail(f, now_ts, attach_known_route_trail=False)
trail_count += count
if hex_id:
seen_hexes.add(hex_id)
for flist in always_trail_lists:
for f in flist:
count, hex_id = _accumulate_trail(f, now_ts, check_route=False)
count, hex_id = _accumulate_trail(f, now_ts, attach_known_route_trail=False)
trail_count += count
if hex_id:
seen_hexes.add(hex_id)
@@ -693,6 +716,13 @@ def _classify_and_publish(all_adsb_flights):
f"Trail accumulation: {trail_count} active trails, {len(stale_keys)} pruned, {len(flight_trails)} total"
)
with _data_lock:
latest_data["commercial_flights"] = commercial_snapshot
latest_data["private_jets"] = private_jets_snapshot
latest_data["private_flights"] = private_ga_snapshot
latest_data["tracked_flights"] = tracked_snapshot
latest_data["military_flights"] = military_snapshot
# --- GPS Jamming Detection ---
# Uses NACp (Navigation Accuracy Category Position) from ADS-B to infer
# GPS interference zones, similar to GPSJam.org / Flightradar24.
+1 -1
View File
@@ -182,7 +182,7 @@ def fetch_meshtastic_nodes():
callsign = str(getattr(get_settings(), "MESHTASTIC_OPERATOR_CALLSIGN", "") or "").strip()
except Exception:
callsign = ""
ua_base = "ShadowBroker-OSINT/0.9.7 (+https://github.com/BigBodyCobain/Shadowbroker; contact: bigbodycobain@gmail.com; 24h polling)"
ua_base = "ShadowBroker-OSINT/0.9.79 (+https://github.com/BigBodyCobain/Shadowbroker; contact: bigbodycobain@gmail.com; 24h polling)"
user_agent = f"{ua_base}; node={callsign}" if callsign else ua_base
try:
+8
View File
@@ -6,6 +6,7 @@ import time
import requests
from services.network_utils import fetch_with_curl
from services.fetchers._store import latest_data, _data_lock, _mark_fresh
from services.fetchers.emissions import get_emissions_info
from services.fetchers.plane_alert import enrich_with_plane_alert
logger = logging.getLogger("services.data_fetcher")
@@ -289,6 +290,13 @@ def fetch_military_flights():
remaining_mil = []
for mf in military_flights:
enrich_with_plane_alert(mf)
model = mf.get("model")
if not model or str(model).strip().lower() in {"", "unknown"}:
model = mf.get("alert_type") or ""
if model:
emissions = get_emissions_info(model)
if emissions:
mf["emissions"] = emissions
if mf.get("alert_category"):
mf["type"] = "tracked_flight"
tracked_mil.append(mf)
+1 -1
View File
@@ -25,7 +25,7 @@ _REFRESH_INTERVAL_S = 5 * 24 * 3600
_HTTP_TIMEOUT_S = 60
_USER_AGENT = (
"ShadowBroker-OSINT/0.9.7 "
"ShadowBroker-OSINT/0.9.79 "
"(+https://github.com/BigBodyCobain/Shadowbroker; "
"contact: bigbodycobain@gmail.com)"
)
+5 -1
View File
@@ -930,7 +930,11 @@ def fetch_satellites():
now.year, now.month, now.day, now.hour, now.minute, now.second + now.microsecond / 1e6
)
for s in all_sats:
for source_sat in all_sats:
# Keep the classified cache immutable. The render payload below
# strips orbital fields after propagation, and mutating the cached
# entry would make the next refresh unable to position satellites.
s = dict(source_sat)
try:
mean_motion = s.get("MEAN_MOTION")
ecc = s.get("ECCENTRICITY")
+15
View File
@@ -1264,6 +1264,21 @@ class DMRelay:
)
self._save()
def unregister_prekey_lookup_alias(self, alias: str) -> bool:
"""Remove an invite-scoped lookup alias from the local relay."""
handle = str(alias or "").strip()
if not handle:
return False
removed = False
with self._lock:
self._refresh_from_shared_relay()
if handle in self._prekey_lookup_aliases:
del self._prekey_lookup_aliases[handle]
removed = True
if removed:
self._save()
return removed
def consume_one_time_prekey(self, agent_id: str) -> dict[str, Any] | None:
"""Atomically claim the next published one-time prekey for a peer bundle."""
claimed: dict[str, Any] | None = None
@@ -30,10 +30,19 @@ def eligible_sync_peers(records: list[PeerRecord], *, now: float | None = None)
for record in records
if record.bucket == "sync" and record.enabled and int(record.cooldown_until or 0) <= current_time
]
def _seed_priority(record: PeerRecord) -> int:
role = str(record.role or "").strip().lower()
source = str(record.source or "").strip().lower()
if role == "seed" and source in {"bundle", "bootstrap_promoted"}:
return 0
return 1
return sorted(
candidates,
key=lambda record: (
-int(record.last_sync_ok_at or 0),
_seed_priority(record),
int(record.failure_count or 0),
int(record.added_at or 0),
record.peer_url,
+9 -3
View File
@@ -258,6 +258,12 @@ class PeerStore:
self._records[record.record_key()] = record
return record
explicit_seed_refresh = (
record.bucket == "sync"
and record.role == "seed"
and record.source in {"bundle", "bootstrap_promoted"}
)
merged = PeerRecord(
bucket=record.bucket,
source=record.source,
@@ -272,9 +278,9 @@ class PeerStore:
last_seen_at=max(existing.last_seen_at, record.last_seen_at),
last_sync_ok_at=max(existing.last_sync_ok_at, record.last_sync_ok_at),
last_push_ok_at=max(existing.last_push_ok_at, record.last_push_ok_at),
last_error=record.last_error or existing.last_error,
failure_count=max(existing.failure_count, record.failure_count),
cooldown_until=max(existing.cooldown_until, record.cooldown_until),
last_error="" if explicit_seed_refresh else record.last_error or existing.last_error,
failure_count=0 if explicit_seed_refresh else max(existing.failure_count, record.failure_count),
cooldown_until=0 if explicit_seed_refresh else max(existing.cooldown_until, record.cooldown_until),
metadata={**existing.metadata, **record.metadata},
)
self._records[record.record_key()] = merged
+15 -18
View File
@@ -390,15 +390,9 @@ class MeshtasticTransport:
def _mqtt_config() -> tuple[str, int, str, str]:
"""Return (broker, port, user, password) from settings."""
try:
from services.config import get_settings
from services.meshtastic_mqtt_settings import mqtt_connection_config
s = get_settings()
return (
str(s.MESH_MQTT_BROKER or "mqtt.meshtastic.org"),
int(s.MESH_MQTT_PORT or 1883),
str(s.MESH_MQTT_USER or "meshdev"),
str(s.MESH_MQTT_PASS or "large4cats"),
)
return mqtt_connection_config()
except Exception:
return ("mqtt.meshtastic.org", 1883, "meshdev", "large4cats")
@@ -433,8 +427,9 @@ class MeshtasticTransport:
def _resolve_psk(cls) -> bytes:
"""Return the PSK from config, or the default LongFast key if empty."""
try:
from services.config import get_settings
raw = str(getattr(get_settings(), "MESH_MQTT_PSK", "") or "").strip()
from services.meshtastic_mqtt_settings import mqtt_psk_hex
raw = mqtt_psk_hex()
except Exception:
raw = ""
if not raw:
@@ -449,7 +444,10 @@ class MeshtasticTransport:
@staticmethod
def mesh_address_for_sender(sender_id: str) -> str:
"""Return the synthetic public mesh address used for MQTT-originated sends."""
"""Return the public mesh address used for MQTT-originated sends."""
parsed = MeshtasticTransport._parse_node_id(sender_id)
if parsed is not None:
return f"!{parsed:08x}"
return f"!{MeshtasticTransport._stable_node_id(sender_id):08x}"
@staticmethod
@@ -489,7 +487,8 @@ class MeshtasticTransport:
# Generate IDs
packet_id = random.randint(1, 0xFFFFFFFF)
from_node = self._stable_node_id(envelope.sender_id)
parsed_sender = self._parse_node_id(envelope.sender_id)
from_node = parsed_sender if parsed_sender is not None else self._stable_node_id(envelope.sender_id)
direct_node = self._parse_node_id(envelope.destination)
to_node = direct_node if direct_node is not None else 0xFFFFFFFF
@@ -521,7 +520,7 @@ class MeshtasticTransport:
def _on_connect(client, userdata, flags, rc):
if rc == 0:
info = client.publish(topic, payload, qos=0)
info = client.publish(topic, payload, qos=1)
info.wait_for_publish(timeout=5)
published[0] = True
client.disconnect()
@@ -529,9 +528,7 @@ class MeshtasticTransport:
error_msg[0] = f"MQTT connect refused: rc={rc}"
client.disconnect()
client = mqtt.Client(
client_id=f"shadowbroker-tx-{envelope.message_id[:8]}", protocol=mqtt.MQTTv311
)
client = mqtt.Client(client_id=f"meshchat-tx-{envelope.message_id[:8]}", protocol=mqtt.MQTTv311)
broker, port, user, pw = self._mqtt_config()
client.username_pw_set(user, pw)
client.on_connect = _on_connect
@@ -553,9 +550,9 @@ class MeshtasticTransport:
True,
self.NAME,
(
f"Published direct to !{to_node:08x} via {region}/{channel}"
f"Broker accepted direct publish to !{to_node:08x} via {region}/{channel}"
if direct_node is not None
else f"Published to {region}/{channel} ({len(payload)}B protobuf)"
else f"Broker accepted channel publish to {region}/{channel} ({len(payload)}B protobuf)"
),
)
except Exception as e:
+178 -4
View File
@@ -11,6 +11,7 @@ import base64
import hmac
import hashlib
import json
import logging
import secrets
import time
from typing import Any
@@ -51,6 +52,8 @@ PREKEY_LOOKUP_ROTATE_BEFORE_REMAINING_USES = 8
PREKEY_LOOKUP_ROTATION_OVERLAP_S = 12 * 60 * 60
PREKEY_LOOKUP_ROTATION_ACTIVE_CAP = 4
logger = logging.getLogger(__name__)
def _safe_int(val, default=0) -> int:
try:
@@ -107,6 +110,7 @@ def _default_identity() -> dict[str, Any]:
def _prekey_lookup_handle_record(
handle: str,
*,
label: str = "",
issued_at: int = 0,
expires_at: int = 0,
max_uses: int = 0,
@@ -125,6 +129,7 @@ def _prekey_lookup_handle_record(
bounded_max_uses = max(1, _safe_int(max_uses or PREKEY_LOOKUP_HANDLE_MAX_USES, PREKEY_LOOKUP_HANDLE_MAX_USES))
return {
"handle": str(handle or "").strip(),
"label": str(label or "").strip()[:96],
"issued_at": issued,
"expires_at": bounded_expires_at,
"max_uses": bounded_max_uses,
@@ -152,8 +157,10 @@ def _coerce_prekey_lookup_handle_record(
max_uses = _safe_int(value.get("max_uses", PREKEY_LOOKUP_HANDLE_MAX_USES) or PREKEY_LOOKUP_HANDLE_MAX_USES)
use_count = _safe_int(value.get("use_count", value.get("uses", 0)) or 0, 0)
last_used_at = _safe_int(value.get("last_used_at", value.get("last_used", 0)) or 0, 0)
label = str(value.get("label", "") or "").strip()
return _prekey_lookup_handle_record(
handle,
label=label,
issued_at=issued_at,
expires_at=expires_at,
max_uses=max_uses,
@@ -228,6 +235,23 @@ def _fresh_prekey_lookup_handle_record(*, now: int | None = None) -> dict[str, A
)
def _prekey_registration_failure_blocks_dm_invite(detail: str) -> bool:
"""Only trust-root failures block address export; transport warm-up can finish later."""
lowered = str(detail or "").lower()
critical_markers = (
"root transparency",
"external root witness",
"stable root",
"witness threshold",
"witness finality",
"root manifest",
"root witness",
"manifest_fingerprint",
"policy fingerprint",
)
return any(marker in lowered for marker in critical_markers)
def _bounded_lookup_handle_records(
records: list[dict[str, Any]],
*,
@@ -884,6 +908,7 @@ def export_wormhole_dm_invite(*, label: str = "", expires_in_s: int = 0) -> dict
existing_handles.append(
_prekey_lookup_handle_record(
lookup_handle,
label=str(label or "").strip(),
issued_at=issued_at,
expires_at=expires_at,
)
@@ -920,14 +945,25 @@ def export_wormhole_dm_invite(*, label: str = "", expires_in_s: int = 0) -> dict
except Exception:
pass
prekey_registration: dict[str, Any] = {"ok": False, "detail": "prekey bundle publish not attempted"}
try:
from services.mesh.mesh_wormhole_prekey import register_wormhole_prekey_bundle
registered = register_wormhole_prekey_bundle()
if not registered.get("ok"):
return {"ok": False, "detail": str(registered.get("detail", "") or "prekey bundle registration failed")}
prekey_registration = register_wormhole_prekey_bundle()
if not prekey_registration.get("ok"):
detail = str(prekey_registration.get("detail", "") or "prekey bundle registration failed")
if _prekey_registration_failure_blocks_dm_invite(detail):
return {"ok": False, "detail": detail}
logger.warning(
"DM invite prekey publish pending: %s",
detail,
)
except Exception as exc:
return {"ok": False, "detail": str(exc) or "prekey bundle registration failed"}
prekey_registration = {"ok": False, "detail": str(exc) or "prekey bundle registration failed"}
detail = str(prekey_registration.get("detail", "") or "")
if _prekey_registration_failure_blocks_dm_invite(detail):
return {"ok": False, "detail": detail}
logger.warning("DM invite prekey publish pending: %s", prekey_registration["detail"])
invite_node_id, invite_public_key, invite_private_key = _generate_invite_signing_identity()
payload = _attach_dm_invite_root_distribution(payload)
@@ -958,6 +994,8 @@ def export_wormhole_dm_invite(*, label: str = "", expires_in_s: int = 0) -> dict
"peer_id": str(invite_node_id or ""),
"trust_fingerprint": str(payload.get("identity_commitment", "") or ""),
"invite": invite,
"prekey_publish_pending": not bool(prekey_registration.get("ok")),
"prekey_registration": prekey_registration,
}
@@ -980,6 +1018,140 @@ def get_prekey_lookup_handle_records() -> list[dict[str, Any]]:
]
def list_prekey_lookup_handle_records_for_ui(*, now: int | None = None) -> dict[str, Any]:
"""Return shareable DM address records without exposing local identity secrets."""
current_time = _safe_int(now or time.time(), int(time.time()))
addresses: list[dict[str, Any]] = []
for record in get_prekey_lookup_handle_records():
handle = str(record.get("handle", "") or "").strip()
if not handle:
continue
expires_at = _effective_prekey_lookup_handle_expires_at(record)
max_uses = max(
1,
_safe_int(
record.get("max_uses", PREKEY_LOOKUP_HANDLE_MAX_USES) or PREKEY_LOOKUP_HANDLE_MAX_USES,
PREKEY_LOOKUP_HANDLE_MAX_USES,
),
)
use_count = max(0, _safe_int(record.get("use_count", 0) or 0, 0))
addresses.append(
{
"handle": handle,
"label": str(record.get("label", "") or "").strip(),
"issued_at": _safe_int(record.get("issued_at", 0) or 0, 0),
"expires_at": expires_at,
"max_uses": max_uses,
"use_count": use_count,
"remaining_uses": max(0, max_uses - use_count),
"last_used_at": _safe_int(record.get("last_used_at", 0) or 0, 0),
"expired": bool(expires_at > 0 and current_time >= expires_at),
"exhausted": bool(use_count >= max_uses),
}
)
addresses.sort(key=lambda item: _safe_int(item.get("issued_at", 0) or 0, 0), reverse=True)
return {"ok": True, "addresses": addresses}
def rename_prekey_lookup_handle(handle: str, label: str) -> dict[str, Any]:
"""Rename an active invite-scoped DM lookup handle without changing the handle."""
lookup_handle = str(handle or "").strip()
next_label = str(label or "").strip()[:96]
if not lookup_handle:
return {"ok": False, "detail": "missing_lookup_handle"}
current_time = int(time.time())
data = read_wormhole_identity()
existing, _ = _normalize_prekey_lookup_handles(
data.get("prekey_lookup_handles", []),
fallback_issued_at=current_time,
now=current_time,
)
updated = False
next_records: list[dict[str, Any]] = []
for record in existing:
current = dict(record)
if str(current.get("handle", "") or "").strip() == lookup_handle:
current["label"] = next_label
updated = True
next_records.append(current)
if not updated:
return {
"ok": False,
"handle": lookup_handle,
"label": next_label,
"updated": False,
"detail": "lookup_handle_not_found",
}
normalized_records, _ = _normalize_prekey_lookup_handles(
next_records,
fallback_issued_at=current_time,
now=current_time,
)
_write_identity({"prekey_lookup_handles": normalized_records})
return {
"ok": True,
"handle": lookup_handle,
"label": next_label,
"updated": True,
}
def revoke_prekey_lookup_handle(handle: str) -> dict[str, Any]:
"""Revoke an invite-scoped DM lookup handle for future first-contact attempts."""
lookup_handle = str(handle or "").strip()
if not lookup_handle:
return {"ok": False, "detail": "missing_lookup_handle"}
current_time = int(time.time())
data = read_wormhole_identity()
existing, _ = _normalize_prekey_lookup_handles(
data.get("prekey_lookup_handles", []),
fallback_issued_at=current_time,
now=current_time,
)
next_records = [
dict(record)
for record in existing
if str(record.get("handle", "") or "").strip() != lookup_handle
]
identity_removed = len(next_records) != len(existing)
if identity_removed:
_write_identity({"prekey_lookup_handles": next_records})
relay_removed = False
try:
from services.mesh.mesh_dm_relay import dm_relay
relay_removed = bool(dm_relay.unregister_prekey_lookup_alias(lookup_handle))
except Exception:
relay_removed = False
republished = False
detail = ""
if identity_removed:
try:
from services.mesh.mesh_wormhole_prekey import register_wormhole_prekey_bundle
registered = register_wormhole_prekey_bundle()
republished = bool(registered.get("ok"))
if not republished:
detail = str(registered.get("detail", "") or "prekey bundle republish failed")
except Exception as exc:
detail = str(exc) or "prekey bundle republish failed"
return {
"ok": True,
"handle": lookup_handle,
"revoked": bool(identity_removed or relay_removed),
"identity_removed": identity_removed,
"relay_removed": relay_removed,
"republished": republished,
"detail": detail,
}
def record_prekey_lookup_handle_use(handle: str, *, now: int | None = None) -> dict[str, Any] | None:
lookup_handle = str(handle or "").strip()
if not lookup_handle:
@@ -999,6 +1171,7 @@ def record_prekey_lookup_handle_use(handle: str, *, now: int | None = None) -> d
if str(current.get("handle", "") or "").strip() == lookup_handle:
current = _prekey_lookup_handle_record(
lookup_handle,
label=str(current.get("label", "") or "").strip(),
issued_at=_safe_int(current.get("issued_at", 0) or 0, current_time),
expires_at=_safe_int(current.get("expires_at", 0) or 0, 0),
max_uses=_safe_int(current.get("max_uses", PREKEY_LOOKUP_HANDLE_MAX_USES) or PREKEY_LOOKUP_HANDLE_MAX_USES),
@@ -1129,6 +1302,7 @@ def maybe_rotate_prekey_lookup_handles(*, now: int | None = None) -> dict[str, A
candidate_records.append(
_prekey_lookup_handle_record(
old_handle,
label=str(record.get("label", "") or "").strip(),
issued_at=_safe_int(record.get("issued_at", 0) or 0, current_time),
expires_at=overlap_expires_at,
max_uses=_safe_int(record.get("max_uses", PREKEY_LOOKUP_HANDLE_MAX_USES) or PREKEY_LOOKUP_HANDLE_MAX_USES),
@@ -12,6 +12,7 @@ from __future__ import annotations
import base64
import hashlib
import json
import logging
import time
from pathlib import Path
from typing import Any
@@ -23,7 +24,7 @@ from cryptography.hazmat.primitives.asymmetric import ed25519
from services.mesh.mesh_crypto import build_signature_payload, derive_node_id, verify_node_binding, verify_signature
from services.mesh.mesh_protocol import PROTOCOL_VERSION
from services.mesh.mesh_secure_storage import read_domain_json, write_domain_json
from services.mesh.mesh_secure_storage import SecureStorageError, read_domain_json, write_domain_json
from services.mesh.mesh_wormhole_identity import root_identity_fingerprint_for_material
from services.mesh.mesh_wormhole_persona import (
bootstrap_wormhole_persona_state,
@@ -51,6 +52,7 @@ DEFAULT_ROOT_WITNESS_THRESHOLD = 2
DEFAULT_ROOT_WITNESS_MANAGEMENT_SCOPE = "local"
DEFAULT_ROOT_WITNESS_INDEPENDENCE_GROUP = "local_system"
DEFAULT_ROOT_EXTERNAL_WITNESS_MAX_AGE_S = 3600
logger = logging.getLogger(__name__)
def _safe_int(val: Any, default: int = 0) -> int:
@@ -461,12 +463,22 @@ def witness_policy_fingerprint(policy: dict[str, Any]) -> str:
def read_root_distribution_state() -> dict[str, Any]:
raw = read_domain_json(
ROOT_DISTRIBUTION_DOMAIN,
ROOT_DISTRIBUTION_FILE,
_default_state,
base_dir=DATA_DIR,
)
try:
raw = read_domain_json(
ROOT_DISTRIBUTION_DOMAIN,
ROOT_DISTRIBUTION_FILE,
_default_state,
base_dir=DATA_DIR,
)
except SecureStorageError as exc:
detail = str(exc)
if "Failed to decrypt domain JSON" not in detail:
raise
logger.warning(
"Root distribution state could not decrypt; regenerating local witness distribution: %s",
detail,
)
raw = _default_state()
state = {**_default_state(), **dict(raw or {})}
state["witness_identity"] = {**_empty_witness_identity(), **dict(state.get("witness_identity") or {})}
witness_identities, witness_changed = _normalize_witness_identities(
+21 -2
View File
@@ -8,6 +8,7 @@ from typing import Iterable
# Default subscription roots — US-only to avoid flooding the public broker.
# Users can opt into additional regions via MESH_MQTT_EXTRA_ROOTS.
DEFAULT_ROOTS: tuple[str, ...] = ("US",)
DEFAULT_CHANNEL = "LongFast"
# Every known official region root (for UI dropdowns / manual opt-in).
ALL_OFFICIAL_ROOTS: tuple[str, ...] = (
@@ -107,6 +108,20 @@ def normalize_topic_filter(value: str) -> str | None:
return "/".join(parts)
def _default_topics_for_root(root: str) -> list[str]:
"""Return the default LongFast subscriptions for a region root.
The public broker carries protobuf/encrypted traffic under ``/e/`` and
companion decoded JSON traffic under ``/json/``. Positions often arrive on
the protobuf path, while public text is commonly easiest to observe on the
JSON path.
"""
return [
f"msh/{root}/2/e/{DEFAULT_CHANNEL}/#",
f"msh/{root}/2/json/{DEFAULT_CHANNEL}/#",
]
def build_subscription_topics(
extra_roots: str = "",
extra_topics: str = "",
@@ -119,7 +134,11 @@ def build_subscription_topics(
# via MESH_MQTT_EXTRA_ROOTS to avoid flooding the public broker.
roots.extend(root for root in (normalize_root(item) for item in _split_config_values(extra_roots)) if root)
topics = [f"msh/{root}/#" for root in _dedupe(roots)]
topics = [
topic
for root in _dedupe(roots)
for topic in _default_topics_for_root(root)
]
topics.extend(
topic
for topic in (
@@ -137,7 +156,7 @@ def known_roots(extra_roots: str = "", include_defaults: bool = True) -> list[st
for topic in topics:
if not topic.startswith("msh/") or not topic.endswith("/#"):
continue
root = normalize_root(topic[4:-2])
root = normalize_root(parse_topic_metadata(topic)["root"])
if root:
roots.append(root)
return _dedupe(roots)
@@ -0,0 +1,172 @@
from __future__ import annotations
import json
import os
import time
from pathlib import Path
from typing import Any
from services.config import get_settings
PUBLIC_DEFAULT_USER = "meshdev"
PUBLIC_DEFAULT_PASS = "large4cats"
DATA_DIR = Path(os.environ.get("SB_DATA_DIR", str(Path(__file__).parent.parent / "data")))
if not DATA_DIR.is_absolute():
DATA_DIR = Path(__file__).parent.parent / DATA_DIR
SETTINGS_FILE = DATA_DIR / "meshtastic_mqtt.json"
_cache: dict[str, Any] | None = None
_cache_ts: float = 0.0
_CACHE_TTL = 2.0
def _settings_defaults() -> dict[str, Any]:
try:
s = get_settings()
return {
"enabled": bool(getattr(s, "MESH_MQTT_ENABLED", False)),
"broker": str(getattr(s, "MESH_MQTT_BROKER", "") or "mqtt.meshtastic.org"),
"port": int(getattr(s, "MESH_MQTT_PORT", 1883) or 1883),
"username": str(getattr(s, "MESH_MQTT_USER", "") or PUBLIC_DEFAULT_USER),
"password": str(getattr(s, "MESH_MQTT_PASS", "") or PUBLIC_DEFAULT_PASS),
"psk": str(getattr(s, "MESH_MQTT_PSK", "") or ""),
"include_default_roots": bool(getattr(s, "MESH_MQTT_INCLUDE_DEFAULT_ROOTS", True)),
"extra_roots": str(getattr(s, "MESH_MQTT_EXTRA_ROOTS", "") or ""),
"extra_topics": str(getattr(s, "MESH_MQTT_EXTRA_TOPICS", "") or ""),
}
except Exception:
return {
"enabled": False,
"broker": "mqtt.meshtastic.org",
"port": 1883,
"username": PUBLIC_DEFAULT_USER,
"password": PUBLIC_DEFAULT_PASS,
"psk": "",
"include_default_roots": True,
"extra_roots": "",
"extra_topics": "",
}
def _safe_int(value: Any, default: int) -> int:
try:
parsed = int(value)
except (TypeError, ValueError):
return default
if parsed < 1 or parsed > 65535:
return default
return parsed
def _normalize(data: dict[str, Any]) -> dict[str, Any]:
defaults = _settings_defaults()
return {
"enabled": bool(data.get("enabled", defaults["enabled"])),
"broker": str(data.get("broker", defaults["broker"]) or defaults["broker"]).strip(),
"port": _safe_int(data.get("port", defaults["port"]), defaults["port"]),
"username": str(data.get("username", defaults["username"]) or "").strip(),
"password": str(data.get("password", defaults["password"]) or ""),
"psk": str(data.get("psk", defaults["psk"]) or "").strip(),
"include_default_roots": bool(data.get("include_default_roots", defaults["include_default_roots"])),
"extra_roots": str(data.get("extra_roots", defaults["extra_roots"]) or "").strip(),
"extra_topics": str(data.get("extra_topics", defaults["extra_topics"]) or "").strip(),
"updated_at": _safe_int(data.get("updated_at", 0), 0),
}
def read_meshtastic_mqtt_settings() -> dict[str, Any]:
global _cache, _cache_ts
now = time.monotonic()
if _cache is not None and (now - _cache_ts) < _CACHE_TTL:
return dict(_cache)
if not SETTINGS_FILE.exists():
result = {**_settings_defaults(), "updated_at": 0}
else:
try:
loaded = json.loads(SETTINGS_FILE.read_text(encoding="utf-8"))
except Exception:
loaded = {}
result = _normalize(loaded if isinstance(loaded, dict) else {})
_cache = result
_cache_ts = now
return dict(result)
def write_meshtastic_mqtt_settings(**updates: Any) -> dict[str, Any]:
DATA_DIR.mkdir(parents=True, exist_ok=True)
existing = read_meshtastic_mqtt_settings()
next_data = dict(existing)
for key in (
"enabled",
"broker",
"port",
"username",
"password",
"psk",
"include_default_roots",
"extra_roots",
"extra_topics",
):
if key in updates and updates[key] is not None:
next_data[key] = updates[key]
if "username" in updates and not str(updates.get("username") or "").strip() and "password" not in updates:
next_data["password"] = PUBLIC_DEFAULT_PASS
next_data["updated_at"] = int(time.time())
normalized = _normalize(next_data)
SETTINGS_FILE.write_text(json.dumps(normalized, indent=2), encoding="utf-8")
if os.name != "nt":
os.chmod(SETTINGS_FILE, 0o600)
global _cache, _cache_ts
_cache = normalized
_cache_ts = time.monotonic()
return dict(normalized)
def redacted_meshtastic_mqtt_settings(data: dict[str, Any] | None = None) -> dict[str, Any]:
source = read_meshtastic_mqtt_settings() if data is None else dict(data)
username = str(source.get("username", "") or "")
uses_default_credentials = username in ("", PUBLIC_DEFAULT_USER) and str(source.get("password", "") or "") in (
"",
PUBLIC_DEFAULT_PASS,
)
return {
"enabled": bool(source.get("enabled")),
"broker": str(source.get("broker", "")),
"port": int(source.get("port", 1883) or 1883),
"username": "" if uses_default_credentials else username,
"uses_default_credentials": uses_default_credentials,
"has_password": bool(str(source.get("password", "") or "")),
"has_psk": bool(str(source.get("psk", "") or "")),
"include_default_roots": bool(source.get("include_default_roots", True)),
"extra_roots": str(source.get("extra_roots", "") or ""),
"extra_topics": str(source.get("extra_topics", "") or ""),
"updated_at": int(source.get("updated_at", 0) or 0),
}
def mqtt_connection_config() -> tuple[str, int, str, str]:
data = read_meshtastic_mqtt_settings()
return (
str(data.get("broker") or "mqtt.meshtastic.org"),
int(data.get("port") or 1883),
str(data.get("username") or PUBLIC_DEFAULT_USER),
str(data.get("password") or PUBLIC_DEFAULT_PASS),
)
def mqtt_bridge_enabled() -> bool:
return bool(read_meshtastic_mqtt_settings().get("enabled"))
def mqtt_psk_hex() -> str:
return str(read_meshtastic_mqtt_settings().get("psk", "") or "").strip()
def mqtt_subscription_settings() -> tuple[str, str, bool]:
data = read_meshtastic_mqtt_settings()
return (
str(data.get("extra_roots", "") or ""),
str(data.get("extra_topics", "") or ""),
bool(data.get("include_default_roots", True)),
)
+1 -1
View File
@@ -73,7 +73,7 @@ def fetch_with_curl(url, method="GET", json_data=None, timeout=15, headers=None,
both Python requests and the barebones Windows system curl.
"""
default_headers = {
"User-Agent": "ShadowBroker-OSINT/0.9.7 (+https://github.com/BigBodyCobain/Shadowbroker; contact: bigbodycobain@gmail.com)",
"User-Agent": "ShadowBroker-OSINT/0.9.79 (+https://github.com/BigBodyCobain/Shadowbroker; contact: bigbodycobain@gmail.com)",
}
if headers:
default_headers.update(headers)
+10 -1
View File
@@ -15,6 +15,8 @@ _FEED_URL_REPLACEMENTS = {
"https://www.channelnewsasia.com/rssfeed/8395986": "https://www.channelnewsasia.com/api/v1/rss-outbound-feed?_format=xml",
}
_DEAD_FEED_URLS = {
"https://www.reutersagency.com/feed/?best-topics=world",
"https://rsshub.app/apnews/topics/world-news",
"https://www3.nhk.or.jp/nhkworld/rss/world.xml",
"https://focustaiwan.tw/rss",
"https://english.kyodonews.net/rss/news.xml",
@@ -29,6 +31,11 @@ DEFAULT_FEEDS = [
{"name": "AlJazeera", "url": "https://www.aljazeera.com/xml/rss/all.xml", "weight": 2},
{"name": "NYT", "url": "https://rss.nytimes.com/services/xml/rss/nyt/World.xml", "weight": 1},
{"name": "GDACS", "url": "https://www.gdacs.org/xml/rss.xml", "weight": 5},
{"name": "The War Zone", "url": "https://www.twz.com/feed", "weight": 4},
{"name": "Bellingcat", "url": "https://www.bellingcat.com/feed/", "weight": 4},
{"name": "Guardian", "url": "https://www.theguardian.com/world/rss", "weight": 3},
{"name": "TASS", "url": "https://tass.com/rss/v2.xml", "weight": 2},
{"name": "Xinhua", "url": "http://www.news.cn/english/rss/worldrss.xml", "weight": 2},
{"name": "CNA", "url": "https://www.channelnewsasia.com/api/v1/rss-outbound-feed?_format=xml", "weight": 3},
{"name": "Mercopress", "url": "https://en.mercopress.com/rss/", "weight": 3},
{"name": "SCMP", "url": "https://www.scmp.com/rss/91/feed", "weight": 4},
@@ -73,7 +80,9 @@ def get_feeds() -> list[dict]:
normalised = _normalise_feeds(feeds)
if normalised != feeds:
save_feeds(normalised)
return normalised
if normalised:
return normalised
logger.warning("News feed configuration contained no usable feeds; falling back to defaults")
except (IOError, OSError, json.JSONDecodeError, ValueError) as e:
logger.warning(f"Failed to read news feed config: {e}")
return list(DEFAULT_FEEDS)
+14 -3
View File
@@ -10,7 +10,8 @@ _cache: dict | None = None
_cache_ts: float = 0.0
_CACHE_TTL = 5.0
_DEFAULTS = {
"enabled": False,
"enabled": True,
"operator_disabled": False,
"timemachine_enabled": False,
}
@@ -35,8 +36,16 @@ def read_node_settings() -> dict:
except Exception:
result = {**_DEFAULTS, "updated_at": 0}
else:
operator_disabled = bool(data.get("operator_disabled", False))
raw_enabled = data.get("enabled", _DEFAULTS["enabled"])
# v0.9.7 initially wrote enabled:false as a default/offline state,
# which accidentally blocked InfoNet participation. Treat legacy
# false-without-marker as auto-enabled; only an explicit operator
# disable should keep the participant sync loop off.
enabled = False if operator_disabled else bool(raw_enabled or "operator_disabled" not in data)
result = {
"enabled": bool(data.get("enabled", _DEFAULTS["enabled"])),
"enabled": enabled,
"operator_disabled": operator_disabled,
"timemachine_enabled": bool(data.get("timemachine_enabled", _DEFAULTS["timemachine_enabled"])),
"updated_at": _safe_int(data.get("updated_at", 0) or 0),
}
@@ -48,8 +57,10 @@ def read_node_settings() -> dict:
def write_node_settings(*, enabled: bool | None = None, timemachine_enabled: bool | None = None) -> dict:
DATA_DIR.mkdir(parents=True, exist_ok=True)
existing = read_node_settings()
next_enabled = bool(existing.get("enabled", _DEFAULTS["enabled"])) if enabled is None else bool(enabled)
payload = {
"enabled": bool(existing.get("enabled", _DEFAULTS["enabled"])) if enabled is None else bool(enabled),
"enabled": next_enabled,
"operator_disabled": bool(existing.get("operator_disabled", _DEFAULTS["operator_disabled"])) if enabled is None else not next_enabled,
"timemachine_enabled": bool(existing.get("timemachine_enabled", _DEFAULTS["timemachine_enabled"])) if timemachine_enabled is None else bool(timemachine_enabled),
"updated_at": int(time.time()),
}
+1 -1
View File
@@ -20,7 +20,7 @@ from cachetools import TTLCache
logger = logging.getLogger(__name__)
_SHODAN_BASE = "https://api.shodan.io"
_USER_AGENT = "ShadowBroker/0.9.7 local Shodan connector"
_USER_AGENT = "ShadowBroker/0.9.79 local Shodan connector"
_REQUEST_TIMEOUT = 15
_MIN_INTERVAL_SECONDS = 1.05 # Shodan docs say API plans are rate limited to ~1 req/sec.
_DEFAULT_SEARCH_PAGES = 1
+263 -27
View File
@@ -22,6 +22,12 @@ from collections import deque
from datetime import datetime, timezone
from services.config import get_settings
from services.meshtastic_mqtt_settings import (
mqtt_bridge_enabled,
mqtt_connection_config,
mqtt_psk_hex,
mqtt_subscription_settings,
)
from services.mesh.meshtastic_topics import all_available_roots, build_subscription_topics, known_roots, parse_topic_metadata
logger = logging.getLogger("services.sigint")
@@ -477,22 +483,13 @@ class MeshtasticBridge:
@staticmethod
def _mqtt_config() -> tuple[str, int, str, str]:
"""Return (broker, port, user, password) from settings."""
try:
s = get_settings()
return (
str(s.MESH_MQTT_BROKER or "mqtt.meshtastic.org"),
int(s.MESH_MQTT_PORT or 1883),
str(s.MESH_MQTT_USER or "meshdev"),
str(s.MESH_MQTT_PASS or "large4cats"),
)
except Exception:
return ("mqtt.meshtastic.org", 1883, "meshdev", "large4cats")
return mqtt_connection_config()
@classmethod
def _resolve_psk(cls) -> bytes:
"""Return the PSK from config, or the default LongFast key if empty."""
try:
raw = str(getattr(get_settings(), "MESH_MQTT_PSK", "") or "").strip()
raw = mqtt_psk_hex()
except Exception:
raw = ""
if not raw:
@@ -506,6 +503,11 @@ class MeshtasticBridge:
self._thread: threading.Thread | None = None
self._stop = threading.Event()
self._client_id = self._build_client_id()
self._connected = False
self._last_error = ""
self._last_connected_at = 0.0
self._last_disconnected_at = 0.0
self._last_broker = ""
# Rate-limiter: sliding window of receive timestamps
self._rx_timestamps: deque[float] = deque()
self._rx_dropped = 0
@@ -518,10 +520,11 @@ class MeshtasticBridge:
second client connects with the same id. Using a fixed id made separate
ShadowBroker instances kick each other off the broker.
Includes the app version so the Meshtastic team can track our footprint.
This is deliberately not tied to the user's public mesh address or
ShadowBroker node identity; it is only an MQTT session handle.
"""
suffix = uuid.uuid4().hex[:8]
return f"sb096-{suffix}"
return f"meshchat-{suffix}"
def _dedupe_message(
self,
@@ -542,9 +545,206 @@ class MeshtasticBridge:
self._message_dedupe[key] = now
return False
@staticmethod
def _message_dedupe_key(message: dict) -> str:
sender = str(message.get("from") or "???").strip().lower()
recipient = str(message.get("to") or "broadcast").strip().lower()
text = str(message.get("text") or "").strip()
channel = str(message.get("channel") or "LongFast").strip().lower()
root = str(message.get("root") or message.get("region") or "").strip().lower()
if root == "us":
root = "us"
return f"{sender}:{recipient}:{root}:{channel}:{text}"
def append_text_message(self, message: dict, *, dedupe_window_s: float = 5.0) -> bool:
"""Append a Meshtastic text message unless it is a near-immediate echo."""
if not str(message.get("text") or "").strip():
return False
now = time.time()
cutoff = now - max(1.0, dedupe_window_s)
next_message = dict(message)
next_message.setdefault("to", "broadcast")
next_message.setdefault("channel", "LongFast")
next_message.setdefault("timestamp", datetime.utcnow().isoformat() + "Z")
key = self._message_dedupe_key(next_message)
for existing in list(self.messages)[:40]:
if self._message_dedupe_key(existing) != key:
continue
try:
existing_ts_raw = existing.get("timestamp")
existing_ts = (
datetime.fromisoformat(str(existing_ts_raw).replace("Z", "+00:00")).timestamp()
if existing_ts_raw
else now
)
except Exception:
existing_ts = now
if existing_ts >= cutoff:
if not existing.get("root") and next_message.get("root"):
existing["root"] = next_message.get("root")
if not existing.get("region") and next_message.get("region"):
existing["region"] = next_message.get("region")
return False
self.messages.appendleft(next_message)
return True
@staticmethod
def _coerce_node_ref(value) -> str:
"""Normalize Meshtastic node identifiers into the public !xxxxxxxx form."""
if value is None:
return ""
if isinstance(value, int):
return f"!{value & 0xFFFFFFFF:08x}"
raw = str(value).strip()
if not raw:
return ""
if raw.startswith("!"):
return raw
lowered = raw.lower()
if lowered.startswith("0x"):
try:
return f"!{int(lowered, 16) & 0xFFFFFFFF:08x}"
except ValueError:
return raw
if raw.isdigit():
try:
return f"!{int(raw) & 0xFFFFFFFF:08x}"
except ValueError:
return raw
if len(raw) == 8 and all(ch in "0123456789abcdefABCDEF" for ch in raw):
return f"!{raw.lower()}"
return raw
@staticmethod
def _first_text_value(*values) -> str:
for value in values:
if isinstance(value, bytes):
value = value.decode("utf-8", errors="replace")
if isinstance(value, str):
text = value.strip()
if text:
return MeshtasticBridge._repair_text_mojibake(text)
return ""
@staticmethod
def _repair_text_mojibake(text: str) -> str:
"""Repair common UTF-8-as-Latin-1 mojibake from MQTT JSON bridges."""
if not text or not any(marker in text for marker in ("Ã", "Ð", "Ñ")):
return text
try:
repaired = text.encode("latin-1").decode("utf-8").strip()
except UnicodeError:
return text
if repaired and repaired != text:
return repaired
return text
@staticmethod
def _first_present(*values):
for value in values:
if value is not None and value != "":
return value
return None
def _extract_json_text_message(self, data: dict, topic: str) -> dict | None:
"""Extract a public Meshtastic text event from decoded MQTT JSON.
Meshtastic JSON brokers are not perfectly uniform. Some packets expose
text at the top level, some under ``decoded`` or ``payload``. Keep this
permissive for receive, but only return messages with non-empty text.
"""
if not isinstance(data, dict):
return None
topic_meta = parse_topic_metadata(topic)
packet = data.get("packet") if isinstance(data.get("packet"), dict) else {}
decoded = data.get("decoded") if isinstance(data.get("decoded"), dict) else {}
payload_obj = data.get("payload")
payload = payload_obj if isinstance(payload_obj, dict) else {}
decoded_payload_obj = decoded.get("payload") if decoded else None
decoded_payload = decoded_payload_obj if isinstance(decoded_payload_obj, dict) else {}
text = self._first_text_value(
data.get("text"),
data.get("message"),
data.get("msg"),
payload_obj if isinstance(payload_obj, str) else "",
payload.get("text"),
payload.get("message"),
payload.get("msg"),
payload.get("payload") if isinstance(payload.get("payload"), str) else "",
decoded.get("text"),
decoded.get("message"),
decoded.get("payload") if isinstance(decoded.get("payload"), str) else "",
decoded_payload.get("text"),
decoded_payload.get("message"),
decoded_payload.get("msg"),
)
if not text:
return None
sender = self._coerce_node_ref(
self._first_present(
data.get("from"),
data.get("fromId"),
data.get("from_id"),
data.get("sender"),
data.get("senderId"),
data.get("sender_id"),
packet.get("from"),
packet.get("fromId"),
packet.get("from_id"),
decoded.get("from"),
)
)
recipient = self._coerce_node_ref(
self._first_present(
data.get("to"),
data.get("toId"),
data.get("to_id"),
data.get("recipient"),
data.get("recipientId"),
data.get("recipient_id"),
packet.get("to"),
packet.get("toId"),
packet.get("to_id"),
decoded.get("to"),
)
)
if not recipient or recipient in {"!ffffffff", "broadcast"}:
recipient = "broadcast"
timestamp = datetime.utcnow().isoformat() + "Z"
rx_time = self._first_present(
data.get("rxTime"),
data.get("rx_time"),
data.get("timestamp"),
packet.get("rxTime"),
packet.get("timestamp"),
)
if isinstance(rx_time, (int, float)) and rx_time > 0:
try:
timestamp = datetime.fromtimestamp(float(rx_time), tz=timezone.utc).isoformat()
except (OSError, ValueError):
pass
return {
"from": sender or topic.split("/")[-1],
"to": recipient,
"text": text[:500],
"region": topic_meta["region"],
"root": topic_meta["root"],
"channel": topic_meta["channel"],
"timestamp": timestamp,
}
def start(self):
if self._thread and self._thread.is_alive():
return
if not self._stop.is_set():
return
self._thread.join(timeout=2.0)
if self._thread.is_alive():
logger.warning("Meshtastic MQTT bridge is still stopping; start deferred")
return
self._stop.clear()
self._thread = threading.Thread(target=self._run, daemon=True, name="mesh-bridge")
self._thread.start()
@@ -552,13 +752,37 @@ class MeshtasticBridge:
def stop(self):
self._stop.set()
self._connected = False
def is_running(self) -> bool:
return bool(self._thread and self._thread.is_alive() and not self._stop.is_set())
def status(self) -> dict:
broker, port, user, _pw = self._mqtt_config()
display_user = "" if user == "meshdev" else user
return {
"enabled": mqtt_bridge_enabled(),
"running": self.is_running(),
"connected": bool(self._connected),
"broker": broker,
"port": port,
"username": display_user,
"client_id": self._client_id,
"message_log_size": len(self.messages),
"signal_log_size": len(self.signals),
"last_error": self._last_error,
"last_broker": self._last_broker,
"last_connected_at": self._last_connected_at,
"last_disconnected_at": self._last_disconnected_at,
"rx_dropped": self._rx_dropped,
}
def _subscription_topics(self) -> list[str]:
settings = get_settings()
extra_roots, extra_topics, include_defaults = mqtt_subscription_settings()
return build_subscription_topics(
extra_roots=str(getattr(settings, "MESH_MQTT_EXTRA_ROOTS", "") or ""),
extra_topics=str(getattr(settings, "MESH_MQTT_EXTRA_TOPICS", "") or ""),
include_defaults=bool(getattr(settings, "MESH_MQTT_INCLUDE_DEFAULT_ROOTS", True)),
extra_roots=extra_roots,
extra_topics=extra_topics,
include_defaults=include_defaults,
)
def _run(self):
@@ -582,6 +806,9 @@ class MeshtasticBridge:
def _on_connect(client, userdata, flags, rc):
if rc == 0:
self._connected = True
self._last_error = ""
self._last_connected_at = time.time()
logger.info(
"Meshtastic MQTT connected (%s), subscribing to %s",
self._client_id,
@@ -590,6 +817,8 @@ class MeshtasticBridge:
for topic in topics:
client.subscribe(topic, qos=0)
else:
self._connected = False
self._last_error = f"connect_refused:{rc}"
logger.error(
"Meshtastic MQTT connection refused (%s): rc=%s",
self._client_id,
@@ -597,7 +826,10 @@ class MeshtasticBridge:
)
def _on_disconnect(client, userdata, rc):
self._connected = False
self._last_disconnected_at = time.time()
if rc != 0:
self._last_error = f"disconnect:{rc}"
logger.warning(
"Meshtastic MQTT disconnected unexpectedly (%s, rc=%s), will auto-reconnect",
self._client_id,
@@ -607,6 +839,7 @@ class MeshtasticBridge:
logger.info("Meshtastic MQTT disconnected cleanly (%s)", self._client_id)
broker, port, user, pw = self._mqtt_config()
self._last_broker = f"{broker}:{port}"
client = mqtt.Client(client_id=self._client_id, protocol=mqtt.MQTTv311)
client.username_pw_set(user, pw)
client.on_connect = _on_connect
@@ -645,9 +878,6 @@ class MeshtasticBridge:
def _on_message(self, client, userdata, msg):
"""Parse Meshtastic MQTT messages — protobuf + AES decryption."""
try:
if self._rate_limited():
return
payload = msg.payload
topic = msg.topic
@@ -655,6 +885,11 @@ class MeshtasticBridge:
if "/json/" in topic:
try:
data = json.loads(payload)
text_message = self._extract_json_text_message(data, topic)
if text_message:
self.append_text_message(text_message, dedupe_window_s=30.0)
if self._rate_limited():
return
self._ingest_data(data, topic)
return
except (json.JSONDecodeError, UnicodeDecodeError):
@@ -675,7 +910,7 @@ class MeshtasticBridge:
topic_meta["root"],
):
return
self.messages.appendleft(
self.append_text_message(
{
"from": data.get("from", "???"),
"to": recipient,
@@ -687,6 +922,8 @@ class MeshtasticBridge:
}
)
else:
if self._rate_limited():
return
self._ingest_data(data, topic)
except Exception as e:
@@ -1011,7 +1248,7 @@ class SIGINTGrid:
self._started = True
self.aprs.start()
try:
mqtt_enabled = bool(getattr(get_settings(), "MESH_MQTT_ENABLED", False))
mqtt_enabled = mqtt_bridge_enabled()
except Exception:
mqtt_enabled = False
if mqtt_enabled:
@@ -1123,13 +1360,12 @@ class SIGINTGrid:
ch = msg.get("channel", "LongFast")
channel_msgs[ch] = channel_msgs.get(ch, 0) + 1
extra_roots, _extra_topics, include_defaults = mqtt_subscription_settings()
return {
"regions": regions,
"roots": roots,
"known_roots": known_roots(
str(getattr(get_settings(), "MESH_MQTT_EXTRA_ROOTS", "") or ""),
include_defaults=bool(getattr(get_settings(), "MESH_MQTT_INCLUDE_DEFAULT_ROOTS", True)),
),
"known_roots": known_roots(extra_roots, include_defaults=include_defaults),
"all_roots": all_available_roots(),
"channel_messages": channel_msgs,
"total_nodes": len(seen_callsigns),
+69 -94
View File
@@ -1,13 +1,9 @@
"""Tor Hidden Service auto-provisioner.
"""Tor hidden-service auto-provisioner.
Manages a Tor hidden service that points to the local ShadowBroker backend.
Tor is started as a subprocess with a generated torrc no manual config needed.
Auto-installs the Tor Expert Bundle on Windows if not present.
Usage:
from services.tor_hidden_service import tor_service
status = tor_service.start() # -> {"ok": True, "onion_address": "http://xxxx.onion:8000"}
tor_service.stop()
Tor is started as a subprocess with a generated torrc. Windows source installs
can download the Tor Expert Bundle into backend/data without admin rights.
Docker images should already include the `tor` package.
"""
from __future__ import annotations
@@ -31,31 +27,33 @@ HOSTNAME_PATH = TOR_DIR / "hidden_service" / "hostname"
TOR_DATA_DIR = TOR_DIR / "data"
PIDFILE_PATH = TOR_DIR / "tor.pid"
# Bundled Tor install location (inside our data dir so no admin rights needed)
# Bundled Tor install location (inside data dir so no admin rights are needed).
TOR_INSTALL_DIR = TOR_DIR / "tor_bin"
# How long to wait for Tor to generate the hostname file
_STARTUP_TIMEOUT_S = 90
_POLL_INTERVAL_S = 1.0
# Tor Expert Bundle download URL (Windows x86_64)
_TOR_EXPERT_BUNDLE_URL = "https://dist.torproject.org/torbrowser/15.0.8/tor-expert-bundle-windows-x86_64-15.0.8.tar.gz"
# Windows x86_64 Tor Expert Bundle URLs. Keep a fallback so first-run
# onboarding does not break when Tor rotates point releases.
_TOR_EXPERT_BUNDLE_URLS = [
"https://dist.torproject.org/torbrowser/15.0.11/tor-expert-bundle-windows-x86_64-15.0.11.tar.gz",
"https://dist.torproject.org/torbrowser/15.0.8/tor-expert-bundle-windows-x86_64-15.0.8.tar.gz",
]
def _find_tor_binary() -> str | None:
"""Locate the tor binary on the system, including our bundled install."""
# Check our bundled install first
bundled = TOR_INSTALL_DIR / "tor" / "tor.exe"
if bundled.exists():
return str(bundled)
# Also check for extracted layout variants
for sub in TOR_INSTALL_DIR.rglob("tor.exe"):
return str(sub)
tor = shutil.which("tor")
if tor:
return tor
# Common locations on Windows
for candidate in [
r"C:\Program Files\Tor Browser\Browser\TorBrowser\Tor\tor.exe",
r"C:\Program Files (x86)\Tor Browser\Browser\TorBrowser\Tor\tor.exe",
@@ -67,77 +65,65 @@ def _find_tor_binary() -> str | None:
def _auto_install_tor() -> str | None:
"""Download and extract the Tor Expert Bundle. Returns path to tor binary or None."""
"""Install or download Tor when it is safe to do so."""
if os.name != "nt":
# On Linux/Mac, try package manager
try:
if shutil.which("apt-get"):
subprocess.run(["sudo", "apt-get", "install", "-y", "tor"], check=True, capture_output=True, timeout=120)
elif shutil.which("brew"):
subprocess.run(["brew", "install", "tor"], check=True, capture_output=True, timeout=120)
elif shutil.which("pacman"):
subprocess.run(["sudo", "pacman", "-S", "--noconfirm", "tor"], check=True, capture_output=True, timeout=120)
else:
logger.warning("No supported package manager found for auto-install")
return None
return shutil.which("tor")
except Exception as exc:
logger.error("Failed to auto-install Tor via package manager: %s", exc)
return None
# In Docker this should already be baked into the image. For source
# installs we avoid unattended sudo prompts from a web request path.
logger.warning("Tor is not installed. Install the tor package or use the Docker image with Tor baked in.")
return None
# Windows: download Tor Expert Bundle (no admin needed)
TOR_INSTALL_DIR.mkdir(parents=True, exist_ok=True)
archive_path = TOR_INSTALL_DIR / "tor-expert-bundle.tar.gz"
try:
logger.info("Downloading Tor Expert Bundle over HTTPS from dist.torproject.org...")
urlretrieve(_TOR_EXPERT_BUNDLE_URL, str(archive_path))
# Verify SHA-256 of the downloaded archive
sha256_url = _TOR_EXPERT_BUNDLE_URL + ".sha256sum"
sha256_file = TOR_INSTALL_DIR / "sha256sum.txt"
for bundle_url in _TOR_EXPERT_BUNDLE_URLS:
archive_path = TOR_INSTALL_DIR / "tor-expert-bundle.tar.gz"
try:
urlretrieve(sha256_url, str(sha256_file))
expected_hash = sha256_file.read_text().strip().split()[0].lower()
import hashlib
actual_hash = hashlib.sha256(archive_path.read_bytes()).hexdigest().lower()
sha256_file.unlink(missing_ok=True)
if actual_hash != expected_hash:
logger.error("SHA-256 MISMATCH — download may be compromised! Expected %s, got %s", expected_hash, actual_hash)
archive_path.unlink(missing_ok=True)
return None
logger.info("SHA-256 verified: %s", actual_hash[:16] + "...")
except Exception as hash_err:
# If we can't fetch the hash file, warn but proceed (HTTPS provides baseline integrity)
logger.warning("Could not verify SHA-256 (hash file unavailable): %s — proceeding with HTTPS-only verification", hash_err)
logger.info("Downloading Tor Expert Bundle over HTTPS from %s...", bundle_url)
urlretrieve(bundle_url, str(archive_path))
logger.info("Download complete, extracting...")
sha256_url = bundle_url + ".sha256sum"
sha256_file = TOR_INSTALL_DIR / "sha256sum.txt"
try:
urlretrieve(sha256_url, str(sha256_file))
expected_hash = sha256_file.read_text().strip().split()[0].lower()
import hashlib
# Extract .tar.gz with path traversal protection
import tarfile
with tarfile.open(str(archive_path), "r:gz") as tar:
for member in tar.getmembers():
member_path = (TOR_INSTALL_DIR / member.name).resolve()
if not str(member_path).startswith(str(TOR_INSTALL_DIR.resolve())):
logger.error("Tar path traversal blocked: %s", member.name)
actual_hash = hashlib.sha256(archive_path.read_bytes()).hexdigest().lower()
sha256_file.unlink(missing_ok=True)
if actual_hash != expected_hash:
logger.error("SHA-256 mismatch for Tor download. Expected %s, got %s", expected_hash, actual_hash)
archive_path.unlink(missing_ok=True)
return None
tar.extractall(path=str(TOR_INSTALL_DIR))
continue
logger.info("SHA-256 verified: %s", actual_hash[:16] + "...")
except Exception as hash_err:
logger.warning(
"Could not verify SHA-256 (hash file unavailable): %s; proceeding with HTTPS-only verification",
hash_err,
)
# Clean up archive
archive_path.unlink(missing_ok=True)
logger.info("Download complete, extracting...")
import tarfile
# Find the tor.exe in extracted files
for p in TOR_INSTALL_DIR.rglob("tor.exe"):
logger.info("Tor installed at: %s", p)
return str(p)
with tarfile.open(str(archive_path), "r:gz") as tar:
for member in tar.getmembers():
member_path = (TOR_INSTALL_DIR / member.name).resolve()
if not str(member_path).startswith(str(TOR_INSTALL_DIR.resolve())):
logger.error("Tar path traversal blocked: %s", member.name)
archive_path.unlink(missing_ok=True)
return None
tar.extractall(path=str(TOR_INSTALL_DIR))
logger.error("tor.exe not found after extraction")
return None
except Exception as exc:
logger.error("Failed to download/extract Tor: %s", exc)
archive_path.unlink(missing_ok=True)
return None
archive_path.unlink(missing_ok=True)
for p in TOR_INSTALL_DIR.rglob("tor.exe"):
logger.info("Tor installed at: %s", p)
return str(p)
logger.error("tor.exe not found after extracting %s", bundle_url)
except Exception as exc:
logger.error("Failed to download/extract Tor from %s: %s", bundle_url, exc)
finally:
archive_path.unlink(missing_ok=True)
return None
class TorHiddenService:
@@ -150,7 +136,6 @@ class TorHiddenService:
self._running = False
self._error: str = ""
# Check if we already have a hostname from a previous run
if HOSTNAME_PATH.exists():
try:
hostname = HOSTNAME_PATH.read_text().strip()
@@ -198,19 +183,20 @@ class TorHiddenService:
self._error = ""
tor_bin = _find_tor_binary()
if not tor_bin:
logger.info("Tor not found, attempting auto-install...")
logger.info("Tor not found, attempting bootstrap...")
tor_bin = _auto_install_tor()
if not tor_bin:
self._error = "Failed to auto-install Tor. Please install it manually."
self._error = (
"Could not prepare Tor automatically. Check network access to dist.torproject.org "
"or install Tor, then try again."
)
return {"ok": False, "detail": self._error}
# Create directories
TOR_DIR.mkdir(parents=True, exist_ok=True)
TOR_DATA_DIR.mkdir(parents=True, exist_ok=True)
hidden_service_dir = TOR_DIR / "hidden_service"
hidden_service_dir.mkdir(parents=True, exist_ok=True)
# On non-Windows, Tor requires strict permissions on HiddenServiceDir
if os.name != "nt":
try:
os.chmod(str(hidden_service_dir), 0o700)
@@ -218,19 +204,15 @@ class TorHiddenService:
except OSError:
pass
# Write torrc — enables both hidden service (inbound) and SOCKS proxy
# (outbound) so the mesh/wormhole system can route node-to-node
# traffic through Tor as well.
torrc_content = (
f"DataDirectory {TOR_DATA_DIR.as_posix()}\n"
f"HiddenServiceDir {hidden_service_dir.as_posix()}\n"
f"HiddenServicePort {target_port} 127.0.0.1:{target_port}\n"
f"SocksPort 9050\n"
f"Log notice stderr\n"
"SocksPort 9050\n"
"Log notice stderr\n"
)
TORRC_PATH.write_text(torrc_content, encoding="utf-8")
# Start Tor
try:
self._process = subprocess.Popen(
[tor_bin, "-f", str(TORRC_PATH)],
@@ -245,15 +227,12 @@ class TorHiddenService:
logger.error(self._error)
return {"ok": False, "detail": self._error}
# Wait for hostname file to appear
deadline = time.monotonic() + _STARTUP_TIMEOUT_S
while time.monotonic() < deadline:
if self._process.poll() is not None:
# Tor exited prematurely
stdout = self._process.stdout.read() if self._process.stdout else ""
self._error = f"Tor exited with code {self._process.returncode}"
if stdout:
# Get last few lines for error context
lines = stdout.strip().split("\n")
self._error += ": " + " | ".join(lines[-3:])
self._running = False
@@ -273,7 +252,6 @@ class TorHiddenService:
time.sleep(_POLL_INTERVAL_S)
# Timeout
self._error = f"Tor did not generate hostname within {_STARTUP_TIMEOUT_S}s"
self.stop()
return {"ok": False, "detail": self._error}
@@ -292,10 +270,7 @@ class TorHiddenService:
pass
self._process = None
self._running = False
# Keep the onion_address — it persists across restarts
# since the key is stored in hidden_service_dir
return {"ok": True, "detail": "stopped"}
# Singleton
tor_service = TorHiddenService()
@@ -0,0 +1,57 @@
from __future__ import annotations
import logging
from typing import Any
logger = logging.getLogger(__name__)
def disable_public_mesh_lane(*, reason: str = "private_lane_enabled") -> dict[str, Any]:
"""Disable public Meshtastic MQTT before private Wormhole/Infonet starts."""
result: dict[str, Any] = {
"ok": True,
"reason": reason,
"settings_disabled": False,
"runtime_stopped": False,
}
# Scheduled Wormhole prewarm must not mutate the user's explicit public
# MeshChat session. Only a deliberate private-lane activation should sever
# the public MQTT lane.
normalized_reason = str(reason or "").strip().lower()
if normalized_reason == "wormhole_scheduled_prewarm" or normalized_reason.endswith(":scheduled_prewarm"):
try:
from services.meshtastic_mqtt_settings import mqtt_bridge_enabled
if mqtt_bridge_enabled():
logger.info("Keeping public Mesh lane active during Wormhole prewarm: %s", reason)
result["skipped"] = True
result["skip_reason"] = "public_mesh_user_enabled"
return result
except Exception as exc:
logger.debug("Could not inspect public Mesh state during %s: %s", reason, exc)
logger.info("Disabling public Mesh lane: %s", reason)
try:
from services.meshtastic_mqtt_settings import write_meshtastic_mqtt_settings
settings = write_meshtastic_mqtt_settings(enabled=False)
result["settings_disabled"] = not bool(settings.get("enabled"))
except Exception as exc:
logger.warning("Failed to disable public Mesh settings during %s: %s", reason, exc)
result["ok"] = False
result["settings_error"] = str(exc)
try:
from services.sigint_bridge import sigint_grid
if sigint_grid.mesh.is_running():
sigint_grid.mesh.stop()
result["runtime_stopped"] = not sigint_grid.mesh.is_running()
except Exception as exc:
logger.warning("Failed to stop public Mesh runtime during %s: %s", reason, exc)
result["ok"] = False
result["runtime_error"] = str(exc)
return result
+1 -1
View File
@@ -24,7 +24,7 @@ from cachetools import TTLCache
logger = logging.getLogger(__name__)
_FINNHUB_BASE = "https://finnhub.io/api/v1"
_USER_AGENT = "ShadowBroker/0.9.7 Finnhub connector"
_USER_AGENT = "ShadowBroker/0.9.79 Finnhub connector"
_REQUEST_TIMEOUT = 12
_MIN_INTERVAL_SECONDS = 0.35 # Stay well under 60 calls/min
+80 -16
View File
@@ -243,6 +243,48 @@ def _pid_alive(pid: int) -> bool:
return True
def _find_wormhole_server_pid() -> int:
if os.name == "nt":
return 0
proc_dir = Path("/proc")
if not proc_dir.exists():
return 0
current_pid = os.getpid()
script_name = WORMHOLE_SCRIPT.name
script_path = str(WORMHOLE_SCRIPT)
for entry in proc_dir.iterdir():
if not entry.name.isdigit():
continue
pid = int(entry.name)
if pid == current_pid:
continue
try:
raw = (entry / "cmdline").read_bytes()
except OSError:
continue
cmdline = raw.replace(b"\x00", b" ").decode("utf-8", errors="replace")
if script_path in cmdline or script_name in cmdline:
return pid
return 0
def _terminate_pid(pid: int, *, timeout_s: float = 5.0) -> None:
if os.name == "nt" or pid <= 0:
return
try:
os.kill(pid, signal.SIGTERM)
except Exception:
return
deadline = time.monotonic() + timeout_s
while time.monotonic() < deadline and _pid_alive(pid):
time.sleep(0.1)
if _pid_alive(pid):
try:
os.kill(pid, signal.SIGKILL)
except Exception:
pass
def _probe_ready(timeout_s: float = 1.5) -> bool:
try:
with urlopen(f"http://{WORMHOLE_HOST}:{WORMHOLE_PORT}/api/health", timeout=timeout_s) as resp:
@@ -266,17 +308,32 @@ def _probe_json(path: str, timeout_s: float = 1.5) -> dict[str, Any] | None:
def _current_runtime_state() -> dict[str, Any]:
settings = read_wormhole_settings()
status = read_wormhole_status()
configured = bool(settings.get("enabled"))
running = False
ready = False
pid = int(status.get("pid", 0) or 0)
if _PROCESS and _PROCESS.poll() is None:
if not configured:
# Disabled private transport must stay disabled even if a stale local
# wormhole process is still answering on the health port. Public
# MeshChat relies on this state to keep the MQTT and Wormhole lanes
# mutually exclusive.
pid = 0
ready = False
elif _PROCESS and _PROCESS.poll() is None:
running = True
pid = int(_PROCESS.pid or 0)
elif _pid_alive(pid):
running = True
elif _probe_ready(timeout_s=0.35):
running = True
pid = 0
ready = running and _probe_ready()
else:
if _pid_alive(pid):
running = True
else:
discovered_pid = _find_wormhole_server_pid()
if discovered_pid > 0:
running = True
pid = discovered_pid
if not running and _probe_ready(timeout_s=0.35):
running = True
pid = 0
ready = running and _probe_ready()
if not running:
pid = 0
transport_active = status.get("transport_active", "") if ready else ""
@@ -319,13 +376,13 @@ def _current_runtime_state() -> dict[str, Any]:
anonymous_mode = bool(settings.get("anonymous_mode"))
anonymous_mode_ready = bool(
anonymous_mode
and settings.get("enabled")
and configured
and ready
and effective_transport in {"tor", "tor_arti", "i2p", "mixnet"}
)
snapshot = {
"installed": _installed(),
"configured": bool(settings.get("enabled")),
"configured": configured,
"running": running,
"ready": ready,
"transport_configured": str(settings.get("transport", "direct") or "direct"),
@@ -395,6 +452,12 @@ def get_wormhole_state() -> dict[str, Any]:
def connect_wormhole(*, reason: str = "connect") -> dict[str, Any]:
with _LOCK:
_invalidate_state_cache()
try:
from services.transport_lane_isolation import disable_public_mesh_lane
disable_public_mesh_lane(reason=f"wormhole_{reason}")
except Exception as exc:
logger.warning("Failed to enforce public/private lane isolation during %s: %s", reason, exc)
settings = read_wormhole_settings()
if not settings.get("enabled"):
settings = settings.copy()
@@ -487,8 +550,8 @@ def connect_wormhole(*, reason: str = "connect") -> dict[str, Any]:
def disconnect_wormhole(*, reason: str = "disconnect") -> dict[str, Any]:
with _LOCK:
_invalidate_state_cache()
current = _current_runtime_state()
pid = int(current.get("pid", 0) or 0)
status = read_wormhole_status()
pid = int(status.get("pid", 0) or 0)
global _PROCESS
if _PROCESS and _PROCESS.poll() is None:
try:
@@ -499,14 +562,15 @@ def disconnect_wormhole(*, reason: str = "disconnect") -> dict[str, Any]:
_PROCESS.kill()
except Exception:
pass
elif os.name != "nt" and _pid_alive(pid):
try:
os.kill(pid, signal.SIGTERM)
except Exception:
pass
if os.name != "nt":
_terminate_pid(pid)
discovered_pid = _find_wormhole_server_pid()
if discovered_pid > 0 and discovered_pid != pid:
_terminate_pid(discovered_pid)
_PROCESS = None
write_wormhole_status(
reason=reason,
configured=False,
running=False,
ready=False,
pid=0,
@@ -37,6 +37,30 @@ def test_eligible_sync_peers_filters_bucket_and_cooldown():
assert [record.peer_url for record in candidates] == ["https://active.example"]
def test_eligible_sync_peers_prioritizes_explicit_bootstrap_seed():
old_runtime = make_sync_peer_record(
peer_url="https://old-runtime.example",
transport="clearnet",
role="participant",
source="runtime",
now=100,
)
seed = make_sync_peer_record(
peer_url="https://node.shadowbroker.info",
transport="clearnet",
role="seed",
source="bundle",
now=200,
)
candidates = eligible_sync_peers([old_runtime, seed], now=300)
assert [record.peer_url for record in candidates] == [
"https://node.shadowbroker.info",
"https://old-runtime.example",
]
def test_finish_sync_success_updates_schedule():
state = begin_sync(SyncWorkerState(), peer_url="https://seed.example", now=100)
finished = finish_sync(
@@ -52,7 +52,9 @@ def test_refresh_node_peer_store_promotes_manifest_peers_to_sync_only(tmp_path,
monkeypatch.setenv("MESH_BOOTSTRAP_SIGNER_PUBLIC_KEY", manifest_pub)
monkeypatch.setenv("MESH_BOOTSTRAP_MANIFEST_PATH", str(manifest_path))
monkeypatch.setenv("MESH_RELAY_PEERS", "https://operator.example")
monkeypatch.setenv("MESH_BOOTSTRAP_SEED_PEERS", "")
monkeypatch.setenv("MESH_DEFAULT_SYNC_PEERS", "")
monkeypatch.setenv("MESH_INFONET_ALLOW_CLEARNET_SYNC", "true")
get_settings.cache_clear()
try:
@@ -74,7 +76,7 @@ def test_refresh_node_peer_store_promotes_manifest_peers_to_sync_only(tmp_path,
assert [record.peer_url for record in store.records_for_bucket("push")] == ["https://operator.example"]
def test_refresh_node_peer_store_adds_default_seed_as_pull_only_peer(tmp_path, monkeypatch):
def test_refresh_node_peer_store_adds_bootstrap_seed_as_pull_only_peer(tmp_path, monkeypatch):
import main
from services.config import get_settings
from services.mesh import mesh_peer_store as peer_store_mod
@@ -82,7 +84,9 @@ def test_refresh_node_peer_store_adds_default_seed_as_pull_only_peer(tmp_path, m
peer_store_path = tmp_path / "peer_store.json"
monkeypatch.setattr(peer_store_mod, "DEFAULT_PEER_STORE_PATH", peer_store_path)
monkeypatch.setenv("MESH_RELAY_PEERS", "")
monkeypatch.setenv("MESH_DEFAULT_SYNC_PEERS", "https://node.shadowbroker.info")
monkeypatch.setenv("MESH_BOOTSTRAP_SEED_PEERS", "https://node.shadowbroker.info")
monkeypatch.setenv("MESH_DEFAULT_SYNC_PEERS", "")
monkeypatch.setenv("MESH_INFONET_ALLOW_CLEARNET_SYNC", "true")
monkeypatch.setenv("MESH_BOOTSTRAP_SIGNER_PUBLIC_KEY", "")
get_settings.cache_clear()
@@ -94,6 +98,7 @@ def test_refresh_node_peer_store_adds_default_seed_as_pull_only_peer(tmp_path, m
get_settings.cache_clear()
assert snapshot["manifest_loaded"] is False
assert snapshot["bootstrap_seed_peer_count"] == 1
assert snapshot["default_sync_peer_count"] == 1
assert snapshot["bootstrap_peer_count"] == 1
assert snapshot["sync_peer_count"] == 1
@@ -107,6 +112,36 @@ def test_refresh_node_peer_store_adds_default_seed_as_pull_only_peer(tmp_path, m
assert store.records_for_bucket("sync")[0].source == "bundle"
def test_refresh_node_peer_store_suppresses_clearnet_seed_by_default(tmp_path, monkeypatch):
import main
from services.config import get_settings
from services.mesh import mesh_peer_store as peer_store_mod
peer_store_path = tmp_path / "peer_store.json"
monkeypatch.setattr(peer_store_mod, "DEFAULT_PEER_STORE_PATH", peer_store_path)
monkeypatch.setenv("MESH_RELAY_PEERS", "")
monkeypatch.setenv("MESH_BOOTSTRAP_SEED_PEERS", "https://node.shadowbroker.info")
monkeypatch.setenv("MESH_DEFAULT_SYNC_PEERS", "")
monkeypatch.delenv("MESH_INFONET_ALLOW_CLEARNET_SYNC", raising=False)
monkeypatch.setenv("MESH_BOOTSTRAP_SIGNER_PUBLIC_KEY", "")
get_settings.cache_clear()
try:
snapshot = main._refresh_node_peer_store(now=1_750_000_000)
store = peer_store_mod.PeerStore(peer_store_path)
store.load()
finally:
get_settings.cache_clear()
assert snapshot["private_transport_required"] is True
assert snapshot["skipped_clearnet_peer_count"] == 1
assert snapshot["bootstrap_peer_count"] == 0
assert snapshot["sync_peer_count"] == 0
assert "no clearnet sync fallback" in snapshot["last_bootstrap_error"]
assert store.records_for_bucket("bootstrap") == []
assert store.records_for_bucket("sync") == []
def test_verify_peer_push_hmac_requires_allowlisted_peer(monkeypatch):
import hashlib
import hmac
@@ -172,13 +207,19 @@ def test_infonet_status_includes_node_runtime_snapshot(monkeypatch):
def test_public_sync_cycle_allows_first_node_without_peers(tmp_path, monkeypatch):
import main
from services.config import get_settings
from services.mesh import mesh_peer_store as peer_store_mod
peer_store_path = tmp_path / "peer_store.json"
monkeypatch.setattr(peer_store_mod, "DEFAULT_PEER_STORE_PATH", peer_store_path)
monkeypatch.setattr(main, "_participant_node_enabled", lambda: True)
monkeypatch.setenv("MESH_INFONET_ALLOW_CLEARNET_SYNC", "true")
get_settings.cache_clear()
result = main._run_public_sync_cycle()
try:
result = main._run_public_sync_cycle()
finally:
get_settings.cache_clear()
assert result.last_outcome == "solo"
assert result.last_error == ""
@@ -96,3 +96,38 @@ def test_peer_store_failure_and_success_lifecycle(tmp_path):
assert recovered.cooldown_until == 0
assert recovered.last_error == ""
assert recovered.last_sync_ok_at == 250
def test_upsert_explicit_seed_clears_stale_cooldown(tmp_path):
store = PeerStore(tmp_path / "peer_store.json")
store.upsert(
make_sync_peer_record(
peer_url="https://node.shadowbroker.info",
transport="clearnet",
role="seed",
source="bundle",
now=100,
)
)
failed = store.mark_failure(
"https://node.shadowbroker.info",
"sync",
error="timed out",
cooldown_s=120,
now=110,
)
assert failed.cooldown_until == 230
refreshed = store.upsert(
make_sync_peer_record(
peer_url="https://node.shadowbroker.info",
transport="clearnet",
role="seed",
source="bundle",
now=120,
)
)
assert refreshed.failure_count == 0
assert refreshed.cooldown_until == 0
assert refreshed.last_error == ""
@@ -0,0 +1,54 @@
import importlib
def test_meshtastic_mqtt_settings_redacts_secrets(tmp_path, monkeypatch):
monkeypatch.setenv("SB_DATA_DIR", str(tmp_path))
from services import meshtastic_mqtt_settings
settings = importlib.reload(meshtastic_mqtt_settings)
saved = settings.write_meshtastic_mqtt_settings(
enabled=True,
broker="mqtt.example.test",
port=1884,
username="mesh-user",
password="mesh-pass",
psk="001122",
include_default_roots=False,
extra_roots="EU,US",
)
redacted = settings.redacted_meshtastic_mqtt_settings(saved)
assert saved["password"] == "mesh-pass"
assert saved["psk"] == "001122"
assert redacted["enabled"] is True
assert redacted["broker"] == "mqtt.example.test"
assert redacted["port"] == 1884
assert redacted["username"] == "mesh-user"
assert redacted["has_password"] is True
assert redacted["has_psk"] is True
assert "password" not in redacted
assert "psk" not in redacted
assert settings.mqtt_connection_config() == ("mqtt.example.test", 1884, "mesh-user", "mesh-pass")
assert settings.mqtt_bridge_enabled() is True
assert settings.mqtt_psk_hex() == "001122"
assert settings.mqtt_subscription_settings() == ("EU,US", "", False)
def test_meshtastic_mqtt_settings_hide_public_defaults(tmp_path, monkeypatch):
monkeypatch.setenv("SB_DATA_DIR", str(tmp_path))
from services import meshtastic_mqtt_settings
settings = importlib.reload(meshtastic_mqtt_settings)
saved = settings.write_meshtastic_mqtt_settings(
enabled=True,
broker="mqtt.meshtastic.org",
username="",
password="",
)
redacted = settings.redacted_meshtastic_mqtt_settings(saved)
assert redacted["username"] == ""
assert redacted["uses_default_credentials"] is True
assert settings.mqtt_connection_config() == ("mqtt.meshtastic.org", 1883, "meshdev", "large4cats")
@@ -0,0 +1,27 @@
from services.mesh.meshtastic_topics import build_subscription_topics, known_roots, parse_topic_metadata
def test_default_subscription_is_longfast_only():
assert build_subscription_topics() == [
"msh/US/2/e/LongFast/#",
"msh/US/2/json/LongFast/#",
]
assert known_roots() == ["US"]
def test_extra_roots_are_longfast_only():
assert build_subscription_topics(extra_roots="EU_868,ANZ") == [
"msh/US/2/e/LongFast/#",
"msh/US/2/json/LongFast/#",
"msh/EU_868/2/e/LongFast/#",
"msh/EU_868/2/json/LongFast/#",
"msh/ANZ/2/e/LongFast/#",
"msh/ANZ/2/json/LongFast/#",
]
def test_parse_longfast_topic_root():
meta = parse_topic_metadata("msh/US/2/e/LongFast/!12345678")
assert meta["region"] == "US"
assert meta["root"] == "US"
assert meta["channel"] == "LongFast"
+36 -1
View File
@@ -7,9 +7,44 @@ def test_node_settings_roundtrip(tmp_path, monkeypatch):
monkeypatch.setattr(node_settings, "_cache_ts", 0.0)
initial = node_settings.read_node_settings()
disabled = node_settings.write_node_settings(enabled=False)
updated = node_settings.write_node_settings(enabled=True)
reread = node_settings.read_node_settings()
assert initial["enabled"] is False
assert initial["enabled"] is True
assert initial["operator_disabled"] is False
assert disabled["enabled"] is False
assert disabled["operator_disabled"] is True
assert updated["enabled"] is True
assert updated["operator_disabled"] is False
assert reread["enabled"] is True
def test_legacy_disabled_node_settings_auto_enable(tmp_path, monkeypatch):
from services import node_settings
settings_path = tmp_path / "node.json"
settings_path.write_text('{"enabled": false, "updated_at": 123}', encoding="utf-8")
monkeypatch.setattr(node_settings, "NODE_FILE", settings_path)
monkeypatch.setattr(node_settings, "_cache", None)
monkeypatch.setattr(node_settings, "_cache_ts", 0.0)
reread = node_settings.read_node_settings()
assert reread["enabled"] is True
assert reread["operator_disabled"] is False
def test_explicit_operator_disabled_stays_disabled(tmp_path, monkeypatch):
from services import node_settings
settings_path = tmp_path / "node.json"
settings_path.write_text('{"enabled": false, "operator_disabled": true, "updated_at": 123}', encoding="utf-8")
monkeypatch.setattr(node_settings, "NODE_FILE", settings_path)
monkeypatch.setattr(node_settings, "_cache", None)
monkeypatch.setattr(node_settings, "_cache_ts", 0.0)
reread = node_settings.read_node_settings()
assert reread["enabled"] is False
assert reread["operator_disabled"] is True
+6 -2
View File
@@ -145,6 +145,10 @@ class TestFeedConfig:
def test_new_east_asia_feeds_present(self):
names = {f["name"] for f in DEFAULT_FEEDS}
expected = {"FocusTaiwan", "Kyodo", "SCMP", "The Diplomat", "Stars and Stripes",
"Yonhap", "Nikkei Asia", "Taipei Times", "Asia Times", "Defense News", "Japan Times"}
expected = {"SCMP", "The Diplomat", "Yonhap", "Asia Times", "Defense News", "Japan Times"}
assert expected.issubset(names)
def test_known_dead_feeds_are_not_defaulted(self):
urls = {f["url"] for f in DEFAULT_FEEDS}
assert "https://www.reutersagency.com/feed/?best-topics=world" not in urls
assert "https://rsshub.app/apnews/topics/world-news" not in urls
+2 -2
View File
@@ -1,12 +1,12 @@
{
"name": "@shadowbroker/desktop-shell",
"version": "0.9.7",
"version": "0.9.79",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "@shadowbroker/desktop-shell",
"version": "0.9.7",
"version": "0.9.79",
"devDependencies": {
"typescript": "^5.6.0"
}
+1 -1
View File
@@ -1,6 +1,6 @@
{
"name": "@shadowbroker/desktop-shell",
"version": "0.9.7",
"version": "0.9.79",
"private": true,
"description": "ShadowBroker desktop shell packaging, runtime bridge, and release tooling",
"scripts": {
+25
View File
@@ -9,6 +9,7 @@ $repoRoot = Resolve-Path (Join-Path $scriptDir "..\..")
$frontendDir = Join-Path $repoRoot "frontend"
$frontendOut = Join-Path $frontendDir "out"
$srcTauriDir = Join-Path $scriptDir "src-tauri"
$tauriConfigPath = Join-Path $srcTauriDir "tauri.conf.json"
$companionDir = Join-Path $srcTauriDir "companion-www"
$backendRuntimeDir = Join-Path $srcTauriDir "backend-runtime"
$iconsScript = Join-Path $scriptDir "scripts\generate-icons.cjs"
@@ -43,6 +44,18 @@ function Invoke-External {
}
}
function Write-Utf8NoBom {
param(
[Parameter(Mandatory = $true)]
[string]$Path,
[Parameter(Mandatory = $true)]
[string]$Content
)
$encoding = New-Object System.Text.UTF8Encoding($false)
[System.IO.File]::WriteAllText($Path, $Content, $encoding)
}
foreach ($tool in @("cargo", "npm", "node")) {
if (-not (Get-Command $tool -ErrorAction SilentlyContinue)) {
throw "$tool is required for desktop packaging."
@@ -107,6 +120,7 @@ Write-Host " -> $fileCount files"
Write-Host ""
Push-Location $srcTauriDir
$tauriConfigBackup = $null
try {
if (-not $env:SHADOWBROKER_BACKEND_URL) {
$env:SHADOWBROKER_BACKEND_URL = "http://127.0.0.1:8000"
@@ -131,6 +145,14 @@ try {
Write-Host "Updater signing: enabled"
} else {
Write-Host "Updater signing: disabled (set TAURI_SIGNING_PRIVATE_KEY_PATH to emit update signatures)"
$tauriConfigBackup = Get-Content -LiteralPath $tauriConfigPath -Raw
$tauriConfig = $tauriConfigBackup | ConvertFrom-Json
if ($tauriConfig.bundle.createUpdaterArtifacts) {
$tauriConfig.bundle.createUpdaterArtifacts = $false
$tauriConfig |
ConvertTo-Json -Depth 100 |
ForEach-Object { Write-Utf8NoBom -Path $tauriConfigPath -Content ($_ + "`n") }
}
}
Write-Host ""
@@ -147,5 +169,8 @@ try {
}
}
finally {
if ($null -ne $tauriConfigBackup) {
Write-Utf8NoBom -Path $tauriConfigPath -Content $tauriConfigBackup
}
Pop-Location
}
@@ -43,6 +43,18 @@ function prepareBuildTree() {
filter: shouldCopy,
});
const stagedLayoutPath = path.join(buildFrontendDir, 'src', 'app', 'layout.tsx');
if (fs.existsSync(stagedLayoutPath)) {
const layoutSource = fs.readFileSync(stagedLayoutPath, 'utf8');
fs.writeFileSync(
stagedLayoutPath,
layoutSource
.replace(/\n\/\/ The dashboard is a live local runtime[\s\S]*?client polling ever hydrates\.\n/g, '\n')
.replace(/\nexport const dynamic = ['"]force-dynamic['"];\n/g, '\n')
.replace(/\nexport const revalidate = 0;\n/g, '\n'),
);
}
const liveNodeModules = path.join(frontendDir, 'node_modules');
const stagedNodeModules = path.join(buildFrontendDir, 'node_modules');
if (!fs.existsSync(liveNodeModules)) {
+1 -1
View File
@@ -4201,7 +4201,7 @@ dependencies = [
[[package]]
name = "shadowbroker-tauri-shell"
version = "0.9.7"
version = "0.9.79"
dependencies = [
"axum",
"base64 0.22.1",
@@ -1,6 +1,6 @@
[package]
name = "shadowbroker-tauri-shell"
version = "0.9.7"
version = "0.9.79"
edition = "2021"
[build-dependencies]
@@ -1,7 +1,7 @@
{
"$schema": "https://schema.tauri.app/config/2",
"productName": "ShadowBroker",
"version": "0.9.7",
"version": "0.9.79",
"identifier": "com.shadowbroker.desktop",
"build": {
"frontendDist": "../../../frontend/out",
-1
View File
@@ -21,7 +21,6 @@ services:
resources:
limits:
memory: 2G
cpus: '2'
volumes:
relay_data:
+17 -4
View File
@@ -11,7 +11,7 @@ services:
image: ghcr.io/bigbodycobain/shadowbroker-backend:latest
container_name: shadowbroker-backend
ports:
- "${BIND:-127.0.0.1}:8000:8000"
- "${BIND:-127.0.0.1}:${BACKEND_PORT:-8000}:8000"
environment:
- AIS_API_KEY=${AIS_API_KEY:-}
- OPENSKY_CLIENT_ID=${OPENSKY_CLIENT_ID:-}
@@ -21,12 +21,25 @@ services:
- FINNHUB_API_KEY=${FINNHUB_API_KEY:-}
# Override allowed CORS origins (comma-separated). Auto-detects LAN IPs if empty.
- CORS_ORIGINS=${CORS_ORIGINS:-}
# Default public Infonet seed used for pull-only sync by fresh installs.
- MESH_DEFAULT_SYNC_PEERS=${MESH_DEFAULT_SYNC_PEERS:-https://node.shadowbroker.info}
# Private Infonet bootstrap seeds. Seeds are discovery hints, not fixed roots.
- MESH_BOOTSTRAP_SEED_PEERS=${MESH_BOOTSTRAP_SEED_PEERS:-http://gqpbunqbgtkcqilvclm3xrkt3zowjyl3s62kkktvojgvxzizamvbrqid.onion:8000}
- MESH_DEFAULT_SYNC_PEERS=${MESH_DEFAULT_SYNC_PEERS:-}
# Operator-trusted sync/push peers. Leave empty unless you control the peer secret on both sides.
- MESH_RELAY_PEERS=${MESH_RELAY_PEERS:-}
# Shared transport auth for operator peer push. Must be set to a unique secret per deployment.
- MESH_PEER_PUSH_SECRET=${MESH_PEER_PUSH_SECRET:-}
# Meshtastic MQTT is opt-in to avoid passive load on the public broker.
# Set MESH_MQTT_ENABLED=true in .env only when this node should join live MQTT.
- MESH_MQTT_ENABLED=${MESH_MQTT_ENABLED:-false}
- MESH_MQTT_BROKER=${MESH_MQTT_BROKER:-mqtt.meshtastic.org}
- MESH_MQTT_PORT=${MESH_MQTT_PORT:-1883}
- MESH_MQTT_USER=${MESH_MQTT_USER:-meshdev}
- MESH_MQTT_PASS=${MESH_MQTT_PASS:-large4cats}
- MESH_MQTT_PSK=${MESH_MQTT_PSK:-}
- MESH_MQTT_INCLUDE_DEFAULT_ROOTS=${MESH_MQTT_INCLUDE_DEFAULT_ROOTS:-true}
- MESH_MQTT_EXTRA_ROOTS=${MESH_MQTT_EXTRA_ROOTS:-}
- MESH_MQTT_EXTRA_TOPICS=${MESH_MQTT_EXTRA_TOPICS:-}
- MESHTASTIC_OPERATOR_CALLSIGN=${MESHTASTIC_OPERATOR_CALLSIGN:-}
# The bundled Docker UI talks to the backend across Docker's private bridge.
# Treat that bridge as local operator access while ports remain bound to 127.0.0.1 by default.
- SHADOWBROKER_TRUST_DOCKER_BRIDGE_LOCAL_OPERATOR=${SHADOWBROKER_TRUST_DOCKER_BRIDGE_LOCAL_OPERATOR:-1}
@@ -42,7 +55,7 @@ services:
deploy:
resources:
limits:
memory: 2G
memory: ${BACKEND_MEMORY_LIMIT:-4G}
cpus: '2'
frontend:
+2 -2
View File
@@ -1,12 +1,12 @@
{
"name": "frontend",
"version": "0.9.7",
"version": "0.9.79",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "frontend",
"version": "0.9.7",
"version": "0.9.79",
"dependencies": {
"@mapbox/point-geometry": "^1.1.0",
"@tauri-apps/plugin-process": "^2.3.1",
+1 -1
View File
@@ -1,6 +1,6 @@
{
"name": "frontend",
"version": "0.9.7",
"version": "0.9.79",
"private": true,
"scripts": {
"dev": "node scripts/dev-all.cjs",
@@ -2,8 +2,8 @@
* Phase 5F-A: CSP nonce plumbing tests.
*
* Validates:
* 1. Nonce appears in document CSP header
* 2. Nonce differs across repeated requests
* 1. Document CSP remains hydration-safe for the Next.js runtime
* 2. CSP is deterministic across repeated requests
* 3. next.config.ts no longer owns a static CSP header
* 4. Middleware does not break API/static routes (matcher exclusion)
* 5. Google Fonts domains are preserved in CSP
@@ -41,58 +41,46 @@ function matcherExcludes(path: string): boolean {
}
// ---------------------------------------------------------------------------
// 1. Nonce appears in document CSP header
// 1. Document CSP remains hydration-safe
// ---------------------------------------------------------------------------
describe('nonce in CSP header', () => {
it('CSP header contains a nonce-<value> token in script-src', () => {
describe('hydration-safe CSP header', () => {
it('CSP header does not put nonce tokens in script-src', () => {
const csp = getCsp();
expect(csp).toMatch(/'nonce-[A-Za-z0-9+/=]+'/) ;
expect(csp).not.toMatch(/'nonce-[A-Za-z0-9+/=]+'/);
});
it('nonce value is a base64-encoded UUID', () => {
it('script-src keeps the inline compatibility fallback required by Next hydration', () => {
const csp = getCsp();
const match = csp.match(/'nonce-([A-Za-z0-9+/=]+)'/);
expect(match).not.toBeNull();
const decoded = Buffer.from(match![1], 'base64').toString();
// crypto.randomUUID() produces 8-4-4-4-12 hex with dashes
expect(decoded).toMatch(/^[0-9a-f]{8}-[0-9a-f]{4}-/);
expect(csp).toMatch(/script-src [^;]*'unsafe-inline'/);
});
it('x-nonce request header is set on the response', () => {
const res = callMiddleware();
// NextResponse.next({ request: { headers } }) merges into request headers.
// The CSP nonce in the header must match the one forwarded to server components.
const csp = res.headers.get('Content-Security-Policy') ?? '';
const nonceInCsp = csp.match(/'nonce-([A-Za-z0-9+/=]+)'/)?.[1];
expect(nonceInCsp).toBeTruthy();
it('middleware still returns a CSP header for document requests', () => {
const csp = getCsp();
expect(csp).toContain("default-src 'self'");
expect(csp).toContain("script-src 'self'");
});
});
// ---------------------------------------------------------------------------
// 2. Nonce differs across repeated requests
// 2. CSP is deterministic across repeated requests
// ---------------------------------------------------------------------------
describe('nonce uniqueness', () => {
it('two sequential requests produce different nonces', () => {
describe('CSP stability', () => {
it('two sequential requests produce the same document CSP', () => {
const csp1 = getCsp();
const csp2 = getCsp();
const nonce1 = csp1.match(/'nonce-([A-Za-z0-9+/=]+)'/)?.[1];
const nonce2 = csp2.match(/'nonce-([A-Za-z0-9+/=]+)'/)?.[1];
expect(nonce1).toBeTruthy();
expect(nonce2).toBeTruthy();
expect(nonce1).not.toBe(nonce2);
expect(csp1).toBe(csp2);
});
it('ten requests produce ten distinct nonces', () => {
const nonces = new Set<string>();
it('ten requests do not introduce nonce-bearing CSP variants', () => {
const csps = new Set<string>();
for (let i = 0; i < 10; i++) {
const csp = getCsp();
const nonce = csp.match(/'nonce-([A-Za-z0-9+/=]+)'/)?.[1];
expect(nonce).toBeTruthy();
nonces.add(nonce!);
expect(csp).not.toMatch(/'nonce-[A-Za-z0-9+/=]+'/);
csps.add(csp);
}
expect(nonces.size).toBe(10);
expect(csps.size).toBe(1);
});
});
@@ -185,8 +173,9 @@ describe('production CSP directive completeness', () => {
expect(csp).toContain("default-src 'self'");
});
it('has script-src with nonce', () => {
expect(csp).toMatch(/script-src [^;]*'nonce-/);
it('has script-src with hydration compatibility fallback', () => {
expect(csp).toMatch(/script-src [^;]*'unsafe-inline'/);
expect(csp).not.toMatch(/script-src [^;]*'nonce-/);
});
it('has style-src with unsafe-inline and fonts.googleapis.com', () => {
@@ -1,8 +1,9 @@
/**
* Phase 5F-B: Production script-src unsafe-inline removal tests.
* Phase 5F-B: Production script-src nonce hardening tests.
*
* Validates:
* 1. Production CSP omits script-src 'unsafe-inline'
* 1. Production CSP preserves hydration-safe script execution with a compatibility
* inline fallback required by the Next.js production runtime
* 2. Dev CSP retains 'unsafe-inline' and 'unsafe-eval'
* 3. Unchanged directives (style-src, font-src, worker-src, etc.) intact
* 4. API/static route exclusions remain intact
@@ -41,7 +42,7 @@ function matcherExcludes(path: string): boolean {
}
// ---------------------------------------------------------------------------
// 1. Production CSP omits script-src 'unsafe-inline'
// 1. Production CSP stays hardened without blocking Next hydration
// ---------------------------------------------------------------------------
describe('production script-src hardening', () => {
@@ -52,9 +53,9 @@ describe('production script-src hardening', () => {
vi.unstubAllEnvs();
});
it('production script-src does NOT contain unsafe-inline', () => {
it('production script-src contains unsafe-inline compatibility fallback', () => {
const scriptSrc = getDirective('script-src');
expect(scriptSrc).not.toContain("'unsafe-inline'");
expect(scriptSrc).toContain("'unsafe-inline'");
});
it('production script-src does NOT contain unsafe-eval', () => {
@@ -62,9 +63,9 @@ describe('production script-src hardening', () => {
expect(scriptSrc).not.toContain("'unsafe-eval'");
});
it('production script-src contains nonce', () => {
it('production script-src does not contain nonce until all Next inline scripts are wired', () => {
const scriptSrc = getDirective('script-src');
expect(scriptSrc).toMatch(/'nonce-[A-Za-z0-9+/=]+'/);
expect(scriptSrc).not.toMatch(/'nonce-[A-Za-z0-9+/=]+'/);
});
it('production script-src contains self and blob:', () => {
@@ -105,9 +106,9 @@ describe('dev script-src allowances', () => {
expect(scriptSrc).toContain("'unsafe-eval'");
});
it('dev script-src still contains nonce', () => {
it('dev script-src also omits nonce to match production hydration behavior', () => {
const scriptSrc = getDirective('script-src');
expect(scriptSrc).toMatch(/'nonce-[A-Za-z0-9+/=]+'/);
expect(scriptSrc).not.toMatch(/'nonce-[A-Za-z0-9+/=]+'/);
});
it('dev connect-src includes localhost backends', () => {
@@ -213,10 +214,12 @@ describe('per-request environment evaluation', () => {
it('switching NODE_ENV between calls changes script-src', () => {
vi.stubEnv('NODE_ENV', 'production');
const prodScriptSrc = getDirective('script-src');
expect(prodScriptSrc).not.toContain("'unsafe-inline'");
expect(prodScriptSrc).toContain("'unsafe-inline'");
expect(prodScriptSrc).not.toContain("'unsafe-eval'");
vi.stubEnv('NODE_ENV', 'development');
const devScriptSrc = getDirective('script-src');
expect(devScriptSrc).toContain("'unsafe-inline'");
expect(devScriptSrc).toContain("'unsafe-eval'");
});
});
@@ -9,12 +9,12 @@ import {
} from '@/lib/updateRuntime';
const RELEASE: GitHubLatestRelease = {
html_url: 'https://github.com/BigBodyCobain/Shadowbroker/releases/tag/v0.9.7',
html_url: 'https://github.com/BigBodyCobain/Shadowbroker/releases/tag/v0.9.79',
assets: [
{ name: 'ShadowBroker_0.9.7_x64_en-US.msi', browser_download_url: 'https://example.test/windows.msi' },
{ name: 'ShadowBroker_0.9.7_x64-setup.exe', browser_download_url: 'https://example.test/windows-setup.exe' },
{ name: 'ShadowBroker_0.9.7_aarch64.dmg', browser_download_url: 'https://example.test/macos.dmg' },
{ name: 'ShadowBroker_0.9.7_amd64.AppImage', browser_download_url: 'https://example.test/linux.AppImage' },
{ name: 'ShadowBroker_0.9.79_x64_en-US.msi', browser_download_url: 'https://example.test/windows.msi' },
{ name: 'ShadowBroker_0.9.79_x64-setup.exe', browser_download_url: 'https://example.test/windows-setup.exe' },
{ name: 'ShadowBroker_0.9.79_aarch64.dmg', browser_download_url: 'https://example.test/macos.dmg' },
{ name: 'ShadowBroker_0.9.79_amd64.AppImage', browser_download_url: 'https://example.test/linux.AppImage' },
],
};
@@ -179,8 +179,10 @@ describe('MeshChat decomposition — identity persistence', () => {
const controller = readFile('useMeshChatController.ts');
expect(controller).toMatch(/from\s+['"]@\/mesh\/meshIdentity['"]/);
expect(controller).toMatch(/getNodeIdentity/);
expect(controller).toMatch(/generateNodeKeys/);
expect(controller).toMatch(/signEvent/);
expect(controller).toMatch(/getStoredNodeDescriptor/);
expect(controller).toMatch(/nextSequence/);
expect(controller).toMatch(/verifyEventSignature/);
expect(controller).toMatch(/setSecureModeCached/);
});
it('storage module imports from meshIdentity for seal operations', () => {
@@ -2,7 +2,7 @@ import '@testing-library/jest-dom/vitest';
import React from 'react';
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest';
import { cleanup, fireEvent, render, screen } from '@testing-library/react';
import { cleanup, fireEvent, render, screen, waitFor } from '@testing-library/react';
let contactsState: Record<string, any> = {};
@@ -61,8 +61,29 @@ const mocks = vi.hoisted(() => ({
bootstrapDecryptAccessRequest: vi.fn(async () => 'offer'),
bootstrapEncryptAccessRequest: vi.fn(async () => 'x3dh1:bootstrap'),
canUseWormholeBootstrap: vi.fn(async () => false),
bootstrapWormholeIdentity: vi.fn(async () => ({
node_id: '!sb_local',
public_key: 'local-pub',
public_key_algo: 'Ed25519',
sequence: 1,
protocol_version: 'infonet/2',
})),
exportWormholeDmInvite: vi.fn(async () => ({
ok: true,
invite: {
event_type: 'dm_invite',
payload: {
prekey_lookup_handle: 'handle-123',
expires_at: 2_000_000_000,
},
},
peer_id: '!sb_local',
trust_fingerprint: 'trustfp123456',
prekey_publish_pending: false,
})),
fetchWormholeStatus: vi.fn(async () => ({ ready: true, transport_tier: 'private_strong' })),
fetchWormholeIdentity: vi.fn(async () => ({ node_id: '!sb_local', public_key: 'local-pub' })),
listWormholeDmInviteHandles: vi.fn(async () => ({ ok: true, addresses: [] })),
prepareWormholeInteractiveLane: vi.fn(async () => ({
ready: true,
settingsEnabled: true,
@@ -75,10 +96,13 @@ const mocks = vi.hoisted(() => ({
trust_fingerprint: 'invitefp',
trust_level: 'invite_pinned',
})),
renameWormholeDmInviteHandle: vi.fn(async () => ({ ok: true })),
revokeWormholeDmInviteHandle: vi.fn(async () => ({ ok: true, revoked: true })),
isWormholeReady: vi.fn(async () => true),
isWormholeSecureRequired: vi.fn(async () => false),
issueWormholePairwiseAlias: vi.fn(async () => ({ ok: true, shared_alias: 'alias-123' })),
openWormholeSenderSeal: vi.fn(async () => ({ sender_id: '!sb_peer', seal_verified: true })),
writeClipboard: vi.fn(async () => undefined),
}));
vi.mock('@/lib/api', () => ({
@@ -152,8 +176,10 @@ vi.mock('@/mesh/wormholeDmBootstrapClient', () => ({
}));
vi.mock('@/mesh/wormholeIdentityClient', () => ({
bootstrapWormholeIdentity: mocks.bootstrapWormholeIdentity,
fetchWormholeStatus: mocks.fetchWormholeStatus,
fetchWormholeIdentity: mocks.fetchWormholeIdentity,
exportWormholeDmInvite: mocks.exportWormholeDmInvite,
prepareWormholeInteractiveLane: mocks.prepareWormholeInteractiveLane,
getWormholeDmInviteImportErrorResult: (error: unknown) =>
error && typeof error === 'object' && 'result' in (error as Record<string, unknown>)
@@ -162,8 +188,11 @@ vi.mock('@/mesh/wormholeIdentityClient', () => ({
importWormholeDmInvite: mocks.importWormholeDmInvite,
isWormholeReady: mocks.isWormholeReady,
isWormholeSecureRequired: mocks.isWormholeSecureRequired,
listWormholeDmInviteHandles: mocks.listWormholeDmInviteHandles,
issueWormholePairwiseAlias: mocks.issueWormholePairwiseAlias,
openWormholeSenderSeal: mocks.openWormholeSenderSeal,
renameWormholeDmInviteHandle: mocks.renameWormholeDmInviteHandle,
revokeWormholeDmInviteHandle: mocks.revokeWormholeDmInviteHandle,
}));
import MessagesView from '@/components/InfonetTerminal/MessagesView';
@@ -191,10 +220,21 @@ describe('MessagesView first-contact trust UX', () => {
localStorage.clear();
contactsState = {};
vi.clearAllMocks();
Object.defineProperty(navigator, 'clipboard', {
value: { writeText: mocks.writeClipboard },
configurable: true,
});
mocks.getContacts.mockImplementation(() => contactsState);
mocks.hydrateWormholeContacts.mockImplementation(async () => contactsState);
mocks.fetchWormholeStatus.mockResolvedValue({ ready: true, transport_tier: 'private_strong' });
mocks.bootstrapWormholeIdentity.mockResolvedValue({
node_id: '!sb_local',
public_key: 'local-pub',
public_key_algo: 'Ed25519',
sequence: 1,
protocol_version: 'infonet/2',
});
mocks.prepareWormholeInteractiveLane.mockResolvedValue({
ready: true,
settingsEnabled: true,
@@ -215,6 +255,20 @@ describe('MessagesView first-contact trust UX', () => {
mocks.fetchDmPublicKey.mockResolvedValue({ dh_pub_key: 'peer-dh', dh_algo: 'X25519' });
mocks.sendOffLedgerConsentMessage.mockResolvedValue({ ok: true, transport: 'relay' });
mocks.canUseWormholeBootstrap.mockResolvedValue(false);
mocks.exportWormholeDmInvite.mockResolvedValue({
ok: true,
invite: {
event_type: 'dm_invite',
payload: {
prekey_lookup_handle: 'handle-123',
expires_at: 2_000_000_000,
},
},
peer_id: '!sb_local',
trust_fingerprint: 'trustfp123456',
prekey_publish_pending: false,
});
mocks.listWormholeDmInviteHandles.mockResolvedValue({ ok: true, addresses: [] });
});
afterEach(() => {
@@ -238,7 +292,7 @@ describe('MessagesView first-contact trust UX', () => {
fireEvent.click(screen.getByRole('button', { name: 'Import Signed Invite' }));
expect(await screen.findByText('Import Verified Invite')).toBeInTheDocument();
expect(await screen.findByText("Paste Someone's Address")).toBeInTheDocument();
expect(screen.getByLabelText(/Local Alias/i)).toHaveValue('!sb_unknown');
});
@@ -285,7 +339,7 @@ describe('MessagesView first-contact trust UX', () => {
expect(screen.getByRole('button', { name: 'Send Secure Mail' })).toBeEnabled();
});
it('warms the private lane in the background before sending secure mail', async () => {
it('sends sealed mail without waiting for the private delivery route', async () => {
contactsState = {
'!sb_pinned': {
alias: 'Pinned Peer',
@@ -296,15 +350,29 @@ describe('MessagesView first-contact trust UX', () => {
},
};
mocks.fetchWormholeStatus.mockResolvedValue({ ready: false, transport_tier: 'public_degraded' });
mocks.prepareWormholeInteractiveLane.mockImplementation(
() =>
new Promise(() => {
/* background route prep stays pending */
}),
);
mocks.sendDmMessage.mockResolvedValueOnce({
ok: true,
queued: true,
private_transport_pending: true,
});
renderMessagesView();
await openComposeForRecipient('!sb_pinned', 'hello after warmup');
fireEvent.click(screen.getByRole('button', { name: 'Send Secure Mail' }));
const sendButton = screen.getByRole('button', { name: 'Send Secure Mail' });
await waitFor(() => expect(sendButton).toBeEnabled(), { timeout: 5000 });
fireEvent.click(sendButton);
await screen.findByText(/Mail delivered to Pinned Peer/i, {}, { timeout: 5000 });
expect(mocks.prepareWormholeInteractiveLane).toHaveBeenCalled();
expect(mocks.sendDmMessage).toHaveBeenCalled();
await waitFor(() => expect(mocks.prepareWormholeInteractiveLane).toHaveBeenCalled(), { timeout: 5000 });
await waitFor(() => expect(mocks.sendDmMessage).toHaveBeenCalled(), { timeout: 5000 });
await screen.findByText(/Mail sealed locally for Pinned Peer/i, {}, { timeout: 5000 });
expect(screen.queryByText(/still warming up/i)).not.toBeInTheDocument();
}, 10000);
it('does not flatten witness policy not met into a generic witnessed root label', async () => {
@@ -358,6 +426,70 @@ describe('MessagesView first-contact trust UX', () => {
expect(screen.getByLabelText(/Local Alias/i)).toHaveValue('!sb_unpinned');
});
it('surfaces pending contact requests in the contact list with approve and deny actions', async () => {
localStorage.setItem(
'sb_infonet_mailbox_v1:!sb_local',
JSON.stringify({
version: 1,
items: [
{
id: 'request-1',
msgId: 'request-1',
folder: 'inbox',
kind: 'request',
direction: 'inbound',
senderId: '!sb_requester',
recipientId: '!sb_local',
subject: 'Contact request from !sb_requester',
body: '!sb_requester wants to open a secure mailbox.',
timestamp: 1_778_624_800,
read: false,
transport: 'relay',
deliveryClass: 'request',
requestStatus: 'pending',
requestDhPubKey: 'requester-dh',
requestDhAlgo: 'X25519',
},
],
}),
);
mocks.addContact.mockImplementation((peerId: string, dhPubKey: string, _alias?: string, dhAlgo?: string) => {
contactsState[peerId] = {
alias: 'Requester',
blocked: false,
dhPubKey,
dhAlgo,
trust_level: 'unpinned',
};
});
renderMessagesView();
fireEvent.click(screen.getByRole('button', { name: 'CONTACTS' }));
expect(await screen.findByText('Contact Requests')).toBeInTheDocument();
expect(await screen.findByText('1 pending')).toBeInTheDocument();
expect(await screen.findAllByText('!sb_requester')).toHaveLength(2);
expect(screen.getByRole('button', { name: 'Deny' })).toBeEnabled();
fireEvent.click(screen.getByRole('button', { name: 'Approve' }));
await waitFor(() => expect(mocks.addContact).toHaveBeenCalledWith(
'!sb_requester',
'peer-dh',
undefined,
'X25519',
));
await waitFor(() =>
expect(mocks.sendOffLedgerConsentMessage).toHaveBeenCalledWith(
expect.objectContaining({
recipientId: '!sb_requester',
recipientDhPub: 'peer-dh',
}),
),
);
expect(await screen.findByText(/Contact accepted: Requester\./i)).toBeInTheDocument();
});
it('routes continuity reverify from Secure Messages into Dead Drop with SAS visible', async () => {
contactsState = {
'!sb_reverify': {
@@ -461,18 +593,133 @@ describe('MessagesView first-contact trust UX', () => {
renderMessagesView();
fireEvent.click(screen.getByRole('button', { name: 'CONTACTS' }));
expect(await screen.findByText('Import Verified Invite')).toBeInTheDocument();
expect(await screen.findByText("Paste Someone's Address")).toBeInTheDocument();
fireEvent.change(screen.getByLabelText(/Signed Invite JSON/i), {
fireEvent.change(screen.getByPlaceholderText(/Paste the full text copied/i), {
target: { value: JSON.stringify({ invite: { event_type: 'dm_invite', payload: {} } }) },
});
fireEvent.click(screen.getByRole('button', { name: 'Import Signed Invite' }));
fireEvent.click(screen.getByRole('button', { name: 'Import Address' }));
expect(
await screen.findByText(/INVITE PINNED for !sb_attested \(invitefp\.\.tested\)\./i),
).toBeInTheDocument();
});
it('generates and copies the full signed public address instead of the lookup handle', async () => {
renderMessagesView();
fireEvent.click(await screen.findByRole('button', { name: 'Generate Address' }));
await waitFor(() => expect(mocks.writeClipboard).toHaveBeenCalled());
const copied = String(mocks.writeClipboard.mock.calls[0][0] || '');
expect(copied).toContain('"type": "shadowbroker.infonet.dm.invite"');
expect(copied).toContain('"prekey_lookup_handle": "handle-123"');
expect(copied).not.toBe('handle-123');
expect(await screen.findByText(/Generated and copied/i)).toBeInTheDocument();
expect(screen.getByText(/Signed invite ready/i)).toBeInTheDocument();
expect(screen.queryByText(/shadowbroker\.infonet\.dm\.invite/i)).not.toBeInTheDocument();
});
it('does not advertise legacy handle-only addresses as copyable public addresses', async () => {
localStorage.setItem(
'sb_infonet_dm_addresses_v1:!sb_local',
JSON.stringify({
version: 1,
addresses: [
{
id: 'legacy-address',
label: 'Legacy handle',
handle: 'd8ce691f751817e137066f2a1858e21689b0118f8ec485c1',
peerId: '',
trustFingerprint: '',
inviteBlob: '',
createdAt: 1_700_000_000,
},
],
}),
);
renderMessagesView();
expect(await screen.findByText(/Generate an address, then send it to someone/i)).toBeInTheDocument();
fireEvent.click(screen.getByRole('button', { name: 'CONTACTS' }));
expect(await screen.findByText('Legacy handle')).toBeInTheDocument();
expect(screen.getByText('Address unavailable locally.')).toBeInTheDocument();
expect(screen.getByRole('button', { name: 'Copy' })).toBeDisabled();
});
it('explains raw lookup handles instead of showing a JSON parser error', async () => {
renderMessagesView();
fireEvent.click(screen.getByRole('button', { name: 'CONTACTS' }));
expect(await screen.findByText("Paste Someone's Address")).toBeInTheDocument();
fireEvent.change(screen.getByPlaceholderText(/Paste the full text copied/i), {
target: { value: 'f0eee9e9ccf849bcb2d86c0d7a1e0669c75be4e05533b0f6c67' },
});
expect(await screen.findByText(/only a short address ID/i)).toBeInTheDocument();
expect(screen.getByRole('button', { name: 'Import Address' })).toBeDisabled();
expect(screen.queryByText(/Unexpected number in JSON/i)).not.toBeInTheDocument();
expect(mocks.importWormholeDmInvite).not.toHaveBeenCalled();
});
it('hides pasted signed address JSON until advanced details are opened', async () => {
const signedAddress = JSON.stringify({
type: 'shadowbroker.infonet.dm.invite',
version: 1,
invite: { event_type: 'dm_invite', payload: {} },
});
renderMessagesView();
fireEvent.click(screen.getByRole('button', { name: 'CONTACTS' }));
expect(await screen.findByText("Paste Someone's Address")).toBeInTheDocument();
const addressField = screen.getByPlaceholderText(/Paste the full text copied/i);
fireEvent.paste(addressField, {
clipboardData: {
getData: () => signedAddress,
},
});
expect(screen.getByDisplayValue(/Copied address received\. Ready to import\./i)).toBeInTheDocument();
expect(screen.queryByDisplayValue(/shadowbroker\.infonet\.dm\.invite/i)).not.toBeInTheDocument();
fireEvent.click(screen.getByRole('button', { name: 'Advanced Details' }));
expect(screen.getByLabelText('Raw copied public address')).toHaveValue(signedAddress);
});
it('imports a copied address without waiting for secure mail warm-up', async () => {
mocks.fetchWormholeStatus.mockResolvedValue({ ready: false, transport_tier: 'public_degraded' });
mocks.prepareWormholeInteractiveLane.mockImplementation(
() =>
new Promise(() => {
/* background warm-up stays pending */
}),
);
mocks.importWormholeDmInvite.mockResolvedValueOnce({
ok: true,
peer_id: '!sb_now',
trust_fingerprint: 'invitefp-now',
trust_level: 'invite_pinned',
contact: {},
});
renderMessagesView();
fireEvent.click(screen.getByRole('button', { name: 'CONTACTS' }));
expect(await screen.findByText("Paste Someone's Address")).toBeInTheDocument();
fireEvent.change(screen.getByPlaceholderText(/Paste the full text copied/i), {
target: { value: JSON.stringify({ invite: { event_type: 'dm_invite', payload: {} } }) },
});
fireEvent.click(screen.getByRole('button', { name: 'Import Address' }));
expect(await screen.findByText(/INVITE PINNED for !sb_now \(invitefp-now\)\./i)).toBeInTheDocument();
expect(mocks.importWormholeDmInvite).toHaveBeenCalled();
expect(screen.queryByText(/Secure mail is still warming up/i)).not.toBeInTheDocument();
});
it('announces compat invite imports as TOFU PINNED with backend detail', async () => {
mocks.importWormholeDmInvite.mockResolvedValueOnce({
ok: true,
@@ -485,12 +732,12 @@ describe('MessagesView first-contact trust UX', () => {
renderMessagesView();
fireEvent.click(screen.getByRole('button', { name: 'CONTACTS' }));
expect(await screen.findByText('Import Verified Invite')).toBeInTheDocument();
expect(await screen.findByText("Paste Someone's Address")).toBeInTheDocument();
fireEvent.change(screen.getByLabelText(/Signed Invite JSON/i), {
fireEvent.change(screen.getByPlaceholderText(/Paste the full text copied/i), {
target: { value: JSON.stringify({ invite: { event_type: 'dm_invite', payload: {} } }) },
});
fireEvent.click(screen.getByRole('button', { name: 'Import Signed Invite' }));
fireEvent.click(screen.getByRole('button', { name: 'Import Address' }));
expect(
await screen.findByText(/TOFU PINNED for !sb_compat \(invitefp\.\.compat\)\./i),
@@ -534,12 +781,12 @@ describe('MessagesView first-contact trust UX', () => {
renderMessagesView();
fireEvent.click(screen.getByRole('button', { name: 'CONTACTS' }));
expect(await screen.findByText('Import Verified Invite')).toBeInTheDocument();
expect(await screen.findByText("Paste Someone's Address")).toBeInTheDocument();
fireEvent.change(screen.getByLabelText(/Signed Invite JSON/i), {
fireEvent.change(screen.getByPlaceholderText(/Paste the full text copied/i), {
target: { value: JSON.stringify({ invite: { event_type: 'dm_invite', payload: {} } }) },
});
fireEvent.click(screen.getByRole('button', { name: 'Import Signed Invite' }));
fireEvent.click(screen.getByRole('button', { name: 'Import Address' }));
expect(
await screen.findByText(/CONTINUITY BROKEN for Pinned Peer\. Stable root continuity changed\./i),
@@ -550,7 +797,7 @@ describe('MessagesView first-contact trust UX', () => {
});
it('uses non-blocking secure-mail startup language while the DM lane warms', async () => {
mocks.fetchWormholeStatus.mockResolvedValueOnce({ ready: false, transport_tier: 'public_degraded' });
mocks.fetchWormholeStatus.mockResolvedValue({ ready: false, transport_tier: 'public_degraded' });
mocks.prepareWormholeInteractiveLane.mockImplementation(
() =>
new Promise(() => {
@@ -561,8 +808,9 @@ describe('MessagesView first-contact trust UX', () => {
renderMessagesView();
expect(
await screen.findByText(/Preparing secure mail in the background/i),
await screen.findByText(/Private delivery route is connecting/i),
).toBeInTheDocument();
expect(screen.getByText(/Addresses, contacts, and sealed sends can proceed now/i)).toBeInTheDocument();
expect(screen.queryByText(/LOCKED/i)).not.toBeInTheDocument();
expect(screen.queryByText(/enter the Wormhole/i)).not.toBeInTheDocument();
});
@@ -1327,6 +1327,7 @@ describe('wormholeIdentityClient strict profile hints', () => {
expect.objectContaining({
method: 'POST',
headers: { 'Content-Type': 'application/json' },
requireAdminSession: false,
body: JSON.stringify({
invite: { event_type: 'dm_invite' },
alias: 'field contact',
@@ -1378,6 +1379,7 @@ describe('wormholeIdentityClient strict profile hints', () => {
const prepared = await mod.prepareWormholeInteractiveLane({ bootstrapIdentity: true });
expect(connectWormhole).toHaveBeenCalledTimes(1);
expect(connectWormhole).toHaveBeenCalledWith({ requireAdminSession: false });
expect(joinWormhole).not.toHaveBeenCalled();
expect(prepared).toEqual(
expect.objectContaining({
+14 -3
View File
@@ -26,6 +26,8 @@ const STRIP_REQUEST = new Set([
'transfer-encoding',
'upgrade',
'host',
'content-length',
'expect',
]);
// Headers that must not be forwarded back to the browser.
@@ -64,6 +66,7 @@ function isSensitiveProxyPath(pathSegments: string[]): boolean {
if (joined === 'system/update') return true;
if (pathSegments[0] === 'settings') return true;
if (joined === 'mesh/infonet/ingest') return true;
if (joined === 'mesh/meshtastic/send') return true;
// mesh/peers and all tools/* use require_local_operator on the backend and
// need X-Admin-Key injected on the server-side proxy leg.
if (pathSegments[0] === 'mesh' && pathSegments[1] === 'peers') return true;
@@ -201,9 +204,10 @@ async function proxy(req: NextRequest, pathSegments: string[]): Promise<NextResp
cache: 'no-store',
};
if (!isBodyless) {
requestInit.body = req.body;
// Required for streaming request bodies in Node.js fetch
requestInit.duplex = 'half';
const body = await req.text();
if (body.length > 0) {
requestInit.body = body;
}
}
const maxAttempts = isBodyless ? 18 : 1;
let fetchError: unknown = null;
@@ -214,6 +218,13 @@ async function proxy(req: NextRequest, pathSegments: string[]): Promise<NextResp
break;
} catch (error) {
fetchError = error;
if (attempt >= maxAttempts) {
console.error('api proxy upstream fetch failed', {
method: req.method,
target: targetUrl.toString(),
error,
});
}
if (attempt >= maxAttempts) break;
await sleep(250);
}
+6
View File
@@ -8,6 +8,12 @@ export const metadata: Metadata = {
description: 'Advanced Geopolitical Risk Dashboard',
};
// The dashboard is a live local runtime, not a static landing page. If Next
// prerenders and caches the initial shell, Docker users can get stuck on the
// "prioritizing map feeds" markup before client polling ever hydrates.
export const dynamic = 'force-dynamic';
export const revalidate = 0;
export default function RootLayout({
children,
}: Readonly<{
+38 -2
View File
@@ -20,11 +20,23 @@ import {
Heart,
} from 'lucide-react';
const CURRENT_VERSION = '0.9.7';
const CURRENT_VERSION = '0.9.79';
const STORAGE_KEY = `shadowbroker_changelog_v${CURRENT_VERSION}`;
const RELEASE_TITLE = 'Agentic AI Channel + InfoNet Decentralized Intelligence';
const RELEASE_TITLE = 'Onboarding, Live Feeds, Mesh, and Agent Hardening';
const HEADLINE_FEATURES = [
{
icon: <Bot size={20} className="text-purple-400" />,
accent: 'purple' as const,
title: 'Agentic onboarding for OpenClaw-compatible agents',
subtitle: 'First-time setup now includes local/direct agent connection, access-tier selection, copyable HMAC setup, and optional Tor hidden-service prep.',
details: [
'The onboarding flow can generate the local agent connection bundle through the existing HMAC API, point agents at /api/ai/tools, and let operators choose restricted read-only or full write access before connecting an agent.',
'Remote mode is labeled honestly: .onion exposes the signed HTTP agent API over Tor. Wormhole/MLS is not claimed as the current agent command transport.',
'The setup copy works for OpenClaw, Hermes, or any custom agent that implements the documented HMAC request contract.',
],
callToAction: 'OPEN FIRST-TIME SETUP -> AI AGENT',
},
{
icon: <Bot size={20} className="text-purple-400" />,
accent: 'purple' as const,
@@ -53,6 +65,26 @@ const HEADLINE_FEATURES = [
];
const NEW_FEATURES = [
{
icon: <Clock size={18} className="text-cyan-400" />,
title: 'Startup and Feed Responsiveness Pass',
desc: 'Map-critical feeds now lean on startup caches and priority preload behavior so the dashboard can paint before heavyweight synthesis jobs finish.',
},
{
icon: <Network size={18} className="text-green-400" />,
title: 'MeshChat MQTT Settings',
desc: 'Public MeshChat stays opt-in and now has an in-panel settings lane for broker, port, username, password, and channel PSK while remaining separated from Wormhole/private mode.',
},
{
icon: <Plane size={18} className="text-cyan-400" />,
title: 'Selected Entity Trails',
desc: 'Flight and vessel trails are drawn only for selected assets, reducing global clutter while still exposing movement history for unknown-route entities.',
},
{
icon: <Plane size={18} className="text-amber-400" />,
title: 'Aircraft Detail Cards',
desc: 'Commercial aircraft stay airline-first, while private and general aviation aircraft can show model-focused Wiki context and imagery when available.',
},
{
icon: <Cpu size={18} className="text-purple-400" />,
title: 'AI Batch Command Channel',
@@ -101,6 +133,10 @@ const NEW_FEATURES = [
];
const BUG_FIXES = [
'Docker proxy and backend port handling hardened so changing the host backend port does not require changing the internal service contract.',
'Global Threat Intercept and live-data startup paths no longer wait on slow-tier synthesis before cached data can paint the UI.',
'MeshChat and Infonet statuses now separate public MQTT participation, private Wormhole mode, and local node bootstrap so the UI does not imply the wrong connection state.',
'Commercial aircraft detail cards no longer show a confusing model image alongside the airline card.',
'Sovereign Shell adaptive polling — voting and challenge windows refresh every 8 seconds while active, every 30 to 60 seconds when idle. Voting feels live without a websocket layer.',
'Per-row write actions (petitions, upgrades, disputes) hold isolated submission state so concurrent forms no longer share a single in-flight slot.',
'Verbatim diagnostic surfacing on every write button. The backend reason text is always shown on rejection — no opaque "denied" toasts.',
@@ -11,7 +11,6 @@ import {
} from '@/mesh/infonetEconomyClient';
import { generateNodeKeys, getNodeIdentity } from '@/mesh/meshIdentity';
import {
DEFAULT_INFONET_SEED_URL,
fetchInfonetNodeStatusSnapshot,
setInfonetNodeEnabled,
type InfonetNodeStatusSnapshot,
@@ -57,9 +56,12 @@ export default function BootstrapView({ marketId, onBack }: BootstrapViewProps)
const nodeEnabled = Boolean(nodeStatus?.node_enabled);
const nodeMode = String(nodeStatus?.node_mode || 'participant').toUpperCase();
const syncOutcome = String(nodeStatus?.sync_runtime?.last_outcome || 'idle').toLowerCase();
const seedPeerCount = Number(nodeStatus?.bootstrap?.default_sync_peer_count || 0);
const seedPeerCount = Number(
nodeStatus?.bootstrap?.bootstrap_seed_peer_count ?? nodeStatus?.bootstrap?.default_sync_peer_count ?? 0,
);
const syncPeerCount = Number(nodeStatus?.bootstrap?.sync_peer_count || 0);
const lastPeerUrl = String(nodeStatus?.sync_runtime?.last_peer_url || '').trim();
const privateTransportRequired = Boolean(nodeStatus?.private_transport_required);
const toggleNode = useCallback(async (enabled: boolean) => {
setNodeToggleBusy(true);
@@ -146,8 +148,10 @@ export default function BootstrapView({ marketId, onBack }: BootstrapViewProps)
</div>
<div className="grid grid-cols-1 md:grid-cols-3 gap-2 text-xs">
<div>
<div className="text-gray-500">Default Seed</div>
<div className="text-cyan-300 font-mono break-all">{DEFAULT_INFONET_SEED_URL}</div>
<div className="text-gray-500">Transport</div>
<div className="text-cyan-300 font-mono break-all">
{privateTransportRequired ? 'ONION / RNS ONLY' : 'CLEARNET DEV OVERRIDE'}
</div>
</div>
<div>
<div className="text-gray-500">Local Node</div>
@@ -158,15 +162,15 @@ export default function BootstrapView({ marketId, onBack }: BootstrapViewProps)
<div>
<div className="text-gray-500">Sync Path</div>
<div className="text-white font-mono">
{syncPeerCount} peers / {seedPeerCount} default
{syncPeerCount} peers / {seedPeerCount} seeds
</div>
</div>
</div>
<div className="mt-3 flex flex-col md:flex-row md:items-center gap-3">
<div className="flex-1 text-[11px] text-gray-500 leading-relaxed">
{nodeEnabled
? `Public chain sync is ${syncOutcome || 'active'}${lastPeerUrl ? ` via ${lastPeerUrl}` : ''}.`
: 'Start a local participant node to pull from the default seed and help carry the public Infonet chain while this backend is running.'}
? `Infonet sync is ${syncOutcome || 'active'}${lastPeerUrl ? ` via ${lastPeerUrl}` : ''}.`
: 'Start a local participant node to sync through available Wormhole onion/RNS peers while this backend is running.'}
</div>
<button
type="button"
File diff suppressed because it is too large Load Diff
@@ -61,7 +61,7 @@ export default function NetworkStats() {
const nodeColor = stats.syncOutcome === 'ok' ? 'text-green-400'
: stats.syncOutcome === 'running' ? 'text-amber-400'
: stats.nodeEnabled ? 'text-amber-400' : 'text-gray-600';
const nodeLabel = stats.syncOutcome === 'ok' ? 'CONNECTED'
const nodeLabel = stats.syncOutcome === 'ok' ? 'SEED SYNCED'
: stats.syncOutcome === 'running' ? 'SYNCING'
: stats.syncOutcome === 'error' || stats.syncOutcome === 'fork' ? 'RETRYING'
: stats.nodeEnabled ? 'WAITING' : 'OFFLINE';
@@ -3,6 +3,11 @@
import React, { useEffect } from 'react';
import { AnimatePresence, motion } from 'framer-motion';
import { X } from 'lucide-react';
import {
fetchInfonetNodeStatusSnapshot,
setInfonetNodeEnabled,
startTorHiddenService,
} from '@/mesh/controlPlaneStatusClient';
import InfonetShell from './InfonetShell';
interface InfonetTerminalProps {
@@ -28,6 +33,33 @@ export default function InfonetTerminal({
return () => window.removeEventListener('keydown', handler);
}, [isOpen, onClose]);
useEffect(() => {
if (!isOpen) return;
let cancelled = false;
const connectParticipantNode = async () => {
try {
const nodeStatus = await fetchInfonetNodeStatusSnapshot(true).catch(() => null);
if (cancelled || nodeStatus?.node_enabled) return;
const torStatus = await startTorHiddenService().catch(() => null);
if (cancelled || !torStatus?.running || !torStatus?.onion_address) return;
await setInfonetNodeEnabled(true);
if (!cancelled) {
await fetchInfonetNodeStatusSnapshot(true).catch(() => null);
}
} catch {
// Remote/shared viewers may not have local-operator rights. Leave manual controls intact.
}
};
void connectParticipantNode();
return () => {
cancelled = true;
};
}, [isOpen]);
return (
<AnimatePresence>
{isOpen && (
+152 -16
View File
@@ -159,7 +159,7 @@ import {
EarthquakeLabels,
ThreatMarkers,
} from '@/components/map/MapMarkers';
import type { DashboardData, KiwiSDR, MaplibreViewerProps, Scanner, SigintSignal } from '@/types/dashboard';
import type { DashboardData, Flight, KiwiSDR, MaplibreViewerProps, Scanner, Ship, SigintSignal } from '@/types/dashboard';
import { useDataKeys } from '@/hooks/useDataStore';
import { useInterpolation } from '@/components/map/hooks/useInterpolation';
import { useClusterLabels } from '@/components/map/hooks/useClusterLabels';
@@ -225,6 +225,68 @@ type GeoExtras = {
type KiwiProps = Partial<KiwiSDR> & GeoExtras;
type ScannerProps = Partial<Scanner> & GeoExtras;
type SigintProps = Partial<SigintSignal> & GeoExtras;
type TrailPoint = { lng: number; lat: number; alt?: number; sog?: number; ts?: number };
type TrailKind = 'flight' | 'ship';
const FLIGHT_SELECTION_TYPES = new Set([
'flight',
'private_flight',
'military_flight',
'private_jet',
'tracked_flight',
]);
function parseTrailPoints(raw: unknown, kind: TrailKind): TrailPoint[] {
if (!Array.isArray(raw)) return [];
return raw
.map((p): TrailPoint | null => {
if (Array.isArray(p)) {
const lat = Number(p[0]);
const lng = Number(p[1]);
if (!Number.isFinite(lat) || !Number.isFinite(lng)) return null;
if (kind === 'ship') {
return { lat, lng, sog: Number(p[2]) || 0, ts: Number(p[3]) || 0 };
}
return { lat, lng, alt: Number(p[2]) || 0, ts: Number(p[3]) || 0 };
}
if (p && typeof p === 'object') {
const point = p as { lat?: number; lng?: number; alt?: number; sog?: number; ts?: number };
const lat = Number(point.lat);
const lng = Number(point.lng);
if (!Number.isFinite(lat) || !Number.isFinite(lng)) return null;
return {
lat,
lng,
alt: Number(point.alt) || 0,
sog: Number(point.sog) || 0,
ts: Number(point.ts) || 0,
};
}
return null;
})
.filter((p): p is TrailPoint => Boolean(p && (p.lat !== 0 || p.lng !== 0)));
}
function hasKnownRouteName(value?: string | null): boolean {
const normalized = String(value || '').trim().toUpperCase();
return Boolean(normalized && normalized !== 'UNKNOWN');
}
function flightHasKnownRoute(entity: ReturnType<typeof findSelectedEntity>, dynamicRoute: DynamicRoute | null): boolean {
if (!entity) return false;
if (dynamicRoute?.orig_loc && dynamicRoute?.dest_loc) return true;
return flightPayloadHasKnownRoute(entity);
}
function flightPayloadHasKnownRoute(entity: ReturnType<typeof findSelectedEntity>): boolean {
if (!entity) return false;
if (!('origin_loc' in entity) && !('origin_name' in entity)) return false;
const flight = entity as Flight;
return Boolean(
(flight.origin_loc && flight.dest_loc)
|| (hasKnownRouteName(flight.origin_name) && hasKnownRouteName(flight.dest_name)),
);
}
const MAP_EXTRA_DATA_KEYS = [
'air_quality',
@@ -479,6 +541,7 @@ const MaplibreViewer = ({
}, [activeLayers.viirs_nightlights, viirsProbeDayKey]);
const [dynamicRoute, setDynamicRoute] = useState<DynamicRoute | null>(null);
const [selectedTrailPoints, setSelectedTrailPoints] = useState<TrailPoint[]>([]);
const prevCallsign = useRef<string | null>(null);
// Oracle region intel for map entity popups
@@ -557,6 +620,7 @@ const MaplibreViewer = ({
if (callsign && callsign !== prevCallsign.current) {
prevCallsign.current = callsign;
setDynamicRoute(null);
fetch(`${API_BASE}/api/route/${callsign}?lat=${entityLat}&lng=${entityLng}`)
.then((res) => res.json())
.then((routeData) => {
@@ -575,6 +639,76 @@ const MaplibreViewer = ({
};
}, [selectedEntity, data]);
useEffect(() => {
let cancelled = false;
const entity = findSelectedEntity(selectedEntity, data);
if (!selectedEntity || !entity) {
setSelectedTrailPoints([]);
return () => {
cancelled = true;
};
}
const isFlight = FLIGHT_SELECTION_TYPES.has(selectedEntity.type);
const isShip = selectedEntity.type === 'ship';
if (!isFlight && !isShip) {
setSelectedTrailPoints([]);
return () => {
cancelled = true;
};
}
if (isFlight && flightPayloadHasKnownRoute(entity)) {
setSelectedTrailPoints([]);
return () => {
cancelled = true;
};
}
const kind: TrailKind = isShip ? 'ship' : 'flight';
const fallback = parseTrailPoints((entity as Flight | Ship).trail, kind);
if (fallback.length >= 2) {
setSelectedTrailPoints(fallback);
} else {
setSelectedTrailPoints([]);
}
const trailId = String(selectedEntity.id || '').trim();
if (!trailId) {
return () => {
cancelled = true;
};
}
if (isShip && !/^\d+$/.test(trailId)) {
return () => {
cancelled = true;
};
}
const endpoint = isShip
? `${API_BASE}/api/trail/ship/${encodeURIComponent(trailId)}`
: `${API_BASE}/api/trail/flight/${encodeURIComponent(trailId)}`;
const refreshSelectedTrail = () => {
fetch(endpoint, { cache: 'no-store' })
.then((res) => (res.ok ? res.json() : null))
.then((payload) => {
if (cancelled || !payload) return;
const points = parseTrailPoints(payload.trail, kind);
setSelectedTrailPoints(points.length >= 2 ? points : fallback);
})
.catch(() => {
if (!cancelled) setSelectedTrailPoints(fallback);
});
};
refreshSelectedTrail();
const trailRefreshTimer = window.setInterval(refreshSelectedTrail, 30000);
return () => {
cancelled = true;
window.clearInterval(trailRefreshTimer);
};
}, [selectedEntity, data, dynamicRoute]);
// Fetch oracle region intel for entity popups
useEffect(() => {
if (!selectedEntity) {
@@ -1349,27 +1483,29 @@ const MaplibreViewer = ({
return { type: 'FeatureCollection' as const, features };
}, [selectedEntity, data, dynamicRoute, getSelectedEntityLiveCoords, interpTick]);
// Trail history GeoJSON: shows where the SELECTED aircraft has been
// Trail history GeoJSON: shows where the selected unknown-route aircraft or vessel has been.
const trailGeoJSON = useMemo(() => {
void interpTick;
const entity = findSelectedEntity(selectedEntity, data);
if (!entity || !('trail' in entity) || !entity.trail || entity.trail.length < 2) return null;
if (!entity || selectedTrailPoints.length < 2) return null;
if (selectedEntity && FLIGHT_SELECTION_TYPES.has(selectedEntity.type) && flightPayloadHasKnownRoute(entity)) {
return null;
}
// Parse trail points — backend sends [lat, lng, alt, ts] arrays
type TrailPt = { lng: number; lat: number; alt: number; ts: number };
const points: TrailPt[] = (
entity.trail as Array<{ lat?: number; lng?: number; alt?: number; ts?: number } | number[]>
).map((p) => {
if (Array.isArray(p)) {
return { lat: p[0] as number, lng: p[1] as number, alt: (p[2] as number) || 0, ts: (p[3] as number) || 0 };
}
return { lat: p.lat ?? 0, lng: p.lng ?? 0, alt: p.alt ?? 0, ts: p.ts ?? 0 };
}).filter((p) => p.lat !== 0 || p.lng !== 0);
// Trails are loaded only for the selected asset to avoid open-map clutter.
const isShipTrail = selectedEntity?.type === 'ship';
const points = [...selectedTrailPoints];
const currentLoc = getSelectedEntityLiveCoords(entity);
if (currentLoc && points.length > 0) {
const lastPt = points[points.length - 1];
points.push({ lng: currentLoc[0], lat: currentLoc[1], alt: lastPt.alt, ts: Date.now() / 1000 });
points.push({
lng: currentLoc[0],
lat: currentLoc[1],
alt: lastPt.alt,
sog: lastPt.sog,
ts: Date.now() / 1000,
});
}
if (points.length < 2) return null;
@@ -1394,7 +1530,7 @@ const MaplibreViewer = ({
type: 'Feature' as const,
properties: {
type: 'trail',
color: altToColor((a.alt + b.alt) / 2),
color: isShipTrail ? '#22d3ee' : altToColor(((a.alt ?? 0) + (b.alt ?? 0)) / 2),
opacity: 0.4 + progress * 0.5, // older segments more transparent
segIndex: i,
},
@@ -1406,7 +1542,7 @@ const MaplibreViewer = ({
}
return { type: 'FeatureCollection' as const, features };
}, [selectedEntity, data, getSelectedEntityLiveCoords, interpTick]);
}, [selectedEntity, data, selectedTrailPoints, dynamicRoute, getSelectedEntityLiveCoords, interpTick]);
// Predictive vector GeoJSON: dotted line projecting ~5 min ahead based on heading + speed
// Skip when entity has a known route (origin+dest) — the route line already shows where it's going
+270 -27
View File
@@ -113,15 +113,30 @@ const MeshChat = React.memo(function MeshChat(props: MeshChatProps) {
meshQuickStatus,
meshSessionActive,
publicMeshAddress,
activePublicMeshAddress,
meshView,
setMeshView,
meshDirectTarget,
setMeshDirectTarget,
meshAddressDraft,
setMeshAddressDraft,
meshMqttSettings,
meshMqttForm,
setMeshMqttForm,
meshMqttBusy,
meshMqttStatusText,
meshMqttEnabled,
meshMqttRunning,
meshMqttConnected,
meshMqttConnectionLabel,
saveMeshMqttSettings,
refreshMeshMqttSettings,
// Identity
identity,
publicIdentity,
hasStoredPublicLaneIdentity,
hasPublicLaneIdentity,
canUsePublicMeshInput,
hasId,
shouldShowIdentityWarning,
wormholeEnabled,
@@ -328,14 +343,13 @@ const MeshChat = React.memo(function MeshChat(props: MeshChatProps) {
void handleRequestAccess(targetId);
};
const meshActivationText =
meshQuickStatus?.text ||
(publicMeshBlockedByWormhole
publicMeshBlockedByWormhole
? hasStoredPublicLaneIdentity
? 'Wormhole is active. Turning MeshChat on will turn Wormhole off and use your saved public mesh key.'
: 'Wormhole is active. Turning MeshChat on will turn Wormhole off and mint a separate public mesh key.'
: hasStoredPublicLaneIdentity
? 'MeshChat is off. Turn it on to use your saved public mesh key.'
: 'Public mesh posting needs a mesh key. One tap gets you a fresh address.');
: 'Public mesh posting needs a mesh key. One tap gets you a fresh address.';
const handleMeshActivationAction = () => {
if (hasStoredPublicLaneIdentity) {
void handleActivatePublicMeshSession();
@@ -347,6 +361,21 @@ const MeshChat = React.memo(function MeshChat(props: MeshChatProps) {
}
void handleQuickCreatePublicIdentity();
};
const normalizeMeshDirectAddress = (value: string) => {
const compact = value.trim().replace(/^!/, '').toLowerCase();
return /^[0-9a-f]{8}$/.test(compact) ? `!${compact}` : '';
};
const handleMeshDirectTargetSubmit = () => {
const target = normalizeMeshDirectAddress(meshAddressDraft);
if (!target) {
setSendError('enter node address like !1ee21986');
window.setTimeout(() => setSendError(''), 4000);
return;
}
setMeshDirectTarget(target);
setMeshView('channel');
window.setTimeout(() => inputRef.current?.focus(), 0);
};
const meshActivationLabel = identityWizardBusy
? 'GETTING MESH KEY'
: hasStoredPublicLaneIdentity
@@ -471,7 +500,7 @@ const MeshChat = React.memo(function MeshChat(props: MeshChatProps) {
</div>
)}
{anonymousModeEnabled && !anonymousModeReady && (
{activeTab !== 'meshtastic' && anonymousModeEnabled && !anonymousModeReady && (
<div className="px-3 py-2 text-sm font-mono text-red-400/90 border-b border-red-900/30 bg-red-950/20 leading-[1.65] shrink-0">
Anonymous mode is active, but hidden transport is not ready. Dead Drop is blocked
until Wormhole is running over Tor, I2P, or Mixnet.
@@ -1133,8 +1162,8 @@ const MeshChat = React.memo(function MeshChat(props: MeshChatProps) {
))}
</select>
</div>
<div className="flex items-center justify-between gap-2 px-3 py-1 border-b border-[var(--border-primary)]/20 shrink-0 bg-green-950/10">
<div className="flex items-center gap-1">
<div className="flex items-center gap-1 px-3 py-1 border-b border-[var(--border-primary)]/20 shrink-0 bg-green-950/10">
<div className="flex items-center gap-1 min-w-0 flex-wrap">
<button
onClick={() => setMeshView('channel')}
className={`px-2 py-0.5 text-[11px] font-mono tracking-wider border transition-colors ${
@@ -1155,36 +1184,245 @@ const MeshChat = React.memo(function MeshChat(props: MeshChatProps) {
>
INBOX
</button>
</div>
<div className="text-[10px] font-mono text-[var(--text-muted)] truncate">
{meshSessionActive && publicMeshAddress
? `ADDR ${publicMeshAddress.toUpperCase()}`
: publicMeshAddress
? 'MESH OFF / KEY SAVED'
: 'NO PUBLIC MESH ADDRESS'}
<button
onClick={() => setMeshView('settings')}
className={`px-2 py-0.5 text-[11px] font-mono tracking-wider border transition-colors ${
meshView === 'settings'
? 'border-cyan-500/40 text-cyan-300 bg-cyan-950/20'
: 'border-[var(--border-primary)]/40 text-[var(--text-muted)] hover:text-cyan-300'
}`}
>
SETTINGS
</button>
<button
onClick={() => {
setMeshAddressDraft(meshDirectTarget || '');
setMeshView('message');
}}
className={`px-2 py-0.5 text-[11px] font-mono tracking-wider border transition-colors ${
meshView === 'message'
? 'border-green-500/40 text-green-200 bg-green-950/25'
: 'border-[var(--border-primary)]/40 text-[var(--text-muted)] hover:text-green-300'
}`}
>
MESSAGE
</button>
</div>
</div>
<div className="flex-1 overflow-y-auto styled-scrollbar px-3 py-1.5 border-l-2 border-cyan-800/25">
{!meshSessionActive && (
{meshView === 'message' && (
<div className="space-y-2 py-1 text-[11px] font-mono">
<div className="border border-green-700/35 bg-green-950/10 p-2">
<div className="text-green-300 tracking-[0.18em]">DIRECT MESHTASTIC MESSAGE</div>
<div className="mt-1 text-[10px] text-[var(--text-muted)] leading-[1.5]">
Enter a public Meshtastic node address. Direct MQTT publishes are public/degraded and depend on the target mesh hearing the broker bridge.
</div>
</div>
<label className="block space-y-1">
<span className="text-[var(--text-muted)]">NODE ADDRESS</span>
<input
value={meshAddressDraft}
onChange={(e) => setMeshAddressDraft(e.target.value)}
onKeyDown={(e) => {
if (e.key === 'Enter') {
e.preventDefault();
handleMeshDirectTargetSubmit();
}
}}
placeholder="!1ee21986"
className="w-full border border-[var(--border-primary)] bg-black/30 px-2 py-1 text-green-200 outline-none placeholder:text-[var(--text-muted)] focus:border-green-500/50"
/>
</label>
<div className="grid grid-cols-2 gap-2">
<button
onClick={handleMeshDirectTargetSubmit}
className="border border-green-600/45 bg-green-950/20 px-2 py-1.5 text-green-300 hover:bg-green-950/35"
>
USE ADDRESS
</button>
<button
onClick={() => {
setMeshDirectTarget('');
setMeshAddressDraft('');
setMeshView('channel');
window.setTimeout(() => inputRef.current?.focus(), 0);
}}
className="border border-cyan-700/40 bg-cyan-950/15 px-2 py-1.5 text-cyan-300 hover:bg-cyan-950/25"
>
BROADCAST
</button>
</div>
{meshDirectTarget && (
<div className="border border-amber-600/30 bg-amber-950/10 p-2 text-amber-200/85 leading-[1.5]">
Active direct target: {meshDirectTarget.toUpperCase()}. Type in the input below and press send, or clear it to return to channel broadcast.
</div>
)}
</div>
)}
{meshView === 'settings' && (
<div className="space-y-2 py-1 text-[11px] font-mono">
<div className="border border-cyan-800/35 bg-cyan-950/10 p-2">
<div className="flex items-center justify-between gap-2">
<div>
<div className="text-cyan-300 tracking-[0.18em]">MESHTASTIC MQTT</div>
<div className="mt-1 text-[10px] text-[var(--text-muted)] leading-[1.5]">
Public Mesh is separate from Wormhole. Turning MQTT on disables the private Wormhole lane for MeshChat.
</div>
</div>
<span
className={`shrink-0 border px-2 py-1 text-[10px] tracking-[0.16em] ${
meshMqttConnected
? 'border-green-500/40 text-green-300'
: meshMqttEnabled
? 'border-amber-500/40 text-amber-300'
: 'border-red-500/35 text-red-300'
}`}
>
{meshMqttConnectionLabel}
</span>
</div>
{meshMqttSettings?.runtime?.last_error && (
<div className="mt-2 text-red-300/80">
LAST ERROR: {meshMqttSettings.runtime.last_error}
</div>
)}
{meshMqttRunning && !meshMqttConnected && !meshMqttSettings?.runtime?.last_error && (
<div className="mt-2 text-amber-300/80">
MQTT bridge is starting. Live messages appear after broker connect.
</div>
)}
</div>
<div className="grid grid-cols-[1fr_70px] gap-2">
<label className="space-y-1">
<span className="text-[var(--text-muted)]">BROKER</span>
<input
value={meshMqttForm.broker}
onChange={(e) => setMeshMqttForm((prev) => ({ ...prev, broker: e.target.value }))}
className="w-full border border-[var(--border-primary)] bg-black/30 px-2 py-1 text-cyan-200 outline-none focus:border-cyan-500/50"
/>
</label>
<label className="space-y-1">
<span className="text-[var(--text-muted)]">PORT</span>
<input
value={meshMqttForm.port}
onChange={(e) => setMeshMqttForm((prev) => ({ ...prev, port: e.target.value }))}
className="w-full border border-[var(--border-primary)] bg-black/30 px-2 py-1 text-cyan-200 outline-none focus:border-cyan-500/50"
/>
</label>
</div>
<label className="block space-y-1">
<span className="text-[var(--text-muted)]">BROKER LOGIN (optional)</span>
<input
value={meshMqttForm.username}
onChange={(e) => setMeshMqttForm((prev) => ({ ...prev, username: e.target.value }))}
placeholder="blank uses public Meshtastic default"
className="w-full border border-[var(--border-primary)] bg-black/30 px-2 py-1 text-cyan-200 outline-none focus:border-cyan-500/50"
/>
</label>
<label className="block space-y-1">
<span className="text-[var(--text-muted)]">
BROKER PASSWORD {meshMqttSettings?.uses_default_credentials ? '(public default)' : meshMqttSettings?.has_password ? '(saved)' : ''}
</span>
<input
type="password"
value={meshMqttForm.password}
onChange={(e) => setMeshMqttForm((prev) => ({ ...prev, password: e.target.value }))}
placeholder={
meshMqttSettings?.uses_default_credentials
? 'blank uses public Meshtastic default'
: meshMqttSettings?.has_password
? 'leave blank to keep saved password'
: 'blank uses public Meshtastic default'
}
className="w-full border border-[var(--border-primary)] bg-black/30 px-2 py-1 text-cyan-200 outline-none placeholder:text-[var(--text-muted)] focus:border-cyan-500/50"
/>
</label>
<label className="block space-y-1">
<span className="text-[var(--text-muted)]">
CHANNEL PSK HEX {meshMqttSettings?.has_psk ? '(saved)' : '(default LongFast if blank)'}
</span>
<input
type="password"
value={meshMqttForm.psk}
onChange={(e) => setMeshMqttForm((prev) => ({ ...prev, psk: e.target.value }))}
placeholder="blank uses default LongFast key"
className="w-full border border-[var(--border-primary)] bg-black/30 px-2 py-1 text-cyan-200 outline-none placeholder:text-[var(--text-muted)] focus:border-cyan-500/50"
/>
</label>
<label className="flex items-center gap-2 border border-[var(--border-primary)]/40 bg-black/20 px-2 py-1 text-cyan-200">
<input
type="checkbox"
checked={meshMqttForm.include_default_roots}
onChange={(e) =>
setMeshMqttForm((prev) => ({ ...prev, include_default_roots: e.target.checked }))
}
/>
DEFAULT PUBLIC ROOTS
</label>
<label className="block space-y-1">
<span className="text-[var(--text-muted)]">EXTRA ROOTS</span>
<input
value={meshMqttForm.extra_roots}
onChange={(e) => setMeshMqttForm((prev) => ({ ...prev, extra_roots: e.target.value }))}
placeholder="comma separated, optional"
className="w-full border border-[var(--border-primary)] bg-black/30 px-2 py-1 text-cyan-200 outline-none placeholder:text-[var(--text-muted)] focus:border-cyan-500/50"
/>
</label>
<div className="grid grid-cols-3 gap-2 pt-1">
<button
onClick={() => void saveMeshMqttSettings({ enabled: true })}
disabled={meshMqttBusy}
className="border border-green-600/40 bg-green-950/20 px-2 py-1.5 text-green-300 hover:bg-green-950/35 disabled:opacity-50"
>
ENABLE
</button>
<button
onClick={() => void saveMeshMqttSettings({ enabled: false })}
disabled={meshMqttBusy}
className="border border-red-600/35 bg-red-950/15 px-2 py-1.5 text-red-300 hover:bg-red-950/25 disabled:opacity-50"
>
DISABLE
</button>
<button
onClick={() => void refreshMeshMqttSettings()}
disabled={meshMqttBusy}
className="border border-cyan-700/40 bg-cyan-950/15 px-2 py-1.5 text-cyan-300 hover:bg-cyan-950/25 disabled:opacity-50"
>
REFRESH
</button>
</div>
{meshMqttStatusText && (
<div className="text-[10px] text-cyan-200/80 leading-[1.5]">{meshMqttStatusText}</div>
)}
</div>
)}
{!canUsePublicMeshInput && meshView !== 'settings' && (
<div className="text-[12px] font-mono text-green-300/70 text-center py-4 leading-[1.65]">
MeshChat is off. Turn it on to connect the public mesh lane.
</div>
)}
{meshSessionActive && meshView === 'channel' && filteredMeshMessages.length === 0 && (
{canUsePublicMeshInput && meshView === 'channel' && filteredMeshMessages.length === 0 && (
<div className="text-[12px] font-mono text-[var(--text-muted)] text-center py-4 leading-[1.65]">
No messages from {meshRegion} / {meshChannel}
</div>
)}
{meshSessionActive && meshView === 'inbox' && (
{canUsePublicMeshInput && meshView === 'inbox' && (
<>
{!publicMeshAddress && (
{!activePublicMeshAddress && (
<div className="text-[12px] font-mono text-[var(--text-muted)] text-center py-4 leading-[1.65]">
Create or load a public mesh identity to see direct Meshtastic traffic.
</div>
)}
{publicMeshAddress && meshInboxMessages.length === 0 && (
{activePublicMeshAddress && meshInboxMessages.length === 0 && (
<div className="text-[12px] font-mono text-[var(--text-muted)] text-center py-4 leading-[1.65]">
No public direct messages addressed to {publicMeshAddress.toUpperCase()} yet.
No public direct messages addressed to {activePublicMeshAddress.toUpperCase()} yet.
</div>
)}
{meshInboxMessages.map((m, i) => (
@@ -1198,7 +1436,7 @@ const MeshChat = React.memo(function MeshChat(props: MeshChatProps) {
</button>
<div className="flex-1 min-w-0">
<div className="text-[10px] text-amber-200/70 mb-0.5">
TO {publicMeshAddress.toUpperCase()}
TO {activePublicMeshAddress.toUpperCase()}
</div>
<div className="break-words whitespace-pre-wrap text-amber-100/90">
{m.text}
@@ -2091,10 +2329,12 @@ const MeshChat = React.memo(function MeshChat(props: MeshChatProps) {
? `→ INFONET${selectedGate ? ` / ${selectedGate}` : ''}${privateInfonetTransportReady ? '' : ' / EXPERIMENTAL ENCRYPTION'}`
: '→ PRIVATE LANE LOCKED'
: activeTab === 'meshtastic'
? hasPublicLaneIdentity
? canUsePublicMeshInput
? meshDirectTarget
? `→ MESH / TO ${meshDirectTarget.toUpperCase()}`
: `→ MESH / ${meshRegion} / ${meshChannel}`
? `→ MESH / TO ${meshDirectTarget.toUpperCase()} / FROM ${activePublicMeshAddress.toUpperCase()}`
: `→ MESH / ${meshRegion} / ${meshChannel} / ${activePublicMeshAddress.toUpperCase()}`
: publicMeshBlockedByWormhole
? '→ MESH BLOCKED / WORMHOLE ACTIVE'
: hasStoredPublicLaneIdentity
? '→ MESH OFF'
: '→ MESH LOCKED'
@@ -2106,7 +2346,7 @@ const MeshChat = React.memo(function MeshChat(props: MeshChatProps) {
</span>
)}
</div>
{activeTab === 'meshtastic' && !hasPublicLaneIdentity && !sendError && (
{activeTab === 'meshtastic' && !sendError && (!canUsePublicMeshInput || meshQuickStatus) && (
<div
className={`px-3 pt-1 text-[12px] font-mono leading-[1.5] ${
meshQuickStatus?.type === 'err'
@@ -2116,7 +2356,7 @@ const MeshChat = React.memo(function MeshChat(props: MeshChatProps) {
: 'text-green-300/70'
}`}
>
{meshActivationText}
{meshQuickStatus?.text || meshActivationText}
</div>
)}
<div className="flex items-center gap-2 px-3 pb-2 pt-1">
@@ -2146,7 +2386,7 @@ const MeshChat = React.memo(function MeshChat(props: MeshChatProps) {
NEED WORMHOLE
</span>
</button>
) : activeTab === 'meshtastic' && !hasPublicLaneIdentity ? (
) : activeTab === 'meshtastic' && !canUsePublicMeshInput ? (
<button
onClick={handleMeshActivationAction}
disabled={identityWizardBusy}
@@ -2162,7 +2402,10 @@ const MeshChat = React.memo(function MeshChat(props: MeshChatProps) {
</button>
) : activeTab === 'meshtastic' && meshDirectTarget ? (
<button
onClick={() => setMeshDirectTarget('')}
onClick={() => {
setMeshDirectTarget('');
setMeshAddressDraft('');
}}
className="w-full flex items-center justify-between gap-2 px-3 py-2 border border-amber-700/40 bg-amber-950/10 text-amber-200 hover:bg-amber-950/20 hover:border-amber-500/50 transition-colors"
>
<span className="inline-flex items-center gap-2 text-sm font-mono tracking-[0.2em]">
@@ -13,11 +13,8 @@ import {
extractNativeGateResyncTarget,
} from '@/lib/desktopControlContract';
import type { DesktopControlAuditReport } from '@/lib/desktopControlContract';
import { fetchPrivacyProfileSnapshot } from '@/mesh/controlPlaneStatusClient';
import { fetchPrivacyProfileSnapshot, setInfonetNodeEnabled } from '@/mesh/controlPlaneStatusClient';
import {
clearBrowserIdentityState,
derivePublicMeshAddress,
generateNodeKeys,
getNodeIdentity,
getStoredNodeDescriptor,
getWormholeIdentityDescriptor,
@@ -31,9 +28,7 @@ import {
updateContact,
blockContact,
getDMNotify,
getPublicKeyAlgo,
nextSequence,
signEvent,
verifyEventSignature,
verifyRawSignature,
purgeBrowserContactGraph,
@@ -130,7 +125,6 @@ import {
preferredDmPeerId,
} from '@/mesh/meshDmConsent';
import { deriveSasPhrase } from '@/mesh/meshSas';
import { PROTOCOL_VERSION } from '@/mesh/meshProtocol';
import { validateEventPayload } from '@/mesh/meshSchema';
import {
buildDmTrustHint,
@@ -223,6 +217,94 @@ interface GateCompatConsentPromptState {
reason: string;
}
interface MeshMqttRuntime {
enabled?: boolean;
running?: boolean;
connected?: boolean;
broker?: string;
port?: number;
username?: string;
client_id?: string;
message_log_size?: number;
signal_log_size?: number;
last_error?: string;
last_connected_at?: number;
last_disconnected_at?: number;
}
interface MeshMqttSettings {
enabled: boolean;
broker: string;
port: number;
username: string;
uses_default_credentials?: boolean;
has_password: boolean;
has_psk: boolean;
include_default_roots: boolean;
extra_roots: string;
extra_topics: string;
runtime?: MeshMqttRuntime;
}
interface MeshMqttForm {
broker: string;
port: string;
username: string;
password: string;
psk: string;
include_default_roots: boolean;
extra_roots: string;
extra_topics: string;
}
const PUBLIC_MESH_ADDRESS_KEY = 'sb_public_meshtastic_address';
function normalizePublicMeshAddress(value: string): string {
const raw = String(value || '').trim().toLowerCase();
const body = raw.startsWith('!') ? raw.slice(1) : raw;
if (!/^[0-9a-f]{8}$/.test(body)) return '';
return `!${body}`;
}
function readStoredPublicMeshAddress(): string {
if (typeof window === 'undefined') return '';
try {
return normalizePublicMeshAddress(window.localStorage.getItem(PUBLIC_MESH_ADDRESS_KEY) || '');
} catch {
return '';
}
}
function writeStoredPublicMeshAddress(address: string): void {
if (typeof window === 'undefined') return;
const normalized = normalizePublicMeshAddress(address);
if (!normalized) return;
try {
window.localStorage.setItem(PUBLIC_MESH_ADDRESS_KEY, normalized);
} catch {
/* ignore */
}
}
function clearStoredPublicMeshAddress(): void {
if (typeof window === 'undefined') return;
try {
window.localStorage.removeItem(PUBLIC_MESH_ADDRESS_KEY);
} catch {
/* ignore */
}
}
function createPublicMeshAddress(): string {
if (typeof window !== 'undefined' && window.crypto?.getRandomValues) {
const value = new Uint32Array(1);
window.crypto.getRandomValues(value);
if (value[0]) return `!${value[0].toString(16).padStart(8, '0')}`;
}
const fallback = Math.floor((Date.now() ^ Math.floor(Math.random() * 0xffffffff)) >>> 0);
return `!${fallback.toString(16).padStart(8, '0')}`;
}
function describeGateCompatConsentRequired(): string {
return 'Local gate runtime is unavailable for this room.';
}
@@ -315,8 +397,22 @@ export function useMeshChatController({
const [meshQuickStatus, setMeshQuickStatus] = useState<{ type: 'ok' | 'err'; text: string } | null>(null);
const [meshSessionActive, setMeshSessionActive] = useState(false);
const [publicMeshAddress, setPublicMeshAddress] = useState('');
const [meshView, setMeshView] = useState<'channel' | 'inbox'>('channel');
const [meshView, setMeshView] = useState<'channel' | 'inbox' | 'settings' | 'message'>('channel');
const [meshDirectTarget, setMeshDirectTarget] = useState('');
const [meshAddressDraft, setMeshAddressDraft] = useState('');
const [meshMqttSettings, setMeshMqttSettings] = useState<MeshMqttSettings | null>(null);
const [meshMqttForm, setMeshMqttForm] = useState<MeshMqttForm>({
broker: 'mqtt.meshtastic.org',
port: '1883',
username: '',
password: '',
psk: '',
include_default_roots: true,
extra_roots: '',
extra_topics: '',
});
const [meshMqttBusy, setMeshMqttBusy] = useState(false);
const [meshMqttStatusText, setMeshMqttStatusText] = useState('');
// Identity
const [identity, setIdentity] = useState<NodeIdentity | null>(null);
@@ -329,31 +425,137 @@ export function useMeshChatController({
const [recentPrivateFallbackReason, setRecentPrivateFallbackReason] = useState('');
const [unresolvedSenderSealCount, setUnresolvedSenderSealCount] = useState(0);
const [privacyProfile, setPrivacyProfile] = useState<'default' | 'high'>('default');
const storedPublicIdentity = clientHydrated ? getNodeIdentity() : null;
const hasStoredPublicLaneIdentity = clientHydrated && Boolean(storedPublicIdentity) && hasSovereignty();
const publicIdentity = meshSessionActive ? storedPublicIdentity : null;
const hasPublicLaneIdentity = meshSessionActive && hasStoredPublicLaneIdentity;
const storedPublicMeshAddress = clientHydrated ? readStoredPublicMeshAddress() : '';
const hasStoredPublicLaneIdentity = clientHydrated && Boolean(storedPublicMeshAddress);
const publicIdentity = null;
const activePublicMeshAddress = publicMeshAddress || storedPublicMeshAddress;
const hasPublicLaneIdentity = meshSessionActive && Boolean(activePublicMeshAddress);
const hasId = Boolean(identity) && (hasSovereignty() || wormholeEnabled);
const shouldShowIdentityWarning = activeTab !== 'meshtastic' && !hasId;
const privateInfonetReady = wormholeEnabled && wormholeReadyState;
const publicMeshBlockedByWormhole = wormholeEnabled || wormholeReadyState;
const dmSendQueue = useRef<(() => Promise<void>)[]>([]);
const infonetAutoBootstrapRef = useRef(false);
const meshMqttRuntime = meshMqttSettings?.runtime;
const meshMqttEnabled = Boolean(meshMqttSettings?.enabled || meshMqttRuntime?.enabled);
const canUsePublicMeshInput = Boolean(activePublicMeshAddress) && meshMqttEnabled && !publicMeshBlockedByWormhole;
const meshMqttRunning = Boolean(meshMqttRuntime?.running);
const meshMqttConnected = Boolean(meshMqttRuntime?.connected);
const meshMqttConnectionLabel = !meshMqttEnabled
? 'MQTT OFF'
: meshMqttConnected
? 'MQTT LIVE'
: meshMqttRunning
? 'MQTT CONNECTING'
: 'MQTT STARTING';
const applyMeshMqttSettings = useCallback((data: MeshMqttSettings) => {
setMeshMqttSettings(data);
setMeshMqttForm((prev) => ({
broker: data.broker || prev.broker || 'mqtt.meshtastic.org',
port: String(data.port || prev.port || '1883'),
username: data.uses_default_credentials ? '' : data.username || prev.username || '',
password: '',
psk: '',
include_default_roots: Boolean(data.include_default_roots),
extra_roots: data.extra_roots || '',
extra_topics: data.extra_topics || '',
}));
}, []);
const refreshMeshMqttSettings = useCallback(async () => {
try {
const res = await fetch(`${API_BASE}/api/settings/meshtastic-mqtt`, { cache: 'no-store' });
if (!res.ok) return null;
const data = (await res.json()) as MeshMqttSettings;
applyMeshMqttSettings(data);
return data;
} catch {
return null;
}
}, [applyMeshMqttSettings]);
const saveMeshMqttSettings = useCallback(
async (updates: Partial<MeshMqttForm> & { enabled?: boolean } = {}) => {
setMeshMqttBusy(true);
setMeshMqttStatusText('');
try {
const nextForm = { ...meshMqttForm, ...updates };
const body: Record<string, unknown> = {
broker: nextForm.broker.trim() || 'mqtt.meshtastic.org',
port: Number.parseInt(nextForm.port, 10) || 1883,
username: nextForm.username.trim(),
include_default_roots: Boolean(nextForm.include_default_roots),
extra_roots: nextForm.extra_roots.trim(),
extra_topics: nextForm.extra_topics.trim(),
};
if (!nextForm.username.trim() && !nextForm.password.trim()) {
body.password = '';
}
if (typeof updates.enabled === 'boolean') {
body.enabled = updates.enabled;
}
if (nextForm.password.trim()) {
body.password = nextForm.password;
}
if (nextForm.psk.trim()) {
body.psk = nextForm.psk.trim();
}
const res = await fetch(`${API_BASE}/api/settings/meshtastic-mqtt`, {
method: 'PUT',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(body),
});
if (!res.ok) {
const detail = await res.text().catch(() => '');
throw new Error(detail || `HTTP ${res.status}`);
}
const data = (await res.json()) as MeshMqttSettings;
applyMeshMqttSettings(data);
if (data.enabled) {
setWormholeEnabled(false);
setWormholeReadyState(false);
setWormholeRnsReady(false);
setWormholeRnsDirectReady(false);
setWormholeRnsPeers({ active: 0, configured: 0 });
setSecureModeCached(false);
}
const status = data.runtime?.connected
? 'MQTT bridge connected.'
: data.enabled
? 'MQTT bridge enabled. Connection may take a few seconds.'
: 'MQTT bridge disabled.';
setMeshMqttStatusText(status);
return { ok: true as const, text: status, data };
} catch (err) {
const text = err instanceof Error ? err.message : 'MQTT settings update failed';
setMeshMqttStatusText(text);
return { ok: false as const, text };
} finally {
setMeshMqttBusy(false);
}
},
[applyMeshMqttSettings, meshMqttForm],
);
const enableMeshMqttBridge = useCallback(async () => {
const result = await saveMeshMqttSettings({ enabled: true });
if (!result.ok) {
throw new Error(result.text);
}
return result;
}, [saveMeshMqttSettings]);
const dmSendTimer = useRef<ReturnType<typeof setTimeout> | null>(null);
const streamEnabledForSelectedGateRef = useRef(false);
const displayPublicMeshSender = useCallback(
(sender: string) => {
if (!sender) return '???';
if (
hasPublicLaneIdentity &&
publicIdentity?.nodeId &&
publicMeshAddress &&
sender.toLowerCase() === publicIdentity.nodeId.toLowerCase()
) {
return publicMeshAddress.toUpperCase();
if (activePublicMeshAddress && sender.toLowerCase() === activePublicMeshAddress.toLowerCase()) {
return activePublicMeshAddress.toUpperCase();
}
return sender;
},
[hasPublicLaneIdentity, publicIdentity?.nodeId, publicMeshAddress],
[activePublicMeshAddress],
);
const openIdentityWizard = useCallback(
@@ -370,6 +572,7 @@ export function useMeshChatController({
useEffect(() => {
if (!clientHydrated) return;
setPublicMeshAddress(readStoredPublicMeshAddress());
setMeshSessionActive(false);
setMeshMessages([]);
setMeshQuickStatus(null);
@@ -525,25 +728,6 @@ export function useMeshChatController({
};
}, []);
useEffect(() => {
let alive = true;
const senderId = storedPublicIdentity?.nodeId || '';
if (!senderId || !globalThis.crypto?.subtle) {
setPublicMeshAddress('');
return;
}
derivePublicMeshAddress(senderId)
.then((addr) => {
if (alive) setPublicMeshAddress(addr);
})
.catch(() => {
if (alive) setPublicMeshAddress('');
});
return () => {
alive = false;
};
}, [storedPublicIdentity?.nodeId]);
const flushDmQueue = useCallback(async () => {
const queue = dmSendQueue.current.splice(0);
if (dmSendTimer.current) {
@@ -1037,6 +1221,7 @@ export function useMeshChatController({
const inputRef = useRef<HTMLTextAreaElement>(null);
const cursorMirrorRef = useRef<HTMLDivElement>(null);
const cursorMarkerRef = useRef<HTMLSpanElement>(null);
const publicMeshPrivacyEnforcedRef = useRef(false);
useEffect(() => {
const el = messagesEndRef.current;
@@ -1145,15 +1330,51 @@ export function useMeshChatController({
() => infoMessages.filter((m) => !m.node_id || !mutedUsers.has(m.node_id)),
[infoMessages, mutedUsers],
);
const isBroadcastMeshMessage = useCallback((m: MeshtasticMessage) => {
const target = String(m.to || 'broadcast').trim().toLowerCase();
return target === '' || target === 'broadcast' || target === '^all';
}, []);
const filteredMeshMessages = useMemo(
() => meshMessages.filter((m) => !mutedUsers.has(m.from)),
[meshMessages, mutedUsers],
() => meshMessages.filter((m) => isBroadcastMeshMessage(m) && !mutedUsers.has(m.from)),
[isBroadcastMeshMessage, meshMessages, mutedUsers],
);
const meshInboxMessages = useMemo(() => {
if (!meshSessionActive || !publicMeshAddress) return [];
const target = publicMeshAddress.toLowerCase();
return filteredMeshMessages.filter((m) => String(m.to || '').toLowerCase() === target);
}, [filteredMeshMessages, meshSessionActive, publicMeshAddress]);
if (!activePublicMeshAddress) return [];
const target = activePublicMeshAddress.toLowerCase();
return meshMessages.filter(
(m) => !mutedUsers.has(m.from) && String(m.to || '').toLowerCase() === target,
);
}, [activePublicMeshAddress, meshMessages, mutedUsers]);
useEffect(() => {
if (!expanded || activeTab !== 'meshtastic') return;
let alive = true;
const tick = async () => {
const data = await refreshMeshMqttSettings();
if (!alive || !data) return;
if (!data.enabled && meshSessionActive) {
setMeshQuickStatus({
type: 'err',
text: 'Public Mesh key is ready, but MQTT is off. Enable MQTT in Settings to join the live public lane.',
});
}
};
void tick();
const timer = window.setInterval(() => {
void tick();
}, meshMqttEnabled && !meshMqttConnected ? 5_000 : 15_000);
return () => {
alive = false;
window.clearInterval(timer);
};
}, [
activeTab,
expanded,
meshMqttConnected,
meshMqttEnabled,
meshSessionActive,
refreshMeshMqttSettings,
]);
// ─── InfoNet Polling ─────────────────────────────────────────────────────
@@ -1747,7 +1968,7 @@ export function useMeshChatController({
// ─── Meshtastic Channel Discovery ──────────────────────────────────────
useEffect(() => {
if (!expanded || activeTab !== 'meshtastic' || !meshSessionActive) return;
if (!expanded || activeTab !== 'meshtastic' || !canUsePublicMeshInput) return;
let cancelled = false;
const fetchChannels = async () => {
try {
@@ -1806,12 +2027,12 @@ export function useMeshChatController({
cancelled = true;
clearInterval(iv);
};
}, [expanded, activeTab, meshRegion, meshSessionActive]);
}, [expanded, activeTab, meshRegion, canUsePublicMeshInput]);
// ─── Meshtastic Polling ──────────────────────────────────────────────────
useEffect(() => {
if (!expanded || activeTab !== 'meshtastic' || !meshSessionActive) return;
if (!expanded || activeTab !== 'meshtastic' || !canUsePublicMeshInput) return;
let cancelled = false;
const poll = async () => {
try {
@@ -1820,6 +2041,7 @@ export function useMeshChatController({
region: meshRegion,
channel: meshChannel,
});
if (meshView === 'inbox') params.set('include_direct', '1');
const res = await fetch(`${API_BASE}/api/mesh/messages?${params}`);
if (res.ok && !cancelled) {
const data = await res.json();
@@ -1835,13 +2057,13 @@ export function useMeshChatController({
cancelled = true;
clearInterval(iv);
};
}, [expanded, activeTab, meshRegion, meshChannel, meshView, meshSessionActive]);
}, [expanded, activeTab, meshRegion, meshChannel, meshView, canUsePublicMeshInput]);
useEffect(() => {
if (meshSessionActive) return;
if (canUsePublicMeshInput) return;
setMeshMessages([]);
setMeshQuickStatus(null);
}, [meshSessionActive]);
}, [canUsePublicMeshInput]);
// ─── DM Polling ──────────────────────────────────────────────────────────
@@ -2326,7 +2548,7 @@ export function useMeshChatController({
if (!msg || busy) return;
if (activeTab !== 'meshtastic' && !hasId) return;
const cooldownMs = activeTab === 'dms' ? 0 : 30_000;
const cooldownMs = activeTab === 'dms' ? 0 : activeTab === 'meshtastic' ? 6_000 : 30_000;
const now = Date.now();
const elapsed = now - lastSendTime;
if (cooldownMs > 0 && elapsed < cooldownMs) {
@@ -2336,8 +2558,8 @@ export function useMeshChatController({
return;
}
if (anonymousPublicBlocked && (activeTab === 'infonet' || activeTab === 'meshtastic')) {
setSendError('hidden transport required for public posting');
if (anonymousPublicBlocked && activeTab === 'infonet') {
setSendError('hidden transport required for infonet posting');
setTimeout(() => setSendError(''), 4000);
return;
}
@@ -2411,10 +2633,11 @@ export function useMeshChatController({
]);
setGateReplyContext(null);
} else if (activeTab === 'meshtastic') {
if (!meshSessionActive || !publicIdentity || !hasSovereignty()) {
const meshSenderAddress = activePublicMeshAddress;
if (!meshSenderAddress) {
setInputValue(msg);
setLastSendTime(0);
setSendError(meshSessionActive ? 'public mesh identity needed' : 'meshchat is off');
setSendError('public mesh identity needed');
openIdentityWizard({
type: 'err',
text: hasStoredPublicLaneIdentity
@@ -2425,8 +2648,24 @@ export function useMeshChatController({
setBusy(false);
return;
}
if (!meshSessionActive) {
setPublicMeshAddress(meshSenderAddress);
setMeshSessionActive(true);
}
if (!meshMqttEnabled) {
setInputValue(msg);
setLastSendTime(0);
setSendError('mqtt is off');
setMeshQuickStatus({
type: 'err',
text: 'Public Mesh key is ready, but MQTT is off. Open Settings and enable the public broker.',
});
setMeshView('settings');
setTimeout(() => setSendError(''), 4000);
setBusy(false);
return;
}
const meshDestination = meshDirectTarget.trim() || 'broadcast';
const sequence = nextSequence();
const payload = {
message: msg,
destination: meshDestination,
@@ -2444,8 +2683,7 @@ export function useMeshChatController({
setBusy(false);
return;
}
const signature = await signEvent('message', publicIdentity.nodeId, sequence, payload);
const sendRes = await fetch(`${API_BASE}/api/mesh/send`, {
const sendRes = await fetch(`${API_BASE}/api/mesh/meshtastic/send`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
@@ -2455,14 +2693,8 @@ export function useMeshChatController({
priority: 'normal',
ephemeral: false,
transport_lock: 'meshtastic',
sender_id: publicIdentity.nodeId,
node_id: publicIdentity.nodeId,
public_key: publicIdentity.publicKey,
public_key_algo: getPublicKeyAlgo(),
signature,
sequence,
protocol_version: PROTOCOL_VERSION,
credentials: { mesh_region: meshRegion },
sender_id: meshSenderAddress,
mesh_region: meshRegion,
}),
});
if (!sendRes.ok) {
@@ -2476,25 +2708,33 @@ export function useMeshChatController({
if (!sendData.ok) {
setInputValue(msg);
setLastSendTime(0);
if (sendData.detail === 'Invalid signature') {
setSendError('public mesh signature failed');
openIdentityWizard({
type: 'err',
text: 'This public mesh identity did not verify. Reset it, recreate it, then retry.',
});
} else {
setSendError(sendData.detail || 'send failed');
}
setSendError(sendData.detail || 'send failed');
setTimeout(() => setSendError(''), 4000);
return;
}
// Re-fetch — backend injects our msg into the bridge feed after publish
const directTarget = meshDestination !== 'broadcast'
? meshDestination.startsWith('!')
? meshDestination.toUpperCase()
: `!${meshDestination}`.toUpperCase()
: '';
const routeDetail = Array.isArray(sendData.results) && sendData.results[0]?.reason
? String(sendData.results[0].reason)
: String(sendData.route_reason || 'MQTT broker accepted publish');
setMeshQuickStatus({
type: 'ok',
text: directTarget
? `Direct message queued for ${directTarget}. ${routeDetail}`
: `Channel message published to ${meshRegion}/${meshChannel}. ${routeDetail}`,
});
window.setTimeout(() => setMeshQuickStatus(null), 6000);
await new Promise((r) => setTimeout(r, 500));
const params = new URLSearchParams({
limit: '30',
region: meshRegion,
channel: meshChannel,
});
if (directTarget) params.set('include_direct', '1');
const mRes = await fetch(`${API_BASE}/api/mesh/messages?${params}`);
if (mRes.ok) {
const data = await mRes.json();
@@ -3927,7 +4167,7 @@ export function useMeshChatController({
privateInfonetTransportReady,
});
const inputDisabled =
!hasId ||
(activeTab !== 'meshtastic' && !hasId) ||
busy ||
(activeTab === 'infonet' && !privateInfonetReady) ||
(activeTab === 'infonet' && !selectedGate) ||
@@ -3937,6 +4177,7 @@ export function useMeshChatController({
wormholeReadyState &&
!selectedGateAccessReady) ||
(activeTab === 'infonet' && anonymousPublicBlocked) ||
(activeTab === 'meshtastic' && !canUsePublicMeshInput) ||
(activeTab === 'dms' &&
(dmView !== 'chat' ||
!selectedContact ||
@@ -3980,6 +4221,10 @@ export function useMeshChatController({
[inputDisabled],
);
const disablePrivateNodeForPublicMesh = useCallback(async () => {
await setInfonetNodeEnabled(false);
}, []);
const disableWormholeForPublicMesh = useCallback(async () => {
const requireBackendLeave = wormholeEnabled || wormholeReadyState;
try {
@@ -3995,7 +4240,28 @@ export function useMeshChatController({
setWormholeRnsDirectReady(false);
setWormholeRnsPeers({ active: 0, configured: 0 });
setSecureModeCached(false);
}, [wormholeEnabled, wormholeReadyState]);
await disablePrivateNodeForPublicMesh();
}, [disablePrivateNodeForPublicMesh, wormholeEnabled, wormholeReadyState]);
useEffect(() => {
if (!meshSessionActive || !activePublicMeshAddress || !meshMqttEnabled) {
publicMeshPrivacyEnforcedRef.current = false;
return;
}
if (publicMeshPrivacyEnforcedRef.current) return;
publicMeshPrivacyEnforcedRef.current = true;
void disableWormholeForPublicMesh().catch((err) => {
publicMeshPrivacyEnforcedRef.current = false;
const message =
typeof err === 'object' && err !== null && 'message' in err
? String((err as { message?: string }).message)
: 'unknown error';
setMeshQuickStatus({
type: 'err',
text: `Could not isolate public Mesh lane: ${message}`,
});
});
}, [activePublicMeshAddress, disableWormholeForPublicMesh, meshMqttEnabled, meshSessionActive]);
const createPublicMeshIdentity = useCallback(
async ({ closeWizardOnSuccess }: { closeWizardOnSuccess: boolean }) => {
@@ -4003,11 +4269,11 @@ export function useMeshChatController({
setIdentityWizardStatus(null);
try {
await disableWormholeForPublicMesh();
const nextIdentity = await generateNodeKeys();
const nextAddress = await derivePublicMeshAddress(nextIdentity.nodeId).catch(() => '');
const readyAddress = (nextAddress || nextIdentity.nodeId).toUpperCase();
setIdentity(nextIdentity);
setPublicMeshAddress(nextAddress || nextIdentity.nodeId);
const nextAddress = createPublicMeshAddress();
await enableMeshMqttBridge();
writeStoredPublicMeshAddress(nextAddress);
const readyAddress = nextAddress.toUpperCase();
setPublicMeshAddress(nextAddress);
setMeshSessionActive(true);
setMeshMessages([]);
setSendError('');
@@ -4038,7 +4304,7 @@ export function useMeshChatController({
setIdentityWizardBusy(false);
}
},
[disableWormholeForPublicMesh],
[disableWormholeForPublicMesh, enableMeshMqttBridge],
);
const handleCreatePublicIdentity = useCallback(async () => {
@@ -4059,8 +4325,8 @@ export function useMeshChatController({
setIdentityWizardStatus(null);
setMeshQuickStatus(null);
try {
const savedIdentity = getNodeIdentity();
if (!savedIdentity || !hasSovereignty()) {
const savedAddress = readStoredPublicMeshAddress();
if (!savedAddress) {
const text = 'No saved public mesh key is available. Create a mesh key first.';
setMeshSessionActive(false);
setIdentityWizardStatus({ type: 'err', text });
@@ -4068,16 +4334,15 @@ export function useMeshChatController({
return { ok: false as const, text };
}
await disableWormholeForPublicMesh();
const nextAddress = await derivePublicMeshAddress(savedIdentity.nodeId).catch(() => '');
const readyAddress = (nextAddress || savedIdentity.nodeId).toUpperCase();
setIdentity(savedIdentity);
setPublicMeshAddress(nextAddress || savedIdentity.nodeId);
await enableMeshMqttBridge();
const readyAddress = savedAddress.toUpperCase();
setPublicMeshAddress(savedAddress);
setMeshSessionActive(true);
setMeshMessages([]);
setSendError('');
const text = `MeshChat is on with saved address ${readyAddress}.`;
const text = `MeshChat is on. Address ${readyAddress}.`;
setIdentityWizardStatus({ type: 'ok', text });
setMeshQuickStatus({ type: 'ok', text });
setMeshQuickStatus(null);
return { ok: true as const, text };
} catch (err) {
const message =
@@ -4091,13 +4356,14 @@ export function useMeshChatController({
} finally {
setIdentityWizardBusy(false);
}
}, [disableWormholeForPublicMesh]);
}, [disableWormholeForPublicMesh, enableMeshMqttBridge]);
const handleReplyToMeshAddress = useCallback((address: string) => {
const target = String(address || '').trim();
if (!target) return;
setMeshDirectTarget(target);
setMeshView('inbox');
setMeshAddressDraft(target);
setMeshView('channel');
setSenderPopup(null);
setTimeout(() => inputRef.current?.focus(), 0);
}, []);
@@ -4108,7 +4374,7 @@ export function useMeshChatController({
: await createPublicMeshIdentity({ closeWizardOnSuccess: false });
const status = { type: result.ok ? 'ok' as const : 'err' as const, text: result.text };
setIdentityWizardStatus(status);
setMeshQuickStatus(status);
setMeshQuickStatus(result.ok ? null : status);
if (result.ok) {
window.setTimeout(() => setIdentityWizardOpen(false), 900);
}
@@ -4127,14 +4393,8 @@ export function useMeshChatController({
try {
setMeshSessionActive(false);
setMeshMessages([]);
await clearBrowserIdentityState();
setIdentity(null);
clearStoredPublicMeshAddress();
setPublicMeshAddress('');
setContacts({});
setSelectedContact('');
setDmMessages([]);
setAccessRequestsState([]);
setPendingSentState([]);
setIdentityWizardStatus({
type: 'ok',
text: 'Public mesh identity cleared. Start a fresh one when you are ready.',
@@ -4219,6 +4479,23 @@ export function useMeshChatController({
setIdentityWizardBusy(false);
}
}, [wormholeDescriptor?.nodeId, wormholeEnabled, wormholeReadyState]);
useEffect(() => {
if (!expanded || activeTab !== 'infonet') {
infonetAutoBootstrapRef.current = false;
return;
}
if (privateInfonetReady) {
infonetAutoBootstrapRef.current = false;
return;
}
if (identityWizardBusy || infonetAutoBootstrapRef.current) return;
infonetAutoBootstrapRef.current = true;
void handleBootstrapPrivateIdentity().catch(() => {
infonetAutoBootstrapRef.current = false;
});
}, [activeTab, expanded, handleBootstrapPrivateIdentity, identityWizardBusy, privateInfonetReady]);
return {
// UI state
expanded,
@@ -4242,15 +4519,30 @@ export function useMeshChatController({
meshQuickStatus,
meshSessionActive,
publicMeshAddress,
activePublicMeshAddress,
meshView,
setMeshView,
meshDirectTarget,
setMeshDirectTarget,
meshAddressDraft,
setMeshAddressDraft,
meshMqttSettings,
meshMqttForm,
setMeshMqttForm,
meshMqttBusy,
meshMqttStatusText,
meshMqttEnabled,
meshMqttRunning,
meshMqttConnected,
meshMqttConnectionLabel,
saveMeshMqttSettings,
refreshMeshMqttSettings,
// Identity
identity,
publicIdentity,
hasStoredPublicLaneIdentity,
hasPublicLaneIdentity,
canUsePublicMeshInput,
hasId,
shouldShowIdentityWarning,
wormholeEnabled,
+3 -3
View File
@@ -5953,7 +5953,7 @@ export default function MeshTerminal({ isOpen, launchToken = 0, onClose, onDmCou
PARTICIPANT NODE
</div>
<div className="mt-1 text-sm leading-5 text-slate-400">
Backend bootstrap is configured; activate the participant node to sync the public testnet seed without Wormhole.
Backend bootstrap is configured; the participant node syncs the testnet seed over the private seed lane.
</div>
</div>
<div className="border border-cyan-500/20 bg-cyan-500/8 px-3 py-1.5 text-[13px] tracking-[0.22em] text-cyan-200">
@@ -6008,10 +6008,10 @@ export default function MeshTerminal({ isOpen, launchToken = 0, onClose, onDmCou
<div className="border border-amber-400/16 bg-amber-400/6 px-4 py-3 text-sm leading-6 text-amber-100/85">
<div className="text-[13px] font-mono tracking-[0.24em] text-amber-300">
WORMHOLE OPTIONAL FOR NODE SYNC
PRIVATE SEED LANE
</div>
<div className="mt-2">
Participant-node bootstrap, sync, and public chain hosting run on the backend lane without Wormhole.
Participant-node bootstrap, sync, and public chain hosting use the backend private seed lane.
</div>
<div className="mt-2 text-amber-200/75">
Turn Wormhole on for gates, obfuscated inbox, and the stronger obfuscated lane only.
+144 -37
View File
@@ -7,6 +7,7 @@ import React, { useEffect, useRef, useCallback } from 'react';
import WikiImage from '@/components/WikiImage';
import type { SelectedEntity, RegionDossier, FimiData } from "@/types/dashboard";
import { useDataKeys } from '@/hooks/useDataStore';
import { API_BASE } from '@/lib/api';
import { lookupShodanHost } from '@/lib/shodanClient';
import type { ShodanHost } from '@/types/shodan';
@@ -100,6 +101,7 @@ const AIRCRAFT_WIKI: Record<string, string> = {
PA46: 'Piper PA-46 Malibu', BE36: 'Beechcraft Bonanza', BE9L: 'Beechcraft King Air',
BE20: 'Beechcraft Super King Air', B350: 'Beechcraft King Air 350', PC12: 'Pilatus PC-12',
PC24: 'Pilatus PC-24', TBM7: 'Daher TBM', TBM8: 'Daher TBM', TBM9: 'Daher TBM',
PIVI: 'Pipistrel Virus',
// Helicopters
R44: 'Robinson R44', R22: 'Robinson R22', R66: 'Robinson R66',
B06: 'Bell 206', B407: 'Bell 407', B412: 'Bell 412',
@@ -196,12 +198,17 @@ function resolveAcTypeWiki(acType: string): string | null {
return null;
}
function resolveAircraftWikiTitle(model: string | undefined): string | null {
if (!model) return null;
return AIRCRAFT_WIKI[model] || resolveAcTypeWiki(model);
}
// Module-level cache for Wikipedia thumbnails (persists across re-renders)
const _wikiThumbCache: Record<string, { url: string | null; loading: boolean }> = {};
function useAircraftImage(model: string | undefined): { imgUrl: string | null; wikiUrl: string | null; loading: boolean } {
const [, forceUpdate] = useState(0);
const wikiTitle = model ? AIRCRAFT_WIKI[model] : undefined;
const wikiTitle = resolveAircraftWikiTitle(model) || undefined;
const wikiUrl = wikiTitle ? `https://en.wikipedia.org/wiki/${wikiTitle.replace(/ /g, '_')}` : null;
useEffect(() => {
@@ -236,6 +243,42 @@ const VESSEL_TYPE_WIKI: Record<string, string> = {
'military_vessel': 'https://en.wikipedia.org/wiki/Warship',
};
type FlightTrailPoint = { lat?: number; lng?: number; alt?: number; ts?: number } | number[];
function EmissionsEstimateBlock({ flight }: { flight: any }) {
const emissions = flight?.emissions;
const context = emissions ? 'Model-based cruise estimate' : null;
return (
<div className="border-b border-[var(--border-primary)] pb-2">
<span className="text-[var(--text-muted)] text-[10px] block mb-1.5">EMISSIONS ESTIMATE</span>
<div className="flex gap-3">
<div className="flex-1 bg-[var(--bg-primary)]/50 border border-[var(--border-primary)] px-2 py-1.5">
<div className="text-[11px] text-[var(--text-muted)] tracking-widest">FUEL RATE</div>
<div className="text-xs font-bold text-orange-400">
{emissions ? (
<>{emissions.fuel_gph} <span className="text-[11px] text-[var(--text-muted)] font-normal">GPH</span></>
) : 'UNKNOWN'}
</div>
</div>
<div className="flex-1 bg-[var(--bg-primary)]/50 border border-[var(--border-primary)] px-2 py-1.5">
<div className="text-[11px] text-[var(--text-muted)] tracking-widest">CO2 RATE</div>
<div className="text-xs font-bold text-red-400">
{emissions ? (
<>{emissions.co2_kg_per_hour.toLocaleString()} <span className="text-[11px] text-[var(--text-muted)] font-normal">KG/HR</span></>
) : 'UNKNOWN'}
</div>
</div>
</div>
{context && (
<div className="mt-1.5 text-[10px] text-[var(--text-muted)] leading-relaxed">
{context}
</div>
)}
</div>
);
}
function NewsFeedInner({ selectedEntity, regionDossier, regionDossierLoading, onArticleClick }: { selectedEntity?: SelectedEntity | null, regionDossier?: RegionDossier | null, regionDossierLoading?: boolean, onArticleClick?: (idx: number, lat?: number, lng?: number, title?: string) => void }) {
const data = useDataKeys([
'news', 'fimi', 'commercial_flights', 'private_flights', 'private_jets',
@@ -243,6 +286,7 @@ function NewsFeedInner({ selectedEntity, regionDossier, regionDossierLoading, on
'airports', 'last_updated', 'threat_level',
] as const);
const [isMinimized, setIsMinimized] = useState(false);
const [selectedFlightTrail, setSelectedFlightTrail] = useState<FlightTrailPoint[]>([]);
const [expandedIndexes, setExpandedIndexes] = useState<number[]>([]);
const [fimiExpanded, setFimiExpanded] = useState(false);
const [aiSummaryOpen, setAiSummaryOpen] = useState(false);
@@ -277,15 +321,72 @@ function NewsFeedInner({ selectedEntity, regionDossier, regionDossierLoading, on
const selectedFlightModel = (() => {
if (!selectedEntity) return undefined;
const { type, id } = selectedEntity;
let flight: any = null;
if (type === 'flight') flight = data?.commercial_flights?.[id as number];
else if (type === 'private_flight') flight = data?.private_flights?.[id as number];
else if (type === 'private_jet') flight = data?.private_jets?.[id as number];
else if (type === 'military_flight') flight = data?.military_flights?.[id as number];
else if (type === 'tracked_flight') flight = data?.tracked_flights?.[id as number];
const findByIdOrIndex = (flights?: Array<{ icao24?: string; model?: string }>) => {
if (!flights) return null;
if (typeof id === 'number') return flights[id] || null;
return flights.find((flight) => flight.icao24 === id) || null;
};
let flight: { model?: string } | null = null;
if (type === 'flight') flight = findByIdOrIndex(data?.commercial_flights);
else if (type === 'private_flight') flight = findByIdOrIndex(data?.private_flights);
else if (type === 'private_jet') flight = findByIdOrIndex(data?.private_jets);
else if (type === 'military_flight') flight = findByIdOrIndex(data?.military_flights);
else if (type === 'tracked_flight') flight = findByIdOrIndex(data?.tracked_flights);
return flight?.model;
})();
const { imgUrl: aircraftImgUrl, wikiUrl: aircraftWikiUrl, loading: aircraftImgLoading } = useAircraftImage(selectedFlightModel);
useEffect(() => {
const flightSelectionTypes = new Set([
'flight',
'commercial_flight',
'private_flight',
'private_ga',
'private_jet',
'military_flight',
'tracked_flight',
]);
if (!selectedEntity || !flightSelectionTypes.has(selectedEntity.type)) {
setSelectedFlightTrail([]);
return;
}
const trailId = String(selectedEntity.id || '').trim();
if (!trailId) {
setSelectedFlightTrail([]);
return;
}
let cancelled = false;
const refreshSelectedFlightTrail = () => {
fetch(`${API_BASE}/api/trail/flight/${encodeURIComponent(trailId)}`, { cache: 'no-store' })
.then((res) => (res.ok ? res.json() : null))
.then((payload) => {
if (cancelled) return;
const trail = Array.isArray(payload?.trail) ? payload.trail as FlightTrailPoint[] : [];
setSelectedFlightTrail(trail);
})
.catch(() => {
if (!cancelled) setSelectedFlightTrail([]);
});
};
refreshSelectedFlightTrail();
const trailRefreshTimer = window.setInterval(refreshSelectedFlightTrail, 30000);
return () => {
cancelled = true;
window.clearInterval(trailRefreshTimer);
};
}, [selectedEntity?.id, selectedEntity?.type]);
const withSelectedTrail = useCallback((flight: any) => {
if (!flight || selectedFlightTrail.length < 2) return flight;
const selectedId = String(selectedEntity?.id || '').trim().toLowerCase();
const flightId = String(flight.icao24 || '').trim().toLowerCase();
if (!selectedId || !flightId || selectedId !== flightId) return flight;
return { ...flight, trail: selectedFlightTrail };
}, [selectedEntity?.id, selectedFlightTrail]);
const [shodanDetail, setShodanDetail] = useState<ShodanHost | null>(null);
const [shodanLoading, setShodanLoading] = useState(false);
const [shodanError, setShodanError] = useState<string | null>(null);
@@ -499,6 +600,7 @@ function NewsFeedInner({ selectedEntity, regionDossier, regionDossierLoading, on
if (selectedEntity?.type === 'tracked_flight') {
const flight = data?.tracked_flights?.find((f: any) => f.icao24 === selectedEntity.id);
if (flight) {
const flightForEmissions = withSelectedTrail(flight);
const callsign = flight.callsign || "UNKNOWN";
const alertColorMap: Record<string, string> = {
'#ff1493': 'text-[#ff1493]', pink: 'text-[#ff1493]', red: 'text-red-400', yellow: 'text-yellow-400',
@@ -684,19 +786,7 @@ function NewsFeedInner({ selectedEntity, regionDossier, regionDossierLoading, on
<span className={`text-xs font-bold ${flight.squawk === '7700' ? 'text-red-400 animate-pulse' : flight.squawk === '7600' ? 'text-yellow-400' : 'text-[var(--text-primary)]'}`}>{flight.squawk}{flight.squawk === '7700' ? ' ⚠ EMERGENCY' : flight.squawk === '7600' ? ' COMMS LOST' : ''}</span>
</div>
)}
<div className="border-b border-[var(--border-primary)] pb-2">
<span className="text-[var(--text-muted)] text-[10px] block mb-1.5">EMISSIONS ESTIMATE</span>
<div className="flex gap-3">
<div className="flex-1 bg-[var(--bg-primary)]/50 border border-[var(--border-primary)] px-2 py-1.5">
<div className="text-[11px] text-[var(--text-muted)] tracking-widest">FUEL BURN</div>
<div className="text-xs font-bold text-orange-400">{flight.emissions ? <>{flight.emissions.fuel_gph} <span className="text-[11px] text-[var(--text-muted)] font-normal">GPH</span></> : 'UNKNOWN'}</div>
</div>
<div className="flex-1 bg-[var(--bg-primary)]/50 border border-[var(--border-primary)] px-2 py-1.5">
<div className="text-[11px] text-[var(--text-muted)] tracking-widest">CO2 OUTPUT</div>
<div className="text-xs font-bold text-red-400">{flight.emissions ? <>{flight.emissions.co2_kg_per_hour.toLocaleString()} <span className="text-[11px] text-[var(--text-muted)] font-normal">KG/HR</span></> : 'UNKNOWN'}</div>
</div>
</div>
</div>
<EmissionsEstimateBlock flight={flightForEmissions} />
{flight.alert_link && (
<div className="flex justify-between items-center border-b border-[var(--border-primary)] pb-2">
<span className="text-[var(--text-muted)] text-[10px]">REFERENCE</span>
@@ -748,8 +838,15 @@ function NewsFeedInner({ selectedEntity, regionDossier, regionDossierLoading, on
const flight = flightsList?.find((f: any) => f.icao24 === selectedEntity.id);
if (flight) {
const flightForEmissions = withSelectedTrail(flight);
const callsign = flight.callsign || "UNKNOWN";
let airline = "UNKNOWN";
const isPrivateFlight = selectedEntity.type === 'private_flight' || selectedEntity.type === 'private_jet';
const aircraftWikiTitle = resolveAircraftWikiTitle(flight.model);
const aircraftModelWikiUrl = aircraftWikiTitle
? `https://en.wikipedia.org/wiki/${aircraftWikiTitle.replace(/ /g, '_')}`
: null;
const showModelWiki = isPrivateFlight || selectedEntity.type === 'military_flight';
if (selectedEntity.type === 'military_flight') {
const mil = flight as import('@/types/dashboard').MilitaryFlight;
@@ -798,7 +895,7 @@ function NewsFeedInner({ selectedEntity, regionDossier, regionDossierLoading, on
<div className="p-4 flex flex-col gap-3">
<div className="flex justify-between items-center border-b border-[var(--border-primary)] pb-2">
<span className="text-[var(--text-muted)] text-[10px]">OPERATOR</span>
{selectedEntity.type !== 'military_flight' && airline && airline !== 'COMMERCIAL FLIGHT' && airline !== 'UNKNOWN' ? (
{!isPrivateFlight && selectedEntity.type !== 'military_flight' && airline && airline !== 'COMMERCIAL FLIGHT' && airline !== 'UNKNOWN' ? (
<a
href={`https://en.wikipedia.org/wiki/${encodeURIComponent(airline.replace(/ /g, '_'))}`}
target="_blank"
@@ -812,7 +909,7 @@ function NewsFeedInner({ selectedEntity, regionDossier, regionDossierLoading, on
)}
</div>
{/* Commercial: Airline company Wikipedia image */}
{selectedEntity.type !== 'military_flight' && airline && airline !== 'COMMERCIAL FLIGHT' && airline !== 'UNKNOWN' && (
{!isPrivateFlight && selectedEntity.type !== 'military_flight' && airline && airline !== 'COMMERCIAL FLIGHT' && airline !== 'UNKNOWN' && (
<div className="border-b border-[var(--border-primary)] pb-2">
<WikiImage
wikiUrl={`https://en.wikipedia.org/wiki/${encodeURIComponent(airline.replace(/ /g, '_'))}`}
@@ -828,7 +925,18 @@ function NewsFeedInner({ selectedEntity, regionDossier, regionDossierLoading, on
</div>
<div className="flex justify-between items-center border-b border-[var(--border-primary)] pb-2">
<span className="text-[var(--text-muted)] text-[10px]">AIRCRAFT MODEL</span>
<span className="text-[var(--text-primary)] text-xs font-bold">{flight.model || "UNKNOWN"}</span>
{showModelWiki && aircraftModelWikiUrl ? (
<a
href={aircraftModelWikiUrl}
target="_blank"
rel="noreferrer"
className="text-xs font-bold text-cyan-400 hover:text-cyan-300 underline"
>
{aircraftWikiTitle || flight.model}
</a>
) : (
<span className="text-[var(--text-primary)] text-xs font-bold">{flight.model || "UNKNOWN"}</span>
)}
</div>
{/* Military: Aircraft model Wikipedia image (gold accent) */}
{selectedEntity.type === 'military_flight' && (() => {
@@ -878,8 +986,19 @@ function NewsFeedInner({ selectedEntity, regionDossier, regionDossierLoading, on
}
return null;
})()}
{/* Private/GA: aircraft model Wikipedia image as the primary visual */}
{isPrivateFlight && aircraftModelWikiUrl && (
<div className="border-b border-[var(--border-primary)] pb-3">
<WikiImage
wikiUrl={aircraftModelWikiUrl}
label={aircraftWikiTitle || flight.model}
maxH="max-h-36"
accent="hover:border-purple-400/60"
/>
</div>
)}
{/* Non-military: Aircraft model photo (secondary, below airline image) */}
{selectedEntity.type !== 'military_flight' && (aircraftImgUrl || aircraftImgLoading || aircraftWikiUrl) && (
{!isPrivateFlight && selectedEntity.type !== 'military_flight' && selectedEntity.type !== 'flight' && (aircraftImgUrl || aircraftImgLoading || aircraftWikiUrl) && (
<div className="border-b border-[var(--border-primary)] pb-3">
{aircraftImgLoading && (
<div className="w-full h-24 bg-[var(--bg-tertiary)]/60" />
@@ -924,19 +1043,7 @@ function NewsFeedInner({ selectedEntity, regionDossier, regionDossierLoading, on
<span className="text-[var(--text-muted)] text-[10px]">ROUTE</span>
<span className="text-cyan-400 text-xs font-bold">{flight.origin_name !== "UNKNOWN" ? `[${flight.origin_name}] → [${flight.dest_name}]` : "UNKNOWN"}</span>
</div>
<div className="border-b border-[var(--border-primary)] pb-2">
<span className="text-[var(--text-muted)] text-[10px] block mb-1.5">EMISSIONS ESTIMATE</span>
<div className="flex gap-3">
<div className="flex-1 bg-[var(--bg-primary)]/50 border border-[var(--border-primary)] px-2 py-1.5">
<div className="text-[11px] text-[var(--text-muted)] tracking-widest">FUEL BURN</div>
<div className="text-xs font-bold text-orange-400">{flight.emissions ? <>{flight.emissions.fuel_gph} <span className="text-[11px] text-[var(--text-muted)] font-normal">GPH</span></> : 'UNKNOWN'}</div>
</div>
<div className="flex-1 bg-[var(--bg-primary)]/50 border border-[var(--border-primary)] px-2 py-1.5">
<div className="text-[11px] text-[var(--text-muted)] tracking-widest">CO2 OUTPUT</div>
<div className="text-xs font-bold text-red-400">{flight.emissions ? <>{flight.emissions.co2_kg_per_hour.toLocaleString()} <span className="text-[11px] text-[var(--text-muted)] font-normal">KG/HR</span></> : 'UNKNOWN'}</div>
</div>
</div>
</div>
<EmissionsEstimateBlock flight={flightForEmissions} />
{flight.icao24 && (
<div className="flex justify-between items-center border-b border-[var(--border-primary)] pb-2">
<span className="text-[var(--text-muted)] text-[10px]">FLIGHT RECORD</span>
+270 -7
View File
@@ -2,9 +2,9 @@
import React, { useState, useEffect } from 'react';
import { motion, AnimatePresence } from 'framer-motion';
import { X, ExternalLink, Key, Shield, Radar, Globe, Satellite, Ship, Radio } from 'lucide-react';
import { X, ExternalLink, Key, Shield, Radar, Globe, Satellite, Ship, Radio, Bot, Copy, Check, Network } from 'lucide-react';
const CURRENT_ONBOARDING_VERSION = '0.9.7-docker-keys-1';
const CURRENT_ONBOARDING_VERSION = '0.9.79-agentic-onboarding-1';
const STORAGE_KEY = `shadowbroker_onboarding_complete_v${CURRENT_ONBOARDING_VERSION}`;
const LEGACY_STORAGE_KEY = 'shadowbroker_onboarding_complete';
@@ -68,6 +68,14 @@ const OnboardingModal = React.memo(function OnboardingModal({
});
const [setupSaving, setSetupSaving] = useState(false);
const [setupMsg, setSetupMsg] = useState<{ type: 'ok' | 'err'; text: string } | null>(null);
const [agentSecret, setAgentSecret] = useState('');
const [agentTier, setAgentTier] = useState<'restricted' | 'full'>('restricted');
const [agentMode, setAgentMode] = useState<'local' | 'remote'>('local');
const [agentLoading, setAgentLoading] = useState(false);
const [agentMsg, setAgentMsg] = useState<{ type: 'ok' | 'err'; text: string } | null>(null);
const [agentCopied, setAgentCopied] = useState(false);
const [torStarting, setTorStarting] = useState(false);
const [torAddress, setTorAddress] = useState('');
const handleDismiss = () => {
localStorage.setItem(STORAGE_KEY, 'true');
@@ -114,6 +122,110 @@ const OnboardingModal = React.memo(function OnboardingModal({
}
};
const agentEndpoint =
agentMode === 'local'
? 'http://localhost:8000'
: torAddress || '<prepare remote .onion link>';
const agentSnippet = [
`SHADOWBROKER_URL=${agentEndpoint}`,
agentSecret ? `SHADOWBROKER_KEY=${agentSecret}` : 'SHADOWBROKER_KEY=<generate in ShadowBroker>',
`SHADOWBROKER_ACCESS=${agentTier}`,
'',
'# FIRST: load available tools',
`GET ${agentEndpoint}/api/ai/tools`,
'',
'# Auth: HMAC-SHA256 signed requests.',
'# Restricted = read-only telemetry. Full = can write when asked.',
].join('\n');
const remoteAgentNeedsTor = agentMode === 'remote' && !torAddress;
const fetchAgentConnectInfo = async (reveal = true) => {
setAgentLoading(true);
setAgentMsg(null);
try {
const res = await fetch(`/api/ai/connect-info?reveal=${reveal ? 'true' : 'false'}`);
const data = await res.json().catch(() => ({}));
if (!res.ok || data?.ok === false) {
throw new Error(data?.detail || 'Could not prepare agent credentials.');
}
setAgentSecret(data.hmac_secret || '');
setAgentTier(data.access_tier === 'full' ? 'full' : 'restricted');
setAgentMsg({ type: 'ok', text: 'Agent key is ready. Copy it into your local or remote agent runtime.' });
} catch (error) {
setAgentMsg({
type: 'err',
text: error instanceof Error ? error.message : 'Could not prepare agent credentials.',
});
} finally {
setAgentLoading(false);
}
};
const saveAgentTier = async (tier: 'restricted' | 'full') => {
setAgentTier(tier);
setAgentMsg(null);
try {
const res = await fetch('/api/ai/connect-info/access-tier', {
method: 'PUT',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ tier }),
});
const data = await res.json().catch(() => ({}));
if (!res.ok || data?.ok === false) {
throw new Error(data?.detail || 'Could not update agent access tier.');
}
setAgentMsg({
type: 'ok',
text: tier === 'full'
? 'Full access saved. The agent can write to the dashboard when authenticated.'
: 'Restricted access saved. The agent can read telemetry but cannot write.',
});
} catch (error) {
setAgentMsg({
type: 'err',
text: error instanceof Error ? error.message : 'Could not update agent access tier.',
});
}
};
const prepareTorAgentAddress = async () => {
setTorStarting(true);
setAgentMsg(null);
try {
const res = await fetch('/api/settings/tor/start', { method: 'POST' });
const data = await res.json().catch(() => ({}));
if (!res.ok || data?.ok === false || !data?.onion_address) {
throw new Error(data?.detail || 'Could not start Tor hidden service.');
}
setTorAddress(data.onion_address);
setAgentMsg({
type: 'ok',
text: 'Tor is ready. The remote agent link is private to your local ShadowBroker node.',
});
} catch (error) {
setAgentMsg({
type: 'err',
text:
error instanceof Error
? error.message
: 'ShadowBroker could not install or start Tor automatically. Check network access and try again.',
});
} finally {
setTorStarting(false);
}
};
const copyAgentSnippet = async () => {
if (remoteAgentNeedsTor) {
setAgentMsg({ type: 'err', text: 'Install Tor and create the remote link first, then copy the agent config.' });
return;
}
await navigator.clipboard.writeText(agentSnippet);
setAgentCopied(true);
setTimeout(() => setAgentCopied(false), 1600);
};
return (
<AnimatePresence>
{/* Backdrop */}
@@ -166,7 +278,7 @@ const OnboardingModal = React.memo(function OnboardingModal({
{/* Step Indicators */}
<div className="flex gap-2 px-6 pt-4">
{['API Keys', 'Trust Modes', 'Free Sources'].map((label, i) => (
{['API Keys', 'AI Agent', 'Trust Modes', 'Free Sources'].map((label, i) => (
<button
key={label}
onClick={() => setStep(i)}
@@ -183,7 +295,7 @@ const OnboardingModal = React.memo(function OnboardingModal({
{/* Content */}
<div className="flex-1 overflow-y-auto styled-scrollbar p-6">
{step === 1 && (
{step === 2 && (
<div className="space-y-4">
<div className="text-center py-4">
<div className="text-lg font-bold tracking-[0.3em] text-[var(--text-primary)] font-mono mb-2">
@@ -246,6 +358,157 @@ const OnboardingModal = React.memo(function OnboardingModal({
</div>
)}
{step === 1 && (
<div className="space-y-5">
<div>
<p className="text-[11px] text-violet-300 font-mono font-bold tracking-widest mb-2">
STEP 1 - WHERE IS YOUR AGENT?
</p>
<div className="grid grid-cols-2 gap-2">
<button
onClick={() => setAgentMode('local')}
className={`border px-4 py-3 text-left transition-all ${
agentMode === 'local'
? 'border-cyan-500/50 bg-cyan-950/40'
: 'border-[var(--border-primary)] hover:border-cyan-500/30'
}`}
>
<p className={`text-sm font-mono font-bold ${agentMode === 'local' ? 'text-cyan-300' : 'text-[var(--text-secondary)]'}`}>
Local
</p>
<p className="text-[10px] text-[var(--text-muted)] font-mono mt-1">
Same machine as ShadowBroker
</p>
</button>
<button
onClick={() => setAgentMode('remote')}
className={`border px-4 py-3 text-left transition-all ${
agentMode === 'remote'
? 'border-violet-500/50 bg-violet-950/40'
: 'border-[var(--border-primary)] hover:border-violet-500/30'
}`}
>
<p className={`text-sm font-mono font-bold ${agentMode === 'remote' ? 'text-violet-300' : 'text-[var(--text-secondary)]'}`}>
Remote
</p>
<p className="text-[10px] text-[var(--text-muted)] font-mono mt-1">
Different machine over Tor
</p>
</button>
</div>
</div>
<div>
<p className="text-[11px] text-violet-300 font-mono font-bold tracking-widest mb-2">
STEP 2 - WHAT CAN IT DO?
</p>
<div className="grid grid-cols-2 gap-2">
<button
onClick={() => void saveAgentTier('restricted')}
className={`border px-4 py-3 text-left transition-all ${
agentTier === 'restricted'
? 'border-green-500/50 bg-green-950/30'
: 'border-[var(--border-primary)] hover:border-green-500/30'
}`}
>
<p className="text-sm text-green-300 font-mono font-bold flex items-center gap-2">
<Shield size={14} /> Read Only
</p>
<p className="text-[10px] text-[var(--text-muted)] font-mono mt-2">
Can see live telemetry but cannot change anything
</p>
</button>
<button
onClick={() => void saveAgentTier('full')}
className={`border px-4 py-3 text-left transition-all ${
agentTier === 'full'
? 'border-amber-500/50 bg-amber-950/30'
: 'border-[var(--border-primary)] hover:border-amber-500/30'
}`}
>
<p className="text-sm text-amber-300 font-mono font-bold flex items-center gap-2">
<Network size={14} /> Full Access
</p>
<p className="text-[10px] text-[var(--text-muted)] font-mono mt-2">
Can place pins, create layers, and trigger display actions
</p>
</button>
</div>
</div>
<div>
<div className="flex items-center justify-between gap-3 mb-2">
<div>
<p className="text-[11px] text-violet-300 font-mono font-bold tracking-widest">
STEP 3 - COPY THIS INTO YOUR AGENT
</p>
<p className="text-[10px] text-[var(--text-muted)] font-mono mt-1">
Generate a local key, then copy these variables into OpenClaw, Hermes, or another HMAC agent.
</p>
</div>
<button
onClick={() => void fetchAgentConnectInfo(true)}
disabled={agentLoading}
className="px-3 py-2 border border-violet-500/40 text-violet-300 hover:bg-violet-500/10 disabled:opacity-50 text-[11px] font-mono tracking-widest"
>
{agentLoading ? 'GENERATING...' : 'GENERATE'}
</button>
</div>
{remoteAgentNeedsTor && (
<div className="mb-2 border border-violet-500/30 bg-violet-950/20 p-3">
<div className="flex items-start justify-between gap-3">
<div>
<p className="text-[11px] text-violet-200 font-mono font-bold tracking-widest">
TOR REQUIRED FOR REMOTE AGENTS
</p>
<p className="text-[10px] text-[var(--text-muted)] font-mono mt-1 leading-relaxed">
ShadowBroker will install or use Tor locally, then create a private .onion link for this backend.
</p>
</div>
<button
onClick={() => void prepareTorAgentAddress()}
disabled={torStarting}
className="shrink-0 px-3 py-2 border border-violet-500/40 text-violet-200 hover:bg-violet-500/10 disabled:opacity-50 text-[10px] font-mono tracking-widest flex items-center gap-2"
>
<Network size={13} />
{torStarting ? 'INSTALLING...' : 'INSTALL TOR'}
</button>
</div>
</div>
)}
<div className="relative">
<pre className="min-h-40 max-h-56 overflow-auto styled-scrollbar bg-[var(--bg-primary)] border border-violet-500/30 p-4 pr-24 text-[12px] text-violet-100 font-mono whitespace-pre-wrap leading-relaxed">
{agentSnippet}
</pre>
<button
onClick={() => void copyAgentSnippet()}
disabled={remoteAgentNeedsTor}
className="absolute top-3 right-3 px-3 py-2 border border-violet-500/50 bg-violet-950/50 text-violet-200 hover:bg-violet-800/30 disabled:opacity-45 disabled:hover:bg-violet-950/50 text-[11px] font-mono tracking-widest flex items-center gap-2"
>
{agentCopied ? <Check size={13} /> : <Copy size={13} />}
{agentCopied ? 'COPIED' : 'COPY'}
</button>
</div>
{agentMsg && (
<p
className={`mt-2 text-sm font-mono ${
agentMsg.type === 'ok' ? 'text-green-300' : 'text-red-300'
}`}
>
{agentMsg.text}
</p>
)}
</div>
<p className="text-[11px] text-orange-300/80 font-mono leading-relaxed">
Remote agent access uses the signed HTTP API over Tor. Wormhole uses the same Tor/Arti transport lane when it is available; MLS-native agent transport is still planned.
</p>
</div>
)}
{step === 0 && (
<div className="space-y-4">
<div className="bg-yellow-950/20 border border-yellow-500/20 p-4">
@@ -359,7 +622,7 @@ const OnboardingModal = React.memo(function OnboardingModal({
</div>
)}
{step === 2 && (
{step === 3 && (
<div className="space-y-3">
<p className="text-sm text-[var(--text-secondary)] font-mono mb-3">
These data sources are completely free and require no API keys. They activate
@@ -401,7 +664,7 @@ const OnboardingModal = React.memo(function OnboardingModal({
</button>
<div className="flex gap-1.5">
{[0, 1, 2].map((i) => (
{[0, 1, 2, 3].map((i) => (
<div
key={i}
className={`w-1.5 h-1.5 rounded-full transition-colors ${step === i ? 'bg-cyan-400' : 'bg-[var(--border-primary)]'}`}
@@ -409,7 +672,7 @@ const OnboardingModal = React.memo(function OnboardingModal({
))}
</div>
{step < 2 ? (
{step < 3 ? (
<button
onClick={() => setStep(step + 1)}
className="px-4 py-2 border border-cyan-500/40 text-cyan-400 hover:bg-cyan-500/10 text-sm font-mono tracking-widest transition-all"
+90 -46
View File
@@ -500,6 +500,7 @@ const SettingsPanel = React.memo(function SettingsPanel({
// stored server-side, and never returned to the browser.
const [apis, setApis] = useState<ApiEntry[]>([]);
const [apiKeyInputs, setApiKeyInputs] = useState<Record<string, string>>({});
const [apiKeyEditing, setApiKeyEditing] = useState<Record<string, boolean>>({});
const [apiKeySaving, setApiKeySaving] = useState<string | null>(null);
const [apiKeyMsg, setApiKeyMsg] = useState<{ type: 'ok' | 'err'; text: string } | null>(null);
const [expandedCategories, setExpandedCategories] = useState<Set<string>>(
@@ -573,6 +574,7 @@ const SettingsPanel = React.memo(function SettingsPanel({
if (result.keys) setApis(result.keys);
if (result.env) setEnvMeta(result.env);
setApiKeyInputs((prev) => ({ ...prev, [envKey]: '' }));
setApiKeyEditing((prev) => ({ ...prev, [envKey]: false }));
setApiKeyMsg({ type: 'ok', text: `${envKey} saved locally. Restart or refresh feeds to use it.` });
} catch (e) {
const message = e instanceof Error ? e.message : 'Could not save API key';
@@ -2215,6 +2217,10 @@ const SettingsPanel = React.memo(function SettingsPanel({
aircraft and vessel feeds.
</p>
</div>
<div className="pl-5 text-[12px] font-mono text-cyan-200/80 leading-relaxed">
Configured keys stay hidden for shared dashboards. Unlock operator tools, then
use ROTATE only when you intentionally want to replace a working credential.
</div>
{envMeta && (
<div className="pl-5 text-[12px] font-mono text-[var(--text-muted)] leading-relaxed space-y-0.5">
<div>
@@ -2344,17 +2350,53 @@ const SettingsPanel = React.memo(function SettingsPanel({
{api.has_key && (
<div className="mt-2 space-y-2 text-[12px] font-mono">
{api.is_set ? (
<div className="flex items-center gap-2">
<span className="px-2 py-0.5 border border-green-500/40 bg-green-950/20 text-green-300 tracking-wider">
CONFIGURED
</span>
<span className="text-[var(--text-muted)]">
edit{' '}
<span className="text-cyan-300 select-all break-all">
{api.env_key}
</span>{' '}
Enter a replacement below if you need to rotate it.
</span>
<div className="space-y-2">
<div className="flex items-start justify-between gap-2">
<div className="min-w-0 flex items-center gap-2">
<span className="px-2 py-0.5 border border-green-500/40 bg-green-950/20 text-green-300 tracking-wider">
CONFIGURED
</span>
<span className="text-[var(--text-muted)] leading-relaxed">
Secret hidden. Stored write-only on this backend as{' '}
<span className="text-cyan-300 select-all break-all">
{api.env_key}
</span>
.
</span>
</div>
{api.env_key && (
<button
type="button"
onClick={() => {
if (!(nativeProtected || adminSessionReady)) {
setApiKeyMsg({
type: 'err',
text: 'Unlock operator tools before rotating a configured key.',
});
return;
}
setApiKeyMsg(null);
setApiKeyEditing((prev) => ({
...prev,
[api.env_key as string]: !prev[api.env_key as string],
}));
}}
className={`shrink-0 px-2 py-1 border text-[11px] tracking-widest transition-colors ${
nativeProtected || adminSessionReady
? 'border-yellow-500/40 text-yellow-300 hover:bg-yellow-500/10'
: 'border-[var(--border-primary)] text-[var(--text-muted)] hover:border-yellow-500/30 hover:text-yellow-300/80'
}`}
>
{apiKeyEditing[api.env_key] ? 'CANCEL' : 'ROTATE'}
</button>
)}
</div>
{!(nativeProtected || adminSessionReady) && (
<div className="text-[11px] text-yellow-300/70 leading-relaxed">
Operator tools are locked. Viewers can see source status
but cannot replace saved credentials.
</div>
)}
</div>
) : (
<div className="flex items-center gap-2">
@@ -2366,40 +2408,42 @@ const SettingsPanel = React.memo(function SettingsPanel({
</span>
</div>
)}
<div className="flex items-center gap-2">
<input
type="password"
value={api.env_key ? apiKeyInputs[api.env_key] || '' : ''}
onChange={(event) => {
if (!api.env_key) return;
setApiKeyInputs((prev) => ({
...prev,
[api.env_key as string]: event.target.value,
}));
}}
placeholder={
api.is_set
? 'Enter replacement key...'
: `Enter ${api.env_key}...`
}
className="min-w-0 flex-1 bg-[var(--bg-primary)] border border-[var(--border-primary)] px-2 py-1.5 text-sm text-[var(--text-primary)] outline-none focus:border-cyan-500/70 placeholder:text-[var(--text-muted)]/50"
autoComplete="off"
/>
<button
onClick={() => void saveApiKey(api.env_key)}
disabled={
!api.env_key ||
apiKeySaving === api.env_key ||
!String(
api.env_key ? apiKeyInputs[api.env_key] || '' : '',
).trim()
}
className="h-8 px-3 border border-cyan-500/40 bg-cyan-950/20 text-cyan-300 hover:bg-cyan-500/15 disabled:opacity-40 disabled:cursor-not-allowed flex items-center gap-1.5 tracking-widest"
>
<Save size={12} />
{apiKeySaving === api.env_key ? 'SAVING' : 'SAVE'}
</button>
</div>
{(!api.is_set || (api.env_key && apiKeyEditing[api.env_key])) && (
<div className="flex items-center gap-2">
<input
type="password"
value={api.env_key ? apiKeyInputs[api.env_key] || '' : ''}
onChange={(event) => {
if (!api.env_key) return;
setApiKeyInputs((prev) => ({
...prev,
[api.env_key as string]: event.target.value,
}));
}}
placeholder={
api.is_set
? 'Enter replacement key...'
: `Enter ${api.env_key}...`
}
className="min-w-0 flex-1 bg-[var(--bg-primary)] border border-[var(--border-primary)] px-2 py-1.5 text-sm text-[var(--text-primary)] outline-none focus:border-cyan-500/70 placeholder:text-[var(--text-muted)]/50"
autoComplete="off"
/>
<button
onClick={() => void saveApiKey(api.env_key)}
disabled={
!api.env_key ||
apiKeySaving === api.env_key ||
!String(
api.env_key ? apiKeyInputs[api.env_key] || '' : '',
).trim()
}
className="h-8 px-3 border border-cyan-500/40 bg-cyan-950/20 text-cyan-300 hover:bg-cyan-500/15 disabled:opacity-40 disabled:cursor-not-allowed flex items-center gap-1.5 tracking-widest"
>
<Save size={12} />
{apiKeySaving === api.env_key ? 'SAVING' : 'SAVE'}
</button>
</div>
)}
</div>
)}
</div>
@@ -2416,7 +2460,7 @@ const SettingsPanel = React.memo(function SettingsPanel({
<div className="p-4 border-t border-[var(--border-primary)]/80">
<div className="flex items-center justify-between text-[13px] text-[var(--text-muted)] font-mono">
<span>{apis.length} REGISTERED APIs</span>
<span>{apis.filter((a) => a.has_key).length} KEYS CONFIGURED</span>
<span>{apis.filter((a) => a.has_key && a.is_set).length} KEYS CONFIGURED</span>
</div>
</div>
</>
@@ -4,7 +4,7 @@ import React, { useEffect, useState } from 'react';
import { motion, AnimatePresence } from 'framer-motion';
import { Database, Clock, X } from 'lucide-react';
const CURRENT_VERSION = '0.9.7';
const CURRENT_VERSION = '0.9.79';
const STORAGE_KEY = `shadowbroker_startup_warmup_notice_v${CURRENT_VERSION}`;
interface StartupWarmupModalProps {
+13 -9
View File
@@ -33,8 +33,8 @@ import {
import { purgeBrowserContactGraph, purgeBrowserSigningMaterial, setSecureModeCached, getNodeIdentity, generateNodeKeys } from '@/mesh/meshIdentity';
import { purgeBrowserDmState } from '@/mesh/meshDmWorkerClient';
import {
DEFAULT_INFONET_SEED_URL,
fetchInfonetNodeStatusSnapshot,
startTorHiddenService,
type InfonetNodeStatusSnapshot,
} from '@/mesh/controlPlaneStatusClient';
import {
@@ -263,6 +263,10 @@ export default function TopRightControls({
await generateNodeKeys();
}
setActivatingPhase('peers');
const torStatus = await startTorHiddenService();
if (!torStatus?.running || !torStatus?.onion_address) {
throw new Error(torStatus?.detail || 'Tor onion service did not start');
}
}
const res = await controlPlaneFetch('/api/settings/node', {
@@ -833,9 +837,9 @@ export default function TopRightControls({
: activatingPhase === 'peers' ? 'text-cyan-300'
: 'text-green-300'
}>
{activatingPhase === 'keys' ? 'Connecting to default seed...'
: activatingPhase === 'peers' ? 'Connecting to default seed...'
: 'Default seed connected'}
{activatingPhase === 'keys' ? 'Preparing onion transport...'
: activatingPhase === 'peers' ? 'Finding bootstrap peers...'
: 'Bootstrap peers ready'}
</span>
</div>
{/* Step: Sync chain */}
@@ -899,7 +903,7 @@ export default function TopRightControls({
<div className="border border-cyan-500/20 bg-cyan-950/10 px-4 py-4 text-[10px] font-mono text-cyan-100 leading-[1.8]">
Do you want to activate a node on this install?
<div className="mt-2 text-[9px] text-cyan-200/70 normal-case tracking-normal">
This turns on your local participant node and lets this install keep syncing the public Infonet chain from <span className="text-cyan-300">{DEFAULT_INFONET_SEED_URL}</span>.
This turns on your local participant node and syncs Infonet only through available Wormhole onion/RNS peers. Clearnet bootstrap is disabled by default.
</div>
</div>
{(bootstrapFailed || nodeStatusError || nodeToggleError) && (
@@ -930,10 +934,10 @@ export default function TopRightControls({
<div className="text-cyan-300 tracking-[0.18em]">BY CONTINUING YOU AGREE:</div>
<ul className="mt-3 space-y-2 list-disc pl-5">
<li>This install can keep a local copy of the public Infonet chain.</li>
<li>Fresh installs pull from the bundled default seed at {DEFAULT_INFONET_SEED_URL}.</li>
<li>Participant-node sync is public-facing unless you separately use obfuscated-lane features.</li>
<li>Your backend may sync with configured or bundled bootstrap peers in the background.</li>
<li>Wormhole provides gates (transitional private lane) and Dead Drop / DM (stronger private lane) as separate postures.</li>
<li>Fresh installs do not use a clearnet Infonet seed.</li>
<li>Participant-node sync requires an onion/RNS peer through Wormhole.</li>
<li>Your backend may sync with configured private bootstrap peers in the background.</li>
<li>Wormhole keeps Infonet, gates, Dead Drop, and DM traffic on the obfuscated lane.</li>
</ul>
</div>
<div className="text-[11px] font-mono uppercase tracking-[0.2em] text-cyan-300/80">
@@ -2,7 +2,6 @@ import { useCallback, useEffect, useRef, useState } from 'react';
import { Send } from 'lucide-react';
import { API_BASE } from '@/lib/api';
import {
derivePublicMeshAddress,
getNodeIdentity,
hasSovereignty,
signEvent,
@@ -13,11 +12,28 @@ import { PROTOCOL_VERSION } from '@/mesh/meshProtocol';
import { validateEventPayload } from '@/mesh/meshSchema';
const MESH_NODE_ID_RE = /^![0-9a-f]{8}$/i;
const PUBLIC_MESH_ADDRESS_KEY = 'sb_public_meshtastic_address';
function isMeshtasticNodeId(value: string | undefined | null): boolean {
return !!value && MESH_NODE_ID_RE.test(value.trim());
}
function normalizePublicMeshAddress(value: string | undefined | null): string {
const raw = String(value || '').trim().toLowerCase();
const body = raw.startsWith('!') ? raw.slice(1) : raw;
if (!/^[0-9a-f]{8}$/.test(body)) return '';
return `!${body}`;
}
function readStoredPublicMeshAddress(): string {
if (typeof window === 'undefined') return '';
try {
return normalizePublicMeshAddress(window.localStorage.getItem(PUBLIC_MESH_ADDRESS_KEY));
} catch {
return '';
}
}
/** Inline send-message form for SIGINT popups — routes via MeshRouter */
export function SigintSendForm({
destination,
@@ -40,26 +56,11 @@ export function SigintSendForm({
const isDirectMesh = isMesh && isMeshtasticNodeId(destination);
useEffect(() => {
let cancelled = false;
if (!isMesh) {
setPublicMeshAddress('');
return;
}
const identity = getNodeIdentity();
if (!identity?.nodeId || !globalThis.crypto?.subtle) {
setPublicMeshAddress('');
return;
}
derivePublicMeshAddress(identity.nodeId)
.then((addr) => {
if (!cancelled) setPublicMeshAddress(addr);
})
.catch(() => {
if (!cancelled) setPublicMeshAddress('');
});
return () => {
cancelled = true;
};
setPublicMeshAddress(readStoredPublicMeshAddress());
}, [isMesh]);
const handleSend = async () => {
@@ -71,6 +72,56 @@ export function SigintSendForm({
}
setStatus('sending');
try {
if (isMesh) {
const meshSender = normalizePublicMeshAddress(publicMeshAddress || readStoredPublicMeshAddress());
if (!meshSender) {
setStatus('error');
setDetail('public mesh key required');
return;
}
const payload = {
message: msg.trim(),
destination: destination || 'broadcast',
channel: channel || 'LongFast',
priority: 'normal',
ephemeral: false,
transport_lock: 'meshtastic',
};
const v = validateEventPayload('message', payload);
if (!v.ok) {
setStatus('error');
setDetail(`invalid payload: ${v.reason}`);
return;
}
const res = await fetch(`${API_BASE}/api/mesh/meshtastic/send`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
destination: destination || 'broadcast',
message: msg.trim(),
sender_id: meshSender,
channel: channel || 'LongFast',
priority: 'normal',
ephemeral: false,
transport_lock: 'meshtastic',
mesh_region: region || 'US',
}),
});
const data = await res.json().catch(() => ({}));
if (res.ok && data.ok) {
setStatus('sent');
const routeDetail = Array.isArray(data.results) && data.results[0]?.reason
? String(data.results[0].reason)
: String(data.route_reason || 'MQTT broker accepted publish');
setDetail(routeDetail);
setMsg('');
} else {
setStatus('error');
setDetail(String(data.detail || data.route_reason || 'send failed'));
}
return;
}
const identity = getNodeIdentity();
if (!identity || !hasSovereignty()) {
setStatus('error');
@@ -234,22 +285,7 @@ export function MeshtasticChannelFeed({ region, channel }: { region: string; cha
const intervalRef = useRef<ReturnType<typeof setInterval> | null>(null);
useEffect(() => {
let cancelled = false;
const identity = getNodeIdentity();
if (!identity?.nodeId || !globalThis.crypto?.subtle) {
setPublicMeshAddress('');
return;
}
derivePublicMeshAddress(identity.nodeId)
.then((addr) => {
if (!cancelled) setPublicMeshAddress(addr);
})
.catch(() => {
if (!cancelled) setPublicMeshAddress('');
});
return () => {
cancelled = true;
};
setPublicMeshAddress(readStoredPublicMeshAddress());
}, []);
const fetchData = useCallback(async () => {
@@ -281,6 +317,10 @@ export function MeshtasticChannelFeed({ region, channel }: { region: string; cha
const regionData = channelStats?.roots?.[region] || channelStats?.regions?.[region];
const regionChannels = regionData?.channels || {};
const sortedChannels = Object.entries(regionChannels).sort((a, b) => b[1] - a[1]);
const channelMessages = messages.filter((m) => {
const target = String(m.to || 'broadcast').trim().toLowerCase();
return target === '' || target === 'broadcast' || target === '^all';
});
if (loading)
return <div className="text-[11px] text-cyan-400/50 animate-pulse mt-1">Loading...</div>;
@@ -317,13 +357,13 @@ export function MeshtasticChannelFeed({ region, channel }: { region: string; cha
)}
{/* Message feed */}
{messages.length > 0 ? (
{channelMessages.length > 0 ? (
<>
<div className="text-[11px] text-green-400/60 tracking-widest mb-1">
MESSAGES {channel} ({region})
</div>
<div className="max-h-[140px] overflow-y-auto space-y-0.5 scrollbar-thin">
{messages.map((m: MeshtasticMessage, i: number) => {
{channelMessages.map((m: MeshtasticMessage, i: number) => {
const directedToYou =
!!publicMeshAddress &&
typeof m.to === 'string' &&
+5 -1
View File
@@ -46,7 +46,11 @@ export async function controlPlaneJson<T>(
const res = await controlPlaneFetch(path, options);
const data = await res.json().catch(() => ({}));
if (!res.ok || data?.ok === false) {
throw new Error(data?.detail || data?.message || 'control_plane_request_failed');
const fallback =
res.status === 429
? 'control_plane_rate_limited'
: `control_plane_request_failed:${res.status || 'unknown'}`;
throw new Error(data?.detail || data?.message || fallback);
}
return data as T;
}
+16 -2
View File
@@ -21,6 +21,7 @@ export interface InfonetBootstrapSnapshot {
sync_peer_count?: number;
push_peer_count?: number;
operator_peer_count?: number;
bootstrap_seed_peer_count?: number;
default_sync_peer_count?: number;
last_bootstrap_error?: string;
}
@@ -69,6 +70,7 @@ export interface InfonetNodeStatusSnapshot {
sync_runtime?: InfonetSyncRuntimeSnapshot;
push_runtime?: InfonetPushRuntimeSnapshot;
private_lane_tier?: string;
private_transport_required?: boolean;
}
export interface NodeSettingsSnapshot {
@@ -79,9 +81,14 @@ export interface NodeSettingsSnapshot {
node_enabled?: boolean;
}
export const DEFAULT_INFONET_SEED_URL = 'https://node.shadowbroker.info';
export interface TorHiddenServiceSnapshot {
ok?: boolean;
running?: boolean;
onion_address?: string;
detail?: string;
}
const CACHE_TTL_MS = 15000;
const CACHE_TTL_MS = 5000;
type CacheEntry<T> = {
value: T;
@@ -220,3 +227,10 @@ export async function setInfonetNodeEnabled(enabled: boolean): Promise<NodeSetti
invalidateInfonetNodeStatusCache();
return result;
}
export async function startTorHiddenService(): Promise<TorHiddenServiceSnapshot> {
return controlPlaneJson<TorHiddenServiceSnapshot>('/api/settings/tor/start', {
method: 'POST',
requireAdminSession: false,
});
}
+4 -1
View File
@@ -212,10 +212,13 @@ export async function fetchWormholeSettings(
return inflight;
}
export async function connectWormhole(): Promise<WormholeState> {
export async function connectWormhole(
options: { requireAdminSession?: boolean } = {},
): Promise<WormholeState> {
resetWormholeCaches();
const res = await controlPlaneFetch('/api/wormhole/connect', {
method: 'POST',
requireAdminSession: options.requireAdminSession,
});
const state = await parseState(res);
wormholeStateCache = {
+103 -3
View File
@@ -91,6 +91,9 @@ export interface WormholeDmInviteExport {
peer_id: string;
trust_fingerprint: string;
invite: WormholeDmInviteEnvelope;
prekey_publish_pending?: boolean;
prekey_registration?: Record<string, unknown>;
detail?: string;
}
export interface WormholeDmInviteImportResult {
@@ -102,6 +105,44 @@ export interface WormholeDmInviteImportResult {
contact: Record<string, unknown>;
}
export interface WormholeDmAddressRecord {
handle: string;
label: string;
issued_at: number;
expires_at: number;
max_uses: number;
use_count: number;
remaining_uses: number;
last_used_at: number;
expired: boolean;
exhausted: boolean;
revoked?: boolean;
}
export interface WormholeDmInviteHandlesResponse {
ok: boolean;
addresses: WormholeDmAddressRecord[];
detail?: string;
}
export interface WormholeDmInviteHandleRevokeResult {
ok: boolean;
handle: string;
revoked: boolean;
identity_removed?: boolean;
relay_removed?: boolean;
republished?: boolean;
detail?: string;
}
export interface WormholeDmInviteHandleUpdateResult {
ok: boolean;
handle: string;
label: string;
updated: boolean;
detail?: string;
}
export type WormholeDmInviteImportFailure = Partial<WormholeDmInviteImportResult> & {
ok?: false;
};
@@ -840,7 +881,7 @@ export async function prepareWormholeInteractiveLane(
let settings = await fetchWormholeSettings(true).catch(() => null);
if (!runtime?.ready) {
if (settings?.enabled || runtime?.configured) {
runtime = await connectWormhole().catch((error) => {
runtime = await connectWormhole({ requireAdminSession: false }).catch((error) => {
throw new Error(
normalizeWormholeInteractivePrepError(
error instanceof Error ? error.message : 'wormhole_connect_failed',
@@ -939,12 +980,70 @@ export async function fetchWormholeIdentity(): Promise<WormholeIdentity> {
return value;
}
export async function exportWormholeDmInvite(): Promise<WormholeDmInviteExport> {
return controlPlaneJson<WormholeDmInviteExport>('/api/wormhole/dm/invite', {
export async function exportWormholeDmInvite(options: {
label?: string;
expiresInSeconds?: number;
} = {}): Promise<WormholeDmInviteExport> {
const params = new URLSearchParams();
if (options.label?.trim()) {
params.set('label', options.label.trim());
}
if (options.expiresInSeconds && options.expiresInSeconds > 0) {
params.set('expires_in_s', String(Math.floor(options.expiresInSeconds)));
}
const suffix = params.toString() ? `?${params.toString()}` : '';
return controlPlaneJson<WormholeDmInviteExport>(`/api/wormhole/dm/invite${suffix}`, {
requireAdminSession: false,
});
}
export async function listWormholeDmInviteHandles(): Promise<WormholeDmInviteHandlesResponse> {
return controlPlaneJson<WormholeDmInviteHandlesResponse>('/api/wormhole/dm/invite/handles', {
requireAdminSession: false,
});
}
export async function revokeWormholeDmInviteHandle(
handle: string,
): Promise<WormholeDmInviteHandleRevokeResult> {
const response = await controlPlaneFetch(
`/api/wormhole/dm/invite/handles/${encodeURIComponent(handle)}`,
{
method: 'DELETE',
requireAdminSession: false,
},
);
const data = (await response.json().catch(() => ({}))) as WormholeDmInviteHandleRevokeResult & {
message?: string;
};
if (!response.ok || data?.ok === false) {
throw new Error(String(data?.detail || data?.message || 'DM address revoke failed'));
}
return data;
}
export async function renameWormholeDmInviteHandle(
handle: string,
label: string,
): Promise<WormholeDmInviteHandleUpdateResult> {
const response = await controlPlaneFetch(
`/api/wormhole/dm/invite/handles/${encodeURIComponent(handle)}`,
{
method: 'PATCH',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ label }),
requireAdminSession: false,
},
);
const data = (await response.json().catch(() => ({}))) as WormholeDmInviteHandleUpdateResult & {
message?: string;
};
if (!response.ok || data?.ok === false) {
throw new Error(String(data?.detail || data?.message || 'DM address label update failed'));
}
return data;
}
export async function importWormholeDmInvite(
invite: Record<string, unknown>,
alias: string = '',
@@ -956,6 +1055,7 @@ export async function importWormholeDmInvite(
invite,
alias,
}),
requireAdminSession: false,
});
const data = (await response.json().catch(() => ({}))) as WormholeDmInviteImportResult & {
message?: string;
+14 -6
View File
@@ -8,13 +8,16 @@
import { NextRequest, NextResponse } from 'next/server';
function buildCsp(nonce: string): string {
function buildCsp(nonce: string, strictScripts = false): string {
const isDev = process.env.NODE_ENV !== 'production';
const scriptSrc = isDev
? "script-src 'self' 'unsafe-inline' 'unsafe-eval' blob:"
: strictScripts
? `script-src 'self' 'nonce-${nonce}' blob:`
: "script-src 'self' 'unsafe-inline' blob:";
const directives = [
"default-src 'self'",
isDev
? `script-src 'self' 'unsafe-inline' 'unsafe-eval' 'nonce-${nonce}' blob:`
: `script-src 'self' 'nonce-${nonce}' blob:`,
scriptSrc,
"style-src 'self' 'unsafe-inline' https://fonts.googleapis.com",
"img-src 'self' data: blob: https:",
isDev
@@ -35,7 +38,8 @@ function buildCsp(nonce: string): string {
export function middleware(request: NextRequest) {
const nonce = Buffer.from(crypto.randomUUID()).toString('base64');
// Forward nonce to server components via request header.
// Forward a nonce for staged CSP support. Strict script-src is opt-in until
// every Next inline bootstrap script is verified with the nonce in production.
const requestHeaders = new Headers(request.headers);
requestHeaders.set('x-nonce', nonce);
@@ -43,7 +47,11 @@ export function middleware(request: NextRequest) {
request: { headers: requestHeaders },
});
response.headers.set('Content-Security-Policy', buildCsp(nonce));
const strictCsp = process.env.SHADOWBROKER_STRICT_CSP === '1';
response.headers.set('Content-Security-Policy', buildCsp(nonce, strictCsp));
if (!strictCsp && process.env.NODE_ENV === 'production') {
response.headers.set('Content-Security-Policy-Report-Only', buildCsp(nonce, true));
}
return response;
}
+1
View File
@@ -114,6 +114,7 @@ export interface Ship {
source_url?: string;
last_osint_update?: string;
desc?: string;
trail?: Array<{ lat: number; lng: number; sog?: number; ts?: number } | number[]>;
// Tracked yacht enrichment
yacht_alert?: boolean;
yacht_owner?: string;
+2 -1
View File
@@ -1,7 +1,8 @@
---
apiVersion: v2
name: shadowbroker
version: 0.0.1
version: 0.9.79
appVersion: "0.9.79"
description: simple shadowbroker installation
type: application
+1 -1
View File
@@ -1,6 +1,6 @@
[project]
name = "shadowbroker"
version = "0.9.7"
version = "0.9.79"
readme = "README.md"
requires-python = ">=3.10"
dependencies = []
Generated
+321 -125
View File
@@ -74,8 +74,8 @@ wheels = [
[[package]]
name = "backend"
version = "0.9.7"
source = { virtual = "backend" }
version = "0.9.79"
source = { editable = "backend" }
dependencies = [
{ name = "apscheduler" },
{ name = "beautifulsoup4" },
@@ -93,6 +93,7 @@ dependencies = [
{ name = "pydantic" },
{ name = "pydantic-settings" },
{ name = "pynacl" },
{ name = "pysocks" },
{ name = "pystac-client" },
{ name = "python-dotenv" },
{ name = "requests" },
@@ -125,20 +126,21 @@ requires-dist = [
{ name = "meshtastic", specifier = ">=2.5.0" },
{ name = "orjson", specifier = ">=3.10.0" },
{ name = "paho-mqtt", specifier = ">=1.6.0,<2.0.0" },
{ name = "playwright", specifier = "==1.50.0" },
{ name = "playwright", specifier = "==1.59.0" },
{ name = "playwright-stealth", specifier = "==1.0.6" },
{ name = "pydantic", specifier = "==2.11.1" },
{ name = "pydantic", specifier = "==2.13.3" },
{ name = "pydantic-settings", specifier = "==2.8.1" },
{ name = "pynacl", specifier = ">=1.5.0" },
{ name = "pysocks", specifier = "==1.7.1" },
{ name = "pystac-client", specifier = "==0.8.6" },
{ name = "python-dotenv", specifier = "==1.2.2" },
{ name = "requests", specifier = "==2.31.0" },
{ name = "reverse-geocoder", specifier = "==1.5.1" },
{ name = "sgp4", specifier = "==2.23" },
{ name = "sgp4", specifier = "==2.25" },
{ name = "slowapi", specifier = "==0.1.9" },
{ name = "uvicorn", specifier = "==0.34.0" },
{ name = "vadersentiment", specifier = ">=3.3.0" },
{ name = "yfinance", specifier = "==0.2.54" },
{ name = "yfinance", specifier = "==1.3.0" },
]
[package.metadata.requires-dev]
@@ -532,6 +534,39 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/1a/89/843b53614b47f97fe1abc13f9a86efa5ec9e275292c457af1d4a60dc80e0/cryptography-46.0.6-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:6728c49e3b2c180ef26f8e9f0a883a2c585638db64cf265b49c9ba10652d430e", size = 3409955, upload-time = "2026-03-25T23:34:48.465Z" },
]
[[package]]
name = "curl-cffi"
version = "0.15.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "certifi" },
{ name = "cffi" },
{ name = "rich" },
]
sdist = { url = "https://files.pythonhosted.org/packages/48/5b/89fcfebd3e5e85134147ac99e9f2b2271165fd4d71984fc65da5f17819b7/curl_cffi-0.15.0.tar.gz", hash = "sha256:ea0c67652bf6893d34ee0f82c944f37e488f6147e9421bef1771cc6545b02ded", size = 196437, upload-time = "2026-04-03T11:12:31.525Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/5e/42/54ddd442c795f30ce5dd4e49f87ce77505958d3777cd96a91567a3975d2a/curl_cffi-0.15.0-cp310-abi3-macosx_10_9_x86_64.whl", hash = "sha256:bda66404010e9ed743b1b83c20c86f24fe21a9a6873e17479d6e67e29d8ded28", size = 2795267, upload-time = "2026-04-03T11:11:46.48Z" },
{ url = "https://files.pythonhosted.org/packages/83/2d/3915e238579b3c5a92cead5c79130c3b8d20caaba7616cc4d894650e1d6b/curl_cffi-0.15.0-cp310-abi3-macosx_11_0_arm64.whl", hash = "sha256:a25620d9bf989c9c029a7d1642999c4c265abb0bad811deb2f77b0b5b2b12e5b", size = 2573544, upload-time = "2026-04-03T11:11:47.951Z" },
{ url = "https://files.pythonhosted.org/packages/2a/b3/9d2f1057749a1b07ba1989db3c1503ce8bed998310bae9aea2c43aa64f20/curl_cffi-0.15.0-cp310-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:582e570aa2586b96ed47cf4a17586b9a3c462cbe43f780487c3dc245c6ef1527", size = 10515369, upload-time = "2026-04-03T11:11:50.126Z" },
{ url = "https://files.pythonhosted.org/packages/b5/1d/6d10dded5ce3fd8157e558ebd97d09e551b77a62cdc1c31e93d0a633cee5/curl_cffi-0.15.0-cp310-abi3-manylinux2014_i686.manylinux_2_17_i686.whl", hash = "sha256:838e48212447d9c81364b04707a5c861daf08f8320f9ecb3406a8919d1d5c3b3", size = 10160045, upload-time = "2026-04-03T11:11:52.664Z" },
{ url = "https://files.pythonhosted.org/packages/5c/12/c70b835487ace3b9ba1502631912e3440082b8ae3a162f60b59cb0b6444d/curl_cffi-0.15.0-cp310-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:2b6c847d86283b07ae69bb72c82eb8a59242277142aa35b89850f89e792a02fc", size = 11090433, upload-time = "2026-04-03T11:11:55.049Z" },
{ url = "https://files.pythonhosted.org/packages/ea/0d/78edcc4f71934225db99df68197a107386d59080742fc7bf6bb4d007924f/curl_cffi-0.15.0-cp310-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:9e5e69eee735f659287e2c84444319d68a1fa68dd37abf228943a4074864283a", size = 10479178, upload-time = "2026-04-03T11:11:57.685Z" },
{ url = "https://files.pythonhosted.org/packages/5b/84/1e101c1acb1ea2f0b4992f5c3024f596d8e21db0d53540b9d583f673c4e7/curl_cffi-0.15.0-cp310-abi3-manylinux_2_34_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:aa1323950224db24f4c510d010b3affa02196ca853fb424191fa917a513d3f4b", size = 10317051, upload-time = "2026-04-03T11:12:00.295Z" },
{ url = "https://files.pythonhosted.org/packages/28/42/8ef236b22a6c23d096c85a1dc507efe37bfdfc7a2f8a4b34efb590197369/curl_cffi-0.15.0-cp310-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:41f80170ba844009273b2660da1964ec31e99e5719d16b3422ada87177e32e13", size = 11299660, upload-time = "2026-04-03T11:12:02.791Z" },
{ url = "https://files.pythonhosted.org/packages/1d/01/56aeb055d962da87a1be0d74c6c644e251c7e88129b5471dc44ac724e678/curl_cffi-0.15.0-cp310-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:1977e1e12cfb5c11352cbb74acef1bed24eb7d226dab61ca57c168c21acd4d61", size = 11945049, upload-time = "2026-04-03T11:12:05.912Z" },
{ url = "https://files.pythonhosted.org/packages/d8/8c/2abf99a38d6340d66cf0557e0c750ef3f8883dfc5d450087e01c85861343/curl_cffi-0.15.0-cp310-abi3-win_amd64.whl", hash = "sha256:5a0c1896a0d5a5ac1eb89cd24b008d2b718dd1df6fd2f75451b59ca66e49e572", size = 1661649, upload-time = "2026-04-03T11:12:07.948Z" },
{ url = "https://files.pythonhosted.org/packages/3d/39/dfd54f2240d3a9b96d77bacc62b97813b35e2aa8ecf5cd5013c683f1ba96/curl_cffi-0.15.0-cp310-abi3-win_arm64.whl", hash = "sha256:a6d57f8389273a3a1f94370473c74897467bcc36af0a17336989780c507fa43d", size = 1410741, upload-time = "2026-04-03T11:12:10.073Z" },
{ url = "https://files.pythonhosted.org/packages/19/6a/c24df8a4fc22fa84070dcd94abeba43c15e08cc09e35869565c0bad196fd/curl_cffi-0.15.0-cp313-abi3-android_24_arm64_v8a.whl", hash = "sha256:4682dc38d4336e0eb0b185374db90a760efde63cbea994b4e63f3521d44c4c92", size = 7190427, upload-time = "2026-04-03T11:12:12.142Z" },
{ url = "https://files.pythonhosted.org/packages/11/56/132225cb3491d07cc6adcce5fe395e059bde87c68cff1ef87a31c88c7819/curl_cffi-0.15.0-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:967ad7355bd8e9586f8c2d02eaa99953747549e7ea4a9b25cd53353e6b67fe6d", size = 2795723, upload-time = "2026-04-03T11:12:13.668Z" },
{ url = "https://files.pythonhosted.org/packages/07/8f/f4f83cd303bef7e8f1749512e5dd157e7e5d08b0a36c8211f9640a2757bf/curl_cffi-0.15.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:7e63539d0d839d0a8c5eacf86229bc68c57803547f35e0db7ee0986328b478c3", size = 2573739, upload-time = "2026-04-03T11:12:15.08Z" },
{ url = "https://files.pythonhosted.org/packages/e8/5c/643d65c7fc9acd742876aa55c2d7823c438cb7665810acd2e66c9976c4d9/curl_cffi-0.15.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:08c799b89740b9bc49c09fbc3d5907f13ac1f845ca52620507ef9466d4639dd5", size = 10521046, upload-time = "2026-04-03T11:12:17.034Z" },
{ url = "https://files.pythonhosted.org/packages/7f/0b/9b8037113c93f4c5323096163471fa7c35c7676c3f608eeaf1287cd99d58/curl_cffi-0.15.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:7b7a92767a888ee90147e18964b396d8435ff42737030d6fb00824ffd6094805", size = 11096115, upload-time = "2026-04-03T11:12:19.694Z" },
{ url = "https://files.pythonhosted.org/packages/5f/96/fff2fcbd924ef4042e0d67379f751a8a4e3186a91e75e35a4cf218b306ee/curl_cffi-0.15.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:829cc357061ecb99cc2d406301f609a039e05665322f5c025ec67c38b0dc49ce", size = 11305346, upload-time = "2026-04-03T11:12:22.151Z" },
{ url = "https://files.pythonhosted.org/packages/53/1b/304b253a45ab28691c8c5e8cca1e6cbb9cf8e46dfceae4648dd536f75e73/curl_cffi-0.15.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:408d6f14e346841cd889c2e0962832bb235ba3b6749ebf609f347f747da5e60f", size = 11949834, upload-time = "2026-04-03T11:12:24.986Z" },
{ url = "https://files.pythonhosted.org/packages/5a/ff/4723d92f08259c707a974aba27a08d0a822b9555e35ca581bf18d055a364/curl_cffi-0.15.0-cp314-cp314t-win_amd64.whl", hash = "sha256:b624c7ce087bfda967a013ed0a64702a525444e5b6e97d23534d567ccc6525aa", size = 1702771, upload-time = "2026-04-03T11:12:28.201Z" },
{ url = "https://files.pythonhosted.org/packages/59/8c/36bbe06d66fa2b765e4a07199f643a59a9cd1a754207a96335402a9520f4/curl_cffi-0.15.0-cp314-cp314t-win_arm64.whl", hash = "sha256:0b6c0543b993996670e9e4b78e305a2d60809d5681903ffb5568e21a387434d3", size = 1466312, upload-time = "2026-04-03T11:12:30.054Z" },
]
[[package]]
name = "dbus-fast"
version = "4.0.0"
@@ -800,6 +835,27 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/b9/98/cb5ca20618d205a09d5bec7591fbc4130369c7e6308d9a676a28ff3ab22c/limits-5.8.0-py3-none-any.whl", hash = "sha256:ae1b008a43eb43073c3c579398bd4eb4c795de60952532dc24720ab45e1ac6b8", size = 60954, upload-time = "2026-02-05T07:17:34.425Z" },
]
[[package]]
name = "markdown-it-py"
version = "4.1.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "mdurl" },
]
sdist = { url = "https://files.pythonhosted.org/packages/5c/5c/f3aedc83549aae71cd52b9e9687fe896e3dc6e966ba20eba04718605d198/markdown_it_py-4.1.0.tar.gz", hash = "sha256:760e3f87b2787c044c5138a5ba107b7c2be26c03b13cc7f8fe42756b65b1df6c", size = 81613, upload-time = "2026-05-06T16:32:13.649Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/a8/88/802c82060c54bc7dde21eb0033e337838b8181a1323254aa9ec41cbfc3d1/markdown_it_py-4.1.0-py3-none-any.whl", hash = "sha256:d4939a62a2dd0cd9cb80a191a711ba1d39bac8ed5ef9e9966895b0171c01c46d", size = 90955, upload-time = "2026-05-06T16:32:12.184Z" },
]
[[package]]
name = "mdurl"
version = "0.1.2"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/d6/54/cfe61301667036ec958cb99bd3efefba235e65cdeb9c84d24a8293ba1d90/mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba", size = 8729, upload-time = "2022-08-14T12:40:10.846Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/b3/38/89ba8ad64ae25be8de66a6d463314cf1eb366222074cfda9ee839c56a4b4/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8", size = 9979, upload-time = "2022-08-14T12:40:09.779Z" },
]
[[package]]
name = "meshtastic"
version = "2.7.8"
@@ -1243,20 +1299,21 @@ wheels = [
[[package]]
name = "playwright"
version = "1.50.0"
version = "1.59.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "greenlet" },
{ name = "pyee" },
]
wheels = [
{ url = "https://files.pythonhosted.org/packages/0d/5e/068dea3c96e9c09929b45c92cf7e573403b52a89aa463f89b9da9b87b7a4/playwright-1.50.0-py3-none-macosx_10_13_x86_64.whl", hash = "sha256:f36d754a6c5bd9bf7f14e8f57a2aea6fd08f39ca4c8476481b9c83e299531148", size = 40277564, upload-time = "2025-02-03T14:57:22.774Z" },
{ url = "https://files.pythonhosted.org/packages/78/85/b3deb3d2add00d2a6ee74bf6f57ccefb30efc400fd1b7b330ba9a3626330/playwright-1.50.0-py3-none-macosx_11_0_arm64.whl", hash = "sha256:40f274384591dfd27f2b014596250b2250c843ed1f7f4ef5d2960ecb91b4961e", size = 39521844, upload-time = "2025-02-03T14:57:29.372Z" },
{ url = "https://files.pythonhosted.org/packages/f3/f6/002b3d98df9c84296fea84f070dc0d87c2270b37f423cf076a913370d162/playwright-1.50.0-py3-none-macosx_11_0_universal2.whl", hash = "sha256:9922ef9bcd316995f01e220acffd2d37a463b4ad10fd73e388add03841dfa230", size = 40277563, upload-time = "2025-02-03T14:57:36.291Z" },
{ url = "https://files.pythonhosted.org/packages/b9/63/c9a73736e434df894e484278dddc0bf154312ff8d0f16d516edb790a7d42/playwright-1.50.0-py3-none-manylinux1_x86_64.whl", hash = "sha256:8fc628c492d12b13d1f347137b2ac6c04f98197ff0985ef0403a9a9ee0d39131", size = 45076712, upload-time = "2025-02-03T14:57:43.581Z" },
{ url = "https://files.pythonhosted.org/packages/bd/2c/a54b5a64cc7d1a62f2d944c5977fb3c88e74d76f5cdc7966e717426bce66/playwright-1.50.0-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ffcff35f72db2689a79007aee78f1b0621a22e6e3d6c1f58aaa9ac805bf4497c", size = 44493111, upload-time = "2025-02-03T14:57:50.226Z" },
{ url = "https://files.pythonhosted.org/packages/2b/4a/047cbb2ffe1249bd7a56441fc3366fb4a8a1f44bc36a9061d10edfda2c86/playwright-1.50.0-py3-none-win32.whl", hash = "sha256:3b906f4d351260016a8c5cc1e003bb341651ae682f62213b50168ed581c7558a", size = 34784543, upload-time = "2025-02-03T14:57:55.942Z" },
{ url = "https://files.pythonhosted.org/packages/bc/2b/e944e10c9b18e77e43d3bb4d6faa323f6cc27597db37b75bc3fd796adfd5/playwright-1.50.0-py3-none-win_amd64.whl", hash = "sha256:1859423da82de631704d5e3d88602d755462b0906824c1debe140979397d2e8d", size = 34784546, upload-time = "2025-02-03T14:58:01.664Z" },
{ url = "https://files.pythonhosted.org/packages/5b/48/abab23f40643b4de8f2665816f0a1bf0994eeecda39d6d62f0f292b2ad01/playwright-1.59.0-py3-none-macosx_10_13_x86_64.whl", hash = "sha256:bfc6940100b57423175c819ce2422ec5880d55fa2769987f62ab7a1f5fe6783e", size = 43156922, upload-time = "2026-04-29T08:11:08.921Z" },
{ url = "https://files.pythonhosted.org/packages/08/71/5e4d98b2ce3641b4343623c6450ff33b9de1c979d12a957505e392338b07/playwright-1.59.0-py3-none-macosx_11_0_arm64.whl", hash = "sha256:af068143a0c045ec11608b67d6c42e58db7e9cf65a742dd21fddedc1a9802c47", size = 41947177, upload-time = "2026-04-29T08:11:12.867Z" },
{ url = "https://files.pythonhosted.org/packages/80/91/fd219aa78ca03d37e93aaedaed4e224131e3090a9264f9bb773c8271d67e/playwright-1.59.0-py3-none-macosx_11_0_universal2.whl", hash = "sha256:4a4a2d4842b0e4120de3fa48636e4b69085a05b81d8a35ad4353f530ade72ed6", size = 43156922, upload-time = "2026-04-29T08:11:16.595Z" },
{ url = "https://files.pythonhosted.org/packages/73/0c/1e513d37c5be07d12829ebce93dbfe7baee230084cb66966c423432799c4/playwright-1.59.0-py3-none-manylinux1_x86_64.whl", hash = "sha256:c5792aad9e22b91a09264b9edbc18553cf05ea5a39404d65dc19a012c6b2e51d", size = 47151793, upload-time = "2026-04-29T08:11:19.979Z" },
{ url = "https://files.pythonhosted.org/packages/a3/2d/15f72288cb65d690134e18fefb9483cc4976f7579b580648c45e494481a7/playwright-1.59.0-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8c881a19377d2b900af855fb525b5f22a27bf3cfbecba6d1edb36766d56cb100", size = 46877615, upload-time = "2026-04-29T08:11:23.863Z" },
{ url = "https://files.pythonhosted.org/packages/72/a1/717ac5bc99f387c0f60def91271ea4262125c0815d764a5d1776a272275c/playwright-1.59.0-py3-none-win32.whl", hash = "sha256:6989c476be2b9cd3e24a18cc9dcf202e266fb3d91e3e5395cd668c54ea54b119", size = 37713698, upload-time = "2026-04-29T08:11:27.251Z" },
{ url = "https://files.pythonhosted.org/packages/0f/a5/4e630ee05d8b46b840f943268e86d6063703e8dadb2d3eb405c7b9b2e48c/playwright-1.59.0-py3-none-win_amd64.whl", hash = "sha256:d5a5cc064b82ca92996080025710844e417f44df8fda9001102c28f44174171c", size = 37713704, upload-time = "2026-04-29T08:11:30.41Z" },
{ url = "https://files.pythonhosted.org/packages/eb/0c/3ece41761ba13c8321009aefcaec7a016eb42799c42eef5e03ace7f2de5b/playwright-1.59.0-py3-none-win_arm64.whl", hash = "sha256:93581ad515728cadc8af39b288a5633ba6d36e7d72048e79d890ce01ea2156f9", size = 33956745, upload-time = "2026-04-29T08:11:34.738Z" },
]
[[package]]
@@ -1306,7 +1363,7 @@ wheels = [
[[package]]
name = "pydantic"
version = "2.11.1"
version = "2.13.3"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "annotated-types" },
@@ -1314,96 +1371,125 @@ dependencies = [
{ name = "typing-extensions" },
{ name = "typing-inspection" },
]
sdist = { url = "https://files.pythonhosted.org/packages/93/a3/698b87a4d4d303d7c5f62ea5fbf7a79cab236ccfbd0a17847b7f77f8163e/pydantic-2.11.1.tar.gz", hash = "sha256:442557d2910e75c991c39f4b4ab18963d57b9b55122c8b2a9cd176d8c29ce968", size = 782817, upload-time = "2025-03-28T21:14:58.347Z" }
sdist = { url = "https://files.pythonhosted.org/packages/d9/e4/40d09941a2cebcb20609b86a559817d5b9291c49dd6f8c87e5feffbe703a/pydantic-2.13.3.tar.gz", hash = "sha256:af09e9d1d09f4e7fe37145c1f577e1d61ceb9a41924bf0094a36506285d0a84d", size = 844068, upload-time = "2026-04-20T14:46:43.632Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/cc/12/f9221a949f2419e2e23847303c002476c26fbcfd62dc7f3d25d0bec5ca99/pydantic-2.11.1-py3-none-any.whl", hash = "sha256:5b6c415eee9f8123a14d859be0c84363fec6b1feb6b688d6435801230b56e0b8", size = 442648, upload-time = "2025-03-28T21:14:55.856Z" },
{ url = "https://files.pythonhosted.org/packages/f3/0a/fd7d723f8f8153418fb40cf9c940e82004fce7e987026b08a68a36dd3fe7/pydantic-2.13.3-py3-none-any.whl", hash = "sha256:6db14ac8dfc9a1e57f87ea2c0de670c251240f43cb0c30a5130e9720dc612927", size = 471981, upload-time = "2026-04-20T14:46:41.402Z" },
]
[[package]]
name = "pydantic-core"
version = "2.33.0"
version = "2.46.3"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "typing-extensions" },
]
sdist = { url = "https://files.pythonhosted.org/packages/b9/05/91ce14dfd5a3a99555fce436318cc0fd1f08c4daa32b3248ad63669ea8b4/pydantic_core-2.33.0.tar.gz", hash = "sha256:40eb8af662ba409c3cbf4a8150ad32ae73514cd7cb1f1a2113af39763dd616b3", size = 434080, upload-time = "2025-03-26T20:30:05.906Z" }
sdist = { url = "https://files.pythonhosted.org/packages/2a/ef/f7abb56c49382a246fd2ce9c799691e3c3e7175ec74b14d99e798bcddb1a/pydantic_core-2.46.3.tar.gz", hash = "sha256:41c178f65b8c29807239d47e6050262eb6bf84eb695e41101e62e38df4a5bc2c", size = 471412, upload-time = "2026-04-20T14:40:56.672Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/29/43/0649ad07e66b36a3fb21442b425bd0348ac162c5e686b36471f363201535/pydantic_core-2.33.0-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:71dffba8fe9ddff628c68f3abd845e91b028361d43c5f8e7b3f8b91d7d85413e", size = 2042968, upload-time = "2025-03-26T20:26:38.341Z" },
{ url = "https://files.pythonhosted.org/packages/a0/a6/975fea4774a459e495cb4be288efd8b041ac756a0a763f0b976d0861334b/pydantic_core-2.33.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:abaeec1be6ed535a5d7ffc2e6c390083c425832b20efd621562fbb5bff6dc518", size = 1860347, upload-time = "2025-03-26T20:26:41.311Z" },
{ url = "https://files.pythonhosted.org/packages/aa/49/7858dadad305101a077ec4d0c606b6425a2b134ea8d858458a6d287fd871/pydantic_core-2.33.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:759871f00e26ad3709efc773ac37b4d571de065f9dfb1778012908bcc36b3a73", size = 1910060, upload-time = "2025-03-26T20:26:43.095Z" },
{ url = "https://files.pythonhosted.org/packages/8d/4f/6522527911d9c5fe6d76b084d8b388d5c84b09d113247b39f91937500b34/pydantic_core-2.33.0-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:dcfebee69cd5e1c0b76a17e17e347c84b00acebb8dd8edb22d4a03e88e82a207", size = 1997129, upload-time = "2025-03-26T20:26:44.523Z" },
{ url = "https://files.pythonhosted.org/packages/75/d0/06f396da053e3d73001ea4787e56b4d7132a87c0b5e2e15a041e808c35cd/pydantic_core-2.33.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1b1262b912435a501fa04cd213720609e2cefa723a07c92017d18693e69bf00b", size = 2140389, upload-time = "2025-03-26T20:26:46.37Z" },
{ url = "https://files.pythonhosted.org/packages/f5/6b/b9ff5b69cd4ef007cf665463f3be2e481dc7eb26c4a55b2f57a94308c31a/pydantic_core-2.33.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4726f1f3f42d6a25678c67da3f0b10f148f5655813c5aca54b0d1742ba821b8f", size = 2754237, upload-time = "2025-03-26T20:26:48.729Z" },
{ url = "https://files.pythonhosted.org/packages/53/80/b4879de375cdf3718d05fcb60c9aa1f119d28e261dafa51b6a69c78f7178/pydantic_core-2.33.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e790954b5093dff1e3a9a2523fddc4e79722d6f07993b4cd5547825c3cbf97b5", size = 2007433, upload-time = "2025-03-26T20:26:50.642Z" },
{ url = "https://files.pythonhosted.org/packages/46/24/54054713dc0af98a94eab37e0f4294dfd5cd8f70b2ca9dcdccd15709fd7e/pydantic_core-2.33.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:34e7fb3abe375b5c4e64fab75733d605dda0f59827752debc99c17cb2d5f3276", size = 2123980, upload-time = "2025-03-26T20:26:52.296Z" },
{ url = "https://files.pythonhosted.org/packages/3a/4c/257c1cb89e14cfa6e95ebcb91b308eb1dd2b348340ff76a6e6fcfa9969e1/pydantic_core-2.33.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:ecb158fb9b9091b515213bed3061eb7deb1d3b4e02327c27a0ea714ff46b0760", size = 2087433, upload-time = "2025-03-26T20:26:54.22Z" },
{ url = "https://files.pythonhosted.org/packages/0c/62/927df8a39ad78ef7b82c5446e01dec9bb0043e1ad71d8f426062f5f014db/pydantic_core-2.33.0-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:4d9149e7528af8bbd76cc055967e6e04617dcb2a2afdaa3dea899406c5521faa", size = 2260242, upload-time = "2025-03-26T20:26:55.749Z" },
{ url = "https://files.pythonhosted.org/packages/74/f2/389414f7c77a100954e84d6f52a82bd1788ae69db72364376d8a73b38765/pydantic_core-2.33.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:e81a295adccf73477220e15ff79235ca9dcbcee4be459eb9d4ce9a2763b8386c", size = 2258227, upload-time = "2025-03-26T20:26:58.479Z" },
{ url = "https://files.pythonhosted.org/packages/53/99/94516313e15d906a1264bb40faf24a01a4af4e2ca8a7c10dd173b6513c5a/pydantic_core-2.33.0-cp310-cp310-win32.whl", hash = "sha256:f22dab23cdbce2005f26a8f0c71698457861f97fc6318c75814a50c75e87d025", size = 1925523, upload-time = "2025-03-26T20:26:59.838Z" },
{ url = "https://files.pythonhosted.org/packages/7d/67/cc789611c6035a0b71305a1ec6ba196256ced76eba8375f316f840a70456/pydantic_core-2.33.0-cp310-cp310-win_amd64.whl", hash = "sha256:9cb2390355ba084c1ad49485d18449b4242da344dea3e0fe10babd1f0db7dcfc", size = 1951872, upload-time = "2025-03-26T20:27:01.316Z" },
{ url = "https://files.pythonhosted.org/packages/f0/93/9e97af2619b4026596487a79133e425c7d3c374f0a7f100f3d76bcdf9c83/pydantic_core-2.33.0-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:a608a75846804271cf9c83e40bbb4dab2ac614d33c6fd5b0c6187f53f5c593ef", size = 2042784, upload-time = "2025-03-26T20:27:02.809Z" },
{ url = "https://files.pythonhosted.org/packages/42/b4/0bba8412fd242729feeb80e7152e24f0e1a1c19f4121ca3d4a307f4e6222/pydantic_core-2.33.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e1c69aa459f5609dec2fa0652d495353accf3eda5bdb18782bc5a2ae45c9273a", size = 1858179, upload-time = "2025-03-26T20:27:04.747Z" },
{ url = "https://files.pythonhosted.org/packages/69/1f/c1c40305d929bd08af863df64b0a26203b70b352a1962d86f3bcd52950fe/pydantic_core-2.33.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b9ec80eb5a5f45a2211793f1c4aeddff0c3761d1c70d684965c1807e923a588b", size = 1909396, upload-time = "2025-03-26T20:27:06.258Z" },
{ url = "https://files.pythonhosted.org/packages/0f/99/d2e727375c329c1e652b5d450fbb9d56e8c3933a397e4bd46e67c68c2cd5/pydantic_core-2.33.0-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e925819a98318d17251776bd3d6aa9f3ff77b965762155bdad15d1a9265c4cfd", size = 1998264, upload-time = "2025-03-26T20:27:08.439Z" },
{ url = "https://files.pythonhosted.org/packages/9c/2e/3119a33931278d96ecc2e9e1b9d50c240636cfeb0c49951746ae34e4de74/pydantic_core-2.33.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5bf68bb859799e9cec3d9dd8323c40c00a254aabb56fe08f907e437005932f2b", size = 2140588, upload-time = "2025-03-26T20:27:09.949Z" },
{ url = "https://files.pythonhosted.org/packages/35/bd/9267bd1ba55f17c80ef6cb7e07b3890b4acbe8eb6014f3102092d53d9300/pydantic_core-2.33.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1b2ea72dea0825949a045fa4071f6d5b3d7620d2a208335207793cf29c5a182d", size = 2746296, upload-time = "2025-03-26T20:27:11.824Z" },
{ url = "https://files.pythonhosted.org/packages/6f/ed/ef37de6478a412ee627cbebd73e7b72a680f45bfacce9ff1199de6e17e88/pydantic_core-2.33.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1583539533160186ac546b49f5cde9ffc928062c96920f58bd95de32ffd7bffd", size = 2005555, upload-time = "2025-03-26T20:27:13.872Z" },
{ url = "https://files.pythonhosted.org/packages/dd/84/72c8d1439585d8ee7bc35eb8f88a04a4d302ee4018871f1f85ae1b0c6625/pydantic_core-2.33.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:23c3e77bf8a7317612e5c26a3b084c7edeb9552d645742a54a5867635b4f2453", size = 2124452, upload-time = "2025-03-26T20:27:15.402Z" },
{ url = "https://files.pythonhosted.org/packages/a7/8f/cb13de30c6a3e303423751a529a3d1271c2effee4b98cf3e397a66ae8498/pydantic_core-2.33.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:a7a7f2a3f628d2f7ef11cb6188bcf0b9e1558151d511b974dfea10a49afe192b", size = 2087001, upload-time = "2025-03-26T20:27:17.014Z" },
{ url = "https://files.pythonhosted.org/packages/83/d0/e93dc8884bf288a63fedeb8040ac8f29cb71ca52e755f48e5170bb63e55b/pydantic_core-2.33.0-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:f1fb026c575e16f673c61c7b86144517705865173f3d0907040ac30c4f9f5915", size = 2261663, upload-time = "2025-03-26T20:27:18.819Z" },
{ url = "https://files.pythonhosted.org/packages/4c/ba/4b7739c95efa0b542ee45fd872c8f6b1884ab808cf04ce7ac6621b6df76e/pydantic_core-2.33.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:635702b2fed997e0ac256b2cfbdb4dd0bf7c56b5d8fba8ef03489c03b3eb40e2", size = 2257786, upload-time = "2025-03-26T20:27:20.752Z" },
{ url = "https://files.pythonhosted.org/packages/cc/98/73cbca1d2360c27752cfa2fcdcf14d96230e92d7d48ecd50499865c56bf7/pydantic_core-2.33.0-cp311-cp311-win32.whl", hash = "sha256:07b4ced28fccae3f00626eaa0c4001aa9ec140a29501770a88dbbb0966019a86", size = 1925697, upload-time = "2025-03-26T20:27:22.688Z" },
{ url = "https://files.pythonhosted.org/packages/9a/26/d85a40edeca5d8830ffc33667d6fef329fd0f4bc0c5181b8b0e206cfe488/pydantic_core-2.33.0-cp311-cp311-win_amd64.whl", hash = "sha256:4927564be53239a87770a5f86bdc272b8d1fbb87ab7783ad70255b4ab01aa25b", size = 1949859, upload-time = "2025-03-26T20:27:24.371Z" },
{ url = "https://files.pythonhosted.org/packages/7e/0b/5a381605f0b9870465b805f2c86c06b0a7c191668ebe4117777306c2c1e5/pydantic_core-2.33.0-cp311-cp311-win_arm64.whl", hash = "sha256:69297418ad644d521ea3e1aa2e14a2a422726167e9ad22b89e8f1130d68e1e9a", size = 1907978, upload-time = "2025-03-26T20:27:25.964Z" },
{ url = "https://files.pythonhosted.org/packages/a9/c4/c9381323cbdc1bb26d352bc184422ce77c4bc2f2312b782761093a59fafc/pydantic_core-2.33.0-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:6c32a40712e3662bebe524abe8abb757f2fa2000028d64cc5a1006016c06af43", size = 2025127, upload-time = "2025-03-26T20:27:27.704Z" },
{ url = "https://files.pythonhosted.org/packages/6f/bd/af35278080716ecab8f57e84515c7dc535ed95d1c7f52c1c6f7b313a9dab/pydantic_core-2.33.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8ec86b5baa36f0a0bfb37db86c7d52652f8e8aa076ab745ef7725784183c3fdd", size = 1851687, upload-time = "2025-03-26T20:27:29.67Z" },
{ url = "https://files.pythonhosted.org/packages/12/e4/a01461225809c3533c23bd1916b1e8c2e21727f0fea60ab1acbffc4e2fca/pydantic_core-2.33.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4deac83a8cc1d09e40683be0bc6d1fa4cde8df0a9bf0cda5693f9b0569ac01b6", size = 1892232, upload-time = "2025-03-26T20:27:31.374Z" },
{ url = "https://files.pythonhosted.org/packages/51/17/3d53d62a328fb0a49911c2962036b9e7a4f781b7d15e9093c26299e5f76d/pydantic_core-2.33.0-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:175ab598fb457a9aee63206a1993874badf3ed9a456e0654273e56f00747bbd6", size = 1977896, upload-time = "2025-03-26T20:27:33.055Z" },
{ url = "https://files.pythonhosted.org/packages/30/98/01f9d86e02ec4a38f4b02086acf067f2c776b845d43f901bd1ee1c21bc4b/pydantic_core-2.33.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5f36afd0d56a6c42cf4e8465b6441cf546ed69d3a4ec92724cc9c8c61bd6ecf4", size = 2127717, upload-time = "2025-03-26T20:27:34.768Z" },
{ url = "https://files.pythonhosted.org/packages/3c/43/6f381575c61b7c58b0fd0b92134c5a1897deea4cdfc3d47567b3ff460a4e/pydantic_core-2.33.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0a98257451164666afafc7cbf5fb00d613e33f7e7ebb322fbcd99345695a9a61", size = 2680287, upload-time = "2025-03-26T20:27:36.826Z" },
{ url = "https://files.pythonhosted.org/packages/01/42/c0d10d1451d161a9a0da9bbef023b8005aa26e9993a8cc24dc9e3aa96c93/pydantic_core-2.33.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ecc6d02d69b54a2eb83ebcc6f29df04957f734bcf309d346b4f83354d8376862", size = 2008276, upload-time = "2025-03-26T20:27:38.609Z" },
{ url = "https://files.pythonhosted.org/packages/20/ca/e08df9dba546905c70bae44ced9f3bea25432e34448d95618d41968f40b7/pydantic_core-2.33.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:1a69b7596c6603afd049ce7f3835bcf57dd3892fc7279f0ddf987bebed8caa5a", size = 2115305, upload-time = "2025-03-26T20:27:41.717Z" },
{ url = "https://files.pythonhosted.org/packages/03/1f/9b01d990730a98833113581a78e595fd40ed4c20f9693f5a658fb5f91eff/pydantic_core-2.33.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:ea30239c148b6ef41364c6f51d103c2988965b643d62e10b233b5efdca8c0099", size = 2068999, upload-time = "2025-03-26T20:27:43.42Z" },
{ url = "https://files.pythonhosted.org/packages/20/18/fe752476a709191148e8b1e1139147841ea5d2b22adcde6ee6abb6c8e7cf/pydantic_core-2.33.0-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:abfa44cf2f7f7d7a199be6c6ec141c9024063205545aa09304349781b9a125e6", size = 2241488, upload-time = "2025-03-26T20:27:46.744Z" },
{ url = "https://files.pythonhosted.org/packages/81/22/14738ad0a0bf484b928c9e52004f5e0b81dd8dabbdf23b843717b37a71d1/pydantic_core-2.33.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:20d4275f3c4659d92048c70797e5fdc396c6e4446caf517ba5cad2db60cd39d3", size = 2248430, upload-time = "2025-03-26T20:27:48.458Z" },
{ url = "https://files.pythonhosted.org/packages/e8/27/be7571e215ac8d321712f2433c445b03dbcd645366a18f67b334df8912bc/pydantic_core-2.33.0-cp312-cp312-win32.whl", hash = "sha256:918f2013d7eadea1d88d1a35fd4a1e16aaf90343eb446f91cb091ce7f9b431a2", size = 1908353, upload-time = "2025-03-26T20:27:50.488Z" },
{ url = "https://files.pythonhosted.org/packages/be/3a/be78f28732f93128bd0e3944bdd4b3970b389a1fbd44907c97291c8dcdec/pydantic_core-2.33.0-cp312-cp312-win_amd64.whl", hash = "sha256:aec79acc183865bad120b0190afac467c20b15289050648b876b07777e67ea48", size = 1955956, upload-time = "2025-03-26T20:27:52.239Z" },
{ url = "https://files.pythonhosted.org/packages/21/26/b8911ac74faa994694b76ee6a22875cc7a4abea3c381fdba4edc6c6bef84/pydantic_core-2.33.0-cp312-cp312-win_arm64.whl", hash = "sha256:5461934e895968655225dfa8b3be79e7e927e95d4bd6c2d40edd2fa7052e71b6", size = 1903259, upload-time = "2025-03-26T20:27:54.06Z" },
{ url = "https://files.pythonhosted.org/packages/79/20/de2ad03ce8f5b3accf2196ea9b44f31b0cd16ac6e8cfc6b21976ed45ec35/pydantic_core-2.33.0-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:f00e8b59e1fc8f09d05594aa7d2b726f1b277ca6155fc84c0396db1b373c4555", size = 2032214, upload-time = "2025-03-26T20:27:56.197Z" },
{ url = "https://files.pythonhosted.org/packages/f9/af/6817dfda9aac4958d8b516cbb94af507eb171c997ea66453d4d162ae8948/pydantic_core-2.33.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:1a73be93ecef45786d7d95b0c5e9b294faf35629d03d5b145b09b81258c7cd6d", size = 1852338, upload-time = "2025-03-26T20:27:57.876Z" },
{ url = "https://files.pythonhosted.org/packages/44/f3/49193a312d9c49314f2b953fb55740b7c530710977cabe7183b8ef111b7f/pydantic_core-2.33.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ff48a55be9da6930254565ff5238d71d5e9cd8c5487a191cb85df3bdb8c77365", size = 1896913, upload-time = "2025-03-26T20:27:59.719Z" },
{ url = "https://files.pythonhosted.org/packages/06/e0/c746677825b2e29a2fa02122a8991c83cdd5b4c5f638f0664d4e35edd4b2/pydantic_core-2.33.0-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:26a4ea04195638dcd8c53dadb545d70badba51735b1594810e9768c2c0b4a5da", size = 1986046, upload-time = "2025-03-26T20:28:01.583Z" },
{ url = "https://files.pythonhosted.org/packages/11/ec/44914e7ff78cef16afb5e5273d480c136725acd73d894affdbe2a1bbaad5/pydantic_core-2.33.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:41d698dcbe12b60661f0632b543dbb119e6ba088103b364ff65e951610cb7ce0", size = 2128097, upload-time = "2025-03-26T20:28:03.437Z" },
{ url = "https://files.pythonhosted.org/packages/fe/f5/c6247d424d01f605ed2e3802f338691cae17137cee6484dce9f1ac0b872b/pydantic_core-2.33.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ae62032ef513fe6281ef0009e30838a01057b832dc265da32c10469622613885", size = 2681062, upload-time = "2025-03-26T20:28:05.498Z" },
{ url = "https://files.pythonhosted.org/packages/f0/85/114a2113b126fdd7cf9a9443b1b1fe1b572e5bd259d50ba9d5d3e1927fa9/pydantic_core-2.33.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f225f3a3995dbbc26affc191d0443c6c4aa71b83358fd4c2b7d63e2f6f0336f9", size = 2007487, upload-time = "2025-03-26T20:28:07.879Z" },
{ url = "https://files.pythonhosted.org/packages/e6/40/3c05ed28d225c7a9acd2b34c5c8010c279683a870219b97e9f164a5a8af0/pydantic_core-2.33.0-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:5bdd36b362f419c78d09630cbaebc64913f66f62bda6d42d5fbb08da8cc4f181", size = 2121382, upload-time = "2025-03-26T20:28:09.651Z" },
{ url = "https://files.pythonhosted.org/packages/8a/22/e70c086f41eebd323e6baa92cc906c3f38ddce7486007eb2bdb3b11c8f64/pydantic_core-2.33.0-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:2a0147c0bef783fd9abc9f016d66edb6cac466dc54a17ec5f5ada08ff65caf5d", size = 2072473, upload-time = "2025-03-26T20:28:11.69Z" },
{ url = "https://files.pythonhosted.org/packages/3e/84/d1614dedd8fe5114f6a0e348bcd1535f97d76c038d6102f271433cd1361d/pydantic_core-2.33.0-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:c860773a0f205926172c6644c394e02c25421dc9a456deff16f64c0e299487d3", size = 2249468, upload-time = "2025-03-26T20:28:13.651Z" },
{ url = "https://files.pythonhosted.org/packages/b0/c0/787061eef44135e00fddb4b56b387a06c303bfd3884a6df9bea5cb730230/pydantic_core-2.33.0-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:138d31e3f90087f42aa6286fb640f3c7a8eb7bdae829418265e7e7474bd2574b", size = 2254716, upload-time = "2025-03-26T20:28:16.105Z" },
{ url = "https://files.pythonhosted.org/packages/ae/e2/27262eb04963201e89f9c280f1e10c493a7a37bc877e023f31aa72d2f911/pydantic_core-2.33.0-cp313-cp313-win32.whl", hash = "sha256:d20cbb9d3e95114325780f3cfe990f3ecae24de7a2d75f978783878cce2ad585", size = 1916450, upload-time = "2025-03-26T20:28:18.252Z" },
{ url = "https://files.pythonhosted.org/packages/13/8d/25ff96f1e89b19e0b70b3cd607c9ea7ca27e1dcb810a9cd4255ed6abf869/pydantic_core-2.33.0-cp313-cp313-win_amd64.whl", hash = "sha256:ca1103d70306489e3d006b0f79db8ca5dd3c977f6f13b2c59ff745249431a606", size = 1956092, upload-time = "2025-03-26T20:28:20.129Z" },
{ url = "https://files.pythonhosted.org/packages/1b/64/66a2efeff657b04323ffcd7b898cb0354d36dae3a561049e092134a83e9c/pydantic_core-2.33.0-cp313-cp313-win_arm64.whl", hash = "sha256:6291797cad239285275558e0a27872da735b05c75d5237bbade8736f80e4c225", size = 1908367, upload-time = "2025-03-26T20:28:22.498Z" },
{ url = "https://files.pythonhosted.org/packages/52/54/295e38769133363d7ec4a5863a4d579f331728c71a6644ff1024ee529315/pydantic_core-2.33.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:7b79af799630af263eca9ec87db519426d8c9b3be35016eddad1832bac812d87", size = 1813331, upload-time = "2025-03-26T20:28:25.004Z" },
{ url = "https://files.pythonhosted.org/packages/4c/9c/0c8ea02db8d682aa1ef48938abae833c1d69bdfa6e5ec13b21734b01ae70/pydantic_core-2.33.0-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:eabf946a4739b5237f4f56d77fa6668263bc466d06a8036c055587c130a46f7b", size = 1986653, upload-time = "2025-03-26T20:28:27.02Z" },
{ url = "https://files.pythonhosted.org/packages/8e/4f/3fb47d6cbc08c7e00f92300e64ba655428c05c56b8ab6723bd290bae6458/pydantic_core-2.33.0-cp313-cp313t-win_amd64.whl", hash = "sha256:8a1d581e8cdbb857b0e0e81df98603376c1a5c34dc5e54039dcc00f043df81e7", size = 1931234, upload-time = "2025-03-26T20:28:29.237Z" },
{ url = "https://files.pythonhosted.org/packages/44/77/85e173b715e1a277ce934f28d877d82492df13e564fa68a01c96f36a47ad/pydantic_core-2.33.0-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:e2762c568596332fdab56b07060c8ab8362c56cf2a339ee54e491cd503612c50", size = 2040129, upload-time = "2025-03-26T20:28:58.992Z" },
{ url = "https://files.pythonhosted.org/packages/33/e7/33da5f8a94bbe2191cfcd15bd6d16ecd113e67da1b8c78d3cc3478112dab/pydantic_core-2.33.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:5bf637300ff35d4f59c006fff201c510b2b5e745b07125458a5389af3c0dff8c", size = 1872656, upload-time = "2025-03-26T20:29:01.255Z" },
{ url = "https://files.pythonhosted.org/packages/b4/7a/9600f222bea840e5b9ba1f17c0acc79b669b24542a78c42c6a10712c0aae/pydantic_core-2.33.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:62c151ce3d59ed56ebd7ce9ce5986a409a85db697d25fc232f8e81f195aa39a1", size = 1903731, upload-time = "2025-03-26T20:29:03.78Z" },
{ url = "https://files.pythonhosted.org/packages/81/d2/94c7ca4e24c5dcfb74df92e0836c189e9eb6814cf62d2f26a75ea0a906db/pydantic_core-2.33.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9ee65f0cc652261744fd07f2c6e6901c914aa6c5ff4dcfaf1136bc394d0dd26b", size = 2083966, upload-time = "2025-03-26T20:29:06.157Z" },
{ url = "https://files.pythonhosted.org/packages/b8/74/a0259989d220e8865ed6866a6d40539e40fa8f507e587e35d2414cc081f8/pydantic_core-2.33.0-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:024d136ae44d233e6322027bbf356712b3940bee816e6c948ce4b90f18471b3d", size = 2118951, upload-time = "2025-03-26T20:29:08.241Z" },
{ url = "https://files.pythonhosted.org/packages/13/4c/87405ed04d6d07597920b657f082a8e8e58bf3034178bb9044b4d57a91e2/pydantic_core-2.33.0-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:e37f10f6d4bc67c58fbd727108ae1d8b92b397355e68519f1e4a7babb1473442", size = 2079632, upload-time = "2025-03-26T20:29:10.394Z" },
{ url = "https://files.pythonhosted.org/packages/5a/4c/bcb02970ef91d4cd6de7c6893101302637da456bc8b52c18ea0d047b55ce/pydantic_core-2.33.0-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:502ed542e0d958bd12e7c3e9a015bce57deaf50eaa8c2e1c439b512cb9db1e3a", size = 2250541, upload-time = "2025-03-26T20:29:12.898Z" },
{ url = "https://files.pythonhosted.org/packages/a3/2b/dbe5450c4cd904be5da736dcc7f2357b828199e29e38de19fc81f988b288/pydantic_core-2.33.0-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:715c62af74c236bf386825c0fdfa08d092ab0f191eb5b4580d11c3189af9d330", size = 2255685, upload-time = "2025-03-26T20:29:15.023Z" },
{ url = "https://files.pythonhosted.org/packages/ca/a6/ca1d35f695d81f639c5617fc9efb44caad21a9463383fa45364b3044175a/pydantic_core-2.33.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:bccc06fa0372151f37f6b69834181aa9eb57cf8665ed36405fb45fbf6cac3bae", size = 2082395, upload-time = "2025-03-26T20:29:17.113Z" },
{ url = "https://files.pythonhosted.org/packages/2b/b2/553e42762e7b08771fca41c0230c1ac276f9e79e78f57628e1b7d328551d/pydantic_core-2.33.0-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:5d8dc9f63a26f7259b57f46a7aab5af86b2ad6fbe48487500bb1f4b27e051e4c", size = 2041207, upload-time = "2025-03-26T20:29:20.111Z" },
{ url = "https://files.pythonhosted.org/packages/85/81/a91a57bbf3efe53525ab75f65944b8950e6ef84fe3b9a26c1ec173363263/pydantic_core-2.33.0-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:30369e54d6d0113d2aa5aee7a90d17f225c13d87902ace8fcd7bbf99b19124db", size = 1873736, upload-time = "2025-03-26T20:29:22.811Z" },
{ url = "https://files.pythonhosted.org/packages/9c/d2/5ab52e9f551cdcbc1ee99a0b3ef595f56d031f66f88e5ca6726c49f9ce65/pydantic_core-2.33.0-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f3eb479354c62067afa62f53bb387827bee2f75c9c79ef25eef6ab84d4b1ae3b", size = 1903794, upload-time = "2025-03-26T20:29:25.369Z" },
{ url = "https://files.pythonhosted.org/packages/2f/5f/a81742d3f3821b16f1265f057d6e0b68a3ab13a814fe4bffac536a1f26fd/pydantic_core-2.33.0-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0310524c833d91403c960b8a3cf9f46c282eadd6afd276c8c5edc617bd705dc9", size = 2083457, upload-time = "2025-03-26T20:29:27.551Z" },
{ url = "https://files.pythonhosted.org/packages/b5/2f/e872005bc0fc47f9c036b67b12349a8522d32e3bda928e82d676e2a594d1/pydantic_core-2.33.0-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:eddb18a00bbb855325db27b4c2a89a4ba491cd6a0bd6d852b225172a1f54b36c", size = 2119537, upload-time = "2025-03-26T20:29:29.763Z" },
{ url = "https://files.pythonhosted.org/packages/d3/13/183f13ce647202eaf3dada9e42cdfc59cbb95faedd44d25f22b931115c7f/pydantic_core-2.33.0-pp311-pypy311_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:ade5dbcf8d9ef8f4b28e682d0b29f3008df9842bb5ac48ac2c17bc55771cc976", size = 2080069, upload-time = "2025-03-26T20:29:32.31Z" },
{ url = "https://files.pythonhosted.org/packages/23/8b/b6be91243da44a26558d9c3a9007043b3750334136c6550551e8092d6d96/pydantic_core-2.33.0-pp311-pypy311_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:2c0afd34f928383e3fd25740f2050dbac9d077e7ba5adbaa2227f4d4f3c8da5c", size = 2251618, upload-time = "2025-03-26T20:29:34.967Z" },
{ url = "https://files.pythonhosted.org/packages/aa/c5/fbcf1977035b834f63eb542e74cd6c807177f383386175b468f0865bcac4/pydantic_core-2.33.0-pp311-pypy311_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:7da333f21cd9df51d5731513a6d39319892947604924ddf2e24a4612975fb936", size = 2255374, upload-time = "2025-03-26T20:29:37.132Z" },
{ url = "https://files.pythonhosted.org/packages/2f/f8/66f328e411f1c9574b13c2c28ab01f308b53688bbbe6ca8fb981e6cabc42/pydantic_core-2.33.0-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:4b6d77c75a57f041c5ee915ff0b0bb58eabb78728b69ed967bc5b780e8f701b8", size = 2082099, upload-time = "2025-03-26T20:29:39.227Z" },
{ url = "https://files.pythonhosted.org/packages/22/98/b50eb9a411e87483b5c65dba4fa430a06bac4234d3403a40e5a9905ebcd0/pydantic_core-2.46.3-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:1da3786b8018e60349680720158cc19161cc3b4bdd815beb0a321cd5ce1ad5b1", size = 2108971, upload-time = "2026-04-20T14:43:51.945Z" },
{ url = "https://files.pythonhosted.org/packages/08/4b/f364b9d161718ff2217160a4b5d41ce38de60aed91c3689ebffa1c939d23/pydantic_core-2.46.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:cc0988cb29d21bf4a9d5cf2ef970b5c0e38d8d8e107a493278c05dc6c1dda69f", size = 1949588, upload-time = "2026-04-20T14:44:10.386Z" },
{ url = "https://files.pythonhosted.org/packages/8f/8b/30bd03ee83b2f5e29f5ba8e647ab3c456bf56f2ec72fdbcc0215484a0854/pydantic_core-2.46.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:27f9067c3bfadd04c55484b89c0d267981b2f3512850f6f66e1e74204a4e4ce3", size = 1975986, upload-time = "2026-04-20T14:43:57.106Z" },
{ url = "https://files.pythonhosted.org/packages/3c/54/13ccf954d84ec275d5d023d5786e4aa48840bc9f161f2838dc98e1153518/pydantic_core-2.46.3-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a642ac886ecf6402d9882d10c405dcf4b902abeb2972cd5fb4a48c83cd59279a", size = 2055830, upload-time = "2026-04-20T14:44:15.499Z" },
{ url = "https://files.pythonhosted.org/packages/be/0e/65f38125e660fdbd72aa858e7dfae893645cfa0e7b13d333e174a367cd23/pydantic_core-2.46.3-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:79f561438481f28681584b89e2effb22855e2179880314bcddbf5968e935e807", size = 2222340, upload-time = "2026-04-20T14:41:51.353Z" },
{ url = "https://files.pythonhosted.org/packages/d1/88/f3ab7739efe0e7e80777dbb84c59eb98518e3f57ea433206194c2e425272/pydantic_core-2.46.3-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:57a973eae4665352a47cf1a99b4ee864620f2fe663a217d7a8da68a1f3a5bfda", size = 2280727, upload-time = "2026-04-20T14:41:30.461Z" },
{ url = "https://files.pythonhosted.org/packages/2a/6d/c228219080817bec4982f9531cadb18da6aaa770fdeb114f49c237ac2c9f/pydantic_core-2.46.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:83d002b97072a53ea150d63e0a3adfae5670cef5aa8a6e490240e482d3b22e57", size = 2092158, upload-time = "2026-04-20T14:44:07.305Z" },
{ url = "https://files.pythonhosted.org/packages/0f/b1/525a16711e7c6d61635fac3b0bd54600b5c5d9f60c6fc5aaab26b64a2297/pydantic_core-2.46.3-cp310-cp310-manylinux_2_31_riscv64.whl", hash = "sha256:b40ddd51e7c44b28cfaef746c9d3c506d658885e0a46f9eeef2ee815cbf8e045", size = 2116626, upload-time = "2026-04-20T14:42:34.118Z" },
{ url = "https://files.pythonhosted.org/packages/ef/7c/17d30673351439a6951bf54f564cf2443ab00ae264ec9df00e2efd710eb5/pydantic_core-2.46.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ac5ec7fb9b87f04ee839af2d53bcadea57ded7d229719f56c0ed895bff987943", size = 2160691, upload-time = "2026-04-20T14:41:14.023Z" },
{ url = "https://files.pythonhosted.org/packages/86/66/af8adbcbc0886ead7f1a116606a534d75a307e71e6e08226000d51b880d2/pydantic_core-2.46.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:a3b11c812f61b3129c4905781a2601dfdfdea5fe1e6c1cfb696b55d14e9c054f", size = 2182543, upload-time = "2026-04-20T14:40:48.886Z" },
{ url = "https://files.pythonhosted.org/packages/b0/37/6de71e0f54c54a4190010f57deb749e1ddf75c568ada3b1320b70067f121/pydantic_core-2.46.3-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:1108da631e602e5b3c38d6d04fe5bb3bfa54349e6918e3ca6cf570b2e2b2f9d4", size = 2324513, upload-time = "2026-04-20T14:42:36.121Z" },
{ url = "https://files.pythonhosted.org/packages/51/b1/9fc74ce94f603d5ef59ff258ca9c2c8fb902fb548d340a96f77f4d1c3b7f/pydantic_core-2.46.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:de885175515bcfa98ae618c1df7a072f13d179f81376c8007112af20567fd08a", size = 2361853, upload-time = "2026-04-20T14:43:24.886Z" },
{ url = "https://files.pythonhosted.org/packages/40/d0/4c652fc592db35f100279ee751d5a145aca1b9a7984b9684ba7c1b5b0535/pydantic_core-2.46.3-cp310-cp310-win32.whl", hash = "sha256:d11058e3201527d41bc6b545c79187c9e4bf85e15a236a6007f0e991518882b7", size = 1980465, upload-time = "2026-04-20T14:44:46.239Z" },
{ url = "https://files.pythonhosted.org/packages/27/b8/a920453c38afbe1f355e1ea0b0d94a0a3e0b0879d32d793108755fa171d5/pydantic_core-2.46.3-cp310-cp310-win_amd64.whl", hash = "sha256:3612edf65c8ea67ac13616c4d23af12faef1ae435a8a93e5934c2a0cbbdd1fd6", size = 2073884, upload-time = "2026-04-20T14:43:01.201Z" },
{ url = "https://files.pythonhosted.org/packages/22/a2/1ba90a83e85a3f94c796b184f3efde9c72f2830dcda493eea8d59ba78e6d/pydantic_core-2.46.3-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:ab124d49d0459b2373ecf54118a45c28a1e6d4192a533fbc915e70f556feb8e5", size = 2106740, upload-time = "2026-04-20T14:41:20.932Z" },
{ url = "https://files.pythonhosted.org/packages/b6/f6/99ae893c89a0b9d3daec9f95487aa676709aa83f67643b3f0abaf4ab628a/pydantic_core-2.46.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:cca67d52a5c7a16aed2b3999e719c4bcf644074eac304a5d3d62dd70ae7d4b2c", size = 1948293, upload-time = "2026-04-20T14:43:42.115Z" },
{ url = "https://files.pythonhosted.org/packages/3e/b8/2e8e636dc9e3f16c2e16bf0849e24be82c5ee82c603c65fc0326666328fc/pydantic_core-2.46.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5c024e08c0ba23e6fd68c771a521e9d6a792f2ebb0fa734296b36394dc30390e", size = 1973222, upload-time = "2026-04-20T14:41:57.841Z" },
{ url = "https://files.pythonhosted.org/packages/34/36/0e730beec4d83c5306f417afbd82ff237d9a21e83c5edf675f31ed84c1fe/pydantic_core-2.46.3-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:6645ce7eec4928e29a1e3b3d5c946621d105d3e79f0c9cddf07c2a9770949287", size = 2053852, upload-time = "2026-04-20T14:40:43.077Z" },
{ url = "https://files.pythonhosted.org/packages/4b/f0/3071131f47e39136a17814576e0fada9168569f7f8c0e6ac4d1ede6a4958/pydantic_core-2.46.3-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a712c7118e6c5ea96562f7b488435172abb94a3c53c22c9efc1412264a45cbbe", size = 2221134, upload-time = "2026-04-20T14:43:03.349Z" },
{ url = "https://files.pythonhosted.org/packages/2f/a9/a2dc023eec5aa4b02a467874bad32e2446957d2adcab14e107eab502e978/pydantic_core-2.46.3-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:69a868ef3ff206343579021c40faf3b1edc64b1cc508ff243a28b0a514ccb050", size = 2279785, upload-time = "2026-04-20T14:41:19.285Z" },
{ url = "https://files.pythonhosted.org/packages/0a/44/93f489d16fb63fbd41c670441536541f6e8cfa1e5a69f40bc9c5d30d8c90/pydantic_core-2.46.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cc7e8c32db809aa0f6ea1d6869ebc8518a65d5150fdfad8bcae6a49ae32a22e2", size = 2089404, upload-time = "2026-04-20T14:43:10.108Z" },
{ url = "https://files.pythonhosted.org/packages/2a/78/8692e3aa72b2d004f7a5d937f1dfdc8552ba26caf0bec75f342c40f00dec/pydantic_core-2.46.3-cp311-cp311-manylinux_2_31_riscv64.whl", hash = "sha256:3481bd1341dc85779ee506bc8e1196a277ace359d89d28588a9468c3ecbe63fa", size = 2114898, upload-time = "2026-04-20T14:44:51.475Z" },
{ url = "https://files.pythonhosted.org/packages/6a/62/e83133f2e7832532060175cebf1f13748f4c7e7e7165cdd1f611f174494b/pydantic_core-2.46.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8690eba565c6d68ffd3a8655525cbdd5246510b44a637ee2c6c03a7ebfe64d3c", size = 2157856, upload-time = "2026-04-20T14:43:46.64Z" },
{ url = "https://files.pythonhosted.org/packages/6d/ec/6a500e3ad7718ee50583fae79c8651f5d37e3abce1fa9ae177ae65842c53/pydantic_core-2.46.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:4de88889d7e88d50d40ee5b39d5dac0bcaef9ba91f7e536ac064e6b2834ecccf", size = 2180168, upload-time = "2026-04-20T14:42:00.302Z" },
{ url = "https://files.pythonhosted.org/packages/d8/53/8267811054b1aa7fc1dc7ded93812372ef79a839f5e23558136a6afbfde1/pydantic_core-2.46.3-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:e480080975c1ef7f780b8f99ed72337e7cc5efea2e518a20a692e8e7b278eb8b", size = 2322885, upload-time = "2026-04-20T14:41:05.253Z" },
{ url = "https://files.pythonhosted.org/packages/c8/c1/1c0acdb3aa0856ddc4ecc55214578f896f2de16f400cf51627eb3c26c1c4/pydantic_core-2.46.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:de3a5c376f8cd94da9a1b8fd3dd1c16c7a7b216ed31dc8ce9fd7a22bf13b836e", size = 2360328, upload-time = "2026-04-20T14:41:43.991Z" },
{ url = "https://files.pythonhosted.org/packages/f0/d0/ef39cd0f4a926814f360e71c1adeab48ad214d9727e4deb48eedfb5bce1a/pydantic_core-2.46.3-cp311-cp311-win32.whl", hash = "sha256:fc331a5314ffddd5385b9ee9d0d2fee0b13c27e0e02dad71b1ae5d6561f51eeb", size = 1979464, upload-time = "2026-04-20T14:43:12.215Z" },
{ url = "https://files.pythonhosted.org/packages/18/9c/f41951b0d858e343f1cf09398b2a7b3014013799744f2c4a8ad6a3eec4f2/pydantic_core-2.46.3-cp311-cp311-win_amd64.whl", hash = "sha256:b5b9c6cf08a8a5e502698f5e153056d12c34b8fb30317e0c5fd06f45162a6346", size = 2070837, upload-time = "2026-04-20T14:41:47.707Z" },
{ url = "https://files.pythonhosted.org/packages/9f/1e/264a17cd582f6ed50950d4d03dd5fefd84e570e238afe1cb3e25cf238769/pydantic_core-2.46.3-cp311-cp311-win_arm64.whl", hash = "sha256:5dfd51cf457482f04ec49491811a2b8fd5b843b64b11eecd2d7a1ee596ea78a6", size = 2053647, upload-time = "2026-04-20T14:42:27.535Z" },
{ url = "https://files.pythonhosted.org/packages/4b/cb/5b47425556ecc1f3fe18ed2a0083188aa46e1dd812b06e406475b3a5d536/pydantic_core-2.46.3-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:b11b59b3eee90a80a36701ddb4576d9ae31f93f05cb9e277ceaa09e6bf074a67", size = 2101946, upload-time = "2026-04-20T14:40:52.581Z" },
{ url = "https://files.pythonhosted.org/packages/a1/4f/2fb62c2267cae99b815bbf4a7b9283812c88ca3153ef29f7707200f1d4e5/pydantic_core-2.46.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:af8653713055ea18a3abc1537fe2ebc42f5b0bbb768d1eb79fd74eb47c0ac089", size = 1951612, upload-time = "2026-04-20T14:42:42.996Z" },
{ url = "https://files.pythonhosted.org/packages/50/6e/b7348fd30d6556d132cddd5bd79f37f96f2601fe0608afac4f5fb01ec0b3/pydantic_core-2.46.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:75a519dab6d63c514f3a81053e5266c549679e4aa88f6ec57f2b7b854aceb1b0", size = 1977027, upload-time = "2026-04-20T14:42:02.001Z" },
{ url = "https://files.pythonhosted.org/packages/82/11/31d60ee2b45540d3fb0b29302a393dbc01cd771c473f5b5147bcd353e593/pydantic_core-2.46.3-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a6cd87cb1575b1ad05ba98894c5b5c96411ef678fa2f6ed2576607095b8d9789", size = 2063008, upload-time = "2026-04-20T14:44:17.952Z" },
{ url = "https://files.pythonhosted.org/packages/8a/db/3a9d1957181b59258f44a2300ab0f0be9d1e12d662a4f57bb31250455c52/pydantic_core-2.46.3-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f80a55484b8d843c8ada81ebf70a682f3f00a3d40e378c06cf17ecb44d280d7d", size = 2233082, upload-time = "2026-04-20T14:40:57.934Z" },
{ url = "https://files.pythonhosted.org/packages/9c/e1/3277c38792aeb5cfb18c2f0c5785a221d9ff4e149abbe1184d53d5f72273/pydantic_core-2.46.3-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3861f1731b90c50a3266316b9044f5c9b405eecb8e299b0a7120596334e4fe9c", size = 2304615, upload-time = "2026-04-20T14:42:12.584Z" },
{ url = "https://files.pythonhosted.org/packages/5e/d5/e3d9717c9eba10855325650afd2a9cba8e607321697f18953af9d562da2f/pydantic_core-2.46.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fb528e295ed31570ac3dcc9bfdd6e0150bc11ce6168ac87a8082055cf1a67395", size = 2094380, upload-time = "2026-04-20T14:43:05.522Z" },
{ url = "https://files.pythonhosted.org/packages/a1/20/abac35dedcbfd66c6f0b03e4e3564511771d6c9b7ede10a362d03e110d9b/pydantic_core-2.46.3-cp312-cp312-manylinux_2_31_riscv64.whl", hash = "sha256:367508faa4973b992b271ba1494acaab36eb7e8739d1e47be5035fb1ea225396", size = 2135429, upload-time = "2026-04-20T14:41:55.549Z" },
{ url = "https://files.pythonhosted.org/packages/6c/a5/41bfd1df69afad71b5cf0535055bccc73022715ad362edbc124bc1e021d7/pydantic_core-2.46.3-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:5ad3c826fe523e4becf4fe39baa44286cff85ef137c729a2c5e269afbfd0905d", size = 2174582, upload-time = "2026-04-20T14:41:45.96Z" },
{ url = "https://files.pythonhosted.org/packages/79/65/38d86ea056b29b2b10734eb23329b7a7672ca604df4f2b6e9c02d4ee22fe/pydantic_core-2.46.3-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:ec638c5d194ef8af27db69f16c954a09797c0dc25015ad6123eb2c73a4d271ca", size = 2187533, upload-time = "2026-04-20T14:40:55.367Z" },
{ url = "https://files.pythonhosted.org/packages/b6/55/a1129141678a2026badc539ad1dee0a71d06f54c2f06a4bd68c030ac781b/pydantic_core-2.46.3-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:28ed528c45446062ee66edb1d33df5d88828ae167de76e773a3c7f64bd14e976", size = 2332985, upload-time = "2026-04-20T14:44:13.05Z" },
{ url = "https://files.pythonhosted.org/packages/d7/60/cb26f4077719f709e54819f4e8e1d43f4091f94e285eb6bd21e1190a7b7c/pydantic_core-2.46.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:aed19d0c783886d5bd86d80ae5030006b45e28464218747dcf83dabfdd092c7b", size = 2373670, upload-time = "2026-04-20T14:41:53.421Z" },
{ url = "https://files.pythonhosted.org/packages/6b/7e/c3f21882bdf1d8d086876f81b5e296206c69c6082551d776895de7801fa0/pydantic_core-2.46.3-cp312-cp312-win32.whl", hash = "sha256:06d5d8820cbbdb4147578c1fe7ffcd5b83f34508cb9f9ab76e807be7db6ff0a4", size = 1966722, upload-time = "2026-04-20T14:44:30.588Z" },
{ url = "https://files.pythonhosted.org/packages/57/be/6b5e757b859013ebfbd7adba02f23b428f37c86dcbf78b5bb0b4ffd36e99/pydantic_core-2.46.3-cp312-cp312-win_amd64.whl", hash = "sha256:c3212fda0ee959c1dd04c60b601ec31097aaa893573a3a1abd0a47bcac2968c1", size = 2072970, upload-time = "2026-04-20T14:42:54.248Z" },
{ url = "https://files.pythonhosted.org/packages/bf/f8/a989b21cc75e9a32d24192ef700eea606521221a89faa40c919ce884f2b1/pydantic_core-2.46.3-cp312-cp312-win_arm64.whl", hash = "sha256:f1f8338dd7a7f31761f1f1a3c47503a9a3b34eea3c8b01fa6ee96408affb5e72", size = 2035963, upload-time = "2026-04-20T14:44:20.4Z" },
{ url = "https://files.pythonhosted.org/packages/9b/3c/9b5e8eb9821936d065439c3b0fb1490ffa64163bfe7e1595985a47896073/pydantic_core-2.46.3-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:12bc98de041458b80c86c56b24df1d23832f3e166cbaff011f25d187f5c62c37", size = 2102109, upload-time = "2026-04-20T14:41:24.219Z" },
{ url = "https://files.pythonhosted.org/packages/91/97/1c41d1f5a19f241d8069f1e249853bcce378cdb76eec8ab636d7bc426280/pydantic_core-2.46.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:85348b8f89d2c3508b65b16c3c33a4da22b8215138d8b996912bb1532868885f", size = 1951820, upload-time = "2026-04-20T14:42:14.236Z" },
{ url = "https://files.pythonhosted.org/packages/30/b4/d03a7ae14571bc2b6b3c7b122441154720619afe9a336fa3a95434df5e2f/pydantic_core-2.46.3-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1105677a6df914b1fb71a81b96c8cce7726857e1717d86001f29be06a25ee6f8", size = 1977785, upload-time = "2026-04-20T14:42:31.648Z" },
{ url = "https://files.pythonhosted.org/packages/ae/0c/4086f808834b59e3c8f1aa26df8f4b6d998cdcf354a143d18ef41529d1fe/pydantic_core-2.46.3-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:87082cd65669a33adeba5470769e9704c7cf026cc30afb9cc77fd865578ebaad", size = 2062761, upload-time = "2026-04-20T14:40:37.093Z" },
{ url = "https://files.pythonhosted.org/packages/fa/71/a649be5a5064c2df0db06e0a512c2281134ed2fcc981f52a657936a7527c/pydantic_core-2.46.3-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:60e5f66e12c4f5212d08522963380eaaeac5ebd795826cfd19b2dfb0c7a52b9c", size = 2232989, upload-time = "2026-04-20T14:42:59.254Z" },
{ url = "https://files.pythonhosted.org/packages/a2/84/7756e75763e810b3a710f4724441d1ecc5883b94aacb07ca71c5fb5cfb69/pydantic_core-2.46.3-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b6cdf19bf84128d5e7c37e8a73a0c5c10d51103a650ac585d42dd6ae233f2b7f", size = 2303975, upload-time = "2026-04-20T14:41:32.287Z" },
{ url = "https://files.pythonhosted.org/packages/6c/35/68a762e0c1e31f35fa0dac733cbd9f5b118042853698de9509c8e5bf128b/pydantic_core-2.46.3-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:031bb17f4885a43773c8c763089499f242aee2ea85cf17154168775dccdecf35", size = 2095325, upload-time = "2026-04-20T14:42:47.685Z" },
{ url = "https://files.pythonhosted.org/packages/77/bf/1bf8c9a8e91836c926eae5e3e51dce009bf495a60ca56060689d3df3f340/pydantic_core-2.46.3-cp313-cp313-manylinux_2_31_riscv64.whl", hash = "sha256:bcf2a8b2982a6673693eae7348ef3d8cf3979c1d63b54fca7c397a635cc68687", size = 2133368, upload-time = "2026-04-20T14:41:22.766Z" },
{ url = "https://files.pythonhosted.org/packages/e5/50/87d818d6bab915984995157ceb2380f5aac4e563dddbed6b56f0ed057aba/pydantic_core-2.46.3-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:28e8cf2f52d72ced402a137145923a762cbb5081e48b34312f7a0c8f55928ec3", size = 2173908, upload-time = "2026-04-20T14:42:52.044Z" },
{ url = "https://files.pythonhosted.org/packages/91/88/a311fb306d0bd6185db41fa14ae888fb81d0baf648a761ae760d30819d33/pydantic_core-2.46.3-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:17eaface65d9fc5abb940003020309c1bf7a211f5f608d7870297c367e6f9022", size = 2186422, upload-time = "2026-04-20T14:43:29.55Z" },
{ url = "https://files.pythonhosted.org/packages/8f/79/28fd0d81508525ab2054fef7c77a638c8b5b0afcbbaeee493cf7c3fef7e1/pydantic_core-2.46.3-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:93fd339f23408a07e98950a89644f92c54d8729719a40b30c0a30bb9ebc55d23", size = 2332709, upload-time = "2026-04-20T14:42:16.134Z" },
{ url = "https://files.pythonhosted.org/packages/b3/21/795bf5fe5c0f379308b8ef19c50dedab2e7711dbc8d0c2acf08f1c7daa05/pydantic_core-2.46.3-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:23cbdb3aaa74dfe0837975dbf69b469753bbde8eacace524519ffdb6b6e89eb7", size = 2372428, upload-time = "2026-04-20T14:41:10.974Z" },
{ url = "https://files.pythonhosted.org/packages/45/b3/ed14c659cbe7605e3ef063077680a64680aec81eb1a04763a05190d49b7f/pydantic_core-2.46.3-cp313-cp313-win32.whl", hash = "sha256:610eda2e3838f401105e6326ca304f5da1e15393ae25dacae5c5c63f2c275b13", size = 1965601, upload-time = "2026-04-20T14:41:42.128Z" },
{ url = "https://files.pythonhosted.org/packages/ef/bb/adb70d9a762ddd002d723fbf1bd492244d37da41e3af7b74ad212609027e/pydantic_core-2.46.3-cp313-cp313-win_amd64.whl", hash = "sha256:68cc7866ed863db34351294187f9b729964c371ba33e31c26f478471c52e1ed0", size = 2071517, upload-time = "2026-04-20T14:43:36.096Z" },
{ url = "https://files.pythonhosted.org/packages/52/eb/66faefabebfe68bd7788339c9c9127231e680b11906368c67ce112fdb47f/pydantic_core-2.46.3-cp313-cp313-win_arm64.whl", hash = "sha256:f64b5537ac62b231572879cd08ec05600308636a5d63bcbdb15063a466977bec", size = 2035802, upload-time = "2026-04-20T14:43:38.507Z" },
{ url = "https://files.pythonhosted.org/packages/7f/db/a7bcb4940183fda36022cd18ba8dd12f2dff40740ec7b58ce7457befa416/pydantic_core-2.46.3-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:afa3aa644f74e290cdede48a7b0bee37d1c35e71b05105f6b340d484af536d9b", size = 2097614, upload-time = "2026-04-20T14:44:38.374Z" },
{ url = "https://files.pythonhosted.org/packages/24/35/e4066358a22e3e99519db370494c7528f5a2aa1367370e80e27e20283543/pydantic_core-2.46.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:ced3310e51aa425f7f77da8bbbb5212616655bedbe82c70944320bc1dbe5e018", size = 1951896, upload-time = "2026-04-20T14:40:53.996Z" },
{ url = "https://files.pythonhosted.org/packages/87/92/37cf4049d1636996e4b888c05a501f40a43ff218983a551d57f9d5e14f0d/pydantic_core-2.46.3-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e29908922ce9da1a30b4da490bd1d3d82c01dcfdf864d2a74aacee674d0bfa34", size = 1979314, upload-time = "2026-04-20T14:41:49.446Z" },
{ url = "https://files.pythonhosted.org/packages/d8/36/9ff4d676dfbdfb2d591cf43f3d90ded01e15b1404fd101180ed2d62a2fd3/pydantic_core-2.46.3-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:0c9ff69140423eea8ed2d5477df3ba037f671f5e897d206d921bc9fdc39613e7", size = 2056133, upload-time = "2026-04-20T14:42:23.574Z" },
{ url = "https://files.pythonhosted.org/packages/bc/f0/405b442a4d7ba855b06eec8b2bf9c617d43b8432d099dfdc7bf999293495/pydantic_core-2.46.3-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b675ab0a0d5b1c8fdb81195dc5bcefea3f3c240871cdd7ff9a2de8aa50772eb2", size = 2228726, upload-time = "2026-04-20T14:44:22.816Z" },
{ url = "https://files.pythonhosted.org/packages/e7/f8/65cd92dd5a0bd89ba277a98ecbfaf6fc36bbd3300973c7a4b826d6ab1391/pydantic_core-2.46.3-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0087084960f209a9a4af50ecd1fb063d9ad3658c07bb81a7a53f452dacbfb2ba", size = 2301214, upload-time = "2026-04-20T14:44:48.792Z" },
{ url = "https://files.pythonhosted.org/packages/fd/86/ef96a4c6e79e7a2d0410826a68fbc0eccc0fd44aa733be199d5fcac3bb87/pydantic_core-2.46.3-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ed42e6cc8e1b0e2b9b96e2276bad70ae625d10d6d524aed0c93de974ae029f9f", size = 2099927, upload-time = "2026-04-20T14:41:40.196Z" },
{ url = "https://files.pythonhosted.org/packages/6d/53/269caf30e0096e0a8a8f929d1982a27b3879872cca2d917d17c2f9fdf4fe/pydantic_core-2.46.3-cp314-cp314-manylinux_2_31_riscv64.whl", hash = "sha256:f1771ce258afb3e4201e67d154edbbae712a76a6081079fe247c2f53c6322c22", size = 2128789, upload-time = "2026-04-20T14:41:15.868Z" },
{ url = "https://files.pythonhosted.org/packages/00/b0/1a6d9b6a587e118482910c244a1c5acf4d192604174132efd12bf0ac486f/pydantic_core-2.46.3-cp314-cp314-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:a7610b6a5242a6c736d8ad47fd5fff87fcfe8f833b281b1c409c3d6835d9227f", size = 2173815, upload-time = "2026-04-20T14:44:25.152Z" },
{ url = "https://files.pythonhosted.org/packages/87/56/e7e00d4041a7e62b5a40815590114db3b535bf3ca0bf4dca9f16cef25246/pydantic_core-2.46.3-cp314-cp314-musllinux_1_1_aarch64.whl", hash = "sha256:ff5e7783bcc5476e1db448bf268f11cb257b1c276d3e89f00b5727be86dd0127", size = 2181608, upload-time = "2026-04-20T14:41:28.933Z" },
{ url = "https://files.pythonhosted.org/packages/e8/22/4bd23c3d41f7c185d60808a1de83c76cf5aeabf792f6c636a55c3b1ec7f9/pydantic_core-2.46.3-cp314-cp314-musllinux_1_1_armv7l.whl", hash = "sha256:9d2e32edcc143bc01e95300671915d9ca052d4f745aa0a49c48d4803f8a85f2c", size = 2326968, upload-time = "2026-04-20T14:42:03.962Z" },
{ url = "https://files.pythonhosted.org/packages/24/ac/66cd45129e3915e5ade3b292cb3bc7fd537f58f8f8dbdaba6170f7cabb74/pydantic_core-2.46.3-cp314-cp314-musllinux_1_1_x86_64.whl", hash = "sha256:6e42d83d1c6b87fa56b521479cff237e626a292f3b31b6345c15a99121b454c1", size = 2369842, upload-time = "2026-04-20T14:41:35.52Z" },
{ url = "https://files.pythonhosted.org/packages/a2/51/dd4248abb84113615473aa20d5545b7c4cd73c8644003b5259686f93996c/pydantic_core-2.46.3-cp314-cp314-win32.whl", hash = "sha256:07bc6d2a28c3adb4f7c6ae46aa4f2d2929af127f587ed44057af50bf1ce0f505", size = 1959661, upload-time = "2026-04-20T14:41:00.042Z" },
{ url = "https://files.pythonhosted.org/packages/20/eb/59980e5f1ae54a3b86372bd9f0fa373ea2d402e8cdcd3459334430f91e91/pydantic_core-2.46.3-cp314-cp314-win_amd64.whl", hash = "sha256:8940562319bc621da30714617e6a7eaa6b98c84e8c685bcdc02d7ed5e7c7c44e", size = 2071686, upload-time = "2026-04-20T14:43:16.471Z" },
{ url = "https://files.pythonhosted.org/packages/8c/db/1cf77e5247047dfee34bc01fa9bca134854f528c8eb053e144298893d370/pydantic_core-2.46.3-cp314-cp314-win_arm64.whl", hash = "sha256:5dcbbcf4d22210ced8f837c96db941bdb078f419543472aca5d9a0bb7cddc7df", size = 2026907, upload-time = "2026-04-20T14:43:31.732Z" },
{ url = "https://files.pythonhosted.org/packages/57/c0/b3df9f6a543276eadba0a48487b082ca1f201745329d97dbfa287034a230/pydantic_core-2.46.3-cp314-cp314t-macosx_10_12_x86_64.whl", hash = "sha256:d0fe3dce1e836e418f912c1ad91c73357d03e556a4d286f441bf34fed2dbeecf", size = 2095047, upload-time = "2026-04-20T14:42:37.982Z" },
{ url = "https://files.pythonhosted.org/packages/66/57/886a938073b97556c168fd99e1a7305bb363cd30a6d2c76086bf0587b32a/pydantic_core-2.46.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:9ce92e58abc722dac1bf835a6798a60b294e48eb0e625ec9fd994b932ac5feee", size = 1934329, upload-time = "2026-04-20T14:43:49.655Z" },
{ url = "https://files.pythonhosted.org/packages/0b/7c/b42eaa5c34b13b07ecb51da21761297a9b8eb43044c864a035999998f328/pydantic_core-2.46.3-cp314-cp314t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a03e6467f0f5ab796a486146d1b887b2dc5e5f9b3288898c1b1c3ad974e53e4a", size = 1974847, upload-time = "2026-04-20T14:42:10.737Z" },
{ url = "https://files.pythonhosted.org/packages/e6/9b/92b42db6543e7de4f99ae977101a2967b63122d4b6cf7773812da2d7d5b5/pydantic_core-2.46.3-cp314-cp314t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2798b6ba041b9d70acfb9071a2ea13c8456dd1e6a5555798e41ba7b0790e329c", size = 2041742, upload-time = "2026-04-20T14:40:44.262Z" },
{ url = "https://files.pythonhosted.org/packages/0f/19/46fbe1efabb5aa2834b43b9454e70f9a83ad9c338c1291e48bdc4fecf167/pydantic_core-2.46.3-cp314-cp314t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9be3e221bdc6d69abf294dcf7aff6af19c31a5cdcc8f0aa3b14be29df4bd03b1", size = 2236235, upload-time = "2026-04-20T14:41:27.307Z" },
{ url = "https://files.pythonhosted.org/packages/77/da/b3f95bc009ad60ec53120f5d16c6faa8cabdbe8a20d83849a1f2b8728148/pydantic_core-2.46.3-cp314-cp314t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f13936129ce841f2a5ddf6f126fea3c43cd128807b5a59588c37cf10178c2e64", size = 2282633, upload-time = "2026-04-20T14:44:33.271Z" },
{ url = "https://files.pythonhosted.org/packages/cc/6e/401336117722e28f32fb8220df676769d28ebdf08f2f4469646d404c43a3/pydantic_core-2.46.3-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:28b5f2ef03416facccb1c6ef744c69793175fd27e44ef15669201601cf423acb", size = 2109679, upload-time = "2026-04-20T14:44:41.065Z" },
{ url = "https://files.pythonhosted.org/packages/fc/53/b289f9bc8756a32fe718c46f55afaeaf8d489ee18d1a1e7be1db73f42cc4/pydantic_core-2.46.3-cp314-cp314t-manylinux_2_31_riscv64.whl", hash = "sha256:830d1247d77ad23852314f069e9d7ddafeec5f684baf9d7e7065ed46a049c4e6", size = 2108342, upload-time = "2026-04-20T14:42:50.144Z" },
{ url = "https://files.pythonhosted.org/packages/10/5b/8292fc7c1f9111f1b2b7c1b0dcf1179edcd014fc3ea4517499f50b829d71/pydantic_core-2.46.3-cp314-cp314t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d0793c90c1a3c74966e7975eaef3ed30ebdff3260a0f815a62a22adc17e4c01c", size = 2157208, upload-time = "2026-04-20T14:42:08.133Z" },
{ url = "https://files.pythonhosted.org/packages/2b/9e/f80044e9ec07580f057a89fc131f78dda7a58751ddf52bbe05eaf31db50f/pydantic_core-2.46.3-cp314-cp314t-musllinux_1_1_aarch64.whl", hash = "sha256:d2d0aead851b66f5245ec0c4fb2612ef457f8bbafefdf65a2bf9d6bac6140f47", size = 2167237, upload-time = "2026-04-20T14:42:25.412Z" },
{ url = "https://files.pythonhosted.org/packages/f8/84/6781a1b037f3b96be9227edbd1101f6d3946746056231bf4ac48cdff1a8d/pydantic_core-2.46.3-cp314-cp314t-musllinux_1_1_armv7l.whl", hash = "sha256:2f40e4246676beb31c5ce77c38a55ca4e465c6b38d11ea1bd935420568e0b1ab", size = 2312540, upload-time = "2026-04-20T14:40:40.313Z" },
{ url = "https://files.pythonhosted.org/packages/3e/db/19c0839feeb728e7df03255581f198dfdf1c2aeb1e174a8420b63c5252e5/pydantic_core-2.46.3-cp314-cp314t-musllinux_1_1_x86_64.whl", hash = "sha256:cf489cf8986c543939aeee17a09c04d6ffb43bfef8ca16fcbcc5cfdcbed24dba", size = 2369556, upload-time = "2026-04-20T14:41:09.427Z" },
{ url = "https://files.pythonhosted.org/packages/e0/15/3228774cb7cd45f5f721ddf1b2242747f4eb834d0c491f0c02d606f09fed/pydantic_core-2.46.3-cp314-cp314t-win32.whl", hash = "sha256:ffe0883b56cfc05798bf994164d2b2ff03efe2d22022a2bb080f3b626176dd56", size = 1949756, upload-time = "2026-04-20T14:41:25.717Z" },
{ url = "https://files.pythonhosted.org/packages/b8/2a/c79cf53fd91e5a87e30d481809f52f9a60dd221e39de66455cf04deaad37/pydantic_core-2.46.3-cp314-cp314t-win_amd64.whl", hash = "sha256:706d9d0ce9cf4593d07270d8e9f53b161f90c57d315aeec4fb4fd7a8b10240d8", size = 2051305, upload-time = "2026-04-20T14:43:18.627Z" },
{ url = "https://files.pythonhosted.org/packages/0b/db/d8182a7f1d9343a032265aae186eb063fe26ca4c40f256b21e8da4498e89/pydantic_core-2.46.3-cp314-cp314t-win_arm64.whl", hash = "sha256:77706aeb41df6a76568434701e0917da10692da28cb69d5fb6919ce5fdb07374", size = 2026310, upload-time = "2026-04-20T14:41:01.778Z" },
{ url = "https://files.pythonhosted.org/packages/66/7f/03dbad45cd3aa9083fbc93c210ae8b005af67e4136a14186950a747c6874/pydantic_core-2.46.3-graalpy311-graalpy242_311_native-macosx_10_12_x86_64.whl", hash = "sha256:9715525891ed524a0a1eb6d053c74d4d4ad5017677fb00af0b7c2644a31bae46", size = 2105683, upload-time = "2026-04-20T14:42:19.779Z" },
{ url = "https://files.pythonhosted.org/packages/26/22/4dc186ac8ea6b257e9855031f51b62a9637beac4d68ac06bee02f046f836/pydantic_core-2.46.3-graalpy311-graalpy242_311_native-macosx_11_0_arm64.whl", hash = "sha256:9d2f400712a99a013aff420ef1eb9be077f8189a36c1e3ef87660b4e1088a874", size = 1940052, upload-time = "2026-04-20T14:43:59.274Z" },
{ url = "https://files.pythonhosted.org/packages/0d/ca/d376391a5aff1f2e8188960d7873543608130a870961c2b6b5236627c116/pydantic_core-2.46.3-graalpy311-graalpy242_311_native-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bd2aab0e2e9dc2daf36bd2686c982535d5e7b1d930a1344a7bb6e82baab42a76", size = 1988172, upload-time = "2026-04-20T14:41:17.469Z" },
{ url = "https://files.pythonhosted.org/packages/0e/6b/523b9f85c23788755d6ab949329de692a2e3a584bc6beb67fef5e035aa9d/pydantic_core-2.46.3-graalpy311-graalpy242_311_native-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4e9d76736da5f362fabfeea6a69b13b7f2be405c6d6966f06b2f6bfff7e64531", size = 2128596, upload-time = "2026-04-20T14:40:41.707Z" },
{ url = "https://files.pythonhosted.org/packages/34/42/f426db557e8ab2791bc7562052299944a118655496fbff99914e564c0a94/pydantic_core-2.46.3-graalpy312-graalpy250_312_native-macosx_10_12_x86_64.whl", hash = "sha256:b12dd51f1187c2eb489af8e20f880362db98e954b54ab792fa5d92e8bcc6b803", size = 2091877, upload-time = "2026-04-20T14:43:27.091Z" },
{ url = "https://files.pythonhosted.org/packages/5c/4f/86a832a9d14df58e663bfdf4627dc00d3317c2bd583c4fb23390b0f04b8e/pydantic_core-2.46.3-graalpy312-graalpy250_312_native-macosx_11_0_arm64.whl", hash = "sha256:f00a0961b125f1a47af7bcc17f00782e12f4cd056f83416006b30111d941dfa3", size = 1932428, upload-time = "2026-04-20T14:40:45.781Z" },
{ url = "https://files.pythonhosted.org/packages/11/1a/fe857968954d93fb78e0d4b6df5c988c74c4aaa67181c60be7cfe327c0ca/pydantic_core-2.46.3-graalpy312-graalpy250_312_native-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:57697d7c056aca4bbb680200f96563e841a6386ac1129370a0102592f4dddff5", size = 1997550, upload-time = "2026-04-20T14:44:02.425Z" },
{ url = "https://files.pythonhosted.org/packages/17/eb/9d89ad2d9b0ba8cd65393d434471621b98912abb10fbe1df08e480ba57b5/pydantic_core-2.46.3-graalpy312-graalpy250_312_native-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fd35aa21299def8db7ef4fe5c4ff862941a9a158ca7b63d61e66fe67d30416b4", size = 2137657, upload-time = "2026-04-20T14:42:45.149Z" },
{ url = "https://files.pythonhosted.org/packages/1f/da/99d40830684f81dec901cac521b5b91c095394cc1084b9433393cde1c2df/pydantic_core-2.46.3-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:13afdd885f3d71280cf286b13b310ee0f7ccfefd1dbbb661514a474b726e2f25", size = 2107973, upload-time = "2026-04-20T14:42:06.175Z" },
{ url = "https://files.pythonhosted.org/packages/99/a5/87024121818d75bbb2a98ddbaf638e40e7a18b5e0f5492c9ca4b1b316107/pydantic_core-2.46.3-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:f91c0aff3e3ee0928edd1232c57f643a7a003e6edf1860bc3afcdc749cb513f3", size = 1947191, upload-time = "2026-04-20T14:43:14.319Z" },
{ url = "https://files.pythonhosted.org/packages/60/62/0c1acfe10945b83a6a59d19fbaa92f48825381509e5701b855c08f13db76/pydantic_core-2.46.3-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6529d1d128321a58d30afcc97b49e98836542f68dd41b33c2e972bb9e5290536", size = 2123791, upload-time = "2026-04-20T14:43:22.766Z" },
{ url = "https://files.pythonhosted.org/packages/75/3e/3b2393b4c8f44285561dc30b00cf307a56a2eff7c483a824db3b8221ca51/pydantic_core-2.46.3-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:975c267cff4f7e7272eacbe50f6cc03ca9a3da4c4fbd66fffd89c94c1e311aa1", size = 2153197, upload-time = "2026-04-20T14:44:27.932Z" },
{ url = "https://files.pythonhosted.org/packages/ba/75/5af02fb35505051eee727c061f2881c555ab4f8ddb2d42da715a42c9731b/pydantic_core-2.46.3-pp311-pypy311_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:2b8e4f2bbdf71415c544b4b1138b8060db7b6611bc927e8064c769f64bed651c", size = 2181073, upload-time = "2026-04-20T14:43:20.729Z" },
{ url = "https://files.pythonhosted.org/packages/10/92/7e0e1bd9ca3c68305db037560ca2876f89b2647deb2f8b6319005de37505/pydantic_core-2.46.3-pp311-pypy311_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:e61ea8e9fff9606d09178f577ff8ccdd7206ff73d6552bcec18e1033c4254b85", size = 2315886, upload-time = "2026-04-20T14:44:04.826Z" },
{ url = "https://files.pythonhosted.org/packages/b8/d8/101655f27eaf3e44558ead736b2795d12500598beed4683f279396fa186e/pydantic_core-2.46.3-pp311-pypy311_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:b504bda01bafc69b6d3c7a0c7f039dcf60f47fab70e06fe23f57b5c75bdc82b8", size = 2360528, upload-time = "2026-04-20T14:40:47.431Z" },
{ url = "https://files.pythonhosted.org/packages/07/0f/1c34a74c8d07136f0d729ffe5e1fdab04fbdaa7684f61a92f92511a84a15/pydantic_core-2.46.3-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:b00b76f7142fc60c762ce579bd29c8fa44aaa56592dd3c54fab3928d0d4ca6ff", size = 2184144, upload-time = "2026-04-20T14:42:57Z" },
]
[[package]]
@@ -1421,14 +1507,23 @@ wheels = [
[[package]]
name = "pyee"
version = "12.1.1"
version = "13.0.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "typing-extensions" },
]
sdist = { url = "https://files.pythonhosted.org/packages/0a/37/8fb6e653597b2b67ef552ed49b438d5398ba3b85a9453f8ada0fd77d455c/pyee-12.1.1.tar.gz", hash = "sha256:bbc33c09e2ff827f74191e3e5bbc6be7da02f627b7ec30d86f5ce1a6fb2424a3", size = 30915, upload-time = "2024-11-16T21:26:44.275Z" }
sdist = { url = "https://files.pythonhosted.org/packages/8b/04/e7c1fe4dc78a6fdbfd6c337b1c3732ff543b8a397683ab38378447baa331/pyee-13.0.1.tar.gz", hash = "sha256:0b931f7c14535667ed4c7e0d531716368715e860b988770fc7eb8578d1f67fc8", size = 31655, upload-time = "2026-02-14T21:12:28.044Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/25/68/7e150cba9eeffdeb3c5cecdb6896d70c8edd46ce41c0491e12fb2b2256ff/pyee-12.1.1-py3-none-any.whl", hash = "sha256:18a19c650556bb6b32b406d7f017c8f513aceed1ef7ca618fb65de7bd2d347ef", size = 15527, upload-time = "2024-11-16T21:26:42.422Z" },
{ url = "https://files.pythonhosted.org/packages/a0/c4/b4d4827c93ef43c01f599ef31453ccc1c132b353284fc6c87d535c233129/pyee-13.0.1-py3-none-any.whl", hash = "sha256:af2f8fede4171ef667dfded53f96e2ed0d6e6bd7ee3bb46437f77e3b57689228", size = 15659, upload-time = "2026-02-14T21:12:26.263Z" },
]
[[package]]
name = "pygments"
version = "2.20.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/c3/b2/bc9c9196916376152d655522fdcebac55e66de6603a76a02bca1b6414f6c/pygments-2.20.0.tar.gz", hash = "sha256:6757cd03768053ff99f3039c1a36d6c0aa0b263438fcab17520b30a303a82b5f", size = 4955991, upload-time = "2026-03-29T13:29:33.898Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/f4/7e/a72dd26f3b0f4f2bf1dd8923c85f7ceb43172af56d63c7383eb62b332364/pygments-2.20.0-py3-none-any.whl", hash = "sha256:81a9e26dd42fd28a23a2d169d86d7ac03b46e2f8b59ed4698fb4785f946d0176", size = 1231151, upload-time = "2026-03-29T13:29:30.038Z" },
]
[[package]]
@@ -1564,6 +1659,15 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/07/bc/587a445451b253b285629263eb51c2d8e9bcea4fc97826266d186f96f558/pyserial-3.5-py2.py3-none-any.whl", hash = "sha256:c4451db6ba391ca6ca299fb3ec7bae67a5c55dde170964c7a14ceefec02f2cf0", size = 90585, upload-time = "2020-11-23T03:59:13.41Z" },
]
[[package]]
name = "pysocks"
version = "1.7.1"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/bd/11/293dd436aea955d45fc4e8a35b6ae7270f5b8e00b53cf6c024c83b657a11/PySocks-1.7.1.tar.gz", hash = "sha256:3f8804571ebe159c380ac6de37643bb4685970655d3bba243530d6558b799aa0", size = 284429, upload-time = "2019-09-20T02:07:35.714Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/8d/59/b4572118e098ac8e46e399a1dd0f2d85403ce8bbaad9ec79373ed6badaf9/PySocks-1.7.1-py3-none-any.whl", hash = "sha256:2725bd0a9925919b9b51739eea5f9e2bae91e83288108a9ad338b2e3a4435ee5", size = 16725, upload-time = "2019-09-20T02:06:22.938Z" },
]
[[package]]
name = "pystac"
version = "1.14.3"
@@ -1810,6 +1914,19 @@ dependencies = [
]
sdist = { url = "https://files.pythonhosted.org/packages/0b/0f/b7d5d4b36553731f11983e19e1813a1059ad0732c5162c01b3220c927d31/reverse_geocoder-1.5.1.tar.gz", hash = "sha256:2a2e781b5f69376d922b78fe8978f1350c84fce0ddb07e02c834ecf98b57c75c", size = 2246559, upload-time = "2016-09-15T16:46:46.277Z" }
[[package]]
name = "rich"
version = "15.0.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "markdown-it-py" },
{ name = "pygments" },
]
sdist = { url = "https://files.pythonhosted.org/packages/c0/8f/0722ca900cc807c13a6a0c696dacf35430f72e0ec571c4275d2371fca3e9/rich-15.0.0.tar.gz", hash = "sha256:edd07a4824c6b40189fb7ac9bc4c52536e9780fbbfbddf6f1e2502c31b068c36", size = 230680, upload-time = "2026-04-12T08:24:00.75Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/82/3b/64d4899d73f91ba49a8c18a8ff3f0ea8f1c1d75481760df8c68ef5235bf5/rich-15.0.0-py3-none-any.whl", hash = "sha256:33bd4ef74232fb73fe9279a257718407f169c09b78a87ad3d296f548e27de0bb", size = 310654, upload-time = "2026-04-12T08:24:02.83Z" },
]
[[package]]
name = "rpds-py"
version = "0.30.0"
@@ -2103,36 +2220,44 @@ sdist = { url = "https://files.pythonhosted.org/packages/9e/bd/3704a8c3e0942d711
[[package]]
name = "sgp4"
version = "2.23"
version = "2.25"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/14/c2/ed46e67105a95acf4f06e2811f4381198a5acef57a960d98a74011a797cc/sgp4-2.23.tar.gz", hash = "sha256:d8addc53a2fb9f88dee6bfd401d2865b014cc0b57bf2cee69bdee8d9685d5429", size = 176937, upload-time = "2023-11-11T13:45:32.839Z" }
sdist = { url = "https://files.pythonhosted.org/packages/6e/d0/fc467010d17742321f73b16a71acac88439a88f2b166641942a6566c9b2a/sgp4-2.25.tar.gz", hash = "sha256:e19edc6dcc25d69fb8fde0a267b8f0c44d7e915c7bcbeacf5d3a8b595baf0674", size = 181016, upload-time = "2025-08-04T18:02:33.765Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/07/8d/2eef1f49a1328dd2b3a48b38212005a6b80c1f74d442ae6426e7d5ddcbfd/sgp4-2.23-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:ebf585c09bebf7f7b9c89cff42e0f097654c4c5e092181fbbbee29c338dc11ed", size = 184170, upload-time = "2023-11-11T13:44:27.075Z" },
{ url = "https://files.pythonhosted.org/packages/c6/21/e7628040de626eed36195c62a43bb2d8b1a9ccc6af6bb94d2696748c6691/sgp4-2.23-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:9c71fec3278274a0edd5bb93ba83b22db5407040d95d62166e81fbd97633d756", size = 160021, upload-time = "2023-11-11T13:44:29.058Z" },
{ url = "https://files.pythonhosted.org/packages/00/4e/f03a7d4328ed90f4e38d81e477809759d20d218f05334a4fd6e56416e73d/sgp4-2.23-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f07f11fc70fa76ad7f538bcff4ee960faab270279cbb2d1149b15cbb37696e86", size = 158580, upload-time = "2023-11-11T13:44:30.258Z" },
{ url = "https://files.pythonhosted.org/packages/03/10/d090b516abf042fdf032f92201ba8fe08cfd88708927cae23398e9c582ed/sgp4-2.23-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d9a777e4173194965ec9466b307c3a929770d058be303ead8179c251fc37ed2e", size = 232904, upload-time = "2023-11-11T13:44:31.79Z" },
{ url = "https://files.pythonhosted.org/packages/4c/5d/4537a6a24a44599daf533c5f087acf38b394833c9d8b4d63076a1022c300/sgp4-2.23-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3884d50cba79f4dbeda16b4eb1d768bd3400b7a2c137a8c88eca056738bab94d", size = 232344, upload-time = "2023-11-11T13:44:33.248Z" },
{ url = "https://files.pythonhosted.org/packages/85/5a/03db579c9a538b52d34d13ecf644bd2c2eb0099ad52f9e8e21f18daa0537/sgp4-2.23-cp310-cp310-win32.whl", hash = "sha256:ce19d23a4276bab2bbba03ce5bbcc308c5472be69219d877f83d4b12c07f5ea1", size = 158859, upload-time = "2023-11-11T13:44:34.504Z" },
{ url = "https://files.pythonhosted.org/packages/40/fa/d433c219b78f1f18299af4bfa5b2e74a8248a5c2a2053ea45728d49a003e/sgp4-2.23-cp310-cp310-win_amd64.whl", hash = "sha256:2d6b56adb8771f3092ff5d59697d19f6f3bf2510fcca03795eedae41a42624b2", size = 160827, upload-time = "2023-11-11T13:44:36.198Z" },
{ url = "https://files.pythonhosted.org/packages/9a/6e/f3cabda68aa28f770a161894c1d1210e5c489c2ecf454d6e0a338f42fd87/sgp4-2.23-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:90dbea63ea1f288ce90448b3d7a1e9728292f1b9c9fa61839053e9277d98e96f", size = 184169, upload-time = "2023-11-11T13:44:37.793Z" },
{ url = "https://files.pythonhosted.org/packages/1a/25/a47d0c39260f34577a49ca139ccd99d45b51fcdaf524434761ba1089bc53/sgp4-2.23-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:e7b043fb0f628a3759ef649ee12b6e76ee65a0ec86b94dc41dfc90129fcbc81e", size = 160023, upload-time = "2023-11-11T13:44:39.502Z" },
{ url = "https://files.pythonhosted.org/packages/dd/fc/27496962d238fc18b9a005035141373307267830e3cf2b3bb04de7dfbebe/sgp4-2.23-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:7d55529102283821054ece00f95dd2b82843d1c72c80e39026c78e2f400ac9b9", size = 158584, upload-time = "2023-11-11T13:44:40.744Z" },
{ url = "https://files.pythonhosted.org/packages/a8/8f/a21128c2919ae10d00f1df81c9e39b850a73694b2e083d1597209731bfa5/sgp4-2.23-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4d9f4f39be2b85a4772a184cd4d160f9ed345116afcda79a0a1852ccfa12b030", size = 232875, upload-time = "2023-11-11T13:44:42.053Z" },
{ url = "https://files.pythonhosted.org/packages/9b/6f/4970dffe5c44a92370c2c7fff9a443901ac8daf3ddf3cf00586485ec84c4/sgp4-2.23-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:72868cd01169f45562954cdaaa9f1a2f738cac3c9ba3fdbd12b07a5f99bcb9d1", size = 232337, upload-time = "2023-11-11T13:44:43.342Z" },
{ url = "https://files.pythonhosted.org/packages/24/df/46e5de2cf8ac54eb757ce7ce29f348716e83b76fb839d979bb364582c38f/sgp4-2.23-cp311-cp311-win32.whl", hash = "sha256:137adf0e0fbe4d9a514284bb809590c5a2c52dc506b977ec7896165df8bc2a5e", size = 158868, upload-time = "2023-11-11T13:44:45.192Z" },
{ url = "https://files.pythonhosted.org/packages/f6/fb/d9d513dcb6c8898504e8c32cc95bc60435ddc1de27583ca6d2063c72471b/sgp4-2.23-cp311-cp311-win_amd64.whl", hash = "sha256:34efb27f9f281c76b5650c9a39a0b8f43d383d5575a4f7b3d4d597c9b7ff3d75", size = 160829, upload-time = "2023-11-11T13:44:46.702Z" },
{ url = "https://files.pythonhosted.org/packages/9a/97/09f19c4ea8cef4b384002b2c40c1a7a98484799d3a534a02712140ffaea9/sgp4-2.23-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:c45fa362763b5ae955a52bfdf05fee8d98351d1fa9db8fd1809b4f380cd9b0c6", size = 184223, upload-time = "2023-11-11T13:44:48.341Z" },
{ url = "https://files.pythonhosted.org/packages/e1/df/f2e9029e7a14129ae7e6c280a44dafc725fcf997d40fc5bd121e81bb9186/sgp4-2.23-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:3765d0efae2c537228311591691a8fd7d44686d5322810be15eb7f204688f336", size = 160036, upload-time = "2023-11-11T13:44:49.724Z" },
{ url = "https://files.pythonhosted.org/packages/50/29/389cf5b4429c83799e721b29fbc1b46bcbe8065e9dcac7d349be4e717ca7/sgp4-2.23-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:b80dd3c3ee73a1bc24c0f63316889b3eff32e623ca67a91774c9c92ac663e98c", size = 158619, upload-time = "2023-11-11T13:44:50.953Z" },
{ url = "https://files.pythonhosted.org/packages/b8/96/2da259097de4e7c5da09d559e5819f96631e496517c343300136a4975502/sgp4-2.23-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b1ec001c6b5bd9a330ffa40d29ebf85f1cc2bfa558ba8f330efcb1aa50a8b733", size = 233487, upload-time = "2023-11-11T13:44:52.326Z" },
{ url = "https://files.pythonhosted.org/packages/cd/af/79f7f661c2baf1b2db54166ec2e6cefa2d1eab0825fc4ac21b058ecc8bc8/sgp4-2.23-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7424adc1d5f8cf93bf6ddeb079a9122af25c8a6b9234e7ed71d1868e128afe4a", size = 232868, upload-time = "2023-11-11T13:44:54.33Z" },
{ url = "https://files.pythonhosted.org/packages/be/74/7a8c5c19b10949f1d588ff23ad25161390ad0431c80dba3055b941fd6a41/sgp4-2.23-cp312-cp312-win32.whl", hash = "sha256:4d954aa01a9f8c68a7b20e91ea6b5e0a195bb42b25e63698ea770d12c01927ea", size = 158905, upload-time = "2023-11-11T13:44:56.156Z" },
{ url = "https://files.pythonhosted.org/packages/3e/4d/378e647b0607919db275d584ff362475714e574fa8e2e1aab1146580e01b/sgp4-2.23-cp312-cp312-win_amd64.whl", hash = "sha256:cc08bc23327f7b3cd052e79b026f02a1da885a273f10fa2a78734e5fa5ccfddd", size = 160874, upload-time = "2023-11-11T13:44:57.969Z" },
{ url = "https://files.pythonhosted.org/packages/6a/86/f329af1f37f88693a7e8db92f9c0bf92a7e7dc44c272f89a2808b0582766/sgp4-2.25-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:29fd9ad2ded9517f6ba10f91e2d993144400c6a925e2b7931198646625beafd4", size = 162967, upload-time = "2025-08-04T18:01:39.296Z" },
{ url = "https://files.pythonhosted.org/packages/87/f0/c258a86bf9f50b46295bdc32af15954719ba88a12267572f7123ffc66f0d/sgp4-2.25-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9ad88a8ced4b78f337765e8463f7f11c5f86d9267f83fc8e3dd8982df67bff45", size = 161934, upload-time = "2025-08-04T18:01:40.99Z" },
{ url = "https://files.pythonhosted.org/packages/a6/4f/f437ca8daaba0e3c4ce67a997f30eead0489908527d6b358418ed4a3e8b1/sgp4-2.25-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dc0c6ccb0f83e670e50dcd8a90b8a5bfe5bbf4225ce8450f807e14acc517ab21", size = 235757, upload-time = "2025-08-04T18:01:42.164Z" },
{ url = "https://files.pythonhosted.org/packages/29/19/e8f990fc80d508e7957f9b621366db05d384f5af70d05f77376be74e3007/sgp4-2.25-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3282ec0931e57692f3bf875342f28f41b1155cb575cbe24a30c3cd272ea46fb5", size = 232737, upload-time = "2025-08-04T18:01:43.456Z" },
{ url = "https://files.pythonhosted.org/packages/26/e3/898d7a6d31309cb1be2b2b68dd14141da300ee7257bc30774759a9c1a323/sgp4-2.25-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:18e44f66670c61ae2372d6fecde076cb655f76d211b34b8de440cad5a273409f", size = 235207, upload-time = "2025-08-04T18:01:44.532Z" },
{ url = "https://files.pythonhosted.org/packages/87/c2/bb42e252846d29f668ff26b42adf65c842c8985c033b58d61d1c810728aa/sgp4-2.25-cp310-cp310-win32.whl", hash = "sha256:a2cc50b72b7d2b04c4012b492ec0e76f085e84de45f5e56d3baa4d3ef5f65dac", size = 161870, upload-time = "2025-08-04T18:01:45.461Z" },
{ url = "https://files.pythonhosted.org/packages/23/e8/3e44b47d3ccab7ae02a03d8107651c080da671a0c6e49cb97dd953f10809/sgp4-2.25-cp310-cp310-win_amd64.whl", hash = "sha256:2b92506eef5c07063ab7595db58373bd965f8969fb1fb5b76cbffeb39027ba93", size = 164101, upload-time = "2025-08-04T18:01:46.439Z" },
{ url = "https://files.pythonhosted.org/packages/03/1c/388f1b70a637e3bee179fae1031dcdd8ce09bd040bbbe80fcba20a2e2b86/sgp4-2.25-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:93b22b9ae35db33664f2ddc37955a8d86c3a28f5c668d201e8c6f195a184496f", size = 162964, upload-time = "2025-08-04T18:01:47.866Z" },
{ url = "https://files.pythonhosted.org/packages/47/42/7f27ceca4730d2af31b20928b1c0f1f924cd942c2709d11fc52d9e02f48e/sgp4-2.25-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:33048ff064a4c0b6d8e3c2c79449a49ff45f5dabe8594622f0fb7ed17fa27c0e", size = 161937, upload-time = "2025-08-04T18:01:48.951Z" },
{ url = "https://files.pythonhosted.org/packages/d5/9f/99b1587bd3e6c17405efbd9e48603ae194c9d53938e1f8946c6c5d18c33f/sgp4-2.25-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ac94f1d6fae120beeb40f2af587b351f9cb198837ae0fb3678e3bce44334a2a2", size = 235730, upload-time = "2025-08-04T18:01:49.905Z" },
{ url = "https://files.pythonhosted.org/packages/4f/5a/76f56a0466c3a916403b320363a0f10e71db5a34d2b3627d6a92d2eb9d08/sgp4-2.25-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1b164e636c4f1c64e09c6164b85985395c28c8556bc72ea56e42a889826287a0", size = 232736, upload-time = "2025-08-04T18:01:50.973Z" },
{ url = "https://files.pythonhosted.org/packages/16/f8/c1216d3c85341e30d79c9ca27b2c27dba6ed0238c56e8b04ef568ea92c50/sgp4-2.25-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfaddc20c4d6aa2e86119d13e3fd94a1d05e5bd17cb4fddb2ca5116842bc9228", size = 235203, upload-time = "2025-08-04T18:01:52.275Z" },
{ url = "https://files.pythonhosted.org/packages/52/c6/846a8039a0e8c5b653265e0248ee9836c75f420cfe99a65215a23f035595/sgp4-2.25-cp311-cp311-win32.whl", hash = "sha256:0ecd7d8833f83fe426d7926149665f4f23f4dab34b844e50876a1df88ee9aa7b", size = 161868, upload-time = "2025-08-04T18:01:53.256Z" },
{ url = "https://files.pythonhosted.org/packages/f9/ff/a74904468464d01dd52affb4b15ee1fe5ea804e5128a3f5f5f38c954765d/sgp4-2.25-cp311-cp311-win_amd64.whl", hash = "sha256:6b023f81fb20e62f8fa0b6f506201539ca8306779ef8565422bbf000f1e5a3dc", size = 164096, upload-time = "2025-08-04T18:01:54.264Z" },
{ url = "https://files.pythonhosted.org/packages/0a/71/864524bde46a02e636cc5de47b9a4e1f1ed18c7acc3f1319cf9fe1db3c7a/sgp4-2.25-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:170ec2882cd166ff9d8dccfb8018f86d5cc033ea8a07c27a1825999c62439f05", size = 162985, upload-time = "2025-08-04T18:01:55.646Z" },
{ url = "https://files.pythonhosted.org/packages/e3/cd/022aa419d9570d494dafd5103a71dda64c6ffc956a1c7f5b096a58a23a6a/sgp4-2.25-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:64c7597a60b770caac51566b1f621d1cd74df0409ef19c5e7ea3505d0dfbc677", size = 161951, upload-time = "2025-08-04T18:01:56.745Z" },
{ url = "https://files.pythonhosted.org/packages/3a/1c/76dbf2190d30a770fe8ac57474d212e005f56f47e65dd6fcecdb546d454f/sgp4-2.25-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0e1d18b8972643dd29e758e67c062cfb68fbe2421fe3f6398f1957a9825119f6", size = 236340, upload-time = "2025-08-04T18:01:57.778Z" },
{ url = "https://files.pythonhosted.org/packages/97/a4/2fc9bf9cb75571222bd453407317e91193a3db1c559333c5e46ce7a014c9/sgp4-2.25-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:35649388a06cbee7def24cbb789f452c31d42ed9e87bddd89935ed78f19451ed", size = 233080, upload-time = "2025-08-04T18:01:58.812Z" },
{ url = "https://files.pythonhosted.org/packages/fc/40/50ecdc518edd3a85ad74bda7a2196b53d5901256e3d7ab34225c96e8edc8/sgp4-2.25-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:911460477f1c52dcda2b3eb20538435b89b0a43668bcb5edd1e7700b7a1a0225", size = 235729, upload-time = "2025-08-04T18:01:59.83Z" },
{ url = "https://files.pythonhosted.org/packages/1b/dd/c1ee8571828debfd3e0f2297379a2a2af75024062c70cf76bdc121e77623/sgp4-2.25-cp312-cp312-win32.whl", hash = "sha256:128edd3d6061e833600d93e77d4c08d1a5002293997e368256b0b777ea525dda", size = 161899, upload-time = "2025-08-04T18:02:00.882Z" },
{ url = "https://files.pythonhosted.org/packages/c8/f8/7dae15af520dfe5def1f8620c2817203cbbf1a1bf154b2079add1200acd3/sgp4-2.25-cp312-cp312-win_amd64.whl", hash = "sha256:979eb60e74aff5dc318cfe1a6c817db884486bdfc8496d2c5bc07b05fe833280", size = 164137, upload-time = "2025-08-04T18:02:01.817Z" },
{ url = "https://files.pythonhosted.org/packages/02/0f/daf4a70829be7c1550b914c98b3abbd15404d00899835432ae8d4a9be502/sgp4-2.25-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:c4d4eab0f2c94aad3a0ab0bedd59f2137484af5480a3b40df8e4ab5a1fbc6b86", size = 162974, upload-time = "2025-08-04T18:02:02.816Z" },
{ url = "https://files.pythonhosted.org/packages/27/88/af20e342590c3ede18cc8dc6a1e1da708f576e1a97dcb69e2870e739ae21/sgp4-2.25-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:2822ca25f3724694bfced16cad8b3018678bee47fa3baf4eea20876d0e35ad33", size = 161957, upload-time = "2025-08-04T18:02:03.835Z" },
{ url = "https://files.pythonhosted.org/packages/6e/14/81f0df0cc39bdc95336a6f5834c84a6e5f79b5e728918cb9dadff3278017/sgp4-2.25-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7beca36492eb6d20ef15eeedd9520b8af4fa0cbaaae46a9269d5a2e7c8e56e46", size = 236195, upload-time = "2025-08-04T18:02:05.121Z" },
{ url = "https://files.pythonhosted.org/packages/a3/a7/3740791f656d9b7ad78da7c0d9f6f842a18642fead2d26b2d69fb701892e/sgp4-2.25-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8e9dfd18cacf6bfb1faad29c89a6cec98a642558f805851080dea9c394520db2", size = 232992, upload-time = "2025-08-04T18:02:06.086Z" },
{ url = "https://files.pythonhosted.org/packages/62/45/0e35398ef8d4b07ecfa9f7f680e183b2b6af9215a56af34f9e621c29b495/sgp4-2.25-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5789b7add136362684dfcbf0862919f8c3018f74ab11a05a9964edd5fdd4d2a7", size = 235584, upload-time = "2025-08-04T18:02:07.152Z" },
{ url = "https://files.pythonhosted.org/packages/47/b7/1b680b5514586b3860500109ef37fd1761f21e6787f20a2597baa44d91a0/sgp4-2.25-cp313-cp313-win32.whl", hash = "sha256:94219b486def29aa1246f42de8bea05ccb8e98a5458dd08ce42b9811c79ca814", size = 161896, upload-time = "2025-08-04T18:02:08.062Z" },
{ url = "https://files.pythonhosted.org/packages/67/c1/a22be2e7886db40d1512969f3b8cc3ce989167e69ea8f308f26afd1dfd31/sgp4-2.25-cp313-cp313-win_amd64.whl", hash = "sha256:dec2f6c842d9bf40c67d5764bd752980844f91f338020d2af7f85847364d0ff7", size = 164142, upload-time = "2025-08-04T18:02:09Z" },
{ url = "https://files.pythonhosted.org/packages/3a/47/8231e3d4a88341316ec8d0eb98d3a8a972477d8b038555259522735a8371/sgp4-2.25-py3-none-any.whl", hash = "sha256:4f39ecf6c2663109fed04adfe9982815ac83893271b521d92d5b186820f8c78e", size = 137376, upload-time = "2026-04-27T18:29:23.71Z" },
]
[[package]]
name = "shadowbroker"
version = "0.9.5"
version = "0.9.79"
source = { virtual = "." }
[package.metadata]
@@ -2322,6 +2447,74 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/76/fc/310e16254683c1ed35eeb97386986d6c00bc29df17ce280aed64d55537e9/vaderSentiment-3.3.2-py2.py3-none-any.whl", hash = "sha256:3bf1d243b98b1afad575b9f22bc2cb1e212b94ff89ca74f8a23a588d024ea311", size = 125950, upload-time = "2020-05-22T15:07:00.052Z" },
]
[[package]]
name = "websockets"
version = "16.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/04/24/4b2031d72e840ce4c1ccb255f693b15c334757fc50023e4db9537080b8c4/websockets-16.0.tar.gz", hash = "sha256:5f6261a5e56e8d5c42a4497b364ea24d94d9563e8fbd44e78ac40879c60179b5", size = 179346, upload-time = "2026-01-10T09:23:47.181Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/20/74/221f58decd852f4b59cc3354cccaf87e8ef695fede361d03dc9a7396573b/websockets-16.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:04cdd5d2d1dacbad0a7bf36ccbcd3ccd5a30ee188f2560b7a62a30d14107b31a", size = 177343, upload-time = "2026-01-10T09:22:21.28Z" },
{ url = "https://files.pythonhosted.org/packages/19/0f/22ef6107ee52ab7f0b710d55d36f5a5d3ef19e8a205541a6d7ffa7994e5a/websockets-16.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:8ff32bb86522a9e5e31439a58addbb0166f0204d64066fb955265c4e214160f0", size = 175021, upload-time = "2026-01-10T09:22:22.696Z" },
{ url = "https://files.pythonhosted.org/packages/10/40/904a4cb30d9b61c0e278899bf36342e9b0208eb3c470324a9ecbaac2a30f/websockets-16.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:583b7c42688636f930688d712885cf1531326ee05effd982028212ccc13e5957", size = 175320, upload-time = "2026-01-10T09:22:23.94Z" },
{ url = "https://files.pythonhosted.org/packages/9d/2f/4b3ca7e106bc608744b1cdae041e005e446124bebb037b18799c2d356864/websockets-16.0-cp310-cp310-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:7d837379b647c0c4c2355c2499723f82f1635fd2c26510e1f587d89bc2199e72", size = 183815, upload-time = "2026-01-10T09:22:25.469Z" },
{ url = "https://files.pythonhosted.org/packages/86/26/d40eaa2a46d4302becec8d15b0fc5e45bdde05191e7628405a19cf491ccd/websockets-16.0-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:df57afc692e517a85e65b72e165356ed1df12386ecb879ad5693be08fac65dde", size = 185054, upload-time = "2026-01-10T09:22:27.101Z" },
{ url = "https://files.pythonhosted.org/packages/b0/ba/6500a0efc94f7373ee8fefa8c271acdfd4dca8bd49a90d4be7ccabfc397e/websockets-16.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:2b9f1e0d69bc60a4a87349d50c09a037a2607918746f07de04df9e43252c77a3", size = 184565, upload-time = "2026-01-10T09:22:28.293Z" },
{ url = "https://files.pythonhosted.org/packages/04/b4/96bf2cee7c8d8102389374a2616200574f5f01128d1082f44102140344cc/websockets-16.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:335c23addf3d5e6a8633f9f8eda77efad001671e80b95c491dd0924587ece0b3", size = 183848, upload-time = "2026-01-10T09:22:30.394Z" },
{ url = "https://files.pythonhosted.org/packages/02/8e/81f40fb00fd125357814e8c3025738fc4ffc3da4b6b4a4472a82ba304b41/websockets-16.0-cp310-cp310-win32.whl", hash = "sha256:37b31c1623c6605e4c00d466c9d633f9b812ea430c11c8a278774a1fde1acfa9", size = 178249, upload-time = "2026-01-10T09:22:32.083Z" },
{ url = "https://files.pythonhosted.org/packages/b4/5f/7e40efe8df57db9b91c88a43690ac66f7b7aa73a11aa6a66b927e44f26fa/websockets-16.0-cp310-cp310-win_amd64.whl", hash = "sha256:8e1dab317b6e77424356e11e99a432b7cb2f3ec8c5ab4dabbcee6add48f72b35", size = 178685, upload-time = "2026-01-10T09:22:33.345Z" },
{ url = "https://files.pythonhosted.org/packages/f2/db/de907251b4ff46ae804ad0409809504153b3f30984daf82a1d84a9875830/websockets-16.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:31a52addea25187bde0797a97d6fc3d2f92b6f72a9370792d65a6e84615ac8a8", size = 177340, upload-time = "2026-01-10T09:22:34.539Z" },
{ url = "https://files.pythonhosted.org/packages/f3/fa/abe89019d8d8815c8781e90d697dec52523fb8ebe308bf11664e8de1877e/websockets-16.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:417b28978cdccab24f46400586d128366313e8a96312e4b9362a4af504f3bbad", size = 175022, upload-time = "2026-01-10T09:22:36.332Z" },
{ url = "https://files.pythonhosted.org/packages/58/5d/88ea17ed1ded2079358b40d31d48abe90a73c9e5819dbcde1606e991e2ad/websockets-16.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:af80d74d4edfa3cb9ed973a0a5ba2b2a549371f8a741e0800cb07becdd20f23d", size = 175319, upload-time = "2026-01-10T09:22:37.602Z" },
{ url = "https://files.pythonhosted.org/packages/d2/ae/0ee92b33087a33632f37a635e11e1d99d429d3d323329675a6022312aac2/websockets-16.0-cp311-cp311-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:08d7af67b64d29823fed316505a89b86705f2b7981c07848fb5e3ea3020c1abe", size = 184631, upload-time = "2026-01-10T09:22:38.789Z" },
{ url = "https://files.pythonhosted.org/packages/c8/c5/27178df583b6c5b31b29f526ba2da5e2f864ecc79c99dae630a85d68c304/websockets-16.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7be95cfb0a4dae143eaed2bcba8ac23f4892d8971311f1b06f3c6b78952ee70b", size = 185870, upload-time = "2026-01-10T09:22:39.893Z" },
{ url = "https://files.pythonhosted.org/packages/87/05/536652aa84ddc1c018dbb7e2c4cbcd0db884580bf8e95aece7593fde526f/websockets-16.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d6297ce39ce5c2e6feb13c1a996a2ded3b6832155fcfc920265c76f24c7cceb5", size = 185361, upload-time = "2026-01-10T09:22:41.016Z" },
{ url = "https://files.pythonhosted.org/packages/6d/e2/d5332c90da12b1e01f06fb1b85c50cfc489783076547415bf9f0a659ec19/websockets-16.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:1c1b30e4f497b0b354057f3467f56244c603a79c0d1dafce1d16c283c25f6e64", size = 184615, upload-time = "2026-01-10T09:22:42.442Z" },
{ url = "https://files.pythonhosted.org/packages/77/fb/d3f9576691cae9253b51555f841bc6600bf0a983a461c79500ace5a5b364/websockets-16.0-cp311-cp311-win32.whl", hash = "sha256:5f451484aeb5cafee1ccf789b1b66f535409d038c56966d6101740c1614b86c6", size = 178246, upload-time = "2026-01-10T09:22:43.654Z" },
{ url = "https://files.pythonhosted.org/packages/54/67/eaff76b3dbaf18dcddabc3b8c1dba50b483761cccff67793897945b37408/websockets-16.0-cp311-cp311-win_amd64.whl", hash = "sha256:8d7f0659570eefb578dacde98e24fb60af35350193e4f56e11190787bee77dac", size = 178684, upload-time = "2026-01-10T09:22:44.941Z" },
{ url = "https://files.pythonhosted.org/packages/84/7b/bac442e6b96c9d25092695578dda82403c77936104b5682307bd4deb1ad4/websockets-16.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:71c989cbf3254fbd5e84d3bff31e4da39c43f884e64f2551d14bb3c186230f00", size = 177365, upload-time = "2026-01-10T09:22:46.787Z" },
{ url = "https://files.pythonhosted.org/packages/b0/fe/136ccece61bd690d9c1f715baaeefd953bb2360134de73519d5df19d29ca/websockets-16.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:8b6e209ffee39ff1b6d0fa7bfef6de950c60dfb91b8fcead17da4ee539121a79", size = 175038, upload-time = "2026-01-10T09:22:47.999Z" },
{ url = "https://files.pythonhosted.org/packages/40/1e/9771421ac2286eaab95b8575b0cb701ae3663abf8b5e1f64f1fd90d0a673/websockets-16.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:86890e837d61574c92a97496d590968b23c2ef0aeb8a9bc9421d174cd378ae39", size = 175328, upload-time = "2026-01-10T09:22:49.809Z" },
{ url = "https://files.pythonhosted.org/packages/18/29/71729b4671f21e1eaa5d6573031ab810ad2936c8175f03f97f3ff164c802/websockets-16.0-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:9b5aca38b67492ef518a8ab76851862488a478602229112c4b0d58d63a7a4d5c", size = 184915, upload-time = "2026-01-10T09:22:51.071Z" },
{ url = "https://files.pythonhosted.org/packages/97/bb/21c36b7dbbafc85d2d480cd65df02a1dc93bf76d97147605a8e27ff9409d/websockets-16.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e0334872c0a37b606418ac52f6ab9cfd17317ac26365f7f65e203e2d0d0d359f", size = 186152, upload-time = "2026-01-10T09:22:52.224Z" },
{ url = "https://files.pythonhosted.org/packages/4a/34/9bf8df0c0cf88fa7bfe36678dc7b02970c9a7d5e065a3099292db87b1be2/websockets-16.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:a0b31e0b424cc6b5a04b8838bbaec1688834b2383256688cf47eb97412531da1", size = 185583, upload-time = "2026-01-10T09:22:53.443Z" },
{ url = "https://files.pythonhosted.org/packages/47/88/4dd516068e1a3d6ab3c7c183288404cd424a9a02d585efbac226cb61ff2d/websockets-16.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:485c49116d0af10ac698623c513c1cc01c9446c058a4e61e3bf6c19dff7335a2", size = 184880, upload-time = "2026-01-10T09:22:55.033Z" },
{ url = "https://files.pythonhosted.org/packages/91/d6/7d4553ad4bf1c0421e1ebd4b18de5d9098383b5caa1d937b63df8d04b565/websockets-16.0-cp312-cp312-win32.whl", hash = "sha256:eaded469f5e5b7294e2bdca0ab06becb6756ea86894a47806456089298813c89", size = 178261, upload-time = "2026-01-10T09:22:56.251Z" },
{ url = "https://files.pythonhosted.org/packages/c3/f0/f3a17365441ed1c27f850a80b2bc680a0fa9505d733fe152fdf5e98c1c0b/websockets-16.0-cp312-cp312-win_amd64.whl", hash = "sha256:5569417dc80977fc8c2d43a86f78e0a5a22fee17565d78621b6bb264a115d4ea", size = 178693, upload-time = "2026-01-10T09:22:57.478Z" },
{ url = "https://files.pythonhosted.org/packages/cc/9c/baa8456050d1c1b08dd0ec7346026668cbc6f145ab4e314d707bb845bf0d/websockets-16.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:878b336ac47938b474c8f982ac2f7266a540adc3fa4ad74ae96fea9823a02cc9", size = 177364, upload-time = "2026-01-10T09:22:59.333Z" },
{ url = "https://files.pythonhosted.org/packages/7e/0c/8811fc53e9bcff68fe7de2bcbe75116a8d959ac699a3200f4847a8925210/websockets-16.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:52a0fec0e6c8d9a784c2c78276a48a2bdf099e4ccc2a4cad53b27718dbfd0230", size = 175039, upload-time = "2026-01-10T09:23:01.171Z" },
{ url = "https://files.pythonhosted.org/packages/aa/82/39a5f910cb99ec0b59e482971238c845af9220d3ab9fa76dd9162cda9d62/websockets-16.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:e6578ed5b6981005df1860a56e3617f14a6c307e6a71b4fff8c48fdc50f3ed2c", size = 175323, upload-time = "2026-01-10T09:23:02.341Z" },
{ url = "https://files.pythonhosted.org/packages/bd/28/0a25ee5342eb5d5f297d992a77e56892ecb65e7854c7898fb7d35e9b33bd/websockets-16.0-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:95724e638f0f9c350bb1c2b0a7ad0e83d9cc0c9259f3ea94e40d7b02a2179ae5", size = 184975, upload-time = "2026-01-10T09:23:03.756Z" },
{ url = "https://files.pythonhosted.org/packages/f9/66/27ea52741752f5107c2e41fda05e8395a682a1e11c4e592a809a90c6a506/websockets-16.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c0204dc62a89dc9d50d682412c10b3542d748260d743500a85c13cd1ee4bde82", size = 186203, upload-time = "2026-01-10T09:23:05.01Z" },
{ url = "https://files.pythonhosted.org/packages/37/e5/8e32857371406a757816a2b471939d51c463509be73fa538216ea52b792a/websockets-16.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:52ac480f44d32970d66763115edea932f1c5b1312de36df06d6b219f6741eed8", size = 185653, upload-time = "2026-01-10T09:23:06.301Z" },
{ url = "https://files.pythonhosted.org/packages/9b/67/f926bac29882894669368dc73f4da900fcdf47955d0a0185d60103df5737/websockets-16.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6e5a82b677f8f6f59e8dfc34ec06ca6b5b48bc4fcda346acd093694cc2c24d8f", size = 184920, upload-time = "2026-01-10T09:23:07.492Z" },
{ url = "https://files.pythonhosted.org/packages/3c/a1/3d6ccdcd125b0a42a311bcd15a7f705d688f73b2a22d8cf1c0875d35d34a/websockets-16.0-cp313-cp313-win32.whl", hash = "sha256:abf050a199613f64c886ea10f38b47770a65154dc37181bfaff70c160f45315a", size = 178255, upload-time = "2026-01-10T09:23:09.245Z" },
{ url = "https://files.pythonhosted.org/packages/6b/ae/90366304d7c2ce80f9b826096a9e9048b4bb760e44d3b873bb272cba696b/websockets-16.0-cp313-cp313-win_amd64.whl", hash = "sha256:3425ac5cf448801335d6fdc7ae1eb22072055417a96cc6b31b3861f455fbc156", size = 178689, upload-time = "2026-01-10T09:23:10.483Z" },
{ url = "https://files.pythonhosted.org/packages/f3/1d/e88022630271f5bd349ed82417136281931e558d628dd52c4d8621b4a0b2/websockets-16.0-cp314-cp314-macosx_10_15_universal2.whl", hash = "sha256:8cc451a50f2aee53042ac52d2d053d08bf89bcb31ae799cb4487587661c038a0", size = 177406, upload-time = "2026-01-10T09:23:12.178Z" },
{ url = "https://files.pythonhosted.org/packages/f2/78/e63be1bf0724eeb4616efb1ae1c9044f7c3953b7957799abb5915bffd38e/websockets-16.0-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:daa3b6ff70a9241cf6c7fc9e949d41232d9d7d26fd3522b1ad2b4d62487e9904", size = 175085, upload-time = "2026-01-10T09:23:13.511Z" },
{ url = "https://files.pythonhosted.org/packages/bb/f4/d3c9220d818ee955ae390cf319a7c7a467beceb24f05ee7aaaa2414345ba/websockets-16.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:fd3cb4adb94a2a6e2b7c0d8d05cb94e6f1c81a0cf9dc2694fb65c7e8d94c42e4", size = 175328, upload-time = "2026-01-10T09:23:14.727Z" },
{ url = "https://files.pythonhosted.org/packages/63/bc/d3e208028de777087e6fb2b122051a6ff7bbcca0d6df9d9c2bf1dd869ae9/websockets-16.0-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:781caf5e8eee67f663126490c2f96f40906594cb86b408a703630f95550a8c3e", size = 185044, upload-time = "2026-01-10T09:23:15.939Z" },
{ url = "https://files.pythonhosted.org/packages/ad/6e/9a0927ac24bd33a0a9af834d89e0abc7cfd8e13bed17a86407a66773cc0e/websockets-16.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:caab51a72c51973ca21fa8a18bd8165e1a0183f1ac7066a182ff27107b71e1a4", size = 186279, upload-time = "2026-01-10T09:23:17.148Z" },
{ url = "https://files.pythonhosted.org/packages/b9/ca/bf1c68440d7a868180e11be653c85959502efd3a709323230314fda6e0b3/websockets-16.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:19c4dc84098e523fd63711e563077d39e90ec6702aff4b5d9e344a60cb3c0cb1", size = 185711, upload-time = "2026-01-10T09:23:18.372Z" },
{ url = "https://files.pythonhosted.org/packages/c4/f8/fdc34643a989561f217bb477cbc47a3a07212cbda91c0e4389c43c296ebf/websockets-16.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:a5e18a238a2b2249c9a9235466b90e96ae4795672598a58772dd806edc7ac6d3", size = 184982, upload-time = "2026-01-10T09:23:19.652Z" },
{ url = "https://files.pythonhosted.org/packages/dd/d1/574fa27e233764dbac9c52730d63fcf2823b16f0856b3329fc6268d6ae4f/websockets-16.0-cp314-cp314-win32.whl", hash = "sha256:a069d734c4a043182729edd3e9f247c3b2a4035415a9172fd0f1b71658a320a8", size = 177915, upload-time = "2026-01-10T09:23:21.458Z" },
{ url = "https://files.pythonhosted.org/packages/8a/f1/ae6b937bf3126b5134ce1f482365fde31a357c784ac51852978768b5eff4/websockets-16.0-cp314-cp314-win_amd64.whl", hash = "sha256:c0ee0e63f23914732c6d7e0cce24915c48f3f1512ec1d079ed01fc629dab269d", size = 178381, upload-time = "2026-01-10T09:23:22.715Z" },
{ url = "https://files.pythonhosted.org/packages/06/9b/f791d1db48403e1f0a27577a6beb37afae94254a8c6f08be4a23e4930bc0/websockets-16.0-cp314-cp314t-macosx_10_15_universal2.whl", hash = "sha256:a35539cacc3febb22b8f4d4a99cc79b104226a756aa7400adc722e83b0d03244", size = 177737, upload-time = "2026-01-10T09:23:24.523Z" },
{ url = "https://files.pythonhosted.org/packages/bd/40/53ad02341fa33b3ce489023f635367a4ac98b73570102ad2cdd770dacc9a/websockets-16.0-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:b784ca5de850f4ce93ec85d3269d24d4c82f22b7212023c974c401d4980ebc5e", size = 175268, upload-time = "2026-01-10T09:23:25.781Z" },
{ url = "https://files.pythonhosted.org/packages/74/9b/6158d4e459b984f949dcbbb0c5d270154c7618e11c01029b9bbd1bb4c4f9/websockets-16.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:569d01a4e7fba956c5ae4fc988f0d4e187900f5497ce46339c996dbf24f17641", size = 175486, upload-time = "2026-01-10T09:23:27.033Z" },
{ url = "https://files.pythonhosted.org/packages/e5/2d/7583b30208b639c8090206f95073646c2c9ffd66f44df967981a64f849ad/websockets-16.0-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:50f23cdd8343b984957e4077839841146f67a3d31ab0d00e6b824e74c5b2f6e8", size = 185331, upload-time = "2026-01-10T09:23:28.259Z" },
{ url = "https://files.pythonhosted.org/packages/45/b0/cce3784eb519b7b5ad680d14b9673a31ab8dcb7aad8b64d81709d2430aa8/websockets-16.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:152284a83a00c59b759697b7f9e9cddf4e3c7861dd0d964b472b70f78f89e80e", size = 186501, upload-time = "2026-01-10T09:23:29.449Z" },
{ url = "https://files.pythonhosted.org/packages/19/60/b8ebe4c7e89fb5f6cdf080623c9d92789a53636950f7abacfc33fe2b3135/websockets-16.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:bc59589ab64b0022385f429b94697348a6a234e8ce22544e3681b2e9331b5944", size = 186062, upload-time = "2026-01-10T09:23:31.368Z" },
{ url = "https://files.pythonhosted.org/packages/88/a8/a080593f89b0138b6cba1b28f8df5673b5506f72879322288b031337c0b8/websockets-16.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:32da954ffa2814258030e5a57bc73a3635463238e797c7375dc8091327434206", size = 185356, upload-time = "2026-01-10T09:23:32.627Z" },
{ url = "https://files.pythonhosted.org/packages/c2/b6/b9afed2afadddaf5ebb2afa801abf4b0868f42f8539bfe4b071b5266c9fe/websockets-16.0-cp314-cp314t-win32.whl", hash = "sha256:5a4b4cc550cb665dd8a47f868c8d04c8230f857363ad3c9caf7a0c3bf8c61ca6", size = 178085, upload-time = "2026-01-10T09:23:33.816Z" },
{ url = "https://files.pythonhosted.org/packages/9f/3e/28135a24e384493fa804216b79a6a6759a38cc4ff59118787b9fb693df93/websockets-16.0-cp314-cp314t-win_amd64.whl", hash = "sha256:b14dc141ed6d2dde437cddb216004bcac6a1df0935d79656387bd41632ba0bbd", size = 178531, upload-time = "2026-01-10T09:23:35.016Z" },
{ url = "https://files.pythonhosted.org/packages/72/07/c98a68571dcf256e74f1f816b8cc5eae6eb2d3d5cfa44d37f801619d9166/websockets-16.0-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:349f83cd6c9a415428ee1005cadb5c2c56f4389bc06a9af16103c3bc3dcc8b7d", size = 174947, upload-time = "2026-01-10T09:23:36.166Z" },
{ url = "https://files.pythonhosted.org/packages/7e/52/93e166a81e0305b33fe416338be92ae863563fe7bce446b0f687b9df5aea/websockets-16.0-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:4a1aba3340a8dca8db6eb5a7986157f52eb9e436b74813764241981ca4888f03", size = 175260, upload-time = "2026-01-10T09:23:37.409Z" },
{ url = "https://files.pythonhosted.org/packages/56/0c/2dbf513bafd24889d33de2ff0368190a0e69f37bcfa19009ef819fe4d507/websockets-16.0-pp311-pypy311_pp73-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:f4a32d1bd841d4bcbffdcb3d2ce50c09c3909fbead375ab28d0181af89fd04da", size = 176071, upload-time = "2026-01-10T09:23:39.158Z" },
{ url = "https://files.pythonhosted.org/packages/a5/8f/aea9c71cc92bf9b6cc0f7f70df8f0b420636b6c96ef4feee1e16f80f75dd/websockets-16.0-pp311-pypy311_pp73-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0298d07ee155e2e9fda5be8a9042200dd2e3bb0b8a38482156576f863a9d457c", size = 176968, upload-time = "2026-01-10T09:23:41.031Z" },
{ url = "https://files.pythonhosted.org/packages/9a/3f/f70e03f40ffc9a30d817eef7da1be72ee4956ba8d7255c399a01b135902a/websockets-16.0-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:a653aea902e0324b52f1613332ddf50b00c06fdaf7e92624fbf8c77c78fa5767", size = 178735, upload-time = "2026-01-10T09:23:42.259Z" },
{ url = "https://files.pythonhosted.org/packages/6f/28/258ebab549c2bf3e64d2b0217b973467394a9cea8c42f70418ca2c5d0d2e/websockets-16.0-py3-none-any.whl", hash = "sha256:1637db62fad1dc833276dded54215f2c7fa46912301a24bd94d45d46a011ceec", size = 171598, upload-time = "2026-01-10T09:23:45.395Z" },
]
[[package]]
name = "winrt-runtime"
version = "3.2.1"
@@ -2644,10 +2837,11 @@ wheels = [
[[package]]
name = "yfinance"
version = "0.2.54"
version = "1.3.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "beautifulsoup4" },
{ name = "curl-cffi" },
{ name = "frozendict" },
{ name = "multitasking" },
{ name = "numpy", version = "2.2.6", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.11'" },
@@ -2656,10 +2850,12 @@ dependencies = [
{ name = "pandas", version = "3.0.1", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.11'" },
{ name = "peewee" },
{ name = "platformdirs" },
{ name = "protobuf" },
{ name = "pytz" },
{ name = "requests" },
{ name = "websockets" },
]
sdist = { url = "https://files.pythonhosted.org/packages/83/a7/592e5074f5f72be7b4b9fafd2f66e4d20e10afd83a134de63638980fcd19/yfinance-0.2.54.tar.gz", hash = "sha256:a4ab8e2ecba4fda5a36bff0bdc602a014adc732e5eda5d3ac283836ce40356e8", size = 117872, upload-time = "2025-02-18T22:19:21.232Z" }
sdist = { url = "https://files.pythonhosted.org/packages/9b/fd/943a7d71ce98a40b9006daccba96a83837acadb8e55361f41c7a81873013/yfinance-1.3.0.tar.gz", hash = "sha256:42c4e64a889dab8eeaffd3a66d4ccf1baffd566910ca63fb6332283f8f9b8a40", size = 145297, upload-time = "2026-04-16T19:51:05.785Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/09/05/28664524fcc67c078313d482bf25fe403e9399130622cfc89e185ec0abf6/yfinance-0.2.54-py2.py3-none-any.whl", hash = "sha256:8754f90332158d5d19bf754c1b230864ca2d1d313182a3f94a7bc7718bbe7d90", size = 108707, upload-time = "2025-02-18T22:19:19.883Z" },
{ url = "https://files.pythonhosted.org/packages/dc/bc/e46ed5dfb88c6f7af0f641ffb6227d32f484ea989a2987a52a9c35d17aa9/yfinance-1.3.0-py2.py3-none-any.whl", hash = "sha256:c89539f0cf6af026d570131189bd659a962e8fb942376ef8ff8913e77c9fca75", size = 133706, upload-time = "2026-04-16T19:51:04.298Z" },
]