diff --git a/CHANGELOG.md b/CHANGELOG.md index c073b906..a194e4b0 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,5 +1,60 @@ # Changelog +## [1.17.0.0] - 2026-04-26 + +## **Your gstack memory now actually lives in gbrain.** + +For everyone who ran `/setup-gbrain` in the last month and noticed `gbrain search` couldn't find their CEO plans, learnings, or retros: that's because Step 7 wrote a placeholder `consumers.json` with `status: "pending"` and called it done. The HTTP endpoint that placeholder pointed at was never built on the gbrain side. This release scraps that approach and uses the gbrain v0.18.0 federation surface (`gbrain sources` + `gbrain sync`) instead. + +After upgrading, `/setup-gbrain` adds a `git worktree` of your brain repo, registers it as a federated source on your gbrain (Supabase or PGLite), and runs an initial sync. Subsequent gstack skill end-of-run cycles also run `gbrain sync` so new artifacts land in the index automatically. Local-Mac only. No cloud agent required. `/gstack-upgrade` runs a one-shot migration for existing users. + +### Verify after upgrade + +```bash +gbrain sources list --json | jq '.sources[] | {id, page_count, federated}' +# Expect: two entries, your default brain plus a "gstack-brain-{user}" +# entry, both federated=true. + +gbrain search "ethos" --source gstack-brain-{user} | head -5 +# Expect: hits from your gstack repo content (readme, ethos, designs, etc). +``` + +### What shipped + +`bin/gstack-gbrain-source-wireup` is the new helper. It derives a per-user source id from `~/.gstack/.git`'s origin URL (with multi-fallback to `~/.gstack-brain-remote.txt` and a `--source-id` flag), creates a detached `git worktree` at `~/.gstack-brain-worktree/`, registers it as a federated source on gbrain, runs initial backfill, and supports `--strict` (Step 7 strictness), `--uninstall` (full teardown including future-launchd plist), and `--probe` (read-only state inspection). All idempotent. The helper depends on `jq` (transitive via `gstack-gbrain-detect`). + +The helper locks the database URL at startup (precedence: `--database-url` flag > `GBRAIN_DATABASE_URL`/`DATABASE_URL` env > read once from `~/.gbrain/config.json`) and exports it as `GBRAIN_DATABASE_URL` for every child `gbrain` invocation. This means external rewrites of `~/.gbrain/config.json` mid-sync (e.g., a concurrent `gbrain init --non-interactive` running in another workspace) cannot redirect the wireup at a different brain. Per gbrain's `loadConfig()`, env-var URLs override the file. Step 7 of `/setup-gbrain` reads the URL out of `config.json` once and passes it explicitly via `--database-url`, so the wireup is robust against config flips during the seconds-to-minutes sync window. + +`/setup-gbrain` Step 7 now invokes the helper with `--strict` after `gstack-brain-init`. `/gstack-upgrade` invokes the helper without `--strict` via `gstack-upgrade/migrations/v1.12.3.0.sh` so missing/old gbrain is a benign skip during batch upgrade. `bin/gstack-brain-restore` invokes the helper after the initial clone so a 2nd Mac gets the wireup automatically. `bin/gstack-brain-uninstall` invokes `--uninstall` plus removes legacy `consumers.json`. + +`bin/gstack-brain-init` drops 60 lines of dead consumer-registration code (the HTTP POST block, the `consumers.json` writer, the chore commit). `bin/gstack-brain-restore` drops the 18-line `consumers.json` token-rehydration block (the only consumer that used it never had real tokens). `bin/gstack-brain-consumer` is marked deprecated in its header docstring; removal in v1.18.0.0 after one cycle of grace. + +`test/gstack-gbrain-source-wireup.test.ts` is new: 13 unit tests with a fake `gbrain` binary on `$PATH` covering fresh-state registration, idempotent re-runs, drift recovery (gbrain has no `sources update`, only `remove + add`), `--strict` failure modes, source-id fallback chain (`.git` → remote-file → flag), `--probe` non-mutation, sync errors, and `--uninstall`. + +### The numbers that matter + +These are reproducible on any machine after upgrade. Run the verify commands above to see your own delta. + +| Metric | Before (v1.16.0.0) | After (v1.17.0.0) | +|---|---|---| +| `gbrain sources list` size | 1 (default `/data/brain`) | 2 (default + `gstack-brain-{user}`) | +| `consumers.json` status | `"pending"`, ingest_url `""` | file deleted from new installs | +| Manual steps to wire up | 4 (clone + sources add + sync + cron) | 0, automatic in Step 7 | +| Helper test coverage | 0 unit tests | 13 unit tests (`bun test test/gstack-gbrain-source-wireup.test.ts`) | +| `bin/gstack-brain-init` size | 363 lines | 300 lines (60 lines of dead code removed) | + +Local Mac is the producer of artifacts and the worktree advances automatically with `~/.gstack/`'s commits. Cross-machine sync runs through GitHub via the existing `gstack-brain-sync --once` push hook. No new cron infrastructure needed today; when gbrain v0.21 code-graph features ship, the helper's `--enable-cron` flag is a clean extension. + +### What this means for builders + +Your gstack memory is searchable now. Run a CEO plan review or office-hours session, sync runs at skill-end automatically, and `gbrain search` finds the plan content from any gbrain client (this Claude Code session, future Macs, optional cloud agents like OpenClaw). One source of truth across machines. The placeholder is dead. + +### For contributors + +- `bin/gstack-brain-consumer` is deprecated in this release; removal in v1.18.0.0. +- The `gbrain_url` and `gbrain_token` config keys are now no-ops. They remain readable for one cycle for back-compat, removed in v1.18.0.0. +- Three pre-existing test failures on this branch (`gstack-config gbrain keys > GSTACK_HOME overrides real config dir`, `no compiled binaries in git > git tracks no files larger than 2MB`, `Opus 4.7 overlay — pacing directive`) were verified to fail on the base branch too. Out of scope for this PR; flagged for a follow-up. + ## [1.16.0.0] - 2026-04-28 ## **Paired-agent tunnel allowlist now matches what the docs already promised. Catch-22 resolved, gate is unit-testable.** @@ -47,8 +102,6 @@ Three things change immediately. **First**, paired agents can actually open and - The plan was reviewed under `/plan-eng-review` plus 2 sequential codex outside-voice passes during plan mode. Round-1 codex caught a doc-target mistake (we were going to update `SIDEBAR_MESSAGE_FLOW.md` instead of `REMOTE_BROWSER_ACCESS.md`) and a wrong-layer test design. Round-2 codex caught that the round-1 correction was still wrong (the chosen test harness only binds the local listener) AND that the docs promised 6 more commands than the allowlist had. All 6 of 7 substantive findings landed in the implementation; the 7th (a pre-existing `/pair-agent` `/health` probe mismatch at `cli.ts:656-668`) is logged as out of scope. - One known accepted risk: `tabs` over the tunnel returns metadata for ALL tabs in the browser, not just tabs the agent owns. The user authored the trust relationship when they paired the agent, the agent already can't read CONTENT of unowned tabs (write commands blocked, the active tab can't be switched without a `tab ` command that's NOT in the allowlist), and tab IDs already leak via the 403 `hint` field on disallowed `goto`. Codex noted that tightening this requires touching the ownership gate itself (the gate falls back to `getActiveTabId()` BEFORE dispatch in `server.ts:603-614`), which is materially out of scope for a catch-22 fix. Logged in the plan failure-mode table as accepted. - - ## [1.15.0.0] - 2026-04-26 ## **Real-PTY test harness ships. 11 plan-mode E2E tests, 23 unit tests, and 50K fewer tokens per invocation.** diff --git a/USING_GBRAIN_WITH_GSTACK.md b/USING_GBRAIN_WITH_GSTACK.md index f0dfb14c..17dea2b0 100644 --- a/USING_GBRAIN_WITH_GSTACK.md +++ b/USING_GBRAIN_WITH_GSTACK.md @@ -159,6 +159,7 @@ The skill re-collects a PAT (one-time, discarded after), lists every project in | `gstack-gbrain-supabase-verify` | Structural URL check. Rejects direct-connection URLs (`db.*.supabase.co:5432`) with exit 3 | | `gstack-gbrain-supabase-provision` | Management API wrapper. Subcommands: `list-orgs`, `create`, `wait`, `pooler-url`, `list-orphans`, `delete-project`. All require `SUPABASE_ACCESS_TOKEN` in env. `create` and `pooler-url` also require `DB_PASS`. `--json` mode available on every subcommand. | | `gstack-gbrain-repo-policy` | Per-remote trust triad. Subcommands: `get`, `set`, `list`, `normalize` | +| `gstack-gbrain-source-wireup` | Registers your `~/.gstack/` brain repo with gbrain as a federated source via `gbrain sources add` + `git worktree`, then runs an initial `gbrain sync`. Idempotent. Replaces the dead `consumers.json + /ingest-repo` HTTP wireup from v1.12.x. Flags: `--strict`, `--source-id `, `--no-pull`, `--uninstall`, `--probe`. | ### gbrain CLI (upstream tool) diff --git a/VERSION b/VERSION index 6d98661f..706a8a06 100644 --- a/VERSION +++ b/VERSION @@ -1 +1 @@ -1.16.0.0 +1.17.0.0 diff --git a/bin/gstack-brain-consumer b/bin/gstack-brain-consumer index cf92ea3e..12403ae5 100755 --- a/bin/gstack-brain-consumer +++ b/bin/gstack-brain-consumer @@ -1,6 +1,11 @@ #!/usr/bin/env bash # gstack-brain-consumer — manage the consumer (reader) registry. # +# DEPRECATED in v1.17.0.0. This binary targets a gbrain HTTP /ingest-repo +# endpoint that never shipped on the gbrain side. Live federation now uses +# `gbrain sources` directly via bin/gstack-gbrain-source-wireup. This file +# stays for one cycle to avoid breaking external scripts; removal in v1.18.0.0. +# # Consumer = a reader that ingests the gstack-brain git repo as a source of # session memory. v1 primary consumer is GBrain; later versions can register # Codex, OpenClaw, or third-party readers. diff --git a/bin/gstack-brain-init b/bin/gstack-brain-init index 3ed48559..4bf665cc 100755 --- a/bin/gstack-brain-init +++ b/bin/gstack-brain-init @@ -22,11 +22,9 @@ # 8. Prompt for remote (default: gh repo create --private gstack-brain-$USER) # 9. Initial commit + push # 10. Write ~/.gstack-brain-remote.txt (URL-only, safe to share) -# 11. Register GBrain consumer (HTTP POST if GBRAIN_URL set; else defer) # # Env: # GSTACK_HOME — override ~/.gstack -# GBRAIN_URL — GBrain ingest endpoint base URL (for consumer registration) set -euo pipefail @@ -34,7 +32,6 @@ GSTACK_HOME="${GSTACK_HOME:-$HOME/.gstack}" SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)" CONFIG_BIN="$SCRIPT_DIR/gstack-config" REMOTE_FILE="$HOME/.gstack-brain-remote.txt" -CONSUMERS_FILE="$GSTACK_HOME/consumers.json" REMOTE_URL="" while [ $# -gt 0 ]; do @@ -280,68 +277,6 @@ fi echo "$REMOTE_URL" > "$REMOTE_FILE" chmod 600 "$REMOTE_FILE" -# ---- register GBrain consumer ---- -mkdir -p "$GSTACK_HOME" -CONSUMER_STATUS="pending" -GBRAIN_URL_VAL="${GBRAIN_URL:-$("$CONFIG_BIN" get gbrain_url 2>/dev/null || echo "")}" -GBRAIN_TOKEN_VAL="${GBRAIN_TOKEN:-$("$CONFIG_BIN" get gbrain_token 2>/dev/null || echo "")}" - -if [ -n "$GBRAIN_URL_VAL" ] && [ -n "$GBRAIN_TOKEN_VAL" ]; then - # Try the HTTP handoff. - HTTP_RESP=$(curl -sS -X POST "${GBRAIN_URL_VAL%/}/ingest-repo" \ - -H "Authorization: Bearer $GBRAIN_TOKEN_VAL" \ - -H "Content-Type: application/json" \ - --data "{\"repo_url\":\"$REMOTE_URL\"}" \ - -w "\n%{http_code}" 2>&1 || echo -e "\ncurl-error") - HTTP_CODE=$(echo "$HTTP_RESP" | tail -1) - if [ "$HTTP_CODE" = "200" ] || [ "$HTTP_CODE" = "201" ] || [ "$HTTP_CODE" = "204" ]; then - CONSUMER_STATUS="ok" - echo "GBrain consumer registered: $GBRAIN_URL_VAL" - else - echo "GBrain ingest endpoint returned HTTP $HTTP_CODE; will retry on next skill run." - fi -elif [ -z "$GBRAIN_URL_VAL" ]; then - echo "(GBRAIN_URL not configured; skipping consumer registration. Set it with:" - echo " gstack-config set gbrain_url " - echo " gstack-config set gbrain_token " - echo " then run: gstack-brain-consumer add gbrain --ingest-url --token )" -fi - -# Write consumers.json — the canonical registry. Tokens are NOT stored here; -# they stay in gstack-config (machine-local). This file IS synced so a new -# machine knows which consumers exist and can prompt for tokens. -python3 - "$CONSUMERS_FILE" "$GBRAIN_URL_VAL" "$CONSUMER_STATUS" <<'PYEOF' -import sys, json, os -path, url, status = sys.argv[1:4] -try: - with open(path) as f: - data = json.load(f) -except (FileNotFoundError, json.JSONDecodeError): - data = {"consumers": []} -# Upsert GBrain entry. -entry = {"name": "gbrain", "ingest_url": url, "status": status, "token_ref": "gbrain_token"} -updated = False -for i, c in enumerate(data.get("consumers", [])): - if c.get("name") == "gbrain": - data["consumers"][i] = entry - updated = True - break -if not updated: - data.setdefault("consumers", []).append(entry) -with open(path, "w") as f: - json.dump(data, f, indent=2) - f.write("\n") -PYEOF - -# Stage and commit consumers.json in the same session. -cd "$GSTACK_HOME" -git add -f consumers.json 2>/dev/null || true -if ! git diff --cached --quiet 2>/dev/null; then - git -c user.email="gstack@localhost" -c user.name="gstack-brain-init" \ - commit -q -m "chore: register GBrain consumer" - git push -q origin HEAD 2>/dev/null || true -fi - # ---- done ---- cat <") -PYEOF -fi - # ---- write remote helper file if missing ---- if [ ! -f "$REMOTE_FILE" ]; then echo "$REMOTE_URL" > "$REMOTE_FILE" @@ -222,6 +204,12 @@ if [ ! -f "$REMOTE_FILE" ]; then echo "Wrote $REMOTE_FILE for future skill-run auto-detection." fi +# ---- wire the cloned brain into gbrain (best-effort) ---- +WIREUP_BIN="$SCRIPT_DIR/gstack-gbrain-source-wireup" +if [ -x "$WIREUP_BIN" ]; then + "$WIREUP_BIN" || >&2 echo "WARNING: gbrain wireup failed; run $WIREUP_BIN manually after fixing prereqs" +fi + cat </dev/null || true rm -f "$GSTACK_HOME/.brain-skip.txt" 2>/dev/null || true rm -f "$GSTACK_HOME/.brain-sync-status.json" 2>/dev/null || true rm -rf "$GSTACK_HOME/.brain-sync.lock.d" 2>/dev/null || true + +# ---- unregister gbrain federated source + remove worktree (best-effort) ---- +# The wireup helper handles: gbrain sources remove, git worktree remove, +# launchd plist (future). All best-effort; uninstall continues on failure. +WIREUP_BIN="$SCRIPT_DIR/gstack-gbrain-source-wireup" +if [ -x "$WIREUP_BIN" ]; then + "$WIREUP_BIN" --uninstall 2>/dev/null || true +fi + +# ---- legacy consumers.json (no longer written by gstack-brain-init since v1.17.0.0) ---- rm -f "$GSTACK_HOME/consumers.json" 2>/dev/null || true # ---- clear config keys ---- diff --git a/bin/gstack-gbrain-source-wireup b/bin/gstack-gbrain-source-wireup new file mode 100755 index 00000000..3b175482 --- /dev/null +++ b/bin/gstack-gbrain-source-wireup @@ -0,0 +1,357 @@ +#!/usr/bin/env bash +# gstack-gbrain-source-wireup — register the gstack brain repo as a gbrain +# federated source via `git worktree`, run an initial sync, hook into +# subsequent skill-end syncs. +# +# Replaces the v1.12.2.0 dead `consumers.json + ingest_url + /ingest-repo` +# wireup which depended on a gbrain HTTP endpoint that never shipped. +# +# Usage: +# gstack-gbrain-source-wireup [--strict] [--source-id ] [--no-pull] +# [--database-url ] +# gstack-gbrain-source-wireup --uninstall [--source-id ] +# [--database-url ] +# gstack-gbrain-source-wireup --probe +# gstack-gbrain-source-wireup --help +# +# Exit codes: +# 0 — success, OR benign skip without --strict +# 1 — hard failure (gbrain or git op errored on a real call) +# 2 — missing prereqs (no gbrain >= 0.18.0, no .git or remote-file) +# 3 — source-id derivation failed in --uninstall, no fallback worked +# +# Env: +# GSTACK_HOME — override ~/.gstack (test harness) +# GSTACK_BRAIN_WORKTREE — override worktree path (default ~/.gstack-brain-worktree) +# GSTACK_BRAIN_SOURCE_ID — id override; --source-id flag takes precedence +# GSTACK_BRAIN_NO_SYNC — skip the gbrain sync step (tests; helper still +# ensures source registration) +# +# Defense against external rewrites of ~/.gbrain/config.json: +# At helper startup we capture the database URL ONCE — from --database-url, +# from GBRAIN_DATABASE_URL/DATABASE_URL env, or from ~/.gbrain/config.json — +# and export it as GBRAIN_DATABASE_URL for every child `gbrain` invocation. +# That env var overrides whatever's in config.json (per gbrain's loadConfig +# at src/core/config.ts:53), so a process that flips config.json mid-sync +# can't redirect us at a different brain mid-stream. +# +# Depends on: jq (transitive via gstack-gbrain-detect). + +set -euo pipefail + +SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)" +CONFIG_BIN="$SCRIPT_DIR/gstack-config" + +GSTACK_HOME="${GSTACK_HOME:-$HOME/.gstack}" +WORKTREE="${GSTACK_BRAIN_WORKTREE:-$HOME/.gstack-brain-worktree}" +REMOTE_FILE="$HOME/.gstack-brain-remote.txt" +PLIST_PATH="$HOME/Library/LaunchAgents/com.gstack.brain-sync.plist" +GBRAIN_CONFIG="$HOME/.gbrain/config.json" + +# ---- arg parse ---- +MODE="wireup" +STRICT=0 +NO_PULL=0 +SOURCE_ID="" +DATABASE_URL_ARG="" + +while [ $# -gt 0 ]; do + case "$1" in + --uninstall) MODE="uninstall"; shift ;; + --probe) MODE="probe"; shift ;; + --strict) STRICT=1; shift ;; + --no-pull) NO_PULL=1; shift ;; + --source-id) SOURCE_ID="$2"; shift 2 ;; + --database-url) DATABASE_URL_ARG="$2"; shift 2 ;; + --help|-h) sed -n '2,40p' "$0" | sed 's/^# \{0,1\}//'; exit 0 ;; + *) echo "Unknown flag: $1" >&2; exit 1 ;; + esac +done + +# ---- lock the database URL at startup ---- +# Precedence: --database-url flag > existing GBRAIN_DATABASE_URL/DATABASE_URL +# env > read once from ~/.gbrain/config.json. Whichever wins gets exported as +# GBRAIN_DATABASE_URL so every child `gbrain` invocation uses THAT brain even +# if config.json is rewritten by another process during the wireup. +_locked_url="" +if [ -n "$DATABASE_URL_ARG" ]; then + _locked_url="$DATABASE_URL_ARG" +elif [ -n "${GBRAIN_DATABASE_URL:-}" ]; then + _locked_url="$GBRAIN_DATABASE_URL" +elif [ -n "${DATABASE_URL:-}" ]; then + _locked_url="$DATABASE_URL" +elif [ -f "$GBRAIN_CONFIG" ]; then + # Python heredoc reads config.json. On JSON parse failure or any IO error, + # we WARN (not silently swallow) so the user knows the URL lock fell back + # to gbrain's own loadConfig (which would still read this same file). + _py_err=$(mktemp -t wireup-pyerr 2>/dev/null || mktemp /tmp/wireup-pyerr.XXXXXX) + _locked_url=$(GBRAIN_CONFIG_PATH="$GBRAIN_CONFIG" python3 -c ' +import json, os, sys +try: + c = json.load(open(os.environ["GBRAIN_CONFIG_PATH"])) + print(c.get("database_url","")) +except FileNotFoundError: + sys.exit(0) +except Exception as e: + print(f"config.json parse error: {e}", file=sys.stderr) + sys.exit(1) +' "$_py_err") || warn "could not read $GBRAIN_CONFIG ($(cat "$_py_err" 2>/dev/null)); URL not locked" + rm -f "$_py_err" 2>/dev/null +fi +if [ -n "$_locked_url" ]; then + export GBRAIN_DATABASE_URL="$_locked_url" +fi + +prefix() { sed 's/^/gstack-gbrain-source-wireup: /' >&2; } +warn() { echo "$*" | prefix; } +# die [exit_code]: warn with just the message, exit with code (default 1). +die() { warn "$1"; exit "${2:-1}"; } + +# Refuse to rm anything outside $HOME/. Defends against GSTACK_BRAIN_WORKTREE=/ +# or empty-string overrides that would otherwise have line 169 / 161 nuke the +# user's home or root. +safe_rm_worktree() { + local target="$1" + case "$target" in + "" | "/" | "/Users" | "/Users/" | "$HOME" | "$HOME/" ) + die "refusing to rm dangerous path: $target" 1 ;; + esac + case "$target" in + "$HOME"/*) rm -rf "$target" ;; + *) die "refusing to rm path outside \$HOME: $target" 1 ;; + esac +} + +# ---- source-id derivation (D6 multi-fallback) ---- +derive_source_id() { + if [ -n "$SOURCE_ID" ]; then + echo "$SOURCE_ID"; return 0 + fi + if [ -n "${GSTACK_BRAIN_SOURCE_ID:-}" ]; then + echo "$GSTACK_BRAIN_SOURCE_ID"; return 0 + fi + local remote_url="" + remote_url=$(git -C "$GSTACK_HOME" remote get-url origin 2>/dev/null) || true + if [ -z "$remote_url" ] && [ -f "$REMOTE_FILE" ]; then + remote_url=$(head -1 "$REMOTE_FILE" 2>/dev/null | tr -d '[:space:]') + fi + [ -z "$remote_url" ] && return 3 + basename "$remote_url" .git \ + | tr '[:upper:]' '[:lower:]' \ + | tr -c 'a-z0-9-' '-' \ + | sed 's/--*/-/g; s/^-//; s/-$//' \ + | cut -c1-32 +} + +# ---- gbrain version gate ---- +gbrain_version_ok() { + if ! command -v gbrain >/dev/null 2>&1; then + return 1 + fi + local v + v=$(gbrain --version 2>/dev/null | awk '{print $2}') + [ -z "$v" ] && return 1 + # 0.18.0 minimum (gbrain sources shipped here). Put the floor first in stdin + # so equal or greater $v sorts to position 2 — head -1 == "0.18.0" iff $v >= floor. + [ "$(printf '0.18.0\n%s\n' "$v" | sort -V | head -1)" = "0.18.0" ] +} + +# ---- worktree management ---- +# A worktree is always created `--detach`ed at $GSTACK_HOME's HEAD. Detached +# because a branch (main) can only be checked out in ONE worktree, and the +# parent at $GSTACK_HOME already has it. To advance, we re-checkout the +# parent's current HEAD into the detached worktree. +_worktree_add_detached() { + local sha + sha=$(git -C "$GSTACK_HOME" rev-parse HEAD 2>/dev/null) || return 1 + git -C "$GSTACK_HOME" worktree prune 2>/dev/null || true + # Surface git errors via prefix so users see WHY the add failed (disk, perms, etc). + git -C "$GSTACK_HOME" worktree add --detach "$WORKTREE" "$sha" 2>&1 | prefix + return "${PIPESTATUS[0]}" +} + +ensure_worktree() { + if [ ! -d "$GSTACK_HOME/.git" ]; then + return 2 + fi + if [ -d "$WORKTREE/.git" ] || [ -f "$WORKTREE/.git" ]; then + # already exists; advance the detached HEAD to parent's current HEAD + if [ "$NO_PULL" = "0" ]; then + local sha + sha=$(git -C "$GSTACK_HOME" rev-parse HEAD 2>/dev/null) || return 1 + # Surface checkout errors via prefix so users see WHY the advance failed + # (uncommitted changes in the detached worktree, ref ambiguity, etc). + ( cd "$WORKTREE" && git checkout --detach "$sha" 2>&1 | prefix; exit "${PIPESTATUS[0]}" ) || { + warn "worktree at $WORKTREE could not advance to $sha; resetting via remove + re-add" + git -C "$GSTACK_HOME" worktree remove --force "$WORKTREE" 2>/dev/null || safe_rm_worktree "$WORKTREE" + _worktree_add_detached || return 1 + } + fi + return 0 + fi + # Stray non-git dir? Remove first. + [ -e "$WORKTREE" ] && safe_rm_worktree "$WORKTREE" + _worktree_add_detached || return 1 +} + +# ---- gbrain sources operations ---- +# Returns 0 if source with id exists at expected path. 1 if exists but path differs. 2 if absent. +# Hard-fails (exits non-zero via die) if jq is missing — without jq we cannot +# distinguish "absent" from "missing-tool" and would falsely re-add an existing +# source. jq is documented as a dependency of gstack-gbrain-detect (transitive) +# but adversarial review flagged the silent-fall-through path; this probe makes +# the failure mode loud. +check_source_state() { + local id="$1" + if ! command -v jq >/dev/null 2>&1; then + die "jq required for source state detection. Install jq (brew install jq) and re-run." 1 + fi + local existing_path + existing_path=$(gbrain sources list --json 2>/dev/null \ + | jq -r --arg id "$id" '.sources[] | select(.id==$id) | .local_path' 2>/dev/null \ + | tr -d '[:space:]') || existing_path="" + if [ -z "$existing_path" ]; then + return 2 + fi + if [ "$existing_path" = "$WORKTREE" ]; then + return 0 + fi + return 1 +} + +# ---- modes ---- +do_probe() { + local id worktree_status="absent" gbrain_status="missing" source_status="absent" + id=$(derive_source_id 2>/dev/null) || id="(unknown)" + # Use explicit if-block so [ -d ] || [ -f ] doesn't get short-circuited by && + # precedence (the `||` and `&&` chain has trap behavior in bash test syntax). + if [ -d "$WORKTREE/.git" ] || [ -f "$WORKTREE/.git" ]; then + worktree_status="present" + fi + if gbrain_version_ok; then + gbrain_status="ok ($(gbrain --version 2>/dev/null | awk '{print $2}'))" + # Capture check_source_state's return code explicitly. Relying on $? after + # an `if`-elif chain is fragile under set -e and undefined under some shells. + set +e + check_source_state "$id" + local css_rc=$? + set -e + case "$css_rc" in + 0) source_status="registered ($WORKTREE)" ;; + 1) source_status="registered (different path)" ;; + esac + fi + echo "source_id=$id" + echo "worktree=$WORKTREE" + echo "worktree_status=$worktree_status" + echo "gbrain=$gbrain_status" + echo "source_status=$source_status" +} + +do_wireup() { + local id + id=$(derive_source_id) || die "cannot derive source id (no .git, no remote-file, no --source-id)" 2 + + if ! gbrain_version_ok; then + if [ "$STRICT" = "1" ]; then + die "gbrain not installed or < 0.18.0; install/upgrade gbrain and re-run" 2 + fi + warn "gbrain not installed or < 0.18.0; skipping wireup (benign skip)" + exit 0 + fi + + # Capture ensure_worktree's return code explicitly. `$?` after `||` reflects + # the LAST command in the function under set -e, which is unreliable when the + # function has multiple internal exit paths. + set +e + ensure_worktree + ew_rc=$? + set -e + case "$ew_rc" in + 0) : ;; # success + 2) + [ "$STRICT" = "1" ] && die "no $GSTACK_HOME/.git; run /setup-gbrain Step 7 (gstack-brain-init) first" 2 + warn "no $GSTACK_HOME/.git; skipping (benign skip)" + exit 0 + ;; + *) die "git worktree creation failed at $WORKTREE" 1 ;; + esac + + # Source registration: probe state, then act. + set +e + check_source_state "$id" + local sstate=$? + set -e + case "$sstate" in + 0) : ;; # already correctly registered + 1) + # Multi-Mac case: if the existing path also looks like another machine's + # brain-worktree (same basename, different parent), don't ping-pong the + # registration. Just sync from our local worktree — gbrain stores pages + # by content, not by local_path. The metadata is informational only. + local existing_path + existing_path=$(gbrain sources list --json 2>/dev/null \ + | jq -r --arg id "$id" '.sources[] | select(.id==$id) | .local_path' 2>/dev/null \ + | tr -d '[:space:]') || existing_path="" + if [ "$(basename "$existing_path")" = "$(basename "$WORKTREE")" ] \ + && [ "$existing_path" != "$WORKTREE" ]; then + warn "source $id is registered at $existing_path (likely another machine's local copy of the same brain repo). Skipping re-registration; will sync from local worktree." + else + warn "source $id registered with different path; recreating (gbrain has no 'sources update')" + gbrain sources remove "$id" --yes 2>&1 | prefix || die "gbrain sources remove failed" 1 + gbrain sources add "$id" --path "$WORKTREE" --federated 2>&1 | prefix \ + || die "gbrain sources add failed" 1 + fi + ;; + 2) + gbrain sources add "$id" --path "$WORKTREE" --federated 2>&1 | prefix \ + || die "gbrain sources add failed" 1 + ;; + esac + + if [ "${GSTACK_BRAIN_NO_SYNC:-0}" = "1" ]; then + echo "source_id=$id" + echo "worktree=$WORKTREE" + echo "pages_synced=skipped" + exit 0 + fi + + local sync_out sync_redacted + sync_out=$(gbrain sync --repo "$WORKTREE" 2>&1) || { + # Redact any postgres:// URLs from the error message in case gbrain logged + # a connection error containing the full DSN with password. The user sees + # "***REDACTED***" instead of credentials in their stderr or any log. + sync_redacted=$(echo "$sync_out" | tail -10 | sed -E 's#postgres(ql)?://[^[:space:]]+#postgres://***REDACTED***#g') + die "gbrain sync failed (last 10 lines, secrets redacted): $sync_redacted" 1 + } + echo "$sync_out" | tail -3 | prefix + + echo "source_id=$id" + echo "worktree=$WORKTREE" + echo "pages_synced=$(echo "$sync_out" | grep -oE '[0-9]+ pages? imported' | head -1 || echo 'incremental')" +} + +do_uninstall() { + local id + id=$(derive_source_id) || die "cannot derive source id; pass --source-id explicitly" 3 + + if command -v gbrain >/dev/null 2>&1; then + gbrain sources remove "$id" --yes 2>&1 | prefix || warn "gbrain sources remove failed (continuing)" + fi + + if [ -d "$WORKTREE/.git" ] || [ -f "$WORKTREE/.git" ]; then + git -C "$GSTACK_HOME" worktree remove --force "$WORKTREE" 2>/dev/null \ + || safe_rm_worktree "$WORKTREE" + fi + + # Cron-stub: future launchd plist (not created today; safety net for D9 future). + rm -f "$PLIST_PATH" 2>/dev/null || true + + echo "uninstalled source=$id worktree=$WORKTREE" +} + +case "$MODE" in + probe) do_probe ;; + wireup) do_wireup ;; + uninstall) do_uninstall ;; +esac diff --git a/docs/gbrain-sync.md b/docs/gbrain-sync.md index 02e9dd4c..e5f1d700 100644 --- a/docs/gbrain-sync.md +++ b/docs/gbrain-sync.md @@ -43,9 +43,13 @@ The command: 3. Pushes an initial commit with just the config. 4. Writes `~/.gstack-brain-remote.txt` (URL-only, no secrets — safe to copy to another machine). -5. Registers GBrain as a reader if `GBRAIN_URL` + `GBRAIN_TOKEN` are - configured. Otherwise you can add readers later with - `gstack-brain-reader add --ingest-url --token `. +5. Wires the gstack-brain repo into your local gbrain as a federated + source (via `gbrain sources add` + `git worktree`) so `gbrain search` + can index your synced learnings, plans, and designs. Implementation + lives in `bin/gstack-gbrain-source-wireup`. The old + `gstack-brain-reader add --ingest-url ...` HTTP path was removed in + v1.15.1.0 — it depended on a `/ingest-repo` endpoint gbrain never + shipped. After init, the **next skill you run** will ask you ONE question about privacy mode: diff --git a/gstack-upgrade/migrations/v1.17.0.0.sh b/gstack-upgrade/migrations/v1.17.0.0.sh new file mode 100755 index 00000000..5b8f1dd9 --- /dev/null +++ b/gstack-upgrade/migrations/v1.17.0.0.sh @@ -0,0 +1,56 @@ +#!/usr/bin/env bash +# Migration: v1.17.0.0 — Wire existing brain-sync repos as gbrain federated sources +# +# Pre-1.17.0.0 /setup-gbrain wrote ~/.gstack/consumers.json with a placeholder +# `status: "pending"` and an empty `ingest_url`, expecting a gbrain HTTP +# /ingest-repo endpoint that never shipped. This migration runs the real +# wireup (gbrain sources add + worktree + initial sync) for users who +# already opted into brain-sync but never got the gbrain side connected. +# +# Idempotent: safe to re-run. Skips when: +# - User never opted into brain-sync (gbrain_sync_mode = off or unset) +# - No ~/.gstack/.git (brain-init never ran) +# - The wireup helper is missing on disk (broken install — defensive) +# +# Failure mode: invokes the helper WITHOUT --strict, so a missing/old gbrain +# CLI is a benign skip rather than blocking the rest of /gstack-upgrade. +set -euo pipefail + +if [ -z "${HOME:-}" ]; then + echo " [v1.17.0.0] HOME is unset or empty — skipping migration." >&2 + exit 0 +fi + +SKILLS_DIR="${HOME}/.claude/skills" +BIN_DIR="${SKILLS_DIR}/gstack/bin" +CONFIG_BIN="${BIN_DIR}/gstack-config" +WIREUP_BIN="${BIN_DIR}/gstack-gbrain-source-wireup" + +# Skip if user never opted into brain-sync. +SYNC_MODE="" +if [ -x "$CONFIG_BIN" ]; then + # Trim whitespace defensively: gstack-config can emit trailing newlines, + # which would mis-classify "off\n" as a non-empty non-off mode. + SYNC_MODE=$("$CONFIG_BIN" get gbrain_sync_mode 2>/dev/null | tr -d '[:space:]' || echo "") +fi +if [ "$SYNC_MODE" = "off" ] || [ -z "$SYNC_MODE" ]; then + exit 0 +fi + +# Skip if no brain-sync git repo exists. +if [ ! -d "${HOME}/.gstack/.git" ]; then + exit 0 +fi + +# Skip if helper missing (defensive — should always be present post-upgrade). +if [ ! -x "$WIREUP_BIN" ]; then + echo " [v1.17.0.0] $WIREUP_BIN missing or non-executable — skipping wireup." >&2 + exit 0 +fi + +echo " [v1.17.0.0] Wiring brain-sync repo into gbrain (federated source + initial sync)..." + +# No --strict: missing/old gbrain is a benign skip during a batch upgrade. +"$WIREUP_BIN" || { + echo " [v1.17.0.0] Wireup exited non-zero — re-run manually with: $WIREUP_BIN" >&2 +} diff --git a/package.json b/package.json index 4aac18f0..5326f311 100644 --- a/package.json +++ b/package.json @@ -1,6 +1,6 @@ { "name": "gstack", - "version": "1.16.0.0", + "version": "1.17.0.0", "description": "Garry's Stack — Claude Code skills + fast headless browser. One repo, one install, entire AI engineering workflow.", "license": "MIT", "type": "module", diff --git a/setup-gbrain/SKILL.md b/setup-gbrain/SKILL.md index 77e297b4..1ee78dac 100644 --- a/setup-gbrain/SKILL.md +++ b/setup-gbrain/SKILL.md @@ -986,7 +986,7 @@ For `/setup-gbrain --repo` invocations, execute ONLY Step 6 and exit. --- -## Step 7: Offer gstack-brain-sync +## Step 7: Offer gstack-brain-sync + wire it into gbrain Separate AskUserQuestion: "Also sync your gstack session memory (learnings, plans, retros) to a private git repo that gbrain can index across machines?" @@ -1004,6 +1004,37 @@ If yes: # or "full" if user picked yes-full ``` +Then wire the brain repo into gbrain so its content is searchable from any +gbrain client (this Claude Code session, future Macs, optional cloud agents). +The helper creates a `git worktree` of `~/.gstack/`, registers it as a +federated source on the user's gbrain (Supabase or PGLite), and runs an +initial `gbrain sync`. Local-Mac only. No cloud agent required. Subsequent +skill runs trigger incremental sync via the existing skill-end push hook. + +Capture the database URL out of `~/.gbrain/config.json` first and pass it +explicitly so the wireup is robust against any other process rewriting +`~/.gbrain/config.json` mid-sync (e.g., concurrent `gbrain init` runs +elsewhere on the machine): + +```bash +GBRAIN_URL=$(python3 -c " +import json, os, sys +try: + c = json.load(open(os.path.expanduser('~/.gbrain/config.json'))) + print(c.get('database_url', '')) +except Exception: + pass +") +~/.claude/skills/gstack/bin/gstack-gbrain-source-wireup --strict \ + ${GBRAIN_URL:+--database-url "$GBRAIN_URL"} +``` + +`--strict` exits non-zero on missing prereqs (gbrain not installed, < 0.18.0, +or no `~/.gstack/.git` yet) so the user sees the failure rather than silently +ending up with an unwired brain. On non-zero exit, surface the helper's +output and STOP per skill rules — search-across-machines won't work until +the prereq is fixed. + --- ## Step 8: Persist `## GBrain Configuration` in CLAUDE.md diff --git a/setup-gbrain/SKILL.md.tmpl b/setup-gbrain/SKILL.md.tmpl index 685e15e0..3bbf9b12 100644 --- a/setup-gbrain/SKILL.md.tmpl +++ b/setup-gbrain/SKILL.md.tmpl @@ -347,7 +347,7 @@ For `/setup-gbrain --repo` invocations, execute ONLY Step 6 and exit. --- -## Step 7: Offer gstack-brain-sync +## Step 7: Offer gstack-brain-sync + wire it into gbrain Separate AskUserQuestion: "Also sync your gstack session memory (learnings, plans, retros) to a private git repo that gbrain can index across machines?" @@ -365,6 +365,37 @@ If yes: # or "full" if user picked yes-full ``` +Then wire the brain repo into gbrain so its content is searchable from any +gbrain client (this Claude Code session, future Macs, optional cloud agents). +The helper creates a `git worktree` of `~/.gstack/`, registers it as a +federated source on the user's gbrain (Supabase or PGLite), and runs an +initial `gbrain sync`. Local-Mac only. No cloud agent required. Subsequent +skill runs trigger incremental sync via the existing skill-end push hook. + +Capture the database URL out of `~/.gbrain/config.json` first and pass it +explicitly so the wireup is robust against any other process rewriting +`~/.gbrain/config.json` mid-sync (e.g., concurrent `gbrain init` runs +elsewhere on the machine): + +```bash +GBRAIN_URL=$(python3 -c " +import json, os, sys +try: + c = json.load(open(os.path.expanduser('~/.gbrain/config.json'))) + print(c.get('database_url', '')) +except Exception: + pass +") +~/.claude/skills/gstack/bin/gstack-gbrain-source-wireup --strict \ + ${GBRAIN_URL:+--database-url "$GBRAIN_URL"} +``` + +`--strict` exits non-zero on missing prereqs (gbrain not installed, < 0.18.0, +or no `~/.gstack/.git` yet) so the user sees the failure rather than silently +ending up with an unwired brain. On non-zero exit, surface the helper's +output and STOP per skill rules — search-across-machines won't work until +the prereq is fixed. + --- ## Step 8: Persist `## GBrain Configuration` in CLAUDE.md diff --git a/test/gstack-gbrain-source-wireup.test.ts b/test/gstack-gbrain-source-wireup.test.ts new file mode 100644 index 00000000..d7a30b76 --- /dev/null +++ b/test/gstack-gbrain-source-wireup.test.ts @@ -0,0 +1,440 @@ +/** + * gstack-gbrain-source-wireup — unit tests with mocked gbrain CLI. + * + * The helper registers the gstack brain repo as a gbrain federated source + * via `git worktree`, runs an initial sync, and exposes --uninstall + --probe. + * + * Strategy: put a fake `gbrain` binary on PATH that records every call into + * a log file and reads/writes its "registered sources" state from a JSON + * file in the test's tmp dir. The helper sees a consistent gbrain-CLI surface + * but no real database, no real gbrain. + */ + +import { describe, test, expect, beforeEach, afterEach } from 'bun:test'; +import * as fs from 'fs'; +import * as os from 'os'; +import * as path from 'path'; +import { spawnSync } from 'child_process'; + +const ROOT = path.resolve(import.meta.dir, '..'); +const BIN_DIR = path.join(ROOT, 'bin'); +const WIREUP_BIN = path.join(BIN_DIR, 'gstack-gbrain-source-wireup'); + +let tmpHome: string; +let gstackHome: string; +let worktreeDir: string; +let fakeBinDir: string; +let gbrainCallLog: string; +let gbrainStateFile: string; + +function makeFakeGbrain(opts: { + version?: string | null; // null = "binary missing" (don't write the file) + syncFails?: boolean; +}) { + const version = opts.version ?? '0.18.2'; + if (version === null) return; // simulate missing binary by NOT writing one + const syncFails = opts.syncFails ?? false; + + // Stub gbrain reads/writes state from a JSON file. Fields: + // sources: [{id, local_path, federated}] + fs.writeFileSync(gbrainStateFile, JSON.stringify({ sources: [] }, null, 2)); + + const script = `#!/bin/bash +LOG="${gbrainCallLog}" +STATE="${gbrainStateFile}" +# Record the call AND any GBRAIN_DATABASE_URL that the parent passed via env. +# Format: "gbrain [GBRAIN_DATABASE_URL=]" so tests can assert +# the wireup helper exported the locked URL into our env. +LINE="gbrain $@" +[ -n "\${GBRAIN_DATABASE_URL:-}" ] && LINE="\$LINE [GBRAIN_DATABASE_URL=\$GBRAIN_DATABASE_URL]" +echo "\$LINE" >> "$LOG" + +# --version +if [ "$1" = "--version" ]; then + echo "gbrain ${version}" + exit 0 +fi + +# sources list --json → emits state +if [ "$1" = "sources" ] && [ "$2" = "list" ]; then + cat "$STATE" + exit 0 +fi + +# sources add --path

--federated → adds entry +if [ "$1" = "sources" ] && [ "$2" = "add" ]; then + shift 2 + ID="$1"; shift + PATH_VAL="" + FED="false" + while [ $# -gt 0 ]; do + case "$1" in + --path) PATH_VAL="$2"; shift 2 ;; + --federated) FED="true"; shift ;; + *) shift ;; + esac + done + python3 -c " +import json, sys +state = json.load(open('$STATE')) +state['sources'].append({'id': '$ID', 'local_path': '$PATH_VAL', 'federated': '$FED' == 'true'}) +json.dump(state, open('$STATE','w'), indent=2) +" || exit 1 + exit 0 +fi + +# sources remove --yes → drops entry +if [ "$1" = "sources" ] && [ "$2" = "remove" ]; then + shift 2 + ID="$1" + python3 -c " +import json +state = json.load(open('$STATE')) +state['sources'] = [s for s in state['sources'] if s['id'] != '$ID'] +json.dump(state, open('$STATE','w'), indent=2) +" + exit 0 +fi + +# sync --repo

→ records, optionally fails +if [ "$1" = "sync" ]; then + ${syncFails ? 'echo "sync failed: connection error" >&2; exit 1' : 'echo "1 page imported"; exit 0'} +fi + +echo "fake gbrain: unhandled subcommand: $@" >&2 +exit 99 +`; + const gbrainPath = path.join(fakeBinDir, 'gbrain'); + fs.writeFileSync(gbrainPath, script, { mode: 0o755 }); +} + +function run( + argv: string[], + opts: { env?: Record } = {} +) { + const env = { + PATH: `${fakeBinDir}:${process.env.PATH || '/usr/bin:/bin:/opt/homebrew/bin'}`, + HOME: tmpHome, + GSTACK_HOME: gstackHome, + GSTACK_BRAIN_WORKTREE: worktreeDir, + GSTACK_BRAIN_NO_SYNC: '0', + ...(opts.env || {}), + }; + return spawnSync(WIREUP_BIN, argv, { + env, + encoding: 'utf-8', + cwd: ROOT, + }); +} + +function readState(): { sources: Array<{ id: string; local_path: string; federated: boolean }> } { + if (!fs.existsSync(gbrainStateFile)) return { sources: [] }; + return JSON.parse(fs.readFileSync(gbrainStateFile, 'utf-8')); +} + +function gbrainCalls(): string[] { + if (!fs.existsSync(gbrainCallLog)) return []; + return fs.readFileSync(gbrainCallLog, 'utf-8') + .split('\n') + .filter((l) => l.trim()); +} + +function setupGstackRepo(remoteUrl: string) { + // Real git repo at gstackHome with at least one commit + an origin remote. + fs.mkdirSync(gstackHome, { recursive: true }); + spawnSync('git', ['-C', gstackHome, 'init', '-q', '-b', 'main'], { stdio: 'pipe' }); + spawnSync('git', ['-C', gstackHome, 'config', 'user.email', 'test@example.com'], { stdio: 'pipe' }); + spawnSync('git', ['-C', gstackHome, 'config', 'user.name', 'test'], { stdio: 'pipe' }); + fs.writeFileSync(path.join(gstackHome, '.brain-allowlist'), '# allowlist\n'); + spawnSync('git', ['-C', gstackHome, 'add', '.'], { stdio: 'pipe' }); + spawnSync('git', ['-C', gstackHome, 'commit', '-q', '-m', 'init'], { stdio: 'pipe' }); + spawnSync('git', ['-C', gstackHome, 'remote', 'add', 'origin', remoteUrl], { stdio: 'pipe' }); +} + +beforeEach(() => { + tmpHome = fs.mkdtempSync(path.join(os.tmpdir(), 'gstack-wireup-test-')); + gstackHome = path.join(tmpHome, '.gstack'); + worktreeDir = path.join(tmpHome, '.gstack-brain-worktree'); + fakeBinDir = path.join(tmpHome, 'fake-bin'); + fs.mkdirSync(fakeBinDir, { recursive: true }); + gbrainCallLog = path.join(tmpHome, 'gbrain-calls.log'); + gbrainStateFile = path.join(tmpHome, 'gbrain-state.json'); +}); + +afterEach(() => { + try { + fs.rmSync(tmpHome, { recursive: true, force: true }); + } catch {} +}); + +describe('gstack-gbrain-source-wireup — wireup mode', () => { + test('fresh state: registers source + creates worktree + syncs', () => { + setupGstackRepo('git@github.com:user/gstack-brain-user.git'); + makeFakeGbrain({}); + const r = run([], { env: { GSTACK_BRAIN_NO_SYNC: '1' } }); + expect(r.status).toBe(0); + expect(fs.existsSync(worktreeDir)).toBe(true); + const state = readState(); + expect(state.sources).toHaveLength(1); + expect(state.sources[0].id).toBe('gstack-brain-user'); + expect(state.sources[0].local_path).toBe(worktreeDir); + expect(state.sources[0].federated).toBe(true); + }); + + test('idempotent re-run after success: no new sources add call', () => { + setupGstackRepo('git@github.com:user/gstack-brain-user.git'); + makeFakeGbrain({}); + run([], { env: { GSTACK_BRAIN_NO_SYNC: '1' } }); + const callsAfterFirst = gbrainCalls().filter((c) => c.startsWith('gbrain sources add')).length; + expect(callsAfterFirst).toBe(1); + run([], { env: { GSTACK_BRAIN_NO_SYNC: '1' } }); + const callsAfterSecond = gbrainCalls().filter((c) => c.startsWith('gbrain sources add')).length; + expect(callsAfterSecond).toBe(1); // no new add + }); + + test('drift recovery: existing source with different path triggers remove + add', () => { + setupGstackRepo('git@github.com:user/gstack-brain-user.git'); + makeFakeGbrain({}); + // Pre-seed the fake gbrain state with a source at the wrong path + fs.writeFileSync( + gbrainStateFile, + JSON.stringify({ + sources: [{ id: 'gstack-brain-user', local_path: '/old/stale/path', federated: true }], + }) + ); + const r = run([], { env: { GSTACK_BRAIN_NO_SYNC: '1' } }); + expect(r.status).toBe(0); + const calls = gbrainCalls(); + expect(calls.some((c) => c.startsWith('gbrain sources remove gstack-brain-user'))).toBe(true); + expect(calls.some((c) => c.includes(`gbrain sources add gstack-brain-user --path ${worktreeDir}`))).toBe(true); + const state = readState(); + expect(state.sources[0].local_path).toBe(worktreeDir); + }); + + test('--strict + gbrain too old: exits 2', () => { + setupGstackRepo('git@github.com:user/gstack-brain-user.git'); + makeFakeGbrain({ version: '0.17.0' }); + const r = run(['--strict']); + expect(r.status).toBe(2); + expect(r.stderr).toContain('< 0.18.0'); + }); + + test('non-strict + gbrain too old: warn + exit 0', () => { + setupGstackRepo('git@github.com:user/gstack-brain-user.git'); + makeFakeGbrain({ version: '0.17.0' }); + const r = run([]); + expect(r.status).toBe(0); + expect(r.stderr).toContain('benign skip'); + }); + + test('--strict + gbrain missing on PATH: exits 2', () => { + setupGstackRepo('git@github.com:user/gstack-brain-user.git'); + // Don't make a fake gbrain — fakeBinDir is empty. Keep system dirs on PATH + // so basic commands (git, awk, sed, etc.) work; only `gbrain` is absent. + const r = run(['--strict'], { + env: { PATH: `${fakeBinDir}:/usr/bin:/bin:/opt/homebrew/bin` }, + }); + expect(r.status).toBe(2); + }); + + test('source-id derived from origin URL', () => { + setupGstackRepo('git@github.com:user/gstack-brain-alice.git'); + makeFakeGbrain({}); + const r = run([], { env: { GSTACK_BRAIN_NO_SYNC: '1' } }); + expect(r.status).toBe(0); + expect(readState().sources[0].id).toBe('gstack-brain-alice'); + }); + + test('source-id fallback to ~/.gstack-brain-remote.txt when .git is gone', () => { + // No git repo at gstackHome; just the remote-file + fs.mkdirSync(tmpHome, { recursive: true }); + fs.writeFileSync( + path.join(tmpHome, '.gstack-brain-remote.txt'), + 'git@github.com:user/gstack-brain-bob.git\n' + ); + makeFakeGbrain({}); + // No --strict: helper should benign-skip because .gstack/.git is missing + const r = run([]); + // ensure_worktree returns 2 → benign skip, exit 0 + expect(r.status).toBe(0); + }); + + test('source-id from --source-id flag overrides everything', () => { + setupGstackRepo('git@github.com:user/gstack-brain-different.git'); + makeFakeGbrain({}); + run(['--source-id', 'custom-id'], { env: { GSTACK_BRAIN_NO_SYNC: '1' } }); + const state = readState(); + expect(state.sources[0].id).toBe('custom-id'); + }); + + test('--probe: read-only, prints state without mutating', () => { + setupGstackRepo('git@github.com:user/gstack-brain-user.git'); + makeFakeGbrain({}); + const r = run(['--probe']); + expect(r.status).toBe(0); + expect(r.stdout).toContain('source_id=gstack-brain-user'); + expect(r.stdout).toContain('worktree='); + expect(r.stdout).toContain('gbrain=ok'); + expect(r.stdout).toContain('source_status=absent'); + // Probe should NOT call sources add / sync + const calls = gbrainCalls(); + expect(calls.some((c) => c.startsWith('gbrain sources add'))).toBe(false); + expect(calls.some((c) => c.startsWith('gbrain sync'))).toBe(false); + }); + + test('gbrain sync failure: exits 1 with stderr', () => { + setupGstackRepo('git@github.com:user/gstack-brain-user.git'); + makeFakeGbrain({ syncFails: true }); + const r = run([]); + expect(r.status).toBe(1); + expect(r.stderr).toContain('sync failed'); + }); +}); + +describe('gstack-gbrain-source-wireup — --database-url lock (defends against external config rewrites)', () => { + test('--database-url flag is exported as GBRAIN_DATABASE_URL to child gbrain calls', () => { + setupGstackRepo('git@github.com:user/gstack-brain-user.git'); + makeFakeGbrain({}); + const TARGET = 'postgresql://postgres.abc:pw@aws.pooler.supabase.com:5432/postgres'; + const r = run(['--database-url', TARGET], { env: { GSTACK_BRAIN_NO_SYNC: '1' } }); + expect(r.status).toBe(0); + const calls = gbrainCalls(); + // every gbrain invocation should carry the locked URL + const writingCalls = calls.filter((c) => c.includes('sources') || c.includes('sync')); + expect(writingCalls.length).toBeGreaterThan(0); + for (const c of writingCalls) { + expect(c).toContain(`[GBRAIN_DATABASE_URL=${TARGET}]`); + } + }); + + test('falls back to ~/.gbrain/config.json database_url when no flag and no env', () => { + setupGstackRepo('git@github.com:user/gstack-brain-user.git'); + makeFakeGbrain({}); + const FILE_URL = 'postgresql://postgres.xyz:pw@aws.pooler.supabase.com:5432/postgres'; + fs.mkdirSync(path.join(tmpHome, '.gbrain'), { recursive: true }); + fs.writeFileSync( + path.join(tmpHome, '.gbrain', 'config.json'), + JSON.stringify({ engine: 'postgres', database_url: FILE_URL }) + ); + // Important: don't pass GBRAIN_DATABASE_URL or DATABASE_URL in env; helper + // should read from $HOME/.gbrain/config.json (HOME is tmpHome here). + const r = run([], { + env: { + GSTACK_BRAIN_NO_SYNC: '1', + GBRAIN_DATABASE_URL: '', + DATABASE_URL: '', + }, + }); + expect(r.status).toBe(0); + const calls = gbrainCalls(); + const writingCalls = calls.filter((c) => c.includes('sources add')); + expect(writingCalls.length).toBe(1); + expect(writingCalls[0]).toContain(`[GBRAIN_DATABASE_URL=${FILE_URL}]`); + }); + + test('--database-url overrides env GBRAIN_DATABASE_URL and config.json', () => { + setupGstackRepo('git@github.com:user/gstack-brain-user.git'); + makeFakeGbrain({}); + const FLAG_URL = 'postgresql://postgres.flag:pw@a.b:5432/postgres'; + const ENV_URL = 'postgresql://postgres.env:pw@x.y:5432/postgres'; + const FILE_URL = 'postgresql://postgres.file:pw@p.q:5432/postgres'; + fs.mkdirSync(path.join(tmpHome, '.gbrain'), { recursive: true }); + fs.writeFileSync( + path.join(tmpHome, '.gbrain', 'config.json'), + JSON.stringify({ engine: 'postgres', database_url: FILE_URL }) + ); + const r = run(['--database-url', FLAG_URL], { + env: { + GSTACK_BRAIN_NO_SYNC: '1', + GBRAIN_DATABASE_URL: ENV_URL, + }, + }); + expect(r.status).toBe(0); + const calls = gbrainCalls(); + const writingCalls = calls.filter((c) => c.includes('sources add')); + expect(writingCalls.length).toBe(1); + expect(writingCalls[0]).toContain(`[GBRAIN_DATABASE_URL=${FLAG_URL}]`); + expect(writingCalls[0]).not.toContain(ENV_URL); + expect(writingCalls[0]).not.toContain(FILE_URL); + }); +}); + +describe('gstack-gbrain-source-wireup — uninstall mode', () => { + test('after wireup: removes source + worktree', () => { + setupGstackRepo('git@github.com:user/gstack-brain-user.git'); + makeFakeGbrain({}); + run([], { env: { GSTACK_BRAIN_NO_SYNC: '1' } }); + expect(readState().sources).toHaveLength(1); + expect(fs.existsSync(worktreeDir)).toBe(true); + const r = run(['--uninstall']); + expect(r.status).toBe(0); + expect(readState().sources).toHaveLength(0); + expect(fs.existsSync(worktreeDir)).toBe(false); + }); + + test('with no prior state: exits 3 (cannot derive id)', () => { + // No git repo, no remote file. --uninstall must fail with code 3. + fs.mkdirSync(tmpHome, { recursive: true }); + makeFakeGbrain({}); + const r = run(['--uninstall']); + expect(r.status).toBe(3); + }); + + test('--uninstall when gbrain is missing: exits 0 (best-effort), still removes worktree', () => { + setupGstackRepo('git@github.com:user/gstack-brain-user.git'); + // First wireup with fake gbrain to create the worktree + register source + makeFakeGbrain({}); + run([], { env: { GSTACK_BRAIN_NO_SYNC: '1' } }); + expect(fs.existsSync(worktreeDir)).toBe(true); + // Now remove the fake gbrain so uninstall sees gbrain missing + fs.rmSync(path.join(fakeBinDir, 'gbrain'), { force: true }); + const r = run(['--uninstall'], { + env: { PATH: `${fakeBinDir}:/usr/bin:/bin:/opt/homebrew/bin` }, + }); + expect(r.status).toBe(0); // best-effort, never fails on gbrain absence + expect(fs.existsSync(worktreeDir)).toBe(false); // worktree still cleaned up + }); +}); + +describe('gstack-gbrain-source-wireup — defensive paths', () => { + test('--no-pull skips HEAD advance on existing worktree', () => { + setupGstackRepo('git@github.com:user/gstack-brain-user.git'); + makeFakeGbrain({}); + // First run to create worktree + run([], { env: { GSTACK_BRAIN_NO_SYNC: '1' } }); + // Make a new commit on parent so worktree HEAD is "behind" + fs.writeFileSync(path.join(gstackHome, 'newfile.md'), 'new'); + spawnSync('git', ['-C', gstackHome, 'add', '.'], { stdio: 'pipe' }); + spawnSync('git', ['-C', gstackHome, 'commit', '-q', '-m', 'second commit'], { stdio: 'pipe' }); + const parentHeadAfter = spawnSync('git', ['-C', gstackHome, 'rev-parse', 'HEAD'], { + encoding: 'utf-8', + }).stdout.trim(); + const worktreeHeadBefore = spawnSync('git', ['-C', worktreeDir, 'rev-parse', 'HEAD'], { + encoding: 'utf-8', + }).stdout.trim(); + expect(parentHeadAfter).not.toBe(worktreeHeadBefore); // sanity: parent advanced + // --no-pull should leave worktree HEAD where it was + const r = run(['--no-pull'], { env: { GSTACK_BRAIN_NO_SYNC: '1' } }); + expect(r.status).toBe(0); + const worktreeHeadAfter = spawnSync('git', ['-C', worktreeDir, 'rev-parse', 'HEAD'], { + encoding: 'utf-8', + }).stdout.trim(); + expect(worktreeHeadAfter).toBe(worktreeHeadBefore); + expect(worktreeHeadAfter).not.toBe(parentHeadAfter); + }); + + test('stray non-git directory at worktree path is cleaned up + worktree created', () => { + setupGstackRepo('git@github.com:user/gstack-brain-user.git'); + makeFakeGbrain({}); + // Plant a stray non-git directory at the worktree path + fs.mkdirSync(worktreeDir, { recursive: true }); + fs.writeFileSync(path.join(worktreeDir, 'unrelated.txt'), 'not a worktree'); + expect(fs.existsSync(path.join(worktreeDir, 'unrelated.txt'))).toBe(true); + expect(fs.existsSync(path.join(worktreeDir, '.git'))).toBe(false); + // Helper should remove the stray dir + create a real worktree + const r = run([], { env: { GSTACK_BRAIN_NO_SYNC: '1' } }); + expect(r.status).toBe(0); + expect(fs.existsSync(path.join(worktreeDir, '.git'))).toBe(true); // real worktree + expect(fs.existsSync(path.join(worktreeDir, 'unrelated.txt'))).toBe(false); // stray gone + }); +}); diff --git a/test/gstack-upgrade-migration-v1_17_0_0.test.ts b/test/gstack-upgrade-migration-v1_17_0_0.test.ts new file mode 100644 index 00000000..e1d20a95 --- /dev/null +++ b/test/gstack-upgrade-migration-v1_17_0_0.test.ts @@ -0,0 +1,151 @@ +/** + * gstack-upgrade/migrations/v1.17.0.0.sh — migration script unit tests. + * + * The migration runs on /gstack-upgrade for users with brain-sync configured but + * never wired up to gbrain. It has 4 skip conditions and one happy path. + * + * Strategy: stub gstack-config and gstack-gbrain-source-wireup binaries on PATH + * so each skip condition can be triggered independently. The migration script + * itself is plain bash — we exercise it directly. + */ + +import { describe, test, expect, beforeEach, afterEach } from 'bun:test'; +import * as fs from 'fs'; +import * as os from 'os'; +import * as path from 'path'; +import { spawnSync } from 'child_process'; + +const ROOT = path.resolve(import.meta.dir, '..'); +const MIGRATION = path.join(ROOT, 'gstack-upgrade', 'migrations', 'v1.17.0.0.sh'); + +let tmpHome: string; +let fakeBinDir: string; +let stubLog: string; + +function makeFakeStubs(opts: { + configValue?: string; // value gstack-config returns for gbrain_sync_mode + configMissing?: boolean; // gstack-config binary itself missing (test edge) + wireupMissing?: boolean; // wireup binary missing + wireupExitCode?: number; +}) { + const skillsBin = path.join(tmpHome, '.claude', 'skills', 'gstack', 'bin'); + fs.mkdirSync(skillsBin, { recursive: true }); + + if (!opts.configMissing) { + const cfg = `#!/bin/bash +echo "gstack-config $@" >> "${stubLog}" +[ "$1" = "get" ] && [ "$2" = "gbrain_sync_mode" ] && echo "${opts.configValue ?? ''}" +exit 0 +`; + fs.writeFileSync(path.join(skillsBin, 'gstack-config'), cfg, { mode: 0o755 }); + } + + if (!opts.wireupMissing) { + const wu = `#!/bin/bash +echo "gstack-gbrain-source-wireup $@" >> "${stubLog}" +exit ${opts.wireupExitCode ?? 0} +`; + fs.writeFileSync(path.join(skillsBin, 'gstack-gbrain-source-wireup'), wu, { mode: 0o755 }); + } +} + +function makeBrainGitRepo() { + const gstackHome = path.join(tmpHome, '.gstack'); + fs.mkdirSync(path.join(gstackHome, '.git'), { recursive: true }); +} + +function run(opts: { env?: Record } = {}) { + const env = { + PATH: '/usr/bin:/bin:/opt/homebrew/bin', + HOME: tmpHome, + ...(opts.env || {}), + }; + return spawnSync('bash', [MIGRATION], { + env, + encoding: 'utf-8', + cwd: tmpHome, + }); +} + +function stubCalls(): string[] { + if (!fs.existsSync(stubLog)) return []; + return fs.readFileSync(stubLog, 'utf-8').split('\n').filter((l) => l.trim()); +} + +beforeEach(() => { + tmpHome = fs.mkdtempSync(path.join(os.tmpdir(), 'gstack-migration-test-')); + fakeBinDir = path.join(tmpHome, 'fake-bin'); + fs.mkdirSync(fakeBinDir, { recursive: true }); + stubLog = path.join(tmpHome, 'stub-calls.log'); +}); + +afterEach(() => { + try { + fs.rmSync(tmpHome, { recursive: true, force: true }); + } catch {} +}); + +describe('migrations/v1.17.0.0.sh', () => { + test('HOME unset: prints message + exit 0 (defensive)', () => { + // Override HOME to empty string. Bash's [ -z "${HOME:-}" ] guard should fire. + const r = run({ env: { HOME: '' } }); + expect(r.status).toBe(0); + expect(r.stderr).toContain('HOME is unset or empty'); + }); + + test('gbrain_sync_mode = off: exit 0 silently (no helper invoked)', () => { + makeFakeStubs({ configValue: 'off' }); + const r = run(); + expect(r.status).toBe(0); + // Helper should not have been invoked + const calls = stubCalls(); + expect(calls.some((c) => c.startsWith('gstack-gbrain-source-wireup'))).toBe(false); + }); + + test('gbrain_sync_mode unset/empty: exit 0 silently', () => { + makeFakeStubs({ configValue: '' }); // empty string return + const r = run(); + expect(r.status).toBe(0); + const calls = stubCalls(); + expect(calls.some((c) => c.startsWith('gstack-gbrain-source-wireup'))).toBe(false); + }); + + test('no ~/.gstack/.git: exit 0 silently (no brain-sync configured)', () => { + makeFakeStubs({ configValue: 'full' }); + // Do NOT call makeBrainGitRepo() — no .gstack/.git directory exists + const r = run(); + expect(r.status).toBe(0); + const calls = stubCalls(); + expect(calls.some((c) => c.startsWith('gstack-gbrain-source-wireup'))).toBe(false); + }); + + test('helper missing on PATH: prints warning, exit 0 (defensive)', () => { + makeFakeStubs({ configValue: 'full', wireupMissing: true }); + makeBrainGitRepo(); + const r = run(); + expect(r.status).toBe(0); + expect(r.stderr).toContain('missing or non-executable'); + }); + + test('happy path: invokes the helper', () => { + makeFakeStubs({ configValue: 'full' }); + makeBrainGitRepo(); + const r = run(); + expect(r.status).toBe(0); + const calls = stubCalls(); + expect(calls.some((c) => c.startsWith('gstack-gbrain-source-wireup'))).toBe(true); + // Note: migration invokes WITHOUT --strict (benign-skip semantics for batch upgrade) + const helperCall = calls.find((c) => c.startsWith('gstack-gbrain-source-wireup')); + expect(helperCall).not.toContain('--strict'); + }); + + test('helper exits non-zero: migration prints retry hint, exit 0 (non-blocking)', () => { + // The migration uses `|| { echo retry-hint; }` so non-zero helper still + // exits 0 and prints a retry hint to stderr. + makeFakeStubs({ configValue: 'full', wireupExitCode: 2 }); + makeBrainGitRepo(); + const r = run(); + expect(r.status).toBe(0); // migration is non-blocking + expect(r.stderr).toContain('Wireup exited non-zero'); + }); +});