v1.26.3.0 feat: /sync-gbrain skill + native code-surface orchestrator (#1314)

* feat: native gbrain code-surface orchestrator + ensureSourceRegistered helper

Replaces gbrain import (markdown only) with gbrain sources add + sync
--strategy code (or reindex-code on --full). Adds lib/gbrain-sources.ts
exporting ensureSourceRegistered/probeSource/sourcePageCount, plus lock
file + tmp-rename atomicity + dry-run write skip in the orchestrator.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* feat: setup-gbrain Step 8 writes ## GBrain Search Guidance after smoke test

Extends Step 8 to write a machine-agnostic guidance block that teaches
the agent when to prefer gbrain CLI (search/query/code-def/code-refs/
code-callers/code-callees) over Grep. Gated on smoke test pass.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* feat: /sync-gbrain skill — keep gbrain current and refresh agent guidance

New top-level skill that wraps gstack-gbrain-sync with state probing,
capability check (write+search round-trip, not gbrain doctor), CLAUDE.md
guidance lifecycle (write iff healthy, remove iff broken), and a
per-source verdict block. Re-runnable, idempotent.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* feat: preamble emits gbrain-availability block when capability ok

Extends generate-brain-sync-block.ts to emit Variant A (steady-state, 4
lines) when cwd page_count > 0 or Variant B (empty-corpus emergency, 3
lines) when 0; empty string otherwise. Reads cached page_count from
.gbrain-sync-state.json (handles pretty + compact JSON). Refreshes ship
golden fixtures and bumps the plan-review preamble byte budget to 35K
to absorb the new block.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* docs: register /sync-gbrain in AGENTS.md and docs/skills.md

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* chore: regenerate SKILL.md across all hosts (gen:skill-docs)

Mechanical regeneration after preamble + setup-gbrain template + new
sync-gbrain skill. Run via: bun run gen:skill-docs --host all.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* chore: bump version and changelog (v1.26.3.0)

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* docs: add /sync-gbrain to README skills table and gbrain section

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
This commit is contained in:
Garry Tan
2026-05-04 09:29:48 -07:00
committed by GitHub
parent 30fe6bb11c
commit db9447c333
59 changed files with 3412 additions and 79 deletions
+29
View File
@@ -343,6 +343,35 @@ _BRAIN_REMOTE_FILE="$HOME/.gstack-brain-remote.txt"
_BRAIN_SYNC_BIN="~/.claude/skills/gstack/bin/gstack-brain-sync"
_BRAIN_CONFIG_BIN="~/.claude/skills/gstack/bin/gstack-config"
# /sync-gbrain context-load: teach the agent to use gbrain when it's available.
# Mutually exclusive variants per /plan-eng-review §4. Empty string when gbrain
# is not configured (zero context cost for non-gbrain users).
_GBRAIN_CONFIG="$HOME/.gbrain/config.json"
if [ -f "$_GBRAIN_CONFIG" ] && command -v gbrain >/dev/null 2>&1; then
_GBRAIN_VERSION_OK=$(gbrain --version 2>/dev/null | grep -c '^gbrain ' || echo 0)
if [ "$_GBRAIN_VERSION_OK" -gt 0 ] 2>/dev/null; then
_SYNC_STATE="$_GSTACK_HOME/.gbrain-sync-state.json"
_CWD_PAGES=0
if [ -f "$_SYNC_STATE" ]; then
# Flatten newlines so the regex works against pretty-printed JSON too.
_CWD_PAGES=$(tr -d '\n' < "$_SYNC_STATE" 2>/dev/null \
| grep -o '"name": *"code"[^}]*"detail": *{[^}]*"page_count": *[0-9]*' \
| grep -o '"page_count": *[0-9]*' | grep -o '[0-9]\+' | head -1)
_CWD_PAGES=${_CWD_PAGES:-0}
fi
if [ "$_CWD_PAGES" -gt 0 ] 2>/dev/null; then
echo "GBrain configured. Prefer \`gbrain search\`/\`gbrain query\` over Grep for"
echo "semantic questions; use \`gbrain code-def\`/\`code-refs\`/\`code-callers\` for"
echo "symbol-aware code lookup. See \"## GBrain Search Guidance\" in CLAUDE.md."
echo "Run /sync-gbrain to refresh."
else
echo "GBrain configured but this repo isn't indexed yet. Run \`/sync-gbrain --full\`"
echo "before relying on \`gbrain search\` for code questions in this repo."
echo "Falls back to Grep until indexed."
fi
fi
fi
_BRAIN_SYNC_MODE=$("$_BRAIN_CONFIG_BIN" get gbrain_sync_mode 2>/dev/null || echo off)
if [ -f "$_BRAIN_REMOTE_FILE" ] && [ ! -d "$_GSTACK_HOME/.git" ] && [ "$_BRAIN_SYNC_MODE" = "off" ]; then
+29
View File
@@ -332,6 +332,35 @@ _BRAIN_REMOTE_FILE="$HOME/.gstack-brain-remote.txt"
_BRAIN_SYNC_BIN="$GSTACK_BIN/gstack-brain-sync"
_BRAIN_CONFIG_BIN="$GSTACK_BIN/gstack-config"
# /sync-gbrain context-load: teach the agent to use gbrain when it's available.
# Mutually exclusive variants per /plan-eng-review §4. Empty string when gbrain
# is not configured (zero context cost for non-gbrain users).
_GBRAIN_CONFIG="$HOME/.gbrain/config.json"
if [ -f "$_GBRAIN_CONFIG" ] && command -v gbrain >/dev/null 2>&1; then
_GBRAIN_VERSION_OK=$(gbrain --version 2>/dev/null | grep -c '^gbrain ' || echo 0)
if [ "$_GBRAIN_VERSION_OK" -gt 0 ] 2>/dev/null; then
_SYNC_STATE="$_GSTACK_HOME/.gbrain-sync-state.json"
_CWD_PAGES=0
if [ -f "$_SYNC_STATE" ]; then
# Flatten newlines so the regex works against pretty-printed JSON too.
_CWD_PAGES=$(tr -d '\n' < "$_SYNC_STATE" 2>/dev/null \
| grep -o '"name": *"code"[^}]*"detail": *{[^}]*"page_count": *[0-9]*' \
| grep -o '"page_count": *[0-9]*' | grep -o '[0-9]\+' | head -1)
_CWD_PAGES=${_CWD_PAGES:-0}
fi
if [ "$_CWD_PAGES" -gt 0 ] 2>/dev/null; then
echo "GBrain configured. Prefer \`gbrain search\`/\`gbrain query\` over Grep for"
echo "semantic questions; use \`gbrain code-def\`/\`code-refs\`/\`code-callers\` for"
echo "symbol-aware code lookup. See \"## GBrain Search Guidance\" in CLAUDE.md."
echo "Run /sync-gbrain to refresh."
else
echo "GBrain configured but this repo isn't indexed yet. Run \`/sync-gbrain --full\`"
echo "before relying on \`gbrain search\` for code questions in this repo."
echo "Falls back to Grep until indexed."
fi
fi
fi
_BRAIN_SYNC_MODE=$("$_BRAIN_CONFIG_BIN" get gbrain_sync_mode 2>/dev/null || echo off)
if [ -f "$_BRAIN_REMOTE_FILE" ] && [ ! -d "$_GSTACK_HOME/.git" ] && [ "$_BRAIN_SYNC_MODE" = "off" ]; then
+29
View File
@@ -334,6 +334,35 @@ _BRAIN_REMOTE_FILE="$HOME/.gstack-brain-remote.txt"
_BRAIN_SYNC_BIN="$GSTACK_BIN/gstack-brain-sync"
_BRAIN_CONFIG_BIN="$GSTACK_BIN/gstack-config"
# /sync-gbrain context-load: teach the agent to use gbrain when it's available.
# Mutually exclusive variants per /plan-eng-review §4. Empty string when gbrain
# is not configured (zero context cost for non-gbrain users).
_GBRAIN_CONFIG="$HOME/.gbrain/config.json"
if [ -f "$_GBRAIN_CONFIG" ] && command -v gbrain >/dev/null 2>&1; then
_GBRAIN_VERSION_OK=$(gbrain --version 2>/dev/null | grep -c '^gbrain ' || echo 0)
if [ "$_GBRAIN_VERSION_OK" -gt 0 ] 2>/dev/null; then
_SYNC_STATE="$_GSTACK_HOME/.gbrain-sync-state.json"
_CWD_PAGES=0
if [ -f "$_SYNC_STATE" ]; then
# Flatten newlines so the regex works against pretty-printed JSON too.
_CWD_PAGES=$(tr -d '\n' < "$_SYNC_STATE" 2>/dev/null \
| grep -o '"name": *"code"[^}]*"detail": *{[^}]*"page_count": *[0-9]*' \
| grep -o '"page_count": *[0-9]*' | grep -o '[0-9]\+' | head -1)
_CWD_PAGES=${_CWD_PAGES:-0}
fi
if [ "$_CWD_PAGES" -gt 0 ] 2>/dev/null; then
echo "GBrain configured. Prefer \`gbrain search\`/\`gbrain query\` over Grep for"
echo "semantic questions; use \`gbrain code-def\`/\`code-refs\`/\`code-callers\` for"
echo "symbol-aware code lookup. See \"## GBrain Search Guidance\" in CLAUDE.md."
echo "Run /sync-gbrain to refresh."
else
echo "GBrain configured but this repo isn't indexed yet. Run \`/sync-gbrain --full\`"
echo "before relying on \`gbrain search\` for code questions in this repo."
echo "Falls back to Grep until indexed."
fi
fi
fi
_BRAIN_SYNC_MODE=$("$_BRAIN_CONFIG_BIN" get gbrain_sync_mode 2>/dev/null || echo off)
if [ -f "$_BRAIN_REMOTE_FILE" ] && [ ! -d "$_GSTACK_HOME/.git" ] && [ "$_BRAIN_SYNC_MODE" = "off" ]; then
+219
View File
@@ -0,0 +1,219 @@
/**
* Unit tests for lib/gbrain-sources.ts (per /plan-eng-review D3 DRY extraction).
*
* The helper shells out to the real `gbrain` CLI. To test idempotency
* deterministically without a live brain, we put a fake `gbrain` binary on
* PATH that emits canned `sources list --json` output and records its
* invocations. The same trick `test/gstack-gbrain-source-wireup.test.ts` uses.
*/
import { describe, it, expect } from "bun:test";
import { mkdtempSync, writeFileSync, readFileSync, existsSync, mkdirSync, rmSync, chmodSync } from "fs";
import { tmpdir } from "os";
import { join } from "path";
import { ensureSourceRegistered, probeSource, sourcePageCount } from "../lib/gbrain-sources";
interface FakeGbrainSetup {
bindir: string;
statePath: string;
logPath: string;
/**
* Env to pass to helper calls. Bun's execFileSync does NOT respect runtime
* mutations of process.env.PATH; we have to pass env explicitly. Production
* callers leave this unset (inherit process.env) — the helper signature has
* an optional `env` param specifically for tests.
*/
env: NodeJS.ProcessEnv;
cleanup: () => void;
}
/**
* Build a temp dir with a fake `gbrain` shell script on PATH. The fake honors:
* gbrain sources list --json → cat $STATE_PATH
* gbrain sources add <id> --path <p> [--federated] → append to state, log
* gbrain sources remove <id> --yes → drop from state, log
* gbrain --version → echo "gbrain 0.25.1"
* Anything else exits 1.
*/
function makeFakeGbrain(initialState: { sources: Array<{ id: string; local_path: string; federated?: boolean; page_count?: number }> }): FakeGbrainSetup {
const tmp = mkdtempSync(join(tmpdir(), "gbrain-sources-test-"));
const bindir = join(tmp, "bin");
mkdirSync(bindir, { recursive: true });
const statePath = join(tmp, "state.json");
const logPath = join(tmp, "calls.log");
writeFileSync(statePath, JSON.stringify(initialState));
writeFileSync(logPath, "");
const fake = `#!/bin/sh
echo "$@" >> "${logPath}"
case "$1 $2" in
"--version ")
echo "gbrain 0.25.1"
exit 0
;;
"sources list")
cat "${statePath}"
exit 0
;;
"sources add")
ID="$3"
shift 3
PATH_VAL=""
FED="false"
while [ $# -gt 0 ]; do
case "$1" in
--path) PATH_VAL="$2"; shift 2 ;;
--federated) FED="true"; shift ;;
*) shift ;;
esac
done
NEW=$(jq --arg id "$ID" --arg path "$PATH_VAL" --argjson fed "$FED" \
'.sources += [{id: $id, local_path: $path, federated: $fed, page_count: 0}]' "${statePath}")
echo "$NEW" > "${statePath}"
exit 0
;;
"sources remove")
ID="$3"
NEW=$(jq --arg id "$ID" '.sources = (.sources | map(select(.id != $id)))' "${statePath}")
echo "$NEW" > "${statePath}"
exit 0
;;
esac
echo "fake gbrain: unknown command: $@" >&2
exit 1
`;
const fakePath = join(bindir, "gbrain");
writeFileSync(fakePath, fake);
chmodSync(fakePath, 0o755);
// Build the env override we'll pass to helper calls. We do NOT mutate
// process.env globally because Bun's execFileSync caches PATH at process
// start; explicit env is the only reliable way to redirect spawn-time PATH.
const env: NodeJS.ProcessEnv = { ...process.env, PATH: `${bindir}:${process.env.PATH || ""}` };
return {
bindir,
statePath,
logPath,
env,
cleanup: () => {
rmSync(tmp, { recursive: true, force: true });
},
};
}
describe("probeSource", () => {
it("returns absent when source id is not in the list", () => {
const fake = makeFakeGbrain({ sources: [{ id: "other-source", local_path: "/x" }] });
const state = probeSource("gstack-code-foo", fake.env);
expect(state.status).toBe("absent");
expect(state.registered_path).toBeUndefined();
fake.cleanup();
});
it("returns match when source id is registered (path included)", () => {
const fake = makeFakeGbrain({
sources: [{ id: "gstack-code-foo", local_path: "/Users/me/repo" }],
});
const state = probeSource("gstack-code-foo", fake.env);
expect(state.status).toBe("match");
expect(state.registered_path).toBe("/Users/me/repo");
fake.cleanup();
});
});
describe("ensureSourceRegistered", () => {
it("adds source when absent, returns changed=true", async () => {
const fake = makeFakeGbrain({ sources: [] });
const result = await ensureSourceRegistered("gstack-code-foo", "/Users/me/repo", {
federated: true,
env: fake.env,
});
expect(result.changed).toBe(true);
expect(result.state.status).toBe("match");
expect(result.state.registered_path).toBe("/Users/me/repo");
const log = readFileSync(fake.logPath, "utf-8");
expect(log).toContain("sources add gstack-code-foo --path /Users/me/repo --federated");
expect(log).not.toContain("sources remove");
fake.cleanup();
});
it("is a no-op when source is already at the correct path, returns changed=false", async () => {
const fake = makeFakeGbrain({
sources: [{ id: "gstack-code-foo", local_path: "/Users/me/repo" }],
});
const result = await ensureSourceRegistered("gstack-code-foo", "/Users/me/repo", { env: fake.env });
expect(result.changed).toBe(false);
expect(result.state.status).toBe("match");
const log = readFileSync(fake.logPath, "utf-8");
expect(log).toContain("sources list --json");
expect(log).not.toContain("sources add");
expect(log).not.toContain("sources remove");
fake.cleanup();
});
it("recreates source when path differs (gbrain has no `sources update`), returns changed=true", async () => {
const fake = makeFakeGbrain({
sources: [{ id: "gstack-code-foo", local_path: "/old/path" }],
});
const result = await ensureSourceRegistered("gstack-code-foo", "/new/path", {
federated: true,
env: fake.env,
});
expect(result.changed).toBe(true);
expect(result.state.status).toBe("match");
expect(result.state.registered_path).toBe("/new/path");
const log = readFileSync(fake.logPath, "utf-8");
expect(log).toContain("sources remove gstack-code-foo --yes");
expect(log).toContain("sources add gstack-code-foo --path /new/path --federated");
fake.cleanup();
});
it("when reregister_on_drift=false and source is at different path, returns changed=false", async () => {
const fake = makeFakeGbrain({
sources: [{ id: "gstack-code-foo", local_path: "/old/path" }],
});
const result = await ensureSourceRegistered("gstack-code-foo", "/new/path", {
reregister_on_drift: false,
env: fake.env,
});
expect(result.changed).toBe(false);
expect(result.state.status).toBe("drift");
expect(result.state.registered_path).toBe("/old/path");
const log = readFileSync(fake.logPath, "utf-8");
expect(log).not.toContain("sources remove");
expect(log).not.toContain("sources add");
fake.cleanup();
});
});
describe("sourcePageCount", () => {
it("returns the page_count when the source is registered", () => {
const fake = makeFakeGbrain({
sources: [
{ id: "gstack-code-foo", local_path: "/x", page_count: 1247 },
{ id: "other-source", local_path: "/y", page_count: 99 },
],
});
expect(sourcePageCount("gstack-code-foo", fake.env)).toBe(1247);
expect(sourcePageCount("other-source", fake.env)).toBe(99);
fake.cleanup();
});
it("returns null when the source is absent", () => {
const fake = makeFakeGbrain({ sources: [{ id: "other", local_path: "/x", page_count: 5 }] });
expect(sourcePageCount("missing", fake.env)).toBeNull();
fake.cleanup();
});
it("returns null when page_count is missing from the source object", () => {
const fake = makeFakeGbrain({ sources: [{ id: "no-count", local_path: "/x" } as { id: string; local_path: string }] });
expect(sourcePageCount("no-count", fake.env)).toBeNull();
fake.cleanup();
});
});
+3 -1
View File
@@ -316,10 +316,12 @@ describe('gen-skill-docs', () => {
// (Brain Sync, Context Recovery, Routing Injection are load-bearing
// functionality, not optional). Budget is set to current size + small
// headroom; ratchet down if a future slim trims real bytes.
// Ratcheted from 33000 → 35000 when the gbrain context-load block was
// added to generate-brain-sync-block.ts (per /sync-gbrain plan §4).
for (const skill of reviewSkills) {
const content = fs.readFileSync(skill.path, 'utf-8');
const preamble = extractPreambleBeforeWorkflow(content, skill.markers);
expect(Buffer.byteLength(preamble, 'utf-8')).toBeLessThan(34_000);
expect(Buffer.byteLength(preamble, 'utf-8')).toBeLessThan(35_000);
}
});
+81 -3
View File
@@ -55,7 +55,11 @@ describe("gstack-gbrain-sync CLI", () => {
const r = runScript(["--dry-run", "--code-only", "--quiet"], { HOME: home, GSTACK_HOME: gstackHome });
expect(r.exitCode).toBe(0);
expect(r.stdout).toContain("would: gbrain import");
// Code stage now uses native code surface: sources add + sync --strategy code
// (NOT gbrain import — that's the markdown-only path that was rejected post-codex).
expect(r.stdout).toContain("would: gbrain sources add");
expect(r.stdout).toContain("gbrain sync --strategy code");
expect(r.stdout).not.toContain("gbrain import");
// memory + brain-sync stages should not appear
expect(r.stdout).not.toContain("gstack-memory-ingest --probe");
expect(r.stdout).not.toContain("gstack-brain-sync --discover-new");
@@ -69,7 +73,8 @@ describe("gstack-gbrain-sync CLI", () => {
const r = runScript(["--dry-run"], { HOME: home, GSTACK_HOME: gstackHome });
expect(r.exitCode).toBe(0);
expect(r.stdout).toContain("would: gbrain import");
expect(r.stdout).toContain("would: gbrain sources add");
expect(r.stdout).toContain("gbrain sync --strategy code");
expect(r.stdout).toContain("would: gstack-memory-ingest");
expect(r.stdout).toContain("would: gstack-brain-sync");
rmSync(home, { recursive: true, force: true });
@@ -82,11 +87,84 @@ describe("gstack-gbrain-sync CLI", () => {
const r = runScript(["--dry-run", "--no-code"], { HOME: home, GSTACK_HOME: gstackHome });
expect(r.exitCode).toBe(0);
expect(r.stdout).not.toContain("would: gbrain import");
expect(r.stdout).not.toContain("would: gbrain sources add");
expect(r.stdout).toContain("would: gstack-memory-ingest");
rmSync(home, { recursive: true, force: true });
});
it("dry-run derives a stable source id from the canonical git remote", () => {
// The source id pattern is `gstack-code-<canonicalized-remote>`. For this
// repo (github.com/garrytan/gstack), the slug should appear in the dry-run
// preview line. We don't pin the exact slug — just verify the prefix +
// that the preview command would target a source with id gstack-code-*.
const home = makeTestHome();
const gstackHome = join(home, ".gstack");
mkdirSync(gstackHome, { recursive: true });
const r = runScript(["--dry-run", "--code-only", "--quiet"], { HOME: home, GSTACK_HOME: gstackHome });
expect(r.exitCode).toBe(0);
expect(r.stdout).toMatch(/gbrain sources add gstack-code-[a-z0-9-]+/);
expect(r.stdout).toMatch(/gbrain sync --strategy code --source gstack-code-[a-z0-9-]+/);
rmSync(home, { recursive: true, force: true });
});
it("dry-run does NOT acquire the lock file (lock is for write paths only)", () => {
const home = makeTestHome();
const gstackHome = join(home, ".gstack");
mkdirSync(gstackHome, { recursive: true });
const r = runScript(["--dry-run"], { HOME: home, GSTACK_HOME: gstackHome });
expect(r.exitCode).toBe(0);
// Lock file should not exist after a dry-run (it's a write-only safety primitive).
const lockPath = join(gstackHome, ".sync-gbrain.lock");
expect(existsSync(lockPath)).toBe(false);
rmSync(home, { recursive: true, force: true });
});
it("a stale lock file (older than 5 min) is taken over, not blocking", () => {
const home = makeTestHome();
const gstackHome = join(home, ".gstack");
mkdirSync(gstackHome, { recursive: true });
// Plant a stale lock file (mtime 6 min ago).
const lockPath = join(gstackHome, ".sync-gbrain.lock");
writeFileSync(lockPath, JSON.stringify({ pid: 99999, started_at: new Date(Date.now() - 6 * 60 * 1000).toISOString() }));
const sixMinAgo = (Date.now() - 6 * 60 * 1000) / 1000;
// Set mtime explicitly via Bun's fs.utimes
const fs = require("fs");
fs.utimesSync(lockPath, sixMinAgo, sixMinAgo);
// Run with all stages disabled so we don't actually invoke anything heavy.
const r = runScript(["--incremental", "--no-code", "--no-memory", "--no-brain-sync", "--quiet"], {
HOME: home,
GSTACK_HOME: gstackHome,
});
expect(r.exitCode).toBe(0);
// Lock should be cleared after the run (we took it over and released).
expect(existsSync(lockPath)).toBe(false);
rmSync(home, { recursive: true, force: true });
});
it("a fresh lock file (less than 5 min old) blocks a second invocation with exit 2", () => {
const home = makeTestHome();
const gstackHome = join(home, ".gstack");
mkdirSync(gstackHome, { recursive: true });
// Plant a fresh lock file (mtime now).
const lockPath = join(gstackHome, ".sync-gbrain.lock");
writeFileSync(lockPath, JSON.stringify({ pid: 99999, started_at: new Date().toISOString() }));
const r = runScript(["--incremental", "--no-code", "--no-memory", "--no-brain-sync", "--quiet"], {
HOME: home,
GSTACK_HOME: gstackHome,
});
expect(r.exitCode).toBe(2);
expect(r.stderr).toContain("another /sync-gbrain is running");
// Lock should still be there — the second invocation didn't take it over.
expect(existsSync(lockPath)).toBe(true);
rmSync(home, { recursive: true, force: true });
});
it("writes a state file with schema_version: 1 after a non-dry run", () => {
const home = makeTestHome();
const gstackHome = join(home, ".gstack");
+4 -1
View File
@@ -183,7 +183,10 @@ describe("V1 /gbrain-sync orchestrator E2E", () => {
const r = runBun(SYNC, ["--dry-run"], env);
expect(r.exitCode).toBe(0);
expect(r.stdout).toContain("would: gbrain import");
// Code stage uses native gbrain code surfaces (sources add + sync --strategy code)
// post-codex review; NOT `gbrain import` (markdown-only path).
expect(r.stdout).toContain("would: gbrain sources add");
expect(r.stdout).toContain("gbrain sync --strategy code");
expect(r.stdout).toContain("would: gstack-memory-ingest");
expect(r.stdout).toContain("would: gstack-brain-sync");