Files
gstack/bin/gstack-developer-profile
T
Garry Tan 9dbaf906cf feat(v1.9.0.0): gbrain-sync — cross-machine gstack memory (#1151)
* feat(gbrain-sync): queue primitives + writer shims

Adds bin/gstack-brain-enqueue (atomic append to sync queue) and
bin/gstack-jsonl-merge (git merge driver, ts-sort with SHA-256 fallback).
Wires one backgrounded enqueue call into learnings-log, timeline-log,
review-log, and developer-profile --migrate. question-log and
question-preferences stay local per Codex v2 decision.

gstack-config gains gbrain_sync_mode (off/artifacts-only/full) and
gbrain_sync_mode_prompted keys, plus GSTACK_HOME env alignment so
tests don't leak into real ~/.gstack/config.yaml.

* feat(gbrain-sync): --once drain + secret scan + push

bin/gstack-brain-sync is the core sync binary. Subcommands: --once
(drain queue, allowlist-filter, privacy-class-filter, secret-scan
staged diff, commit with template, push with fetch+merge retry),
--status, --skip-file <path>, --drop-queue --yes, --discover-new
(cursor-based detection of artifact writes that skip the shim).

Secret regex families: AWS keys, GitHub tokens (ghp_/gho_/ghu_/ghs_/
ghr_/github_pat_), OpenAI sk-, PEM blocks, JWTs, bearer-token-in-JSON.
On hit: unstage, preserve queue, print remediation hint (--skip-file
or edit), exit clean. No daemon — invoked by preamble at skill
boundaries.

* feat(gbrain-sync): init, restore, uninstall, consumer registry

bin/gstack-brain-init: idempotent first-run. git init ~/.gstack/,
.gitignore=*, canonical .brain-allowlist + .brain-privacy-map.json,
pre-commit secret-scan hook (defense-in-depth), merge driver registration
via git config, gh repo create --private OR arbitrary --remote <url>,
initial push, ~/.gstack-brain-remote.txt for new-machine discovery,
GBrain consumer registration via HTTP POST.

bin/gstack-brain-restore: safe new-machine bootstrap. Refuses clobber
of existing allowlisted files, clones to staging, rsync-copies tracked
files, re-registers merge drivers (required — not cloned from remote),
rehydrates consumers.json, prompts for per-consumer tokens.

bin/gstack-brain-uninstall: clean off-ramp. Removes .git + .brain-*
files + consumers.json + config keys. Preserves user data (learnings,
plans, retros, profile). Optional --delete-remote for GitHub repos.

bin/gstack-brain-consumer + bin/gstack-brain-reader (symlink alias):
registry management. Internal 'consumer' term; user-facing 'reader'
per DX review decision.

* feat(gbrain-sync): preamble block — privacy gate + boundary sync

scripts/resolvers/preamble/generate-brain-sync-block.ts emits bash that
runs at every skill invocation:
- Detects ~/.gstack-brain-remote.txt on machines without local .git
  and surfaces a restore-available hint (does NOT auto-run restore).
- Runs gstack-brain-sync --once at skill start to drain any pending
  writes (and at skill end via prose instruction).
- Once-per-day auto-pull (cached via .brain-last-pull) for append-only
  JSONL files.
- Emits BRAIN_SYNC: status line every skill run.

Also emits prose for the host LLM to fire the one-time privacy
stop-gate (full / artifacts-only / off) when gbrain is detected and
gbrain_sync_mode_prompted is false. Wired into preamble.ts composition.

* test(gbrain-sync): 27-test consolidated suite

test/brain-sync.test.ts covers:
- Config: validation, defaults, GSTACK_HOME env isolation
- Enqueue: no-op gates, skip list, concurrent atomicity, JSON escape
- JSONL merge driver: 3-way + ts-sort + SHA-256 fallback
- Init + sync: canonical file creation, merge driver registration,
  push-reject + fetch+merge retry path
- Init refuses different remote (idempotency)
- Cross-machine restore round-trip (machine A write → machine B sees)
- Secret scan across all 6 regex families (AWS, GH, OpenAI, PEM, JWT,
  bearer-JSON). --skip-file unblock remediation
- Uninstall removes sync config, preserves user data
- --discover-new idempotence via mtime+size cursor

Behaviors verified via integration smokes during implementation. Known
follow-up: bun-test 5s default timeout needs 30s wrapper for
spawnSync-heavy tests.

* docs(gbrain-sync): user guide + error lookup + README section

docs/gbrain-sync.md: setup walkthrough, privacy modes, cross-machine
workflow, secret protection, two-machine conflict handling, uninstall,
troubleshooting reference.

docs/gbrain-sync-errors.md: problem/cause/fix index for every
user-visible error. Patterned on Rust's error docs + Stripe's API
error reference.

README.md: 'Cross-machine memory with GBrain sync' section near the
top (discovery moment), plus docs-table entry.

* chore: bump version and changelog (v1.7.0.0)

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* chore: regenerate SKILL.md files for gbrain-sync preamble block

Re-runs bun run gen:skill-docs after adding generateBrainSyncBlock
to scripts/resolvers/preamble.ts in a2aa8a07. CI check-freshness
caught the drift. All 36 SKILL.md files regenerated with the new
skill-start bash block + privacy-gate prose + skill-end sync
instructions baked in.

* fix(test): session-awareness reads AskUserQuestion Format from a Tier 2+ SKILL.md

The test was reading ROOT/SKILL.md (browse skill, Tier 1) which never
contained '## AskUserQuestion Format' — that section is only emitted
for Tier 2+ skills by scripts/resolvers/preamble.ts. As a result the
agent was prompted with an empty format guide and only emitted
'RECOMMENDATION' intermittently, making the test flaky.

Pre-existing on main (same ROOT/SKILL.md shape there) — surfaced now
because the agent run didn't hit the RECOMMENDATION/recommend/option a
fallback strings in this particular attempt.

Fix: read from office-hours/SKILL.md (Tier 3, always has the section)
with a fallback that scans for the first top-level skill dir whose
SKILL.md contains the header. Future template moves won't break this
test again.

* chore: bump to v1.9.0.0 for gbrain-sync landing

Changes just the VERSION + package.json + CHANGELOG header (1.7.0.0 → 1.9.0.0
and date 2026-04-22 → 2026-04-23). No code changes. User call: land gbrain-sync
as a bigger-signal release above main's 1.6.4.0, skipping 1.8.0.0.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.7 <noreply@anthropic.com>
2026-04-23 17:54:54 -07:00

451 lines
16 KiB
Bash
Executable File

#!/usr/bin/env bash
# gstack-developer-profile — unified developer profile access and derivation.
#
# Supersedes bin/gstack-builder-profile. The old binary remains as a legacy
# shim that delegates to `gstack-developer-profile --read`.
#
# Subcommands:
# --read (default) emit KEY: VALUE pairs in builder-profile format
# for /office-hours compatibility.
# --derive recompute inferred dimensions from question events;
# write updated ~/.gstack/developer-profile.json.
# --profile emit the full profile as JSON (all fields).
# --gap emit declared-vs-inferred gap as JSON.
# --trace <dim> show events that contributed to a dimension.
# --narrative (v2 stub) output a coach bio paragraph.
# --vibe (v2 stub) output the one-word archetype.
# --check-mismatch detect meaningful gaps between declared and observed.
# --migrate migrate builder-profile.jsonl → developer-profile.json.
# Idempotent; archives the source file on success.
#
# Profile file: ~/.gstack/developer-profile.json (unified schema — see
# docs/designs/PLAN_TUNING_V0.md). Event file: ~/.gstack/projects/{SLUG}/
# question-events.jsonl.
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
ROOT_DIR="$(cd "$SCRIPT_DIR/.." && pwd)"
GSTACK_HOME="${GSTACK_HOME:-$HOME/.gstack}"
PROFILE_FILE="$GSTACK_HOME/developer-profile.json"
LEGACY_FILE="$GSTACK_HOME/builder-profile.jsonl"
eval "$("$SCRIPT_DIR/gstack-slug" 2>/dev/null || true)"
SLUG="${SLUG:-unknown}"
CMD="${1:---read}"
shift || true
# -----------------------------------------------------------------------
# Migration: builder-profile.jsonl → developer-profile.json
# -----------------------------------------------------------------------
do_migrate() {
if [ ! -f "$LEGACY_FILE" ]; then
echo "MIGRATE: no legacy file to migrate"
return 0
fi
if [ -f "$PROFILE_FILE" ]; then
# Already migrated — no-op (idempotent).
echo "MIGRATE: already migrated (developer-profile.json exists)"
return 0
fi
# Run migration in a temp file, then atomic rename.
local TMPOUT
TMPOUT=$(mktemp "$GSTACK_HOME/developer-profile.json.XXXXXX.tmp")
trap 'rm -f "$TMPOUT"' EXIT
cat "$LEGACY_FILE" | bun -e "
const lines = (await Bun.stdin.text()).trim().split('\n').filter(Boolean);
const sessions = [];
const signalsAcc = {};
const resources = new Set();
const topics = new Set();
for (const line of lines) {
try {
const e = JSON.parse(line);
sessions.push(e);
for (const s of (e.signals || [])) {
signalsAcc[s] = (signalsAcc[s] || 0) + 1;
}
for (const r of (e.resources_shown || [])) resources.add(r);
for (const t of (e.topics || [])) topics.add(t);
} catch {}
}
const profile = {
identity: {},
declared: {},
inferred: {
values: {
scope_appetite: 0.5,
risk_tolerance: 0.5,
detail_preference: 0.5,
autonomy: 0.5,
architecture_care: 0.5,
},
sample_size: 0,
diversity: { skills_covered: 0, question_ids_covered: 0, days_span: 0 },
},
gap: {},
overrides: {},
sessions,
signals_accumulated: signalsAcc,
resources_shown: Array.from(resources),
topics: Array.from(topics),
migrated_at: new Date().toISOString(),
schema_version: 1,
};
console.log(JSON.stringify(profile, null, 2));
" > "$TMPOUT"
# Atomic rename.
mv "$TMPOUT" "$PROFILE_FILE"
trap - EXIT
# gbrain-sync: enqueue the migrated file for cross-machine sync (no-op if off).
SCRIPT_DIR_E="$(cd "$(dirname "$0")" && pwd)"
"$SCRIPT_DIR_E/gstack-brain-enqueue" "developer-profile.json" 2>/dev/null &
# Archive the legacy file.
local TS
TS="$(date +%Y-%m-%d-%H%M%S)"
mv "$LEGACY_FILE" "$LEGACY_FILE.migrated-$TS"
local COUNT
COUNT=$(bun -e "console.log(JSON.parse(require('fs').readFileSync('$PROFILE_FILE','utf-8')).sessions.length)" 2>/dev/null || echo "?")
echo "MIGRATE: ok — migrated $COUNT sessions from builder-profile.jsonl"
}
# -----------------------------------------------------------------------
# Load-or-migrate helper: ensure developer-profile.json exists.
# Auto-migrates from builder-profile.jsonl if present.
# Returns path to profile file via stdout. Creates a minimal stub if nothing exists.
# -----------------------------------------------------------------------
ensure_profile() {
if [ -f "$PROFILE_FILE" ]; then
return 0
fi
if [ -f "$LEGACY_FILE" ]; then
do_migrate >/dev/null
return 0
fi
# Nothing yet — create a stub.
mkdir -p "$GSTACK_HOME"
cat > "$PROFILE_FILE" <<EOF
{
"identity": {},
"declared": {},
"inferred": {
"values": {
"scope_appetite": 0.5,
"risk_tolerance": 0.5,
"detail_preference": 0.5,
"autonomy": 0.5,
"architecture_care": 0.5
},
"sample_size": 0,
"diversity": { "skills_covered": 0, "question_ids_covered": 0, "days_span": 0 }
},
"gap": {},
"overrides": {},
"sessions": [],
"signals_accumulated": {},
"schema_version": 1
}
EOF
}
# -----------------------------------------------------------------------
# Read: emit legacy KEY: VALUE output for /office-hours compat.
# -----------------------------------------------------------------------
do_read() {
ensure_profile
cat "$PROFILE_FILE" | bun -e "
const p = JSON.parse(await Bun.stdin.text());
const sessions = p.sessions || [];
const count = sessions.length;
let tier = 'introduction';
if (count >= 8) tier = 'inner_circle';
else if (count >= 4) tier = 'regular';
else if (count >= 1) tier = 'welcome_back';
const last = sessions[count - 1] || {};
const prev = sessions[count - 2] || {};
const crossProject = prev.project_slug && last.project_slug
? prev.project_slug !== last.project_slug
: false;
const designs = sessions.map(e => e.design_doc || '').filter(Boolean);
const designTitles = sessions
.map(e => (e.design_doc ? (e.project_slug || 'unknown') : ''))
.filter(Boolean);
const signalCounts = p.signals_accumulated || {};
let totalSignals = 0;
for (const v of Object.values(signalCounts)) totalSignals += v;
const signalStr = Object.entries(signalCounts).map(([k,v]) => k + ':' + v).join(',');
const builderSessions = sessions.filter(e => e.mode !== 'startup').length;
const nudgeEligible = builderSessions >= 3 && totalSignals >= 5;
const resources = p.resources_shown || [];
const topics = p.topics || [];
console.log('SESSION_COUNT: ' + count);
console.log('TIER: ' + tier);
console.log('LAST_PROJECT: ' + (last.project_slug || ''));
console.log('LAST_ASSIGNMENT: ' + (last.assignment || ''));
console.log('LAST_DESIGN_TITLE: ' + (last.design_doc || ''));
console.log('DESIGN_COUNT: ' + designs.length);
console.log('DESIGN_TITLES: ' + JSON.stringify(designTitles));
console.log('ACCUMULATED_SIGNALS: ' + signalStr);
console.log('TOTAL_SIGNAL_COUNT: ' + totalSignals);
console.log('CROSS_PROJECT: ' + crossProject);
console.log('NUDGE_ELIGIBLE: ' + nudgeEligible);
console.log('RESOURCES_SHOWN: ' + resources.join(','));
console.log('RESOURCES_SHOWN_COUNT: ' + resources.length);
console.log('TOPICS: ' + topics.join(','));
"
}
# -----------------------------------------------------------------------
# Profile: emit the full JSON
# -----------------------------------------------------------------------
do_profile() {
ensure_profile
cat "$PROFILE_FILE"
}
# -----------------------------------------------------------------------
# Gap: declared vs inferred diff
# -----------------------------------------------------------------------
do_gap() {
ensure_profile
cat "$PROFILE_FILE" | bun -e "
const p = JSON.parse(await Bun.stdin.text());
const declared = p.declared || {};
const inferred = (p.inferred && p.inferred.values) || {};
const dims = ['scope_appetite','risk_tolerance','detail_preference','autonomy','architecture_care'];
const gap = {};
for (const d of dims) {
if (declared[d] !== undefined && inferred[d] !== undefined) {
gap[d] = +(Math.abs(declared[d] - inferred[d])).toFixed(3);
}
}
console.log(JSON.stringify({ declared, inferred, gap }, null, 2));
"
}
# -----------------------------------------------------------------------
# Derive: recompute inferred dimensions from question-events.jsonl
# -----------------------------------------------------------------------
do_derive() {
ensure_profile
local EVENTS="$GSTACK_HOME/projects/$SLUG/question-log.jsonl"
local REGISTRY="$ROOT_DIR/scripts/question-registry.ts"
local SIGNALS="$ROOT_DIR/scripts/psychographic-signals.ts"
if [ ! -f "$REGISTRY" ] || [ ! -f "$SIGNALS" ]; then
echo "DERIVE: registry or signals file missing, cannot derive" >&2
exit 1
fi
cd "$ROOT_DIR"
PROFILE_FILE_PATH="$PROFILE_FILE" EVENTS_PATH="$EVENTS" bun -e "
import('./scripts/question-registry.ts').then(async (regmod) => {
const sigmod = await import('./scripts/psychographic-signals.ts');
const fs = require('fs');
const { QUESTIONS } = regmod;
const { SIGNAL_MAP, applySignal, newDimensionTotals, normalizeToDimensionValue } = sigmod;
const profilePath = process.env.PROFILE_FILE_PATH;
const eventsPath = process.env.EVENTS_PATH;
const profile = JSON.parse(fs.readFileSync(profilePath, 'utf-8'));
let lines = [];
if (fs.existsSync(eventsPath)) {
lines = fs.readFileSync(eventsPath, 'utf-8').trim().split('\n').filter(Boolean);
}
const totals = newDimensionTotals();
const skills = new Set();
const qids = new Set();
const days = new Set();
let count = 0;
for (const line of lines) {
let e;
try { e = JSON.parse(line); } catch { continue; }
if (!e.question_id || !e.user_choice) continue;
count++;
skills.add(e.skill);
qids.add(e.question_id);
if (e.ts) days.add(String(e.ts).slice(0,10));
const def = QUESTIONS[e.question_id];
if (def && def.signal_key) {
applySignal(totals, def.signal_key, e.user_choice);
}
}
const values = {};
for (const [dim, total] of Object.entries(totals)) {
values[dim] = +normalizeToDimensionValue(total).toFixed(3);
}
profile.inferred = {
values,
sample_size: count,
diversity: {
skills_covered: skills.size,
question_ids_covered: qids.size,
days_span: days.size,
},
};
// Recompute gap.
const gap = {};
for (const d of Object.keys(values)) {
if (profile.declared && profile.declared[d] !== undefined) {
gap[d] = +(Math.abs(profile.declared[d] - values[d])).toFixed(3);
}
}
profile.gap = gap;
profile.derived_at = new Date().toISOString();
const tmp = profilePath + '.tmp';
fs.writeFileSync(tmp, JSON.stringify(profile, null, 2));
fs.renameSync(tmp, profilePath);
console.log('DERIVE: ok — ' + count + ' events, ' + skills.size + ' skills, ' + qids.size + ' questions');
}).catch(err => { console.error('DERIVE:', err.message); process.exit(1); });
"
}
# -----------------------------------------------------------------------
# Trace: show events contributing to a dimension
# -----------------------------------------------------------------------
do_trace() {
local DIM="${1:-}"
if [ -z "$DIM" ]; then
echo "TRACE: missing dimension argument" >&2
exit 1
fi
local EVENTS="$GSTACK_HOME/projects/$SLUG/question-log.jsonl"
if [ ! -f "$EVENTS" ]; then
echo "TRACE: no events for this project"
return 0
fi
cd "$ROOT_DIR"
EVENTS_PATH="$EVENTS" TRACE_DIM="$DIM" bun -e "
import('./scripts/question-registry.ts').then(async (regmod) => {
const sigmod = await import('./scripts/psychographic-signals.ts');
const fs = require('fs');
const { QUESTIONS } = regmod;
const { SIGNAL_MAP } = sigmod;
const target = process.env.TRACE_DIM;
const lines = fs.readFileSync(process.env.EVENTS_PATH, 'utf-8').trim().split('\n').filter(Boolean);
const rows = [];
for (const line of lines) {
let e;
try { e = JSON.parse(line); } catch { continue; }
const def = QUESTIONS[e.question_id];
if (!def || !def.signal_key) continue;
const deltas = SIGNAL_MAP[def.signal_key]?.[e.user_choice] || [];
for (const d of deltas) {
if (d.dim === target) {
rows.push({ ts: e.ts, question_id: e.question_id, choice: e.user_choice, delta: d.delta });
}
}
}
if (rows.length === 0) {
console.log('TRACE: no events contribute to ' + target);
} else {
console.log('TRACE: ' + rows.length + ' events for ' + target);
for (const r of rows) {
console.log(' ' + (r.ts || '').slice(0,19) + ' ' + r.question_id + ' → ' + r.choice + ' (' + (r.delta > 0 ? '+' : '') + r.delta + ')');
}
}
});
"
}
# -----------------------------------------------------------------------
# Check mismatch: flag when declared ≠ inferred by > threshold
# -----------------------------------------------------------------------
do_check_mismatch() {
ensure_profile
cat "$PROFILE_FILE" | bun -e "
const p = JSON.parse(await Bun.stdin.text());
const declared = p.declared || {};
const inferred = (p.inferred && p.inferred.values) || {};
const sampleSize = (p.inferred && p.inferred.sample_size) || 0;
const diversity = (p.inferred && p.inferred.diversity) || {};
// Require enough data before reporting mismatch.
if (sampleSize < 10) {
console.log('MISMATCH: not enough data (' + sampleSize + ' events; need 10+)');
process.exit(0);
}
const THRESHOLD = 0.3;
const flagged = [];
for (const d of Object.keys(declared)) {
if (inferred[d] === undefined) continue;
const gap = Math.abs(declared[d] - inferred[d]);
if (gap > THRESHOLD) {
flagged.push({ dim: d, declared: declared[d], inferred: inferred[d], gap: +gap.toFixed(3) });
}
}
if (flagged.length === 0) {
console.log('MISMATCH: none');
} else {
console.log('MISMATCH: ' + flagged.length + ' dimension(s) disagree (gap > ' + THRESHOLD + ')');
for (const f of flagged) {
console.log(' ' + f.dim + ': declared ' + f.declared + ' vs inferred ' + f.inferred + ' (gap ' + f.gap + ')');
}
}
"
}
# -----------------------------------------------------------------------
# Narrative + Vibe (v2 stubs)
# -----------------------------------------------------------------------
do_narrative() {
echo "NARRATIVE: (v2 — not yet implemented; use /plan-tune profile for now)"
}
do_vibe() {
ensure_profile
cd "$ROOT_DIR"
cat "$PROFILE_FILE" | PROFILE_DATA="$(cat "$PROFILE_FILE")" bun -e "
import('./scripts/archetypes.ts').then(async (mod) => {
const p = JSON.parse(process.env.PROFILE_DATA);
const dims = (p.inferred && p.inferred.values) || {
scope_appetite: 0.5, risk_tolerance: 0.5, detail_preference: 0.5,
autonomy: 0.5, architecture_care: 0.5,
};
const arch = mod.matchArchetype(dims);
console.log(arch.name);
console.log(arch.description);
});
"
}
# -----------------------------------------------------------------------
# Dispatch
# -----------------------------------------------------------------------
case "$CMD" in
--read) do_read ;;
--profile) do_profile ;;
--gap) do_gap ;;
--derive) do_derive ;;
--trace) do_trace "$@" ;;
--narrative) do_narrative ;;
--vibe) do_vibe ;;
--check-mismatch) do_check_mismatch ;;
--migrate) do_migrate ;;
--help|-h) sed -n '1,/^set -euo/p' "$0" | sed 's|^# \?||' ;;
*)
echo "gstack-developer-profile: unknown subcommand '$CMD'" >&2
echo "run --help for usage" >&2
exit 1
;;
esac