mirror of
https://github.com/garrytan/gstack.git
synced 2026-05-01 19:25:10 +02:00
feat: v0.3.2 — project-local state, diff-aware QA, Greptile integration (#36)
* fix: cookie import picker returns JSON instead of HTML jsonResponse() was defined at module scope but referenced `url` which only existed as a parameter of handleCookiePickerRoute(). Every API call crashed, the catch block also crashed, and Bun returned a default HTML page that the frontend couldn't parse as JSON. Thread port via corsOrigin() helper and options objects. Add route-level tests to prevent this class of bug from shipping again. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * feat: add help command to browse server Agents that don't have SKILL.md loaded (or misread flags) had no way to self-discover the CLI. The help command returns a formatted reference of all commands and snapshot flags. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * feat: version-aware find-browse with META signal protocol Agents in other workspaces found stale browse binaries that were missing newer flags. find-browse now compares the local binary's git SHA against origin/main via git ls-remote (4hr cache), and emits META:UPDATE_AVAILABLE when behind. SKILL.md setup checks parse META signals and prompt the user to update. - New compiled binary: browse/dist/find-browse (TypeScript, testable) - Bash shim at browse/bin/find-browse delegates to compiled binary - .version file written at build time with git commit SHA - Build script compiles both browse and find-browse binaries - Graceful degradation: offline, missing .version, corrupt cache all skip check Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * chore: clean up .bun-build temp files after compile bun build --compile leaves ~58MB temp files in the working directory. Add rm -f .*.bun-build to the build script to clean up after each build. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix: make help command reachable by removing it from META_COMMANDS help was in META_COMMANDS, so it dispatched to handleMetaCommand() which threw "Unknown meta command: help". Removing it from the set lets the dedicated else-if handler in handleCommand() execute correctly. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * chore: bump version and changelog (v0.3.2) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * feat: add shared Greptile comment triage reference doc Shared reference for fetching, filtering, and classifying Greptile review comments on GitHub PRs. Used by both /review and /ship skills. Includes parallel API fetching, suppressions check, classification logic, reply APIs, and history file writes. * feat: make /review and /ship Greptile-aware /review: Step 2.5 fetches and classifies Greptile comments, Step 5 resolves them with AskUserQuestion for valid issues and false positives. /ship: Step 3.75 triages Greptile comments between pre-landing review and version bump. Adds Greptile Review section to PR body in Step 8. Re-runs tests if any Greptile fixes are applied. * feat: add Greptile batting average to /retro Reads ~/.gstack/greptile-history.md, computes signal ratio (valid catches vs false positives), includes in metrics table, JSON snapshot, and Code Quality Signals narrative. * docs: add Greptile integration section to README Personal endorsement, two-layer review narrative, full UX walkthrough transcript, skills table updates. Add Greptile training feedback loop to TODO.md future ideas. * feat: add local dev mode for testing skills from within the repo bin/dev-setup creates .claude/skills/gstack symlink to the working tree so Claude Code discovers skills locally. bin/dev-teardown cleans up. DEVELOPING_GSTACK.md documents the workflow. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix: narrow gitignore to .claude/skills/ instead of all .claude/ Avoids ignoring legitimate Claude Code config like settings.json or CLAUDE.md. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * docs: rename DEVELOPING_GSTACK.md to CONTRIBUTING.md Rewritten as a contributor-friendly guide instead of a dry plan doc. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * docs: explain why dev-setup is needed in CONTRIBUTING.md quick start Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * feat: add browser interaction guidance to CLAUDE.md Prevents Claude from using mcp__claude-in-chrome__* tools instead of /browse. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * feat: add shared config module for project-local browse state Centralizes path resolution (git root detection, state dir, log paths) into config.ts. Both cli.ts and server.ts import from it, eliminating duplicated PORT_OFFSET/BROWSE_PORT/STATE_FILE logic. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * feat: rewrite port selection to use random ports Replace CONDUCTOR_PORT magic offset and 9400-9409 scan with random port 10000-60000. Atomic state file writes, log paths from config module, binaryVersion field for auto-restart on update. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * feat: move browse state from /tmp to project-local .gstack/ CLI now uses config module for state paths, passes BROWSE_STATE_FILE to spawned server. Adds version mismatch auto-restart, legacy /tmp cleanup with PID verification, and removes stale global install fallback. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix: update crash log path reference to .gstack/ Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * test: add config tests and update CLI lifecycle test 14 new tests for config resolution, ensureStateDir, readVersionHash, resolveServerScript, and version mismatch detection. Remove obsolete CONDUCTOR_PORT/BROWSE_PORT filtering from commands.test.ts. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * docs: update BROWSER.md and TODO.md for project-local state Replace /tmp paths with .gstack/, remove CONDUCTOR_PORT docs, document random port selection and per-project isolation. Add server bundling TODO. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * docs: update README, CHANGELOG, and CONTRIBUTING for v0.3.2 - README: replace Conductor-aware language with project-local isolation, add Greptile setup note - CHANGELOG: comprehensive v0.3.2 entry with all state management changes - CONTRIBUTING: add instructions for testing branches in other repos Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * feat: add diff-aware mode to /qa — auto-tests affected pages from branch diff When on a feature branch, /qa now reads git diff main, identifies affected pages/routes from changed files, and tests them automatically. No URL required. The most natural flow: write code, /ship, /qa. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * chore: update CHANGELOG for complete v0.3.2 coverage Add missing entries: diff-aware QA mode, Greptile integration, local dev mode, crash log path fix, README/SKILL.md updates. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> --------- Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
@@ -1,5 +1,11 @@
|
||||
#!/bin/bash
|
||||
# Find the gstack browse binary. Echoes path and exits 0, or exits 1 if not found.
|
||||
# Shim: delegates to compiled find-browse binary, falls back to basic discovery.
|
||||
# The compiled binary adds version checking and META signal support.
|
||||
DIR="$(cd "$(dirname "$0")/.." && pwd)/dist"
|
||||
if test -x "$DIR/find-browse"; then
|
||||
exec "$DIR/find-browse" "$@"
|
||||
fi
|
||||
# Fallback: basic discovery (no version check)
|
||||
ROOT=$(git rev-parse --show-toplevel 2>/dev/null)
|
||||
if [ -n "$ROOT" ] && test -x "$ROOT/.claude/skills/gstack/browse/dist/browse"; then
|
||||
echo "$ROOT/.claude/skills/gstack/browse/dist/browse"
|
||||
|
||||
@@ -47,7 +47,7 @@ export class BrowserManager {
|
||||
// Chromium crash → exit with clear message
|
||||
this.browser.on('disconnected', () => {
|
||||
console.error('[browse] FATAL: Chromium process crashed or was killed. Server exiting.');
|
||||
console.error('[browse] Console/network logs flushed to /tmp/browse-*.log');
|
||||
console.error('[browse] Console/network logs flushed to .gstack/browse-*.log');
|
||||
process.exit(1);
|
||||
});
|
||||
|
||||
|
||||
+80
-13
@@ -2,22 +2,18 @@
|
||||
* gstack CLI — thin wrapper that talks to the persistent server
|
||||
*
|
||||
* Flow:
|
||||
* 1. Read /tmp/browse-server.json for port + token
|
||||
* 1. Read .gstack/browse.json for port + token
|
||||
* 2. If missing or stale PID → start server in background
|
||||
* 3. Health check
|
||||
* 3. Health check + version mismatch detection
|
||||
* 4. Send command via HTTP POST
|
||||
* 5. Print response to stdout (or stderr for errors)
|
||||
*/
|
||||
|
||||
import * as fs from 'fs';
|
||||
import * as path from 'path';
|
||||
import { resolveConfig, ensureStateDir, readVersionHash } from './config';
|
||||
|
||||
const PORT_OFFSET = 45600;
|
||||
const BROWSE_PORT = process.env.CONDUCTOR_PORT
|
||||
? parseInt(process.env.CONDUCTOR_PORT, 10) - PORT_OFFSET
|
||||
: parseInt(process.env.BROWSE_PORT || '0', 10);
|
||||
const INSTANCE_SUFFIX = BROWSE_PORT ? `-${BROWSE_PORT}` : '';
|
||||
const STATE_FILE = process.env.BROWSE_STATE_FILE || `/tmp/browse-server${INSTANCE_SUFFIX}.json`;
|
||||
const config = resolveConfig();
|
||||
const MAX_START_WAIT = 8000; // 8 seconds to start
|
||||
|
||||
export function resolveServerScript(
|
||||
@@ -45,8 +41,9 @@ export function resolveServerScript(
|
||||
}
|
||||
}
|
||||
|
||||
// Legacy fallback for user-level installs
|
||||
return path.resolve(env.HOME || '/tmp', '.claude/skills/gstack/browse/src/server.ts');
|
||||
throw new Error(
|
||||
'Cannot find server.ts. Set BROWSE_SERVER_SCRIPT env or run from the browse source tree.'
|
||||
);
|
||||
}
|
||||
|
||||
const SERVER_SCRIPT = resolveServerScript();
|
||||
@@ -57,12 +54,13 @@ interface ServerState {
|
||||
token: string;
|
||||
startedAt: string;
|
||||
serverPath: string;
|
||||
binaryVersion?: string;
|
||||
}
|
||||
|
||||
// ─── State File ────────────────────────────────────────────────
|
||||
function readState(): ServerState | null {
|
||||
try {
|
||||
const data = fs.readFileSync(STATE_FILE, 'utf-8');
|
||||
const data = fs.readFileSync(config.stateFile, 'utf-8');
|
||||
return JSON.parse(data);
|
||||
} catch {
|
||||
return null;
|
||||
@@ -78,15 +76,73 @@ function isProcessAlive(pid: number): boolean {
|
||||
}
|
||||
}
|
||||
|
||||
// ─── Process Management ─────────────────────────────────────────
|
||||
async function killServer(pid: number): Promise<void> {
|
||||
if (!isProcessAlive(pid)) return;
|
||||
|
||||
try { process.kill(pid, 'SIGTERM'); } catch { return; }
|
||||
|
||||
// Wait up to 2s for graceful shutdown
|
||||
const deadline = Date.now() + 2000;
|
||||
while (Date.now() < deadline && isProcessAlive(pid)) {
|
||||
await Bun.sleep(100);
|
||||
}
|
||||
|
||||
// Force kill if still alive
|
||||
if (isProcessAlive(pid)) {
|
||||
try { process.kill(pid, 'SIGKILL'); } catch {}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Clean up legacy /tmp/browse-server*.json files from before project-local state.
|
||||
* Verifies PID ownership before sending signals.
|
||||
*/
|
||||
function cleanupLegacyState(): void {
|
||||
try {
|
||||
const files = fs.readdirSync('/tmp').filter(f => f.startsWith('browse-server') && f.endsWith('.json'));
|
||||
for (const file of files) {
|
||||
const fullPath = `/tmp/${file}`;
|
||||
try {
|
||||
const data = JSON.parse(fs.readFileSync(fullPath, 'utf-8'));
|
||||
if (data.pid && isProcessAlive(data.pid)) {
|
||||
// Verify this is actually a browse server before killing
|
||||
const check = Bun.spawnSync(['ps', '-p', String(data.pid), '-o', 'command='], {
|
||||
stdout: 'pipe', stderr: 'pipe', timeout: 2000,
|
||||
});
|
||||
const cmd = check.stdout.toString().trim();
|
||||
if (cmd.includes('bun') || cmd.includes('server.ts')) {
|
||||
try { process.kill(data.pid, 'SIGTERM'); } catch {}
|
||||
}
|
||||
}
|
||||
fs.unlinkSync(fullPath);
|
||||
} catch {
|
||||
// Best effort — skip files we can't parse or clean up
|
||||
}
|
||||
}
|
||||
// Clean up legacy log files too
|
||||
const logFiles = fs.readdirSync('/tmp').filter(f =>
|
||||
f.startsWith('browse-console') || f.startsWith('browse-network') || f.startsWith('browse-dialog')
|
||||
);
|
||||
for (const file of logFiles) {
|
||||
try { fs.unlinkSync(`/tmp/${file}`); } catch {}
|
||||
}
|
||||
} catch {
|
||||
// /tmp read failed — skip legacy cleanup
|
||||
}
|
||||
}
|
||||
|
||||
// ─── Server Lifecycle ──────────────────────────────────────────
|
||||
async function startServer(): Promise<ServerState> {
|
||||
ensureStateDir(config);
|
||||
|
||||
// Clean up stale state file
|
||||
try { fs.unlinkSync(STATE_FILE); } catch {}
|
||||
try { fs.unlinkSync(config.stateFile); } catch {}
|
||||
|
||||
// Start server as detached background process
|
||||
const proc = Bun.spawn(['bun', 'run', SERVER_SCRIPT], {
|
||||
stdio: ['ignore', 'pipe', 'pipe'],
|
||||
env: { ...process.env },
|
||||
env: { ...process.env, BROWSE_STATE_FILE: config.stateFile },
|
||||
});
|
||||
|
||||
// Don't hold the CLI open
|
||||
@@ -120,6 +176,14 @@ async function ensureServer(): Promise<ServerState> {
|
||||
const state = readState();
|
||||
|
||||
if (state && isProcessAlive(state.pid)) {
|
||||
// Check for binary version mismatch (auto-restart on update)
|
||||
const currentVersion = readVersionHash();
|
||||
if (currentVersion && state.binaryVersion && currentVersion !== state.binaryVersion) {
|
||||
console.error('[browse] Binary updated, restarting server...');
|
||||
await killServer(state.pid);
|
||||
return startServer();
|
||||
}
|
||||
|
||||
// Server appears alive — do a health check
|
||||
try {
|
||||
const resp = await fetch(`http://127.0.0.1:${state.port}/health`, {
|
||||
@@ -237,6 +301,9 @@ Refs: After 'snapshot', use @e1, @e2... as selectors:
|
||||
process.exit(0);
|
||||
}
|
||||
|
||||
// One-time cleanup of legacy /tmp state files
|
||||
cleanupLegacyState();
|
||||
|
||||
const command = args[0];
|
||||
const commandArgs = args.slice(1);
|
||||
|
||||
|
||||
@@ -0,0 +1,105 @@
|
||||
/**
|
||||
* Shared config for browse CLI + server.
|
||||
*
|
||||
* Resolution:
|
||||
* 1. BROWSE_STATE_FILE env → derive stateDir from parent
|
||||
* 2. git rev-parse --show-toplevel → projectDir/.gstack/
|
||||
* 3. process.cwd() fallback (non-git environments)
|
||||
*
|
||||
* The CLI computes the config and passes BROWSE_STATE_FILE to the
|
||||
* spawned server. The server derives all paths from that env var.
|
||||
*/
|
||||
|
||||
import * as fs from 'fs';
|
||||
import * as path from 'path';
|
||||
|
||||
export interface BrowseConfig {
|
||||
projectDir: string;
|
||||
stateDir: string;
|
||||
stateFile: string;
|
||||
consoleLog: string;
|
||||
networkLog: string;
|
||||
dialogLog: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* Detect the git repository root, or null if not in a repo / git unavailable.
|
||||
*/
|
||||
export function getGitRoot(): string | null {
|
||||
try {
|
||||
const proc = Bun.spawnSync(['git', 'rev-parse', '--show-toplevel'], {
|
||||
stdout: 'pipe',
|
||||
stderr: 'pipe',
|
||||
timeout: 2_000, // Don't hang if .git is broken
|
||||
});
|
||||
if (proc.exitCode !== 0) return null;
|
||||
return proc.stdout.toString().trim() || null;
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Resolve all browse config paths.
|
||||
*
|
||||
* If BROWSE_STATE_FILE is set (e.g. by CLI when spawning server, or by
|
||||
* tests for isolation), all paths are derived from it. Otherwise, the
|
||||
* project root is detected via git or cwd.
|
||||
*/
|
||||
export function resolveConfig(
|
||||
env: Record<string, string | undefined> = process.env,
|
||||
): BrowseConfig {
|
||||
let stateFile: string;
|
||||
let stateDir: string;
|
||||
let projectDir: string;
|
||||
|
||||
if (env.BROWSE_STATE_FILE) {
|
||||
stateFile = env.BROWSE_STATE_FILE;
|
||||
stateDir = path.dirname(stateFile);
|
||||
projectDir = path.dirname(stateDir); // parent of .gstack/
|
||||
} else {
|
||||
projectDir = getGitRoot() || process.cwd();
|
||||
stateDir = path.join(projectDir, '.gstack');
|
||||
stateFile = path.join(stateDir, 'browse.json');
|
||||
}
|
||||
|
||||
return {
|
||||
projectDir,
|
||||
stateDir,
|
||||
stateFile,
|
||||
consoleLog: path.join(stateDir, 'browse-console.log'),
|
||||
networkLog: path.join(stateDir, 'browse-network.log'),
|
||||
dialogLog: path.join(stateDir, 'browse-dialog.log'),
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Create the .gstack/ state directory if it doesn't exist.
|
||||
* Throws with a clear message on permission errors.
|
||||
*/
|
||||
export function ensureStateDir(config: BrowseConfig): void {
|
||||
try {
|
||||
fs.mkdirSync(config.stateDir, { recursive: true });
|
||||
} catch (err: any) {
|
||||
if (err.code === 'EACCES') {
|
||||
throw new Error(`Cannot create state directory ${config.stateDir}: permission denied`);
|
||||
}
|
||||
if (err.code === 'ENOTDIR') {
|
||||
throw new Error(`Cannot create state directory ${config.stateDir}: a file exists at that path`);
|
||||
}
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Read the binary version (git SHA) from browse/dist/.version.
|
||||
* Returns null if the file doesn't exist or can't be read.
|
||||
*/
|
||||
export function readVersionHash(execPath: string = process.execPath): string | null {
|
||||
try {
|
||||
const versionFile = path.resolve(path.dirname(execPath), '.version');
|
||||
return fs.readFileSync(versionFile, 'utf-8').trim() || null;
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
@@ -26,18 +26,25 @@ const importedCounts = new Map<string, number>();
|
||||
|
||||
// ─── JSON Helpers ───────────────────────────────────────────────
|
||||
|
||||
function jsonResponse(data: any, status = 200): Response {
|
||||
function corsOrigin(port: number): string {
|
||||
return `http://127.0.0.1:${port}`;
|
||||
}
|
||||
|
||||
function jsonResponse(data: any, opts: { port: number; status?: number }): Response {
|
||||
return new Response(JSON.stringify(data), {
|
||||
status,
|
||||
status: opts.status ?? 200,
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Access-Control-Allow-Origin': `http://127.0.0.1:${parseInt(url.port, 10) || 9400}`,
|
||||
'Access-Control-Allow-Origin': corsOrigin(opts.port),
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
function errorResponse(message: string, code: string, status = 400, action?: string): Response {
|
||||
return jsonResponse({ error: message, code, ...(action ? { action } : {}) }, status);
|
||||
function errorResponse(message: string, code: string, opts: { port: number; status?: number; action?: string }): Response {
|
||||
return jsonResponse(
|
||||
{ error: message, code, ...(opts.action ? { action: opts.action } : {}) },
|
||||
{ port: opts.port, status: opts.status ?? 400 },
|
||||
);
|
||||
}
|
||||
|
||||
// ─── Route Handler ──────────────────────────────────────────────
|
||||
@@ -48,13 +55,14 @@ export async function handleCookiePickerRoute(
|
||||
bm: BrowserManager,
|
||||
): Promise<Response> {
|
||||
const pathname = url.pathname;
|
||||
const port = parseInt(url.port, 10) || 9400;
|
||||
|
||||
// CORS preflight
|
||||
if (req.method === 'OPTIONS') {
|
||||
return new Response(null, {
|
||||
status: 204,
|
||||
headers: {
|
||||
'Access-Control-Allow-Origin': `http://127.0.0.1:${parseInt(url.port, 10) || 9400}`,
|
||||
'Access-Control-Allow-Origin': corsOrigin(port),
|
||||
'Access-Control-Allow-Methods': 'GET, POST, OPTIONS',
|
||||
'Access-Control-Allow-Headers': 'Content-Type',
|
||||
},
|
||||
@@ -64,7 +72,6 @@ export async function handleCookiePickerRoute(
|
||||
try {
|
||||
// GET /cookie-picker — serve the picker UI
|
||||
if (pathname === '/cookie-picker' && req.method === 'GET') {
|
||||
const port = parseInt(url.port, 10) || 9400;
|
||||
const html = getCookiePickerHTML(port);
|
||||
return new Response(html, {
|
||||
status: 200,
|
||||
@@ -80,20 +87,20 @@ export async function handleCookiePickerRoute(
|
||||
name: b.name,
|
||||
aliases: b.aliases,
|
||||
})),
|
||||
});
|
||||
}, { port });
|
||||
}
|
||||
|
||||
// GET /cookie-picker/domains?browser=<name> — list domains + counts
|
||||
if (pathname === '/cookie-picker/domains' && req.method === 'GET') {
|
||||
const browserName = url.searchParams.get('browser');
|
||||
if (!browserName) {
|
||||
return errorResponse("Missing 'browser' parameter", 'missing_param');
|
||||
return errorResponse("Missing 'browser' parameter", 'missing_param', { port });
|
||||
}
|
||||
const result = listDomains(browserName);
|
||||
return jsonResponse({
|
||||
browser: result.browser,
|
||||
domains: result.domains,
|
||||
});
|
||||
}, { port });
|
||||
}
|
||||
|
||||
// POST /cookie-picker/import — decrypt + import to Playwright session
|
||||
@@ -102,13 +109,13 @@ export async function handleCookiePickerRoute(
|
||||
try {
|
||||
body = await req.json();
|
||||
} catch {
|
||||
return errorResponse('Invalid JSON body', 'bad_request');
|
||||
return errorResponse('Invalid JSON body', 'bad_request', { port });
|
||||
}
|
||||
|
||||
const { browser, domains } = body;
|
||||
if (!browser) return errorResponse("Missing 'browser' field", 'missing_param');
|
||||
if (!browser) return errorResponse("Missing 'browser' field", 'missing_param', { port });
|
||||
if (!domains || !Array.isArray(domains) || domains.length === 0) {
|
||||
return errorResponse("Missing or empty 'domains' array", 'missing_param');
|
||||
return errorResponse("Missing or empty 'domains' array", 'missing_param', { port });
|
||||
}
|
||||
|
||||
// Decrypt cookies from the browser DB
|
||||
@@ -122,7 +129,7 @@ export async function handleCookiePickerRoute(
|
||||
message: result.failed > 0
|
||||
? `All ${result.failed} cookies failed to decrypt`
|
||||
: 'No cookies found for the specified domains',
|
||||
});
|
||||
}, { port });
|
||||
}
|
||||
|
||||
// Add to Playwright context
|
||||
@@ -141,7 +148,7 @@ export async function handleCookiePickerRoute(
|
||||
imported: result.count,
|
||||
failed: result.failed,
|
||||
domainCounts: result.domainCounts,
|
||||
});
|
||||
}, { port });
|
||||
}
|
||||
|
||||
// POST /cookie-picker/remove — clear cookies for domains
|
||||
@@ -150,12 +157,12 @@ export async function handleCookiePickerRoute(
|
||||
try {
|
||||
body = await req.json();
|
||||
} catch {
|
||||
return errorResponse('Invalid JSON body', 'bad_request');
|
||||
return errorResponse('Invalid JSON body', 'bad_request', { port });
|
||||
}
|
||||
|
||||
const { domains } = body;
|
||||
if (!domains || !Array.isArray(domains) || domains.length === 0) {
|
||||
return errorResponse("Missing or empty 'domains' array", 'missing_param');
|
||||
return errorResponse("Missing or empty 'domains' array", 'missing_param', { port });
|
||||
}
|
||||
|
||||
const page = bm.getPage();
|
||||
@@ -171,7 +178,7 @@ export async function handleCookiePickerRoute(
|
||||
return jsonResponse({
|
||||
removed: domains.length,
|
||||
domains,
|
||||
});
|
||||
}, { port });
|
||||
}
|
||||
|
||||
// GET /cookie-picker/imported — currently imported domains + counts
|
||||
@@ -186,15 +193,15 @@ export async function handleCookiePickerRoute(
|
||||
domains: entries,
|
||||
totalDomains: entries.length,
|
||||
totalCookies: entries.reduce((sum, e) => sum + e.count, 0),
|
||||
});
|
||||
}, { port });
|
||||
}
|
||||
|
||||
return new Response('Not found', { status: 404 });
|
||||
} catch (err: any) {
|
||||
if (err instanceof CookieImportError) {
|
||||
return errorResponse(err.message, err.code, 400, err.action);
|
||||
return errorResponse(err.message, err.code, { port, status: 400, action: err.action });
|
||||
}
|
||||
console.error(`[cookie-picker] Error: ${err.message}`);
|
||||
return errorResponse(err.message || 'Internal error', 'internal_error', 500);
|
||||
return errorResponse(err.message || 'Internal error', 'internal_error', { port, status: 500 });
|
||||
}
|
||||
}
|
||||
|
||||
@@ -0,0 +1,181 @@
|
||||
/**
|
||||
* find-browse — locate the gstack browse binary + check for updates.
|
||||
*
|
||||
* Compiled to browse/dist/find-browse (standalone binary, no bun runtime needed).
|
||||
*
|
||||
* Output protocol:
|
||||
* Line 1: /path/to/binary (always present)
|
||||
* Line 2+: META:<TYPE> <json-payload> (optional, 0 or more)
|
||||
*
|
||||
* META types:
|
||||
* META:UPDATE_AVAILABLE — local binary is behind origin/main
|
||||
*
|
||||
* All version checks are best-effort: network failures, missing files, and
|
||||
* cache errors degrade gracefully to outputting only the binary path.
|
||||
*/
|
||||
|
||||
import { existsSync } from 'fs';
|
||||
import { readFileSync, writeFileSync } from 'fs';
|
||||
import { join, dirname } from 'path';
|
||||
import { homedir } from 'os';
|
||||
|
||||
const REPO_URL = 'https://github.com/garrytan/gstack.git';
|
||||
const CACHE_PATH = '/tmp/gstack-latest-version';
|
||||
const CACHE_TTL = 14400; // 4 hours in seconds
|
||||
|
||||
// ─── Binary Discovery ───────────────────────────────────────────
|
||||
|
||||
function getGitRoot(): string | null {
|
||||
try {
|
||||
const proc = Bun.spawnSync(['git', 'rev-parse', '--show-toplevel'], {
|
||||
stdout: 'pipe',
|
||||
stderr: 'pipe',
|
||||
});
|
||||
if (proc.exitCode !== 0) return null;
|
||||
return proc.stdout.toString().trim();
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
export function locateBinary(): string | null {
|
||||
const root = getGitRoot();
|
||||
const home = homedir();
|
||||
|
||||
// Workspace-local takes priority (for development)
|
||||
if (root) {
|
||||
const local = join(root, '.claude', 'skills', 'gstack', 'browse', 'dist', 'browse');
|
||||
if (existsSync(local)) return local;
|
||||
}
|
||||
|
||||
// Global fallback
|
||||
const global = join(home, '.claude', 'skills', 'gstack', 'browse', 'dist', 'browse');
|
||||
if (existsSync(global)) return global;
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
// ─── Version Check ──────────────────────────────────────────────
|
||||
|
||||
interface CacheEntry {
|
||||
sha: string;
|
||||
timestamp: number;
|
||||
}
|
||||
|
||||
function readCache(): CacheEntry | null {
|
||||
try {
|
||||
const content = readFileSync(CACHE_PATH, 'utf-8').trim();
|
||||
const parts = content.split(/\s+/);
|
||||
if (parts.length < 2) return null;
|
||||
const sha = parts[0];
|
||||
const timestamp = parseInt(parts[1], 10);
|
||||
if (!sha || isNaN(timestamp)) return null;
|
||||
// Validate SHA is hex
|
||||
if (!/^[0-9a-f]{40}$/i.test(sha)) return null;
|
||||
return { sha, timestamp };
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
function writeCache(sha: string, timestamp: number): void {
|
||||
try {
|
||||
writeFileSync(CACHE_PATH, `${sha} ${timestamp}\n`);
|
||||
} catch {
|
||||
// Cache write failure is non-fatal
|
||||
}
|
||||
}
|
||||
|
||||
function fetchRemoteSHA(): string | null {
|
||||
try {
|
||||
const proc = Bun.spawnSync(['git', 'ls-remote', REPO_URL, 'refs/heads/main'], {
|
||||
stdout: 'pipe',
|
||||
stderr: 'pipe',
|
||||
timeout: 10_000, // 10s timeout
|
||||
});
|
||||
if (proc.exitCode !== 0) return null;
|
||||
const output = proc.stdout.toString().trim();
|
||||
const sha = output.split(/\s+/)[0];
|
||||
if (!sha || !/^[0-9a-f]{40}$/i.test(sha)) return null;
|
||||
return sha;
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
function resolveSkillDir(binaryPath: string): string | null {
|
||||
const home = homedir();
|
||||
const globalPrefix = join(home, '.claude', 'skills', 'gstack');
|
||||
if (binaryPath.startsWith(globalPrefix)) return globalPrefix;
|
||||
|
||||
// Workspace-local: binary is at $ROOT/.claude/skills/gstack/browse/dist/browse
|
||||
// Skill dir is $ROOT/.claude/skills/gstack
|
||||
const parts = binaryPath.split('/.claude/skills/gstack/');
|
||||
if (parts.length === 2) return parts[0] + '/.claude/skills/gstack';
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
export function checkVersion(binaryDir: string): string | null {
|
||||
// Read local version
|
||||
const versionFile = join(binaryDir, '.version');
|
||||
let localSHA: string;
|
||||
try {
|
||||
localSHA = readFileSync(versionFile, 'utf-8').trim();
|
||||
} catch {
|
||||
return null; // No .version file → skip check
|
||||
}
|
||||
if (!localSHA) return null;
|
||||
|
||||
const now = Math.floor(Date.now() / 1000);
|
||||
|
||||
// Check cache
|
||||
let remoteSHA: string | null = null;
|
||||
const cache = readCache();
|
||||
if (cache && (now - cache.timestamp) < CACHE_TTL) {
|
||||
remoteSHA = cache.sha;
|
||||
}
|
||||
|
||||
// Fetch from remote if cache miss
|
||||
if (!remoteSHA) {
|
||||
remoteSHA = fetchRemoteSHA();
|
||||
if (remoteSHA) {
|
||||
writeCache(remoteSHA, now);
|
||||
}
|
||||
}
|
||||
|
||||
if (!remoteSHA) return null; // Offline or error → skip check
|
||||
|
||||
// Compare
|
||||
if (localSHA === remoteSHA) return null; // Up to date
|
||||
|
||||
// Determine skill directory for update command
|
||||
const binaryPath = join(binaryDir, 'browse');
|
||||
const skillDir = resolveSkillDir(binaryPath);
|
||||
if (!skillDir) return null;
|
||||
|
||||
const payload = JSON.stringify({
|
||||
current: localSHA.slice(0, 8),
|
||||
latest: remoteSHA.slice(0, 8),
|
||||
command: `cd ${skillDir} && git stash && git fetch origin && git reset --hard origin/main && ./setup`,
|
||||
});
|
||||
|
||||
return `META:UPDATE_AVAILABLE ${payload}`;
|
||||
}
|
||||
|
||||
// ─── Main ───────────────────────────────────────────────────────
|
||||
|
||||
function main() {
|
||||
const bin = locateBinary();
|
||||
if (!bin) {
|
||||
process.stderr.write('ERROR: browse binary not found. Run: cd <skill-dir> && ./setup\n');
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
console.log(bin);
|
||||
|
||||
const meta = checkVersion(dirname(bin));
|
||||
if (meta) console.log(meta);
|
||||
}
|
||||
|
||||
main();
|
||||
+60
-21
@@ -6,6 +6,11 @@
|
||||
* Console/network/dialog buffers: CircularBuffer in-memory + async disk flush
|
||||
* Chromium crash → server EXITS with clear error (CLI auto-restarts)
|
||||
* Auto-shutdown after BROWSE_IDLE_TIMEOUT (default 30 min)
|
||||
*
|
||||
* State:
|
||||
* State file: <project-root>/.gstack/browse.json (set via BROWSE_STATE_FILE env)
|
||||
* Log files: <project-root>/.gstack/browse-{console,network,dialog}.log
|
||||
* Port: random 10000-60000 (or BROWSE_PORT env for debug override)
|
||||
*/
|
||||
|
||||
import { BrowserManager } from './browser-manager';
|
||||
@@ -13,18 +18,18 @@ import { handleReadCommand } from './read-commands';
|
||||
import { handleWriteCommand } from './write-commands';
|
||||
import { handleMetaCommand } from './meta-commands';
|
||||
import { handleCookiePickerRoute } from './cookie-picker-routes';
|
||||
import { resolveConfig, ensureStateDir, readVersionHash } from './config';
|
||||
import * as fs from 'fs';
|
||||
import * as path from 'path';
|
||||
import * as crypto from 'crypto';
|
||||
|
||||
// ─── Auth (inline) ─────────────────────────────────────────────
|
||||
// ─── Config ─────────────────────────────────────────────────────
|
||||
const config = resolveConfig();
|
||||
ensureStateDir(config);
|
||||
|
||||
// ─── Auth ───────────────────────────────────────────────────────
|
||||
const AUTH_TOKEN = crypto.randomUUID();
|
||||
const PORT_OFFSET = 45600;
|
||||
const BROWSE_PORT = process.env.CONDUCTOR_PORT
|
||||
? parseInt(process.env.CONDUCTOR_PORT, 10) - PORT_OFFSET
|
||||
: parseInt(process.env.BROWSE_PORT || '0', 10); // 0 = auto-scan
|
||||
const INSTANCE_SUFFIX = BROWSE_PORT ? `-${BROWSE_PORT}` : '';
|
||||
const STATE_FILE = process.env.BROWSE_STATE_FILE || `/tmp/browse-server${INSTANCE_SUFFIX}.json`;
|
||||
const BROWSE_PORT = parseInt(process.env.BROWSE_PORT || '0', 10);
|
||||
const IDLE_TIMEOUT_MS = parseInt(process.env.BROWSE_IDLE_TIMEOUT || '1800000', 10); // 30 min
|
||||
|
||||
function validateAuth(req: Request): boolean {
|
||||
@@ -36,9 +41,9 @@ function validateAuth(req: Request): boolean {
|
||||
import { consoleBuffer, networkBuffer, dialogBuffer, addConsoleEntry, addNetworkEntry, addDialogEntry, type LogEntry, type NetworkEntry, type DialogEntry } from './buffers';
|
||||
export { consoleBuffer, networkBuffer, dialogBuffer, addConsoleEntry, addNetworkEntry, addDialogEntry, type LogEntry, type NetworkEntry, type DialogEntry };
|
||||
|
||||
const CONSOLE_LOG_PATH = `/tmp/browse-console${INSTANCE_SUFFIX}.log`;
|
||||
const NETWORK_LOG_PATH = `/tmp/browse-network${INSTANCE_SUFFIX}.log`;
|
||||
const DIALOG_LOG_PATH = `/tmp/browse-dialog${INSTANCE_SUFFIX}.log`;
|
||||
const CONSOLE_LOG_PATH = config.consoleLog;
|
||||
const NETWORK_LOG_PATH = config.networkLog;
|
||||
const DIALOG_LOG_PATH = config.dialogLog;
|
||||
let lastConsoleFlushed = 0;
|
||||
let lastNetworkFlushed = 0;
|
||||
let lastDialogFlushed = 0;
|
||||
@@ -132,22 +137,25 @@ export const META_COMMANDS = new Set([
|
||||
const browserManager = new BrowserManager();
|
||||
let isShuttingDown = false;
|
||||
|
||||
// Find port: deterministic from CONDUCTOR_PORT, or scan range
|
||||
// Find port: explicit BROWSE_PORT, or random in 10000-60000
|
||||
async function findPort(): Promise<number> {
|
||||
// Deterministic port from CONDUCTOR_PORT (e.g., 55040 - 45600 = 9440)
|
||||
// Explicit port override (for debugging)
|
||||
if (BROWSE_PORT) {
|
||||
try {
|
||||
const testServer = Bun.serve({ port: BROWSE_PORT, fetch: () => new Response('ok') });
|
||||
testServer.stop();
|
||||
return BROWSE_PORT;
|
||||
} catch {
|
||||
throw new Error(`[browse] Port ${BROWSE_PORT} (from CONDUCTOR_PORT ${process.env.CONDUCTOR_PORT}) is in use`);
|
||||
throw new Error(`[browse] Port ${BROWSE_PORT} (from BROWSE_PORT env) is in use`);
|
||||
}
|
||||
}
|
||||
|
||||
// Fallback: scan range
|
||||
const start = parseInt(process.env.BROWSE_PORT_START || '9400', 10);
|
||||
for (let port = start; port < start + 10; port++) {
|
||||
// Random port with retry
|
||||
const MIN_PORT = 10000;
|
||||
const MAX_PORT = 60000;
|
||||
const MAX_RETRIES = 5;
|
||||
for (let attempt = 0; attempt < MAX_RETRIES; attempt++) {
|
||||
const port = MIN_PORT + Math.floor(Math.random() * (MAX_PORT - MIN_PORT));
|
||||
try {
|
||||
const testServer = Bun.serve({ port, fetch: () => new Response('ok') });
|
||||
testServer.stop();
|
||||
@@ -156,7 +164,7 @@ async function findPort(): Promise<number> {
|
||||
continue;
|
||||
}
|
||||
}
|
||||
throw new Error(`[browse] No available port in range ${start}-${start + 9}`);
|
||||
throw new Error(`[browse] No available port after ${MAX_RETRIES} attempts in range ${MIN_PORT}-${MAX_PORT}`);
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -201,6 +209,34 @@ async function handleCommand(body: any): Promise<Response> {
|
||||
result = await handleWriteCommand(command, args, browserManager);
|
||||
} else if (META_COMMANDS.has(command)) {
|
||||
result = await handleMetaCommand(command, args, browserManager, shutdown);
|
||||
} else if (command === 'help') {
|
||||
const helpText = [
|
||||
'gstack browse — headless browser for AI agents',
|
||||
'',
|
||||
'Commands:',
|
||||
' Navigation: goto <url>, back, forward, reload',
|
||||
' Interaction: click <sel>, fill <sel> <text>, select <sel> <val>, hover, type, press, scroll, wait',
|
||||
' Read: text [sel], html [sel], links, forms, accessibility, cookies, storage, console, network, perf',
|
||||
' Evaluate: js <expr>, eval <expr>, css <sel> <prop>, attrs <sel>, is <sel> <state>',
|
||||
' Snapshot: snapshot [-i] [-c] [-d N] [-s sel] [-D] [-a] [-o path] [-C]',
|
||||
' Screenshot: screenshot [path], pdf [path], responsive <widths>',
|
||||
' Tabs: tabs, tab <id>, newtab [url], closetab [id]',
|
||||
' State: cookie <set|get|clear>, cookie-import <json>, cookie-import-browser [browser]',
|
||||
' Headers: header <set|clear> [name] [value], useragent [string]',
|
||||
' Upload: upload <sel> <file1> [file2...]',
|
||||
' Dialogs: dialog, dialog-accept [text], dialog-dismiss',
|
||||
' Meta: status, stop, restart, diff, chain, help',
|
||||
'',
|
||||
'Snapshot flags:',
|
||||
' -i interactive only -c compact (remove empty nodes)',
|
||||
' -d N limit depth -s sel scope to CSS selector',
|
||||
' -D diff vs previous -a annotated screenshot with ref labels',
|
||||
' -o path output file -C cursor-interactive elements',
|
||||
].join('\n');
|
||||
return new Response(helpText, {
|
||||
status: 200,
|
||||
headers: { 'Content-Type': 'text/plain' },
|
||||
});
|
||||
} else {
|
||||
return new Response(JSON.stringify({
|
||||
error: `Unknown command: ${command}`,
|
||||
@@ -235,7 +271,7 @@ async function shutdown() {
|
||||
await browserManager.close();
|
||||
|
||||
// Clean up state file
|
||||
try { fs.unlinkSync(STATE_FILE); } catch {}
|
||||
try { fs.unlinkSync(config.stateFile); } catch {}
|
||||
|
||||
process.exit(0);
|
||||
}
|
||||
@@ -301,19 +337,22 @@ async function start() {
|
||||
},
|
||||
});
|
||||
|
||||
// Write state file
|
||||
// Write state file (atomic: write .tmp then rename)
|
||||
const state = {
|
||||
pid: process.pid,
|
||||
port,
|
||||
token: AUTH_TOKEN,
|
||||
startedAt: new Date().toISOString(),
|
||||
serverPath: path.resolve(import.meta.dir, 'server.ts'),
|
||||
binaryVersion: readVersionHash() || undefined,
|
||||
};
|
||||
fs.writeFileSync(STATE_FILE, JSON.stringify(state, null, 2), { mode: 0o600 });
|
||||
const tmpFile = config.stateFile + '.tmp';
|
||||
fs.writeFileSync(tmpFile, JSON.stringify(state, null, 2), { mode: 0o600 });
|
||||
fs.renameSync(tmpFile, config.stateFile);
|
||||
|
||||
browserManager.serverPort = port;
|
||||
console.log(`[browse] Server running on http://127.0.0.1:${port} (PID: ${process.pid})`);
|
||||
console.log(`[browse] State file: ${STATE_FILE}`);
|
||||
console.log(`[browse] State file: ${config.stateFile}`);
|
||||
console.log(`[browse] Idle timeout: ${IDLE_TIMEOUT_MS / 1000}s`);
|
||||
}
|
||||
|
||||
|
||||
@@ -457,14 +457,11 @@ describe('CLI lifecycle', () => {
|
||||
}));
|
||||
|
||||
const cliPath = path.resolve(__dirname, '../src/cli.ts');
|
||||
// Build env without CONDUCTOR_PORT/BROWSE_PORT so BROWSE_PORT_START takes effect
|
||||
const cliEnv: Record<string, string> = {};
|
||||
for (const [k, v] of Object.entries(process.env)) {
|
||||
if (k !== 'CONDUCTOR_PORT' && k !== 'BROWSE_PORT' && v !== undefined) cliEnv[k] = v;
|
||||
if (v !== undefined) cliEnv[k] = v;
|
||||
}
|
||||
cliEnv.BROWSE_STATE_FILE = stateFile;
|
||||
// Use a random high port to avoid conflicts with running servers
|
||||
cliEnv.BROWSE_PORT_START = String(9600 + Math.floor(Math.random() * 100));
|
||||
const result = await new Promise<{ code: number; stdout: string; stderr: string }>((resolve) => {
|
||||
const proc = spawn('bun', ['run', cliPath, 'status'], {
|
||||
timeout: 15000,
|
||||
|
||||
@@ -0,0 +1,125 @@
|
||||
import { describe, test, expect } from 'bun:test';
|
||||
import { resolveConfig, ensureStateDir, readVersionHash, getGitRoot } from '../src/config';
|
||||
import * as fs from 'fs';
|
||||
import * as path from 'path';
|
||||
import * as os from 'os';
|
||||
|
||||
describe('config', () => {
|
||||
describe('getGitRoot', () => {
|
||||
test('returns a path when in a git repo', () => {
|
||||
const root = getGitRoot();
|
||||
expect(root).not.toBeNull();
|
||||
expect(fs.existsSync(path.join(root!, '.git'))).toBe(true);
|
||||
});
|
||||
});
|
||||
|
||||
describe('resolveConfig', () => {
|
||||
test('uses git root by default', () => {
|
||||
const config = resolveConfig({});
|
||||
const gitRoot = getGitRoot();
|
||||
expect(gitRoot).not.toBeNull();
|
||||
expect(config.projectDir).toBe(gitRoot);
|
||||
expect(config.stateDir).toBe(path.join(gitRoot!, '.gstack'));
|
||||
expect(config.stateFile).toBe(path.join(gitRoot!, '.gstack', 'browse.json'));
|
||||
});
|
||||
|
||||
test('derives paths from BROWSE_STATE_FILE when set', () => {
|
||||
const stateFile = '/tmp/test-config/.gstack/browse.json';
|
||||
const config = resolveConfig({ BROWSE_STATE_FILE: stateFile });
|
||||
expect(config.stateFile).toBe(stateFile);
|
||||
expect(config.stateDir).toBe('/tmp/test-config/.gstack');
|
||||
expect(config.projectDir).toBe('/tmp/test-config');
|
||||
});
|
||||
|
||||
test('log paths are in stateDir', () => {
|
||||
const config = resolveConfig({});
|
||||
expect(config.consoleLog).toBe(path.join(config.stateDir, 'browse-console.log'));
|
||||
expect(config.networkLog).toBe(path.join(config.stateDir, 'browse-network.log'));
|
||||
expect(config.dialogLog).toBe(path.join(config.stateDir, 'browse-dialog.log'));
|
||||
});
|
||||
});
|
||||
|
||||
describe('ensureStateDir', () => {
|
||||
test('creates directory if it does not exist', () => {
|
||||
const tmpDir = path.join(os.tmpdir(), `browse-config-test-${Date.now()}`);
|
||||
const config = resolveConfig({ BROWSE_STATE_FILE: path.join(tmpDir, '.gstack', 'browse.json') });
|
||||
expect(fs.existsSync(config.stateDir)).toBe(false);
|
||||
ensureStateDir(config);
|
||||
expect(fs.existsSync(config.stateDir)).toBe(true);
|
||||
// Cleanup
|
||||
fs.rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
test('is a no-op if directory already exists', () => {
|
||||
const tmpDir = path.join(os.tmpdir(), `browse-config-test-${Date.now()}`);
|
||||
const stateDir = path.join(tmpDir, '.gstack');
|
||||
fs.mkdirSync(stateDir, { recursive: true });
|
||||
const config = resolveConfig({ BROWSE_STATE_FILE: path.join(stateDir, 'browse.json') });
|
||||
ensureStateDir(config); // should not throw
|
||||
expect(fs.existsSync(config.stateDir)).toBe(true);
|
||||
// Cleanup
|
||||
fs.rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
});
|
||||
|
||||
describe('readVersionHash', () => {
|
||||
test('returns null when .version file does not exist', () => {
|
||||
const result = readVersionHash('/nonexistent/path/browse');
|
||||
expect(result).toBeNull();
|
||||
});
|
||||
|
||||
test('reads version from .version file adjacent to execPath', () => {
|
||||
const tmpDir = path.join(os.tmpdir(), `browse-version-test-${Date.now()}`);
|
||||
fs.mkdirSync(tmpDir, { recursive: true });
|
||||
const versionFile = path.join(tmpDir, '.version');
|
||||
fs.writeFileSync(versionFile, 'abc123def\n');
|
||||
const result = readVersionHash(path.join(tmpDir, 'browse'));
|
||||
expect(result).toBe('abc123def');
|
||||
// Cleanup
|
||||
fs.rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('resolveServerScript', () => {
|
||||
// Import the function from cli.ts
|
||||
const { resolveServerScript } = require('../src/cli');
|
||||
|
||||
test('uses BROWSE_SERVER_SCRIPT env when set', () => {
|
||||
const result = resolveServerScript({ BROWSE_SERVER_SCRIPT: '/custom/server.ts' }, '', '');
|
||||
expect(result).toBe('/custom/server.ts');
|
||||
});
|
||||
|
||||
test('finds server.ts adjacent to cli.ts in dev mode', () => {
|
||||
const srcDir = path.resolve(__dirname, '../src');
|
||||
const result = resolveServerScript({}, srcDir, '');
|
||||
expect(result).toBe(path.join(srcDir, 'server.ts'));
|
||||
});
|
||||
|
||||
test('throws when server.ts cannot be found', () => {
|
||||
expect(() => resolveServerScript({}, '/nonexistent/$bunfs', '/nonexistent/browse'))
|
||||
.toThrow('Cannot find server.ts');
|
||||
});
|
||||
});
|
||||
|
||||
describe('version mismatch detection', () => {
|
||||
test('detects when versions differ', () => {
|
||||
const stateVersion = 'abc123';
|
||||
const currentVersion = 'def456';
|
||||
expect(stateVersion !== currentVersion).toBe(true);
|
||||
});
|
||||
|
||||
test('no mismatch when versions match', () => {
|
||||
const stateVersion = 'abc123';
|
||||
const currentVersion = 'abc123';
|
||||
expect(stateVersion !== currentVersion).toBe(false);
|
||||
});
|
||||
|
||||
test('no mismatch when either version is null', () => {
|
||||
const currentVersion: string | null = null;
|
||||
const stateVersion: string | undefined = 'abc123';
|
||||
// Version mismatch only triggers when both are present
|
||||
const shouldRestart = currentVersion !== null && stateVersion !== undefined && currentVersion !== stateVersion;
|
||||
expect(shouldRestart).toBe(false);
|
||||
});
|
||||
});
|
||||
@@ -0,0 +1,205 @@
|
||||
/**
|
||||
* Tests for cookie-picker route handler
|
||||
*
|
||||
* Tests the HTTP glue layer directly with mock BrowserManager objects.
|
||||
* Verifies that all routes return valid JSON (not HTML) with correct CORS headers.
|
||||
*/
|
||||
|
||||
import { describe, test, expect } from 'bun:test';
|
||||
import { handleCookiePickerRoute } from '../src/cookie-picker-routes';
|
||||
|
||||
// ─── Mock BrowserManager ──────────────────────────────────────
|
||||
|
||||
function mockBrowserManager() {
|
||||
const addedCookies: any[] = [];
|
||||
const clearedDomains: string[] = [];
|
||||
return {
|
||||
bm: {
|
||||
getPage: () => ({
|
||||
context: () => ({
|
||||
addCookies: (cookies: any[]) => { addedCookies.push(...cookies); },
|
||||
clearCookies: (opts: { domain: string }) => { clearedDomains.push(opts.domain); },
|
||||
}),
|
||||
}),
|
||||
} as any,
|
||||
addedCookies,
|
||||
clearedDomains,
|
||||
};
|
||||
}
|
||||
|
||||
function makeUrl(path: string, port = 9470): URL {
|
||||
return new URL(`http://127.0.0.1:${port}${path}`);
|
||||
}
|
||||
|
||||
function makeReq(method: string, body?: any): Request {
|
||||
const opts: RequestInit = { method };
|
||||
if (body) {
|
||||
opts.body = JSON.stringify(body);
|
||||
opts.headers = { 'Content-Type': 'application/json' };
|
||||
}
|
||||
return new Request('http://127.0.0.1:9470', opts);
|
||||
}
|
||||
|
||||
// ─── Tests ──────────────────────────────────────────────────────
|
||||
|
||||
describe('cookie-picker-routes', () => {
|
||||
describe('CORS', () => {
|
||||
test('OPTIONS returns 204 with correct CORS headers', async () => {
|
||||
const { bm } = mockBrowserManager();
|
||||
const url = makeUrl('/cookie-picker/browsers');
|
||||
const req = new Request('http://127.0.0.1:9470', { method: 'OPTIONS' });
|
||||
|
||||
const res = await handleCookiePickerRoute(url, req, bm);
|
||||
|
||||
expect(res.status).toBe(204);
|
||||
expect(res.headers.get('Access-Control-Allow-Origin')).toBe('http://127.0.0.1:9470');
|
||||
expect(res.headers.get('Access-Control-Allow-Methods')).toContain('POST');
|
||||
});
|
||||
|
||||
test('JSON responses include correct CORS origin with port', async () => {
|
||||
const { bm } = mockBrowserManager();
|
||||
const url = makeUrl('/cookie-picker/browsers', 9450);
|
||||
const req = new Request('http://127.0.0.1:9450', { method: 'GET' });
|
||||
|
||||
const res = await handleCookiePickerRoute(url, req, bm);
|
||||
|
||||
expect(res.headers.get('Access-Control-Allow-Origin')).toBe('http://127.0.0.1:9450');
|
||||
});
|
||||
});
|
||||
|
||||
describe('JSON responses (not HTML)', () => {
|
||||
test('GET /cookie-picker/browsers returns JSON', async () => {
|
||||
const { bm } = mockBrowserManager();
|
||||
const url = makeUrl('/cookie-picker/browsers');
|
||||
const req = new Request('http://127.0.0.1:9470', { method: 'GET' });
|
||||
|
||||
const res = await handleCookiePickerRoute(url, req, bm);
|
||||
|
||||
expect(res.status).toBe(200);
|
||||
expect(res.headers.get('Content-Type')).toBe('application/json');
|
||||
const body = await res.json();
|
||||
expect(body).toHaveProperty('browsers');
|
||||
expect(Array.isArray(body.browsers)).toBe(true);
|
||||
});
|
||||
|
||||
test('GET /cookie-picker/domains without browser param returns JSON error', async () => {
|
||||
const { bm } = mockBrowserManager();
|
||||
const url = makeUrl('/cookie-picker/domains');
|
||||
const req = new Request('http://127.0.0.1:9470', { method: 'GET' });
|
||||
|
||||
const res = await handleCookiePickerRoute(url, req, bm);
|
||||
|
||||
expect(res.status).toBe(400);
|
||||
expect(res.headers.get('Content-Type')).toBe('application/json');
|
||||
const body = await res.json();
|
||||
expect(body).toHaveProperty('error');
|
||||
expect(body).toHaveProperty('code', 'missing_param');
|
||||
});
|
||||
|
||||
test('POST /cookie-picker/import with invalid JSON returns JSON error', async () => {
|
||||
const { bm } = mockBrowserManager();
|
||||
const url = makeUrl('/cookie-picker/import');
|
||||
const req = new Request('http://127.0.0.1:9470', {
|
||||
method: 'POST',
|
||||
body: 'not json',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
|
||||
const res = await handleCookiePickerRoute(url, req, bm);
|
||||
|
||||
expect(res.status).toBe(400);
|
||||
expect(res.headers.get('Content-Type')).toBe('application/json');
|
||||
const body = await res.json();
|
||||
expect(body.code).toBe('bad_request');
|
||||
});
|
||||
|
||||
test('POST /cookie-picker/import missing browser field returns JSON error', async () => {
|
||||
const { bm } = mockBrowserManager();
|
||||
const url = makeUrl('/cookie-picker/import');
|
||||
const req = makeReq('POST', { domains: ['.example.com'] });
|
||||
|
||||
const res = await handleCookiePickerRoute(url, req, bm);
|
||||
|
||||
expect(res.status).toBe(400);
|
||||
const body = await res.json();
|
||||
expect(body.code).toBe('missing_param');
|
||||
});
|
||||
|
||||
test('POST /cookie-picker/import missing domains returns JSON error', async () => {
|
||||
const { bm } = mockBrowserManager();
|
||||
const url = makeUrl('/cookie-picker/import');
|
||||
const req = makeReq('POST', { browser: 'Chrome' });
|
||||
|
||||
const res = await handleCookiePickerRoute(url, req, bm);
|
||||
|
||||
expect(res.status).toBe(400);
|
||||
const body = await res.json();
|
||||
expect(body.code).toBe('missing_param');
|
||||
});
|
||||
|
||||
test('POST /cookie-picker/remove with invalid JSON returns JSON error', async () => {
|
||||
const { bm } = mockBrowserManager();
|
||||
const url = makeUrl('/cookie-picker/remove');
|
||||
const req = new Request('http://127.0.0.1:9470', {
|
||||
method: 'POST',
|
||||
body: '{bad',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
|
||||
const res = await handleCookiePickerRoute(url, req, bm);
|
||||
|
||||
expect(res.status).toBe(400);
|
||||
expect(res.headers.get('Content-Type')).toBe('application/json');
|
||||
});
|
||||
|
||||
test('POST /cookie-picker/remove missing domains returns JSON error', async () => {
|
||||
const { bm } = mockBrowserManager();
|
||||
const url = makeUrl('/cookie-picker/remove');
|
||||
const req = makeReq('POST', {});
|
||||
|
||||
const res = await handleCookiePickerRoute(url, req, bm);
|
||||
|
||||
expect(res.status).toBe(400);
|
||||
const body = await res.json();
|
||||
expect(body.code).toBe('missing_param');
|
||||
});
|
||||
|
||||
test('GET /cookie-picker/imported returns JSON with domain list', async () => {
|
||||
const { bm } = mockBrowserManager();
|
||||
const url = makeUrl('/cookie-picker/imported');
|
||||
const req = new Request('http://127.0.0.1:9470', { method: 'GET' });
|
||||
|
||||
const res = await handleCookiePickerRoute(url, req, bm);
|
||||
|
||||
expect(res.status).toBe(200);
|
||||
expect(res.headers.get('Content-Type')).toBe('application/json');
|
||||
const body = await res.json();
|
||||
expect(body).toHaveProperty('domains');
|
||||
expect(body).toHaveProperty('totalDomains');
|
||||
expect(body).toHaveProperty('totalCookies');
|
||||
});
|
||||
});
|
||||
|
||||
describe('routing', () => {
|
||||
test('GET /cookie-picker returns HTML', async () => {
|
||||
const { bm } = mockBrowserManager();
|
||||
const url = makeUrl('/cookie-picker');
|
||||
const req = new Request('http://127.0.0.1:9470', { method: 'GET' });
|
||||
|
||||
const res = await handleCookiePickerRoute(url, req, bm);
|
||||
|
||||
expect(res.status).toBe(200);
|
||||
expect(res.headers.get('Content-Type')).toContain('text/html');
|
||||
});
|
||||
|
||||
test('unknown path returns 404', async () => {
|
||||
const { bm } = mockBrowserManager();
|
||||
const url = makeUrl('/cookie-picker/nonexistent');
|
||||
const req = new Request('http://127.0.0.1:9470', { method: 'GET' });
|
||||
|
||||
const res = await handleCookiePickerRoute(url, req, bm);
|
||||
|
||||
expect(res.status).toBe(404);
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -0,0 +1,144 @@
|
||||
/**
|
||||
* Tests for find-browse version check logic
|
||||
*
|
||||
* Tests the checkVersion() and locateBinary() functions directly.
|
||||
* Uses temp directories with mock .version files and cache files.
|
||||
*/
|
||||
|
||||
import { describe, test, expect, beforeEach, afterEach } from 'bun:test';
|
||||
import { checkVersion, locateBinary } from '../src/find-browse';
|
||||
import { mkdtempSync, writeFileSync, rmSync, existsSync, mkdirSync } from 'fs';
|
||||
import { join } from 'path';
|
||||
import { tmpdir } from 'os';
|
||||
|
||||
let tempDir: string;
|
||||
|
||||
beforeEach(() => {
|
||||
tempDir = mkdtempSync(join(tmpdir(), 'find-browse-test-'));
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
rmSync(tempDir, { recursive: true, force: true });
|
||||
// Clean up test cache
|
||||
try { rmSync('/tmp/gstack-latest-version'); } catch {}
|
||||
});
|
||||
|
||||
describe('checkVersion', () => {
|
||||
test('returns null when .version file is missing', () => {
|
||||
const result = checkVersion(tempDir);
|
||||
expect(result).toBeNull();
|
||||
});
|
||||
|
||||
test('returns null when .version file is empty', () => {
|
||||
writeFileSync(join(tempDir, '.version'), '');
|
||||
const result = checkVersion(tempDir);
|
||||
expect(result).toBeNull();
|
||||
});
|
||||
|
||||
test('returns null when .version has only whitespace', () => {
|
||||
writeFileSync(join(tempDir, '.version'), ' \n');
|
||||
const result = checkVersion(tempDir);
|
||||
expect(result).toBeNull();
|
||||
});
|
||||
|
||||
test('returns null when local SHA matches remote (cache hit)', () => {
|
||||
const sha = 'a'.repeat(40);
|
||||
writeFileSync(join(tempDir, '.version'), sha);
|
||||
// Write cache with same SHA, recent timestamp
|
||||
const now = Math.floor(Date.now() / 1000);
|
||||
writeFileSync('/tmp/gstack-latest-version', `${sha} ${now}\n`);
|
||||
|
||||
const result = checkVersion(tempDir);
|
||||
expect(result).toBeNull();
|
||||
});
|
||||
|
||||
test('returns META:UPDATE_AVAILABLE when SHAs differ (cache hit)', () => {
|
||||
const localSha = 'a'.repeat(40);
|
||||
const remoteSha = 'b'.repeat(40);
|
||||
writeFileSync(join(tempDir, '.version'), localSha);
|
||||
// Create a fake browse binary path so resolveSkillDir works
|
||||
const browsePath = join(tempDir, 'browse');
|
||||
writeFileSync(browsePath, '');
|
||||
// Write cache with different SHA, recent timestamp
|
||||
const now = Math.floor(Date.now() / 1000);
|
||||
writeFileSync('/tmp/gstack-latest-version', `${remoteSha} ${now}\n`);
|
||||
|
||||
const result = checkVersion(tempDir);
|
||||
// Result may be null if resolveSkillDir can't determine skill dir from temp path
|
||||
// That's expected — the META signal requires a known skill dir path
|
||||
if (result !== null) {
|
||||
expect(result).toStartWith('META:UPDATE_AVAILABLE');
|
||||
const jsonStr = result.replace('META:UPDATE_AVAILABLE ', '');
|
||||
const payload = JSON.parse(jsonStr);
|
||||
expect(payload.current).toBe('a'.repeat(8));
|
||||
expect(payload.latest).toBe('b'.repeat(8));
|
||||
expect(payload.command).toContain('git stash');
|
||||
expect(payload.command).toContain('git reset --hard origin/main');
|
||||
expect(payload.command).toContain('./setup');
|
||||
}
|
||||
});
|
||||
|
||||
test('uses cached SHA when cache is fresh (< 4hr)', () => {
|
||||
const localSha = 'a'.repeat(40);
|
||||
const remoteSha = 'a'.repeat(40);
|
||||
writeFileSync(join(tempDir, '.version'), localSha);
|
||||
// Cache is 1 hour old — should still be valid
|
||||
const oneHourAgo = Math.floor(Date.now() / 1000) - 3600;
|
||||
writeFileSync('/tmp/gstack-latest-version', `${remoteSha} ${oneHourAgo}\n`);
|
||||
|
||||
const result = checkVersion(tempDir);
|
||||
expect(result).toBeNull(); // SHAs match
|
||||
});
|
||||
|
||||
test('treats expired cache as stale', () => {
|
||||
const localSha = 'a'.repeat(40);
|
||||
writeFileSync(join(tempDir, '.version'), localSha);
|
||||
// Cache is 5 hours old — should be stale
|
||||
const fiveHoursAgo = Math.floor(Date.now() / 1000) - 18000;
|
||||
writeFileSync('/tmp/gstack-latest-version', `${'b'.repeat(40)} ${fiveHoursAgo}\n`);
|
||||
|
||||
// This will try git ls-remote which may fail in test env — that's OK
|
||||
// The important thing is it doesn't use the stale cache value
|
||||
const result = checkVersion(tempDir);
|
||||
// Result depends on whether git ls-remote succeeds in test environment
|
||||
// If offline, returns null (graceful degradation)
|
||||
expect(result === null || typeof result === 'string').toBe(true);
|
||||
});
|
||||
|
||||
test('handles corrupt cache file gracefully', () => {
|
||||
const localSha = 'a'.repeat(40);
|
||||
writeFileSync(join(tempDir, '.version'), localSha);
|
||||
writeFileSync('/tmp/gstack-latest-version', 'garbage data here');
|
||||
|
||||
// Should not throw, should treat as stale
|
||||
const result = checkVersion(tempDir);
|
||||
expect(result === null || typeof result === 'string').toBe(true);
|
||||
});
|
||||
|
||||
test('handles cache with invalid SHA gracefully', () => {
|
||||
const localSha = 'a'.repeat(40);
|
||||
writeFileSync(join(tempDir, '.version'), localSha);
|
||||
writeFileSync('/tmp/gstack-latest-version', `not-a-sha ${Math.floor(Date.now() / 1000)}\n`);
|
||||
|
||||
// Invalid SHA should be treated as no cache
|
||||
const result = checkVersion(tempDir);
|
||||
expect(result === null || typeof result === 'string').toBe(true);
|
||||
});
|
||||
});
|
||||
|
||||
describe('locateBinary', () => {
|
||||
test('returns null when no binary exists at known paths', () => {
|
||||
// This test depends on the test environment — if a real binary exists at
|
||||
// ~/.claude/skills/gstack/browse/dist/browse, it will find it.
|
||||
// We mainly test that the function doesn't throw.
|
||||
const result = locateBinary();
|
||||
expect(result === null || typeof result === 'string').toBe(true);
|
||||
});
|
||||
|
||||
test('returns string path when binary exists', () => {
|
||||
const result = locateBinary();
|
||||
if (result !== null) {
|
||||
expect(existsSync(result)).toBe(true);
|
||||
}
|
||||
});
|
||||
});
|
||||
Reference in New Issue
Block a user