Files
gstack/scripts/models.ts
T
Garry Tan 6c8cf6774f feat: model overlays with explicit --model flag (no auto-detect)
Adds a per-model behavioral patch layer orthogonal to the host axis.
Different LLMs have different tendencies (GPT won't stop, Gemini
over-explains, o-series wants structured output). Overlays nudge each
model toward better defaults for gstack workflows.

Codex review caught three landmines the prior reviews missed:
1. Host != model — Claude Code can run any Claude model, Codex runs
   GPT/o-series, Cursor fronts multiple providers. Auto-detecting from
   host would lie. Dropped auto-detect. --model is explicit (default
   claude). Missing overlay file → empty string (graceful).
2. Import cycle — putting Model in resolvers/types.ts would cycle
   through hosts/index. Created neutral scripts/models.ts instead.
3. "Final say" is dangerous — overlay at the end of preamble could
   override STOP points, AskUserQuestion gates, /ship review gates.
   Placed overlay after spawned-session-check but before voice + tier
   sections. Wrapper heading adds explicit subordination language on
   every overlay: "subordinate to skill workflow, STOP points,
   AskUserQuestion gates, plan-mode safety, and /ship review gates."

Changes:
- scripts/models.ts: new neutral module. ALL_MODEL_NAMES, Model type,
  resolveModel() for family heuristics (gpt-5.4-mini → gpt-5.4, o3 →
  o-series, claude-opus-4-7 → claude), validateModel() helper.
- scripts/resolvers/types.ts: import Model, add ctx.model field.
- scripts/resolvers/model-overlay.ts: new resolver. Reads
  model-overlays/{model}.md. Supports {{INHERIT:base}} directive at
  top of file for concat (gpt-5.4 inherits gpt). Cycle guard.
- scripts/resolvers/index.ts: register MODEL_OVERLAY resolver.
- scripts/resolvers/preamble.ts: wire generateModelOverlay into
  composition before voice. Print MODEL_OVERLAY: {model} in preamble
  bash so users can see which overlay is active. Filter empty sections.
- scripts/gen-skill-docs.ts: parse --model CLI flag. Default claude.
  Unknown model → throw with list of valid options.
- model-overlays/{claude,gpt,gpt-5.4,gemini,o-series}.md: behavioral
  patches per model family. gpt-5.4.md uses {{INHERIT:gpt}} to extend
  gpt.md without duplication.
- test/gen-skill-docs.test.ts: fix qa-only guardrail regex scope.
  Was matching Edit/Glob/Grep anywhere after `allowed-tools:` in the
  whole file. Now scoped to frontmatter only. Body prose (Claude
  overlay references Edit as a tool) correctly no longer breaks it.

Verification:
- bun run gen:skill-docs --host all --dry-run → all fresh
- bun run gen:skill-docs --model gpt-5.4 → concat works, gpt.md +
  gpt-5.4.md content appears in order
- bun run gen:skill-docs --model unknown → errors with valid list
- All generated skills contain MODEL_OVERLAY: claude in preamble
- Golden ship fixtures regenerated

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 06:01:27 +08:00

69 lines
2.4 KiB
TypeScript

/**
* Model taxonomy — neutral module with no imports from hosts/ or resolvers/.
*
* Model families supported by model overlays in model-overlays/{family}.md.
* Host configs can reference these as `defaultModel` strings (validated at
* generation time), but the model axis is independent of the host axis.
*
* IMPORTANT: host ≠ model. Claude Code can run any Claude model (Opus, Sonnet,
* Haiku, future). Codex CLI runs GPT/o-series models. Cursor and OpenCode can
* front multiple providers. We do NOT auto-detect the model from the host —
* users pass --model explicitly. Default is 'claude'.
*/
export const ALL_MODEL_NAMES = [
'claude',
'gpt',
'gpt-5.4',
'gemini',
'o-series',
] as const;
export type Model = (typeof ALL_MODEL_NAMES)[number];
/**
* Resolve a model argument from CLI input to a known Model family.
*
* Precedence rules:
* 1. Exact match against ALL_MODEL_NAMES → return as-is.
* 2. Family heuristics for common variants:
* - `gpt-5.4-mini`, `gpt-5.4-turbo`, `gpt-5.4-*` → `gpt-5.4`
* - `gpt-*` (anything else GPT) → `gpt`
* - `o3`, `o4`, `o4-mini`, `o1`, `o1-mini`, `o1-pro` → `o-series`
* - `claude-*` (sonnet, opus, haiku, any version) → `claude`
* - `gemini-*` (2.5-pro, flash, etc.) → `gemini`
* 3. Unknown input → returns null (caller decides: error, or fall back).
*
* The resolver file in model-overlays/{model}.md applies further fallback
* (e.g., missing gpt-5.4.md falls back to gpt.md). This function only
* normalizes CLI input to a family name.
*/
export function resolveModel(input: string): Model | null {
const s = input.trim();
if (!s) return null;
// Exact match first
if ((ALL_MODEL_NAMES as readonly string[]).includes(s)) {
return s as Model;
}
// Family heuristics
if (/^gpt-5\.4(-|$)/.test(s)) return 'gpt-5.4';
if (/^gpt(-|$)/.test(s)) return 'gpt';
if (/^o[0-9]+(-|$)/.test(s)) return 'o-series';
if (/^claude(-|$)/.test(s)) return 'claude';
if (/^gemini(-|$)/.test(s)) return 'gemini';
return null;
}
/**
* Validate a string against ALL_MODEL_NAMES. Used by host-config validators
* when a HostConfig declares `defaultModel`. Returns an error message or null
* if valid.
*/
export function validateModel(input: string): string | null {
if ((ALL_MODEL_NAMES as readonly string[]).includes(input)) return null;
return `'${input}' is not a known model. Use ${ALL_MODEL_NAMES.join(', ')}.`;
}