mirror of
https://github.com/garrytan/gstack.git
synced 2026-05-17 01:31:26 +08:00
v1.34.2.0 fix wave: /codex review on CLI 0.130+, /investigate learnings, /sync-gbrain on Supabase (3 community-reported bugs) (#1478)
* fix(learnings): accept type:"investigation" in gstack-learnings-log The /investigate skill instructed agents to log learnings with type:"investigation", but bin/gstack-learnings-log:22 rejected anything not in [pattern, pitfall, preference, architecture, tool, operational]. Every investigation run exited 1 to stderr and the learning was dropped, silently to the user. Fix: add 'investigation' to ALLOWED_TYPES. Regression test: round-trips a learning with type:"investigation" and asserts exit 0 + file write; second test reads investigate/SKILL.md.tmpl and asserts it emits the literal type:"investigation" string, guarding the template/validator contract at both ends. Fixes #1423. Reported by diogolealassis. * fix(gbrain): engine detection survives gbrain ≥0.25 schema + non-zero doctor exit freshDetectEngineTier() in lib/gstack-memory-helpers.ts returned engine: "unknown" for every Supabase user on gbrain ≥0.25. Two stacking bugs: 1. execSync("gbrain doctor --json --fast 2>/dev/null") threw on non-zero exit. gbrain doctor exits 1 whenever health_score < 100, which is essentially every fresh install due to resolver_health warnings. The JSON output never reached the parser. 2. gbrain ≥0.25 shipped schema_version:2 doctor output that dropped the top-level 'engine' field entirely. Result: every /sync-gbrain on Supabase logged 'engine=unknown' and skipped all sync stages silently. Fix: - Replace execSync with execFileSync (no shell, no bash-specific 2>/dev/null redirect; portable to Windows). - Recover stdout from the thrown error object so non-zero exits still parse. - Fall back to reading gbrain's config.json (respecting GBRAIN_HOME env var, defaulting to ~/.gbrain/config.json) when doctor output doesn't surface an engine field. - Add logGbrainError() helper that appends one-line JSONL to ~/.gstack/.gbrain-errors.jsonl on parse failure, so future regressions leave a forensic trail. The "supabase" tier here means "remote postgres" in practice — gbrain config uses engine:"postgres" for both real Supabase and any other remote postgres (e.g. local-postgres-for-testing). Downstream sync code treats them identically, so the label compression is intentional and documented inline. Regression test: existing detectEngineTier suite now isolates HOME + GBRAIN_HOME + PATH to temp dirs (closes a flake source where the prior tests would read whatever was on the reviewer's machine). New test forces gbrain off PATH, writes a synthetic config.json with engine:"postgres", asserts detectEngineTier() returns engine:"supabase". Fixes #1415. Patch shape contributed by Shiv @shivasymbl (tested on gstack v1.31.0.0 + gbrain v0.31.3 + Supabase). * fix(codex): /codex review works on Codex CLI ≥0.130.0 Codex CLI 0.130.0 made [PROMPT] and --base <BRANCH> mutually exclusive at argv level. Step 2A of codex/SKILL.md.tmpl had always passed both (the filesystem boundary prefix as the prompt argument + the base branch), so every /codex review call died with: error: the argument '[PROMPT]' cannot be used with '--base <BRANCH>' Fix: split Step 2A into two paths. Default (no custom user instructions): bare 'codex review --base <base>'. Codex's review prompt is internally diff-scoped, so the model focuses on the changes against base. The filesystem boundary prefix is dropped here because Codex 0.130 has no documented system-prompt config key (probed -c 'system_prompt="..."' against 0.130 — the flag is silently accepted but the value isn't applied). Skill files under .claude/ and agents/ are public, so this is a token-efficiency concern, not a safety one. Custom instructions (/codex review <focus>): route through codex exec with the diff written to a tempfile, inlined into the prompt between explicit DIFF_START / DIFF_END markers. The boundary is preserved here because codex exec isn't auto-scoped to the diff. The DIFF_START/END delimiters tell the model where data ends and instructions resume, which materially reduces prompt-injection hijack rates when the diff contains adversarial content. Note on bash semantics: codex's earlier review flagged the exec route as "command injection via $_DIFF interpolation." That framing is wrong — bash parameter expansion does not re-evaluate $(...) or backticks inside the expanded value, so a diff containing $(rm -rf /) is plain string data to codex exec. The real risk is prompt injection (model-side, not shell-side), which the DIFF_START/END pattern mitigates. Regression tests in test/codex-hardening.test.ts assert across BOTH codex/SKILL.md.tmpl AND the generated codex/SKILL.md: 1. No 'codex review' invocation line combines a quoted-string OR variable positional argument with --base. 2. Step 2A still contains either bare 'codex review --base' OR 'codex exec' (guards against accidental deletion of both fix paths). Fixes #1428. Reported by Stashub. * test: raise timeouts for slow integration tests Two test files were timing out at the default 5s on developer machines, both pre-existing on origin/main but unrelated to this branch's bug fixes: - test/gstack-artifacts-init.test.ts: 13 tests spawning real subprocesses via fake gh/glab/git shims in PATH. bun's fork+exec overhead pushed these past 5s consistently. Added a local test-wrapper that aliases test() with a 30s timeout (matches the brain-sync.test.ts pattern already in the repo). - test/gstack-next-version.test.ts: one integration smoke test that spawns 'bun run ./bin/gstack-next-version' and parses the resulting JSON. The subprocess does a 'gh pr list' against the live GitHub API to enumerate claimed version slots. Network latency makes 5s tight; raised this single test to 30s. No production code changed. The tests already passed deterministically once given enough wall-clock time. * chore: bump version and changelog (v1.34.2.0) Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com> --------- Co-authored-by: Claude Opus 4.7 <noreply@anthropic.com>
This commit is contained in:
@@ -244,21 +244,82 @@ export function detectEngineTier(): EngineDetect {
|
||||
return fresh;
|
||||
}
|
||||
|
||||
// Returns gbrain's config.json path, honoring GBRAIN_HOME env var with a
|
||||
// fallback to ~/.gbrain. gbrain >=0.25 dropped the top-level `engine` field
|
||||
// from doctor output, so this file is the only reliable source for engine
|
||||
// detection on that version. See #1415.
|
||||
function gbrainConfigPath(): string {
|
||||
const root = process.env.GBRAIN_HOME || join(homedir(), ".gbrain");
|
||||
return join(root, "config.json");
|
||||
}
|
||||
|
||||
// Best-effort JSONL append to ~/.gstack/.gbrain-errors.jsonl. Never throws.
|
||||
function logGbrainError(kind: string, detail: string): void {
|
||||
try {
|
||||
const path = errorLogPath();
|
||||
mkdirSync(dirname(path), { recursive: true });
|
||||
appendFileSync(
|
||||
path,
|
||||
JSON.stringify({ ts: new Date().toISOString(), kind, detail: detail.slice(0, 500) }) + "\n",
|
||||
"utf-8"
|
||||
);
|
||||
} catch { /* logging is best-effort */ }
|
||||
}
|
||||
|
||||
function freshDetectEngineTier(): EngineDetect {
|
||||
const now = Date.now();
|
||||
let parsed: Record<string, unknown> | null = null;
|
||||
|
||||
// execFileSync (not execSync) avoids shell redirection — portable to
|
||||
// environments where `2>/dev/null` is bash-specific. The stdio array
|
||||
// suppresses stderr without invoking a shell.
|
||||
try {
|
||||
const out = execSync("gbrain doctor --json --fast 2>/dev/null", { encoding: "utf-8", timeout: 5000 });
|
||||
const parsed = JSON.parse(out);
|
||||
const engine: EngineTier = parsed?.engine === "supabase" ? "supabase" : parsed?.engine === "pglite" ? "pglite" : "unknown";
|
||||
return {
|
||||
engine,
|
||||
supabase_url: parsed?.supabase_url || undefined,
|
||||
detected_at: now,
|
||||
schema_version: 1,
|
||||
};
|
||||
} catch {
|
||||
return { engine: "unknown", detected_at: now, schema_version: 1 };
|
||||
const out = execFileSync("gbrain", ["doctor", "--json", "--fast"], {
|
||||
encoding: "utf-8",
|
||||
timeout: 5000,
|
||||
stdio: ["ignore", "pipe", "ignore"],
|
||||
});
|
||||
parsed = JSON.parse(out);
|
||||
} catch (err: unknown) {
|
||||
// execFileSync throws on non-zero exit; stdout is still on the error
|
||||
// object. gbrain doctor exits 1 whenever health_score < 100, which is
|
||||
// essentially always on fresh installs (resolver_health warnings are
|
||||
// normal). Recover stdout and re-parse. See #1415.
|
||||
try {
|
||||
const stdout = (err as { stdout?: Buffer | string })?.stdout ?? "";
|
||||
const stdoutStr = typeof stdout === "string" ? stdout : stdout.toString("utf-8");
|
||||
if (stdoutStr) parsed = JSON.parse(stdoutStr);
|
||||
} catch (parseErr) {
|
||||
logGbrainError("doctor_parse_failure", String(parseErr));
|
||||
}
|
||||
}
|
||||
|
||||
let engine: EngineTier =
|
||||
parsed?.engine === "supabase" ? "supabase" :
|
||||
parsed?.engine === "pglite" ? "pglite" : "unknown";
|
||||
|
||||
// gbrain >=0.25 ships schema_version:2 doctor output which dropped the
|
||||
// top-level `engine` field. Fall back to gbrain's config.json (respects
|
||||
// GBRAIN_HOME). "supabase" here means "remote postgres" — gbrain config
|
||||
// uses engine:"postgres" for real Supabase AND any other remote postgres
|
||||
// (e.g. local-postgres-for-testing). Downstream sync code treats them the
|
||||
// same, so the label compression is intentional.
|
||||
if (engine === "unknown") {
|
||||
try {
|
||||
const cfg = JSON.parse(readFileSync(gbrainConfigPath(), "utf-8"));
|
||||
if (cfg?.engine === "pglite") engine = "pglite";
|
||||
else if (cfg?.engine === "postgres" || cfg?.database_url) engine = "supabase";
|
||||
} catch (cfgErr) {
|
||||
logGbrainError("config_read_failure", String(cfgErr));
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
engine,
|
||||
supabase_url: parsed?.supabase_url as string | undefined,
|
||||
detected_at: now,
|
||||
schema_version: 1,
|
||||
};
|
||||
}
|
||||
|
||||
// ── Public: parseSkillManifest ────────────────────────────────────────────
|
||||
|
||||
Reference in New Issue
Block a user