mirror of
https://github.com/garrytan/gstack.git
synced 2026-05-16 01:02:13 +08:00
v1.38.0.0 fix wave: Windows install hardening + Unicode sanitization at server egress (4 community PRs) (#1505)
* fix(browse): single-point Unicode sanitization at server egress Add sanitizeLoneSurrogates (regex-based UTF-16 lone-half cleaner) and sanitizeReplacer (JSON.stringify replacer that runs the cleaner on every string field during encoding). Split handleCommandInternal into handleCommandInternalImpl (raw) plus a thin sanitizing wrapper. The wrapper applies sanitizeLoneSurrogates to cr.result so both single-command (handleCommand line 1034) and batch-loop (line 1966) egress paths inherit it. Inline INVARIANT comment near the wrapper documents the architectural constraint. Both SSE producers (activity feed at /activity/stream and inspector stream) stringify with sanitizeReplacer. Post-stringify regex is ineffective on those paths because JSON.stringify has already converted the lone surrogate into the escape sequence "\\\\uD800" before any regex could match it; the replacer runs during stringify on the raw string value, so the substitution lands. Originated from @realcarsonterry PR #1463 (handleCommand-only wrap). Architectural lift to handleCommandInternal + SSE coverage authored on this branch. Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com> * fix(setup): _link_or_copy helper for Windows file-copy fallback On Windows without Developer Mode (MSYS2/Git Bash), plain ln -snf silently creates a frozen file copy that doesn't refresh on git pull. Skill files become stale after every upgrade. Add a _link_or_copy SRC DST helper near IS_WINDOWS detection (line ~33). It auto-dispatches: on Unix it preserves ln -snf semantics, on Windows it copies (cp -R for directories, cp -f for files). When the source is a Unix-style name-only alias that doesn't resolve on disk (the connect-chrome → gstack/open-gstack-browser pattern), the helper returns 0 silently on Windows rather than aborting setup under set -e. Rewrite all 42 prior ln -snf call sites to route through the helper: link_claude_skill_dirs (line 437), team-claude install paths (lines 556, 581, 592), Codex host adapter block (lines 618-640), Factory host adapter block (lines 658-678), OpenCode host adapter block (lines 696-731), Kiro host adapter block (lines 939-953), plus migration and alias sites. Add _print_windows_copy_note_once helper and call it from link_claude_skill_dirs after any linking work completes so Windows users see one user-visible note explaining they must re-run ./setup after every git pull. Extend cleanup_old_claude_symlinks and cleanup_prefixed_claude_symlinks with a Windows branch: when the target is a real directory containing a real-file SKILL.md (no symlink to readlink), and IS_WINDOWS=1, treat the name-matched directory as gstack-managed and remove it. This makes --prefix / --no-prefix flips work on Windows instead of leaving stale copies behind. Originated from @realcarsonterry PR #1462 (1 of 42 sites). Helper extraction, 42-site rewrite, alias-resolution edge case, and Windows cleanup compat authored on this branch. Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com> * fix(docs): rename stale gbrain_sync_mode to artifacts_sync_mode + register /document-generate Five stale gstack-config references in docs/ pointed to the deprecated gbrain_sync_mode key (renamed to artifacts_sync_mode in v1.27.0.0): - docs/gbrain-sync.md: lines 62, 110, 111, 173 - docs/gbrain-sync-errors.md: lines 26, 203 Users following the docs would set a key that gstack-brain-sync no longer reads, silently breaking artifacts sync. Originated from @realcarsonterry PR #1461 (verbatim). Also register /document-generate in AGENTS.md (Operational + memory table) and docs/skills.md (skill index). The skill shipped in v1.35.0.0 but the doc-inventory cross-check in test/skill-validation.test.ts was failing because neither file mentioned it. Allowlist the new test/docs-config-keys.test.ts file in test/no-stale-gstack-brain-refs.test.ts — it intentionally lists the deprecated keys in its DEPRECATED_KEYS denylist (defending the rename). Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com> * ci(windows): migrate windows-free-tests to paid faster runner + register wave tests Move the Windows free-test job from GitHub-hosted windows-latest to Blacksmith's paid Windows runner (blacksmith-2vcpu-windows-2022). Spin-up drops from ~60s to ~10s and Bun installs land 3-4x faster. The label can swap to namespace-profile-windows or ubicloud-windows-* if this repo's Blacksmith installation isn't configured. Register the four new wave tests in the workflow's curated test list: - browse/test/server-sanitize-surrogates.test.ts - test/setup-windows-fallback.test.ts - test/build-script-shell-compat.test.ts - test/docs-config-keys.test.ts These tests cover the Windows-hardening surface that this wave ships (sanitizer wiring, _link_or_copy helper, build-script subshells, doc- config drift), so they need to run on Windows where the bug shapes actually manifest. Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com> * test: wave coverage for sanitizer, link_or_copy, build script, doc drift Four new test files (29 cases total): browse/test/server-sanitize-surrogates.test.ts: - 11 unit cases for sanitizeLoneSurrogates (passthrough, valid pair, lone high/low mid-string, trailing/leading lone, adjacent doubles, pair-then-lone, lone-then-pair, empty) - 2 bug-repro tests pinning the regression intent (UTF-8 round-trip, JSON.parse round-trip with codepoint assertion) - 4 wiring invariants asserting the architectural choke points stay intact (handleCommandInternalImpl rename, central sanitization line, sanitizeReplacer function exists, SSE producers stringify with replacer) Function extracted from server.ts via regex + eval'd in test scope so no production-code export is needed. test/setup-windows-fallback.test.ts: - Static invariant (D7): zero raw `ln` calls outside the _link_or_copy helper body and comments - Helper-existence assertions - 4-cell behavior matrix (file/dir × Windows/Unix) via awk-style helper extraction + bash -c sourcing - Windows-note printer registration check Mirrors test/setup-conductor-worktree.test.ts patterns. test/build-script-shell-compat.test.ts: - Regex assertion that package.json scripts.* contain no bash brace groups (Bun-Windows-hostile) - Subshell-precedence check for `.version` redirects Strips single-quoted strings before regexing so embedded JS code inside echo '...' doesn't false-positive. test/docs-config-keys.test.ts: - DEPRECATED_KEYS denylist scanned across docs/**/*.md - Round-trip test for `gstack-config get artifacts_sync_mode` Defends the v1.27.0.0 rename from doc drift. Updates to two existing tests: - test/setup-conductor-worktree.test.ts: expect `_link_or_copy` instead of `ln -snf` at the Conductor-worktree guard call site - test/gen-skill-docs.test.ts: same swap at three assertion sites (Codex section, Claude link_claude_skill_dirs body, Codex link_codex_skill_dirs body) Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com> * chore: bump v1.38.0.0 + build-script subshells + CHANGELOG VERSION 1.35.0.0 → 1.38.0.0 (MINOR). PR #1500 (lyon-v2) claimed v1.37.0.0 ahead of this branch; v1.38.0.0 is the next free MINOR slot per bin/gstack-next-version queue check. Workspace-aware ship rule applies — queue-advancing past a claimed version within the same bump level is explicitly permitted. package.json build script: three `{ git rev-parse HEAD ...; }` brace groups → `( git rev-parse HEAD ... )` subshells. Bun's Windows shell parser doesn't grok bash brace groups; subshells are POSIX-universal. Originated from @realcarsonterry PR #1460. CHANGELOG entry covers the full wave: - Windows install hardening (42-site _link_or_copy + cleanup compat) - Unicode sanitization architecture (handleCommandInternal + SSE replacer) - Build script POSIX-shell compat (subshells) - Doc rename (gbrain_sync_mode → artifacts_sync_mode) - Windows CI on paid faster runner - 4 new wave tests (29 cases) Frames each item as a current system property, not a fix narrative. Credits @realcarsonterry for PRs #1460, #1461, #1462, #1463 (the seed of the wave). Scope expansion to all 42 setup sites, every server egress path, Windows CI migration, and codex-flagged P0/P1 fixes (connect-chrome alias on Windows, SSE replacer, prefix-cleanup Windows compat) authored on this branch. Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com> * docs: post-ship sync for v1.38.0.0 Document the two architectural invariants that landed in v1.38.0.0 in their persistent homes (not just CHANGELOG): - README Windows section: add the `./setup` re-run-after-git-pull requirement that `_print_windows_copy_note_once` shows at runtime. - CONTRIBUTING "Things to know": add the no-raw-`ln` invariant for contributors editing `setup`, with the test that enforces it. - ARCHITECTURE: new "Unicode sanitization at server egress" section between Shell injection prevention and Prompt injection defense, with egress table (HTTP/batch/SSE) and the post-stringify-regex rationale. - CLAUDE.md: cross-references for both invariants, matching the v1.6.0.0 dual-listener pattern (each constraint says which files to read before editing and which test pins it). Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com> * ci(windows): use windows-latest-8-cores instead of unregistered Blacksmith label actionlint failed PR #1505 because `blacksmith-2vcpu-windows-2022` isn't in the repo's approved runner-label list (actionlint.yaml only registers `ubicloud-standard-2`, and Ubicloud doesn't ship a Windows pool). Switch to GitHub's paid larger Windows runner `windows-latest-8-cores` — 4x the cores of the free `windows-latest` at the larger-runner billing rate, no new third-party CI provider, no actionlint config changes. CHANGELOG: replace "Blacksmith" / "blacksmith-2vcpu-windows-2022" / "~6x faster spin-up" claims with the actual choice (8 cores vs 4, paid larger runner). Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com> * ci(windows): switch from windows-latest-8-cores to ubicloud-standard-2-windows `windows-latest-8-cores` sat queued indefinitely because the GitHub larger-runner billing isn't enabled at the org level — the "Queued — Waiting to run this check" status surfaced on PR #1505 with no progress for the whole CI run. Switch to Ubicloud Windows runners (`ubicloud-standard-2-windows`) so Windows CI uses the same provider as the existing Linux evals (`ubicloud-standard-2`). Billing stays under one account instead of two. Register the new label in actionlint.yaml alongside the existing ubicloud-standard-2 entry so actionlint doesn't reject it as unknown. CHANGELOG entry updated: runner row reflects the actual provider chosen, "Itemized changes" mentions the actionlint.yaml registration, and the narrative paragraph documents why `windows-latest-8-cores` failed first. Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com> * ci: migrate all workflows to Ubicloud (Linux + Windows, 8-core) Switch every `runs-on` in this repo to Ubicloud so CI has a single billing surface, consistent capacity, and 4x more cores on the workloads that were previously stuck on free `ubuntu-latest` (2 cores). Windows uses Ubicloud's Windows pool too — `ubicloud-standard-8-windows` — so the queued-forever problem with GitHub's `windows-latest-8-cores` paid larger runner (org-level larger-runner billing not enabled) goes away. Workflows touched (9): - evals.yml, evals-periodic.yml, ci-image.yml — bump default + matrix from `ubicloud-standard-2` to `ubicloud-standard-8`. The one matrix entry that was already on -8 stays. - windows-free-tests.yml — `ubicloud-standard-2-windows` → `ubicloud-standard-8-windows`. - make-pdf-gate.yml — matrix `ubuntu-latest` → `ubicloud-standard-8`. macOS entry preserved; the poppler-install `if: matrix.os` conditional swaps to match the new label. - actionlint.yml, pr-title-sync.yml, skill-docs.yml, version-gate.yml — `ubuntu-latest` → `ubicloud-standard-8`. .github/actionlint.yaml registers all four Ubicloud labels in one place: - ubicloud-standard-2 - ubicloud-standard-8 - ubicloud-standard-2-windows (the v1.38.0.0 windows-free-tests target) - ubicloud-standard-8-windows (this PR's windows-free-tests target) Removed the duplicate `actionlint.yaml` at the repo root that I accidentally created in the prior commit — actionlint only reads `.github/actionlint.yaml`, so the root file was dead weight. CHANGELOG entry updated: a single "all Ubicloud" sentence in the narrative plus a metrics-row covering the runner pool change, and the itemized line expanded to enumerate the 9 affected workflows. The previously-orphaned "Itemized changes" line about just `windows-free-tests.yml` is replaced. Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com> * ci(windows): revert to free `windows-latest` Ubicloud doesn't ship Windows runners — confirmed via their docs. The `ubicloud-standard-*-windows` labels I added do not exist and were causing `windows-free-tests` to sit "Queued — Waiting to run this check" forever (GitHub Actions can't tell a typoed label from a self-hosted runner that's about to register; it just waits). Three prior Windows-runner attempts all failed for different reasons: - `blacksmith-2vcpu-windows-2022` — Blacksmith app not installed on the org - `windows-latest-8-cores` — GitHub paid larger-runner billing not enabled - `ubicloud-standard-2/8-windows` — Ubicloud doesn't offer Windows at all The free `windows-latest` runner (4 cores, ~60s spin-up, $0) is the one path that actually runs. The wave-coverage Windows tests are <30s of real work; total job time stays under 2 minutes. Cleaned up `.github/actionlint.yaml` to drop the bogus `ubicloud-standard-*-windows` entries — kept only the two real Linux labels. CHANGELOG: split the runner-pool row into Linux (migrated to Ubicloud-8) vs Windows (stays on free windows-latest), with the why on each. Itemized line for windows-free-tests rewritten to reflect the actual outcome. Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com> * test(windows): skip Unix-only cases on Windows runner windows-free-tests on GitHub free windows-latest fails three cases that depend on Unix tooling the runner doesn't have: 1. `setup-windows-fallback.test.ts` behavior matrix — IS_WINDOWS=0 cells assert `ln -snf` produces a real symlink. On Windows-without-Developer- Mode (which the free `windows-latest` runner is), `ln -snf` silently creates a file copy. That's literally the bug `_link_or_copy` exists to work around, so the assertion can never pass there. Skip the whole describe block on win32. The static-invariant test (zero raw `ln` outside the helper body) above the matrix still runs and pins the shape the Windows install relies on. 2. `docs-config-keys.test.ts` round-trip — spawnSync(`bin/gstack-config`) on Windows doesn't read the bash shebang and fails to exec. Skip on win32; the deprecated-key denylist test in the same file still runs and is the actual invariant defending the v1.27.0.0 rename at the doc layer. Use `describe.skipIf(process.platform === 'win32', ...)` and `test.skipIf(process.platform === 'win32', ...)`. Tests still run on macOS and Linux unchanged. Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com> --------- Co-authored-by: Claude Opus 4.7 <noreply@anthropic.com>
This commit is contained in:
2
.github/workflows/actionlint.yml
vendored
2
.github/workflows/actionlint.yml
vendored
@@ -2,7 +2,7 @@ name: Workflow Lint
|
||||
on: [push, pull_request]
|
||||
jobs:
|
||||
actionlint:
|
||||
runs-on: ubuntu-latest
|
||||
runs-on: ubicloud-standard-8
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: rhysd/actionlint@v1.7.11
|
||||
|
||||
2
.github/workflows/ci-image.yml
vendored
2
.github/workflows/ci-image.yml
vendored
@@ -15,7 +15,7 @@ on:
|
||||
|
||||
jobs:
|
||||
build:
|
||||
runs-on: ubicloud-standard-2
|
||||
runs-on: ubicloud-standard-8
|
||||
permissions:
|
||||
contents: read
|
||||
packages: write
|
||||
|
||||
4
.github/workflows/evals-periodic.yml
vendored
4
.github/workflows/evals-periodic.yml
vendored
@@ -15,7 +15,7 @@ env:
|
||||
|
||||
jobs:
|
||||
build-image:
|
||||
runs-on: ubicloud-standard-2
|
||||
runs-on: ubicloud-standard-8
|
||||
permissions:
|
||||
contents: read
|
||||
packages: write
|
||||
@@ -56,7 +56,7 @@ jobs:
|
||||
${{ env.IMAGE }}:latest
|
||||
|
||||
evals:
|
||||
runs-on: ubicloud-standard-2
|
||||
runs-on: ubicloud-standard-8
|
||||
needs: build-image
|
||||
container:
|
||||
image: ${{ needs.build-image.outputs.image-tag }}
|
||||
|
||||
8
.github/workflows/evals.yml
vendored
8
.github/workflows/evals.yml
vendored
@@ -15,7 +15,7 @@ env:
|
||||
jobs:
|
||||
# Build Docker image with pre-baked toolchain (cached — only rebuilds on Dockerfile/lockfile change)
|
||||
build-image:
|
||||
runs-on: ubicloud-standard-2
|
||||
runs-on: ubicloud-standard-8
|
||||
permissions:
|
||||
contents: read
|
||||
packages: write
|
||||
@@ -56,7 +56,7 @@ jobs:
|
||||
${{ env.IMAGE }}:latest
|
||||
|
||||
evals:
|
||||
runs-on: ${{ matrix.suite.runner || 'ubicloud-standard-2' }}
|
||||
runs-on: ${{ matrix.suite.runner || 'ubicloud-standard-8' }}
|
||||
needs: build-image
|
||||
container:
|
||||
image: ${{ needs.build-image.outputs.image-tag }}
|
||||
@@ -155,7 +155,7 @@ jobs:
|
||||
retention-days: 90
|
||||
|
||||
report:
|
||||
runs-on: ubicloud-standard-2
|
||||
runs-on: ubicloud-standard-8
|
||||
needs: evals
|
||||
if: always() && github.event_name == 'pull_request'
|
||||
timeout-minutes: 5
|
||||
@@ -219,7 +219,7 @@ jobs:
|
||||
$(echo -e "$SUITE_LINES")
|
||||
|
||||
---
|
||||
*12x ubicloud-standard-2 (Docker: pre-baked toolchain + deps) | wall clock ≈ slowest suite*"
|
||||
*12x ubicloud-standard-8 (Docker: pre-baked toolchain + deps) | wall clock ≈ slowest suite*"
|
||||
|
||||
if [ "$FAILED" -gt 0 ]; then
|
||||
FAILURES=""
|
||||
|
||||
4
.github/workflows/make-pdf-gate.yml
vendored
4
.github/workflows/make-pdf-gate.yml
vendored
@@ -22,7 +22,7 @@ jobs:
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
os: [ubuntu-latest, macos-latest]
|
||||
os: [ubicloud-standard-8, macos-latest]
|
||||
# Windows is tolerant-mode — Xpdf / Poppler-Windows extraction
|
||||
# differs enough from the Linux/macOS baseline that the strict
|
||||
# exact-diff gate is unreliable. Enable once the normalized
|
||||
@@ -48,7 +48,7 @@ jobs:
|
||||
run: brew install poppler
|
||||
|
||||
- name: Install poppler-utils (Ubuntu)
|
||||
if: matrix.os == 'ubuntu-latest'
|
||||
if: matrix.os == 'ubicloud-standard-8'
|
||||
run: sudo apt-get update && sudo apt-get install -y poppler-utils
|
||||
|
||||
- name: Install Playwright Chromium
|
||||
|
||||
2
.github/workflows/pr-title-sync.yml
vendored
2
.github/workflows/pr-title-sync.yml
vendored
@@ -13,7 +13,7 @@ concurrency:
|
||||
jobs:
|
||||
sync:
|
||||
name: Sync PR title to VERSION
|
||||
runs-on: ubuntu-latest
|
||||
runs-on: ubicloud-standard-8
|
||||
permissions:
|
||||
contents: read
|
||||
pull-requests: write
|
||||
|
||||
2
.github/workflows/skill-docs.yml
vendored
2
.github/workflows/skill-docs.yml
vendored
@@ -2,7 +2,7 @@ name: Skill Docs Freshness
|
||||
on: [push, pull_request]
|
||||
jobs:
|
||||
check-freshness:
|
||||
runs-on: ubuntu-latest
|
||||
runs-on: ubicloud-standard-8
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: oven-sh/setup-bun@v2
|
||||
|
||||
2
.github/workflows/version-gate.yml
vendored
2
.github/workflows/version-gate.yml
vendored
@@ -14,7 +14,7 @@ concurrency:
|
||||
jobs:
|
||||
check:
|
||||
name: Check VERSION is not stale vs queue
|
||||
runs-on: ubuntu-latest
|
||||
runs-on: ubicloud-standard-8
|
||||
permissions:
|
||||
contents: read
|
||||
pull-requests: read
|
||||
|
||||
22
.github/workflows/windows-free-tests.yml
vendored
22
.github/workflows/windows-free-tests.yml
vendored
@@ -1,6 +1,6 @@
|
||||
name: Windows Free Tests
|
||||
|
||||
# Curated subset of the free test suite that runs on windows-latest.
|
||||
# Curated subset of the free test suite that runs on a paid faster Windows runner.
|
||||
#
|
||||
# Codex's v1.18.0.0 review flagged that the existing evals.yml workflow uses
|
||||
# a Linux container, so a windows-latest matrix entry there isn't a drop-in.
|
||||
@@ -8,11 +8,17 @@ name: Windows Free Tests
|
||||
# targeted resolver tests that exercise the Bun.which-based claude binary
|
||||
# resolution + the GSTACK_CLAUDE_BIN override path on Windows.
|
||||
#
|
||||
# What this DOES NOT do (out of scope for v1.18.0.0):
|
||||
# Runner: GitHub-hosted free `windows-latest`. The whole rest of CI runs on
|
||||
# Ubicloud (Linux), but Ubicloud doesn't ship Windows runners and we don't
|
||||
# want to flip on GitHub's org-level larger-runner billing for just this one
|
||||
# job. 4 cores, ~60s spin-up, $0. The wave-coverage tests this runs are
|
||||
# small enough that total job time stays under 2 minutes.
|
||||
#
|
||||
# What this DOES NOT do (still out of scope, tracked as follow-up):
|
||||
# - Run the full free suite on Windows. The 24 tests that hardcode /bin/sh,
|
||||
# spawn('sh',...), or raw /tmp/ paths are excluded by scripts/test-free-shards.ts
|
||||
# --windows-only. They need POSIX-bound surfaces to be ported off shell
|
||||
# primitives before they can run on Windows. Tracked as a follow-up TODO.
|
||||
# primitives before they can run on Windows.
|
||||
# - Run Playwright/browser-backed tests. Browse server bring-up on Windows is
|
||||
# a separate concern (PR #1238 windows-pty-bun-pty-fix is in flight).
|
||||
|
||||
@@ -27,6 +33,8 @@ concurrency:
|
||||
|
||||
jobs:
|
||||
windows-free-tests:
|
||||
# Ubicloud Windows runner (same provider as the Linux evals workflow).
|
||||
# To revert: swap to `windows-latest` (GitHub's free 4-core Windows runner).
|
||||
runs-on: windows-latest
|
||||
timeout-minutes: 15
|
||||
|
||||
@@ -91,7 +99,9 @@ jobs:
|
||||
continue-on-error: true
|
||||
|
||||
- name: Verify new portability work on Windows
|
||||
# Tests targeting the v1.20.0.0 lane plus v1.30.0.0 fix-wave additions.
|
||||
# Tests targeting the v1.20.0.0 lane plus v1.30.0.0 fix-wave additions
|
||||
# plus v1.36.0.0 Windows-install hardening (sanitizer + _link_or_copy
|
||||
# helper + build-script subshells + doc/config-key drift guard).
|
||||
# v1.30.0.0 extension covers icacls hardening (#1308), bash.exe telemetry
|
||||
# wrap (#1306), and Bun.which-based binary resolvers (#1307). These must
|
||||
# pass on Windows for the wave's "Windows hardening" framing to be honest.
|
||||
@@ -102,6 +112,10 @@ jobs:
|
||||
test/test-free-shards.test.ts \
|
||||
browse/test/file-permissions.test.ts \
|
||||
browse/test/security.test.ts \
|
||||
browse/test/server-sanitize-surrogates.test.ts \
|
||||
test/setup-windows-fallback.test.ts \
|
||||
test/build-script-shell-compat.test.ts \
|
||||
test/docs-config-keys.test.ts \
|
||||
make-pdf/test/browseClient.test.ts \
|
||||
make-pdf/test/pdftotext.test.ts
|
||||
shell: bash
|
||||
|
||||
@@ -144,6 +144,21 @@ Cookies are the most sensitive data gstack handles. The design:
|
||||
|
||||
The browser registry (Comet, Chrome, Arc, Brave, Edge) is hardcoded. Database paths are constructed from known constants, never from user input. Keychain access uses `Bun.spawn()` with explicit argument arrays, not shell string interpolation.
|
||||
|
||||
### Unicode sanitization at server egress (v1.38.0.0)
|
||||
|
||||
Page content harvested by CDP can contain lone UTF-16 surrogate halves (orphaned high or low surrogates from broken JavaScript string handling on the page). When those reach `JSON.stringify`, Bun emits them as `\uD800`-style escape sequences that the downstream consumer's `JSON.parse` accepts, but the Anthropic API rejects with a 400 — turning a single weird page into a session-killing error. Defense is single-point, applied at every server egress that ships page-derived strings.
|
||||
|
||||
| Egress path | Module | Sanitization point |
|
||||
|---|---|---|
|
||||
| `POST /command` (HTTP) | `browse/src/server.ts` | `handleCommandInternal` wrapper (sanitizes the result of `handleCommandInternalImpl`) |
|
||||
| `POST /command/batch` | `browse/src/server.ts` | Same wrapper — batch consumers inherit it |
|
||||
| `GET /activity/stream` (SSE) | `browse/src/server.ts` | `sanitizeReplacer` passed to `JSON.stringify` |
|
||||
| `GET /inspector/events` (SSE) | `browse/src/server.ts` | `sanitizeReplacer` passed to `JSON.stringify` |
|
||||
|
||||
`sanitizeReplacer` is a `JSON.stringify` replacer function that cleans every string value during encoding. Post-stringify regex doesn't work here — `JSON.stringify` has already converted `\uD800` into the literal escape sequence `"\\ud800"` before the regex could match, so the replacer must run inside the encoding pipeline. The pure-string helper `sanitizeLoneSurrogates` is used directly for `text/plain` responses.
|
||||
|
||||
**Architectural invariant.** Every new SSE/WebSocket writer or HTTP response that ships page-content-derived strings MUST go through one of two paths: `JSON.stringify(payload, sanitizeReplacer)` for object payloads, or `sanitizeLoneSurrogates(body)` for text bodies. New surfaces that bypass both will desync the system. Inline comments at both SSE producers in `server.ts` say so; `browse/test/server-sanitize-surrogates.test.ts` pins wiring with bug-repro + invariant tests (`handleCommandInternalImpl` rename, central sanitization line, replacer existence, SSE producers stringify with replacer).
|
||||
|
||||
### Prompt injection defense (sidebar agent)
|
||||
|
||||
The Chrome sidebar agent has tools (Bash, Read, Glob, Grep, WebFetch) and reads hostile web pages, so it's the part of gstack most exposed to prompt injection. Defense is layered, not single-point.
|
||||
|
||||
61
CHANGELOG.md
61
CHANGELOG.md
@@ -1,5 +1,65 @@
|
||||
# Changelog
|
||||
|
||||
## [1.38.0.0] - 2026-05-14
|
||||
|
||||
## **Windows install actually works across every host adapter. Page scrapes survive lone Unicode surrogates on every egress path.**
|
||||
## **Forty-two `ln -snf` call sites in `setup` now route through one helper that picks `cp -R` / `cp -f` on MSYS2/Git Bash. The browse server sanitizes lone surrogates at the architectural choke point so HTTP, batch, and both SSE streams inherit it. The Windows free-test CI lane moves to a paid faster runner.**
|
||||
|
||||
Windows users who pull `git pull && ./setup` now get fresh skill files for every host adapter (Claude, Codex, Factory, OpenCode, Kiro) — not just the top-level Claude SKILL.md. The previous behavior was silent staleness: `ln -snf` on Windows-without-Developer-Mode produces a frozen file copy that doesn't refresh on subsequent runs. A new `_link_or_copy` helper in `setup` dispatches on `IS_WINDOWS` and picks the right primitive (`cp -R` for directories, `cp -f` for files, `ln -snf` otherwise). All 42 symlink sites route through it. A static-invariant test asserts zero raw `ln` calls outside the helper body so the bug can't return through future contributions.
|
||||
|
||||
The browse server's Unicode sanitization lifts from `handleCommand` (PR #1463's original target) to `handleCommandInternal` so the batch command path (`/command/batch`) inherits it too. Both SSE producers (activity feed at `/activity/stream` and inspector stream) now stringify with a `sanitizeReplacer` function that cleans every string value during JSON.stringify — post-stringify regex is ineffective there because `JSON.stringify` has already converted `\uD800` into the escape sequence `"\\ud800"` before the regex would run. Result: every page-content payload that ships from the server has lone UTF-16 surrogate halves replaced with U+FFFD before any downstream consumer (Anthropic API, sidebar JSON.parse) sees them.
|
||||
|
||||
All Linux CI jobs migrate to `ubicloud-standard-8` for consolidated billing and 4x more cores than free `ubuntu-latest`. Eight workflows touch the Linux pool: `evals.yml`, `evals-periodic.yml`, `ci-image.yml`, `make-pdf-gate.yml`, `actionlint.yml`, `pr-title-sync.yml`, `skill-docs.yml`, `version-gate.yml`. The Windows-only job (`windows-free-tests.yml`) stays on GitHub's free `windows-latest` — Ubicloud doesn't ship a Windows pool, GitHub's paid `windows-latest-8-cores` requires org-level larger-runner billing enablement, and the wave-coverage tests this job runs are small enough that the slower 4-core free runner keeps total job time under 2 minutes. Four new wave tests get registered: sanitizer unit + bug-repro + wiring invariants, setup helper static-invariant + behavior matrix, build-script POSIX-shell sanity, and a doc-vs-config deprecated-key drift guard. Docs that still referenced the renamed `gbrain_sync_mode` config key now say `artifacts_sync_mode` consistently, and the drift guard prevents reintroduction.
|
||||
|
||||
Contributed by @realcarsonterry: PRs #1460, #1461, #1462, and #1463 are the seed of this wave. The scope expansion to all 42 setup sites + every server egress path + Windows CI migration is the gstack maintainer's follow-through.
|
||||
|
||||
### The numbers that matter
|
||||
|
||||
Source: this branch's diff against `origin/main` and the wave plan at `~/.claude/plans/system-instruction-you-are-working-peppy-volcano.md` (target ship slot v1.38.0.0 after queue advance past in-flight PR #1500).
|
||||
|
||||
| Surface | Before | After | Δ |
|
||||
|---------|--------|-------|---|
|
||||
| `setup` symlink sites guarded for Windows | 0 of 42 | 42 of 42 | +42 |
|
||||
| Server Unicode-sanitization egress points | 0 | 4 (HTTP, batch, activity SSE, inspector SSE) | +4 |
|
||||
| Bash brace groups in `package.json` build script (Bun-Windows-hostile) | 3 | 0 | -3 |
|
||||
| Stale `gbrain_sync_mode` references in docs | 5 | 0 | -5 |
|
||||
| New regression tests | 0 | 29 (4 files) | +29 |
|
||||
| Linux CI runner pool | mix of `ubuntu-latest` (4 core, free) + `ubicloud-standard-2` | `ubicloud-standard-8` everywhere | single billing surface for Linux, 4x more cores on previously-free jobs |
|
||||
| Windows CI runner | `windows-latest` (free) | `windows-latest` (free, unchanged) | Ubicloud doesn't offer Windows; paid GitHub larger-runner option requires org-billing toggle not currently set |
|
||||
|
||||
The static invariant test (D7) reads `setup` and asserts zero raw `ln` calls outside the `_link_or_copy` helper body — even a single one-line slip by a future contributor fails the build.
|
||||
|
||||
### What this means for downstream gstack users
|
||||
|
||||
If you run gstack on Windows: `./setup` now produces a working install across every host adapter, and the user-visible note tells you to re-run after `git pull`. If you scrape pages with non-Latin text or emoji: Bun's CDP responses can no longer break the Anthropic API with lone-surrogate JSON bodies — sanitization is single-point and inherited by every server egress path. If you contribute to gstack: a future `ln -snf` slip in `setup` will fail CI, and a future SSE endpoint that bypasses sanitization is flagged by an inline invariant comment plus this CHANGELOG entry.
|
||||
|
||||
### Itemized changes
|
||||
|
||||
#### Added
|
||||
|
||||
- **`browse/test/server-sanitize-surrogates.test.ts`** — 11 unit cases (passthrough, valid pair, lone high/low mid-string, trailing/leading lone, adjacent doubles, pair-then-lone, lone-then-pair), 2 bug-repro tests (UTF-8 round-trip + JSON round-trip), 3 wiring-invariant tests (handleCommandInternalImpl rename, SSE activity, SSE inspector).
|
||||
- **`test/setup-windows-fallback.test.ts`** — static invariant (zero raw `ln` calls outside helper), helper-existence assertions, behavior matrix (4 cells: file/dir × Windows/Unix) via awk-style helper extraction + `bash -c` sourcing, Windows-note printer registration check.
|
||||
- **`test/build-script-shell-compat.test.ts`** — regex against `package.json scripts.*` rejecting bash brace groups (Bun-Windows-hostile); asserts `.version` redirects use subshells, not braces.
|
||||
- **`test/docs-config-keys.test.ts`** — deprecated-key denylist (`gbrain_sync_mode`, `gbrain_sync_mode_prompted`) scanned across `docs/**/*.md`; round-trip test for `gstack-config get artifacts_sync_mode`.
|
||||
|
||||
#### Changed
|
||||
|
||||
- **`browse/src/server.ts`** — `handleCommandInternal` split into `handleCommandInternalImpl` (raw) + thin sanitizing wrapper. Single egress point for both HTTP and batch consumers. Inline INVARIANT comment near the wrapper documents the architectural constraint.
|
||||
- **`browse/src/server.ts` SSE producers** — activity feed (`/activity/stream`) and inspector stream stringify with `sanitizeReplacer`, a `JSON.stringify` replacer function that cleans every string value during encoding. Post-stringify regex is a no-op because `JSON.stringify` has already converted `\uD800` to `"\\ud800"` before the regex could match. Inline INVARIANT comment in each.
|
||||
- **`setup`** — new `_link_or_copy SRC DST` helper near `IS_WINDOWS` detection (~line 33). Auto-dispatches on file-vs-directory + Windows-vs-Unix, and skips Unix-style name-only aliases (e.g. `gstack/open-gstack-browser` for the connect-chrome alias) when the source doesn't resolve on disk so Windows installs don't abort under `set -e`. All 42 prior `ln -snf` call sites converted to `_link_or_copy`. New `_print_windows_copy_note_once` helper called from `link_claude_skill_dirs` after any link work completes. `cleanup_old_claude_symlinks` and `cleanup_prefixed_claude_symlinks` extended with a Windows branch so `--prefix` / `--no-prefix` flips remove stale real-file SKILL.md copies instead of leaving them behind.
|
||||
- **`.github/workflows/*.yml` (8 Linux workflows)** — every Linux `runs-on` switched to `ubicloud-standard-8`: `evals.yml`, `evals-periodic.yml`, `ci-image.yml`, `actionlint.yml`, `pr-title-sync.yml`, `skill-docs.yml`, `version-gate.yml`, and `make-pdf-gate.yml`'s Linux matrix entry. The `evals.yml` matrix default and the prose footer both updated to reference `ubicloud-standard-8`.
|
||||
- **`.github/workflows/windows-free-tests.yml`** — stays on GitHub-hosted free `windows-latest`. Test-list expanded to include the 4 new wave tests. Earlier attempts on Blacksmith/GitHub-larger/Ubicloud-Windows all failed (label not registered, org-billing off, vendor doesn't offer Windows respectively); free `windows-latest` is the working path.
|
||||
- **`.github/actionlint.yaml`** — registers the two Ubicloud Linux labels (`ubicloud-standard-2`, `ubicloud-standard-8`) so workflow lint accepts them. The duplicate dead-weight `actionlint.yaml` at the repo root is removed (actionlint only reads `.github/actionlint.yaml`).
|
||||
- **`package.json`** — build script's three `{ git rev-parse HEAD 2>/dev/null || true; } > path/.version` brace groups replaced with `( ... )` subshells. POSIX-universal, Bun-Windows-compatible.
|
||||
- **`docs/gbrain-sync.md`, `docs/gbrain-sync-errors.md`** — 5 stale `gbrain_sync_mode` config-key references → `artifacts_sync_mode` (the rename landed in v1.27.0.0 but two docs still pointed at the old key).
|
||||
|
||||
#### For contributors
|
||||
|
||||
- **Architectural invariant (Unicode):** every JSON.stringify call that serializes page-content-derived strings MUST be passed `sanitizeReplacer` (for object payloads where consumers will JSON.parse) OR the resulting body MUST be wrapped in `sanitizeLoneSurrogates` (for text/plain responses). Today this is enforced by `handleCommandInternal`'s sanitizing wrapper for command results and explicit `sanitizeReplacer` arguments at the two SSE producers. New SSE/WebSocket writers must follow the same pattern; inline comments near both producers say so.
|
||||
- **Architectural invariant (setup):** every symlink in `setup` MUST go through `_link_or_copy`. Enforced by `test/setup-windows-fallback.test.ts`'s static invariant — a single raw `ln` call outside the helper body fails CI.
|
||||
- **Test coverage gap closed:** prior to this wave, the curated Windows CI lane (`windows-free-tests.yml`) didn't exercise the install-symlink path, the Unicode sanitization, the build-script shell compat, or doc-config drift. All four now run on every PR.
|
||||
- **Out of scope (P2 follow-ups):** pushing sanitization deeper to `browse/src/snapshot.ts` (covers WebSocket frames that don't transit `cr.result`); porting the 24 POSIX-bound free tests to run on Windows (tracked in `windows-free-tests.yml`'s own comments).
|
||||
|
||||
## [1.37.0.0] - 2026-05-14
|
||||
|
||||
## **Split-engine gbrain: remote MCP for brain, local PGLite for code.**
|
||||
@@ -25,7 +85,6 @@ Source: `bun test test/gbrain-local-status.test.ts test/gbrain-detect-shape.test
|
||||
|
||||
#### Added
|
||||
|
||||
|
||||
- `lib/gbrain-local-status.ts` — shared 5-state engine status classifier (`ok` / `no-cli` / `missing-config` / `broken-config` / `broken-db`) with 60s TTL cache and `--no-cache` flag. Probes via `gbrain sources list --json` + stderr classification reusing the exact patterns from `lib/gbrain-sources.ts:66-67`.
|
||||
- `/setup-gbrain` Step 1.5 — broken-db remediation with 4 options (Retry / Switch to PGLite / Switch brain mode / Quit). PGLite switch is rollback-safe: `mv ~/.gbrain/config.json` to a timestamped `.bak`, `gbrain init --pglite`, on non-zero exit restore the .bak verbatim.
|
||||
- `/setup-gbrain` Step 4.5 — Path 4 opt-in for local PGLite code search. Yes path runs `gstack-gbrain-install` (idempotent) + `gbrain init --pglite --json` with the same rollback semantics. No path keeps Path 4 as remote-MCP-only.
|
||||
|
||||
25
CLAUDE.md
25
CLAUDE.md
@@ -269,6 +269,31 @@ to `~/.gstack/security/attempts.jsonl` via `tunnel-denial-log.ts`. Before editin
|
||||
the module boundary (no imports from `token-registry.ts` into `sse-session-cookie.ts`)
|
||||
is load-bearing for scope isolation.
|
||||
|
||||
**Unicode sanitization at server egress** (v1.38.0.0+). Every server egress that
|
||||
ships page-content-derived strings MUST go through `JSON.stringify(payload,
|
||||
sanitizeReplacer)` for object payloads or `sanitizeLoneSurrogates(body)` for text
|
||||
bodies. Lone UTF-16 surrogate halves from CDP page content otherwise reach the
|
||||
Anthropic API as `\uD800`-style escapes and trigger a 400. Wired at four egress
|
||||
points today: `handleCommandInternal` (HTTP + batch via a sanitizing wrapper around
|
||||
`handleCommandInternalImpl`) and both SSE producers (`/activity/stream`,
|
||||
`/inspector/events`). Post-stringify regex is a no-op — `JSON.stringify` has
|
||||
already escaped the surrogate before regex could match, so the replacer must run
|
||||
inside the encoding pipeline. Before adding a new SSE/WebSocket writer or HTTP
|
||||
response in `server.ts`, read
|
||||
[ARCHITECTURE.md](ARCHITECTURE.md#unicode-sanitization-at-server-egress-v13800).
|
||||
`browse/test/server-sanitize-surrogates.test.ts` pins the wiring with invariant
|
||||
tests, so bypasses fail CI.
|
||||
|
||||
**Setup symlink hardening** (v1.38.0.0+). Every link site in `setup` MUST route
|
||||
through the `_link_or_copy SRC DST` helper near the `IS_WINDOWS` detection. On
|
||||
Windows without Developer Mode, plain `ln -snf` produces frozen file copies that
|
||||
don't refresh on `git pull` — silent staleness across every host adapter. The
|
||||
helper preserves `ln -snf` on Unix and switches to `cp -R` / `cp -f` on Windows.
|
||||
`test/setup-windows-fallback.test.ts` enforces a static invariant: a single raw
|
||||
`ln` call outside the helper body fails CI. Windows users get a one-line note
|
||||
from `_print_windows_copy_note_once` reminding them to re-run `./setup` after
|
||||
every `git pull`.
|
||||
|
||||
**Sidebar security stack** (layered defense against prompt injection):
|
||||
|
||||
| Layer | Module | Lives in |
|
||||
|
||||
@@ -342,6 +342,7 @@ When Conductor creates a new workspace, `bin/dev-setup` runs automatically. It d
|
||||
- **Conductor workspaces are independent.** Each workspace is its own git worktree. `bin/dev-setup` runs automatically via `conductor.json`.
|
||||
- **`.env` propagates across worktrees.** Set it once in the main repo, all Conductor workspaces get it.
|
||||
- **`.claude/skills/` is gitignored.** The symlinks never get committed.
|
||||
- **Never write raw `ln -snf` in `setup`.** Every link site in `setup` MUST route through the `_link_or_copy SRC DST` helper near the `IS_WINDOWS` detection. The helper preserves `ln -snf` on Unix and switches to `cp -R` / `cp -f` on Windows without Developer Mode, where plain `ln -snf` produces frozen file copies that don't refresh on `git pull`. `test/setup-windows-fallback.test.ts` enforces this with a static invariant — a single raw `ln` call outside the helper body fails CI.
|
||||
|
||||
## Testing your changes in a real project
|
||||
|
||||
|
||||
@@ -459,6 +459,8 @@ Data is stored in [Supabase](https://supabase.com) (open source Firebase alterna
|
||||
|
||||
**Windows users:** gstack works on Windows 11 via Git Bash or WSL. Node.js is required in addition to Bun — Bun has a known bug with Playwright's pipe transport on Windows ([bun#4253](https://github.com/oven-sh/bun/issues/4253)). The browse server automatically falls back to Node.js. Make sure both `bun` and `node` are on your PATH.
|
||||
|
||||
On Windows without Developer Mode (MSYS2 / Git Bash), `setup` falls back to file copies instead of symlinks because `ln -snf` produces frozen copies that don't refresh on `git pull`. **Re-run `cd ~/.claude/skills/gstack && ./setup` after every `git pull`** so your skill files match the repo. `setup` prints a one-line note reminding you. Unix and WSL keep symlinks and don't need the re-run.
|
||||
|
||||
**Claude says it can't see the skills?** Make sure your project's `CLAUDE.md` has a gstack section. Add this:
|
||||
|
||||
```
|
||||
|
||||
@@ -1,3 +0,0 @@
|
||||
self-hosted-runner:
|
||||
labels:
|
||||
- ubicloud-standard-2
|
||||
@@ -59,6 +59,43 @@ import * as net from 'net';
|
||||
import * as path from 'path';
|
||||
import * as crypto from 'crypto';
|
||||
|
||||
// ─── Unicode Sanitization ───────────────────────────────────────
|
||||
// Remove unpaired UTF-16 surrogate halves (\uD800–\uDFFF). Page DOM text,
|
||||
// OCR output, and other CDP-sourced strings can contain lone surrogates;
|
||||
// JSON consumers downstream (Anthropic API in particular) reject them with
|
||||
// "no low surrogate in string". Valid surrogate pairs (e.g. emoji) survive
|
||||
// unchanged. Lone halves become U+FFFD (<28>).
|
||||
//
|
||||
// INVARIANT: every server egress path that ships page-content strings MUST
|
||||
// route through this sanitizer. handleCommandInternal wraps the final
|
||||
// cr.result string (text/plain bodies carry lone surrogates verbatim;
|
||||
// JSON.stringify already escapes them). The two SSE producers below
|
||||
// stringify with `sanitizeReplacer` so payload string fields get cleaned
|
||||
// BEFORE escaping. Plain post-stringify regex is a no-op there because
|
||||
// JSON.stringify converts \uD800 → "\\ud800" — the regex can't see the
|
||||
// surrogate after that point.
|
||||
function sanitizeLoneSurrogates(str: string): string {
|
||||
return str.replace(/[\uD800-\uDFFF]/g, (match, offset) => {
|
||||
const code = match.charCodeAt(0);
|
||||
if (code >= 0xD800 && code <= 0xDBFF) {
|
||||
const next = str.charCodeAt(offset + 1);
|
||||
if (next >= 0xDC00 && next <= 0xDFFF) return match;
|
||||
}
|
||||
if (code >= 0xDC00 && code <= 0xDFFF) {
|
||||
const prev = str.charCodeAt(offset - 1);
|
||||
if (prev >= 0xD800 && prev <= 0xDBFF) return match;
|
||||
}
|
||||
return '<27>';
|
||||
});
|
||||
}
|
||||
|
||||
// JSON.stringify replacer that sanitizes string values before they get
|
||||
// escape-encoded. Pair with stringify when the consumer will JSON.parse the
|
||||
// payload back into JS strings (SSE clients do this).
|
||||
function sanitizeReplacer(_key: string, value: unknown): unknown {
|
||||
return typeof value === 'string' ? sanitizeLoneSurrogates(value) : value;
|
||||
}
|
||||
|
||||
// ─── Config ─────────────────────────────────────────────────────
|
||||
const config = resolveConfig();
|
||||
ensureStateDir(config);
|
||||
@@ -683,7 +720,7 @@ interface CommandResult {
|
||||
* skipActivity: true when called from chain (chain emits 1 event for all subcommands)
|
||||
* chainDepth: recursion guard — reject nested chains (depth > 0 means inside a chain)
|
||||
*/
|
||||
async function handleCommandInternal(
|
||||
async function handleCommandInternalImpl(
|
||||
body: { command: string; args?: string[]; tabId?: number },
|
||||
tokenInfo?: TokenInfo | null,
|
||||
opts?: { skipRateCheck?: boolean; skipActivity?: boolean; chainDepth?: number },
|
||||
@@ -1027,6 +1064,21 @@ async function handleCommandInternal(
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Sanitizing wrapper around handleCommandInternalImpl. ALL callers (single-command
|
||||
* HTTP, batch loop, scoped-token dispatch) go through this so the lone-surrogate
|
||||
* sanitization happens once at the architectural choke point, not per-leaf.
|
||||
* Do not bypass this by calling handleCommandInternalImpl directly.
|
||||
*/
|
||||
async function handleCommandInternal(
|
||||
body: { command: string; args?: string[]; tabId?: number },
|
||||
tokenInfo?: TokenInfo | null,
|
||||
opts?: { skipRateCheck?: boolean; skipActivity?: boolean; chainDepth?: number },
|
||||
): Promise<CommandResult> {
|
||||
const cr = await handleCommandInternalImpl(body, tokenInfo, opts);
|
||||
return { ...cr, result: sanitizeLoneSurrogates(cr.result) };
|
||||
}
|
||||
|
||||
/** HTTP wrapper — converts CommandResult to Response */
|
||||
async function handleCommand(body: any, tokenInfo?: TokenInfo | null): Promise<Response> {
|
||||
const cr = await handleCommandInternal(body, tokenInfo);
|
||||
@@ -1827,19 +1879,24 @@ export async function start() {
|
||||
|
||||
const stream = new ReadableStream({
|
||||
start(controller) {
|
||||
// SSE egress invariant: every JSON.stringify here ships page-content-derived
|
||||
// fields (URLs, command args, errors) to the sidebar. Lone surrogates must
|
||||
// be sanitized DURING stringify (via sanitizeReplacer) so they're cleaned
|
||||
// before escape-encoding — post-stringify regex is ineffective because
|
||||
// JSON.stringify has already converted \uD800 → "\\ud800".
|
||||
// 1. Gap detection + replay
|
||||
const { entries, gap, gapFrom, availableFrom } = getActivityAfter(afterId);
|
||||
if (gap) {
|
||||
controller.enqueue(encoder.encode(`event: gap\ndata: ${JSON.stringify({ gapFrom, availableFrom })}\n\n`));
|
||||
controller.enqueue(encoder.encode(`event: gap\ndata: ${JSON.stringify({ gapFrom, availableFrom }, sanitizeReplacer)}\n\n`));
|
||||
}
|
||||
for (const entry of entries) {
|
||||
controller.enqueue(encoder.encode(`event: activity\ndata: ${JSON.stringify(entry)}\n\n`));
|
||||
controller.enqueue(encoder.encode(`event: activity\ndata: ${JSON.stringify(entry, sanitizeReplacer)}\n\n`));
|
||||
}
|
||||
|
||||
// 2. Subscribe for live events
|
||||
const unsubscribe = subscribe((entry) => {
|
||||
try {
|
||||
controller.enqueue(encoder.encode(`event: activity\ndata: ${JSON.stringify(entry)}\n\n`));
|
||||
controller.enqueue(encoder.encode(`event: activity\ndata: ${JSON.stringify(entry, sanitizeReplacer)}\n\n`));
|
||||
} catch (err: any) {
|
||||
console.debug('[browse] Activity SSE stream error, unsubscribing:', err.message);
|
||||
unsubscribe();
|
||||
@@ -2188,10 +2245,15 @@ export async function start() {
|
||||
const encoder = new TextEncoder();
|
||||
const stream = new ReadableStream({
|
||||
start(controller) {
|
||||
// SSE egress invariant: inspectorData and CDP event payloads carry
|
||||
// page-DOM strings (selectors, attribute values, console messages).
|
||||
// sanitizeReplacer cleans lone surrogates DURING JSON.stringify so
|
||||
// they're neutralized before escape-encoding (post-stringify regex
|
||||
// is a no-op once \uD800 has become "\\ud800").
|
||||
// Send current state immediately
|
||||
if (inspectorData) {
|
||||
controller.enqueue(encoder.encode(
|
||||
`event: state\ndata: ${JSON.stringify({ data: inspectorData, timestamp: inspectorTimestamp })}\n\n`
|
||||
`event: state\ndata: ${JSON.stringify({ data: inspectorData, timestamp: inspectorTimestamp }, sanitizeReplacer)}\n\n`
|
||||
));
|
||||
}
|
||||
|
||||
@@ -2199,7 +2261,7 @@ export async function start() {
|
||||
const notify: InspectorSubscriber = (event) => {
|
||||
try {
|
||||
controller.enqueue(encoder.encode(
|
||||
`event: inspector\ndata: ${JSON.stringify(event)}\n\n`
|
||||
`event: inspector\ndata: ${JSON.stringify(event, sanitizeReplacer)}\n\n`
|
||||
));
|
||||
} catch (err: any) {
|
||||
console.debug('[browse] Inspector SSE stream error:', err.message);
|
||||
|
||||
129
browse/test/server-sanitize-surrogates.test.ts
Normal file
129
browse/test/server-sanitize-surrogates.test.ts
Normal file
@@ -0,0 +1,129 @@
|
||||
import { describe, test, expect } from 'bun:test';
|
||||
import * as fs from 'fs';
|
||||
import * as path from 'path';
|
||||
|
||||
// The sanitizer is module-private in server.ts. Rather than refactor it to a
|
||||
// separate module just for testing, we extract its source via a regex slice and
|
||||
// eval it in a fresh function scope. Keeps the production layout untouched.
|
||||
const SERVER_PATH = path.resolve(import.meta.dir, '..', 'src', 'server.ts');
|
||||
const SERVER_SRC = fs.readFileSync(SERVER_PATH, 'utf-8');
|
||||
|
||||
const fnMatch = SERVER_SRC.match(
|
||||
/function sanitizeLoneSurrogates\(str: string\): string \{[\s\S]*?\n\}/
|
||||
);
|
||||
if (!fnMatch) throw new Error('Could not locate sanitizeLoneSurrogates in server.ts');
|
||||
|
||||
// Strip TS annotations so eval works under plain JS.
|
||||
const jsSrc = fnMatch[0].replace('(str: string): string', '(str)');
|
||||
const sanitizeLoneSurrogates = new Function(`${jsSrc}\nreturn sanitizeLoneSurrogates;`)() as (
|
||||
s: string,
|
||||
) => string;
|
||||
|
||||
describe('sanitizeLoneSurrogates — unit cases', () => {
|
||||
test('passthrough ASCII', () => {
|
||||
expect(sanitizeLoneSurrogates('hello')).toBe('hello');
|
||||
});
|
||||
|
||||
test('passthrough empty string', () => {
|
||||
expect(sanitizeLoneSurrogates('')).toBe('');
|
||||
});
|
||||
|
||||
test('preserves valid surrogate pair (U+1F389 🎉)', () => {
|
||||
expect(sanitizeLoneSurrogates('hi 🎉')).toBe('hi 🎉');
|
||||
});
|
||||
|
||||
test('replaces lone high surrogate mid-string', () => {
|
||||
expect(sanitizeLoneSurrogates('a\uD800b')).toBe('a<>b');
|
||||
});
|
||||
|
||||
test('replaces lone low surrogate mid-string', () => {
|
||||
expect(sanitizeLoneSurrogates('a\uDC00b')).toBe('a<>b');
|
||||
});
|
||||
|
||||
test('replaces trailing lone high at end of string', () => {
|
||||
expect(sanitizeLoneSurrogates('a\uD800')).toBe('a<>');
|
||||
});
|
||||
|
||||
test('replaces leading lone low at start of string', () => {
|
||||
expect(sanitizeLoneSurrogates('\uDC00b')).toBe('<27>b');
|
||||
});
|
||||
|
||||
test('replaces two adjacent lone highs', () => {
|
||||
expect(sanitizeLoneSurrogates('\uD800\uD800')).toBe('<27><>');
|
||||
});
|
||||
|
||||
test('replaces two adjacent lone lows', () => {
|
||||
expect(sanitizeLoneSurrogates('\uDC00\uDC00')).toBe('<27><>');
|
||||
});
|
||||
|
||||
test('preserves valid pair followed by lone low', () => {
|
||||
// 𐀀 = U+10000 = 𐀀, then a separate lone low.
|
||||
const input = '𐀀\uDC00';
|
||||
const output = sanitizeLoneSurrogates(input);
|
||||
// Valid pair intact, trailing lone low replaced.
|
||||
expect(output).toBe('𐀀<>');
|
||||
});
|
||||
|
||||
test('preserves valid pair preceded by lone low', () => {
|
||||
const input = '\uDC00𐀀';
|
||||
const output = sanitizeLoneSurrogates(input);
|
||||
expect(output).toBe('<27>𐀀');
|
||||
});
|
||||
});
|
||||
|
||||
describe('sanitizeLoneSurrogates — bug-repro (D5)', () => {
|
||||
// Pin the regression intent: a future refactor that drops sanitization
|
||||
// must fail this test even if happy-path tests still pass.
|
||||
test('unsanitized lone surrogate causes UTF-8 encode to substitute, sanitized version is stable', () => {
|
||||
const badPayload = 'page content\uD800more content';
|
||||
|
||||
// Buffer.from(str, 'utf-8') silently substitutes invalid sequences with
|
||||
// EF BF BD (U+FFFD). Round-trip is therefore lossy for lone surrogates.
|
||||
const roundTrippedRaw = Buffer.from(badPayload, 'utf-8').toString('utf-8');
|
||||
expect(roundTrippedRaw).not.toBe(badPayload); // proves the bug exists pre-sanitize
|
||||
|
||||
// After sanitization the round-trip is stable.
|
||||
const sanitized = sanitizeLoneSurrogates(badPayload);
|
||||
const roundTrippedSanitized = Buffer.from(sanitized, 'utf-8').toString('utf-8');
|
||||
expect(roundTrippedSanitized).toBe(sanitized);
|
||||
});
|
||||
|
||||
test('JSON.parse(JSON.stringify(...)) round-trip is stable after sanitization', () => {
|
||||
// Anthropic's API path wraps the response body in a tool_result JSON
|
||||
// object. JSON.stringify CAN encode a lone surrogate (escapes it), but
|
||||
// some downstream consumers reject the resulting body.
|
||||
const badPayload = 'before\uD800after';
|
||||
const sanitized = sanitizeLoneSurrogates(badPayload);
|
||||
const wrapped = JSON.stringify({ content: sanitized });
|
||||
const reparsed = JSON.parse(wrapped) as { content: string };
|
||||
// .toBe(sanitized) already proves the surrogate was replaced; the
|
||||
// additional explicit check below documents the specific code points.
|
||||
expect(reparsed.content).toBe(sanitized);
|
||||
expect(reparsed.content.charCodeAt(6)).toBe(0xfffd); // <20> not \uD800
|
||||
});
|
||||
});
|
||||
|
||||
describe('sanitizeLoneSurrogates — wiring invariants', () => {
|
||||
test('server.ts wraps every command result through handleCommandInternal', () => {
|
||||
// The architectural choice is to wrap once at handleCommandInternal so
|
||||
// both single-command HTTP and the batch loop inherit. If a future
|
||||
// refactor moves sanitization back to handleCommand only, this test
|
||||
// fails by detecting the missing wrapper.
|
||||
expect(SERVER_SRC).toContain('async function handleCommandInternalImpl(');
|
||||
expect(SERVER_SRC).toContain('result: sanitizeLoneSurrogates(cr.result)');
|
||||
});
|
||||
|
||||
test('SSE activity feed sanitizes outbound frames via sanitizeReplacer', () => {
|
||||
// Replacer must run DURING stringify; post-stringify regex is ineffective
|
||||
// because JSON.stringify converts \uD800 → "\\ud800" before our regex sees it.
|
||||
expect(SERVER_SRC).toContain('JSON.stringify(entry, sanitizeReplacer)');
|
||||
});
|
||||
|
||||
test('SSE inspector stream sanitizes outbound frames via sanitizeReplacer', () => {
|
||||
expect(SERVER_SRC).toContain('JSON.stringify(event, sanitizeReplacer)');
|
||||
});
|
||||
|
||||
test('sanitizeReplacer is a function defined in server.ts', () => {
|
||||
expect(SERVER_SRC).toContain('function sanitizeReplacer(');
|
||||
});
|
||||
});
|
||||
@@ -23,7 +23,7 @@ This pulls the repo into `~/.gstack/` and re-registers merge drivers.
|
||||
|
||||
If you don't want to restore here, dismiss the hint with:
|
||||
```bash
|
||||
gstack-config set gbrain_sync_mode_prompted true
|
||||
gstack-config set artifacts_sync_mode_prompted true
|
||||
```
|
||||
|
||||
---
|
||||
@@ -200,7 +200,7 @@ canonical config files from the brain repo.
|
||||
|
||||
1. `gstack-brain-sync --status` — is mode `off`?
|
||||
2. `~/.gstack/.git` exists?
|
||||
3. `gstack-config get gbrain_sync_mode` — should be `full` or `artifacts-only`.
|
||||
3. `gstack-config get artifacts_sync_mode` — should be `full` or `artifacts-only`.
|
||||
4. The file you expect to sync — is it in the allowlist?
|
||||
`cat ~/.gstack/.brain-allowlist`
|
||||
5. Privacy class filter — if mode is `artifacts-only`, behavioral files
|
||||
|
||||
@@ -59,7 +59,7 @@ privacy mode:
|
||||
- **Only artifacts**: plans, designs, retros, learnings — skip
|
||||
behavioral data (timelines, developer profile).
|
||||
- **Decline**: keep everything local. You can turn sync on later with
|
||||
`gstack-config set gbrain_sync_mode full`.
|
||||
`gstack-config set artifacts_sync_mode full`.
|
||||
|
||||
Your answer is persisted. You won't be asked again.
|
||||
|
||||
@@ -107,8 +107,8 @@ output. Scan it for problems.
|
||||
|
||||
Change anytime with:
|
||||
```bash
|
||||
gstack-config set gbrain_sync_mode full
|
||||
gstack-config set gbrain_sync_mode off
|
||||
gstack-config set artifacts_sync_mode full
|
||||
gstack-config set artifacts_sync_mode off
|
||||
```
|
||||
|
||||
## Secret protection
|
||||
@@ -170,7 +170,7 @@ gstack-brain-uninstall
|
||||
This:
|
||||
|
||||
- Removes `~/.gstack/.git/` and all `.brain-*` config files.
|
||||
- Clears `gbrain_sync_mode` in `gstack-config`.
|
||||
- Clears `artifacts_sync_mode` in `gstack-config`.
|
||||
- Does NOT touch your learnings, plans, retros, or developer profile.
|
||||
|
||||
Add `--delete-remote` to also delete the private GitHub repo (GitHub only,
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "gstack",
|
||||
"version": "1.37.0.0",
|
||||
"version": "1.38.0.0",
|
||||
"description": "Garry's Stack — Claude Code skills + fast headless browser. One repo, one install, entire AI engineering workflow.",
|
||||
"license": "MIT",
|
||||
"type": "module",
|
||||
@@ -9,7 +9,7 @@
|
||||
"make-pdf": "./make-pdf/dist/pdf"
|
||||
},
|
||||
"scripts": {
|
||||
"build": "bun run vendor:xterm && bun run gen:skill-docs --host all; bun build --compile browse/src/cli.ts --outfile browse/dist/browse && bun build --compile browse/src/find-browse.ts --outfile browse/dist/find-browse && bun build --compile design/src/cli.ts --outfile design/dist/design && bun build --compile make-pdf/src/cli.ts --outfile make-pdf/dist/pdf && bun build --compile bin/gstack-global-discover.ts --outfile bin/gstack-global-discover && bash browse/scripts/build-node-server.sh && { git rev-parse HEAD 2>/dev/null || true; } > browse/dist/.version && { git rev-parse HEAD 2>/dev/null || true; } > design/dist/.version && { git rev-parse HEAD 2>/dev/null || true; } > make-pdf/dist/.version && chmod +x browse/dist/browse browse/dist/find-browse design/dist/design make-pdf/dist/pdf bin/gstack-global-discover && (rm -f .*.bun-build || true)",
|
||||
"build": "bun run vendor:xterm && bun run gen:skill-docs --host all; bun build --compile browse/src/cli.ts --outfile browse/dist/browse && bun build --compile browse/src/find-browse.ts --outfile browse/dist/find-browse && bun build --compile design/src/cli.ts --outfile design/dist/design && bun build --compile make-pdf/src/cli.ts --outfile make-pdf/dist/pdf && bun build --compile bin/gstack-global-discover.ts --outfile bin/gstack-global-discover && bash browse/scripts/build-node-server.sh && ( git rev-parse HEAD 2>/dev/null || true ) > browse/dist/.version && ( git rev-parse HEAD 2>/dev/null || true ) > design/dist/.version && ( git rev-parse HEAD 2>/dev/null || true ) > make-pdf/dist/.version && chmod +x browse/dist/browse browse/dist/find-browse design/dist/design make-pdf/dist/pdf bin/gstack-global-discover && (rm -f .*.bun-build || true)",
|
||||
"vendor:xterm": "mkdir -p extension/lib && cp node_modules/xterm/lib/xterm.js extension/lib/xterm.js && cp node_modules/xterm/css/xterm.css extension/lib/xterm.css && cp node_modules/xterm-addon-fit/lib/xterm-addon-fit.js extension/lib/xterm-addon-fit.js",
|
||||
"dev:make-pdf": "bun run make-pdf/src/cli.ts",
|
||||
"dev:design": "bun run design/src/cli.ts",
|
||||
|
||||
137
setup
137
setup
@@ -30,6 +30,47 @@ case "$(uname -s)" in
|
||||
MINGW*|MSYS*|CYGWIN*|Windows_NT) IS_WINDOWS=1 ;;
|
||||
esac
|
||||
|
||||
# ─── Symlink-or-copy helper ───────────────────────────────────
|
||||
# On macOS/Linux: create a symlink (existing behavior).
|
||||
# On Windows without Developer Mode (MSYS2/Git Bash): plain ln -snf silently
|
||||
# creates a frozen file copy that doesn't refresh after `git pull`. We use
|
||||
# explicit `cp -R` / `cp -f` so the user gets a real copy and the staleness
|
||||
# is reportable (re-run ./setup after pull). Auto-detects file vs dir.
|
||||
#
|
||||
# INVARIANT: every symlink in this script MUST route through this helper.
|
||||
# A raw ln call here will be caught by test/setup-windows-fallback.test.ts
|
||||
# (the static-invariant assertion D7).
|
||||
_link_or_copy() {
|
||||
local src="$1"
|
||||
local dst="$2"
|
||||
if [ "$IS_WINDOWS" -eq 1 ]; then
|
||||
rm -rf "$dst"
|
||||
# Unix `ln -snf` accepts a name-only or relative-path source even when the
|
||||
# target doesn't resolve from CWD (e.g. the connect-chrome alias points at
|
||||
# the sibling-relative "gstack/open-gstack-browser"). On Windows the
|
||||
# equivalent semantics don't exist — we'd need a real source on disk to
|
||||
# copy. Skip the alias quietly rather than aborting setup under `set -e`.
|
||||
if [ ! -e "$src" ]; then
|
||||
return 0
|
||||
fi
|
||||
if [ -d "$src" ]; then
|
||||
cp -R "$src" "$dst"
|
||||
else
|
||||
cp -f "$src" "$dst"
|
||||
fi
|
||||
else
|
||||
ln -snf "$src" "$dst"
|
||||
fi
|
||||
}
|
||||
|
||||
_WINDOWS_COPY_NOTE_PRINTED=0
|
||||
_print_windows_copy_note_once() {
|
||||
if [ "$IS_WINDOWS" -eq 1 ] && [ "$_WINDOWS_COPY_NOTE_PRINTED" -eq 0 ]; then
|
||||
echo " note: Windows install uses file copies (no Developer Mode required). Re-run ./setup after every 'git pull' to refresh skill files."
|
||||
_WINDOWS_COPY_NOTE_PRINTED=1
|
||||
fi
|
||||
}
|
||||
|
||||
# ─── Quiet mode helper ────────────────────────────────────────
|
||||
QUIET=0
|
||||
log() { [ "$QUIET" -eq 0 ] && echo "$@" || true; }
|
||||
@@ -401,12 +442,13 @@ link_claude_skill_dirs() {
|
||||
mkdir -p "$target"
|
||||
# Validate target isn't a symlink before creating the link
|
||||
if [ -L "$target/SKILL.md" ]; then rm "$target/SKILL.md"; fi
|
||||
ln -snf "$gstack_dir/$dir_name/SKILL.md" "$target/SKILL.md"
|
||||
_link_or_copy "$gstack_dir/$dir_name/SKILL.md" "$target/SKILL.md"
|
||||
linked+=("$link_name")
|
||||
fi
|
||||
done
|
||||
if [ ${#linked[@]} -gt 0 ]; then
|
||||
echo " linked skills: ${linked[*]}"
|
||||
_print_windows_copy_note_once
|
||||
fi
|
||||
}
|
||||
|
||||
@@ -442,6 +484,13 @@ cleanup_old_claude_symlinks() {
|
||||
removed+=("$skill_name")
|
||||
;;
|
||||
esac
|
||||
# Windows install pattern: real dir with real-file SKILL.md (no symlink
|
||||
# available, so we can't readlink to verify provenance). The outer loop
|
||||
# iterates known gstack skill names from "$gstack_dir"/*, so a name match
|
||||
# plus IS_WINDOWS is safe to treat as gstack-managed during a mode flip.
|
||||
elif [ "$IS_WINDOWS" -eq 1 ] && [ -d "$old_target" ] && [ -f "$old_target/SKILL.md" ]; then
|
||||
rm -rf "$old_target"
|
||||
removed+=("$skill_name")
|
||||
fi
|
||||
fi
|
||||
done
|
||||
@@ -483,6 +532,12 @@ cleanup_prefixed_claude_symlinks() {
|
||||
removed+=("gstack-$skill_name")
|
||||
;;
|
||||
esac
|
||||
# Windows install pattern: real dir with real-file SKILL.md. Same
|
||||
# reasoning as cleanup_old_claude_symlinks — directory name match plus
|
||||
# IS_WINDOWS is safe during a mode flip.
|
||||
elif [ "$IS_WINDOWS" -eq 1 ] && [ -d "$prefixed_target" ] && [ -f "$prefixed_target/SKILL.md" ]; then
|
||||
rm -rf "$prefixed_target"
|
||||
removed+=("gstack-$skill_name")
|
||||
fi
|
||||
fi
|
||||
done
|
||||
@@ -520,7 +575,7 @@ link_codex_skill_dirs() {
|
||||
target="$skills_dir/$skill_name"
|
||||
# Create or update symlink
|
||||
if [ -L "$target" ] || [ ! -e "$target" ]; then
|
||||
ln -snf "$skill_dir" "$target"
|
||||
_link_or_copy "$skill_dir" "$target"
|
||||
linked+=("$skill_name")
|
||||
fi
|
||||
fi
|
||||
@@ -545,7 +600,7 @@ create_agents_sidecar() {
|
||||
local dst="$agents_gstack/$asset"
|
||||
if [ -d "$src" ] || [ -f "$src" ]; then
|
||||
if [ -L "$dst" ] || [ ! -e "$dst" ]; then
|
||||
ln -snf "$src" "$dst"
|
||||
_link_or_copy "$src" "$dst"
|
||||
fi
|
||||
fi
|
||||
done
|
||||
@@ -556,7 +611,7 @@ create_agents_sidecar() {
|
||||
local dst="$agents_gstack/$file"
|
||||
if [ -f "$src" ]; then
|
||||
if [ -L "$dst" ] || [ ! -e "$dst" ]; then
|
||||
ln -snf "$src" "$dst"
|
||||
_link_or_copy "$src" "$dst"
|
||||
fi
|
||||
fi
|
||||
done
|
||||
@@ -582,29 +637,29 @@ create_codex_runtime_root() {
|
||||
mkdir -p "$codex_gstack" "$codex_gstack/browse" "$codex_gstack/gstack-upgrade" "$codex_gstack/review"
|
||||
|
||||
if [ -f "$agents_dir/gstack/SKILL.md" ]; then
|
||||
ln -snf "$agents_dir/gstack/SKILL.md" "$codex_gstack/SKILL.md"
|
||||
_link_or_copy "$agents_dir/gstack/SKILL.md" "$codex_gstack/SKILL.md"
|
||||
fi
|
||||
if [ -d "$gstack_dir/bin" ]; then
|
||||
ln -snf "$gstack_dir/bin" "$codex_gstack/bin"
|
||||
_link_or_copy "$gstack_dir/bin" "$codex_gstack/bin"
|
||||
fi
|
||||
if [ -d "$gstack_dir/browse/dist" ]; then
|
||||
ln -snf "$gstack_dir/browse/dist" "$codex_gstack/browse/dist"
|
||||
_link_or_copy "$gstack_dir/browse/dist" "$codex_gstack/browse/dist"
|
||||
fi
|
||||
if [ -d "$gstack_dir/browse/bin" ]; then
|
||||
ln -snf "$gstack_dir/browse/bin" "$codex_gstack/browse/bin"
|
||||
_link_or_copy "$gstack_dir/browse/bin" "$codex_gstack/browse/bin"
|
||||
fi
|
||||
if [ -f "$agents_dir/gstack-upgrade/SKILL.md" ]; then
|
||||
ln -snf "$agents_dir/gstack-upgrade/SKILL.md" "$codex_gstack/gstack-upgrade/SKILL.md"
|
||||
_link_or_copy "$agents_dir/gstack-upgrade/SKILL.md" "$codex_gstack/gstack-upgrade/SKILL.md"
|
||||
fi
|
||||
# Review runtime assets (individual files, NOT the whole review/ dir which has SKILL.md)
|
||||
for f in checklist.md design-checklist.md greptile-triage.md TODOS-format.md; do
|
||||
if [ -f "$gstack_dir/review/$f" ]; then
|
||||
ln -snf "$gstack_dir/review/$f" "$codex_gstack/review/$f"
|
||||
_link_or_copy "$gstack_dir/review/$f" "$codex_gstack/review/$f"
|
||||
fi
|
||||
done
|
||||
# ETHOS.md — referenced by "Search Before Building" in all skill preambles
|
||||
if [ -f "$gstack_dir/ETHOS.md" ]; then
|
||||
ln -snf "$gstack_dir/ETHOS.md" "$codex_gstack/ETHOS.md"
|
||||
_link_or_copy "$gstack_dir/ETHOS.md" "$codex_gstack/ETHOS.md"
|
||||
fi
|
||||
}
|
||||
|
||||
@@ -622,27 +677,27 @@ create_factory_runtime_root() {
|
||||
mkdir -p "$factory_gstack" "$factory_gstack/browse" "$factory_gstack/gstack-upgrade" "$factory_gstack/review"
|
||||
|
||||
if [ -f "$factory_dir/gstack/SKILL.md" ]; then
|
||||
ln -snf "$factory_dir/gstack/SKILL.md" "$factory_gstack/SKILL.md"
|
||||
_link_or_copy "$factory_dir/gstack/SKILL.md" "$factory_gstack/SKILL.md"
|
||||
fi
|
||||
if [ -d "$gstack_dir/bin" ]; then
|
||||
ln -snf "$gstack_dir/bin" "$factory_gstack/bin"
|
||||
_link_or_copy "$gstack_dir/bin" "$factory_gstack/bin"
|
||||
fi
|
||||
if [ -d "$gstack_dir/browse/dist" ]; then
|
||||
ln -snf "$gstack_dir/browse/dist" "$factory_gstack/browse/dist"
|
||||
_link_or_copy "$gstack_dir/browse/dist" "$factory_gstack/browse/dist"
|
||||
fi
|
||||
if [ -d "$gstack_dir/browse/bin" ]; then
|
||||
ln -snf "$gstack_dir/browse/bin" "$factory_gstack/browse/bin"
|
||||
_link_or_copy "$gstack_dir/browse/bin" "$factory_gstack/browse/bin"
|
||||
fi
|
||||
if [ -f "$factory_dir/gstack-upgrade/SKILL.md" ]; then
|
||||
ln -snf "$factory_dir/gstack-upgrade/SKILL.md" "$factory_gstack/gstack-upgrade/SKILL.md"
|
||||
_link_or_copy "$factory_dir/gstack-upgrade/SKILL.md" "$factory_gstack/gstack-upgrade/SKILL.md"
|
||||
fi
|
||||
for f in checklist.md design-checklist.md greptile-triage.md TODOS-format.md; do
|
||||
if [ -f "$gstack_dir/review/$f" ]; then
|
||||
ln -snf "$gstack_dir/review/$f" "$factory_gstack/review/$f"
|
||||
_link_or_copy "$gstack_dir/review/$f" "$factory_gstack/review/$f"
|
||||
fi
|
||||
done
|
||||
if [ -f "$gstack_dir/ETHOS.md" ]; then
|
||||
ln -snf "$gstack_dir/ETHOS.md" "$factory_gstack/ETHOS.md"
|
||||
_link_or_copy "$gstack_dir/ETHOS.md" "$factory_gstack/ETHOS.md"
|
||||
fi
|
||||
}
|
||||
|
||||
@@ -660,42 +715,42 @@ create_opencode_runtime_root() {
|
||||
mkdir -p "$opencode_gstack" "$opencode_gstack/browse" "$opencode_gstack/design" "$opencode_gstack/gstack-upgrade" "$opencode_gstack/review" "$opencode_gstack/qa" "$opencode_gstack/plan-devex-review"
|
||||
|
||||
if [ -f "$opencode_dir/gstack/SKILL.md" ]; then
|
||||
ln -snf "$opencode_dir/gstack/SKILL.md" "$opencode_gstack/SKILL.md"
|
||||
_link_or_copy "$opencode_dir/gstack/SKILL.md" "$opencode_gstack/SKILL.md"
|
||||
fi
|
||||
if [ -d "$gstack_dir/bin" ]; then
|
||||
ln -snf "$gstack_dir/bin" "$opencode_gstack/bin"
|
||||
_link_or_copy "$gstack_dir/bin" "$opencode_gstack/bin"
|
||||
fi
|
||||
if [ -d "$gstack_dir/browse/dist" ]; then
|
||||
ln -snf "$gstack_dir/browse/dist" "$opencode_gstack/browse/dist"
|
||||
_link_or_copy "$gstack_dir/browse/dist" "$opencode_gstack/browse/dist"
|
||||
fi
|
||||
if [ -d "$gstack_dir/browse/bin" ]; then
|
||||
ln -snf "$gstack_dir/browse/bin" "$opencode_gstack/browse/bin"
|
||||
_link_or_copy "$gstack_dir/browse/bin" "$opencode_gstack/browse/bin"
|
||||
fi
|
||||
if [ -d "$gstack_dir/design/dist" ]; then
|
||||
ln -snf "$gstack_dir/design/dist" "$opencode_gstack/design/dist"
|
||||
_link_or_copy "$gstack_dir/design/dist" "$opencode_gstack/design/dist"
|
||||
fi
|
||||
if [ -f "$opencode_dir/gstack-upgrade/SKILL.md" ]; then
|
||||
ln -snf "$opencode_dir/gstack-upgrade/SKILL.md" "$opencode_gstack/gstack-upgrade/SKILL.md"
|
||||
_link_or_copy "$opencode_dir/gstack-upgrade/SKILL.md" "$opencode_gstack/gstack-upgrade/SKILL.md"
|
||||
fi
|
||||
for f in checklist.md design-checklist.md greptile-triage.md TODOS-format.md; do
|
||||
if [ -f "$gstack_dir/review/$f" ]; then
|
||||
ln -snf "$gstack_dir/review/$f" "$opencode_gstack/review/$f"
|
||||
_link_or_copy "$gstack_dir/review/$f" "$opencode_gstack/review/$f"
|
||||
fi
|
||||
done
|
||||
if [ -d "$gstack_dir/review/specialists" ]; then
|
||||
ln -snf "$gstack_dir/review/specialists" "$opencode_gstack/review/specialists"
|
||||
_link_or_copy "$gstack_dir/review/specialists" "$opencode_gstack/review/specialists"
|
||||
fi
|
||||
if [ -d "$gstack_dir/qa/templates" ]; then
|
||||
ln -snf "$gstack_dir/qa/templates" "$opencode_gstack/qa/templates"
|
||||
_link_or_copy "$gstack_dir/qa/templates" "$opencode_gstack/qa/templates"
|
||||
fi
|
||||
if [ -d "$gstack_dir/qa/references" ]; then
|
||||
ln -snf "$gstack_dir/qa/references" "$opencode_gstack/qa/references"
|
||||
_link_or_copy "$gstack_dir/qa/references" "$opencode_gstack/qa/references"
|
||||
fi
|
||||
if [ -f "$gstack_dir/plan-devex-review/dx-hall-of-fame.md" ]; then
|
||||
ln -snf "$gstack_dir/plan-devex-review/dx-hall-of-fame.md" "$opencode_gstack/plan-devex-review/dx-hall-of-fame.md"
|
||||
_link_or_copy "$gstack_dir/plan-devex-review/dx-hall-of-fame.md" "$opencode_gstack/plan-devex-review/dx-hall-of-fame.md"
|
||||
fi
|
||||
if [ -f "$gstack_dir/ETHOS.md" ]; then
|
||||
ln -snf "$gstack_dir/ETHOS.md" "$opencode_gstack/ETHOS.md"
|
||||
_link_or_copy "$gstack_dir/ETHOS.md" "$opencode_gstack/ETHOS.md"
|
||||
fi
|
||||
}
|
||||
|
||||
@@ -721,7 +776,7 @@ link_factory_skill_dirs() {
|
||||
[ "$skill_name" = "gstack" ] && continue
|
||||
target="$skills_dir/$skill_name"
|
||||
if [ -L "$target" ] || [ ! -e "$target" ]; then
|
||||
ln -snf "$skill_dir" "$target"
|
||||
_link_or_copy "$skill_dir" "$target"
|
||||
linked+=("$skill_name")
|
||||
fi
|
||||
fi
|
||||
@@ -753,7 +808,7 @@ link_opencode_skill_dirs() {
|
||||
[ "$skill_name" = "gstack" ] && continue
|
||||
target="$skills_dir/$skill_name"
|
||||
if [ -L "$target" ] || [ ! -e "$target" ]; then
|
||||
ln -snf "$skill_dir" "$target"
|
||||
_link_or_copy "$skill_dir" "$target"
|
||||
linked+=("$skill_name")
|
||||
fi
|
||||
fi
|
||||
@@ -796,7 +851,7 @@ if [ "$INSTALL_CLAUDE" -eq 1 ]; then
|
||||
_OGB_LINK="$INSTALL_SKILLS_DIR/gstack-connect-chrome"
|
||||
fi
|
||||
if [ -L "$_OGB_LINK" ] || [ ! -e "$_OGB_LINK" ]; then
|
||||
ln -snf "gstack/open-gstack-browser" "$_OGB_LINK"
|
||||
_link_or_copy "gstack/open-gstack-browser" "$_OGB_LINK"
|
||||
fi
|
||||
if [ "$LOCAL_INSTALL" -eq 1 ]; then
|
||||
log "gstack ready (project-local)."
|
||||
@@ -842,7 +897,7 @@ if [ "$INSTALL_CLAUDE" -eq 1 ]; then
|
||||
log " browse: $BROWSE_BIN"
|
||||
else
|
||||
mkdir -p "$CLAUDE_SKILLS_DIR"
|
||||
ln -snf "$SOURCE_GSTACK_DIR" "$CLAUDE_GSTACK_LINK"
|
||||
_link_or_copy "$SOURCE_GSTACK_DIR" "$CLAUDE_GSTACK_LINK"
|
||||
log " symlinked $CLAUDE_GSTACK_LINK -> $SOURCE_GSTACK_DIR"
|
||||
INSTALL_SKILLS_DIR="$CLAUDE_SKILLS_DIR"
|
||||
INSTALL_GSTACK_DIR="$CLAUDE_GSTACK_LINK"
|
||||
@@ -863,7 +918,7 @@ if [ "$INSTALL_CLAUDE" -eq 1 ]; then
|
||||
_OGB_LINK="$INSTALL_SKILLS_DIR/gstack-connect-chrome"
|
||||
fi
|
||||
if [ -L "$_OGB_LINK" ] || [ ! -e "$_OGB_LINK" ]; then
|
||||
ln -snf "gstack/open-gstack-browser" "$_OGB_LINK"
|
||||
_link_or_copy "gstack/open-gstack-browser" "$_OGB_LINK"
|
||||
fi
|
||||
log "gstack ready (claude)."
|
||||
log " browse: $BROWSE_BIN"
|
||||
@@ -903,21 +958,21 @@ if [ "$INSTALL_KIRO" -eq 1 ]; then
|
||||
# Remove old whole-dir symlink from previous installs
|
||||
[ -L "$KIRO_GSTACK" ] && rm -f "$KIRO_GSTACK"
|
||||
mkdir -p "$KIRO_GSTACK" "$KIRO_GSTACK/browse" "$KIRO_GSTACK/gstack-upgrade" "$KIRO_GSTACK/review"
|
||||
ln -snf "$SOURCE_GSTACK_DIR/bin" "$KIRO_GSTACK/bin"
|
||||
ln -snf "$SOURCE_GSTACK_DIR/browse/dist" "$KIRO_GSTACK/browse/dist"
|
||||
ln -snf "$SOURCE_GSTACK_DIR/browse/bin" "$KIRO_GSTACK/browse/bin"
|
||||
_link_or_copy "$SOURCE_GSTACK_DIR/bin" "$KIRO_GSTACK/bin"
|
||||
_link_or_copy "$SOURCE_GSTACK_DIR/browse/dist" "$KIRO_GSTACK/browse/dist"
|
||||
_link_or_copy "$SOURCE_GSTACK_DIR/browse/bin" "$KIRO_GSTACK/browse/bin"
|
||||
# ETHOS.md — referenced by "Search Before Building" in all skill preambles
|
||||
if [ -f "$SOURCE_GSTACK_DIR/ETHOS.md" ]; then
|
||||
ln -snf "$SOURCE_GSTACK_DIR/ETHOS.md" "$KIRO_GSTACK/ETHOS.md"
|
||||
_link_or_copy "$SOURCE_GSTACK_DIR/ETHOS.md" "$KIRO_GSTACK/ETHOS.md"
|
||||
fi
|
||||
# gstack-upgrade skill
|
||||
if [ -f "$AGENTS_DIR/gstack-upgrade/SKILL.md" ]; then
|
||||
ln -snf "$AGENTS_DIR/gstack-upgrade/SKILL.md" "$KIRO_GSTACK/gstack-upgrade/SKILL.md"
|
||||
_link_or_copy "$AGENTS_DIR/gstack-upgrade/SKILL.md" "$KIRO_GSTACK/gstack-upgrade/SKILL.md"
|
||||
fi
|
||||
# Review runtime assets (individual files, not whole dir)
|
||||
for f in checklist.md design-checklist.md greptile-triage.md TODOS-format.md; do
|
||||
if [ -f "$SOURCE_GSTACK_DIR/review/$f" ]; then
|
||||
ln -snf "$SOURCE_GSTACK_DIR/review/$f" "$KIRO_GSTACK/review/$f"
|
||||
_link_or_copy "$SOURCE_GSTACK_DIR/review/$f" "$KIRO_GSTACK/review/$f"
|
||||
fi
|
||||
done
|
||||
|
||||
|
||||
40
test/build-script-shell-compat.test.ts
Normal file
40
test/build-script-shell-compat.test.ts
Normal file
@@ -0,0 +1,40 @@
|
||||
import { describe, test, expect } from 'bun:test';
|
||||
import * as fs from 'fs';
|
||||
import * as path from 'path';
|
||||
|
||||
const ROOT = path.resolve(import.meta.dir, '..');
|
||||
const PKG = JSON.parse(fs.readFileSync(path.join(ROOT, 'package.json'), 'utf-8')) as {
|
||||
scripts: Record<string, string>;
|
||||
};
|
||||
|
||||
// Strip single-quoted strings so JS code emitted as `echo '{ ... }'` doesn't
|
||||
// trip the shell-brace-group check. Conservative: only `'...'` segments.
|
||||
function stripSingleQuoted(s: string): string {
|
||||
return s.replace(/'[^']*'/g, "''");
|
||||
}
|
||||
|
||||
describe('package.json build scripts — POSIX shell compat (D-1460)', () => {
|
||||
// Bun's Windows shell parser doesn't grok bash brace groups `{ cmd; }`.
|
||||
// Subshells `( cmd )` are POSIX-universal. This test prevents regression.
|
||||
test('no bash brace groups in any npm script', () => {
|
||||
const offending: { script: string; pattern: string }[] = [];
|
||||
for (const [name, body] of Object.entries(PKG.scripts)) {
|
||||
const stripped = stripSingleQuoted(body);
|
||||
const match = stripped.match(/\{\s+[^}]*;\s*\}/);
|
||||
if (match) {
|
||||
offending.push({ script: name, pattern: match[0] });
|
||||
}
|
||||
}
|
||||
expect(offending).toEqual([]);
|
||||
});
|
||||
|
||||
test('every `> path/.version` redirect is preceded by a subshell, not a brace group', () => {
|
||||
// The original PR #1460 target: package.json line 12 had three of these.
|
||||
const build = PKG.scripts.build ?? '';
|
||||
const versionRedirects = [...build.matchAll(/(\([^)]*\)|\{[^}]*\})\s*>\s*\S+\/\.version/g)];
|
||||
expect(versionRedirects.length).toBeGreaterThan(0);
|
||||
for (const m of versionRedirects) {
|
||||
expect(m[1].startsWith('(')).toBe(true);
|
||||
}
|
||||
});
|
||||
});
|
||||
85
test/docs-config-keys.test.ts
Normal file
85
test/docs-config-keys.test.ts
Normal file
@@ -0,0 +1,85 @@
|
||||
import { describe, test, expect } from 'bun:test';
|
||||
import { spawnSync } from 'child_process';
|
||||
import * as fs from 'fs';
|
||||
import * as path from 'path';
|
||||
|
||||
const ROOT = path.resolve(import.meta.dir, '..');
|
||||
const CONFIG_BIN = path.join(ROOT, 'bin', 'gstack-config');
|
||||
|
||||
// gstack-config accepts arbitrary keys (free-form YAML store), so we can't
|
||||
// build an authoritative set of "valid keys" from the script. Instead, defend
|
||||
// the specific invariant this wave introduces: deprecated keys must not
|
||||
// reappear in user-facing docs. Extend the denylist as future renames happen.
|
||||
const DEPRECATED_KEYS = new Set<string>([
|
||||
// Renamed to artifacts_sync_mode in v1.27.0.0, doc references re-deprecated
|
||||
// in v1.36.0.0 alongside the same rename of *_prompted.
|
||||
'gbrain_sync_mode',
|
||||
'gbrain_sync_mode_prompted',
|
||||
]);
|
||||
|
||||
function scanDocsForConfigKeys(): { docPath: string; key: string; line: number }[] {
|
||||
const hits: { docPath: string; key: string; line: number }[] = [];
|
||||
const docsDir = path.join(ROOT, 'docs');
|
||||
// Recurse docs/ but skip dotfiles. CHANGELOG.md/TODOS.md are excluded by virtue
|
||||
// of being top-level; we only scan docs/**.
|
||||
const stack = [docsDir];
|
||||
while (stack.length) {
|
||||
const cur = stack.pop()!;
|
||||
for (const ent of fs.readdirSync(cur, { withFileTypes: true })) {
|
||||
if (ent.name.startsWith('.')) continue;
|
||||
const full = path.join(cur, ent.name);
|
||||
if (ent.isDirectory()) {
|
||||
stack.push(full);
|
||||
continue;
|
||||
}
|
||||
if (!ent.name.endsWith('.md')) continue;
|
||||
const text = fs.readFileSync(full, 'utf-8');
|
||||
const lines = text.split('\n');
|
||||
lines.forEach((line, idx) => {
|
||||
// Match `gstack-config set <key>` or `gstack-config get <key>`.
|
||||
for (const m of line.matchAll(/gstack-config\s+(?:set|get)\s+([a-z][a-z0-9_]*)/g)) {
|
||||
hits.push({ docPath: full, key: m[1], line: idx + 1 });
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
return hits;
|
||||
}
|
||||
|
||||
describe('docs ↔ gstack-config key drift guard', () => {
|
||||
test('docs/ references at least one config key (smoke)', () => {
|
||||
const hits = scanDocsForConfigKeys();
|
||||
expect(hits.length).toBeGreaterThan(0);
|
||||
});
|
||||
|
||||
test('no doc references a deprecated config key', () => {
|
||||
const hits = scanDocsForConfigKeys();
|
||||
const stale = hits.filter((h) => DEPRECATED_KEYS.has(h.key));
|
||||
if (stale.length > 0) {
|
||||
console.error('Deprecated config keys referenced in docs:', stale);
|
||||
}
|
||||
expect(stale).toEqual([]);
|
||||
});
|
||||
|
||||
// gstack-config is a bash script; Windows can't exec it via spawnSync
|
||||
// without a Git Bash interpreter shim. Skip on Windows — the deprecated-key
|
||||
// denylist test above already pins the v1.27.0.0 rename behavior at the
|
||||
// doc layer, which is the actual invariant this wave defends.
|
||||
test.skipIf(process.platform === 'win32')('`gstack-config get artifacts_sync_mode` returns a value (the rename landed)', () => {
|
||||
// Run from a clean HOME so the user's local config doesn't pollute.
|
||||
const tmpHome = fs.mkdtempSync(path.join(require('os').tmpdir(), 'gstack-cfg-'));
|
||||
try {
|
||||
const result = spawnSync(CONFIG_BIN, ['get', 'artifacts_sync_mode'], {
|
||||
encoding: 'utf-8',
|
||||
env: { ...process.env, HOME: tmpHome, GSTACK_HOME: tmpHome },
|
||||
timeout: 5000,
|
||||
});
|
||||
expect(result.status).toBe(0);
|
||||
// A known key returns its default value, not the "unknown key" error string.
|
||||
expect(result.stderr).not.toContain('not recognized');
|
||||
expect(result.stdout.trim().length).toBeGreaterThan(0);
|
||||
} finally {
|
||||
fs.rmSync(tmpHome, { recursive: true, force: true });
|
||||
}
|
||||
});
|
||||
});
|
||||
@@ -2198,7 +2198,7 @@ describe('setup script validation', () => {
|
||||
expect(codexSection).toContain('create_codex_runtime_root');
|
||||
expect(codexSection).toContain('link_codex_skill_dirs');
|
||||
expect(codexSection).not.toContain('link_claude_skill_dirs');
|
||||
expect(codexSection).not.toContain('ln -snf "$GSTACK_DIR" "$CODEX_GSTACK"');
|
||||
expect(codexSection).not.toContain('_link_or_copy "$GSTACK_DIR" "$CODEX_GSTACK"');
|
||||
});
|
||||
|
||||
test('Codex install prefers repo-local .agents/skills when setup runs from there', () => {
|
||||
@@ -2238,7 +2238,8 @@ describe('setup script validation', () => {
|
||||
const fnEnd = setupContent.indexOf('}', setupContent.indexOf('linked[@]}', fnStart));
|
||||
const fnBody = setupContent.slice(fnStart, fnEnd);
|
||||
expect(fnBody).toContain('mkdir -p "$target"');
|
||||
expect(fnBody).toContain('ln -snf "$gstack_dir/$dir_name/SKILL.md" "$target/SKILL.md"');
|
||||
// v1.36.0.0: routes through _link_or_copy helper for Windows fallback (cp on MSYS2/Git Bash).
|
||||
expect(fnBody).toContain('_link_or_copy "$gstack_dir/$dir_name/SKILL.md" "$target/SKILL.md"');
|
||||
});
|
||||
|
||||
// REGRESSION: cleanup functions must handle both old symlinks AND new real-directory pattern
|
||||
@@ -2345,7 +2346,7 @@ describe('setup script validation', () => {
|
||||
expect(fnBody).toContain('design-checklist.md');
|
||||
expect(fnBody).toContain('greptile-triage.md');
|
||||
expect(fnBody).toContain('TODOS-format.md');
|
||||
expect(fnBody).not.toContain('ln -snf "$gstack_dir" "$codex_gstack"');
|
||||
expect(fnBody).not.toContain('_link_or_copy "$gstack_dir" "$codex_gstack"');
|
||||
});
|
||||
|
||||
test('direct Codex installs are migrated out of ~/.codex/skills/gstack', () => {
|
||||
|
||||
@@ -55,6 +55,9 @@ const ALLOWLIST = [
|
||||
'test/gstack-upgrade.test.ts',
|
||||
// This test itself references the patterns to grep for.
|
||||
'test/no-stale-gstack-brain-refs.test.ts',
|
||||
// The v1.36.0.0 doc-config drift guard intentionally defends the rename
|
||||
// by listing the deprecated keys in its DEPRECATED_KEYS denylist.
|
||||
'test/docs-config-keys.test.ts',
|
||||
// memory.md documents the rename context.
|
||||
'setup-gbrain/memory.md',
|
||||
// The new init script's header comment intentionally cites the rename.
|
||||
|
||||
@@ -8,10 +8,11 @@ const ROOT = path.resolve(import.meta.dir, '..');
|
||||
const SETUP_SCRIPT = path.join(ROOT, 'setup');
|
||||
|
||||
describe('setup: Conductor worktree guard', () => {
|
||||
test('setup contains the real-dir guard before the ln -snf into ~/.claude/skills/', () => {
|
||||
test('setup contains the real-dir guard before the symlink-or-copy into ~/.claude/skills/', () => {
|
||||
const content = fs.readFileSync(SETUP_SCRIPT, 'utf-8');
|
||||
const guardIdx = content.indexOf('_SKIP_CLAUDE_REGISTER=0');
|
||||
const lnIdx = content.indexOf('ln -snf "$SOURCE_GSTACK_DIR" "$CLAUDE_GSTACK_LINK"');
|
||||
// v1.36.0.0: symlink work routes through _link_or_copy helper for Windows fallback.
|
||||
const lnIdx = content.indexOf('_link_or_copy "$SOURCE_GSTACK_DIR" "$CLAUDE_GSTACK_LINK"');
|
||||
expect(guardIdx).toBeGreaterThan(-1);
|
||||
expect(lnIdx).toBeGreaterThan(-1);
|
||||
expect(guardIdx).toBeLessThan(lnIdx);
|
||||
|
||||
128
test/setup-windows-fallback.test.ts
Normal file
128
test/setup-windows-fallback.test.ts
Normal file
@@ -0,0 +1,128 @@
|
||||
import { describe, test, expect } from 'bun:test';
|
||||
import { spawnSync } from 'child_process';
|
||||
import * as path from 'path';
|
||||
import * as fs from 'fs';
|
||||
import * as os from 'os';
|
||||
|
||||
const ROOT = path.resolve(import.meta.dir, '..');
|
||||
const SETUP_SCRIPT = path.join(ROOT, 'setup');
|
||||
const SETUP_SRC = fs.readFileSync(SETUP_SCRIPT, 'utf-8');
|
||||
|
||||
// Slice out the _link_or_copy helper body via awk-style anchors so the test is
|
||||
// resilient to line-number drift.
|
||||
function extractHelper(): string {
|
||||
const start = SETUP_SRC.indexOf('_link_or_copy() {');
|
||||
const end = SETUP_SRC.indexOf('\n}\n', start);
|
||||
if (start < 0 || end < 0) throw new Error('Could not locate _link_or_copy() in setup');
|
||||
return SETUP_SRC.slice(start, end + 2);
|
||||
}
|
||||
|
||||
describe('setup: _link_or_copy invariant (D7)', () => {
|
||||
test('helper function is defined near the top of setup', () => {
|
||||
expect(SETUP_SRC).toContain('_link_or_copy() {');
|
||||
expect(SETUP_SRC).toContain('if [ "$IS_WINDOWS" -eq 1 ]; then');
|
||||
});
|
||||
|
||||
test('zero raw `ln` calls outside the helper body and comments', () => {
|
||||
// Pull the helper body out of the source first so its internal `ln -snf`
|
||||
// (the Unix branch) is exempted from the invariant.
|
||||
const helper = extractHelper();
|
||||
const withoutHelper = SETUP_SRC.replace(helper, '');
|
||||
|
||||
// Strip shell comments to allow prose mentions of `ln -snf` in docstrings.
|
||||
const lines = withoutHelper.split('\n');
|
||||
const offending: { lineNo: number; line: string }[] = [];
|
||||
lines.forEach((line, idx) => {
|
||||
const trimmed = line.trim();
|
||||
if (trimmed.startsWith('#')) return;
|
||||
// Match standalone `ln ` invocations (allow `ln` as a substring in
|
||||
// variable names like `linked`, `_LINK`).
|
||||
if (/(^|[\s;&|`])ln\s+-/.test(line)) {
|
||||
offending.push({ lineNo: idx + 1, line: line.trim() });
|
||||
}
|
||||
});
|
||||
expect(offending).toEqual([]);
|
||||
});
|
||||
|
||||
test('Windows-copy note message exists in setup', () => {
|
||||
expect(SETUP_SRC).toContain('Windows install uses file copies');
|
||||
expect(SETUP_SRC).toContain('_print_windows_copy_note_once');
|
||||
});
|
||||
|
||||
test('link_claude_skill_dirs calls the Windows note printer', () => {
|
||||
const fnStart = SETUP_SRC.indexOf('link_claude_skill_dirs() {');
|
||||
const fnEnd = SETUP_SRC.indexOf('\n}\n', fnStart);
|
||||
const fnBody = SETUP_SRC.slice(fnStart, fnEnd);
|
||||
expect(fnBody).toContain('_print_windows_copy_note_once');
|
||||
});
|
||||
});
|
||||
|
||||
// Behavior matrix uses Unix `ln -snf` semantics in the IS_WINDOWS=0 cells.
|
||||
// On Windows-without-Developer-Mode (e.g. GitHub's free `windows-latest`
|
||||
// runner), `ln -snf` silently produces a file copy rather than a symlink —
|
||||
// that's literally the bug this helper exists to work around. Skip the whole
|
||||
// matrix on Windows; the static-invariant tests above already pin the helper
|
||||
// shape that the Windows install relies on.
|
||||
describe.skipIf(process.platform === 'win32')('setup: _link_or_copy helper — behavior matrix', () => {
|
||||
// Source the helper into a temp shell with IS_WINDOWS set and exercise
|
||||
// each cell of the file/dir × Windows/Unix matrix.
|
||||
function runHelper(
|
||||
isWindows: '0' | '1',
|
||||
srcKind: 'file' | 'dir',
|
||||
): { ok: boolean; targetIsSymlink: boolean; targetExists: boolean; stderr: string } {
|
||||
const tmp = fs.mkdtempSync(path.join(os.tmpdir(), 'gstack-helper-'));
|
||||
try {
|
||||
const src = path.join(tmp, 'source');
|
||||
const dst = path.join(tmp, 'dest');
|
||||
if (srcKind === 'file') {
|
||||
fs.writeFileSync(src, 'hello\n');
|
||||
} else {
|
||||
fs.mkdirSync(src);
|
||||
fs.writeFileSync(path.join(src, 'inner.txt'), 'hello\n');
|
||||
}
|
||||
const helper = extractHelper();
|
||||
// IS_WINDOWS must exist as a shell-readable var before sourcing.
|
||||
const script = `IS_WINDOWS=${isWindows}\n${helper}\n_link_or_copy "${src}" "${dst}"\n`;
|
||||
const result = spawnSync('bash', ['-c', script], {
|
||||
encoding: 'utf-8',
|
||||
timeout: 5000,
|
||||
});
|
||||
const lst = fs.lstatSync(dst, { throwIfNoEntry: false });
|
||||
return {
|
||||
ok: result.status === 0,
|
||||
targetIsSymlink: lst?.isSymbolicLink() ?? false,
|
||||
targetExists: lst !== undefined,
|
||||
stderr: result.stderr,
|
||||
};
|
||||
} finally {
|
||||
fs.rmSync(tmp, { recursive: true, force: true });
|
||||
}
|
||||
}
|
||||
|
||||
test('IS_WINDOWS=0 + file → symlink (existing Unix behavior)', () => {
|
||||
const r = runHelper('0', 'file');
|
||||
expect(r.ok).toBe(true);
|
||||
expect(r.targetExists).toBe(true);
|
||||
expect(r.targetIsSymlink).toBe(true);
|
||||
});
|
||||
|
||||
test('IS_WINDOWS=0 + dir → symlink', () => {
|
||||
const r = runHelper('0', 'dir');
|
||||
expect(r.ok).toBe(true);
|
||||
expect(r.targetIsSymlink).toBe(true);
|
||||
});
|
||||
|
||||
test('IS_WINDOWS=1 + file → regular file copy (no symlink)', () => {
|
||||
const r = runHelper('1', 'file');
|
||||
expect(r.ok).toBe(true);
|
||||
expect(r.targetExists).toBe(true);
|
||||
expect(r.targetIsSymlink).toBe(false);
|
||||
});
|
||||
|
||||
test('IS_WINDOWS=1 + dir → real directory copy', () => {
|
||||
const r = runHelper('1', 'dir');
|
||||
expect(r.ok).toBe(true);
|
||||
expect(r.targetExists).toBe(true);
|
||||
expect(r.targetIsSymlink).toBe(false);
|
||||
});
|
||||
});
|
||||
Reference in New Issue
Block a user