mirror of
https://github.com/affaan-m/everything-claude-code.git
synced 2026-05-11 23:27:25 +08:00
docs: salvage focused stale PR contributions
- add Vite and Redis pattern skills from closed stale PRs - add frontend-slides support assets - port skill-comply runner fixes and LLM prompt/provider regressions - harden agent frontmatter validation and sync catalog counts
This commit is contained in:
committed by
Affaan Mustafa
parent
d8f879e671
commit
b39d2244cf
@@ -11,7 +11,7 @@
|
||||
{
|
||||
"name": "ecc",
|
||||
"source": "./",
|
||||
"description": "The most comprehensive Claude Code plugin — 50 agents, 185 skills, 68 legacy command shims, selective install profiles, and production-ready hooks for TDD, security scanning, code review, and continuous learning",
|
||||
"description": "The most comprehensive Claude Code plugin — 50 agents, 187 skills, 68 legacy command shims, selective install profiles, and production-ready hooks for TDD, security scanning, code review, and continuous learning",
|
||||
"version": "2.0.0-rc.1",
|
||||
"author": {
|
||||
"name": "Affaan Mustafa",
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
{
|
||||
"name": "ecc",
|
||||
"version": "2.0.0-rc.1",
|
||||
"description": "Battle-tested Claude Code plugin for engineering teams — 50 agents, 185 skills, 68 legacy command shims, production-ready hooks, and selective install workflows evolved through continuous real-world use",
|
||||
"description": "Battle-tested Claude Code plugin for engineering teams — 50 agents, 187 skills, 68 legacy command shims, production-ready hooks, and selective install workflows evolved through continuous real-world use",
|
||||
"author": {
|
||||
"name": "Affaan Mustafa",
|
||||
"url": "https://x.com/affaanmustafa"
|
||||
@@ -23,6 +23,10 @@
|
||||
"best-practices"
|
||||
],
|
||||
"mcpServers": {},
|
||||
"skills": ["./skills/"],
|
||||
"commands": ["./commands/"]
|
||||
"skills": [
|
||||
"./skills/"
|
||||
],
|
||||
"commands": [
|
||||
"./commands/"
|
||||
]
|
||||
}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
# Everything Claude Code (ECC) — Agent Instructions
|
||||
|
||||
This is a **production-ready AI coding plugin** providing 50 specialized agents, 185 skills, 68 commands, and automated hook workflows for software development.
|
||||
This is a **production-ready AI coding plugin** providing 50 specialized agents, 187 skills, 68 commands, and automated hook workflows for software development.
|
||||
|
||||
**Version:** 2.0.0-rc.1
|
||||
|
||||
@@ -146,7 +146,7 @@ Troubleshoot failures: check test isolation → verify mocks → fix implementat
|
||||
|
||||
```
|
||||
agents/ — 50 specialized subagents
|
||||
skills/ — 185 workflow skills and domain knowledge
|
||||
skills/ — 187 workflow skills and domain knowledge
|
||||
commands/ — 68 slash commands
|
||||
hooks/ — Trigger-based automations
|
||||
rules/ — Always-follow guidelines (common + per-language)
|
||||
|
||||
@@ -350,7 +350,7 @@ If you stacked methods, clean up in this order:
|
||||
/plugin list ecc@ecc
|
||||
```
|
||||
|
||||
**That's it!** You now have access to 50 agents, 185 skills, and 68 legacy command shims.
|
||||
**That's it!** You now have access to 50 agents, 187 skills, and 68 legacy command shims.
|
||||
|
||||
### Dashboard GUI
|
||||
|
||||
@@ -1338,7 +1338,7 @@ The configuration is automatically detected from `.opencode/opencode.json`.
|
||||
|---------|-------------|----------|--------|
|
||||
| Agents | PASS: 50 agents | PASS: 12 agents | **Claude Code leads** |
|
||||
| Commands | PASS: 68 commands | PASS: 31 commands | **Claude Code leads** |
|
||||
| Skills | PASS: 185 skills | PASS: 37 skills | **Claude Code leads** |
|
||||
| Skills | PASS: 187 skills | PASS: 37 skills | **Claude Code leads** |
|
||||
| Hooks | PASS: 8 event types | PASS: 11 events | **OpenCode has more!** |
|
||||
| Rules | PASS: 29 rules | PASS: 13 instructions | **Claude Code leads** |
|
||||
| MCP Servers | PASS: 14 servers | PASS: Full | **Full parity** |
|
||||
@@ -1443,7 +1443,7 @@ ECC is the **first plugin to maximize every major AI coding tool**. Here's how e
|
||||
|---------|------------|------------|-----------|----------|
|
||||
| **Agents** | 50 | Shared (AGENTS.md) | Shared (AGENTS.md) | 12 |
|
||||
| **Commands** | 68 | Shared | Instruction-based | 31 |
|
||||
| **Skills** | 185 | Shared | 10 (native format) | 37 |
|
||||
| **Skills** | 187 | Shared | 10 (native format) | 37 |
|
||||
| **Hook Events** | 8 types | 15 types | None yet | 11 types |
|
||||
| **Hook Scripts** | 20+ scripts | 16 scripts (DRY adapter) | N/A | Plugin hooks |
|
||||
| **Rules** | 34 (common + lang) | 34 (YAML frontmatter) | Instruction-based | 13 instructions |
|
||||
|
||||
@@ -160,7 +160,7 @@ Copy-Item -Recurse rules/typescript "$HOME/.claude/rules/"
|
||||
/plugin list ecc@ecc
|
||||
```
|
||||
|
||||
**完成!** 你现在可以使用 50 个代理、185 个技能和 68 个命令。
|
||||
**完成!** 你现在可以使用 50 个代理、187 个技能和 68 个命令。
|
||||
|
||||
### multi-* 命令需要额外配置
|
||||
|
||||
|
||||
@@ -3,7 +3,6 @@ name: a11y-architect
|
||||
description: Accessibility Architect specializing in WCAG 2.2 compliance for Web and Native platforms. Use PROACTIVELY when designing UI components, establishing design systems, or auditing code for inclusive user experiences.
|
||||
model: sonnet
|
||||
tools: ["Read", "Write", "Edit", "Bash", "Grep", "Glob"]
|
||||
model: opus
|
||||
---
|
||||
|
||||
You are a Senior Accessibility Architect. Your goal is to ensure that every digital product is Perceivable, Operable, Understandable, and Robust (POUR) for all users, including those with visual, auditory, motor, or cognitive disabilities.
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
# Everything Claude Code (ECC) — 智能体指令
|
||||
|
||||
这是一个**生产就绪的 AI 编码插件**,提供 50 个专业代理、185 项技能、68 条命令以及自动化钩子工作流,用于软件开发。
|
||||
这是一个**生产就绪的 AI 编码插件**,提供 50 个专业代理、187 项技能、68 条命令以及自动化钩子工作流,用于软件开发。
|
||||
|
||||
**版本:** 2.0.0-rc.1
|
||||
|
||||
@@ -147,7 +147,7 @@
|
||||
|
||||
```
|
||||
agents/ — 50 个专业子代理
|
||||
skills/ — 185 个工作流技能和领域知识
|
||||
skills/ — 187 个工作流技能和领域知识
|
||||
commands/ — 68 个斜杠命令
|
||||
hooks/ — 基于触发的自动化
|
||||
rules/ — 始终遵循的指导方针(通用 + 每种语言)
|
||||
|
||||
@@ -224,7 +224,7 @@ Copy-Item -Recurse rules/typescript "$HOME/.claude/rules/"
|
||||
/plugin list ecc@ecc
|
||||
```
|
||||
|
||||
**搞定!** 你现在可以使用 50 个智能体、185 项技能和 68 个命令了。
|
||||
**搞定!** 你现在可以使用 50 个智能体、187 项技能和 68 个命令了。
|
||||
|
||||
***
|
||||
|
||||
@@ -1134,7 +1134,7 @@ opencode
|
||||
|---------|-------------|----------|--------|
|
||||
| 智能体 | PASS: 50 个 | PASS: 12 个 | **Claude Code 领先** |
|
||||
| 命令 | PASS: 68 个 | PASS: 31 个 | **Claude Code 领先** |
|
||||
| 技能 | PASS: 185 项 | PASS: 37 项 | **Claude Code 领先** |
|
||||
| 技能 | PASS: 187 项 | PASS: 37 项 | **Claude Code 领先** |
|
||||
| 钩子 | PASS: 8 种事件类型 | PASS: 11 种事件 | **OpenCode 更多!** |
|
||||
| 规则 | PASS: 29 条 | PASS: 13 条指令 | **Claude Code 领先** |
|
||||
| MCP 服务器 | PASS: 14 个 | PASS: 完整 | **完全对等** |
|
||||
@@ -1242,7 +1242,7 @@ ECC 是**第一个最大化利用每个主要 AI 编码工具的插件**。以
|
||||
|---------|------------|------------|-----------|----------|
|
||||
| **智能体** | 50 | 共享 (AGENTS.md) | 共享 (AGENTS.md) | 12 |
|
||||
| **命令** | 68 | 共享 | 基于指令 | 31 |
|
||||
| **技能** | 185 | 共享 | 10 (原生格式) | 37 |
|
||||
| **技能** | 187 | 共享 | 10 (原生格式) | 37 |
|
||||
| **钩子事件** | 8 种类型 | 15 种类型 | 暂无 | 11 种类型 |
|
||||
| **钩子脚本** | 20+ 个脚本 | 16 个脚本 (DRY 适配器) | N/A | 插件钩子 |
|
||||
| **规则** | 34 (通用 + 语言) | 34 (YAML 前页) | 基于指令 | 13 条指令 |
|
||||
|
||||
@@ -44,20 +44,29 @@ Equivalent local commands via `yarn prettier` or `npm exec prettier --` are fine
|
||||
|
||||
### Type Check
|
||||
|
||||
Use `--incremental` so re-runs reuse the previous `.tsbuildinfo` (1-3s on unchanged code instead of 30-60s every time). Wrap in `timeout` so a stuck tsc gets reaped by the OS instead of accumulating across edits — this prevents the multi-process buildup that happens when edits fire faster than tsc finishes.
|
||||
|
||||
```json
|
||||
{
|
||||
"hooks": {
|
||||
"PostToolUse": [
|
||||
{
|
||||
"matcher": "Write|Edit",
|
||||
"command": "pnpm tsc --noEmit --pretty false",
|
||||
"description": "Type-check after frontend edits"
|
||||
"command": "timeout 60 pnpm tsc --noEmit --pretty false --incremental --tsBuildInfoFile node_modules/.cache/tsc-hook.tsbuildinfo",
|
||||
"description": "Type-check after frontend edits (incremental + timeout-capped)"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Why both flags matter:**
|
||||
- Without `--incremental`, every edit re-checks the entire program from scratch. On a real Next.js project this stacks fast: edits at 5-10s intervals + 30-60s tsc runs = N concurrent tsc processes.
|
||||
- Without `timeout`, a tsc that hangs (transitive dep change, type-checker stuck on a recursive type) never exits and orphans when the parent shell does.
|
||||
- `--tsBuildInfoFile` is required because `--noEmit` normally suppresses the buildinfo write; specifying the path explicitly keeps incremental working.
|
||||
|
||||
If you're on Windows without GNU coreutils, swap `timeout 60` for a PowerShell wrapper or rely on a Stop/SessionEnd hook to sweep stale tsc processes.
|
||||
|
||||
### CSS Lint
|
||||
|
||||
```json
|
||||
|
||||
@@ -18,15 +18,25 @@ function extractFrontmatter(content) {
|
||||
if (!match) return null;
|
||||
|
||||
const frontmatter = {};
|
||||
const duplicates = [];
|
||||
const lines = match[1].split(/\r?\n/);
|
||||
for (const line of lines) {
|
||||
// Only top-level keys are unique. Indented YAML belongs to nested values.
|
||||
if (/^\s/.test(line)) continue;
|
||||
const colonIdx = line.indexOf(':');
|
||||
if (colonIdx > 0) {
|
||||
const key = line.slice(0, colonIdx).trim();
|
||||
const value = line.slice(colonIdx + 1).trim();
|
||||
if (Object.prototype.hasOwnProperty.call(frontmatter, key)) {
|
||||
duplicates.push(key);
|
||||
}
|
||||
frontmatter[key] = value;
|
||||
}
|
||||
}
|
||||
Object.defineProperty(frontmatter, '__duplicates__', {
|
||||
value: duplicates,
|
||||
enumerable: false,
|
||||
});
|
||||
return frontmatter;
|
||||
}
|
||||
|
||||
@@ -57,6 +67,11 @@ function validateAgents() {
|
||||
continue;
|
||||
}
|
||||
|
||||
if (frontmatter.__duplicates__.length > 0) {
|
||||
console.error(`ERROR: ${file} - Duplicate frontmatter keys: ${[...new Set(frontmatter.__duplicates__)].join(', ')}`);
|
||||
hasErrors = true;
|
||||
}
|
||||
|
||||
for (const field of REQUIRED_FIELDS) {
|
||||
if (!frontmatter[field] || (typeof frontmatter[field] === 'string' && !frontmatter[field].trim())) {
|
||||
console.error(`ERROR: ${file} - Missing required field: ${field}`);
|
||||
|
||||
122
skills/frontend-slides/animation-patterns.md
Normal file
122
skills/frontend-slides/animation-patterns.md
Normal file
@@ -0,0 +1,122 @@
|
||||
# Animation Patterns Reference
|
||||
|
||||
Use this reference when generating presentations. Match animations to the intended feeling.
|
||||
|
||||
## Effect-to-Feeling Guide
|
||||
|
||||
| Feeling | Animations | Visual Cues |
|
||||
|---------|-----------|-------------|
|
||||
| **Dramatic / Cinematic** | Slow fade-ins (1-1.5s), large-scale transitions (0.9 to 1), parallax scrolling | Dark backgrounds, spotlight effects, full-bleed images |
|
||||
| **Techy / Futuristic** | Neon glow (box-shadow), glitch/scramble text, grid reveals | Particle systems (canvas), grid patterns, monospace accents, cyan/magenta/electric blue |
|
||||
| **Playful / Friendly** | Bouncy easing (spring physics), floating/bobbing | Rounded corners, pastel/bright colors, hand-drawn elements |
|
||||
| **Professional / Corporate** | Subtle fast animations (200-300ms), clean slides | Navy/slate/charcoal, precise spacing, data visualization focus |
|
||||
| **Calm / Minimal** | Very slow subtle motion, gentle fades | High whitespace, muted palette, serif typography, generous padding |
|
||||
| **Editorial / Magazine** | Staggered text reveals, image-text interplay | Strong type hierarchy, pull quotes, grid-breaking layouts, serif headlines + sans body |
|
||||
|
||||
## Entrance Animations
|
||||
|
||||
```css
|
||||
/* Fade + Slide Up (most versatile) */
|
||||
.reveal {
|
||||
opacity: 0;
|
||||
transform: translateY(30px);
|
||||
transition: opacity 0.6s var(--ease-out-expo),
|
||||
transform 0.6s var(--ease-out-expo);
|
||||
}
|
||||
.visible .reveal {
|
||||
opacity: 1;
|
||||
transform: translateY(0);
|
||||
}
|
||||
|
||||
/* Scale In */
|
||||
.reveal-scale {
|
||||
opacity: 0;
|
||||
transform: scale(0.9);
|
||||
transition: opacity 0.6s, transform 0.6s var(--ease-out-expo);
|
||||
}
|
||||
.visible .reveal-scale {
|
||||
opacity: 1;
|
||||
transform: scale(1);
|
||||
}
|
||||
|
||||
/* Slide from Left */
|
||||
.reveal-left {
|
||||
opacity: 0;
|
||||
transform: translateX(-50px);
|
||||
transition: opacity 0.6s, transform 0.6s var(--ease-out-expo);
|
||||
}
|
||||
.visible .reveal-left {
|
||||
opacity: 1;
|
||||
transform: translateX(0);
|
||||
}
|
||||
|
||||
/* Blur In */
|
||||
.reveal-blur {
|
||||
opacity: 0;
|
||||
filter: blur(10px);
|
||||
transition: opacity 0.8s, filter 0.8s var(--ease-out-expo);
|
||||
}
|
||||
.visible .reveal-blur {
|
||||
opacity: 1;
|
||||
filter: blur(0);
|
||||
}
|
||||
```
|
||||
|
||||
## Background Effects
|
||||
|
||||
```css
|
||||
/* Gradient Mesh — layered radial gradients for depth */
|
||||
.gradient-bg {
|
||||
background:
|
||||
radial-gradient(ellipse at 20% 80%, rgba(120, 0, 255, 0.3) 0%, transparent 50%),
|
||||
radial-gradient(ellipse at 80% 20%, rgba(0, 255, 200, 0.2) 0%, transparent 50%),
|
||||
var(--bg-primary);
|
||||
}
|
||||
|
||||
/* Noise Texture — inline SVG for grain */
|
||||
.noise-bg {
|
||||
background-image: url("data:image/svg+xml,..."); /* Inline SVG noise */
|
||||
}
|
||||
|
||||
/* Grid Pattern — subtle structural lines */
|
||||
.grid-bg {
|
||||
background-image:
|
||||
linear-gradient(rgba(255,255,255,0.03) 1px, transparent 1px),
|
||||
linear-gradient(90deg, rgba(255,255,255,0.03) 1px, transparent 1px);
|
||||
background-size: 50px 50px;
|
||||
}
|
||||
```
|
||||
|
||||
## Interactive Effects
|
||||
|
||||
```javascript
|
||||
/* 3D Tilt on Hover — adds depth to cards/panels */
|
||||
class TiltEffect {
|
||||
constructor(element) {
|
||||
this.element = element;
|
||||
this.element.style.transformStyle = 'preserve-3d';
|
||||
this.element.style.perspective = '1000px';
|
||||
|
||||
this.element.addEventListener('mousemove', (e) => {
|
||||
const rect = this.element.getBoundingClientRect();
|
||||
const x = (e.clientX - rect.left) / rect.width - 0.5;
|
||||
const y = (e.clientY - rect.top) / rect.height - 0.5;
|
||||
this.element.style.transform = `rotateY(${x * 10}deg) rotateX(${-y * 10}deg)`;
|
||||
});
|
||||
|
||||
this.element.addEventListener('mouseleave', () => {
|
||||
this.element.style.transform = 'rotateY(0) rotateX(0)';
|
||||
});
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
| Problem | Fix |
|
||||
|---------|-----|
|
||||
| Fonts not loading | Check Fontshare/Google Fonts URL; ensure font names match in CSS |
|
||||
| Animations not triggering | Verify Intersection Observer is running; check `.visible` class is being added |
|
||||
| Scroll snap not working | Ensure `scroll-snap-type: y mandatory` on html; each slide needs `scroll-snap-align: start` |
|
||||
| Mobile issues | Disable heavy effects at 768px breakpoint; test touch events; reduce particle count |
|
||||
| Performance issues | Use `will-change` sparingly; prefer `transform`/`opacity` animations; throttle scroll handlers |
|
||||
419
skills/frontend-slides/html-template.md
Normal file
419
skills/frontend-slides/html-template.md
Normal file
@@ -0,0 +1,419 @@
|
||||
# HTML Presentation Template
|
||||
|
||||
Reference architecture for generating slide presentations. Every presentation follows this structure.
|
||||
|
||||
## Base HTML Structure
|
||||
|
||||
```html
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
|
||||
<title>Presentation Title</title>
|
||||
|
||||
<!-- Fonts: use Fontshare or Google Fonts — never system fonts -->
|
||||
<link rel="stylesheet" href="https://api.fontshare.com/v2/css?f[]=..." />
|
||||
|
||||
<style>
|
||||
/* ===========================================
|
||||
CSS CUSTOM PROPERTIES (THEME)
|
||||
Change these to change the whole look
|
||||
=========================================== */
|
||||
:root {
|
||||
/* Colors — from chosen style preset */
|
||||
--bg-primary: #0a0f1c;
|
||||
--bg-secondary: #111827;
|
||||
--text-primary: #ffffff;
|
||||
--text-secondary: #9ca3af;
|
||||
--accent: #00ffcc;
|
||||
--accent-glow: rgba(0, 255, 204, 0.3);
|
||||
|
||||
/* Typography — MUST use clamp() */
|
||||
--font-display: "Clash Display", sans-serif;
|
||||
--font-body: "Satoshi", sans-serif;
|
||||
--title-size: clamp(2rem, 6vw, 5rem);
|
||||
--subtitle-size: clamp(0.875rem, 2vw, 1.25rem);
|
||||
--body-size: clamp(0.75rem, 1.2vw, 1rem);
|
||||
|
||||
/* Spacing — MUST use clamp() */
|
||||
--slide-padding: clamp(1.5rem, 4vw, 4rem);
|
||||
--content-gap: clamp(1rem, 2vw, 2rem);
|
||||
|
||||
/* Animation */
|
||||
--ease-out-expo: cubic-bezier(0.16, 1, 0.3, 1);
|
||||
--duration-normal: 0.6s;
|
||||
}
|
||||
|
||||
/* ===========================================
|
||||
BASE STYLES
|
||||
=========================================== */
|
||||
* {
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
||||
/* --- PASTE viewport-base.css CONTENTS HERE --- */
|
||||
|
||||
/* ===========================================
|
||||
ANIMATIONS
|
||||
Trigger via .visible class (added by JS on scroll)
|
||||
=========================================== */
|
||||
.reveal {
|
||||
opacity: 0;
|
||||
transform: translateY(30px);
|
||||
transition:
|
||||
opacity var(--duration-normal) var(--ease-out-expo),
|
||||
transform var(--duration-normal) var(--ease-out-expo);
|
||||
}
|
||||
|
||||
.slide.visible .reveal {
|
||||
opacity: 1;
|
||||
transform: translateY(0);
|
||||
}
|
||||
|
||||
/* Stagger children for sequential reveal */
|
||||
.reveal:nth-child(1) {
|
||||
transition-delay: 0.1s;
|
||||
}
|
||||
.reveal:nth-child(2) {
|
||||
transition-delay: 0.2s;
|
||||
}
|
||||
.reveal:nth-child(3) {
|
||||
transition-delay: 0.3s;
|
||||
}
|
||||
.reveal:nth-child(4) {
|
||||
transition-delay: 0.4s;
|
||||
}
|
||||
|
||||
/* ... preset-specific styles ... */
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<!-- Optional: Progress bar -->
|
||||
<div class="progress-bar"></div>
|
||||
|
||||
<!-- Optional: Navigation dots -->
|
||||
<nav class="nav-dots"><!-- Generated by JS --></nav>
|
||||
|
||||
<!-- Slides -->
|
||||
<section class="slide title-slide">
|
||||
<h1 class="reveal">Presentation Title</h1>
|
||||
<p class="reveal">Subtitle or author</p>
|
||||
</section>
|
||||
|
||||
<section class="slide">
|
||||
<div class="slide-content">
|
||||
<h2 class="reveal">Slide Title</h2>
|
||||
<p class="reveal">Content...</p>
|
||||
</div>
|
||||
</section>
|
||||
|
||||
<!-- More slides... -->
|
||||
|
||||
<script>
|
||||
/* ===========================================
|
||||
SLIDE PRESENTATION CONTROLLER
|
||||
=========================================== */
|
||||
class SlidePresentation {
|
||||
constructor() {
|
||||
this.slides = document.querySelectorAll(".slide");
|
||||
this.currentSlide = 0;
|
||||
this.setupIntersectionObserver();
|
||||
this.setupKeyboardNav();
|
||||
this.setupTouchNav();
|
||||
this.setupProgressBar();
|
||||
this.setupNavDots();
|
||||
}
|
||||
|
||||
setupIntersectionObserver() {
|
||||
// Add .visible class when slides enter viewport
|
||||
// Triggers CSS animations efficiently
|
||||
}
|
||||
|
||||
setupKeyboardNav() {
|
||||
// Arrow keys, Space, Page Up/Down
|
||||
}
|
||||
|
||||
setupTouchNav() {
|
||||
// Touch/swipe support for mobile
|
||||
}
|
||||
|
||||
setupProgressBar() {
|
||||
// Update progress bar on scroll
|
||||
}
|
||||
|
||||
setupNavDots() {
|
||||
// IMPORTANT: Always clear before building — if outerHTML was
|
||||
// captured while dots were rendered, re-opening the file would
|
||||
// append a duplicate set on top of the existing ones.
|
||||
this.navDotsContainer.innerHTML = "";
|
||||
// Generate and manage navigation dots
|
||||
}
|
||||
}
|
||||
|
||||
new SlidePresentation();
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
||||
```
|
||||
|
||||
## Required JavaScript Features
|
||||
|
||||
Every presentation must include:
|
||||
|
||||
1. **SlidePresentation Class** — Main controller with:
|
||||
- Keyboard navigation (arrows, space, page up/down)
|
||||
- Touch/swipe support
|
||||
- Mouse wheel navigation
|
||||
- Progress bar updates
|
||||
- Navigation dots
|
||||
|
||||
2. **Intersection Observer** — For scroll-triggered animations:
|
||||
- Add `.visible` class when slides enter viewport
|
||||
- Trigger CSS transitions efficiently
|
||||
|
||||
3. **Optional Enhancements** (match to chosen style):
|
||||
- Custom cursor with trail
|
||||
- Particle system background (canvas)
|
||||
- Parallax effects
|
||||
- 3D tilt on hover
|
||||
- Magnetic buttons
|
||||
- Counter animations
|
||||
|
||||
4. **Inline Editing** (only if user opted in during Phase 1 — skip entirely if they said No):
|
||||
- Edit toggle button (hidden by default, revealed via hover hotzone or `E` key)
|
||||
- Auto-save to localStorage
|
||||
- Export/save file functionality
|
||||
- See "Inline Editing Implementation" section below
|
||||
|
||||
## Inline Editing Implementation (Opt-In Only)
|
||||
|
||||
**If the user chose "No" for inline editing in Phase 1, do NOT generate any edit-related HTML, CSS, or JS.**
|
||||
|
||||
**Do NOT use CSS `~` sibling selector for hover-based show/hide.** The CSS-only approach (`edit-hotzone:hover ~ .edit-toggle`) fails because `pointer-events: none` on the toggle button breaks the hover chain: user hovers hotzone -> button becomes visible -> mouse moves toward button -> leaves hotzone -> button disappears before click.
|
||||
|
||||
**Required approach: JS-based hover with 400ms delay timeout.**
|
||||
|
||||
HTML:
|
||||
|
||||
```html
|
||||
<div class="edit-hotzone"></div>
|
||||
<button class="edit-toggle" id="editToggle" title="Edit mode (E)">Edit</button>
|
||||
```
|
||||
|
||||
CSS (visibility controlled by JS classes only):
|
||||
|
||||
```css
|
||||
/* Do NOT use CSS ~ sibling selector for this!
|
||||
pointer-events: none breaks the hover chain.
|
||||
Must use JS with delay timeout. */
|
||||
.edit-hotzone {
|
||||
position: fixed;
|
||||
top: 0;
|
||||
left: 0;
|
||||
width: 80px;
|
||||
height: 80px;
|
||||
z-index: 10000;
|
||||
cursor: pointer;
|
||||
}
|
||||
.edit-toggle {
|
||||
opacity: 0;
|
||||
pointer-events: none;
|
||||
transition: opacity 0.3s ease;
|
||||
z-index: 10001;
|
||||
}
|
||||
.edit-toggle.show,
|
||||
.edit-toggle.active {
|
||||
opacity: 1;
|
||||
pointer-events: auto;
|
||||
}
|
||||
```
|
||||
|
||||
JS (three interaction methods):
|
||||
|
||||
```javascript
|
||||
// 1. Click handler on the toggle button
|
||||
document.getElementById("editToggle").addEventListener("click", () => {
|
||||
editor.toggleEditMode();
|
||||
});
|
||||
|
||||
// 2. Hotzone hover with 400ms grace period
|
||||
const hotzone = document.querySelector(".edit-hotzone");
|
||||
const editToggle = document.getElementById("editToggle");
|
||||
let hideTimeout = null;
|
||||
|
||||
hotzone.addEventListener("mouseenter", () => {
|
||||
clearTimeout(hideTimeout);
|
||||
editToggle.classList.add("show");
|
||||
});
|
||||
hotzone.addEventListener("mouseleave", () => {
|
||||
hideTimeout = setTimeout(() => {
|
||||
if (!editor.isActive) editToggle.classList.remove("show");
|
||||
}, 400);
|
||||
});
|
||||
editToggle.addEventListener("mouseenter", () => {
|
||||
clearTimeout(hideTimeout);
|
||||
});
|
||||
editToggle.addEventListener("mouseleave", () => {
|
||||
hideTimeout = setTimeout(() => {
|
||||
if (!editor.isActive) editToggle.classList.remove("show");
|
||||
}, 400);
|
||||
});
|
||||
|
||||
// 3. Hotzone direct click
|
||||
hotzone.addEventListener("click", () => {
|
||||
editor.toggleEditMode();
|
||||
});
|
||||
|
||||
// 4. Keyboard shortcut (E key, skip when editing text)
|
||||
document.addEventListener("keydown", (e) => {
|
||||
if (
|
||||
(e.key === "e" || e.key === "E") &&
|
||||
!e.target.getAttribute("contenteditable")
|
||||
) {
|
||||
editor.toggleEditMode();
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
**CRITICAL: `exportFile()` must strip edit state before capturing outerHTML.**
|
||||
|
||||
When the user presses Ctrl+S in edit mode, `document.documentElement.outerHTML` captures the live DOM —
|
||||
including `body.edit-active`, `contenteditable="true"` on every text element, and `.active`/`.show` classes on
|
||||
the toggle button and banner. Anyone opening the saved file sees dashed outlines, a checkmark button, and an
|
||||
edit banner, as if permanently stuck in edit mode.
|
||||
|
||||
Always implement `exportFile()` like this:
|
||||
|
||||
```javascript
|
||||
exportFile() {
|
||||
// Temporarily strip edit state so the saved file opens cleanly
|
||||
const editableEls = Array.from(document.querySelectorAll('[contenteditable]'));
|
||||
editableEls.forEach(el => el.removeAttribute('contenteditable'));
|
||||
document.body.classList.remove('edit-active');
|
||||
|
||||
// Also strip UI classes from toggle button and banner
|
||||
const editToggle = document.getElementById('editToggle');
|
||||
const editBanner = document.querySelector('.edit-banner');
|
||||
editToggle?.classList.remove('active', 'show');
|
||||
editBanner?.classList.remove('active', 'show');
|
||||
|
||||
const html = '<!DOCTYPE html>\n' + document.documentElement.outerHTML;
|
||||
|
||||
// Restore edit state so the user can keep editing
|
||||
document.body.classList.add('edit-active');
|
||||
editableEls.forEach(el => el.setAttribute('contenteditable', 'true'));
|
||||
editToggle?.classList.add('active');
|
||||
editBanner?.classList.add('active');
|
||||
|
||||
const blob = new Blob([html], { type: 'text/html' });
|
||||
const a = document.createElement('a');
|
||||
a.href = URL.createObjectURL(blob);
|
||||
a.download = 'presentation.html';
|
||||
a.click();
|
||||
URL.revokeObjectURL(a.href);
|
||||
}
|
||||
```
|
||||
|
||||
## Image Pipeline (Skip If No Images)
|
||||
|
||||
If user chose "No images" in Phase 1, skip this entirely. If images were provided, process them before generating HTML.
|
||||
|
||||
**Dependency:** `pip install Pillow`
|
||||
|
||||
### Image Processing
|
||||
|
||||
```python
|
||||
from PIL import Image, ImageDraw
|
||||
|
||||
# Circular crop (for logos on modern/clean styles)
|
||||
def crop_circle(input_path, output_path):
|
||||
img = Image.open(input_path).convert('RGBA')
|
||||
w, h = img.size
|
||||
size = min(w, h)
|
||||
left, top = (w - size) // 2, (h - size) // 2
|
||||
img = img.crop((left, top, left + size, top + size))
|
||||
mask = Image.new('L', (size, size), 0)
|
||||
ImageDraw.Draw(mask).ellipse([0, 0, size, size], fill=255)
|
||||
img.putalpha(mask)
|
||||
img.save(output_path, 'PNG')
|
||||
|
||||
# Resize (for oversized images that inflate HTML)
|
||||
def resize_max(input_path, output_path, max_dim=1200):
|
||||
img = Image.open(input_path)
|
||||
img.thumbnail((max_dim, max_dim), Image.LANCZOS)
|
||||
img.save(output_path, quality=85)
|
||||
```
|
||||
|
||||
| Situation | Operation |
|
||||
| -------------------------------- | ----------------------------- |
|
||||
| Square logo on rounded aesthetic | `crop_circle()` |
|
||||
| Image > 1MB | `resize_max(max_dim=1200)` |
|
||||
| Wrong aspect ratio | Manual crop with `img.crop()` |
|
||||
|
||||
Save processed images with `_processed` suffix. Never overwrite originals.
|
||||
|
||||
### Image Placement
|
||||
|
||||
**Use direct file paths** (not base64) — presentations are viewed locally:
|
||||
|
||||
```html
|
||||
<img src="assets/logo_round.png" alt="Logo" class="slide-image logo" />
|
||||
<img
|
||||
src="assets/screenshot.png"
|
||||
alt="Screenshot"
|
||||
class="slide-image screenshot"
|
||||
/>
|
||||
```
|
||||
|
||||
```css
|
||||
.slide-image {
|
||||
max-width: 100%;
|
||||
max-height: min(50vh, 400px);
|
||||
object-fit: contain;
|
||||
border-radius: 8px;
|
||||
}
|
||||
.slide-image.screenshot {
|
||||
border: 1px solid rgba(255, 255, 255, 0.1);
|
||||
border-radius: 12px;
|
||||
box-shadow: 0 8px 32px rgba(0, 0, 0, 0.3);
|
||||
}
|
||||
.slide-image.logo {
|
||||
max-height: min(30vh, 200px);
|
||||
}
|
||||
```
|
||||
|
||||
**Adapt border/shadow colors to match the chosen style's accent.** Never repeat the same image on multiple slides (except logos on title + closing).
|
||||
|
||||
**Placement patterns:** Logo centered on title slide. Screenshots in two-column layouts with text. Full-bleed images as slide backgrounds with text overlay (use sparingly).
|
||||
|
||||
---
|
||||
|
||||
## Code Quality
|
||||
|
||||
**Comments:** Every section needs clear comments explaining what it does and how to modify it.
|
||||
|
||||
**Accessibility:**
|
||||
|
||||
- Semantic HTML (`<section>`, `<nav>`, `<main>`)
|
||||
- Keyboard navigation works fully
|
||||
- ARIA labels where needed
|
||||
- `prefers-reduced-motion` support (included in viewport-base.css)
|
||||
|
||||
## File Structure
|
||||
|
||||
Single presentations:
|
||||
|
||||
```
|
||||
presentation.html # Self-contained, all CSS/JS inline
|
||||
assets/ # Images only, if any
|
||||
```
|
||||
|
||||
Multiple presentations in one project:
|
||||
|
||||
```
|
||||
[name].html
|
||||
[name]-assets/
|
||||
```
|
||||
418
skills/frontend-slides/scripts/export-pdf.sh
Executable file
418
skills/frontend-slides/scripts/export-pdf.sh
Executable file
@@ -0,0 +1,418 @@
|
||||
#!/usr/bin/env bash
|
||||
# export-pdf.sh - Export an HTML presentation to PDF
|
||||
#
|
||||
# Usage:
|
||||
# bash scripts/export-pdf.sh <path-to-html> [output.pdf]
|
||||
#
|
||||
# Examples:
|
||||
# bash scripts/export-pdf.sh ./my-deck/index.html
|
||||
# bash scripts/export-pdf.sh ./presentation.html ./presentation.pdf
|
||||
#
|
||||
# What this does:
|
||||
# 1. Starts a local server to serve the HTML (fonts and assets need HTTP)
|
||||
# 2. Uses Playwright to screenshot each slide at 1920x1080
|
||||
# 3. Combines all screenshots into a single PDF
|
||||
# 4. Cleans up the server and temp files
|
||||
#
|
||||
# The PDF preserves colors, fonts, and layout - but not animations.
|
||||
# Perfect for email attachments, printing, or embedding in documents.
|
||||
set -euo pipefail
|
||||
|
||||
# --- Colors ---
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
CYAN='\033[0;36m'
|
||||
YELLOW='\033[1;33m'
|
||||
BOLD='\033[1m'
|
||||
NC='\033[0m'
|
||||
|
||||
info() { echo -e "${CYAN}INFO:${NC} $*"; }
|
||||
ok() { echo -e "${GREEN}OK:${NC} $*"; }
|
||||
warn() { echo -e "${YELLOW}WARNING:${NC} $*"; }
|
||||
err() { echo -e "${RED}ERROR:${NC} $*" >&2; }
|
||||
|
||||
# --- Parse flags ---
|
||||
|
||||
# Default resolution: 1920x1080 (full HD, ~1-2MB per slide)
|
||||
# Compact resolution: 1280x720 (HD, ~50-70% smaller files)
|
||||
VIEWPORT_W=1920
|
||||
VIEWPORT_H=1080
|
||||
COMPACT=false
|
||||
|
||||
POSITIONAL=()
|
||||
for arg in "$@"; do
|
||||
case $arg in
|
||||
--compact)
|
||||
COMPACT=true
|
||||
VIEWPORT_W=1280
|
||||
VIEWPORT_H=720
|
||||
;;
|
||||
*)
|
||||
POSITIONAL+=("$arg")
|
||||
;;
|
||||
esac
|
||||
done
|
||||
set -- "${POSITIONAL[@]}"
|
||||
|
||||
# --- Input validation ---
|
||||
|
||||
if [[ $# -lt 1 ]]; then
|
||||
err "Usage: bash scripts/export-pdf.sh <path-to-html> [output.pdf] [--compact]"
|
||||
err ""
|
||||
err "Examples:"
|
||||
err " bash scripts/export-pdf.sh ./my-deck/index.html"
|
||||
err " bash scripts/export-pdf.sh ./presentation.html ./slides.pdf"
|
||||
err " bash scripts/export-pdf.sh ./presentation.html --compact # smaller file size"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
INPUT_HTML="$1"
|
||||
if [[ ! -f "$INPUT_HTML" ]]; then
|
||||
err "File not found: $INPUT_HTML"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Resolve to absolute path
|
||||
INPUT_HTML=$(cd "$(dirname "$INPUT_HTML")" && pwd)/$(basename "$INPUT_HTML")
|
||||
|
||||
# Output PDF path: use second argument or derive from input name
|
||||
if [[ $# -ge 2 ]]; then
|
||||
OUTPUT_PDF="$2"
|
||||
else
|
||||
OUTPUT_PDF="$(dirname "$INPUT_HTML")/$(basename "$INPUT_HTML" .html).pdf"
|
||||
fi
|
||||
|
||||
# Resolve output to absolute path
|
||||
OUTPUT_DIR=$(dirname "$OUTPUT_PDF")
|
||||
mkdir -p "$OUTPUT_DIR"
|
||||
OUTPUT_PDF="$OUTPUT_DIR/$(basename "$OUTPUT_PDF")"
|
||||
|
||||
echo ""
|
||||
echo -e "${BOLD}========================================${NC}"
|
||||
echo -e "${BOLD} Export Slides to PDF${NC}"
|
||||
echo -e "${BOLD}========================================${NC}"
|
||||
echo ""
|
||||
|
||||
# --- Step 1: Check dependencies ---
|
||||
|
||||
info "Checking dependencies..."
|
||||
|
||||
if ! command -v npx &>/dev/null; then
|
||||
err "Node.js is required but not installed."
|
||||
err ""
|
||||
err "Install Node.js:"
|
||||
err " macOS: brew install node"
|
||||
err " or visit https://nodejs.org and download the installer"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
ok "Node.js found"
|
||||
|
||||
# --- Step 2: Create the export script ---
|
||||
|
||||
# We use a temporary Node.js script with Playwright to:
|
||||
# 1. Start a local server (so fonts load correctly)
|
||||
# 2. Navigate to each slide
|
||||
# 3. Screenshot each slide at 1920x1080 (16:9 landscape)
|
||||
# 4. Combine into a single PDF
|
||||
|
||||
TEMP_DIR=$(mktemp -d)
|
||||
TEMP_SCRIPT="$TEMP_DIR/export-slides.mjs"
|
||||
|
||||
# Figure out which directory to serve (the folder containing the HTML)
|
||||
SERVE_DIR=$(dirname "$INPUT_HTML")
|
||||
HTML_FILENAME=$(basename "$INPUT_HTML")
|
||||
|
||||
cat > "$TEMP_SCRIPT" << 'EXPORT_SCRIPT'
|
||||
// export-slides.mjs - Playwright script to export HTML slides to PDF
|
||||
//
|
||||
// How it works:
|
||||
// 1. Starts a local HTTP server (needed for fonts/assets to load)
|
||||
// 2. Opens the presentation in a headless browser at 1920x1080
|
||||
// 3. Counts the total number of slides
|
||||
// 4. Screenshots each slide one by one
|
||||
// 5. Generates a PDF with all slides as landscape pages
|
||||
|
||||
import { chromium } from 'playwright';
|
||||
import { createServer } from 'http';
|
||||
import { readFileSync, existsSync, mkdirSync, unlinkSync, writeFileSync } from 'fs';
|
||||
import { join, extname, resolve } from 'path';
|
||||
import { execSync } from 'child_process';
|
||||
|
||||
const SERVE_DIR = process.argv[2];
|
||||
const HTML_FILE = process.argv[3];
|
||||
const OUTPUT_PDF = process.argv[4];
|
||||
const SCREENSHOT_DIR = process.argv[5];
|
||||
const VP_WIDTH = parseInt(process.argv[6]) || 1920;
|
||||
const VP_HEIGHT = parseInt(process.argv[7]) || 1080;
|
||||
|
||||
// --- Simple static file server ---
|
||||
// (We need HTTP so that Google Fonts and relative assets load correctly)
|
||||
|
||||
const MIME_TYPES = {
|
||||
'.html': 'text/html',
|
||||
'.css': 'text/css',
|
||||
'.js': 'application/javascript',
|
||||
'.json': 'application/json',
|
||||
'.png': 'image/png',
|
||||
'.jpg': 'image/jpeg',
|
||||
'.jpeg': 'image/jpeg',
|
||||
'.gif': 'image/gif',
|
||||
'.svg': 'image/svg+xml',
|
||||
'.webp': 'image/webp',
|
||||
'.woff': 'font/woff',
|
||||
'.woff2': 'font/woff2',
|
||||
'.ttf': 'font/ttf',
|
||||
'.eot': 'application/vnd.ms-fontobject',
|
||||
};
|
||||
|
||||
const server = createServer((req, res) => {
|
||||
// Decode URL-encoded characters (e.g., %20 -> space) so filenames with spaces resolve correctly
|
||||
const decodedUrl = decodeURIComponent(req.url);
|
||||
let filePath = join(SERVE_DIR, decodedUrl === '/' ? HTML_FILE : decodedUrl);
|
||||
try {
|
||||
const content = readFileSync(filePath);
|
||||
const ext = extname(filePath).toLowerCase();
|
||||
res.writeHead(200, { 'Content-Type': MIME_TYPES[ext] || 'application/octet-stream' });
|
||||
res.end(content);
|
||||
} catch {
|
||||
res.writeHead(404);
|
||||
res.end('Not found');
|
||||
}
|
||||
});
|
||||
|
||||
// Find a free port
|
||||
const port = await new Promise((resolve) => {
|
||||
server.listen(0, () => resolve(server.address().port));
|
||||
});
|
||||
|
||||
console.log(` Local server on port ${port}`);
|
||||
|
||||
// --- Screenshot each slide ---
|
||||
|
||||
const browser = await chromium.launch();
|
||||
const page = await browser.newPage({
|
||||
viewport: { width: VP_WIDTH, height: VP_HEIGHT },
|
||||
});
|
||||
|
||||
// Load the presentation
|
||||
await page.goto(`http://localhost:${port}/`, { waitUntil: 'networkidle' });
|
||||
|
||||
// Wait for fonts to load
|
||||
await page.evaluate(() => document.fonts.ready);
|
||||
|
||||
// Extra wait for animations to settle on the first slide
|
||||
await page.waitForTimeout(1500);
|
||||
|
||||
// Count slides
|
||||
const slideCount = await page.evaluate(() => {
|
||||
return document.querySelectorAll('.slide').length;
|
||||
});
|
||||
|
||||
console.log(` Found ${slideCount} slides`);
|
||||
|
||||
if (slideCount === 0) {
|
||||
console.error(' ERROR: No .slide elements found in the presentation.');
|
||||
console.error(' Make sure your HTML uses <div class="slide"> or <section class="slide">.');
|
||||
await browser.close();
|
||||
server.close();
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
// Screenshot each slide
|
||||
mkdirSync(SCREENSHOT_DIR, { recursive: true });
|
||||
const screenshotPaths = [];
|
||||
|
||||
for (let i = 0; i < slideCount; i++) {
|
||||
// Navigate to slide by simulating the presentation's navigation
|
||||
// Most frontend-slides presentations use a currentSlide index and show/hide
|
||||
await page.evaluate((index) => {
|
||||
const slides = document.querySelectorAll('.slide');
|
||||
|
||||
// Try multiple navigation strategies used by frontend-slides:
|
||||
|
||||
// Strategy 1: Direct slide manipulation (most common in generated decks)
|
||||
slides.forEach((slide, idx) => {
|
||||
if (idx === index) {
|
||||
slide.style.display = '';
|
||||
slide.style.opacity = '1';
|
||||
slide.style.visibility = 'visible';
|
||||
slide.style.position = 'relative';
|
||||
slide.style.transform = 'none';
|
||||
slide.classList.add('active');
|
||||
} else {
|
||||
slide.style.display = 'none';
|
||||
slide.classList.remove('active');
|
||||
}
|
||||
});
|
||||
|
||||
// Strategy 2: If there's a SlidePresentation class instance, use it
|
||||
if (window.presentation && typeof window.presentation.goToSlide === 'function') {
|
||||
window.presentation.goToSlide(index);
|
||||
}
|
||||
|
||||
// Strategy 3: Scroll-based (some decks use scroll snapping)
|
||||
slides[index]?.scrollIntoView({ behavior: 'instant' });
|
||||
}, i);
|
||||
|
||||
// Wait for any slide transition animations to finish
|
||||
await page.waitForTimeout(300);
|
||||
|
||||
// Wait for intersection observer animations to trigger
|
||||
await page.waitForTimeout(200);
|
||||
|
||||
// Force all .reveal elements on the current slide to be visible
|
||||
// (animations normally trigger on scroll/intersection, but we need them visible now)
|
||||
await page.evaluate((index) => {
|
||||
const slides = document.querySelectorAll('.slide');
|
||||
const currentSlide = slides[index];
|
||||
if (currentSlide) {
|
||||
currentSlide.querySelectorAll('.reveal').forEach(el => {
|
||||
el.style.opacity = '1';
|
||||
el.style.transform = 'none';
|
||||
el.style.visibility = 'visible';
|
||||
});
|
||||
}
|
||||
}, i);
|
||||
|
||||
await page.waitForTimeout(100);
|
||||
|
||||
const screenshotPath = join(SCREENSHOT_DIR, `slide-${String(i + 1).padStart(3, '0')}.png`);
|
||||
await page.screenshot({ path: screenshotPath, fullPage: false });
|
||||
screenshotPaths.push(screenshotPath);
|
||||
console.log(` Captured slide ${i + 1}/${slideCount}`);
|
||||
}
|
||||
|
||||
await browser.close();
|
||||
server.close();
|
||||
|
||||
// --- Combine screenshots into PDF ---
|
||||
// Use a second Playwright page to generate a PDF from the screenshots
|
||||
|
||||
console.log(' Assembling PDF...');
|
||||
|
||||
const browser2 = await chromium.launch();
|
||||
const pdfPage = await browser2.newPage();
|
||||
|
||||
// Build an HTML page with all screenshots, one per page
|
||||
const imagesHtml = screenshotPaths.map((p) => {
|
||||
const imgData = readFileSync(p).toString('base64');
|
||||
return `<div class="page"><img src="data:image/png;base64,${imgData}" /></div>`;
|
||||
}).join('\n');
|
||||
|
||||
const pdfHtml = `<!DOCTYPE html>
|
||||
<html>
|
||||
<head>
|
||||
<style>
|
||||
* { margin: 0; padding: 0; }
|
||||
@page { size: ${VP_WIDTH}px ${VP_HEIGHT}px; margin: 0; }
|
||||
.page {
|
||||
width: ${VP_WIDTH}px;
|
||||
height: ${VP_HEIGHT}px;
|
||||
page-break-after: always;
|
||||
overflow: hidden;
|
||||
}
|
||||
.page:last-child { page-break-after: auto; }
|
||||
img {
|
||||
width: ${VP_WIDTH}px;
|
||||
height: ${VP_HEIGHT}px;
|
||||
display: block;
|
||||
object-fit: contain;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>${imagesHtml}</body>
|
||||
</html>`;
|
||||
|
||||
await pdfPage.setContent(pdfHtml, { waitUntil: 'load' });
|
||||
await pdfPage.pdf({
|
||||
path: OUTPUT_PDF,
|
||||
width: `${VP_WIDTH}px`,
|
||||
height: `${VP_HEIGHT}px`,
|
||||
printBackground: true,
|
||||
margin: { top: 0, right: 0, bottom: 0, left: 0 },
|
||||
});
|
||||
|
||||
await browser2.close();
|
||||
|
||||
// Clean up screenshots
|
||||
screenshotPaths.forEach(p => unlinkSync(p));
|
||||
|
||||
console.log(` OK: PDF saved to: ${OUTPUT_PDF}`);
|
||||
EXPORT_SCRIPT
|
||||
|
||||
# --- Step 3: Install Playwright in temp directory ---
|
||||
# We install Playwright locally in the temp dir so the Node script can import it.
|
||||
# This avoids polluting global packages and ensures the script is self-contained.
|
||||
|
||||
info "Setting up Playwright (headless browser for screenshots)..."
|
||||
info "This may take a moment on first run..."
|
||||
echo ""
|
||||
|
||||
cd "$TEMP_DIR"
|
||||
|
||||
# Create a minimal package.json so npm install works
|
||||
cat > "$TEMP_DIR/package.json" << 'PKG'
|
||||
{ "name": "slide-export", "private": true, "type": "module" }
|
||||
PKG
|
||||
|
||||
# Install Playwright into the temp directory
|
||||
npm install playwright &>/dev/null || {
|
||||
err "Failed to install Playwright."
|
||||
err "Try running: npm install playwright"
|
||||
rm -rf "$TEMP_DIR"
|
||||
exit 1
|
||||
}
|
||||
|
||||
# Ensure Chromium browser binary is downloaded
|
||||
npx playwright install chromium 2>/dev/null || {
|
||||
err "Failed to install Chromium browser for Playwright."
|
||||
err "Try running manually: npx playwright install chromium"
|
||||
rm -rf "$TEMP_DIR"
|
||||
exit 1
|
||||
}
|
||||
ok "Playwright ready"
|
||||
echo ""
|
||||
|
||||
# --- Step 4: Run the export ---
|
||||
|
||||
SCREENSHOT_DIR="$TEMP_DIR/screenshots"
|
||||
|
||||
info "Exporting slides to PDF..."
|
||||
echo ""
|
||||
|
||||
# Run from the temp dir so Node can find the locally-installed playwright
|
||||
if [[ "$COMPACT" == "true" ]]; then
|
||||
info "Using compact mode (1280x720) for smaller file size"
|
||||
fi
|
||||
|
||||
node "$TEMP_SCRIPT" "$SERVE_DIR" "$HTML_FILENAME" "$OUTPUT_PDF" "$SCREENSHOT_DIR" "$VIEWPORT_W" "$VIEWPORT_H" || {
|
||||
err "PDF export failed."
|
||||
rm -rf "$TEMP_DIR"
|
||||
exit 1
|
||||
}
|
||||
|
||||
# --- Step 5: Cleanup and success ---
|
||||
|
||||
rm -rf "$TEMP_DIR"
|
||||
|
||||
echo ""
|
||||
echo -e "${BOLD}========================================${NC}"
|
||||
ok "PDF exported successfully!"
|
||||
echo ""
|
||||
echo -e " ${BOLD}File:${NC} $OUTPUT_PDF"
|
||||
echo ""
|
||||
FILE_SIZE=$(du -h "$OUTPUT_PDF" | cut -f1 | xargs)
|
||||
echo " Size: $FILE_SIZE"
|
||||
echo ""
|
||||
echo " This PDF works everywhere - email, Slack, Notion, print."
|
||||
echo " Note: Animations are not preserved (it's a static export)."
|
||||
echo -e "${BOLD}========================================${NC}"
|
||||
echo ""
|
||||
|
||||
# Open the PDF automatically
|
||||
if command -v open &>/dev/null; then
|
||||
open "$OUTPUT_PDF"
|
||||
elif command -v xdg-open &>/dev/null; then
|
||||
xdg-open "$OUTPUT_PDF"
|
||||
fi
|
||||
96
skills/frontend-slides/scripts/extract-pptx.py
Normal file
96
skills/frontend-slides/scripts/extract-pptx.py
Normal file
@@ -0,0 +1,96 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Extract all content from a PowerPoint file (.pptx).
|
||||
Returns a JSON structure with slides, text, and images.
|
||||
|
||||
Usage:
|
||||
python extract-pptx.py <input.pptx> [output_dir]
|
||||
|
||||
Requires: pip install python-pptx
|
||||
"""
|
||||
|
||||
import json
|
||||
import os
|
||||
import sys
|
||||
from pptx import Presentation
|
||||
|
||||
|
||||
def extract_pptx(file_path, output_dir="."):
|
||||
"""
|
||||
Extract all content from a PowerPoint file.
|
||||
Returns a list of slide data dicts with text, images, and notes.
|
||||
"""
|
||||
prs = Presentation(file_path)
|
||||
slides_data = []
|
||||
|
||||
# Create assets directory for extracted images
|
||||
assets_dir = os.path.join(output_dir, "assets")
|
||||
os.makedirs(assets_dir, exist_ok=True)
|
||||
|
||||
for slide_num, slide in enumerate(prs.slides):
|
||||
slide_data = {
|
||||
"number": slide_num + 1,
|
||||
"title": "",
|
||||
"content": [],
|
||||
"images": [],
|
||||
"notes": "",
|
||||
}
|
||||
|
||||
for shape in slide.shapes:
|
||||
# Extract text content
|
||||
if shape.has_text_frame:
|
||||
if shape == slide.shapes.title:
|
||||
slide_data["title"] = shape.text
|
||||
else:
|
||||
slide_data["content"].append(
|
||||
{"type": "text", "content": shape.text}
|
||||
)
|
||||
|
||||
# Extract images
|
||||
if shape.shape_type == 13: # Picture type
|
||||
image = shape.image
|
||||
image_bytes = image.blob
|
||||
image_ext = image.ext
|
||||
image_name = f"slide{slide_num + 1}_img{len(slide_data['images']) + 1}.{image_ext}"
|
||||
image_path = os.path.join(assets_dir, image_name)
|
||||
|
||||
with open(image_path, "wb") as f:
|
||||
f.write(image_bytes)
|
||||
|
||||
slide_data["images"].append(
|
||||
{
|
||||
"path": f"assets/{image_name}",
|
||||
"width": shape.width,
|
||||
"height": shape.height,
|
||||
}
|
||||
)
|
||||
|
||||
# Extract speaker notes
|
||||
if slide.has_notes_slide:
|
||||
notes_frame = slide.notes_slide.notes_text_frame
|
||||
slide_data["notes"] = notes_frame.text
|
||||
|
||||
slides_data.append(slide_data)
|
||||
|
||||
return slides_data
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
if len(sys.argv) < 2:
|
||||
print("Usage: python extract-pptx.py <input.pptx> [output_dir]")
|
||||
sys.exit(1)
|
||||
|
||||
input_file = sys.argv[1]
|
||||
output_dir = sys.argv[2] if len(sys.argv) > 2 else "."
|
||||
|
||||
slides = extract_pptx(input_file, output_dir)
|
||||
|
||||
# Write extracted data as JSON
|
||||
output_path = os.path.join(output_dir, "extracted-slides.json")
|
||||
with open(output_path, "w") as f:
|
||||
json.dump(slides, f, indent=2)
|
||||
|
||||
print(f"Extracted {len(slides)} slides to {output_path}")
|
||||
for s in slides:
|
||||
img_count = len(s["images"])
|
||||
print(f" Slide {s['number']}: {s['title'] or '(no title)'} — {img_count} image(s)")
|
||||
153
skills/frontend-slides/viewport-base.css
Normal file
153
skills/frontend-slides/viewport-base.css
Normal file
@@ -0,0 +1,153 @@
|
||||
/* ===========================================
|
||||
VIEWPORT FITTING: MANDATORY BASE STYLES
|
||||
Include this ENTIRE file in every presentation.
|
||||
These styles ensure slides fit exactly in the viewport.
|
||||
=========================================== */
|
||||
|
||||
/* 1. Lock html/body to viewport */
|
||||
html, body {
|
||||
height: 100%;
|
||||
overflow-x: hidden;
|
||||
}
|
||||
|
||||
html {
|
||||
scroll-snap-type: y mandatory;
|
||||
scroll-behavior: smooth;
|
||||
}
|
||||
|
||||
/* 2. Each slide = exact viewport height */
|
||||
.slide {
|
||||
width: 100vw;
|
||||
height: 100vh;
|
||||
height: 100dvh; /* Dynamic viewport height for mobile browsers */
|
||||
overflow: hidden; /* CRITICAL: Prevent ANY overflow */
|
||||
scroll-snap-align: start;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
position: relative;
|
||||
}
|
||||
|
||||
/* 3. Content container with flex for centering */
|
||||
.slide-content {
|
||||
flex: 1;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
justify-content: center;
|
||||
max-height: 100%;
|
||||
overflow: hidden; /* Double-protection against overflow */
|
||||
padding: var(--slide-padding);
|
||||
}
|
||||
|
||||
/* 4. ALL typography uses clamp() for responsive scaling */
|
||||
:root {
|
||||
/* Titles scale from mobile to desktop */
|
||||
--title-size: clamp(1.5rem, 5vw, 4rem);
|
||||
--h2-size: clamp(1.25rem, 3.5vw, 2.5rem);
|
||||
--h3-size: clamp(1rem, 2.5vw, 1.75rem);
|
||||
|
||||
/* Body text */
|
||||
--body-size: clamp(0.75rem, 1.5vw, 1.125rem);
|
||||
--small-size: clamp(0.65rem, 1vw, 0.875rem);
|
||||
|
||||
/* Spacing scales with viewport */
|
||||
--slide-padding: clamp(1rem, 4vw, 4rem);
|
||||
--content-gap: clamp(0.5rem, 2vw, 2rem);
|
||||
--element-gap: clamp(0.25rem, 1vw, 1rem);
|
||||
}
|
||||
|
||||
/* 5. Cards/containers use viewport-relative max sizes */
|
||||
.card, .container, .content-box {
|
||||
max-width: min(90vw, 1000px);
|
||||
max-height: min(80vh, 700px);
|
||||
}
|
||||
|
||||
/* 6. Lists auto-scale with viewport */
|
||||
.feature-list, .bullet-list {
|
||||
gap: clamp(0.4rem, 1vh, 1rem);
|
||||
}
|
||||
|
||||
.feature-list li, .bullet-list li {
|
||||
font-size: var(--body-size);
|
||||
line-height: 1.4;
|
||||
}
|
||||
|
||||
/* 7. Grids adapt to available space */
|
||||
.grid {
|
||||
display: grid;
|
||||
grid-template-columns: repeat(auto-fit, minmax(min(100%, 250px), 1fr));
|
||||
gap: clamp(0.5rem, 1.5vw, 1rem);
|
||||
}
|
||||
|
||||
/* 8. Images constrained to viewport */
|
||||
img, .image-container {
|
||||
max-width: 100%;
|
||||
max-height: min(50vh, 400px);
|
||||
object-fit: contain;
|
||||
}
|
||||
|
||||
/* ===========================================
|
||||
RESPONSIVE BREAKPOINTS
|
||||
Aggressive scaling for smaller viewports
|
||||
=========================================== */
|
||||
|
||||
/* Short viewports (< 700px height) */
|
||||
@media (max-height: 700px) {
|
||||
:root {
|
||||
--slide-padding: clamp(0.75rem, 3vw, 2rem);
|
||||
--content-gap: clamp(0.4rem, 1.5vw, 1rem);
|
||||
--title-size: clamp(1.25rem, 4.5vw, 2.5rem);
|
||||
--h2-size: clamp(1rem, 3vw, 1.75rem);
|
||||
}
|
||||
}
|
||||
|
||||
/* Very short viewports (< 600px height) */
|
||||
@media (max-height: 600px) {
|
||||
:root {
|
||||
--slide-padding: clamp(0.5rem, 2.5vw, 1.5rem);
|
||||
--content-gap: clamp(0.3rem, 1vw, 0.75rem);
|
||||
--title-size: clamp(1.1rem, 4vw, 2rem);
|
||||
--body-size: clamp(0.7rem, 1.2vw, 0.95rem);
|
||||
}
|
||||
|
||||
/* Hide non-essential elements */
|
||||
.nav-dots, .keyboard-hint, .decorative {
|
||||
display: none;
|
||||
}
|
||||
}
|
||||
|
||||
/* Extremely short (landscape phones, < 500px height) */
|
||||
@media (max-height: 500px) {
|
||||
:root {
|
||||
--slide-padding: clamp(0.4rem, 2vw, 1rem);
|
||||
--title-size: clamp(1rem, 3.5vw, 1.5rem);
|
||||
--h2-size: clamp(0.9rem, 2.5vw, 1.25rem);
|
||||
--body-size: clamp(0.65rem, 1vw, 0.85rem);
|
||||
}
|
||||
}
|
||||
|
||||
/* Narrow viewports (< 600px width) */
|
||||
@media (max-width: 600px) {
|
||||
:root {
|
||||
--title-size: clamp(1.25rem, 7vw, 2.5rem);
|
||||
}
|
||||
|
||||
/* Stack grids vertically */
|
||||
.grid {
|
||||
grid-template-columns: 1fr;
|
||||
}
|
||||
}
|
||||
|
||||
/* ===========================================
|
||||
REDUCED MOTION
|
||||
Respect user preferences
|
||||
=========================================== */
|
||||
@media (prefers-reduced-motion: reduce) {
|
||||
*, *::before, *::after {
|
||||
animation-duration: 0.01ms !important;
|
||||
transition-duration: 0.2s !important;
|
||||
}
|
||||
|
||||
html {
|
||||
scroll-behavior: auto;
|
||||
}
|
||||
}
|
||||
403
skills/redis-patterns/SKILL.md
Normal file
403
skills/redis-patterns/SKILL.md
Normal file
@@ -0,0 +1,403 @@
|
||||
---
|
||||
name: redis-patterns
|
||||
description: Redis data structure patterns, caching strategies, distributed locks, rate limiting, pub/sub, and connection management for production applications.
|
||||
origin: ECC
|
||||
---
|
||||
|
||||
# Redis Patterns
|
||||
|
||||
Quick reference for Redis best practices across common backend use cases.
|
||||
|
||||
## How It Works
|
||||
|
||||
Redis is an in-memory data structure store that supports strings, hashes, lists, sets, sorted sets, streams, and more. Individual Redis commands are atomic on a single instance; multi-step workflows require Lua scripts, MULTI/EXEC transactions, or explicit synchronization to stay atomic. Data is optionally persisted via RDB snapshots or AOF logs. Clients communicate over TCP using the RESP protocol; connection pools are essential to avoid per-request handshake overhead.
|
||||
|
||||
## When to Activate
|
||||
|
||||
- Adding caching to an application
|
||||
- Implementing rate limiting or throttling
|
||||
- Building distributed locks or coordination
|
||||
- Setting up session or token storage
|
||||
- Using Pub/Sub or Redis Streams for messaging
|
||||
- Configuring Redis in production (pooling, eviction, clustering)
|
||||
|
||||
## Data Structure Cheat Sheet
|
||||
|
||||
| Use Case | Structure | Example Key |
|
||||
|----------|-----------|-------------|
|
||||
| Simple cache | String | `product:123` |
|
||||
| User session | Hash | `session:abc` |
|
||||
| Leaderboard | Sorted Set | `scores:weekly` |
|
||||
| Unique visitors | Set | `visitors:2024-01-01` |
|
||||
| Activity feed | List | `feed:user:456` |
|
||||
| Event stream | Stream | `events:orders` |
|
||||
| Counters / rate limits | String (INCR) | `ratelimit:user:123` |
|
||||
| Bloom filter / HLL | HyperLogLog | `hll:pageviews` |
|
||||
|
||||
## Core Patterns
|
||||
|
||||
### Cache-Aside (Lazy Loading)
|
||||
|
||||
```python
|
||||
import redis
|
||||
import json
|
||||
|
||||
r = redis.Redis(host='localhost', port=6379, decode_responses=True)
|
||||
|
||||
def get_product(product_id: int):
|
||||
cache_key = f"product:{product_id}"
|
||||
cached = r.get(cache_key)
|
||||
|
||||
if cached:
|
||||
return json.loads(cached)
|
||||
|
||||
product = db.query("SELECT * FROM products WHERE id = %s", product_id)
|
||||
r.setex(cache_key, 3600, json.dumps(product)) # TTL: 1 hour
|
||||
return product
|
||||
```
|
||||
|
||||
### Write-Through Cache
|
||||
|
||||
```python
|
||||
def update_product(product_id: int, data: dict):
|
||||
# Write to DB first
|
||||
db.execute("UPDATE products SET ... WHERE id = %s", product_id)
|
||||
|
||||
# Immediately update cache
|
||||
cache_key = f"product:{product_id}"
|
||||
r.setex(cache_key, 3600, json.dumps(data))
|
||||
```
|
||||
|
||||
### Cache Invalidation
|
||||
|
||||
```python
|
||||
# Tag-based invalidation — group related keys under a set
|
||||
def cache_product(product_id: int, category_id: int, data: dict):
|
||||
key = f"product:{product_id}"
|
||||
tag = f"tag:category:{category_id}"
|
||||
pipe = r.pipeline(transaction=True)
|
||||
pipe.setex(key, 3600, json.dumps(data))
|
||||
pipe.sadd(tag, key)
|
||||
pipe.expire(tag, 3600)
|
||||
pipe.execute()
|
||||
|
||||
def invalidate_category(category_id: int):
|
||||
tag = f"tag:category:{category_id}"
|
||||
keys = r.smembers(tag)
|
||||
if keys:
|
||||
r.delete(*keys)
|
||||
r.delete(tag)
|
||||
```
|
||||
|
||||
### Session Storage
|
||||
|
||||
```python
|
||||
import time
|
||||
import uuid
|
||||
|
||||
def create_session(user_id: int, ttl: int = 86400) -> str:
|
||||
session_id = str(uuid.uuid4())
|
||||
key = f"session:{session_id}"
|
||||
pipe = r.pipeline(transaction=True)
|
||||
pipe.hset(key, mapping={
|
||||
"user_id": user_id,
|
||||
"created_at": int(time.time()),
|
||||
})
|
||||
pipe.expire(key, ttl)
|
||||
pipe.execute()
|
||||
return session_id
|
||||
|
||||
def get_session(session_id: str) -> dict | None:
|
||||
data = r.hgetall(f"session:{session_id}")
|
||||
return data if data else None
|
||||
|
||||
def delete_session(session_id: str):
|
||||
r.delete(f"session:{session_id}")
|
||||
```
|
||||
|
||||
## Rate Limiting
|
||||
|
||||
### Fixed Window (Simple)
|
||||
|
||||
```python
|
||||
def is_rate_limited(user_id: int, limit: int = 100, window: int = 60) -> bool:
|
||||
key = f"ratelimit:{user_id}:{int(time.time()) // window}"
|
||||
pipe = r.pipeline(transaction=True)
|
||||
pipe.incr(key)
|
||||
pipe.expire(key, window)
|
||||
count, _ = pipe.execute()
|
||||
return count > limit
|
||||
```
|
||||
|
||||
### Sliding Window (Lua — Atomic)
|
||||
|
||||
```lua
|
||||
-- sliding_window.lua
|
||||
local key = KEYS[1]
|
||||
local now = tonumber(ARGV[1])
|
||||
local window = tonumber(ARGV[2])
|
||||
local limit = tonumber(ARGV[3])
|
||||
|
||||
redis.call('ZREMRANGEBYSCORE', key, 0, now - window)
|
||||
local count = redis.call('ZCARD', key)
|
||||
|
||||
if count < limit then
|
||||
-- Use unique member (now + sequence) to avoid collisions within the same millisecond
|
||||
local seq_key = key .. ':seq'
|
||||
local seq = redis.call('INCR', seq_key)
|
||||
redis.call('EXPIRE', seq_key, math.ceil(window / 1000))
|
||||
redis.call('ZADD', key, now, now .. '-' .. seq)
|
||||
redis.call('EXPIRE', key, math.ceil(window / 1000))
|
||||
return 1
|
||||
end
|
||||
return 0
|
||||
```
|
||||
|
||||
```python
|
||||
sliding_window = r.register_script(open('sliding_window.lua').read())
|
||||
|
||||
def allow_request(user_id: int) -> bool:
|
||||
key = f"ratelimit:sliding:{user_id}"
|
||||
now = int(time.time() * 1000)
|
||||
return bool(sliding_window(keys=[key], args=[now, 60000, 100]))
|
||||
```
|
||||
|
||||
## Distributed Locks
|
||||
|
||||
### Distributed Lock (Single Node — SET NX PX)
|
||||
|
||||
```python
|
||||
import uuid
|
||||
|
||||
def acquire_lock(resource: str, ttl_ms: int = 5000) -> str | None:
|
||||
lock_key = f"lock:{resource}"
|
||||
token = str(uuid.uuid4())
|
||||
acquired = r.set(lock_key, token, px=ttl_ms, nx=True)
|
||||
return token if acquired else None
|
||||
|
||||
def release_lock(resource: str, token: str) -> bool:
|
||||
release_script = """
|
||||
if redis.call('get', KEYS[1]) == ARGV[1] then
|
||||
return redis.call('del', KEYS[1])
|
||||
else
|
||||
return 0
|
||||
end
|
||||
"""
|
||||
result = r.eval(release_script, 1, f"lock:{resource}", token)
|
||||
return bool(result)
|
||||
|
||||
# Usage
|
||||
token = acquire_lock("order:payment:123")
|
||||
if token:
|
||||
try:
|
||||
process_payment()
|
||||
finally:
|
||||
release_lock("order:payment:123", token)
|
||||
```
|
||||
|
||||
> For multi-node setups use the `redlock-py` library which implements the full Redlock algorithm.
|
||||
|
||||
## Pub/Sub & Streams
|
||||
|
||||
### Pub/Sub (Fire-and-Forget)
|
||||
|
||||
```python
|
||||
# Publisher
|
||||
def publish_event(channel: str, payload: dict):
|
||||
r.publish(channel, json.dumps(payload))
|
||||
|
||||
# Subscriber (blocking — run in separate thread/process)
|
||||
def subscribe_events(channel: str):
|
||||
pubsub = r.pubsub()
|
||||
pubsub.subscribe(channel)
|
||||
for message in pubsub.listen():
|
||||
if message['type'] == 'message':
|
||||
handle(json.loads(message['data']))
|
||||
```
|
||||
|
||||
### Redis Streams (Durable Queue)
|
||||
|
||||
```python
|
||||
# Producer
|
||||
def emit(stream: str, event: dict):
|
||||
r.xadd(stream, event, maxlen=10000) # Cap stream length
|
||||
|
||||
# Consumer group — guarantees at-least-once delivery
|
||||
try:
|
||||
r.xgroup_create('events:orders', 'processor', id='0', mkstream=True)
|
||||
except Exception:
|
||||
pass # Group already exists
|
||||
|
||||
def consume(stream: str, group: str, consumer: str):
|
||||
while True:
|
||||
messages = r.xreadgroup(group, consumer, {stream: '>'}, count=10, block=2000)
|
||||
for _, entries in (messages or []):
|
||||
for msg_id, data in entries:
|
||||
process(data)
|
||||
r.xack(stream, group, msg_id)
|
||||
```
|
||||
|
||||
> Prefer **Streams** over Pub/Sub when you need delivery guarantees, consumer groups, or replay.
|
||||
|
||||
## Key Design
|
||||
|
||||
### Naming Conventions
|
||||
|
||||
```
|
||||
# Pattern: resource:id:field
|
||||
user:123:profile
|
||||
order:456:status
|
||||
cache:product:789
|
||||
|
||||
# Pattern: namespace:resource:id
|
||||
myapp:session:abc123
|
||||
myapp:ratelimit:user:123
|
||||
|
||||
# Pattern: resource:date (time-bound keys)
|
||||
stats:pageviews:2024-01-01
|
||||
```
|
||||
|
||||
### TTL Strategy
|
||||
|
||||
| Data Type | Suggested TTL |
|
||||
|-----------|--------------|
|
||||
| User session | 24h (`86400`) |
|
||||
| API response cache | 5–15 min |
|
||||
| Rate limit window | Match window size |
|
||||
| Short-lived tokens | 5–10 min |
|
||||
| Leaderboard | 1h–24h |
|
||||
| Static/reference data | 1h–1 week |
|
||||
|
||||
Always set a TTL. Keys without TTL accumulate indefinitely and cause memory pressure.
|
||||
|
||||
## Connection Management
|
||||
|
||||
### Connection Pooling
|
||||
|
||||
```python
|
||||
from redis import ConnectionPool, Redis
|
||||
|
||||
pool = ConnectionPool(
|
||||
host='localhost',
|
||||
port=6379,
|
||||
db=0,
|
||||
max_connections=20,
|
||||
decode_responses=True,
|
||||
socket_connect_timeout=2,
|
||||
socket_timeout=2,
|
||||
)
|
||||
|
||||
r = Redis(connection_pool=pool)
|
||||
```
|
||||
|
||||
### Cluster Mode
|
||||
|
||||
```python
|
||||
from redis.cluster import RedisCluster
|
||||
|
||||
r = RedisCluster(
|
||||
startup_nodes=[{"host": "redis-1", "port": 6379}],
|
||||
decode_responses=True,
|
||||
skip_full_coverage_check=True,
|
||||
)
|
||||
```
|
||||
|
||||
### Sentinel (High Availability)
|
||||
|
||||
```python
|
||||
from redis.sentinel import Sentinel
|
||||
|
||||
sentinel = Sentinel(
|
||||
[('sentinel-1', 26379), ('sentinel-2', 26379)],
|
||||
socket_timeout=0.5,
|
||||
)
|
||||
master = sentinel.master_for('mymaster', decode_responses=True)
|
||||
replica = sentinel.slave_for('mymaster', decode_responses=True)
|
||||
```
|
||||
|
||||
## Eviction Policies
|
||||
|
||||
| Policy | Behavior | Best For |
|
||||
|--------|----------|----------|
|
||||
| `noeviction` | Error on write when full | Queues / critical data |
|
||||
| `allkeys-lru` | Evict least recently used | General cache |
|
||||
| `volatile-lru` | LRU only among keys with TTL | Mixed data store |
|
||||
| `allkeys-lfu` | Evict least frequently used | Skewed access patterns |
|
||||
| `volatile-ttl` | Evict soonest-to-expire | Prioritize long-lived data |
|
||||
|
||||
Set via `redis.conf`: `maxmemory-policy allkeys-lru`
|
||||
|
||||
## Anti-Patterns
|
||||
|
||||
| Anti-Pattern | Problem | Fix |
|
||||
|---|---|---|
|
||||
| Keys with no TTL | Memory grows unbounded | Always set TTL |
|
||||
| `KEYS *` in production | Blocks the server (O(N)) | Use `SCAN` cursor |
|
||||
| Storing large blobs (>100KB) | Slow serialization, memory pressure | Store reference + fetch from object store |
|
||||
| Single Redis for everything | No isolation between cache & queue | Use separate DBs or instances |
|
||||
| Ignoring connection pool limits | Connection exhaustion under load | Size pool to workload |
|
||||
| Not handling cache miss stampede | Thundering herd on cold start | Use locks or probabilistic early expiry |
|
||||
| `FLUSHALL` without thought | Wipes entire instance | Scope deletes by key pattern |
|
||||
|
||||
### Cache Miss Stampede Prevention
|
||||
|
||||
```python
|
||||
import threading
|
||||
|
||||
_locks: dict[str, threading.Lock] = {}
|
||||
_locks_mutex = threading.Lock()
|
||||
|
||||
def get_with_lock(key: str, fetch_fn, ttl: int = 300):
|
||||
cached = r.get(key)
|
||||
if cached:
|
||||
return json.loads(cached)
|
||||
|
||||
with _locks_mutex:
|
||||
if key not in _locks:
|
||||
_locks[key] = threading.Lock()
|
||||
lock = _locks[key]
|
||||
with lock:
|
||||
cached = r.get(key) # Re-check after acquiring lock
|
||||
if cached:
|
||||
return json.loads(cached)
|
||||
value = fetch_fn()
|
||||
r.setex(key, ttl, json.dumps(value))
|
||||
return value
|
||||
```
|
||||
|
||||
> Note: for multi-process deployments, replace the in-process lock with `acquire_lock`/`release_lock` from the Distributed Locks section above.
|
||||
|
||||
## Examples
|
||||
|
||||
**Add caching to a Django/Flask API endpoint:**
|
||||
Use cache-aside with `setex` and a 5-minute TTL on the response. Key on the request parameters.
|
||||
|
||||
**Rate-limit an API by user:**
|
||||
Use fixed-window with `pipeline(transaction=True)` for low-traffic endpoints; use sliding-window Lua for accurate per-user throttling.
|
||||
|
||||
**Coordinate a background job across workers:**
|
||||
Use `acquire_lock` with a TTL that exceeds the expected job duration. Always release in a `finally` block.
|
||||
|
||||
**Fan-out notifications to multiple subscribers:**
|
||||
Use Pub/Sub for fire-and-forget. Switch to Streams if you need guaranteed delivery or replay for late consumers.
|
||||
|
||||
## Quick Reference
|
||||
|
||||
| Pattern | When to Use |
|
||||
|---------|-------------|
|
||||
| Cache-aside | Read-heavy, tolerate slight staleness |
|
||||
| Write-through | Strong consistency required |
|
||||
| Distributed lock | Prevent concurrent access to a resource |
|
||||
| Sliding window rate limit | Accurate per-user throttling |
|
||||
| Redis Streams | Durable event queue with consumer groups |
|
||||
| Pub/Sub | Broadcast with no delivery guarantees needed |
|
||||
| Sorted Set leaderboard | Ranked scoring, pagination |
|
||||
| HyperLogLog | Approximate unique count at low memory |
|
||||
|
||||
## Related
|
||||
|
||||
- Skill: `postgres-patterns` — relational data patterns
|
||||
- Skill: `backend-patterns` — API and service layer patterns
|
||||
- Skill: `database-migrations` — schema versioning
|
||||
- Skill: `django-patterns` — Django cache framework integration
|
||||
- Agent: `database-reviewer` — full database review workflow
|
||||
@@ -15,6 +15,10 @@ from scripts.scenario_generator import Scenario
|
||||
|
||||
SANDBOX_BASE = Path("/tmp/skill-comply-sandbox")
|
||||
ALLOWED_MODELS = frozenset({"haiku", "sonnet", "opus"})
|
||||
# Shell builtins cannot be invoked via subprocess.run; cwd is already
|
||||
# controlled by the cwd= keyword. Scenarios that include these in
|
||||
# setup_commands (a common shell-style convention) must be tolerated.
|
||||
SHELL_BUILTINS = frozenset({"cd", "pushd", "popd"})
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
@@ -53,9 +57,22 @@ def run_scenario(
|
||||
cwd=sandbox_dir,
|
||||
)
|
||||
|
||||
if result.returncode != 0:
|
||||
# claude -p returns rc=1 when --max-turns is reached, but the stream-json
|
||||
# output is still complete and parseable. Treat this graceful termination
|
||||
# as non-fatal so scenarios that hit the turn cap still produce usable
|
||||
# observations.
|
||||
nonfatal_max_turns = (
|
||||
result.returncode == 1
|
||||
and '"terminal_reason":"max_turns"' in result.stdout
|
||||
)
|
||||
if result.returncode != 0 and not nonfatal_max_turns:
|
||||
# Include both stderr and stdout tails. claude -p often surfaces the
|
||||
# actual failure context (model error JSON, partial stream-json) on
|
||||
# stdout, while stderr carries generic transport / auth messages.
|
||||
# Showing both dramatically reduces "rc=N: <empty>" debugging dead-ends.
|
||||
raise RuntimeError(
|
||||
f"claude -p failed (rc={result.returncode}): {result.stderr[:500]}"
|
||||
f"claude -p failed (rc={result.returncode}): "
|
||||
f"stderr={result.stderr[:500]!r} stdout_tail={result.stdout[-500:]!r}"
|
||||
)
|
||||
|
||||
observations = _parse_stream_json(result.stdout)
|
||||
@@ -86,7 +103,15 @@ def _setup_sandbox(sandbox_dir: Path, scenario: Scenario) -> None:
|
||||
|
||||
for cmd in scenario.setup_commands:
|
||||
parts = shlex.split(cmd)
|
||||
subprocess.run(parts, cwd=sandbox_dir, capture_output=True)
|
||||
if not parts or parts[0] in SHELL_BUILTINS:
|
||||
# Shell builtins (cd/pushd/popd) cannot run as subprocess; skip.
|
||||
continue
|
||||
try:
|
||||
subprocess.run(parts, cwd=sandbox_dir, capture_output=True)
|
||||
except FileNotFoundError:
|
||||
# Setup tool not installed in this environment; skip rather than
|
||||
# crash the whole scenario. The compliance run continues.
|
||||
continue
|
||||
|
||||
|
||||
def _parse_stream_json(stdout: str) -> list[ObservationEvent]:
|
||||
|
||||
172
skills/skill-comply/tests/test_runner.py
Normal file
172
skills/skill-comply/tests/test_runner.py
Normal file
@@ -0,0 +1,172 @@
|
||||
"""Tests for runner module — scenario execution + subprocess error handling."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import subprocess
|
||||
from dataclasses import dataclass
|
||||
from unittest.mock import MagicMock, patch
|
||||
|
||||
import pytest
|
||||
|
||||
from scripts.runner import _setup_sandbox, run_scenario
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class _FakeScenario:
|
||||
"""Minimal Scenario-like object for runner tests (avoids generator deps)."""
|
||||
|
||||
id: str
|
||||
prompt: str = "do nothing"
|
||||
setup_commands: tuple[str, ...] = ()
|
||||
|
||||
|
||||
class TestSetupSandboxSkipsShellBuiltins:
|
||||
"""Setup commands containing shell builtins (cd/pushd/popd) must be skipped.
|
||||
|
||||
Regression: subprocess.run(["cd", ...]) raises FileNotFoundError because
|
||||
cd is a shell builtin, not an external binary. Real-world scenarios often
|
||||
include "cd subdir" in setup_commands assuming shell semantics, so the
|
||||
runner must tolerate this rather than crashing the whole scenario.
|
||||
"""
|
||||
|
||||
def test_skips_cd(self, tmp_path):
|
||||
scenario = _FakeScenario(
|
||||
id="t1",
|
||||
setup_commands=("cd subdir",),
|
||||
)
|
||||
called_args: list[list[str]] = []
|
||||
|
||||
def fake_run(args, **kwargs):
|
||||
called_args.append(args)
|
||||
return subprocess.CompletedProcess(args=args, returncode=0)
|
||||
|
||||
with patch("scripts.runner.subprocess.run", side_effect=fake_run):
|
||||
_setup_sandbox(tmp_path, scenario)
|
||||
|
||||
# git init runs once; "cd subdir" must NOT be passed to subprocess
|
||||
assert ["git", "init"] in called_args
|
||||
assert ["cd", "subdir"] not in called_args
|
||||
|
||||
def test_skips_pushd_popd(self, tmp_path):
|
||||
scenario = _FakeScenario(
|
||||
id="t2",
|
||||
setup_commands=("pushd dir", "popd"),
|
||||
)
|
||||
called_args: list[list[str]] = []
|
||||
|
||||
def fake_run(args, **kwargs):
|
||||
called_args.append(args)
|
||||
return subprocess.CompletedProcess(args=args, returncode=0)
|
||||
|
||||
with patch("scripts.runner.subprocess.run", side_effect=fake_run):
|
||||
_setup_sandbox(tmp_path, scenario)
|
||||
|
||||
assert ["pushd", "dir"] not in called_args
|
||||
assert ["popd"] not in called_args
|
||||
|
||||
def test_tolerates_missing_executable(self, tmp_path):
|
||||
"""A scenario referencing an unavailable tool must not crash setup."""
|
||||
scenario = _FakeScenario(
|
||||
id="t3",
|
||||
setup_commands=("nonexistent-tool-xyz arg",),
|
||||
)
|
||||
|
||||
def fake_run(args, **kwargs):
|
||||
if args[0] == "nonexistent-tool-xyz":
|
||||
raise FileNotFoundError(2, "No such file or directory")
|
||||
return subprocess.CompletedProcess(args=args, returncode=0)
|
||||
|
||||
with patch("scripts.runner.subprocess.run", side_effect=fake_run):
|
||||
# Must NOT raise — missing tools are skipped, not fatal
|
||||
_setup_sandbox(tmp_path, scenario)
|
||||
|
||||
def test_real_commands_still_run(self, tmp_path):
|
||||
"""Skip logic must not break legitimate setup commands."""
|
||||
scenario = _FakeScenario(
|
||||
id="t4",
|
||||
setup_commands=("touch file.txt", "cd ignored", "echo hi"),
|
||||
)
|
||||
called_args: list[list[str]] = []
|
||||
|
||||
def fake_run(args, **kwargs):
|
||||
called_args.append(args)
|
||||
return subprocess.CompletedProcess(args=args, returncode=0)
|
||||
|
||||
with patch("scripts.runner.subprocess.run", side_effect=fake_run):
|
||||
_setup_sandbox(tmp_path, scenario)
|
||||
|
||||
# Real commands present, cd absent
|
||||
assert ["touch", "file.txt"] in called_args
|
||||
assert ["echo", "hi"] in called_args
|
||||
assert ["cd", "ignored"] not in called_args
|
||||
|
||||
|
||||
class TestRunScenarioMaxTurnsTermination:
|
||||
"""rc=1 with terminal_reason=max_turns is graceful termination, not failure.
|
||||
|
||||
claude -p returns rc=1 when --max-turns is reached, but the stream-json
|
||||
output is still valid. Treating this as RuntimeError aborts scenarios
|
||||
that would have produced useful observations. Detect the marker in stdout
|
||||
and downgrade rc=1 + max_turns to non-fatal.
|
||||
"""
|
||||
|
||||
def test_rc1_with_max_turns_marker_returns_normally(self, tmp_path, monkeypatch):
|
||||
scenario = _FakeScenario(id="mt1", prompt="long task", setup_commands=())
|
||||
|
||||
# Skip sandbox setup side effects
|
||||
monkeypatch.setattr("scripts.runner._setup_sandbox", lambda *a, **kw: None)
|
||||
|
||||
max_turns_stdout = (
|
||||
'{"type":"system","subtype":"init","session_id":"s1"}\n'
|
||||
'{"type":"result","terminal_reason":"max_turns"}\n'
|
||||
)
|
||||
|
||||
fake_result = subprocess.CompletedProcess(
|
||||
args=["claude"], returncode=1, stdout=max_turns_stdout, stderr=""
|
||||
)
|
||||
|
||||
with patch("scripts.runner.subprocess.run", return_value=fake_result):
|
||||
# Must NOT raise — max_turns is graceful termination
|
||||
run_scenario(scenario, model="haiku")
|
||||
|
||||
def test_rc1_without_max_turns_marker_still_raises(self, tmp_path, monkeypatch):
|
||||
"""Real failures (rc≠0 with no max_turns marker) must still raise."""
|
||||
scenario = _FakeScenario(id="mt2", prompt="oops", setup_commands=())
|
||||
monkeypatch.setattr("scripts.runner._setup_sandbox", lambda *a, **kw: None)
|
||||
|
||||
fake_result = subprocess.CompletedProcess(
|
||||
args=["claude"], returncode=1, stdout="", stderr="auth error"
|
||||
)
|
||||
|
||||
with patch("scripts.runner.subprocess.run", return_value=fake_result):
|
||||
with pytest.raises(RuntimeError, match="claude -p failed"):
|
||||
run_scenario(scenario, model="haiku")
|
||||
|
||||
|
||||
class TestRunScenarioErrorIncludesStdoutTail:
|
||||
"""Error messages must include stdout tail, not only stderr.
|
||||
|
||||
When claude -p fails inside an LLM call, useful diagnostic context often
|
||||
appears in stdout (partial stream-json events, model error JSON), not
|
||||
stderr. Including stdout tail in the RuntimeError message dramatically
|
||||
improves debug-ability without adding any new dependency.
|
||||
"""
|
||||
|
||||
def test_error_message_contains_stdout_tail(self, tmp_path, monkeypatch):
|
||||
scenario = _FakeScenario(id="e1", prompt="x", setup_commands=())
|
||||
monkeypatch.setattr("scripts.runner._setup_sandbox", lambda *a, **kw: None)
|
||||
|
||||
diagnostic_marker = "DIAG_STDOUT_MARKER_xyz123"
|
||||
fake_result = subprocess.CompletedProcess(
|
||||
args=["claude"],
|
||||
returncode=2,
|
||||
stdout=f"some context {diagnostic_marker} more text",
|
||||
stderr="generic error",
|
||||
)
|
||||
|
||||
with patch("scripts.runner.subprocess.run", return_value=fake_result):
|
||||
with pytest.raises(RuntimeError) as excinfo:
|
||||
run_scenario(scenario, model="haiku")
|
||||
|
||||
# Stdout marker MUST appear in the error message
|
||||
assert diagnostic_marker in str(excinfo.value)
|
||||
449
skills/vite-patterns/SKILL.md
Normal file
449
skills/vite-patterns/SKILL.md
Normal file
@@ -0,0 +1,449 @@
|
||||
---
|
||||
name: vite-patterns
|
||||
description: Vite build tool patterns including config, plugins, HMR, env variables, proxy setup, SSR, library mode, dependency pre-bundling, and build optimization. Activate when working with vite.config.ts, Vite plugins, or Vite-based projects.
|
||||
origin: ECC
|
||||
---
|
||||
|
||||
# Vite Patterns
|
||||
|
||||
Build tool and dev server patterns for Vite 8+ projects. Covers configuration, environment variables, proxy setup, library mode, dependency pre-bundling, and common production pitfalls.
|
||||
|
||||
## When to Use
|
||||
|
||||
- Configuring `vite.config.ts` or `vite.config.js`
|
||||
- Setting up environment variables or `.env` files
|
||||
- Configuring dev server proxy for API backends
|
||||
- Optimizing build output (chunks, minification, assets)
|
||||
- Publishing libraries with `build.lib`
|
||||
- Troubleshooting dependency pre-bundling or CJS/ESM interop
|
||||
- Debugging HMR, dev server, or build errors
|
||||
- Choosing or ordering Vite plugins
|
||||
|
||||
## How It Works
|
||||
|
||||
- **Dev mode** serves source files as native ESM — no bundling. Transforms happen on-demand per module request, which is why cold starts are fast and HMR is precise.
|
||||
- **Build mode** uses Rolldown (v7+) or Rollup (v5–v6) to bundle the app for production with tree-shaking, code-splitting, and Oxc-based minification.
|
||||
- **Dependency pre-bundling** converts CJS/UMD deps to ESM once via esbuild and caches the result under `node_modules/.vite`, so subsequent starts skip the work.
|
||||
- **Plugins** share a unified interface across dev and build — the same plugin object works for both the dev server's on-demand transforms and the production pipeline.
|
||||
- **Environment variables** are statically inlined at build time. `VITE_`-prefixed vars become public constants in the bundle; everything unprefixed is invisible to client code.
|
||||
|
||||
## Examples
|
||||
|
||||
### Config Structure
|
||||
|
||||
#### Basic Config
|
||||
|
||||
```typescript
|
||||
// vite.config.ts
|
||||
import { defineConfig } from 'vite'
|
||||
import react from '@vitejs/plugin-react'
|
||||
|
||||
export default defineConfig({
|
||||
plugins: [react()],
|
||||
resolve: {
|
||||
alias: { '@': new URL('./src', import.meta.url).pathname },
|
||||
},
|
||||
})
|
||||
```
|
||||
|
||||
#### Conditional Config
|
||||
|
||||
```typescript
|
||||
// vite.config.ts
|
||||
import { defineConfig, loadEnv } from 'vite'
|
||||
import react from '@vitejs/plugin-react'
|
||||
|
||||
export default defineConfig(({ command, mode }) => {
|
||||
const env = loadEnv(mode, process.cwd()) // VITE_ prefixed only (safe)
|
||||
|
||||
return {
|
||||
plugins: [react()],
|
||||
server: command === 'serve' ? { port: 3000 } : undefined,
|
||||
define: {
|
||||
__API_URL__: JSON.stringify(env.VITE_API_URL),
|
||||
},
|
||||
}
|
||||
})
|
||||
```
|
||||
|
||||
#### Key Config Options
|
||||
|
||||
| Key | Default | Description |
|
||||
|-----|---------|-------------|
|
||||
| `root` | `'.'` | Project root (where `index.html` lives) |
|
||||
| `base` | `'/'` | Public base path for deployed assets |
|
||||
| `envPrefix` | `'VITE_'` | Prefix for client-exposed env vars |
|
||||
| `build.outDir` | `'dist'` | Output directory |
|
||||
| `build.minify` | `'oxc'` | Minifier (`'oxc'`, `'terser'`, or `false`) |
|
||||
| `build.sourcemap` | `false` | `true`, `'inline'`, or `'hidden'` |
|
||||
|
||||
### Plugins
|
||||
|
||||
#### Essential Plugins
|
||||
|
||||
Most plugin needs are covered by a handful of well-maintained packages. Reach for these before writing your own.
|
||||
|
||||
| Plugin | Purpose | When to use |
|
||||
|--------|---------|-------------|
|
||||
| `@vitejs/plugin-react-swc` | React HMR + Fast Refresh via SWC | Default for React apps (faster than Babel variant) |
|
||||
| `@vitejs/plugin-react` | React HMR + Fast Refresh via Babel | Only if you need Babel plugins (emotion, MobX decorators) |
|
||||
| `@vitejs/plugin-vue` | Vue 3 SFC support | Vue apps |
|
||||
| `vite-plugin-checker` | Runs `tsc` + ESLint in worker thread with HMR overlay | **Any TypeScript app** — Vite does NOT type-check during `vite build` |
|
||||
| `vite-tsconfig-paths` | Honors `tsconfig.json` `paths` aliases | Any time you already have aliases in `tsconfig.json` |
|
||||
| `vite-plugin-dts` | Emits `.d.ts` files in library mode | Publishing TypeScript libraries |
|
||||
| `vite-plugin-svgr` | Imports SVGs as React components | React apps using SVGs as components |
|
||||
| `rollup-plugin-visualizer` | Bundle treemap/sunburst report | Periodic bundle size audits (use `enforce: 'post'`) |
|
||||
| `vite-plugin-pwa` | Zero-config PWA + Workbox | Offline-capable apps |
|
||||
|
||||
**Critical callout:** `vite build` transpiles but does NOT type-check. Type errors silently ship to production unless you add `vite-plugin-checker` or run `tsc --noEmit` in CI.
|
||||
|
||||
#### Authoring Custom Plugins
|
||||
|
||||
Authoring is rare — most needs are covered by existing plugins. When you do need one, start inline in `vite.config.ts` and only extract if reused.
|
||||
|
||||
```typescript
|
||||
// vite.config.ts — minimal inline plugin
|
||||
function myPlugin(): Plugin {
|
||||
return {
|
||||
name: 'my-plugin', // required, must be unique
|
||||
enforce: 'pre', // 'pre' | 'post' (optional)
|
||||
apply: 'build', // 'build' | 'serve' (optional)
|
||||
transform(code, id) {
|
||||
if (!id.endsWith('.custom')) return
|
||||
return { code: transformCustom(code), map: null }
|
||||
},
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Key hooks:** `transform` (modify source), `resolveId` + `load` (virtual modules), `transformIndexHtml` (inject into HTML), `configureServer` (add dev middleware), `hotUpdate` (custom HMR — replaces deprecated `handleHotUpdate` in v7+).
|
||||
|
||||
**Virtual modules** use the `\0` prefix convention — `resolveId` returns `'\0virtual:my-id'` so other plugins skip it. User code imports `'virtual:my-id'`.
|
||||
|
||||
For full plugin API, see [vite.dev/guide/api-plugin](https://vite.dev/guide/api-plugin). Use `vite-plugin-inspect` during development to debug the transform pipeline.
|
||||
|
||||
### HMR API
|
||||
|
||||
Framework plugins (`@vitejs/plugin-react`, `@vitejs/plugin-vue`, etc.) handle HMR automatically. Reach for `import.meta.hot` directly only when building custom state stores, dev tools, or framework-agnostic utilities that need to persist state across updates.
|
||||
|
||||
```typescript
|
||||
// src/store.ts — manual HMR for a vanilla module
|
||||
if (import.meta.hot) {
|
||||
// Persist state across updates (must MUTATE, never reassign .data)
|
||||
import.meta.hot.data.count = import.meta.hot.data.count ?? 0
|
||||
|
||||
// Cleanup side effects before module is replaced
|
||||
import.meta.hot.dispose((data) => clearInterval(data.intervalId))
|
||||
|
||||
// Accept this module's own updates
|
||||
import.meta.hot.accept()
|
||||
}
|
||||
```
|
||||
|
||||
All `import.meta.hot` code is tree-shaken out of production builds — no guard removal needed.
|
||||
|
||||
### Environment Variables
|
||||
|
||||
Vite loads `.env`, `.env.local`, `.env.[mode]`, and `.env.[mode].local` in that order (later overrides earlier); `*.local` files are gitignored and meant for local secrets.
|
||||
|
||||
#### Client-Side Access
|
||||
|
||||
Only `VITE_`-prefixed vars are exposed to client code:
|
||||
|
||||
```typescript
|
||||
import.meta.env.VITE_API_URL // string
|
||||
import.meta.env.MODE // 'development' | 'production' | custom
|
||||
import.meta.env.BASE_URL // base config value
|
||||
import.meta.env.DEV // boolean
|
||||
import.meta.env.PROD // boolean
|
||||
import.meta.env.SSR // boolean
|
||||
```
|
||||
|
||||
#### Using Env in Config
|
||||
|
||||
```typescript
|
||||
// vite.config.ts
|
||||
import { defineConfig, loadEnv } from 'vite'
|
||||
|
||||
export default defineConfig(({ mode }) => {
|
||||
const env = loadEnv(mode, process.cwd()) // VITE_ prefixed only (safe)
|
||||
return {
|
||||
define: {
|
||||
__API_URL__: JSON.stringify(env.VITE_API_URL),
|
||||
},
|
||||
}
|
||||
})
|
||||
```
|
||||
|
||||
### Security
|
||||
|
||||
#### `VITE_` Prefix is NOT a Security Boundary
|
||||
|
||||
Any variable prefixed with `VITE_` is **statically inlined into the client bundle at build time**. Minification, base64 encoding, and disabling source maps do NOT hide it. A determined attacker can extract any `VITE_` var from the shipped JavaScript.
|
||||
|
||||
**Rule:** Only public values (API URLs, feature flags, public keys) go in `VITE_` vars. Secrets (API tokens, database URLs, private keys) MUST live server-side behind an API or serverless function.
|
||||
|
||||
#### The `loadEnv('')` Trap
|
||||
|
||||
```typescript
|
||||
// BAD: passing '' as the third arg loads ALL env vars — including server secrets —
|
||||
// and makes them available to inline into client code via `define`.
|
||||
const env = loadEnv(mode, process.cwd(), '')
|
||||
|
||||
// GOOD: explicit prefix list
|
||||
const env = loadEnv(mode, process.cwd(), ['VITE_', 'APP_'])
|
||||
```
|
||||
|
||||
#### Source Maps in Production
|
||||
|
||||
Production source maps leak your original source code. Disable them unless you upload to an error tracker (Sentry, Bugsnag) and delete locally afterward:
|
||||
|
||||
```typescript
|
||||
build: {
|
||||
sourcemap: false, // default — keep it this way
|
||||
}
|
||||
```
|
||||
|
||||
#### `.gitignore` Checklist
|
||||
|
||||
- `.env.local`, `.env.*.local` — local secret overrides
|
||||
- `dist/` — build output
|
||||
- `node_modules/.vite` — pre-bundle cache (stale entries cause phantom errors)
|
||||
|
||||
### Server Proxy
|
||||
|
||||
```typescript
|
||||
// vite.config.ts — server.proxy
|
||||
server: {
|
||||
proxy: {
|
||||
'/foo': 'http://localhost:4567', // string shorthand
|
||||
|
||||
'/api': {
|
||||
target: 'http://localhost:8080',
|
||||
changeOrigin: true, // needed for virtual-hosted backends
|
||||
rewrite: (path) => path.replace(/^\/api/, ''),
|
||||
},
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
For WebSocket proxying, add `ws: true` to the route config.
|
||||
|
||||
### Build Optimization
|
||||
|
||||
#### Manual Chunks
|
||||
|
||||
```typescript
|
||||
// vite.config.ts — build.rolldownOptions
|
||||
build: {
|
||||
rolldownOptions: {
|
||||
output: {
|
||||
// Object form: group specific packages
|
||||
manualChunks: {
|
||||
'react-vendor': ['react', 'react-dom'],
|
||||
'ui-vendor': ['@radix-ui/react-dialog', '@radix-ui/react-popover'],
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
```typescript
|
||||
// Function form: split by heuristic
|
||||
manualChunks(id) {
|
||||
if (id.includes('node_modules/react')) return 'react-vendor'
|
||||
if (id.includes('node_modules')) return 'vendor'
|
||||
}
|
||||
```
|
||||
|
||||
### Performance
|
||||
|
||||
#### Avoid Barrel Files
|
||||
|
||||
Barrel files (`index.ts` re-exporting everything from a directory) force Vite to load every re-exported file even when you import a single symbol. This is the #1 dev-server slowdown flagged by the official docs.
|
||||
|
||||
```typescript
|
||||
// BAD — importing one util forces Vite to load the whole barrel
|
||||
import { slash } from '@/utils'
|
||||
|
||||
// GOOD — direct import, only the one file is loaded
|
||||
import { slash } from '@/utils/slash'
|
||||
```
|
||||
|
||||
#### Be Explicit with Import Extensions
|
||||
|
||||
Each implicit extension forces up to 6 filesystem checks via `resolve.extensions`. In large codebases, this adds up.
|
||||
|
||||
```typescript
|
||||
// BAD
|
||||
import Component from './Component'
|
||||
|
||||
// GOOD
|
||||
import Component from './Component.tsx'
|
||||
```
|
||||
|
||||
Narrow `tsconfig.json` `allowImportingTsExtensions` + `resolve.extensions` to only the extensions you actually use.
|
||||
|
||||
#### Warm-Up Hot-Path Routes
|
||||
|
||||
`server.warmup.clientFiles` pre-transforms known hot entries before the browser requests them — eliminating the cold-load request waterfall on large apps.
|
||||
|
||||
```typescript
|
||||
// vite.config.ts
|
||||
server: {
|
||||
warmup: {
|
||||
clientFiles: ['./src/main.tsx', './src/routes/**/*.tsx'],
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
#### Profiling Slow Dev Servers
|
||||
|
||||
When `vite dev` feels slow, start with `vite --profile`, interact with the app, then press `p+enter` to save a `.cpuprofile`. Load it in [Speedscope](https://www.speedscope.app) to find which plugins are eating time — usually `buildStart`, `config`, or `configResolved` hooks in community plugins.
|
||||
|
||||
### Library Mode
|
||||
|
||||
When publishing an npm package, use `build.lib`. Two footguns matter more than config detail:
|
||||
|
||||
1. **Types are not emitted** — add `vite-plugin-dts` or run `tsc --emitDeclarationOnly` separately.
|
||||
2. **Peer dependencies MUST be externalized** — unlisted peers get bundled into your library, causing duplicate-runtime errors in consumers.
|
||||
|
||||
```typescript
|
||||
// vite.config.ts
|
||||
build: {
|
||||
lib: {
|
||||
entry: 'src/index.ts',
|
||||
formats: ['es', 'cjs'],
|
||||
fileName: (format) => `my-lib.${format}.js`,
|
||||
},
|
||||
rolldownOptions: {
|
||||
external: ['react', 'react-dom', 'react/jsx-runtime'], // every peer dep
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
### SSR Externals
|
||||
|
||||
Bare `createServer({ middlewareMode: true })` setups are framework-author territory. Most apps should use Nuxt, Remix, SvelteKit, Astro, or TanStack Start instead. What you *will* tweak as a framework user is the externals config when deps break in SSR:
|
||||
|
||||
```typescript
|
||||
// vite.config.ts — ssr options
|
||||
ssr: {
|
||||
external: ['node-native-package'], // keep as require() in SSR bundle
|
||||
noExternal: ['esm-only-package'], // force-bundle into SSR output (fixes most SSR errors)
|
||||
target: 'node', // 'node' or 'webworker'
|
||||
}
|
||||
```
|
||||
|
||||
### Dependency Pre-Bundling
|
||||
|
||||
Vite pre-bundles dependencies to convert CJS/UMD to ESM and reduce request count.
|
||||
|
||||
```typescript
|
||||
// vite.config.ts — optimizeDeps
|
||||
optimizeDeps: {
|
||||
include: [
|
||||
'lodash-es', // force pre-bundle known heavy deps
|
||||
'cjs-package', // CJS deps that cause interop issues
|
||||
'deep-lib/components/**', // glob for deep imports
|
||||
],
|
||||
exclude: ['local-esm-package'], // must be valid ESM if excluded
|
||||
force: true, // ignore cache, re-optimize (temporary debugging)
|
||||
}
|
||||
```
|
||||
|
||||
### Common Pitfalls
|
||||
|
||||
#### Dev Does Not Match Build
|
||||
|
||||
Dev uses esbuild/Rolldown for transforms; build uses Rolldown for bundling. CJS libraries can behave differently between the two. Always verify with `vite build && vite preview` before deploying.
|
||||
|
||||
#### Stale Chunks After Deployment
|
||||
|
||||
New builds produce new chunk hashes. Users with active sessions request old filenames that no longer exist. Vite has no built-in solution. Mitigations:
|
||||
|
||||
- Keep old `dist/assets/` files live for a deployment window
|
||||
- Catch dynamic import errors in your router and force a page reload
|
||||
|
||||
#### Docker and Containers
|
||||
|
||||
Vite binds to `localhost` by default, which is unreachable from outside a container:
|
||||
|
||||
```typescript
|
||||
// vite.config.ts — Docker/container setup
|
||||
server: {
|
||||
host: true, // bind 0.0.0.0
|
||||
hmr: { clientPort: 3000 }, // if behind a reverse proxy
|
||||
}
|
||||
```
|
||||
|
||||
#### Monorepo File Access
|
||||
|
||||
Vite restricts file serving to the project root. Packages outside root are blocked:
|
||||
|
||||
```typescript
|
||||
// vite.config.ts — monorepo file access
|
||||
server: {
|
||||
fs: {
|
||||
allow: ['..'], // allow parent directory (workspace root)
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
### Anti-Patterns
|
||||
|
||||
```typescript
|
||||
// BAD: Setting envPrefix to '' exposes ALL env vars (including secrets) to the client
|
||||
envPrefix: ''
|
||||
|
||||
// BAD: Assuming require() works in application source code — Vite is ESM-first
|
||||
const lib = require('some-lib') // use import instead
|
||||
|
||||
// BAD: Splitting every node_module into its own chunk — creates hundreds of tiny files
|
||||
manualChunks(id) {
|
||||
if (id.includes('node_modules')) {
|
||||
return id.split('node_modules/')[1].split('/')[0] // one chunk per package
|
||||
}
|
||||
}
|
||||
|
||||
// BAD: Not externalizing peer deps in library mode — causes duplicate runtime errors
|
||||
// build.lib without rolldownOptions.external
|
||||
|
||||
// BAD: Using deprecated esbuild minifier
|
||||
build: { minify: 'esbuild' } // use 'oxc' (default) or 'terser'
|
||||
|
||||
// BAD: Mutating import.meta.hot.data by reassignment
|
||||
import.meta.hot.data = { count: 0 } // WRONG: must mutate properties, not reassign
|
||||
import.meta.hot.data.count = 0 // CORRECT
|
||||
```
|
||||
|
||||
**Process anti-patterns:**
|
||||
|
||||
- **`vite preview` is NOT a production server** — it is a smoke test for the built bundle. Deploy `dist/` to a real static host (NGINX, Cloudflare Pages, Vercel static) or use a multi-stage Dockerfile.
|
||||
- **Expecting `vite build` to type-check** — it only transpiles. Type errors silently ship to production. Add `vite-plugin-checker` or run `tsc --noEmit` in CI.
|
||||
- **Shipping `@vitejs/plugin-legacy` by default** — it bloats bundles ~40%, breaks source-map bundle analyzers, and is unnecessary for the 95%+ of users on modern browsers. Gate it on real analytics, not assumption.
|
||||
- **Hand-rolling 30+ `resolve.alias` entries that duplicate `tsconfig.json` paths** — use `vite-tsconfig-paths` instead. Observed in Excalidraw and PostHog; avoid in new projects.
|
||||
- **Leaving stale `node_modules/.vite` after dep changes** — pre-bundle cache causes phantom errors. Clear it when switching branches or after patching deps.
|
||||
|
||||
## Quick Reference
|
||||
|
||||
| Pattern | When to Use |
|
||||
|---------|-------------|
|
||||
| `defineConfig` | Always — provides type inference |
|
||||
| `loadEnv(mode, root, ['VITE_'])` | Access env vars in config (explicit prefix) |
|
||||
| `vite-plugin-checker` | Any TypeScript app (fills the type-check gap) |
|
||||
| `vite-tsconfig-paths` | Instead of hand-rolled `resolve.alias` |
|
||||
| `optimizeDeps.include` | CJS deps causing interop issues |
|
||||
| `server.proxy` | Route API requests to backend in dev |
|
||||
| `server.host: true` | Docker, containers, remote access |
|
||||
| `server.warmup.clientFiles` | Pre-transform hot-path routes |
|
||||
| `build.lib` + `external` | Publishing npm packages |
|
||||
| `manualChunks` (object) | Vendor bundle splitting |
|
||||
| `vite --profile` | Debug slow dev server |
|
||||
| `vite build && vite preview` | Smoke-test prod bundle locally (NOT a prod server) |
|
||||
|
||||
## Related Skills
|
||||
|
||||
- `frontend-patterns` — React component patterns
|
||||
- `docker-patterns` — containerized dev with Vite
|
||||
- `nextjs-turbopack` — alternative bundler for Next.js
|
||||
@@ -1,13 +1,23 @@
|
||||
"""Prompt module for prompt building and normalization."""
|
||||
|
||||
from llm.prompt.builder import PromptBuilder, adapt_messages_for_provider, get_provider_builder
|
||||
from llm.prompt.templates import TEMPLATES, get_template, get_template_or_default
|
||||
from llm.prompt.templates import (
|
||||
TEMPLATES,
|
||||
clear_templates,
|
||||
deregister_template,
|
||||
get_template,
|
||||
get_template_or_default,
|
||||
register_template,
|
||||
)
|
||||
|
||||
__all__ = (
|
||||
"PromptBuilder",
|
||||
"TEMPLATES",
|
||||
"clear_templates",
|
||||
"deregister_template",
|
||||
"adapt_messages_for_provider",
|
||||
"get_provider_builder",
|
||||
"get_template",
|
||||
"get_template_or_default",
|
||||
"register_template",
|
||||
)
|
||||
|
||||
@@ -19,9 +19,32 @@ class PromptConfig:
|
||||
tool_format: str = "native"
|
||||
|
||||
|
||||
class PromptBuilder:
|
||||
def __init__(self, config: PromptConfig | None = None) -> None:
|
||||
self.config = config or PromptConfig()
|
||||
class PromptBuilder:
|
||||
def __init__(
|
||||
self,
|
||||
config: PromptConfig | None = None,
|
||||
*,
|
||||
system_template: str | None = None,
|
||||
user_template: str | None = None,
|
||||
include_tools_in_system: bool | None = None,
|
||||
tool_format: str | None = None,
|
||||
) -> None:
|
||||
if config is not None and any(
|
||||
value is not None
|
||||
for value in (system_template, user_template, include_tools_in_system, tool_format)
|
||||
):
|
||||
raise ValueError("Pass either config or PromptBuilder keyword options, not both")
|
||||
|
||||
if config is None:
|
||||
overrides = {
|
||||
"system_template": system_template,
|
||||
"user_template": user_template,
|
||||
"include_tools_in_system": include_tools_in_system,
|
||||
"tool_format": tool_format,
|
||||
}
|
||||
config = PromptConfig(**{key: value for key, value in overrides.items() if value is not None})
|
||||
|
||||
self.config = config
|
||||
|
||||
def build(self, messages: list[Message], tools: list[ToolDefinition] | None = None) -> list[Message]:
|
||||
if not messages:
|
||||
|
||||
@@ -1 +1,41 @@
|
||||
# Templates module for provider-specific prompt templates
|
||||
"""Provider-specific prompt template helpers."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
_TEMPLATE_REGISTRY: dict[str, str] = {}
|
||||
TEMPLATES = _TEMPLATE_REGISTRY
|
||||
|
||||
|
||||
def _validate_template_input(name: str, template: str | None = None) -> None:
|
||||
"""Validate template registry inputs before mutating the registry."""
|
||||
if not isinstance(name, str) or not name.strip():
|
||||
raise ValueError("Template name must be a non-empty string")
|
||||
if template is not None and (not isinstance(template, str) or not template.strip()):
|
||||
raise ValueError("Template content must be a non-empty string")
|
||||
|
||||
|
||||
def register_template(name: str, template: str) -> None:
|
||||
"""Register or replace a named prompt template."""
|
||||
_validate_template_input(name, template)
|
||||
_TEMPLATE_REGISTRY[name] = template
|
||||
|
||||
|
||||
def deregister_template(name: str) -> None:
|
||||
"""Remove a named prompt template if it is registered."""
|
||||
_validate_template_input(name)
|
||||
_TEMPLATE_REGISTRY.pop(name, None)
|
||||
|
||||
|
||||
def clear_templates() -> None:
|
||||
"""Remove all registered prompt templates."""
|
||||
_TEMPLATE_REGISTRY.clear()
|
||||
|
||||
|
||||
def get_template(name: str) -> str | None:
|
||||
"""Return a named prompt template when one is registered."""
|
||||
return _TEMPLATE_REGISTRY.get(name)
|
||||
|
||||
|
||||
def get_template_or_default(name: str, default: str = "") -> str:
|
||||
"""Return a named prompt template or a caller-provided default."""
|
||||
return _TEMPLATE_REGISTRY.get(name, default)
|
||||
|
||||
@@ -57,27 +57,39 @@ class ClaudeProvider(LLMProvider):
|
||||
}
|
||||
if input.max_tokens:
|
||||
params["max_tokens"] = input.max_tokens
|
||||
else:
|
||||
params["max_tokens"] = 8192 # required by Anthropic API
|
||||
if input.tools:
|
||||
else:
|
||||
params["max_tokens"] = 8192 # required by Anthropic API
|
||||
if input.tools:
|
||||
params["tools"] = [tool.to_anthropic_tool() for tool in input.tools]
|
||||
|
||||
response = self.client.messages.create(**params)
|
||||
|
||||
tool_calls = None
|
||||
if response.content and hasattr(response.content[0], "type"):
|
||||
if response.content[0].type == "tool_use":
|
||||
tool_calls = [
|
||||
ToolCall(
|
||||
id=getattr(response.content[0], "id", ""),
|
||||
name=getattr(response.content[0], "name", ""),
|
||||
arguments=getattr(response.content[0].input, "__dict__", {}),
|
||||
)
|
||||
]
|
||||
|
||||
return LLMOutput(
|
||||
content=response.content[0].text if response.content else "",
|
||||
tool_calls=tool_calls,
|
||||
text_parts: list[str] = []
|
||||
tool_calls: list[ToolCall] = []
|
||||
for block in response.content or []:
|
||||
block_type = getattr(block, "type", None)
|
||||
if block_type == "text":
|
||||
text = getattr(block, "text", "")
|
||||
if text:
|
||||
text_parts.append(text)
|
||||
elif block_type == "tool_use":
|
||||
raw_arguments = getattr(block, "input", {})
|
||||
arguments = (
|
||||
raw_arguments.copy()
|
||||
if isinstance(raw_arguments, dict)
|
||||
else getattr(raw_arguments, "__dict__", {}).copy()
|
||||
)
|
||||
tool_calls.append(
|
||||
ToolCall(
|
||||
id=getattr(block, "id", ""),
|
||||
name=getattr(block, "name", ""),
|
||||
arguments=arguments,
|
||||
)
|
||||
)
|
||||
|
||||
return LLMOutput(
|
||||
content="".join(text_parts),
|
||||
tool_calls=tool_calls or None,
|
||||
model=response.model,
|
||||
usage={
|
||||
"input_tokens": response.usage.input_tokens,
|
||||
|
||||
@@ -2002,6 +2002,28 @@ function runTests() {
|
||||
cleanupTestDir(testDir);
|
||||
})) passed++; else failed++;
|
||||
|
||||
if (test('rejects agent with duplicate top-level frontmatter keys', () => {
|
||||
const testDir = createTestDir();
|
||||
fs.writeFileSync(path.join(testDir, 'dup-model.md'),
|
||||
'---\nname: dup\nmodel: sonnet\ntools: Read, Write\ndescription: test\nmodel: opus\n---\n# Agent');
|
||||
|
||||
const result = runValidatorWithDir('validate-agents', 'AGENTS_DIR', testDir);
|
||||
assert.strictEqual(result.code, 1, 'Should reject duplicate top-level YAML keys');
|
||||
assert.ok(result.stderr.includes('Duplicate frontmatter keys'), 'Should report duplicate keys');
|
||||
assert.ok(result.stderr.includes('model'), 'Should name the duplicated key');
|
||||
cleanupTestDir(testDir);
|
||||
})) passed++; else failed++;
|
||||
|
||||
if (test('allows duplicate-looking nested frontmatter keys', () => {
|
||||
const testDir = createTestDir();
|
||||
fs.writeFileSync(path.join(testDir, 'nested.md'),
|
||||
'---\nmodel: sonnet\ntools: Read\nmetadata:\n model: display-only\n---\n# Agent');
|
||||
|
||||
const result = runValidatorWithDir('validate-agents', 'AGENTS_DIR', testDir);
|
||||
assert.strictEqual(result.code, 0, 'Indented nested keys should not count as top-level duplicates');
|
||||
cleanupTestDir(testDir);
|
||||
})) passed++; else failed++;
|
||||
|
||||
// ── Round 32: empty frontmatter & edge cases ──
|
||||
console.log('\nRound 32: validate-agents (empty frontmatter):');
|
||||
|
||||
|
||||
@@ -1,4 +1,10 @@
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
import pytest
|
||||
|
||||
sys.path.insert(0, str(Path(__file__).parent.parent / "src"))
|
||||
|
||||
|
||||
def pytest_configure(config: pytest.Config) -> None:
|
||||
config.addinivalue_line("markers", "unit: marks fast unit tests")
|
||||
|
||||
@@ -24,22 +24,37 @@ class TestPromptBuilder:
|
||||
assert len(result) == 2
|
||||
assert result[0].role == Role.SYSTEM
|
||||
|
||||
def test_build_adds_system_from_config(self):
|
||||
messages = [Message(role=Role.USER, content="Hello")]
|
||||
builder = PromptBuilder(system_template="You are a pirate.")
|
||||
result = builder.build(messages)
|
||||
|
||||
assert len(result) == 2
|
||||
assert "pirate" in result[0].content
|
||||
|
||||
def test_build_adds_system_from_config(self):
|
||||
messages = [Message(role=Role.USER, content="Hello")]
|
||||
builder = PromptBuilder(config=PromptConfig(system_template="You are a pirate."))
|
||||
result = builder.build(messages)
|
||||
|
||||
assert len(result) == 2
|
||||
assert "pirate" in result[0].content
|
||||
def test_build_with_tools(self):
|
||||
def test_build_adds_system_from_keyword_options(self):
|
||||
messages = [Message(role=Role.USER, content="Hello")]
|
||||
builder = PromptBuilder(system_template="You are a pirate.")
|
||||
result = builder.build(messages)
|
||||
|
||||
assert len(result) == 2
|
||||
assert "pirate" in result[0].content
|
||||
|
||||
def test_build_adds_system_from_prompt_config(self):
|
||||
messages = [Message(role=Role.USER, content="Hello")]
|
||||
builder = PromptBuilder(config=PromptConfig(system_template="You are a pirate."))
|
||||
result = builder.build(messages)
|
||||
|
||||
assert len(result) == 2
|
||||
assert "pirate" in result[0].content
|
||||
|
||||
def test_rejects_config_with_keyword_options(self):
|
||||
with pytest.raises(ValueError, match="Pass either config or PromptBuilder keyword options"):
|
||||
PromptBuilder(
|
||||
config=PromptConfig(system_template="Configured."),
|
||||
system_template="Keyword override.",
|
||||
)
|
||||
|
||||
def test_empty_system_template_does_not_add_blank_system_message(self):
|
||||
messages = [Message(role=Role.USER, content="Hello")]
|
||||
builder = PromptBuilder(system_template="")
|
||||
result = builder.build(messages)
|
||||
|
||||
assert result == messages
|
||||
|
||||
def test_build_with_tools(self):
|
||||
messages = [Message(role=Role.USER, content="Search for something")]
|
||||
tools = [
|
||||
ToolDefinition(name="search", description="Search the web", parameters={}),
|
||||
|
||||
111
tests/test_claude_provider.py
Normal file
111
tests/test_claude_provider.py
Normal file
@@ -0,0 +1,111 @@
|
||||
from types import SimpleNamespace
|
||||
from typing import Any
|
||||
|
||||
import pytest
|
||||
|
||||
from llm.core.types import LLMInput, Message, Role
|
||||
from llm.providers.claude import ClaudeProvider
|
||||
|
||||
|
||||
class FakeMessages:
|
||||
def __init__(self, response: SimpleNamespace) -> None:
|
||||
self.response = response
|
||||
|
||||
def create(self, **_params: object) -> SimpleNamespace:
|
||||
return self.response
|
||||
|
||||
|
||||
class FakeClient:
|
||||
def __init__(self, response: SimpleNamespace) -> None:
|
||||
self.messages = FakeMessages(response)
|
||||
self.api_key = "test-key"
|
||||
|
||||
|
||||
def make_provider(response: SimpleNamespace) -> ClaudeProvider:
|
||||
provider = ClaudeProvider(api_key="test-key")
|
||||
provider.client = FakeClient(response)
|
||||
return provider
|
||||
|
||||
|
||||
def make_response(content: list[SimpleNamespace], stop_reason: str = "tool_use") -> SimpleNamespace:
|
||||
return SimpleNamespace(
|
||||
content=content,
|
||||
model="claude-test",
|
||||
usage=SimpleNamespace(input_tokens=3, output_tokens=5),
|
||||
stop_reason=stop_reason,
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.unit
|
||||
def test_generate_collects_text_and_tool_use_blocks() -> None:
|
||||
provider = make_provider(
|
||||
make_response(
|
||||
[
|
||||
SimpleNamespace(type="text", text="I will search. "),
|
||||
SimpleNamespace(type="tool_use", id="toolu_1", name="search", input={"query": "claude"}),
|
||||
SimpleNamespace(type="text", text="Done."),
|
||||
]
|
||||
)
|
||||
)
|
||||
|
||||
output = provider.generate(LLMInput(messages=[Message(role=Role.USER, content="Search")]))
|
||||
|
||||
assert output.content == "I will search. Done."
|
||||
assert output.tool_calls is not None
|
||||
assert len(output.tool_calls) == 1
|
||||
assert output.tool_calls[0].id == "toolu_1"
|
||||
assert output.tool_calls[0].name == "search"
|
||||
assert output.tool_calls[0].arguments == {"query": "claude"}
|
||||
|
||||
|
||||
@pytest.mark.unit
|
||||
def test_generate_collects_multiple_tool_use_blocks() -> None:
|
||||
provider = make_provider(
|
||||
make_response(
|
||||
[
|
||||
SimpleNamespace(type="tool_use", id="toolu_1", name="search", input={"query": "claude"}),
|
||||
SimpleNamespace(
|
||||
type="tool_use",
|
||||
id="toolu_2",
|
||||
name="read",
|
||||
input=SimpleNamespace(path="README.md"),
|
||||
),
|
||||
]
|
||||
)
|
||||
)
|
||||
|
||||
output = provider.generate(LLMInput(messages=[Message(role=Role.USER, content="Use tools")]))
|
||||
|
||||
assert output.content == ""
|
||||
assert [call.id for call in output.tool_calls or []] == ["toolu_1", "toolu_2"]
|
||||
assert (output.tool_calls or [])[1].arguments == {"path": "README.md"}
|
||||
|
||||
|
||||
@pytest.mark.unit
|
||||
def test_generate_copies_tool_use_dict_arguments() -> None:
|
||||
raw_arguments: dict[str, Any] = {"query": "claude"}
|
||||
provider = make_provider(
|
||||
make_response(
|
||||
[SimpleNamespace(type="tool_use", id="toolu_1", name="search", input=raw_arguments)]
|
||||
)
|
||||
)
|
||||
|
||||
output = provider.generate(LLMInput(messages=[Message(role=Role.USER, content="Use tools")]))
|
||||
raw_arguments["query"] = "mutated"
|
||||
|
||||
assert (output.tool_calls or [])[0].arguments == {"query": "claude"}
|
||||
|
||||
|
||||
@pytest.mark.unit
|
||||
def test_generate_text_only_has_no_tool_calls() -> None:
|
||||
provider = make_provider(
|
||||
make_response(
|
||||
[SimpleNamespace(type="text", text="Hello.")],
|
||||
stop_reason="end_turn",
|
||||
)
|
||||
)
|
||||
|
||||
output = provider.generate(LLMInput(messages=[Message(role=Role.USER, content="Hi")]))
|
||||
|
||||
assert output.content == "Hello."
|
||||
assert output.tool_calls is None
|
||||
71
tests/test_templates.py
Normal file
71
tests/test_templates.py
Normal file
@@ -0,0 +1,71 @@
|
||||
import pytest
|
||||
|
||||
from llm.prompt import (
|
||||
TEMPLATES,
|
||||
clear_templates,
|
||||
deregister_template,
|
||||
get_template,
|
||||
get_template_or_default,
|
||||
register_template,
|
||||
)
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def restore_template_registry():
|
||||
snapshot = dict(TEMPLATES)
|
||||
clear_templates()
|
||||
yield
|
||||
try:
|
||||
clear_templates()
|
||||
finally:
|
||||
TEMPLATES.update(snapshot)
|
||||
|
||||
|
||||
@pytest.mark.unit
|
||||
def test_register_template_exposes_public_template_mapping():
|
||||
register_template("system", "You are helpful.")
|
||||
|
||||
assert get_template("system") == "You are helpful."
|
||||
assert get_template_or_default("missing", "fallback") == "fallback"
|
||||
assert TEMPLATES["system"] == "You are helpful."
|
||||
|
||||
|
||||
@pytest.mark.unit
|
||||
def test_templates_mapping_remains_mutable_for_existing_callers():
|
||||
TEMPLATES["legacy"] = "Use the existing public mapping."
|
||||
|
||||
assert get_template("legacy") == "Use the existing public mapping."
|
||||
|
||||
|
||||
@pytest.mark.unit
|
||||
def test_deregister_template_removes_named_template():
|
||||
register_template("system", "You are helpful.")
|
||||
|
||||
deregister_template("system")
|
||||
|
||||
assert get_template("system") is None
|
||||
|
||||
|
||||
@pytest.mark.unit
|
||||
def test_clear_templates_removes_all_registered_templates():
|
||||
register_template("system", "You are helpful.")
|
||||
register_template("user", "Answer clearly.")
|
||||
|
||||
clear_templates()
|
||||
|
||||
assert TEMPLATES == {}
|
||||
|
||||
|
||||
@pytest.mark.unit
|
||||
@pytest.mark.parametrize(
|
||||
("name", "template", "error_match"),
|
||||
[
|
||||
("", "content", "Template name must be a non-empty string"),
|
||||
(" ", "content", "Template name must be a non-empty string"),
|
||||
("system", "", "Template content must be a non-empty string"),
|
||||
("system", " ", "Template content must be a non-empty string"),
|
||||
],
|
||||
)
|
||||
def test_register_template_rejects_empty_inputs(name, template, error_match):
|
||||
with pytest.raises(ValueError, match=error_match):
|
||||
register_template(name, template)
|
||||
Reference in New Issue
Block a user