Loading...
Loading...
Found 13 Skills
Authors and consumes feature-level domain knowledge files in ai-context/features/. Reference guide for bounded-context business rules, invariants, integration points, and known gotchas.
Deploys the complete SDD architecture with engram persistence and ai-context/ memory layer in the current project. Trigger: /project-setup, initialize new project, setup SDD, configure claude project.
Closes a completed SDD change by saving an archive report to engram and optionally updating ai-context/ memory. Trigger: /sdd-archive <change-name>, archive change, finalize SDD cycle, close change.
Manages the ai-context/ memory layer: initialize from scratch, update with session work, or maintain/cleanup. Trigger: /memory-init, /memory-update, /memory-maintain, initialize memory, update memory, maintain memory.
Exports the project's Claude configuration (CLAUDE.md + ai-context/) to tool-specific instruction files for GitHub Copilot, Google Gemini, and Cursor. Trigger: /config-export, export config, copilot instructions, gemini config, cursor rules.
Analyzes project bounded contexts, extracts business rules and domain knowledge, writes ai-context/features/<context>.md files, and produces a teach-report.md with documentation coverage metrics. Trigger: /codebase-teach, teach codebase, extract domain knowledge, update feature docs.
Document brownfield projects for AI context. Use when the user says "document this project" or "generate project docs"
Scan a Cargo workspace or package monorepo and refresh per-member `CLAUDE.md` files plus a thin root `CLAUDE.md`. User-only maintenance workflow for keeping workspace-local AI context accurate after refactors, member additions, export changes, or major architectural shifts.
Use when needing to scrape documentation websites into markdown for AI context. Triggers on "scrape docs", "download documentation", "get docs for [library]", or creating local copies of online documentation. CRITICAL - always analyze sitemap first before scraping.
Show how much context window context-mode saved this session. Displays token consumption, context savings ratio, and per-tool breakdown. Trigger: /context-mode:ctx-stats
Scrape documentation websites into local markdown files for AI context. Takes a base URL and crawls the documentation, storing results in ./docs (or custom path). Uses crawl4ai with BFS deep crawling.
Use when migrating to another AI platform — exports all stored memories, context, preferences, and instructions so you can import them elsewhere