Loading...
Loading...
Found 13 Skills
Compress large language models using knowledge distillation from teacher to student models. Use when deploying smaller models with retained performance, transferring GPT-4 capabilities to open-source models, or reducing inference costs. Covers temperature scaling, soft targets, reverse KLD, logit distillation, and MiniLLM training strategies.
Bootstraps modular Agent Skills from any repository. Clones the source to `sources/`, extracts core documentation into categorized references under `skills/`, and registers the output in the workspace `AGENTS.md`.
The foundational knowledge distillation pattern for building and maintaining an AI-powered Obsidian wiki. Based on Andrej Karpathy's LLM Wiki architecture. Use this skill whenever the user wants to understand the wiki pattern, set up a new knowledge base, or needs guidance on the three-layer architecture (raw sources → wiki → schema). Also use when discussing knowledge management strategy, wiki structure decisions, or how to organize distilled knowledge. This is the "theory" skill — other skills handle specific operations (ingesting, querying, linting).
Use when reducing model size, improving inference speed, or deploying to edge devices - covers quantization, pruning, knowledge distillation, ONNX export, and TensorRT optimizationUse when ", " mentioned.
Sync the current project's knowledge into the Obsidian wiki. Use this skill from any project when the user says "update wiki", "sync to wiki", "save this to my wiki", "update obsidian", or wants to distill what they've been working on into their knowledge base. This is the cross-project skill that lets you push knowledge from wherever you are into the vault.
Given a domain, identify the few independent forces that truly underpin it. Reduce dozens of phenomena to the minimal set of generators—only when you can regenerate all phenomena from these generators does it count. Use this when the user says 'rank reduction', 'find rank', 'what is rank', 'what supports this domain', 'what lies behind it', or wants to decompose any domain into its irreducible generators.
Human-led curation of accumulated metis and guardrails. Surface patterns across sessions, propose what to promote, compact, or dismiss. Use after multiple sessions, before a new phase, or when search results feel noisy.
Ingest GitHub Copilot CLI session history into an Obsidian wiki as distilled knowledge pages. Use this skill when the user wants to capture their Copilot CLI sessions into a personal wiki — extracting architecture decisions, debug notes, and patterns into searchable Obsidian pages. Triggers on phrases like "ingest my copilot sessions into obsidian", "add my copilot history to my wiki", "pull my copilot session history into the vault", "capture what I've learned from copilot into obsidian", "just the new sessions since last time", or "mine patterns across my copilot sessions". Also triggers when the user mentions session-store.db, ~/.copilot/session-state, or VS Code copilot-chat transcripts in the context of building a wiki or knowledge base. Does NOT trigger for general copilot usage questions, searching sessions, or backing up history.
Capture the current task into a structured temporary session bundle under `.agents/sessions/` so a learning agent can later distill durable repo knowledge. Use for completed, blocked, or abandoned tasks with meaningful changes, debugging, validation, or reusable lessons.
DISTILL
Efficient AI techniques including model compression, quantization, pruning, knowledge distillation, and hardware-aware optimization for production systems.
Perform "Rank Reduction" on any domain — start from phenomena, extract dimensions, identify constraints, find irreducible independent generators (rank), and verify through generation tests and validation. Use when the user says "Rank Reduction", "find the rank", "what is the rank of this domain", or wants to find the irreducible principles of any domain.