Loading...
Loading...
Found 722 Skills
Create and manage Dexie/IndexedDB repositories with type-safe interfaces, converters, and standardized CRUD operations. Use when (1) adding entity storage, (2) implementing save/load/delete operations, (3) designing database schema and indexes, (4) converting between database (Db*) and domain types, (5) handling database errors or migrations, (6) using existing repositories (SettingsRepository, WorkoutsRepository, TemplatesRepository, CustomExercisesRepository, BenchmarksRepository, ActiveWorkoutRepository). Triggers include "database", "repository", "save data", "fetch from database", "delete from storage", "database schema", "database table", "indexes", "migration", "persist", "convert workout", "converter", "buildPartialUpdate", "mock repository", "database error", "bulk operations", "import/export", or specific repository names.
Add new field to .agents.yml config schema (updates init.md templates, version, migration)
GitHub Actions 2025 features including 1 vCPU runners, immutable releases, and Node24 migration
Frappe Bench CLI command reference for site management, app management, development, and production operations. Use when running bench commands, managing sites, migrations, builds, or deployments.
Automatically discover database skills when working with SQL, PostgreSQL, MongoDB, Redis, database schema design, query optimization, migrations, connection pooling, ORMs, or database selection. Activates for database design, optimization, and implementation tasks.
Instructions for using the DeepBase multi-driver persistence library. Use when a task requires data persistence, storage abstraction, multi-backend setups, data migration between drivers, or integrating DeepBase into a Node.js project.
Comprehensive Biome (biomejs.dev) integration for professional TypeScript/JavaScript development. Use for linting, formatting, code quality, and flawless Biome integration into codebases. Covers installation, configuration, migration from ESLint/Prettier, all linter rules, formatter options, CLI usage, editor integration, monorepo setup, and CI/CD integration. Use when working with Biome tooling, configuring biome.json, setting up linting/formatting, migrating projects, debugging Biome issues, or implementing production-ready Biome workflows.
Professional Pydantic v2.12 development for data validation, serialization, and type-safe models. Use when working with Pydantic for (1) creating or modifying BaseModel classes, (2) implementing validators and serializers, (3) configuring model behavior, (4) handling JSON schema generation, (5) working with settings management, (6) debugging validation errors, (7) integrating with ORMs or APIs, or (8) any production-grade Python data validation tasks. Includes complete API reference, concept guides, examples, and migration patterns.
Unifies API response patterns across endpoints including pagination format, error structure, status codes, response envelopes, and versioning strategy. Provides contract documentation, shared TypeScript types, middleware utilities, and migration plan. Use when standardizing "API contracts", "response formats", "API conventions", or "API consistency".
SQLAlchemy and database patterns for Python. Triggers on: sqlalchemy, database, orm, migration, alembic, async database, connection pool, repository pattern, unit of work.
Architect and co-design futureproof persistence systems built on open data principles. Use when designing data layers, choosing storage formats, structuring knowledge bases, building file-system-as-database architectures, or evaluating existing systems for portability and longevity. Use when user says "design my data model", "how should I store this", "is my data portable", "audit my persistence layer", "plan a migration", or asks about file-based databases, Markdown schemas, or Obsidian-compatible data formats. Do NOT use for general coding tasks, database query optimization, or SQL schema design.
Audits codebases for quantum-vulnerable cryptography and plans migration to Post-Quantum Cryptography (PQC) standards to ensure long-term data security.