Skill4Agent
Skill4Agent
All SkillsSearchTools
|
Explore
Skill4Agent
Skill4Agent

AI Agent Skills Directory with categorization, English/Chinese translation, and script security checks.

Sitemap

  • Home
  • All Skills
  • Search
  • Tools

About

  • About Us
  • Disclaimer
  • Copyright

Help

  • FAQ
  • Privacy
  • Terms
Contact Us:osulivan147@qq.com

© 2026 Skill4Agent. All rights reserved.

All Skills

Total 31,369 skills, AI & Machine Learning has 5079 skills

Categories

Showing 12 of 5079 skills

Per page
Downloads
Sort
AI & Machine Learningdavila7/claude-code-templ...

modal-serverless-gpu

Serverless GPU cloud platform for running ML workloads. Use when you need on-demand GPU access without infrastructure management, deploying ML models as APIs, or running batch jobs with automatic scaling.

🇺🇸|EnglishTranslated
2
AI & Machine Learningdavila7/claude-code-templ...

model-pruning

Reduce LLM size and accelerate inference using pruning techniques like Wanda and SparseGPT. Use when compressing models without retraining, achieving 50% sparsity with minimal accuracy loss, or enabling faster inference on hardware accelerators. Covers unstructured pruning, structured pruning, N:M sparsity, magnitude pruning, and one-shot methods.

🇺🇸|EnglishTranslated
2
AI & Machine Learningassistant-ui/skills

update

Update assistant-ui and AI SDK to latest versions. Detects current versions, identifies breaking changes, and executes migrations.

🇺🇸|EnglishTranslated
2
AI & Machine Learningdavila7/claude-code-templ...

training-llms-megatron

Trains large language models (2B-462B parameters) using NVIDIA Megatron-Core with advanced parallelism strategies. Use when training models >1B parameters, need maximum GPU efficiency (47% MFU on H100), or require tensor/pipeline/sequence/context/expert parallelism. Production-ready framework used for Nemotron, LLaMA, DeepSeek.

🇺🇸|EnglishTranslated
2
AI & Machine Learningdavila7/claude-code-templ...

moe-training

Train Mixture of Experts (MoE) models using DeepSpeed or HuggingFace. Use when training large-scale models with limited compute (5× cost reduction vs dense models), implementing sparse architectures like Mixtral 8x7B or DeepSeek-V3, or scaling model capacity without proportional compute increase. Covers MoE architectures, routing mechanisms, load balancing, expert parallelism, and inference optimization.

🇺🇸|EnglishTranslated
2
AI & Machine Learningdavila7/claude-code-templ...

mlflow

Track ML experiments, manage model registry with versioning, deploy models to production, and reproduce experiments with MLflow - framework-agnostic ML lifecycle platform

🇺🇸|EnglishTranslated
2
AI & Machine Learningdavila7/claude-code-templ...

instructor

Extract structured data from LLM responses with Pydantic validation, retry failed extractions automatically, parse complex JSON with type safety, and stream partial results with Instructor - battle-tested structured output library

🇺🇸|EnglishTranslated
2
AI & Machine Learningdavila7/claude-code-templ...

outlines

Guarantee valid JSON/XML/code structure during generation, use Pydantic models for type-safe outputs, support local models (Transformers, vLLM), and maximize inference speed with Outlines - dottxt.ai's structured generation library

🇺🇸|EnglishTranslated
2
AI & Machine Learningdavila7/claude-code-templ...

audiocraft-audio-generation

PyTorch library for audio generation including text-to-music (MusicGen) and text-to-sound (AudioGen). Use when you need to generate music from text descriptions, create sound effects, or perform melody-conditioned music generation.

🇺🇸|EnglishTranslated
2
AI & Machine Learningdavila7/claude-code-templ...

peft-fine-tuning

Parameter-efficient fine-tuning for LLMs using LoRA, QLoRA, and 25+ methods. Use when fine-tuning large models (7B-70B) with limited GPU memory, when you need to train <1% of parameters with minimal accuracy loss, or for multi-adapter serving. HuggingFace's official library integrated with transformers ecosystem.

🇺🇸|EnglishTranslated
2
AI & Machine Learningdavila7/claude-code-templ...

sentence-transformers

Framework for state-of-the-art sentence, text, and image embeddings. Provides 5000+ pre-trained models for semantic similarity, clustering, and retrieval. Supports multilingual, domain-specific, and multimodal models. Use for generating embeddings for RAG, semantic search, or similarity tasks. Best for production embedding generation.

🇺🇸|EnglishTranslated
2
AI & Machine Learningdavila7/claude-code-templ...

autonomous-agent-patterns

Design patterns for building autonomous coding agents. Covers tool integration, permission systems, browser automation, and human-in-the-loop workflows. Use when building AI agents, designing tool APIs, implementing permission systems, or creating autonomous coding assistants.

🇺🇸|EnglishTranslated
2
1...107108109110111...424
Page