Loading...
Loading...
Genera documentación llms.txt optimizada para LLMs. Usa cuando el usuario diga "crear llms.txt", "documentar para AI", "crear documentación para LLMs", "generar docs para modelos", o quiera hacer el repo legible para Claude/AI.
npx skill4agent add testacode/llm-toolkit llms-txt-generatorllm-docs/
├── llm.txt # Main index (~1-2 KB)
├── llm.version.txt # Metadata and sync info (~0.3 KB)
└── llm.{domain}.txt # Domain-specific files (~3-50 KB each)¿En qué idioma prefieres la documentación? / What language do you prefer?
- Español
- English
- Bilingual (technical terms in English, explanations in Spanish)| Indicator | Project Type |
|---|---|
| Frontend/UI Library |
| CLI Tool |
| REST/GraphQL API |
| Generic Library |
llm.{domain}.txtreferences/frontend-example.mdreferences/cli-example.mdreferences/api-example.mdreferences/library-example.md| Condition | Approach |
|---|---|
| Structured data exists (JSON, JSDoc, OpenAPI) | Create generator script |
| Manual documentation needed | Write static markdown files |
| Mixed sources | Hybrid: script for structured, manual for rest |
generate:llms# {Project} LLM Documentation
- **Version**: {semantic version}
- **Last Updated**: {YYYY-MM-DD}
- **Documentation Version**: 1.0.0
- **Files**: {count} domain files
- **Total Size**: ~{X} KB# {Project} - LLM Documentation
## Project Metadata
- **Name**: {project name}
- **Type**: {frontend|cli|api|library}
- **Language**: {primary language}
- **Purpose**: {one-line description}
## Quick Reference
- **Key Modules**: {list main areas}
- **Patterns**: {architectural patterns used}
- **Dependencies**: {key dependencies}
## Documentation Structure
### {Domain 1}
#### llm.{domain1}.txt
- **Focus**: {what this file covers}
- **Use when**: {scenarios to read this file}
### {Domain 2}
...
## Reading Guide
1. Start with `llm.version.txt` for metadata
2. Read `llm.{primary-domain}.txt` for core concepts
3. Reference other files as needed# {Domain} - {Project}
## Overview
{2-3 sentences explaining this domain}
## {Section 1}
| Name | Type | Description |
|------|------|-------------|
| ... | ... | ... |
## {Section 2}
### {Subsection}
{Content with code examples}
## Related Files
- `llm.{related}.txt` - {why related}// Structure
const config = { COMPONENTS_DIR, OUTPUT_DIR, ... };
// Utilities
function readFile(path) { ... }
function writeOutput(filename, content) { ... }
// Extractors (one per data source)
function extractComponents() { ... }
function extractTokens() { ... }
// Generators (one per output file)
function generateIndex() { ... }
function generateVersion() { ... }
function generateDomain() { ... }
// Main
function main() {
// Extract all data
// Generate all files
// Log summary
}
// Export for testing
module.exports = { extractors, generators };
// Run if main
if (require.main === module) main();{
"scripts": {
"generate:llms": "node build-scripts/create-llms-docs.js"
}
}# DeployCLI LLM Documentation
- **Version**: 2.1.0
- **Last Updated**: 2025-12-15
- **Documentation Version**: 1.0.0
- **Files**: 4 domain files
- **Total Size**: ~35 KB# DeployCLI - LLM Documentation
## Project Metadata
- **Name**: deploy-cli
- **Type**: CLI Tool
- **Language**: TypeScript
- **Purpose**: Deploy applications to multiple cloud providers
## Quick Reference
- **Key Modules**: commands, providers, config
- **Patterns**: Command pattern, Provider abstraction
- **Dependencies**: commander, chalk, ora
## Documentation Structure
### Commands
#### llm.commands.txt
- **Focus**: All CLI commands and subcommands
- **Use when**: Need to understand available commands and flags
### Providers
#### llm.providers.txt
- **Focus**: Cloud provider integrations (AWS, GCP, Vercel)
- **Use when**: Adding or modifying provider support
### Configuration
#### llm.config.txt
- **Focus**: Config file format and options
- **Use when**: Understanding how users configure the CLI# Commands - DeployCLI
## Overview
DeployCLI exposes 5 main commands for deployment management.
## Commands
| Command | Description | Flags |
|---------|-------------|-------|
| `deploy` | Deploy to target provider | `--provider`, `--env`, `--dry-run` |
| `rollback` | Revert to previous deployment | `--version`, `--force` |
| `status` | Check deployment status | `--watch`, `--json` |
| `config` | Manage configuration | `--init`, `--validate` |
| `logs` | Stream deployment logs | `--follow`, `--since` |
## deploy
Main deployment command.
### Usage
\`\`\`bash
deploy-cli deploy --provider aws --env production
\`\`\`
### Flags
- `--provider, -p`: Target provider (aws, gcp, vercel)
- `--env, -e`: Environment (development, staging, production)
- `--dry-run`: Simulate without deploying
- `--config, -c`: Path to config file
## Related Files
- `llm.providers.txt` - Provider-specific deployment details
- `llm.config.txt` - Configuration options for deployments@llm-docs/llm.commands.txt How do I deploy to staging?## LLM Documentation
Ver `llm-docs/` para documentación optimizada para AI.npm run generate:llms # Regenerar después de cambios