Skill4Agent
Skill4Agent
All SkillsSearchTools
|
Explore
Skill4Agent
Skill4Agent

AI Agent Skills Directory with categorization, English/Chinese translation, and script security checks.

Sitemap

  • Home
  • All Skills
  • Search
  • Tools

About

  • About Us
  • Disclaimer
  • Copyright

Help

  • FAQ
  • Privacy
  • Terms
Contact Us:osulivan147@qq.com

© 2026 Skill4Agent. All rights reserved.

All Skills

Total 30,472 skills, Data Processing has 1461 skills

Categories

Showing 12 of 1461 skills

Per page
Downloads
Sort
Data Processingoctagonai/skills

sector-performance-snapshot

Retrieve a snapshot of market sector performance using Octagon MCP. Use when analyzing sector-wide metrics including revenue, EBITDA, net income, market cap, and enterprise value for companies within a specific sector and exchange.

🇺🇸|EnglishTranslated
1
Data Processingoctagonai/skills

sec-segment-reporting

Analyze business segment performance and reporting from SEC filings using Octagon MCP. Use when researching segment revenue, operating income, margins, geographic breakdown, and segment restructuring from 10-K and 10-Q filings.

🇺🇸|EnglishTranslated
1
Data Processingoctagonai/skills

stock-price-change

Retrieve stock price change statistics across multiple time periods using Octagon MCP. Use when analyzing short-term and long-term returns, comparing performance across timeframes, and evaluating momentum and historical growth.

🇺🇸|EnglishTranslated
1
Data Processingtimescale/pg-aiguide

migrate-postgres-tables-to-hypertables

Use this skill to migrate identified PostgreSQL tables to Timescale/TimescaleDB hypertables with optimal configuration and validation. **Trigger when user asks to:** - Migrate or convert PostgreSQL tables to hypertables - Execute hypertable migration with minimal downtime - Plan blue-green migration for large tables - Validate hypertable migration success - Configure compression after migration **Prerequisites:** Tables already identified as candidates (use find-hypertable-candidates first if needed) **Keywords:** migrate to hypertable, convert table, Timescale, TimescaleDB, blue-green migration, in-place conversion, create_hypertable, migration validation, compression setup Step-by-step migration planning including: partition column selection, chunk interval calculation, PK/constraint handling, migration execution (in-place vs blue-green), and performance validation queries.

🇺🇸|EnglishTranslated
1
Data Processingcaptaindpt/truth-terminal

tool-nasdaq-quote

Use the nasdaq_quote tool to fetch a US equity quote (free; delayed) with lightweight caching and latency metadata.

🇺🇸|EnglishTranslated
1
Data Processingcaptaindpt/truth-terminal

tool-nasdaq-candles

Use the nasdaq_candles tool to fetch OHLCV candles (free) with caching and latency metadata; good for quick charting.

🇺🇸|EnglishTranslated
1
Data Processingvictory-hugo/s2-agent-ski...

xlsx

全面的电子表格创建、编辑与分析工具,支持公式、格式设置、数据分析和可视化。当需要处理电子表格(如 .xlsx、.xlsm、.csv、.tsv 等)时使用,包括:(1) 创建包含公式和格式的新电子表格,(2) 读取或分析数据,(3) 在保留公式的情况下修改现有电子表格,(4) 在电子表格中进行数据分析和可视化,或 (5) 重新计算公式。

🇺🇸|EnglishTranslated
1
1 scripts/Checked
Data Processingomer-metin/skills-for-ant...

bioinformatics-workflows

Patterns for building, maintaining, and scaling bioinformatics workflows. Covers Nextflow, Snakemake, WDL/Cromwell, container orchestration, and best practices for reproducible computational biology. Use when ", " mentioned.

🇺🇸|EnglishTranslated
1
Data Processingplurigrid/asi

acsets-relational-thinking

ACSets (Attributed C-Sets) for categorical database design and DPO rewriting

🇺🇸|EnglishTranslated
1
Data Processingvibery-studio/templates

udemy-crawler

Extract Udemy course content to markdown. Use when user asks to scrape/crawl Udemy course pages.

🇺🇸|EnglishTranslated
1
Data Processinghttprunner/skills

result-bitable-reporter

Collect app events via evalpkgs into sqlite, then filter/report capture_results to Feishu Bitable with retry-safe writeback. Use for collect-start/collect-stop/filter/report/retry-reset workflows.

🇺🇸|EnglishTranslated
1
1 scripts/Attention
Data Processingwilloscar/research-units-...

arxiv-search

Retrieve paper metadata from arXiv using keyword queries and save results as JSONL (`papers/papers_raw.jsonl`). **Trigger**: arXiv, arxiv, paper search, metadata retrieval, 文献检索, 论文检索, 拉取元数据, 离线导入. **Use when**: 需要一个初始论文集合(survey/snapshot 的 Stage C1),来源为 arXiv(在线检索或离线导入 export)。 **Skip if**: 已经有可用的 `papers/papers_raw.jsonl`,或数据源不是 arXiv。 **Network**: 在线检索需要网络;离线 `--input <export.*>` 不需要网络。 **Guardrail**: 只做 metadata;不要在 `output/` 写长 prose。

🇺🇸|EnglishTranslated
1
1 scripts/Checked
1...116117118119120...122
Page