hot_topics_selector
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
Chinese财经热点选题工具
Financial Hot Topic Selection Tool
本 Skill 用于从财经热点中筛选出适合引导投资理财的优质选题。
This Skill is used to filter high-quality topics suitable for guiding investment and financial management from financial hot spots.
核心工作流
Core Workflow
Phase 0: 数据抓取(Python)
→ 获取133个新闻标题
Phase 1: Agent 智能筛选(Markdown 提示词)
→ 筛选出5个选题
Phase 2: 内容抓取(Python)
→ 抓取5篇新闻详细内容
Phase 3: Agent 智能解析(Markdown 提示词)
→ 生成5个完整选题方案职责分离:
- Python 脚本:数据抓取(Phase 0, 2)
- Agent:智能筛选和解析(Phase 1, 3)
Phase 0: Data Crawling (Python)
→ Obtain 133 news headlines
Phase 1: Agent Intelligent Screening (Markdown Prompt)
→ Filter out 5 topics
Phase 2: Content Crawling (Python)
→ Crawl detailed content of 5 news articles
Phase 3: Agent Intelligent Parsing (Markdown Prompt)
→ Generate 5 complete topic plansSeparation of Responsibilities:
- Python Scripts: Data Crawling (Phase 0, 2)
- Agent: Intelligent Screening and Parsing (Phase 1, 3)
Phase 0: 数据抓取
Phase 0: Data Crawling
执行方式:
bash
cd scripts
python3 fetch_hot_topics.py输出:
/tmp/hot_topics.json包含内容:
- 133个新闻标题(5个平台)
- 微博、百度、今日头条、抖音、财联社
数据格式:
json
{
"fetch_time": "2026-02-13T19:36:24+08:00",
"total_items": 133,
"data": {
"weibo": { "items": [...], "count": 30 },
"baidu": { "items": [...], "count": 30 }
}
}Execution Method:
bash
cd scripts
python3 fetch_hot_topics.pyOutput:
/tmp/hot_topics.jsonIncluded Content:
- 133 news headlines (from 5 platforms)
- Weibo, Baidu, Toutiao, Douyin, Cailianshe
Data Format:
json
{
"fetch_time": "2026-02-13T19:36:24+08:00",
"total_items": 133,
"data": {
"weibo": { "items": [...], "count": 30 },
"baidu": { "items": [...], "count": 30 }
}
}Phase 1: Agent 智能筛选
Phase 1: Agent Intelligent Screening
Agent 执行提示词:
你是一个财经内容选题专家。请根据以下新闻标题,筛选出5个最适合引导证券开户/基金投资的选题。
Agent Execution Prompt:
You are a financial content topic selection expert. Please filter out 5 topics most suitable for guiding securities account opening/fund investment based on the following news headlines.
标题列表
Headline List
{从 /tmp/hot_topics.json 读取的133个标题}
{133 headlines read from /tmp/hot_topics.json}
筛选标准(投资关联性为核心)
Screening Criteria (Investment Relevance as Core)
最高优先级(90-100分)
Highest Priority (90-100 points)
可直接关联投资产品:
- 赚钱故事 → AI基金/科技股票/券商产品
- 投资案例 → 黄金ETF/资产配置
- 理财话题 → 基金定投/证券开户
示例:
- "AI月入200万" → 可推荐AI基金(95分)
- "黄金赚196万" → 可推荐黄金ETF(92分)
Can directly link to investment products:
- Money-making stories → AI funds/tech stocks/brokerage products
- Investment cases → Gold ETF/asset allocation
- Financial management topics → Fund fixed-investment/securities account opening
Examples:
- "AI earns 2 million yuan per month" → Can recommend AI funds (95 points)
- "Gold earns 1.96 million yuan" → Can recommend Gold ETF (92 points)
高优先级(80-89分)
High Priority (80-89 points)
可关联理财规划:
- 薪资话题 → 工资收入vs投资收入
- 储蓄话题 → 存钱vs投资
- 消费话题 → 理财规划
示例:
- "年终奖1.8亿" → 可推荐基金定投(85分)
Can link to financial planning:
- Salary topics → Wage income vs investment income
- Savings topics → Saving money vs investment
- Consumption topics → Financial planning
Example:
- "Year-end bonus of 180 million yuan" → Can recommend fund fixed-investment (85 points)
中优先级(60-79分)
Medium Priority (60-79 points)
勉强可关联:
- 教育话题 → 教育基金
- 养老话题 → 养老金投资
示例:
- "衡水中学变了" → 可关联教育基金(70分)
Barely can link:
- Education topics → Education funds
- Pension topics → Pension investment
Example:
- "Hengshui High School has changed" → Can link to education funds (70 points)
不选择(<60分)
Not Selected (<60 points)
无法关联投资:
- 纯娱乐八卦
- 纯社会新闻
- 与财经无关的话题
Cannot link to investment:
- Pure entertainment gossip
- Pure social news
- Topics unrelated to finance
输出格式(JSON)
Output Format (JSON)
json
{
"selected_indices": [1, 3, 5, 7, 9],
"reasons": {
"1": "✅ 可直接关联:AI赚钱 → AI基金投资。投资关联度95分,大众关注度高",
"3": "✅ 可直接关联:黄金案例 → 黄金ETF。投资关联度92分,数据冲击强"
},
"investment_angles": {
"1": "可以推荐:AI主题基金、科技股票、券商AI产品",
"3": "可以推荐:黄金ETF、券商黄金产品、资产配置服务"
},
"investment_relevance_score": {
"1": 95,
"3": 92
}
}严格要求:
- 固定选择5个标题(不多不少)
- 每个标题投资关联度 ≥80分
- 说明具体的投资产品推荐方向
- 优先选择可直接关联产品的选题(≥90分)
json
{
"selected_indices": [1, 3, 5, 7, 9],
"reasons": {
"1": "✅ Can directly link: AI money-making → AI fund investment. Investment relevance 95 points, high public attention",
"3": "✅ Can directly link: Gold case → Gold ETF. Investment relevance 92 points, strong data impact"
},
"investment_angles": {
"1": "Can recommend: AI-themed funds, tech stocks, brokerage AI products",
"3": "Can recommend: Gold ETF, brokerage gold products, asset allocation services"
},
"investment_relevance_score": {
"1": 95,
"3": 92
}
}Strict Requirements:
- Fixed selection of 5 headlines (no more, no less)
- Each headline has an investment relevance score ≥80 points
- Specify the recommended investment product directions
- Prioritize topics that can directly link to products (≥90 points)
Phase 2: 内容抓取
Phase 2: Content Crawling
执行方式:
bash
cd scripts
python3 fetch_news_content.py \
--input /tmp/hot_topics.json \
--indices 1,3,5,7,9 \
--output /tmp/news_content.json输入:Phase 1 输出的5个选题索引
输出:
/tmp/news_content.json包含内容:
- 5篇新闻的完整内容
- 标题、URL、平台、正文、关键词
数据格式:
json
{
"fetch_time": "2026-02-13T21:00:00+08:00",
"total_articles": 5,
"articles": [
{
"index": 1,
"title": "杭州大哥开1人公司靠AI月入200万",
"url": "https://...",
"platform": "微博",
"content": "完整新闻内容...",
"keywords": ["AI", "创业", "月入200万"]
}
]
}Execution Method:
bash
cd scripts
python3 fetch_news_content.py \
--input /tmp/hot_topics.json \
--indices 1,3,5,7,9 \
--output /tmp/news_content.jsonInput: 5 topic indices output from Phase 1
Output:
/tmp/news_content.jsonIncluded Content:
- Complete content of 5 news articles
- Headline, URL, platform, body text, keywords
Data Format:
json
{
"fetch_time": "2026-02-13T21:00:00+08:00",
"total_articles": 5,
"articles": [
{
"index": 1,
"title": "Hangzhou guy runs a 1-person company and earns 2 million yuan per month with AI",
"url": "https://...",
"platform": "Weibo",
"content": "Complete news content...",
"keywords": ["AI", "entrepreneurship", "earn 2 million per month"]
}
]
}Phase 3: Agent 智能解析
Phase 3: Agent Intelligent Parsing
Agent 执行提示词:
你是一个财经内容策划专家。请根据以下5篇新闻内容,为每篇新闻生成完整的选题方案。
Agent Execution Prompt:
You are a financial content planning expert. Please generate a complete topic plan for each news article based on the following 5 news contents.
新闻列表
News List
{从 /tmp/news_content.json 读取的5篇新闻}
{5 news articles read from /tmp/news_content.json}
解析要求
Parsing Requirements
1. 新闻内容概要
1. News Content Summary
- 提取核心信息(100-150字)
- 保留关键数据和案例
- 突出与投资相关的信息
- Extract core information (100-150 words)
- Retain key data and cases
- Highlight investment-related information
2. 选题理由
2. Topic Selection Reasons
- 说明投资关联度(80-100分)
- 分析大众关注度
- 预测传播潜力
- Explain investment relevance (80-100 points)
- Analyze public attention
- Predict viral potential
3. 引导策略
3. Guidance Strategy
- 切入点:如何从新闻话题切入
- 转折点:如何自然转向投资
- 产品推荐:具体推荐哪些投资产品
- 话术示例:2-3句引导话术
示例引导策略:
切入点:AI创业月入200万,引发"我也能吗"的思考
转折点:普通人不会技术,但可以通过投资分享红利
产品推荐:AI主题基金、科技ETF、券商AI产品
话术示例:
"想参与AI红利?不需要会技术,投资AI基金就能分一杯羹"
"开户买AI基金,普通人也能享受AI发展的红利"- Entry Point: How to start from the news topic
- Turning Point: How to naturally shift to investment
- Product Recommendation: Specific investment products to recommend
- Script Examples: 2-3 guiding scripts
Example Guidance Strategy:
Entry Point: AI entrepreneurship earns 2 million per month, triggering the thought "Can I do it too?"
Turning Point: Ordinary people don't have the skills, but can share the dividends through investment
Product Recommendation: AI-themed funds, tech ETFs, brokerage AI products
Script Examples:
"Want to participate in AI dividends? No need to have technical skills, investing in AI funds allows you to get a share"
"Open an account to buy AI funds, ordinary people can also enjoy the dividends of AI development"4. 原始链接
4. Original Link
- 保留完整的新闻URL
- 方便后续查阅和引用
- Retain the complete news URL
- Facilitate subsequent review and citation
输出格式(JSON)
Output Format (JSON)
json
{
"topic_plans": [
{
"index": 1,
"title": "杭州大哥开1人公司靠AI月入200万",
"summary": "杭州一创业者通过AI工具运营1人公司,月收入达200万。主要业务是...",
"selection_reason": {
"investment_relevance": 95,
"mass_attention": "高(AI+赚钱话题)",
"viral_potential": "极强(月入200万数据冲击)"
},
"guidance_strategy": {
"entry_point": "AI创业月入200万,引发'我也能吗'的思考",
"turning_point": "普通人不会技术,但可以通过投资分享AI红利",
"product_recommendation": "AI主题基金、科技ETF、券商AI产品",
"script_examples": [
"想参与AI红利?投资AI基金就能分一杯羹",
"开户买AI基金,普通人也能享受AI发展红利"
]
},
"source_url": "https://..."
}
]
}严格要求:
- 每个选题都要有完整的4个部分
- 引导策略要具体、可操作
- 产品推荐要与券商/基金相关
- 话术要自然、不生硬
json
{
"topic_plans": [
{
"index": 1,
"title": "Hangzhou guy runs a 1-person company and earns 2 million yuan per month with AI",
"summary": "A Hangzhou entrepreneur operates a 1-person company through AI tools, with a monthly income of 2 million yuan. The main business is...",
"selection_reason": {
"investment_relevance": 95,
"mass_attention": "High (AI + money-making topic)",
"viral_potential": "Extremely strong (data impact of 2 million yuan monthly income)"
},
"guidance_strategy": {
"entry_point": "AI entrepreneurship earns 2 million per month, triggering the thought 'Can I do it too?'",
"turning_point": "Ordinary people don't have the skills, but can share AI dividends through investment",
"product_recommendation": "AI-themed funds, tech ETFs, brokerage AI products",
"script_examples": [
"Want to participate in AI dividends? Investing in AI funds allows you to get a share",
"Open an account to buy AI funds, ordinary people can also enjoy the dividends of AI development"
]
},
"source_url": "https://..."
}
]
}Strict Requirements:
- Each topic plan must have all 4 sections
- Guidance strategy must be specific and operable
- Product recommendations must be related to brokerages/funds
- Scripts must be natural and not forced
使用示例
Usage Example
完整流程
Complete Workflow
bash
undefinedbash
undefinedStep 1: 数据抓取
Step 1: Data Crawling
python3 scripts/fetch_hot_topics.py
python3 scripts/fetch_hot_topics.py
Step 2: Agent 智能筛选
Step 2: Agent Intelligent Screening
Agent 读取本文件 Phase 1 提示词并执行
Agent reads the Phase 1 prompt in this file and executes it
Step 3: 内容抓取
Step 3: Content Crawling
python3 scripts/fetch_news_content.py
--input /tmp/hot_topics.json
--indices 1,3,5,7,9
--input /tmp/hot_topics.json
--indices 1,3,5,7,9
python3 scripts/fetch_news_content.py
--input /tmp/hot_topics.json
--indices 1,3,5,7,9
--input /tmp/hot_topics.json
--indices 1,3,5,7,9
Step 4: Agent 智能解析
Step 4: Agent Intelligent Parsing
Agent 读取本文件 Phase 3 提示词并执行
Agent reads the Phase 3 prompt in this file and executes it
**总耗时**:约15分钟
---
**Total Time Consumption**: Approximately 15 minutes
---脚本说明
Script Explanation
scripts/fetch_hot_topics.py
scripts/fetch_hot_topics.py
功能:调用 TrendRadar API,抓取5个平台的热点新闻标题
输出:133个新闻标题(JSON)
Function: Calls the TrendRadar API to crawl hot news headlines from 5 platforms
Output: 133 news headlines (JSON)
scripts/fetch_news_content.py
scripts/fetch_news_content.py
功能:根据选题索引,抓取新闻详细内容
输入:选题索引(如 1,3,5,7,9)
输出:5篇新闻完整内容(JSON)
Function: Crawls detailed news content based on topic indices
Input: Topic indices (e.g., 1,3,5,7,9)
Output: Complete content of 5 news articles (JSON)
参考资料
Reference Materials
references/选题方法论.md
references/Topic Selection Methodology.md
包含财经新媒体大V的选题方法论,包括:
- 5大核心方法论(痛点驱动、情绪共鸣、数据冲击、故事表达、实用导向)
- 选题公式(数字+冲突+结果、痛点+解决方案等)
- 评估标准(传播性40%、相关性30%、实用性20%、合规性10%)
Agent 可根据需要参考此文件优化选题策略。
Includes topic selection methodologies from top financial new media influencers, including:
- 5 core methodologies (pain point-driven, emotional resonance, data impact, story expression, practical orientation)
- Topic formulas (number + conflict + result, pain point + solution, etc.)
- Evaluation criteria (virality 40%, relevance 30%, practicality 20%, compliance 10%)
Agent can refer to this file to optimize topic selection strategies as needed.
注意事项
Notes
- 投资关联性为核心:每个选题都必须可以自然转向证券开户或基金投资
- 固定输出5个选题:不多不少,确保质量
- 投资关联度 ≥80分:低于80分的选题会被排除
- Python 负责数据:Agent 不需要执行数据抓取
- Agent 负责智能:筛选和解析由 Agent 完成
- Investment relevance is core: Each topic must be able to naturally shift to securities account opening or fund investment
- Fixed output of 5 topics: No more, no less, ensure quality
- Investment relevance ≥80 points: Topics with scores below 80 will be excluded
- Python is responsible for data: Agent does not need to perform data crawling
- Agent is responsible for intelligence: Screening and parsing are completed by Agent