github-triage
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
ChineseGitHub Triage - Read-Only Analyzer
GitHub 分类处理 - 只读分析器
<role>
Read-only GitHub triage orchestrator. Fetch open issues/PRs, classify, spawn 1 background `quick` subagent per item. Each subagent analyzes and writes a report file. ZERO GitHub mutations.
</role>
<role>
只读GitHub分类处理协调器。获取未关闭的Issue/PR,进行分类,为每个待处理项生成一个后台`quick`子代理(subagent)。每个子代理分析后写入报告文件。绝不修改GitHub数据。
</role>
Architecture
架构
1 ISSUE/PR = 1 = 1 SUBAGENT (background). NO EXCEPTIONS.
task_createquick| Rule | Value |
|---|---|
| Category | |
| Execution | |
| Parallelism | ALL items simultaneously |
| Tracking | |
| Output | |
1个Issue/PR = 1个 = 1个子代理(后台运行)。无例外。
task_createquick| Rule | Value |
|---|---|
| Category | |
| Execution | |
| Parallelism | 所有待处理项同时执行 |
| Tracking | 每个待处理项对应一个 |
| Output | |
Zero-Action Policy (ABSOLUTE)
零操作原则(绝对遵守)
<zero_action>
Subagents MUST NEVER run ANY command that writes or mutates GitHub state.
FORBIDDEN (non-exhaustive):
, , , , , , , , , ,
gh issue commentgh issue closegh issue editgh pr commentgh pr mergegh pr reviewgh pr editgh api -X POSTgh api -X PUTgh api -X PATCHgh api -X DELETEALLOWED:
- ,
gh issue view,gh pr view(GET only) - read GitHub datagh api - ,
Grep,Read- read codebaseGlob - - write report files to
WriteONLY/tmp/ - ,
git log,git show- read git history (for finding fix commits)git blame
ANY GitHub mutation = CRITICAL violation.
</zero_action>
<zero_action>
子代理(Subagent)绝不能运行任何会写入或修改GitHub状态的命令。
禁止操作(非完整列表):
, , , , , , , , , ,
gh issue commentgh issue closegh issue editgh pr commentgh pr mergegh pr reviewgh pr editgh api -X POSTgh api -X PUTgh api -X PATCHgh api -X DELETE允许操作:
- ,
gh issue view,gh pr view(仅GET请求)- 读取GitHub数据gh api - ,
Grep,Read- 读取代码库Glob - - 仅向
Write目录写入报告文件/tmp/ - ,
git log,git show- 读取git历史(用于查找修复提交)git blame
任何修改GitHub状态的操作都属于严重违规。
</zero_action>
Evidence Rule (MANDATORY)
证据规则(强制要求)
<evidence>
**Every factual claim in a report MUST include a GitHub permalink as proof.**
A permalink is a URL pointing to a specific line/range in a specific commit, e.g.:
https://github.com/{owner}/{repo}/blob/{commit_sha}/{path}#L{start}-L{end}<evidence>
**报告中的所有事实性结论都必须包含GitHub永久链接作为证明。**
永久链接是指向特定提交中特定行/行范围的URL,例如:
https://github.com/{owner}/{repo}/blob/{commit_sha}/{path}#L{start}-L{end}How to generate permalinks
如何生成永久链接
- Find the relevant file and line(s) via Grep/Read.
- Get the current commit SHA:
git rev-parse HEAD - Construct: (or
https://github.com/{REPO}/blob/{SHA}/{filepath}#L{line}for ranges)#L{start}-L{end}
- 通过Grep/Read找到相关文件和行。
- 获取当前提交SHA:
git rev-parse HEAD - 构造链接:(范围链接使用
https://github.com/{REPO}/blob/{SHA}/{filepath}#L{line})#L{start}-L{end}
Rules
规则
- No permalink = no claim. If you cannot back a statement with a permalink, state "No evidence found" instead.
- Claims without permalinks are explicitly marked and carry zero weight.
[UNVERIFIED] - Permalinks to /
main/masterbranches are NOT acceptable - use commit SHAs only.dev - For bug analysis: permalink to the problematic code. For fix verification: permalink to the fixing commit diff. </evidence>
- 无永久链接则不得下结论。如果无法为某个结论找到支持的永久链接,应注明“未找到证据”。
- 无永久链接的结论需明确标记,且不具备任何可信度。
[UNVERIFIED] - 永久链接不能指向/
main/master分支 - 仅允许使用提交SHA。dev - 对于Bug分析:永久链接指向有问题的代码。对于修复验证:永久链接指向修复提交的差异。 </evidence>
Phase 0: Setup
阶段0:设置
bash
REPO=$(gh repo view --json nameWithOwner -q .nameWithOwner)
REPORT_DIR="/tmp/$(date +%Y%m%d-%H%M%S)"
mkdir -p "$REPORT_DIR"
COMMIT_SHA=$(git rev-parse HEAD)Pass , , and to every subagent.
REPOREPORT_DIRCOMMIT_SHAbash
REPO=$(gh repo view --json nameWithOwner -q .nameWithOwner)
REPORT_DIR="/tmp/$(date +%Y%m%d-%H%M%S)"
mkdir -p "$REPORT_DIR"
COMMIT_SHA=$(git rev-parse HEAD)将、和传递给每个子代理。
REPOREPORT_DIRCOMMIT_SHAPhase 1: Fetch All Open Items (CORRECTED)
阶段1:获取所有未关闭的待处理项(已修正)
IMPORTANT: and fields may contain control characters that break jq parsing. Fetch basic metadata first, then fetch full details per-item in subagents.
bodycommentsbash
undefined重要提示: 和字段可能包含会破坏jq解析的控制字符。先获取基础元数据,再在子代理中逐个获取完整详情。
bodycommentsbash
undefinedStep 1: Fetch basic metadata (without body/comments to avoid JSON parsing issues)
Step 1: 获取基础元数据(不包含body/comments以避免JSON解析问题)
ISSUES_LIST=$(gh issue list --repo $REPO --state open --limit 500
--json number,title,labels,author,createdAt) ISSUE_COUNT=$(echo "$ISSUES_LIST" | jq length)
--json number,title,labels,author,createdAt) ISSUE_COUNT=$(echo "$ISSUES_LIST" | jq length)
ISSUES_LIST=$(gh issue list --repo $REPO --state open --limit 500
--json number,title,labels,author,createdAt) ISSUE_COUNT=$(echo "$ISSUES_LIST" | jq length)
--json number,title,labels,author,createdAt) ISSUE_COUNT=$(echo "$ISSUES_LIST" | jq length)
Paginate if needed
按需分页
if [ "$ISSUE_COUNT" -eq 500 ]; then
LAST_DATE=$(echo "$ISSUES_LIST" | jq -r '.[-1].createdAt')
while true; do
PAGE=$(gh issue list --repo $REPO --state open --limit 500
--search "created:<$LAST_DATE"
--json number,title,labels,author,createdAt) PAGE_COUNT=$(echo "$PAGE" | jq length) [ "$PAGE_COUNT" -eq 0 ] && break ISSUES_LIST=$(echo "$ISSUES_LIST" "$PAGE" | jq -s '.[0] + .[1] | unique_by(.number)') ISSUE_COUNT=$(echo "$ISSUES_LIST" | jq length) [ "$PAGE_COUNT" -lt 500 ] && break LAST_DATE=$(echo "$PAGE" | jq -r '.[-1].createdAt') done fi
--search "created:<$LAST_DATE"
--json number,title,labels,author,createdAt) PAGE_COUNT=$(echo "$PAGE" | jq length) [ "$PAGE_COUNT" -eq 0 ] && break ISSUES_LIST=$(echo "$ISSUES_LIST" "$PAGE" | jq -s '.[0] + .[1] | unique_by(.number)') ISSUE_COUNT=$(echo "$ISSUES_LIST" | jq length) [ "$PAGE_COUNT" -lt 500 ] && break LAST_DATE=$(echo "$PAGE" | jq -r '.[-1].createdAt') done fi
if [ "$ISSUE_COUNT" -eq 500 ]; then
LAST_DATE=$(echo "$ISSUES_LIST" | jq -r '.[-1].createdAt')
while true; do
PAGE=$(gh issue list --repo $REPO --state open --limit 500
--search "created:<$LAST_DATE"
--json number,title,labels,author,createdAt) PAGE_COUNT=$(echo "$PAGE" | jq length) [ "$PAGE_COUNT" -eq 0 ] && break ISSUES_LIST=$(echo "$ISSUES_LIST" "$PAGE" | jq -s '.[0] + .[1] | unique_by(.number)') ISSUE_COUNT=$(echo "$ISSUES_LIST" | jq length) [ "$PAGE_COUNT" -lt 500 ] && break LAST_DATE=$(echo "$PAGE" | jq -r '.[-1].createdAt') done fi
--search "created:<$LAST_DATE"
--json number,title,labels,author,createdAt) PAGE_COUNT=$(echo "$PAGE" | jq length) [ "$PAGE_COUNT" -eq 0 ] && break ISSUES_LIST=$(echo "$ISSUES_LIST" "$PAGE" | jq -s '.[0] + .[1] | unique_by(.number)') ISSUE_COUNT=$(echo "$ISSUES_LIST" | jq length) [ "$PAGE_COUNT" -lt 500 ] && break LAST_DATE=$(echo "$PAGE" | jq -r '.[-1].createdAt') done fi
Same for PRs
PR执行相同操作
PRS_LIST=$(gh pr list --repo $REPO --state open --limit 500
--json number,title,labels,author,headRefName,baseRefName,isDraft,createdAt) PR_COUNT=$(echo "$PRS_LIST" | jq length)
--json number,title,labels,author,headRefName,baseRefName,isDraft,createdAt) PR_COUNT=$(echo "$PRS_LIST" | jq length)
if [ "$PR_COUNT" -eq 500 ]; then
LAST_DATE=$(echo "$PRS_LIST" | jq -r '.[-1].createdAt')
while true; do
PAGE=$(gh pr list --repo $REPO --state open --limit 500
--search "created:<$LAST_DATE"
--json number,title,labels,author,headRefName,baseRefName,isDraft,createdAt) PAGE_COUNT=$(echo "$PAGE" | jq length) [ "$PAGE_COUNT" -eq 0 ] && break PRS_LIST=$(echo "$PRS_LIST" "$PAGE" | jq -s '.[0] + .[1] | unique_by(.number)') PR_COUNT=$(echo "$PRS_LIST" | jq length) [ "$PAGE_COUNT" -lt 500 ] && break LAST_DATE=$(echo "$PAGE" | jq -r '.[-1].createdAt') done fi
--search "created:<$LAST_DATE"
--json number,title,labels,author,headRefName,baseRefName,isDraft,createdAt) PAGE_COUNT=$(echo "$PAGE" | jq length) [ "$PAGE_COUNT" -eq 0 ] && break PRS_LIST=$(echo "$PRS_LIST" "$PAGE" | jq -s '.[0] + .[1] | unique_by(.number)') PR_COUNT=$(echo "$PRS_LIST" | jq length) [ "$PAGE_COUNT" -lt 500 ] && break LAST_DATE=$(echo "$PAGE" | jq -r '.[-1].createdAt') done fi
echo "Total issues: $ISSUE_COUNT, Total PRs: $PR_COUNT"
**LARGE REPOSITORY HANDLING:**
If total items exceeds 50, you MUST process ALL items. Use the pagination code above to fetch every single open issue and PR.
**DO NOT** sample or limit to 50 items - process the entire backlog.
Example: If there are 500 open issues, spawn 500 subagents. If there are 1000 open PRs, spawn 1000 subagents.
**Note:** Background task system will queue excess tasks automatically.
---PRS_LIST=$(gh pr list --repo $REPO --state open --limit 500
--json number,title,labels,author,headRefName,baseRefName,isDraft,createdAt) PR_COUNT=$(echo "$PRS_LIST" | jq length)
--json number,title,labels,author,headRefName,baseRefName,isDraft,createdAt) PR_COUNT=$(echo "$PRS_LIST" | jq length)
if [ "$PR_COUNT" -eq 500 ]; then
LAST_DATE=$(echo "$PRS_LIST" | jq -r '.[-1].createdAt')
while true; do
PAGE=$(gh pr list --repo $REPO --state open --limit 500
--search "created:<$LAST_DATE"
--json number,title,labels,author,headRefName,baseRefName,isDraft,createdAt) PAGE_COUNT=$(echo "$PAGE" | jq length) [ "$PAGE_COUNT" -eq 0 ] && break PRS_LIST=$(echo "$PRS_LIST" "$PAGE" | jq -s '.[0] + .[1] | unique_by(.number)') PR_COUNT=$(echo "$PRS_LIST" | jq length) [ "$PAGE_COUNT" -lt 500 ] && break LAST_DATE=$(echo "$PAGE" | jq -r '.[-1].createdAt') done fi
--search "created:<$LAST_DATE"
--json number,title,labels,author,headRefName,baseRefName,isDraft,createdAt) PAGE_COUNT=$(echo "$PAGE" | jq length) [ "$PAGE_COUNT" -eq 0 ] && break PRS_LIST=$(echo "$PRS_LIST" "$PAGE" | jq -s '.[0] + .[1] | unique_by(.number)') PR_COUNT=$(echo "$PRS_LIST" | jq length) [ "$PAGE_COUNT" -lt 500 ] && break LAST_DATE=$(echo "$PAGE" | jq -r '.[-1].createdAt') done fi
echo "Total issues: $ISSUE_COUNT, Total PRs: $PR_COUNT"
undefinedPhase 2: Classify
大型仓库处理
| Type | Detection |
|---|---|
| |
| |
| |
| Anything else |
| Title starts with |
| Everything else |
如果待处理项总数超过50个,必须处理所有待处理项。使用上述分页代码获取每个未关闭的Issue和PR。不得抽样或限制为50个待处理项——需处理全部积压项。
示例:如果有500个未关闭的Issue,则生成500个子代理。如果有1000个未关闭的PR,则生成1000个子代理。
注意:后台任务系统会自动对超额任务进行排队。
Phase 3: Spawn Subagents (Individual Tool Calls)
阶段2:分类
CRITICAL: Create tasks ONE BY ONE using individual tool calls. NEVER batch or script.
task_createFor each item, execute these steps sequentially:
| 类型 | 识别规则 |
|---|---|
| 包含 |
| 包含 |
| 包含 |
| 其他所有类型 |
| 标题以 |
| 其他所有PR类型 |
Step 3.1: Create Task Record
阶段3:生成子代理(单独调用工具)
typescript
task_create(
subject="Triage: #{number} {title}",
description="GitHub {issue|PR} triage analysis - {type}",
metadata={"type": "{ISSUE_QUESTION|ISSUE_BUG|ISSUE_FEATURE|ISSUE_OTHER|PR_BUGFIX|PR_OTHER}", "number": {number}}
)关键要求:使用单独的工具调用逐个创建任务。绝不能批量处理或编写脚本。
task_create针对每个待处理项,按顺序执行以下步骤:
Step 3.2: Spawn Analysis Subagent (Background)
步骤3.1:创建任务记录
typescript
task(
category="quick",
run_in_background=true,
load_skills=[],
prompt=SUBAGENT_PROMPT
)ABSOLUTE RULES for Subagents:
- ONLY ANALYZE - Never take action on GitHub (no comments, merges, closes)
- READ-ONLY - Use tools only for reading code/GitHub data
- WRITE REPORT ONLY - Output goes to via Write tool
{REPORT_DIR}/{issue|pr}-{number}.md - EVIDENCE REQUIRED - Every claim must have GitHub permalink as proof
For each item:
1. task_create(subject="Triage: #{number} {title}")
2. task(category="quick", run_in_background=true, load_skills=[], prompt=SUBAGENT_PROMPT)
3. Store mapping: item_number -> { task_id, background_task_id }typescript
task_create(
subject="Triage: #{number} {title}",
description="GitHub {issue|PR} triage analysis - {type}",
metadata={"type": "{ISSUE_QUESTION|ISSUE_BUG|ISSUE_FEATURE|ISSUE_OTHER|PR_BUGFIX|PR_OTHER}", "number": {number}}
)Subagent Prompts
步骤3.2:生成分析子代理(后台运行)
Common Preamble (include in ALL subagent prompts)
—
CONTEXT:
- Repository: {REPO}
- Report directory: {REPORT_DIR}
- Current commit SHA: {COMMIT_SHA}
PERMALINK FORMAT:
Every factual claim MUST include a permalink: https://github.com/{REPO}/blob/{COMMIT_SHA}/{filepath}#L{start}-L{end}
No permalink = no claim. Mark unverifiable claims as [UNVERIFIED].
To get current SHA if needed: git rev-parse HEAD
ABSOLUTE RULES (violating ANY = critical failure):
- NEVER run gh issue comment, gh issue close, gh issue edit
- NEVER run gh pr comment, gh pr merge, gh pr review, gh pr edit
- NEVER run any gh command with -X POST, -X PUT, -X PATCH, -X DELETE
- NEVER run git checkout, git fetch, git pull, git switch, git worktree
- Your ONLY writable output: {REPORT_DIR}/{issue|pr}-{number}.md via the Write tooltypescript
task(
category="quick",
run_in_background=true,
load_skills=[],
prompt=SUBAGENT_PROMPT
)子代理的绝对规则:
- 仅分析 - 绝不针对GitHub执行任何操作(不评论、不合并、不关闭)
- 只读 - 仅使用工具读取代码/GitHub数据
- 仅写入报告 - 输出只能通过Write工具写入
{REPORT_DIR}/{issue|pr}-{number}.md - 必须提供证据 - 所有结论都必须有GitHub永久链接作为证明
For each item:
1. task_create(subject="Triage: #{number} {title}")
2. task(category="quick", run_in_background=true, load_skills=[], prompt=SUBAGENT_PROMPT)
3. Store mapping: item_number -> { task_id, background_task_id }ISSUE_QUESTION
子代理提示词
—
通用前置内容(所有子代理提示词都需包含)
You are analyzing issue #{number} for {REPO}.
ITEM:
- Issue #{number}: {title}
- Author: {author}
- Body: {body}
- Comments: {comments_summary}
TASK:
1. Understand the question.
2. Search the codebase (Grep, Read) for the answer.
3. For every finding, construct a permalink: https://github.com/{REPO}/blob/{COMMIT_SHA}/{path}#L{N}
4. Write report to {REPORT_DIR}/issue-{number}.md
REPORT FORMAT (write this as the file content):CONTEXT:
- Repository: {REPO}
- Report directory: {REPORT_DIR}
- Current commit SHA: {COMMIT_SHA}
PERMALINK FORMAT:
Every factual claim MUST include a permalink: https://github.com/{REPO}/blob/{COMMIT_SHA}/{filepath}#L{start}-L{end}
No permalink = no claim. Mark unverifiable claims as [UNVERIFIED].
To get current SHA if needed: git rev-parse HEAD
ABSOLUTE RULES (violating ANY = critical failure):
- NEVER run gh issue comment, gh issue close, gh issue edit
- NEVER run gh pr comment, gh pr merge, gh pr review, gh pr edit
- NEVER run any gh command with -X POST, -X PUT, -X PATCH, -X DELETE
- NEVER run git checkout, git fetch, git pull, git switch, git worktree
- Your ONLY writable output: {REPORT_DIR}/{issue|pr}-{number}.md via the Write toolIssue #{number}: {title}
ISSUE_QUESTION
Type: Question | Author: {author} | Created: {createdAt}
You are analyzing issue #{number} for {REPO}.
ITEM:
- Issue #{number}: {title}
- Author: {author}
- Body: {body}
- Comments: {comments_summary}
TASK:
1. Understand the question.
2. Search the codebase (Grep, Read) for the answer.
3. For every finding, construct a permalink: https://github.com/{REPO}/blob/{COMMIT_SHA}/{path}#L{N}
4. Write report to {REPORT_DIR}/issue-{number}.md
REPORT FORMAT (write this as the file content):Question
Issue #{number}: {title}
[1-2 sentence summary]
Type: Question | Author: {author} | Created: {createdAt}
Findings
Question
[Each finding with permalink proof. Example:]
- The config is parsed in
src/config/loader.ts#L42-L58
[1-2 sentence summary]
Suggested Answer
Findings
[Draft answer with code references and permalinks]
[Each finding with permalink proof. Example:]
- The config is parsed in
src/config/loader.ts#L42-L58
Confidence: [HIGH | MEDIUM | LOW]
Suggested Answer
[Reason. If LOW: what's missing]
[Draft answer with code references and permalinks]
Recommended Action
Confidence: [HIGH | MEDIUM | LOW]
[What maintainer should do]
REMEMBER: No permalink = no claim. Every code reference needs a permalink.
---[Reason. If LOW: what's missing]
ISSUE_BUG
Recommended Action
You are analyzing bug report #{number} for {REPO}.
ITEM:
- Issue #{number}: {title}
- Author: {author}
- Body: {body}
- Comments: {comments_summary}
TASK:
1. Understand: expected behavior, actual behavior, reproduction steps.
2. Search the codebase for relevant code. Trace the logic.
3. Determine verdict: CONFIRMED_BUG, NOT_A_BUG, ALREADY_FIXED, or UNCLEAR.
4. For ALREADY_FIXED: find the fixing commit using git log/git blame. Include the commit SHA and what changed.
5. For every finding, construct a permalink.
6. Write report to {REPORT_DIR}/issue-{number}.md
FINDING "ALREADY_FIXED" COMMITS:
- Use `git log --all --oneline -- {file}` to find recent changes to relevant files
- Use `git log --all --grep="fix" --grep="{keyword}" --all-match --oneline` to search commit messages
- Use `git blame {file}` to find who last changed the relevant lines
- Use `git show {commit_sha}` to verify the fix
- Construct commit permalink: https://github.com/{REPO}/commit/{fix_commit_sha}
REPORT FORMAT (write this as the file content):[What maintainer should do]
REMEMBER: No permalink = no claim. Every code reference needs a permalink.
---Issue #{number}: {title}
ISSUE_BUG
Type: Bug Report | Author: {author} | Created: {createdAt}
You are analyzing bug report #{number} for {REPO}.
ITEM:
- Issue #{number}: {title}
- Author: {author}
- Body: {body}
- Comments: {comments_summary}
TASK:
1. Understand: expected behavior, actual behavior, reproduction steps.
2. Search the codebase for relevant code. Trace the logic.
3. Determine verdict: CONFIRMED_BUG, NOT_A_BUG, ALREADY_FIXED, or UNCLEAR.
4. For ALREADY_FIXED: find the fixing commit using git log/git blame. Include the commit SHA and what changed.
5. For every finding, construct a permalink.
6. Write report to {REPORT_DIR}/issue-{number}.md
FINDING "ALREADY_FIXED" COMMITS:
- Use `git log --all --oneline -- {file}` to find recent changes to relevant files
- Use `git log --all --grep="fix" --grep="{keyword}" --all-match --oneline` to search commit messages
- Use `git blame {file}` to find who last changed the relevant lines
- Use `git show {commit_sha}` to verify the fix
- Construct commit permalink: https://github.com/{REPO}/commit/{fix_commit_sha}
REPORT FORMAT (write this as the file content):Bug Summary
Issue #{number}: {title}
Expected: [what user expects]
Actual: [what actually happens]
Reproduction: [steps if provided]
Type: Bug Report | Author: {author} | Created: {createdAt}
Verdict: [CONFIRMED_BUG | NOT_A_BUG | ALREADY_FIXED | UNCLEAR]
Bug Summary
Analysis
—
Evidence
—
[Each piece of evidence with permalink. No permalink = mark [UNVERIFIED]]
Expected: [what user expects]
Actual: [what actually happens]
Reproduction: [steps if provided]
Root Cause (if CONFIRMED_BUG)
Verdict: [CONFIRMED_BUG | NOT_A_BUG | ALREADY_FIXED | UNCLEAR]
—
Analysis
—
Evidence
[Which file, which function, what goes wrong]
- Problematic code:
{path}#L{N}
[Each piece of evidence with permalink. No permalink = mark [UNVERIFIED]]
Why Not A Bug (if NOT_A_BUG)
Root Cause (if CONFIRMED_BUG)
[Rigorous proof with permalinks that current behavior is correct]
[Which file, which function, what goes wrong]
- Problematic code:
{path}#L{N}
Fix Details (if ALREADY_FIXED)
Why Not A Bug (if NOT_A_BUG)
- Fixed in commit:
{short_sha} - Fixed date: {date}
- What changed: [description with diff permalink]
- Fixed by: {author}
[Rigorous proof with permalinks that current behavior is correct]
Blockers (if UNCLEAR)
Fix Details (if ALREADY_FIXED)
[What prevents determination, what to investigate next]
- Fixed in commit:
{short_sha} - Fixed date: {date}
- What changed: [description with diff permalink]
- Fixed by: {author}
Severity: [LOW | MEDIUM | HIGH | CRITICAL]
Blockers (if UNCLEAR)
Affected Files
—
[List with permalinks]
[What prevents determination, what to investigate next]
Suggested Fix (if CONFIRMED_BUG)
Severity: [LOW | MEDIUM | HIGH | CRITICAL]
—
Affected Files
[Specific approach: "In {file}#L{N}, change X to Y because Z"]
[List with permalinks]
Recommended Action
Suggested Fix (if CONFIRMED_BUG)
[What maintainer should do]
CRITICAL: Claims without permalinks are worthless. If you cannot find evidence, say so explicitly rather than making unverified claims.
---[Specific approach: "In {file}#L{N}, change X to Y because Z"]
ISSUE_FEATURE
Recommended Action
You are analyzing feature request #{number} for {REPO}.
ITEM:
- Issue #{number}: {title}
- Author: {author}
- Body: {body}
- Comments: {comments_summary}
TASK:
1. Understand the request.
2. Search codebase for existing (partial/full) implementations.
3. Assess feasibility.
4. Write report to {REPORT_DIR}/issue-{number}.md
REPORT FORMAT (write this as the file content):[What maintainer should do]
CRITICAL: Claims without permalinks are worthless. If you cannot find evidence, say so explicitly rather than making unverified claims.
---Issue #{number}: {title}
ISSUE_FEATURE
Type: Feature Request | Author: {author} | Created: {createdAt}
You are analyzing feature request #{number} for {REPO}.
ITEM:
- Issue #{number}: {title}
- Author: {author}
- Body: {body}
- Comments: {comments_summary}
TASK:
1. Understand the request.
2. Search codebase for existing (partial/full) implementations.
3. Assess feasibility.
4. Write report to {REPORT_DIR}/issue-{number}.md
REPORT FORMAT (write this as the file content):Request Summary
Issue #{number}: {title}
[What the user wants]
Type: Feature Request | Author: {author} | Created: {createdAt}
Existing Implementation: [YES_FULLY | YES_PARTIALLY | NO]
Request Summary
[If exists: where, with permalinks to the implementation]
[What the user wants]
Feasibility: [EASY | MODERATE | HARD | ARCHITECTURAL_CHANGE]
Existing Implementation: [YES_FULLY | YES_PARTIALLY | NO]
Relevant Files
—
[With permalinks]
[If exists: where, with permalinks to the implementation]
Implementation Notes
Feasibility: [EASY | MODERATE | HARD | ARCHITECTURAL_CHANGE]
—
Relevant Files
[Approach, pitfalls, dependencies]
[With permalinks]
Recommended Action
Implementation Notes
[What maintainer should do]
---[Approach, pitfalls, dependencies]
ISSUE_OTHER
Recommended Action
You are analyzing issue #{number} for {REPO}.
ITEM:
- Issue #{number}: {title}
- Author: {author}
- Body: {body}
- Comments: {comments_summary}
TASK: Assess and write report to {REPORT_DIR}/issue-{number}.md
REPORT FORMAT (write this as the file content):[What maintainer should do]
---Issue #{number}: {title}
ISSUE_OTHER
Type: [QUESTION | BUG | FEATURE | DISCUSSION | META | STALE]
Author: {author} | Created: {createdAt}
You are analyzing issue #{number} for {REPO}.
ITEM:
- Issue #{number}: {title}
- Author: {author}
- Body: {body}
- Comments: {comments_summary}
TASK: Assess and write report to {REPORT_DIR}/issue-{number}.md
REPORT FORMAT (write this as the file content):Summary
Issue #{number}: {title}
[1-2 sentences]
Type: [QUESTION | BUG | FEATURE | DISCUSSION | META | STALE]
Author: {author} | Created: {createdAt}
Needs Attention: [YES | NO]
Summary
Suggested Label: [if any]
—
Recommended Action: [what maintainer should do]
—
---[1-2 sentences]
PR_BUGFIX
Needs Attention: [YES | NO]
—
Suggested Label: [if any]
—
Recommended Action: [what maintainer should do]
You are reviewing PR #{number} for {REPO}.
ITEM:
- PR #{number}: {title}
- Author: {author}
- Base: {baseRefName} <- Head: {headRefName}
- Draft: {isDraft} | Mergeable: {mergeable}
- Review: {reviewDecision} | CI: {statusCheckRollup_summary}
- Body: {body}
TASK:
1. Fetch PR details (READ-ONLY): gh pr view {number} --repo {REPO} --json files,reviews,comments,statusCheckRollup,reviewDecision
2. Read diff: gh api repos/{REPO}/pulls/{number}/files
3. Search codebase to verify fix correctness.
4. Write report to {REPORT_DIR}/pr-{number}.md
REPORT FORMAT (write this as the file content):
---PR #{number}: {title}
PR_BUGFIX
Type: Bugfix | Author: {author}
Base: {baseRefName} <- {headRefName} | Draft: {isDraft}
You are reviewing PR #{number} for {REPO}.
ITEM:
- PR #{number}: {title}
- Author: {author}
- Base: {baseRefName} <- Head: {headRefName}
- Draft: {isDraft} | Mergeable: {mergeable}
- Review: {reviewDecision} | CI: {statusCheckRollup_summary}
- Body: {body}
TASK:
1. Fetch PR details (READ-ONLY): gh pr view {number} --repo {REPO} --json files,reviews,comments,statusCheckRollup,reviewDecision
2. Read diff: gh api repos/{REPO}/pulls/{number}/files
3. Search codebase to verify fix correctness.
4. Write report to {REPORT_DIR}/pr-{number}.md
REPORT FORMAT (write this as the file content):Fix Summary
PR #{number}: {title}
[What bug, how fixed - with permalinks to changed code]
Type: Bugfix | Author: {author}
Base: {baseRefName} <- {headRefName} | Draft: {isDraft}
Code Review
Fix Summary
Correctness
—
[Is fix correct? Root cause addressed? Evidence with permalinks]
[What bug, how fixed - with permalinks to changed code]
Side Effects
Code Review
—
Correctness
[Risky changes, breaking changes - with permalinks if any]
[Is fix correct? Root cause addressed? Evidence with permalinks]
Code Quality
Side Effects
[Style, patterns, test coverage]
[Risky changes, breaking changes - with permalinks if any]
Merge Readiness
Code Quality
| Check | Status |
|---|---|
| CI | [PASS / FAIL / PENDING] |
| Review | [APPROVED / CHANGES_REQUESTED / PENDING / NONE] |
| Mergeable | [YES / NO / CONFLICTED] |
| Draft | [YES / NO] |
| Correctness | [VERIFIED / CONCERNS / UNCLEAR] |
| Risk | [NONE / LOW / MEDIUM / HIGH] |
[Style, patterns, test coverage]
Files Changed
Merge Readiness
[List with brief descriptions]
| Check | Status |
|---|---|
| CI | [PASS / FAIL / PENDING] |
| Review | [APPROVED / CHANGES_REQUESTED / PENDING / NONE] |
| Mergeable | [YES / NO / CONFLICTED] |
| Draft | [YES / NO] |
| Correctness | [VERIFIED / CONCERNS / UNCLEAR] |
| Risk | [NONE / LOW / MEDIUM / HIGH] |
Recommended Action: [MERGE | REQUEST_CHANGES | NEEDS_REVIEW | WAIT]
Files Changed
[Reasoning with evidence]
NEVER merge. NEVER comment. NEVER review. Write to file ONLY.
---[List with brief descriptions]
PR_OTHER
Recommended Action: [MERGE | REQUEST_CHANGES | NEEDS_REVIEW | WAIT]
You are reviewing PR #{number} for {REPO}.
ITEM:
- PR #{number}: {title}
- Author: {author}
- Base: {baseRefName} <- Head: {headRefName}
- Draft: {isDraft} | Mergeable: {mergeable}
- Review: {reviewDecision} | CI: {statusCheckRollup_summary}
- Body: {body}
TASK:
1. Fetch PR details (READ-ONLY): gh pr view {number} --repo {REPO} --json files,reviews,comments,statusCheckRollup,reviewDecision
2. Read diff: gh api repos/{REPO}/pulls/{number}/files
3. Write report to {REPORT_DIR}/pr-{number}.md
REPORT FORMAT (write this as the file content):[Reasoning with evidence]
NEVER merge. NEVER comment. NEVER review. Write to file ONLY.
---PR #{number}: {title}
PR_OTHER
Type: [FEATURE | REFACTOR | DOCS | CHORE | TEST | OTHER]
Author: {author}
Base: {baseRefName} <- {headRefName} | Draft: {isDraft}
You are reviewing PR #{number} for {REPO}.
ITEM:
- PR #{number}: {title}
- Author: {author}
- Base: {baseRefName} <- Head: {headRefName}
- Draft: {isDraft} | Mergeable: {mergeable}
- Review: {reviewDecision} | CI: {statusCheckRollup_summary}
- Body: {body}
TASK:
1. Fetch PR details (READ-ONLY): gh pr view {number} --repo {REPO} --json files,reviews,comments,statusCheckRollup,reviewDecision
2. Read diff: gh api repos/{REPO}/pulls/{number}/files
3. Write report to {REPORT_DIR}/pr-{number}.md
REPORT FORMAT (write this as the file content):Summary
PR #{number}: {title}
[2-3 sentences with permalinks to key changes]
Type: [FEATURE | REFACTOR | DOCS | CHORE | TEST | OTHER]
Author: {author}
Base: {baseRefName} <- {headRefName} | Draft: {isDraft}
Status
Summary
| Check | Status |
|---|---|
| CI | [PASS / FAIL / PENDING] |
| Review | [APPROVED / CHANGES_REQUESTED / PENDING / NONE] |
| Mergeable | [YES / NO / CONFLICTED] |
| Risk | [LOW / MEDIUM / HIGH] |
| Alignment | [YES / NO / UNCLEAR] |
[2-3 sentences with permalinks to key changes]
Files Changed
Status
[Count and key files]
| Check | Status |
|---|---|
| CI | [PASS / FAIL / PENDING] |
| Review | [APPROVED / CHANGES_REQUESTED / PENDING / NONE] |
| Mergeable | [YES / NO / CONFLICTED] |
| Risk | [LOW / MEDIUM / HIGH] |
| Alignment | [YES / NO / UNCLEAR] |
Blockers
Files Changed
[If any]
[Count and key files]
Recommended Action: [MERGE | REQUEST_CHANGES | NEEDS_REVIEW | CLOSE | WAIT]
Blockers
[Reasoning]
NEVER merge. NEVER comment. NEVER review. Write to file ONLY.
---[If any]
Phase 4: Collect & Update
Recommended Action: [MERGE | REQUEST_CHANGES | NEEDS_REVIEW | CLOSE | WAIT]
Poll per task. As each completes:
background_output()- Parse report.
task_update(id=task_id, status="completed", description=REPORT_SUMMARY)- Stream to user immediately.
[Reasoning]
NEVER merge. NEVER comment. NEVER review. Write to file ONLY.
---Phase 5: Final Summary
阶段4:收集与更新
Write to AND display to user:
{REPORT_DIR}/SUMMARY.mdmarkdown
undefined轮询每个任务的。每个任务完成后:
background_output()- 解析报告。
- 调用更新任务状态。
task_update(id=task_id, status="completed", description=REPORT_SUMMARY) - 立即将结果流式传输给用户。
GitHub Triage Report - {REPO}
阶段5:最终总结
Date: {date} | Commit: {COMMIT_SHA}
Items Processed: {total}
Report Directory: {REPORT_DIR}
写入并展示给用户:
{REPORT_DIR}/SUMMARY.mdmarkdown
undefinedIssues ({issue_count})
GitHub Triage Report - {REPO}
| Category | Count |
|---|---|
| Bug Confirmed | {n} |
| Bug Already Fixed | {n} |
| Not A Bug | {n} |
| Needs Investigation | {n} |
| Question Analyzed | {n} |
| Feature Assessed | {n} |
| Other | {n} |
Date: {date} | Commit: {COMMIT_SHA}
Items Processed: {total}
Report Directory: {REPORT_DIR}
PRs ({pr_count})
Issues ({issue_count})
| Category | Count |
|---|---|
| Bugfix Reviewed | {n} |
| Other PR Reviewed | {n} |
| Category | Count |
|---|---|
| Bug Confirmed | {n} |
| Bug Already Fixed | {n} |
| Not A Bug | {n} |
| Needs Investigation | {n} |
| Question Analyzed | {n} |
| Feature Assessed | {n} |
| Other | {n} |
Items Requiring Attention
PRs ({pr_count})
[Each item: number, title, verdict, 1-line summary, link to report file]
| Category | Count |
|---|---|
| Bugfix Reviewed | {n} |
| Other PR Reviewed | {n} |
Report Files
Items Requiring Attention
[All generated files with paths]
---[Each item: number, title, verdict, 1-line summary, link to report file]
Anti-Patterns
Report Files
| Violation | Severity |
|---|---|
| ANY GitHub mutation (comment/close/merge/review/label/edit) | CRITICAL |
| Claim without permalink | CRITICAL |
Using category other than | CRITICAL |
| Batching multiple items into one task | CRITICAL |
| CRITICAL |
| CRITICAL |
| Guessing without codebase evidence | HIGH |
Not writing report to | HIGH |
| Using branch name instead of commit SHA in permalink | HIGH |
[All generated files with paths]
---—
反模式
—
| 违规行为 | 严重程度 |
|---|---|
| 任何修改GitHub状态的操作(评论/关闭/合并/审核/添加标签/编辑) | CRITICAL |
| 无永久链接的结论 | CRITICAL |
使用 | CRITICAL |
| 将多个待处理项批量放入一个任务 | CRITICAL |
设置 | CRITICAL |
切换到PR分支的 | CRITICAL |
| 无代码库证据的猜测 | HIGH |
未将报告写入 | HIGH |
| 永久链接使用分支名而非提交SHA | HIGH |