multi-llm-consult

Compare original and translation side by side

🇺🇸

Original

English
🇨🇳

Translation

Chinese

Multi-LLM Consult

多LLM咨询

Overview

概述

Use a bundled script to query external LLM providers with a sanitized prompt and return a concise comparison.
使用内置脚本,通过经过清理的提示词查询外部LLM提供商,并返回简洁的对比结果。

Setup

配置步骤

  • Configure API keys in the TUI: open the Command Palette (Ctrl+P) and run Configure LLM Providers.
  • Keys are stored in
    settings.json
    under
    llm_providers
    .
  • 在TUI中配置API密钥:打开命令面板(Ctrl+P)并运行Configure LLM Providers
  • 密钥将存储在
    settings.json
    llm_providers
    字段下。

Workflow

工作流程

  1. Identify the purpose (
    second-opinion
    ,
    plan
    ,
    review
    ,
    delegate
    ).
  2. Summarize the task and sanitize sensitive data before sending it out.
  3. Run the consult script with the chosen provider.
  4. Compare responses and reconcile with your own plan before acting.
  1. 确定用途
    second-opinion
    plan
    review
    delegate
    )。
  2. 总结任务内容,并在发送前清理敏感数据
  3. 使用选定的提供商运行咨询脚本。
  4. 对比返回结果,并在采取行动前与自身计划进行协调。

Consult Script

咨询脚本

Always run
--help
first:
bash
python scripts/consult_llm.py --help
Example: second opinion
bash
python scripts/consult_llm.py \
  --provider gemini \
  --purpose second-opinion \
  --prompt "We plan to refactor module X. What risks or gaps do you see?"
Example: delegate a review
bash
python scripts/consult_llm.py \
  --provider qwen \
  --purpose review \
  --prompt-file /tmp/review_request.md \
  --context-file /tmp/patch.diff
Example: plan check with Codex (OpenAI)
bash
python scripts/consult_llm.py \
  --provider codex \
  --purpose plan \
  --prompt "Draft a 5-step plan for implementing feature Y."
请先运行
--help
查看帮助:
bash
python scripts/consult_llm.py --help
示例:获取第二意见
bash
python scripts/consult_llm.py \
  --provider gemini \
  --purpose second-opinion \
  --prompt "We plan to refactor module X. What risks or gaps do you see?"
示例:委托评审
bash
python scripts/consult_llm.py \
  --provider qwen \
  --purpose review \
  --prompt-file /tmp/review_request.md \
  --context-file /tmp/patch.diff
示例:使用Codex(OpenAI)检查计划
bash
python scripts/consult_llm.py \
  --provider codex \
  --purpose plan \
  --prompt "Draft a 5-step plan for implementing feature Y."

Output Handling

结果处理

  • Treat responses as advisory; verify against repo constraints and current state.
  • Summarize the external response in 3-6 bullets before acting.
  • If responses conflict, call out the differences explicitly and choose a path.
  • 将返回结果视为参考建议;需结合仓库约束和当前状态进行验证。
  • 采取行动前,将外部返回结果总结为3-6条要点。
  • 若结果存在冲突,需明确指出差异并选择合适的方案。

References

参考资料

  • Provider defaults and configuration:
    references/providers.md
  • 提供商默认设置与配置:
    references/providers.md