sync-provider

Compare original and translation side by side

🇺🇸

Original

English
🇨🇳

Translation

Chinese

Sync Provider

同步Provider

Sync changes from cloned provider repositories while preserving local customizations.
同步克隆的provider仓库的变更,同时保留本地自定义内容。

Overview

概述

Most providers (except
opencode
) are cloned from external repositories and need to be kept in sync with upstream changes. This skill guides the complete workflow from checking for updates to applying changes safely.
除了
opencode
之外,大多数provider都是从外部仓库克隆而来,需要与上游变更保持同步。本指南涵盖了从检查更新到安全应用变更的完整工作流。

Prerequisites

前置要求

  1. GitHub CLI (
    gh
    )
    : Must be installed and authenticated
    bash
    gh --version
    gh auth login
    gh auth status
  2. GitHub Token: Add
    GITHUB_TOKEN
    to root
    .env
    file
    bash
    GITHUB_TOKEN=your_token_here
  3. Repository Info: Each provider's
    package.json
    contains
    repository.url
    field (format:
    https://github.com/owner/repo-name
    )
  1. GitHub CLI (
    gh
    )
    :必须已安装并完成身份认证
    bash
    gh --version
    gh auth login
    gh auth status
  2. GitHub Token:在根目录
    .env
    文件中添加
    GITHUB_TOKEN
    bash
    GITHUB_TOKEN=your_token_here
  3. 仓库信息:每个provider的
    package.json
    中都包含
    repository.url
    字段(格式:
    https://github.com/owner/repo-name

Workflow

工作流

Step 1: Identify Provider and Repository

步骤1:确认Provider和仓库信息

Check the provider's
package.json
for repository URL:
bash
cat providers/<provider>/package.json | grep -A 2 repository
Extract owner/repo format:
https://github.com/owner/repo-name
owner/repo-name
查看provider的
package.json
获取仓库URL:
bash
cat providers/<provider>/package.json | grep -A 2 repository
提取owner/repo格式:
https://github.com/owner/repo-name
owner/repo-name

Step 2: Check for New Commits

步骤2:检查新提交

Use
check-provider-commit.sh
to automatically check for new commits:
bash
COMMIT_HASH=$(scripts/check-provider-commit.sh <provider> 2>/dev/null)
EXIT_CODE=$?

if [ $EXIT_CODE -eq 0 ] && [ -n "$COMMIT_HASH" ]; then
  echo "Proceeding with sync from commit: $COMMIT_HASH"
elif [ $EXIT_CODE -eq 0 ]; then
  echo "Already up to date - no sync needed"
  exit 0
else
  echo "No state file found - need to determine initial commit hash manually"
  exit 1
fi
Script Behavior:
  • Up to date: Exits with code 0, shows "Already up to date"
  • New commits: Outputs commit hash (last synced), exits with code 0
  • No state file: Exits with code 1, shows latest commit (requires manual determination for first sync)
Important: Redirect stderr (
2>/dev/null
) when capturing commit hash to avoid mixing informational messages.
使用
check-provider-commit.sh
自动检查新提交:
bash
COMMIT_HASH=$(scripts/check-provider-commit.sh <provider> 2>/dev/null)
EXIT_CODE=$?

if [ $EXIT_CODE -eq 0 ] && [ -n "$COMMIT_HASH" ]; then
  echo "Proceeding with sync from commit: $COMMIT_HASH"
elif [ $EXIT_CODE -eq 0 ]; then
  echo "Already up to date - no sync needed"
  exit 0
else
  echo "No state file found - need to determine initial commit hash manually"
  exit 1
fi
脚本行为:
  • 已是最新版本:退出码为0,显示"Already up to date"
  • 存在新提交:输出提交哈希(上次同步的版本),退出码为0
  • 无状态文件:退出码为1,显示最新提交(首次同步需要手动确认初始提交哈希)
重要提示:捕获提交哈希时请重定向stderr(
2>/dev/null
),避免混入提示信息。

Step 3: Run Git Diff Script

步骤3:运行Git Diff脚本

Execute the sync script with appropriate parameters:
bash
REPO=$(cat providers/<provider>/package.json | grep -A 2 repository | grep url | cut -d'"' -f4 | sed 's|https://github.com/||')

pnpm exec tsx scripts/git-diff.ts \
  -r $REPO \
  -c "$COMMIT_HASH" \
  -i "src/**/*" \
  -i "docs/**/*" \
  -i "examples/**/*" \
  --state-file-path ./providers/<provider>/diff_last_commit.txt
Parameters:
  • -r, --repo
    : GitHub repository in
    owner/repo
    format (required)
  • -c, --commit
    : Commit hash to compare from (required) - use last synced commit from state file
  • -i, --include
    : Glob pattern(s) to filter files (can specify multiple times)
    • Always use these three default patterns:
      • src/**/*
        - All source files
      • docs/**/*
        - Documentation files
      • examples/**/*
        - Example files
  • --state-file-path
    : Path to store last synced commit hash
使用合适的参数执行同步脚本:
bash
REPO=$(cat providers/<provider>/package.json | grep -A 2 repository | grep url | cut -d'"' -f4 | sed 's|https://github.com/||')

pnpm exec tsx scripts/git-diff.ts \
  -r $REPO \
  -c "$COMMIT_HASH" \
  -i "src/**/*" \
  -i "docs/**/*" \
  -i "examples/**/*" \
  --state-file-path ./providers/<provider>/diff_last_commit.txt
参数说明:
  • -r, --repo
    :GitHub仓库,格式为
    owner/repo
    (必填)
  • -c, --commit
    :用于对比的提交哈希(必填)- 使用状态文件中记录的上次同步的提交哈希
  • -i, --include
    :用于过滤文件的glob模式(可指定多次)
    • 请始终使用这三个默认模式:
      • src/**/*
        - 所有源码文件
      • docs/**/*
        - 文档文件
      • examples/**/*
        - 示例文件
  • --state-file-path
    :存储上次同步提交哈希的文件路径

Step 4: Deep Research and Analysis

步骤4:深度调研与分析

MANDATORY: Before applying any changes, perform deep research:
bash
undefined
【必填】:应用任何变更前,请先完成深度调研:
bash
undefined

List all generated diff files

列出所有生成的差异文件

ls -R .diffs/<commit_hash>/
ls -R .diffs/<commit_hash>/

Review all diffs to understand scope

查看所有差异以了解变更范围

find .diffs/<commit_hash>/ -name "*.txt" -exec echo "=== {} ===" ; -exec cat {} ;
find .diffs/<commit_hash>/ -name "*.txt" -exec echo "=== {} ===" ; -exec cat {} ;

Get list of affected files

获取受影响的文件列表

find .diffs/<commit_hash>/ -name "*.txt" | sed "s|.diffs/<commit_hash>/||" | sed 's|.txt$||'

**Research Steps:**

1. **Analyze All Diffs**: Review every diff file to understand:
   - Files being modified, added, or removed
   - Nature of changes (bug fixes, features, refactoring, breaking changes)
   - Impact on existing local customizations
   - Dependencies between changes

2. **Check Local Customizations**: Review current local files to identify:
   - Custom modifications that might conflict
   - Local additions that should be preserved
   - Configuration differences

3. **Review Project Rules**: **MANDATORY STEP** - Check `.cursor/rules` to understand:
   - Which rules apply to this provider (TypeScript, React, etc.)
   - Project standards and patterns
   - Best practices for the technologies involved
find .diffs/<commit_hash>/ -name "*.txt" | sed "s|.diffs/<commit_hash>/||" | sed 's|.txt$||'

**调研步骤:**

1. **分析所有差异**:查看每个差异文件,了解以下信息:
   - 被修改、新增或删除的文件
   - 变更的性质(bug修复、功能新增、重构、破坏性变更)
   - 对现有本地自定义内容的影响
   - 变更之间的依赖关系

2. **检查本地自定义内容**:查看当前本地文件,识别:
   - 可能产生冲突的自定义修改
   - 需要保留的本地新增内容
   - 配置差异

3. **查看项目规则**:【必填步骤】- 查看`.cursor/rules`了解:
   - 适用于该provider的规则(TypeScript、React等)
   - 项目标准和规范
   - 相关技术的最佳实践

Step 5: Create Refactoring Plan Using Pal MCP

步骤5:使用Pal MCP创建重构计划

MANDATORY: Use Pal MCP refactor tool with Gemini 3.0 Pro to create comprehensive plan.
<critical> - **TASK INVALIDATION**: **THE TASK WILL BE INVALIDATED** if you make ANY edits to files before completing ALL steps of the Pal Refactor tool - **NO EXCEPTIONS**: You MUST complete the entire Pal Refactor workflow (all steps until `next_step_required: false`) before touching ANY files - **VERIFICATION**: Only proceed to Step 7 after receiving confirmation that all Pal Refactor steps are complete </critical>
Refactor Tool Usage:
  1. Identify Relevant Files: Collect all files that will be affected:
    • Files from
      .diffs/<commit_hash>/
      that have changes
    • Current local files in
      providers/<provider>/
      that correspond to changed files
    • Use FULL absolute paths for all files
  2. Run Pal Refactor Analysis:
    typescript
    // Use mcp_zen_refactor with:
    // - model: "gemini-3.0-pro" or "anthropic/claude-opus-4.6"
    // - relevant_files: Array of absolute paths to affected files
    // - refactor_type: "modernize" or "organization" (as appropriate)
    // - focus_areas: ["sync-upstream-changes", "preserve-local-customizations"]
  3. Complete ALL Steps: MANDATORY - Finalize ALL steps until
    next_step_required: false
    • Continue calling the refactor tool until you receive confirmation that all steps are complete
    • Do NOT proceed to any file edits until this is confirmed
    • TASK WILL BE INVALIDATED if you skip this step
  4. Review Refactoring Recommendations: The tool will provide:
    • Best approach for applying changes
    • How to preserve local customizations
    • Potential conflicts and how to resolve them
    • Code quality improvements
    • Architecture considerations
Refactor Tool Requirements:
  • Model: Must use
    gemini-3.0-pro
    or
    anthropic/claude-opus-4.6
  • Multi-step: Complete ALL steps until
    next_step_required: false
    - NO EXCEPTIONS
  • File Paths: Use FULL absolute paths (e.g.,
    /Users/pedronauck/Dev/compozy/compozy-code/providers/claude-code/src/index.ts
    )
  • Focus Areas: Include context about syncing upstream changes while preserving local customizations
  • Completion Verification: Only proceed when the tool confirms all steps are complete
【必填】:使用搭载Gemini 3.0 Pro的Pal MCP重构工具创建全面的计划。
<critical> - **任务失效规则**:如果你在完成Pal重构工具的所有步骤之前对文件进行任何编辑,**任务将被判定为无效** - **无例外规则**:你必须完成整个Pal重构工作流(所有步骤直到`next_step_required: false`),才能修改任何文件 - **验证要求**:仅在收到Pal重构所有步骤已完成的确认后,才能进入步骤7 </critical>
重构工具使用说明:
  1. 识别相关文件:收集所有会受影响的文件:
    • .diffs/<commit_hash>/
      下有变更的文件
    • providers/<provider>/
      下对应变更文件的本地版本
    • 所有文件请使用完整绝对路径
  2. 运行Pal重构分析
    typescript
    // 使用mcp_zen_refactor,参数如下:
    // - model: "gemini-3.0-pro" 或 "anthropic/claude-opus-4.6"
    // - relevant_files: 受影响文件的绝对路径数组
    // - refactor_type: "modernize" 或 "organization"(按需选择)
    // - focus_areas: ["sync-upstream-changes", "preserve-local-customizations"]
  3. 完成所有步骤:【必填】- 完成所有步骤直到
    next_step_required: false
    • 持续调用重构工具,直到收到所有步骤已完成的确认
    • 确认完成前请勿进行任何文件编辑
    • 跳过此步骤任务将被判定为无效
  4. 查看重构建议:工具将提供以下内容:
    • 应用变更的最佳方案
    • 保留本地自定义内容的方法
    • 潜在冲突及解决方法
    • 代码质量优化建议
    • 架构层面的考量
重构工具要求:
  • 模型:必须使用
    gemini-3.0-pro
    anthropic/claude-opus-4.6
  • 多步骤要求:完成所有步骤直到
    next_step_required: false
    - 无例外
  • 文件路径:使用完整绝对路径(例如:
    /Users/pedronauck/Dev/compozy/compozy-code/providers/claude-code/src/index.ts
  • 聚焦领域:包含同步上游变更同时保留本地自定义的上下文
  • 完成验证:仅在工具确认所有步骤完成后再继续

Step 6: Create Implementation Plan

步骤6:创建执行计划

Based on the Pal refactor analysis, create detailed implementation plan:
  1. Prioritize Changes: Order by dependencies (apply dependencies first), risk level (low-risk first), impact (critical files first)
  2. Identify Conflicts: Document files with local customizations that conflict with upstream changes, strategy for resolving each conflict, decisions on what to preserve vs. update
  3. Plan Testing Strategy: Define which tests to run after each change, how to verify local customizations are preserved, integration points to test
  4. Document Decisions: Record why certain changes are applied or skipped, how local customizations are preserved, any architectural decisions made
基于Pal重构分析结果,创建详细的执行计划:
  1. 变更优先级排序:按依赖关系(先应用依赖项)、风险等级(先低风险)、影响范围(先核心文件)排序
  2. 识别冲突:记录与上游变更存在冲突的本地自定义文件、每个冲突的解决策略、保留/更新的决策
  3. 规划测试策略:定义每次变更后需要运行的测试、验证本地自定义内容保留的方法、需要测试的集成点
  4. 记录决策:记录应用/跳过特定变更的原因、本地自定义内容的保留方式、所有架构相关决策

Step 7: Apply Changes According to Plan

步骤7:按计划应用变更

<critical> - **VERIFICATION REQUIRED**: Before proceeding, verify that: 1. All Pal Refactor steps are complete (`next_step_required: false`) 2. You have received the complete refactoring analysis and recommendations 3. You have created a detailed implementation plan (Step 6) - **TASK INVALIDATION**: **THE TASK WILL BE INVALIDATED** if you start editing files before completing ALL Pal Refactor steps - **NO EXCEPTIONS**: Even if you think you understand the changes, you MUST complete the full Pal Refactor workflow first </critical>
Only after completing steps 4-6 AND verifying Pal Refactor completion, apply changes:
  1. For Modified Files: Apply changes according to refactor plan, preserving local customizations
  2. For Added Files: Add new files following project standards
  3. For Removed Files: Evaluate if removal should be applied locally (may need to preserve)
  4. For Renamed Files: Handle rename and content changes according to plan
Best Practices:
  • Apply changes incrementally, following the prioritized plan
  • Test after each significant change
  • Preserve local customizations as identified in the plan
  • Document any deviations from the plan
<critical> - **验证要求**:继续操作前,请确认: 1. 所有Pal重构步骤已完成(`next_step_required: false`) 2. 你已收到完整的重构分析和建议 3. 你已创建详细的执行计划(步骤6) - **任务失效规则**:如果你在完成所有Pal重构步骤前开始编辑文件,**任务将被判定为无效** - **无例外规则**:即使你认为自己已经理解变更内容,也必须先完成完整的Pal重构工作流 </critical>
仅在完成步骤4-6且确认Pal重构已完成后,再应用变更:
  1. 修改的文件:按照重构计划应用变更,保留本地自定义内容
  2. 新增的文件:遵循项目规范新增文件
  3. 删除的文件:评估本地是否需要执行删除操作(可能需要保留)
  4. 重命名的文件:按照计划处理重命名和内容变更
最佳实践:
  • 按照优先级计划增量应用变更
  • 每次重大变更后进行测试
  • 按照计划保留本地自定义内容
  • 记录任何偏离计划的操作

Step 8: Update State File

步骤8:更新状态文件

After successfully applying changes, verify state file was updated:
bash
cat providers/<provider>/diff_last_commit.txt
成功应用变更后,验证状态文件已更新:
bash
cat providers/<provider>/diff_last_commit.txt

Should contain the latest commit hash that was synced

应当包含本次同步的最新提交哈希

undefined
undefined

Example Workflow

工作流示例

Syncing claude-code Provider

同步claude-code Provider

bash
undefined
bash
undefined

1. Check repository info

1. 查看仓库信息

cat providers/claude-code/package.json | grep repository
cat providers/claude-code/package.json | grep repository

2. Check for new commits

2. 检查新提交

COMMIT_HASH=$(scripts/check-provider-commit.sh claude-code 2>/dev/null) EXIT_CODE=$?
if [ $EXIT_CODE -ne 0 ]; then echo "⚠️ No state file found - need to determine initial commit hash manually" exit 1 elif [ -z "$COMMIT_HASH" ]; then echo "✅ Already up to date - no sync needed" exit 0 fi
COMMIT_HASH=$(scripts/check-provider-commit.sh claude-code 2>/dev/null) EXIT_CODE=$?
if [ $EXIT_CODE -ne 0 ]; then echo "⚠️ 未找到状态文件 - 需要手动确认初始提交哈希" exit 1 elif [ -z "$COMMIT_HASH" ]; then echo "✅ 已是最新版本 - 无需同步" exit 0 fi

3. Extract repository name

3. 提取仓库名称

REPO=$(cat providers/claude-code/package.json | grep -A 2 repository | grep url | cut -d'"' -f4 | sed 's|https://github.com/||')
REPO=$(cat providers/claude-code/package.json | grep -A 2 repository | grep url | cut -d'"' -f4 | sed 's|https://github.com/||')

4. Run sync script

4. 运行同步脚本

pnpm exec tsx scripts/git-diff.ts
-r $REPO
-c "$COMMIT_HASH"
-i "src//*"
-i "docs/
/"
-i "examples/**/
"
--state-file-path ./providers/claude-code/diff_last_commit.txt
pnpm exec tsx scripts/git-diff.ts
-r $REPO
-c "$COMMIT_HASH"
-i "src//*"
-i "docs/
/"
-i "examples/**/
"
--state-file-path ./providers/claude-code/diff_last_commit.txt

5. Deep research and analysis

5. 深度调研与分析

find .diffs/$COMMIT_HASH/ -name "*.txt" -exec echo "=== {} ===" ; -exec cat {} ;
find .diffs/$COMMIT_HASH/ -name "*.txt" -exec echo "=== {} ===" ; -exec cat {} ;

6. Use Pal MCP refactor tool (complete all steps)

6. 使用Pal MCP重构工具(完成所有步骤)

7. Create implementation plan

7. 创建执行计划

8. Apply changes according to plan

8. 按计划应用变更

9. Verify state file updated

9. 验证状态文件已更新

cat providers/claude-code/diff_last_commit.txt
undefined
cat providers/claude-code/diff_last_commit.txt
undefined

Available Providers

可用Provider列表

ProviderRepositoryNotes
claude-code
ben-vargas/ai-sdk-provider-claude-code
Cloned
gemini
ben-vargas/ai-sdk-provider-gemini-cli
Cloned
codex
ben-vargas/ai-sdk-provider-codex-cli
Cloned
opencode
N/ACreated locally - no sync needed
Provider仓库备注
claude-code
ben-vargas/ai-sdk-provider-claude-code
克隆仓库
gemini
ben-vargas/ai-sdk-provider-gemini-cli
克隆仓库
codex
ben-vargas/ai-sdk-provider-codex-cli
克隆仓库
opencode
本地创建 - 无需同步

Critical Requirements

核心要求

<critical> - **YOU MUST** use `gh` CLI to automatically check for new commits before syncing - **YOU MUST** verify the repository URL from `package.json` before running the script - **YOU MUST** always use the three default include patterns: `src/**/*`, `docs/**/*`, `examples/**/*` - **YOU MUST** use the commit hash from the state file (last synced commit) as the `-c` parameter - **YOU MUST** perform deep research and analysis of all diffs before creating a plan - **YOU MUST** review `.cursor/rules` files to understand which rules apply (MANDATORY STEP) - **YOU MUST NEED** to use the Pal MCP refactor tool with Gemini 3.0 Pro to find out the best way to apply changes - **YOU MUST** complete ALL steps of the Pal refactor tool - don't stop the process in the middle - **YOU MUST** verify Pal Refactor completion (`next_step_required: false`) before proceeding to any file edits - **YOU MUST** create a detailed implementation plan based on the refactor analysis before applying changes - **YOU MUST** use FULL absolute paths when using Pal MCP tools (never relative paths) - **YOU MUST** preserve local customizations when applying upstream changes - **YOU MUST** test the provider after applying changes (`pnpm test` in provider directory) - **YOU MUST** run lint and typecheck after applying changes (`pnpm run lint && pnpm run typecheck`) - **YOU MUST** update the state file path to match the provider directory structure - **NEVER** sync `opencode` provider (it's created locally, not cloned) - **NEVER** apply changes directly without deep research and planning first - **NEVER** start editing files before completing ALL Pal Refactor steps - **NEVER** use workarounds - always prefer good and well-designed solutions - **ALWAYS** use the exact repository format: `owner/repo-name` (no `https://github.com/` prefix) - **ALWAYS** check `.diffs/<commit_hash>/` folder exists and contains expected files before planning - **ALWAYS** verify that `gh` CLI is authenticated (`gh auth status`) - **ALWAYS** follow greenfield approach - don't care about backwards compatibility, prioritize quality - **ALWAYS** double check which rules from `.cursor/rules` apply for the task before starting (MANDATORY STEP) </critical>
<critical> - **必须**在同步前使用`gh` CLI自动检查新提交 - **必须**在运行脚本前验证`package.json`中的仓库URL - **必须**始终使用三个默认包含模式:`src/**/*`、`docs/**/*`、`examples/**/*` - **必须**使用状态文件中的提交哈希(上次同步的提交)作为`-c`参数 - **必须**在创建计划前对所有差异进行深度调研与分析 - **必须**查看`.cursor/rules`文件了解适用规则(必填步骤) - **必须**使用搭载Gemini 3.0 Pro的Pal MCP重构工具确定最佳变更应用方案 - **必须**完成Pal重构工具的所有步骤 - 不要中途停止流程 - **必须**在进行任何文件编辑前验证Pal重构已完成(`next_step_required: false`) - **必须**在应用变更前基于重构分析创建详细的执行计划 - **必须**在使用Pal MCP工具时使用完整绝对路径(禁止使用相对路径) - **必须**在应用上游变更时保留本地自定义内容 - **必须**在应用变更后测试provider(在provider目录下运行`pnpm test`) - **必须**在应用变更后运行lint和类型检查(`pnpm run lint && pnpm run typecheck`) - **必须**更新状态文件路径以匹配provider目录结构 - **禁止**同步`opencode` provider(为本地创建,非克隆而来) - **禁止**未经过深度调研和规划直接应用变更 - **禁止**在完成所有Pal重构步骤前开始编辑文件 - **禁止**使用变通方案 - 始终优先选择设计合理的优质解决方案 - **始终**使用准确的仓库格式:`owner/repo-name`(不带`https://github.com/`前缀) - **始终**在规划前检查`.diffs/<commit_hash>/`文件夹存在且包含预期文件 - **始终**验证`gh` CLI已完成身份认证(`gh auth status`) - **始终**遵循绿地开发原则 - 无需考虑向后兼容性,优先保证质量 - **始终**在开始前二次确认`.cursor/rules`中适用于本次任务的规则(必填步骤) </critical>

Troubleshooting

问题排查

No files matching glob pattern

无匹配glob模式的文件

  • Check if the commit hash is correct
  • Verify the repository name format
  • Use broader patterns like
    **/*
    to see all changes
  • 检查提交哈希是否正确
  • 验证仓库名称格式
  • 使用更宽泛的模式(如
    **/*
    )查看所有变更

State file not found

未找到状态文件

  • First sync: Use the initial commit hash from when you cloned
  • Subsequent syncs: The script creates the state file automatically
  • 首次同步:使用克隆仓库时的初始提交哈希
  • 后续同步:脚本会自动创建状态文件

Binary files detected

检测到二进制文件

Binary files are noted in
.diffs/<commit_hash>/<file-path>.txt
but not diffed. Handle these manually:
  • Images: Review in GitHub or download directly
  • Other binaries: Decide if update is needed
二进制文件会被记录在
.diffs/<commit_hash>/<file-path>.txt
中,但不会生成差异。请手动处理:
  • 图片:在GitHub中查看或直接下载
  • 其他二进制文件:判断是否需要更新

GitHub API rate limits

GitHub API速率限制

  • Wait before retrying
  • Use a GitHub token with higher rate limits
  • Consider syncing in smaller batches
  • 等待一段时间后重试
  • 使用速率限制更高的GitHub token
  • 考虑拆分成更小的批次同步

After Syncing

同步完成后

  • MUST TEST: Run tests in the provider directory:
    bash
    cd providers/<provider>
    pnpm test
    pnpm run lint
    pnpm run typecheck
  • MUST VERIFY: Ensure all changes are correctly applied and no local customizations are lost
  • MUST DOCUMENT: Record any decisions made during the sync process, especially:
    • Conflicts resolved
    • Local customizations preserved
    • Deviations from upstream changes
    • Architectural decisions
  • 必须测试:在provider目录下运行测试:
    bash
    cd providers/<provider>
    pnpm test
    pnpm run lint
    pnpm run typecheck
  • 必须验证:确保所有变更已正确应用,无本地自定义内容丢失
  • 必须记录:记录同步过程中的所有决策,特别是:
    • 已解决的冲突
    • 保留的本地自定义内容
    • 与上游变更的差异
    • 架构相关决策