sf-datacloud-connect
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
Chinesesf-datacloud-connect: Data Cloud Connect Phase
sf-datacloud-connect:Data Cloud Connect阶段
Use this skill when the user needs source connection work: connector discovery, connection metadata, connection testing, browsing source objects, or understanding what connector type to use.
当用户需要进行源连接工作时使用此技能:连接器发现、连接元数据、连接测试、浏览源对象,或了解应使用哪种连接器类型。
When This Skill Owns the Task
此技能负责的任务场景
Use when the work involves:
sf-datacloud-connectsf data360 connection *- connector catalog inspection
- connection creation, update, test, or delete
- browsing source objects, fields, databases, or schemas
- identifying connector types already in use
Delegate elsewhere when the user is:
- creating data streams or DLOs → sf-datacloud-prepare
- creating DMOs, mappings, IR rulesets, or data graphs → sf-datacloud-harmonize
- writing Data Cloud SQL or search-index workflows → sf-datacloud-retrieve
当工作涉及以下内容时,使用:
sf-datacloud-connect- 命令
sf data360 connection * - 连接器目录检查
- 连接的创建、更新、测试或删除
- 浏览源对象、字段、数据库或架构
- 识别已在使用的连接器类型
当用户进行以下操作时,请转至其他技能:
- 创建数据流或DLO → sf-datacloud-prepare
- 创建DMO、映射、IR规则集或数据图谱 → sf-datacloud-harmonize
- 编写Data Cloud SQL或搜索索引工作流 → sf-datacloud-retrieve
Required Context to Gather First
首先需要收集的必要上下文
Ask for or infer:
- target org alias
- connector type or source system
- whether the user wants inspection only or live mutation
- connection name if one already exists
- whether credentials are already configured outside the CLI
询问或推断:
- 目标组织别名
- 连接器类型或源系统
- 用户仅需检查还是需要实际修改
- 连接名称(如果已存在)
- 凭据是否已在CLI外部配置完成
Core Operating Rules
核心操作规则
- Verify the plugin runtime first; see ../sf-datacloud/references/plugin-setup.md.
- Run the shared readiness classifier before mutating connections: .
node ~/.claude/skills/sf-datacloud/scripts/diagnose-org.mjs -o <org> --phase connect --json - Prefer read-only discovery before connection creation.
- Suppress linked-plugin warning noise with for standard usage.
2>/dev/null - Remember that requires
connection list.--connector-type - Discover existing connector types from streams first when the org is unfamiliar.
- API-based external connector creation is supported, but payloads are connector-specific.
- Do not use query-plane errors from other phases to declare connect work unavailable.
- 首先验证插件运行环境;请查看../sf-datacloud/references/plugin-setup.md。
- 在修改连接之前,先运行共享就绪性分类器:。
node ~/.claude/skills/sf-datacloud/scripts/diagnose-org.mjs -o <org> --phase connect --json - 在创建连接之前,优先进行只读发现操作。
- 在标准使用中,使用抑制链接插件的警告信息。
2>/dev/null - 请注意命令需要
connection list参数。--connector-type - 当不熟悉目标组织时,先从数据流中发现已有的连接器类型。
- 支持基于API的外部连接器创建,但请求负载因连接器类型而异。
- 不要使用其他阶段的查询平面错误来判定连接工作无法进行。
Recommended Workflow
推荐工作流
1. Classify readiness for connect work
1. 分类连接工作的就绪状态
bash
node ~/.claude/skills/sf-datacloud/scripts/diagnose-org.mjs -o <org> --phase connect --jsonbash
node ~/.claude/skills/sf-datacloud/scripts/diagnose-org.mjs -o <org> --phase connect --json2. Discover connector types
2. 发现连接器类型
bash
sf data360 connection connector-list -o <org> 2>/dev/null
sf data360 data-stream list -o <org> 2>/dev/nullbash
sf data360 connection connector-list -o <org> 2>/dev/null
sf data360 data-stream list -o <org> 2>/dev/null3. Inspect connections by type
3. 按类型检查连接
bash
sf data360 connection list -o <org> --connector-type SalesforceDotCom 2>/dev/null
sf data360 connection list -o <org> --connector-type REDSHIFT 2>/dev/nullbash
sf data360 connection list -o <org> --connector-type SalesforceDotCom 2>/dev/null
sf data360 connection list -o <org> --connector-type REDSHIFT 2>/dev/null4. Inspect a specific connection
4. 检查特定连接
bash
sf data360 connection get -o <org> --name <connection> 2>/dev/null
sf data360 connection objects -o <org> --name <connection> 2>/dev/null
sf data360 connection fields -o <org> --name <connection> 2>/dev/nullbash
sf data360 connection get -o <org> --name <connection> 2>/dev/null
sf data360 connection objects -o <org> --name <connection> 2>/dev/null
sf data360 connection fields -o <org> --name <connection> 2>/dev/null5. Test or create only after discovery
5. 仅在发现后进行测试或创建
bash
sf data360 connection test -o <org> --name <connection> 2>/dev/null
sf data360 connection create -o <org> -f connection.json 2>/dev/nullbash
sf data360 connection test -o <org> --name <connection> 2>/dev/null
sf data360 connection create -o <org> -f connection.json 2>/dev/null6. Start from curated example payloads for external connectors
6. 从预定义示例负载开始创建外部连接器
Use the phase-owned examples before inventing a payload from scratch:
examples/connections/heroku-postgres.jsonexamples/connections/redshift.json
To discover payload fields for a connector type not covered by those examples, create one in the UI and inspect it:
bash
sf api request rest "/services/data/v66.0/ssot/connections/<id>" -o <org>在从头编写负载之前,请使用此阶段提供的示例:
examples/connections/heroku-postgres.jsonexamples/connections/redshift.json
如果需要发现示例未覆盖的连接器类型的负载字段,请先在UI中创建一个,然后检查:
bash
sf api request rest "/services/data/v66.0/ssot/connections/<id>" -o <org>High-Signal Gotchas
关键注意事项
- has no true global "list all" mode; query by connector type.
connection list - The connection catalog name and connection connector type are not always the same label.
- Some external connector credential setup still depends on UI-side configuration.
- Use connection metadata inspection before guessing available source objects or databases.
- An empty connection list usually means "enabled but not configured yet", not "feature disabled".
- Heroku Postgres and Redshift payloads use different credential / parameter names. Reuse the curated examples instead of guessing.
- 没有真正的全局“列出全部”模式;需按连接器类型查询。
connection list - 连接器目录名称和连接的连接器类型标签并非始终一致。
- 某些外部连接器的凭据设置仍依赖于UI端配置。
- 在猜测可用的源对象或数据库之前,请先检查连接元数据。
- 空的连接列表通常表示“已启用但尚未配置”,而非“功能已禁用”。
- Heroku Postgres和Redshift的负载使用不同的凭据/参数名称。请复用预定义示例,不要自行猜测。
Output Format
输出格式
text
Connect task: <inspect / create / test / update>
Connector type: <SalesforceDotCom / REDSHIFT / S3 / ...>
Target org: <alias>
Commands: <key commands run>
Verification: <passed / partial / blocked>
Next step: <prepare phase or connector follow-up>text
Connect task: <inspect / create / test / update>
Connector type: <SalesforceDotCom / REDSHIFT / S3 / ...>
Target org: <alias>
Commands: <key commands run>
Verification: <passed / partial / blocked>
Next step: <prepare phase or connector follow-up>References
参考资料
- README.md
- examples/connections/heroku-postgres.json
- examples/connections/redshift.json
- ../sf-datacloud/references/plugin-setup.md
- ../sf-datacloud/references/feature-readiness.md
- ../sf-datacloud/UPSTREAM.md
- README.md
- examples/connections/heroku-postgres.json
- examples/connections/redshift.json
- ../sf-datacloud/references/plugin-setup.md
- ../sf-datacloud/references/feature-readiness.md
- ../sf-datacloud/UPSTREAM.md