cx-data-pipeline
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
ChineseData Pipeline Skill
数据管道技能
Use this skill when configuring how Coralogix processes, enriches, and transforms data. It covers parsing rules (extract structured fields from raw logs), enrichments (add context from lookup tables), Events2Metrics (derive metrics from log/span events), and recording rules (precompute PromQL expressions).
当配置Coralogix如何处理、丰富和转换数据时使用此技能。它涵盖了解析规则(从原始日志中提取结构化字段)、数据丰富(从查找表添加上下文)、Events2Metrics(从日志/链路事件派生指标)以及录制规则(预计算PromQL表达式)。
CLI Commands
CLI命令
| Command | Subcommands | Purpose |
|---|---|---|
| | Manage log parsing rules |
| | Manage enrichment rules |
| | Manage custom enrichment tables |
| | Manage Events2Metrics definitions |
| | Manage Prometheus recording rule groups |
Key flags:
- All create/update operations use (or
--from-file <path>for stdin)- - All commands support for structured output and
-o jsonfor profile selection-p <profile> - and
cx parsing-rules updaterequire bothcx recording-rules updateand the rule group ID--from-file - requires
cx enrichments custom searchand--id <table-id>--query <text> - requires
cx parsing-rules bulk-delete--ids <id1> <id2> ...
| 命令 | 子命令 | 用途 |
|---|---|---|
| | 管理日志解析规则 |
| | 管理数据丰富规则 |
| | 管理自定义丰富表 |
| | 管理Events2Metrics定义 |
| | 管理Prometheus录制规则组 |
关键参数:
- 所有创建/更新操作使用(或
--from-file <path>表示标准输入)- - 所有命令支持以输出结构化内容,支持
-o json选择配置文件-p <profile> - 和
cx parsing-rules update同时需要cx recording-rules update和规则组ID--from-file - 需要
cx enrichments custom search和--id <table-id>--query <text> - 需要
cx parsing-rules bulk-delete--ids <id1> <id2> ...
Working with JSON Payloads
JSON有效负载使用方法
These commands use complex JSON structures. Always template from an existing resource to avoid format errors:
bash
undefined这些命令使用复杂的JSON结构。始终从现有资源生成模板以避免格式错误:
bash
undefined1. Get an existing resource as a template
1. 获取现有资源作为模板
cx parsing-rules get <rule-group-id> -o json > template.json
cx parsing-rules get <rule-group-id> -o json > template.json
2. Modify the template (change fields, remove the ID for create operations)
2. 修改模板(更改字段,创建操作时移除ID)
3. Create or update
3. 创建或更新
cx parsing-rules create --from-file template.json
cx parsing-rules update --from-file template.json <rule-group-id>
This pattern applies to all create/update operations across all 4 commands. It prevents payload format errors that are the #1 cause of failed attempts.
---cx parsing-rules create --from-file template.json
cx parsing-rules update --from-file template.json <rule-group-id>
这种模式适用于所有4个命令的所有创建/更新操作。它可以避免有效负载格式错误,这是尝试失败的首要原因。
---Parsing Rules Workflow
解析规则工作流
1. List Existing Rules
1. 列出现有规则
bash
cx parsing-rules list -o json
cx parsing-rules list -o json | jq '[.[] | {id, name, enabled, rule_count: (.rules | length)}]'bash
cx parsing-rules list -o json
cx parsing-rules list -o json | jq '[.[] | {id, name, enabled, rule_count: (.rules | length)}]'2. Get a Template
2. 获取模板
bash
cx parsing-rules get <existing-rule-group-id> -o json > rule-template.jsonbash
cx parsing-rules get <existing-rule-group-id> -o json > rule-template.json3. Create New Rule Group
3. 创建新规则组
Edit the template for your new service, then:
bash
cx parsing-rules create --from-file rule-template.json编辑模板以适配你的新服务,然后执行:
bash
cx parsing-rules create --from-file rule-template.json4. Verify Parsing
4. 验证解析结果
Use the skill to query recent logs and confirm fields are extracted:
cx-query-logsbash
cx logs 'source logs | filter $d.subsystem == "my-service" | limit 10' -o json使用技能查询近期日志,确认字段已提取:
cx-query-logsbash
cx logs 'source logs | filter $d.subsystem == "my-service" | limit 10' -o json5. Check Usage Limits
5. 查看使用限制
bash
cx parsing-rules usage-limits -o jsonbash
cx parsing-rules usage-limits -o jsonEnrichment Workflow
数据丰富工作流
1. List Enrichment Rules
1. 列出数据丰富规则
bash
cx enrichments list -o json
cx enrichments settings -o json
cx enrichments limit -o jsonbash
cx enrichments list -o json
cx enrichments settings -o json
cx enrichments limit -o json2. Create Custom Enrichment Table (if needed)
2. 创建自定义丰富表(如有需要)
bash
cx enrichments custom list -o json
cx enrichments custom create --from-file table-definition.jsonbash
cx enrichments custom list -o json
cx enrichments custom create --from-file table-definition.json3. Add Enrichment Rules
3. 添加数据丰富规则
bash
cx enrichments add --from-file enrichment-rules.jsonbash
cx enrichments add --from-file enrichment-rules.json4. Search Custom Table Data
4. 搜索自定义表数据
bash
cx enrichments custom search --id <table-id> --query "search term"bash
cx enrichments custom search --id <table-id> --query "search term"5. Verify Enriched Fields
5. 验证丰富字段
Query logs on hot storage (FrequentSearch tier) to confirm enriched fields appear. Avoid querying archive for verification - ingestion delays can cause false negatives.
bash
cx logs 'source logs | filter $d.enriched_field != null | limit 5' -o json查询热存储(FrequentSearch层级)中的日志,确认丰富字段已出现。避免查询归档存储进行验证——摄入延迟可能导致误判。
bash
cx logs 'source logs | filter $d.enriched_field != null | limit 5' -o jsonEvents2Metrics Workflow
Events2Metrics工作流
1. Design the Metric
1. 设计指标
Decide the metric name, labels, and aggregation type before creating.
在创建前确定指标名称、标签和聚合类型。
2. Check Limits
2. 查看限制
bash
cx e2m limits -o json
cx e2m labels-cardinality -o jsonbash
cx e2m limits -o json
cx e2m labels-cardinality -o json3. Get a Template
3. 获取模板
bash
cx e2m list -o json
cx e2m get <existing-e2m-id> -o json > e2m-template.jsonbash
cx e2m list -o json
cx e2m get <existing-e2m-id> -o json > e2m-template.json4. Create E2M Definition
4. 创建E2M定义
bash
cx e2m create --from-file e2m-definition.jsonbash
cx e2m create --from-file e2m-definition.json5. Verify Metric
5. 验证指标
Use the skill to confirm the new metric appears:
cx-metrics-querybash
cx metrics search --name "new_metric_name"使用技能确认新指标已生成:
cx-metrics-querybash
cx metrics search --name "new_metric_name"Recording Rules Workflow
录制规则工作流
1. List Existing Recording Rules
1. 列出现有录制规则
bash
cx recording-rules list -o json
cx recording-rules list -o json | jq '[.[] | {id, name, rules: [.rules[]?.record]}]'bash
cx recording-rules list -o json
cx recording-rules list -o json | jq '[.[] | {id, name, rules: [.rules[]?.record]}]'2. Get a Template
2. 获取模板
bash
cx recording-rules get <existing-id> -o json > recording-rule-template.jsonbash
cx recording-rules get <existing-id> -o json > recording-rule-template.json3. Create Recording Rule Group
3. 创建录制规则组
bash
cx recording-rules create --from-file recording-rule-group.jsonbash
cx recording-rules create --from-file recording-rule-group.json4. Verify with PromQL
4. 使用PromQL验证
Use the skill to confirm the precomputed metric is available:
cx-metrics-querybash
cx metrics query "new_precomputed_metric" --time now使用技能确认预计算指标可用:
cx-metrics-querybash
cx metrics query "new_precomputed_metric" --time nowKey Principles
核心原则
- Always template from existing - before any create
cx <command> get <id> -o json > template.json - Verify after create - query logs/metrics to confirm the pipeline change took effect
- Use - all payload inspection and creation should use JSON output
-o json - Check limits first - and
cx parsing-rules usage-limitsbefore creating to avoid hitting capscx e2m limits - Bulk operations - use for cleanup, not individual deletes
cx parsing-rules bulk-delete --ids
- 始终从现有资源生成模板 - 在任何创建操作前执行
cx <command> get <id> -o json > template.json - 创建后验证 - 查询日志/指标以确认管道更改已生效
- 使用- 所有有效负载检查和创建都应使用JSON输出
-o json - 先检查限制 - 创建前执行和
cx parsing-rules usage-limits以避免超出上限cx e2m limits - 批量操作 - 使用进行清理,而非逐个删除
cx parsing-rules bulk-delete --ids
Related Skills
相关技能
- - verify parsing results and enriched fields in log data
cx-query-logs - - verify E2M and recording rule output metrics
cx-metrics-query - - DataPrime syntax reference for rule expressions
cx-dataprime - - discover what data is available before configuring pipeline
cx-telemetry-querying
- - 验证日志数据中的解析结果和丰富字段
cx-query-logs - - 验证E2M和录制规则输出的指标
cx-metrics-query - - 规则表达式的DataPrime语法参考
cx-dataprime - - 配置管道前发现可用数据
cx-telemetry-querying