dd-logs
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
ChineseDatadog Logs
Datadog 日志
Search, process, and archive logs with cost awareness.
具备成本意识地搜索、处理和归档日志。
Prerequisites
前置要求
Datadog Pup (dd-pup/pup) should already be installed:
bash
go install github.com/datadog-labs/pup@latestDatadog Pup (dd-pup/pup) 需已完成安装:
bash
go install github.com/datadog-labs/pup@latestQuick Start
快速开始
bash
pup auth loginbash
pup auth loginSearch Logs
搜索日志
bash
undefinedbash
undefinedBasic search
基础搜索
pup logs search --query="status:error" --from="1h"
pup logs search --query="status:error" --from="1h"
With filters
带过滤器
pup logs search --query="service:api status:error" --from="1h" --limit 100
pup logs search --query="service:api status:error" --from="1h" --limit 100
JSON output
JSON格式输出
pup logs search --query="@http.status_code:>=500" --from="1h" --json
undefinedpup logs search --query="@http.status_code:>=500" --from="1h" --json
undefinedSearch Syntax
搜索语法
| Query | Meaning |
|---|---|
| Full-text search |
| Tag equals |
| Attribute equals |
| Numeric range |
| Boolean |
| Wildcard |
| 查询 | 含义 |
|---|---|
| 全文搜索 |
| 标签匹配 |
| 属性匹配 |
| 数值范围 |
| 布尔运算 |
| 通配符匹配 |
Pipelines
管道
Process logs before indexing:
bash
undefined索引前处理日志:
bash
undefinedList pipelines
列出管道
pup logs pipelines list
pup logs pipelines list
Create pipeline (JSON)
创建管道(JSON格式)
pup logs pipelines create --json @pipeline.json
undefinedpup logs pipelines create --json @pipeline.json
undefinedCommon Processors
常用处理器
json
{
"name": "API Logs",
"filter": {"query": "service:api"},
"processors": [
{
"type": "grok-parser",
"name": "Parse nginx",
"source": "message",
"grok": {"match_rules": "%{IPORHOST:client_ip} %{DATA:method} %{DATA:path} %{NUMBER:status}"}
},
{
"type": "status-remapper",
"name": "Set severity",
"sources": ["level", "severity"]
},
{
"type": "attribute-remapper",
"name": "Remap user_id",
"sources": ["user_id"],
"target": "usr.id"
}
]
}json
{
"name": "API Logs",
"filter": {"query": "service:api"},
"processors": [
{
"type": "grok-parser",
"name": "Parse nginx",
"source": "message",
"grok": {"match_rules": "%{IPORHOST:client_ip} %{DATA:method} %{DATA:path} %{NUMBER:status}"}
},
{
"type": "status-remapper",
"name": "Set severity",
"sources": ["level", "severity"]
},
{
"type": "attribute-remapper",
"name": "Remap user_id",
"sources": ["user_id"],
"target": "usr.id"
}
]
}⚠️ Exclusion Filters (Cost Control)
⚠️ 排除过滤器(成本控制)
Index only what matters:
json
{
"name": "Drop debug logs",
"filter": {"query": "status:debug"},
"is_enabled": true
}仅索引重要内容:
json
{
"name": "Drop debug logs",
"filter": {"query": "status:debug"},
"is_enabled": true
}High-Volume Exclusions
高流量排除规则
bash
undefinedbash
undefinedFind noisiest log sources
查找流量最高的日志来源
pup logs search --query="*" --from="1h" --json | jq 'group_by(.service) | map({service: .[0].service, count: length}) | sort_by(-.count)[:10]'
| Exclude | Query |
|---------|-------|
| Health checks | `@http.url:"/health" OR @http.url:"/ready"` |
| Debug logs | `status:debug` |
| Static assets | `@http.url:*.css OR @http.url:*.js` |
| Heartbeats | `@message:*heartbeat*` |pup logs search --query="*" --from="1h" --json | jq 'group_by(.service) | map({service: .[0].service, count: length}) | sort_by(-.count)[:10]'
| 排除项 | 查询 |
|---------|-------|
| 健康检查 | `@http.url:"/health" OR @http.url:"/ready"` |
| 调试日志 | `status:debug` |
| 静态资源 | `@http.url:*.css OR @http.url:*.js` |
| 心跳请求 | `@message:*heartbeat*` |Archives
归档
Store logs cheaply for compliance:
bash
undefined为合规需求低成本存储日志:
bash
undefinedList archives
列出归档
pup logs archives list
pup logs archives list
Archive config (S3 example)
归档配置(S3示例)
{
"name": "compliance-archive",
"query": "*",
"destination": {
"type": "s3",
"bucket": "my-logs-archive",
"path": "/datadog"
},
"rehydration_tags": ["team:platform"]
}
undefined{
"name": "compliance-archive",
"query": "*",
"destination": {
"type": "s3",
"bucket": "my-logs-archive",
"path": "/datadog"
},
"rehydration_tags": ["team:platform"]
}
undefinedRehydrate (Restore)
复现(恢复)
bash
undefinedbash
undefinedRehydrate archived logs
恢复已归档的日志
pup logs rehydrate create
--archive-id abc123
--from "2024-01-01T00:00:00Z"
--to "2024-01-02T00:00:00Z"
--query "service:api status:error"
--archive-id abc123
--from "2024-01-01T00:00:00Z"
--to "2024-01-02T00:00:00Z"
--query "service:api status:error"
undefinedpup logs rehydrate create
--archive-id abc123
--from "2024-01-01T00:00:00Z"
--to "2024-01-02T00:00:00Z"
--query "service:api status:error"
--archive-id abc123
--from "2024-01-01T00:00:00Z"
--to "2024-01-02T00:00:00Z"
--query "service:api status:error"
undefinedLog-Based Metrics
基于日志的指标
Create metrics from logs (cheaper than indexing):
bash
undefined从日志创建指标(比索引成本更低):
bash
undefinedCount errors per service
统计每个服务的错误数
pup logs metrics create
--name "api.errors.count"
--query "service:api status:error"
--group-by "endpoint"
--name "api.errors.count"
--query "service:api status:error"
--group-by "endpoint"
**⚠️ Cardinality warning:** Group by bounded values only.pup logs metrics create
--name "api.errors.count"
--query "service:api status:error"
--group-by "endpoint"
--name "api.errors.count"
--query "service:api status:error"
--group-by "endpoint"
**⚠️ 基数警告:** 仅按有界值分组。Sensitive Data
敏感数据
Scrubbing Rules
清洗规则
json
{
"type": "hash-remapper",
"name": "Hash emails",
"sources": ["email", "@user.email"]
}json
{
"type": "hash-remapper",
"name": "Hash emails",
"sources": ["email", "@user.email"]
}Never Log
切勿在日志中记录的内容
python
undefinedpython
undefinedIn your app - sanitize before sending
在你的应用中——发送前先做 sanitize 处理
import re
def sanitize_log(message: str) -> str:
# Remove credit cards
message = re.sub(r'\b\d{4}[-\s]?\d{4}[-\s]?\d{4}[-\s]?\d{4}\b', '[REDACTED]', message)
# Remove SSNs
message = re.sub(r'\b\d{3}-\d{2}-\d{4}\b', '[REDACTED]', message)
return message
undefinedimport re
def sanitize_log(message: str) -> str:
# 移除信用卡号
message = re.sub(r'\b\d{4}[-\s]?\d{4}[-\s]?\d{4}[-\s]?\d{4}\b', '[REDACTED]', message)
# 移除社保号
message = re.sub(r'\b\d{3}-\d{2}-\d{4}\b', '[REDACTED]', message)
return message
undefinedTroubleshooting
问题排查
| Problem | Fix |
|---|---|
| Logs not appearing | Check agent, pipeline filters |
| High costs | Add exclusion filters |
| Search slow | Narrow time range, use indexes |
| Missing attributes | Check grok parser |
| 问题 | 解决方案 |
|---|---|
| 日志未显示 | 检查Agent、管道过滤器 |
| 成本过高 | 添加排除过滤器 |
| 搜索缓慢 | 缩小时间范围,使用索引 |
| 属性缺失 | 检查grok解析器 |