dd-logs

Compare original and translation side by side

🇺🇸

Original

English
🇨🇳

Translation

Chinese

Datadog Logs

Datadog 日志

Search, process, and archive logs with cost awareness.
具备成本意识地搜索、处理和归档日志。

Prerequisites

前置要求

Datadog Pup (dd-pup/pup) should already be installed:
bash
go install github.com/datadog-labs/pup@latest
Datadog Pup (dd-pup/pup) 需已完成安装:
bash
go install github.com/datadog-labs/pup@latest

Quick Start

快速开始

bash
pup auth login
bash
pup auth login

Search Logs

搜索日志

bash
undefined
bash
undefined

Basic search

基础搜索

pup logs search --query="status:error" --from="1h"
pup logs search --query="status:error" --from="1h"

With filters

带过滤器

pup logs search --query="service:api status:error" --from="1h" --limit 100
pup logs search --query="service:api status:error" --from="1h" --limit 100

JSON output

JSON格式输出

pup logs search --query="@http.status_code:>=500" --from="1h" --json
undefined
pup logs search --query="@http.status_code:>=500" --from="1h" --json
undefined

Search Syntax

搜索语法

QueryMeaning
error
Full-text search
status:error
Tag equals
@http.status_code:500
Attribute equals
@http.status_code:>=400
Numeric range
service:api AND env:prod
Boolean
@message:*timeout*
Wildcard
查询含义
error
全文搜索
status:error
标签匹配
@http.status_code:500
属性匹配
@http.status_code:>=400
数值范围
service:api AND env:prod
布尔运算
@message:*timeout*
通配符匹配

Pipelines

管道

Process logs before indexing:
bash
undefined
索引前处理日志:
bash
undefined

List pipelines

列出管道

pup logs pipelines list
pup logs pipelines list

Create pipeline (JSON)

创建管道(JSON格式)

pup logs pipelines create --json @pipeline.json
undefined
pup logs pipelines create --json @pipeline.json
undefined

Common Processors

常用处理器

json
{
  "name": "API Logs",
  "filter": {"query": "service:api"},
  "processors": [
    {
      "type": "grok-parser",
      "name": "Parse nginx",
      "source": "message",
      "grok": {"match_rules": "%{IPORHOST:client_ip} %{DATA:method} %{DATA:path} %{NUMBER:status}"}
    },
    {
      "type": "status-remapper",
      "name": "Set severity",
      "sources": ["level", "severity"]
    },
    {
      "type": "attribute-remapper",
      "name": "Remap user_id",
      "sources": ["user_id"],
      "target": "usr.id"
    }
  ]
}
json
{
  "name": "API Logs",
  "filter": {"query": "service:api"},
  "processors": [
    {
      "type": "grok-parser",
      "name": "Parse nginx",
      "source": "message",
      "grok": {"match_rules": "%{IPORHOST:client_ip} %{DATA:method} %{DATA:path} %{NUMBER:status}"}
    },
    {
      "type": "status-remapper",
      "name": "Set severity",
      "sources": ["level", "severity"]
    },
    {
      "type": "attribute-remapper",
      "name": "Remap user_id",
      "sources": ["user_id"],
      "target": "usr.id"
    }
  ]
}

⚠️ Exclusion Filters (Cost Control)

⚠️ 排除过滤器(成本控制)

Index only what matters:
json
{
  "name": "Drop debug logs",
  "filter": {"query": "status:debug"},
  "is_enabled": true
}
仅索引重要内容:
json
{
  "name": "Drop debug logs",
  "filter": {"query": "status:debug"},
  "is_enabled": true
}

High-Volume Exclusions

高流量排除规则

bash
undefined
bash
undefined

Find noisiest log sources

查找流量最高的日志来源

pup logs search --query="*" --from="1h" --json | jq 'group_by(.service) | map({service: .[0].service, count: length}) | sort_by(-.count)[:10]'

| Exclude | Query |
|---------|-------|
| Health checks | `@http.url:"/health" OR @http.url:"/ready"` |
| Debug logs | `status:debug` |
| Static assets | `@http.url:*.css OR @http.url:*.js` |
| Heartbeats | `@message:*heartbeat*` |
pup logs search --query="*" --from="1h" --json | jq 'group_by(.service) | map({service: .[0].service, count: length}) | sort_by(-.count)[:10]'

| 排除项 | 查询 |
|---------|-------|
| 健康检查 | `@http.url:"/health" OR @http.url:"/ready"` |
| 调试日志 | `status:debug` |
| 静态资源 | `@http.url:*.css OR @http.url:*.js` |
| 心跳请求 | `@message:*heartbeat*` |

Archives

归档

Store logs cheaply for compliance:
bash
undefined
为合规需求低成本存储日志:
bash
undefined

List archives

列出归档

pup logs archives list
pup logs archives list

Archive config (S3 example)

归档配置(S3示例)

{ "name": "compliance-archive", "query": "*", "destination": { "type": "s3", "bucket": "my-logs-archive", "path": "/datadog" }, "rehydration_tags": ["team:platform"] }
undefined
{ "name": "compliance-archive", "query": "*", "destination": { "type": "s3", "bucket": "my-logs-archive", "path": "/datadog" }, "rehydration_tags": ["team:platform"] }
undefined

Rehydrate (Restore)

复现(恢复)

bash
undefined
bash
undefined

Rehydrate archived logs

恢复已归档的日志

pup logs rehydrate create
--archive-id abc123
--from "2024-01-01T00:00:00Z"
--to "2024-01-02T00:00:00Z"
--query "service:api status:error"
undefined
pup logs rehydrate create
--archive-id abc123
--from "2024-01-01T00:00:00Z"
--to "2024-01-02T00:00:00Z"
--query "service:api status:error"
undefined

Log-Based Metrics

基于日志的指标

Create metrics from logs (cheaper than indexing):
bash
undefined
从日志创建指标(比索引成本更低):
bash
undefined

Count errors per service

统计每个服务的错误数

pup logs metrics create
--name "api.errors.count"
--query "service:api status:error"
--group-by "endpoint"

**⚠️ Cardinality warning:** Group by bounded values only.
pup logs metrics create
--name "api.errors.count"
--query "service:api status:error"
--group-by "endpoint"

**⚠️ 基数警告:** 仅按有界值分组。

Sensitive Data

敏感数据

Scrubbing Rules

清洗规则

json
{
  "type": "hash-remapper",
  "name": "Hash emails",
  "sources": ["email", "@user.email"]
}
json
{
  "type": "hash-remapper",
  "name": "Hash emails",
  "sources": ["email", "@user.email"]
}

Never Log

切勿在日志中记录的内容

python
undefined
python
undefined

In your app - sanitize before sending

在你的应用中——发送前先做 sanitize 处理

import re
def sanitize_log(message: str) -> str: # Remove credit cards message = re.sub(r'\b\d{4}[-\s]?\d{4}[-\s]?\d{4}[-\s]?\d{4}\b', '[REDACTED]', message) # Remove SSNs message = re.sub(r'\b\d{3}-\d{2}-\d{4}\b', '[REDACTED]', message) return message
undefined
import re
def sanitize_log(message: str) -> str: # 移除信用卡号 message = re.sub(r'\b\d{4}[-\s]?\d{4}[-\s]?\d{4}[-\s]?\d{4}\b', '[REDACTED]', message) # 移除社保号 message = re.sub(r'\b\d{3}-\d{2}-\d{4}\b', '[REDACTED]', message) return message
undefined

Troubleshooting

问题排查

ProblemFix
Logs not appearingCheck agent, pipeline filters
High costsAdd exclusion filters
Search slowNarrow time range, use indexes
Missing attributesCheck grok parser
问题解决方案
日志未显示检查Agent、管道过滤器
成本过高添加排除过滤器
搜索缓慢缩小时间范围,使用索引
属性缺失检查grok解析器

References/Documentation

参考/文档