sentry-setup-ai-monitoring

Compare original and translation side by side

🇺🇸

Original

English
🇨🇳

Translation

Chinese

Setup Sentry AI Agent Monitoring

配置Sentry AI Agent监控

Configure Sentry to track LLM calls, agent executions, tool usage, and token consumption.
配置Sentry以追踪LLM调用、Agent执行、工具使用以及令牌消耗。

Invoke This Skill When

当以下情况时调用该技能

  • User asks to "monitor AI/LLM calls" or "track OpenAI/Anthropic usage"
  • User wants "AI observability" or "agent monitoring"
  • User asks about token usage, model latency, or AI costs
  • 用户要求“监控AI/LLM调用”或“追踪OpenAI/Anthropic使用情况”
  • 用户需要“AI可观测性”或“Agent监控”
  • 用户询问令牌使用量、模型延迟或AI成本

Prerequisites

前置条件

AI monitoring requires tracing enabled (
tracesSampleRate > 0
).
AI监控需要启用追踪
tracesSampleRate > 0
)。

Detection First

先进行检测

Always detect installed AI SDKs before configuring:
bash
undefined
配置前务必先检测已安装的AI SDK:
bash
undefined

JavaScript

JavaScript

grep -E '"(openai|@anthropic-ai/sdk|ai|@langchain|@google/genai)"' package.json
grep -E '"(openai|@anthropic-ai/sdk|ai|@langchain|@google/genai)"' package.json

Python

Python

grep -E '(openai|anthropic|langchain|huggingface)' requirements.txt pyproject.toml 2>/dev/null
undefined
grep -E '(openai|anthropic|langchain|huggingface)' requirements.txt pyproject.toml 2>/dev/null
undefined

Supported SDKs

支持的SDK

JavaScript

JavaScript

PackageIntegrationMin Sentry SDKAuto?
openai
openAIIntegration()
10.2.0Yes
@anthropic-ai/sdk
anthropicAIIntegration()
10.12.0Yes
ai
(Vercel)
vercelAIIntegration()
10.6.0Node only*
@langchain/*
langChainIntegration()
10.22.0Yes
@langchain/langgraph
langGraphIntegration()
10.25.0Yes
@google/genai
googleGenAIIntegration()
10.14.0Yes
*Vercel AI requires explicit setup for Edge runtime and
experimental_telemetry
per-call.
集成最低Sentry SDK版本自动配置?
openai
openAIIntegration()
10.2.0
@anthropic-ai/sdk
anthropicAIIntegration()
10.12.0
ai
(Vercel)
vercelAIIntegration()
10.6.0仅Node环境*
@langchain/*
langChainIntegration()
10.22.0
@langchain/langgraph
langGraphIntegration()
10.25.0
@google/genai
googleGenAIIntegration()
10.14.0
*Vercel AI在Edge运行时需要显式配置,且每次调用需开启
experimental_telemetry

Python

Python

PackageInstallMin SDK
openai
pip install "sentry-sdk[openai]"
2.41.0
anthropic
pip install "sentry-sdk[anthropic]"
2.x
langchain
pip install "sentry-sdk[langchain]"
2.x
huggingface_hub
pip install "sentry-sdk[huggingface_hub]"
2.x
安装方式最低SDK版本
openai
pip install "sentry-sdk[openai]"
2.41.0
anthropic
pip install "sentry-sdk[anthropic]"
2.x
langchain
pip install "sentry-sdk[langchain]"
2.x
huggingface_hub
pip install "sentry-sdk[huggingface_hub]"
2.x

JavaScript Configuration

JavaScript配置

Auto-enabled integrations (OpenAI, Anthropic, Google GenAI, LangChain)

自动启用的集成(OpenAI、Anthropic、Google GenAI、LangChain)

Just ensure tracing is enabled. To capture prompts/outputs:
javascript
Sentry.init({
  dsn: "YOUR_DSN",
  tracesSampleRate: 1.0,
  integrations: [
    Sentry.openAIIntegration({ recordInputs: true, recordOutputs: true }),
  ],
});
只需确保已启用追踪。如需捕获提示词/输出:
javascript
Sentry.init({
  dsn: "YOUR_DSN",
  tracesSampleRate: 1.0,
  integrations: [
    Sentry.openAIIntegration({ recordInputs: true, recordOutputs: true }),
  ],
});

Next.js OpenAI (additional step required)

Next.js OpenAI(需额外步骤)

For Next.js projects using OpenAI, you must wrap the client:
javascript
import OpenAI from "openai";
import * as Sentry from "@sentry/nextjs";

const openai = Sentry.instrumentOpenAiClient(new OpenAI());
// Use 'openai' client as normal
对于使用OpenAI的Next.js项目,必须包装客户端:
javascript
import OpenAI from "openai";
import * as Sentry from "@sentry/nextjs";

const openai = Sentry.instrumentOpenAiClient(new OpenAI());
// 正常使用'openai'客户端

LangChain / LangGraph (explicit)

LangChain / LangGraph(显式配置)

javascript
integrations: [
  Sentry.langChainIntegration({ recordInputs: true, recordOutputs: true }),
  Sentry.langGraphIntegration({ recordInputs: true, recordOutputs: true }),
],
javascript
integrations: [
  Sentry.langChainIntegration({ recordInputs: true, recordOutputs: true }),
  Sentry.langGraphIntegration({ recordInputs: true, recordOutputs: true }),
],

Vercel AI SDK

Vercel AI SDK

Add to
sentry.edge.config.ts
for Edge runtime:
javascript
integrations: [Sentry.vercelAIIntegration()],
Enable telemetry per-call:
javascript
await generateText({
  model: openai("gpt-4o"),
  prompt: "Hello",
  experimental_telemetry: { isEnabled: true, recordInputs: true, recordOutputs: true },
});
sentury.edge.config.ts
中添加以支持Edge运行时:
javascript
integrations: [Sentry.vercelAIIntegration()],
每次调用时启用遥测:
javascript
await generateText({
  model: openai("gpt-4o"),
  prompt: "Hello",
  experimental_telemetry: { isEnabled: true, recordInputs: true, recordOutputs: true },
});

Python Configuration

Python配置

python
import sentry_sdk
from sentry_sdk.integrations.openai import OpenAIIntegration  # or anthropic, langchain

sentry_sdk.init(
    dsn="YOUR_DSN",
    traces_sample_rate=1.0,
    send_default_pii=True,  # Required for prompt capture
    integrations=[OpenAIIntegration(include_prompts=True)],
)
python
import sentry_sdk
from sentry_sdk.integrations.openai import OpenAIIntegration  # 或anthropic、langchain

sentry_sdk.init(
    dsn="YOUR_DSN",
    traces_sample_rate=1.0,
    send_default_pii=True,  # 捕获提示词需要开启此项
    integrations=[OpenAIIntegration(include_prompts=True)],
)

Manual Instrumentation

手动埋点

Use when no supported SDK is detected.
当未检测到支持的SDK时使用。

Span Types

Span类型

op
Value
Purpose
gen_ai.request
Individual LLM calls
gen_ai.invoke_agent
Agent execution lifecycle
gen_ai.execute_tool
Tool/function calls
gen_ai.handoff
Agent-to-agent transitions
op
用途
gen_ai.request
单个LLM调用
gen_ai.invoke_agent
Agent执行生命周期
gen_ai.execute_tool
工具/函数调用
gen_ai.handoff
Agent间的切换

Example (JavaScript)

示例(JavaScript)

javascript
await Sentry.startSpan({
  op: "gen_ai.request",
  name: "LLM request gpt-4o",
  attributes: { "gen_ai.request.model": "gpt-4o" },
}, async (span) => {
  span.setAttribute("gen_ai.request.messages", JSON.stringify(messages));
  const result = await llmClient.complete(prompt);
  span.setAttribute("gen_ai.usage.input_tokens", result.inputTokens);
  span.setAttribute("gen_ai.usage.output_tokens", result.outputTokens);
  return result;
});
javascript
await Sentry.startSpan({
  op: "gen_ai.request",
  name: "LLM request gpt-4o",
  attributes: { "gen_ai.request.model": "gpt-4o" },
}, async (span) => {
  span.setAttribute("gen_ai.request.messages", JSON.stringify(messages));
  const result = await llmClient.complete(prompt);
  span.setAttribute("gen_ai.usage.input_tokens", result.inputTokens);
  span.setAttribute("gen_ai.usage.output_tokens", result.outputTokens);
  return result;
});

Key Attributes

关键属性

AttributeDescription
gen_ai.request.model
Model identifier
gen_ai.request.messages
JSON input messages
gen_ai.usage.input_tokens
Input token count
gen_ai.usage.output_tokens
Output token count
gen_ai.agent.name
Agent identifier
gen_ai.tool.name
Tool identifier
属性描述
gen_ai.request.model
模型标识符
gen_ai.request.messages
JSON格式的输入消息
gen_ai.usage.input_tokens
输入令牌数量
gen_ai.usage.output_tokens
输出令牌数量
gen_ai.agent.name
Agent标识符
gen_ai.tool.name
工具标识符

PII Considerations

隐私信息(PII)注意事项

Prompts/outputs are PII. To capture:
  • JS:
    recordInputs: true, recordOutputs: true
    per-integration
  • Python:
    include_prompts=True
    +
    send_default_pii=True
提示词/输出属于隐私信息(PII)。如需捕获:
  • JS:在每个集成中设置
    recordInputs: true, recordOutputs: true
  • Python:设置
    include_prompts=True
    +
    send_default_pii=True

Troubleshooting

故障排查

IssueSolution
AI spans not appearingVerify
tracesSampleRate > 0
, check SDK version
Token counts missingSome providers don't return tokens for streaming
Prompts not capturedEnable
recordInputs
/
include_prompts
Vercel AI not workingAdd
experimental_telemetry
to each call
问题解决方案
AI Span未显示确认
tracesSampleRate > 0
,检查SDK版本
令牌计数缺失部分提供商不会返回流式调用的令牌数量
提示词未捕获启用
recordInputs
/
include_prompts
Vercel AI无法正常工作在每次调用中添加
experimental_telemetry