instrument
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
ChineseInstrument — Add Opik Tracing to a Codebase
埋点工具 — 为代码库添加Opik链路追踪
You are instrumenting an existing codebase with Opik observability. Follow these steps precisely.
你正在为现有代码库接入Opik可观测性能力,请严格遵循以下步骤操作。
Step 1 — Scope
步骤1 — 确定作用范围
If is provided, scope your work to those files or directories. Otherwise, discover the project root and instrument the main application code.
$ARGUMENTS如果提供了参数,则将你的工作范围限定在指定的文件或目录下。否则自动识别项目根目录,为核心应用代码添加埋点。
$ARGUMENTSStep 2 — Detect Language & Frameworks
步骤2 — 检测编程语言与框架
Scan the codebase to determine:
- Language: Python (look for ,
*.py,pyproject.toml) or TypeScript (look forrequirements.txt,*.ts,*.tsx)package.json - LLM frameworks in use — search imports for these patterns:
| Import pattern | Framework | Integration |
|---|---|---|
| OpenAI | |
| Anthropic | |
| LangChain | |
| LangGraph | |
| CrewAI | |
| DSPy | |
| Google Gemini | |
| AWS Bedrock | |
| LlamaIndex | |
| LiteLLM | |
| Pydantic AI | Logfire OTLP bridge |
| Google ADK | |
| Ollama | |
| OpenAI Agents SDK | |
| Haystack | |
| OpenAI (TS) | |
| Vercel AI SDK | |
| LangChain.js | |
| Gemini (TS) | |
- Existing Opik usage — check if or
opikis already imported. If so, audit rather than re-instrument.@opik.track
扫描代码库确认以下信息:
- 编程语言:Python(查找、
*.py、pyproject.toml文件)或TypeScript(查找requirements.txt、*.ts、*.tsx文件)package.json - 当前使用的LLM框架 — 按以下导入模式搜索代码:
| 导入模式 | 框架 | 集成方式 |
|---|---|---|
| OpenAI | |
| Anthropic | |
| LangChain | |
| LangGraph | |
| CrewAI | |
| DSPy | |
| Google Gemini | |
| AWS Bedrock | |
| LlamaIndex | |
| LiteLLM | |
| Pydantic AI | Logfire OTLP bridge |
| Google ADK | |
| Ollama | |
| OpenAI Agents SDK | |
| Haystack | |
| OpenAI (TS) | |
| Vercel AI SDK | |
| LangChain.js | |
| Gemini (TS) | |
- 现有Opik使用情况 — 检查是否已经导入了或者
opik,如果已存在则执行审计而非重复埋点。@opik.track
Step 3 — Identify the Call Graph
步骤3 — 识别调用链路
Find:
- Entrypoint: the top-level function that kicks off the agent (e.g., ,
main,run,agent, a route handler, or whatever the user's main orchestration function is)handle_message - LLM call sites: functions that call an LLM provider directly
- Tool functions: retrieval, search, API calls, or other tool-like operations
- Existing config classes: dataclasses, Pydantic models, or plain classes holding model names, temperatures, prompts, or other tunable parameters
找到以下内容:
- 入口点:启动Agent的顶层函数(例如、
main、run、agent、路由处理函数,或是用户的核心编排函数)handle_message - LLM调用位点:直接调用LLM服务商接口的函数
- 工具函数:检索、搜索、API调用或其他类似工具的操作函数
- 现有配置类:存储模型名称、温度、提示词或其他可调参数的数据类、Pydantic模型或普通类
Step 4 — Add Framework Integrations
步骤4 — 添加框架集成
For each detected framework, add the appropriate integration at the module level. See the integration table above and for the exact patterns.
references/integrations.mdPython examples:
python
undefined针对每个检测到的框架,在模块层级添加对应的集成配置。可参考上方的集成表格与文件获取精确的使用模式。
references/integrations.mdPython示例:
python
undefinedOpenAI
OpenAI
from opik.integrations.openai import track_openai
client = track_openai(OpenAI()) # wrap existing client
from opik.integrations.openai import track_openai
client = track_openai(OpenAI()) # wrap existing client
Anthropic
Anthropic
from opik.integrations.anthropic import track_anthropic
client = track_anthropic(anthropic.Anthropic())
from opik.integrations.anthropic import track_anthropic
client = track_anthropic(anthropic.Anthropic())
LangChain / LangGraph
LangChain / LangGraph
from opik.integrations.langchain import OpikTracer
tracer = OpikTracer()
from opik.integrations.langchain import OpikTracer
tracer = OpikTracer()
pass config={"callbacks": [tracer]} to invoke()
pass config={"callbacks": [tracer]} to invoke()
LiteLLM inside @opik.track — CRITICAL: pass span context
LiteLLM inside @opik.track — CRITICAL: pass span context
from opik.opik_context import get_current_span_data
from opik.opik_context import get_current_span_data
in every litellm.completion() call, add:
in every litellm.completion() call, add:
metadata={"opik": {"current_span_data": get_current_span_data()}}
metadata={"opik": {"current_span_data": get_current_span_data()}}
**TypeScript examples:**
```typescript
// OpenAI
import { trackOpenAI } from "opik-openai";
const trackedClient = trackOpenAI(openai);
// Vercel AI SDK
import { OpikExporter } from "opik-vercel";
// set up NodeSDK with OpikExporter
**TypeScript示例:**
```typescript
// OpenAI
import { trackOpenAI } from "opik-openai";
const trackedClient = trackOpenAI(openai);
// Vercel AI SDK
import { OpikExporter } from "opik-vercel";
// set up NodeSDK with OpikExporterStep 5 — Add @opik.track
Decorators (Python) or Client Tracing (TypeScript)
@opik.track步骤5 — 添加@opik.track
装饰器(Python)或客户端追踪(TypeScript)
@opik.trackPython
Python
Add at the top of each file you instrument.
import opik| Function role | Decorator |
|---|---|
| Entrypoint (top-level agent) | |
| LLM call | |
| Tool / retrieval | |
| Guardrail / validation | |
| Other helper in the call chain | |
- Place the decorator above any existing decorators (e.g., above )
@app.route - For async functions, works the same way — no changes needed
@opik.track - If the function is a script entrypoint (not a long-running server), add after the top-level call
opik.flush_tracker()
在每个需要埋点的文件顶部添加。
import opik| 函数角色 | 装饰器 |
|---|---|
| 入口点(顶层Agent函数) | |
| LLM调用 | |
| 工具/检索函数 | |
| 护栏/校验函数 | |
| 调用链路中的其他辅助函数 | |
- 将装饰器放在所有现有装饰器的上方(例如放在上方)
@app.route - 对于异步函数,无需修改即可正常使用
@opik.track - 如果函数是脚本入口点(非长期运行的服务),在顶层调用后添加
opik.flush_tracker()
TypeScript
TypeScript
Use the client-based approach:
typescript
import { Opik } from "opik";
const client = new Opik({ projectName: "<project-name>" });
// In the entrypoint function:
const trace = client.trace({ name: "<agent-name>", input: { ... } });
const span = trace.span({ name: "<operation>", type: "tool", input: { ... } });
// ... logic
span.end({ output: { ... } });
trace.end({ output: { ... } });
await client.flush();For entrypoints that should be discoverable by :
opik connecttypescript
import { track } from "opik";
const myAgent = track(
{ name: "<agent-name>", entrypoint: true, params: [{ name: "query", type: "string" }] },
async (query: string) => { /* ... */ }
);使用基于客户端的实现方式:
typescript
import { Opik } from "opik";
const client = new Opik({ projectName: "<project-name>" });
// In the entrypoint function:
const trace = client.trace({ name: "<agent-name>", input: { ... } });
const span = trace.span({ name: "<operation>", type: "tool", input: { ... } });
// ... logic
span.end({ output: { ... } });
trace.end({ output: { ... } });
await client.flush();对于需要被识别的入口点:
opik connecttypescript
import { track } from "opik";
const myAgent = track(
{ name: "<agent-name>", entrypoint: true, params: [{ name: "query", type: "string" }] },
async (query: string) => { /* ... */ }
);Step 6 — Conversational Agents: Add thread_id
thread_id步骤6 — 会话类Agent:添加thread_id
thread_idIf the agent handles multi-turn conversations (chat bots, support agents, multi-step assistants), wire :
thread_idpython
@opik.track(entrypoint=True)
def handle_message(session_id: str, message: str) -> str:
opik.update_current_trace(thread_id=session_id)
return generate_response(session_id, message)Skip this for single-shot agents or batch processing.
如果Agent处理多轮对话(聊天机器人、客服Agent、多步助手),请配置:
thread_idpython
@opik.track(entrypoint=True)
def handle_message(session_id: str, message: str) -> str:
opik.update_current_trace(thread_id=session_id)
return generate_response(session_id, message)单轮Agent或批处理场景可跳过此步骤。
Step 7 — Environment Config
步骤7 — 环境配置
Follow the setup decision tree from the main opik skill:
- If the project has /
.env→ append.env.local,OPIK_API_KEY,OPIK_WORKSPACE(if missing)OPIK_URL_OVERRIDE - If no exists → Python: create/update
.env; TypeScript: create~/.opik.configor.env.env.local - Never introduce a second config mechanism
- Never overwrite existing values
- Update /
.env.exampleif one exists.env.sample - Set in code, not in env files
project_name
遵循Opik核心技能的配置决策树:
- 如果项目有/
.env文件 → 补充缺失的.env.local、OPIK_API_KEY、OPIK_WORKSPACE配置OPIK_URL_OVERRIDE - 如果不存在文件 → Python:创建/更新
.env;TypeScript:创建~/.opik.config或.env文件.env.local - 不要引入第二种配置机制
- 不要覆盖已有的配置值
- 如果存在/
.env.example文件则同步更新.env.sample - 在代码中设置,不要放在环境变量文件中
project_name
Step 8 — Install Dependencies
步骤8 — 安装依赖
Print the install command but do NOT run it automatically. Let the user decide.
Python:
pip install opikPlus any integration packages if needed (most are included in ).
opikTypeScript:
npm install opikPlus framework-specific packages: , , , as needed.
opik-openaiopik-vercelopik-langchainopik-gemini打印安装命令但不要自动执行,由用户决定是否运行。
Python:
pip install opik如有需要可额外安装集成包(大多数集成已包含在包中)。
opikTypeScript:
npm install opik根据需要安装框架对应的包:、、、。
opik-openaiopik-vercelopik-langchainopik-geminiStep 9 — Verify
步骤9 — 验证
After instrumentation, do a quick audit:
- Every LLM call site is traced (via integration wrapper or )
@opik.track - Exactly one function has
entrypoint=True - Script entrypoints call (Python) or
opik.flush_tracker()(TypeScript)await client.flush() - LiteLLM calls inside pass
@opik.trackvia metadatacurrent_span_data - No hardcoded API keys were introduced
- Existing tests still import correctly (no circular imports introduced)
完成埋点后,快速执行以下审计:
- 所有LLM调用位点都已被追踪(通过集成包装或)
@opik.track - 仅有一个函数配置了
entrypoint=True - 脚本入口点调用了(Python)或
opik.flush_tracker()(TypeScript)await client.flush() - 在代码内的LiteLLM调用通过metadata传递了
@opik.trackcurrent_span_data - 没有引入硬编码的API密钥
- 现有测试仍可正常导入(没有引入循环依赖)
Anti-Patterns to Avoid
需要避免的反模式
- Double-wrapping: Don't add to a function that already uses a framework integration (e.g.,
@opik.track(type="llm")). The integration handles tracing.track_openai - Orphaned LiteLLM traces: Always pass when
current_span_datais used insideOpikLoggercode.@opik.track - Missing entrypoint: Without , Local Runner (
entrypoint=True) won't discover the agent.opik connect - Missing flush: Scripts that exit without flushing lose trace data.
- Overwriting config: Check before writing to or
.env.~/.opik.config
- 重复包装:不要给已经使用了框架集成(例如)的函数再添加
track_openai,集成已经处理了追踪逻辑。@opik.track(type="llm") - 孤立的LiteLLM追踪:在代码中使用
@opik.track时必须传递OpikLogger。current_span_data - 缺失入口点:没有配置的话,本地运行器(
entrypoint=True)无法识别Agent。opik connect - 缺失flush操作:脚本执行退出前没有flush会丢失追踪数据。
- 覆盖配置:写入或
.env前先检查是否已存在对应配置。~/.opik.config
References
参考资料
For detailed API signatures and advanced patterns, see:
- — Python SDK reference
../opik/references/tracing-python.md - — TypeScript SDK reference
../opik/references/tracing-typescript.md - — All 40+ framework integrations
../opik/references/integrations.md - — Core concepts (traces, spans, threads)
../opik/references/observability.md
如需了解详细的API签名与高级模式,可查看:
- — Python SDK参考
../opik/references/tracing-python.md - — TypeScript SDK参考
../opik/references/tracing-typescript.md - — 全部40+框架集成说明
../opik/references/integrations.md - — 核心概念(追踪、跨度、会话)
../opik/references/observability.md