Loading...
Loading...
Add Opik tracing to an existing codebase. Detects language (Python/TypeScript), identifies LLM frameworks, adds appropriate decorators and integrations, marks entrypoints, and wires up environment config. Use for "instrument my code", "add opik tracing", "add observability", or "trace my agent".
npx skill4agent add comet-ml/opik-skills instrument$ARGUMENTS*.pypyproject.tomlrequirements.txt*.ts*.tsxpackage.json| Import pattern | Framework | Integration |
|---|---|---|
| OpenAI | |
| Anthropic | |
| LangChain | |
| LangGraph | |
| CrewAI | |
| DSPy | |
| Google Gemini | |
| AWS Bedrock | |
| LlamaIndex | |
| LiteLLM | |
| Pydantic AI | Logfire OTLP bridge |
| Google ADK | |
| Ollama | |
| OpenAI Agents SDK | |
| Haystack | |
| OpenAI (TS) | |
| Vercel AI SDK | |
| LangChain.js | |
| Gemini (TS) | |
opik@opik.trackmainrunagenthandle_messagereferences/integrations.md# OpenAI
from opik.integrations.openai import track_openai
client = track_openai(OpenAI()) # wrap existing client
# Anthropic
from opik.integrations.anthropic import track_anthropic
client = track_anthropic(anthropic.Anthropic())
# LangChain / LangGraph
from opik.integrations.langchain import OpikTracer
tracer = OpikTracer()
# pass config={"callbacks": [tracer]} to invoke()
# LiteLLM inside @opik.track — CRITICAL: pass span context
from opik.opik_context import get_current_span_data
# in every litellm.completion() call, add:
# metadata={"opik": {"current_span_data": get_current_span_data()}}// OpenAI
import { trackOpenAI } from "opik-openai";
const trackedClient = trackOpenAI(openai);
// Vercel AI SDK
import { OpikExporter } from "opik-vercel";
// set up NodeSDK with OpikExporter@opik.trackimport opik| Function role | Decorator |
|---|---|
| Entrypoint (top-level agent) | |
| LLM call | |
| Tool / retrieval | |
| Guardrail / validation | |
| Other helper in the call chain | |
@app.route@opik.trackopik.flush_tracker()import { Opik } from "opik";
const client = new Opik({ projectName: "<project-name>" });
// In the entrypoint function:
const trace = client.trace({ name: "<agent-name>", input: { ... } });
const span = trace.span({ name: "<operation>", type: "tool", input: { ... } });
// ... logic
span.end({ output: { ... } });
trace.end({ output: { ... } });
await client.flush();opik connectimport { track } from "opik";
const myAgent = track(
{ name: "<agent-name>", entrypoint: true, params: [{ name: "query", type: "string" }] },
async (query: string) => { /* ... */ }
);thread_idthread_id@opik.track(entrypoint=True)
def handle_message(session_id: str, message: str) -> str:
opik.update_current_trace(thread_id=session_id)
return generate_response(session_id, message).env.env.localOPIK_API_KEYOPIK_WORKSPACEOPIK_URL_OVERRIDE.env~/.opik.config.env.env.local.env.example.env.sampleproject_namepip install opikopiknpm install opikopik-openaiopik-vercelopik-langchainopik-gemini@opik.trackentrypoint=Trueopik.flush_tracker()await client.flush()@opik.trackcurrent_span_data@opik.track(type="llm")track_openaicurrent_span_dataOpikLogger@opik.trackentrypoint=Trueopik connect.env~/.opik.config../opik/references/tracing-python.md../opik/references/tracing-typescript.md../opik/references/integrations.md../opik/references/observability.md