pica-openai-agents
Original:🇺🇸 English
Translated
Integrate PICA into an application using the OpenAI Agents SDK. Use when adding PICA tools to an OpenAI agent via @openai/agents, setting up PICA MCP with the OpenAI Agents SDK, or when the user mentions PICA with OpenAI Agents.
3installs
Sourcepicahq/skills
Added on
NPX Install
npx skill4agent add picahq/skills pica-openai-agentsTags
Translated version includes tags in frontmatterSKILL.md Content
View Translation Comparison →PICA MCP Integration with the OpenAI Agents SDK
PICA provides a unified API platform that connects AI agents to third-party services (CRMs, email, calendars, databases, etc.) through MCP tool calling.
PICA MCP Server
PICA exposes its capabilities through an MCP server distributed as . It uses stdio transport — it runs as a local subprocess via .
@picahq/mcpnpxMCP Configuration
json
{
"mcpServers": {
"pica": {
"command": "npx",
"args": ["@picahq/mcp"],
"env": {
"PICA_SECRET": "your-pica-secret-key"
}
}
}
}- Package: (run via
@picahq/mcp, no install needed)npx - Auth: environment variable (obtain from the PICA dashboard https://app.picaos.com/settings/api-keys)
PICA_SECRET - Transport: stdio (standard input/output)
Environment Variable
Always store the PICA secret in an environment variable, never hardcode it:
PICA_SECRET=sk_test_...
OPENAI_API_KEY=sk-...Add them to (or equivalent) and document in .
.env.local.env.exampleUsing PICA with the OpenAI Agents SDK
The OpenAI Agents SDK () has first-class MCP support via . No additional MCP client package is needed — the SDK handles tool discovery, conversion, and execution automatically.
@openai/agentsMCPServerStdioRequired packages
bash
pnpm add @openai/agents zod- : Main SDK (includes
@openai/agents,MCPServerStdio,Agent)run - : Required by the SDK (v4+)
zod
Before implementing: look up the latest docs
The OpenAI Agents SDK API may change between versions. Always check the latest docs first:
- Docs: https://openai.github.io/openai-agents-js/
- MCP guide: https://openai.github.io/openai-agents-js/guides/mcp/
- GitHub: https://github.com/openai/openai-agents-js
Integration pattern
- Create an MCP server using with
MCPServerStdio,command: "npx"args: ["@picahq/mcp"] - Connect the server via
await mcpServer.connect() - Create an Agent with — tools are discovered automatically
mcpServers: [mcpServer] - Run the agent with — the SDK handles the full agent loop (tool calls, execution, multi-step)
run(agent, input, { stream: true }) - Stream events by iterating the result — handle for text deltas and
raw_model_stream_eventfor tool callsrun_item_stream_event - Close the MCP server when done via
await mcpServer.close()
When passing environment variables, spread so the subprocess inherits PATH and other system vars:
process.envtypescript
env: {
...(process.env as Record<string, string>),
PICA_SECRET: process.env.PICA_SECRET!,
}Minimal example
typescript
import { Agent, run, MCPServerStdio } from "@openai/agents";
const mcpServer = new MCPServerStdio({
name: "PICA MCP Server",
command: "npx",
args: ["@picahq/mcp"],
env: {
...(process.env as Record<string, string>),
PICA_SECRET: process.env.PICA_SECRET!,
},
});
await mcpServer.connect();
try {
const agent = new Agent({
name: "PICA Assistant",
model: "gpt-4o-mini",
instructions: "You are a helpful assistant.",
mcpServers: [mcpServer],
});
// Non-streaming
const result = await run(agent, "List my connected integrations");
console.log(result.finalOutput);
// Streaming
const streamResult = await run(agent, "List my connected integrations", {
stream: true,
});
for await (const event of streamResult) {
if (event.type === "raw_model_stream_event") {
const data = event.data as Record<string, unknown>;
if (data.type === "response.output_text.delta") {
process.stdout.write(data.delta as string);
}
}
}
await streamResult.completed;
} finally {
await mcpServer.close();
}Streaming SSE events for a chat UI
When building a Next.js API route, stream responses as SSE events using a . Emit events in this format for compatibility with the frontend component:
ReadableStreamPythonChat- — streamed text chunks
{ type: "text", content: "..." } - — tool execution starting
{ type: "tool_start", name: "tool_name", input: "..." } - — tool execution result
{ type: "tool_end", name: "tool_name", output: "..." } - — error messages
{ type: "error", content: "..." } - — stream finished
data: [DONE]
Handling streaming events
The SDK emits three event types when streaming:
| Event Type | Purpose | Key Fields |
|---|---|---|
| Raw model token deltas | |
| Tool calls, outputs, messages | |
| Agent switched (handoff) | |
For text streaming, match and read .
data.type === "response.output_text.delta"data.deltaFor tool events, check :
item.rawItem.type- — tool was invoked (has
"function_call",call_id,name)arguments - — tool returned (has
"function_call_output",call_id, but nooutput— track names via aname)Map<call_id, name>
Important: may fire multiple times for the same tool call (created, in-progress, completed). Use a to deduplicate events.
run_item_stream_eventSet<call_id>tool_startFallback: After the stream loop completes, check — if no text deltas were streamed (e.g., the model returned a single non-streamed response), send as a text event.
result.finalOutputfinalOutputMulti-turn input format
Pass conversation history as an array of message objects:
typescript
const input = messages.map((m: { role: string; content: string }) => ({
role: m.role as "user" | "assistant",
content: m.content,
}));
const result = await run(agent, input, { stream: true });Checklist
When setting up PICA MCP with the OpenAI Agents SDK:
- is installed
@openai/agents - (v4+) is installed
zod - is set in
OPENAI_API_KEY.env.local - is set in
PICA_SECRET.env.local - documents both
.env.exampleandOPENAI_API_KEYPICA_SECRET - uses
MCPServerStdio,command: "npx"args: ["@picahq/mcp"] - Full is spread into the MCP server's
process.envoptionenv - is called before creating the agent
mcpServer.connect() - Agent has — tools are auto-discovered
mcpServers: [mcpServer] - is called with
run()for streaming responses{ stream: true } - is awaited after iterating the stream
result.completed - Fallback to if no text deltas were streamed
result.finalOutput - Tool call names are tracked by (output events lack
call_id)name - Tool start events are deduplicated with a
Set<call_id> - is called in a
mcpServer.close()blockfinally