pica-langchain

Compare original and translation side by side

🇺🇸

Original

English
🇨🇳

Translation

Chinese

PICA MCP Integration with LangChain

PICA MCP 与 LangChain 集成

PICA provides a unified API platform that connects AI agents to third-party services (CRMs, email, calendars, databases, etc.) through MCP tool calling.
PICA提供统一的API平台,通过MCP工具调用能力将AI Agent连接到第三方服务(CRM、邮件、日历、数据库等)。

PICA MCP Server

PICA MCP 服务器

PICA exposes its capabilities through an MCP server distributed as
@picahq/mcp
. It uses stdio transport — it runs as a local subprocess via
npx
.
PICA通过以
@picahq/mcp
包分发的MCP服务器提供其能力,采用stdio传输协议——它通过
npx
作为本地子进程运行。

MCP Configuration

MCP 配置

json
{
  "mcpServers": {
    "pica": {
      "command": "npx",
      "args": ["@picahq/mcp"],
      "env": {
        "PICA_SECRET": "your-pica-secret-key"
      }
    }
  }
}
  • Package:
    @picahq/mcp
    (run via
    npx
    , no install needed)
  • Auth:
    PICA_SECRET
    environment variable (obtain from the PICA dashboard https://app.picaos.com/settings/api-keys)
  • Transport: stdio (standard input/output)
json
{
  "mcpServers": {
    "pica": {
      "command": "npx",
      "args": ["@picahq/mcp"],
      "env": {
        "PICA_SECRET": "your-pica-secret-key"
      }
    }
  }
}

Environment Variable

环境变量

Always store the PICA secret in an environment variable, never hardcode it:
PICA_SECRET=sk_test_...
Add it to
.env
and load with
python-dotenv
.
请始终将PICA密钥存储在环境变量中,切勿硬编码:
PICA_SECRET=sk_test_...
将其添加到
.env
文件中,并使用
python-dotenv
加载。

Using PICA with LangChain

在LangChain中使用PICA

LangChain provides MCP client support via the
langchain-mcp-adapters
package. Always refer to the latest docs before implementing. See langchain-mcp-reference.md.
LangChain通过
langchain-mcp-adapters
包提供MCP客户端支持。实现前请务必参考最新文档,参见langchain-mcp-reference.md

Required packages

所需依赖包

bash
pip install langchain-mcp-adapters langgraph langchain-anthropic mcp python-dotenv
bash
pip install langchain-mcp-adapters langgraph langchain-anthropic mcp python-dotenv

Before implementing: look up the latest docs

实现前:查阅最新文档

The
langchain-mcp-adapters
API has changed across versions (e.g.,
MultiServerMCPClient
is no longer a context manager as of v0.1.0). Always check the latest docs before writing code. See langchain-mcp-reference.md.
langchain-mcp-adapters
的API在不同版本之间存在变更(例如,从v0.1.0版本起
MultiServerMCPClient
不再作为上下文管理器使用)。编写代码前务必查看最新文档,参见langchain-mcp-reference.md

Integration pattern

集成流程

  1. Create an MCP client using
    MultiServerMCPClient
    with stdio transport pointed at
    npx @picahq/mcp
  2. Get tools from the client via
    await client.get_tools()
  3. Create a ReAct agent using
    create_react_agent(model, tools)
    from
    langgraph.prebuilt
  4. Stream or invoke the agent with your messages
  5. Pass environment variables (
    PICA_SECRET
    ,
    PATH
    ,
    HOME
    ) to the MCP client's
    env
    config
  1. 创建MCP客户端:使用
    MultiServerMCPClient
    ,将stdio传输指向
    npx @picahq/mcp
  2. 获取工具:通过
    await client.get_tools()
    从客户端获取工具
  3. 创建ReAct Agent:使用
    langgraph.prebuilt
    提供的
    create_react_agent(model, tools)
    创建
  4. 流式调用或普通调用:使用你的消息调用Agent
  5. 将环境变量(
    PICA_SECRET
    PATH
    HOME
    )传入MCP客户端的
    env
    配置

Minimal example

最小示例

python
from langchain_anthropic import ChatAnthropic
from langchain_mcp_adapters.client import MultiServerMCPClient
from langgraph.prebuilt import create_react_agent

model = ChatAnthropic(model="claude-haiku-4-5-20251001", streaming=True)

client = MultiServerMCPClient({
    "pica": {
        "command": "npx",
        "args": ["@picahq/mcp"],
        "transport": "stdio",
        "env": {
            "PICA_SECRET": os.environ.get("PICA_SECRET", ""),
            "PATH": os.environ.get("PATH", ""),
            "HOME": os.environ.get("HOME", ""),
        },
    },
})
tools = await client.get_tools()
agent = create_react_agent(model, tools)
python
from langchain_anthropic import ChatAnthropic
from langchain_mcp_adapters.client import MultiServerMCPClient
from langgraph.prebuilt import create_react_agent

model = ChatAnthropic(model="claude-haiku-4-5-20251001", streaming=True)

client = MultiServerMCPClient({
    "pica": {
        "command": "npx",
        "args": ["@picahq/mcp"],
        "transport": "stdio",
        "env": {
            "PICA_SECRET": os.environ.get("PICA_SECRET", ""),
            "PATH": os.environ.get("PATH", ""),
            "HOME": os.environ.get("HOME", ""),
        },
    },
})
tools = await client.get_tools()
agent = create_react_agent(model, tools)

Invoke

调用

result = await agent.ainvoke({"messages": [{"role": "user", "content": "..."}]})
result = await agent.ainvoke({"messages": [{"role": "user", "content": "..."}]})

Or stream events

或者流式事件

async for event in agent.astream_events({"messages": messages}, version="v2"): kind = event["event"] if kind == "on_chat_model_stream": content = event["data"]["chunk"].content # content may be a list of content blocks (Anthropic models) or a string
undefined
async for event in agent.astream_events({"messages": messages}, version="v2"): kind = event["event"] if kind == "on_chat_model_stream": content = event["data"]["chunk"].content # content 可能是内容块列表(Anthropic模型)或字符串
undefined

Important: Anthropic content blocks

重要提示:Anthropic内容块

When streaming with
astream_events(version="v2")
, Anthropic models return
chunk.content
as a list of content blocks, not plain strings. Always handle both:
python
if isinstance(content, list):
    text = "".join(
        block.get("text", "") if isinstance(block, dict) else str(block)
        for block in content
    )
elif isinstance(content, str):
    text = content
使用
astream_events(version="v2")
进行流式调用时,Anthropic模型返回的
chunk.content
内容块列表,而非纯字符串,请始终同时处理两种情况:
python
if isinstance(content, list):
    text = "".join(
        block.get("text", "") if isinstance(block, dict) else str(block)
        for block in content
    )
elif isinstance(content, str):
    text = content

Checklist

检查清单

When setting up PICA MCP with LangChain:
  • langchain-mcp-adapters
    ,
    langgraph
    ,
    langchain-anthropic
    ,
    mcp
    are installed
  • PICA_SECRET
    is set in
    .env
  • .env
    is loaded via
    python-dotenv
    (
    load_dotenv()
    at top of file)
  • MCP client uses stdio transport with
    npx @picahq/mcp
  • PATH
    and
    HOME
    are passed in the MCP client
    env
    config
  • MultiServerMCPClient
    is NOT used as a context manager (API changed in v0.1.0)
  • Streaming handles both list and string content from Anthropic models
  • Tool events (
    on_tool_start
    ,
    on_tool_end
    ) are handled for UI rendering
使用LangChain配置PICA MCP时的检查项:
  • 已安装
    langchain-mcp-adapters
    langgraph
    langchain-anthropic
    mcp
  • .env
    文件中已配置
    PICA_SECRET
  • 已通过
    python-dotenv
    加载
    .env
    (文件顶部调用
    load_dotenv()
  • MCP客户端使用stdio传输协议指向
    npx @picahq/mcp
  • MCP客户端的
    env
    配置中已传入
    PATH
    HOME
  • 未将
    MultiServerMCPClient
    作为上下文管理器使用(v0.1.0版本API变更)
  • 流式调用逻辑已处理Anthropic模型返回的列表和字符串两种内容格式
  • 已处理工具事件(
    on_tool_start
    on_tool_end
    )用于UI渲染

Additional resources

额外资源

  • For LangChain MCP adapter docs and API details, see langchain-mcp-reference.md
  • LangChain MCP适配器文档和API详情参见langchain-mcp-reference.md