pica-langchain
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
ChinesePICA MCP Integration with LangChain
PICA MCP 与 LangChain 集成
PICA provides a unified API platform that connects AI agents to third-party services (CRMs, email, calendars, databases, etc.) through MCP tool calling.
PICA提供统一的API平台,通过MCP工具调用能力将AI Agent连接到第三方服务(CRM、邮件、日历、数据库等)。
PICA MCP Server
PICA MCP 服务器
PICA exposes its capabilities through an MCP server distributed as . It uses stdio transport — it runs as a local subprocess via .
@picahq/mcpnpxPICA通过以包分发的MCP服务器提供其能力,采用stdio传输协议——它通过作为本地子进程运行。
@picahq/mcpnpxMCP Configuration
MCP 配置
json
{
"mcpServers": {
"pica": {
"command": "npx",
"args": ["@picahq/mcp"],
"env": {
"PICA_SECRET": "your-pica-secret-key"
}
}
}
}- Package: (run via
@picahq/mcp, no install needed)npx - Auth: environment variable (obtain from the PICA dashboard https://app.picaos.com/settings/api-keys)
PICA_SECRET - Transport: stdio (standard input/output)
json
{
"mcpServers": {
"pica": {
"command": "npx",
"args": ["@picahq/mcp"],
"env": {
"PICA_SECRET": "your-pica-secret-key"
}
}
}
}- 包:(通过
@picahq/mcp运行,无需安装)npx - 鉴权:环境变量(可从PICA控制台获取:https://app.picaos.com/settings/api-keys)
PICA_SECRET - 传输协议:stdio(标准输入/输出)
Environment Variable
环境变量
Always store the PICA secret in an environment variable, never hardcode it:
PICA_SECRET=sk_test_...Add it to and load with .
.envpython-dotenv请始终将PICA密钥存储在环境变量中,切勿硬编码:
PICA_SECRET=sk_test_...将其添加到文件中,并使用加载。
.envpython-dotenvUsing PICA with LangChain
在LangChain中使用PICA
LangChain provides MCP client support via the package. Always refer to the latest docs before implementing. See langchain-mcp-reference.md.
langchain-mcp-adaptersLangChain通过包提供MCP客户端支持。实现前请务必参考最新文档,参见langchain-mcp-reference.md。
langchain-mcp-adaptersRequired packages
所需依赖包
bash
pip install langchain-mcp-adapters langgraph langchain-anthropic mcp python-dotenvbash
pip install langchain-mcp-adapters langgraph langchain-anthropic mcp python-dotenvBefore implementing: look up the latest docs
实现前:查阅最新文档
The API has changed across versions (e.g., is no longer a context manager as of v0.1.0). Always check the latest docs before writing code. See langchain-mcp-reference.md.
langchain-mcp-adaptersMultiServerMCPClientlangchain-mcp-adaptersMultiServerMCPClientIntegration pattern
集成流程
- Create an MCP client using with stdio transport pointed at
MultiServerMCPClientnpx @picahq/mcp - Get tools from the client via
await client.get_tools() - Create a ReAct agent using from
create_react_agent(model, tools)langgraph.prebuilt - Stream or invoke the agent with your messages
- Pass environment variables (,
PICA_SECRET,PATH) to the MCP client'sHOMEconfigenv
- 创建MCP客户端:使用,将stdio传输指向
MultiServerMCPClientnpx @picahq/mcp - 获取工具:通过从客户端获取工具
await client.get_tools() - 创建ReAct Agent:使用提供的
langgraph.prebuilt创建create_react_agent(model, tools) - 流式调用或普通调用:使用你的消息调用Agent
- 将环境变量(、
PICA_SECRET、PATH)传入MCP客户端的HOME配置env
Minimal example
最小示例
python
from langchain_anthropic import ChatAnthropic
from langchain_mcp_adapters.client import MultiServerMCPClient
from langgraph.prebuilt import create_react_agent
model = ChatAnthropic(model="claude-haiku-4-5-20251001", streaming=True)
client = MultiServerMCPClient({
"pica": {
"command": "npx",
"args": ["@picahq/mcp"],
"transport": "stdio",
"env": {
"PICA_SECRET": os.environ.get("PICA_SECRET", ""),
"PATH": os.environ.get("PATH", ""),
"HOME": os.environ.get("HOME", ""),
},
},
})
tools = await client.get_tools()
agent = create_react_agent(model, tools)python
from langchain_anthropic import ChatAnthropic
from langchain_mcp_adapters.client import MultiServerMCPClient
from langgraph.prebuilt import create_react_agent
model = ChatAnthropic(model="claude-haiku-4-5-20251001", streaming=True)
client = MultiServerMCPClient({
"pica": {
"command": "npx",
"args": ["@picahq/mcp"],
"transport": "stdio",
"env": {
"PICA_SECRET": os.environ.get("PICA_SECRET", ""),
"PATH": os.environ.get("PATH", ""),
"HOME": os.environ.get("HOME", ""),
},
},
})
tools = await client.get_tools()
agent = create_react_agent(model, tools)Invoke
调用
result = await agent.ainvoke({"messages": [{"role": "user", "content": "..."}]})
result = await agent.ainvoke({"messages": [{"role": "user", "content": "..."}]})
Or stream events
或者流式事件
async for event in agent.astream_events({"messages": messages}, version="v2"):
kind = event["event"]
if kind == "on_chat_model_stream":
content = event["data"]["chunk"].content
# content may be a list of content blocks (Anthropic models) or a string
undefinedasync for event in agent.astream_events({"messages": messages}, version="v2"):
kind = event["event"]
if kind == "on_chat_model_stream":
content = event["data"]["chunk"].content
# content 可能是内容块列表(Anthropic模型)或字符串
undefinedImportant: Anthropic content blocks
重要提示:Anthropic内容块
When streaming with , Anthropic models return as a list of content blocks, not plain strings. Always handle both:
astream_events(version="v2")chunk.contentpython
if isinstance(content, list):
text = "".join(
block.get("text", "") if isinstance(block, dict) else str(block)
for block in content
)
elif isinstance(content, str):
text = content使用进行流式调用时,Anthropic模型返回的是内容块列表,而非纯字符串,请始终同时处理两种情况:
astream_events(version="v2")chunk.contentpython
if isinstance(content, list):
text = "".join(
block.get("text", "") if isinstance(block, dict) else str(block)
for block in content
)
elif isinstance(content, str):
text = contentChecklist
检查清单
When setting up PICA MCP with LangChain:
- ,
langchain-mcp-adapters,langgraph,langchain-anthropicare installedmcp - is set in
PICA_SECRET.env - is loaded via
.env(python-dotenvat top of file)load_dotenv() - MCP client uses stdio transport with
npx @picahq/mcp - and
PATHare passed in the MCP clientHOMEconfigenv - is NOT used as a context manager (API changed in v0.1.0)
MultiServerMCPClient - Streaming handles both list and string content from Anthropic models
- Tool events (,
on_tool_start) are handled for UI renderingon_tool_end
使用LangChain配置PICA MCP时的检查项:
- 已安装、
langchain-mcp-adapters、langgraph、langchain-anthropicmcp - 文件中已配置
.envPICA_SECRET - 已通过加载
python-dotenv(文件顶部调用.env)load_dotenv() - MCP客户端使用stdio传输协议指向
npx @picahq/mcp - MCP客户端的配置中已传入
env和PATHHOME - 未将作为上下文管理器使用(v0.1.0版本API变更)
MultiServerMCPClient - 流式调用逻辑已处理Anthropic模型返回的列表和字符串两种内容格式
- 已处理工具事件(、
on_tool_start)用于UI渲染on_tool_end
Additional resources
额外资源
- For LangChain MCP adapter docs and API details, see langchain-mcp-reference.md
- LangChain MCP适配器文档和API详情参见langchain-mcp-reference.md