python-mcp
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
ChinesePython MCP SDK Best Practices
Python MCP SDK 最佳实践
The Model Context Protocol (MCP) Python SDK ( on PyPI) provides the canonical Python implementation for building servers and clients that connect LLMs to external data and tools in a standardized way.
mcpModel Context Protocol (MCP) Python SDK(PyPI上的包)提供了标准的Python实现,用于构建以标准化方式连接LLM与外部数据及工具的服务器和客户端。
mcpInstallation
安装
Use uv (recommended) or pip:
bash
uv add "mcp[cli]"推荐使用 uv 或 pip:
bash
uv add "mcp[cli]"or
or
pip install "mcp[cli]"
Requires Python ≥ 3.10. The `[cli]` extra adds the `mcp` CLI for development tooling.pip install "mcp[cli]"
要求 Python ≥ 3.10。`[cli]` 扩展会添加用于开发工具的 `mcp` 命令行工具。Three Primitives
三大核心原语
MCP servers expose three primitives to LLM clients:
| Primitive | Analogy | Purpose |
|---|---|---|
| Resources | GET endpoint | Load data into LLM context (read-only) |
| Tools | POST endpoint | Execute actions, produce side effects |
| Prompts | Template | Reusable interaction patterns for LLMs |
Choose the right primitive for each capability:
- Use Resources for data retrieval that has no side effects.
- Use Tools for operations that compute, write, or call external APIs.
- Use Prompts for structured instruction templates clients can invoke by name.
MCP服务器向LLM客户端暴露三大核心原语:
| 原语 | 类比 | 用途 |
|---|---|---|
| Resources | GET 接口 | 将数据加载到LLM上下文(只读) |
| Tools | POST 接口 | 执行操作,产生副作用 |
| Prompts | 模板 | 供LLM使用的可复用交互模式 |
为每种能力选择合适的原语:
- 无副作用的数据检索使用Resources。
- 涉及计算、写入或调用外部API的操作使用Tools。
- 供客户端按名称调用的结构化指令模板使用Prompts。
FastMCP — The High-Level API
FastMCP — 高层级API
FastMCPFastMCPServer Initialization
服务器初始化
python
from mcp.server.fastmcp import FastMCP
mcp = FastMCP(
"MyServer",
stateless_http=True, # recommended for production HTTP
json_response=True, # recommended for scalability
)Always name servers descriptively — the name appears in client UIs and logs.
python
from mcp.server.fastmcp import FastMCP
mcp = FastMCP(
"MyServer",
stateless_http=True, # 生产环境HTTP推荐配置
json_response=True, # 可扩展性推荐配置
)务必为服务器设置描述性名称——该名称会显示在客户端UI和日志中。
Defining Tools
定义工具
Annotate function parameters and return types. FastMCP generates JSON Schema from type hints automatically.
python
from pydantic import BaseModel, Field
class WeatherData(BaseModel):
temperature: float = Field(description="Temperature in Celsius")
condition: str
humidity: float
@mcp.tool()
def get_weather(city: str, unit: str = "celsius") -> WeatherData:
"""Get current weather for a city.
Returns structured weather data validated against WeatherData schema.
"""
# Implementation calls a real weather API
return WeatherData(temperature=22.5, condition="sunny", humidity=45.0)Key rules for tools:
- Write clear, descriptive docstrings — LLMs use them to decide when to call the tool.
- Use Pydantic return types for structured output; the schema is exposed to clients.
BaseModel - Use or
TypedDictas lighter alternatives when full Pydantic validation is not needed.dataclass - Prefer for I/O-bound tools to avoid blocking the event loop.
async def - Add parameter last when progress reporting or logging is needed.
ctx: Context
为函数参数和返回类型添加注解。FastMCP会自动从类型提示生成JSON Schema。
python
from pydantic import BaseModel, Field
class WeatherData(BaseModel):
temperature: float = Field(description="Temperature in Celsius")
condition: str
humidity: float
@mcp.tool()
def get_weather(city: str, unit: str = "celsius") -> WeatherData:
"""Get current weather for a city.
Returns structured weather data validated against WeatherData schema.
"""
# 实现部分调用真实天气API
return WeatherData(temperature=22.5, condition="sunny", humidity=45.0)工具定义核心规则:
- 编写清晰、有描述性的文档字符串——LLM会依据这些内容决定何时调用工具。
- 结构化输出返回Pydantic 类型,其Schema会暴露给客户端。
BaseModel - 无需完整Pydantic验证时,可使用或
TypedDict作为轻量替代方案。dataclass - I/O密集型工具优先使用,避免阻塞事件循环。
async def - 需要进度报告或日志功能时,在函数参数末尾添加。
ctx: Context
Defining Resources
定义资源
python
@mcp.resource("file://documents/{name}")
def read_document(name: str) -> str:
"""Read a document by name from the document store."""
# Read from disk, DB, or cache
return f"Content of {name}"
@mcp.resource("config://settings")
def get_settings() -> str:
"""Return current application settings as JSON."""
return '{"theme": "dark", "debug": false}'Resources must be idempotent and free of significant side effects. Use URI templates () for dynamic resources.
{param}python
@mcp.resource("file://documents/{name}")
def read_document(name: str) -> str:
"""从文档存储中按名称读取文档。"""
# 从磁盘、数据库或缓存读取
return f"Content of {name}"
@mcp.resource("config://settings")
def get_settings() -> str:
"""以JSON格式返回当前应用配置。"""
return '{"theme": "dark", "debug": false}'资源必须是幂等的,且无显著副作用。使用URI模板()定义动态资源。
{param}Defining Prompts
定义提示词
python
from mcp.server.fastmcp.prompts import base
@mcp.prompt(title="Code Review")
def review_code(code: str, language: str = "python") -> list[base.Message]:
"""Generate a structured code review prompt."""
return [
base.UserMessage(f"Please review this {language} code:"),
base.UserMessage(f"```{language}\n{code}\n```"),
base.AssistantMessage("I'll analyze the code for correctness, style, and potential issues."),
]python
from mcp.server.fastmcp.prompts import base
@mcp.prompt(title="Code Review")
def review_code(code: str, language: str = "python") -> list[base.Message]:
"""生成结构化的代码评审提示词。"""
return [
base.UserMessage(f"Please review this {language} code:"),
base.UserMessage(f"```{language}\n{code}\n```"),
base.AssistantMessage("I'll analyze the code for correctness, style, and potential issues."),
]Context Object
Context 对象
Inject into any tool or resource function to access MCP capabilities. FastMCP injects it automatically — it does not appear in the tool's JSON Schema.
ctx: Contextpython
from mcp.server.fastmcp import Context, FastMCP
from mcp.server.session import ServerSession
@mcp.tool()
async def long_running_task(
task_name: str,
steps: int,
ctx: Context[ServerSession, None],
) -> str:
"""Run a multi-step task with progress reporting."""
await ctx.info(f"Starting task: {task_name}")
for i in range(steps):
await ctx.report_progress(
progress=(i + 1) / steps,
total=1.0,
message=f"Step {i + 1} of {steps}",
)
await ctx.info("Task complete")
return f"Completed {task_name}"Context capabilities:
| Method | Purpose |
|---|---|
| Send info log to client |
| Send debug log |
| Send warning log |
| Send error log |
| Report numeric progress |
| Read another resource from within a tool |
| Request structured input from the user |
| Unique ID for current request |
| Access server instance metadata |
在任意工具或资源函数中注入以访问MCP能力。FastMCP会自动注入它——它不会出现在工具的JSON Schema中。
ctx: Contextpython
from mcp.server.fastmcp import Context, FastMCP
from mcp.server.session import ServerSession
@mcp.tool()
async def long_running_task(
task_name: str,
steps: int,
ctx: Context[ServerSession, None],
) -> str:
"""Run a multi-step task with progress reporting."""
await ctx.info(f"Starting task: {task_name}")
for i in range(steps):
await ctx.report_progress(
progress=(i + 1) / steps,
total=1.0,
message=f"Step {i + 1} of {steps}",
)
await ctx.info("Task complete")
return f"Completed {task_name}"Context能力说明:
| 方法 | 用途 |
|---|---|
| 向客户端发送信息日志 |
| 发送调试日志 |
| 发送警告日志 |
| 发送错误日志 |
| 上报数值化进度 |
| 在工具内部读取其他资源 |
| 向用户请求结构化输入 |
| 当前请求的唯一ID |
| 访问服务器实例元数据 |
Lifespan — Managing Shared Resources
生命周期 — 管理共享资源
Use the lifespan pattern for database connections, HTTP clients, or any resource that must be initialized once and shared across requests.
python
from collections.abc import AsyncIterator
from contextlib import asynccontextmanager
from dataclasses import dataclass
import httpx
from mcp.server.fastmcp import Context, FastMCP
@dataclass
class AppState:
http_client: httpx.AsyncClient
@asynccontextmanager
async def lifespan(server: FastMCP) -> AsyncIterator[AppState]:
async with httpx.AsyncClient() as client:
yield AppState(http_client=client)
mcp = FastMCP("MyServer", lifespan=lifespan)
@mcp.tool()
async def fetch_url(url: str, ctx: Context) -> str:
"""Fetch content from a URL using the shared HTTP client."""
state: AppState = ctx.request_context.lifespan_context
response = await state.http_client.get(url)
return response.textAlways type the lifespan context with a or — this provides IDE support and avoids attribute lookup errors at runtime.
@dataclassTypedDict使用生命周期模式管理数据库连接、HTTP客户端或任何需要初始化一次并在请求间共享的资源。
python
from collections.abc import AsyncIterator
from contextlib import asynccontextmanager
from dataclasses import dataclass
import httpx
from mcp.server.fastmcp import Context, FastMCP
@dataclass
class AppState:
http_client: httpx.AsyncClient
@asynccontextmanager
async def lifespan(server: FastMCP) -> AsyncIterator[AppState]:
async with httpx.AsyncClient() as client:
yield AppState(http_client=client)
mcp = FastMCP("MyServer", lifespan=lifespan)
@mcp.tool()
async def fetch_url(url: str, ctx: Context) -> str:
"""使用共享HTTP客户端获取URL内容。"""
state: AppState = ctx.request_context.lifespan_context
response = await state.http_client.get(url)
return response.text务必使用或为生命周期上下文添加类型标注——这能提供IDE支持,避免运行时属性查找错误。
@dataclassTypedDictError Handling
错误处理
Raise standard Python exceptions in tools — MCP transmits them as structured error responses.
python
@mcp.tool()
def divide(a: float, b: float) -> float:
"""Divide a by b."""
if b == 0:
raise ValueError("Division by zero is not allowed")
return a / bFor tools that can return partial results, use directly:
CallToolResultpython
from mcp.types import CallToolResult, TextContent
@mcp.tool()
def safe_parse(data: str) -> CallToolResult:
"""Parse data, returning errors inline rather than raising."""
try:
result = parse(data)
return CallToolResult(
content=[TextContent(type="text", text=str(result))]
)
except ParseError as exc:
return CallToolResult(
content=[TextContent(type="text", text=f"Parse failed: {exc}")],
isError=True,
)Use to signal tool-level failures that should not halt the LLM's reasoning.
isError=True在工具中抛出标准Python异常——MCP会将它们转换为结构化错误响应传输。
python
@mcp.tool()
def divide(a: float, b: float) -> float:
"""Divide a by b."""
if b == 0:
raise ValueError("Division by zero is not allowed")
return a / b对于可返回部分结果的工具,直接使用:
CallToolResultpython
from mcp.types import CallToolResult, TextContent
@mcp.tool()
def safe_parse(data: str) -> CallToolResult:
"""Parse data, returning errors inline rather than raising."""
try:
result = parse(data)
return CallToolResult(
content=[TextContent(type="text", text=str(result))]
)
except ParseError as exc:
return CallToolResult(
content=[TextContent(type="text", text=f"Parse failed: {exc}")],
isError=True,
)使用标记不应终止LLM推理的工具级失败。
isError=TrueQuick Reference
快速参考
bash
undefinedbash
undefinedStart dev server with MCP Inspector
启动带MCP Inspector的开发服务器
uv run mcp dev server.py
uv run mcp dev server.py
Install to Claude Desktop
安装到Claude Desktop
uv run mcp install server.py --name "My Server"
uv run mcp install server.py --name "My Server"
Run with extra dependencies
带额外依赖运行
uv run mcp dev server.py --with pandas --with numpy
uv run mcp dev server.py --with pandas --with numpy
Run production HTTP server (uvicorn)
启动生产环境HTTP服务器(uvicorn)
uvicorn server:mcp.streamable_http_app --host 0.0.0.0 --port 8000
| Pattern | Recommendation |
|---------|---------------|
| Transport (production) | Streamable HTTP with `stateless_http=True, json_response=True` |
| Transport (local/stdio) | stdio via `mcp.run()` or `uv run mcp run server.py` |
| I/O tools | Use `async def` |
| Shared state | Use lifespan context |
| Structured output | Return Pydantic `BaseModel` subclass |
| Progress reporting | Use `ctx.report_progress()` |
| Secrets/config | Pass via environment variables, not hardcoded |uvicorn server:mcp.streamable_http_app --host 0.0.0.0 --port 8000
| 场景 | 推荐方案 |
|---------|---------------|
| 生产环境传输协议 | 启用`stateless_http=True, json_response=True`的流式HTTP |
| 本地/标准输入输出传输 | 标准输入输出,通过`mcp.run()`或`uv run mcp run server.py`启动 |
| I/O密集型工具 | 使用`async def` |
| 共享状态 | 使用生命周期上下文 |
| 结构化输出 | 返回Pydantic `BaseModel`子类 |
| 进度上报 | 使用`ctx.report_progress()` |
| 密钥/配置 | 通过环境变量传递,不要硬编码 |Additional Resources
额外资源
- — Advanced patterns: structured output, elicitation, sampling, notifications, authentication, and mounting multiple servers.
references/server-patterns.md - — Transport comparison (stdio vs SSE vs Streamable HTTP), CORS, ASGI mounting, and production deployment checklist.
references/transports-and-deployment.md
- — 进阶模式:结构化输出、用户输入引导、采样、通知、认证及多服务器挂载。
references/server-patterns.md - — 传输协议对比(标准输入输出 vs SSE vs 流式HTTP)、CORS、ASGI挂载及生产环境部署检查清单。
references/transports-and-deployment.md