python-mcp

Compare original and translation side by side

🇺🇸

Original

English
🇨🇳

Translation

Chinese

Python MCP SDK Best Practices

Python MCP SDK 最佳实践

The Model Context Protocol (MCP) Python SDK (
mcp
on PyPI) provides the canonical Python implementation for building servers and clients that connect LLMs to external data and tools in a standardized way.
Model Context Protocol (MCP) Python SDK(PyPI上的
mcp
包)提供了标准的Python实现,用于构建以标准化方式连接LLM与外部数据及工具的服务器和客户端。

Installation

安装

Use uv (recommended) or pip:
bash
uv add "mcp[cli]"
推荐使用 uv 或 pip:
bash
uv add "mcp[cli]"

or

or

pip install "mcp[cli]"

Requires Python ≥ 3.10. The `[cli]` extra adds the `mcp` CLI for development tooling.
pip install "mcp[cli]"

要求 Python ≥ 3.10。`[cli]` 扩展会添加用于开发工具的 `mcp` 命令行工具。

Three Primitives

三大核心原语

MCP servers expose three primitives to LLM clients:
PrimitiveAnalogyPurpose
ResourcesGET endpointLoad data into LLM context (read-only)
ToolsPOST endpointExecute actions, produce side effects
PromptsTemplateReusable interaction patterns for LLMs
Choose the right primitive for each capability:
  • Use Resources for data retrieval that has no side effects.
  • Use Tools for operations that compute, write, or call external APIs.
  • Use Prompts for structured instruction templates clients can invoke by name.
MCP服务器向LLM客户端暴露三大核心原语:
原语类比用途
ResourcesGET 接口将数据加载到LLM上下文(只读)
ToolsPOST 接口执行操作,产生副作用
Prompts模板供LLM使用的可复用交互模式
为每种能力选择合适的原语:
  • 无副作用的数据检索使用Resources
  • 涉及计算、写入或调用外部API的操作使用Tools
  • 供客户端按名称调用的结构化指令模板使用Prompts

FastMCP — The High-Level API

FastMCP — 高层级API

FastMCP
is the primary interface. It wraps the low-level protocol and handles connection management, message routing, and serialization automatically.
FastMCP
是主要接口,它封装了底层协议,自动处理连接管理、消息路由和序列化。

Server Initialization

服务器初始化

python
from mcp.server.fastmcp import FastMCP

mcp = FastMCP(
    "MyServer",
    stateless_http=True,   # recommended for production HTTP
    json_response=True,    # recommended for scalability
)
Always name servers descriptively — the name appears in client UIs and logs.
python
from mcp.server.fastmcp import FastMCP

mcp = FastMCP(
    "MyServer",
    stateless_http=True,   # 生产环境HTTP推荐配置
    json_response=True,    # 可扩展性推荐配置
)
务必为服务器设置描述性名称——该名称会显示在客户端UI和日志中。

Defining Tools

定义工具

Annotate function parameters and return types. FastMCP generates JSON Schema from type hints automatically.
python
from pydantic import BaseModel, Field

class WeatherData(BaseModel):
    temperature: float = Field(description="Temperature in Celsius")
    condition: str
    humidity: float

@mcp.tool()
def get_weather(city: str, unit: str = "celsius") -> WeatherData:
    """Get current weather for a city.

    Returns structured weather data validated against WeatherData schema.
    """
    # Implementation calls a real weather API
    return WeatherData(temperature=22.5, condition="sunny", humidity=45.0)
Key rules for tools:
  • Write clear, descriptive docstrings — LLMs use them to decide when to call the tool.
  • Use Pydantic
    BaseModel
    return types for structured output; the schema is exposed to clients.
  • Use
    TypedDict
    or
    dataclass
    as lighter alternatives when full Pydantic validation is not needed.
  • Prefer
    async def
    for I/O-bound tools to avoid blocking the event loop.
  • Add
    ctx: Context
    parameter last when progress reporting or logging is needed.
为函数参数和返回类型添加注解。FastMCP会自动从类型提示生成JSON Schema。
python
from pydantic import BaseModel, Field

class WeatherData(BaseModel):
    temperature: float = Field(description="Temperature in Celsius")
    condition: str
    humidity: float

@mcp.tool()
def get_weather(city: str, unit: str = "celsius") -> WeatherData:
    """Get current weather for a city.

    Returns structured weather data validated against WeatherData schema.
    """
    # 实现部分调用真实天气API
    return WeatherData(temperature=22.5, condition="sunny", humidity=45.0)
工具定义核心规则:
  • 编写清晰、有描述性的文档字符串——LLM会依据这些内容决定何时调用工具。
  • 结构化输出返回Pydantic
    BaseModel
    类型,其Schema会暴露给客户端。
  • 无需完整Pydantic验证时,可使用
    TypedDict
    dataclass
    作为轻量替代方案。
  • I/O密集型工具优先使用
    async def
    ,避免阻塞事件循环。
  • 需要进度报告或日志功能时,在函数参数末尾添加
    ctx: Context

Defining Resources

定义资源

python
@mcp.resource("file://documents/{name}")
def read_document(name: str) -> str:
    """Read a document by name from the document store."""
    # Read from disk, DB, or cache
    return f"Content of {name}"

@mcp.resource("config://settings")
def get_settings() -> str:
    """Return current application settings as JSON."""
    return '{"theme": "dark", "debug": false}'
Resources must be idempotent and free of significant side effects. Use URI templates (
{param}
) for dynamic resources.
python
@mcp.resource("file://documents/{name}")
def read_document(name: str) -> str:
    """从文档存储中按名称读取文档。"""
    # 从磁盘、数据库或缓存读取
    return f"Content of {name}"

@mcp.resource("config://settings")
def get_settings() -> str:
    """以JSON格式返回当前应用配置。"""
    return '{"theme": "dark", "debug": false}'
资源必须是幂等的,且无显著副作用。使用URI模板(
{param}
)定义动态资源。

Defining Prompts

定义提示词

python
from mcp.server.fastmcp.prompts import base

@mcp.prompt(title="Code Review")
def review_code(code: str, language: str = "python") -> list[base.Message]:
    """Generate a structured code review prompt."""
    return [
        base.UserMessage(f"Please review this {language} code:"),
        base.UserMessage(f"```{language}\n{code}\n```"),
        base.AssistantMessage("I'll analyze the code for correctness, style, and potential issues."),
    ]
python
from mcp.server.fastmcp.prompts import base

@mcp.prompt(title="Code Review")
def review_code(code: str, language: str = "python") -> list[base.Message]:
    """生成结构化的代码评审提示词。"""
    return [
        base.UserMessage(f"Please review this {language} code:"),
        base.UserMessage(f"```{language}\n{code}\n```"),
        base.AssistantMessage("I'll analyze the code for correctness, style, and potential issues."),
    ]

Context Object

Context 对象

Inject
ctx: Context
into any tool or resource function to access MCP capabilities. FastMCP injects it automatically — it does not appear in the tool's JSON Schema.
python
from mcp.server.fastmcp import Context, FastMCP
from mcp.server.session import ServerSession

@mcp.tool()
async def long_running_task(
    task_name: str,
    steps: int,
    ctx: Context[ServerSession, None],
) -> str:
    """Run a multi-step task with progress reporting."""
    await ctx.info(f"Starting task: {task_name}")

    for i in range(steps):
        await ctx.report_progress(
            progress=(i + 1) / steps,
            total=1.0,
            message=f"Step {i + 1} of {steps}",
        )

    await ctx.info("Task complete")
    return f"Completed {task_name}"
Context capabilities:
MethodPurpose
await ctx.info(msg)
Send info log to client
await ctx.debug(msg)
Send debug log
await ctx.warning(msg)
Send warning log
await ctx.error(msg)
Send error log
await ctx.report_progress(progress, total, message)
Report numeric progress
await ctx.read_resource(uri)
Read another resource from within a tool
await ctx.elicit(message, schema)
Request structured input from the user
ctx.request_id
Unique ID for current request
ctx.fastmcp
Access server instance metadata
在任意工具或资源函数中注入
ctx: Context
以访问MCP能力。FastMCP会自动注入它——它不会出现在工具的JSON Schema中。
python
from mcp.server.fastmcp import Context, FastMCP
from mcp.server.session import ServerSession

@mcp.tool()
async def long_running_task(
    task_name: str,
    steps: int,
    ctx: Context[ServerSession, None],
) -> str:
    """Run a multi-step task with progress reporting."""
    await ctx.info(f"Starting task: {task_name}")

    for i in range(steps):
        await ctx.report_progress(
            progress=(i + 1) / steps,
            total=1.0,
            message=f"Step {i + 1} of {steps}",
        )

    await ctx.info("Task complete")
    return f"Completed {task_name}"
Context能力说明:
方法用途
await ctx.info(msg)
向客户端发送信息日志
await ctx.debug(msg)
发送调试日志
await ctx.warning(msg)
发送警告日志
await ctx.error(msg)
发送错误日志
await ctx.report_progress(progress, total, message)
上报数值化进度
await ctx.read_resource(uri)
在工具内部读取其他资源
await ctx.elicit(message, schema)
向用户请求结构化输入
ctx.request_id
当前请求的唯一ID
ctx.fastmcp
访问服务器实例元数据

Lifespan — Managing Shared Resources

生命周期 — 管理共享资源

Use the lifespan pattern for database connections, HTTP clients, or any resource that must be initialized once and shared across requests.
python
from collections.abc import AsyncIterator
from contextlib import asynccontextmanager
from dataclasses import dataclass

import httpx
from mcp.server.fastmcp import Context, FastMCP

@dataclass
class AppState:
    http_client: httpx.AsyncClient

@asynccontextmanager
async def lifespan(server: FastMCP) -> AsyncIterator[AppState]:
    async with httpx.AsyncClient() as client:
        yield AppState(http_client=client)

mcp = FastMCP("MyServer", lifespan=lifespan)

@mcp.tool()
async def fetch_url(url: str, ctx: Context) -> str:
    """Fetch content from a URL using the shared HTTP client."""
    state: AppState = ctx.request_context.lifespan_context
    response = await state.http_client.get(url)
    return response.text
Always type the lifespan context with a
@dataclass
or
TypedDict
— this provides IDE support and avoids attribute lookup errors at runtime.
使用生命周期模式管理数据库连接、HTTP客户端或任何需要初始化一次并在请求间共享的资源。
python
from collections.abc import AsyncIterator
from contextlib import asynccontextmanager
from dataclasses import dataclass

import httpx
from mcp.server.fastmcp import Context, FastMCP

@dataclass
class AppState:
    http_client: httpx.AsyncClient

@asynccontextmanager
async def lifespan(server: FastMCP) -> AsyncIterator[AppState]:
    async with httpx.AsyncClient() as client:
        yield AppState(http_client=client)

mcp = FastMCP("MyServer", lifespan=lifespan)

@mcp.tool()
async def fetch_url(url: str, ctx: Context) -> str:
    """使用共享HTTP客户端获取URL内容。"""
    state: AppState = ctx.request_context.lifespan_context
    response = await state.http_client.get(url)
    return response.text
务必使用
@dataclass
TypedDict
为生命周期上下文添加类型标注——这能提供IDE支持,避免运行时属性查找错误。

Error Handling

错误处理

Raise standard Python exceptions in tools — MCP transmits them as structured error responses.
python
@mcp.tool()
def divide(a: float, b: float) -> float:
    """Divide a by b."""
    if b == 0:
        raise ValueError("Division by zero is not allowed")
    return a / b
For tools that can return partial results, use
CallToolResult
directly:
python
from mcp.types import CallToolResult, TextContent

@mcp.tool()
def safe_parse(data: str) -> CallToolResult:
    """Parse data, returning errors inline rather than raising."""
    try:
        result = parse(data)
        return CallToolResult(
            content=[TextContent(type="text", text=str(result))]
        )
    except ParseError as exc:
        return CallToolResult(
            content=[TextContent(type="text", text=f"Parse failed: {exc}")],
            isError=True,
        )
Use
isError=True
to signal tool-level failures that should not halt the LLM's reasoning.
在工具中抛出标准Python异常——MCP会将它们转换为结构化错误响应传输。
python
@mcp.tool()
def divide(a: float, b: float) -> float:
    """Divide a by b."""
    if b == 0:
        raise ValueError("Division by zero is not allowed")
    return a / b
对于可返回部分结果的工具,直接使用
CallToolResult
python
from mcp.types import CallToolResult, TextContent

@mcp.tool()
def safe_parse(data: str) -> CallToolResult:
    """Parse data, returning errors inline rather than raising."""
    try:
        result = parse(data)
        return CallToolResult(
            content=[TextContent(type="text", text=str(result))]
        )
    except ParseError as exc:
        return CallToolResult(
            content=[TextContent(type="text", text=f"Parse failed: {exc}")],
            isError=True,
        )
使用
isError=True
标记不应终止LLM推理的工具级失败。

Quick Reference

快速参考

bash
undefined
bash
undefined

Start dev server with MCP Inspector

启动带MCP Inspector的开发服务器

uv run mcp dev server.py
uv run mcp dev server.py

Install to Claude Desktop

安装到Claude Desktop

uv run mcp install server.py --name "My Server"
uv run mcp install server.py --name "My Server"

Run with extra dependencies

带额外依赖运行

uv run mcp dev server.py --with pandas --with numpy
uv run mcp dev server.py --with pandas --with numpy

Run production HTTP server (uvicorn)

启动生产环境HTTP服务器(uvicorn)

uvicorn server:mcp.streamable_http_app --host 0.0.0.0 --port 8000

| Pattern | Recommendation |
|---------|---------------|
| Transport (production) | Streamable HTTP with `stateless_http=True, json_response=True` |
| Transport (local/stdio) | stdio via `mcp.run()` or `uv run mcp run server.py` |
| I/O tools | Use `async def` |
| Shared state | Use lifespan context |
| Structured output | Return Pydantic `BaseModel` subclass |
| Progress reporting | Use `ctx.report_progress()` |
| Secrets/config | Pass via environment variables, not hardcoded |
uvicorn server:mcp.streamable_http_app --host 0.0.0.0 --port 8000

| 场景 | 推荐方案 |
|---------|---------------|
| 生产环境传输协议 | 启用`stateless_http=True, json_response=True`的流式HTTP |
| 本地/标准输入输出传输 | 标准输入输出,通过`mcp.run()`或`uv run mcp run server.py`启动 |
| I/O密集型工具 | 使用`async def` |
| 共享状态 | 使用生命周期上下文 |
| 结构化输出 | 返回Pydantic `BaseModel`子类 |
| 进度上报 | 使用`ctx.report_progress()` |
| 密钥/配置 | 通过环境变量传递,不要硬编码 |

Additional Resources

额外资源

  • references/server-patterns.md
    — Advanced patterns: structured output, elicitation, sampling, notifications, authentication, and mounting multiple servers.
  • references/transports-and-deployment.md
    — Transport comparison (stdio vs SSE vs Streamable HTTP), CORS, ASGI mounting, and production deployment checklist.
  • references/server-patterns.md
    — 进阶模式:结构化输出、用户输入引导、采样、通知、认证及多服务器挂载。
  • references/transports-and-deployment.md
    — 传输协议对比(标准输入输出 vs SSE vs 流式HTTP)、CORS、ASGI挂载及生产环境部署检查清单。