letta-api-client
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
ChineseLetta API Client Skill
Letta API Client 开发技能指南
Build applications on top of the Letta API — a model-agnostic, stateful API for building persistent agents with memory and long-term learning. The Letta API powers Letta Code and the Learning SDK. This skill covers the core patterns for creating agents, managing memory, building custom tools, and handling multi-user scenarios.
基于Letta API构建应用——这是一个与模型无关的有状态API,用于构建具备记忆和长期学习能力的持久化Agent。Letta API为Letta Code和Learning SDK提供底层支持。本技能指南涵盖了创建Agent、管理记忆、构建自定义工具以及处理多用户场景的核心模式。
When to Use This Skill
何时使用本技能
- Building applications that need persistent, stateful AI agents
- Creating chatbots, assistants, or autonomous agents with memory
- Integrating Letta into existing web/mobile applications
- Building multi-user applications where each user has their own agent
- Understanding the API layer that Letta Code and Learning SDK are built on
- 构建需要持久化、有状态AI Agent的应用
- 创建具备记忆功能的聊天机器人、助手或自主Agent
- 将Letta集成到现有的Web/移动应用中
- 构建每个用户拥有独立Agent的多用户应用
- 了解Letta Code和Learning SDK所基于的API层
Quick Start
快速开始
See getting-started.md for first-time setup and common onboarding issues.
首次设置和常见入门问题请参考getting-started.md。
SDK Versions Tested
已测试的SDK版本
Examples last tested with:
- Python SDK:
letta-client==1.7.1 - TypeScript SDK:
@letta-ai/letta-client@1.7.1
示例代码最后测试对应的版本:
- Python SDK:
letta-client==1.7.1 - TypeScript SDK:
@letta-ai/letta-client@1.7.1
Core Concepts
核心概念
1. Client Setup
1. 客户端设置
See client-setup.md for initialization patterns:
- Letta Cloud vs self-hosted connections
- Environment variable management
- Singleton patterns for web frameworks
初始化模式请参考client-setup.md:
- Letta Cloud与自托管连接方式
- 环境变量管理
- Web框架中的单例模式
2. Memory Architecture
2. 记忆架构
See memory-architecture.md for memory patterns:
- Core Memory Blocks: Always in-context (persona, human, custom blocks)
- Archival Memory: Large corpus with semantic search
- Conversation History: Searchable message history
- Shared Blocks: Multi-agent coordination
记忆模式请参考memory-architecture.md:
- 核心记忆块:始终处于上下文环境中(角色设定、用户信息、自定义块)
- 归档记忆:支持语义搜索的大型语料库
- 对话历史:可搜索的消息历史记录
- 共享块:多Agent协作
3. Custom Tools
3. 自定义工具
See custom-tools.md for tool creation:
- Simple function tools with auto-generated schemas
- Tools with environment variable secrets
- BaseTool class for complex schemas
- Sandboxed execution requirements
工具创建请参考custom-tools.md:
- 具备自动生成Schema的简单函数工具
- 包含环境变量密钥的工具
- 用于复杂Schema的BaseTool类
- 沙箱执行要求
4. Client-Side Tools
4. 客户端工具
See client-side-tools.md for local tool execution:
- Execute tools on your machine while agent runs on Letta API
- How Letta Code runs Bash/Read/Write locally
- Approval-based flow with responses
type: "tool" - Access local files, databases, and private APIs
本地工具执行请参考client-side-tools.md:
- Agent在Letta API上运行时,在本地机器执行工具
- Letta Code如何在本地运行Bash/读取/写入操作
- 基于审批的流程,使用响应
type: "tool" - 访问本地文件、数据库和私有API
5. Client Injection & Secrets
5. 客户端注入与密钥
See client-injection.md for server-side tool patterns:
- Pre-injected variable on Letta Cloud
client - Building custom memory tools that modify agent state
- Agent secrets via
os.getenv() - for self-referential tools
LETTA_AGENT_ID
服务器端工具模式请参考client-injection.md:
- Letta Cloud上预注入的变量
client - 构建可修改Agent状态的自定义记忆工具
- 通过获取Agent密钥
os.getenv() - 用于自引用工具的
LETTA_AGENT_ID
6. Multi-User Patterns
6. 多用户模式
See multi-user.md for scaling:
- One agent per user (personalization)
- Shared agent with Conversations API
- Identity system for user context
扩展方案请参考multi-user.md:
- 每个用户对应一个Agent(个性化)
- 结合Conversations API的共享Agent
- 用于用户上下文的身份系统
7. Streaming
7. 流式传输
See streaming.md for real-time responses:
- Basic SSE streaming
- Long-running operations with
include_pings - Background execution and resumable streams
实时响应请参考streaming.md:
- 基础SSE流式传输
- 带有的长时间运行操作
include_pings - 后台执行与可恢复流
8. Conversations
8. 对话功能
Conversations enable parallel sessions with shared memory:
- Thread-safe concurrent messaging (agents.messages.create is NOT thread-safe)
- Shared memory blocks across all conversations
- Separate context windows per conversation
- Use for: same user with multiple parallel tasks, multi-threaded applications
对话功能支持共享记忆的并行会话:
- 线程安全的并发消息传递(并非线程安全)
agents.messages.create - 所有会话共享记忆块
- 每个会话拥有独立的上下文窗口
- 适用场景:同一用户处理多个并行任务、多线程应用
9. Sleeptime Agents
9. 休眠期Agent
See sleeptime.md for background memory processing:
- Enable with
enable_sleeptime=True - Background agent refines memory between conversations
- Good for agents that learn over time
后台记忆处理请参考sleeptime.md:
- 通过启用
enable_sleeptime=True - 对话间隙,后台Agent优化记忆
- 适合具备长期学习能力的Agent
10. Agent Files & Folders
10. Agent文件与文件夹
See agent-files.md for portability and file access:
- Export/import agents with files
.af - Attach folders to give agents document access
- Migration checklist for moving agents
可移植性与文件访问请参考agent-files.md:
- 使用文件导出/导入Agent
.af - 附加文件夹以让Agent访问文档
- Agent迁移检查清单
11. Tool Rules
11. 工具规则
See tool-rules.md for constraining tool execution:
- - Force a tool to run first
InitToolRule - - Control which tools can follow
ChildToolRule - - End agent turn after tool
TerminalToolRule - Sequential pipelines and approval workflows
限制工具执行请参考tool-rules.md:
- - 强制某个工具先运行
InitToolRule - - 控制后续可执行的工具
ChildToolRule - - 工具执行后结束Agent轮次
TerminalToolRule - 顺序流水线与审批工作流
Quick Reference
快速参考
Python SDK
Python SDK
bash
pip install letta-clientpython
from letta_client import Lettabash
pip install letta-clientpython
from letta_client import LettaCloud
云端
client = Letta(api_key="LETTA_API_KEY")
client = Letta(api_key="LETTA_API_KEY")
Self-hosted
自托管
client = Letta(base_url="http://localhost:8283")
undefinedclient = Letta(base_url="http://localhost:8283")
undefinedTypeScript SDK
TypeScript SDK
bash
npm install @letta-ai/letta-clienttypescript
import { Letta } from "@letta-ai/letta-client";
// Cloud
const client = new Letta({ apiKey: process.env.LETTA_API_KEY });
// Self-hosted
const client = new Letta({ baseUrl: "http://localhost:8283" });bash
npm install @letta-ai/letta-clienttypescript
import { Letta } from "@letta-ai/letta-client";
// 云端
const client = new Letta({ apiKey: process.env.LETTA_API_KEY });
// 自托管
const client = new Letta({ baseUrl: "http://localhost:8283" });Examples
示例代码
See the directory for runnable code:
examples/Python:
- - Client initialization
01_basic_client.py - - Agent creation with memory blocks
02_create_agent.py - - Basic custom tool
03_custom_tool_simple.py - - Tool with environment variables
04_custom_tool_secrets.py - - Basic messaging
05_send_message.py - - Streaming responses
06_send_message_stream.py - - Multi-user patterns
07_multi_user.py - - Archival memory operations
08_archival_memory.py - - Multi-agent shared memory
09_shared_blocks.py - - Parallel sessions with conversations
10_conversations.py - - Custom memory tools with injected client
11_client_injection.py - - Constraining tool execution order
12_tool_rules.py - - Execute tools locally (like Letta Code)
13_client_side_tools.py
TypeScript:
- - Client initialization
01_basic_client.ts - - Agent creation
02_create_agent.ts - - Basic messaging
03_send_message.ts - - Streaming
04_send_message_stream.ts - - Next.js pattern
05_nextjs_singleton.ts - - Multi-user patterns
06_multi_user.ts - - Parallel sessions
07_conversations.ts - - Custom tools with secrets
08_custom_tool.ts - - Long-term storage
09_archival_memory.ts - - Multi-agent shared memory
10_shared_blocks.ts - - Custom memory tools
11_client_injection.ts - - Tool execution order
12_tool_rules.ts - - Execute tools locally (like Letta Code)
13_client_side_tools.ts
请查看目录下的可运行代码:
examples/Python:
- - 客户端初始化
01_basic_client.py - - 创建带记忆块的Agent
02_create_agent.py - - 基础自定义工具
03_custom_tool_simple.py - - 带环境变量的工具
04_custom_tool_secrets.py - - 基础消息发送
05_send_message.py - - 流式响应
06_send_message_stream.py - - 多用户模式
07_multi_user.py - - 归档记忆操作
08_archival_memory.py - - 多Agent共享记忆
09_shared_blocks.py - - 并行会话
10_conversations.py - - 带注入客户端的自定义记忆工具
11_client_injection.py - - 限制工具执行顺序
12_tool_rules.py - - 本地执行工具(类似Letta Code)
13_client_side_tools.py
TypeScript:
- - 客户端初始化
01_basic_client.ts - - 创建Agent
02_create_agent.ts - - 基础消息发送
03_send_message.ts - - 流式响应
04_send_message_stream.ts - - Next.js模式
05_nextjs_singleton.ts - - 多用户模式
06_multi_user.ts - - 并行会话
07_conversations.ts - - 带密钥的自定义工具
08_custom_tool.ts - - 长期存储
09_archival_memory.ts - - 多Agent共享记忆
10_shared_blocks.ts - - 自定义记忆工具
11_client_injection.ts - - 工具执行顺序
12_tool_rules.ts - - 本地执行工具(类似Letta Code)
13_client_side_tools.ts
Troubleshooting
故障排除
| Error | Cause | Fix |
|---|---|---|
| 401 Unauthorized | Invalid or missing API key | Check |
| 422 Validation Error | Missing required field | Add |
| Tool not found | Tool not attached to agent | |
| Secret not configured | Add to agent via |
| 524 Timeout | Long operation without pings | Add |
| Agent not responding | Model issue or empty response | Check for |
| Memory block not updating | Looking at wrong agent | Verify |
| Import error in tool | Top-level import | Move imports inside function body |
| 错误 | 原因 | 修复方案 |
|---|---|---|
| 401 Unauthorized | API密钥无效或缺失 | 检查 |
| 422 Validation Error | 缺少必填字段 | 添加 |
| Tool not found | 工具未关联到Agent | 执行 |
| 未配置密钥 | 通过 |
| 524 Timeout | 长时间操作未发送心跳 | 流式传输时添加 |
| Agent无响应 | 模型问题或空响应 | 检查响应中是否包含 |
| 记忆块未更新 | 指向错误的Agent | 确认 |
| 工具中导入错误 | 顶层导入导致 | 将导入语句移至函数内部 |
Key Gotchas
关键注意事项
- Imports in tools must be inside the function - Tools run in a sandbox without access to top-level imports
- Use for secrets - Don't pass sensitive data as function arguments
os.getenv() - On Cloud, use injected - Don't instantiate
clientinside tools, use the pre-injected clientLetta() - Memory blocks are character-limited - Use archival memory for large data
- Streaming requires for long operations - Prevents timeout on Cloud
include_pings=True - SDK 1.0 uses not
.update()- Method was renamed.modify() - is always available - Use it in tools to reference the current agent
LETTA_AGENT_ID - Archival tools need - Not attached by default
include_base_tools=True - Use for shared blocks - Safest for concurrent writes (append-only)
memory_insert - Tool docstrings require Args section - Parameters need descriptions or schema generation fails
- 工具中的导入必须放在函数内部 - 工具在沙箱中运行,无法访问顶层导入
- 使用获取密钥 - 不要将敏感数据作为函数参数传递
os.getenv() - 云端环境使用注入的- 不要在工具内部实例化
client,使用预注入的客户端Letta() - 记忆块有字符限制 - 大型数据请使用归档记忆
- 长时间流式操作需添加- 避免云端超时
include_pings=True - SDK 1.0使用而非
.update()- 方法已重命名.modify() - 始终可用 - 在工具中使用它引用当前Agent
LETTA_AGENT_ID - 归档工具需要- 默认未关联
include_base_tools=True - 共享块使用- 并发写入最安全(仅追加)
memory_insert - 工具文档字符串需要Args部分 - 参数需要描述,否则Schema生成失败
TypeScript SDK Notes
TypeScript SDK 注意事项
typescript
// Client initialization uses baseURL (not baseUrl)
const client = new Letta({ apiKey: "...", baseURL: "http://localhost:8283" });
// Block API: positional args changed
client.agents.blocks.attach(blockId, { agent_id }); // blockId is first
client.agents.blocks.retrieve(blockLabel, { agent_id }); // label is first
// Passages.create returns array
const passages = await client.agents.passages.create(agentId, { text: "..." });
const passage = passages[0];
// Content can be string | array - use type guard
const content = typeof msg.content === "string" ? msg.content : JSON.stringify(msg.content);
// Conversations API returns streams by default
const stream = await client.conversations.messages.create(convId, { messages: [...] });
for await (const chunk of stream) { ... }
// Tool rule types
{ type: "run_first", tool_name: "..." } // InitToolRule
{ type: "constrain_child_tools", tool_name: "...", children: [...] } // ChildToolRule
{ type: "exit_loop", tool_name: "..." } // TerminalToolRuletypescript
// 客户端初始化使用baseURL(而非baseUrl)
const client = new Letta({ apiKey: "...", baseURL: "http://localhost:8283" });
// 块API:参数位置已变更
client.agents.blocks.attach(blockId, { agent_id }); // blockId为第一个参数
client.agents.blocks.retrieve(blockLabel, { agent_id }); // label为第一个参数
// Passages.create返回数组
const passages = await client.agents.passages.create(agentId, { text: "..." });
const passage = passages[0];
// 内容可以是字符串或数组 - 使用类型守卫
const content = typeof msg.content === "string" ? msg.content : JSON.stringify(msg.content);
// Conversations API默认返回流
const stream = await client.conversations.messages.create(convId, { messages: [...] });
for await (const chunk of stream) { ... }
// 工具规则类型
{ type: "run_first", tool_name: "..." } // InitToolRule
{ type: "constrain_child_tools", tool_name: "...", children: [...] } // ChildToolRule
{ type: "exit_loop", tool_name: "..." } // TerminalToolRuleQuick Reference
快速参考
python
undefinedpython
undefinedClient
客户端
client = Letta(api_key=os.getenv("LETTA_API_KEY"))
client = Letta(api_key=os.getenv("LETTA_API_KEY"))
Create agent
创建Agent
agent = client.agents.create(
model="anthropic/claude-sonnet-4-5-20250929",
embedding="openai/text-embedding-3-small",
memory_blocks=[{"label": "persona", "value": "..."}],
include_base_tools=True, # archival memory tools
enable_sleeptime=True, # background memory processing
)
agent = client.agents.create(
model="anthropic/claude-sonnet-4-5-20250929",
embedding="openai/text-embedding-3-small",
memory_blocks=[{"label": "persona", "value": "..."}],
include_base_tools=True, # 归档记忆工具
enable_sleeptime=True, # 后台记忆处理
)
Send message
发送消息
response = client.agents.messages.create(
agent_id=agent.id,
messages=[{"role": "user", "content": "Hello"}]
)
response = client.agents.messages.create(
agent_id=agent.id,
messages=[{"role": "user", "content": "Hello"}]
)
Stream response
流式响应
stream = client.agents.messages.stream(
agent_id=agent.id,
messages=[{"role": "user", "content": "Hello"}],
stream_tokens=True,
include_pings=True, # prevent timeout
)
stream = client.agents.messages.stream(
agent_id=agent.id,
messages=[{"role": "user", "content": "Hello"}],
stream_tokens=True,
include_pings=True, # 防止超时
)
Create tool
创建工具
tool = client.tools.create(source_code="def my_tool(x: str) -> str: ...")
client.agents.tools.attach(agent_id=agent.id, tool_id=tool.id)
tool = client.tools.create(source_code="def my_tool(x: str) -> str: ...")
client.agents.tools.attach(agent_id=agent.id, tool_id=tool.id)
Memory blocks
记忆块
client.agents.blocks.retrieve(agent_id=agent.id, block_label="persona")
client.agents.blocks.update(agent_id=agent.id, block_label="persona", value="...")
client.agents.blocks.retrieve(agent_id=agent.id, block_label="persona")
client.agents.blocks.update(agent_id=agent.id, block_label="persona", value="...")
Folders
文件夹
folder = client.folders.create(name="docs")
client.folders.files.upload(file=f, folder_id=folder.id)
client.agents.folders.attach(agent_id=agent.id, folder_id=folder.id)
folder = client.folders.create(name="docs")
client.folders.files.upload(file=f, folder_id=folder.id)
client.agents.folders.attach(agent_id=agent.id, folder_id=folder.id)
Conversations (parallel sessions)
对话(并行会话)
conv = client.conversations.create(agent_id=agent.id)
stream = client.conversations.messages.create(conv.id, messages=[...])
conv = client.conversations.create(agent_id=agent.id)
stream = client.conversations.messages.create(conv.id, messages=[...])
Agent secrets (for tools)
Agent密钥(供工具使用)
client.agents.update(agent_id=agent.id, secrets={"API_KEY": "..."})
undefinedclient.agents.update(agent_id=agent.id, secrets={"API_KEY": "..."})
undefinedResources
资源
Platform:
- Letta Cloud (ADE) - Agent Development Environment
- API Keys - Get your API key
Documentation:
- Letta Docs - Full documentation
- Agents Guide - Agent concepts
- Memory Blocks - Memory architecture
- Custom Tools - Tool creation
- Streaming - Real-time responses
- Multi-User - Scaling patterns
SDKs:
- Python SDK -
pip install letta-client - TypeScript SDK -
npm install @letta-ai/letta-client
Examples:
- Chatbot Example - Full app example
平台:
- Letta Cloud (ADE) - Agent开发环境
- API Keys - 获取你的API密钥
文档:
- Letta Docs - 完整文档
- Agents Guide - Agent概念
- Memory Blocks - 记忆架构
- Custom Tools - 工具创建
- Streaming - 实时响应
- Multi-User - 扩展模式
SDKs:
- Python SDK -
pip install letta-client - TypeScript SDK -
npm install @letta-ai/letta-client
示例:
- Chatbot Example - 完整应用示例