letta-configuration
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
ChineseLetta Configuration
Letta 配置
Complete guide for configuring models on agents and providers on servers.
关于在Agent上配置模型、在服务器上配置提供商的完整指南。
When to Use This Skill
何时使用该技能
Agent-level (model configuration):
- Creating agents with specific model configurations
- Adjusting model settings (temperature, max tokens, context window)
- Configuring provider-specific features (OpenAI reasoning, Anthropic thinking)
- Changing models on existing agents
Server-level (provider configuration):
- Setting up BYOK (bring your own key) providers
- Configuring self-hosted deployments with environment variables
- Validating provider credentials
- Setting up custom OpenAI-compatible endpoints
Not covered here: Model selection advice (which model to choose) - see skill.
agent-developmentAgent层面(模型配置):
- 创建带有特定模型配置的Agent
- 调整模型设置(temperature、max tokens、context window)
- 配置提供商专属功能(OpenAI推理、Anthropic思维扩展)
- 为已有的Agent更换模型
服务器层面(提供商配置):
- 搭建BYOK(自带密钥)提供商
- 通过环境变量配置自托管部署
- 验证提供商凭证
- 搭建自定义OpenAI兼容端点
本文未涵盖: 模型选择建议(如何选择模型)- 请查看技能文档。
agent-developmentPart 1: Model Configuration (Agent-Level)
第一部分:模型配置(Agent层面)
Model Handles
模型句柄
Models use a format:
provider/model-name| Provider | Handle Prefix | Example |
|---|---|---|
| OpenAI | | |
| Anthropic | | |
| Google AI | | |
| Azure OpenAI | | |
| AWS Bedrock | | |
| Groq | | |
| Together | | |
| OpenRouter | | |
| Ollama (local) | | |
模型采用格式:
provider/model-name| 提供商 | 句柄前缀 | 示例 |
|---|---|---|
| OpenAI | | |
| Anthropic | | |
| Google AI | | |
| Azure OpenAI | | |
| AWS Bedrock | | |
| Groq | | |
| Together | | |
| OpenRouter | | |
| Ollama(本地) | | |
Basic Model Configuration
基础模型配置
python
from letta_client import Letta
client = Letta(api_key="your-api-key")
agent = client.agents.create(
model="openai/gpt-4o",
model_settings={
"provider_type": "openai", # Required - must match model provider
"temperature": 0.7,
"max_output_tokens": 4096,
},
context_window_limit=128000
)python
from letta_client import Letta
client = Letta(api_key="your-api-key")
agent = client.agents.create(
model="openai/gpt-4o",
model_settings={
"provider_type": "openai", # Required - must match model provider
"temperature": 0.7,
"max_output_tokens": 4096,
},
context_window_limit=128000
)Common Settings
通用设置
| Setting | Type | Description |
|---|---|---|
| string | Required. Must match model provider ( |
| float | Controls randomness (0.0-2.0). Lower = more deterministic. |
| int | Maximum tokens in the response. |
| 设置项 | 类型 | 描述 |
|---|---|---|
| string | 必填项。必须与模型提供商匹配( |
| float | 控制输出随机性(取值范围0.0-2.0)。值越低,输出越具确定性。 |
| int | 响应中的最大令牌数。 |
Changing an Agent's Model
更换Agent的模型
python
client.agents.update(
agent_id=agent.id,
model="anthropic/claude-sonnet-4-5-20250929",
model_settings={"provider_type": "anthropic", "temperature": 0.5},
context_window_limit=64000
)Note: Agents retain memory and tools when changing models.
python
client.agents.update(
agent_id=agent.id,
model="anthropic/claude-sonnet-4-5-20250929",
model_settings={"provider_type": "anthropic", "temperature": 0.5},
context_window_limit=64000
)注意: 更换模型时,Agent会保留其记忆与工具。
Provider-Specific Settings
提供商专属设置
For OpenAI reasoning models and Anthropic extended thinking, see .
references/provider-settings.md关于OpenAI推理模型和Anthropic扩展思维功能的配置,请查看。
references/provider-settings.mdPart 2: Provider Configuration (Server-Level)
第二部分:提供商配置(服务器层面)
Quick Start
快速开始
bash
undefinedbash
undefinedAdd provider via API
Add provider via API
python scripts/setup_provider.py --type openai --api-key sk-...
python scripts/setup_provider.py --type openai --api-key sk-...
Generate .env for Docker
Generate .env for Docker
python scripts/generate_env.py --providers openai,anthropic,ollama
python scripts/generate_env.py --providers openai,anthropic,ollama
Validate credentials
Validate credentials
python scripts/validate_provider.py --provider-id provider-xxx
undefinedpython scripts/validate_provider.py --provider-id provider-xxx
undefinedAdd BYOK Provider
添加BYOK提供商
python
undefinedpython
undefinedVia REST API
Via REST API
curl -X POST http://localhost:8283/v1/providers
-H "Content-Type: application/json"
-d '{ "name": "My OpenAI", "provider_type": "openai", "api_key": "sk-your-key-here" }'
-H "Content-Type: application/json"
-d '{ "name": "My OpenAI", "provider_type": "openai", "api_key": "sk-your-key-here" }'
undefinedcurl -X POST http://localhost:8283/v1/providers
-H "Content-Type: application/json"
-d '{ "name": "My OpenAI", "provider_type": "openai", "api_key": "sk-your-key-here" }'
-H "Content-Type: application/json"
-d '{ "name": "My OpenAI", "provider_type": "openai", "api_key": "sk-your-key-here" }'
undefinedSupported Provider Types
支持的提供商类型
openaianthropicazuregoogle_aigoogle_vertexollamagroqdeepseekxaitogethermistralcerebrasbedrockvllmsglanghugging_facelmstudio_openaiFor detailed configuration of each provider, see:
- - OpenAI, Anthropic, Azure, Google
references/common_providers.md - - Ollama, vLLM, LM Studio
references/self_hosted_providers.md - - Complete reference
references/all_providers.md - - Docker/self-hosted setup
references/environment_variables.md
openaianthropicazuregoogle_aigoogle_vertexollamagroqdeepseekxaitogethermistralcerebrasbedrockvllmsglanghugging_facelmstudio_openai如需每个提供商的详细配置说明,请查看:
- - OpenAI、Anthropic、Azure、Google
references/common_providers.md - - Ollama、vLLM、LM Studio
references/self_hosted_providers.md - - 完整参考手册
references/all_providers.md - - Docker/自托管部署配置
references/environment_variables.md
Anti-Hallucination Checklist
防幻觉检查清单
Before configuring:
- Model handle uses correct format
provider/model-name - includes required
model_settingsfieldprovider_type - is set at agent level, not in
context_window_limitmodel_settings - Provider-specific settings use correct nested structure
- For self-hosted: embedding model is specified
- Temperature is within valid range (0.0-2.0)
配置前请确认:
- 模型句柄使用正确的格式
provider/model-name - 包含必填的
model_settings字段provider_type - 在Agent层面设置,而非
context_window_limit中model_settings - 提供商专属设置使用正确的嵌套结构
- 对于自托管场景:已指定嵌入模型
- Temperature取值在有效范围(0.0-2.0)内
Scripts
脚本工具
Model configuration:
- - Basic model configuration
scripts/basic_config.py - - TypeScript equivalent
scripts/basic_config.ts - - Changing models on existing agents
scripts/change_model.py - - OpenAI reasoning, Anthropic thinking
scripts/provider_specific.py
Provider configuration:
- - Add providers via REST API
scripts/setup_provider.py - - Check provider credentials
scripts/validate_provider.py - - Generate .env for Docker
scripts/generate_env.py
模型配置类:
- - 基础模型配置脚本
scripts/basic_config.py - - TypeScript版本
scripts/basic_config.ts - - 为已有Agent更换模型的脚本
scripts/change_model.py - - OpenAI推理、Anthropic思维扩展配置脚本
scripts/provider_specific.py
提供商配置类:
- - 通过REST API添加提供商的脚本
scripts/setup_provider.py - - 检查提供商凭证的脚本
scripts/validate_provider.py - - 为Docker生成.env文件的脚本
scripts/generate_env.py