letta-configuration

Compare original and translation side by side

🇺🇸

Original

English
🇨🇳

Translation

Chinese

Letta Configuration

Letta 配置

Complete guide for configuring models on agents and providers on servers.
关于在Agent上配置模型、在服务器上配置提供商的完整指南。

When to Use This Skill

何时使用该技能

Agent-level (model configuration):
  • Creating agents with specific model configurations
  • Adjusting model settings (temperature, max tokens, context window)
  • Configuring provider-specific features (OpenAI reasoning, Anthropic thinking)
  • Changing models on existing agents
Server-level (provider configuration):
  • Setting up BYOK (bring your own key) providers
  • Configuring self-hosted deployments with environment variables
  • Validating provider credentials
  • Setting up custom OpenAI-compatible endpoints
Not covered here: Model selection advice (which model to choose) - see
agent-development
skill.

Agent层面(模型配置):
  • 创建带有特定模型配置的Agent
  • 调整模型设置(temperature、max tokens、context window)
  • 配置提供商专属功能(OpenAI推理、Anthropic思维扩展)
  • 为已有的Agent更换模型
服务器层面(提供商配置):
  • 搭建BYOK(自带密钥)提供商
  • 通过环境变量配置自托管部署
  • 验证提供商凭证
  • 搭建自定义OpenAI兼容端点
本文未涵盖: 模型选择建议(如何选择模型)- 请查看
agent-development
技能文档。

Part 1: Model Configuration (Agent-Level)

第一部分:模型配置(Agent层面)

Model Handles

模型句柄

Models use a
provider/model-name
format:
ProviderHandle PrefixExample
OpenAI
openai/
openai/gpt-4o
,
openai/gpt-4o-mini
Anthropic
anthropic/
anthropic/claude-sonnet-4-5-20250929
Google AI
google_ai/
google_ai/gemini-2.0-flash
Azure OpenAI
azure/
azure/gpt-4o
AWS Bedrock
bedrock/
bedrock/anthropic.claude-3-5-sonnet
Groq
groq/
groq/llama-3.3-70b-versatile
Together
together/
together/meta-llama/Llama-3-70b
OpenRouter
openrouter/
openrouter/anthropic/claude-3.5-sonnet
Ollama (local)
ollama/
ollama/llama3.2
模型采用
provider/model-name
格式:
提供商句柄前缀示例
OpenAI
openai/
openai/gpt-4o
,
openai/gpt-4o-mini
Anthropic
anthropic/
anthropic/claude-sonnet-4-5-20250929
Google AI
google_ai/
google_ai/gemini-2.0-flash
Azure OpenAI
azure/
azure/gpt-4o
AWS Bedrock
bedrock/
bedrock/anthropic.claude-3-5-sonnet
Groq
groq/
groq/llama-3.3-70b-versatile
Together
together/
together/meta-llama/Llama-3-70b
OpenRouter
openrouter/
openrouter/anthropic/claude-3.5-sonnet
Ollama(本地)
ollama/
ollama/llama3.2

Basic Model Configuration

基础模型配置

python
from letta_client import Letta

client = Letta(api_key="your-api-key")

agent = client.agents.create(
    model="openai/gpt-4o",
    model_settings={
        "provider_type": "openai",  # Required - must match model provider
        "temperature": 0.7,
        "max_output_tokens": 4096,
    },
    context_window_limit=128000
)
python
from letta_client import Letta

client = Letta(api_key="your-api-key")

agent = client.agents.create(
    model="openai/gpt-4o",
    model_settings={
        "provider_type": "openai",  # Required - must match model provider
        "temperature": 0.7,
        "max_output_tokens": 4096,
    },
    context_window_limit=128000
)

Common Settings

通用设置

SettingTypeDescription
provider_type
stringRequired. Must match model provider (
openai
,
anthropic
,
google_ai
, etc.)
temperature
floatControls randomness (0.0-2.0). Lower = more deterministic.
max_output_tokens
intMaximum tokens in the response.
设置项类型描述
provider_type
string必填项。必须与模型提供商匹配(
openai
,
anthropic
,
google_ai
等)
temperature
float控制输出随机性(取值范围0.0-2.0)。值越低,输出越具确定性。
max_output_tokens
int响应中的最大令牌数。

Changing an Agent's Model

更换Agent的模型

python
client.agents.update(
    agent_id=agent.id,
    model="anthropic/claude-sonnet-4-5-20250929",
    model_settings={"provider_type": "anthropic", "temperature": 0.5},
    context_window_limit=64000
)
Note: Agents retain memory and tools when changing models.
python
client.agents.update(
    agent_id=agent.id,
    model="anthropic/claude-sonnet-4-5-20250929",
    model_settings={"provider_type": "anthropic", "temperature": 0.5},
    context_window_limit=64000
)
注意: 更换模型时,Agent会保留其记忆与工具。

Provider-Specific Settings

提供商专属设置

For OpenAI reasoning models and Anthropic extended thinking, see
references/provider-settings.md
.

关于OpenAI推理模型和Anthropic扩展思维功能的配置,请查看
references/provider-settings.md

Part 2: Provider Configuration (Server-Level)

第二部分:提供商配置(服务器层面)

Quick Start

快速开始

bash
undefined
bash
undefined

Add provider via API

Add provider via API

python scripts/setup_provider.py --type openai --api-key sk-...
python scripts/setup_provider.py --type openai --api-key sk-...

Generate .env for Docker

Generate .env for Docker

python scripts/generate_env.py --providers openai,anthropic,ollama
python scripts/generate_env.py --providers openai,anthropic,ollama

Validate credentials

Validate credentials

python scripts/validate_provider.py --provider-id provider-xxx
undefined
python scripts/validate_provider.py --provider-id provider-xxx
undefined

Add BYOK Provider

添加BYOK提供商

python
undefined
python
undefined

Via REST API

Via REST API

curl -X POST http://localhost:8283/v1/providers
-H "Content-Type: application/json"
-d '{ "name": "My OpenAI", "provider_type": "openai", "api_key": "sk-your-key-here" }'
undefined
curl -X POST http://localhost:8283/v1/providers
-H "Content-Type: application/json"
-d '{ "name": "My OpenAI", "provider_type": "openai", "api_key": "sk-your-key-here" }'
undefined

Supported Provider Types

支持的提供商类型

openai
,
anthropic
,
azure
,
google_ai
,
google_vertex
,
ollama
,
groq
,
deepseek
,
xai
,
together
,
mistral
,
cerebras
,
bedrock
,
vllm
,
sglang
,
hugging_face
,
lmstudio_openai
For detailed configuration of each provider, see:
  • references/common_providers.md
    - OpenAI, Anthropic, Azure, Google
  • references/self_hosted_providers.md
    - Ollama, vLLM, LM Studio
  • references/all_providers.md
    - Complete reference
  • references/environment_variables.md
    - Docker/self-hosted setup

openai
,
anthropic
,
azure
,
google_ai
,
google_vertex
,
ollama
,
groq
,
deepseek
,
xai
,
together
,
mistral
,
cerebras
,
bedrock
,
vllm
,
sglang
,
hugging_face
,
lmstudio_openai
如需每个提供商的详细配置说明,请查看:
  • references/common_providers.md
    - OpenAI、Anthropic、Azure、Google
  • references/self_hosted_providers.md
    - Ollama、vLLM、LM Studio
  • references/all_providers.md
    - 完整参考手册
  • references/environment_variables.md
    - Docker/自托管部署配置

Anti-Hallucination Checklist

防幻觉检查清单

Before configuring:
  • Model handle uses correct
    provider/model-name
    format
  • model_settings
    includes required
    provider_type
    field
  • context_window_limit
    is set at agent level, not in
    model_settings
  • Provider-specific settings use correct nested structure
  • For self-hosted: embedding model is specified
  • Temperature is within valid range (0.0-2.0)
配置前请确认:
  • 模型句柄使用正确的
    provider/model-name
    格式
  • model_settings
    包含必填的
    provider_type
    字段
  • context_window_limit
    在Agent层面设置,而非
    model_settings
  • 提供商专属设置使用正确的嵌套结构
  • 对于自托管场景:已指定嵌入模型
  • Temperature取值在有效范围(0.0-2.0)内

Scripts

脚本工具

Model configuration:
  • scripts/basic_config.py
    - Basic model configuration
  • scripts/basic_config.ts
    - TypeScript equivalent
  • scripts/change_model.py
    - Changing models on existing agents
  • scripts/provider_specific.py
    - OpenAI reasoning, Anthropic thinking
Provider configuration:
  • scripts/setup_provider.py
    - Add providers via REST API
  • scripts/validate_provider.py
    - Check provider credentials
  • scripts/generate_env.py
    - Generate .env for Docker
模型配置类:
  • scripts/basic_config.py
    - 基础模型配置脚本
  • scripts/basic_config.ts
    - TypeScript版本
  • scripts/change_model.py
    - 为已有Agent更换模型的脚本
  • scripts/provider_specific.py
    - OpenAI推理、Anthropic思维扩展配置脚本
提供商配置类:
  • scripts/setup_provider.py
    - 通过REST API添加提供商的脚本
  • scripts/validate_provider.py
    - 检查提供商凭证的脚本
  • scripts/generate_env.py
    - 为Docker生成.env文件的脚本