dust-llm
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
ChineseAdding Support for a New LLM Model
新增LLM模型支持
This skill guides you through adding support for a newly released LLM.
本教程将指导你完成对新发布LLM的支持适配工作。
Quick Reference
快速参考
Files to Modify
需要修改的文件
| File | Purpose |
|---|---|
| Model ID + configuration |
| Pricing per million tokens |
| Central registry |
| Router whitelist |
| SDK types |
| UI availability (optional) |
| Integration tests |
| 文件 | 用途 |
|---|---|
| 模型ID + 配置 |
| 每百万tokens定价 |
| 中央注册中心 |
| 路由白名单 |
| SDK类型定义 |
| UI可用性(可选) |
| 集成测试 |
Prerequisites
前置准备
Before adding, gather:
- Model ID: Exact provider identifier (e.g., )
gpt-4-turbo-2024-04-09 - Context size: Total context window in tokens
- Pricing: Input/output cost per million tokens
- Capabilities: Vision, structured output, reasoning effort levels
- Tokenizer: Compatible tokenizer for token counting
在开始适配前,请收集以下信息:
- Model ID:服务商提供的准确标识符(例如 )
gpt-4-turbo-2024-04-09 - Context size:总上下文窗口的token数量
- Pricing:每百万输入/输出token的成本
- Capabilities:视觉能力、结构化输出、推理effort等级
- Tokenizer:适配的tokenizer用于token计数
Step-by-Step: Adding an OpenAI Model
分步指南:添加OpenAI模型
Step 1: Add Model Configuration
步骤1:添加模型配置
Edit :
front/types/assistant/models/openai.tstypescript
export const GPT_4_TURBO_2024_04_09_MODEL_ID = "gpt-4-turbo-2024-04-09" as const;
export const GPT_4_TURBO_2024_04_09_MODEL_CONFIG: ModelConfigurationType = {
providerId: "openai",
modelId: GPT_4_TURBO_2024_04_09_MODEL_ID,
displayName: "GPT 4 turbo",
contextSize: 128_000,
recommendedTopK: 32,
recommendedExhaustiveTopK: 64,
largeModel: true,
description: "OpenAI's GPT 4 Turbo model for complex tasks (128k context).",
shortDescription: "OpenAI's second best model.",
isLegacy: false,
isLatest: false,
generationTokensCount: 2048,
supportsVision: true,
minimumReasoningEffort: "none",
maximumReasoningEffort: "none",
defaultReasoningEffort: "none",
supportsResponseFormat: false,
tokenizer: { type: "tiktoken", base: "cl100k_base" },
};编辑 :
front/types/assistant/models/openai.tstypescript
export const GPT_4_TURBO_2024_04_09_MODEL_ID = "gpt-4-turbo-2024-04-09" as const;
export const GPT_4_TURBO_2024_04_09_MODEL_CONFIG: ModelConfigurationType = {
providerId: "openai",
modelId: GPT_4_TURBO_2024_04_09_MODEL_ID,
displayName: "GPT 4 turbo",
contextSize: 128_000,
recommendedTopK: 32,
recommendedExhaustiveTopK: 64,
largeModel: true,
description: "OpenAI's GPT 4 Turbo model for complex tasks (128k context).",
shortDescription: "OpenAI's second best model.",
isLegacy: false,
isLatest: false,
generationTokensCount: 2048,
supportsVision: true,
minimumReasoningEffort: "none",
maximumReasoningEffort: "none",
defaultReasoningEffort: "none",
supportsResponseFormat: false,
tokenizer: { type: "tiktoken", base: "cl100k_base" },
};Step 2: Add Pricing
步骤2:添加定价
Edit :
front/lib/api/assistant/token_pricing.tstypescript
const CURRENT_MODEL_PRICING: Record<BaseModelIdType, PricingEntry> = {
// ... existing
"gpt-4-turbo-2024-04-09": {
input: 10.0, // USD per million input tokens
output: 30.0, // USD per million output tokens
cache_read_input_tokens: 1.0, // Optional: cached reads
cache_creation_input_tokens: 12.5, // Optional: cache creation
},
};编辑 :
front/lib/api/assistant/token_pricing.tstypescript
const CURRENT_MODEL_PRICING: Record<BaseModelIdType, PricingEntry> = {
// ... existing
"gpt-4-turbo-2024-04-09": {
input: 10.0, // USD per million input tokens
output: 30.0, // USD per million output tokens
cache_read_input_tokens: 1.0, // Optional: cached reads
cache_creation_input_tokens: 12.5, // Optional: cache creation
},
};Step 3: Register in Central Registry
步骤3:在中央注册中心注册
Edit :
front/types/assistant/models/models.tstypescript
export const MODEL_IDS = [
// ... existing
GPT_4_TURBO_2024_04_09_MODEL_ID,
] as const;
export const SUPPORTED_MODEL_CONFIGS: ModelConfigurationType[] = [
// ... existing
GPT_4_TURBO_2024_04_09_MODEL_CONFIG,
];编辑 :
front/types/assistant/models/models.tstypescript
export const MODEL_IDS = [
// ... existing
GPT_4_TURBO_2024_04_09_MODEL_ID,
] as const;
export const SUPPORTED_MODEL_CONFIGS: ModelConfigurationType[] = [
// ... existing
GPT_4_TURBO_2024_04_09_MODEL_CONFIG,
];Step 4: Update Router Whitelist
步骤4:更新路由白名单
Edit :
front/lib/api/llm/clients/openai/types.tstypescript
export const OPENAI_WHITELISTED_MODEL_IDS = [
// ... existing
GPT_4_TURBO_2024_04_09_MODEL_ID,
] as const;编辑 :
front/lib/api/llm/clients/openai/types.tstypescript
export const OPENAI_WHITELISTED_MODEL_IDS = [
// ... existing
GPT_4_TURBO_2024_04_09_MODEL_ID,
] as const;Step 5: Update SDK Types
步骤5:更新SDK类型
Edit :
sdks/js/src/types.tstypescript
const ModelLLMIdSchema = FlexibleEnumSchema<
// ... existing
| "gpt-4-turbo-2024-04-09"
>();编辑 :
sdks/js/src/types.tstypescript
const ModelLLMIdSchema = FlexibleEnumSchema<
// ... existing
| "gpt-4-turbo-2024-04-09"
>();Step 6: Add to UI (Optional)
步骤6:添加到UI(可选)
Edit :
front/components/providers/types.tstypescript
export const USED_MODEL_CONFIGS: readonly ModelConfig[] = [
// ... existing
GPT_4_TURBO_2024_04_09_MODEL_CONFIG,
] as const;编辑 :
front/components/providers/types.tstypescript
export const USED_MODEL_CONFIGS: readonly ModelConfig[] = [
// ... existing
GPT_4_TURBO_2024_04_09_MODEL_CONFIG,
] as const;Step 7: Test (Mandatory)
步骤7:测试(必填)
Edit :
front/lib/api/llm/tests/llm.test.tstypescript
const MODELS = {
// ... existing
[GPT_4_TURBO_2024_04_09_MODEL_ID]: {
runTest: true, // Enable for testing
providerId: "openai",
},
};Run test:
bash
RUN_LLM_TEST=true npx vitest --config lib/api/llm/tests/vite.config.js lib/api/llm/tests/llm.test.ts --runAfter test passes, set to avoid expensive CI runs.
runTest: false编辑 :
front/lib/api/llm/tests/llm.test.tstypescript
const MODELS = {
// ... existing
[GPT_4_TURBO_2024_04_09_MODEL_ID]: {
runTest: true, // Enable for testing
providerId: "openai",
},
};运行测试:
bash
RUN_LLM_TEST=true npx vitest --config lib/api/llm/tests/vite.config.js lib/api/llm/tests/llm.test.ts --run测试通过后,将设置为false,避免CI运行产生额外成本。
runTest: falseAdding Anthropic Models
添加Anthropic模型
Same pattern with Anthropic-specific files:
- - Add
front/types/assistant/models/anthropic.tsand configCLAUDE_X_MODEL_ID - - Add to
front/lib/api/llm/clients/anthropic/types.tsANTHROPIC_WHITELISTED_MODEL_IDS - - Register in central registry
front/types/assistant/models/models.ts - - Add pricing
front/lib/api/assistant/token_pricing.ts - - Update SDK types
sdks/js/src/types.ts - Test and validate
遵循相同的模式修改Anthropic对应的文件即可:
- - 添加
front/types/assistant/models/anthropic.ts和配置CLAUDE_X_MODEL_ID - - 添加到
front/lib/api/llm/clients/anthropic/types.tsANTHROPIC_WHITELISTED_MODEL_IDS - - 在中央注册中心注册
front/types/assistant/models/models.ts - - 添加定价
front/lib/api/assistant/token_pricing.ts - - 更新SDK类型
sdks/js/src/types.ts - 测试并验证
Model Configuration Properties
模型配置属性
| Property | Description |
|---|---|
| Can process images |
| Supports structured output (JSON) |
| Min reasoning level ("none", "low", "medium", "high") |
| Max reasoning level |
| Default reasoning level |
| Tokenizer config for token counting |
| 属性 | 描述 |
|---|---|
| 能否处理图像 |
| 是否支持结构化输出(JSON) |
| 最低推理等级("none", "low", "medium", "high") |
| 最高推理等级 |
| 默认推理等级 |
| 用于token计数的tokenizer配置 |
Validation Checklist
验证检查清单
- Model config added to provider file
- Pricing updated (input, output, cache if applicable)
- Registered in central registry (+
MODEL_IDS)SUPPORTED_MODEL_CONFIGS - Router whitelist updated
- SDK types updated
- UI config added (if needed)
- Integration test passes
- Test disabled after validation
- 模型配置已添加到对应服务商的文件中
- 定价已更新(输入、输出,缓存定价如适用)
- 已在中央注册中心注册(+
MODEL_IDS)SUPPORTED_MODEL_CONFIGS - 路由白名单已更新
- SDK类型已更新
- UI配置已添加(如有需要)
- 集成测试通过
- 验证通过后已禁用测试
Troubleshooting
故障排除
Model not in UI: Check in
USED_MODEL_CONFIGSfront/components/providers/types.tsAPI calls failing: Verify model ID matches provider's exact identifier, check router whitelist
Token counting errors: Validate context size and tokenizer configuration
Pricing issues: Ensure prices are per million tokens in USD
UI中找不到模型:检查中的
front/components/providers/types.tsUSED_MODEL_CONFIGSAPI调用失败:确认模型ID与服务商提供的完全一致,检查路由白名单
Token计数错误:验证上下文窗口大小和tokenizer配置
定价问题:确保定价单位为每百万token的美元价格
Reference
参考
- See and
front/types/assistant/models/openai.tsfor examplesanthropic.ts - Provider docs: OpenAI, Anthropic, Google, Mistral
- 可参考和
front/types/assistant/models/openai.ts中的示例anthropic.ts - 服务商文档:OpenAI, Anthropic, Google, Mistral