provider-implementation

Compare original and translation side by side

🇺🇸

Original

English
🇨🇳

Translation

Chinese

Provider Implementation

Provider 实现

Checklist

检查清单

  1. Implement the full interface (ChatModel, Embedder, VectorStore, STT, TTS, etc.).
  2. Register via
    init()
    with parent package's
    Register()
    .
  3. Map provider errors to
    core.Error
    with correct ErrorCode.
  4. Support context cancellation.
  5. Include token/usage metrics where applicable.
  6. Compile-time check:
    var _ Interface = (*Impl)(nil)
    .
  7. Unit tests with mocked HTTP responses (httptest).
  1. 实现完整的接口(ChatModel、Embedder、VectorStore、STT、TTS等)。
  2. 通过父包的
    Register()
    方法,在
    init()
    中完成注册。
  3. 将Provider错误映射为带有正确ErrorCode的
    core.Error
  4. 支持上下文取消(context cancellation)。
  5. 适用时包含token/使用量指标。
  6. 编译时检查:
    var _ Interface = (*Impl)(nil)
  7. 使用模拟HTTP响应(httptest)编写单元测试。

File Structure

文件结构

llm/providers/openai/
├── openai.go          # Implementation + New() + init()
├── stream.go          # Streaming
├── errors.go          # Error mapping
├── openai_test.go     # Tests
└── testdata/          # Recorded HTTP responses
llm/providers/openai/
├── openai.go          # 实现代码 + New() + init()
├── stream.go          # 流式处理
├── errors.go          # 错误映射
├── openai_test.go     # 测试代码
└── testdata/          # 录制的HTTP响应

Template

模板

go
var _ llm.ChatModel = (*Model)(nil)

func init() {
    llm.Register("openai", func(cfg llm.ProviderConfig) (llm.ChatModel, error) { return New(cfg) })
}

func New(cfg llm.ProviderConfig) (*Model, error) {
    if cfg.APIKey == "" { return nil, &core.Error{Op: "openai.new", Code: core.ErrAuth, Message: "API key required"} }
    return &Model{client: newClient(cfg.APIKey, cfg.BaseURL), model: cfg.Model}, nil
}

func (m *Model) Stream(ctx context.Context, msgs []schema.Message, opts ...llm.GenerateOption) iter.Seq2[schema.StreamChunk, error] {
    return func(yield func(schema.StreamChunk, error) bool) { /* stream implementation */ }
}
go
var _ llm.ChatModel = (*Model)(nil)

func init() {
    llm.Register("openai", func(cfg llm.ProviderConfig) (llm.ChatModel, error) { return New(cfg) })
}

func New(cfg llm.ProviderConfig) (*Model, error) {
    if cfg.APIKey == "" { return nil, &core.Error{Op: "openai.new", Code: core.ErrAuth, Message: "API key required"} }
    return &Model{client: newClient(cfg.APIKey, cfg.BaseURL), model: cfg.Model}, nil
}

func (m *Model) Stream(ctx context.Context, msgs []schema.Message, opts ...llm.GenerateOption) iter.Seq2[schema.StreamChunk, error] {
    return func(yield func(schema.StreamChunk, error) bool) { /* 流式处理实现 */ }
}

Error Mapping

错误映射

go
switch apiErr.StatusCode {
case 401: code = core.ErrAuth
case 429: code = core.ErrRateLimit
case 408, 504: code = core.ErrTimeout
case 400: code = core.ErrInvalidInput
}
See
docs/providers.md
for provider categories and priorities.
go
switch apiErr.StatusCode {
case 401: code = core.ErrAuth
case 429: code = core.ErrRateLimit
case 408, 504: code = core.ErrTimeout
case 400: code = core.ErrInvalidInput
}
请查看
docs/providers.md
了解Provider的分类和优先级。