tanstack-ai

Compare original and translation side by side

🇺🇸

Original

English
🇨🇳

Translation

Chinese

TanStack AI (React)

TanStack AI (React)

AI chat framework with isomorphic tools, streaming, and full type safety.
一款具备同构工具、流式传输和完整类型安全的AI聊天框架。

Packages

包列表

  • @tanstack/ai
    — core:
    chat()
    ,
    toolDefinition()
    ,
    toServerSentEventsResponse()
    ,
    maxIterations()
  • @tanstack/ai-react
    — React:
    useChat()
    hook, re-exports connection adapters
  • @tanstack/ai-client
    — headless:
    ChatClient
    ,
    clientTools()
    ,
    createChatClientOptions()
    ,
    InferChatMessages
  • @tanstack/ai-{openai,anthropic,gemini,ollama,grok,openrouter,fal}
    — adapter packages
  • @tanstack/ai
    — 核心包:包含
    chat()
    toolDefinition()
    toServerSentEventsResponse()
    maxIterations()
    方法
  • @tanstack/ai-react
    — React适配包:提供
    useChat()
    钩子,并重导出连接适配器
  • @tanstack/ai-client
    — 无头包:包含
    ChatClient
    clientTools()
    createChatClientOptions()
    InferChatMessages
    类型
  • @tanstack/ai-{openai,anthropic,gemini,ollama,grok,openrouter,fal}
    — 各类模型适配器包

Quick Start

快速开始

Install

安装

bash
npm install @tanstack/ai @tanstack/ai-react @tanstack/ai-openai
bash
npm install @tanstack/ai @tanstack/ai-react @tanstack/ai-openai

Server (Next.js API Route)

服务端(Next.js API路由)

typescript
import { chat, toServerSentEventsResponse } from "@tanstack/ai";
import { openaiText } from "@tanstack/ai-openai";

export async function POST(request: Request) {
  const { messages } = await request.json();
  const stream = chat({
    adapter: openaiText("gpt-5.2"),
    messages,
  });
  return toServerSentEventsResponse(stream);
}
typescript
import { chat, toServerSentEventsResponse } from "@tanstack/ai";
import { openaiText } from "@tanstack/ai-openai";

export async function POST(request: Request) {
  const { messages } = await request.json();
  const stream = chat({
    adapter: openaiText("gpt-5.2"),
    messages,
  });
  return toServerSentEventsResponse(stream);
}

Client (React)

客户端(React)

typescript
import { useState } from "react";
import { useChat, fetchServerSentEvents } from "@tanstack/ai-react";

export function Chat() {
  const [input, setInput] = useState("");
  const { messages, sendMessage, isLoading } = useChat({
    connection: fetchServerSentEvents("/api/chat"),
  });

  return (
    <div>
      {messages.map((message) => (
        <div key={message.id}>
          <strong>{message.role}:</strong>
          {message.parts.map((part, idx) => {
            if (part.type === "text") return <span key={idx}>{part.content}</span>;
            if (part.type === "thinking") return <em key={idx}>{part.content}</em>;
            return null;
          })}
        </div>
      ))}
      <form onSubmit={(e) => { e.preventDefault(); sendMessage(input); setInput(""); }}>
        <input value={input} onChange={(e) => setInput(e.target.value)} disabled={isLoading} />
        <button type="submit" disabled={isLoading}>Send</button>
      </form>
    </div>
  );
}
typescript
import { useState } from "react";
import { useChat, fetchServerSentEvents } from "@tanstack/ai-react";

export function Chat() {
  const [input, setInput] = useState("");
  const { messages, sendMessage, isLoading } = useChat({
    connection: fetchServerSentEvents("/api/chat"),
  });

  return (
    <div>
      {messages.map((message) => (
        <div key={message.id}>
          <strong>{message.role}:</strong>
          {message.parts.map((part, idx) => {
            if (part.type === "text") return <span key={idx}>{part.content}</span>;
            if (part.type === "thinking") return <em key={idx}>{part.content}</em>;
            return null;
          })}
        </div>
      ))}
      <form onSubmit={(e) => { e.preventDefault(); sendMessage(input); setInput(""); }}>
        <input value={input} onChange={(e) => setInput(e.target.value)} disabled={isLoading} />
        <button type="submit" disabled={isLoading}>Send</button>
      </form>
    </div>
  );
}

useChat Hook

useChat钩子

typescript
const {
  messages,          // UIMessage[] — current messages
  sendMessage,       // (content: string | MultimodalContent) => Promise<void>
  append,            // (message: ModelMessage | UIMessage) => Promise<void>
  isLoading,         // boolean
  error,             // Error | undefined
  stop,              // () => void — cancel current stream
  reload,            // () => Promise<void> — regenerate last response
  clear,             // () => void — clear all messages
  setMessages,       // (messages: UIMessage[]) => void
  addToolResult,     // (result: { toolCallId, tool, output, state? }) => Promise<void>
  addToolApprovalResponse, // (response: { id, approved }) => Promise<void>
} = useChat({
  connection: fetchServerSentEvents("/api/chat"),
  tools?,             // client tool implementations
  initialMessages?,   // UIMessage[]
  id?,                // string — unique chat instance id
  body?,              // additional body params sent with every request
  onResponse?,        // (response) => void
  onChunk?,           // (chunk) => void
  onFinish?,          // (message) => void
  onError?,           // (error) => void
});
typescript
const {
  messages,          // UIMessage[] — 当前消息列表
  sendMessage,       // (content: string | MultimodalContent) => Promise<void>
  append,            // (message: ModelMessage | UIMessage) => Promise<void>
  isLoading,         // 布尔值,表示是否正在加载
  error,             // Error | undefined — 错误信息
  stop,              // () => void — 取消当前流式传输
  reload,            // () => Promise<void> — 重新生成最后一条响应
  clear,             // () => void — 清空所有消息
  setMessages,       // (messages: UIMessage[]) => void — 设置消息列表
  addToolResult,     // (result: { toolCallId, tool, output, state? }) => Promise<void>
  addToolApprovalResponse, // (response: { id, approved }) => Promise<void>
} = useChat({
  connection: fetchServerSentEvents("/api/chat"),
  tools?,             // 客户端工具实现
  initialMessages?,   // UIMessage[] — 初始消息列表
  id?,                // string — 唯一的聊天实例ID
  body?,              // 随每个请求发送的额外参数
  onResponse?,        // (response) => void — 响应回调
  onChunk?,           // (chunk) => void — 流式数据块回调
  onFinish?,          // (message) => void — 完成回调
  onError?,           // (error) => void — 错误回调
});

Message Structure

消息结构

Messages use
UIMessage
with a
parts
array:
typescript
interface UIMessage {
  id: string;
  role: "user" | "assistant";
  parts: (TextPart | ThinkingPart | ToolCallPart | ToolResultPart)[];
}
Render parts by type:
  • part.type === "text"
    part.content
    (string)
  • part.type === "thinking"
    part.content
    (model reasoning, UI-only, not sent back)
  • part.type === "tool-call"
    part.name
    ,
    part.input
    ,
    part.output
    ,
    part.state
  • part.type === "tool-result"
    part.output
    ,
    part.state
消息使用带有
parts
数组的
UIMessage
类型:
typescript
interface UIMessage {
  id: string;
  role: "user" | "assistant";
  parts: (TextPart | ThinkingPart | ToolCallPart | ToolResultPart)[];
}
根据类型渲染不同的消息部分:
  • part.type === "text"
    — 显示
    part.content
    (字符串类型)
  • part.type === "thinking"
    — 显示
    part.content
    (模型推理内容,仅用于UI,不回传给模型)
  • part.type === "tool-call"
    — 包含
    part.name
    part.input
    part.output
    part.state
    字段
  • part.type === "tool-result"
    — 包含
    part.output
    part.state
    字段

Connection Adapters

连接适配器

typescript
import { fetchServerSentEvents, fetchHttpStream, stream } from "@tanstack/ai-react";

// SSE (recommended — auto-reconnection)
fetchServerSentEvents("/api/chat", { headers: { Authorization: "Bearer token" } })

// HTTP stream (NDJSON)
fetchHttpStream("/api/chat")

// Custom
stream(async (messages, data, signal) => { /* return async iterable */ })
typescript
import { fetchServerSentEvents, fetchHttpStream, stream } from "@tanstack/ai-react";

// SSE(推荐 — 支持自动重连)
fetchServerSentEvents("/api/chat", { headers: { Authorization: "Bearer token" } })

// HTTP流式传输(NDJSON格式)
fetchHttpStream("/api/chat")

// 自定义适配器
stream(async (messages, data, signal) => { /* 返回异步可迭代对象 */ })

Adapters

模型适配器

Model passed to adapter factory — one function per activity for tree-shaking:
typescript
import { openaiText } from "@tanstack/ai-openai";       // openaiText('gpt-5.2')
import { anthropicText } from "@tanstack/ai-anthropic";  // anthropicText('claude-sonnet-4-5')
import { geminiText } from "@tanstack/ai-gemini";        // geminiText('gemini-2.5-pro')
import { ollamaText } from "@tanstack/ai-ollama";        // ollamaText('llama3')
import { grokText } from "@tanstack/ai-grok";            // grokText('grok-4')
import { openRouterText } from "@tanstack/ai-openrouter"; // openRouterText('openai/gpt-5')
模型通过适配器工厂传入 — 每个功能对应一个函数,支持摇树优化:
typescript
import { openaiText } from "@tanstack/ai-openai";       // openaiText('gpt-5.2')
import { anthropicText } from "@tanstack/ai-anthropic";  // anthropicText('claude-sonnet-4-5')
import { geminiText } from "@tanstack/ai-gemini";        // geminiText('gemini-2.5-pro')
import { ollamaText } from "@tanstack/ai-ollama";        // ollamaText('llama3')
import { grokText } from "@tanstack/ai-grok";            // grokText('grok-4')
import { openRouterText } from "@tanstack/ai-openrouter"; // openRouterText('openai/gpt-5')

Tools Overview

工具概述

Two-step process: define schema with
toolDefinition()
, then implement with
.server()
or
.client()
.
typescript
import { toolDefinition } from "@tanstack/ai";
import { z } from "zod";

const getWeatherDef = toolDefinition({
  name: "get_weather",
  description: "Get current weather for a location",
  inputSchema: z.object({ location: z.string() }),
  outputSchema: z.object({ temperature: z.number(), conditions: z.string() }),
  needsApproval: false, // optional
});

// Server implementation — runs on server with DB/API access
const getWeather = getWeatherDef.server(async ({ location }) => {
  const data = await fetchWeather(location);
  return { temperature: data.temp, conditions: data.conditions };
});

// Client implementation — runs in browser for UI/localStorage
const getWeatherClient = getWeatherDef.client((input) => {
  return { temperature: 72, conditions: "cached" };
});
For detailed tool patterns (server, client, hybrid, approval, agentic cycle), see references/tools.md.
分为两步:使用
toolDefinition()
定义工具 schema,然后通过
.server()
.client()
实现工具逻辑。
typescript
import { toolDefinition } from "@tanstack/ai";
import { z } from "zod";

const getWeatherDef = toolDefinition({
  name: "get_weather",
  description: "Get current weather for a location",
  inputSchema: z.object({ location: z.string() }),
  outputSchema: z.object({ temperature: z.number(), conditions: z.string() }),
  needsApproval: false, // 可选参数,是否需要审批
});

// 服务端实现 — 在服务端运行,可访问数据库/API
const getWeather = getWeatherDef.server(async ({ location }) => {
  const data = await fetchWeather(location);
  return { temperature: data.temp, conditions: data.conditions };
});

// 客户端实现 — 在浏览器中运行,适用于UI/本地存储操作
const getWeatherClient = getWeatherDef.client((input) => {
  return { temperature: 72, conditions: "cached" };
});
关于工具的详细模式(服务端、客户端、混合、审批、智能体循环),请查看references/tools.md

Type Safety

类型安全

Use
clientTools()
+
createChatClientOptions()
+
InferChatMessages
for full type inference:
typescript
import { clientTools, createChatClientOptions, type InferChatMessages } from "@tanstack/ai-client";

const tools = clientTools(updateUI, saveToStorage); // no 'as const' needed
const chatOptions = createChatClientOptions({
  connection: fetchServerSentEvents("/api/chat"),
  tools,
});
type ChatMessages = InferChatMessages<typeof chatOptions>;

// In component:
const { messages } = useChat(chatOptions);
// messages typed — part.name is discriminated union, part.input/output typed from Zod schemas
使用
clientTools()
+
createChatClientOptions()
+
InferChatMessages
实现完整的类型推断:
typescript
import { clientTools, createChatClientOptions, type InferChatMessages } from "@tanstack/ai-client";

const tools = clientTools(updateUI, saveToStorage); // 无需使用'as const'
const chatOptions = createChatClientOptions({
  connection: fetchServerSentEvents("/api/chat"),
  tools,
});
type ChatMessages = InferChatMessages<typeof chatOptions>;

// 在组件中使用:
const { messages } = useChat(chatOptions);
// messages类型已推断 — part.name为区分联合类型,part.input/output类型由Zod schema自动推断

Devtools

开发工具

bash
npm install -D @tanstack/react-ai-devtools @tanstack/react-devtools
tsx
import { TanStackDevtools } from "@tanstack/react-devtools";
import { aiDevtoolsPlugin } from "@tanstack/react-ai-devtools";

<TanStackDevtools
  plugins={[aiDevtoolsPlugin()]}
  eventBusConfig={{ connectToServerBus: true }}
/>
bash
npm install -D @tanstack/react-ai-devtools @tanstack/react-devtools
tsx
import { TanStackDevtools } from "@tanstack/react-devtools";
import { aiDevtoolsPlugin } from "@tanstack/react-ai-devtools";

<TanStackDevtools
  plugins={[aiDevtoolsPlugin()]}
  eventBusConfig={{ connectToServerBus: true }}
/>

Additional Guides

更多指南

  • Server setup patterns (Next.js, TanStack Start): see references/server-setup.md
  • Tool system (server, client, hybrid, approval, agentic cycle): see references/tools.md
  • Advanced features (multimodal, structured outputs, runtime adapter switching): see references/advanced.md
  • 服务端设置模式(Next.js、TanStack Start):查看references/server-setup.md
  • 工具系统(服务端、客户端、混合、审批、智能体循环):查看references/tools.md
  • 高级功能(多模态、结构化输出、运行时适配器切换):查看references/advanced.md