capture-api-response-test-fixture
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
ChineseAPI Response Test Fixtures
API响应测试夹具
For provider response parsing tests, we aim at storing test fixtures with the true responses from the providers (unless they are too large in which case some cutting that does not change semantics is advised).
The fixtures are stored in a subfolder, e.g. . See the file names in for naming conventions and for how to set up test helpers.
__fixtures__packages/openai/src/responses/__fixtures__packages/openai/src/responses/__fixtures__packages/openai/src/responses/openai-responses-language-model.test.tsYou can use our examples under to generate test fixtures.
/examples/ai-functions对于供应商响应解析测试,我们的目标是使用来自供应商的真实响应存储测试夹具(除非响应过大,这种情况下建议进行不改变语义的裁剪)。
测试夹具存储在子文件夹中,例如。命名规范可参考中的文件名,测试助手的设置方法可参考。
__fixtures__packages/openai/src/responses/__fixtures__packages/openai/src/responses/__fixtures__packages/openai/src/responses/openai-responses-language-model.test.ts你可以使用下的示例来生成测试夹具。
/examples/ai-functionsgenerateText (doGenerate testing)
generateText(doGenerate测试)
For , log the raw response output to the console and copy it into a new test fixture.
generateTextts
import { openai } from '@ai-sdk/openai';
import { generateText } from 'ai';
import { run } from '../lib/run';
run(async () => {
const result = await generateText({
model: openai('gpt-5-nano'),
prompt: 'Invent a new holiday and describe its traditions.',
});
console.log(JSON.stringify(result.response.body, null, 2));
});对于,将原始响应输出记录到控制台,然后复制到新的测试夹具中。
generateTextts
import { openai } from '@ai-sdk/openai';
import { generateText } from 'ai';
import { run } from '../lib/run';
run(async () => {
const result = await generateText({
model: openai('gpt-5-nano'),
prompt: 'Invent a new holiday and describe its traditions.',
});
console.log(JSON.stringify(result.response.body, null, 2));
});streamText (doStream testing)
streamText(doStream测试)
For , you need to set to and use the special helper. Run the script from the folder via . The result is then stored in the folder. You can copy it to your fixtures folder and rename it.
streamTextincludeRawChunkstruesaveRawChunks/example/ai-functionspnpm tsx src/stream-text/script-name.ts/examples/ai-functions/outputts
import { openai } from '@ai-sdk/openai';
import { streamText } from 'ai';
import { run } from '../lib/run';
import { saveRawChunks } from '../lib/save-raw-chunks';
run(async () => {
const result = streamText({
model: openai('gpt-5-nano'),
prompt: 'Invent a new holiday and describe its traditions.',
includeRawChunks: true,
});
await saveRawChunks({ result, filename: 'openai-gpt-5-nano' });
});对于,你需要将设置为,并使用专用的助手。通过从文件夹运行脚本。结果将存储在文件夹中,你可以将其复制到你的夹具文件夹并重命名。
streamTextincludeRawChunkstruesaveRawChunkspnpm tsx src/stream-text/script-name.ts/example/ai-functions/examples/ai-functions/outputts
import { openai } from '@ai-sdk/openai';
import { streamText } from 'ai';
import { run } from '../lib/run';
import { saveRawChunks } from '../lib/save-raw-chunks';
run(async () => {
const result = streamText({
model: openai('gpt-5-nano'),
prompt: 'Invent a new holiday and describe its traditions.',
includeRawChunks: true,
});
await saveRawChunks({ result, filename: 'openai-gpt-5-nano' });
});