aliyun-qwen-generation
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
ChineseCategory: provider
Category: provider
Model Studio Qwen Text Generation
Model Studio Qwen 文本生成
Validation
验证
bash
mkdir -p output/aliyun-qwen-generation
python -m py_compile skills/ai/text/aliyun-qwen-generation/scripts/prepare_generation_request.py && echo "py_compile_ok" > output/aliyun-qwen-generation/validate.txtPass criteria: command exits 0 and is generated.
output/aliyun-qwen-generation/validate.txtbash
mkdir -p output/aliyun-qwen-generation
python -m py_compile skills/ai/text/aliyun-qwen-generation/scripts/prepare_generation_request.py && echo "py_compile_ok" > output/aliyun-qwen-generation/validate.txt通过条件:命令退出码为0,且成功生成文件。
output/aliyun-qwen-generation/validate.txtOutput And Evidence
输出与证明
- Save prompt templates, normalized request payloads, and response summaries under .
output/aliyun-qwen-generation/ - Keep one reproducible request example with model name, region, and key parameters.
Use this skill for general text generation, reasoning, tool-calling, and long-context chat on Alibaba Cloud Model Studio.
- 将提示词模板、标准化请求负载和响应摘要保存至目录下。
output/aliyun-qwen-generation/ - 保留一份可复现的请求示例,需包含模型名称、地域和关键参数。
本skill可用于在阿里云Model Studio上完成通用文本生成、推理、工具调用和长上下文聊天任务。
Critical model names
核心模型名称
Prefer the current flagship families:
qwen3-maxqwen3-max-2026-01-23qwen3.5-plusqwen3.5-plus-2026-02-15qwen3.5-flashqwen3.5-flash-2026-02-23
Common related variants listed in the official model catalog:
qwen3.5-397b-a17bqwen3.5-122b-a10bqwen3.5-35b-a3bqwen3.5-27b
优先选用当前旗舰系列:
qwen3-maxqwen3-max-2026-01-23qwen3.5-plusqwen3.5-plus-2026-02-15qwen3.5-flashqwen3.5-flash-2026-02-23
官方模型目录中列出的常见相关变体:
qwen3.5-397b-a17bqwen3.5-122b-a10bqwen3.5-35b-a3bqwen3.5-27b
Prerequisites
前置要求
- Install SDK in a virtual environment:
bash
python3 -m venv .venv
. .venv/bin/activate
python -m pip install dashscope- Set in your environment, or add
DASHSCOPE_API_KEYtodashscope_api_key.~/.alibabacloud/credentials
- 在虚拟环境中安装SDK:
bash
python3 -m venv .venv
. .venv/bin/activate
python -m pip install dashscope- 在环境变量中设置,或者将
DASHSCOPE_API_KEY添加到dashscope_api_key文件中。~/.alibabacloud/credentials
Normalized interface (text.generate)
标准化接口(text.generate)
Request
请求参数
- (array<object>, required): standard chat turns.
messages - (string, optional): default
model.qwen3.5-plus - (number, optional)
temperature - (number, optional)
top_p - (int, optional)
max_tokens - (bool, optional)
enable_thinking - (array<object>, optional)
tools - (object, optional)
response_format - (bool, optional)
stream
- (array<object>, 必填):标准对话轮次。
messages - (string, 选填):默认值为
model。qwen3.5-plus - (number, 选填)
temperature - (number, 选填)
top_p - (int, 选填)
max_tokens - (bool, 选填)
enable_thinking - (array<object>, 选填)
tools - (object, 选填)
response_format - (bool, 选填)
stream
Response
响应参数
- (string): assistant output.
text - (string, optional)
finish_reason - (object, optional)
usage - (object, optional)
raw
- (string):助手输出内容。
text - (string, 选填)
finish_reason - (object, 选填)
usage - (object, 选填)
raw
Quick start (OpenAI-compatible endpoint)
快速入门(OpenAI兼容端点)
bash
curl -sS https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions \
-H "Authorization: Bearer $DASHSCOPE_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "qwen3.5-plus",
"messages": [
{"role": "system", "content": "You are a concise assistant."},
{"role": "user", "content": "Summarize why object storage helps media pipelines."}
],
"stream": false
}'bash
curl -sS https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions \
-H "Authorization: Bearer $DASHSCOPE_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "qwen3.5-plus",
"messages": [
{"role": "system", "content": "You are a concise assistant."},
{"role": "user", "content": "Summarize why object storage helps media pipelines."}
],
"stream": false
}'Local helper script
本地辅助脚本
bash
python skills/ai/text/aliyun-qwen-generation/scripts/prepare_generation_request.py \
--prompt "Draft a concise architecture summary for a media ingestion pipeline." \
--model qwen3.5-plusbash
python skills/ai/text/aliyun-qwen-generation/scripts/prepare_generation_request.py \
--prompt "Draft a concise architecture summary for a media ingestion pipeline." \
--model qwen3.5-plusOperational guidance
使用指南
- Use snapshot IDs when reproducibility matters.
- Prefer for lower-latency simple tasks and
qwen3.5-flashfor harder multi-step tasks.qwen3-max - Keep tool schemas minimal and explicit when enabling tool calls.
- For multimodal input, route to dedicated VL or Omni skills unless the task is primarily text-centric.
- 对可复现性有要求时请使用快照ID。
- 低延迟简单任务优先选用,复杂多步骤任务优先选用
qwen3.5-flash。qwen3-max - 启用工具调用时,请保持工具Schema简洁明确。
- 对于多模态输入,除非任务以文本为核心,否则请路由到专用的VL或Omni skill。
Output location
输出位置
- Default output:
output/aliyun-qwen-generation/requests/ - Override base dir with .
OUTPUT_DIR
- 默认输出路径:
output/aliyun-qwen-generation/requests/ - 可通过环境变量覆盖基础目录。
OUTPUT_DIR
References
参考资料
references/sources.md
references/sources.md