n8n
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
Chinesen8n Workflow Automation Skill
n8n工作流自动化技能
Purpose
用途
Provide specialized guidance for developing workflows, custom nodes, and integrations on the n8n automation platform. Enable AI assistants to design workflows, write custom code nodes, build TypeScript-based custom nodes, integrate external services, and implement AI agent patterns.
为在n8n自动化平台上开发工作流、自定义节点和集成提供专业指导。支持AI助手设计工作流、编写自定义代码节点、构建基于TypeScript的自定义节点、集成外部服务以及实现AI Agent模式。
When to Use This Skill
何时使用此技能
Invoke this skill when:
- Designing automation workflows combining multiple services
- Writing JavaScript/Python code within workflow nodes
- Building custom nodes in TypeScript
- Integrating APIs, databases, and cloud services
- Creating AI agent workflows with LangChain
- Troubleshooting workflow execution errors
- Planning self-hosted n8n deployments
- Converting manual processes to automated workflows
Do NOT use this skill for:
- Generic automation advice (use appropriate language/platform skill)
- Cloud platform-specific integrations (combine with cloud provider skill)
- Database design (use database-specialist skill)
- Frontend development (n8n has minimal UI customization)
在以下场景调用此技能:
- 设计组合多个服务的自动化工作流
- 在工作流节点中编写JavaScript/Python代码
- 使用TypeScript构建自定义节点
- 集成API、数据库和云服务
- 创建基于LangChain的AI Agent工作流
- 排查工作流执行错误
- 规划自托管n8n部署
- 将手动流程转换为自动化工作流
请勿在以下场景使用此技能:
- 通用自动化建议(使用合适的语言/平台技能)
- 云平台特定集成(结合云服务商技能)
- 数据库设计(使用数据库专家技能)
- 前端开发(n8n仅支持极少的UI自定义)
Core n8n Concepts
n8n核心概念
Platform Architecture
平台架构
Runtime Environment:
- Node.js-based execution engine
- TypeScript (90.7%) and Vue.js frontend
- pnpm monorepo structure
- Self-hosted or cloud deployment options
Workflow Execution Models:
- Manual trigger - User-initiated execution
- Webhook trigger - HTTP endpoint activation
- Schedule trigger - Cron-based timing
- Event trigger - External service events (database changes, file uploads)
- Error trigger - Workflow failure handling
Fair-code License:
- Apache 2.0 with Commons Clause
- Free for self-hosting and unlimited executions
- Commercial restrictions for SaaS offerings
运行时环境:
- 基于Node.js的执行引擎
- 90.7%代码使用TypeScript,前端使用Vue.js
- pnpm monorepo结构
- 支持自托管或云部署
工作流执行模型:
- 手动触发 - 用户发起执行
- Webhook触发 - HTTP端点激活
- 调度触发 - 基于Cron的定时执行
- 事件触发 - 外部服务事件(数据库变更、文件上传)
- 错误触发 - 工作流失败处理
Fair-code许可协议:
- Apache 2.0 + Commons Clause
- 自托管和无限制执行免费
- SaaS产品存在商用限制
Node Types and Categories
节点类型与分类
Core Nodes (Data manipulation):
- Code - Execute JavaScript/Python
- Set - Assign variable values
- If - Conditional branching
- Switch - Multi-branch routing
- Merge - Combine data streams
- Split In Batches - Process large datasets incrementally
- Loop Over Items - Iterate through data
Trigger Nodes (Workflow initiation):
- Webhook - HTTP endpoint
- Schedule - Time-based execution
- Manual Trigger - User activation
- Error Trigger - Catch workflow failures
- Start - Default entry point
Action Nodes (500+ integrations):
- API connectors (REST, GraphQL, SOAP)
- Database clients (PostgreSQL, MongoDB, MySQL, Redis)
- Cloud services (AWS, GCP, Azure, Cloudflare)
- Communication (Email, Slack, Discord, SMS)
- File operations (FTP, S3, Google Drive, Dropbox)
- Authentication (OAuth2, API keys, JWT)
AI Nodes (LangChain integration):
- AI Agent - Autonomous decision-making
- AI Chain - Sequential LLM operations
- AI Transform - Data manipulation with LLMs
- Vector Store - Embedding storage and retrieval
- Document Loaders - Text extraction from files
核心节点(数据处理):
- Code - 执行JavaScript/Python代码
- Set - 分配变量值
- If - 条件分支
- Switch - 多分支路由
- Merge - 合并数据流
- Split In Batches - 增量处理大型数据集
- Loop Over Items - 遍历数据
触发节点(工作流启动):
- Webhook - HTTP端点
- Schedule - 基于时间的执行
- Manual Trigger - 用户激活
- Error Trigger - 捕获工作流失败
- Start - 默认入口点
动作节点(500+集成):
- API连接器(REST、GraphQL、SOAP)
- 数据库客户端(PostgreSQL、MongoDB、MySQL、Redis)
- 云服务(AWS、GCP、Azure、Cloudflare)
- 通信工具(Email、Slack、Discord、SMS)
- 文件操作(FTP、S3、Google Drive、Dropbox)
- 认证方式(OAuth2、API密钥、JWT)
AI节点(LangChain集成):
- AI Agent - 自主决策
- AI Chain - 顺序LLM操作
- AI Transform - 使用LLM处理数据
- Vector Store - 嵌入存储与检索
- Document Loaders - 从文件提取文本
Data Flow and Connections
数据流与连接
Connection Types:
- Main connection - Primary data flow (solid line)
- Error connection - Failure routing (dashed red line)
Data Structure:
javascript
// Input/output format for all nodes
[
{
json: { /* Your data object */ },
binary: { /* Optional binary data (files, images) */ },
pairedItem: { /* Reference to source item */ }
}
]Data Access Patterns:
- Expression - (current node output)
{{ $json.field }} - Input reference - (specific node)
{{ $('NodeName').item.json.field }} - All items - (entire dataset)
{{ $input.all() }} - First item - (single item)
{{ $input.first() }} - Item index - (current iteration)
{{ $itemIndex }}
连接类型:
- 主连接 - 主要数据流(实线)
- 错误连接 - 失败路由(红色虚线)
数据结构:
javascript
// 所有节点的输入/输出格式
[
{
json: { /* 你的数据对象 */ },
binary: { /* 可选二进制数据(文件、图片) */ },
pairedItem: { /* 源项引用 */ }
}
]数据访问模式:
- 表达式 - (当前节点输出)
{{ $json.field }} - 输入引用 - (指定节点)
{{ $('NodeName').item.json.field }} - 所有项 - (整个数据集)
{{ $input.all() }} - 第一项 - (单个项)
{{ $input.first() }} - 项索引 - (当前迭代)
{{ $itemIndex }}
Credentials and Authentication
凭证与认证
Credential Types:
- Predefined - Pre-configured for popular services (OAuth2, API key)
- Generic - HTTP authentication (Basic, Digest, Header Auth)
- Custom - User-defined credential structures
Security Practices:
- Credentials stored encrypted in database
- Environment variable support for sensitive values
- Credential sharing across workflows (optional)
- Rotation: Manual update required
凭证类型:
- 预定义 - 为常用服务预配置(OAuth2、API密钥)
- 通用 - HTTP认证(Basic、Digest、Header Auth)
- 自定义 - 用户定义的凭证结构
安全实践:
- 凭证加密存储在数据库中
- 支持使用环境变量存储敏感值
- 支持跨工作流共享凭证(可选)
- 轮换:需要手动更新
Workflow Design Methodology
工作流设计方法论
Planning Phase
规划阶段
Step 1: Define Requirements
- Input sources (webhooks, schedules, databases)
- Data transformations needed
- Output destinations (APIs, files, databases)
- Error handling requirements
- Execution frequency and volume
Step 2: Map Data Flow
- Identify trigger events
- List transformation steps
- Specify validation rules
- Define branching logic
- Plan error recovery
Step 3: Select Nodes
Decision criteria:
- Use native nodes when available (optimized, maintained)
- Use Code node for custom logic <50 lines
- Build custom node for reusable complex logic >100 lines
- Use HTTP Request node for APIs without native nodes
- Use Execute Command node for system operations (security risk)
步骤1:定义需求
- 输入源(webhooks、调度、数据库)
- 需要的数据转换
- 输出目标(API、文件、数据库)
- 错误处理需求
- 执行频率与数据量
步骤2:映射数据流
- 识别触发事件
- 列出转换步骤
- 指定验证规则
- 定义分支逻辑
- 规划错误恢复
步骤3:选择节点
决策标准:
- 优先使用原生节点(经过优化、官方维护)
- 自定义逻辑少于50行时使用Code节点
- 可复用的复杂逻辑(超过100行)则构建自定义节点
- 无原生节点的API使用HTTP Request节点
- 系统操作使用Execute Command节点(存在安全风险)
Implementation Phase
实现阶段
Workflow Structure Pattern:
[Trigger] → [Validation] → [Branch (If/Switch)] → [Processing] → [Error Handler]
↓ ↓
[Path A nodes] [Path B nodes]
↓ ↓
[Merge/Output] [Output]Modular Design:
- Extract reusable logic to sub-workflows
- Use Execute Workflow node for modularity
- Limit main workflow to 15-20 nodes (readability)
- Parameterize workflows with input variables
Error Handling Strategy:
- Error Trigger workflows - Capture all failures
- Try/Catch pattern - Error output connections on nodes
- Retry logic - Configure per-node retry settings
- Validation nodes - If/Switch for data checks
- Notification - Alert on critical failures (Email, Slack)
工作流结构模式:
[触发节点] → [验证节点] → [分支(If/Switch)] → [处理节点] → [错误处理]
↓ ↓
[路径A节点] [路径B节点]
↓ ↓
[合并/输出] [输出]模块化设计:
- 将可复用逻辑提取为子工作流
- 使用Execute Workflow节点实现模块化
- 主工作流节点数量限制在15-20个(提升可读性)
- 使用输入变量参数化工作流
错误处理策略:
- Error Trigger工作流 - 捕获所有失败
- Try/Catch模式 - 为节点配置错误输出连接
- 重试逻辑 - 为每个节点配置重试设置
- 验证节点 - 使用If/Switch进行数据检查
- 通知 - 发生严重失败时发送警报(Email、Slack)
Testing Phase
测试阶段
Local Testing:
- Execute with sample data
- Verify each node output (inspect data panel)
- Test error paths with invalid data
- Check credential connections
Production Validation:
- Enable workflow, monitor executions
- Review execution history for failures
- Check resource usage (execution time, memory)
- Validate output data quality
本地测试:
- 使用示例数据执行
- 验证每个节点的输出(检查数据面板)
- 使用无效数据测试错误路径
- 检查凭证连接
生产验证:
- 启用工作流,监控执行情况
- 查看执行历史中的失败记录
- 检查资源使用情况(执行时间、内存)
- 验证输出数据质量
Code Execution in Workflows
工作流中的代码执行
Code Node (JavaScript)
Code节点(JavaScript)
Available APIs:
- Node.js built-ins - ,
fs,path,cryptohttps - Lodash - ,
_.groupBy(), etc._.sortBy() - Luxon - DateTime manipulation
- n8n helpers - ,
$input,$json$binary
Basic Structure:
javascript
// Access input items
const items = $input.all();
// Process data
const processedItems = items.map(item => {
const inputData = item.json;
return {
json: {
// Output fields
processed: inputData.field.toUpperCase(),
timestamp: new Date().toISOString()
}
};
});
// Return transformed items
return processedItems;Data Transformation Patterns:
Filtering:
javascript
const items = $input.all();
return items.filter(item => item.json.status === 'active');Aggregation:
javascript
const items = $input.all();
const grouped = _.groupBy(items, item => item.json.category);
return [{
json: {
summary: Object.keys(grouped).map(category => ({
category,
count: grouped[category].length
}))
}
}];API calls (async):
javascript
const items = $input.all();
const results = [];
for (const item of items) {
const response = await fetch(`https://api.example.com/data/${item.json.id}`);
const data = await response.json();
results.push({
json: {
original: item.json,
enriched: data
}
});
}
return results;Error Handling in Code:
javascript
const items = $input.all();
return items.map(item => {
try {
// Risky operation
const result = JSON.parse(item.json.data);
return { json: { parsed: result } };
} catch (error) {
return {
json: {
error: error.message,
original: item.json.data
}
};
}
});可用API:
- Node.js内置模块 - ,
fs,path,cryptohttps - Lodash - ,
_.groupBy()等_.sortBy() - Luxon - 日期时间处理
- n8n助手 - ,
$input,$json$binary
基本结构:
javascript
// 访问输入项
const items = $input.all();
// 处理数据
const processedItems = items.map(item => {
const inputData = item.json;
return {
json: {
// 输出字段
processed: inputData.field.toUpperCase(),
timestamp: new Date().toISOString()
}
};
});
// 返回转换后的项
return processedItems;数据转换模式:
过滤:
javascript
const items = $input.all();
return items.filter(item => item.json.status === 'active');聚合:
javascript
const items = $input.all();
const grouped = _.groupBy(items, item => item.json.category);
return [{
json: {
summary: Object.keys(grouped).map(category => ({
category,
count: grouped[category].length
}))
}
}];API调用(异步):
javascript
const items = $input.all();
const results = [];
for (const item of items) {
const response = await fetch(`https://api.example.com/data/${item.json.id}`);
const data = await response.json();
results.push({
json: {
original: item.json,
enriched: data
}
});
}
return results;代码中的错误处理:
javascript
const items = $input.all();
return items.map(item => {
try {
// 风险操作
const result = JSON.parse(item.json.data);
return { json: { parsed: result } };
} catch (error) {
return {
json: {
error: error.message,
original: item.json.data
}
};
}
});Code Node (Python)
Code节点(Python)
Available Libraries:
- Standard library - ,
json,datetime,rerequests - NumPy - Array operations
- Pandas - Data analysis (if installed)
Basic Structure:
python
undefined可用库:
- 标准库 - ,
json,datetime,rerequests - NumPy - 数组操作
- Pandas - 数据分析(需安装)
基本结构:
python
undefinedAccess input items
访问输入项
items = _input.all()
items = _input.all()
Process data
处理数据
processed_items = []
for item in items:
input_data = item['json']
processed_items.append({
'json': {
'processed': input_data['field'].upper(),
'timestamp': datetime.now().isoformat()
}
})processed_items = []
for item in items:
input_data = item['json']
processed_items.append({
'json': {
'processed': input_data['field'].upper(),
'timestamp': datetime.now().isoformat()
}
})Return transformed items
返回转换后的项
return processed_items
**Complexity Rating: Code Nodes**
- Simple transformations (map/filter): **1**
- API calls with error handling: **2**
- Multi-step async operations: **3**
- Complex algorithms with libraries: **4**
- Performance-critical processing: **5** (consider custom node)return processed_items
**复杂度评级:Code节点**
- 简单转换(map/filter):**1**
- 带错误处理的API调用:**2**
- 多步骤异步操作:**3**
- 使用库的复杂算法:**4**
- 性能关键型处理:**5**(考虑自定义节点)Custom Node Development
自定义节点开发
When to Build Custom Nodes
何时构建自定义节点
Build custom node when:
- Reusable logic across multiple workflows (>3 workflows)
- Complex authentication requirements
- Performance-critical operations (Code node overhead)
- Community contribution (public npm package)
- Organization-specific integrations
Use Code node when:
- One-off transformations
- Rapid prototyping
- Simple API calls (<100 lines)
在以下场景构建自定义节点:
- 跨多个工作流复用逻辑(>3个工作流)
- 复杂认证需求
- 性能关键型操作(Code节点存在开销)
- 社区贡献(公开npm包)
- 组织特定集成
在以下场景使用Code节点:
- 一次性转换
- 快速原型开发
- 简单API调用(<100行)
Development Styles
开发风格
[See Code Examples: examples/n8n_custom_node.ts]
1. Programmatic Style (Full control)
Use for:
- Complex authentication flows
- Advanced parameter validation
- Custom UI components
- Polling operations with state management
[See: class in examples/n8n_custom_node.ts]
CustomNode2. Declarative Style (Simplified)
Use for:
- Standard CRUD operations
- RESTful API wrappers
- Simple integrations without complex logic
[See: and exports in examples/n8n_custom_node.ts]
operationsrouterAdditional Examples:
- Credential configuration: in examples/n8n_custom_node.ts
customApiCredentials - Credential validation: in examples/n8n_custom_node.ts
validateCredentials() - Polling trigger: class in examples/n8n_custom_node.ts
PollingTrigger
[查看代码示例:examples/n8n_custom_node.ts]
1. 编程式风格(完全控制)
适用场景:
- 复杂认证流程
- 高级参数验证
- 自定义UI组件
- 带状态管理的轮询操作
[查看:examples/n8n_custom_node.ts中的类]
CustomNode2. 声明式风格(简化)
适用场景:
- 标准CRUD操作
- RESTful API包装器
- 无复杂逻辑的简单集成
[查看:examples/n8n_custom_node.ts中的和导出]
operationsrouter额外示例:
- 凭证配置:examples/n8n_custom_node.ts中的
customApiCredentials - 凭证验证:examples/n8n_custom_node.ts中的
validateCredentials() - 轮询触发:examples/n8n_custom_node.ts中的类
PollingTrigger
Development Workflow
开发流程
Step 1: Initialize Node
bash
undefined步骤1:初始化节点
bash
undefinedCreate from template
从模板创建
npm create @n8n/node my-custom-node
npm create @n8n/node my-custom-node
Directory structure created:
创建的目录结构:
├── nodes/
├── nodes/
│ └── MyCustomNode/
│ └── MyCustomNode/
│ └── MyCustomNode.node.ts
│ └── MyCustomNode.node.ts
├── credentials/
├── credentials/
│ └── MyCustomNodeApi.credentials.ts
│ └── MyCustomNodeApi.credentials.ts
└── package.json
└── package.json
**Step 2: Implement Logic**
- Define node properties (parameters, credentials)
- Implement execute method
- Add error handling
- Write unit tests (optional)
**Step 3: Build and Test**
```bash
**步骤2:实现逻辑**
- 定义节点属性(参数、凭证)
- 实现execute方法
- 添加错误处理
- 编写单元测试(可选)
**步骤3:构建与测试**
```bashBuild TypeScript
构建TypeScript代码
npm run build
npm run build
Link locally for testing
本地链接用于测试
npm link
npm link
In n8n development environment
在n8n开发环境中
cd ~/.n8n/nodes
npm link my-custom-node
cd ~/.n8n/nodes
npm link my-custom-node
Restart n8n to load node
重启n8n以加载节点
n8n start
**Step 4: Publish**
```bashn8n start
**步骤4:发布**
```bashCommunity node (npm package)
社区节点(npm包)
npm publish
npm publish
Install in n8n
在n8n中安装
Settings → Community Nodes → Install → Enter package name
**Complexity Rating: Custom Nodes**
- Declarative CRUD wrapper: **2**
- Programmatic with authentication: **3**
- Complex state management: **4**
- Advanced polling/webhooks: **5**设置 → 社区节点 → 安装 → 输入包名
**复杂度评级:自定义节点**
- 声明式CRUD包装器:**2**
- 带认证的编程式实现:**3**
- 复杂状态管理:**4**
- 高级轮询/Webhook:**5**Integration Patterns
集成模式
API Integration Strategy
API集成策略
Decision Tree:
Has native node? ──Yes──> Use native node
│
No
├──> Simple REST API? ──Yes──> HTTP Request node
├──> Complex auth (OAuth2)? ──Yes──> Build custom node
├──> Reusable across workflows? ──Yes──> Build custom node
└──> One-off integration? ──Yes──> Code node with fetch()决策树:
是否有原生节点? ──是──> 使用原生节点
│
否
├──> 简单REST API? ──是──> 使用HTTP Request节点
├──> 复杂认证(OAuth2)? ──是──> 构建自定义节点
├──> 跨工作流复用? ──是──> 构建自定义节点
└──> 一次性集成? ──是──> 使用Code节点和fetch()HTTP Request Node Patterns
HTTP Request节点模式
GET with query parameters:
URL: https://api.example.com/users
Method: GET
Query Parameters:
- status: active
- limit: 100
Authentication: Header Auth
- Name: Authorization
- Value: Bearer {{$credentials.apiKey}}POST with JSON body:
URL: https://api.example.com/users
Method: POST
Body Content Type: JSON
Body:
{
"name": "={{ $json.name }}",
"email": "={{ $json.email }}"
}Pagination handling (Code node):
javascript
let allResults = [];
let page = 1;
let hasMore = true;
while (hasMore) {
const response = await this.helpers.request({
method: 'GET',
url: `https://api.example.com/data?page=${page}`,
json: true,
});
allResults = allResults.concat(response.results);
hasMore = response.hasNext;
page++;
}
return allResults.map(item => ({ json: item }));带查询参数的GET请求:
URL: https://api.example.com/users
Method: GET
Query Parameters:
- status: active
- limit: 100
Authentication: Header Auth
- Name: Authorization
- Value: Bearer {{$credentials.apiKey}}带JSON体的POST请求:
URL: https://api.example.com/users
Method: POST
Body Content Type: JSON
Body:
{
"name": "={{ $json.name }}",
"email": "={{ $json.email }}"
}分页处理(Code节点):
javascript
let allResults = [];
let page = 1;
let hasMore = true;
while (hasMore) {
const response = await this.helpers.request({
method: 'GET',
url: `https://api.example.com/data?page=${page}`,
json: true,
});
allResults = allResults.concat(response.results);
hasMore = response.hasNext;
page++;
}
return allResults.map(item => ({ json: item }));Webhook Patterns
Webhook模式
Receiving webhooks:
- Create webhook trigger node
- Configure HTTP method (POST/GET)
- Set authentication (None/Header Auth/Basic Auth)
- Get webhook URL from node
- Register URL with external service
Responding to webhooks:
javascript
// In Code node after webhook trigger
const webhookData = $input.first().json;
// Process data
const result = processData(webhookData);
// Return response (synchronous webhook)
return [{
json: {
status: 'success',
data: result
}
}];Webhook URL structure:
Production: https://your-domain.com/webhook/workflow-id
Test: https://your-domain.com/webhook-test/workflow-id接收Webhook:
- 创建Webhook触发节点
- 配置HTTP方法(POST/GET)
- 设置认证方式(无/Header Auth/Basic Auth)
- 从节点获取Webhook URL
- 在外部服务中注册该URL
响应Webhook:
javascript
// Webhook触发后的Code节点
const webhookData = $input.first().json;
// 处理数据
const result = processData(webhookData);
// 返回响应(同步Webhook)
return [{
json: {
status: 'success',
data: result
}
}];Webhook URL结构:
生产环境: https://your-domain.com/webhook/workflow-id
测试环境: https://your-domain.com/webhook-test/workflow-idDatabase Integration
数据库集成
Common patterns:
Query with parameters:
sql
-- PostgreSQL node
SELECT * FROM users
WHERE created_at > $1
AND status = $2
ORDER BY created_at DESC
-- Parameters from previous node
Parameters: ['{{ $json.startDate }}', 'active']Batch insert:
javascript
// Code node preparing data for database
const items = $input.all();
const values = items.map(item => ({
name: item.json.name,
email: item.json.email,
created_at: new Date().toISOString()
}));
return [{ json: { values } }];
// Next node: PostgreSQL
// INSERT INTO users (name, email, created_at)
// VALUES {{ $json.values }}常见模式:
带参数的查询:
sql
-- PostgreSQL节点
SELECT * FROM users
WHERE created_at > $1
AND status = $2
ORDER BY created_at DESC
-- 来自前一节点的参数
Parameters: ['{{ $json.startDate }}', 'active']批量插入:
javascript
// 为数据库准备数据的Code节点
const items = $input.all();
const values = items.map(item => ({
name: item.json.name,
email: item.json.email,
created_at: new Date().toISOString()
}));
return [{ json: { values } }];
// 下一个节点:PostgreSQL
// INSERT INTO users (name, email, created_at)
// VALUES {{ $json.values }}File Operations
文件操作
Upload to S3:
Workflow: File Trigger → S3 Upload
- File Trigger: Monitor directory for new files
- S3 node:
- Operation: Upload
- Bucket: my-bucket
- File Name: {{ $json.fileName }}
- Binary Data: true (from file trigger)Download and process:
HTTP Request (download) → Code (process) → Google Drive (upload)
- HTTP Request: Binary response enabled
- Code: Process $binary.data
- Google Drive: Upload with binary data上传到S3:
工作流: File Trigger → S3 Upload
- File Trigger: 监控目录中的新文件
- S3节点:
- Operation: Upload
- Bucket: my-bucket
- File Name: {{ $json.fileName }}
- Binary Data: true(来自文件触发器)下载并处理:
HTTP Request(下载) → Code(处理) → Google Drive(上传)
- HTTP Request: 启用二进制响应
- Code: 处理$binary.data
- Google Drive: 使用二进制数据上传AI Agent Workflows
AI Agent工作流
LangChain Integration
LangChain集成
AI Agent Node Configuration:
- Agent type: OpenAI Functions, ReAct, Conversational
- LLM: OpenAI, Anthropic, Hugging Face, Ollama (local)
- Memory: Buffer, Buffer Window, Summary
- Tools: Calculator, Webhook, Database query, Custom API calls
Basic Agent Pattern:
Manual Trigger → AI Agent → Output
- AI Agent:
- Prompt: "You are a helpful assistant that {{$json.task}}"
- Tools: [Calculator, HTTP Request]
- Memory: Conversation Buffer WindowAI Agent节点配置:
- Agent类型: OpenAI Functions、ReAct、Conversational
- LLM: OpenAI、Anthropic、Hugging Face、Ollama(本地)
- Memory: Buffer、Buffer Window、Summary
- Tools: Calculator、Webhook、Database查询、自定义API调用
基础Agent模式:
Manual Trigger → AI Agent → Output
- AI Agent:
- Prompt: "You are a helpful assistant that {{$json.task}}"
- Tools: [Calculator, HTTP Request]
- Memory: Conversation Buffer WindowGatekeeper Pattern (Supervised AI)
守门人模式(受监督AI)
Use case: Human approval before agent actions
Webhook → AI Agent → If (requires approval) → Send Email → Wait for Webhook → Execute Action
↓ (auto-approve)
Execute ActionImplementation:
- AI Agent generates action plan
- If node checks confidence score
- Low confidence → Email approval request
- Wait for webhook (approve/reject)
- Execute or abort based on response
适用场景: Agent执行操作前需人工批准
Webhook → AI Agent → If(需要批准) → Send Email → Wait for Webhook → Execute Action
↓(自动批准)
Execute Action实现:
- AI Agent生成行动计划
- If节点检查置信度分数
- 低置信度 → 发送邮件请求批准
- 等待Webhook(批准/拒绝)
- 根据响应执行或终止
Iterative Agent Pattern
迭代Agent模式
Use case: Multi-step problem solving with state
Loop Start → AI Agent → Tool Execution → State Update → Loop End (condition check)
↑______________________________________________________________|State management:
javascript
// Code node - Initialize state
return [{
json: {
task: 'Research topic',
iteration: 0,
maxIterations: 5,
context: [],
completed: false
}
}];
// Code node - Update state
const state = $json;
state.iteration++;
state.context.push($('AI Agent').item.json.response);
state.completed = state.iteration >= state.maxIterations || checkGoalMet(state);
return [{ json: state }];适用场景: 带状态的多步骤问题解决
Loop Start → AI Agent → Tool Execution → State Update → Loop End(条件检查)
↑______________________________________________________________|状态管理:
javascript
// Code节点 - 初始化状态
return [{
json: {
task: 'Research topic',
iteration: 0,
maxIterations: 5,
context: [],
completed: false
}
}];
// Code节点 - 更新状态
const state = $json;
state.iteration++;
state.context.push($('AI Agent').item.json.response);
state.completed = state.iteration >= state.maxIterations || checkGoalMet(state);
return [{ json: state }];RAG (Retrieval Augmented Generation) Pattern
RAG(检索增强生成)模式
Query Input → Vector Store Search → Format Context → LLM → Response OutputVector Store setup:
- Document Loader node → Split text into chunks
- Embeddings node → Generate vectors (OpenAI, Cohere)
- Vector Store node → Store in Pinecone/Qdrant/Supabase
- Query: Retrieve relevant chunks → Inject into LLM prompt
Complexity Rating: AI Workflows
- Simple LLM call: 1
- Agent with tools: 3
- Gatekeeper pattern: 4
- Multi-agent orchestration: 5
Query Input → Vector Store Search → Format Context → LLM → Response OutputVector Store设置:
- Document Loader节点 → 将文本分割为块
- Embeddings节点 → 生成向量(OpenAI、Cohere)
- Vector Store节点 → 存储到Pinecone/Qdrant/Supabase
- 查询:检索相关块 → 注入到LLM提示词中
复杂度评级:AI工作流
- 简单LLM调用:1
- 带工具的Agent:3
- 守门人模式:4
- 多Agent编排:5
Deployment and Hosting
部署与托管
Self-Hosting Options
自托管选项
[See Code Examples: examples/n8n_deployment.yaml]
Docker (Recommended):
- Docker Compose with PostgreSQL
- Queue mode configuration for scaling
- Resource requirements by volume
[See: docker-compose configurations in examples/n8n_deployment.yaml]
npm (Development):
bash
npm install n8n -g
n8n start[查看代码示例:examples/n8n_deployment.yaml]
Docker(推荐):
- 带PostgreSQL的Docker Compose
- 队列模式配置用于扩展
- 按数据量划分的资源需求
[查看:examples/n8n_deployment.yaml中的docker-compose配置]
npm(开发环境):
bash
npm install n8n -g
n8n startAccess: http://localhost:5678
访问地址: http://localhost:5678
**Environment Configuration:**
[See: Complete environment variable reference in examples/n8n_deployment.yaml]
Essential variables:
- `N8N_HOST` - Public URL for webhooks
- `WEBHOOK_URL` - Webhook endpoint base
- `N8N_ENCRYPTION_KEY` - Credential encryption (must persist)
- `DB_TYPE` - Database (SQLite/PostgreSQL/MySQL/MariaDB)
- `EXECUTIONS_DATA_SAVE_ON_ERROR` - Error logging
- `EXECUTIONS_DATA_SAVE_ON_SUCCESS` - Success logging
Performance tuning variables documented in examples/n8n_deployment.yaml
**环境配置:**
[查看:examples/n8n_deployment.yaml中的完整环境变量参考]
关键变量:
- `N8N_HOST` - Webhooks的公共URL
- `WEBHOOK_URL` - Webhook端点基础地址
- `N8N_ENCRYPTION_KEY` - 凭证加密密钥(必须持久化)
- `DB_TYPE` - 数据库类型(SQLite/PostgreSQL/MySQL/MariaDB)
- `EXECUTIONS_DATA_SAVE_ON_ERROR` - 错误日志
- `EXECUTIONS_DATA_SAVE_ON_SUCCESS` - 成功日志
性能调优变量记录在examples/n8n_deployment.yaml中Scaling Considerations
扩展考量
Queue Mode (High volume):
bash
undefined队列模式(高数据量):
bash
undefinedSeparate main and worker processes
分离主进程和工作进程
Main process (UI + queue management)
主进程(UI + 队列管理)
N8N_QUEUE_MODE=main n8n start
N8N_QUEUE_MODE=main n8n start
Worker processes (execution only)
工作进程(仅执行)
N8N_QUEUE_MODE=worker n8n worker
**Database:**
- SQLite: Development/low volume (<1000 executions/day)
- PostgreSQL: Production (recommended)
- MySQL/MariaDB: Alternative for existing infrastructure
**Resource Requirements:**
| Workflow Volume | CPU | RAM | Database |
|----------------|-----|-----|----------|
| <100 exec/day | 1 core | 512MB | SQLite |
| 100-1000/day | 2 cores | 2GB | PostgreSQL |
| 1000-10000/day | 4 cores | 4GB | PostgreSQL |
| >10000/day | 8+ cores | 8GB+ | PostgreSQL + Queue mode |
**Monitoring:**
- Enable execution logs (EXECUTIONS_DATA_SAVE_*)
- Set up error workflows (Error Trigger node)
- Monitor database size (execution history cleanup)
- Track webhook response timesN8N_QUEUE_MODE=worker n8n worker
**数据库选择:**
- SQLite:开发/低数据量(<1000次执行/天)
- PostgreSQL:生产环境(推荐)
- MySQL/MariaDB:现有基础设施的替代方案
**资源需求:**
| 工作流数据量 | CPU | RAM | 数据库 |
|----------------|-----|-----|----------|
| <100次/天 | 1核 | 512MB | SQLite |
| 100-1000次/天 | 2核 | 2GB | PostgreSQL |
| 1000-10000次/天 | 4核 | 4GB | PostgreSQL |
| >10000次/天 | 8+核 | 8GB+ | PostgreSQL + 队列模式 |
**监控:**
- 启用执行日志(EXECUTIONS_DATA_SAVE_*)
- 设置错误工作流(Error Trigger节点)
- 监控数据库大小(清理执行历史)
- 跟踪Webhook响应时间Best Practices
最佳实践
Workflow Design
工作流设计
1. Modularity:
- Extract reusable logic to Execute Workflow nodes
- Limit workflows to single responsibility
- Use sub-workflows for common operations (validation, formatting)
2. Error Resilience:
- Add error outputs to critical nodes
- Implement retry logic (node settings)
- Create Error Trigger workflows for alerts
- Log errors to external systems (Sentry, CloudWatch)
3. Performance:
- Use Split In Batches for large datasets (>1000 items)
- Minimize HTTP requests in loops (batch API calls)
- Disable execution logging for high-frequency workflows
- Cache expensive operations in variables
4. Security:
- Store secrets in credentials (not hardcoded)
- Use environment variables for configuration
- Enable webhook authentication
- Restrict Execute Command node usage (or disable globally)
- Review code nodes for injection vulnerabilities
5. Maintainability:
- Add notes to complex workflows (Sticky Note node)
- Use consistent naming (verb + noun: "Fetch Users", "Transform Data")
- Document workflow purpose in workflow settings
- Version control workflows (export JSON, commit to Git)
1. 模块化:
- 将可复用逻辑提取到Execute Workflow节点
- 工作流保持单一职责
- 使用子工作流处理通用操作(验证、格式化)
2. 错误恢复:
- 为关键节点添加错误输出
- 实现重试逻辑(节点设置)
- 创建Error Trigger工作流用于警报
- 将错误日志发送到外部系统(Sentry、CloudWatch)
3. 性能:
- 处理大型数据集(>1000项)时使用Split In Batches
- 减少循环中的HTTP请求(批量API调用)
- 高频率工作流禁用执行日志
- 在变量中缓存昂贵操作的结果
4. 安全:
- 将密钥存储在凭证中(不要硬编码)
- 使用环境变量配置
- 启用Webhook认证
- 限制Execute Command节点的使用(或全局禁用)
- 检查Code节点是否存在注入漏洞
5. 可维护性:
- 为复杂工作流添加注释(Sticky Note节点)
- 使用一致的命名规范(动词+名词:"Fetch Users", "Transform Data")
- 在工作流设置中记录工作流用途
- 版本控制工作流(导出JSON,提交到Git)
Code Quality in Nodes
节点中的代码质量
1. Data validation:
javascript
// Always validate input structure
const items = $input.all();
for (const item of items) {
if (!item.json.email || !item.json.name) {
throw new Error(`Invalid input: missing required fields at item ${item.json.id}`);
}
}2. Error context:
javascript
// Provide debugging information
try {
const result = await apiCall(item.json.id);
} catch (error) {
throw new Error(`API call failed for ID ${item.json.id}: ${error.message}`);
}3. Idempotency:
javascript
// Check existence before creation
const exists = await checkExists(item.json.uniqueId);
if (!exists) {
await createRecord(item.json);
}1. 数据验证:
javascript
// 始终验证输入结构
const items = $input.all();
for (const item of items) {
if (!item.json.email || !item.json.name) {
throw new Error(`无效输入:项${item.json.id}缺少必填字段`);
}
}2. 错误上下文:
javascript
// 提供调试信息
try {
const result = await apiCall(item.json.id);
} catch (error) {
throw new Error(`ID ${item.json.id}的API调用失败:${error.message}`);
}3. 幂等性:
javascript
// 创建前检查是否已存在
const exists = await checkExists(item.json.uniqueId);
if (!exists) {
await createRecord(item.json);
}Workflow Pattern Library
工作流模式库
Pattern 1: API Sync
模式1:API同步
Use case: Sync data between two systems
Schedule Trigger (hourly) → Fetch Source Data → Transform → If (record exists) → Update Target
↓ (new)
Create in TargetComplexity: 2
适用场景: 在两个系统之间同步数据
Schedule Trigger(每小时) → 获取源数据 → 转换 → If(记录存在) → 更新目标系统
↓(新记录)
在目标系统中创建复杂度: 2
Pattern 2: Error Recovery
模式2:错误恢复
Use case: Retry failed operations with exponential backoff
Main Workflow → Process → Error → Error Trigger Workflow
↓
Wait (delay) → Retry → If (max retries) → AlertComplexity: 3
适用场景: 使用指数退避重试失败操作
主工作流 → 处理 → 错误 → Error Trigger工作流
↓
等待(延迟) → 重试 → If(达到最大重试次数) → 警报复杂度: 3
Pattern 3: Data Enrichment
模式3:数据增强
Use case: Augment data with external sources
Webhook → Split In Batches → For Each Item:
↓
API Call (enrich) → Code (merge) → Batch Results
↓
Database InsertComplexity: 3
适用场景: 使用外部源扩充数据
Webhook → Split In Batches → 对每个项:
↓
API调用(增强) → Code(合并) → 批量结果
↓
数据库插入复杂度: 3
Pattern 4: Event-Driven Processing
模式4:事件驱动处理
Use case: Process events from message queue
SQS Trigger → Parse Message → Switch (event type) → [Handler A, Handler B, Handler C] → Confirm/Delete MessageComplexity: 3
适用场景: 处理消息队列中的事件
SQS Trigger → 解析消息 → Switch(事件类型) → [处理器A、处理器B、处理器C] → 确认/删除消息复杂度: 3
Pattern 5: Human-in-the-Loop
模式5:人工介入循环
Use case: Approval workflow
Trigger → Generate Request → Send Email (approval link) → Webhook (approval response) → If (approved) → Execute Action
↓ (rejected)
Send Rejection NoticeComplexity: 4
适用场景: 审批工作流
触发节点 → 生成请求 → Send Email(批准链接) → Webhook(批准响应) → If(已批准) → 执行操作
↓(已拒绝)
发送拒绝通知复杂度: 4
Pattern 6: Multi-Stage ETL
模式6:多阶段ETL
Use case: Complex data pipeline
Schedule → Extract (API) → Validate → Transform → Load (Database) → Success Notification
↓ ↓
Error Handler ────────────────────> Error NotificationComplexity: 3
适用场景: 复杂数据管道
调度 → 提取(API) → 验证 → 转换 → 加载(数据库) → 成功通知
↓ ↓
错误处理 ────────────────────> 错误通知复杂度: 3
Quality Gates
质量门
Definition of Done: Workflows
工作流完成标准
A workflow is production-ready when:
-
Functionality:
- ✓ All nodes execute without errors on test data
- ✓ Error paths tested with invalid input
- ✓ Output format validated against requirements
-
Error Handling:
- ✓ Error outputs configured on critical nodes
- ✓ Error Trigger workflow created for alerts
- ✓ Retry logic configured where applicable
-
Security:
- ✓ Credentials used (no hardcoded secrets)
- ✓ Webhook authentication enabled
- ✓ Input validation implemented
-
Documentation:
- ✓ Workflow description filled in settings
- ✓ Complex logic documented with notes
- ✓ Parameter descriptions clear
-
Performance:
- ✓ Tested with realistic data volume
- ✓ Execution time acceptable
- ✓ Resource usage within limits
工作流达到生产就绪状态的条件:
-
功能:
- ✓ 所有节点在测试数据上执行无错误
- ✓ 使用无效数据测试错误路径
- ✓ 输出格式符合需求
-
错误处理:
- ✓ 为关键节点配置错误输出
- ✓ 创建用于警报的Error Trigger工作流
- ✓ 适用的节点配置了重试逻辑
-
安全:
- ✓ 使用凭证存储(无硬编码密钥)
- ✓ 启用Webhook认证
- ✓ 实现输入验证
-
文档:
- ✓ 在设置中填写工作流描述
- ✓ 为复杂逻辑添加注释
- ✓ 参数描述清晰
-
性能:
- ✓ 使用真实数据量测试
- ✓ 执行时间可接受
- ✓ 资源使用在限制范围内
Definition of Done: Custom Nodes
自定义节点完成标准
A custom node is production-ready when:
-
Functionality:
- ✓ All operations implemented and tested
- ✓ Credentials integration working
- ✓ Parameters validated
-
Code Quality:
- ✓ TypeScript types defined
- ✓ Error handling comprehensive
- ✓ No hardcoded values (use parameters)
-
Documentation:
- ✓ README with installation instructions
- ✓ Parameter descriptions clear
- ✓ Example workflows provided
-
Distribution:
- ✓ Published to npm (if public)
- ✓ Versioned appropriately
- ✓ Dependencies declared in package.json
自定义节点达到生产就绪状态的条件:
-
功能:
- ✓ 所有操作已实现并测试
- ✓ 凭证集成正常工作
- ✓ 参数已验证
-
代码质量:
- ✓ 定义了TypeScript类型
- ✓ 错误处理全面
- ✓ 无硬编码值(使用参数)
-
文档:
- ✓ 包含安装说明的README
- ✓ 参数描述清晰
- ✓ 提供示例工作流
-
分发:
- ✓ 发布到npm(如果公开)
- ✓ 版本号合理
- ✓ package.json中声明了依赖
Error Handling Guide
错误处理指南
Common Issues and Resolutions
常见问题与解决方案
Issue: Workflow fails with "Invalid JSON"
- Cause: Node output not in correct format
- Resolution:
javascript
// Ensure return format return [{ json: { your: 'data' } }]; // NOT: return { your: 'data' };
Issue: "Cannot read property of undefined"
- Cause: Missing data from previous node
- Resolution:
javascript
// Check existence before access const value = $json.field?.subfield ?? 'default';
Issue: Webhook not receiving data
- Cause: Incorrect webhook URL or authentication
- Resolution:
- Verify URL matches external service configuration
- Check authentication method matches (None/Header/Basic)
- Test with curl:
bash
curl -X POST https://your-n8n.com/webhook/test \ -H "Content-Type: application/json" \ -d '{"test": "data"}'
Issue: Custom node not appearing
- Cause: Not properly linked/installed
- Resolution:
bash
# Check installation npm list -g | grep n8n-nodes- # Reinstall if needed npm install -g n8n-nodes-your-node # Restart n8n
Issue: High memory usage
- Cause: Processing large datasets without batching
- Resolution:
- Use Split In Batches node (batch size: 100-1000)
- Disable execution data saving for high-frequency workflows
- Set EXECUTIONS_DATA_PRUNE=true
Issue: Credentials not working
- Cause: Incorrect credential configuration or expired tokens
- Resolution:
- Re-authenticate OAuth2 credentials
- Verify API key/token validity
- Check credential permissions in service
问题:工作流因"Invalid JSON"失败
- 原因: 节点输出格式不正确
- 解决方案:
javascript
// 确保返回格式正确 return [{ json: { your: 'data' } }]; // 错误写法:return { your: 'data' };
问题:"Cannot read property of undefined"
- 原因: 前一节点数据缺失
- 解决方案:
javascript
// 访问前检查是否存在 const value = $json.field?.subfield ?? 'default';
问题:Webhook未接收数据
- 原因: Webhook URL或认证配置错误
- 解决方案:
- 验证URL与外部服务配置一致
- 检查认证方式匹配(无/Header/Basic)
- 使用curl测试:
bash
curl -X POST https://your-n8n.com/webhook/test \ -H "Content-Type: application/json" \ -d '{"test": "data"}'
问题:自定义节点未显示
- 原因: 未正确链接/安装
- 解决方案:
bash
# 检查安装情况 npm list -g | grep n8n-nodes- # 必要时重新安装 npm install -g n8n-nodes-your-node # 重启n8n
问题:内存使用率高
- 原因: 未分批处理大型数据集
- 解决方案:
- 使用Split In Batches节点(批量大小:100-1000)
- 高频率工作流禁用执行数据保存
- 设置EXECUTIONS_DATA_PRUNE=true
问题:凭证无效
- 原因: 凭证配置错误或令牌过期
- 解决方案:
- 重新认证OAuth2凭证
- 验证API密钥/令牌有效性
- 检查服务中的凭证权限
Debugging Strategies
调试策略
1. Inspect node output:
- Click node → View executions data
- Check json and binary tabs
- Verify data structure matches expectations
2. Add debug Code nodes:
javascript
// Log intermediate values
const data = $json;
console.log('Debug data:', JSON.stringify(data, null, 2));
return [{ json: data }];3. Use If node for validation:
javascript
// Expression to check data quality
{{ $json.email && $json.email.includes('@') }}4. Enable execution logging:
- Settings → Log → Set level to debug
- Check docker logs:
docker logs n8n -f
5. Test in isolation:
- Create test workflow with Manual Trigger
- Copy problematic nodes
- Use static test data
1. 检查节点输出:
- 点击节点 → 查看执行数据
- 检查json和binary标签页
- 验证数据结构符合预期
2. 添加调试Code节点:
javascript
// 记录中间值
const data = $json;
console.log('Debug data:', JSON.stringify(data, null, 2));
return [{ json: data }];3. 使用If节点进行验证:
javascript
// 检查数据质量的表达式
{{ $json.email && $json.email.includes('@') }}4. 启用执行日志:
- 设置 → 日志 → 级别设为debug
- 查看docker日志:
docker logs n8n -f
5. 隔离测试:
- 创建带Manual Trigger的测试工作流
- 复制有问题的节点
- 使用静态测试数据
References
参考资料
Official Documentation
官方文档
- Main docs: https://docs.n8n.io
- Node reference: https://docs.n8n.io/integrations/builtin/
- Custom node development: https://docs.n8n.io/integrations/creating-nodes/
- Expression reference: https://docs.n8n.io/code/expressions/
Code Repositories
代码仓库
- Core platform: https://github.com/n8n-io/n8n
- Documentation: https://github.com/n8n-io/n8n-docs
- Community nodes: https://www.npmjs.com/search?q=n8n-nodes
Community Resources
社区资源
- Forum: https://community.n8n.io
- Templates: https://n8n.io/workflows (workflow library)
- YouTube: Official n8n channel (tutorials)
- 论坛: https://community.n8n.io
- 模板: https://n8n.io/workflows(工作流库)
- YouTube: 官方n8n频道(教程)
Related Skills
相关技能
- For cloud integrations: Use skill
cloud-devops-expert - For database design: Use skill
database-specialist - For API design: Use skill
api-design-architect - For TypeScript development: Use language-specific skills
Version: 1.0.0
Last Updated: 2025-11-13
Complexity Rating: 3 (Moderate - requires platform-specific knowledge)
Estimated Learning Time: 8-12 hours for proficiency
- 云集成:使用技能
cloud-devops-expert - 数据库设计:使用技能
database-specialist - API设计:使用技能
api-design-architect - TypeScript开发:使用语言特定技能
版本: 1.0.0
最后更新: 2025-11-13
复杂度评级: 3(中等 - 需要平台特定知识)
预计学习时间: 8-12小时达到熟练水平