b2c-custom-job-steps
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
ChineseCustom Job Steps Skill
自定义作业步骤技能
This skill guides you through creating new custom job steps for Salesforce B2C Commerce batch processing.
Running an existing job? If you need to execute jobs or import site archives via CLI, use theskill instead.b2c-cli:b2c-job
本技能将指导你为Salesforce B2C Commerce批处理创建新的自定义作业步骤。
运行现有作业? 如果你需要通过CLI执行作业或导入站点归档,请改用技能。b2c-cli:b2c-job
When to Use
适用场景
- Creating a new scheduled job for batch processing
- Building a data import job (customers, products, orders)
- Building a data export job (reports, feeds, sync)
- Implementing data sync between systems
- Creating cleanup or maintenance tasks
- 创建用于批处理的新定时作业
- 构建数据导入作业(客户、商品、订单)
- 构建数据导出作业(报表、数据馈送、同步)
- 实现系统间的数据同步
- 创建清理或维护任务
Overview
概述
Custom job steps allow you to execute custom business logic as part of B2C Commerce jobs. There are two execution models:
| Model | Use Case | Progress Tracking |
|---|---|---|
| Task-oriented | Single operations (FTP, import/export) | Limited |
| Chunk-oriented | Bulk data processing | Fine-grained |
自定义作业步骤允许你将自定义业务逻辑作为B2C Commerce作业的一部分执行。有两种执行模式:
| 模式 | 适用场景 | 进度跟踪 |
|---|---|---|
| Task-oriented(面向任务) | 单一操作(FTP、导入/导出) | 有限 |
| Chunk-oriented(面向块) | 批量数据处理 | 细粒度 |
File Structure
文件结构
my_cartridge/
├── cartridge/
│ ├── scripts/
│ │ └── steps/
│ │ ├── myTaskStep.js # Task-oriented script
│ │ └── myChunkStep.js # Chunk-oriented script
│ └── my_cartridge.properties
└── steptypes.json # Step type definitions (at cartridge ROOT)Important: The file must be placed in the root folder of the cartridge, not inside the directory. Only one file per cartridge.
steptypes.jsoncartridge/steptypes.jsonmy_cartridge/
├── cartridge/
│ ├── scripts/
│ │ └── steps/
│ │ ├── myTaskStep.js # Task-oriented script
│ │ └── myChunkStep.js # Chunk-oriented script
│ └── my_cartridge.properties
└── steptypes.json # Step type definitions (at cartridge ROOT)重要提示: 文件必须放在cartridge的根目录中,不能放在目录内。每个cartridge只能有一个文件。
steptypes.jsoncartridge/steptypes.jsonStep Type Definition (steptypes.json)
步骤类型定义(steptypes.json)
json
{
"step-types": {
"script-module-step": [
{
"@type-id": "custom.MyTaskStep",
"@supports-parallel-execution": "false",
"@supports-site-context": "true",
"@supports-organization-context": "false",
"description": "My custom task step",
"module": "my_cartridge/cartridge/scripts/steps/myTaskStep.js",
"function": "execute",
"timeout-in-seconds": 900,
"parameters": {
"parameter": [
{
"@name": "InputFile",
"@type": "string",
"@required": "true",
"description": "Path to input file"
},
{
"@name": "Enabled",
"@type": "boolean",
"@required": "false",
"default-value": "true",
"description": "Enable processing"
}
]
},
"status-codes": {
"status": [
{
"@code": "OK",
"description": "Step completed successfully"
},
{
"@code": "ERROR",
"description": "Step failed"
},
{
"@code": "NO_DATA",
"description": "No data to process"
}
]
}
}
],
"chunk-script-module-step": [
{
"@type-id": "custom.MyChunkStep",
"@supports-parallel-execution": "true",
"@supports-site-context": "true",
"@supports-organization-context": "false",
"description": "Bulk data processing step",
"module": "my_cartridge/cartridge/scripts/steps/myChunkStep.js",
"before-step-function": "beforeStep",
"read-function": "read",
"process-function": "process",
"write-function": "write",
"after-step-function": "afterStep",
"total-count-function": "getTotalCount",
"chunk-size": 100,
"transactional": "false",
"timeout-in-seconds": 1800,
"parameters": {
"parameter": [
{
"@name": "CategoryId",
"@type": "string",
"@required": "true"
}
]
}
}
]
}
}json
{
"step-types": {
"script-module-step": [
{
"@type-id": "custom.MyTaskStep",
"@supports-parallel-execution": "false",
"@supports-site-context": "true",
"@supports-organization-context": "false",
"description": "My custom task step",
"module": "my_cartridge/cartridge/scripts/steps/myTaskStep.js",
"function": "execute",
"timeout-in-seconds": 900,
"parameters": {
"parameter": [
{
"@name": "InputFile",
"@type": "string",
"@required": "true",
"description": "Path to input file"
},
{
"@name": "Enabled",
"@type": "boolean",
"@required": "false",
"default-value": "true",
"description": "Enable processing"
}
]
},
"status-codes": {
"status": [
{
"@code": "OK",
"description": "Step completed successfully"
},
{
"@code": "ERROR",
"description": "Step failed"
},
{
"@code": "NO_DATA",
"description": "No data to process"
}
]
}
}
],
"chunk-script-module-step": [
{
"@type-id": "custom.MyChunkStep",
"@supports-parallel-execution": "true",
"@supports-site-context": "true",
"@supports-organization-context": "false",
"description": "Bulk data processing step",
"module": "my_cartridge/cartridge/scripts/steps/myChunkStep.js",
"before-step-function": "beforeStep",
"read-function": "read",
"process-function": "process",
"write-function": "write",
"after-step-function": "afterStep",
"total-count-function": "getTotalCount",
"chunk-size": 100,
"transactional": "false",
"timeout-in-seconds": 1800,
"parameters": {
"parameter": [
{
"@name": "CategoryId",
"@type": "string",
"@required": "true"
}
]
}
}
]
}
}Task-Oriented Steps
面向任务的步骤
Use for single operations like FTP transfers, file generation, or import/export.
适用于单一操作,如FTP传输、文件生成或导入/导出。
Script (scripts/steps/myTaskStep.js)
脚本(scripts/steps/myTaskStep.js)
javascript
'use strict';
var Status = require('dw/system/Status');
var Logger = require('dw/system/Logger');
/**
* Execute the task step
* @param {Object} parameters - Job step parameters
* @param {dw.job.JobStepExecution} stepExecution - Step execution context
* @returns {dw.system.Status} Execution status
*/
exports.execute = function (parameters, stepExecution) {
var log = Logger.getLogger('job', 'MyTaskStep');
try {
var inputFile = parameters.InputFile;
var enabled = parameters.Enabled;
if (!enabled) {
log.info('Step disabled, skipping');
return new Status(Status.OK, 'SKIP', 'Step disabled');
}
// Your business logic here
log.info('Processing file: ' + inputFile);
// Return success
return new Status(Status.OK);
} catch (e) {
log.error('Step failed: ' + e.message);
return new Status(Status.ERROR, 'ERROR', e.message);
}
};javascript
'use strict';
var Status = require('dw/system/Status');
var Logger = require('dw/system/Logger');
/**
* Execute the task step
* @param {Object} parameters - Job step parameters
* @param {dw.job.JobStepExecution} stepExecution - Step execution context
* @returns {dw.system.Status} Execution status
*/
exports.execute = function (parameters, stepExecution) {
var log = Logger.getLogger('job', 'MyTaskStep');
try {
var inputFile = parameters.InputFile;
var enabled = parameters.Enabled;
if (!enabled) {
log.info('Step disabled, skipping');
return new Status(Status.OK, 'SKIP', 'Step disabled');
}
// Your business logic here
log.info('Processing file: ' + inputFile);
// Return success
return new Status(Status.OK);
} catch (e) {
log.error('Step failed: ' + e.message);
return new Status(Status.ERROR, 'ERROR', e.message);
}
};Status Codes
状态码
javascript
// Success
return new Status(Status.OK);
return new Status(Status.OK, 'CUSTOM_CODE', 'Custom message');
// Error
return new Status(Status.ERROR);
return new Status(Status.ERROR, null, 'Error message');Important: Custom status codes work only with OK status. If you use a custom code with ERROR status, it is replaced with ERROR. Custom status codes cannot contain commas, wildcards, leading/trailing whitespace, or exceed 100 characters.
javascript
// Success
return new Status(Status.OK);
return new Status(Status.OK, 'CUSTOM_CODE', 'Custom message');
// Error
return new Status(Status.ERROR);
return new Status(Status.ERROR, null, 'Error message');重要提示: 自定义状态码仅适用于OK状态。如果在ERROR状态中使用自定义代码,它会被替换为ERROR。自定义状态码不能包含逗号、通配符、前导/尾随空格,且长度不能超过100个字符。
Chunk-Oriented Steps
面向块的步骤
Use for bulk processing of countable data (products, orders, customers).
Important: You cannot define custom exit status for chunk-oriented steps. Chunk modules always finish with either OK or ERROR.
适用于可计数数据的批量处理(商品、订单、客户)。
重要提示: 你不能为面向块的步骤定义自定义退出状态。块模块始终以OK或ERROR结束。
Required Functions
必填函数
| Function | Purpose | Returns |
|---|---|---|
| Get next item | Item or nothing |
| Transform item | Processed item or nothing (filters) |
| Save chunk of items | Nothing |
| 函数 | 用途 | 返回值 |
|---|---|---|
| 获取下一个项目 | 项目或空值 |
| 转换项目 | 处理后的项目或空值(过滤) |
| 保存块中的项目 | 无 |
Optional Functions
可选函数
| Function | Purpose | Returns |
|---|---|---|
| Initialize (open files, queries) | Nothing |
| Cleanup (close files) | Nothing |
| Return total items for progress | Number |
| Before each chunk | Nothing |
| After each chunk | Nothing |
| 函数 | 用途 | 返回值 |
|---|---|---|
| 初始化(打开文件、查询) | 无 |
| 清理(关闭文件) | 无 |
| 返回总项目数以跟踪进度 | 数字 |
| 每个块处理前执行 | 无 |
| 每个块处理后执行 | 无 |
Script (scripts/steps/myChunkStep.js)
脚本(scripts/steps/myChunkStep.js)
javascript
'use strict';
var ProductMgr = require('dw/catalog/ProductMgr');
var Transaction = require('dw/system/Transaction');
var Logger = require('dw/system/Logger');
var File = require('dw/io/File');
var FileWriter = require('dw/io/FileWriter');
var log = Logger.getLogger('job', 'MyChunkStep');
var products;
var fileWriter;
/**
* Initialize before processing
*/
exports.beforeStep = function (parameters, stepExecution) {
log.info('Starting chunk processing');
// Open resources
var outputFile = new File(File.IMPEX + '/export/products.csv');
fileWriter = new FileWriter(outputFile);
fileWriter.writeLine('ID,Name,Price');
// Query products
products = ProductMgr.queryAllSiteProducts();
};
/**
* Get total count for progress tracking
*/
exports.getTotalCount = function (parameters, stepExecution) {
return products.count;
};
/**
* Read next item
* Return nothing to signal end of data
*/
exports.read = function (parameters, stepExecution) {
if (products.hasNext()) {
return products.next();
}
// Return nothing = end of data
};
/**
* Process single item
* Return nothing to filter out item
*/
exports.process = function (product, parameters, stepExecution) {
// Filter: skip offline products
if (!product.online) {
return; // Filtered out
}
// Transform
return {
id: product.ID,
name: product.name,
price: product.priceModel.price.value
};
};
/**
* Write chunk of processed items
*/
exports.write = function (items, parameters, stepExecution) {
for (var i = 0; i < items.size(); i++) {
var item = items.get(i);
fileWriter.writeLine(item.id + ',' + item.name + ',' + item.price);
}
};
/**
* Cleanup after all chunks
*/
exports.afterStep = function (success, parameters, stepExecution) {
// Close resources
if (fileWriter) {
fileWriter.close();
}
if (products) {
products.close();
}
if (success) {
log.info('Chunk processing completed successfully');
} else {
log.error('Chunk processing failed');
}
};javascript
'use strict';
var ProductMgr = require('dw/catalog/ProductMgr');
var Transaction = require('dw/system/Transaction');
var Logger = require('dw/system/Logger');
var File = require('dw/io/File');
var FileWriter = require('dw/io/FileWriter');
var log = Logger.getLogger('job', 'MyChunkStep');
var products;
var fileWriter;
/**
* Initialize before processing
*/
exports.beforeStep = function (parameters, stepExecution) {
log.info('Starting chunk processing');
// Open resources
var outputFile = new File(File.IMPEX + '/export/products.csv');
fileWriter = new FileWriter(outputFile);
fileWriter.writeLine('ID,Name,Price');
// Query products
products = ProductMgr.queryAllSiteProducts();
};
/**
* Get total count for progress tracking
*/
exports.getTotalCount = function (parameters, stepExecution) {
return products.count;
};
/**
* Read next item
* Return nothing to signal end of data
*/
exports.read = function (parameters, stepExecution) {
if (products.hasNext()) {
return products.next();
}
// Return nothing = end of data
};
/**
* Process single item
* Return nothing to filter out item
*/
exports.process = function (product, parameters, stepExecution) {
// Filter: skip offline products
if (!product.online) {
return; // Filtered out
}
// Transform
return {
id: product.ID,
name: product.name,
price: product.priceModel.price.value
};
};
/**
* Write chunk of processed items
*/
exports.write = function (items, parameters, stepExecution) {
for (var i = 0; i < items.size(); i++) {
var item = items.get(i);
fileWriter.writeLine(item.id + ',' + item.name + ',' + item.price);
}
};
/**
* Cleanup after all chunks
*/
exports.afterStep = function (success, parameters, stepExecution) {
// Close resources
if (fileWriter) {
fileWriter.close();
}
if (products) {
products.close();
}
if (success) {
log.info('Chunk processing completed successfully');
} else {
log.error('Chunk processing failed');
}
};Parameter Types
参数类型
| Type | Description | Example Value |
|---|---|---|
| Text value | |
| true/false | |
| Integer | |
| Decimal | |
| ISO datetime | |
| ISO date | |
| ISO time | |
| 类型 | 描述 | 示例值 |
|---|---|---|
| 文本值 | |
| 布尔值(true/false) | |
| 整数 | |
| 小数 | |
| ISO格式日期时间 | |
| ISO格式日期 | |
| ISO格式时间 | |
Parameter Validation Attributes
参数验证属性
| Attribute | Applies To | Description |
|---|---|---|
| All | Trim whitespace before validation (default: |
| All | Mark as required (default: |
| datetime-string, date-string, time-string | Convert to |
| string | Regex pattern for validation |
| string | Minimum string length (must be ≥1) |
| string | Maximum string length (max 1000 chars total) |
| long, double, datetime-string, time-string | Minimum numeric value |
| long, double, datetime-string, time-string | Maximum numeric value |
| All | Restrict to allowed values (dropdown in BM) |
| 属性 | 适用类型 | 描述 |
|---|---|---|
| 所有类型 | 验证前去除空格(默认: |
| 所有类型 | 标记为必填(默认: |
| datetime-string、date-string、time-string | 转换为 |
| string | 用于验证的正则表达式 |
| string | 最小字符串长度(必须≥1) |
| string | 最大字符串长度(总计不超过1000字符) |
| long、double、datetime-string、time-string | 最小数值 |
| long、double、datetime-string、time-string | 最大数值 |
| 所有类型 | 限制为允许的值(在BM中显示为下拉菜单) |
Configuration Options
配置选项
steptypes.json Attributes
steptypes.json属性
| Attribute | Required | Description |
|---|---|---|
| Yes | Unique ID (must start with |
| No | Allow parallel execution (default: |
| No | Available in site-scoped jobs (default: |
| No | Available in org-scoped jobs (default: |
| Yes | Path to script module |
| Yes | Function name to execute (task-oriented) |
| No | Step timeout (recommended to set) |
| No | Wrap in single transaction (default: |
| Yes* | Items per chunk (*required for chunk steps) |
Context Constraints: and cannot both be or both be - one must be and the other .
@supports-site-context@supports-organization-contexttruefalsetruefalse| 属性 | 是否必填 | 描述 |
|---|---|---|
| 是 | 唯一ID(必须以 |
| 否 | 允许并行执行(默认: |
| 否 | 可用于站点范围的作业(默认: |
| 否 | 可用于组织范围的作业(默认: |
| 是 | 脚本模块的路径 |
| 是 | 要执行的函数名称(面向任务的步骤) |
| 否 | 步骤超时时间(建议设置) |
| 否 | 包装在单个事务中(默认: |
| 是* | 每个块的项目数(*面向块的步骤必填) |
上下文约束: 和不能同时为或同时为——必须一个为,另一个为。
@supports-site-context@supports-organization-contexttruefalsetruefalseBest Practices
最佳实践
- Use chunk-oriented for bulk data - better progress tracking and resumability
- Close resources in - queries, files, connections
afterStep() - Set explicit timeouts - default may be too short
- Log progress - helps debugging
- Handle errors gracefully - return proper Status objects
- Don't rely on transactional=true - use for control
Transaction.wrap()
- 对批量数据使用面向块的步骤——进度跟踪和可恢复性更好
- 在中关闭资源——查询、文件、连接
afterStep() - 设置明确的超时时间——默认值可能过短
- 记录进度——有助于调试
- 优雅处理错误——返回正确的Status对象
- 不要依赖transactional=true——使用来控制事务
Transaction.wrap()
Related Skills
相关技能
- - For running existing jobs and importing site archives via CLI
b2c-cli:b2c-job - - When job steps need to call external HTTP services or APIs, use the webservices skill for service configuration and HTTP client patterns
b2c:b2c-webservices
- ——用于通过CLI运行现有作业和导入站点归档
b2c-cli:b2c-job - ——当作业步骤需要调用外部HTTP服务或API时,使用webservices技能进行服务配置和HTTP客户端模式开发
b2c:b2c-webservices
Detailed Reference
详细参考
- Task-Oriented Steps - Full task step patterns
- Chunk-Oriented Steps - Full chunk step patterns
- steptypes.json Reference - Complete schema
- 面向任务的步骤——完整的任务步骤模式
- 面向块的步骤——完整的块步骤模式
- steptypes.json参考——完整的模式