aws-lambda-functions
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
ChineseAWS Lambda Functions
AWS Lambda Functions
Overview
概述
AWS Lambda enables you to run code without provisioning or managing servers. Build serverless applications using event-driven triggers, pay only for compute time consumed, and scale automatically with workload.
AWS Lambda 让您无需预置或管理服务器即可运行代码。借助事件驱动型触发器构建无服务器应用,仅按实际消耗的计算时间付费,并可随工作负载自动扩展。
When to Use
适用场景
- API endpoints and webhooks
- Scheduled batch jobs and data processing
- Real-time file processing (S3 uploads)
- Event-driven workflows (SNS, SQS)
- Microservices and backend APIs
- Data transformations and ETL jobs
- IoT and sensor data processing
- WebSocket connections
- API端点与Webhook
- 定时批处理任务与数据处理
- 实时文件处理(S3上传)
- 事件驱动型工作流(SNS、SQS)
- 微服务与后端API
- 数据转换与ETL任务
- IoT与传感器数据处理
- WebSocket连接
Implementation Examples
实现示例
1. Basic Lambda Function with AWS CLI
1. 使用AWS CLI创建基础Lambda函数
bash
undefinedbash
undefinedCreate Lambda execution role
Create Lambda execution role
aws iam create-role
--role-name lambda-execution-role
--assume-role-policy-document '{ "Version": "2012-10-17", "Statement": [{ "Effect": "Allow", "Principal": {"Service": "lambda.amazonaws.com"}, "Action": "sts:AssumeRole" }] }'
--role-name lambda-execution-role
--assume-role-policy-document '{ "Version": "2012-10-17", "Statement": [{ "Effect": "Allow", "Principal": {"Service": "lambda.amazonaws.com"}, "Action": "sts:AssumeRole" }] }'
aws iam create-role
--role-name lambda-execution-role
--assume-role-policy-document '{ "Version": "2012-10-17", "Statement": [{ "Effect": "Allow", "Principal": {"Service": "lambda.amazonaws.com"}, "Action": "sts:AssumeRole" }] }'
--role-name lambda-execution-role
--assume-role-policy-document '{ "Version": "2012-10-17", "Statement": [{ "Effect": "Allow", "Principal": {"Service": "lambda.amazonaws.com"}, "Action": "sts:AssumeRole" }] }'
Attach basic execution policy
Attach basic execution policy
aws iam attach-role-policy
--role-name lambda-execution-role
--policy-arn arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole
--role-name lambda-execution-role
--policy-arn arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole
aws iam attach-role-policy
--role-name lambda-execution-role
--policy-arn arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole
--role-name lambda-execution-role
--policy-arn arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole
Create function from ZIP
Create function from ZIP
zip function.zip index.js
aws lambda create-function
--function-name my-function
--runtime nodejs18.x
--role arn:aws:iam::ACCOUNT:role/lambda-execution-role
--handler index.handler
--zip-file fileb://function.zip
--timeout 30
--memory-size 256
--environment Variables={ENV=production,DB_HOST=db.example.com}
--function-name my-function
--runtime nodejs18.x
--role arn:aws:iam::ACCOUNT:role/lambda-execution-role
--handler index.handler
--zip-file fileb://function.zip
--timeout 30
--memory-size 256
--environment Variables={ENV=production,DB_HOST=db.example.com}
zip function.zip index.js
aws lambda create-function
--function-name my-function
--runtime nodejs18.x
--role arn:aws:iam::ACCOUNT:role/lambda-execution-role
--handler index.handler
--zip-file fileb://function.zip
--timeout 30
--memory-size 256
--environment Variables={ENV=production,DB_HOST=db.example.com}
--function-name my-function
--runtime nodejs18.x
--role arn:aws:iam::ACCOUNT:role/lambda-execution-role
--handler index.handler
--zip-file fileb://function.zip
--timeout 30
--memory-size 256
--environment Variables={ENV=production,DB_HOST=db.example.com}
Invoke function
Invoke function
aws lambda invoke
--function-name my-function
--payload '{"name":"John","age":30}'
response.json
--function-name my-function
--payload '{"name":"John","age":30}'
response.json
undefinedaws lambda invoke
--function-name my-function
--payload '{"name":"John","age":30}'
response.json
--function-name my-function
--payload '{"name":"John","age":30}'
response.json
undefined2. Lambda Function with Node.js
2. 基于Node.js的Lambda函数
javascript
// index.js
exports.handler = async (event) => {
console.log('Event:', JSON.stringify(event));
try {
// Parse different event sources
const body = typeof event.body === 'string'
? JSON.parse(event.body)
: event.body || {};
// Process S3 event
if (event.Records && event.Records[0].s3) {
const bucket = event.Records[0].s3.bucket.name;
const key = event.Records[0].s3.object.key;
console.log(`Processing S3 object: ${bucket}/${key}`);
}
// Database query simulation
const results = await queryDatabase(body);
return {
statusCode: 200,
headers: {
'Content-Type': 'application/json',
'Access-Control-Allow-Origin': '*'
},
body: JSON.stringify({
message: 'Success',
data: results
})
};
} catch (error) {
console.error('Error:', error);
return {
statusCode: 500,
body: JSON.stringify({ error: error.message })
};
}
};
async function queryDatabase(params) {
// Simulate database call
return { items: [] };
}javascript
// index.js
exports.handler = async (event) => {
console.log('Event:', JSON.stringify(event));
try {
// Parse different event sources
const body = typeof event.body === 'string'
? JSON.parse(event.body)
: event.body || {};
// Process S3 event
if (event.Records && event.Records[0].s3) {
const bucket = event.Records[0].s3.bucket.name;
const key = event.Records[0].s3.object.key;
console.log(`Processing S3 object: ${bucket}/${key}`);
}
// Database query simulation
const results = await queryDatabase(body);
return {
statusCode: 200,
headers: {
'Content-Type': 'application/json',
'Access-Control-Allow-Origin': '*'
},
body: JSON.stringify({
message: 'Success',
data: results
})
};
} catch (error) {
console.error('Error:', error);
return {
statusCode: 500,
body: JSON.stringify({ error: error.message })
};
}
};
async function queryDatabase(params) {
// Simulate database call
return { items: [] };
}3. Terraform Lambda Deployment
3. 使用Terraform部署Lambda
hcl
undefinedhcl
undefinedmain.tf
main.tf
terraform {
required_providers {
aws = {
source = "hashicorp/aws"
version = "~> 5.0"
}
}
}
provider "aws" {
region = var.aws_region
}
terraform {
required_providers {
aws = {
source = "hashicorp/aws"
version = "~> 5.0"
}
}
}
provider "aws" {
region = var.aws_region
}
Lambda execution role
Lambda execution role
resource "aws_iam_role" "lambda_role" {
name = "lambda-execution-role"
assume_role_policy = jsonencode({
Version = "2012-10-17"
Statement = [{
Action = "sts:AssumeRole"
Effect = "Allow"
Principal = {
Service = "lambda.amazonaws.com"
}
}]
})
}
resource "aws_iam_role" "lambda_role" {
name = "lambda-execution-role"
assume_role_policy = jsonencode({
Version = "2012-10-17"
Statement = [{
Action = "sts:AssumeRole"
Effect = "Allow"
Principal = {
Service = "lambda.amazonaws.com"
}
}]
})
}
CloudWatch Logs policy
CloudWatch Logs policy
resource "aws_iam_role_policy_attachment" "lambda_logs" {
role = aws_iam_role.lambda_role.name
policy_arn = "arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole"
}
resource "aws_iam_role_policy_attachment" "lambda_logs" {
role = aws_iam_role.lambda_role.name
policy_arn = "arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole"
}
S3 Lambda Layer (dependencies)
S3 Lambda Layer (dependencies)
resource "aws_lambda_layer_version" "dependencies" {
filename = "layer.zip"
layer_name = "nodejs-dependencies"
compatible_runtimes = ["nodejs18.x"]
}
resource "aws_lambda_layer_version" "dependencies" {
filename = "layer.zip"
layer_name = "nodejs-dependencies"
compatible_runtimes = ["nodejs18.x"]
}
Lambda function
Lambda function
resource "aws_lambda_function" "api_handler" {
filename = "lambda.zip"
function_name = "api-handler"
role = aws_iam_role.lambda_role.arn
handler = "index.handler"
runtime = "nodejs18.x"
timeout = 30
memory_size = 256
layers = [aws_lambda_layer_version.dependencies.arn]
environment {
variables = {
STAGE = "production"
DB_HOST = var.database_host
}
}
depends_on = [aws_iam_role_policy_attachment.lambda_logs]
}
resource "aws_lambda_function" "api_handler" {
filename = "lambda.zip"
function_name = "api-handler"
role = aws_iam_role.lambda_role.arn
handler = "index.handler"
runtime = "nodejs18.x"
timeout = 30
memory_size = 256
layers = [aws_lambda_layer_version.dependencies.arn]
environment {
variables = {
STAGE = "production"
DB_HOST = var.database_host
}
}
depends_on = [aws_iam_role_policy_attachment.lambda_logs]
}
API Gateway trigger
API Gateway trigger
resource "aws_lambda_permission" "api_gateway" {
statement_id = "AllowAPIGatewayInvoke"
action = "lambda:InvokeFunction"
function_name = aws_lambda_function.api_handler.function_name
principal = "apigateway.amazonaws.com"
}
resource "aws_lambda_permission" "api_gateway" {
statement_id = "AllowAPIGatewayInvoke"
action = "lambda:InvokeFunction"
function_name = aws_lambda_function.api_handler.function_name
principal = "apigateway.amazonaws.com"
}
S3 trigger
S3 trigger
resource "aws_lambda_permission" "s3_trigger" {
statement_id = "AllowS3Invoke"
action = "lambda:InvokeFunction"
function_name = aws_lambda_function.api_handler.function_name
principal = "s3.amazonaws.com"
source_arn = aws_s3_bucket.upload_bucket.arn
}
resource "aws_s3_bucket_notification" "bucket_notification" {
bucket = aws_s3_bucket.upload_bucket.id
depends_on = [aws_lambda_permission.s3_trigger]
lambda_function {
lambda_function_arn = aws_lambda_function.api_handler.arn
events = ["s3:ObjectCreated:*"]
filter_prefix = "uploads/"
filter_suffix = ".jpg"
}
}
undefinedresource "aws_lambda_permission" "s3_trigger" {
statement_id = "AllowS3Invoke"
action = "lambda:InvokeFunction"
function_name = aws_lambda_function.api_handler.function_name
principal = "s3.amazonaws.com"
source_arn = aws_s3_bucket.upload_bucket.arn
}
resource "aws_s3_bucket_notification" "bucket_notification" {
bucket = aws_s3_bucket.upload_bucket.id
depends_on = [aws_lambda_permission.s3_trigger]
lambda_function {
lambda_function_arn = aws_lambda_function.api_handler.arn
events = ["s3:ObjectCreated:*"]
filter_prefix = "uploads/"
filter_suffix = ".jpg"
}
}
undefined4. Lambda with SAM (Serverless Application Model)
4. 使用SAM(无服务器应用模型)部署Lambda
yaml
undefinedyaml
undefinedtemplate.yaml
template.yaml
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Globals:
Function:
Timeout: 30
MemorySize: 256
Runtime: nodejs18.x
Tracing: Active
Parameters:
Environment:
Type: String
Default: dev
AllowedValues: [dev, prod]
Resources:
Lambda function
MyFunction:
Type: AWS::Serverless::Function
Properties:
FunctionName: !Sub '${Environment}-my-function'
CodeUri: src/
Handler: index.handler
Architectures:
- x86_64
Environment:
Variables:
STAGE: !Ref Environment
Policies:
- DynamoDBCrudPolicy:
TableName: !Ref DataTable
- S3CrudPolicy:
BucketName: !Ref DataBucket
Events:
ApiEvent:
Type: Api
Properties:
Path: /api/{proxy+}
Method: ANY
RestApiId: !Ref MyApi
S3Upload:
Type: S3
Properties:
Bucket: !Ref DataBucket
Events: s3:ObjectCreated:*
Filter:
S3Key:
Rules:
- Name: prefix
Value: uploads/
DynamoDB table
DataTable:
Type: AWS::DynamoDB::Table
Properties:
TableName: !Sub '${Environment}-data'
BillingMode: PAY_PER_REQUEST
AttributeDefinitions:
- AttributeName: id
AttributeType: S
KeySchema:
- AttributeName: id
KeyType: HASH
S3 bucket
DataBucket:
Type: AWS::S3::Bucket
Properties:
BucketName: !Sub '${Environment}-data-${AWS::AccountId}'
VersioningConfiguration:
Status: Enabled
API Gateway
MyApi:
Type: AWS::Serverless::Api
Properties:
Name: !Sub '${Environment}-api'
StageName: !Ref Environment
Cors:
AllowMethods: "''"
AllowHeaders: "'Content-Type,Authorization'"
AllowOrigin: "''"
Outputs:
FunctionArn:
Value: !GetAtt MyFunction.Arn
ApiEndpoint:
Value: !Sub 'https://${MyApi}.execute-api.${AWS::Region}.amazonaws.com'
undefinedAWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Globals:
Function:
Timeout: 30
MemorySize: 256
Runtime: nodejs18.x
Tracing: Active
Parameters:
Environment:
Type: String
Default: dev
AllowedValues: [dev, prod]
Resources:
Lambda function
MyFunction:
Type: AWS::Serverless::Function
Properties:
FunctionName: !Sub '${Environment}-my-function'
CodeUri: src/
Handler: index.handler
Architectures:
- x86_64
Environment:
Variables:
STAGE: !Ref Environment
Policies:
- DynamoDBCrudPolicy:
TableName: !Ref DataTable
- S3CrudPolicy:
BucketName: !Ref DataBucket
Events:
ApiEvent:
Type: Api
Properties:
Path: /api/{proxy+}
Method: ANY
RestApiId: !Ref MyApi
S3Upload:
Type: S3
Properties:
Bucket: !Ref DataBucket
Events: s3:ObjectCreated:*
Filter:
S3Key:
Rules:
- Name: prefix
Value: uploads/
DynamoDB table
DataTable:
Type: AWS::DynamoDB::Table
Properties:
TableName: !Sub '${Environment}-data'
BillingMode: PAY_PER_REQUEST
AttributeDefinitions:
- AttributeName: id
AttributeType: S
KeySchema:
- AttributeName: id
KeyType: HASH
S3 bucket
DataBucket:
Type: AWS::S3::Bucket
Properties:
BucketName: !Sub '${Environment}-data-${AWS::AccountId}'
VersioningConfiguration:
Status: Enabled
API Gateway
MyApi:
Type: AWS::Serverless::Api
Properties:
Name: !Sub '${Environment}-api'
StageName: !Ref Environment
Cors:
AllowMethods: "''"
AllowHeaders: "'Content-Type,Authorization'"
AllowOrigin: "''"
Outputs:
FunctionArn:
Value: !GetAtt MyFunction.Arn
ApiEndpoint:
Value: !Sub 'https://${MyApi}.execute-api.${AWS::Region}.amazonaws.com'
undefined5. Lambda Layers for Code Sharing
5. 使用Lambda层实现代码共享
bash
undefinedbash
undefinedCreate layer directory structure
Create layer directory structure
mkdir -p layer/nodejs/node_modules
cd layer/nodejs
mkdir -p layer/nodejs/node_modules
cd layer/nodejs
Install dependencies
Install dependencies
npm install lodash axios moment
npm install lodash axios moment
Go back and create zip
Go back and create zip
cd ..
zip -r layer.zip .
cd ..
zip -r layer.zip .
Upload layer
Upload layer
aws lambda publish-layer-version
--layer-name shared-utils
--zip-file fileb://layer.zip
--compatible-runtimes nodejs18.x
--layer-name shared-utils
--zip-file fileb://layer.zip
--compatible-runtimes nodejs18.x
undefinedaws lambda publish-layer-version
--layer-name shared-utils
--zip-file fileb://layer.zip
--compatible-runtimes nodejs18.x
--layer-name shared-utils
--zip-file fileb://layer.zip
--compatible-runtimes nodejs18.x
undefinedBest Practices
最佳实践
✅ DO
✅ 建议
- Use environment variables for configuration
- Implement proper error handling and logging
- Optimize package size and dependencies
- Set appropriate timeout and memory
- Use Lambda Layers for shared code
- Implement concurrency limits
- Enable X-Ray tracing for debugging
- Use reserved concurrency for critical functions
- 使用环境变量进行配置
- 实现完善的错误处理与日志记录
- 优化包大小与依赖项
- 设置合适的超时时间与内存规格
- 使用Lambda层共享代码
- 实现并发限制
- 启用X-Ray追踪以进行调试
- 为关键函数使用预留并发
❌ DON'T
❌ 不建议
- Store sensitive data in code
- Create long-running operations (>15 min)
- Ignore cold start optimization
- Forget to handle concurrent executions
- Ignore CloudWatch metrics
- Use too much memory unnecessarily
- 在代码中存储敏感数据
- 创建长时间运行的操作(超过15分钟)
- 忽略冷启动优化
- 未处理并发执行情况
- 忽略CloudWatch指标
- 不必要地使用过多内存
Monitoring & Troubleshooting
监控与故障排查
- CloudWatch Logs for application logging
- CloudWatch Metrics for duration, errors, throttles
- X-Ray tracing for performance analysis
- Dead Letter Queues for failed messages
- Alarms for errors and throttling
- CloudWatch Logs:应用日志记录
- CloudWatch Metrics:时长、错误、限流等指标
- X-Ray追踪:性能分析
- 死信队列:处理失败消息
- 告警:针对错误与限流设置告警