gcp-cloud-functions
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
ChineseGCP Cloud Functions
GCP Cloud Functions
Overview
概述
Google Cloud Functions enables event-driven serverless computing on Google Cloud Platform. Build functions with automatic scaling, integrated security, and seamless integration with Google Cloud services for rapid development.
Google Cloud Functions 支持在Google Cloud Platform上进行事件驱动的无服务器计算。您可以构建具备自动扩缩容、集成式安全防护以及与Google Cloud服务无缝集成的函数,实现快速开发。
When to Use
适用场景
- HTTP APIs and webhooks
- Pub/Sub message processing
- Storage bucket events
- Firestore database triggers
- Cloud Scheduler jobs
- Real-time data processing
- Image and video processing
- Data pipeline orchestration
- HTTP API与Webhook
- Pub/Sub消息处理
- 存储桶事件
- Firestore数据库触发器
- Cloud Scheduler任务
- 实时数据处理
- 图片与视频处理
- 数据管道编排
Implementation Examples
实现示例
1. Cloud Function Creation with gcloud CLI
1. 使用gcloud CLI创建Cloud Function
bash
undefinedbash
undefinedInstall Google Cloud SDK
Install Google Cloud SDK
curl https://sdk.cloud.google.com | bash
exec -l $SHELL
curl https://sdk.cloud.google.com | bash
exec -l $SHELL
Initialize and authenticate
Initialize and authenticate
gcloud init
gcloud auth application-default login
gcloud init
gcloud auth application-default login
Set project
Set project
gcloud config set project MY_PROJECT_ID
gcloud config set project MY_PROJECT_ID
Create service account
Create service account
gcloud iam service-accounts create cloud-function-sa
--display-name "Cloud Function Service Account"
--display-name "Cloud Function Service Account"
gcloud iam service-accounts create cloud-function-sa
--display-name "Cloud Function Service Account"
--display-name "Cloud Function Service Account"
Grant permissions
Grant permissions
gcloud projects add-iam-policy-binding MY_PROJECT_ID
--member="serviceAccount:cloud-function-sa@MY_PROJECT_ID.iam.gserviceaccount.com"
--role="roles/cloudfunctions.invoker"
--member="serviceAccount:cloud-function-sa@MY_PROJECT_ID.iam.gserviceaccount.com"
--role="roles/cloudfunctions.invoker"
gcloud projects add-iam-policy-binding MY_PROJECT_ID
--member="serviceAccount:cloud-function-sa@MY_PROJECT_ID.iam.gserviceaccount.com"
--role="roles/cloudfunctions.invoker"
--member="serviceAccount:cloud-function-sa@MY_PROJECT_ID.iam.gserviceaccount.com"
--role="roles/cloudfunctions.invoker"
Deploy HTTP function
Deploy HTTP function
gcloud functions deploy my-http-function
--gen2
--runtime nodejs18
--region us-central1
--source ./src
--entry-point httpHandler
--trigger-http
--allow-unauthenticated
--timeout 60
--memory 256MB
--max-instances 100
--set-env-vars NODE_ENV=production,API_KEY=xxx
--service-account cloud-function-sa@MY_PROJECT_ID.iam.gserviceaccount.com
--gen2
--runtime nodejs18
--region us-central1
--source ./src
--entry-point httpHandler
--trigger-http
--allow-unauthenticated
--timeout 60
--memory 256MB
--max-instances 100
--set-env-vars NODE_ENV=production,API_KEY=xxx
--service-account cloud-function-sa@MY_PROJECT_ID.iam.gserviceaccount.com
gcloud functions deploy my-http-function
--gen2
--runtime nodejs18
--region us-central1
--source ./src
--entry-point httpHandler
--trigger-http
--allow-unauthenticated
--timeout 60
--memory 256MB
--max-instances 100
--set-env-vars NODE_ENV=production,API_KEY=xxx
--service-account cloud-function-sa@MY_PROJECT_ID.iam.gserviceaccount.com
--gen2
--runtime nodejs18
--region us-central1
--source ./src
--entry-point httpHandler
--trigger-http
--allow-unauthenticated
--timeout 60
--memory 256MB
--max-instances 100
--set-env-vars NODE_ENV=production,API_KEY=xxx
--service-account cloud-function-sa@MY_PROJECT_ID.iam.gserviceaccount.com
Deploy Pub/Sub function
Deploy Pub/Sub function
gcloud functions deploy my-pubsub-function
--gen2
--runtime nodejs18
--region us-central1
--source ./src
--entry-point pubsubHandler
--trigger-topic my-topic
--memory 256MB
--timeout 300
--service-account cloud-function-sa@MY_PROJECT_ID.iam.gserviceaccount.com
--gen2
--runtime nodejs18
--region us-central1
--source ./src
--entry-point pubsubHandler
--trigger-topic my-topic
--memory 256MB
--timeout 300
--service-account cloud-function-sa@MY_PROJECT_ID.iam.gserviceaccount.com
gcloud functions deploy my-pubsub-function
--gen2
--runtime nodejs18
--region us-central1
--source ./src
--entry-point pubsubHandler
--trigger-topic my-topic
--memory 256MB
--timeout 300
--service-account cloud-function-sa@MY_PROJECT_ID.iam.gserviceaccount.com
--gen2
--runtime nodejs18
--region us-central1
--source ./src
--entry-point pubsubHandler
--trigger-topic my-topic
--memory 256MB
--timeout 300
--service-account cloud-function-sa@MY_PROJECT_ID.iam.gserviceaccount.com
Deploy Cloud Storage function
Deploy Cloud Storage function
gcloud functions deploy my-storage-function
--gen2
--runtime nodejs18
--region us-central1
--source ./src
--entry-point storageHandler
--trigger-bucket my-bucket
--trigger-location us-central1
--timeout 60
--service-account cloud-function-sa@MY_PROJECT_ID.iam.gserviceaccount.com
--gen2
--runtime nodejs18
--region us-central1
--source ./src
--entry-point storageHandler
--trigger-bucket my-bucket
--trigger-location us-central1
--timeout 60
--service-account cloud-function-sa@MY_PROJECT_ID.iam.gserviceaccount.com
gcloud functions deploy my-storage-function
--gen2
--runtime nodejs18
--region us-central1
--source ./src
--entry-point storageHandler
--trigger-bucket my-bucket
--trigger-location us-central1
--timeout 60
--service-account cloud-function-sa@MY_PROJECT_ID.iam.gserviceaccount.com
--gen2
--runtime nodejs18
--region us-central1
--source ./src
--entry-point storageHandler
--trigger-bucket my-bucket
--trigger-location us-central1
--timeout 60
--service-account cloud-function-sa@MY_PROJECT_ID.iam.gserviceaccount.com
List functions
List functions
gcloud functions list
gcloud functions list
Get function details
Get function details
gcloud functions describe my-http-function --gen2 --region us-central1
gcloud functions describe my-http-function --gen2 --region us-central1
Call function
Call function
gcloud functions call my-http-function
--region us-central1
--data '{"name":"John"}'
--region us-central1
--data '{"name":"John"}'
gcloud functions call my-http-function
--region us-central1
--data '{"name":"John"}'
--region us-central1
--data '{"name":"John"}'
View logs
View logs
gcloud functions logs read my-http-function --limit 50 --gen2 --region us-central1
gcloud functions logs read my-http-function --limit 50 --gen2 --region us-central1
Delete function
Delete function
gcloud functions delete my-http-function --gen2 --region us-central1
undefinedgcloud functions delete my-http-function --gen2 --region us-central1
undefined2. Cloud Functions Implementation (Node.js)
2. Cloud Functions 实现(Node.js)
javascript
// HTTP Trigger Function
exports.httpHandler = async (req, res) => {
try {
// Enable CORS
res.set('Access-Control-Allow-Origin', '*');
res.set('Access-Control-Allow-Methods', 'GET, POST, OPTIONS');
if (req.method === 'OPTIONS') {
res.status(204).send('');
return;
}
// Parse request
const { name } = req.query;
if (!name) {
return res.status(400).json({ error: 'Name is required' });
}
// Log with Cloud Logging
console.log(JSON.stringify({
severity: 'INFO',
message: 'Processing request',
name: name,
requestId: req.id
}));
// Business logic
const response = {
message: `Hello ${name}!`,
timestamp: new Date().toISOString()
};
res.status(200).json(response);
} catch (error) {
console.error(JSON.stringify({
severity: 'ERROR',
message: error.message,
stack: error.stack
}));
res.status(500).json({ error: 'Internal server error' });
}
};
// Pub/Sub Trigger Function
exports.pubsubHandler = async (message, context) => {
try {
// Decode Pub/Sub message
const pubsubMessage = message.data
? Buffer.from(message.data, 'base64').toString()
: null;
console.log('Received message:', pubsubMessage);
// Parse message
const data = JSON.parse(pubsubMessage);
// Process message asynchronously
await processMessage(data);
console.log('Message processed successfully');
} catch (error) {
console.error('Error processing message:', error);
throw error; // Function will retry
}
};
// Cloud Storage Trigger Function
exports.storageHandler = async (file, context) => {
try {
const { name, bucket } = file;
console.log(JSON.stringify({
message: 'Processing storage event',
bucket: bucket,
file: name,
eventId: context.eventId,
eventType: context.eventType
}));
// Check file type
if (!name.endsWith('.jpg') && !name.endsWith('.png')) {
console.log('Skipping non-image file');
return;
}
// Process image
await processImage(bucket, name);
console.log('Image processed successfully');
} catch (error) {
console.error('Error processing file:', error);
throw error;
}
};
// Cloud Scheduler (CRON) Function
exports.cronHandler = async (req, res) => {
try {
console.log('Scheduled job started');
// Run batch processing
await performBatchJob();
res.status(200).json({ message: 'Batch job completed' });
} catch (error) {
console.error('Error in batch job:', error);
res.status(500).json({ error: error.message });
}
};
// Helper functions
async function processMessage(data) {
// Business logic
return new Promise(resolve => {
setTimeout(() => resolve(), 1000);
});
}
async function processImage(bucket, filename) {
// Use Cloud Vision API or similar
return true;
}
async function performBatchJob() {
// Batch processing logic
return true;
}javascript
// HTTP Trigger Function
exports.httpHandler = async (req, res) => {
try {
// Enable CORS
res.set('Access-Control-Allow-Origin', '*');
res.set('Access-Control-Allow-Methods', 'GET, POST, OPTIONS');
if (req.method === 'OPTIONS') {
res.status(204).send('');
return;
}
// Parse request
const { name } = req.query;
if (!name) {
return res.status(400).json({ error: 'Name is required' });
}
// Log with Cloud Logging
console.log(JSON.stringify({
severity: 'INFO',
message: 'Processing request',
name: name,
requestId: req.id
}));
// Business logic
const response = {
message: `Hello ${name}!`,
timestamp: new Date().toISOString()
};
res.status(200).json(response);
} catch (error) {
console.error(JSON.stringify({
severity: 'ERROR',
message: error.message,
stack: error.stack
}));
res.status(500).json({ error: 'Internal server error' });
}
};
// Pub/Sub Trigger Function
exports.pubsubHandler = async (message, context) => {
try {
// Decode Pub/Sub message
const pubsubMessage = message.data
? Buffer.from(message.data, 'base64').toString()
: null;
console.log('Received message:', pubsubMessage);
// Parse message
const data = JSON.parse(pubsubMessage);
// Process message asynchronously
await processMessage(data);
console.log('Message processed successfully');
} catch (error) {
console.error('Error processing message:', error);
throw error; // Function will retry
}
};
// Cloud Storage Trigger Function
exports.storageHandler = async (file, context) => {
try {
const { name, bucket } = file;
console.log(JSON.stringify({
message: 'Processing storage event',
bucket: bucket,
file: name,
eventId: context.eventId,
eventType: context.eventType
}));
// Check file type
if (!name.endsWith('.jpg') && !name.endsWith('.png')) {
console.log('Skipping non-image file');
return;
}
// Process image
await processImage(bucket, name);
console.log('Image processed successfully');
} catch (error) {
console.error('Error processing file:', error);
throw error;
}
};
// Cloud Scheduler (CRON) Function
exports.cronHandler = async (req, res) => {
try {
console.log('Scheduled job started');
// Run batch processing
await performBatchJob();
res.status(200).json({ message: 'Batch job completed' });
} catch (error) {
console.error('Error in batch job:', error);
res.status(500).json({ error: error.message });
}
};
// Helper functions
async function processMessage(data) {
// Business logic
return new Promise(resolve => {
setTimeout(() => resolve(), 1000);
});
}
async function processImage(bucket, filename) {
// Use Cloud Vision API or similar
return true;
}
async function performBatchJob() {
// Batch processing logic
return true;
}3. Terraform Cloud Functions Configuration
3. Terraform Cloud Functions 配置
hcl
undefinedhcl
undefinedcloud-functions.tf
cloud-functions.tf
terraform {
required_providers {
google = {
source = "hashicorp/google"
version = "~> 5.0"
}
}
}
provider "google" {
project = var.project_id
region = var.region
}
variable "project_id" {
description = "GCP Project ID"
}
variable "region" {
default = "us-central1"
}
terraform {
required_providers {
google = {
source = "hashicorp/google"
version = "~> 5.0"
}
}
}
provider "google" {
project = var.project_id
region = var.region
}
variable "project_id" {
description = "GCP Project ID"
}
variable "region" {
default = "us-central1"
}
Service account for functions
Service account for functions
resource "google_service_account" "function_sa" {
account_id = "cloud-function-sa"
display_name = "Cloud Function Service Account"
}
resource "google_service_account" "function_sa" {
account_id = "cloud-function-sa"
display_name = "Cloud Function Service Account"
}
Grant invoker role
Grant invoker role
resource "google_project_iam_member" "function_invoker" {
project = var.project_id
role = "roles/cloudfunctions.invoker"
member = "serviceAccount:${google_service_account.function_sa.email}"
}
resource "google_project_iam_member" "function_invoker" {
project = var.project_id
role = "roles/cloudfunctions.invoker"
member = "serviceAccount:${google_service_account.function_sa.email}"
}
Grant Cloud Logging role
Grant Cloud Logging role
resource "google_project_iam_member" "function_logs" {
project = var.project_id
role = "roles/logging.logWriter"
member = "serviceAccount:${google_service_account.function_sa.email}"
}
resource "google_project_iam_member" "function_logs" {
project = var.project_id
role = "roles/logging.logWriter"
member = "serviceAccount:${google_service_account.function_sa.email}"
}
Source archive bucket
Source archive bucket
resource "google_storage_bucket" "function_source" {
name = "${var.project_id}-function-source"
location = var.region
}
resource "google_storage_bucket" "function_source" {
name = "${var.project_id}-function-source"
location = var.region
}
Upload function code
Upload function code
resource "google_storage_bucket_object" "function_zip" {
name = "function-${data.archive_file.function.output_md5}.zip"
bucket = google_storage_bucket.function_source.name
source = data.archive_file.function.output_path
}
resource "google_storage_bucket_object" "function_zip" {
name = "function-${data.archive_file.function.output_md5}.zip"
bucket = google_storage_bucket.function_source.name
source = data.archive_file.function.output_path
}
Archive function code
Archive function code
data "archive_file" "function" {
type = "zip"
source_dir = "${path.module}/src"
output_path = "${path.module}/function.zip"
}
data "archive_file" "function" {
type = "zip"
source_dir = "${path.module}/src"
output_path = "${path.module}/function.zip"
}
HTTP Cloud Function
HTTP Cloud Function
resource "google_cloudfunctions2_function" "http_function" {
name = "my-http-function"
location = var.region
description = "HTTP trigger function"
build_config {
runtime = "nodejs18"
entry_point = "httpHandler"
source {
storage_source {
bucket = google_storage_bucket.function_source.name
object = google_storage_bucket_object.function_zip.name
}
}
}
service_config {
max_instance_count = 100
available_memory_mb = 256
timeout_seconds = 60
service_account_email = google_service_account.function_sa.email
environment_variables = {
NODE_ENV = "production"
API_KEY = "your-api-key"
}}
labels = {
env = "production"
}
}
resource "google_cloudfunctions2_function" "http_function" {
name = "my-http-function"
location = var.region
description = "HTTP trigger function"
build_config {
runtime = "nodejs18"
entry_point = "httpHandler"
source {
storage_source {
bucket = google_storage_bucket.function_source.name
object = google_storage_bucket_object.function_zip.name
}
}
}
service_config {
max_instance_count = 100
available_memory_mb = 256
timeout_seconds = 60
service_account_email = google_service_account.function_sa.email
environment_variables = {
NODE_ENV = "production"
API_KEY = "your-api-key"
}}
labels = {
env = "production"
}
}
Allow public HTTP access
Allow public HTTP access
resource "google_cloudfunctions2_function_iam_member" "http_public" {
cloud_function = google_cloudfunctions2_function.http_function.name
role = "roles/cloudfunctions.invoker"
member = "allUsers"
}
resource "google_cloudfunctions2_function_iam_member" "http_public" {
cloud_function = google_cloudfunctions2_function.http_function.name
role = "roles/cloudfunctions.invoker"
member = "allUsers"
}
Pub/Sub topic
Pub/Sub topic
resource "google_pubsub_topic" "messages" {
name = "message-topic"
}
resource "google_pubsub_topic" "messages" {
name = "message-topic"
}
Pub/Sub Cloud Function
Pub/Sub Cloud Function
resource "google_cloudfunctions2_function" "pubsub_function" {
name = "my-pubsub-function"
location = var.region
description = "Pub/Sub trigger function"
build_config {
runtime = "nodejs18"
entry_point = "pubsubHandler"
source {
storage_source {
bucket = google_storage_bucket.function_source.name
object = google_storage_bucket_object.function_zip.name
}
}
}
service_config {
max_instance_count = 100
available_memory_mb = 256
timeout_seconds = 300
service_account_email = google_service_account.function_sa.email
}
event_trigger {
trigger_region = var.region
event_type = "google.cloud.pubsub.topic.publish"
pubsub_topic = google_pubsub_topic.messages.id
}
}
resource "google_cloudfunctions2_function" "pubsub_function" {
name = "my-pubsub-function"
location = var.region
description = "Pub/Sub trigger function"
build_config {
runtime = "nodejs18"
entry_point = "pubsubHandler"
source {
storage_source {
bucket = google_storage_bucket.function_source.name
object = google_storage_bucket_object.function_zip.name
}
}
}
service_config {
max_instance_count = 100
available_memory_mb = 256
timeout_seconds = 300
service_account_email = google_service_account.function_sa.email
}
event_trigger {
trigger_region = var.region
event_type = "google.cloud.pubsub.topic.publish"
pubsub_topic = google_pubsub_topic.messages.id
}
}
Cloud Storage bucket
Cloud Storage bucket
resource "google_storage_bucket" "uploads" {
name = "${var.project_id}-uploads"
location = var.region
}
resource "google_storage_bucket" "uploads" {
name = "${var.project_id}-uploads"
location = var.region
}
Cloud Storage trigger function
Cloud Storage trigger function
resource "google_cloudfunctions2_function" "storage_function" {
name = "my-storage-function"
location = var.region
description = "Cloud Storage trigger function"
build_config {
runtime = "nodejs18"
entry_point = "storageHandler"
source {
storage_source {
bucket = google_storage_bucket.function_source.name
object = google_storage_bucket_object.function_zip.name
}
}
}
service_config {
max_instance_count = 50
available_memory_mb = 256
timeout_seconds = 60
service_account_email = google_service_account.function_sa.email
}
event_trigger {
trigger_region = var.region
event_type = "google.storage.object.finalize"
resource = google_storage_bucket.uploads.name
}
}
resource "google_cloudfunctions2_function" "storage_function" {
name = "my-storage-function"
location = var.region
description = "Cloud Storage trigger function"
build_config {
runtime = "nodejs18"
entry_point = "storageHandler"
source {
storage_source {
bucket = google_storage_bucket.function_source.name
object = google_storage_bucket_object.function_zip.name
}
}
}
service_config {
max_instance_count = 50
available_memory_mb = 256
timeout_seconds = 60
service_account_email = google_service_account.function_sa.email
}
event_trigger {
trigger_region = var.region
event_type = "google.storage.object.finalize"
resource = google_storage_bucket.uploads.name
}
}
Cloud Scheduler job (CRON)
Cloud Scheduler job (CRON)
resource "google_cloud_scheduler_job" "batch_job" {
name = "batch-job-scheduler"
description = "Scheduled batch job"
schedule = "0 2 * * *" # Daily at 2 AM
time_zone = "UTC"
attempt_deadline = "320s"
region = var.region
retry_config {
retry_count = 1
}
http_target {
uri = google_cloudfunctions2_function.http_function.service_config[0].uri
http_method = "POST"
headers = {
"Content-Type" = "application/json"
}
body = base64encode(jsonencode({
job_type = "batch"
}))
oidc_token {
service_account_email = google_service_account.function_sa.email
}}
}
resource "google_cloud_scheduler_job" "batch_job" {
name = "batch-job-scheduler"
description = "Scheduled batch job"
schedule = "0 2 * * *" # Daily at 2 AM
time_zone = "UTC"
attempt_deadline = "320s"
region = var.region
retry_config {
retry_count = 1
}
http_target {
uri = google_cloudfunctions2_function.http_function.service_config[0].uri
http_method = "POST"
headers = {
"Content-Type" = "application/json"
}
body = base64encode(jsonencode({
job_type = "batch"
}))
oidc_token {
service_account_email = google_service_account.function_sa.email
}}
}
Cloud Logging sink
Cloud Logging sink
resource "google_logging_project_sink" "function_logs" {
name = "cloud-function-logs"
destination = "logging.googleapis.com/projects/${var.project_id}/logs/my-http-function"
filter = "resource.type="cloud_function" AND resource.labels.function_name="my-http-function""
}
resource "google_logging_project_sink" "function_logs" {
name = "cloud-function-logs"
destination = "logging.googleapis.com/projects/${var.project_id}/logs/my-http-function"
filter = "resource.type="cloud_function" AND resource.labels.function_name="my-http-function""
}
Monitoring alert
Monitoring alert
resource "google_monitoring_alert_policy" "function_errors" {
display_name = "Cloud Function Error Rate"
combiner = "OR"
conditions {
display_name = "Error rate threshold"
condition_threshold {
filter = "metric.type=\"cloudfunctions.googleapis.com/function/error_count\" AND resource.type=\"cloud_function\""
duration = "60s"
comparison = "COMPARISON_GT"
threshold_value = 10
aggregations {
alignment_period = "60s"
per_series_aligner = "ALIGN_RATE"
}
}}
}
output "http_function_url" {
value = google_cloudfunctions2_function.http_function.service_config[0].uri
}
undefinedresource "google_monitoring_alert_policy" "function_errors" {
display_name = "Cloud Function Error Rate"
combiner = "OR"
conditions {
display_name = "Error rate threshold"
condition_threshold {
filter = "metric.type=\"cloudfunctions.googleapis.com/function/error_count\" AND resource.type=\"cloud_function\""
duration = "60s"
comparison = "COMPARISON_GT"
threshold_value = 10
aggregations {
alignment_period = "60s"
per_series_aligner = "ALIGN_RATE"
}
}}
}
output "http_function_url" {
value = google_cloudfunctions2_function.http_function.service_config[0].uri
}
undefinedBest Practices
最佳实践
✅ DO
✅ 建议
- Use service accounts with least privilege
- Store secrets in Secret Manager
- Implement proper error handling
- Use environment variables for configuration
- Monitor with Cloud Logging and Cloud Monitoring
- Set appropriate memory and timeout
- Use event filters to reduce invocations
- Implement idempotent functions
- 使用遵循最小权限原则的服务账号
- 在Secret Manager中存储密钥
- 实现完善的错误处理机制
- 使用环境变量管理配置
- 通过Cloud Logging和Cloud Monitoring进行监控
- 设置合适的内存和超时时间
- 使用事件过滤器减少函数调用次数
- 实现幂等函数
❌ DON'T
❌ 不建议
- Store secrets in code
- Use default service account
- Create long-running functions
- Ignore error handling
- Deploy without testing
- Use unauthenticated access for sensitive functions
- 在代码中存储密钥
- 使用默认服务账号
- 创建长时间运行的函数
- 忽略错误处理
- 未经测试就部署
- 对敏感函数开放未认证访问
Monitoring
监控
- Cloud Logging for application logs
- Cloud Monitoring for metrics
- Error Reporting for error tracking
- Cloud Trace for distributed tracing
- Cloud Profiler for performance analysis
- 借助Cloud Logging查看应用日志
- 借助Cloud Monitoring查看指标
- 借助Error Reporting追踪错误
- 借助Cloud Trace进行分布式追踪
- 借助Cloud Profiler进行性能分析