为 Lambda 函数添加外观的 5 种方法:DevTools 比较指南
TL;DR
俗话说,条条大路通罗马……在科技界,有 5 种方法可以实现 Lambda 函数的条条大路 🤩
让我们比较一下 5 个 DevTools
介绍
当开发人员试图弥合开发和 DevOps 之间的差距时,我认为比较编程语言和 DevTools 会有所帮助。
让我们从一个简单的函数开始,将文本文件上传到我们的云应用程序中的 Bucket。
下一步是展示实现这一目标的几种方法。
注意:在云开发中,管理权限和存储桶身份、打包运行时代码以及处理基础设施和运行时的多个文件会增加开发过程的复杂性。
 
让我们深入研究一些代码!
1.翼
安装 Wing之后,让我们创建一个文件:
main.w
bring cloud;
let bucket = new cloud.Bucket();
new cloud.Function(inflight () => {
  bucket.put("hello.txt", "world!");
});
让我们分析一下上面的代码中发生的事情。
bring cloud是 Wing 的导入语法创建云存储桶:
let bucket = new cloud.Bucket();初始化一个新的云存储桶实例。在后端,Wing 平台会在您的云提供商环境中预置一个新的存储桶。此存储桶用于存储和检索数据。
创建云函数:该
new cloud.Function(inflight () => { ... });语句定义一个新的云函数。此函数在触发时会执行其主体内定义的操作。
bucket.put("hello.txt", "world!");将内容为 world! 且名为 hello.txt 的文件上传到之前创建的云存储桶。
编译并部署到AWS
- 
  wing compile --platform tf-aws main.w
- 
  terraform apply
就是这样,Wing 负责处理复杂性(权限、在运行时代码中获取存储桶身份、将运行时代码打包到存储桶中、必须为基础设施和运行时编写多个文件)等。
更不用说它生成 IAC(TF 或 CF),以及您可以使用现有工具部署的 Javascript。
 
但是在开发过程中,您可以使用本地模拟器获得即时反馈并缩短迭代周期。
Wing 甚至有一个游乐场,您可以在浏览器中尝试一下!
2.普鲁米
步骤1:初始化新的Pulumi项目
mkdir pulumi-s3-lambda-ts
cd pulumi-s3-lambda-ts
pulumi new aws-typescript
步骤2.编写代码将文本文件上传到S3。
这将是您的项目结构。
pulumi-s3-lambda-ts/
├─ src/
│  ├─ index.ts              # Pulumi infrastructure code
│  └─ lambda/
│     └─ index.ts           # Lambda function code to upload a file to S3
├─ tsconfig.json            # TypeScript configuration
└─ package.json             # Node.js project file with dependencies
我们将此代码添加到index.ts
import * as pulumi from "@pulumi/pulumi";
import * as aws from "@pulumi/aws";
// Create an AWS S3 bucket
const bucket = new aws.s3.Bucket("myBucket", {
    acl: "private",
});
// IAM role for the Lambda function
const lambdaRole = new aws.iam.Role("lambdaRole", {
    assumeRolePolicy: JSON.stringify({
        Version: "2023-10-17",
        Statement: [{
            Action: "sts:AssumeRole",
            Principal: {
                Service: "lambda.amazonaws.com",
            },
            Effect: "Allow",
            Sid: "",
        }],
    }),
});
// Attach the AWSLambdaBasicExecutionRole policy
new aws.iam.RolePolicyAttachment("lambdaExecutionRole", {
    role: lambdaRole,
    policyArn: aws.iam.ManagedPolicy.AWSLambdaBasicExecutionRole,
});
// Policy to allow Lambda function to access the S3 bucket
const lambdaS3Policy = new aws.iam.Policy("lambdaS3Policy", {
    policy: bucket.arn.apply(arn => JSON.stringify({
        Version: "2023-10-17",
        Statement: [{
            Action: ["s3:PutObject", "s3:GetObject"],
            Resource: `${arn}/*`,
            Effect: "Allow",
        }],
    })),
});
// Attach policy to Lambda role
new aws.iam.RolePolicyAttachment("lambdaS3PolicyAttachment", {
    role: lambdaRole,
    policyArn: lambdaS3Policy.arn,
});
// Lambda function
const lambda = new aws.lambda.Function("myLambda", {
    code: new pulumi.asset.AssetArchive({
        ".": new pulumi.asset.FileArchive("./src/lambda"),
    }),
    runtime: aws.lambda.Runtime.NodeJS12dX,
    role: lambdaRole.arn,
    handler: "index.handler",
    environment: {
        variables: {
            BUCKET_NAME: bucket.bucket,
        },
    },
});
export const bucketName = bucket.id;
export const lambdaArn = lambda.arn;
接下来,为 Lambda 函数代码创建lambda/index.ts目录:
import { S3 } from "aws-sdk";
const s3 = new S3();
export const handler = async (): Promise<void> => {
    const bucketName = process.env.BUCKET_NAME || "";
    const fileName = "example.txt";
    const content = "Hello, Pulumi!";
    const params = {
        Bucket: bucketName,
        Key: fileName,
        Body: content,
    };
    try {
        await s3.putObject(params).promise();
        console.log(`File uploaded successfully at https://${bucketName}.s3.amazonaws.com/${fileName}`);
    } catch (err) {
        console.log(err);
    }
};
步骤 3:TypeScript 配置(tsconfig.json)
{
  "compilerOptions": {
    "target": "ES2018",
    "module": "CommonJS",
    "strict": true,
    "esModuleInterop": true,
    "skipLibCheck": true,
    "forceConsistentCasingInFileNames": true
  },
  "include": ["src/**/*.ts"],
  "exclude": ["node_modules", "**/*.spec.ts"]
}
创建Pulumi项目后,会自动生成一个yaml文件 pulumi.yaml
name: s3-lambda-pulumi
runtime: nodejs
description: A simple example that uploads a file to an S3 bucket using a Lambda function
template:
  config:
    aws:region:
      description: The AWS region to deploy into
      default: us-west-2
使用 Pulumi 进行部署
确保lambda包含该index.js文件的目录已正确设置。然后,运行以下命令来部署基础架构:pulumi up
3. AWS-CDK
步骤 1:初始化新的 CDK 项目
mkdir cdk-s3-lambda
cd cdk-s3-lambda
cdk init app --language=typescript
第 2 步:添加依赖项
npm install @aws-cdk/aws-lambda @aws-cdk/aws-s3
步骤 3:在 CDK 中定义 AWS 资源
文件:index.js
import * as cdk from '@aws-cdk/core';
import * as lambda from '@aws-cdk/aws-lambda';
import * as s3 from '@aws-cdk/aws-s3';
export class CdkS3LambdaStack extends cdk.Stack {
  constructor(scope: cdk.Construct, id: string, props?: cdk.StackProps) {
    super(scope, id, props);
    // Create the S3 bucket
    const bucket = new s3.Bucket(this, 'MyBucket', {
      removalPolicy: cdk.RemovalPolicy.DESTROY, // NOT recommended for production code
    });
    // Define the Lambda function
    const lambdaFunction = new lambda.Function(this, 'MyLambda', {
      runtime: lambda.Runtime.NODEJS_14_X, // Define the runtime
      handler: 'index.handler', // Specifies the entry point
      code: lambda.Code.fromAsset('lambda'), // Directory containing your Lambda code
      environment: {
        BUCKET_NAME: bucket.bucketName,
      },
    });
    // Grant the Lambda function permissions to write to the S3 bucket
    bucket.grantWrite(lambdaFunction);
  }
}
步骤 4:Lambda 函数代码
在 pulumi 目录中创建与上述相同的文件结构:index.ts
import { S3 } from 'aws-sdk';
const s3 = new S3();
exports.handler = async (event: any) => {
  const bucketName = process.env.BUCKET_NAME;
  const fileName = 'uploaded_file.txt';
  const content = 'Hello, CDK! This file was uploaded by a Lambda function!';
  try {
    const result = await s3.putObject({
      Bucket: bucketName!,
      Key: fileName,
      Body: content,
    }).promise();
    console.log(`File uploaded successfully: ${result}`);
    return {
      statusCode: 200,
      body: `File uploaded successfully: ${fileName}`,
    };
  } catch (error) {
    console.log(error);
    return {
      statusCode: 500,
      body: `Failed to upload file: ${error}`,
    };
  }
};
部署 CDK 堆栈
首先,编译你的 TypeScript 代码:npm run build,然后
将您的 CDK 部署到 AWS:cdk deploy
4. Terraform 的 CDK
步骤 1:初始化新的 CDKTF 项目
mkdir cdktf-s3-lambda-ts
cd cdktf-s3-lambda-ts
然后,使用 TypeScript 初始化一个新的 CDKTF 项目:
cdktf init --template="typescript" --local
步骤 2:安装 AWS 提供程序并添加依赖项
npm install @cdktf/provider-aws
步骤 3:定义基础设施
编辑 main.ts 以定义 S3 存储桶和 Lambda 函数:
import { Construct } from 'constructs';
import { App, TerraformStack } from 'cdktf';
import { AwsProvider, s3, lambdafunction, iam } from '@cdktf/provider-aws';
class MyStack extends TerraformStack {
  constructor(scope: Construct, id: string) {
    super(scope, id);
    new AwsProvider(this, 'aws', { region: 'us-west-2' });
    // S3 bucket
    const bucket = new s3.S3Bucket(this, 'lambdaBucket', {
      bucketPrefix: 'cdktf-lambda-'
    });
    // IAM role for Lambda
    const role = new iam.IamRole(this, 'lambdaRole', {
      name: 'lambda_execution_role',
      assumeRolePolicy: JSON.stringify({
        Version: '2023-10-17',
        Statement: [{
          Action: 'sts:AssumeRole',
          Principal: { Service: 'lambda.amazonaws.com' },
          Effect: 'Allow',
        }],
      }),
    });
    new iam.IamRolePolicyAttachment(this, 'lambdaPolicy', {
      role: role.name,
      policyArn: 'arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole',
    });
    const lambdaFunction = new lambdafunction.LambdaFunction(this, 'MyLambda', {
      functionName: 'myLambdaFunction',
      handler: 'index.handler',
      role: role.arn,
      runtime: 'nodejs14.x',
      s3Bucket: bucket.bucket, // Assuming the Lambda code is uploaded to this bucket
      s3Key: 'lambda.zip', // Assuming the Lambda code zip file is named lambda.zip
      environment: {
        variables: {
          BUCKET_NAME: bucket.bucket,
        },
      },
    });
    // Grant the Lambda function permissions to write to the S3 bucket
    new s3.S3BucketPolicy(this, 'BucketPolicy', {
      bucket: bucket.bucket,
      policy: bucket.bucket.apply(name => JSON.stringify({
        Version: '2023-10-17',
        Statement: [{
          Action: 's3:*',
          Resource: `arn:aws:s3:::${name}/*`,
          Effect: 'Allow',
          Principal: {
            AWS: role.arn,
          },
        }],
      })),
    });
  }
}
const app = new App();
new MyStack(app, 'cdktf-s3-lambda-ts');
app.synth();
步骤 4:Lambda 函数代码
Lambda 函数代码应使用 TypeScript 编写并编译为 JavaScript,因为 AWS Lambda 原生执行 JavaScript。以下是您需要编译和压缩的 Lambda 函数的示例index.ts :
import { S3 } from 'aws-sdk';
const s3 = new S3();
exports.handler = async () => {
  const bucketName = process.env.BUCKET_NAME || '';
  const content = 'Hello, CDKTF!';
  const params = {
    Bucket: bucketName,
    Key: `upload-${Date.now()}.txt`,
    Body: content,
  };
  try {
    await s3.putObject(params).promise();
    return { statusCode: 200, body: 'File uploaded successfully' };
  } catch (err) {
    console.error(err);
    return { statusCode: 500, body: 'Failed to upload file' };
  }
};
您需要将此 TypeScript 代码编译为 JavaScript,压缩它,然后手动或使用脚本将其上传到 S3 存储桶。
确保 LambdaFunction 资源中的 s3Key 指向存储桶中的正确 zip 文件。
编译并部署您的 CDKTF 项目
使用以下方式编译你的项目npm run build
生成 Terraform 配置文件
运行该cdktf synth命令。此命令将执行您的 CDKTF 应用,该应用会在以下目录中生成 Terraform 配置文件(*.tf.json文件)cdktf.out:
部署您的基础设施
cdktf deploy
5.地形
步骤 1:Terraform 设置
定义您的 AWS 提供商和 S3 存储桶
 创建一个名为main.tf的文件,内容如下:
provider "aws" {
  region = "us-west-2" # Choose your AWS region
}
resource "aws_s3_bucket" "lambda_bucket" {
  bucket_prefix = "lambda-upload-bucket-"
  acl           = "private"
}
resource "aws_iam_role" "lambda_execution_role" {
  name = "lambda_execution_role"
  assume_role_policy = jsonencode({
    Version = "2023-10-17"
    Statement = [
      {
        Action = "sts:AssumeRole"
        Effect = "Allow"
        Principal = {
          Service = "lambda.amazonaws.com"
        }
      },
    ]
  })
}
resource "aws_iam_policy" "lambda_s3_policy" {
  name        = "lambda_s3_policy"
  description = "IAM policy for Lambda to access S3"
  policy = jsonencode({
    Version = "2023-10-17"
    Statement = [
      {
        Action   = ["s3:PutObject", "s3:GetObject"],
        Effect   = "Allow",
        Resource = "${aws_s3_bucket.lambda_bucket.arn}/*"
      },
    ]
  })
}
resource "aws_iam_role_policy_attachment" "lambda_s3_access" {
  role       = aws_iam_role.lambda_execution_role.name
  policy_arn = aws_iam_policy.lambda_s3_policy.arn
}
resource "aws_lambda_function" "uploader_lambda" {
  function_name = "S3Uploader"
  s3_bucket = "YOUR_DEPLOYMENT_BUCKET_NAME" # Set your deployment bucket name here
  s3_key    = "lambda.zip" # Upload your ZIP file to S3 and set its key here
  handler = "index.handler"
  role    = aws_iam_role.lambda_execution_role.arn
  runtime = "nodejs14.x"
  environment {
    variables = {
      BUCKET_NAME = aws_s3_bucket.lambda_bucket.bucket
    }
  }
}
步骤 2:Lambda 函数代码(TypeScript)
为 Lambda 函数创建一个 TypeScript 文件index.ts :
import { S3 } from 'aws-sdk';
const s3 = new S3();
exports.handler = async (event: any) => {
  const bucketName = process.env.BUCKET_NAME;
  const fileName = `uploaded-${Date.now()}.txt`;
  const content = 'Hello, Terraform and AWS Lambda!';
  try {
    await s3.putObject({
      Bucket: bucketName!,
      Key: fileName,
      Body: content,
    }).promise();
    console.log('Upload successful');
    return {
      statusCode: 200,
      body: JSON.stringify({ message: 'Upload successful' }),
    };
  } catch (error) {
    console.error('Upload failed:', error);
    return {
      statusCode: 500,
      body: JSON.stringify({ message: 'Upload failed' }),
    };
  }
};
部署
最后,将您的 Lambda 函数代码上传到指定的 S3 存储桶后,运行terraform apply。
我希望您喜欢这五种简单方法的比较,它们可以在我们的云应用程序中编写将文本文件上传到 Bucket 的函数。
正如您所见,除了一个代码外,大多数代码都变得非常复杂。
说唱!
 
点击图片⬆️
文章来源:https://dev.to/winglang/5-ways-to-write-a-simple-function-in-your-cloud-app-1jgl如果您对 Wing 感兴趣并且喜欢我们如何简化云开发流程,请给我们一颗 ⭐ 星。
 后端开发教程 - Java、Spring Boot 实战 - msg200.com
            后端开发教程 - Java、Spring Boot 实战 - msg200.com