Setup log for a production-grade NodeJS application using S3 and AWS Lambda.
Initializing the Project after Git Clone
Tags: typescript, S3, utility
Use utility functions to download files from S3. These are just pure javascript functions that accept some parameters like bucket , fileKey etc and download the file.
import aws from 'aws-sdk';
import fs from 'fs';
const s3 = new aws.S3();
export class S3Utils {
static downloadFileFromS3 = function (bucket: string, fileKey: string, filePath: string) {
console.log('downloading', bucket, fileKey, filePath);
return new Promise((resolve, reject) => {
const file = fs.createWriteStream(filePath),
stream = s3
.getObject({
Bucket: bucket,
Key: fileKey
})
.createReadStream();
stream.on('error', reject);
file.on('error', reject);
file.on('finish', () => {
console.log('downloaded', bucket, fileKey);
resolve(filePath);
});
stream.pipe(file);
});
};
static uploadFileToS3 = function (
bucket: string,
fileKey: string,
filePath: string,
contentType: string
) {
console.log('uploading', bucket, fileKey, filePath);
return s3
.upload({
Bucket: bucket,
Key: fileKey,
Body: fs.createReadStream(filePath),
ACL: 'private',
ContentType: contentType
})
.promise();
};
static cleanDownloadedFile = async (filePath: string) => {
await fs.unlink(filePath, (err) => {
console.log('temporary file deleted ');
});
};
}
Related links:
- https://github.com/Mohammad-Faisal/aws-sam-lambda-trigger-s3-upload
- https://www.mohammadfaisal.dev/
- https://56faisal.medium.com/4d16b08fede1
Place Lambda Handler Under the Src Folder
Tags: typescript, lambda, src, handler
In this lambda, the event object will be an S3CreateEvent because we want this function to get triggered when a new file is uploaded to a particular S3 bucket.
Note: We are using this function to read .xlsx and .csv files. If you want other files to support you will have to add those in the supportedFormats array.
import { S3CreateEvent, Context } from 'aws-lambda';
import path from 'path';
import os from 'os';
import { S3Utils } from '../utils/s3-utils';
const supportedFormats = ['csv', 'xlsx'];
function extractS3Info(event: S3CreateEvent) {
const eventRecord = event.Records && event.Records[0];
const bucket = eventRecord.s3.bucket.name;
const { key } = eventRecord.s3.object;
return { bucket, key };
}
exports.handler = async (event: S3CreateEvent, context: Context) => {
try {
const s3Info = extractS3Info(event);
const id = context.awsRequestId;
const extension = path.extname(s3Info.key).toLowerCase();
const tempFile = path.join(os.tmpdir(), id + extension);
const extensionWithoutDot = extension.slice(1);
console.log('converting', s3Info.bucket, ':', s3Info.key, 'using', tempFile);
if (!supportedFormats.includes(extensionWithoutDot)) {
throw new Error(`unsupported file type ${extension}`);
}
await S3Utils.downloadFileFromS3(s3Info.bucket, s3Info.key, tempFile);
// do whatever you want to do with the file
// contentType = `image/${extensionWithoutDot}`;
// await S3Utils.uploadFileToS3(OUTPUT_BUCKET, s3Info.key, tempFile, contentType);
await S3Utils.cleanDownloadedFile(tempFile);
} catch (err) {
console.log(JSON.stringify(err));
}
};
Related links:
- https://github.com/Mohammad-Faisal/aws-sam-lambda-trigger-s3-upload
- https://www.mohammadfaisal.dev/
- https://56faisal.medium.com/4d16b08fede1
Update the Template.yaml File
Tags: amazon-web-services, handler, lambda, amazon-cloudformation
We added three things: an S3 bucket for file upload, a Lambda that will be triggered when a new file is uploaded, and a policy that will allow the lambda to read the contents of the S3 bucket. We will also attach the policy with the role of the function.
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: AWS SAM demo Lambda react to file uploaded to s3
Globals:
Function:
Runtime: nodejs14.x
Timeout: 30
Resources:
S3BucketToRespond:
Type: AWS::S3::Bucket
Properties:
BucketName: 'Dummy-Bucket'
LambdaThatWillReactToFileUpload:
Type: AWS::Serverless::Function
Properties:
CodeUri: src/s3-file-upload-reaction
Handler: app.handler
Events:
FileUploadedToS3:
Type: S3
Properties:
Bucket: !Ref S3BucketToRespond
Events: s3:ObjectCreated:*
ReadS3BucketPolicy:
Type: AWS::IAM::Policy
Properties:
PolicyName: ReadS3BucketPolicy
PolicyDocument:
Version: '2012-10-17'
Statement:
- Effect: Allow
Action:
- s3:GetObject
Resource:
- !Sub '${S3BucketToRespond.Arn}/*'
Roles:
- !Ref LambdaThatWillReactToFileUploadRole
Related links:
- https://github.com/Mohammad-Faisal/aws-sam-lambda-trigger-s3-upload
- https://www.mohammadfaisal.dev/
- https://56faisal.medium.com/4d16b08fede1
Command To Deploy via aws sam
Tags: aws-lambda, aws-sam, shell, amazzon-webservices, aws-sam-cli
We added the Extra policy to avoid the circular dependency problem. And that’s it. Now you will deploy the code depending on your region.
To deploy your application, you first configure your environment. You can find the details here.
Related links:
- https://github.com/Mohammad-Faisal/aws-sam-lambda-trigger-s3-upload
- https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-quickstart.html
- https://www.mohammadfaisal.dev/
- https://56faisal.medium.com/4d16b08fede1
Test the Lambda Using S3 Console
Tags: logs, lambda, upload
To test whether it works, go to the aws s3 console, upload a file, and check the logs.
To check the logs from the local machine use this.
sam logs -n LambdaThatWillReactToFileUpload --stack-name sam-lambda-trigger-s3-file-upload --tail
Related links:
- https://github.com/Mohammad-Faisal/aws-sam-lambda-trigger-s3-upload
- https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-quickstart.html
- https://www.mohammadfaisal.dev/
- https://56faisal.medium.com/4d16b08fede1