This page looks best with JavaScript enabled

Create Thumbnail Worker With S3 and Lambda: Test the Trigger

 ·   ·  ☕ 7 min read

Introduction

In this post, we are going to see how we can use existing AWS infrastructure to create thumbnail for uploaded images. In this venture, we are going to use S3 trigger to execure Lambda function to do whatever we want. But in this case, we’ll create thumbnail so that we can save on some bandwidth when only a smaller version of image is required on the frontend.

So let’s see the pre-requisites:

  • Some familarity with S3
  • Some familarity with Lambda
  • Some familarity with AWS CLI

Although the tutorial can be followed through AWS Console, I’ve decided to explore AWS CLI as it can do things quicker in some cases.

This post is crudely divided into 2 sections:

  1. Test the trigger (you are here)
  2. Make the thumbnail

Overview

Talking about the overview of this section, in this section, we’ll check if trigger is getting invoked properly when a file is uploaded to a bucket. We’ll do the initial setup to achieve all these.

Let’s move to the next subsection.

Create a source bucket

This is the only bucket we’ll create in Test the trigger section. Later on in Make the thumbnail section, we will create another bucket.

$ aws s3api create-bucket --bucket source-bucket-sntshk --region ap-south-1 --create-bucket-configuration LocationConstraint=ap-south-1
{
    "Location": "http://source-bucket-sntshk.s3.amazonaws.com/"
}

Let’s go to the above command’s flags one by one:

  1. aws is the command line interface to the Amazon Web Services API. Each service* in the AWS toolbelt is available as a subcommand to this aws CLI.
  2. s3api is the command which is used to create and delete bucket among many others.
  3. create-bucket is the sub-command/action to s3api. Rest of the flags provide option to this create-bucket.
  4. --region ap-south-1 provides region to the bucket. Note: Please take note of this region. We’ll have to create Lambda function in the same region for this experiment to work.
  5. --create-bucket-configuration LocationConstraint=ap-south-1 is just another required flag in my case. You may not have to pass this flag if you are creating bucket in us-east-1 region.

Once the command has been done executing, it will spit out some output. Ignore this for now.

Upload a test object to bucket

After we are done with creating the bucket, let’s put a test object in the bucket before proceeding to the Lambda trigger part.

$ aws s3api put-object --bucket source-bucket-sntshk --key TestImage.jpg --body TestImage.jpg

{
    "ETag": "\"7ad9294573ba03dd91442e0c4aba4ad7\""
}

Explanation of the command:

  1. put-object is used to upload individual object to given bucket.
  2. --bucket source-bucket-sntshk provides the name of the bucket.
  3. --key TestImage.jpg is the prospective name of the file on the S3 side.
  4. --body TestImage.jpg is the name of the file on our local system.

On successful upload. You will see an “ETag” output as shown above.

Create the Lambda function

While creating a Lambda function is fairly simple, it’s pre-requsites are not. That is because we need to give our function some role so that it is able to communicate to AWS services. This is the case where I refrain to use AWS CLI. Things are quicker with console in this case.

Step 1: Go to Lambda console page and push the button called Create function.
Step 2: Choose the option Use a blueprint. This is the reason we are not using CLI, as CLI does not gives a way to use blueprints. With this radio button checked, search for s3 and choose the python option (you can choose Node.js if you like, all the best). Now click Configure.
Step 3: Type in a function name in the Function name section. And in Execution role section, choose Create a new role with basic Lambda permission. You may choose Use an existing role if you are following this tutorial for the second time.

Basically what role allows our lambda function to do is to give it permission to do certain things on other AWS services. We can have different policies attached inside our roles. The role used in our context does these two things:

  1. One is to check which new files has been uploaded to the S3 bucket.
  2. Another is for analytical purpose and to upload logs to CloudWatch.
Basic information for Lambda
Basic information for Lambda

Step 4: In next section, we see settings related to S3 triggers. Choose the bucket we created in earlier section. For event type, I am selecting All object create events. For Prefix and Suffix option, we have left them blank. But they are pretty self-explanatory and you can enable them if you like.

S3 trigger options for lambda
S3 trigger options for lambda

Before pushing the button Create function, you can review the sample code that is provided by the blueprint.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
import json
import urllib.parse
import boto3

print('Loading function')

s3 = boto3.client('s3')


def lambda_handler(event, context):
    #print("Received event: " + json.dumps(event, indent=2))

    # Get the object from the event and show its content type
    bucket = event['Records'][0]['s3']['bucket']['name']
    key = urllib.parse.unquote_plus(event['Records'][0]['s3']['object']['key'], encoding='utf-8')
    try:
        response = s3.get_object(Bucket=bucket, Key=key)
        print("CONTENT TYPE: " + response['ContentType'])
        return response['ContentType']
    except Exception as e:
        print(e)
        print('Error getting object {} from bucket {}. Make sure they exist and your bucket is in the same region as this function.'.format(key, bucket))
        raise e

Step 5: Press the Create function button.

Test the function manually

Time to test if our function is really working before we proceed to the next step. Follow the steps below:

Note: The GUI might change in future. But I suppose there will always an option to “Test” as long as there is an option to view code in the console.

  1. Navigate to your Lambda function. Scroll down until you see tabs like “Code”, “Test”, “Monitor”, etc. You should be by default at “Code” tab. That’s the tab we need to be in.
  2. You should see an orange button named “Test” with a caret symbol (downward facing arrow head). On clicking it, you should see options called Configure test event, click that.
  3. You will be presented with a dialog. Select Create test event, then in Event template, select s3-put, and then give this test a name.
Configure test event for Lambda
Configure test event for Lambda
  1. Now in the event JSON, replace the occurance of example-bucket with the name of bucket with created in Create a source bucket section. Replace the occurance of test%2Fkey with the test object we uploaded in Upload a test object to bucket section. Then push the “Create” orange button.

Here is a little gif to show step 2-4.

Configure test event
Configure test event

When tested, you should see Status: Succeeded on the upper right of the execution results.

Test the function by uploading something

Please note that if you are getting AccessDenied error in previous section. You should see next section.

Now that our test have started to pass, we should now proceed to test if it invokes if new object is uploaded to the bucket. Let’s do that.

$ aws s3api put-object --bucket source-bucket-sntshk --key TestImage.png --body .\Screenshot_20201108_004543.png
{
    "ETag": "\"a71acffffa4934a9a0a556365a54b2a7\""
}

This command adds new object to the bucket.

You can see function invocation in the “Monitor” tab of our Lambda function. If you don’t see it, try adjusting the time window. Something like last 15 minutes or so.

You can also verify if the function executed by going to the pressing View logs in CloudWatch button and checking out the latest log to see if it throws an error or executes successfully.

Extra: What the AccessDenied

Question: Hi Santosh, I don’t see a success message, instead I see a AccessDenied message in the Execution results. Why?

Answer: Make sure the role which Lambda create has two policies attached to it. One for writing log to CloudWatch and another for reading from S3 bucket. As part of the exercise, I’m going to deliberately delete S3 policy and attach in the gif below.

Attach role with S3 GetObject permission
Attach role with S3 GetObject permission

Conclusion

In this post, we just checked how we can execute a piece of code when something happens on S3. There are a plathora of services which can be used with Lambda.

In next post, we will see how we can create a thumbnail based on the activity we did in this post.

Share on

Santosh Kumar
WRITTEN BY
Santosh Kumar
Fullstack Developer at Method Studios