| by Arround The Web | No comments

DynamoDB Streams with Lambda

DynamoDB Streams is a feature in Amazon DynamoDB that allows you to get a real-time stream of changes or data changes which are made to your DynamoDB tables. You can use this feature to build the applications that react to data changes in DynamoDB tables such as updating caches and sending notifications.

Of course, you can also use the DynamoDB Streams to trigger the downstream processes and AWS Lambda functions. Lambda, by definition, is a serverless compute service that runs your code in response to events and automatically manages the compute resources for you.

You can use Lambda to write the code in Node.js, Python, Java, or C# to process the stream records and take appropriate actions. The main benefit of integrating the DynamoDB Streams with Lambda is that Lambda allows you to run the backed services or application codes without the need for administration.

How to Use the AWS DynamoDB Streams with Lambda

While it is possible to create a Lambda function that consumes the events and occurrences from a DynamoDB Stream, the process can be quite tasking, particularly on your first attempt. The following steps will help:

Step 1: Ensure that Your System Meets the Prerequisites

This procedure will only be successful if you know the basic Lambda operations and processes. Thus, this should be your first to ensure that your understanding of Lambda is above average.

The second prerequisite that is worth considering is confirming your system’s AWS version. You can use the following command:

aws –version

The result for the provided command should look like this:

aws-cli/2.x.x Python/3.x.x Linux/4.x.x-xxx-std botocore/2.x.x

The given sample response contains the installed version of the AWS CLI (aws-cli/2.x.x), the Python version (Python/3.x.x), and the operating system (Linux/4.x.x-xxx-std). The final part of the response defines the Botocore library version that your AWS CLI runs on (botocore/2.x.x).

You will, therefore, end up with something like this:

Step 2: Create an Execution Role

The next step is to create an execution role in AWS CLI. An execution role is an AWS Identity and Access Management (IAM) role assumed by an AWS service to perform the tasks on your behalf. It gives you access to AWS resources that you will need along the way.

You can create a role using the following command:

aws iam create-role \

--role-name LambdaDynamoDBExecutionRole \

--assume-role-policy-document file://assume-role-policy.json \

--description " AWSLambdaDynamoDBExecutionRole" \

--service-name lambda.amazonaws.com

The previous command is an AWS CLI command to create a role. You can also use the Amazon Management Console to create a role. Once you are at the IAM console, open the Roles page and click the Create role button.

Proceed to enter the following:

  • Trusted Entity: Lambda
  • Role Name: lambda-dynamodb-role
  • Permissions: AWSLambdaDynamoDBExecutionRole

You can also use Python by first installing the AWS SDK for Python:

pip install boto3

Step 3: Enable the DynamoDB Streams on Your Table

You need to enable the DynamoDB Streams on your table. For this illustration, we use the Boto3, AWS SDK for Python. The following command will help:

import boto3

# Connect to the DynamoDB service
dynamodb = boto3.client('dynamodb')

# Enable DynamoDB streams on the 'my-table' table
response = dynamodb.update_table(
TableName='mytable',
StreamSpecification={
        'StreamEnabled': True,
        'StreamViewType': 'NEW_AND_OLD_IMAGES'
    }
)

# Check the response to make sure the stream was enabled successfully
if response['StreamSpecification']['StreamEnabled']:
    print("DynamoDB stream enabled successfully")
else:
    print("Error enabling DynamoDB stream")

This code enables the DynamoDB stream on the “mytable” table that streams both the new and old images of items as soon as any changes occurs. You can choose to only stream the new images as soon as the StreamViewType to “NEW_IMAGE”.

Notably, running this code may only enable the streams on your tables after a while. Instead, the process may take some time. You can use the describe_table method to check the status of the stream.

Step 4: Create the Lambda Function

The next step is creating a Lambda function that triggers the DynamoDB stream. The following steps should help:

  • Open the AWS Lambda console and click the “Create function” tab. On the “Create function” page, choose “Author from scratch” and enter a name for your function. You also need to enter your runtime at this point. We selected Python for this illustration.
  • Under “Choose or create an execution role”, select “Create a new role with basic Lambda permissions” to create an IAM role with the necessary permissions for your Lambda function.
  • Click the “Create function” button to create your Lambda function.
  • On the “Configuration” page for your function, scroll down to the “Designer” section and click the “Add trigger” tab.
  • In the “Trigger configuration” box that appears, select “DynamoDB” from the “Trigger” dropdown menu.
  • Select the DynamoDB table that you want to use to trigger the function. Once done, choose whether you want the function triggered on all updates to the table or only on specific updates (such as updates to particular columns).
  • Click the “Add” button to create the trigger.
  • In the “Function code” editor, write the Python code for your function. You can use the event object that is passed to your function to access the data that triggers the function.
  • Click the “Save” button to have the function saved.

That’s all that happens when creating a Lambda function! Your function is now triggered whenever there are updates to the specified DynamoDB table.

Here’s an example of a simple Python function that the DynamoDB stream can trigger:

def lambda_handler(event, context):

  for record in event['Records']:

    print(record['dynamodb']['NewImage'])

This function iterates through the records in the event object and prints out the new image of the item in the DynamoDB table that triggers the function.

Step 5: Test the Lambda Function

To test a Lambda function that a DynamoDB stream can trigger, you can use the boto3 library to access the DynamoDB API and the invoke method of the Lambda client to trigger the function.

Here’s an example on how to do it:

import boto3

# Connect to the DynamoDB service
dynamodb = boto3.client('dynamodb')

# Connect to the Lambda service
lambda_client = boto3.client('lambda')

# Insert an item into the 'my-table' table
response = dynamodb.put_item(
TableName='mytable',
    Item={
        'id': {'N': '123'},
        'name': {'S': 'Joel Austin},
        '
age': {'N': '34'}
    }
)

# Check the response to make sure the item was inserted successfully
if response['ResponseMetadata']['HTTPStatusCode'] == 200:
    print("Item inserted successfully")
else:
    print("Error inserting item")

# Trigger the Lambda function that is subscribed to the 'my-table' table
response = lambda_client.invoke(
FunctionName='
myfunction',
InvocationType='
Event',
LogType='
Tail',
    Payload='
{"Records": [{"dynamodb": {"NewImage": {"id": {"N": "123"}, "name": {"S": "Joel Austin"}, "age": {"N": "34"}}}}]}'
)

# Check the response to make sure the function was triggered successfully
if response['StatusCode'] == 202:
    print("Lambda function triggered successfully")
else:
    print("Error triggering Lambda function")

This code first inserts an item into the mytable table and then triggers the myfunction Lambda function by sending a sample event payload to the function using the invoke method. The event payload simulates a DynamoDB stream event that includes the new image of the item that was just inserted.

You can then check the logs of the Lambda function to see if it successfully triggered and processed the event data.

Conclusion

It is vital to note that you can invoke multiple times for the same stream record that a DynamoDB stream can trigger a Lambda function. The primary reason behind this is that the stream records are eventually consistent, and it is possible to process the same record multiple times by the Lambda function. It is crucial to design your Lambda function to handle this case correctly.

Share Button

Source: linuxhint.com

Leave a Reply