AWS Database Services

Lambda Integration With Amazon DynamoDB Streams

5 min read
Updated June 21, 2025
5,263 characters

Objective & Use Cases

The goal is to create an automated workflow that captures item-level changes (creations, updates, and deletions) in a DynamoDB table and processes those changes with a Lambda function.

Common Use Cases:

  • Data Replication: Replicating data from a DynamoDB table to another destination, such as a data warehouse like Amazon Redshift, a search service like Amazon OpenSearch, or simply another DynamoDB table.
  • Real-time Notifications: Sending notifications (e.g., via SNS or SES) when specific events occur, like a new user signing up or an order status changing.
  • Data Aggregation: Performing real-time calculations. For example, updating a dashboard every time a new sale is recorded in the table.
  • Archiving: Moving old or infrequently accessed items from a DynamoDB table to a lower-cost storage solution like Amazon S3 Glacier.

Architecture Overview

The architecture is simple, serverless, and event-driven.

Workflow Diagram:

[DynamoDB Table Change (INSERT/MODIFY/REMOVE)] -> [DynamoDB Stream] -> [Event Source Mapping] -> [AWS Lambda Function]

  1. Change: An item in the DynamoDB table is created, updated, or deleted.
  2. Capture: DynamoDB Streams, if enabled on the table, captures this change as a time-ordered record.
  3. Poll & Invoke: An AWS-managed service called the Event Source Mapping continuously polls the stream for new records. When records are found, it invokes the target Lambda function synchronously.
  4. Process: The Lambda function executes its logic, processing the batch of records it received as its event payload.

Core Components

  • Amazon DynamoDB Table: The source database where your application data resides.
  • DynamoDB Stream: An ordered flow of information about changes to items in a DynamoDB table. This is a feature you must enable on your table. It is the "change log" for your table.
  • AWS Lambda Function: The serverless compute function that contains your business logic for processing the changes.
  • Event Source Mapping: The "glue" that connects the DynamoDB Stream to your Lambda function. It is responsible for reading records from the stream and invoking your function.

Key Configuration Details

1. Enabling DynamoDB Streams

This is the most critical setup step on the DynamoDB side. When enabling a stream on your table, you must choose a Stream View Type, which determines what information is written to the stream record.

Stream View Type Description Use Case Example
KEYS_ONLY Only the key attributes (partition and sort key) of the modified item. When you only need to know which item changed, and can look up the rest of the data yourself.
NEW_IMAGE The entire item as it appears after it was modified. Perfect for replicating new or updated data to another system.
OLD_IMAGE The entire item as it appeared before it was modified. Useful for archiving or calculating the difference before and after a change.
NEW_AND_OLD_IMAGES Both the new and the old images of the item. The most flexible option, ideal for seeing what specific attributes changed.

2. Lambda Trigger Configuration

When you set the DynamoDB Stream as a trigger for your Lambda function, you are creating an Event Source Mapping. Key settings include:

  • Batch Size: The maximum number of records to be read from the stream in each batch (default is 100). The Lambda function will be invoked with a payload containing 1 to Batch size records.
  • Error Handling: Configure what happens if your function fails. You can set retry attempts or configure a destination for failed records (a "dead-letter queue" like an SQS queue or SNS topic).

Understanding the Lambda Event Payload

Your Lambda function receives an event object that contains an array of records. Your code needs to loop through this array to process each change individually.

Example Event Structure:

{
  "Records": [
    {
      "eventID": "1",
      "eventName": "INSERT",
      "dynamodb": {
        "Keys": {
          "Id": { "S": "101" }
        },
        "NewImage": {
          "Id": { "S": "101" },
          "Message": { "S": "New item!" }
        },
        "StreamViewType": "NEW_AND_OLD_IMAGES",
        ...
      }
    },
    {
      "eventID": "2",
      "eventName": "MODIFY",
      ...
    }
  ]
}
Records: An array of change records.
eventName: The type of modification (INSERT, MODIFY, or REMOVE).
dynamodb: An object containing the data.
Keys: The key attributes of the modified item.
NewImage: The item's state after the change (available if the view type is NEW_IMAGE or NEW_AND_OLD_IMAGES).
OldImage: The item's state before the change (available if the view type is OLD_IMAGE or NEW_AND_OLD_IMAGES).
IAM Role Permissions
The execution role for your Lambda function needs permissions to interact with DynamoDB Streams. The AWS-managed policy AWSLambdaDynamoDBExecutionRole provides the necessary permissions:
dynamodb:DescribeStream
dynamodb:GetRecords
dynamodb:GetShardIterator
dynamodb:ListStreams It also includes permissions to write logs to CloudWatch.
<!-- end list -->