Logo

dev-resources.site

for different kinds of informations.

DynamoDB Transactions with AWS Step Functions

Published at
10/19/2024
Categories
aws
dynamodb
stepfunctions
Author
viktorardelean
Categories
3 categories in total
aws
open
dynamodb
open
stepfunctions
open
Author
14 person written this
viktorardelean
open
DynamoDB Transactions with AWS Step Functions

1. Overview

In this post, we'll explore how to leverage direct service integrations in AWS Step Functions to build a workflow for executing DynamoDB transactions. AWS Step Functions are an excellent tool for breaking down business workflows into individual steps, promoting separation of concerns and encapsulating discrete actions within each step.

2. The Use Case

Let's consider a real-life scenario to demonstrate this approach. We start with an object stored in Amazon S3. When the file is deleted, we must remove an item from two DynamoDB tables. To ensure data consistency, we'll wrap both delete operations inside a transaction, preventing a situation where one delete succeeds while the other fails.

Here's an example of an Amazon EventBridge rule that captures all delete events from a specific Amazon S3 bucket:

{
  "detail": {
    "bucket": {
      "name": ["bucket_name"]
    },
    "deletion-type": ["Permanently Deleted"]
  },
  "detail-type": ["Object Deleted"],
  "source": ["aws.s3"]
}
Enter fullscreen mode Exit fullscreen mode

3. The Traditional Lambda Solution

A classic design would involve enabling Amazon S3 event notifications to Amazon EventBridge. Once the event reaches the event bus, an Amazon EventBridge rule would trigger a AWS Lambda function to execute the DynamoDB transaction. Here's what this architecture might look like:

Image description

Let's examine a potential AWS Lambda function implementation:

def lambda_handler(event, context):
    # Extract the S3 object key from the EventBridge event
    s3_key = event.get["detail"]["object"]["key"]

    # Construct your DynamoDB delete operations
    delete_item_in_table_A = {
        'Delete': {
            'TableName': "ddb_table_a",
            'Key': {
                'YourPrimaryKeyAttributeName': {'S': s3_key}
            }
        }
    }

    delete_item_in_table_B = {
        'Delete': {
            'TableName': "ddb_table_b",
            'Key': {
                'YourPrimaryKeyAttributeName': {'S': s3_key}
            }
        }
    }

    # Perform a DynamoDB transaction to ensure both deletes happen together
    response = dynamodb.transact_write_items(
        TransactItems=[delete_item_in_table_A, delete_item_in_table_B]
    )

    return {     
        'statusCode': 200,     
        'body': json.dumps('Delete transaction succeeded'  
    }
Enter fullscreen mode Exit fullscreen mode

While this solution is concise and functional, it has some drawbacks. The AWS Lambda function merely receives an event and performs an API call, without any substantial business logic. It serves as a simple connector in a data pipeline chain, executing some delete operations.

AWS Lambda functions that primarily connect different services or transform events without complex business logic can often be replaced with service integrations.
Let's explore this alternative approach.

4. The AWS Step Functions Solution

Amazon EventBridge supports AWS Step Functions as a target, allowing us to replace the AWS Lambda function with a AWS Step Functions workflow. This approach enables us to build a no-code solution using DynamoDB direct service integrations within the workflow.

Here's an overview of this solution:

Image description

Now, let's dive into the implementation of the AWS Step Functions workflow.

4.1 AWS Step Functions Service Integrations

Since our use case doesn't involve complex logic, we can build our workflow using service integrations. AWS Step Functions offer two types of integrations with other AWS services:

  1. AWS SDK Integrations: These cover over 200 services and are similar to API calls you'd make in a AWS Lambda function.
  2. Optimized Integrations: Available for about 20 core services, these add convenience by automatically converting output to JSON and handling asynchronous tasks, eliminating the need for custom polling mechanisms.

We have two AWS SDK integration options, DynamoDB:TransactWriteItems and DynamoDB:ExecuteTransaction, to wrap both delete statements in a transaction.

Let's explore both implementations.

4.1 DynamoDB:TransactWriteItems

The TransactWriteItems API allows for synchronous, atomic write operations across multiple items. It supports up to 100 actions (Put, Update, Delete, or ConditionCheck) in different tables within the same AWS account and region. This API doesn't allow read operations within the transaction and ensures all actions either succeed or fail together.

Using this approach, we need just a single step in our workflow:

Image description

Here's the workflow's ASL (Amazon State Language) definition for the transaction:

{
  "Comment": "DynamoDB Transaction for Delete Statements",
  "StartAt": "DeleteTransaction",
  "States": {
    "DeleteTransaction": {
      "Type": "Task",
      "Parameters": {
        "TransactItems": [
          {
            "Delete": {
              "TableName": "ddb_table_a",
              "Key": {
                "PK": {
                  "S.$": "$.detail.object.key"
                }
              }
            }
          },
          {
            "Delete": {
              "TableName": "ddb_table_b",
              "Key": {
                "PK": {
                  "S.$": "$.detail.object.key"
                }
              }
            }
          }
        ]
      },
      "Resource": "arn:aws:states:::aws-sdk:dynamodb:transactWriteItems",
      "End": true
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

4.2 DynamoDB:ExecuteTranscation

The ExecuteTransaction API allows for transactional reads or writes using PartiQL statements. A transaction can contain up to 100 statements, but all operations must be either reads or writes, not a mix. It ensures that all statements in the transaction are executed atomically.

Our workflow would look like this:

Image description

Defining the delete statements for this approach can be tricky. Here's an example implementation:

{
  "Comment": "DynamoDB Transaction for Delete Statements",
  "StartAt": "ExecuteTransaction",
  "States": {
    "ExecuteTransaction": {
      "Type": "Task",
      "Parameters": {
        "TransactStatements": [
          {
            "Statement": "DELETE FROM \"ddb_table_a\" WHERE PK = ?",
            "Parameters": [
              {
                "S.$": "$.detail.object.key"
              }
            ]
          },
          {
            "Statement": "DELETE FROM \"ddt_table_b\" WHERE PK = ?",
            "Parameters": [
              {
                "S.$": "$.detail.object.key"
              }
            ]
          }
        ]
      },
      "Resource": "arn:aws:states:::aws-sdk:dynamodb:executeTransaction",
      "End": true
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

In both cases, we only need to define a single state with the delete statements. This approach eliminates the need for maintaining code, dealing with cold starts, or managing runtime updates.

Cost Considerations

When it comes to costs, choosing the Express workflow type is the most economical option for such a simple and fast workflow.

An often overlooked fact is that AWS Step Functions Express offers a minimum 64 MB configuration option, which is more cost-effective than the minimum 128 MB AWS Lambda function.

To illustrate, let's consider a scenario with 3 million invocations, each lasting 100ms:

  • A 128 MB AWS Lambda function in us-east-1 would cost $0.51
  • A 64 MB Express Step Function in the same region would cost $0.31

This demonstrates the potential for significant cost savings when using AWS Step Functions for simple workflows.

Conclusion

By leveraging AWS Step Functions with direct service integrations, we can create efficient, no-code solutions for executing DynamoDB transactions. This approach offers several advantages over traditional AWS Lambda-based implementations:

  1. Simplified architecture with reduced code maintenance
  2. Improved separation of concerns
  3. Potential cost savings, especially for simple, high-volume workflows
  4. Elimination of cold starts and runtime management

As we've seen, both TransactWriteItems and ExecuteTransaction APIs provide robust options for implementing transactional operations in DynamoDB through AWS Step Functions. The choice between them depends on your specific use case and whether you need to include read operations in your transactions.

By adopting this serverless, no-code approach, you can simplify your data pipeline processes and focus more on building scalable, maintainable applications in AWS.

dynamodb Article's
30 articles in total
Favicon
Efficient Batch Writing to DynamoDB with Python: A Step-by-Step Guide
Favicon
Advanced Single Table Design Patterns With DynamoDB
Favicon
DynamoDB GUI in local
Favicon
Stream Dynamo para RDS Mysql
Favicon
Alarme Dynamo Throttle Events - Discord
Favicon
What I Learned from the 'Amazon DynamoDB for Serverless Architectures' Course on AWS Skill Builder
Favicon
A Novel Pattern for Documenting DynamoDB Access Patterns
Favicon
Making dynamodb queries just a little bit easier.
Favicon
Replicate data from DynamoDB to Apache Iceberg tables using Glue Zero-ETL integration
Favicon
Deploying a Globally Accessible Web Application with Disaster Recovery
Favicon
Practical DynamoDB - Locking Reads
Favicon
Databases - Query Data with DynamoDB
Favicon
Database Management in the cloud (RDS, DynamoDB)
Favicon
Securing Customer Support with AWS Services and Slack Integration
Favicon
Help! I Think I Broke DynamoDB – A Tale of Three Wishes πŸ§žβ€β™‚οΈ
Favicon
Mastering DynamoDB in 2025: Building Scalable and Cost-Effective Applications
Favicon
Syncing Data in Near Real-Time: Integrate DynamoDB and OpenSearch
Favicon
DynamoDB and the Art of Knowing Your Limits πŸ’₯When Database Bites Back πŸ§›β€β™‚οΈ
Favicon
DynamoDB Transactions with AWS Step Functions
Favicon
Create an API to get data from your DynamoDB Database using CDK
Favicon
AWS Database - Part 2: DynamoDB
Favicon
Building event-driven workflows with DynamoDB Streams
Favicon
Setting up a REST API in Python for DynamoDB
Favicon
How to Create a (Nearly) Free Serverless Rate Limiter on AWS
Favicon
Query DynamoDB with SQL using Athena - Leveraging EventBridge Pipes and Firehose (2/2)
Favicon
Schema Validation in Amazon DynamoDB
Favicon
Use Cases for DynamoDB in Provisioned Mode vs. Auto Scaling: Advantages and Disadvantages
Favicon
Mastering DynamoDB: Batch Operations Explained
Favicon
Not only dynamoDb migration or seed scripting
Favicon
Query DynamoDB with SQL using Athena - Leveraging DynamoDB Exports to S3 (1/2)

Featured ones: