Vibhor
Vibhor

Reputation: 3

Inconsistent data in dynamo DB after concurrent operations in lambda?

I am invoking five lambdas using kafka consumer and the messages as events data. These lambdas concurrently read,update ,write and delete data in dynamo db. Due to this there is inconsistency in the data. One lambda gets the data from the db and then updates the data , but in the meanwhile other lambda gets the older data and updates the data causing inconsistency in the data.Is there a way to deal with this?

Upvotes: 0

Views: 971

Answers (2)

Shawn
Shawn

Reputation: 9402

You can use strongly consistent reads to ensure that reads always reflect all successful writes that occur prior to the read.

DynamoDB uses eventually consistent reads, unless you specify otherwise. Read operations (such as GetItem, Query, and Scan) provide a ConsistentRead parameter. If you set this parameter to true, DynamoDB uses strongly consistent reads during the operation.

See https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/HowItWorks.ReadConsistency.html for a full discussion, including some disadvantages to consider.

Also see Amazon - DynamoDB Strong consistent reads, Are they latest and how?

Upvotes: 1

jarmod
jarmod

Reputation: 78583

DynamoDB supports conditional updates, see the announcement and a Java example.

Look at Optimistic Locking options if you use Java.

Upvotes: 1

Related Questions