Mike Yinger
Mike Yinger

Reputation: 23

NoSQL Workbench commit to DynamoDB incomplete

I use NoSQL Workbench to design (single-table) the PK SK and Non-Key Attributes by creating the test data in VS Code as JSON, then import into the workbench. I repeat this over the course of weeks as I build up the key-designs. Once I like how it behaves I create the table in DynamoDB and start coding queries. Twice in the last few years I run into a situation where after awhile the Workbench commit to DynamoDB fails to read the entire test data set. I'm sure the test data is valid JSON. The workbench pop a dialog that states how many items were not committed. If I delete the new table and re-commit sometimes the number of items not committed changes. The DynamoDB cloud-watch log is not helpful. Next time it happens, I'll revisit this post and add the log message.

This is what I do when that happens. I use the Workbench to create a new model, export the JSON, then I snip the "NonKeyAttributes" and "TableData" table data sections and add them to the new JSON model and then import and the commit to DynamoDB takes all the test data.

Since these are single-table designs, the PK and SK are highly overloaded, but I don't see why that would cause the meta data to get corrupted; i.e., hence the need for a new model.

If anyone can shed some light on what might be happening I'd be grateful or an easier way to fix the problem.

Upvotes: 0

Views: 345

Answers (1)

Leeroy Hannigan
Leeroy Hannigan

Reputation: 19893

I believe this happens as NoSQLWorkbench defines the table to have 1WCU which causes throttling when trying to migrate large amounts of data, resulting in some items being dropped.

The team are aware of this edge case and it should be updated in an upcoming release to resolve this issue.

Upvotes: 1

Related Questions