Reputation: 442
I have multi-tenant product offering and use dynamodb database, so all our web-request is being served from dynamodb. I have use case where I want to move data of a tenant from one region to another, this would be background process.
How do I ensure background process does not hog the database ? otherwise it will give bad user experience and may bring website down.
Is there a way I can have dedicated read and write capacity provisioned for background process.
Upvotes: 0
Views: 81
Reputation: 8107
Sorry answer from Kirk
is not a good idea for saving $$$. DynamoDB has TTL feature so say you want to delete something, you expire the item, meaning queries for that used to get that item no longer retrieve it, because the TTL has expired.
But it is not yet DELETED ! It will be scheduled for deletion later, saving you those precious capacity units when it deletes items in batches as opposed to one by one, greatly saving you money and is what the technology is for.
Upvotes: 1
Reputation: 4845
You cannot dedicate read and write capacity units to specific processes, but you could temporarily change the table's capacity mode to on-demand for the move, and then switch it back to provisioned mode later when the move is complete. You can make this capacity mode switch once every 24 hours. By changing to on-demand capacity mode, you are less likely to be throttled in this specific situation.
That said, without knowing your current table capacity mode and capacity settings on those tables, it is difficult for me to make concrete recommendations though.
Upvotes: 1