Reputation: 11
By existing tables I mean, these were created in the year 2020 each day and I want to save them to S3 and delete from DynamoDB. I came across some ways it can be achieved, I wanted to know which one would work efficiently in my case.
Note: These tables are of size around 1GB and approx 1,00,000+ items.
Please share your take on these and do mention any other way of doing it, if there. Thanks! Appreciate for your time.
UPDATE: I used Export to S3, but it saved the table in four different json files. Will it be easy to import it back to DynamoDB from S3 when reqiured?
Upvotes: 1
Views: 2084
Reputation: 8885
The export to S3 option seems like the best choice if you really need it to be in S3. As with almost everything in AWS, if you can do it in the console you can do it via an API call. For this one, you need to use the ExportTableToPointInTime API, which is export_table_to_point_in_time in boto.
Upvotes: 1
Reputation: 4616
If you have point in time recovery enabled for your table you can use this API to export it directly: https://docs.aws.amazon.com/cli/latest/reference/dynamodb/export-table-to-point-in-time.html
To do the export manually you'll need to use the dynamodb scan operation. You can use the boto3 paginator https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/dynamodb.html#DynamoDB.Paginator.Scan to handle the pagination.
Upvotes: 0