Reputation: 31672
I don't want to use data pipeline because it is too cumbersome. I also have a relatively small table so it would be heavy handed to use data pipeline for it- I could run a script locally to do the import because it's so small.
I used the fully managed Export to S3 feature to export a table to a bucket (in a different account): https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/DataExport.html
What are my options now for importing that to a new table in the other account?
If there isn't a managed feature for this, does AWS provide a canned script I can point at an S3 folder and give the name of the new table I want to create from it?
Upvotes: 0
Views: 4963
Reputation: 4865
As of 18 August 2022, this feature is now built into DynamoDB and you need no other services or code.
Upvotes: 3
Reputation: 397
Amazon DyanamoDB now supports importing data from S3 buckets to new DynamoDB tables from this blog post.
The steps for importing data from S3 buckets can be found in their developer guide.
Upvotes: 3
Reputation: 16805
You may want to create a AWS Data Pipeline which already has recommended template for importing DynamoDB data from S3:
This is the closest you can get to a "managed feature" where you select the S3 prefix and the DynamoDB table.
Upvotes: 1
Reputation: 25779
Another AWS-blessed option is a cross-account DynamoDB table replication that uses Glue
in the target account to import the S3 extract and Dynamo Streams
for ongoing replication.
Upvotes: 1