Reputation: 42523
I have a legacy SQL Server DB and I need to copy part of a very very big table on it over to a new Aurora DB cluster from AWS (RDS).
The old table in SQL server has 1.8 billion records and 43 columns, however in the new DB I will only have 13 of those columns carried over and almost all rows.
I was wondering if anyone has any ideas on the best way that I can move this data across?
I wrote a simple Python script to query the SQL server and then execute insert statements on the new DB but I estimate this would take about 30 hours to run after I did some tests on smaller sets of data.
Any ideas?
P.S Aurora is based off of MySQL so I would imagine if it works for MySQL it would work for Aurora.
Upvotes: 1
Views: 1026
Reputation: 1364
Assuming you can get the data you want into something like a CSV file, LOAD DATA LOCAL INFILE
should be pretty performant.
I did wonder whether it would be allowed on RDS and discovered an AWS article on importing data into MySQL on RDS. I couldn't find an equivalent one for Aurora, only migrating from an RDS based MySQL instance. There's an Amazon RDS for Aurora Export/Import Performance Best Practices document that has one reference to LOAD DATA LOCAL INFILE
, however.
Upvotes: 1