Reputation: 191
I have read access to a SQL Server database with a lot of data stored. Each night I pull about a million or so lines from a table on this DB and put it on a MySQL database on my local machine, where I then perform analyses on the data. My current method of copying this data is a Perl script that basically does:
(1) Connect to MySQL database
(2) Drop old table
(3) Create table.
(4) Get data from SQL Server DB.
(5) Put data on MySQL database using the following method:
while (@data = $sth->fetchrow_array()) {
$id = $data[0];
$at = $data[1];
$ar = $data[2];
$dt = $data[3];
$v = $data[4];
$di = $data[5];
#PUSH ONTO MYSQL
$dbh2->do("INSERT INTO datatable VALUES (?, ?, ?, ?, ?, ?)",
undef, $id, $at, $ar, $dt, $v, $di);
}
This is very slow. Is there a method of copying these data that will be faster, taking into account my limited privileges on the remote SQL server? (Someone suggested creating a linked Server but I don't think I can without more of an admin role on the remote server). In the future I might have to work with even larger sets of data, so figuring out a faster way would be ideal.
Of note: I am copying an entire date range from the remote server every night, even though it technically has only been updated with the most recent day's worth of data. However, there is a possibility that some of the data from previous dates might be changed later, which is why we thought it prudent to just copy the whole thing every time. That's obviously part of what makes it slow. Is there a way to only copy the changes each night?
FYI my work machine is running Windows 7 and I'm using MySQL 5.7
Upvotes: 0
Views: 161
Reputation: 80
According to dev.mysql.com LOAD DATA INFILE is about 20 times faster than INSERT. That means that is is a lot faster for you to use the Perl script to write to a text file and the do a LOAD DATA INFILE with that information.
Upvotes: 1