evilpenguin
evilpenguin

Reputation: 5478

Transfer data between large MySQL tables

I'm looking to transfer some data from a multi-million row table into another table. The problems I'm facing are:

So I'm basically thinking I should do a stored procedure or something similar, which takes 100 rows at a time and inserts them in the other table.

Any ideas of solutions/best practices in this matter?

Thanks,

Upvotes: 1

Views: 2988

Answers (2)

Manse
Manse

Reputation: 38147

You could use SELECT INTO OUTFILE :

SELECT * INTO OUTFILE 'backup.csv'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM <tablename> WHERE <conditions> <Other Clause if required>

Then insert :

LOAD DATA INFILE 'backup.csv'
INTO TABLE <tablename>
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
(field1,field1,field2) etc

Upvotes: 3

Moshe L
Moshe L

Reputation: 1905

What is your update freq.?

If you not need to track updates, you can just using limit. I wrote in the past function (VB.net) that wait until rows=0:

Function ExecCommandInParts(ByVal sql As String, Optional ByVal Updater As Func(Of Integer, Integer) = Nothing) As Integer
    Dim limit As Integer = 10000
    Dim count As Integer = limit
    While count = limit
        count = ExecCommandWithTimeout(sql & " limit " & limit)
        If Updater IsNot Nothing Then Updater(count)
    End While
End Function

SQL = sql query to run (INSERT ... SELECT, DELETE ...) Updater = delegate to update caller object/user what is the status of the command...

ExecCommandWithTimeout = function for execute SQL, and wait 500 milisecond.

Tested in heavy production server, works.

Upvotes: 0

Related Questions