Chris Visser
Chris Visser

Reputation: 1647

How to handle milions of separate insert queries

I have a situation in which I have to insert over 10 million separate records into one table. Normally a batch insert split into chunks does the work for me. The problem however is that this over 3gig file contains over 10 million separate insert statements. Since every query takes 0.01 till 0.1 seconds, it will take over 2 days to insert everything.

I'm sure there must be a way to optimize this by either lowering the insert time drasticly or somehow import in a different way.

I'm now just using the cli

source /home/blabla/file.sql

Note: It's a 3th party that is providing me this file. I'm

Small update I removed any indexes

Upvotes: 1

Views: 61

Answers (3)

claj
claj

Reputation: 5402

Drop the indexes, then re-index when you are done!

Upvotes: 2

Bjoern
Bjoern

Reputation: 16304

There are some ways to improve the speed of your INSERT statements:

  • Try to insert many rows at once if this is an option.

  • An alternative can be to insert the data into a copy of your desired table without indexes, insert the data there, then add the indexes and rename your table.

  • Maybe use LOAD DATA INFILE, if this is an option.

The MySQL manual has something to say about that, too.

Upvotes: 0

iMx
iMx

Reputation: 846

Maybe you can parse the file data and combine several INSERT queries to one query like this:

INSERT INTO tablename (field1, field2...) VALUES (val1, val2, ..), (val3, val4, ..), ...

Upvotes: 0

Related Questions