Reputation: 30956
I have hundred of thousands of elements to insert into a database. I realized calling an insert statement per element is way too costly and I need to reduce the overhead. I recon each insert can have multiple data elements specified such as
INSERT INTO example (Parent, DataNameID) VALUES (1,1), (1,2)
My issue is that since the "DataName" keeps repeating itself for each element I thought it would optimize space if I stored these string names in another table and reference it. However that causes problems for my idea of the bulk insert which now requires a way to actually evaluate the ID from the name before calling the bulk insert.
Any recommendations? Should I simply de-normalize and insert the data every time as plain string to the table? Also what is the limit of the size of the string as the string query amounts to almost 1.2 MB?
I am using PHP with MySQL backend
Upvotes: 1
Views: 293
Reputation: 8704
You might want to read up on load data (local) infile. It works great, I use it all the time.
EDIT: the answer only addresses the sluggishness of individual inserts. As @bemace points out, it says nothing about string IDs.
Upvotes: -1
Reputation: 27916
You haven't given us a lot of info on the database structure or size, but this may be a case where absolute normalization isn't worth the hassle.
However if you want to keep it normalized and the strings are already in your other table (let's call it datanames
), you can do something like
INSERT INTO example (Parent, DataNameID) VALUES
(1, (select id from datanames where name='Foo')),
(1, (select id from datanames where name='Bar'))
Upvotes: 1
Reputation: 31743
First you should insert the name in the table.
Than call LAST_INSERT_ID()
to get the id.
Than you can do your normal inserts.
If your table is MYisam
based you can use INSERT DELAYED
to improve performance: http://dev.mysql.com/doc/refman/5.5/en/insert-delayed.html
Upvotes: 1