Reputation: 1838
So i have a database where there is a lot of data being inserted from a java application. Usualy i insert into table1 get the last id, then again insert into table2 and get the last id from there and finally insert into table3 and get that id as well and work with it within the application. And i insert around 1000-2000 rows of data every 10-15 minutes.
And using a lot of small inserts and selects on a production webserver is not really good, because it sometimes bogs down the server.
My question is: is there a way how to insert multiple data into table1, table2, table3 without using such a huge amount of selects and inserts? Is there a sql-fu technique i'm missing?
Upvotes: 7
Views: 952
Reputation: 18397
You could redesign your database such that the primary key was not a database-generated, auto-incremented value, but rather a client generated UUID. Then you could generated all the keys for every record upfront and batch the inserts however you like.
Upvotes: 1
Reputation: 563001
Since you're probably relying on auto_increment primary keys, you have to do the inserts one at a time, at least for table1 and table2. Because MySQL won't give you more than the very last key generated.
You should never have to select. You can get the last inserted id from the Statement using the getGeneratedKeys()
method. See an example showing this in the MySQL manual for the Connector/J:
Other recommendations:
INSERT
syntax for table3.ALTER TABLE DISABLE KEYS
while you're importing, and re-enable them when you're finished.Unfortunately, you can't use the fastest method for bulk load of data, LOAD DATA INFILE
, because that doesn't allow you to get the generated id values per row.
Upvotes: 6
Reputation: 309028
There's a lot to talk about here:
Upvotes: 2