Reputation: 21499
I have a table with 3 columns. Each has a unique index.
I'd like to do multiple inserts at once (300 records a pop). When a duplicate entry occurs it cancels the insert in its entirety. This means if 1 out of the 300 is a duplicate, none of them will be inserted.
Is there a way around this?
Upvotes: 6
Views: 1859
Reputation: 14814
Try changing your query from INSERT INTO ...
to INSERT IGNORE INTO ...
. This will cause any errors to become warnings, and your other records should be inserted.
Upvotes: 8
Reputation: 813
You can use something like this:
INSERT INTO table (username, id)
VALUES
('john', 1),
('jim', 2),
('mary', 3),
('jack', 4),
('helen', 4) ON DUPLICATE KEY IGNORE
If you want the entry to be replaced , use UPDATE instead of IGNORE.
Upvotes: -1
Reputation: 17182
If your inserts are idempotent, update or replace
would help you out.
Given they are likely not, there isn't any super-efficient way to do this without something falling back to inserting individual rows as a fallback - to isolate the problem row.
If you are batching inserts to reduce latency from client to server, consider using a stored procedure to take the rows, and does insert on the server side taking all the data in a shot; that can have a fallback that does appropriate error handling on a row-by-row basis.
That assumes, of course, that there is some meaningful error handling that can be done on the server side without needing synchronous communication to the client.
Upvotes: 1
Reputation: 9332
Use a transaction: http://dev.mysql.com/doc/refman/5.0/en/commit.html
START TRANSACTION
// DO QUERIES
COMMIT
If anything goes wrong in DO QUERIES, none of the records will be committed.
If you want to ignore errors. use --force from your mysql command line
Upvotes: 0