Reputation: 1148
I have a big array of data, which I want to insert into a DB.
Each element of the array represents another array for a row in my DB.
For each element I create an ActiveRecord
instance.
There can be as much as 10K (and more) elements and I insert them into the DB by cron
. Before each insert my ActiveRecord runs validation and makes a few SELECTs.
But sometimes I get duplicates in the DB. To simplify, I have the next array of data:
Field1 Field2 Field3
field1Value1 field2Value1 field3Value1
field1Value2 field2Value2 field3Value2
field1Value3 field2Value3 field3Value3
field1Value4 field2Value4 field3Value4
And in my DB I can get something like this:
id Field1 Field2 Field3 time
1 field1Value1 field2Value1 field3Value1 2013-11-11 17:17:40
2 field1Value2 field2Value2 field3Value2 2013-11-11 17:17:40
3 field1Value4 field2Value4 field3Value4 2013-11-11 17:17:40
4 field1Value2 field2Value2 field3Value2 2013-11-11 17:17:40
5 field1Value3 field2Value3 field3Value3 2013-11-11 17:17:40
In this case the row with value 2
was inserted twice. Also pay attention to the row with value 4
which was inserted right after the row with value 2
.
The problem is these cases are occasional, I can't find a pattern here. I tried using transactions but it didn't solve the problem.
In my PHP code I have just a loop which creates a new ActiveRecord instance. There's no way for a row to be inserted twice. So the problem must be in MySQL.
Upvotes: 0
Views: 93
Reputation: 111
I always recommend maintaining data integrity at the database level if possible. If you should never have duplicate values for the three columns as stated above then add a unique index for those three columns on the database table. If the script is looping and inserting data it will fail when the unique index is violated and it will prevent the situation. That is the best advice I can provide without seeing the code that is making the inserts.
CREATE UNIQUE INDEX index_name ON table(field1, field2, field3);
Upvotes: 1