Reputation: 990
I have a case, where I should limit rows per user in the table. Now I am doing this via COUNT * FROM table
check before insert, and if the count equals/more than allowed, I throw an error. COUNT and INSERT query running in the single transaction.
But, on 5000 online users and 50K requests per minute, I have extra records (more than limit) in the table. Looks like a race condition on parallel inserts. How can I avoid this? Can anyone suggest some best practices?
Upvotes: 1
Views: 2063
Reputation: 1678
The issue is called Phantom read. Typically it can be resolved by using Serializable isolation level of transaction:
https://en.wikipedia.org/wiki/Isolation_(database_systems)
But it can decrease performance. So if you have a lot of inserts than try other options from comments too.
Upvotes: 0
Reputation: 33
Use a separate table which will maintain the user and the count of rows inserted. Use the userid as foreign key to the main table. Now if you have a session based application you can load the data into the session or memory and keep fetching it / updating the count after every insert in the session / memory and the database and then actually inserting into the main table.
Upvotes: 1