mr-sk
mr-sk

Reputation: 13417

Doctrine entity manager and multiple threads updating database

I currently have an XHR request that can fire off N times from the client. This requests are handled by the server and each requests typically creates a new row in the database (all doctrine / xml).

Before I persist() the object, I make sure if has a unique filename (I am uploading assets) and I do this by overriding persist(), calling my getUniqueFilename() and then calling parent::persist.

I get a race condition when I perform multiple XHR requests w/the same filename. What happens if that multiple threads are running at the exact same time and checking the database for duplicates in order to generate the unique filename.

However, when multiple XHR requests occur in multiple threads a race condition occurs where multiple files are inserted into the database w/the same name (filename_1 is generated multiple times).

I think the way to solve this is either

What would you do?

Upvotes: 2

Views: 1571

Answers (2)

Sgoettschkes
Sgoettschkes

Reputation: 13189

I would suggest using a different strategy: Using the mysql auto_increment to get an id, saving the asset to the database, retrieving the id and then adding it to the filename. All the other ways have drawbacks where you have to perform partial rollbacks, handle dublicated filenames etc.

I would also suggest to not use the original filename for storing the object: You run into troubles with forbidden characters on different operating systems as well as character encoding, possible dublicates for some reason (e.g. because the database is case_sensitive where the file system is not). There may be other drawbacks like max file name length or so which you might not be aware of right now.

My solution is simply using a mysql auto increment as filename. If you think about it it makes sense. The auto increment is used as a unique idenfitier. If you make sure to only story objects from one table into one folder, you have no problems with identifieng different assets, filenames etc.

If you insist on keeping your way, you could make the filename unique in the database and then restart on a failing flush as you suggested.

Upvotes: 2

Louis-Philippe Huberdeau
Louis-Philippe Huberdeau

Reputation: 5431

Adding a unique constraint is the safest way to ensure consistent data. Anything at the PHP level could have some race conditions unless you have some other forms of locking, which will be less efficient.

You could also avoid the problem by making sure the file names become unique using other attributes such as the user it came from or keeping version history which would just make it look like there was a new version almost at the same time.

Upvotes: 4

Related Questions