Hugo Zonderland
Hugo Zonderland

Reputation: 92

PHP: Best way forward [serialisation, objects, Redis]

I have been developing a PHP application for quite a while. The (basic) idea is as follows; users can build web-pages using blocks. These blocks can contain images, text etc. Each of these blocks have their own options. These blocks are defined in Domain Driven Design through PHP.

I've build the application to use a php-based Controller that handles the requests from a jQuery/Javascript front. Each time the user edits an option its send to this Controller which unserialises a collection of blocks (php-objects) from Redis and/or the php-session and sets the the attributes of the blocks that are edited or adds/removes one of the blocks. This is to enforce the Domain logic.

Which was fine will developing for myself. I never kept race conditions and such in mind. While moving forward with the product I notice that people lose data. I'll explain what happens;

  1. User edits an option of a block
  2. press save
  3. A request is made to the controller which,
  4. unserialises the collection and
  5. sets the blocks based on their uuid
  6. puts the blocks back in the collection and
  7. serialises the collection again.

There are scenario's where 2 concurrent request might be created which will override the edits of 1 of both requests.

I know I need to rewrite this part of the application. The question is what is the best approach. I could;

  1. Implement some javascript library which will take me a lot of work because it would require me to rewrite that entire part of the application. Also I do not have a lot of experience implementing javascript based solutions. But I do not might stepping into something new. I do want to javascript testing to prevent future problems from occurring and enable cross-browser testing
  2. Apply Redis / Session locking to only enable the controller to process a single request and prevent concurrent requests from overwriting the data set in the previous request. This will lower the chance of concurrent request and data loss, but not fully. People with real slow internet connection might get their connections losing when they might produce a lot of concurrent requests.

I'm curious what other approaches I might be missing, or if one of the two I mentioned above will suffice.

Upvotes: 0

Views: 78

Answers (1)

Boris Guéry
Boris Guéry

Reputation: 47614

As far as I understand your problem, what you may want to implement is optimistic locking.

A simple way to implement it, is to version your aggregate.

Every time someone edits your object, increment its version.

When you POST your edited blocks, you send back the version on which you are trying to apply your changes.

then, when getting back your object from your persistence storage, you compare the version and ensure you are actually working on an up-to-date object.

If it is, save it, if it is not, reject the modification, notify the user, and reload the object, and take the appropriate action (it depends on your needs).

Upvotes: 1

Related Questions