Dmitry
Dmitry

Reputation: 23

Laravel how to prevent concurrent handling of requests sent by same user

I have following controller method that creates a new order if and only if the there are no opened orders (opened order has status = 0, closed order has status = 1).

public function createOrder(Request $req){
    // some validation stuff
    $last_active = Orders::where('user_id', $this->user->id)->where('status', 0)->orderBy('id', 'desc')->first();
    if ($last_active){
        return ['status' => 'error'];
    }
    $order= Orders::create([
        'status' => 0
                // some details
        ]);
    return ['status' => 'success'];
}

This method is bound to particular route

Route::post('/create', 'OrderController@create');

Client makes an ajax requests to this route. Logic behind this is super simple: i want user to have only one active order at time, so user must do some actions to close previous order before creating new one. Following code works perfectly in case of normal user, but it does not in case of user that wants to harm my application. So here is the problem. When user sends tons of such requests per second (i do it just within Google Chrome dev console with following script)

for (var i = 0; i < 20; i++) 
    setTimeout(function(){
        $.ajax({
                url : '/create',
                type : 'post', 
            success: function(d){
                console.log(d)
            }
        })
}, 1);

It causes multiple records with status=0 inserted into database when expected only one to be inserted, and others should not. IMO, what happens is:

  1. Many requests touch webserver (nginx in my case)
  2. Webserver creates many PHP processes (via php-fpm in my case)
  3. Multiple PHP processes run method at the same time, passing if ($last_active){...} check simultaneously before some record is inserted in another process, thus resulting in multiple records inserted.

What i tried to fix this:

  1. On nginx side, i limit request rate (10 r/s). This does not help a lot, because it still allows to send 10 requests very fast, with super small delay between them, before rejecting them. I can not set rate limit value lower than 10 r/s because it will hurt normal users
  2. On laravel side, i tried to make a transaction
public function createOrder(Request $req){
    // some validation stuff
    DB::beginTransaction();
    try{
        $last_active = Orders::where('user_id', $this->user->id)->where('status', 0)->orderBy('id', 'desc')->first();
        if ($last_active){
            DB::rollBack();  // i dont think i even need this
            return ['status' => 'error'];
        }
        $order= Orders::create([
            'status' => 0
                    // some details
            ]);
        DB::commit();
    }
    catch (\Exception $e){
        DB::rollBack();
        return ['status' => 'error'];
    }
    return ['status' => 'success'];
}

Using transaction significantly reduces number of rows inserted (and often even works as intended - allowing only 1 row to be inserted, but not always).

  1. I created a middleware which keeps track of when last user request was maden and stores this information in session
   public function handle($request, Closure $next)
    {
        if ((session()->has('last_request_time') && (microtime(true) - session()->get('last_request_time')) > 1)
            || !session()->has('last_request_time')){
            session()->put('last_request_time', microtime(true));
            return $next($request);
        }
       return abort(429);
    }

It did not help at all, because it just moves problem on the middleware level

  1. I also tried something odd:
public function createOrder(Request $req){
    if (Cache::has('action.' . $this->user->id)) return ['status' => 'error'];
        Cache::put('action.' . $this->user->id, '', 0.5);
    // some validation stuff
    $last_active = Orders::where('user_id', $this->user->id)->where('status', 0)->orderBy('id', 'desc')->first();
    if ($last_active){
        Cache::forget('action.' . $this->user->id);
        return ['status' => 'error'];
    }
    $order= Orders::create([
        'status' => 0
                // some details
        ]);
    Cache::forget('action.' . $this->user->id);
    return ['status' => 'success'];
}

This kind of works in many cases, especially combined with transaction, but sometimes it still allows up to 2 rows to be inserted (in 1-2 cases out of 30). And also it does look weird for me. I thought about queues, but as laravel doc states they are intented for time consuming tasks. Also i thought about table locking, but is also seems weird and perfomance affecting for normal users. I believe that there exist clean and simple solution for this problem, but i can not find nothing reasonable in google, maybe i am missing something super obvious? Can you help please? Also, there are so many similar cases in my application that i really want to find some general solution for situations where concurrent execution causes such bugs not only with database, but also sessions, cache, redis, etc.

Upvotes: 2

Views: 5073

Answers (1)

levi
levi

Reputation: 25071

You should be able to use lockForUpdate() on the user model, to prevent concurrent orders from being inserted by same user:

DB::beginTransaction();
User::where('id', $this->user->id)->lockForUpdate()->first();
// Create order if not exists etc...
DB::commit();

Upvotes: 6

Related Questions