Reputation: 393
because my database is slow and I get a lot of API requests on the same time. I want to prevent the method from being executed twice at exactly the same time. I'm trying to implement the Laravel Atomic Lock as a solution (as described in: https://laravel.com/docs/8.x/cache#atomic-locks ) by the following way:
$lock = Cache::lock("service-{$id}", 10);
try
{
$lock->block(5);
// Lock acquired after waiting a maximum of 5 seconds...
} catch (LockTimeoutException $e)
{
// Unable to acquire lock...
return null;
}
//Execute code when lock is achieved
optional($lock)->release();
//Release lock when complete.
But for some reason I am still getting duplicate executions of the code despite the lock. I am using Redis as Cache driver. Anything I am doing wrong?
Upvotes: 2
Views: 1083
Reputation: 21
Requests arriving at redis at the same time (milliseconds) can still pass this lock.
You can solve the problem by waiting for incoming requests for random a few milliseconds.
here is an example:
$key=Auth::id(); //user unique id or something like
usleep(rand(0, 1000000));
$time=explode(".", microtime(true));
$time=$time[0].$time[1];
if($time-Redis::get('lock_'.$key)<1000){
return response()->json(['message'=>'Rate Limit Exceeded','errors' => ['Please wait a few second and try again!']], 429);
Redis::set('lock_'.$key,$time);
}
Redis::set('lock_'.$key,$time);
Upvotes: 0