llioor
llioor

Reputation: 6238

Laravel job size limit is been exceeded 256kb (queue SQS aws)

I searched online but I didn't find any solution for my issue. I'm using laravel 5.2 with SQS as queue driver. I'm dispatching a job in order to send email messages to 100 users. The job receives the "Article" model and array of "User"(s) and each user supposed to receive an email with an "article".

When it is 10 users all is OK. When it is 100 users I receive an error message "400 bad request" from amazon SQS service and the response is: "Reason: Message must be shorter than 262144 bytes." I understood the job's request is too big because of the user's array.

I want to split the user's array in order to reduce the job's request size to less than 256kb. I can do it by looping through the user's array and every time I reach close to 256kb I will dispatch a job with the article and users and then I will continue to run over the rest of the users in the array.

  1. What is the way to check the current job request size before we dispatch it?
  2. Do you have a better solution to offer?

Thank you very much in advance Leo.

Upvotes: 7

Views: 7721

Answers (2)

llioor
llioor

Reputation: 6238

I found a solution for the problem. The answer is written in docs:

Because of the SerializesModels trait that the job is using, Eloquent models will be gracefully serialized and unserialized when the job is processing. If your queued job accepts an Eloquent model in its constructor, only the identifier for the model will be serialized onto the queue.

It means laravel save just ids for serialized eloquent models. Like this we reduce the size of the dispatched job. On the other hand, I still don't know how to check the size of the job before dispatching it.

Edit + Solution:

Basically the best way to handle it is by dispatching many jobs. I will explain with an example. Let's say we have an event at work and we want to notify all workers about it. The best way to handle it is to dispatch ONE job (queued) with the event->id and then inside that job to loop over the workers and to dispatch many jobs (queues) with $worker->id. Like this the queue size will never be too big. If we will dispatch one job for all workers in event i can reach to 1000s ids (workers ids) and like this we will reach the queue limit size.

Hope it was clear. Lior.

Upvotes: 2

alepeino
alepeino

Reputation: 9771

Ok, you already figured out that the SerializesModels trait helps keep the payload size low by just saving the model class and the ids, to be fetched again from the database when the job is resolving.

If it still makes the job too large, the Queue object is the one that creates the payload, in the createObjectPayload method. It's protected, so you may not be able to access it easily from outside, but most of the payload is just serialize(clone $job). That will give you an idea of the current size of the job. Maybe create a method in the job class that adds Users to its inner Eloquent\Collection until serialize($this) reaches a limit?


If the job constructor is

public function __construct(\Illuminate\Database\Eloquent\Collection $users, $maxSize)
{
    $this->users = new \Illuminate\Database\Eloquent\Collection();

    while (strlen(serialize(clone $this)) < $maxSize) {
        $this->users->push($users->shift());
    }
}

Then you call it like this:

// $usersToNotify is the collection with the users

$job = new YourJob($usersToNotify, $maxSize);

// Now the job has all users within the limit,
// and $usersToNotify has the remaining users.

Upvotes: 2

Related Questions