Niklas B
Niklas B

Reputation: 1969

Celery: enqueuing multiple (100-1000) tasks at the same time via send_task?

We quite often have the need to enqueue many messages (we chunk them into groups of 1000) using Celery (backed by RabbitMQ). Does anyone have a way to do this? We're basically trying to "batch" a large group of messages in one send_task call.

If i were to guess we would need to go a step "deeper" and hook into kombu or even py-amqp.

Regards,
Niklas

Upvotes: 5

Views: 2429

Answers (2)

Niklas B
Niklas B

Reputation: 1969

What I - provisionally at least - ended up doing was making sure to keep the celery connection open, via:

with celery.Celery(set_as_current=False) as celeryapp:
    ...
    with celeryapp.connection_for_write(connect_timeout=connection_timeout) as conn:
        for message in messages:
            celeryapp.send_task(...)

That way I don't have to re-create connections for producing for each message.

Upvotes: 1

DejanLekic
DejanLekic

Reputation: 19787

No need to "go deeper" and use Kombu directly. - There are few solutions that are suitable for different use-cases:

  • You may want to exploit the chunks if you prefer using Celery workflows.

  • There is nothing stopping you from calling send_task() thousands of times.

  • If calling send_task() is too slow, you may want to use a pool of threads that would concurrently send N tasks to the queue.

Upvotes: 2

Related Questions