Reputation: 1969
We quite often have the need to enqueue many messages (we chunk them into groups of 1000) using Celery (backed by RabbitMQ). Does anyone have a way to do this? We're basically trying to "batch" a large group of messages in one send_task call.
If i were to guess we would need to go a step "deeper" and hook into kombu
or even py-amqp
.
Regards,
Niklas
Upvotes: 5
Views: 2429
Reputation: 1969
What I - provisionally at least - ended up doing was making sure to keep the celery connection open, via:
with celery.Celery(set_as_current=False) as celeryapp:
...
with celeryapp.connection_for_write(connect_timeout=connection_timeout) as conn:
for message in messages:
celeryapp.send_task(...)
That way I don't have to re-create connections for producing for each message.
Upvotes: 1
Reputation: 19787
No need to "go deeper" and use Kombu directly. - There are few solutions that are suitable for different use-cases:
You may want to exploit the chunks if you prefer using Celery workflows.
There is nothing stopping you from calling send_task() thousands of times.
If calling send_task() is too slow, you may want to use a pool of threads that would concurrently send N tasks to the queue.
Upvotes: 2