Liam
Liam

Reputation: 21

Celery with RabbitMQ creating too many queues

When running Django/Celery/RabbitMQ on production server, some tasks are sent and consumed correctly. However, RabbitMQ starts using up all the CPU after processing is done. I believe this is related to the following report.

RabbitMQ on EC2 Consuming Tons of CPU

In that thread, it is suggested to set these config values:

I forked and customized the celery-haystack package to set both those values when calling appl_async(), however it seems to have had no effect.

I think Celery is creating a large number (one per task) of uid-named queues automatically to store results. But I don't seem to be able to stop it.

Any ideas?

Upvotes: 2

Views: 1131

Answers (1)

Vinh
Vinh

Reputation: 83

I just got a day of digging into this problem myself. I think the two options you meantioned can be explained like this:

  • CELERY_IGNORE_RESULT: if True then the results of tasks will be ignored, hence they won't return anything where you call them with delay or apply_async.
  • CELERY_AMQP_TASK_RESULT_EXPIRES: the expiration time for a result stored in the result backend. You can set this option to a reasonable value so RabbitMQ can delete expired results.

The many queues generated are for storing results only. So in case you don't want to store any results, you can remove CELERY_RESULT_BACKEND option from your config file.

Have a ncie day!

Upvotes: 1

Related Questions