Evan Gui
Evan Gui

Reputation: 795

How to set up celery workers on separate machines?

I am new to celery.I know how to install and run one server but I need to distribute the task to multiple machines. My project uses celery to assign user requests passing to a web framework to different machines and then returns the result. I read the documentation but there it doesn't mention how to set up multiple machines. What am I missing?

Upvotes: 63

Views: 33113

Answers (3)

Saeid Mohammadi Nejati
Saeid Mohammadi Nejati

Reputation: 521

For those who use flask or any application that needs celery as an async worker:

  1. The connection between dispatcher and the worker is just a queue which can be RabbitMQ or Redis so you must send your job to the this queue.
  2. the job is a function that is serialized and sent to queue therefore you should keep everything untouched about it (including names and configs).

so if your tasks and celery definitions are inside a module named "tasks.py" you just need to copy this module on another machine (and keep every name and configs untouched) and then start workers based on this module on target machine.

now from all machines that utilized same "tasks.py" module you can start a job on the target (worker) machine.

Upvotes: 0

user2471214
user2471214

Reputation: 749

The way I deployed it is like this:

  1. clone your django project on a heroku instance (this will run the frontend)
  2. add RabitMQ as an add on and configure it
  3. clone your django project into another heroku instance (call it like worker) where you will run the celery tasks

Upvotes: 3

Noufal Ibrahim
Noufal Ibrahim

Reputation: 72835

My understanding is that your app will push requests into a queueing system (e.g. rabbitMQ) and then you can start any number of workers on different machines (with access to the same code as the app which submitted the task). They will pick out tasks from the message queue and then get to work on them. Once they're done, they will update the tombstone database.

The upshot of this is that you don't have to do anything special to start multiple workers. Just start them on separate identical (same source tree) machines.

The server which has the message queue need not be the same as the one with the workers and needn't be the same as the machines which submit jobs. You just need to put the location of the message queue in your celeryconfig.py and all the workers on all the machines can pick up jobs from the queue to perform tasks.

Upvotes: 66

Related Questions