basudev patro
basudev patro

Reputation: 5

How to assign multiple queues for multiple tasks in celery?

I've a python big data project in which i'm trying to do my tasks using celery and Redis server. The thing is i need three different queue's for three different tasks which i'm applying to celery to do.

This is the configuration i've made to run the three tasks at once but they are using one single queue to perform the task one after the other so it's taking so much time.

from celery import Celery
from celery.utils.log import get_task_logger

logger = get_task_logger(__name__)

app = Celery("tasks", broker="redis://localhost:6379")

@app.task()
def main(capital,gamma,alpha):

@app.task()
def gain(capital,gamma,alpha):

@app.task()
def lain(capital,gamma,alpha):

And to start the celery app i'm using this lines of code

("celery -A task worker --loglevel=info -P eventlet --concurrency=10 -n worker1@%h" , shell=True)

("celery -A task worker --loglevel=info -P eventlet --concurrency=10 -n worker2@%h" , shell=True)

("celery -A task worker --loglevel=info -P eventlet --concurrency=10 -n worker3@%h" , shell=True)

These code perfectly runs my app with three tasks but i need to make three independent queue's for each of these tasks so that all the three tasks can run concurrently or parallelly at once.

This is how my celery app looks like with three tasks but it's using only one queue which is default queue named celery.

 -------------- worker1@EC2AMAZ-RTM8UD8 v5.2.3 (dawn-chorus)
--- ***** -----
-- ******* ---- Windows-10-10.0.20348-SP0 2022-03-10 12:59:07
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app:         tasks:0x248adf81e80
- ** ---------- .> transport:   redis://localhost:6379//
- ** ---------- .> results:     disabled://
- *** --- * --- .> concurrency: 10 (eventlet)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery


[tasks]
  . task.gain
  . task.lain
  . task.main

[2022-03-10 12:59:07,461: INFO/MainProcess] Connected to redis://localhost:6379//
[2022-03-10 12:59:07,477: INFO/MainProcess] mingle: searching for neighbors
[2022-03-10 12:59:08,493: INFO/MainProcess] mingle: all alone
[2022-03-10 12:59:08,493: INFO/MainProcess] pidbox: Connected to redis://localhost:6379//.
[2022-03-10 12:59:08,493: INFO/MainProcess] worker1@EC2AMAZ-RTM8UD8 ready.

So please can anyone help me defining three different queue's for three different tasks in celery app and any help would be very much appreciated :)

Upvotes: 0

Views: 2781

Answers (1)

DejanLekic
DejanLekic

Reputation: 19822

It should be as simple as adding -Q <queue name> to those three Celery workers that you run.

Upvotes: 2

Related Questions