Anubhav
Anubhav

Reputation: 585

How to start multiple workers for only one of rq Queues?

I need to start multiple workers for only one of my queues (the "high" priority one below). How can I do this in the context of a worker script that I am using to start my workers?

from config import Config
from redis import from_url
from rq import Worker, Queue, Connection

listen = ['high','mid','low']

conn = from_url(Config.REDIS_URL)

if __name__ == '__main__':
    with Connection(conn):
        worker = Worker(list(map(Queue, listen)),log_job_description=True,)
        worker.work()

The worker script itself is being called by a supervisor process that spawns off 2 worker instances for each of my queue.


[supervisord]

[program:worker]
command=python -m worker
process_name=%(program_name)s-%(process_num)s
numprocs=2
directory=.
stopsignal=TERM
autostart=true
autorestart=true
stdout_logfile=/dev/stdout
stdout_logfile_maxbytes=0
redirect_stderr=true

If I want to have 3 workers ready for my "high" queue but only 2 for the "mid" and "low" queues, how do I go about achieving this?

I tried starting workers in "burst" mode but that kills the workers also if there are not enough jobs. I could live with a solution that autoscales the workers like burst but keeps atleast ONE worker alive at all times.

Upvotes: 5

Views: 1739

Answers (1)

peeyush113
peeyush113

Reputation: 120

You can modify your worker file to read queues from commad line args. then run different survisor process to manage different types of workers listening to different types of queues.

from config import Config
from redis import from_url
from rq import Worker, Queue, Connection

try:
    listen = [sys.argv[1]]
except IndexError as e:
    listen = ['high','mid','low']

conn = from_url(Config.REDIS_URL)

if __name__ == '__main__':
    with Connection(conn):
        worker = Worker(list(map(Queue, listen)),log_job_description=True,)
        worker.work()

and supervisor script might look like this

[supervisord]

[program:high_worker]
command=python -m worker high
process_name=%(program_name)s-%(process_num)s
numprocs=1
directory=.
stopsignal=TERM
autostart=true
autorestart=true
stdout_logfile=/dev/stdout
stdout_logfile_maxbytes=0
redirect_stderr=true

[program:worker]
command=python -m worker
process_name=%(program_name)s-%(process_num)s
numprocs=2
directory=.
stopsignal=TERM
autostart=true
autorestart=true
stdout_logfile=/dev/stdout
stdout_logfile_maxbytes=0
redirect_stderr=true

these scripts are for example purpose u might need to make changes according to your need.

Upvotes: 0

Related Questions