user3201419
user3201419

Reputation: 41

Is it common to run 20 python workers which uses Redis as Queue ?

This program listen to Redis queue. If there is data in Redis, worker start to do their jobs. All these jobs have to run simultaneously that's why each worker listen to one particular Redis queue.

My question is : Is it common to run more than 20 workers to listen to Redis ?

python /usr/src/worker1.py python /usr/src/worker2.py python /usr/src/worker3.py python /usr/src/worker4.py python /usr/src/worker5.py .... .... python /usr/src/worker6.py

Upvotes: 1

Views: 1026

Answers (2)

bruno desthuilliers
bruno desthuilliers

Reputation: 77942

Having multiple worker processes (and when I mean "multiple" I'm talking hundreds or more), possibly running on different machines, fetching jobs from a job queue is indeed a common pattern nowadays. There even are whole packages/frameworks devoted to such workflows, like for example Celery.

What is less common is trying to write the whole task queues system from scratch in a seemingly ad-hoc way instead of using a dedicated task queues system like Celery, ZeroMQ or something similar.

Upvotes: 2

Indent
Indent

Reputation: 4967

If your worker need to do a long task with data, it's a solution. but each data must be treated by a single worker.

By this way, you can easly (without thread,etc..) distribute your tasks, it's better if your worker doesn't work in the same server

Upvotes: 2

Related Questions