jazzblue
jazzblue

Reputation: 2437

Celery: how to limit number of tasks in queue and stop feeding when full?

I am very new to Celery and here is the question I have:

Suppose I have a script that is constantly supposed to fetch new data from DB and send it to workers using Celery.

tasks.py

# Celery Task
from celery import Celery

app = Celery('tasks', broker='amqp://guest@localhost//')

@app.task
def process_data(x):
    # Do something with x
    pass

fetch_db.py

# Fetch new data from DB and dispatch to workers.
from tasks import process_data

while True:
    # Run DB query here to fetch new data from DB fetched_data

    process_data.delay(fetched_data)

    sleep(30);

Here is my concern: the data is being fetched every 30 seconds. process_data() function could take much longer and depending on the amount of workers (especially if too few) the queue might get throttled as I understand.

  1. I cannot increase number of workers.
  2. I can modify the code to refrain from feeding the queue when it is full.

The question is how do I set queue size and how do I know it is full? In general, how to deal with this situation?

Upvotes: 15

Views: 10374

Answers (1)

faisal burhanudin
faisal burhanudin

Reputation: 1150

You can set rabbitmq x-max-length in queue predeclare using kombu

example :

import time
from celery import Celery
from kombu import Queue, Exchange

class Config(object):
    BROKER_URL = "amqp://guest@localhost//"

    CELERY_QUEUES = (
        Queue(
            'important',
            exchange=Exchange('important'),
            routing_key="important",
            queue_arguments={'x-max-length': 10}
        ),
    )

app = Celery('tasks')
app.config_from_object(Config)


@app.task(queue='important')
def process_data(x):
    pass

or using Policies

rabbitmqctl set_policy Ten "^one-meg$" '{"max-length-bytes":1000000}' --apply-to queues

Upvotes: 11

Related Questions