bradley.ayers
bradley.ayers

Reputation: 38372

Retrieve list of tasks in a queue in Celery

How can I retrieve a list of tasks in a queue that are yet to be processed?

Upvotes: 227

Views: 291210

Answers (19)

Rafen
Rafen

Reputation: 158

To get the number of tasks on a queue you can use the flower library, here is a simplified example:

import asyncio
from flower.utils.broker import Broker
from django.conf import settings

def get_queue_length(queue):
    broker = Broker(settings.CELERY_BROKER_URL)
    queues_result = broker.queues([queue])
    res = asyncio.run(queues_result) or [{ "messages": 0 }]
    length = res[0].get('messages', 0)
    return length

Upvotes: 2

mlissner
mlissner

Reputation: 18156

If you don't use prioritized tasks, this is actually pretty simple if you're using Redis. To get the task counts:

redis-cli -h HOST -p PORT -n DATABASE_NUMBER llen QUEUE_NAME

But, prioritized tasks use a different key in redis, so the full picture is slightly more complicated. The full picture is that you need to query redis for every priority of task. In python (and from the Flower project), this looks like:

PRIORITY_SEP = '\x06\x16'
DEFAULT_PRIORITY_STEPS = [0, 3, 6, 9]


def make_queue_name_for_pri(queue, pri):
    """Make a queue name for redis
    
    Celery uses PRIORITY_SEP to separate different priorities of tasks into
    different queues in Redis. Each queue-priority combination becomes a key in
    redis with names like:
    
     - batch1\x06\x163 <-- P3 queue named batch1
     
    There's more information about this in Github, but it doesn't look like it 
    will change any time soon:
     
      - https://github.com/celery/kombu/issues/422
      
    In that ticket the code below, from the Flower project, is referenced:
    
      - https://github.com/mher/flower/blob/master/flower/utils/broker.py#L135
        
    :param queue: The name of the queue to make a name for.
    :param pri: The priority to make a name with.
    :return: A name for the queue-priority pair.
    """
    if pri not in DEFAULT_PRIORITY_STEPS:
        raise ValueError('Priority not in priority steps')
    return '{0}{1}{2}'.format(*((queue, PRIORITY_SEP, pri) if pri else
                                (queue, '', '')))


def get_queue_length(queue_name='celery'):
    """Get the number of tasks in a celery queue.
    
    :param queue_name: The name of the queue you want to inspect.
    :return: the number of items in the queue.
    """
    priority_names = [make_queue_name_for_pri(queue_name, pri) for pri in
                      DEFAULT_PRIORITY_STEPS]
    r = redis.StrictRedis(
        host=settings.REDIS_HOST,
        port=settings.REDIS_PORT,
        db=settings.REDIS_DATABASES['CELERY'],
    )
    return sum([r.llen(x) for x in priority_names])

If you want to get an actual task, you can use something like:

redis-cli -h HOST -p PORT -n DATABASE_NUMBER lrange QUEUE_NAME 0 -1

From there you'll have to deserialize the returned list. In my case I was able to accomplish this with something like:

r = redis.StrictRedis(
    host=settings.REDIS_HOST,
    port=settings.REDIS_PORT,
    db=settings.REDIS_DATABASES['CELERY'],
)
l = r.lrange('celery', 0, -1)
pickle.loads(base64.b64decode(json.loads(l[0])['body']))

Just be warned that deserialization can take a moment, and you'll need to adjust the commands above to work with various priorities.

Upvotes: 39

Valentino
Valentino

Reputation: 21

I found a usecase from the Flower codebase to get the broker queue length. It's fast as broker access.

app = Celery("tasks")

from flower.utils.broker import Broker
broker = Broker(
    app.connection(connect_timeout=1.0).as_uri(include_password=True),
    broker_options=app.conf.broker_transport_options,
    broker_use_ssl=app.conf.broker_use_ssl,
)

async def queue_length():
    queues = await broker.queues(["celery"])
    return queues[0].get("messages")

Upvotes: 1

Kevin Cohen
Kevin Cohen

Reputation: 1341

def get_queue_length(total_tasks: int, queue_name: str, node_name: str):
    queue_size = 0
    inspector = app.control.inspect()
    stats = inspector.stats()
    if stats is not None:
        if f"celery@{node_name}" in stats.keys():
            total = stats[f"celery@{node_name}"]["total"]
            if queue_name in total.keys():
                active_tasks = total[queue_name] 
                if int(total_tasks) > int(active_tasks):
                    queue_size = total_tasks - active_tasks
    return queue_size

This leverages celery's control and inspect commands but also keeps an eye on the tasks that have been submitted.

This alone doesn't really work unless you have some sort of loop that is enqueueing items, like the following:

total_tasks = 0
max_queue_length = 100 # choose your number
queue = "celery_queue"
full_queue_name = "YourCeleryApp.your_celery_queue_name"
for item in list_of_tasks
    total_tasks+=1
    queue_length = get_queue_length(total_tasks=total_tasks, queue_name=full_queue_name  node_name=node_name)
    while int(queue_length) >= max_queue_length:
        time.sleep(10)
        queue_length = get_queue_length(total_tasks=total_tasks, queue_name=full_queue_name , node_name=node_name)
    your_celery_task.apply_async(kwargs={},queue=queue)

With this what's happening is the following:

  1. Keep track of how many items have been submitted
  2. The above code will get the total which is the number of tasks that have been processed by a specific worker in a particular queue.
  3. We check whether the number of total tasks submitted is greater than our active_tasks or the tasks that have been processed by celery.

What this means is that if there are 50 tasks submitted and 30 have been processed, then there are 50-30 = 20 tasks in the queue

Upvotes: 0

HungNV
HungNV

Reputation: 11

Here it works for me without remove messages in queue

def get_broker_tasks() -> []:
    conn = <CELERY_APP_INSTANCE>.connection()

    try:
        simple_queue = conn.SimpleQueue(queue_name)
        queue_size = simple_queue.qsize()
        messages = []

        for i in range(queue_size):
            message = simple_queue.get(block=False)
            messages.append(message)

        return messages
    except:
        messages = []
        return messages
    finally:
        print("Close connection")
        conn.close()

Don't forget to swap out CELERY_APP_INSTANCE with your own.

@Owen: Hope my solution meet your expectations.

Upvotes: 0

Shirantha Madusanka
Shirantha Madusanka

Reputation: 1695

inspector = current_celery_app.control.inspect()
scheduled = list(inspector.scheduled().values())[0]
active = list(inspector.active().values())[0]
reserved = list(inspector.reserved().values())[0]
registered = list(inspector.registered().values())[0]
lst = [*scheduled, *active, *reserved]
for i in lst:
    if job_id == i['id']:
        print("Job found")

Upvotes: 2

Caleb Syring
Caleb Syring

Reputation: 1189

This worked for me in my application:

def get_queued_jobs(queue_name):
    connection = <CELERY_APP_INSTANCE>.connection()

    try:
        channel = connection.channel()
        name, jobs, consumers = channel.queue_declare(queue=queue_name, passive=True)
        active_jobs = []

        def dump_message(message):
            active_jobs.append(message.properties['application_headers']['task'])

        channel.basic_consume(queue=queue_name, callback=dump_message)

        for job in range(jobs):
            connection.drain_events()

        return active_jobs
    finally:
        connection.close()

active_jobs will be a list of strings that correspond to tasks in the queue.

Don't forget to swap out CELERY_APP_INSTANCE with your own.

Thanks to @ashish for pointing me in the right direction with his answer here: https://stackoverflow.com/a/19465670/9843399

Upvotes: 13

DejanLekic
DejanLekic

Reputation: 19787

As far as I know Celery does not give API for examining tasks that are waiting in the queue. This is broker-specific. If you use Redis as a broker for an example, then examining tasks that are waiting in the celery (default) queue is as simple as:

  1. connect to the broker
  2. list items in the celery list (LRANGE command for an example)

Keep in mind that these are tasks WAITING to be picked by available workers. Your cluster may have some tasks running - those will not be in this list as they have already been picked.

The process of retrieving tasks in particular queue is broker-specific.

Upvotes: 5

semarj
semarj

Reputation: 2549

EDIT: See other answers for getting a list of tasks in the queue.

You should look here: Celery Guide - Inspecting Workers

Basically this:

my_app = Celery(...)

# Inspect all nodes.
i = my_app.control.inspect()

# Show the items that have an ETA or are scheduled for later processing
i.scheduled()

# Show tasks that are currently active.
i.active()

# Show tasks that have been claimed by workers
i.reserved()

Depending on what you want

Upvotes: 246

Alexandr S.
Alexandr S.

Reputation: 1776

If you are using Celery+Django simplest way to inspect tasks using commands directly from your terminal in your virtual environment or using a full path to celery:

Doc: http://docs.celeryproject.org/en/latest/userguide/workers.html?highlight=revoke#inspecting-workers

$ celery inspect reserved
$ celery inspect active
$ celery inspect registered
$ celery inspect scheduled

Also if you are using Celery+RabbitMQ you can inspect the list of queues using the following command:

More info: https://linux.die.net/man/1/rabbitmqctl

$ sudo rabbitmqctl list_queues

Upvotes: 72

sashaboulouds
sashaboulouds

Reputation: 1844

With subprocess.run:

import subprocess
import re
active_process_txt = subprocess.run(['celery', '-A', 'my_proj', 'inspect', 'active'],
                                        stdout=subprocess.PIPE).stdout.decode('utf-8')
return len(re.findall(r'worker_pid', active_process_txt))

Be careful to change my_proj with your_proj

Upvotes: -1

张朝龙
张朝龙

Reputation: 17

from celery.task.control import inspect
def key_in_list(k, l):
    return bool([True for i in l if k in i.values()])

def check_task(task_id):
    task_value_dict = inspect().active().values()
    for task_list in task_value_dict:
        if self.key_in_list(task_id, task_list):
             return True
    return False

Upvotes: 0

hedleyroos
hedleyroos

Reputation: 367

If you control the code of the tasks then you can work around the problem by letting a task trigger a trivial retry the first time it executes, then checking inspect().reserved(). The retry registers the task with the result backend, and celery can see that. The task must accept self or context as first parameter so we can access the retry count.

@task(bind=True)
def mytask(self):
    if self.request.retries == 0:
        raise self.retry(exc=MyTrivialError(), countdown=1)
    ...

This solution is broker agnostic, ie. you don't have to worry about whether you are using RabbitMQ or Redis to store the tasks.

EDIT: after testing I've found this to be only a partial solution. The size of reserved is limited to the prefetch setting for the worker.

Upvotes: 1

Max Malysh
Max Malysh

Reputation: 31545

A copy-paste solution for Redis with json serialization:

def get_celery_queue_items(queue_name):
    import base64
    import json  

    # Get a configured instance of a celery app:
    from yourproject.celery import app as celery_app

    with celery_app.pool.acquire(block=True) as conn:
        tasks = conn.default_channel.client.lrange(queue_name, 0, -1)
        decoded_tasks = []

    for task in tasks:
        j = json.loads(task)
        body = json.loads(base64.b64decode(j['body']))
        decoded_tasks.append(body)

    return decoded_tasks

It works with Django. Just don't forget to change yourproject.celery.

Upvotes: 17

Peter Shannon
Peter Shannon

Reputation: 315

I've come to the conclusion the best way to get the number of jobs on a queue is to use rabbitmqctl as has been suggested several times here. To allow any chosen user to run the command with sudo I followed the instructions here (I did skip editing the profile part as I don't mind typing in sudo before the command.)

I also grabbed jamesc's grep and cut snippet and wrapped it up in subprocess calls.

from subprocess import Popen, PIPE
p1 = Popen(["sudo", "rabbitmqctl", "list_queues", "-p", "[name of your virtula host"], stdout=PIPE)
p2 = Popen(["grep", "-e", "^celery\s"], stdin=p1.stdout, stdout=PIPE)
p3 = Popen(["cut", "-f2"], stdin=p2.stdout, stdout=PIPE)
p1.stdout.close()
p2.stdout.close()
print("number of jobs on queue: %i" % int(p3.communicate()[0]))

Upvotes: 2

ashish
ashish

Reputation: 2170

To retrieve tasks from backend, use this

from amqplib import client_0_8 as amqp
conn = amqp.Connection(host="localhost:5672 ", userid="guest",
                       password="guest", virtual_host="/", insist=False)
chan = conn.channel()
name, jobs, consumers = chan.queue_declare(queue="queue_name", passive=True)

Upvotes: 16

Paul in &#39;t Hout
Paul in &#39;t Hout

Reputation: 431

The celery inspect module appears to only be aware of the tasks from the workers perspective. If you want to view the messages that are in the queue (yet to be pulled by the workers) I suggest to use pyrabbit, which can interface with the rabbitmq http api to retrieve all kinds of information from the queue.

An example can be found here: Retrieve queue length with Celery (RabbitMQ, Django)

Upvotes: 7

Ali
Ali

Reputation: 6958

if you are using rabbitMQ, use this in terminal:

sudo rabbitmqctl list_queues

it will print list of queues with number of pending tasks. for example:

Listing queues ...
0b27d8c59fba4974893ec22d478a7093    0
0e0a2da9828a48bc86fe993b210d984f    0
[email protected] 0
11926b79e30a4f0a9d95df61b6f402f7    0
15c036ad25884b82839495fb29bd6395    1
[email protected]    0
celery  166
celeryev.795ec5bb-a919-46a8-80c6-5d91d2fcf2aa   0
celeryev.faa4da32-a225-4f6c-be3b-d8814856d1b6   0

the number in right column is number of tasks in the queue. in above, celery queue has 166 pending task.

Upvotes: 57

Sebastian Blask
Sebastian Blask

Reputation: 2938

I think the only way to get the tasks that are waiting is to keep a list of tasks you started and let the task remove itself from the list when it's started.

With rabbitmqctl and list_queues you can get an overview of how many tasks are waiting, but not the tasks itself: http://www.rabbitmq.com/man/rabbitmqctl.1.man.html

If what you want includes the task being processed, but are not finished yet, you can keep a list of you tasks and check their states:

from tasks import add
result = add.delay(4, 4)

result.ready() # True if finished

Or you let Celery store the results with CELERY_RESULT_BACKEND and check which of your tasks are not in there.

Upvotes: 5

Related Questions