Mammouth
Mammouth

Reputation: 41

Django - Celery with RabbitMQ: task always remains in PENDING

I have to use Celery 4.0.2 with RabbitMQ 3.6.10 to handle an async task. Then, I have followed this tutorial: https://www.codementor.io/uditagarwal/asynchronous-tasks-using-celery-with-django-du1087f5k

However, I have a slight problem with my task because it's impossible to have a result. My task always remains in "PENDING" state.

My question is what i have to do to get a result ?

Thank you in advance for your answer.

Here my code:

Here my __init_.py:

from __future__ import absolute_import, unicode_literals

# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app

__all__ = ['celery_app']

Here a part of my setting.py:

BROKER_URL = 'amqp://guest:guest@localhost//'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'

Here my celery.py:

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'mysite.settings')

app = Celery('mysite',
    backend='amqp',
    broker='amqp://guest@localhost//')

# Using a string here means the worker don't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks()


@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))

And my tasks.py

# Create your tasks here
from __future__ import absolute_import, unicode_literals
from celery import shared_task


@shared_task
def add(x, y):
    test = "ok"
    current_task.update_state(state='PROGRESS',
        meta={'test': ok})
    return x + y

And here my Django Shell:

>>> from blog.tasks import *
>>> job = add.delay(2,3)
>>> job.state
'PENDING'
>>> job.result
>>>

With a picture of my RabbitMQ interface: enter image description here

Upvotes: 0

Views: 432

Answers (1)

rafalmp
rafalmp

Reputation: 4068

You need to start a worker that will process the tasks you add to queue. From your virtualenv run:

celery worker -A blog -l info

Upvotes: 1

Related Questions