Reputation: 46
I use celery in my Django project. It works well on my MacBook and in a CentOS VM. When I run it in a docker container, the request which contains add.delay
(add
is a task) method is always blocked.
I created a demo project on GitHub: https://github.com/fengyouchao/proj_test
My task:
@shared_task
def add(x, y):
return x + y
My view:
def index(request):
a = int(request.GET.get('a', 1))
b = int(request.GET.get('b', 2))
add.delay(a, b)
return HttpResponse("Hello world")
def hello(request):
return HttpResponse("hello")
In the demo project I created three services in docker-compose.yml:
Run services
docker-compose up
Test
curl localhost:8000 # blocked
curl localhost:8000/hello # OK
Run the Django project in current system(use the same rabbitmq-server in docker container)
manage.py runserver 0.0.0.0:18000
Test
curl localhost:18000 # OK , and the "celery" service printed task logs
This problem has been bothering me for a long time, and I don't know where the problem is. I hope someone can help me. Thanks!
Upvotes: 1
Views: 1228
Reputation: 111
I have faced the same issue, and fixed by it importing the app created on proj/proj/celery.py
in my proj/proj/__init__.py
like this:
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ('celery_app',)
You can see more information in Celery's first steps with django documentation.
Hope it helps!
Upvotes: 1
Reputation: 11
I just came across a similar issue,
I am using rabbitmq container as a broker so added CELERY_BROKER_URL in settings.py
when I run the add.delay() in manage.py
django shell, inside the container it got struck but works fine in production
So I have added the following change, it started working
app = Celery('app', broker="amqp://rabbitmq")
Upvotes: 1