Reputation: 1104
While calling the .delay()
method of an imported task from a django application, the process gets stuck and the request is never completed.
We also don't get any error on the console.
Setting up a set_trace()
with pdb results in the same thing.
The following questions were reviewed which didn't help resolve the issue:
Calling celery task hangs for delay and apply_async
celery .delay hangs (recent, not an auth problem)
Eg.:
CELERY_BROKER_URL = os.environ.get("CELERY_BROKER", RABBIT_URL)
CELERY_RESULT_BACKEND = os.environ.get("CELERY_BROKER", RABBIT_URL)
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'backend.settings')
app = Celery('backend')
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
import time
from celery import shared_task
@shared_task
def upload_file(request_id):
time.sleep(request_id)
return True
from rest_framework.views import APIView
from .tasks import upload_file
class UploadCreateAPIView(APIView):
# other methods...
def post(self, request, *args, **kwargs):
id = request.data.get("id", None)
# business logic ...
print("Going to submit task.")
import pdb; pdb.set_trace()
upload_file.delay(id) # <- this hangs the runserver as well as the set_trace()
print("Submitted task.")
Upvotes: 7
Views: 6215
Reputation: 1
I've run into this issue that Celery calls through delay
or apply_async
may randomly hang the program indefinitely. I tried the all broker_transport_options
and retry_policy
options to let Celery to recover, but it still happens. Then I found this solution to enforce an execution time limit for an execution block/function by using underlying Python signal handlers.
@contextmanager
def time_limit(seconds):
def signal_handler(signum, frame):
raise TimeoutException("Timed out!")
signal.signal(signal.SIGALRM, signal_handler)
signal.alarm(seconds)
try:
yield
finally:
signal.alarm(0)
def my_function():
with time_limit(3):
celery_call.apply_sync(kwargs={"k1", "v1"}, expires=30)
Upvotes: 0
Reputation: 1104
The issue was with the setup of the celery application with Django. We need to make sure that the celery app is imported and initialized in the following file:
backend\__init__.py
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ('celery_app',)
Upvotes: 14