Reputation: 1597
I have a Celery 3.1.19 setup which uses a BROKER_URL including a virtual host.
# in settings.py
BROKER_URL = 'amqp://guest:guest@localhost:5672/yard'
Celery starts normally, loads the tasks, and the tasks I define within the @app.task decorator work fine. I assume that my rabbitmq and celery configuration at this end are correct.
Tasks, I define with @shared_tasks and load with app.autodiscover_tasks are still loading correctly upon start. However, if I call the task the message ends up in the (still existing) amqp://guest:guest@localhost:5672/ virtual host.
Question: What am I missing here? Where do shared tasks get their actual configuration from.
And here some more details:
# celery_app.py
from celery import Celery
celery_app = Celery('celery_app')
celery_app.config_from_object('settings')
celery_app.autodiscover_tasks(['connectors'])
@celery_app.task
def i_do_work():
print 'this works'
And in connectors/tasks.py (with an __init__.py
in the same folder):
# in connectors/tasks.py
from celery import shared_task
@shared_task
def I_do_not_work():
print 'bummer'
And again the shared task gets also picked up by the Celery instance. It just lacks somehow the context to send messages to the right BROKER_URL.
Btw. why are shared_tasks so purely documented. Do they rely on some Django context? I am not using Django.
Or do I need additional parameters in my settings?
Thanks a lot.
Upvotes: 4
Views: 2135
Reputation: 1597
The celery_app was not yet imported at application start. Within my project, I added following code to __init__.py
at the same module level as my celery_app definition.
from __future__ import absolute_import
try:
from .celery_app import celery_app
except ImportError:
# just in case someone develops application without
# celery running
pass
I was confused by the fact that Celery seems to come with a perfectly working default app. In this case a more interface like structure with a NotImplementedError might have been more helpful. Nevertheless, Celery is awesome.
Upvotes: 4