Venkat
Venkat

Reputation: 462

Running periodic tasks with django and celery

I'm trying create a simple background periodic task using Django-Celery-RabbitMQ combination. I installed Django 1.3.1, I downloaded and setup djcelery. Here is how my settings.py file looks like:

BROKER_HOST = "127.0.0.1"
BROKER_PORT = 5672
BROKER_VHOST = "/"
BROKER_USER = "guest"
BROKER_PASSWORD = "guest"
....
import djcelery
djcelery.setup_loader()
...
INSTALLED_APPS = (
    'djcelery',
)

And I put a 'tasks.py' file in my application folder with the following contents:

from celery.task import PeriodicTask
from celery.registry import tasks
from datetime import timedelta
from datetime import datetime

class MyTask(PeriodicTask):
    run_every = timedelta(minutes=1)

    def run(self, **kwargs):
        self.get_logger().info("Time now: " + datetime.now())
        print("Time now: " + datetime.now())

tasks.register(MyTask)

And then I start up my django server (local development instance):

python manage.py runserver

Then I start up the celerybeat process:

python manage.py celerybeat --logfile=<path_to_log_file> -l DEBUG

I can see entries like this in the log:

[2012-04-29 07:50:54,671: DEBUG/MainProcess] tasks.MyTask sent. id->72a5963c-6e15-4fc5-a078-dd26da663323

And I also can see the corresponding entries getting created in database, but I can't find where it is logging the text I specified in the actual run function in MyTask class.

I tried fiddling with the logging settings, tried using the django logger instead of celery logger, but of no use. I'm not even sure, my task is getting executed. If I print any debug information in the task, where does it go?

Also, this is first time I'm working with any type of message queuing system. It looks like the task will get executed as part of the celerybeat process - outside the django web framework. Will I still be able to access all the django models I created.

Thanks, Venkat.

Upvotes: 4

Views: 4988

Answers (2)

Rustem
Rustem

Reputation: 2932

Celerybeat it stuff, which pushes task when it need, but not executing them. You tasks instances stored in RabbitMq server. You need to execute celeryd daemon for executing your tasks.

python manage.py celeryd --logfile=<path_to_log_file> -l DEBUG

Also if you using RabbitMq, I recommend to you to install special rabbitmq management plugins:

rabbitmq-plugins list
rabbitmq-enable rabbitmq_management
service rabbitmq-server restart

It will be available at http://:55672/ login: guest pass: guest. Here you can check how many tasks in your rabbit instance online.

Upvotes: 5

alexarsh
alexarsh

Reputation: 5391

You should check the RabbitMQ logs, since celery sends the tasks to RabbitMQ and it should execute them. So all the prints of the tasks should be in RabbitMQ logs.

Upvotes: 0

Related Questions