Reputation: 357
I'm using Celery w/ Django to process a certain task which returns a JSON value that needs to be put in a Model record. Right now I see 2 options to persist it in the Django database:
update
the record.django-db
result backend for Celery, which will live in Celery's own task_result table. This means I'll have to persist the AsyncResult Id inside the record, and whenever the client requests the record I'll look up and see if the process is done or not.To me it seems that option 1 is better, but since I haven't worked with Celery in recent years I want to know if there are downsides to it, and/or which situation would option 2 be better suited for.
Thanks!
Upvotes: 3
Views: 3566
Reputation: 473
No there is nothing wrong with the first approach.
tasks.py from app.models import your_model from celery import task
@task
def update_model(id):
model_obj = your_model.objects.get(id=id)
#do your stuffs here...
views.py
from app.tasks import update_model
def your_view(request):
#your code
update_model.delay(id_of_the_instance_you_want_to_update)
You can use this example code for atomic commits in the database. If you are worried about.(taken from celery docs)
from functools import partial
from django.db import transaction
from .models import Article, Log
from .tasks import send_article_created_notification
def create_article(request):
with transaction.atomic():
article = Article.objects.create(**request.POST)
# send this task only if the rest of the transaction succeeds.
transaction.on_commit(partial(
send_article_created_notification.delay, article_id=article.pk))
Log.objects.create(type=Log.ARTICLE_CREATED, object_pk=article.pk)
Upvotes: 1