etnguyen03
etnguyen03

Reputation: 620

django.core.cache.lock doesn't work in a Celery task

I have the following (tasks.py):

from celery import shared_task
from django.core.cache import cache

@shared_task
def test(param1: str) -> None:
    with cache.lock("lock-1"):
        print("hello")

When I do a test.delay(), nothing is printed, which makes me believe that there is something wrong with cache.lock("lock-1").

I'm using Redis as my cache and Celery backend, and this is configured in settings.py.

What is wrong here? If django.core.cache cannot be used as a locking mechanism (to ensure that only one test runs at a time, what could be used instead?) Thank you!

Upvotes: 2

Views: 1725

Answers (1)

etnguyen03
etnguyen03

Reputation: 620

Figured out why - turns out I didn't read the documentation.

Adding this worked:

# https://docs.celeryproject.org/en/latest/tutorials/task-cookbook.html
@contextmanager
def redis_lock(lock_id):
    timeout_at = time.monotonic() + LOCK_EXPIRE - 3
    # cache.add fails if the key already exists
    # Second value is arbitrary
    status = cache.add(lock_id, "lock", timeout=LOCK_EXPIRE)
    try:
        yield status
    finally:
        # memcache delete is very slow, but we have to use it to take
        # advantage of using add() for atomic locking
        if time.monotonic() < timeout_at and status:
            # don't release the lock if we exceeded the timeout
            # to lessen the chance of releasing an expired lock
            # owned by someone else
            # also don't release the lock if we didn't acquire it
            cache.delete(lock_id)

Then, the lock statement became

with redis_lock("lock-1"):

Upvotes: 1

Related Questions