ealeon
ealeon

Reputation: 12452

Celery: Redis as broker leaving task meta keys

I have celery app with Redis as broker.

The code consist of the following in a loop :

running = []
res = add.apply_async([1,2], queue='add')
running.append(res)

while running:
    r = running.pop()
    if r.ready():
        print r.get()
    else:
        running.insert(0,r)

everything works fine but when i redis-cli into redis and execute keys * I see bunch of celery-task-meta keys.

Why arent they cleaned up?
What are those for?

--

[EDIT]

I've read about CELERY_TASK_RESULT_EXPIRES setting.
Is it possible for the task keys in Redis to be cleaned up right after the result is read rather than wait until the expiration time?

Upvotes: 6

Views: 6294

Answers (3)

Sudip Bhandari
Sudip Bhandari

Reputation: 2275

I think what you are looking for is to ignore the result completely which can be done by setting this flag, 'task_ignore_result' as True. This doesn't store the result at all.

https://docs.celeryq.dev/en/stable/userguide/configuration.html#task-ignore-result

Upvotes: 0

Quantyle
Quantyle

Reputation: 21

I was having the same issue. What fixed it for me was adding app.autodiscover_tasks() to my celery.py file

Upvotes: 0

bin381
bin381

Reputation: 362

From the Celery Doc:

AsyncResult.forget()
   Forget about (and possibly remove the result of) this task.

You have to first r.get() then r.forget()

But, You needn't cleaned up the keys. For,doc say that:

CELERY_TASK_RESULT_EXPIRES

Default is to expire after 1 day.

Upvotes: 4

Related Questions