dingx
dingx

Reputation: 1671

AttributeError: 'DisabledBackend' object has no attribute '_get_task_meta_for' for <python +celery> but it's fine for <heroku local>

I've started celery using:

celery -A tasks worker --loglevel=info

and then I started my python app using:

python app.py

in the app.py when I want to check the AsycnResults:

for id in task_id.keys():
    st[id] = str(AsyncResult(id).state)

it throws error messages:

Traceback (most recent call last):
  File "<...>/app.py", line 271, in update_task_status  
    **st[id] = str(AsyncResult(id).state)**  
  File "../anaconda3/lib/python3.7/site-packages/celery/result.py", line 473, in state
    return self._get_task_meta()['status']  
  File "<...>/anaconda3/lib/python3.7/site-packages/celery/result.py", line 412, in _get_task_meta
    return self._maybe_set_cache(self.backend.get_task_meta(self.id))  
  File "<...>/anaconda3/lib/python3.7/site-packages/celery/backends/base.py", line 386, in get_task_meta  
    meta = self._get_task_meta_for(task_id)    
  AttributeError: 'DisabledBackend' object has no attribute '_get_task_meta_for' 

however, when i start both the app and celery with heroku, it runs fine:

heroku local  
<and the Procfile is like>  
web: gunicorn -b 0.0.0.0:8000 app:server --log-file=-
worker: celery -A tasks worker --loglevel=info

strangely, when I want to set more workers/threads under heroku local, it give the same error again.

<heroku local's Procfile>
web: gunicorn --workers 6 --threads 2 -b 0.0.0.0:11000 app:server --log-file=-
#web: gunicorn -b 0.0.0.0:11000 app:server --log-file=-
worker: celery -A tasks worker --loglevel=info

and in both case, celery seems working right and can receive and process the message:


[tasks]
  . tasks.query_adj_info
  . tasks.query_perf_by
  . tasks.query_perf_dai
  . tasks.query_perf_sum

[2019-07-03 21:56:59,539: INFO/MainProcess] Connected to redis://0.0.0.0:6379//
[2019-07-03 21:56:59,549: INFO/MainProcess] mingle: searching for neighbors
[2019-07-03 21:57:00,573: INFO/MainProcess] mingle: all alone
[2019-07-03 21:57:00,582: INFO/MainProcess] celery@r ready.
[2019-07-03 21:58:57,779: INFO/MainProcess] Received task: tasks.query[depid_eddd6a1a-610f-484f-b2fa-f3052e0c910b_celtid_perfa616b950-e7b8-4c5b-9d80-dc9907796be8]  
[2019-07-03 21:58:57,780: INFO/MainProcess] Received task: ....

Upvotes: 1

Views: 1294

Answers (1)

strg
strg

Reputation: 116

Check if you have configured backend

app = Celery('proj', backend='amqp://', broker='amqp://guest@localhost//')

If you use django try to use python manage.py shell instead of only python. I had issues with backend in celery on pure python shell.

Upvotes: 1

Related Questions