Reputation: 426
I am trying to switch over from LocalExecutor to CeleryExecutor on my local machine. I should have the necessary libraries installed using pip install airflow[celery]
. I also have redis installed and running.
However, when I try to run airflow worker
, I get the error:
DEFAULT_EXECUTOR = CeleryExecutor()
NameError: CeleryExecutor' is not defined
I have a broker url
and celery_result_backend
both set to redis://localhost:6379
with redis running in the background. What am I doing wrong?
Upvotes: 2
Views: 2298
Reputation: 1459
Opening up a python shell and run from celery import Celery
to test if you have installed celery
already. If not you can install it through pip .
pip install celery
Upvotes: 1
Reputation: 402
Seems that you switched, but airflow doesn't know about it. Check that you have specified AIRFLOW_HOME env variable before running each airflow command and you don't have folders ~/airflow and file ~/airflow.cfg (in your home directory!) as it will be used as default (ignoring your AIRFLOW_HOME).
More details about this issue: https://github.com/puckel/docker-airflow/issues/132
Upvotes: 1
Reputation: 6548
Try opening up a python shell where airflow is installed and run this:
>>> from airflow.executors.celery_executor import CeleryExecutor
I think you'll get a more useful error.
Upvotes: 0