Reputation: 2513
I am using Airflow 1.7.1.3 installed using pip
I would like to limit the logging to ERROR level for the workflow being executed by the scheduler. Could not find anything beyond setting log files location in the settings.py file.
Also the online resources led me to this google group discussion here but not much info here as well
Any idea how to control logging in Airflow?
Upvotes: 16
Views: 39434
Reputation: 189
If you are using docker-compose.yaml
x-airflow-common:
&airflow-common
image: ${AIRFLOW_IMAGE_NAME:-apache/airflow:2.1.2}
environment:
&airflow-common-env
#the other parameters
AIRFLOW__CORE__LOGGING_LEVEL: DEBUG #add
Upvotes: 0
Reputation: 10887
As @Dimo Boyadzhiev pointed the change, Adding the more info path for documentaion.
File - $AIRFLOW_HOME/airflow.cfg
# Logging level
logging_level = INFO
fab_logging_level = WARN
Upvotes: 3
Reputation: 171
I tried below work around and it seems to be working to set LOGGING_LEVEL
outside of settings.py
:
Update settings.py
:
Remove or comment line:
LOGGING_LEVEL = logging.INFO
Add line:
LOGGING_LEVEL = os.path.expanduser(conf.get('core', 'LOGGING_LEVEL'))
Update airflow.cfg
configuration file:
Add line under [core]
:
logging_level = WARN
Restart webserver
and scheduler
services
Use environment vaiable AIRFLOW__CORE__LOGGING_LEVEL=WARN
.
See the official docs for details.
Upvotes: 16
Reputation: 1357
The logging functionality and its configuration will be changed in version 1.9 with this commit
Upvotes: 8
Reputation: 171
Only solution I am aware of is changing LOGGING_LEVEL
in settings.py
file. Default level is set to INFO
.
AIRFLOW_HOME = os.path.expanduser(conf.get('core', 'AIRFLOW_HOME'))
SQL_ALCHEMY_CONN = conf.get('core', 'SQL_ALCHEMY_CONN')
LOGGING_LEVEL = logging.INFO
DAGS_FOLDER = os.path.expanduser(conf.get('core', 'DAGS_FOLDER'))
Upvotes: 1