Seifolah
Seifolah

Reputation: 561

apache airflow configuration is empty and dags && plugins missing

I have installed apache airflow on Ubuntu 18.4 using this link https://airflow.apache.org/docs/apache-airflow/stable/start/local.html now when i run airflow with

airflow webserver --port 8080

and the Admin/Configurtion is empty and there is this message:

"Your Airflow administrator chose not to expose the configuration, most likely for security reasons." enter image description here

What i did wrong? More information that me be helpfull is that i created an user[airflow] and do all installtion with sudo , so my airflow info is :

Paths info                                                                                                                                                               
airflow_home    | /home/airflow/airflow                                                                                                                                  
system_path     | /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin                                                     
python_path     | /usr/local/bin:/usr/lib/python36.zip:/usr/lib/python3.6:/usr/lib/python3.6/lib-dynload:/usr/local/lib/python3.6/dist-packages:/usr/lib/python3/dist-pac
                | kages:/home/airflow/airflow/dags:/home/airflow/airflow/config:/home/airflow/airflow/plugins                                                            
airflow_on_path | True                                                                                                                                                   
                                                                                                                                                                         
Config info                                                                                                                                                                   
executor             | LocalExecutor                                                                                                                                          
task_logging_handler | airflow.utils.log.file_task_handler.FileTaskHandler                                                                                                    
sql_alchemy_conn     | postgresql+psycopg2://airflow:airflow@localhost:5432/airflow                                                                                           
dags_folder          | /home/airflow/airflow/dags                                                                                                                             
plugins_folder       | /home/airflow/airflow/plugins                                                                                                                          
base_log_folder      | /home/airflow/airflow/logs  

However these folder does not exists also :/home/airflow/airflow/dags && /home/airflow/airflow/plugins

Upvotes: 13

Views: 15453

Answers (6)

Eric Flynn
Eric Flynn

Reputation: 21

For those running Airflow with docker-compose, you can modify the docker-compose.yaml to include the last line below:

version: '3.8'
x-airflow-common:
  &airflow-common
  # In order to add custom dependencies or upgrade provider packages you can use your extended image.
  # Comment the image line, place your Dockerfile in the directory where you placed the docker-compose.yaml
  # and uncomment the "build" line below, Then run `docker-compose build` to build the images.
  image: ${AIRFLOW_IMAGE_NAME:-airflow-custom}
  build: .
  environment:
    &airflow-common-env
    AIRFLOW__WEBSERVER__EXPOSE_CONFIG: 'true'

Upvotes: 2

Jiraheta
Jiraheta

Reputation: 481

I just wanted to add that for "Azure bitnami Airflow multi-tier" implementation.

expose_config = True

in airflow.cfg and restart the web-server did the trick

Upvotes: 0

D Shridhar Reddy
D Shridhar Reddy

Reputation: 31

If You are running airflow inside the docker using the YAML file then please go to the docker-compose.YAML file and add this is a line under the env tag: AIRFLOW__WEBSERVER__EXPOSE_CONFIG: 'true'

This should fix the issue

Upvotes: 3

santos
santos

Reputation: 21

I'm deploy airflow with helm chart, but it should help.

First, you need content file values, command (helm chart)

helm show values apache-airflow/airflow > values.yaml

Find in file values: extraEnv, put the value below and save.

extraEnv: |
  - name: AIRFLOW__WEBSERVER__EXPOSE_CONFIG
  value: 'TRUE'

now, run changes with command below

helm upgrade  --install airflow apache-airflow/airflow -n airflow -f values.yaml --debug

Upvotes: 2

Yu-Lin Chen
Yu-Lin Chen

Reputation: 559

As @somuchtolearnandshare mentioned, it should be AIRFLOW__WEBSERVER__EXPOSE_CONFIG: "True"

Upvotes: 6

Prarthi
Prarthi

Reputation: 311

You will probably need to set expose_config = True in airflow.cfg and restart the web-server.

Upvotes: 21

Related Questions