Reputation: 10948
Problem: new dags not shown on docker airflow, no error when running airflow dags list-import-errors
Docker image: official airflow image
Dags path inside docker-compose.yaml
(this is the default path):
volumes:
- ./dags:/opt/airflow/dags
I put the dag file inside the dags
folder on the main directory, as shown below:
However, the dags
is still not shown on both webserver UI
and airflow dags list
. Running airflow dags list-import-errors
also yield no result.
When I open the docker terminal, I can see my dag inside the dags folder via ls
command. I also tried to make the owner root by using chown
, but both of my dag still not shown up on the list.
The airflow run successfully (via docker compose
) as I can see example dags, but not my own dags.
Any help will be appreciated. Thanks!
Upvotes: 2
Views: 4843
Reputation: 705
In my case, I was using MacOS and my docker application (Docker desktop
) Did not have permission to read DAG files.
So if you are using Mac, make sure your Docker app (whether it is Docker desktop
, Rancher
or etc) has access to the files where your DAGs are located. By default these Apps do not have permission over your files and you have to go to System preferences > Security & Privacy > Files and Folders
and give access to your docker app to read the files on your system.
Also one way to check if this is really the problem is to open a shell session in the container that you are running your airflow on, so use docker exec -it -u root <CONTAINER_ID> /bin/bash
and try to run ls -a dags/
in the shell session. If the permission issue mentioned above is the case, you will see an error much like Operation not permitted
Upvotes: 0
Reputation: 114
It might also be a problem with permissions. Check what user is assigned to the DAG files or better try to view the files from inside the container, like this:
docker exec -it <container> bash
cat /opt/airflow/dags/test_gcp.py
Upvotes: 1
Reputation: 20097
airflow info
run in your scheduler container (providing that it is run with the same env as the running scheduler) should show all the configurationairflow dags
sub commandsUpvotes: 1
Reputation: 249
I would try a few things:
gcp_dag.py
and python_dag.py
import airflow
is present in each fileDAG
object in each file__init__.py
(empty) file to dags folderIt would be also helpful to see contents of at least one of those files.
Upvotes: 4