Reputation: 535
I am a newbie as far as both Airflow and Docker are concerned; to make things more complicated, I use Astronomer, and to make things worse, I run Airflow on Windows. (Not on a Unix subsystem - could not install Docker on Ubuntu 20.4). "astro dev start" breaks with an error, but in Docker Desktop I see, and can start, 3 Airflow-related containers. They see my DAGs just fine, but my DAGs don't see the local file system. Is thus unavoidable with the Airflow + Docker combo? (Seems like a big handicap; one can only use a file in the cloud).
Upvotes: 0
Views: 2651
Reputation: 366
In general, you can declare a volume at image runtime in Docker using the -v
switch with your docker run
command to mount a local folder on your host to a mount point in your container, and you can access that point from inside the container.
If you go on to use docker-compose up
to orchestrate your containers, you can specify volumes in the docker-compose.yml
file for your containers which configures the volumes for the containers that run.
In your case, the Astronomer docs here suggest it is possible to create a custom directive in the Astronomer docker-compose.override.yml
file to mount the volumes in the Airflow containers created as part of your astro
commands for your stack which should then be visible from your DAGs.
Upvotes: 1