Reputation: 976
I have the following hierarchy:
-airflow
-dags
-test.py
-deployment
-dockerfile
-docker compose
-scripts
-requirements.txt
The test.py
file uses functions from the scripts
directory. Some of the scripts have external import statements, like import boto3
. I assume this is where the problem is, because when I run the airflow webserver I can see that all the DAGs which don't require those external packages load up, but the DAGs which do require them fail to load with:
Broken DAG: [/usr/local/airflow/dags/test.py] No module named 'boto3'
The docker compose
file looks something like this:
version: '3'
services:
webserver:
build: .
I tried to add something like this to my dockerfile
:
FROM puckel/docker-airflow:1.10.9
WORKDIR /airflow
COPY requirements.txt /airflow
RUN pip install -U pip && pip install -r requirements.txt
But the packages don't seem to get installed. How can I install my requirements.txt
whenever I boot up the webserver (docker compose up)?
Upvotes: 0
Views: 3828
Reputation: 699
Could be as easy as...
WORKDIR /airflow
COPY requirements.txt requirements.txt
RUN pip install -r requirements.txt
You may also have to instruct the Docker build step. I think docker-compose build .
or docker-compose up --no-cache
... but that's based on a loose memory
Upvotes: 1