Reputation: 2477
I'm trying to use apache-airflow
and i'd like to explore Docker
to run task in container.
My current airflow installation is in a dedicated virtualenv and airflow is restarted automatically with systemd.
I have already multiple projects I want to take on Airflow.
Each project should have its own dag.
I'd like to have the project dag written with PythonOperator
and run inside a docker container with an image that I've previously built with all the correct dependencies.
This could guarantee code dependencies to be isolated between each project.
Is it achieavable somehow?
Upvotes: 1
Views: 554
Reputation: 2342
There is a DockerOperator: https://airflow.apache.org/docs/stable/_api/airflow/operators/docker_operator/index.html
As well as a PythonVirtualEnvironmentOperator: https://airflow.apache.org/docs/stable/_api/airflow/operators/python_operator/index.html#airflow.operators.python_operator.PythonVirtualenvOperator
Anyway, within a PythonOperator you can code whatever you want, so you can create there a new virtual environment, install the dependencies and build a docker image
Upvotes: 1