zZzZ
zZzZ

Reputation: 173

How to keep Airflow tasks running all the time

I installed Airflow in a conda virtual environment on my local machine. I ran the following commands to automate some scripts

% airflow scheduler -D
% airflow webserver

The DAG was running smoothly as per scheduled (Listening on port 8080 for UI). But it stopped running once I closed the terminal. The next time I activate the conda environment, I will need to run the airflow scheduler command again. Is there any suggestion for me to keep it running all the time?


If I do the same thing by creating conda virtual environment on GCP compute engine. Will it be able to run all the time without human intervention?

Upvotes: 0

Views: 1479

Answers (1)

Bas Harenslak
Bas Harenslak

Reputation: 3094

There are several ways to deal with this.

On your own laptop, you could run with nohup which would survive the terminal getting stopped: https://unix.stackexchange.com/a/4006. An alternative is to run with systemd to run as a service. The Airflow repository holds several systemd scripts that you can use for inspiration: https://github.com/apache/airflow/tree/main/scripts/systemd (docs).

However, stopping your personal computer will still stop Airflow, so another machine (such as GCP Compute Engine) will be a better choice. Such a machine can run 24/7 and you can safely shut down your own computer. There are several ways to install Airflow, read the docs for more information: https://airflow.apache.org/docs/apache-airflow/stable/installation/index.html.

Upvotes: 1

Related Questions