DennisLi
DennisLi

Reputation: 4154

Airflow scheduler does not appear to be running after execute a task

When there is a task running, Airflow will pop a notice saying the scheduler does not appear to be running and it kept showing until the task finished:

The scheduler does not appear to be running. Last heartbeat was received 5 minutes ago.

The DAGs list may not update, and new tasks will not be scheduled.

Actually, the scheduler process is running, as I have checked the process. After the task finished, the notice will disappear and everything back to normal.

My task is kind of heavy, may running for couple hours.

Upvotes: 56

Views: 138004

Answers (15)

Vinay Kulkarni
Vinay Kulkarni

Reputation: 300

I had a similar issue and have been trying to troubleshoot this for a while now.

I managed to fix it by setting this value in airflow.cfg:

scheduler_health_check_threshold = 240

PS: Based on a recent conversation in Airflow Slack Community, it could happen due to contention at the Database side. So, another workaround suggested was to scale up the database. In my case, this was not a viable solution.

EDIT: This was last tested with Airflow Version 2.3.3

Upvotes: 6

PandaPhi
PandaPhi

Reputation: 377

If it matters: somehow, the -D flag causes a lot of problems for me. The airflow webserver -D immediately crashes after starting, and airflow scheduler -D somehow does next to nothing for me.

Weirdly enough, it works without the detach flag. This means I can just run the program normally, and make it run in the background, with e.g. nohup airflow scheduler &.

Upvotes: 1

Índio
Índio

Reputation: 611

This happens to me when AIRFLOW_HOME is not set. By setting AIRFLOW_HOME to the correct path, the indicated executor will be selected.

Upvotes: 0

dogdog
dogdog

Reputation: 133

Our problem is that the file "logs/scheduler.log" is too large, 1TB. After cleaning this file everything is fine.

Upvotes: 1

searain
searain

Reputation: 3301

In simple words, using LocalExecutor and postgresql could fix this error.

Running Airflow locally, following the instruction, https://airflow.apache.org/docs/apache-airflow/stable/start/local.html.

It has the default config

executor = SequentialExecutor
sql_alchemy_conn = sqlite:////Users/yourusername/airflow/airflow.db

It will use SequentialExecutor and sqlite by default, and it will have this "The scheduler does not appear to be running." error.

To fix it, I followed Jarek Potiuk's advice. I changed the following config:

executor = LocalExecutor
sql_alchemy_conn = postgresql://postgres:masterpasswordforyourlocalpostgresql@localhost:5432

And then I rerun the "airflow db init"

airflow db init

airflow users create \
--username admin \
--firstname Peter \
--lastname Parker \
--role Admin \
--email [email protected]

After the db inited. Run

airflow webserver --port 8080
airflow scheduler

This fixed the airflow scheduler error.

Upvotes: 2

Kanna TJ
Kanna TJ

Reputation: 11

Check the airflow-scheduler.err and airflow-scheduler.log files.

I got an error like this:

Traceback (most recent call last): File "/home/myVM/venv/py_env/lib/python3.8/site-packages/lockfile/pidlockfile.py", ine 77, in acquire write_pid_to_pidfile(self.path) File "/home/myVM/venv/py_env/lib/python3.8/site-packages/lockfile/pidlockfile.py", line 161, in write_pid_to_pidfile pidfile_fd = os.open(pidfile_path, open_flags, open_mode) FileExistsError: [Errno 17] File exists: '/home/myVM/venv/py_env/airflow-scheduler.pid'

I removed the existing airflow-scheduler.pid file and started the scheduler again by airflow scheduler -D. It was working fine then.

Upvotes: 1

yukai.huang
yukai.huang

Reputation: 1

I've had the same issue after changing the airflow timezone. I then restarted the airflow-scheduler and it works. You can also check if the airflow-scheduler and airflow-worker are on different servers.

Upvotes: 0

JIANG
JIANG

Reputation: 1887

On Composer page, click on your environment name, and it will open the Environment details, go to the PyPIPackages tab.

Click on Edit button, increase the any package version.

For example: enter image description here

I increased the version of pymsql packages, and this restarted the airflow environment, it took a while for it to update. Once it is done, I'm no longer have this error.

You can also add a Python package, it will restart the airflow environment.

Upvotes: 0

Sudhakar Tripathi
Sudhakar Tripathi

Reputation: 41

I have solved this issue by deleting airflow-scheduler.pid file. then airflow scheduler -D

Upvotes: 3

JohnDoe_Scientist
JohnDoe_Scientist

Reputation: 660

A quick fix could be to run the airflow scheduler separately. Perhaps not the best solution but it did work for me. To do so, run this command in the terminal:

airflow scheduler

Upvotes: 17

Ganesh
Ganesh

Reputation: 757

You have started airflow webserver and you haven't started your airflow scheduler. Run airflow scheduler in background

airflow scheduler > /console/scheduler_log.log &

Upvotes: 11

Jarek Potiuk
Jarek Potiuk

Reputation: 20097

I think it is expected for Sequential Executor. Sequential Executor runs one thing at a time so it cannot run heartbeat and task at the same time.

Why do you need to use Sequential Executor / Sqlite? The advice to switch to other DB/Executor make perfect sense.

Upvotes: 25

DennisLi
DennisLi

Reputation: 4154

After change executor from SequentialExecutor to LocalExecutor, it works!

in airflow.cfg:

executor = LocalExecutor

Upvotes: -3

as - if
as - if

Reputation: 3307

I had the same issue. I switch to postgresql by updating airflow.cfg file > sql_alchemy_conn =postgresql+psycopg2://airflow@localhost:5432/airflow and executor = LocalExecutor

This link may help how to set this up locally https://medium.com/@taufiq_ibrahim/apache-airflow-installation-on-ubuntu-ddc087482c14

Upvotes: 10

amoskaliov
amoskaliov

Reputation: 799

I had the same issue while using sqlite. There was a special message in Airflow logs: ERROR - Cannot use more than 1 thread when using sqlite. Setting max_threads to 1. If you use only 1 thread, the scheduler will be unavailable while executing a dag.

So if use sqlite, try to switch to another database. If you don't, check max_threads value in your airflow.cfg.

Upvotes: 0

Related Questions