Sumith08
Sumith08

Reputation: 770

Airflow Sechulder error ERROR - DagFileProcessorManager (PID=1234) last sent a heartbeat 50.72 seconds ago! Restarting it

I have recently installed airflow 2.1.3 using apache-airflow helm repo on Azure AKS cluster. But post the installation, The Dag files are not getting displayed on the UI. The reason could be the scheduler getting terminated consistently. Below is the error. Can anyone please help me with the below issue?

[2021-10-28 05:16:49,322] {manager.py:254} INFO - Launched DagFileProcessorManager with pid: 1268
[2021-10-28 05:16:49,339] {settings.py:51} INFO - Configured default timezone Timezone('UTC')
[2021-10-28 05:17:39,997] {manager.py:414} ERROR - DagFileProcessorManager (PID=1268) last sent a heartbeat 50.68 seconds ago! Restarting it
[2021-10-28 05:17:39,998] {process_utils.py:100} INFO - Sending Signals.SIGTERM to GPID 1268
[2021-10-28 05:17:40,251] {process_utils.py:66} INFO - Process psutil.Process(pid=1268, status='terminated', exitcode=0, started='05:16:48') (1268) terminated with exit code 0
[2021-10-28 05:17:40,256] {manager.py:254} INFO - Launched DagFileProcessorManager with pid: 1313
[2021-10-28 05:17:40,274] {settings.py:51} INFO - Configured default timezone Timezone('UTC')

Upvotes: 3

Views: 4011

Answers (2)

TaeKyung Yoo
TaeKyung Yoo

Reputation: 41

Increase

dag_file_processor_timeout

in airflow.cfg or set AIRFLOW__CORE__DAG_FILE_PROCESSOR_TIMEOUT as higher than 50 will be helpful. (default: 50, seconds)

https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html#dag-file-processor-timeout

Your messages just mean a timeout of that values.

Upvotes: 0

Vinay Kulkarni
Vinay Kulkarni

Reputation: 300

I have previously been able to fix this by setting a higher value in airflow.cfg for scheduler_health_check_threshold

For Ex:
scheduler_health_check_threshold = 240

Also, ensure that orphaned_tasks_check_interval is greater than the value that you set for scheduler_health_check_threshold

Upvotes: 1

Related Questions