Reputation: 1
When I run a DAG let say with one task, in some cases, task finishes the expected process, I can see this in server logs, but in Airflow UI, task still in mode running and DAG also stayes in running mode. After dagrun_timeout (for example 60 minutes), the task passes to skipping mode, and the DAG passes to failed mode.
This is my DAG:
from datetime import timedelta
from airflow import DAG
from airflow.operators.bash import BashOperator
from airflow.operators.dummy import DummyOperator
from airflow.utils.dates import days_ago
#from airflow.utils.dates import days_agoargs = { 'owner': 'admin'}
#OWNER="airflow"
with DAG(
dag_id='ITM_STG_LAUNCH',
default_args={
"owner": "airflow",
"retries": 5,
"retry_delay": timedelta(minutes=5)
},
schedule='0 12 * * *',
start_date=days_ago(2),
catchup=False,
dagrun_timeout=timedelta(minutes=15),
tags=['ITM', 'STG', 'DFAI'],
#params={"example_key": "example_value"},
) as dag:
Launch_air = BashOperator(
task_id='Launch_air',
bash_command='/opt/app/Batch/run.ksh LAUNCH_AIR',
queue='Q_SRAI-GIIFITM-SIT1',
cwd="/opt/app/Batch"
)
Launch_air
Notice colors in this image: enter image description here
I upgraded Airflow : 2.9.2 to 2.10.1.
I commented dagrun_timeout
Upvotes: 0
Views: 17