ardiantovn
ardiantovn

Reputation: 55

Why only a DummyOperator task can run in my Airflow DAG?

from airflow import DAG
from datetime import datetime, timedelta
from airflow.operators.dummy_operator import DummyOperator
from airflow.operators.bash_operator import BashOperator
import time

dag_name = 'dagName'

default_args = {
    'owner': 'Airflow',
    'start_date':datetime(2022,2,16),
    'schedule_interval':'@once',
    'retries': 1,
    'retry_delay': timedelta(seconds=5),
    'depends_on_past': False,
    'catchup':False
}

with DAG(dag_name,default_args=default_args) as dag:

    t1 = DummyOperator(task_id="start")

    t2 = BashOperator(task_id='hello_world',
                        bash_command='echo "Hi!!"')
    
    t3 = DummyOperator(task_id="end")
    
    t1 >> t2 >> t3

task_status

I have run this code on Apache Airflow 2. There is only the start task which marked as success while the hello_world task is still in queued status. I have check the task instance detail and it shows that : Task is in the 'queued' state which is not a valid state for execution. The task must be cleared in order to be run. Then, I clear hello_world task, but the task is still in queued status. Is there any solution for this? Thank you

Upvotes: 0

Views: 2249

Answers (1)

Elad Kalif
Elad Kalif

Reputation: 15931

DummyOperator doesn't have any actual code to execute so it's redundant to submit it to run on worker due to that reason Airflow has optimization that DummyOperator and any of its subclasses will not be sent to workers, they are automatically marked as Success by the scheduler (assuming no on_execute_callback is called etc..)

Your description means that you probably have some configuration not set up properly. This is not a DAG code issue.

Upvotes: 1

Related Questions