hky404
hky404

Reputation: 1159

Why is the task inside my DAG not running?

I have scheduled my airflow DAGs to run, every DAG has one task inside of them. When the DAGs run, the tasks inside them don't get executed. enter image description here

Here's my code for the same (I am trying to SSH into an EC2 server and run a bash command):

from datetime import timedelta, datetime
from airflow import DAG
from airflow.contrib.operators.ssh_operator import SSHOperator


default_args = {
    'owner': 'airflow',
    'depends_on_past': False,
    'email': ['[email protected]'],
    'email_on_failure': True,
    'email_on_retry': True,
    'start_date': datetime.now() - timedelta(days=1),
    'retries': 1,
    'retry_delay': timedelta(minutes=5),
}

dag = DAG(dag_id='back_fill_reactivated_photo_dimension',
          default_args=default_args,
          schedule_interval='55 * * * *',
          dagrun_timeout=timedelta(seconds=120))

t1_bash = """
/usr/local/bin/dp/database_jobs/run_py.sh "backfill_photo_dim_reactivated.py"
"""

t1 = SSHOperator(
    ssh_conn_id='ssh_aws_ec2',
    task_id='backfill_photo_dim',
    command=t1_bash,
    dag=dag)

The Airflow UI shows the DAG to be in the running state but the actual task inside the DAG never runs, am I missing something in my code? enter image description here

Also, is there a way to force run a DAG regardless of it's CRON schedule?

Upvotes: 2

Views: 4772

Answers (3)

kotartemiy
kotartemiy

Reputation: 104

Most likely you do not have scheduler running.

Run airflow scheduler -D to turn it in a background. That should resolve the issue.

Upvotes: 2

Ravi Ranjan
Ravi Ranjan

Reputation: 62

There is nothing wrong with you dag check your configurations.Can you share your cfg file

Upvotes: 1

Breathe
Breathe

Reputation: 724

A task stuck in "scheduled" generally means you have no pool or no queue available. Are you using local executor? if yes, is the scheduler running?

You can force run (or test) a task using the command line.

Upvotes: 1

Related Questions