Arie
Arie

Reputation: 3573

Airflow Only works with the Celery, CeleryKubernetes or Kubernetes executors

I got this dag, nevrtheless when trying to run it, it stacks on Queued run. When i then trying to run manually i get error:

Error:

Only works with the Celery, CeleryKubernetes or Kubernetes executors

Code:

from airflow import DAG
from airflow.providers.postgres.hooks.postgres import PostgresHook
from airflow.operators.python import PythonOperator
from datetime import datetime

def helloWorld():
    print('Hello World')

def take_clients():
    hook = PostgresHook(postgres_conn_id="postgres_robert")
    df = hook.get_pandas_df(sql="SELECT * FROM clients;")
    print(df)
    # do what you need with the df....

with DAG(dag_id="test",
         start_date=datetime(2021,1,1),
         schedule_interval="@once",
         catchup=False) as dag:

         task1 = PythonOperator(
            task_id="hello_world",
            python_callable=helloWorld)

         task2 = PythonOperator(
            task_id="get_clients",
            python_callable=take_clients)

task1 >> task2

Upvotes: 0

Views: 4440

Answers (2)

Elad Kalif
Elad Kalif

Reputation: 15979

I guess you are trying to use RUN button from the UI. This button is enabled only for executors that supports it. In your Airflow setup you are using Executor that doesn't support this command. In newer Airflow versions the button is simply disable if you you are using Executor that doesn't support it:

enter image description here

I assume that what you are after is to create a new run, in that case you should use Trigger Run button. If you are looking to re-run specific task then use Clear button.

Upvotes: 2

Ashkan Goleh Pour
Ashkan Goleh Pour

Reputation: 522

you run it as LocalExecutor , you have to change your Executor to Celery, CeleryKubernetes or Kubernetes or DaskExecutor

if you using docker-compose add:

AIRFLOW__CORE__EXECUTOR: CeleryExecutor

otherwise go to airflow Executor

Upvotes: 0

Related Questions