Radka Žmers
Radka Žmers

Reputation: 45

Airflow test - how to pass parameters from cli

I dont know how to continue with this task. Could you please help? I am creating a dag, where a parameter by user will be passed manually. This dag is supposed to take "pid" to kill the hanging query in the postgres database. I have written a code and I cant pass the paramater to test it via cli. I am using this command : airflow tasks test killer_dag get_idle_queries 20210802 -t '{"pid":"12345"}

and this is the code: killer_dag.py

from airflow.models import DAG
from plugins.platform.utils import skyflow_email_list
from dags.utils.utils import (kill_hanging_queries,)
from airflow.models.log import Log
from airflow.utils.db import create_session
with create_session() as session:
    results = session.query(Log.dttm, Log.dag_id, Log.execution_date,
                            Log.owner, Log.extra) \
        .filter(Log.dag_id == 'killer_dag', Log.event ==
                'trigger').order_by(Log.dttm.desc()).all()
killer_dag = DAG(
    dag_id="killer_dag",
    default_args={
        "owner": "Data Intelligence: Data Platform",
        "email": skyflow_email_list,
        "email_on_failure": True,
        "email_on_retry": False,
        "depends_on_past": False,
        "start_date": datetime(2021, 8, 1, 0, 0, 0),
        "retries": 10,
        "retry_delay": timedelta(minutes=1),
        "sla": timedelta(minutes=90),
    },
    schedule_interval=timedelta(days=1),
)
kill_hanging_queries(killer_dag) 

and

utils.py

import logging
from airflow.operators.python_operator import PythonOperator
from psycopg2.extras import RealDictCursor
from plugins.platform.kw_postgres_hook import KwPostgresHook
from airflow.models import DagRun
from airflow.providers.postgres.hooks.postgres import PostgresHook

def get_idle_queries(**kwargs):
    logging.info(f"STARTING TO FETCH THE PID")
    logging.info(kwargs)
    pid= kwargs["pid"]
    logging.info(pid)
    logging.info("received pid: ", pid)
    # return 'Whatever you return gets printed in the logs'
    analdb_hook = KwPostgresHook(postgres_conn_id="anal_db")
    analdb_conn = analdb_hook.get_conn()
    analdb_cur = analdb_conn.cursor(cursor_factory=RealDictCursor)
    get_idle_queries_query = """
        SELECT pg_terminate_backend('{pid}');
    """
    analdb_cur.execute(get_idle_queries_query)
    hanging_queries = analdb_cur.fetchall()
    logging.info(f"Listing info about hanging queries {hanging_queries}")  # NORO KODO STARTO
    for record in hanging_queries:
        query = record["terminate_q"]
        logging.info(f"Running query: {query}")
        analdb_cur.execute(query)
    analdb_conn.close()
def kill_hanging_queries(killer_dag):
    PythonOperator(
        task_id="get_idle_queries",
        python_callable=get_idle_queries,
        dag=killer_dag,
        provide_context=True
    ) ```

Upvotes: 2

Views: 1947

Answers (1)

NicoE
NicoE

Reputation: 4873

To pass the params from CLI the correct way is pretty much how you did it (unless you are really missing the closing ' as in your post above):

airflow tasks test killer_dag get_idle_queries 20210802 -t '{"pid":"12345"}'

So I think the problem in your code is related to how you are trying to access those params. In get_idle_queries you could access them doing kwargs["params"]["pid"]}, like this:

def get_idle_queries(**kwargs):
    logging.info(f"STARTING TO FETCH THE PID")
    logging.info(kwargs)
    pid = kwargs["params"]["pid"]

Let me know if that worked for you.

Upvotes: 1

Related Questions