Sreekanth
Sreekanth

Reputation: 1947

In Airflow- Can we get the output of sql executed in JdbcOperator?

Can we see or get the output of a sql executed in JdbcOperator?

with DAG(dag_id='Exasol_DB_Checks',schedule_interval= '@hourly',default_args=default_args,catchup=False,template_searchpath=tmpl_search_path) as dag:
      start_task=DummyOperator(task_id='start_task',dag=dag)
      
      sql_task_1 = JdbcOperator(task_id='sql_cmd',
                                jdbc_conn_id='Exasol_db',
                                sql = ['select current_timestamp;','select current_user from DUAL;',"test.sql"],
                                autocommit=True,
                                params={
                                    "my_param": "{{ var.value.source_path }}"}
                                )
      start_task >> sql_task_1

Upvotes: 0

Views: 1189

Answers (1)

jim
jim

Reputation: 583

Maybe you can use a JdbcHook inside a PythonOperator for your needs:

task = PythonOperator(
  task_id='task1',
  python_callable=do_work,
  dag=dag
)
    
def do_work():
  jdbc_hook = JdbcHook(jdbc_conn_id="some_db"),
  jdbc_conn = jdbc_hook.get_conn()
  jdbc_cursor = jdbc_conn.cursor()
  jdbc_cursor.execute('SELECT ......')
  row = jdbc_cursor.fetchone()[0]

task1 > task2

https://airflow.apache.org/docs/stable/concepts.html#hooks

Upvotes: 1

Related Questions