M Hrytsenia
M Hrytsenia

Reputation: 21

It is possible to handle the task shut down from inside the Airflow Task?

We use Neo4j and call its queries from Airflow tasks. The issue is that these queries often don't stop when the task is marked as "Failed" or "Completed" in the Airflow GUI. So, I would like to find out a way how to call a kill query from inside the currently running task when the task is marked as "Failed" or "Completed".

In the Airflow, the query is executed using session.run(query) method from GraphDatabase.driver. Where GraphDatabase is a part of neo4j python library

Is it any straightforward solution how to do it?

Upvotes: 1

Views: 228

Answers (2)

Jarek Potiuk
Jarek Potiuk

Reputation: 20067

Base Operator has "on_kill" method that you can override: https://airflow.apache.org/docs/apache-airflow/stable/_api/airflow/models/baseoperator/index.html#airflow.models.baseoperator.BaseOperator.on_kill

It's likely that the operator you use (Neo4j) does not have it properly implemented - but you can always create a custom operator with proper on_kill implementation and possibly contribute it back as PR

Upvotes: 2

aldrin
aldrin

Reputation: 4572

If you know the query id, you could try running the following query,

CALL dbms.killQuery(queryId)

See https://neo4j.com/docs/operations-manual/current/monitoring/query-management/ . This link also show you how to list running queries or transactions

Upvotes: 0

Related Questions