Reputation: 61
I have a list of dags that are hosted on Airflow. I want to get the name of the dags in a AWS lambda function so that I can use the names and trigger the dag using experimental API. I am stuck on getting the names of the dag. Any help would be appreciated
Upvotes: 6
Views: 23264
Reputation: 11
This way works slowly, because it probably needs to parse all python files to add DAGs to DagBag:
from airflow.models import DagBag
dag_ids = DagBag(include_examples=False).dag_ids
for id in dag_ids:
print(id)
In my case on PROD environment it would take 20 seconds to parse all DAGs. Does anyone know better way maybe read DAGs from database? I was trying to do it by passing parameter
read_dags_from_db=True
to DagBag, but it returns empty list of Dags.
Upvotes: 0
Reputation: 41
You can first connect with the backend database, By default airflow using SQLite.
Then you can check the DAGs status from table dag
using columns is_active
and is_paused
e.g. airflow=# SELECT dag_id FROM dag WHERE is_active=TRUE AND is_paused=FALSE;
Upvotes: 2
Reputation: 256
from airflow.models import DagBag
dag_ids = DagBag(include_examples=False).dag_ids
for id in dag_ids:
print(id)
Upvotes: 10
Reputation: 11
This command will shows all DAGS include disabled dags as well
airflow dag_list
Upvotes: 0
Reputation: 22887
Since Airflow 2.0, the airflow list_dags
command is now:
airflow dags list [-h] [-o table, json, yaml, plain] [-S SUBDIR] [-v]
with the following named arguments:
-o
, --output
-S
, --subdir
-v
, --verbose
See:
Upvotes: 9
Reputation: 439
All Airflow-CLI commands for the various various are listed on this URL - https://airflow.apache.org/docs/apache-airflow/stable/usage-cli.html
In the latest version you could do
airflow list_dags
Upvotes: 2