Reputation: 139
I'm new to Airflow. I'm following the offical tutorial to set up the first DAG and task
from airflow import DAG
from airflow.operators.bash import BashOperator
from datetime import datetime, timedelta
default_args = {
'owner': 'admin',
'retries': 3,
'retry_delay': timedelta(minutes=1)
}
with DAG(
dag_id="hello_world_dag",
description="Hello world DAG",
start_date=datetime(2023, 1, 16),
schedule_interval='@daily',
default_args=default_args
) as dag:
task1 = BashOperator(
task_id="hello_task",
bash_command="echo hello world!"
)
task1
When I tried to run this manually, it always failed. I've checked the web server logs and the scheduler logs, they don't have any obvious errors. I also checked the task run logs, it's empty.
The setup is pretty simple: SequentialExecutor with sqlite. My question is: where can I see the worker logs, or any other places that have any useful message logged?
Upvotes: 0
Views: 73
Reputation: 139
Ok finally figured this out.
Firstly let me correct my question - there's actually an error raised in scheduler log that the "BashTaskRunner" cannot be loaded. So I searched Airflow's source code, and found it was renamed to StandardBashRunner like 3 years ago(link).
This is the only occurrence of the word BashTaskRunner
in the whole repo. So I'm curious how the AIRFLOW_HOME/airflow.cfg is generated, which sets this as the default task_runner value.
Upvotes: 1