Reputation:
I am trying to run bash
scripts using Apache Airflow
on Docker
this is the bash script i am running,
#!/bin/bash
rootdir=/home/anti/Documents/logistics/ariflowtest
crondir="$rootdir/cron/$(date "+%Y%m%d/%H")"
mkdir -p "$crondir"
/usr/bin/python3 "$rootdir/her.py" >> "$crondir/cron.log" 2>&1
when i run the Airflow DAG
, it shows this error
Running command: ./aus.sh
[2020-11-12 08:15:14,821] {{bash_operator.py:122}} INFO - Output:
[2020-11-12 08:15:14,823] {{bash_operator.py:126}} INFO - /tmp/airflowtmp_856j2oz/task_ausg7zzejtp: line 1: ./aus.sh: No such file or directory
i tried passing relative path of the script , but it did not work
this is my dag code :
from airflow.models import DAG
from airflow.utils.dates import days_ago
from airflow.operators.python_operator import PythonOperator
from airflow.operators.bash_operator import BashOperator
from airflow.operators.dummy_operator import DummyOperator
import os
args={
'owner' : 'anti',
'start_date':days_ago(1)
}
dag = DAG(dag_id='sn_logistics2',default_args=args,schedule_interval='*/5 * * * *')
create_command_dpd = "./home/anti/Documents/logistics/ariflowtest/dpd.sh "
create_command_aus = "./home/anti/Documents/logistics/ariflowtest/aus.sh "
with dag:
dummy_operator = DummyOperator(task_id='mothertrigger',retries=3)
task_dpd=BashOperator(
task_id='task_dpd',
bash_command="/dpd.sh ",
#bash_command = create_command_dpd,
xcom_push=True,
dag=dag
)
task_aus=BashOperator(
task_id='task_aus',
bash_command="/aus.sh ",
#bash_command = create_command_aus,
xcom_push=True,
dag=dag
)
dummy_operator >> task_dpd
dummy_operator >> task_aus
i checked if my dag folder is mount or not :
sudo docker inspect -f '{{ .Mounts }}' 1e6c9974a9f3
output
[{bind /home/anti/Documents/dags /usr/local/airflow/dags rw true rprivate}]
script files that i'm trying to run and docker compose file are at two different locations
script files are in :
/home/anti/Documents/logistics/ariflowtest
docker-compose is in :
/home/anti/Documents
and i made sure the folder is mounted correctly in the docker-compose
file
version: '3.7'
services:
postgres:
image: postgres:9.6
environment:
- POSTGRES_USER=airflow
- POSTGRES_PASSWORD=airflow
- POSTGRES_DB=airflow
logging:
options:
max-size: 10m
max-file: "3"
webserver:
image: puckel/docker-airflow:1.10.9
restart: always
depends_on:
- postgres
environment:
- LOAD_EX=y
- EXECUTOR=Local
logging:
options:
max-size: 10m
max-file: "3"
volumes:
- ./dags:/usr/local/airflow/dags
- ./scripts:/usr/local/airflow/scripts
# - ./plugins:/usr/local/airflow/plugins
ports:
- "8080:8080"
command: webserver
healthcheck:
test: ["CMD-SHELL", "[ -f /usr/local/airflow/airflow-webserver.pid ]"]
interval: 30s
timeout: 30s
retries: 3
Upvotes: 0
Views: 2886
Reputation: 5076
The problem is that the scripts aus.sh
and dpd.sh
are not in the docker container. You can solve this by mapping these files as well. e.g.
volumes:
- ./dags:/usr/local/airflow/dags
- ./scripts:/usr/local/airflow/scripts
- /home/anti/Documents/logistics/ariflowtest:/scripts
# - ./plugins:/usr/local/airflow/plugins
In the dag code you'll have to change bash_command="/aus.sh "
to bash_command="/scripts/aus.sh "
. Same goes for dpd.sh
.
Upvotes: 1
Reputation: 189658
You are using ./home/anti/Documents/logistics/ariflowtest/aus.sh
where you almost certainly mean /home/anti/Documents/logistics/ariflowtest/aus.sh
without a dot in front.
In brief, you are looking for a directory named home
in the current directory, and inside that anti
, and inside that Documents
; where presumably the real location is inside home
in the root directory.
Perhaps see also Difference between ./ and ~/
Upvotes: 0