phenderbender
phenderbender

Reputation: 785

Errno 13 Permission denied when Airflow tries to write to logs

We're running into a permission error when using Airflow, receiving the following error:

PermissionError: [Errno 13] Permission denied: '/usr/local/airflow/logs/scheduler/2019-12-18/../../../../home

We've tried using chmod 777 -R on the /usr/local/airflow/logs/schedule directory within the container but this doesn't seem to have done the trick.

We have this piece in our entrypoint.sh script:

export AIRFLOW__CORE__BASE_LOGS_FOLDER="/usr/local/airflow/logs

Has anyone else run into this airflow log permission issue? Can't seem to find much about this one in particular online.

Upvotes: 22

Views: 64095

Answers (8)

Giorgos Myrianthous
Giorgos Myrianthous

Reputation: 39810

Another approach would be to copy the files into the image whilst also changing the ownership.

COPY --chown=airflow . .

Upvotes: 0

Dominik Sajovic
Dominik Sajovic

Reputation: 841

I had the same error.

PermissionError: [Errno 13] Permission denied: '/usr/local/airflow/logs/scheduler'

The reason I got that error is because I didn't create the initial 3 folders (dags, logs, plugins) before running airflow docker container. So docker seems to have created then automatically but the permissions were wrong.

Steps to fix:

  1. Stop current container
docker-compose down --volumes --remove-orphans
  1. Delete folders dags, logs, plugins
  2. Just in case Destroy the images and volumes already created (in Docker Desktop)
  3. Create folders again from command line
mkdir logs dags plugins
  1. run airflow docker again
docker-compose up airflow-init
docker-compose up

Upvotes: 6

kkpalczewski
kkpalczewski

Reputation: 45

Little late to the party, but you could add a user to the default group, which creates the directory.

When your docker-compose is up you could run service docker-compose exec SERVICE_NAME bash and check to which group specific directory belongs to and then add this group to your user permission in docker-compose.yml:

service_name:
     ...
     user: USER_NAME:USER_GROUP

Upvotes: 0

Starnuto di topo
Starnuto di topo

Reputation: 3569

I was having the same problem running an Airflow image on docker hosted by Windows.

My solution was to override the CMD in the scheduler's dockerfile with a CMD that set the file permissions, before launching the default CMD.

The default CMD can be obtained with docker inspect -f '{{.Config.Cmd}}' <schedulerImageId>.

Example. I used the bitnami image ( docker.io/bitnami/airflow-scheduler:2.1.4-debian-10-r16 ). Inspecting the image I saw that the default CMD was

/opt/bitnami/scripts/airflow-scheduler/run.sh

So I created a run.sh script with the following content:

#! /bin/bash

chmod -R 777 /opt/bitnami/airflow/logs/
. /opt/bitnami/scripts/airflow-scheduler/run.sh

Then I added the following lines at the end of my dockerfile:

COPY run.sh /
RUN  chmod +x /run.sh

CMD /run.sh

Upvotes: 0

Galuoises
Galuoises

Reputation: 3283

I solved the issue: in my case the problem was that the volume mounted folders, logs and dags didn't have write permission. I added it with

chmod -R 777 dags/
chmod -R 777 logs/

and in the docker-composer file they are mounted as

    volumes:
      - ./dags:/opt/bitnami/airflow/dags
      - ./logs:/opt/bitnami/airflow/logs

Upvotes: 10

rubbengimenez
rubbengimenez

Reputation: 249

Just for anyone with the same issue...

Surprisingly, I had to take a look to the Airflow documentation... and according to it:

On Linux, the mounted volumes in container use the native Linux filesystem user/group permissions, so you have to make sure the container and host computer have matching file permissions.

mkdir ./dags ./logs ./plugins
echo -e "AIRFLOW_UID=$(id -u)\nAIRFLOW_GID=0" > .env

Once you have matched file permissions:

docker-compose up airflow-init
docker-compose up

Upvotes: 16

Sairam Krish
Sairam Krish

Reputation: 11681

Folder permission that is bind mounted could also result in this error.

For example:

docker-compose.yml (pseudo code)

   service_name:
     ...
     volumes:
      - /home/user/airflow_logs:/opt/airflow/logs

Grant permission to the local folder, so that airflow container can write logs, create directory if needed etc.,

 sudo chmod u=rwx,g=rwx,o=rwx /home/user/airflow_logs

Upvotes: 13

Muhammad Radifar
Muhammad Radifar

Reputation: 1499

I also have the same problem using Apache Airflow 1.10.7.

Traceback (most recent call last):
  File "/usr/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap
    self.run()
  File "/usr/lib/python3.7/multiprocessing/process.py", line 99, in run
    self._target(*self._args, **self._kwargs)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py", line 135, in _run_file_processor
    set_context(log, file_path)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/site-packages/airflow/utils/log/logging_mixin.py", line 198, in set_context
    handler.set_context(value)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/site-packages/airflow/utils/log/file_processor_handler.py", line 65, in set_context
    local_loc = self._init_file(filename)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/site-packages/airflow/utils/log/file_processor_handler.py", line 148, in _init_file
    os.makedirs(directory)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/os.py", line 211, in makedirs
    makedirs(head, exist_ok=exist_ok)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/os.py", line 211, in makedirs
    makedirs(head, exist_ok=exist_ok)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/os.py", line 211, in makedirs
    makedirs(head, exist_ok=exist_ok)
  [Previous line repeated 5 more times]
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/os.py", line 221, in makedirs
    mkdir(name, mode)
PermissionError: [Errno 13] Permission denied: '/media/radifar/radifar-dsl/Workflow/Airflow/airflow-home/logs/scheduler/2020-01-04/../../../../../../../home'

After checking how file_processor_handler.py works I find that the error was caused by the different directory location of example dag and our dag folder settings. In my case 7 folder above the folder 2020-01-04 is /media/radifar. In your case 4 folder above the folder 2019-12-18 is /usr/local. That's why the PermissionError was raised.

I was able to solve this problem by cleaning the AIRFLOW_HOME folder then run airflow version, set the load_example to False in airflow.cfg. Then run airflow initdb. After that I can use airflow without error.

Upvotes: 2

Related Questions