Miha Šušteršič
Miha Šušteršič

Reputation: 10052

gitlab-CI pipeline: lftp error 550 when trying to delete files

I am using the free shared runners on the gitlab.com environment. I have a gitlab-CI pipeline that runs the following lftp commands one after the other:

The purpose of these commands is to delete the content of the httpdocs folder (previous files) and then upload new build artifact.

The CI pipeline is triggered from a CMS. It sometimes happen that the content editors update the content in parallel, resulting in a lot of triggers that run in parallel (the pipeline takes about 3 minutes to finish).

The pipeline will then start failing with the following error:

rm: Access failed: 550 /httpdocs/build-html-styles.css: No such file or directory

This happens because a file deleted by another pipeline is queued for deletion. A very similar error happens when the httpdocs folder is completely empty. This results in my whole pipeline failing (the second upload lftp command does not get executed at all).

Examples of failing pipelines and their output:

How do I prevent this from happening? Using lftp to upload the artifact is not a must - I am running the node:8.10.0 docker image. Gitlab-ci.yml file in question.

Upvotes: 4

Views: 1265

Answers (3)

Timothy Gonzalez
Timothy Gonzalez

Reputation: 760

Concurrency level of 50 with --parallel=50 seems high. I had a 550 permission error as well using lftp and lowering the level of concurrency fixed the issue for me.

Upvotes: 0

KamilCuk
KamilCuk

Reputation: 141155

I was commenting about simple file locking with simple active polling waiting. I have no experience with lftp, but scrambling from various internet resources like this, I have written the following. I see that lftp does not support file locking in the protocol, so you could something like this:

const="set ftp:ssl-allow no; open -u $USERNAME,$PASSWORD $HOST"
# wait until file exists
while lftp -c "$const; df lockfile"; do sleep 1; done
# create the lockfile
lftp -c "$const; mkdir lockfile"
# the work
lftp -c "$const; glob -a rm -r ./httpdocs/*"
lftp -c "$const; mirror -R public/ httpdocs  --ignore-time --parallel=50 --exclude-glob .git* --exclude .git/"
# remove lockfile
lftp -c "$const; rmdir lockfile"

I used mkdir and rmdir and a directory instead of a file, because I don't know how to create an empty file with lftp. There is still a race condition between finding file and creating it, but it should protect at least against two concurrent accesses. To protect more you could do something like sleep 0.$(printf "%02d" $((RANDOM / 10))) - make the sleep time random, so they enter creating a file less "concurently".

Also just in case, I wouldn't mirror to httpdocs directory, but to some temporary directory like tmp=httpdocs_$(uuidgen); lftp "mirror .. $tmp" that could be later renamed lftp 'rmdir httpdocs; rename $tmp httpdocs", for making deployments safer with less downtime (less time with httpdocs beeing empty). For future I suggest to just move to a safer/more advanced protocol of connecting with your remote server, that supports file locking. Like ssh. Or maybe samba.

Upvotes: 1

VonC
VonC

Reputation: 1324935

lavv17/lftp issue 302 advocated for lftp to skip such content, but that has not gained any traction.

The mailing list suggests

To delete a directory use rmdir

In your case: rm -rf httpdocs/ (followed by mkdir httpdocs if you need the empty folder)

Upvotes: 0

Related Questions