Reputation: 1244
A script regularly downloads some data file from a remote server with wget:
CERTDIR=folder1
SPOOLDIR=folder2
URL="http://..."
FILENAME="$SPOOLDIR/latest.xml.gz"
/usr/bin/wget \
-N \
--quiet \
--private-key=${CERTDIR}/keynopass.pem \
--ca-certificate=${CERTDIR}/ca.pem \
--certificate=${CERTDIR}/client.pem \
"$URL" \
--output-document ${FILENAME}
The -N switch is used to turn on timestamping. (possibly redundant, this seems to be the default)
I was anticipating that the file only will be downloaded if there is a newer remote version. But this is not the case. The actual download is done, no matter if the remote file has the same timestamp as the local file.
The file is a bit lengthy, so my plan was to check for a new version frequently, but download only as needed. Unfortunately this seems impossible with that approach.
Just guessing: the URL references no file, but is an api call. Could this be the reason?
But: the timestamp of the local file is set to the timestamp of the remote file - so I know, that the timestamp information is available.
Do I miss something?
Notes:
Upvotes: 2
Views: 299
Reputation: 13097
The documentation mentions that:
Use of
-O
is not intended to mean simply "use the name file instead of the one in the URL;" rather, it is analogous to shell redirection:wget -O file http://foo
is intended to work likewget -O - http://foo > file;
file will be truncated immediately, and all downloaded content will be written there.For this reason,
-N
(for timestamp-checking) is not supported in combination with-O
: since file is always newly created, it will always have a very new timestamp. A warning will be issued if this combination is used.
Thus one option would be to leave out the -O
option, let wget
download the file (if needed), and just create a symlink in your target directory called latest.xml.gz
pointing to the downloaded file...
Upvotes: 3