Decrypter
Decrypter

Reputation: 3000

Keep files updated from remote server

I have a server at hostname.com/files. Whenever a file has been uploaded I want to download it.

I was thinking of creating a script that constantly checked the files directory. It would check the timestamp of the files on the server and download them based on that.

Is it possible to check the files timestamp using a bash script? Are there better ways of doing this?

I could just download all the files in the server every 1 hour. Would it therefore be better to use a cron job?

Upvotes: 0

Views: 605

Answers (1)

icedwater
icedwater

Reputation: 4887

If you have a regular interval at which you'd like to update your files, yes, a cron job is probably your best bet. Just write a script that does the checking and run that at an hourly interval.

As @Barmar commented above, rsync could be another option. Put something like this in the crontab and you should be set:

# min hour      day month day-of-week user command
17 *    * * *   user  rsync -av http://hostname.com/ >> rsync.log

would grab files from the server in that location and append the details to rsync.log on the 17th minute of every hour. Right now, though, I can't seem to get rsync to get files from a webserver.

Another option using wget is:

wget -Nrb -np -o wget.log http://hostname.com/

where -N re-downloads only files newer than the timestamp on the local version, -b sends the process to the background, -r recurses into directories and -o specifies a log file. This works from an arbitrary web server. -np makes sure it doesn't go up into a parent directory, effectively spidering the entire server's content.

More details, as usual, will be in the man pages of rsync or wget.

Upvotes: 1

Related Questions