Val Nolav
Val Nolav

Reputation: 902

getting xml file from a website and copying it to a server

My question is very broad since I do not know exactly what I ought to do. It will be my first trial. I hope I can express my need adequately.

I want to read and than copy an xml file from a website and copy it to my amazon cloud server account. I want to write the code on my amazon server on linux platform.

1- I need to check the xml file on the website every day.

2- If there is a change in the xml file, I will get it and copy it to my amazon cloud server. (perhaps I can compare character lengths of today's and previous day's xml file in order to understand if there is a change)

3-I made a research and I found wget command can be used to copy a file.

Could you please give me some sample codes and guideline?

Many thanks,

I apologize if my question is nonsense or ambiguous .

Upvotes: 0

Views: 550

Answers (1)

imm
imm

Reputation: 5919

Yes, you could use the wget or curl commands to download the XML file. You can use diff to compare the new file to the old file. Maybe look into creating a bash shell script to automate these processes, and schedule it to run periodically with cron. I think you could have this run directly on your "cloud server", rather than transfer the XML file there after doing these checks.

Upvotes: 2

Related Questions