Reputation: 533
I need to download some files from a website, The webiste content some charts and have the option to download the data of the charts as CSV files. I try using wget
wget --save-cookies cookies.txt --post-data 'user=foo&password=bar' https://websiteyyyyyyyy/cacti/graph_xport.php?local_graph_id= 1234
But only the code only download the login page of the website and I need to download the attached file from the charts.
But I don't know if is possible to get the files in this way??
I try using curl but I get the same result
any advice
Upvotes: 0
Views: 3291
Reputation: 181
Many websites keep track of whether you are logged in or not via session cookies. Seeing that you are not logged in (because you have not sent a valid session cookie) you are redirected to the login page (even though you have provided your password). Therefore you may have to run wget to login, and then run it a second time to actually retrieve the file. Note that --save-cookies alone may not be enough, you may need to add --keep-session-cookies
wget --save-cookies cookies.txt --keep-session-cookies --post-data 'user=foo&password=bar' https://websiteyyyyyyyy/loginpage
wget --load-cookies cookies.txt https://websiteyyyyyyyy/graph_xport.php?local_graph_id=1234
There is an example very similar to the above in the Wget documentation in the section about the --post-file option.
Upvotes: 1
Reputation: 73
That url looks like it's pointing to the site that has the file, but not the file itself. To download the file, you need something like
wget http://www.examplesite.com/subpage/yourfile.txt
which would download the yourfile.txt
However, if those charts are dynamically created via a server side script, such as chart generation based on user input, pointing to the page with that script will not run that script, which in turn won't create the file or initiate the file request. You have to use a url that points to a file location, and the file must already be there before the request is made.
Upvotes: 0