Reputation: 170856
I would like to know if there is an easy name to list all files/directories from a HTTP file share - by default the HTTP server displays them but I'm wondering is there is an easy way to get the list of files without manually parsing the returned webpage.
Any solution that would use curl
, wget
or python
should be just fine.
Upvotes: 0
Views: 4109
Reputation: 864
wget is only designed to download files not list directories.
If that's all you've got, though...
wget -r http://SOME.SITE/PATH 2>&1 | grep 'Saving to:' | sed "s/Saving to: \`\([^?']*\).*'/\1/" | uniq -u
rm -rf SOME.SITE
(Just so you don't sue me later, this is downloading all of the files from the site and then deleting them when it's done)
Edit: Sorry, I'm tired. If you want only the top-level directories, you can do something like this:
wget -rq http://SOME.SITE/PATH
ls -1p SOME.SITE | grep '/$'
rm -rf SOME.SITE
This does the same as above, but only lists immediate subdirectories of the URL.
Upvotes: 1