Jay H
Jay H

Reputation: 649

how to get a list of all paths/files on a webpage using wget or curl in php?

I use wget -p $url to get all the files on a webpage so that I can get a list. But for some URLs, it turns out that only the index.html can be fetched by wget. Is there a way to get a list of files on a specific URL by wget or cURL? Do I need to check the request headers and response headers?

Upvotes: 0

Views: 6884

Answers (1)

gcochard
gcochard

Reputation: 11744

Some servers do not let you browse directory listings, and if there's a default document in that directory, it takes over and you can't browse either.

You need to implement a spider that parses all the paths and files and links, and creates a directory structure of files that are declared and used in the HTML. Then you can download those files.

Upvotes: 2

Related Questions