khernik
khernik

Reputation: 2091

Getting source of the websites using file_get_content

I have a list of a couple of thousands websites. I have to iterate over them, and in each iteration - call file_get_contents of the given url, search for some information from the source using regex, and write it to another file.

Ok, the thing is - it's very, very slow. I divided the whole process into searching for about 50 urls each time I refresh the page. But:

Is there a way to speed this up?

Upvotes: 0

Views: 28

Answers (1)

jde
jde

Reputation: 198

set_time_limit ( int $seconds ) could help you to increase the maximum execution time. http://php.net/manual/fr/function.set-time-limit.php

I assume you're using your script frow a browser. Maybe you should consider execution from command line, since it's better for long execution time scripts.

Upvotes: 1

Related Questions