Thomas Veit
Thomas Veit

Reputation: 47

Scraping much data from different URLs with simple_html_dom.php

i basically want to do exactly something like this: Simple Html DOM Caching

i got everything to work so far, but now i'm getting the following error because i scrape many sites (6 at the moment, i want up to 25 sites):

Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 39 bytes)

i'm a php newbie =/...so, how can i "serialize" the scraping process step by step that my memory don't give up? :-)

code sample:

// Include the library
include('simple_html_dom.php');

// retrieve and find contents
$html0 = file_get_html('http://www.site.com/');
foreach($html0->find('#id') as $aktuelle_spiele);

file_put_contents("cache/cache0.html",$aktuelle_spiele);

thank you very much in advance for your help!

Upvotes: 1

Views: 222

Answers (2)

Luigi Siri
Luigi Siri

Reputation: 2018

you can run a memory increase at the start of your script.

Like this:

ini_set('memory_limit', '128M');

Upvotes: 0

Hackerman
Hackerman

Reputation: 12295

In your php.ini, change this line:

memory_limit = 32M

With this one:

memory_limit = 256M //or another greater value

Or add this piece of code at the start of every php script that use simple_html_dom:

ini_set('memory_limit', '128M'); //or a greater value

Upvotes: 3

Related Questions