Ivan
Ivan

Reputation: 51

simplexml_load_file in php

I've been using simplexml_load_file to parse xml file with 15,000 records, it was working ok, but then when I tried to work with many files each of them is 15,000 it gave me this errorPHP

Fatal error:  Allowed memory size of 134217728 bytes exhausted (tried to allocate 64 bytes)

Not sure what to do, The following is sample of what I'm doing,

$xml = simplexml_load_file($file)
       or die("Error: Cannot create object");

foreach($xml->children() as $events){
                foreach($events->load as $load){
                        $record = $load->loadrecord['record']."    ";
                        if ($load->loadrecord['record'] == "test"){
                                foreach($events->receiveds as $received){
                                $release = $received->received['release'];
                                }
                                foreach($events->sender as $sender){
                                $test1 = $sender['test1'];
                                $test2 = $sender['test2'];
                                $test3 = $sender['test3'];
                                $test4 = $sender['test4'];
                                }
                                foreach($events->screens as $filter){
                                $record = $filter->filter['record'];
                                }
                        }
}

Do I need to free something after the parsing is completed, please note that the issue happens when have many files, I tried with two files and there was no problem

Upvotes: 0

Views: 1933

Answers (4)

Ivan
Ivan

Reputation: 51

Finally found the problem, you need to unset after each iterate, for example you need to unset the $xml here

$xml = simplexml_load_file(file_name )
foreach($xml->children() as $logs){
do_stuff
unset($xml);
}

What I had before was like this

$xml = simplexml_load_file(file_name )
foreach($xml->children() as $logs){
do_stuff
}
unset($xml);

wouldn't really found it without your guidance

Upvotes: 1

Spudley
Spudley

Reputation: 168813

Unfortunately for you, the SimpleXML class loads the whole XML file into memory. This will obviously cause you issues if you give it a large file.

Instead, you'll need to use the XMLReader class. This class reads the XML one element at a time, and throws it away after reading. This means you have a minimum of data in memory at any one time.

A convenient way to use this class is by wrapping it in an Iterator class. This means you can use foreach() to loop through the elements just as if it were all loaded at once.

Here's a link to a great example of an Iterator class for XMLReader. When I had exactly this issue, I found this class very helpful. I had to make a couple of minor tweaks to suit my needs, but it pretty much worked first time. I hope it works for you too.

Upvotes: 0

Almog Baku
Almog Baku

Reputation: 800

DON'T use simplexml for large files.. Use XML DOM object.

You may use some advanced tool like SAX or XMLReader or other third-party to parse the data.

Upvotes: 2

Martin Lyne
Martin Lyne

Reputation: 3065

You could consider using a cron job to process these files, one-by-one, then store their output somewhere and retrieve the output when it is finished.

This of course relies on you not needing the result immediately. If you did, you could begin this process and use AJAX requests to check for when it is done and to grab the final output.

Obviously, the needs and requirements may mean that's not feasible. Raising your memory limit is an option but typically not if you are on a shared hosting platform. It also just means you bypass the issue, rather than solving it (i.e. if the number of records increases again, the problem will come back)

Edit: misread question, modified answer to suit.

Upvotes: 0

Related Questions