Andy Tait
Andy Tait

Reputation: 199

Memory limit exceeded with PHP foreach

I'm attempting to foreach through many thousand items in an array, do some operations and save some values to a mysql table.

However, as I loop through, the memory usage continually grows until we run out of the memory I specify in php.ini, which is pretty quick.

I've tried using unset, setting variables to null and looked at garbage collection, but nothing is having an impact.

Is there a more efficient way I can loop through these elements (ie so the memory usage does not continually grow).

Below is a simplified example of what I'm doing.

foreach ($subscribers as $subscriber)
{
    $member = new Member($subscriber['id']);
    if ($member['id'] > 0)
    {
        $bulletin = Bulletin::getCustomBulletin($member['id']);

        Bulletin::compileBulletin($member['email'], time(), $bulletin['title'], $bulletin['content']);

        echo $member['email'] . "\n";
        echo memory_get_usage() . "\n";
    }
}

This produces the following results:

[email protected]
11336688
[email protected]
12043640
[email protected]
12749952

Upvotes: 4

Views: 11188

Answers (3)

Paul Campbell
Paul Campbell

Reputation: 1145

Unfortunately it's hard to tell from the code that you've posted as there's no clue as to what the Member object does internally, but this looks to be a possible:

recursive references leak memory

I would eliminate the new member object creation in the code above to check whether that is the source of the memory leak. From what you've said, creating the member object may be unnecessary anyway and it could be replaced with a static lookup member function.

Upvotes: 0

Matthieu Napoli
Matthieu Napoli

Reputation: 49703

Is $suscribers a result of a DB query?

If so, it may be the source of your problem: the rows will be buffered in memory, even though you go through them one at a time.

You can try using unbuffered queries or limit the number of results for the query and execute several smaller queries.

See also: Why "Allowed memory size exhausted"?

Upvotes: 2

webbiedave
webbiedave

Reputation: 48887

There's nothing suspect about your foreach loop (no variables to be unset as they are all being written over on each iteration). You will need to find out which line is causing the memory grabs. A profiler found in a professional IDE would help with this. If you do not have access to one, you'll want to use memory_get_usage() as you've been doing but put it after every line to check which one is causing the bottleneck.

There are also free profiling tools such as xdebug.

Upvotes: 0

Related Questions