Reputation: 512
I have a few foreach loops and at the end of every I use
set_time_limit(30)
which restarts the counter back to zero.
Now the script runs for longer time (I fetch 5000-10000 articles using an API and store them in DB), but after a while (when it processes already lots of data) I get the "Maximum execution time of 30 seconds exceeded".
Could it be because of lack of memory? How could I tackle this problem?
The script is fetching articles using an API, and a foreach loop is used basically like this
foreach($articles as $article)
{
//do stuff with single article using $article
set_time_limit(30);
}
and I do not expect it that it needs more than 30 seconds to fetch and process a single article, but apparently after the script is run for some time, it hits that limit. What am I doing wrong? I don't want to allow the script to do a 5-secs job longer than a maximum of 30 seconds by using set_time_limit(9000) or something like that, which will probably get the job done but I suppose this is not a good way to solve the issue?
Upvotes: 0
Views: 1122
Reputation: 3425
Try this:
PHP has a function set_time_limit which sets the time limit of execution.
It has one argument set in seconds.
If you use like this,
set_time_limit(0)
then it will set unlimited time of execution.
Write this code in top.
You can refer more from here: http://php.net/manual/en/function.set-time-limit.php
Upvotes: 0
Reputation: 10627
If, for some crazy reason you feel that you must impose a time limit, consider the following:
set_time_limit(0);
$n = 0; $inc = 1;
foreach($article as $v){
// do stuff with $v - Less characters is faster
$n+=$inc;
}
set_time_limit($n);
Change $inc
to suit your needs.
Upvotes: 1
Reputation: 2634
Why don't you just make
set_time_limit(0);
the first line of your script?
Upvotes: 3