Fluidbyte
Fluidbyte

Reputation: 5210

PHP usleep to prevent script timeout

I have a script that loads a massive directory listing and just (in it's nature) takes forever to load. I'm now in some cases experiencing the script timing out and was curious if I could use something like usleep to keep the script from timing out or if I'll just make the situation worse?

Upvotes: 0

Views: 1851

Answers (4)

Bez Hermoso
Bez Hermoso

Reputation: 1132

Have you tried using RecursiveDirectoryIterator for generating the directory listing?

I used to use a recursive function to generate directory listings which inadvertently cause script timeouts when I had to work with a massive amount of files going into deeper levels. Using RecursiveDirectoryIterator solved many of my problems.

Upvotes: 1

HNygard
HNygard

Reputation: 4816

You can't set the timeout limit with set_time_limit()?

If you set it to 0, the script will run forever.

set_time_limit(0);

Usleep() will halt the execution of the PHP script in the given time. In that time your script will not be listing any directories and such. It will just freeze the script until it is allowed to continue.

Upvotes: 3

Tessmore
Tessmore

Reputation: 1039

PHP can try to look for a file/directory that doesn't exist for a long time so if you're already using something like:

if ((is_dir($path) || file_exists($path)) && ($dh = opendir($path)))
{
  while(($file = readdir($dh)) !== false)
  {
    .. file or dir is found, do stuff :)
  }

  closedir($dh);
}

I haven't said a word, but if you simply use:

$dh = opendir($path);

It can take a few minutes before the script times out, but it doesn't do anything.

Upvotes: 1

alexpirine
alexpirine

Reputation: 3283

You can try set_time_limit or see if you can optimise your code:

  • execute ls -l > results.txt & on your system so that the listing launches in the background and copies it to the results.txt file.
  • reduce the amount of files in your directory by using subdirectories

Upvotes: 0

Related Questions