MarathonStudios
MarathonStudios

Reputation: 4321

Chaining cron scripts to get around 30-second runtime limit

I have a infrequent custom sitemap generator cron script I need to run on a site currently sitting on a HostGator shared server. The generator script is written in PHP, and executed with the following cron command:

php public_html/mysite.com/core/update.php > update.log

The generation process involves calling an internal API and doing distance calculations based on the result, so the script generally needs 5-10 minutes to run. The hardcoded PHP runtime limit is 30 seconds for HostGator's shared servers.

Is there any way I can have my script break itself down into smaller processes? Does anyone know if transcoding it into Perl or Python would get around the 30-second limit?

Upvotes: 1

Views: 957

Answers (2)

Tim McDonald
Tim McDonald

Reputation: 1252

putting this

set_time_limit(30);

inside a function or loop where your code spends most of its time should do the trick.

Upvotes: 2

Brent Baisley
Brent Baisley

Reputation: 12721

You can try forking. You would need to make sure each fork could perform it's work and create a new fork within 30 seconds. Since a fork is a copy of the running process, you retain all the data you've pulled so far.

Alternatively, use multi-curl to blast your server with many requests at once and complete the entire process in under 30 seconds. You would need to divide the work appropriately of course.

Upvotes: 1

Related Questions