taras
taras

Reputation: 2263

long time cron job via wget/curl?

I am working on cron jobs for my php app and planning to use cron via wget/curl. Some of my php cron jobs can take 2-3 hours. How to let php work 2-3 hours from cron tab ? Is it good practice to run such long time jobs via cron wget/curl ? Anyone got experience on this ? I also have email queue and i needs to be run every 10 seconds, but cron tab is minute level. Any suggest on this case ?

Thanks for reading.

Upvotes: 3

Views: 7965

Answers (4)

site
site

Reputation: 1638

If using wget to call the long running cron job, consider some of it's defaults that aren't desirable in this situation. By default it'll --tries 20 times if no data is read within --read-timeout of 900 seconds. So at a maximum set tries to 1 to ensure the cron job is only called once, since you of course wouldn't want a long running script called multiple times while it's still running.

wget -t 1 https://url/

If on cPanel and you don't care for any output emails after each cron job run, also use the following flags -O and -q to hide output, and redirect errors with 2>&1.

wget -O /dev/null -q -t 1 https://url/ 2>&1

Upvotes: 0

Marcel
Marcel

Reputation: 21

I've just run into the problem with a long running PHP script executed via cron and wget. The script terminates after 15 minutes due to the default timeout in wget.

However I believe this method does in fact work if you set up the correct timeouts. The PHP script you run needs to be set to have an unlimited run time. Best to set this at the start of your long running script.

set_time_limit(0);

When using wget you also need to remove its timeout by passing -T0. e.g.

wget -q0 -T0 http://yourhost/job.php

Be very careful not to overload your server with long running scripts.

Upvotes: 2

Ciaran McNulty
Ciaran McNulty

Reputation: 18868

You can use the following at the start of your script to tell PHP to effectively never time out:

set_time_limit(0);

What may be wiser is, if crontab is going to run the script every 24 hours, set the timeout to 24 hours so that you don't get two copies running at the same time.

set_time_limit(24*60*60);

Crontab only allows minute-level execution because, that's the most often you should really be launching a script - there are startup/shutdown costs that make more rapid scheduling inefficient.

If your application requires a queue to be checked every 10 seconds a better strategy might be to have a single long-running script that does that checking and uses sleep() occasionally to stop itself from hogging system resources.

On a UNIX system such a script should really run as a daemon - have a look at the PEAR System_Daemon package to see how that can be accomplished simply.

Upvotes: 2

Thejesh GN
Thejesh GN

Reputation: 1128

When you use wget/curl, you are requesting for a page from a webserver. Every webserver will have a time out period, so this might time out.

Also some of the hosting providers may stop the process running beyond certain minutes ( basically done to control the rogue threads).

So it is not advisable to schedule using wget/curl if the job takes more than few minutes.

Try scheduling it using actual scheduler. You can run php from command line

php [options] [-f] [--] [args...]

php command should be on the path.

Upvotes: 4

Related Questions