Reputation: 2547
I am running a crawler programed in PHP every hour with a cron job. When everythings goes as expected, the script quits automatically. However, for some reasons, sometimes it gets stuck in an infinite loop. It gets worse because I use a lock file to avoid a duplicate run, when the crawler gets stuck it never runs again until it kill it manually ( ps aux -> kill )
How can I make sure that the script ends after a couple of hours whatever happens?
Should I add a line in the php code? Wouldn't it be more robust to do that directly in Linux?
The best idea that I have so far is to make a small batch file with all the necessary commands and then invoke that batch with cron instead of the php script directly.
Am I right, and what should the commands be?
Thanks
Edit: the best I found so far is: http://www.linuxquestions.org/questions/linux-general-1/how-to-kill-the-process-after-specific-time-624453
The bash is way too long, I was hoping for a smarter, shorter solution.
Cheers
Upvotes: 2
Views: 1655
Reputation: 1300
This would kill php process which were started more then an hour ago:
$(ps -eo comm,pid,etimes | awk '/^php/ {if ($3 > 3600) { print "kill "$2}}')
3600 - timestamp in second
P.S. You can run command
> ps -eo comm,pid,etimes
before and after to ensure that everything worked out.
P.P.S. I know it is old question but someone might find it helpful
Upvotes: 3