SaidbakR
SaidbakR

Reputation: 13544

Using CRON jobs to visit url?

I have a web application that has to perform a repeated tasks, Sending messages and alerts, I, already, use a script page do those tasks when it loaded in the browser i.e http://example.com/tasks.php and I included it by the mean of iframe in every page of my web application.

Now I want to change this to use CRON jobs because the first approach may leads to jam performance, So How could I make a CRON job that visits http://example.com/tasks.php. However, I don't want this CRON job creating output files such as day.*!

I host the application on shared hosting service that permits CRON jobs via cPanel.

Upvotes: 88

Views: 160337

Answers (11)

BogdanPopa
BogdanPopa

Reputation: 475

For Cron job to work you just need to hit the url, without any output and without downloading anything.

I am using only the wget with this two parameters:

*/2 * * * * wget -q --spider https://example.com/

*/2 : run every 2 minutes

‘-q’ :Turn off Wget’s output.

--spider : When invoked with this option, Wget will behave as a Web spider, which means that it will not download the pages, just check that they are there.

Documentation: https://www.gnu.org/software/wget/manual/wget.pdf

Upvotes: 1

Saeed Awan
Saeed Awan

Reputation: 456

Here is simple example. you can use it like

wget -q -O - http://example.com/backup >/dev/null 2>&1

and in start you can add your option like (*****). Its up to your system requirements either you want to run it every minute or hours etc.

Upvotes: 0

Abdul Alim
Abdul Alim

Reputation: 110

You can use this command:

links https://www.honeymovies.com

Upvotes: 2

Val Kornea
Val Kornea

Reputation: 4717

* * * * * wget --quiet https://example.com/file --output-document=/dev/null

I find --quiet clearer than -q, and --output-document=/dev/null clearer than -O - > /dev/null

Upvotes: 2

Walk
Walk

Reputation: 1649

U can try this :-


    wget -q -O - http://www.example.com/ >/dev/null 2>&1

Upvotes: 2

VPS-Managed.com
VPS-Managed.com

Reputation: 31

you can use this for url with parameters:

lynx -dump "http://vps-managed.com/tasks.php?code=23456"

lynx is available on all systems by default.

Upvotes: 2

Jerzy Drożdż
Jerzy Drożdż

Reputation: 448

You can use curl as is in this thread

For the lazy:

*/5 * * * * curl --request GET 'http://exemple.com/path/check.php?param1=1'

This will be executed every 5 minutes.

Upvotes: 29

Abbas Arif
Abbas Arif

Reputation: 390

i use this commands

wget -q -O /dev/null "http://example.com/some/cron/job.php" > /dev/null 2>&1

Cron task:

* * * * * wget -q -O /dev/null "http://example.com/some/cron/job.php" > /dev/null 2>&1

Upvotes: 8

mrraka
mrraka

Reputation: 133

You can also use the local commandline php-cli:

* * * * * php /local/root/path/to/tasks.php > /dev/null

It is faster and decrease load for your webserver.

Upvotes: 10

Diego Torres Milano
Diego Torres Milano

Reputation: 69318

You don't need the redirection, use only

* * * * * wget -qO /dev/null http://yoursite.com/tasks.php

Upvotes: 25

Mitch Dempsey
Mitch Dempsey

Reputation: 39929

* * * * * wget -O - http://yoursite.com/tasks.php >/dev/null 2>&1

That should work for you. Just have a wget script that loads the page.

Using -O - means that the output of the web request will be sent to STDOUT (standard output)

by adding >/dev/null we instruct standard output to be redirect to a black hole. by adding 2>&1 we instruct STDERR (errors) to also be sent to STDOUT, and thus all output will be sent to a blackhole. (so it will load the website, but never write a file anywhere)

Upvotes: 249

Related Questions