Cripto
Cripto

Reputation: 3751

php exec and running a bash script until its finished

I have a bash script that can take hours to finish.

I have a web frontend for it to make it easy to use.

On this main page, I wanna have a url that I press that starts my php command

<?exec('myscript that take a long time');?>

After the exec has finished, I want it to load a cookie.

setcookie('status',"done");

This is all easily done and works as is. However, the url that loads my exec command is a blank white page. I dont want this. I want the url to be an action which starts my phpscript and sets the cookie when the exec command returns all in the background.

Is this possible?

If not, how close can I get to this behavior.

EDIT:

function foo(){

var conn = new Ext.data.Connection();
conn.request({
url:‘request.php’,
method:‘POST’,
success: function(responseObject) {
    alert(“Hello,Word!”);
},
failure: function() {
alert(“Something fail”);
}
});}

I have tried the above code with no luck.

Upvotes: 0

Views: 604

Answers (2)

Cripto
Cripto

Reputation: 3751

While the other answer are correct and the WWW is not meant for open requests, we have to consider the www was never "meant" to take information.

As programmers, we made it take information such as logins.

Further more, my question was simple, how do I commit action A with result B. While sending and email would be nice and dandy as the other post by @symcbean suggests, its not a solution but a sidestep to the problem.

Web applications often need to communicate with the webserver to update their status. This is an oxymoron because the websterver is stateless. Cookies are the soultion.

Here was my solution:

$.ajax({
  url: url,
  data: data,
  success: success,
  dataType: dataType
});
setcookie('status',"done");

The url is a php function page with an if statement acting as a switch statement and running my external script that takes a really long time. This call is blocking, that is, it will not execute setcookie until it has finished.

Once it does, it will set my cookie and the rest of my web application can continue working.

Upvotes: 0

symcbean
symcbean

Reputation: 48387

I have a bash script that can take hours to finish

Stop there. That's your first problem. The WWW is not designed to maintain open requests for more than a couple of minutes. Maybe one day (since we now have websockets) but even if you know how to configure your webserver so this is not an off-switch for anyone passing by, it's exceeding unlikely that the network in between or your browser will be willing to wait this long.

Running of this job cannot be done synchronously with a web request. It must be run asynchronously.

By all means poll the status of the job from the web page (either via a meta-refresh or an ajax) but I'm having trouble understanding the benefit of setting a cookie when it has completed; usually for stuff like this I'd send out an email from the task when it completes. You also need a way to either separate out concurrent tasks invoked like this or a method of ensuring that only one runs at a time.

One solution would be to pass the PHP session id as an argument to the script, then have the script write a file named with the session id on completion - or even provide partial updates via the file - then you web page can poll the status of the job using the session id. Of course your code should check there isn't already an instance of the job running before starting a new one.

Upvotes: 7

Related Questions