Reputation: 4758
I have one PHP script that can take several minutes to complete. The script downloads a file to the user PC.
I have another PHP script and its role is to monitor progress of the main download script. That script is called by the client via AJAX calls and should return download progress information.
Right now, my tests show, that during the execution of the main script(in other words, during file download), the AJAX - monitor script returns no values at all. It starts behaving normally, when the main - Download script finishes.
Is it possible that PHP can not run two or more scripts simultaneously and it allows to run script only in sequential order?
I could insert my code, but I think for the purpose of my question, it is not needed. I simply need to know, if two or more PHP scripts may run simultaneously for the same client.
I use:
As I was asked to show you my code, please, see the below code parts.
Main PHP(later Download) Script:
<?php
// disable script expiry
set_time_limit(0);
// start session if session is not already started
if (session_status() !== PHP_SESSION_ACTIVE)
{
session_start();
}
// prepare session variable
$_SESSION['DownloadProgress'] = 0;
for( $count = 0; $count < 60; $count++)
{
sleep(1);
echo "Iteration No: " . $count;
$_SESSION['DownloadProgress']++;
echo '$_SESSION[\'DownloadProgress\'] = ' . $_SESSION['DownloadProgress'];
flush();
ob_flush();
}
?>
Monitoring PHP script:
// construct JSON
$array = array("result" => 1, "download_progress" => $_SESSION['DownloadProgress']);
echo json_encode($array);
?>
JavaScript code, where I call the both PHP scripts:
SearchResults.myDownloadFunction = function()
{
console.log( "Calling: PHP/fileDownload.php" );
window.location.href = 'PHP/fileDownload.php?upload_id=1';
console.log( "Calling: getUploadStatus()" );
FileResort.SearchResults.getUploadStatus();
console.log( "Called both functions" );
};
JavaScript AJAX:
// call AJAX function to get upload status from the server
SearchResults.getUploadStatus = function ()
{
var SearchResultsXMLHttpRequest = FileResort.Utils.createRequest();
if (SearchResultsXMLHttpRequest == null)
{
console.log("unable to create request object.");
}
else
{
SearchResultsXMLHttpRequest.onreadystatechange = function ()
{
console.log("Response Text: " + SearchResultsXMLHttpRequest.responseText);
console.log("AJAX Call Returned");
if ((SearchResultsXMLHttpRequest.readyState == 4) && (SearchResultsXMLHttpRequest.status == 200))
{
//if (that.responseJSON.result == "true")
{
var responseJSON = eval('(' + SearchResultsXMLHttpRequest.responseText + ')');
console.log("Download Progress: " + responseJSON.download_progress);
}
}
}
var url = "PHP/fileDownloadStatus.php";
SearchResultsXMLHttpRequest.open("POST", url, true);
SearchResultsXMLHttpRequest.send();
}
};
PHP Script that will later download files:
<?php
// disable script expiry
set_time_limit(0);
for( $count = 0; $count < 60; $count++)
{
sleep(1);
}
?>
PHP Monitoring script that outputs test values:
<?php
$test_value = 25;
// construct JSON
$array = array("result" => 1, "download_progress" => $test_value);
//session_write_close();
echo json_encode($array);
?>
Both scripts are called followingly:
SearchResults.myDownloadFunction = function()
{
console.log( "Calling: PHP/fileDownload.php" );
window.setTimeout(FileResort.SearchResults.fileDownload(), 3000);
console.log( "Calling: getUploadStatus()" );
window.setInterval(function(){FileResort.SearchResults.getDownloadStatus()}, 1000);
console.log( "Called both functions" );
};
Upvotes: 0
Views: 4434
Reputation: 14479
Without more info there are a few possibilities here, but I suspect that the issue is your session. When a script that uses the session file start, PHP will lock the session file until session_write_close()
is called or the script completes. While the session is locked any other files that access the session
will be unable to do anything until the first script is done and writes/closes the session file (so the ajax calls have to wait until the session file is released). Try writing the session as soon as you've done validation, etc on the first script and subsequent scripts should be able to start.
Here's a quick and dirty approach:
This is the page that the user is going to click the download link
<html>
<head>
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.10.2/jquery.min.js"></script>
<script>
$(document).ready(function(e) {
//Every 500ms check monitoring script to see what the progress is
$('#large_file_link').click(function(){
window.p_progress_checker = setInterval( function(){
$.get( "monitor.php", function( data ) {
$( ".download_status" ).html( data +'% complete' );
//we it's done or aborted we stop the interval
if (parseInt(data) >= 100 || data=='ABORTED'){
clearInterval(window.p_progress_checker);
}
//if it's aborted we display that
if (data=='ABORTED'){
$( ".download_status" ).html( data );
$( ".download_status" ).css('color','red').css('font-weight','bold');
}
})
}, 500);
});
});
</script>
</head>
<body>
<div class="download_status"><!-- GETS POPULATED BY AJAX CALL --></div>
<p><a href="download.php" id="large_file_link">Start downloading large file</a></p>
</body>
</html>
This is the PHP script that serves the large file... it breaks it into chunks and after sending each chunk it closes the session so the session becomes available to other scripts. Also notice that I've added a ignore_user_abort
/connection_aborted
handler so that it can take a special action should the connection be terminated. This is the section that actually deals with the session_write_close()
issue, so focus on this script.
<?php
/*Ignore user abort so we can catch it with connection_aborted*/
ignore_user_abort(true);
function send_file_to_user($filename) {
//Set the appropriate headers:
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($filename));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($filename));
$chunksize = 10*(1024); // how many bytes per chunk (i.e. 10K per chunk)
$buffer = '';
$already_transferred =0;
$file_size = filesize( $filename );
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
/*if we're using a session variable to commnicate just open the session
when sending a chunk and then close the session again so that other
scripts which have request the session are able to access it*/
session_start();
//see if the user has aborted the connection, if so, set the status
if (connection_aborted()) {
$_SESSION['file_progress'] = "ABORTED";
return;
}
//otherwise send the next packet...
$buffer = fread($handle, $chunksize);
echo $buffer;
ob_flush();
flush();
//now update the session variable with our progress
$already_transferred += strlen($buffer);
$percent_complete = round( ($already_transferred / $file_size) * 100);
$_SESSION['file_progress'] = $percent_complete;
/*now close the session again so any scripts which need the session
can use it before the next chunk is sent*/
session_write_close();
}
$status = fclose($handle);
return $status;
}
send_file_to_user( 'large_example_file.pdf');
?>
This is a script that is called via Ajax and is in charge of reporting progress back to the Landing Page.
<?
session_start();
echo $_SESSION['file_progress'];
?>
Upvotes: 6