Reputation: 140
i have two ajax calls. the first call is for read a file and save it into the DB (mysql) and at the bottom of a for loop they set an session variable of the "status" like the interval. The second call return the session variable.
this is my javascript code:
var interval = null;
function test(data) {
var i = 0;
interval = setInterval(function () {
$.ajax({
url: '/admin/movies/progress',
type: "GET",
async: true,
dataType: "text",
success: function (data) {
console.log(data);
$('#saveFileProgressBar').width(data[0]);
},
error: function (XMLHttpRequest, textStatus, errorThrown) {
toastr.error('error progressbar', 'Download File');
}
});
i++;
if(i == 5){
clearInterval(interval);
}
}, 500);
$.ajax({
url: '/admin/movies/1',
type: "GET",
async: true,
dataType: "text",
success: function (data) {
console.log(data);
},
error: function (XMLHttpRequest, textStatus, errorThrown) {
toastr.error('error', 'Download File');
}
});
}
this is my laravel 5.4 code:
// Mapped to mysite.com/admin/movies/progress
public function getProgress() {
return Response::json(array(Session::get('progress')));
}
// Mapped to mysite.com/admin/movies/1
public function postGenerate() {
// getting values from form (like $record_num)
Session::put('progress', 0);
Session::save(); // Remember to call save()
for ($i = 1; $i < 100; $i++) {
sleep(1);
Session::put('progress', $i);
Session::save(); // Remember to call save()
}
return "done";
}
///////EDIT///////
my new PHP code:
public function getProgress() {
$rawData = file_get_contents('plugins/elFinder-2.1.25/files/data/progressFile.json');
$cacheData = json_decode($rawData, true);
return $cacheData;
}
public function postGenerate() {
// getting values from form (like $record_num)
for ($i = 0; $i < 10; $i++) {
$data['progress'] = $i;
$fres = fopen('plugins/elFinder-2.1.25/files/data/progressFile.json', 'w');
fwrite($fres, json_encode($data));
fclose($fres);
sleep(1);
}
return "true";
}
my javascript code:
var interval = null;
function test() {
var i = 0;
interval = setInterval(function () {
$.ajax({
url: '/admin/movies/progress',
type: "GET",
async: true,
dataType: "text",
success: function (data) {
console.log(data);
$('#saveFileProgressBar').width(data.progress);
},
error: function (XMLHttpRequest, textStatus, errorThrown) {
toastr.error('Es ist ein Fehler aufgetreten beim Auslesen der Datei', 'Download File');
}
});
i++;
if(i == 5){
clearInterval(interval);
}
}, 500);
$.ajax({
url: '/admin/movies/1',
type: "GET",
async: true,
dataType: "text",
success: function (data) {
console.log(data);
},
error: function (XMLHttpRequest, textStatus, errorThrown) {
toastr.error('Es ist ein Fehler aufgetreten beim Auslesen der Datei', 'Download File');
}
});
}
Upvotes: 1
Views: 1303
Reputation: 229
Requests
Response
Hi,
I just ran a test of your script and response is in the images above. The first image is list of AJAX requests. The green ones are completed requests and the gray one is still in progress request. The second image shows a log of response of the calls in the order the request completed (first completed request shows on top and so on..).
1 in the images refers to the first AJAX call (of course I renamed it to /set-progress instead of /admin/movies/1). The subsequent second AJAX calls are tagged 2 - 6 and /progress = /admin/movies/progress.
I presume your problem is that the subsequent (second call) requests are not running until the first one has completed. From my test which I conclude to be false. We can, clearly, see that the first request is still pending while other requests have completed. So, We can say that your requests are running in parallel.
For your case, there can be only one issue (the least probable; provided you would have enough resource to process a small request) and that is, your server is out of resources. meaning it does not have enough memory to serve the second calls. since the first call went through and the server is out of resources, it cannot process further requests and hence blocking the second calls until the first request has completed. So, your first call or any other running task/process/request is consuming your server resources, hence, blocking further requests.
Reading the comments and previous answer about the file being locked so parallel requests cant be sent to the server or the server cannot accept parallel requests. This statement is also incorrect. Session files are locked each time the session file is being updated that is true but this does not have anything to do with the browser sending multiple requests at once.
If the file is unavailable/does not exist during any request, Laravel simply throws a 500 error. Attached image below shows response for the case, the file not being available.
If the file is locked while writing, the response (to progress call) is simply empty. This means, the file exists but its contents are unavailable at the moment. The image below shows a sample response.
If this answer does not satisfy you, please follow these links. See if they might help. Link 1 and Link 2.
My test environment:
Upvotes: 2
Reputation: 3081
Instead of using session use regular file. Session files are locked for each unique session until the script is finished. so It cant be ran parallel. What you can do is store the data in plain file and access it.
To read out from the cached file
$rawData = file_get_contents('data/progressFile.json');
$cacheData = json_decode($rawData, true);
write like
$data['progress'] = 100;
$fres = fopen('data/progressFile.json', 'w');
fwrite($fres, json_encode($data));
fclose($fres)
You can use json data structure for all the session and parse to get particular progress data or you may use separate file for ech progress data. if you are using one large file you may need to use flock()
to lock the file during the writing.
Upvotes: 1