Reputation: 26904
My application runs up to 180 AJAX jobs that are IO-intensive at the server side (long running SELECT
queries).
I would like to optimize the load of the multiple CPU cores I have available, switching from a design in which each AJAX call is executed sequentially to a design in which these requests are executed with a maximum of, say, 4 in parallel.
A possible, but ugly solution could be issuing all 180 requests at the same time on the client and have the server use a Semaphore
stored at Session
or Application
level. I will discuss about application workloads later.
I would like to find a nicer solution in which calls are all started in order (each row on a table is a different check query) but when any terminates, the next is started and there are a number (namely 4) of concurrent AJAX requests with their respective loader indicators.
I have tried to use Threadpool-js but I have found myself that i cannot use jQuery from workers
My current code is the following
function GlobalCheck() { //entry point
if (ValidateDate()) {
//Below are global variables
list = $(".chkClass:checked"); //Only checked rows deal to AJAX request
num = $(".chkClass:checked").length; //Total number of ajax calls
done = 0; //Count of complete calls. When it reaches num we are done!
if (list.length == 0) {
alert('...');
return;
}
$(".pMessage").fadeOut();
$(".tbStatus").html('');
$(".submit").hide();
$(".exportFunctions").fadeOut();
$(".loader").show();
$(":checkbox").attr('disabled', true);
SingleCheck(0); //simplification, I do other non interesting things here
}
}
function SingleCheck(index) {
aValue = $($(list).get(index)).val();
var splitted = aValue.split('_');
$('#loader_' + aValue).show();
$('#green_' + aValue).hide();
$('#yellow_' + aValue).hide();
$('#red_' + aValue).hide();
$('#print_' + aValue).hide();
$('#xls_' + aValue).hide();
$('#summ_' + aValue).hide();
$.ajax({
type: 'GET',
url: '@Url.Action("Single", "Check")',
data: {
pType: splitted[0],
pIdQuery: splitted[1],
pDateBegin: $('#date_begin').attr('value'),
pDateEnd: $('#date_end').attr('value'),
pNow: Math.floor(Math.random() * 1000000)
},
success: function (data) {
if (!CheckSessionExpired(data)) {
//alert(data);
$("#tdStatus_" + aValue).html(data);
$("#loader_" + aValue).hide();
done++; //Done 1 more query
$(".progress").each(function (i, cRow) { $(this).html([update status]); });
if (done == num) { // Finish?
FinishCheck();
}
else {
SingleCheck(done); //Go to the next
}
}
},
error: function (XMLHttpRequest, textStatus, errorThrown) {
alert(errorThrown);
RedirectToError();
}
});
}
Result is the following:
Question is: what approach can I follow in order to create concurrent AJAX requests in my scenario?
[edit] forgot to discuss application demands: this application is running live but is not serving a large user base. When a user submits data to be checked, the application will perform intensive operation, while staying idle for long periods of time.
Upvotes: 11
Views: 6771
Reputation: 6032
I had difficulty with this because I was also trying to track errors and pass a serialised object to the server side.
The following worked for me hope it helps
var args1 = {
"table": "users",
"order": " ORDER BY id DESC ",
"local_domain":""
}
var args2 = {
"table": "parts",
"order": " ORDER BY date DESC ",
"local_domain":""
}
$.when(
$.ajax({
url: args1.local_domain + '/my/restful',
headers: {
'Content-Type': 'application/x-www-form-urlencoded'
},
type: "POST",
dataType : "json",
contentType: "application/json; charset=utf-8",
data : JSON.stringify(args1),
error: function(err1) {
alert('(Call 1)An error just happened...' + JSON.stringify(err1));
}
}),
$.ajax({
url: args2.local_domain + '/my/restful',
headers: {
'Content-Type': 'application/x-www-form-urlencoded'
},
type: "POST",
dataType : "json",
contentType: "application/json; charset=utf-8",
data : JSON.stringify(args2),
error: function(err2) {
calert('(Call 2)An error just happened...' + JSON.stringify(err2));
}
})
).then(function( data1, data2 ) {
data1 = cleanDataString(data1);
data2 = cleanDataString(data2);
data1.forEach(function(e){
console.log("ids" + e.id)
});
data2.forEach(function(e){
console.log("dates" + e.date)
});
})
function cleanDataString(data){ // this is extra
data = decodeURIComponent(data);
// next if statement was only used because I got additional object on the back of my JSON object
// parsed it out while serialised and then added back closing 2 brackets
if(data !== undefined && data.toString().includes('}],success,')){
temp = data.toString().split('}],success,');
data = temp[0] + '}]';
}
data = JSON.parse(data);
return data; // return parsed object
}
Upvotes: 1
Reputation: 17487
Note : the number of concurrent requests browsers can make to a given domain is limited. So you might as well fire all your requests at once, the browser will take care of queuing them and only executing about 6 to 8 concurrently.
Source : Max parallel http connections in a browser?
However if you want a more precise control, you might implement something like this :
//define your list of URLs and what to do with the data
var sources = [
{
url : "http://...",
callback : function(data){ /* do something */}
},
{
url : "http://...",
callback : function(data){ /* do something */}
}
];
//state
var maxConcurrentRequests = 4;
var concurrentRequests = 0;
var currentSourceIndex = -1;
//this function wil ensure that as long as there are sources left, there are 4 requests running
function startRequestIfNeeded(){
while(currentSourceIndex < sources.length-1 && concurrentRequests < maxConcurrentRequests){
currentSourceIndex++;
concurrentRequests++;
var source = sources[sourceIndex];
doRequest(source);
}
}
//this fires the request and executes the callback
function doRequest(source){
$.getJSON(source.url, function(data){
source.callback(data);
concurrentRequests--;
startRequestIfNeeded();
});
}
startRequestIfNeeded();
I'll leave the error handling to you.
And if you'll have to add logic if you want to detect when all requests are done. Maybe look into promises.
Upvotes: 0
Reputation: 14600
$.ajax()
is an asynchronous function which initiates a request and then returns immediately. So if you call it multiple times in a row, it will create concurrent requests.
Your code still runs on a single thread, but the HTTP requests happen in parallel in the background and jQuery invokes your callbacks as they return data. So you get parallelism without having to deal with threads directly.
In your GlobalCheck()
:
var CONCURRENT_REQUESTS = 4;
while (done < CONCURRENT_REQUESTS) {
SingleCheck(done++);
}
will start four parallel requests, and your existing code will trigger a new request each time one finishes, so you will always have 4 parallel requests running. And because your code only runs on a single thread, you don't have to worry about concurrency issues with your done
variable etc.
For more concurrent requests, just increase the value of CONCURRENT_REQUESTS
, but note that very quickly you'll hit the browser's limit of concurrent requests to a single domain - it varies per browser but it's always a pretty small number. See this answer for specifics.
Upvotes: 3
Reputation: 1191
In my opinion your need an array "nbRequest" to know how many request you have working. (4 max)
Then use setInterval.
Start the interval, store your "num" inside "nbRequest". When the ajax request is finished, remove your "num" from "nbRequest".
In the mean time, the interval 'll check if the length of "nbRequest" equal 4. If not, you start a new request.
When "done == num", stop the interval.
Upvotes: 0
Reputation: 536
When you for instance have a ajax call in a function 'doCall', just start this function for an x-amount depening on the 'threads' you want.
doCall(x);
doCall(x);
doCall(x);
Now you have 3 threads. To keep it going, 'restart' the function in the function. So in the doCall Function you have an other doCall(x) to 'keep the thread alive'.
You will have some sort of 'loop', and the requests will keep getting fired async.
Upvotes: 1