Reputation: 920
I have what I would have thought was a very basic issue whose solution is for some reason completely eluding me.
I have an HTML form #sendform
with a lot of checkboxes each with the class .userCheckbox
. The number of checkboxes is not predictable; it can be anywhere from just one or two to several thousand. Each box represents a user, and if it is checked, that user should receive an e-mail when the form is submitted.
The intended outcome is that submitting the form sends an AJAX request to the PHP backend, adding a class loading
to the parent container (which shows a little spinning wheel) when the request is sent, and then replacing that with a success
or error
class, as relevant, when the request completes. I don’t want to send off the entire bulk of checkboxes at once, since (correct me if I’m wrong) that would entail having to wait for all rows to be processed before getting any visual feedback on the page.
So the very basic skeleton is something like this:
$(document).on('submit', "#sendform", function(e) {
e.preventDefault();
var $form = $(this);
$form.find('input.userCheckbox').each(function(i) {
var $this = $(this);
if (this.checked) {
$.ajax({
url: $form.prop('action'),
type: $form.prop('method'),
dataType: 'json',
data: { /* ... stuff here ... */ }
beforeSend: function() { /* Add loading class to parent */ },
success: function(data, textStatus, jqXHR) { /* Add success/fail class to parent depending on data returned */ },
error: function(jqXHR, textStatus, errorThrown) { /* Add fail class to parent */ }
});
}
});
});
This of course works, in theory, but in practice, all the AJAX requests being sent off pretty much at the same time wreaks havoc. The server (a bog-standard, low-end shared server) seems to handle about twenty or so more-or-less-simultaneous AJAX requests, and then it gives up and the rest of the calls fail with the textStatus
“Error” and the errorThrown
“Too many requests”.
So what I need is to fire off the AJAX requests one at a time, each iteration of the loop waiting until the previous AJAX call has completed before sending off another one—but also, preferably, without locking up the entire browser, as async: false
would do. Enter callback hell, or at least the (to me) very inelegant solution of a recursive function called in the complete
callback of the AJAX request.
Surely this must be an exceedingly common question to have, I thought to myself, and went searching for better alternatives. Sure enough, there is a plethora of StackOverflow questions on variations of this topic, and several plethorata1 of random Internet people offering various solutions to it as well.
The common denominator among the vast majority of these solutions is that using deferred objects and promises is The Way To Go. Very well—I’ve never really grokked or made use of deferred objects, so I figure this is as good a time as any to figure out just what exactly they are and how they work. This turns out to be trickier than expected, and I’m still not sure I’ve quite got it, which may be why I’m not getting this.
I’ve tried implementing at least five different variations of promise-based solutions found here and elsewhere now, and my problem is that none of them works. Partly because the vast majority are six or seven years old, and deferreds and promises have changed a lot since then in ways that I can’t quite keep track of.
At the moment, I have implemented a variation on this solution, essentially thus:
$(document).on('submit', "#sendform", function(e) {
e.preventDefault();
var $form = $(this);
var promise = $.when({});
$form.find('input.userCheckbox').each(function(i) {
var $this = $(this);
if (this.checked) {
promise = promise.then(sendmail($this, $form));
}
});
});
function sendmail($this, $form) {
var defer = $.Deferred();
$.ajax({
url: $form.prop('action'),
type: $form.prop('method'),
dataType: 'json',
data: { /* Stuff here */ },
beforeSend: function() {
console.log($this.val() + " start: " + Date.now();
/* + Add loading class to parent */
},
success: function(data, textStatus, jqXHR) { /* Add success/fail class to parent */ },
error: function(jqXHR, textStatus, errorThrown) { /* Add fail class to parent */ },
complete: function() {
console.log($this.val() + " end: " + Date.now());
defer.resolve();
}
});
return defer.promise();
}
This works… exactly the same way that the code at the top works. In other words, it just shoots off all the AJAX requests basically at once. No waiting for the previous call in the promise queue to finish before sprinting ahead. I log the start and completion times for each AJAX request to the console just to check; and sure enough, this is what I get, trying with ten checkboxes:2
1007 start: 12:12:41.333
1008 start: 12:12:41.341
1009 start: 12:12:41.346
1010 start: 12:12:41.350
1011 start: 12:12:41.355
1012 start: 12:12:41.359
1013 start: 12:12:41.363
1014 start: 12:12:41.367
1015 start: 12:12:41.372
1016 start: 12:12:41.375
1007 end: 12:12:42.140
1008 end: 12:12:42.553
1010 end: 12:12:42.639
1009 end: 12:12:42.772
1011 end: 12:12:42.889
1013 end: 12:12:43.007
1015 end: 12:12:43.157
1016 end: 12:12:43.289
1012 end: 12:12:43.422
1014 end: 12:12:43.570
Each AJAX request takes about a second or so to complete, but the next one gets started immediately, within a few milliseconds.
I think I see why this is happening.
The sendform
function returns a promise object; but this happens immediately, before the complete
callback to the AJAX request has resolved the underlying deferred object. So it essentially makes no difference: there is no instruction to actually wait until the deferred object is resolved before adding the next call to the queue.
All the different approaches I’ve come across here on SO and elsewhere have had this in common—they essentially do nothing to solve the actual issue, as far as I can tell.
So the actual question:
Am I just completely missing something blindingly obvious, or misunderstanding how deferreds and promises work?
1 Yes, I know plethora is feminine, not neuter—indulge me.
2 Not actually using Date.now()
since Unix timestamp milliseconds aren’t the most readable things in the world, but just a trivial wrapper function that I’ve left out here lest this answer get any longer than it already is.
Upvotes: 0
Views: 1781
Reputation: 1
Better yet: do one ajax request do PHP handler and handle an array on server side:
function sendMail(userCheckbox,$from)
{
$.ajax({
url: $form.prop('action'),
type: $form.prop('method'),
dataType: 'json',
data: { array:userCheckbox /*and more data here*/ },
beforeSend: function() {
console.log($this.val() + " start: " + Date.now();
/* + Add loading class to parent */
},
done: function(response) {
/*check response and do something like if(response=='success')*/
},
complete: function() {
console.log($this.val() + " end: " + Date.now());
}
});
}
$(document).on('submit', "#sendform", function(e) {
e.preventDefault();
var $form = $(this);
var userCheckbox=[];
$form.find('input.userCheckbox:checked').each(function(i) {
var $this = $(this);
if($this.val())
userCheckbox.push($this.val());
});
if(userCheckbox.length>0) sendMail(userCheckbox,$form);
});
Upvotes: -1
Reputation: 3151
Try this, but only with jQuery 3+, because of changes to callback execution:
$(document).on('submit', "#sendform", function(e) {
e.preventDefault();
var $form = $(this);
var promise = $.when();
$form.find('input.userCheckbox').each(function(i) {
var $this = $(this);
if (this.checked) {
promise = promise.then(function() {
return sendmail($this, $form);
}).then(function(data) {
//success
}, function() {
// fail
});
}
});
});
function sendmail($this, $form) {
return $.ajax({
url: $form.prop('action'),
type: $form.prop('method'),
dataType: 'json',
data: { /* Stuff here */ },
beforeSend: function() {
console.log($this.val() + " start: " + Date.now();
/* + Add loading class to parent */
},
complete: function() {
console.log($this.val() + " end: " + Date.now());
}
});
}
Upvotes: 0
Reputation: 1074138
So it sounds like you want to limit the number of concurrent outstanding requests.
Before answering that, some thoughts regarding other things to do instead:
I suspect you'd be better served by "chunking" groups of checkboxes together, so you have (say) 10 or 20 requests rather than 100, and get back your results in blocks.
It is possible to have a single request and then get back information incrementally from the server as that request is processed. It's been so long since I've had to do it that at the time an iframe was the best approach, but I'm sure things have moved on in the intervening ~17 years. Basically, the resource you're posting to writes each result to its response as that result is available, and flushes its output; the server sends that to the client without terminating the connection (because there's still more data to come), and the client can read that partial data and update. So it may be worth researching how to do that here in 2017 as opposed with a dodgy iframe. :-)
Another approach is to send a single request with all the data, and have its response be purely that the request has been received. Then have the code poll periodically to get the results available as they become available.
With that out of the way, back to your actual question: :-)
From a Promise perspective, you can use my solution in this answer:
const items = /* ...1000 items... */;
const concurrencyLimit = 10;
const promise = Promise.all(items.reduce((promises, item, index) => {
// What chain do we add it to?
const chainNum = index % concurrencyLimit;
let chain = promises[chainNum];
if (!chain) {
// New chain
chain = promises[chainNum] = Promise.resolve();
}
// Add it
promises[chainNum] = chain.then(_ => foo(item));
return promises;
}, []));
Adapting that to your situation, I get something along these lines:
$(document).on('submit', "#sendform", function(e) {
e.preventDefault();
var $form = $(this);
var promise = $.when({});
// Get a true array for only the checked checkboxes
var checkboxes = $form.find('input.userCheckbox:checked').get();
var concurrencyLimit = 10;
Promise.all(checkboxes.reduce(function(promises, checkbox, index) {
// What chain do we add it to?
var chainNum = index % concurrencyLimit;
var chain = promises[chainNum];
if (!chain) {
// New chain
chain = promises[chainNum] = Promise.resolve(); // Or $.Deferred().resolve().promise() would work, I think
}
// Add it
promises[chainNum] = chain.then(function() {
return sendmail($(checkbox), $form);
});
return promises;
}, [])).then(function() {
// Entire process is done
});
});
With these small changes to sendmail
:
function sendmail($this, $form) {
return $.ajax({
url: $form.prop('action'),
type: $form.prop('method'),
dataType: 'json',
data: { /* Stuff here */ },
beforeSend: function() {
console.log($this.val() + " start: " + Date.now();
/* + Add loading class to parent */
},
success: function(data, textStatus, jqXHR) { /* Add success/fail class to parent */ },
error: function(jqXHR, textStatus, errorThrown) { /* Add fail class to parent */ },
complete: function() {
console.log($this.val() + " end: " + Date.now());
}
});
}
That's assuming a fairly up-to-date version of jQuery, in which the various issues with $.Deferred
have been cleaned up and brought in line with the Promises A/+ spec.
We can take $.Deferred
out of the equation entirely if you prefer, via these changes to sendmail
:
function sendmail($this, $form) {
return new Promise(function(resolve, reject) {
$.ajax({
url: $form.prop('action'),
type: $form.prop('method'),
dataType: 'json',
data: { /* Stuff here */ },
beforeSend: function() {
console.log($this.val() + " start: " + Date.now();
/* + Add loading class to parent */
},
success: function(data, textStatus, jqXHR) {
/* Add success/fail class to parent */
resolve();
},
error: function(jqXHR, textStatus, errorThrown) {
/* Add fail class to parent */
reject();
},
complete: function() {
console.log($this.val() + " end: " + Date.now());
}
});
});
}
Upvotes: 1
Reputation: 338148
An HTTP request is a slow thing. And it's a resource hog, on the server side and the client side, when any processing at all is involved. You want to make as few of them as possible, while making the server's response as fast as possible.
Throttling the requests is a method of last resort, not something to do casually. When one instance of your web app bogs down the server with too many requests, you have an architectural issue, not a processing issue.
You can't really fix bad architecture by changing the processing side. What your code currently does is:
submit
eventClearly the second step is misguided, especially since the only justification seems to be "I want the user to see intermediary progress". That's a nice goal, but making them wait longer and hammering the server at the same time can't be the solution.
Making the 100 requests in sequence causes the user to wait even longer and only solves the server-hammering problem. So that's even worse a solution.
My advice would be to send the form as it would be sent - in one request - and work on the server side to make the processing of that request as fast as you possibly can. I'm pretty sure there is room for optimization there.
In a second step you can think about how to make the client more interactive during the wait. But keep in mind - sending one request and letting the server process it is the shortest wait you can get. Anything that sends more requests than one will of course be slower overall due to the added overhead.
Upvotes: 1