Reputation: 2436
I am maintaining software that unfortunatelly has to run on IE8. The problem with IE8 is that it throws the 'script unresponsive' error if the synchronouse execution is too long:
This message displays when Internet Explorer reaches the maximum number of synchronous instructions for a piece of JavaScript. as described here
The standard way of dealing with this is something like
setTimeout(function(){ //slow code }, 1);
but, in my case the slow part is actually:
jQuery(/*selectors*/).each()// iteration
How can I terate through the elements found with jQuery().each()
, where the actual .each()
part is carried out recursively with timeouts? Even if the each()
block does nothing I still get the pop-up warning. There is about 20,000 elements to iterate through... I know...
What is the best way to do it without rewriting anything on the page (let's assume I really can't rewrite the 20,000 elements table).
Upvotes: 2
Views: 2190
Reputation: 707456
FYI, if your problem actually occurs because this operation all by itself:
jQuery(selectors)
takes too long, then you will have to change the selector to be something that is much faster to evaluate or change the HTML somehow so you can query for pieces of the table at a time. jQuery in IE8 is likely using the Sizzle library to evaluate the selector so if you've got a combination of large HTML, selector and Sizzle that are just too slow for IE8, then you will have to change one of the three.
We can't help with specifics on this issue without seeing the actual HTML and probably having some sort of test bed to experiment with. My guess would be that there could be a better selector, perhaps using natively supported query mechanisms such as getElementsByTagName()
or something like that, but we'd have to see that actual HTML to make a more concrete recommendation. As you already know 20,000 elements in a really slow browser is just a bad recipe to start with.
If you get through the selector find and just want help with the iteartion, you can't use .each()
directly because it will run all at once. Instead, you will have to manually iterate the jQuery list of DOM objects.
function processLargeArray(items) {
// set check size to whatever number of items you can process at once
var chunk = 100;
var index = 0;
function doChunk() {
var cnt = chunk;
while (cnt-- && index < items.length) {
// process items.eq(index) here
++index;
}
if (index < items.length) {
// set Timeout for async iteration
setTimeout(doChunk, 1);
}
}
doChunk();
}
var data = jQuery(selectors);
processLargeArray(data);
FYI, this code is adapted for use with a jQuery object from a more general purpose answer on the subject: Best way to iterate over an array without blocking the UI
And here's a version that uses a jQuery plugin to create a similar interface to .each()
(but it's async).
jQuery.fn.eachChunk = function(chunk, eachFn, completeFn) {
var index = 0;
var obj = this;
function next() {
var temp;
if (index < obj.length) {
temp = obj.slice(index, index + chunk);
temp.each(eachFn);
index += chunk;
setTimeout(next, 1);
} else {
if (completeFn) {
completeFn();
}
}
}
next();
return this;
};
jQuery(selectors).eachChunk(100, yourFn);
Upvotes: 2
Reputation: 1055
I really like the Async Library:
https://github.com/caolan/async
which gives you a whole bunch of options of running sequential, asynchronous functions.
async.each([..], function(callback){
// Iterator function
callback();
});
Works in the same way as jQuery's each, but unlike jQuery, you get more control, such as:
async.eachSeries([...], function(callback){
// Will handle each element one at a time
callback();
});
And async.eachLimit, which means you cancontrol the number of tasks being run at any given time -so that x number of tasks are being run in parallel at any given time;
So, for example:
async.eachLimit([...], 2, function(callback){
// This will run the tasks simultaneously
callback();
});
Will run the iterator functions over all the elements in the array, but will limit the number of concurrently running tasks to 2. If you want 4 tasks, just change the second argument to 4, etc...
So, for example:
async.eachLimit([...], 2, function(callback){
// This will run up to 2 tasks simulatenously
callback(); // If successfull
});
If you need a timeout to make the operation fail silently after a timeout (jobs that pass the time out just don't get finished, and the rest of the queue moves on.
async.eachLimit([...], 2, function(callback){
// This will run up to 2 tasks simulatenously
callback(); // If successfull
setTimeout(function(){
return callback();
}, timeoutLength);
});
If you need a timeout to make the operation fail un-silently, (if there's one error, everything will stop).
async.eachLimit([...], 2, function(callback){
// This will run up to 2 tasks simulatenously
callback(); // If successfull
setTimeout(function(){
return callback("Error");
}, timeoutLength);
});
I'm not sure what the exact requirements of your job are, but I think the async library is a good candidate for this kind of stuff, and has a lot of control flow flexibility to get your job done.
Upvotes: 1
Reputation: 74420
This is one jQuery's way using each
loop:
(function () { //avoid global var for test
var $elems = $('selector'), // elements to iterate
chunk = 50; //number of elements before stoping iteration
(function doStuff() { //recursive method
$elems.each(function (i) {
//delaying method when needed
if (i && !(i%chunk)) {
$elems = $elems.slice(i);
setTimeout(doStuff, 1);
return false;
}
//do slow stuff HERE
//...
});
})();
})();
Upvotes: 0