Reputation: 43441
i have huge file to be processed on server. I upload file to server, then read it, making array. Now i need to put that information back to server:
function getXMLFile(file){ // Single call
$.ajax({
url: '....',
type: 'post',
dataType: 'json',
data: {filename: file},
success: function(json){
$.each(json, function( key, value ){ // iterates over 50 000 items.
tmp.push( value );
i++;
if(i > 10000){
setTimeout(function(){
insert(tmp);
tmp = [];
i = 0;
}, 1000);
}
});
}
});
}
And here is locking function:
function insert(data){ // called from getXMLFile() @data -> array of 10 000 code entries
$.ajax({
url: '....', // for now php function does nothing.
type: 'post',
dataType: 'json',
data: {codes: data},
async: true // !!!!
});
}
});
As you can see i have 'async: true' and using setTimeout, so that my browser not get locked. But it still locks down... Have i done something wrong?
Upvotes: 0
Views: 1497
Reputation: 11671
You are uploading a file to the server, then the server returns a lot of data. From the data you iterate about 50000 items, then you make a request every 10000 iterations with an array that constantly increases. You end up with about 5 requests every 1 second with large data.
It makes sense having an impact to your browser's performance, i propose working on the server whatever can be worked there. For example the data returned from the server the first time can also be processed by the server without making requests with large data from the client to send the data back again. This way you will improve the poor performance on your browser.
To help you resolve this look at the memory consumption of your browser and also try working with a small data set. If your browser does not lock down, then you will know that the data you try to process at your client side is too much.
Upvotes: 2