Reputation: 4830
I have a circumstance where I need to make several getJSON calls, and once all of the data is returned make a call to another function, like so (code simplified for example):
var data = {};
for (var i in a) {
$.getJSON(base_url+i, function(d) {
data[i] = d;
});
}
do_something(data);
Obviously this doesn't work as I am making the call to do_something before the getJSON calls have returned any data.
My current approach to get around this is to make the calls synchronously, like so:
var data = {};
$.ajaxSetup({'async':false});
for (var i in funcs) {
$.getJSON(base_url+i, function(d) {
data[i] = d;
});
}
$.ajaxSetup({'async':true});
do_something(data);
My question is, is there a better way of doing this or am I best off making the calls synchronously as above?
Upvotes: 4
Views: 4646
Reputation: 4830
As per link posted by Felix Kling to a similar question, the answer was to use deferred objects.
However, there is a further complication due to the use of i
in the getJSON
success function. This would always be the value of the last iteration. See my other question for more details:
jQuery Deferred - Variable scope in deferred getJSON success functions
Full solution:
var data = {};
var calls = [];
for (var i in funcs) {
calls.push(
$.getJSON(base_url+i,
(function(i) {
return function(d) {
data[i] = d;
};
}(i))
)
);
}
$.when.apply($,calls).then(function() {
do_something(data);
});
Upvotes: 6
Reputation: 384
I had the same problem with a recent application I was developing.
In this case the best option to handle the data and be sure that your function is being executed when you really have got it is to put the functions inside the $.getJSON call.
In your case it would look like this:
var data = {};
for (var i in funcs) {
$.getJSON(base_url+funcs[i], function(d) {
data[f] = d;
do_something(data);
});
}
Like this you will be sure that the call will be done when the data is handled and this doesn't give errors.
Upvotes: 0