dev7
dev7

Reputation: 6369

jQuery load unknown number of json files with error handling

I have a an index.json file that returns a list of additional N JSON files that need to be loaded. They need to be loaded using a differed approach so that when they are all loaded, I can handle them all at once.

Each one of the additional JSON files may or may not exist on the server.

I am using the following approach to load the data, which works fine when all files actually exist on the server:

$.getJSON('index.json').then(function (response) {
    var files = response.files;
    $.when
     .apply(null, getDeferreds(files))
     .done(function() {
          //process the files
     })
    });
});

function getDeferreds(files) {
    var deferreds = [], i;
    for (i in files) {
       //file type 1
       deferreds.push(
          $.getJSON(files[i] + '_1.json')
            .then(function (response) {
              //do something
             })
       );
      //file type 2
      deferreds.push(
          $.getJSON(files[i] + '_2.json')
            .then(function (response) {
              //do something
             })
       );
    }
    return deferreds;
};

This approach works great, HOWEVER.... When any of the files are missing, i.e somefile_2.json (sometimes the index will be created before the file actually exists on the server), the whole process fails and none of the data is being retrieved.

Within $.getJson (or $.get) I can detect the error using the .fail() method, however that doesn't prevent the call from failing and .done() is never being called.

How would I refactor this to have the .done() method always work even when some files are missing?

Upvotes: 0

Views: 127

Answers (2)

Roamer-1888
Roamer-1888

Reputation: 19288

What you have written looks good, providing do something includes a return statement to deliver data further down the promise chain (in the caller).

You might consider some tidying, and "cutting the cake" differently, by :

  • aggregating promises in the subfunction
  • performing all processing of the delivered data in the caller

It's a matter of preference, but I would probably choose to write :

$.getJSON('index.json').then(function (response) {
    return getFileData(response.files);
}).then(function(dataArray) {
        // `dataArray` is an array of data items
        // Do all processing here
    });
});

function getFileData(files) {
    var promises = files.map(function(file) {
        return $.getJSON(file)
        .then(null, // No processing here
        function() {
            return $.when([]); // $.when(value) is a useful shorthand for $.Deferred().resolve(value)
        });
    });
    // By aggregating the promises here, caller(s) can be delivered an array of data items, wrapped in a single promise.
    return $.when.apply(null, promises).then(function() {
        return Array.prototype.slice.call(arguments); //convert function's `arguments` to a proper array
    });
};

Of course, if getFileData() isn't going to be called from elsewhere, you could do everything in the caller.

Upvotes: 0

dev7
dev7

Reputation: 6369

It turns out the solution was more simple than I thought.

My guess was that somehow along the process when the call fails, the deferred object is not being resolved correctly.

Simply adding return $.Deferred().resolve(); to the fail callback did the trick.

Posting a full solution with error handling in case it helps someone:

$.getJSON('index.json').then(function (response) {
    var files = response.files;
    $.when
     .apply(null, getDeferreds(files))
     .done(function() {
          //process the files
     })
    });
});

function getDeferreds(files) {
    var deferreds = [], i;
    for (i in files) {
         deferreds.push(
           $.getJSON(files[i])
                .then(function (response) {
                   //do something
                }, function() {
                    return $.Deferred().resolve([]);
                })
         );
    };
    return deferreds;
};

Upvotes: 1

Related Questions