josiekre
josiekre

Reputation: 803

Using queue() to load multiple files and assign to globals

I cannot get multiple files to load data and assign to globals. I've read up on similar questions and related examples, but I still am having trouble.

var origins = [],
    geoJSON = {
      "type": "FeatureCollection",
      "features": []
    };

queue(1)
    .defer(d3.csv, "path_to.csv", function(d) {
        origins.push(d.o_geoid)
      })
    .defer(d3.json, "path_to.json", function(d) {
      // Limit GeoJSON features to those in CSV
      for (var i = d.features.length - 1; i >= 0; i--) {
        if($.inArray(d.features[i].properties['GEOID10'], origins) != -1) {
          geoJSON.features.push(d.features[i]);
        }
      }
    })
    .await(ready);

function ready() {
  console.log(geoJSON);
}

I'm happy to filter the geoJSON features within ready() if that works better, but I need it to happen before I start creating the map with

d3.g.selectAll("path")
    .data(geoJSON.features)
  .enter.append("path")
...

I'm assuming this has to do with callbacks and empty results, but I can't quite get it working. I have figured out that using .await(console.log(geoJSON)); outputs the correct object to the console. The ready() function won't execute though. Thanks for any help understanding and fixing this problem.

Upvotes: 3

Views: 3757

Answers (1)

Cool Blue
Cool Blue

Reputation: 6476

Your question was already answered by Jason Davies' reply to the thread you linked but anyway, here it is re-stated in terms of your exact example...

var origins = [],
    geoJSON = {
        "type": "FeatureCollection",
        "features": []
    };

queue(1)
    .defer(function(url, callback) {
        d3.csv(url, function(error, csvData) {
            if(!error) csvData.forEach(function(d) {origins.push(d.o_geoid)});
            callback(error, d);
        })
    }, "path_to.csv")
    .defer(function(url, callback) {
        d3.json(url, function(error, jsonData) {
            // Limit GeoJSON features to those in CSV
            for(var i = jsonData.features.length - 1; !error && i >= 0; i--) {
                if($.inArray(jsonData.features[i].properties['GEOID10'], origins) != -1) {
                    geoJSON.features.push(jsonData.features[i]);
                }
            }
            callback(error, jsonData);
        })
    }, "path_to.json")
    .await(ready);

function ready(error) {
    console.log(error ? "error: " + error.responseText : geoJSON);
}

I've never used queue but, if you think about it, it's pretty obvious from Jason's answer how it works.
The basic pattern is

queue()
    .defer(asynchRequest1, url1)
    .defer(asynchRequest2, url2)
    .await(callback)

function callback(error){
    console.log(error ? 
        "error: " + error.responseText : 
        "completed, " + (arguments.length - 1) + " objects retrieved"
    );
}  

The call signature for the first argument of .defer is function(url, callback) and signature of the callback is function(error, result). The former is aligned with d3 conventions (for which queue is obviously designed) and the later is asynchronous javascript (i.e. node) conventional practice.
To make this work, under the hood, queue needs to provide the callback argument, and that needs to be a function that hits the await object, with the result of the asynch request as arguments, using the standard function(error, result) signature.

If you use the direct pattern, where the first argument of defer is d3.csv for example, then, after it completes, d3.csv will invoke the callback provided by queue, therefore connecting with the await object, passing it's error/result state.

In the indirect pattern described by Jason Davies, d3.csv is wrapped in another function - with the same signature - that defers invocation of the internally provided queue callback, until after d3.csv has completed and your post-processing is done.

Now that we understand what's going on, we can think about refactoring to make it cleaner. Perhaps like this...

queue(1)
    .defer(d3.csv, "path_to.csv")
    .defer(d3.json, "path_to.json")
    .await(ready);

function ready(error, csvData, jsonData) {
    if(error) return console.log("error: " + error.responseText);
    csvData.forEach(function(d) {origins.push(d.o_geoid)})
    // Limit GeoJSON features to those in CSV
    for(var i = jsonData.features.length - 1; !error && i >= 0; i--) {
        if($.inArray(jsonData.features[i].properties['GEOID10'], origins) != -1) {
            geoJSON.features.push(jsonData.features[i]);
        }
    }
}

...which has exactly the same effect.

Upvotes: 4

Related Questions