Etheryte
Etheryte

Reputation: 25318

Array.prototype.reduce vs a simple for loop for filtering and modifying data

"Reducing Filter and Map Down to Reduce" by Elijah Manor outlines the following use case for Array.prototype.reduce():

Given the following data (from the linked article):

var doctors = [
    { number: 1,  actor: "William Hartnell",      begin: 1963, end: 1966 },
    { number: 2,  actor: "Patrick Troughton",     begin: 1966, end: 1969 },
    { number: 3,  actor: "Jon Pertwee",           begin: 1970, end: 1974 },
    { number: 4,  actor: "Tom Baker",             begin: 1974, end: 1981 },
    { number: 5,  actor: "Peter Davison",         begin: 1982, end: 1984 },
    { number: 6,  actor: "Colin Baker",           begin: 1984, end: 1986 },
    { number: 7,  actor: "Sylvester McCoy",       begin: 1987, end: 1989 },
    { number: 8,  actor: "Paul McGann",           begin: 1996, end: 1996 },
    { number: 9,  actor: "Christopher Eccleston", begin: 2005, end: 2005 },
    { number: 10, actor: "David Tennant",         begin: 2005, end: 2010 },
    { number: 11, actor: "Matt Smith",            begin: 2010, end: 2013 },
    { number: 12, actor: "Peter Capaldi",         begin: 2013, end: 2013 }    
];

We can both modify and filter the data at the same time using .reduce() as shown below (also from the linked article):

doctors = doctors.reduce(function(memo, doctor) {
    if (doctor.begin > 2000) { // this serves as our `filter`
        memo.push({ // this serves as our `map`
            doctorNumber: "#" + doctor.number,
            playedBy: doctor.actor,
            yearsPlayed: doctor.end - doctor.begin + 1
        });
    }
    return memo;
}, []);

However, why would one prefer this over a simple for loop? I doubt it out-performs a simple iterating loop with similar contents (though I can't test that claim since Jsperf is down).

Is there any reason (performance or otherwise) to use the .reduce() implementation over a simple loop, other than syntax preference?

Upvotes: 0

Views: 1006

Answers (1)

deceze
deceze

Reputation: 522261

One objective benefit of array operations like map or reduce is the inherent variable scoping and reduction of boilerplate code. Compare:

var result = 0;
for (var i = 0, length = arr.length; i < length; i++) {
    result += arr[i];
}

// vs:

var result = arr.reduce(function (acc, val) { return acc + val; }, 0);

The giant loop declaration has nothing inherently to do with what you're trying to do here, it's just boilerplate code. Further, your scope now has two additional variables i and length floating around which nobody asked for, which may or may not introduce some non-obvious bugs if you're not careful.

The reduce code on the other hand just contains the minimum necessary parts to do what it needs to do. In something like CoffeeScript its syntax can even further be reduced to arr.reduce (acc, val) -> acc + val, which is pretty darn concise. And even if you'd need to create additional variables during the operation inside the loop/the callback, those won't clutter the outer scope.

Secondarily, reduce is a well known paradigm and serves as a good statement of intend for that code block. You don't need to parse useless auxiliary variables and figure out what they're for, you can simply break down the operation into what array is being operated on (arr) and what result the callback expression (acc + val) will produce, then extrapolate that over the entire array to know what the result will be.

Potentially such operations can also be better optimised by a compiler under certain circumstances, but this is mostly not seen in practice at this time I believe.

Upvotes: 2

Related Questions