Reputation: 100000
I have this simple situation where I want to filter and map to the same value, like so:
const files = results.filter(function(r){
return r.file;
})
.map(function(r){
return r.file;
});
To save lines of code, as well as increase performance, I am looking for:
const files = results.filterAndMap(function(r){
return r.file;
});
does this exist, or should I write something myself? I have wanted such functionality in a few places, just never bothered to look into it before.
Upvotes: 6
Views: 3141
Reputation: 350147
Not faster and not in one method call, but you could avoid the repetition of r.file
by swapping the map
and filter
calls, so that the filter
only needs to check for a truthy value, for which you can borrow on Boolean
:
const files = results.map(r => r.file).filter(Boolean);
To avoid an intermediate array (only useful when you have huge arrays and need to save on space), you could use iterator helper methods (introduced in ECMAScript 2025):
const files = results.values().map(r => r.file).filter(Boolean).toArray();
Here, map
and filter
are iterator helper methods.
Upvotes: 0
Reputation: 25840
Arrays are iterable objects, and we can apply all needed operations in just one iteration.
Example below does such single iteration, using iter-ops library:
import {pipe, filter, map} from 'iter-ops';
const i = pipe(
results,
filter(r => !!r.file),
map(m => m.file)
);
console.log('files:', [...i]);
Upvotes: 0
Reputation: 79
const file = (array) => {
return array.reduce((acc,curr) => curr.file ? acc.concat(curr) : acc,
[])
}
Process :
acc initiate as [ ] (empty array) . reduce docs
Upvotes: -2
Reputation: 135197
Transducers
In its most generic form, the answer to your question lies in transducers. But before we go too abstract, let's see some basics first – below, we implement a couple transducers mapReduce
, filterReduce
, and tapReduce
; you can add any others that you need.
const mapReduce = map => reduce =>
(acc, x) => reduce (acc, map (x))
const filterReduce = filter => reduce =>
(acc, x) => filter (x) ? reduce (acc, x) : acc
const tapReduce = tap => reduce =>
(acc, x) => (tap (x), reduce (acc, x))
const tcomp = (f,g) =>
k => f (g (k))
const concat = (xs,ys) =>
xs.concat(ys)
const transduce = (...ts) => xs =>
xs.reduce (ts.reduce (tcomp, k => k) (concat), [])
const main =
transduce (
tapReduce (x => console.log('with:', x)),
filterReduce (x => x.file),
tapReduce (x => console.log('has file:', x.file)),
mapReduce (x => x.file),
tapReduce (x => console.log('final:', x)))
const data =
[{file: 1}, {file: undefined}, {}, {file: 2}]
console.log (main (data))
// with: { file: 1 }
// has file: 1
// final: 1
// with: { file: undefined }
// with: {}
// with: { file: 2 }
// has file: 2
// final: 2
// => [ 1, 2 ]
Chainable API
Maybe you're satisfied with the simplicity of the code but you're unhappy with the somewhat unconventional API. If you want to preserve the ability to chain .map
, .filter
, .whatever
calls without adding undue iterations, we can make a generic interface for transducing and make our chainable API on top of that – this answer is adapted from the link I shared above and other answers I have about transducers
// Trans Monoid
const Trans = f => ({
runTrans: f,
concat: ({runTrans: g}) =>
Trans (k => f (g (k)))
})
Trans.empty = () =>
Trans(k => k)
// transducer "primitives"
const mapper = f =>
Trans (k => (acc, x) => k (acc, f (x)))
const filterer = f =>
Trans (k => (acc, x) => f (x) ? k (acc, x) : acc)
const tapper = f =>
Trans (k => (acc, x) => (f (x), k (acc, x)))
// chainable API
const Transduce = (t = Trans.empty()) => ({
map: f =>
Transduce (t.concat (mapper (f))),
filter: f =>
Transduce (t.concat (filterer (f))),
tap: f =>
Transduce (t.concat (tapper (f))),
run: xs =>
xs.reduce (t.runTrans ((xs,ys) => xs.concat(ys)), [])
})
// demo
const main = data =>
Transduce()
.tap (x => console.log('with:', x))
.filter (x => x.file)
.tap (x => console.log('has file:', x.file))
.map (x => x.file)
.tap (x => console.log('final:', x))
.run (data)
const data =
[{file: 1}, {file: undefined}, {}, {file: 2}]
console.log (main (data))
// with: { file: 1 }
// has file: 1
// final: 1
// with: { file: undefined }
// with: {}
// with: { file: 2 }
// has file: 2
// final: 2
// => [ 1, 2 ]
Chainable API, take 2
As an exercise to implement the chaining API with as little dependency ceremony as possible, I rewrote the code snippet without relying upon the Trans
monoid implementation or the primitive transducers mapper
, filterer
, etc – thanks for the comment @ftor.
This is a definite downgrade in terms of overall readability. We lost that ability to just look at it and understand what was happening. We also lost the monoid interface which made it easy for us to reason about our transducers in other expressions. A big gain here tho is the definition of Transduce
is contained within 10 lines of source code; compared to 28 before – so while the expressions are more complex, you can probably finish reading the entire definition before your brain starts struggling
// chainable API only (no external dependencies)
const Transduce = (t = k => k) => ({
map: f =>
Transduce (k => t ((acc, x) => k (acc, f (x)))),
filter: f =>
Transduce (k => t ((acc, x) => f (x) ? k (acc, x) : acc)),
tap: f =>
Transduce (k => t ((acc, x) => (f (x), k (acc, x)))),
run: xs =>
xs.reduce (t ((xs,ys) => xs.concat(ys)), [])
})
// demo (this stays the same)
const main = data =>
Transduce()
.tap (x => console.log('with:', x))
.filter (x => x.file)
.tap (x => console.log('has file:', x.file))
.map (x => x.file)
.tap (x => console.log('final:', x))
.run (data)
const data =
[{file: 1}, {file: undefined}, {}, {file: 2}]
console.log (main (data))
// with: { file: 1 }
// has file: 1
// final: 1
// with: { file: undefined }
// with: {}
// with: { file: 2 }
// has file: 2
// final: 2
// => [ 1, 2 ]
> Talks about performance
When it comes to speed, no functional variant of this is ever going to beat a static for
loop which combines all of your program statements in a single loop body. However, the transducers above do have the potential to be faster than a series of .map
/.filter
/.whatever
calls where multiple iterations thru a large data set would be expensive.
Coding style & implementation
The very essence of the transducer lies in mapReduce
, which is why I chose to introduce it first. If you can understand how to take multiple mapReduce
calls and sequence them together, you'll understand transducers.
Of course you can implement transducers in any number of ways, but I found Brian's approach the most useful as it encodes transducers as a monoid – having a monoid allows us make all sorts of convenient assumptions about it. And once we transduce an Array (one type of monoid), you might wonder how you can transduce any other monoid... in such a case, get reading that article!
Upvotes: 11
Reputation: 12709
If you really need to do it in 1 function, you'll need to use reduce
like this
results.reduce(
// add the file name to accumulator if it exists
(acc, result) => result.file ? acc.concat([result.file]) : acc,
// pass empty array for initial accumulator value
[]
)
And if you need to squeeze more performance you can change concat
to push
and return the original accumulator array to avoid creating extra arrays.
However, the fastest solution is probably a good old for
loop which avoids all the function calls and stack frames
files = []
for (var i = 0; i < results.length; i++) {
var file = results[i].file
if (file) files.push(file)
}
But I think filter/map
approach is much more expressive and readable
Upvotes: 9
Reputation: 8060
Why not just forEach
?
const files = [];
results.forEach(function(r){
if(r.file) {
files.push(r.file);
}
});
If this is not fast enough you can use fast.js
and make some other micro optimizations
const files = [];
const length = results.length;
for(var i = 0; i < length; i++) {
if (results[i].file) {
files[files.length] = results[i].file;
}
}
Upvotes: 2
Reputation: 7438
To increase performance you have to measure what solution will be faster. Let's play for a moment https://jsperf.com/filter-than-map-or-reduce/1
Any other test cases are welcome.
If you want to play with benchmark against NodeJS (remember to npm i benchmark
)
var suite = new (require('benchmark')).Suite
function getSampleInput() {
return [{file: 'foo'}, {other: 'bar'}, {file: 'baz'}, {file: 'quux'}, {other: 'quuxdoo'}, {file: 'foobar'}, {file: 'foo'}, {other: 'bar'}, {file: 'baz'}, {file: 'quux'}, {other: 'quuxdoo'}, {file: 'foobar'}, {file: 'foo'}, {other: 'bar'}, {file: 'baz'}, {file: 'quux'}, {other: 'quuxdoo'}, {file: 'foobar'}, {file: 'foo'}, {other: 'bar'}, {file: 'baz'}, {file: 'quux'}, {other: 'quuxdoo'}, {file: 'foobar'}, {file: 'foo'}, {other: 'bar'}, {file: 'baz'}, {file: 'quux'}, {other: 'quuxdoo'}, {file: 'foobar'}, {file: 'foo'}, {other: 'bar'}, {file: 'baz'}, {file: 'quux'}, {other: 'quuxdoo'}, {file: 'foobar'}, {file: 'foo'}, {other: 'bar'}, {file: 'baz'}, {file: 'quux'}, {other: 'quuxdoo'}, {file: 'foobar'}, {file: 'foo'}, {other: 'bar'}, {file: 'baz'}, {file: 'quux'}, {other: 'quuxdoo'}, {file: 'foobar'}]
}
// author https://stackoverflow.com/users/3716153/gaafar
function reduce(results) {
return results.reduce(
(acc, result) => result.file ? acc.concat([result.file]) : acc ,
[]
)
}
// author https://stackoverflow.com/users/1223975/alexander-mills
function filterThanMap(results) {
return results.filter(function(r){
return r.file;
})
.map(function(r){
return r.file;
});
}
// author https://stackoverflow.com/users/5361130/ponury-kostek
function forEach(results) {
const files = [];
results.forEach(function(r){
if(r.file) files.push(r.file);
});
return files
}
suite
.add('filterThanMap', function() {filterThanMap(getSampleInput())})
.add('reduce', function() {reduce(getSampleInput())})
.add('forEach', function() {forEach(getSampleInput())})
.on('complete', function() {
console.log('results:')
this.forEach(function(result) {
console.log(result.name, result.count, result.times.elapsed)
})
console.log('the fastest is', this.filter('fastest').map('name')[0])
})
.run()
Upvotes: 3
Reputation: 386560
You could use the value of o.file
or concat with an empty array for the result.
results.reduce((r, o) => r.concat(o.file || []), []);
Upvotes: 1
Reputation: 1
You can use Array.prototype.reduce()
const results = [{file:{file:1}}, {notfile:{file:1}}];
const files = results.reduce(function(arr, r){
return r.file ? arr = [...arr, r.file.file] : arr;
}, []);
console.log(files); // 1
Upvotes: 0