Reputation: 386
I am parsing data like this:
getData()
.filter(fn)
.filter(fn2)
.filter(fn3)
.map(fn4)
in which the filters are conceptually separated and do different operations.
For debugging purposes, is there a JavaScript library or a way to wrap promises such that I can do this:
getData()
.filter(fn)
.then((result) => { log(result.count); return result })
.filter(fn2)
.then(debugFn) // extra chained debug step (not iterating through arr)
.filter(fn3)
.map(fn4)
Or is this an anti-pattern?
Upvotes: 0
Views: 525
Reputation: 2462
You can do what you want pretty easily with rubico, a functional (programming) promise library
import { pipe, tap, map, filter, transform } from 'rubico'
const pipeline = pipe([
getData,
filter(fn),
tap((result) => { log(result.count) }),
filter(fn2),
debugFn,
filter(fn3),
map(fn4),
])
you can use the above pipeline as a transducer (without debugFn for now, since I am not sure the exact nature of what it does) using rubico's transform
transform(pipeline, [])
you are left with an efficient transformation pipeline based off transduction.
Upvotes: 0
Reputation: 18961
EDIT
After some thoughts I'm convinced that the best answer to this question has been given by V-for-Vaggelis: just use breakpoints.
If you do proper function composition then inserting a few tap
calls in your pipeline is cheap, easy and non intrusive but it won't give you as much information than what a breakpoint (and knowing how to use a debugger to step through your code) would.
Applying a function on x
and returning x
as is, no matter what, already has a name: tap
. In libraries like ramda.js, it is described as follow:
Runs the given function with the supplied object, then returns the object.
Since filter
, map
, ... all return new instances, you probably have no other choice than extending the prototype.
We can find ways to do it in a controlled manner though. This is what I'd suggest:
const debug = (xs) => {
Array.prototype.tap = function (fn) {
fn(this);
return this;
};
Array.prototype.debugEnd = function () {
delete Array.prototype.tap;
delete Array.prototype.debugEnd;
return this;
};
return xs;
};
const a = [1, 2, 3];
const b =
debug(a)
.tap(x => console.log('Step 1', x))
.filter(x => x % 2 === 0)
.tap(x => console.log('Step 2', x))
.map(x => x * 10)
.tap(x => console.log('Step 3', x))
.debugEnd();
console.log(b);
try {
b.tap(x => console.log('WAT?!'));
} catch (e) {
console.log('Array prototype is "clean"');
}
If you can afford a library like Ramda, the safest way (IMHO) would be to introduce tap
in your pipeline.
const a = [1, 2, 3];
const transform =
pipe(
tap(x => console.log('Step 1', x))
, filter(x => x % 2 === 0)
, tap(x => console.log('Step 2', x))
, map(x => x * 10)
, tap(x => console.log('Step 2', x))
);
console.log(transform(a));
<script src="https://cdnjs.cloudflare.com/ajax/libs/ramda/0.26.1/ramda.min.js"></script>
<script>const {pipe, filter, map, tap} = R;</script>
Upvotes: 2
Reputation: 14199
The main issue here is that you're trying to use the chaining pattern that doesn't scale very well.
a.method().method()
does only let you apply functions (methods) that are supported by the prototype of the given context (a
in this case).
I'd rather suggest you to take a look at function composition (pipe
VS compose
). This design pattern doesn't depend on a specific context, hence you can provide behaviour externally.
const asyncPipe = R.pipeWith(R.then);
const fetchWarriors = (length) => Promise.resolve(
Array.from({ length }, (_, n) => n),
);
const battle = asyncPipe([
fetchWarriors,
R.filter(n => n % 2 === 0),
R.filter(n => n / 5 < 30),
R.map(n => n ** n),
R.take(4),
R.tap(list => console.log('survivors are', list)),
]);
/* const survivors = await */ battle(100);
<script src="https://cdnjs.cloudflare.com/ajax/libs/ramda/0.26.1/ramda.js"></script>
As you can see from the snippet above, it is not really needed for the Array
type to implement everything...
Upvotes: 2
Reputation: 18240
When you dont want to overwrite the prototype of either you could write a wrapper function that takes a promise and gives you a modified promise that has the additional features you want. However the problem here is that you will need to import all methods that may be used, which is bad for tree-shaking.
The ES6 pipeline operator proposal tries to address this problem.
Until then things like lodashs _.flow
remain, that allow you to do this:
_.pipe([
_.filter(fn),
_.filter(fn2),
])(data);
Now you basically want this in an async way. This should be pretty easy to accomplish with tools like Ramda.
Upvotes: 1
Reputation: 809
You could monkey-patch Array.prototype
, but it's not recommended.
As long as you only use it for debugging:
Array.prototype.debug = function (fn) {
fn(this);
return this;
};
// example usage
[1, 2, 3].map(n = > n * 2).debug(console.log).map(n => n * 3);
It's not a promise - you probably don't need async - but gives you .then
-like behaviour.
Upvotes: 2
Reputation: 107
I believe one could use breakpoints to debug something like this.
Upvotes: 1
Reputation: 414086
Adding functions to built-in object prototypes is controversial, so many people might advise against it. However, if you really want to be able to do what you're asking, that's probably the only option:
Object.defineProperty(Array.prototype, "examine", {
value: function(callback) {
callback.call(this, this);
return this;
}
});
Then you can put .examine(debugFn)
calls in the chain of .filter()
calls, as you described.
Upvotes: 2