Reputation: 1333
I'm studying the annotated source of underscore, and I have a quick question on their usage of apply in the _.defer function. In particular, in _.defer, we use apply on the _ object, while in _.delay, we use apply on null. Why can't we use null instead of _ in _.defer since we don't actually use 'this' in either function?
var slice = Array.prototype.slice;
//Delays a function for the given number of milliseconds, and then calls it with the arguments supplied.
_.delay = function(func, wait) {
var args = slice.call(arguments, 2);
return setTimeout(function(){
return func.apply(null, args);
}, wait);
};
//Defers a function, scheduling it to run after the current call stack has cleared.
_.defer = function(func) {
return _.delay.apply(_, [func, 1].concat(slice.call(arguments, 1)));
};
Upvotes: 1
Views: 64
Reputation: 9706
In delay()
calling apply()
causes user-supplied function to execute. There is no obvious or documented contract, what this
should be in the context of this function. Passing null
as this
reference makes sense as it simplifies debugging for end-user: errors will be easy to investigate.
On the other hand, in defer()
, apply()
is used to call another underscore API method, it is well known and under control of the developer. Using underscore as this
makes sense in general, as a common rule for the whole library - it is intuitive, it is short, and it provides flexibility if needed later.
Upvotes: 1