Reputation: 1782
Its been a few days that I got to learn about Generator functions in JavaScript which are introduced in ES6 version.
There were various places generators were explained but what seems too intriguing for me is the fact that everywhere it is said that
A generator function is a way to write async code synchronously.
The question that I want to raise is "Why is there so much need to introduce a completely different programming strategy just for the sake of it?"
I understand the async nature of the JS code makes it difficult for a newbie to understand and debug the code but does it require a complete change in coding style altogether?
I maybe wrong or not completely understanding the concept behind its introduction but being inquisitive about it all is what compelled me to ask this question.
Upvotes: 7
Views: 1586
Reputation: 15945
Actually, the generator function is not so much about asynchronous, it is more about a function that can be interrupted, how the flow is interrupted is determined by the caller of generator, through iterator -- more explanation below
function* ab3() {
console.log(1);
var c = yield 2;
console.log(2);
return 1 + c;
console.log(3);
}
a generator function is a function that can be truncated in between
it's execution starts when iterator.next() is called, a generator function returns iterator when called
first call of next will run the statement of generator function till the first yield and it will return the yielded value
second call of iterator.next will run till next yield or return statement so it will take in the data passed through next(3) // (3 in this example) and store it in variable (variable to which the yield is assigned), var y = yield 2; so 3 will be stored in y; and statement execution till the return statment., it will return {done:true} over here because we have reached the end of the generator function
next execution ( 3rd here) of iterator.next() will return {value: undefined, done:true}, since nothing is further returned by generator function
Upvotes: 0
Reputation: 754
Scenario 1 (Async) : You might have heard of the importance of writing “non-blocking” javascript. When we do an I/O operation, we use callbacks or promises in javascript to write non blocking javascript code.
Scenario 2 (Sync): running infinite loop in javascript eg: node -e 'while(true) {}' will possibly freeze your computer
With all of this in mind, ES6 Generators allow us to effectively “pause” execution in the middle of a function and resume it at some time in the future (async code synchronously)
Use case: Imagine you need to do something with a sequence of infinite values. Arrays won’t be helpful in that case instead we could use ES6 generator functions
var iterator = generateRandoms(); //suppose it is a generator function which loops through infinite sequence
//generator functions returns a next function which can be called anytime in your code to get the next value from the sequence
console.log(iterator.next()); // { value: 0.4900301224552095, done: false }
console.log(iterator.next()); // { value: 0.8244022422935814, done: false }
And as far as complexity is concerned, its a new syntax but would not take long to grasp it.
For further reading:
Upvotes: 2
Reputation: 155418
Because closures are less convenient for simple iteration; simplifying syntax for reasonably common tasks can be worth it, even if the language supported the same pattern before. Just compare:
function chain() {
var args = Array.from(arguments);
return function() {
if (args.length === 0) return undefined; // Or some other sentinel
var nextval = args[0].shift(); // Destructive to avoid copies or more closure vars
if (args[0].length === 0) args.shift();
return nextval;
};
}
var x;
// a, b and c must be indexable, e.g. Arrays; we can't handle other closures without
// requiring some API specific protocol for generation
for (var nextchain = chain(a, b, c); (x = nextchain()) !== undefined;) {
// do stuff with current value
}
to:
function* chain() {
for (var i = 0; i < arguments.length; ++i)
yield* arguments[i];
}
// a, b and c can be any iterable object; yield* can handle
// strings, Arrays, other generators, etc., all with no special handling
for (var x of chain(a, b, c)) {
// do stuff with current value
}
Sure, the savings in lines of code aren't incredible. It's mostly just reducing boilerplate and unnecessary names, removing the need to deal with closures for simple cases, and with the for...of
syntax, providing a common mechanism to iterate arbitrary iterable things, rather than requiring the user to explicitly construct the initial closure and advance it by name. But if the pattern is common enough, that's useful enough.
As noted in comments, a, b, c
must be Array
-like for the closure based approach (or you'd use a different closure based approach where the writer of chain
imposes arbitrary requirements on stuff passed to it, with special cases for Array
-like stuff vs. generator-like closures) and processing is destructive (you'd need to add more closure state or make copies to make it non-destructive, making it more complex or slower); for the generator-based approach with yield*
, no special cases required. This makes generators composable without complex specs; they can build on one another easily.
Upvotes: 4