Marco Righele
Marco Righele

Reputation: 2852

ES6 generators: transforming callbacks to iterators

I'm experimenting with ES6 generators with the help of babel, and I have trouble understand how (or if!) I can effectively use callback based async function to output an iterator.

Let's say I want be able to write a function that takes a number of urls, asynchronously download them and returns them as soon as they are downloaded. I would like to be able to write something like the following:

let urls = ['http://www.google.com', 'http://www.stackoverflow.com' ];
for ( {url, data} of downloadUrls(urls) ) {
    console.log("Content of url", url, "is");
    console.log(data);
}

How can I implement downloadUrls ? Ideally I would like to be able to write the following:

var downloadUrls = function*(urls) {
    for( let url of urls ) {
        $.ajax(url).done( function(data) {
            yield data;
        });
    }
};

This of course doesn't work, since ``yield'' is being invoked inside a callback and not directly inside the generator. I can find many examples online of people trying the same, they are either not much transparent), require enabling browser/node flags, or use node-specific features/libraries. The library closest to what I need seems to be task.js, but I'm unable to have even the simplest example run on current Chrome.

Is there a way to get the intended behaviour using standard and current features , (With current I mean usable with transpilers like babel, but without the need to enable extra flags on the browser) or do I have to wait for async/await ?

Upvotes: 8

Views: 1140

Answers (3)

Dan Dascalescu
Dan Dascalescu

Reputation: 151936

2019 update

Yielding via callbacks is actually pretty simple. Since you can only call yield directly from the generator function* where it appears (and not from callbacks), you need to yield a Promise instead, which will be resolved from the callback:

async function* fetchUrls(urls) {
  for (const url of urls)
    yield new Promise((resolve, reject) => {
      fetch(url, { mode: 'no-cors' }).then(response => resolve(response.status));
    });
}

(async function main() {
  const urls = ['https://www.ietf.org/rfc/rfc2616.txt', 'https://www.w3.org/TR/PNG/iso_8859-1.txt'];
  // for-await-of syntax
  for await (const status of fetchUrls(urls))
    console.log(status);
}());

If the example doesn't work in the browser (it my return 0 instead of 200 due to Cross Origin Read Blocking), try it live on repl.it.

Upvotes: 2

bruceceng
bruceceng

Reputation: 2182

Here is a clean way to use a generator / iterator to flatten asynchronous code which works for me in node.js:

var asyncProcedureGenerator1 = function*() {
    var it = yield(0); //get a reference to the iterator
    try {
        var a = yield (asyncPart1.bind(it))(0); //call the function, set this = it
        var b = yield (asyncPart2.bind(it))(a);
        var c = yield (asyncPart3.bind(it))(b);
        console.log("c = ", c);
    }
    catch(err)
    {
        console.log("Something went wrong: ", err);
    }
};

var runAsyncGenerator = function(generator) {
    var asyncProcedureIterator = generator(); //create an iterator
    asyncProcedureIterator.next(); //start the iterator
    asyncProcedureIterator.next(asyncProcedureIterator); //pass a reference of the iterator to itself
}

var asyncPart1 = function(param1) {
    var it = this; //the iterator will be equal to this.
    console.log("Starting asyncPart1 with param1 = ", param1);
    setTimeout(function() {
        console.log("Done with asyncPart1");
        var returnValue = 42 + param1;
        console.log("asyncPart1 returned ", returnValue);
        it.next(returnValue); //when we are done, resume the iterator which has yielded to us.
    },2000);
};

var asyncPart2 = function(param1) {
    var it = this; //the iterator will be equal to this.
    console.log("Starting asyncPart2 with param1 = ", param1);
    setTimeout(function() {
        console.log("Done with asyncPart2");
        var returnValue = param1 / 2;
        console.log("asyncPart2 returned ", returnValue);
        //it.throw("Uh oh.");

        it.next(returnValue);

    },2000);
};

var asyncPart3 = function(param1) {
    var it = this; //the iterator will be equal to this.
    console.log("Starting asyncPart3 with param1 = ", param1);
    setTimeout(function() {
        console.log("Done with asyncPart3");
        var returnValue = param1 / 3;
        console.log("asyncPart3 returned ", returnValue);
        it.next(returnValue);
    },2000);
};

runAsyncGenerator(asyncProcedureGenerator1); 

The idea is to run the generator, creator an iterator, and then pass a reference of that iterator to itself.

Then the iterator can call asynchronous functions (with yield) and pass them a reference to itself which allows those functions to either return success and resume the execution by calling iterator.next(result) or failure by calling iterator.throw(error).

I just came up with this pattern, so there may be some gotchas I haven't found yet, but it seems to work and allows very flat code with minimal additions.

Upvotes: 0

Bergi
Bergi

Reputation: 664307

Is there a way to get the intended behaviour using standard and current features

Yes, use promises and generators. Many promise libraries, and some standalone ones, feature the use of generator "coroutines".

But notice that you cannot mix iteration with asynchrony, you can use generators for either only. Your example seems to confuse them a bit - it looks like you expect that for ( {url, data} of downloadUrls(urls) ) { loop to work synchronously, which cannot work.

do I have to wait for async/await?

No, you don't have to wait, Babel already supports them!

Upvotes: 0

Related Questions