Reputation: 119
I'm trying to fetch large data from mongodb
There is my code
let cont = 0
const stream = await dataModel.find({}).lean().cursor(); // it will return around 2.000 elements
console.log("Checkpoint one")
await stream.on('data', async (res) => {
try {
cont += 1
} catch (e) {
console.log(e)
}
});
await stream.on('close', () => {
console.log(`Execution ended. Number of elements: ${cont}.`);
});
console.log("Checkpoint two")
Output:
Checkpoint one
Checkpoint two
Execution ended. Number of elements: 2194.
Expected output:
Checkpoint one
Execution ended. Number of elements: 2194.
Checkpoint two
When I'm trying to console log each res inside "on data" function, its also console logging after "Checkpoint two",
Upvotes: 0
Views: 665
Reputation: 196
javascript
stream.on('close', () => {
console.log(`Execution ended. Number of elements: ${cont}.`);
});
This DEFINES (not "executes") a function (that is called each time a stream is closed) and then directly moves on with code-execution. You cannot await
this, you also cannot await
the stream.on('data')
because that's the same issue.
If you want your parent function to wait to a point where the stream has finished reading, you could create a new Promise
which is fullfilled by your stream.on('close')
function. You can then await
that promise just before your console.log("Checkpoint two")
. This should work, you maybe wanna give it a try.
Upvotes: 1