Reputation: 251
I am building MERN stack application and at one moment I need to fetch large amount of data from mongodb database so I've came across mongodb streams and I was wondering what is the best way to implement this. I've tried these two options:
var stream = Product.find({}).stream();
stream.on('error', (err) => {
console.error(err)
});
stream.on('data', (doc) => {
return res.json(doc);
});
In this example I receive an error Cannot set headers after they are sent to the client. Second try:
await Product.find({})
.cursor()
.pipe(JSON.stringify())
.pipe(res);
In this example I get Cannot read property 'on' of undefined. I was not able to find proper explanation of whole cursor().pipe() chaining so if anyone knows I'll be more than glad if you could explain this logic. I do not want to use pagination in this example.
Upvotes: 1
Views: 1772
Reputation: 1357
I usually use this flow:
var stream = Product.find({}).stream();
const results = [];
stream.on('error', (err) => {
console.error(err)
});
stream.on('data', (doc) => {
results.push(doc);
});
stream.on('end', () => {
res.json(results)
})
Upvotes: 0
Reputation: 514
You can send the response to the client only once. once it is send the connection gets closed. In your case your trying to sending the response more than once hence the error.
you need to use websockets or server-sent-events for the purpose of streaming the data.
Upvotes: 1