Reputation: 1585
I am trying to do this without and 3rd party dependencies, as I don't feel they should be needed. Please note, due to architect ruling we have to use MongoDB native, and not Mongoose (don't ask!).
Basically I have a getAll
function, that will return all documents (based on passed in query) from a single collection.
The number of documents, could easily hit multiple thousand, and thus, I want to stream them out as I receive them.
I have the following code:
db.collection('documents')
.find(query)
.stream({
transform: (result) => {
return JSON.stringify(new Document(result));
}
})
.pipe(res);
Which kind of works, except it destroys the array that the documents should sit in, and it responds {...}{...}
There has to be a way of doing this right?
Upvotes: 2
Views: 1696
Reputation: 7112
What you can do is to write explicitly the start of the array res.write("[")
before requesting the database, put a ,
, on every json stringified object and on the stream end write the end of the array res.write("]")
this can work. But it is not advisable!
JSON.stringify
is a very slow operation, you should try to use it as less as possible.
A better approach will be to go with a streamable JSON.stringify implementation like json-stream-stringify
const JsonStreamStringify = require('json-stream-stringify');
app.get('/api/users', (req, res, next) => {
const stream = db.collection('documents').find().stream();
new JsonStreamStringify(stream).pipe(res);
);
Be aware of using pipe in production, pipe does not destroy the source or destination stream when errors. It is advisable to go for
pump
orpipeline
in production to avoid memory leaks.
Upvotes: 1