Morgan Le Floc'h
Morgan Le Floc'h

Reputation: 21

How to send huge amount of documents from mongoDB over http?

I have a mongoDB database and I want to be able to fetch millions of documents at once, without crashing, avoiding cursors errors. I want to send the data over http, using express (nodeJS). My collection has thousands and thousands of documents, and each one have a field containing thousands of smaller documents. The current size of my colection is 500MB. Do you know the best practices for this big data case ? Should I implement a limit / skip based solution ? If yes, could you please provide a code sample ?

I already tried the document streaming, which seems more reliable but I still run into the same cursor problem. (Cursor not found)

app.get("/api/:collection", (req, res) => {
    const filter = JSON.parse(req.query["filter"] || "{}");
    const projection = JSON.parse(req.query["projection"] || "{}");
    const sort = JSON.parse(req.query["sort"] || "{}");

    db.collection(req.params.collection).find(filter)
        .project(projection).sort(sort)
        .stream({ transform: JSON.stringify })
        .addCursorFlag("noCursorTimeout", true)
        .pipe(res); 
});

Upvotes: 2

Views: 239

Answers (1)

Lucas Rosenberger
Lucas Rosenberger

Reputation: 255

You should gzip your response.

npm i --save compression

var compression = require('compression');  
var express = require('express');  
var app = express();  
app.use(compression());

Upvotes: 2

Related Questions