Guilherme Viebig
Guilherme Viebig

Reputation: 6932

Mongodb thousands of queries running in parallel

We had a PHP application that performed well running 30k unique queries in a mongo collection.

foreach ($request->numbers as $number) {
    $query = [
        'cn' => $ddd,
        'prefix_init' => array('$lte' => (int) $mobile),
        'prefix_final' => array('$gte' => (int) $mobile)
    ];

    $result = $cadup_full->findOne($query);
    $results[] = $result;
}

We converted that application in nodejs and started to have performance issues since in PHP queries were triggered in a synchronous fashion and a nodejs loop make all 30k run at the same time bringing a performance degradation.

How can I take advantage on running these queries still using nodejs?

Upvotes: 0

Views: 335

Answers (1)

chrisbajorin
chrisbajorin

Reputation: 6153

As JohnnyHK mentioned, you can use async, specifically mapLimit.

I don't know PHP, so here's a pseudo/rough guess of it's equivalent syntax in node.

// max number of requests running at once
let concurrencyLimit = 10;

async.mapLimit(numbers, concurrencyLimit, function(number, numberCallback) {

    let query = {
        "cn": ddd,
        "prefix_init": {
            $lte: number
        },
        "prefix_final": {
            $gte: number
        }
    }

    // where numberCallback is function(error, document);
    myCollection.findOne(query, numberCallback);

}, function(error, results) {

    // here is your results array;
    console.log(results);
});

Upvotes: 2

Related Questions