HjalmarCarlson
HjalmarCarlson

Reputation: 868

Connection Pool for NodeJS

I have an app that has been maxing out the number of connection to MongoDB and I was under the assumption that if the drivers were set up correctly you didn't need to worry about closing connections.

I've seen people mention the Generic Pool module but what is the best process for closing or pooling connections using Node & MongoDB?

Here is my connection code for the app:

var sys = require("sys");
var app = require('http').createServer(handler);
var io = require('socket.io').listen(app);

app.listen(1337);

io.configure(function () {
io.set('authorization', function (handshakeData, callback) {
    callback(null, true);
    });
});

function handler (req, res, data) {
    sys.puts('request made to trackerapp.js'); 
    res.writeHead(200);
    res.end(data);
}

io.sockets.on('connection', function (socket) {
  socket.on('adTracker', function (data) {
  var adRequestData = data;
  var databaseUrl = "mongodb://dbuser:dbpass@mongolab.com/tracker";
  var collections = ["cmnads"]
  var db = require("mongojs").connect(databaseUrl, collections);

    db.cmnads.insert({adRequest : adRequestData},function(err, updated) {
        if( err || !updated ) console.log("mongo not updated" + err);
        else console.log("data stored");
      });
    });
});

Upvotes: 1

Views: 3400

Answers (2)

HjalmarCarlson
HjalmarCarlson

Reputation: 868

After seeing JohnnyHK's comment I was able to pull the connection event out of the Socket.io connection and it worked fine, see the solution below:

var databaseUrl = "mongodb://dbuser:dbpass@mongolab.com/tracker";
var collections = ["cmnads"];
var db = mongojs.connect(databaseUrl, collections);

io.sockets.on('connection', function (socket) {
  socket.on('adTracker', function (data) {
  var adRequestData = data;

  //vars for MongoDB used to be created here... so new connect function was called on every request to socket.io

  db.cmnads.insert({adRequest : adRequestData},function(err, updated) {
      if( err || !updated ) console.log("mongo not updated" + err);
        else console.log("data stored");
      });
    });
});

Upvotes: 1

JB.
JB.

Reputation: 861

A technique I used with my express apps that seems have some measure of success is to open a connection to a mongo instance (thereby getting a connection pool) then sharing that db (that is now in the "connected" state) instance wherever it is needed. Something like this:

server = new Server(app.settings.dbsettings.host, app.settings.dbsettings.port, {auto_reconnect: true, poolSize: 5})
db = new Db(app.settings.dbsettings.db, server, {native_parser:false})
db.open(function(err, db) {
  app.db = db;
  server = app.listen(app.settings.port);
  console.log("Express server listening on port %d in %s mode", app.settings.port, app.settings.env);

  require('./apps/socket-io')(app, server);
});

This connects to the database at the highest level in my app before the program moves into the wait listen state.

Before I used this pattern I would create a new database object whenever I needed to interact with the database. The problem I found is that the new database object would create a new thread pool, consuming a bunch of ports. These were never cleaned up properly. After a period of time the machine that hosted the app would run out of ports!

Anyway, a variation on the code I have shown should be where you should do your thinking I believe.

Upvotes: 0

Related Questions