Connection leak (error on too many connections)?

I'm working on a backend system that allows you to write a query, and query our DB.

We have a max limit of 100 connections to our postgres DB and after running a couple of queries we received an error on the DB that it's maxed.

Now I'm trying to figure out how to correctly write this so this isn't an issue.

Here is my current code:

def in_transaction
  connection_model.connection_pool.with_connection do
    connection_model.transaction do
      yield
      raise ActiveRecord::Rollback
    end
  end
end

begin
  in_transaction do
    set_timeout(data_source.timeout) if data_source.timeout
    result = select_all("#{statement}")
    columns = result.columns
  end
end

Should I be adding something like:

ActiveRecord::Base.connection.close or ActiveRecord::Base.connection_pool.release_connection and where?

Upvotes: 1

Views: 786

Answers (1)

Daniel Westendorf
Daniel Westendorf

Reputation: 3465

You likely don't need to be manually managing your connections.

There is a pretty good document by Heroku on database pools that applies regardless of your host. Basically, it boils down to how many connections you're configuring per process * the number of processes. You need that number to be under your maximum of 100.

For example, if you have 3 web processes, 2 background worker processes (sidekiq, resque, etc) and each one had a pool of 20, you'd have a (3 + 2) * 20 = 100, reaching your maximum number of connections.

If you've hit the point where you simply need more connections, or the apprearance of more connections, checkout PgBouncer.

Upvotes: 1

Related Questions