Konrad
Konrad

Reputation: 902

QueuePool limit reached for Flask application

I am running an application with Flask and Flask-SQLAlchemy.

from config import FlaskDatabaseConfig
from flask import Flask
from flask import request
from flask_migrate import Migrate
from flask_sqlalchemy import SQLAlchemy

application = Flask(__name__)
application.config.from_object(FlaskDatabaseConfig())
db = SQLAlchemy(application)

@application.route("/queue/request", methods=["POST"])
    def handle_queued_request():
        stuff()
        return ""

def stuff():
    # Includes database queries and updates and a call to db.session.commit()
    # db.session.begin() and db.session.close() are not called
    pass

if __name__ == "__main__":
    application.run(debug=False, port=5001)

Now, from my understanding, by using Flask-SQLAlchemy I do not need to manage sessions on my own. So why am I getting the following error if I run several requests turn by turn to my endpoint?

sqlalchemy.exc.TimeoutError: QueuePool limit of size 5 overflow 10 reached, connection timed out, timeout 30 (Background on this error at: http://sqlalche.me/e/3o7r)

I've tried using db.session.close() but then, instead of this error, my database updates are not committed properly. What am I doing incorrectly? Do I need to manually close connections with the database once a request has been handled?

Upvotes: 7

Views: 7789

Answers (2)

Minh DN
Minh DN

Reputation: 31

Just remove connection every time you make a query session to db.

products = db.session.query(Product).limit(20).all()
db.session.remove()

Upvotes: 3

Konrad
Konrad

Reputation: 902

I have found a solution to this. The issue was that I had a lot of processes that were "idle in transaction" because I did not call db.session.commit() after making certain database SELECT statements using Query.first()

To investigate this, I queried my (development) PostgreSQL database directly using:

SELECT * FROM pg_stat_activity

Upvotes: 5

Related Questions