Reputation: 32710
I recently found an automatically created connection string specifying "Pooling=False" and wondered as to why it was set up like that. From my understanding pooling is virtually always advantageous as long as it is not totally mis-configured.
Are there any reasons for disabling pooling? Does it depend on the OS, the physical connection or the used DBMS?
Upvotes: 6
Views: 2216
Reputation: 1855
The reason must be that your context is trying to change the state of underlying database for example, if you are doing something that affects TLS(Transport Layer Security) you should not use Connection pool, because LDAP does not track any such state changes, if you do so you are compromising on your security issues.
Upvotes: 2
Reputation: 4906
Yes, there's a reason to disable pooling. You need to check how a particular pooling library copes with temporary network disconnects. For example some database drivers and/or pool libraries do nothing if connection was lost but connection object is still active. Instead of respawning a new connection, pool will give you stale connections and you will get errors. Some pool implementations check if connection is alive by issuing some fast command to the server before serving the connection to application. If they get error they kill that connection and spawn a new one. You always need to test your pool library against such scenario before enabling pooling.
Upvotes: 4
Reputation: 22914
If it's a single threaded app, pooling seems unnecessary. Was it on a resource constrained device? Is startup time important to the application? These are some factors that might lead to the decision to turn off pooling.
In general, I think you are right that pooling is beneficial. If it's a typical web app then I would inquire about it.
Upvotes: 2