genpet
genpet

Reputation: 453

mysql connection design and concurrent connections

My application will be hosted in a shared hosting provider with a limit of 25 Mysql concurrent connections. Below is my code for connecting to DB

function dbConnect() {
$dbHost = 'localhost';
$dbUser = 'user';
$dbPass = 'pass';
$dbName = 'dbase';  
    mysql_connect($dbHost, $dbUser, $dbPass)or die(mysql_error());
    mysql_select_db($dbName) or die(mysql_error());
   }

My application is more on database query. The home page alone has atleast 20 mysq_query in it. The way I code my home page is like below

include 'config.php';
dbConnect();

query1 ... //to get initial variables

process variables that result in multiple query

query2... // process result

query3...// process result

and so on up to....

query 20...// process result

I cant minimize the query anymore coz most of the query is prerequisite of the other query. I am expecting of at least 1000 users daily, and the possibility of 50% of the users will connect on the same time or at lest seconds apart.

  1. On the design of connecting to DB will I easily reach the connection limit set?
  2. Is php closing the connection right after the query, meaning it closes after the first query, then open on the second and closes again right after, and so on and son, even if I'm not closing the connection myself
  3. Is my assumption correct that the above scenario automatically consumed 20 concurrent connections?

Thanks in advance.

Upvotes: 2

Views: 1542

Answers (2)

RandomSeed
RandomSeed

Reputation: 29769

For what it's worth, some answers to questions 2. and 3. of genpet's original post:

  1. PHP closes your connection to MySQL if and only if (assuming it wasn't openned with mysql_pconnect()):

    • you call mysql_close()
    • you reach the end of the script
  2. Therefore, if you send 20 queries during the same script, you only consume 1 connection

One thought just comes to my mind. Consider this pattern:

  • open/poll a connection with *_pconnect()
  • run a query, deal with the results
  • release the connection ASAP with *_close()
  • open/poll again if you need to query the DB later in the script

This way, I believe one could work around the low concurrent connections limit (even though I agree with people advising your hosting plan may be inappropriate to fulfill your needs)

This is theory, though, I haven't tried that in the real.

Upvotes: 2

Robbie
Robbie

Reputation: 17720

It won't let me post this as an comment above, so this isn't an answer, but an extension of my first comment ("upgrade your hosting plan")

IMO, your connection design is not ok. "die" is a terminal function, if you're expecting failure, then build in a retry option and a user friendly error handling function. Also, opening and closing the DB each time is time intensive, you might be able to squeeze three queries faster than closing down and opening again for one (but that depends on several factors, which can be mitigated worth shared pools add suggested by others.) But my advice is upgrade hosting plan and develop "normally". Then your style is ok, except use a friendlier error screen than "die" :-) as well as pdo or mysqli, not mysql_ functions

Upvotes: 0

Related Questions