AM-
AM-

Reputation: 881

Proper way to optimize a multi-DB web application

I am currently building a web application that will be using two databases at the same time. The user first creates a general account and afterwards uses it to create multiple characters on multiple 'servers'. (think of it as creating an account on Facebook and then using it to sign up on other websites).

My current database setup is the following:

The main_english and main_russian databases contain the session and general account data for users who've registered at the appropriate languages.

The server1 and server2 databases handle the created character data of the English users, while server3 handles the Russian ones.

Users will be updating tables every 5-10 seconds or so. My question is, how could I efficiently handle having several DBs used?

So far I've come to several options:

  1. Open two DB connections: one for main_ and another for server, but I've read that opening them is fairly expensive.
  2. Modifying all my queries to explicitly state which DB to update/select data from (the project is still young and thus it wouldn't be that painful)
  3. Modifying my $db class to have select_db() statements with each query, which, in my opinion, is both messy and inefficient.

What could you suggest? Perhaps I am overreacting about opening a second connection for the server queries?

Upvotes: 4

Views: 107

Answers (1)

Cody Caughlan
Cody Caughlan

Reputation: 32748

MySQL is known to have fast connection handling so thats unlikely to be a bottleneck. That being said you could look into persistent connections in the PDO or mysqli extensions.

In terms of the server access and data modeling its difficult to say without knowing more about your application. How do DB slaves factor in? Is replication lag OK?

Upvotes: 1

Related Questions