Reputation: 881
I am currently building a web application that will be using two databases at the same time. The user first creates a general account and afterwards uses it to create multiple characters on multiple 'servers'. (think of it as creating an account on Facebook and then using it to sign up on other websites).
My current database setup is the following:
main_english
main_russian
server1
server2
server3
The main_english
and main_russian
databases contain the session and general account data for users who've registered at the appropriate languages.
The server1
and server2
databases handle the created character data of the English users, while server3
handles the Russian ones.
Users will be updating tables every 5-10 seconds or so. My question is, how could I efficiently handle having several DBs used?
So far I've come to several options:
main_
and another for server
,
but I've read that opening them is fairly expensive.select_db()
statements
with each query, which, in my opinion, is both messy and
inefficient.What could you suggest? Perhaps I am overreacting about opening a second connection for the server
queries?
Upvotes: 4
Views: 107
Reputation: 32748
MySQL is known to have fast connection handling so thats unlikely to be a bottleneck. That being said you could look into persistent connections in the PDO
or mysqli
extensions.
In terms of the server access and data modeling its difficult to say without knowing more about your application. How do DB slaves factor in? Is replication lag OK?
Upvotes: 1