Reputation: 919
I have a performance and best-practice question concerning mysql tables. I´m working on an application which connects to a database which gets filled by other programms. This system is deployed in differnet locations, and from location to location the name of some databases-tables can change (but the fields in this tables etc stay the same).
As I don´t want to change all sql querys in my application for every location, I thought about creating a mysql view which simply mirrors the contents of this table to the normaly used table-name.
Is this a suitable solution, or could it get awfully slow with big tables?
Upvotes: 1
Views: 57
Reputation: 1680
mmm, this is tricky. If there are multiple tables then a quick and dirty version for this would be something like
SELECT * FROM (SELECT * FROM table1
Union
SELECT * FROM table2
Union
SELECT * FROM table3) t
Which I think will work. You will of course have problems with pagination, sorting and searching - because you will have to try and do this over 3 or more tables.
Another way would be this
Create a table with the table names and a counter
ImportTable
name
id
Now in this you can enter the names of the tables and the last id that you want to import from.
Create another table to import the records
TableRecords
source
id
field1
field2
etc
Now run something that goes through the tables in ImportTable
grabs any new records and shoves them into `TableRecords.
Now this becomes really simply you can query TableRecords
and have pagination sorting and searching with no of the previous troubles.
Make something that runs this every 2 minutes say so TableRecords
will be 2 mins behind but everything will be really easy and run like a dream.
Upvotes: 0
Reputation: 7590
Simple views (created as SELECT * FROM table
) behave like the specified table performance wise.
It should be a suitable solution for your case.
Upvotes: 1