Reputation: 704
Is there a way to save a specific mysql query output in a file on my application server (not the DB server) and update this data every n seconds so that the users do not send queries to my db but just load the cached output?
I thought about using a crontab which opens a specific .php that executes the queries and saves the data to a file which is loaded when the users demand the data. If this is a good idea, is it possible to make this .php file not accessible from public http requests?
Upvotes: 1
Views: 79
Reputation: 1269447
First, this is probably not a good idea. If your goal is to reduce I/O, you may be actually increasing it when the user load is low.
Second, if you do this, you will probably need to write custom code to maintain the local cache. You then have issues of cache contention -- particularly when writes and reads are happening at the same time.
Third, if the problem is that queries are taking too long (which is not clear because you are focused on I/O bandwidth), then optimizing the database and queries would be a first step.
Fourth, lots of data being returned from the database to the application often suggests a poor design of the data interface. Thinking through what is needed can be really helpful. For instance, perhaps sorting on the server and returning the first 10 rows is the best solution. Or, perhaps, counting on the database side is the right solution.
Finally, any solution of this sort necessarily means that the local cache can be out-of-sync with the database. This is usually the killer problem with this type of approach.
All of these can be solved, but they generally go in the direction of making the code more complex. If you don't have a real performance problem now, then remember what Knuth advised: "Premature optimization is the root of all evil." (I agree with the sentiment but there is definitely other evil out there in the world ;)
Upvotes: 2