Reputation: 11649
So. I have a table of data, in a MySQL database, eg:
lets say for example: 100 artists.
Because this data will be modified very rarely, I have created an interface whereby when the data is modified by the administrator using the content management system, the system queries the database, and saves a serialized PHP array of the data as a file on the server.
This file is recreated everytime a new artist is added, say once a week.
On the frontend, when the page is loaded instead of querying the database the page simply includes the file (using output buffering) and produces the HTML page layout from this object.
Is this a good idea? Is this going to be quicker than having hundreds of users query the database everytime the page loads?
As an extension to this question, if I start to page the data e.g. set a limit on the MYSQL result object to 10 rows, will it be slower to access the whole table as a PHP array and cut that down into groups of 10 - echoing out the relevant section based on the query string?
Upvotes: 3
Views: 2165
Reputation: 5743
I think you would like to cache these files, instead of making a file to be read.
If update frequency is low, you can add a routine to regenerate static files, one for each page, let's say artists-1.html (last 10 inserted), artists-2.html (next 10)...
Querying has a cost. Filtering a file content isn't necessarily faster than querying a DB (more data, less eficcient).
Another thing to mention is, 100 rows is almost nothing for a DB. Really. If it was 10,000+ you could start worrying about performance. 100 rows is a piece of cake for any modern database.
And as I said, you can "compile" those static files for every update hence this will took just a few seconds or less with that amount, with the advantage of providing direct access, which is the fastest way to retrieve information (not discussing memory here, as memory is even much faster, but requires a more sophisticated solution).
Upvotes: 1
Reputation: 3408
I can't give you a PHP specific answer but I would advise against serialising data because databases are designed to handle large volumes of data well. I would advise you to store the data in the db as you might want to use it for other things later like generating reports that require aggregated data. Such an exercise would be prohibitive without a database engine such as mysql.
Lastly on the performance note IIS caches the html output and i would think apache does as well, although I’m not 100% on that.
Upvotes: 1
Reputation: 48897
the system queries the database, and saves a serialized PHP array of the data as a file on the server.
If you're going to do this, you should just go the whole nine yards and cache the resulting HTML and have your page include that. This way you avoid including executable code, iterating over arrays and creating HTML every time.
You can even cache the paginated HTML using this method.
Regarding scalability of your file approach, including serialized arrays into memory on every request is much less efficient than having a database load incrementally from a file as all the memory usage from the former approach can slow down your entire server.
Upvotes: 3
Reputation: 44215
That speed improovement might be tempting, but I don't think it will be worth the effort of creating and parsing the PHP file and it might even be slower than a simple database query. Additionally you are giving up a lot of flexibility, too (you would have to implement any kind of query you want to make on your data). A common way to reduce database load is using a caching system like Memcached. But when speed really becomes an issue you are not talking about hundreds of users but more in the scale of tens of thousands, so don't optimize there before you don't have to.
Upvotes: 0
Reputation: 2009
I don't think it makes much difference either way when you are only talking about 100 records. Really all you are doing is caching a MySql query. If you wanted it to be even faster you could cache the query in memory using APC. Still faster would be to cache each page of results into a separate object in memory.
Upvotes: 0
Reputation: 101926
It probably will be slower. Storing it on the filesystem will require unserialization of the full file even if you need only a small chunk of data.
Never forget: Databases are fast! And if they are not you are missing an index ;)
PS: If you want to increase performance (and do so only if you really need it!) either use APC or even better cache the fully generated page, so it may be served nearly-statically.
Upvotes: 5
Reputation: 464
i am not sure about the speed differences of file parsing and mysql query execution. Since your content is somewhat static until the admin changes the list, you can create the HTML page layout and store it as an html file which is available to the users. It will definitely reduces the server load since no kind of parsing or sql query execution is needed for the page.
And if your are using pagination it is not needed to load the full results from mysql, you can limit the results fetched from mysql table using LIMIT
keyword.
Upvotes: 1