Reputation: 459
I am creating a site which will make lots of searches and I need to log data about every search that is made for later analysis.
I anticipate ultimately having load distributed between a number of servers, then each month I will download and import all logs into a single mysql database at my end for analysis.
At the moment I've been looking at setting every server up as a mysql 'master' which will live update the slave analysis server and essentially also act as a backup.
However I'm aiming for efficiency. Obviously the benefits of mysql replication are that I always have logs centrally available and don't have to import and reset log files on each server every month.
How much more efficient would it be to log in a plaintext file and just dump this logfile every month and import into mysql centrally? Is a plaintext dump much, if any, more efficient/faster than mysql?
Thanks for your thoughts!
Upvotes: 5
Views: 1352
Reputation: 5525
Databases are strong for doing more than inserts. They are strong for locking mechanisms, transaction management, fast searches, connections pooling, and the list goes on.
On the other hand, if all you need to do in general is writing a chunk of data to the disk, a database would be a huge overhead.
Given the above, and since you only want to write stuff all month long, I would recommend you use logs, and once a month - take the logs, merge them together and analyze them. You could then decide if you want to merge all of them into a database (if it makes sense and gives you some added value), or you just want to merge the text together.
BTW, you may want to save the INSERT statements into this log, and then use it as a script to load everything into the database. Give it a thought :-)
Upvotes: 3