Bogdan Gusiev
Bogdan Gusiev

Reputation: 8305

Log all requests to web site in database

I need to log all post and get requests on web site in the database. There will be two tables:

I will use it only for analytical reports once per month. No regular usage of this data.

I have about one million requests a day and the request parameters table will be very huge. Can I handle such a large table in MySQL with no problems?

Upvotes: 1

Views: 1711

Answers (3)

Paweł Polewicz
Paweł Polewicz

Reputation: 3852

The usual solution of this type of problem is to write a program that parses the logs from the whole month. If You don't need sophisticated MySQL capabilities, You should consider this approach.

If You really need the database, then consider parsing logs offline. Otherwise, if Your database goes down, You will loose data. Logs are know to be pretty safe.

Table indexes are not free. The more indexes You have, the faster the queries run, but the more indexes You have, the slower inserting data becomes.

Upvotes: 0

SpliFF
SpliFF

Reputation: 39014

I'd avoid writing to the db on each request or you'll be vulnerable to slashdot effect. Parse your web logs during quiet times to update the db.

Upvotes: 1

nightcoder
nightcoder

Reputation: 13519

Yes, mysql will handle millions of rows normally, but depending on what you wanna do with your data later and on indexes on those tables perfomance may be not very high.

PS. In my project we have a huge pricelist with a few millions of products in it and it works without any problems.

Upvotes: 0

Related Questions