Reputation: 298
I'm working on a php project which is processing and modifying data entries on a large scale. Data is taken from database and goes through 7 validation tests. Data is then sorted and inserted to database again. This process is time consuming and difficult to track errors and failures.
I want to create log writer class for my application.
There are basically two options:
which method is more efficient or any other options?
Upvotes: 0
Views: 3583
Reputation: 23789
Text files are easier to monitor from the command line in Unix machines:
$ tail -f /path/thefile.log
This gives you a constantly-updating view of the newest lines of a text file. You don't have to do it that way, of course. I think in most cases a text file is sufficient.
Databases are great if you want to do efficient sorting, filtering, and relational operations (ie. you want to relate one log entry to another sort of information that is stored). It takes longer to set this up, so be sure to determine how permanently you need this logging and how robust you need it to be. If it's for simple review, I'd stick with a text file.
Upvotes: 1
Reputation: 21007
Writing log into database is efficient only if those logs are used only rarely (for example you need to check error once in time). If you want to sort and filter errors, the database is the way to go.
You could create table like this:
CREATE TABLE `logs` (
`id` INT AUTO_INCREMENT,
`original_id` INT NOT NULL, -- will contain FOREIGN KEY
`new_id` INT DEFAULT NULL, -- will contain FOREIGN KEY
`validator_id` INT, -- ID of validator which triggered error
`type` ENUM('error', 'notice') DEFAULT 'error',
`message` VARCHAR(255) DEFAULT '',
PRIMARY KEY (`id`)
)
which will allow you to browse your error really easily.
Upvotes: 1
Reputation: 7956
while you could do your own logger I would recommend you log4php which is a well tested library and with the support of the apache fundation
Upvotes: 2