Jan Richter
Jan Richter

Reputation: 2086

Bash grep php error_log Fatal errors to file by filter


I have webserver with several web applications running (PHP). The configuration is set up the way, that php error_log is deleted at midnight (so the file lasts for 24 hrs only).

The thing is, I would like to log all Fatal errors into another file or database but only for specific web applications. (there are about 20 running, 4 of them are mine).

I was thinking about creating a bash script, grepping the error_log file on "Fatal" and url of my applications, fetching the output to file and also remembering last line number of current error_log in separate cache file.

I would then put the script to cron and execute it every few minutes (starting at the last line of the previous run).

The whole idea is a little messed up and I think it could be written efficiently, any ideas?

Upvotes: 0

Views: 842

Answers (1)

geert3
geert3

Reputation: 7341

Writing a cron job seems OK if you can't configure this out of the box. I don't know PHP well enough. In java e.g. you can have the same log message go to several log files depending on criteria.

But I would have your cron job do both the collecting of the fatal errors AND the deletion of the "last day's" log file. This way you can suffice with a single run of this script, at midnight, and save yourself the complexity of knowing where you ended last time (and the chance you missed some errors that happened right before midnight). If the collection was OK, delete the old file, otherwise leave it for diagnosis and retry. It saves you a bunch (24*60) calls to the script.

Upvotes: 1

Related Questions