chance
chance

Reputation: 6487

How complete should log files be?

We are using Log4J to writing log files in our batch applications. The log files after a run are always as huge as several giga bytes. This makes it very difficult and time-consuming to open, read or find useful information from them even after they have been divided into smaller (e.g. 500 MB) ones (by using rolling appender).

I think one reason is that we always write long complete English sentences in all logging levels, because we believe that reading a log file should be like reading a story.

Do you have any best practices to deal with this problem? Is using abbreviations (such as 'OK' instead of 'Successfully created ...') at least in DEBUG and TRACE levels a common usage?

Upvotes: 1

Views: 96

Answers (1)

Robert Harvey
Robert Harvey

Reputation: 180808

Write your log files as Comma-Separated-Values Text.

There are a number of reasons why:

  1. You can open them in the newer versions of Excel, no matter how large they are (well, up to about a million rows/log entries).

  2. You can filter them by time, if you provide a time column

  3. You can search them.

  4. You can filter or sort them by type or severity (if you provide a column containing that information, i.e. Message, Warning, Error, Critical).

With respect to the verbosity, you can provide two columns; a summary description and a detailed description. Excel allows you to hide or delete columns, so this allows users to slice and dice the files any way they want.

In addition, CSV files are machine-readable, so they can potentially be run through a post-processing program for further analysis.

Upvotes: 1

Related Questions