Andrei F
Andrei F

Reputation: 4394

Advanced logging for Java Web Applications

I want to build a more advanced logging mechanism for my java web applications, similar to App engine logs. My needs are:

  1. Stream logs to a database (for ex. sql, bigquery or something else)
  2. Automatically log important data (like app context, request url, request id, browser user agent, user id, etc.)

For point 1, I ca use a "buffering" implementation, where logs are put into different lists, and periodically a cron (thread) gathers all the logs in memory and write's them to database (which can also be on another server)

For point 2, the only way I found of doing this is to inject needed objects into my classes (subsystems), like ServletContext, HttpServletReqest, current user, etc, all modeled into a custom class (let's say AppLogContext), which then can be used by the logging mechanism.

The problem here is that I don't know if this is a good practice. For example, that means that many classes will have to contain this object which has access to servlet context and http request objects and I'm thinking this may create architectural problems (when building modules, layers etc) or even security issues.

App Engine will automatically log this kind of information (and much more, like latencies, cpu usage etc, but this more complicated), and it can be found in the project's Console logs (also it can duplicate logs to big query tables) and I need something for Jetty or other java web app servers.

So, is there another way of doing this, other patterns, different approaches? (couldn't find 3rd party libraries for any of these points)

Thank you.

Upvotes: 1

Views: 601

Answers (1)

Igor Artamonov
Igor Artamonov

Reputation: 35961

You don't really need to invent a bicycle.

There is a common practice that you can follow:

  • Just log using standard logger to a file
  • (if you need to see logs in request context) Logback, Log4J and SLF4J supports Mapped Diagnostic Context (MDC), that's what you can use to put current request into every log line (just initialize context in a filter, put request id for example, or generate a random uuid). You can aggregate log entries by this id later
  • Then use ELK:
    • Logstash got gather logs into
    • ElasticSearch for storing logs
    • to analyze using Kibana

Upvotes: 2

Related Questions