MusikPolice
MusikPolice

Reputation: 1749

Is there a Logback Layout that Creates JSON Objects with Message Parameters as Attributes?

I want to send log events to Loggly as JSON objects with parameterized string messages. Our project currently has a lot of code that looks like this:

String someParameter = "1234";
logger.log("This is a log message with a parameter {}", someParameter);

We're currently using Logback as our SLF4J backend, and Logback's JsonLayout to serialize our ILogEvent objects into JSON. Consequentially, by they time our log events are shipped to Loggly, they look like this:

{
    "message": "This is a log message with a parameter 1234",
    "level": INFO,
    ....
}

While this does work, it sends a different message string for every value of someParameter, which renders Loggly's automatic filters next to useless.

Instead, I'd like to have a Layout that creates JSON that looks like this:

{
    "message": "This is a log message with a parameter {}",
    "level": INFO,
    "parameters": [
        "1234"
    ]
}

This format would allow Loggly to group all log events with the message This is a log message with a parameter together, regardless of the value of someParameter.

It looks like Logstash's KV filter does something like this - is there any way to accomplish this task with Logback, short of writing my own layout that performs custom serialization of the ILogEvent object?

Upvotes: 30

Views: 58027

Answers (6)

Harald K
Harald K

Reputation: 27114

Not a Layout, but the format looks "exactly" like the output of Logback's JsonEncoder (available since Logback 1.3.8/1.4.8):

{
    ...
    "level": "INFO",
    ...
    "message": "This is a log message with a parameter {}",
    "arguments": [
        "1234"
    ]
    ...
}

With the exception of the "parameters" array, which is named arguments in the JSON output. See above link for a full example (which unfortunately lacks arguments, but it does format messages like this).

Upvotes: 0

ThomasRS
ThomasRS

Reputation: 8287

Like already answered you'll get a one-dimensional JSON tree with MDC and/or using a Marker with logstash-logback-encoder.

If you are also looking for the following:

  • codebooks for definition of logged datatype key and type,
  • configuration of log-aggregation tools (like elasticsearch)
  • generated Java helper-code for efficient and correct logging

then try a project I've created: json-log-domain. It defines a simple YAML-format definition from which the above can be generated.

An example helper-code statement would be

logger.info(host("localhost").port(8080), "Hello world");

while generated markdown would like something like this.

Upvotes: 1

andrewps
andrewps

Reputation: 350

So for me I was trying to log execution times, I created a pojo called ExecutionTime with name, method, class, duration.

I was then able to create it:

ExecutionTime time = new ExecutionTime("Controller Hit", methodName, className, sw.getTotalTimeMillis());

For logging I then used:

private final Logger logger = LoggerFactory.getLogger(this.getClass());
logger.info(append("metric", time), time.toString());

Make sure you have:

import static net.logstash.logback.marker.Markers.append;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

This will log something like this:

{  
   "ts":"2017-02-16T07:41:36.680-08:00",
   "msg":"ExecutionTime [name=Controller Hit, method=setupSession, className=class com.xxx.services.controllers.SessionController, duration=3225]",
   "logger":"com.xxx.services.metrics.ExecutionTimeLogger",
   "level":"INFO",
   "metric":{  
      "name":"Controller Hit",
      "method":"setupSession",
      "className":"class com.xxx.services.controllers.SessionController",
      "duration":3225
   }
}

Might be a different set up as I was using logback-spring.xml to output my logs to json:

<?xml version="1.0" encoding="UTF-8"?>
<configuration>
    <include resource="org/springframework/boot/logging/logback/base.xml"/>
    <property name="PROJECT_ID" value="my_service"/>
    <appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
        <File>app/logs/${PROJECT_ID}.json.log</File>
        <encoder class="net.logstash.logback.encoder.LogstashEncoder">
            <fieldNames>
                <timestamp>ts</timestamp>
                <message>msg</message>
                <thread>[ignore]</thread>
                <levelValue>[ignore]</levelValue>
                <logger>logger</logger>
                <version>[ignore]</version>
            </fieldNames>
        </encoder>
        <rollingPolicy class="ch.qos.logback.core.rolling.FixedWindowRollingPolicy">
            <maxIndex>10</maxIndex>
            <FileNamePattern>app/logs/${PROJECT_ID}.json.log.%i</FileNamePattern>
        </rollingPolicy>
        <triggeringPolicy class="ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy">
            <MaxFileSize>20MB</MaxFileSize>
        </triggeringPolicy>
    </appender>
    <logger name="com.xxx" additivity="false" level="DEBUG">
        <appender-ref ref="FILE"/>
        <appender-ref ref="CONSOLE"/>
    </logger>
    <root level="WARN">
        <appender-ref ref="FILE"/>
    </root>
</configuration>

Upvotes: 12

muttonUp
muttonUp

Reputation: 6727

You could use a Mapped Diagnostic Context to set a stamp for each of those type of log messages that you could then filter on once in loggly.

According to the source of JsonLayout the stamp is stored as a separate value in the JSON.

Upvotes: 5

ash
ash

Reputation: 5165

Here's a recently created project that provides a JSON-specific logging API and works with SLF4J:

https://github.com/savoirtech/slf4j-json-logger

Upvotes: 5

Mark Roper
Mark Roper

Reputation: 1389

There is a JSON logstash encoder for Logback, logstash-logback-encoder

Upvotes: 21

Related Questions