George
George

Reputation: 81

SpringBoot -> Logback -> Logstash

Events are not being sent from my Spring Boot app to logstash. Here is my logback.xml file:

 <configuration>
    <appender name="STASH-C" class="net.logstash.logback.appender.LogstashAccessTcpSocketAppender">
        <destination>arc-poc01:5044</destination>
        <encoder class="net.logstash.logback.encoder.LogstashAccessEncoder" />
        <keepAliveDuration>5 minutes</keepAliveDuration>
    </appender>
  <appender name="STASH-B" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
    <destination>arc-poc01:5045</destination>
    <encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder" >
        <providers>
            <timestamp/>
            <version/>
            <loggerName/>
            <pattern>
                <pattern>
                    {
                    "custom_constant": "cfg",
                    "level": "%level",
                    "thread": "%thread",
                    "message": "%message"
                    }
                </pattern>
            </pattern>
         </providers>
    </encoder>
    <keepAliveDuration>5 minutes</keepAliveDuration>
  </appender>
  <appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
    <!-- encoders are assigned the type ch.qos.logback.classic.encoder.PatternLayoutEncoder 
      by default -->
    <encoder>
      <pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n
      </pattern>
    </encoder>
  </appender>
  <logger name="com.gw.test" level="INFO" />
  <logger name="org.springframework" level="INFO" />
  <logger name="com.netflix.astyanax" level="INFO" />

  <root level="DEBUG">
    <appender-ref ref="STASH-B" />
    <appender-ref ref="STASH-C" />
    <appender-ref ref="STDOUT" />
  </root>
</configuration>

Here is my logstash pipeline configuration:

input {
    tcp {
      port => 5044
      codec => json
      data_timeout => -1
    }
    log4j {
        mode => "server"
        host => "0.0.0.0"
        port => 5045
        type => "log4j"
        codec => json
     }
}

output {
    stdout { codec => rubydebug }
    elasticsearch {
        hosts => [ "http://arc-poc01:9200" ]
    }
}

When I run the app console output is being generated as expected but no events are sent to logstash. I run curl -XGET 'localhost:9600/_node/stats/pipeline?pretty' to verify the traffic and here is the output:

{
  "host" : "55357b6f0969",
  "version" : "5.4.1",
  "http_address" : "0.0.0.0:9600",
  "id" : "65153f2f-10af-48c3-9be5-5db0913bf7d8",
  "name" : "55357b6f0969",
  "pipeline" : {
    "events" : {
      "duration_in_millis" : 0,
      "in" : 0,
      "filtered" : 0,
      "out" : 0,
      "queue_push_duration_in_millis" : 0
    },
    "plugins" : {
      "inputs" : [ {
        "id" : "45e636052face5ff7a0b8cb463fa2b88c59c5697-2",
        "events" : {
          "out" : 0,
          "queue_push_duration_in_millis" : 0
        },
        "name" : "log4j"
      }, {
        "id" : "45e636052face5ff7a0b8cb463fa2b88c59c5697-1",
        "events" : {
          "out" : 0,
          "queue_push_duration_in_millis" : 0
        },
        "name" : "tcp"
      } ],
      "filters" : [ ],
      "outputs" : [ {
        "id" : "45e636052face5ff7a0b8cb463fa2b88c59c5697-3",
        "name" : "stdout"
      }, {
        "id" : "45e636052face5ff7a0b8cb463fa2b88c59c5697-4",
        "name" : "elasticsearch"
      } ]
    },
    "reloads" : {
      "last_error" : null,
      "successes" : 0,
      "last_success_timestamp" : null,
      "last_failure_timestamp" : null,
      "failures" : 0
    },
    "queue" : {
      "type" : "memory"
    },
    "id" : "main"
  }
}

There are no errors showing up in logs of both my spring boot application or Logstash. Is there something specific to configuring logback under Spring Boot app that I am missing? I spent 2 days looking into this and ran out of ideas. Any hints as to how I can troubleshoot this would be greatly appreciated!

Upvotes: 1

Views: 9172

Answers (2)

Rimmy Mathew
Rimmy Mathew

Reputation: 157

Try moving the json codec from input to filter:-

input {
    tcp {
        port => 5045
   }
}
filter
{
        json {
            source => "message"
            remove_field => "message"
            }
}
output {
    stdout { codec => rubydebug }
    elasticsearch {
        hosts => [ "http://arc-poc01:9200" ]
    }
}

Upvotes: 0

George
George

Reputation: 81

I should also add to my original description that I am running ELK stack in Docker containers deployed on Mesos via Marathon scripts. Here is what ultimately helped me get my apps to stream its logs through the ELK pipeline:

1) Change the logstash pipeline definition as follows:

input {
    tcp {
        port => 5045
        codec => json
   }
}
output {
    stdout { codec => rubydebug }
    elasticsearch {
        hosts => [ "http://arc-poc01:9200" ]
    }
}

2) Configure Logstash appender in logback.xml file as follows:

   <appender name="STASH-B" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
    <destination>arc-poc01:5045</destination>
    <encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder" >
        <providers>
            <timestamp/>
            <version/>
            <loggerName/>
            <pattern>
                <pattern>
                    {
                    "custom_constant": "cfg",
                    "level": "%level",
                    "thread": "%thread",
                    "message": "%message"
                    }
                </pattern>
            </pattern>
         </providers>
    </encoder>
    <keepAliveDuration>5 minutes</keepAliveDuration>
  </appender>

3) Get relevant debug traces from Logstash appender in my client app showing Logstash connection going up and down by including StatusListener in the logback.xml file as follows:

<statusListener class="ch.qos.logback.core.status.OnConsoleStatusListener" />

The log entries were indicating logstash connection being established and immediately closed by the logstash server. I think the problem was caused by the apparent mismatch between the codec configuration as defined in the pipeline and content type being sent by the logstash appender on the client application side.

Upvotes: 3

Related Questions