user3469584
user3469584

Reputation: 633

Using Log4J with LogStash

I'm new to LogStash. I have some logs written from a Java application in Log4J. I'm in the process of trying to get those logs into ElasticSearch. For the life of me, I can't seem to get it to work consistently. Currently, I'm using the following logstash configuration:

input {
  file {
    type => "log4j"
    path => "/home/ubuntu/logs/application.log"
  }
}
filter {
  grok {
    type => "log4j"
    add_tag => [ "ApplicationName" ]
    match => [ "message", "%{TIMESTAMP_ISO8601:timestamp}  %{LOGLEVEL:level}" ]
  }
}
output {
  elasticsearch {
    protocol => "http"
    codec => "plain"
    host => "[myIpAddress]"
    port => "[myPort]"
  }
}

This configuration seems to be hit or miss. I'm not sure why. For instance, I have two messages. One works, and the other throws a parse failure. Yet, I'm not sure why. Here are the messages and their respective results:

Tags                   Message
------                 -------
["_grokparsefailure"]  2014-04-04 20:14:11,613 TRACE c.g.w.MyJavaClass [pool-2- 
                       thread-6] message was null from https://domain.com/id-1/env-
                       MethodName

["ApplicationName"]    2014-04-04 20:14:11,960 TRACE c.g.w.MyJavaClass [pool-2-
                       thread-4] message was null from https://domain.com/id-1/stable-
                       MethodName

The one with ["ApplicationName"] has my custom fields of timestamp and level. However, the entry with ["_grokparsefailure"] does NOT have my custom fields. The strange piece is, the logs are nearly identical as shown in the message column above. This is really confusing me, yet, I don't know how to figure out what the problem is or how to get beyond it. Does anyone know how how I can use import log4j logs into logstash and get the following fields consistently:

Thank you for any help you can provide. Even if I can just the log level, timestamp, and log message, that would be a HUGE help. I sincerely appreciate it!

Upvotes: 29

Views: 57482

Answers (3)

albgorski
albgorski

Reputation: 117

On my blog (edit: removed dead link) I described how to send JSON message(s) to the ElasticSearch and then parse it with GROK. [Click to see blog post with description and Java example][1] In the post you find description but also simple maven project with example (complete project on github).

Hope it helps you.

Upvotes: 1

Arijit B
Arijit B

Reputation: 21

It looks like the SocketAppender solution that was used before is deprecated because of some security issue. Currently the recommended solution is to use log4j fileAppender and then pass the file through filebeat plugin to logstash and then filter. For more information you can refer the below links:

https://www.elastic.co/blog/log4j-input-logstash

https://www.elastic.co/guide/en/logstash/current/plugins-inputs-log4j.html

Upvotes: 1

ranxxerox
ranxxerox

Reputation: 281

I'd recommend using the log4j socket listener for logstash and the log4j socket appender.

Logstash conf:

input {
  log4j {
    mode => server
    host => "0.0.0.0"
    port => [logstash_port]
    type => "log4j"
  }
}
output {
  elasticsearch {
    protocol => "http"
    host => "[myIpAddress]"
    port => "[myPort]"
  }
}

log4j.properties:

log4j.rootLogger=[myAppender]
log4j.appender.[myAppender]=org.apache.log4j.net.SocketAppender
log4j.appender.[myAppender].port=[log4j_port]
log4j.appender.[myAppender].remoteHost=[logstash_host]

There's more info in the logstash docs for their log4j input: http://logstash.net/docs/1.4.2/inputs/log4j

Upvotes: 28

Related Questions