Godfried
Godfried

Reputation: 151

Logstash Grok Filter key/value pairs

Working on getting our ESET log files (json format) into elasticsearch. I'm shipping logs to our syslog server (syslog-ng), then to logstash, and elasticsearch. Everything is going as it should. My problem is in trying to process the logs in logstash...I cannot seem to separate the key/value pairs into separate fields.

Here's a sample log entry:

Jul  8 11:54:29 192.168.1.144 1 2016-07-08T15:55:09.629Z era.somecompany.local ERAServer 1755 Syslog {"event_type":"Threat_Event","ipv4":"192.168.1.118","source_uuid":"7ecab29a-7db3-4c79-96f5-3946de54cbbf","occured":"08-Jul-2016 15:54:54","severity":"Warning","threat_type":"trojan","threat_name":"HTML/Agent.V","scanner_id":"HTTP filter","scan_id":"virlog.dat","engine_version":"13773 (20160708)","object_type":"file","object_uri":"http://malware.wicar.org/data/java_jre17_exec.html","action_taken":"connection terminated","threat_handled":true,"need_restart":false,"username":"BATHSAVER\\sickes","processname":"C:\\Program Files (x86)\\Google\\Chrome\\Application\\chrome.exe"}

Here is my logstash conf:

input {
  udp {
    type => "esetlog"
    port => 5515
  }
  tcp {
    type => "esetlog"
    port => 5515
  }

filter {
  if [type] == "esetlog" {
    grok {
      match => { "message" => "%{DATA:timestamp}\ %{IPV4:clientip}\ <%{POSINT:num1}>%{POSINT:num2}\ %{DATA:syslogtimestamp}\ %{HOSTNAME}\ %{IPORHOST}\ %{POSINT:syslog_pid\ %{DATA:type}\ %{GREEDYDATA:msg}" }
      }
    kv {
      source => "msg"
      value_split => ":"
      target => "kv"
    }
  }
}

output {
  elasticsearch {
    hosts => ['192.168.1.116:9200']
    index => "eset-%{+YYY.MM.dd}"
  }
}

When the data is displayed in kibana other than the data and time everything is lumped together in the "message" field only, with no separate key/value pairs.

I've been reading and searching for a week now. I've done similar things with other log files with no problems at all so not sure what I'm missing. Any help/suggestions is greatly appreciated.

Upvotes: 2

Views: 9308

Answers (1)

Tw K
Tw K

Reputation: 70

Can you try belows configuration of logstash

       grok {
                   match => {
                    "message" =>["%{CISCOTIMESTAMP:timestamp} %{IPV4:clientip} %{POSINT:num1} %{TIMESTAMP_ISO8601:syslogtimestamp} %{USERNAME:hostname} %{USERNAME:iporhost} %{NUMBER:syslog_pid} Syslog %{GREEDYDATA:msg}"]
                            }

            }
            json {
                    source => "msg"
            }

It's working and tested in http://grokconstructor.appspot.com/do/match#result

Regards.

enter image description here

Upvotes: 1

Related Questions