olg32
olg32

Reputation: 345

Parsing Syslog with Logstash grock filter isn’t working with Kibana

I have crated a very basic grok filter to parse Cisco Syslogs:

input {
  udp {
    port => 5140
    type => syslog
      }
 }

filter {
    grok {
        match => { "message"=> "%{TIMESTAMP_ISO8601:Timestamp_Local} %{IPV4:IP_Address} %{GREEDYDATA:Event}" }
         }
       }

output {
    elasticsearch {
      hosts => ["localhost:9200"]
      sniffing => true
      index => "ciscologs-%{+YYYY.MM.dd}"
       }

 }

After reloading Logstash and verifying that logs show no major issues I reloaded Kibana and refreshed indexes.

When accessing the Discovery section, I saw that the index was indeed created. But looking at the fields, they were the default ones and not the ones defined in the grok filter.

The logs received after adding the filter show the following tag in Kibana:

enter image description here

Before adding the filter I made sure it works using Kibana's Grok debugger. The tag states that there was a problem with the logs parsing but at this point.

Running versions are: 7.7.1 (Kibana/Elasticsearch) and 7.13.3 (Logstash)

I'm not sure where the issue might be, so, any help would be appreciated.

Upvotes: 0

Views: 402

Answers (1)

olg32
olg32

Reputation: 345

I found the problem. I was trying to match the logs in the order sent by the Cisco devices and not the logs in the "message" field. Once I modified that, the filter started working as expected.

Upvotes: 0

Related Questions