Veer Shrivastav
Veer Shrivastav

Reputation: 5496

Querying Kibana using grok pattern

We have configured ELK stack over our daily logs and using Kibana UI to perform basic search/query operation on the the set of logs.

Some of our logs have a certain field in the message while others don't. Therefore we have not configured it as a separate field while configuring Logstash.

I have logs like:

[28/Jun/2016:23:59:56 +0530] 192.168.xxx.xxx [API:Profile]get_data_login: Project password success:  9xxxxxxxxx0
[28/Jun/2016:23:59:56 +0530] 192.168.xxx.xxx [API:Profile]session_end: logout success:  9xxxxxxxxx0 TotalTime:1.1234

In these two logs, I wish to extract TotalTime for all session_end logs. And visualize it.

How should I do it?

I can search all the logs which are listed under session_end, however I am not able to perform grok on the set of logs.

Upvotes: 3

Views: 2805

Answers (2)

baudsp
baudsp

Reputation: 4100

You can use two different grok patterns in the same filter:

grok {
  match => {
    "message" => ['\[%{HTTPDATE}\] %{IP} \[API:Profile\]session_end: %{GREEDYDATA:session} TotalTime:%{GREEDYDATA:tt}',
                '\[%{HTTPDATE}\] %{IP} \[API:Profile\]%{GREEDYDATA:data}']
  }
}

The messages will be tested by the first pattern, if they have session_end: and TotalTime:, you'll have an elasticsearch document with the two fields. Then you'll be able to do aggregations and visualisation on these fields.

The other messages (without session_end: and TotalTime:) will be parsed by the second filter.

Upvotes: 1

Karup
Karup

Reputation: 2079

Inside your filter in logstash you can have something like :

filter {

    ...

    if ([message] ~= "session_end") {
        grok {
             #write grok specifically for the second format of log here
        }
    }
    else if ([message] ~= "get_data_login") {
        grok {
             #write grok specifically for the first format of log here
        }
    }

    ...

}

Grok patterns cannot be used for querying in Kibana.

Upvotes: 3

Related Questions