user2406467
user2406467

Reputation: 1047

Extracting fields in Logstash

I am using Logstash (with Kibana as the UI). I would like to extract some fields from my logs so that I can filter by them on the LHS of the UI.

A sample line from my log looks like this:

2013-07-04 00:27:16.341 -0700 [Comp40_db40_3720_18_25] client_login=C-316fff97-5a19-44f1-9d87-003ae0e36ac9 ip_address=192.168.4.1

In my logstash conf file, I put this:

filter {
    grok {
        type => "mylog"
        pattern => "(?<CLIENT_NAME>Comp\d+_db\d+_\d+_\d+_\d+)"
    }
}

Ideally, I would like to extract Comp40_db40_3720_18_25 (the number of digits can vary, but will always be at least 1 in each section separated by _) and client_login (can also be client_logout). Then, I can search for CLIENT_NAME=Comp40... CLIENT_NAME=Comp55, etc.

Am I missing something in my config to make this a field that I can use in Kibana?

Thanks!

Upvotes: 3

Views: 4381

Answers (1)

Adam
Adam

Reputation: 1982

If you are having any difficulty getting the pattern to match correctly, using the Grok Debugger is a great solution.

For your given problem you could just separate out your search data into another variable, and save the additional varying digits in another (trash) variable.

For example:

(?<SEARCH_FIELD>Comp\d+)%{GREEDYDATA:trash_variable}]

(Please use the Grok Debugger on the above pattern)

Upvotes: 4

Related Questions