Nick
Nick

Reputation: 189

Logstash issue with KV filter

I am trying to index a document in ElasticSearch through logstash. An example from the file I am trying to index is as follows

GET firstname=john&lastname=smith 400

My objective is to create an index that looks something like the following

    HTTPMethod: GET
    firstname : john
    lastname: smith
    query_time : 400

I did the following so far

    filter {
     grok{
    match => {"message" => "%{WORD:HttpMethod} %{GREEDYDATA:KVText} %{NUMBER:time:int}"}
}
kv {
    source => "KVText"
    value_split => "&"
    remove_field => [ "KVText" ]
}

}

However, when I execute the logstash conf file I see the following

      "query_time": 400,
      "message": "GET firstname=john&lastname=smith 400\r",
      "HttpMethod": "GET",
      "firstname=john": "lastname=smith"

I am not getting the index as a key1=value1 format as discrete values. e.g. firstname=john lastname=smith

Also, whenever I make a change to my log file the logstash process doesn't pick the change for indexing in real time. I have to rename the file and restart logstash. I do understand it has something to do with the since_db path in my logstash.conf.

Any pointers are truly appreciated.

Thanks Nick

Upvotes: 0

Views: 1644

Answers (1)

whites11
whites11

Reputation: 13300

You are configuring the kv filter the wrong way.

The value_split param tells the filter what char to use for splitting one key/value pair (you should put "=") while the field_split config tells what char to use to split pairs from the string. Try using:

kv {
  source => "KVText"
  value_split => "=" 
  field_split => "&"
  remove_field => [ "KVText" ]
}

Upvotes: 2

Related Questions