Reputation: 167
My kv filter's field split looks something like the follows -
field_split => ", "
The field_split is based on a comma followed by a space. But one of my values is a json string. And the output of logstash seems to ignore the value after it encounters the first comma. Example - mdc field of the log looks something like: abc=abcvalue request={"key1":"value1","key2":"value2"}
What it parses it into is (the output is directed to elasticsearch): "abc": "abcvalue" "request": "{"key1":"value1""
How do I get the request field to be as follows? "request": "{"key1":"value1","key2":"value2"}"
Upvotes: 0
Views: 691
Reputation: 17165
If you truly have json, you can match the json and parse it using the json filter.
For example:
filter {
grok {
match => ["request=(?<request_json>{[^}]+})"] // match from { to first } and put in request_json
}
json {
source => "request_json"
target => "request"
remove_field => ["request_json"]
}
}
Upvotes: 1