Reputation: 3186
I'm processing some metrics data and store them into Elasticsearch. Now I want to get those data from Elasticsearch and apply a filter on them, the goal is to have more relevant fields after the logstash filtering. For this purpose, I planed to use a grok filter. But I'm not a grok expert and I never parsed this kind of data.
This is a sample data coming from Elasticsearch:
{
"_index" : "metrics",
"_type" : "metrics",
"_id" : "AVh4R8n3cN8PY7B3sFIM",
"_score" : 1.0,
"_source" : {
"event_time" : "2016-11-18T16:31:59.769Z",
"message" : "[{\"values\":[0.04,0.18,0.17],\"dstypes\":[\"gauge\",\"gauge\",\"gauge\"],\"dsnames\":[\"shortterm\",\"midterm\",\"longterm\"],\"time\":1479486719.645,\"interval\":10.000,\"host\":\"test-host\",\"plugin\":\"load\",\"plugin_instance\":\"\",\"type\":\"load\",\"type_instance\":\"\"}]",
"version" : "1",
"tags" : [ ]
}
}
After logstash filtering I expect to have this:
{
"_index" : "metrics",
"_type" : "metrics",
"_id" : "AVh4R8n3cN8PY7B3sFIM",
"_score" : 1.0,
"_source" : {
"event_time" : "2016-11-18T16:31:59.769Z",
"values" : [0.04,0.18,0.17],
"dstypes" : ["gauge","gauge","gauge"],
"dsnames": ["shortterm","midterm","longterm"],
"time" : 1479486719.645,
"interval" : 10.000,
"host" : "test-host",
"plugin" : "load",
"plugin_instance" : "",
"type" : "load",
"type_instance" : ""
}
}
Can someone help me by giving advices or sample grok filter to achieve this?
Thank you in advance!!
Upvotes: 1
Views: 291
Reputation: 3186
I finally resolved that issue by using another filter. grok was not appropriate for this use case.
filter {
json {
source => "message"
}
}
The json filter extract exactly each data from message array as a json of key value pair. and that solve the issue.
Upvotes: 0