Reputation: 1666
Consider a subset of a sample output from http://demo.nginx.com/status:
{
"timestamp": 1516053885198,
"server_zones": {
"hg.nginx.org": {
... // Data for "hg.nginx.org"
},
"trac.nginx.org": {
... // Data for "trac.nginx.org"
}
}
}
The keys "hg.nginx.org"
and "track.nginx.org"
are quite arbitrary, and I would like to parse them into something meaningful for Elasticsearch. In other words, each key under "server_zones"
should be transformed into a separate event. Logstash should thus emit the following events:
[
{
"timestamp": 1516053885198,
"server_zone": "hg.nginx.org",
... // Data for "hg.nginx.org"
},
{
"timestamp": 1516053885198,
"server_zone": "trac.nginx.org",
... // Data for "trac.nginx.org"
}
]
What is the best way to go about doing this?
Upvotes: 1
Views: 381
Reputation: 5135
You can try using the ruby filter. Get the server zones and create a new object using the key value pairs you want to include. From the top of my head, something like below should work. Obviously you then need to map the object to your field in the index. Change the snipped based on your custom format i.e. build the array or object as you want.
filter {
ruby {
code => " time = event.get('timestamp')
myArr = []
event.to_hash.select {|k,v| ['server_zones'].include?(k)}.each do |key,value|
myCustomObject = {}
#map the key value pairs into myCustomObject
myCustomObject[timestamp] = time
myCustomObject[key] = value
myArr.push(myCustomObject) #you'd probably move this out based on nesting level
end
map['my_indexed_field'] = myArr
"
}
}
In the output section use rubydebug for error debugging
output {
stdout { codec => rubydebug }
}
Upvotes: 1