Reputation: 296
I have a log file that's an array of objects that looks something like this:
[
{ "cate1": "data1a", "cate2": "data2a" },
{ "cate1": "data1b", "cate2": "data2b" },
{ "cate1": "data1c", "cate2": "data2c" }
]
and I need each object in the array to be a separate entry in Elasticsearch and each "cate" to be a field. My current logstash.conf file is:
input {
tcp {
port => 5000
}
}
## Add your filters / logstash plugins configuration here
filter {
json {
source => "message"
target => "event"
}
mutate {
gsub => ["message","\]",""]
gsub => ["message","\[",""]
}
}
output {
elasticsearch {
hosts => "elasticsearch:9200"
}
}
but it tags each line with "_jsonparsefailure" except the first entry and it parses the square brackets as well. How would I go about configuring Logstash to do this properly?
Upvotes: 3
Views: 3545
Reputation: 4089
Instead of using the json
filter, you should look into using the json
codec on your input. It seems to do exactly what you want:
This codec may be used to decode (via inputs) and encode (via outputs) full JSON messages. If the data being sent is a JSON array at its root multiple events will be created (one per element).
It would look something like this:
input {
tcp {
port => 5000
codec => json{ }
}
}
Upvotes: 3