Reputation: 2787
I have following logstash config file
input {
tcp { type => "tcp_test" port => 1514 add_field => [ "log_type", "romote_log" ]}
}
filter {
if [type] == "tcp_test" {
grok {
type => tcp_test
match => [ "message", "%{WORD:client} %{WORD:app}" ]
}
date{
match => ["timestamp", "yyyy-MM-dd-HH:mm:ss"]
locale=>"en"
target => "@timestamp"
}
}
}
output {
elasticsearch { host => localhost }
}
my input for this configuration is dummy_computer from_leo_messi 2013-05-15-23:19:27 this is blah blah logs..
and my output in kibana is
I just want 2013-05-15-23:19:27 as time stamp. How can i do this ?
Upvotes: 0
Views: 344
Reputation: 11571
Your date filter is attempting to parse the "timestamp" field, but there is no such field. You'll have to adjust your grok filter accordingly. This works:
filter {
grok {
match => [
"message",
"%{WORD:client} %{WORD:app} (?<timestamp>\S+) %{GREEDYDATA:message}"
]
overwrite => ["message"]
}
date {
match => ["timestamp", "YYYY-MM-dd-HH:mm:ss"]
remove_field => ["timestamp"]
}
}
For an explanation of the "overwrite" option, see my answer to logstash, syslog and grok.
Upvotes: 1