Reputation: 1
I'm new to the ELK stack however after reading the documentation I am still unable to get logstash to recognise the custom time for
Logs look like this
2014-11-30 03:30:01.000118,122.99.34.242,123.56.212.1,44,u,q,NA,0 ? a.root-servers.net A
My logstash filter is this
filter {
date {
#2014-11-30 03:30:01.000118,122.99.34.242,123.56.212.1,44,u,q,NA,0 ? a.root-servers.net A
match => ["timestamp" , "YYYY-MM-dd HH:mm:ss"]
locale => "en"
}
}
However when i run it though this config file through with the log file it doesnt appear to recognise the match i have define
{
"message" => "2014-11-30 03:30:01.011895,123.52.36.153,213.55.121.1,55,u,q,NA,0\t? mobile-collector.newrelic.com A\t",
"@version" => "1",
"@timestamp" => "2014-12-03T03:09:49.857Z",
"type" => "syslog",
"host" => "0.0.0.0",
"path" => "/var/log/dns/2014-11-30_0335_dnsparse.log"
}
I'm running "logstash 1.4.2-modified"
For completeness here is my entire logstash config
input {
file {
path => "/var/log/dns/*.log"
type => "dns_parselog"
start_position => beginning
}
}
filter {
date {
#2014-11-30 03:30:01.000118,122.99.34.242,123.56.212.1,44,u,q,NA,0 ? a.root-servers.net A
match => ["timestamp" , "YYYY-MM-dd HH:mm:ss"]
locale => "en"
}
}
output {
elasticsearch {
host => localhost
}
stdout { codec => rubydebug }
}
Upvotes: 0
Views: 166
Reputation: 11571
You're asking the date filter to parse a field named "timestamp", but there is no such field. Start by extracting the timestamp in the message to a separate field, then use the date filter.
filter {
grok {
match => ["message", "%{TIMESTAMP_ISO8601:timestamp}"]
}
date {
match => ["timestamp", "YYYY-MM-dd HH:mm:ss.SSSSSS"]
remove_field => ["timestamp"]
}
}
You'll want to add additional patterns to the grok expression to extract additional fields, but that's out of the scope of this question.
Upvotes: 1