Reputation: 27
I was trying to filer message to get timestamp and use date filter to convert the string to date but the converted date different as of original.
Filter code:
filter {
grok {
match => [ "message", "%{TIMESTAMP_ISO8601:timestamp} \| %{LOGLEVEL:loglevel} \| %{NOTSPACE:taskid} \| %{NOTSPACE:logger} \| %{WORD:label}( \| %{INT:duration:int})?" ]
}
date {
match => ["timestamp", "YYYY-MM-DD HH:mm:ss,SSS"]
target => "timestamp"
}
}
input
2021-04-19 12:06:39,586 | INFO | 12345 | TASK_START | start
output
"timestamp" => 2021-01-19T06:36:39.586Z,
the hour and minute have changed
Upvotes: 0
Views: 4862
Reputation: 1616
If timestamps in your logs are not UTC you can provide timezone information. For example:
date {
match => ["timestamp", "YYYY-MM-DD HH:mm:ss,SSS"]
timezone => "Asia/Kolkata"
target => "@timestamp" // <--- this is optional, @timestamp is default
}
Upvotes: 2
Reputation: 4072
logstash and elasticsearch store dates as UTC, and kibana will map that to the browser's timezone. By default a date filter will use the local timezone. So if you are in the Asia/Kolkata timezone, which is +05:30 compared to UTC, this is working exactly as expected. If the timestamp field is in a different timezone then use the timezone option of the date filter to tell it which one.
Upvotes: 1