Reputation: 4580
I'm currently discovering elastic search, kibana and logstash with docker. (Version 7.1.1) The three containers are running well.
I have some data files containing some lines like this one:
foo=bar type=alpha T=20180306174204527
My logstash.conf contains:
input {
file {
path => "/tmp/data/*.txt"
start_position => "beginning"
}
}
filter {
kv {
field_split => "\t"
value_split => "="
}
}
output {
elasticsearch { hosts => ["elasticsearch:9200"] }
stdout {
codec => rubydebug
}
}
I handle this data:
{
"host" => "07f3051a3bec",
"foo" => "bar",
"message" => "foo=bar\ttype=alpha\tT=20180306174204527",
"T" => "20180306174204527",
"@timestamp" => 2019-06-17T13:47:14.589Z,
"path" => "/tmp/data/ucL12018_03_06.txt",
"type" => "alpha"
"@version" => "1",
}
First step of job is done.
Now I want to add a filter to transform the value of the key T
as a timestamp.
{
...
"T" => "2018-03-06T17:42:04.527Z",
"@timestamp" => 2019-06-17T13:47:14.589Z,
...
}
I do not know how to do it. I tried to add a second filter just after the kv
filter, but nothing change when I add new files.
Upvotes: 1
Views: 29
Reputation: 4110
Add this filter after the kv
filter:
date {
match => [ "T", "yyyyMMddHHmmssSSS" ]
target => "T"
}
The date filter will try to parse the field T
using the provided pattern to create a date, which will be written to the T
field (by default it overwrite the @timestamp
field).
Upvotes: 1