Tsiyona Dershowitz
Tsiyona Dershowitz

Reputation: 77

Convert log message timestamp to UTC before storing it in Elasticsearch

I am collecting and parsing Tomcat access-log messages using Logstash, and am storing the parsed messages in Elasticsearch. I am using Kibana to display the log messges in Elasticsearch. Currently I am using Elasticsearch 2.0.0, Logstash 2.0.0, and Kibana 4.2.1.

An access-log line looks something like the following:

02-08-2016 19:49:30.669 ip=11.22.333.444  status=200  tenant=908663983 user=0a4ac75477ed42cfb37dbc4e3f51b4d2 correlationId=RID-54082b02-4955-4ce9-866a-a92058297d81  request="GET /pwa/rest/908663983/rms/SampleDataDeployment HTTP/1.1" userType=Apache-HttpClient requestInfo=- duration=4 bytes=2548 thread=http-nio-8080-exec-5 service=rms itemType=SampleDataDeployment itemOperation=READ dataLayer=MongoDB incomingItemCnt=0 outgoingItemCnt=7 

The time displayed in the log file (ex. 02-08-2016 19:49:30.669) is in local time (not UTC!)

Here is how I parse the message line:

filter {

        grok {
            match => { "message" => "%{DATESTAMP:logTimestamp}\s+" }
        }

        kv {}

        mutate {
            convert => { "duration" => "integer" }
            convert => { "bytes" => "integer" }
            convert => { "status" => "integer" }
            convert => { "incomingItemCnt" => "integer" }
            convert => { "outgoingItemCnt" => "integer" }

            gsub => [ "message", "\r", "" ]
        }

        grok {
            match => { "request" => [ "(?:%{WORD:method} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpVersion})?)" ] }
            overwrite => [ "request" ]  
        }

}

I would like Logstash to convert the time read from the log message ('logTimestamp' field) into UTC before storing it in Elasticsearch.

Can someone assist me with that please?

--

I have added the date filter to my processing, but I had to add a timezone.

    filter {
            grok {
                match => { "message" => "%{DATESTAMP:logTimestamp}\s+" }
            }

            date {
               match => [ "logTimestamp" , "mm-dd-yyyy HH:mm:ss.SSS" ]
               timezone => "Asia/Jerusalem"
               target => "logTimestamp"
            }

            ...
   }

Is there a way to convert the date to UTC without supplying the local timezone, such that Logstash takes the timezone of the machine it is running on?

The motivation behind this question is I would like to use the same configuration file in all my deployments, in various timezones.

Upvotes: 1

Views: 4766

Answers (2)

Alexander Marquardt
Alexander Marquardt

Reputation: 1539

This can also be done in an ingest processor as follows:

PUT _ingest/pipeline/chage_local_time_to_iso
{
  "processors": [
    {
      "date" : {
        "field" : "my_time",
        "target_field": "my_time", 
        "formats" : ["dd/MM/yyyy HH:mm:ss"],
        "timezone" : "Europe/Madrid"
      }
    }
  ]
}

Upvotes: 0

Alain Collins
Alain Collins

Reputation: 16362

That's what the date{} filter is for - to parse a string field containing a date string replace the [@timestamp] field with that value in UTC.

Upvotes: 1

Related Questions