Mukesh
Mukesh

Reputation: 953

Elasticsearch error while parsing date from logs

Hi i have created ingest pipeline to fetch custom logs, my pipeline with processor looks like below

[
  {
    "grok": {
      "field": "message",
      "patterns": [
        "\\[%{TIMESTAMP_ISO8601:timestamp}\\] %{DATA:env}\\.%{DATA:log.level}: (?<message>(.|\r|\n)*)"
      ],
      "ignore_missing": true
    }
  },
  {
    "date": {
      "field": "timestamp",
      "formats": [
        "yyyy-MM-dd'T'HH:mm:ss.SSXX"
      ],
      "target_field": "@timestamp"
    }
  },
  {
    "json": {
      "field": "message",
      "add_to_root": true,
      "ignore_failure": true
    }
  }
]

now when i am sending logs with date format

 [2022-01-27T08:31:16.806171+00:00] local.INFO: {"request-id":"5f9c3819-97b3-4439-87ab-30c58bffd2a5","event_name":"cancel_pending_withdraw","message":"action webhook sent"}
 [2022-01-27T12:31:09.972653+00:00] local.INFO: {"request-id":"6cea1e1d-8e54-4225-b7d8-5383e39690bb","event_name":"deposit_approved","message":"Triggering all action now"}

it gives error

{"type":"illegal_argument_exception","reason":"failed to parse date field [2022-01-27T10:22:49.234717+00:00] with format [yyyy-MM-dd'T'HH:mm:ss.SSXX]","caused_by":{"type":"date_time_parse_exception","reason":"Text '2022-01-27T10:22:49.234717+00:00' could not be parsed at index 22"}}

can anyone help what exactly wrong here

Upvotes: 0

Views: 522

Answers (1)

Val
Val

Reputation: 217254

You can use the strict_date_optional_time_nanos date format instead and it will work:

  {
    "date": {
      "field": "timestamp",
      "formats": [
        "strict_date_optional_time_nanos"
      ],
      "target_field": "@timestamp"
    }
  },

Upvotes: 1

Related Questions