Valera Fedorenko
Valera Fedorenko

Reputation: 99

How to Viewing trace logs from OpenTelemetry in Elastic APM

I receive logs from opentelemetry-collector in Elastic APM logs structure :

"{Timestamp:HH:mm:ss} {Level:u3} trace.id={TraceId} transaction.id={SpanId}{NewLine}{Message:lj}{NewLine}{Exception}"

example:

08:27:47 INF trace.id=898a7716358b25408d4f193f1cd17831 transaction.id=4f7590e4ba80b64b SOME MSG

I tried use pipeline

POST _ingest/pipeline/_simulate {   "pipeline": {   "description" : "parse multiple patterns",   "processors": [
    {
      "grok": {
        "field": "message",
        "patterns": ["%{TIMESTAMP_ISO8601:logtime} %{LOGLEVEL:loglevel} \\[trace.id=%{TRACE_ID:trace.id}(?: transaction.id=%{SPAN_ID:transaction.id})?\\] %{GREEDYDATA:message}"],
        "pattern_definitions": {
          "TRACE_ID": "[0-9A-Fa-f]{32}",
          "SPAN_ID": "[0-9A-Fa-f]{16}"
        }
      },
      "date": { "field": "logtime", "target_field": "@timestamp", "formats": ["HH:mm:ss"] }
    }   ] } }

My goal see logs in Elastic APM

 {
        "@timestamp": 2021-01-05T10:10:10",
    
        "message":  "Protocol Port MIs-Match",
        "trace": {
            "traceId": "898a7716358b25408d4f193f1cd17831",
            "spanId": "4f7590e4ba80b64b"
        }
    }

enter image description here

Upvotes: 2

Views: 1813

Answers (1)

Val
Val

Reputation: 217324

Good job so far. Your pipeline is almost good, however, the grok pattern needs some fixing and you have some orphan curly braces. Here is a working example:

POST _ingest/pipeline/_simulate
{
  "pipeline": {
    "description": "parse multiple patterns",
    "processors": [
      {
        "grok": {
          "field": "message",
          "patterns": [
            """%{TIME:logtime} %{WORD:loglevel} trace.id=%{TRACE_ID:trace.id}(?: transaction.id=%{SPAN_ID:transaction.id})? %{GREEDYDATA:message}"""
          ],
          "pattern_definitions": {
            "TRACE_ID": "[0-9A-Fa-f]{32}",
            "SPAN_ID": "[0-9A-Fa-f]{16}"
          }
        }
      },
      {
        "date": {
          "field": "logtime",
          "target_field": "@timestamp",
          "formats": [
            "HH:mm:ss"
          ]
        }
      }
    ]
  },
  "docs": [
    {
      "_source": {
        "message": "08:27:47 INF trace.id=898a7716358b25408d4f193f1cd17831 transaction.id=4f7590e4ba80b64b SOME MSG"
      }
    }
  ]
}

Response:

{
  "docs" : [
    {
      "doc" : {
        "_index" : "_index",
        "_type" : "_doc",
        "_id" : "_id",
        "_source" : {
          "trace" : {
            "id" : "898a7716358b25408d4f193f1cd17831"
          },
          "@timestamp" : "2021-01-01T08:27:47.000Z",
          "loglevel" : "INF",
          "message" : "SOME MSG",
          "logtime" : "08:27:47",
          "transaction" : {
            "id" : "4f7590e4ba80b64b"
          }
        },
        "_ingest" : {
          "timestamp" : "2021-03-30T11:07:52.067275598Z"
        }
      }
    }
  ]
}

Just note that the exact date is missing so the @timestamp field resolve to January 1st this year.

Upvotes: 2

Related Questions