Rémy.W
Rémy.W

Reputation: 225

elasticsearch mapping parser exception

I'm trying to import a Jmeter test report to an ELK stack (ElasticSearch, Logstash, Kibana).

I'm using :

I'm feeding this .csv to Logstash and importing this data to elasticsearch

Label,SampleSize,Average,Median,90% Line,95% Line,99% Line,Min,Max,Error %,Throughput,Received KB/sec,Std Deviation,targethost
myRequest,5,59,58,63,66,66,52,66,0.00%,5.8,12.3,4.87,myHost

Here's a table format for more clarity

Label SampleSize Average Median 90% Line 95% Line 99% Line Min Max Error % Throughput Received KB/sec Std Deviation targethost
myRequest 5 59 58 63 66 66 52 66 0.00% 5.8 12.3 4.87 myHost

And I created this mapping in ElasticSearch :

PUT jmeter-report
{
  "settings": {
    "number_of_shards": 5
  },
  "mappings": {
    "doc": {
      "properties": {
        "90% Line": {
          "type": "long"
        },
        "95% Line": {
          "type": "long"
        },
        "99% Line": {
          "type": "long"
        },
        "Max": {
          "type": "long"
        },
        "Median": {
          "type": "long"
        },
        "Min": {
          "type": "long"
        },
        "Received KB/sec": {
          "type": "double"
        },
        "SampleSize": {
          "type": "long"
        },
        "Std Deviation": {
          "type": "double"
        },
        "Error %": {
          "type": "double"
        },
        "Average": {
          "type": "long"
        },
        "Throughput": {
          "type": "double"
        }
      }
    }
  }
}

But I get this following Error message in Logstash logs :

[WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"jmeter-report", :_type=>"doc", :routing=>nil}, #LogStash::Event:0x3a7561c3], :response=>{"index"=>{"_index"=>"jmeter-report", "_type"=>"doc", "_id"=>"qEXlwHsBSxy0UyVRzKMJ", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [99% Line] of type [long] in document with id 'qEXlwHsBSxy0UyVRzKMJ'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"For input string: "99% Line""}}}}}

If I stop and restart Logstash the field in the error message (here "99% Line") won't be the same, it appears to be kind of random

Upvotes: 1

Views: 193

Answers (1)

Badger
Badger

Reputation: 4072

"reason"=>"For input string: "99% Line"

It is failing to index the header line because it cannot parse the headers as longs.

You could just ignore it, since indexing the header adds no value, or you could drop it in logstash

if [message] =~ /99% Line/ { drop {} }

Upvotes: 0

Related Questions