Maya
Maya

Reputation: 31

Logstash with Elastic index only 10,000 documents

I am working with Filebeat and Logstash to upload logs to Elastic (all are 7.3-oss version). My log file contain billions of rows, yet elastic only show 10K documents. When adding stdout output it seems like all the data is coming to Logstash, but for some reason Logstash uploads only 10,000 docs.

I added another output stdout { codec => rubydebug } for printing to the screen it seems like the data is coming from Filebeat, but for some reason Logstash only upload 10,000 docs. Also tried removing the Json Filter in Logstash, but the issue still occur.

Filebeat config

filebeat.inputs:
- type: log
  paths:
    \\some-path\my.json
output.logstash:
  hosts: ["localhost:5044"]

Logstash pipeline

input {
  beats {
    port => 5044
  }
}

filter{
    json{
        source => "message"
    }
}

output {
    stdout {
    codec => rubydebug 
    }
    elasticsearch {
        hosts => [ "machine-name:9200" ]
    }
}

Logstash.yml

is empty as the default installation

Upvotes: 2

Views: 420

Answers (1)

Maya
Maya

Reputation: 31

I found that is was my search that caused the confusion. According to https://www.elastic.co/guide/en/elasticsearch/reference/7.3/search-request-body.html#request-body-search-track-total-hits, Elastic simply didn't return the accurate hits (just stated that its greater than 10000).

Changing my search query

GET logstash-*/_search
{
  "track_total_hits": true
}

returned the right size.

Upvotes: 1

Related Questions