BoJack Horseman
BoJack Horseman

Reputation: 4452

Logstash and Elasticsearch, possible data loss

I have a log file with 18360 log-lines.I have a pattern to split and analyze the data, but unfortunately the amount of hits for the query all are only 8478 when I ran my logstash config. Then I deleted all the data in this index and got the hits down to zero. Then I once again ran my logstash config and then I got 11432 hits. This looks pretty random to me and even like there is a data loss. I pass my date through stdin and use following command to achieve this:

cat foo.log | /opt/logstash/bin/logstash -f bar.conf

( I have also tested it with the direct file as input, the amount of hits is still random )

Is this an evidence for data loss or could this be something else ?

Upvotes: 2

Views: 1159

Answers (1)

Jilles van Gurp
Jilles van Gurp

Reputation: 8294

Do a curl localhost:9200/logstash*/_count to get a reliable count. If that's less than you are expecting you are indeed experiencing data loss.

If so, check your elasticsearch log for errors, check your logstash for errors as well. Common issues that could result in data loss: out of memory errors, problems with your mapping, network timeouts, problems with number of filehandles, etc. The logs will tell you.

If not, check the elasticsearch log for problems with queries kibana is sending. This can happen.

Upvotes: 1

Related Questions