Reputation: 1070
I'm using ELK stack and wondered how to handle crises in my elastic search, what is the best practice to buffer logs coming from logstash to elastic search fails in case elastic search fails and logs keep coming.
Or in case you have a better solution to provide, in order to solve the problems with failing elastic search when we should keeping the logstash "live and on air"
Upvotes: 5
Views: 1808
Reputation: 16362
You don't mention what your inputs actually are. filebeat will stop sending data to logstash/elasticsearch if it is unable to connect. Since it keeps track of where in the file it is, you get a distributed cache for free. Note that you may have problems if the log file is rotated when the server is still down.
Upvotes: 0
Reputation: 414
Place a Buffer (Redis, RabbitMQ ...) in front of your Logstash machine that will be the entry point for all log events that are shipped to your system. It will then buffer the data until the downstream components have enough resources to index.
Upvotes: 1