Reputation: 81
I have multiple logstash instances shipping logs directly to a central elasticsearch server (using output->elasticsearch).
This works fine so far, however when elasticsearch is going down (e.g. the whole server is restarted) logstash doesn't restart sending logs once elasticsearch is up again.
I have to manually restart logstash. Additionally all logs from elasticsearch going down until restart of logstash are lost.
How can i change my setup to make it more fault-tolerant ?
Upvotes: 2
Views: 1415
Reputation: 1618
You should consider using a broker and send all your logs to a message queue (for example rabbitMQ), from there the Logstash will pull messages and send the data to Elasticsearch. In case Elasticsearch will be down Logstash will stop pulling the messages and they will accumulate in the broker. once the connection reestablish your messages will be written.
Upvotes: 1
Reputation: 8949
Add another server and start an elasticsearch cluster. Elasticsearch is built to scale to multiple nodes, and the logstash client will join the cluster and fail-over automatically.
Learn more about the distributed nature of elasticsearch.
Upvotes: 0