Reputation: 11
I am trying to build a log pipe using RabbitMQ + ELK on Windows Servers.
RabbitMQ --> Logstash --> ElasticSearch --> Kibana.
Ideally i want to have 2 instances to RabbitMQ, 2 of Logstash, 3 of ElasticSearch and 1 Kibana.
Has anyone setup up something like this ? I know we can setup ElasticSearch cluster easily via setting the cluster name in the yml. What is the mechanism for lagstash to write to the ES cluster ?
Should i setup RabbitmQ+Logstash combos in each instance so that if MQs are behind a load balancer, each MQ will have its own logstash output instance and from there data goes to the cluster.
Upvotes: 1
Views: 1417
Reputation: 1811
It's not recommended to put directelly the DATA from Logstash to ES. ES Write is slow , so in heavy load you can loose data .
The idea is to add a proxy between Logstash and ES .
Logstash --> Proxy --> Elasticsearch
Logstash support Redis and RabbitMQ as a proxy .
This proxy can handle large Inputs and work as a queue mechanism .
Logstash is putting Redis as a primary choice (Because of simplicity of setup and monitoring).
Upvotes: 1
Reputation: 46
Technically you could write directly from Logstash to ES using elasticsearch output plugin or Elasticsearch_http output plugin(if using ES version not compatible with Logstash). That said for an enterprise scenario you would need fault tolerance and to handle volume, its a good idea to have RabbitMQ/Redis.
Your above config looks good, although input to your Rabbit cluster would be from one or many Logstash shippers(instances running on the client machines where logs live), that would point to a HA RabbitMQ cluster. Then a Logstash indexer whose input would be configured to look at the RabbitMQ queue(s)and output it to Elastic search cluster.
Hope that helps.
Upvotes: 1