Reputation: 291
I am trying to use ELK to build a log analysis system. I see a lot of architecture which use ELK in different way. one of them is
Logstash->Redis->Logstash->Elasticseach->Kibana
The first Logstash is used for collecting logs, the second Logstash is used for filter logs.
I am not very clear about the Redis, Do we have to use it? Why not using Kafka?
Upvotes: 4
Views: 5483
Reputation: 278
You can just find the simple Setup of ELK (if doesn't need of Redis
). You can go through the below link for full setup and and how to load the logs using logstash and search the logs using elsticsearch and make the visulizations in Kibana
Upvotes: 0
Reputation: 16362
The redis between the two logstash instances is a buffer, just there in case elasticsearch or the logstash indexer goes down.
Depending on what you're processing with logstash, you may not need it. If you're reading log files, logstash (the shipper) will stop sending logs when logstash (the indexer) is overwhelmed. This way, you get a distributed cache (in your log files!).
If you're using one-time events (e.g. traps or syslogs from network devices), then the buffer like redis or rabbitmq would be important to store them until logstash (indexer) is available.
Upvotes: 3