Reputation: 45
hope you're all well during this pandemic.
I've got a kubernetes cluster running. The comunication between pods is done through kafka. It is currently logging to stdout only. No files. no kafka topic. This is obviously pretty bad.
I want to setup a grafana instance that lets me centralize all logs there. The storage would be Loki + S3
In order to do that, I found that many people use tools like Fluentd, FluentBit and Promtail, which centralizes the logs and sends them to Loki. However, I already have Kafka running. I can't see why I'd use some tool like fluentd if I can send all logs to kafka through a "logging" topic.
My question is: How could I send all messages inside the logging topic to Loki? Fluentd cannot get input from kafka.
Would I have to setup some script that runs periodically, sorts data and sends it to loki directly?
Upvotes: 2
Views: 4125
Reputation: 1191
I recommend you to use promtail because is also from Grafana and not use the kafka solution.
If you send the logs from your apps to kafka then you need to:
And if you use one the normal proposed approach you need to:
But if you want to go for your solution with kafka in the middle there are some plugins of fluentd to configure kafka as input and output. https://github.com/fluent/fluent-plugin-kafka
Upvotes: 0