Reputation: 1335
We are setting up elasticsearch, kibana, logstash and filebeat on a server to analyse log files from many applications. Due to reasons* each application log file ends up in a separate directory on the ELK server. We have about 20 log files.
Thank you!
*There are different vendors responsible for different applications and they run a cross many different OS and many of them will not or can't install anything like filebeats.
Upvotes: 2
Views: 1399
Reputation: 7875
We do not recommend reading log files from network volumes. Whenever possible, install Filebeat on the host machine and send the log files directly from there. Reading files from network volumes (especially on Windows) can have unexpected side effects. For example, changed file identifiers may result in Filebeat reading a log file from scratch again.
We always recommend installing Filebeat on the remote servers. Using shared folders is not supported. The typical setup is that you have a Logstash + Elasticsearch + Kibana in a central place (one or multiple servers) and Filebeat installed on the remote machines from where you are collecting data.
For one filebeat instance running you can apply different configuration settings to different files by defining multiple input sections as below example, check here for more
filebeat.inputs:
- type: log
enabled: true
paths:
- 'C:\App01_Logs\log.txt'
tags: ["App01"]
fields:
app_name: App01
- type: log
enabled: true
paths:
- 'C:\App02_Logs\log.txt'
tags: ["App02"]
fields:
app_name: App02
- type: log
enabled: true
paths:
- 'C:\App03_Logs\log.txt'
tags: ["App03"]
fields:
app_name: App03
And you can have one logstash pipeline with if statement in filter
filter {
if [fields][app_name] == "App01" {
grok { }
} else if [fields][app_name] == "App02" {
grok { }
} else {
grok { }
}
}
Condtion can be also if "App02" in [tags]
or if [source]=="C:\App01_Logs\log.txt"
as we send from filebeat
Upvotes: 0