Reputation: 136381
Consider the following directory structure:
/var/log/quodo/campaigns/deployment_29/campaign-32/users.log
/var/log/quodo/campaigns/deployment_29/campaign-12/ads.log
/var/log/quodo/campaigns/deployment_55/campaign-77/users.log
/var/log/quodo/campaigns/deployment_55/campaign-37/ads.log
...
I would like to log-ship all the logs under /var/log/quodo/campaigns/<whatever1>/<whatever2>
to ElasticSearch using Filebeat.
Filebeat does not feature recursive monitoring of a directory:
To fetch all files from a predefined level of subdirectories, the following pattern can be used:
/var/log/*/*.log
.This fetches all
.log
files from the subfolders of/var/log
. It does not fetch log files from the/var/log
folder itself. Currently it is not possible to recursively fetch all files in all subdirectories of a directory.
filebeat.prospectors:
- input_type: log
paths:
- /var/log/quodo/campaigns/*/*/*
scan_frequency: 1s
output.elasticsearch:
<connection data>
Can I use two levels of *
in the directory hierarchy in the Filebeat configuration?
Upvotes: 2
Views: 3091
Reputation: 476
it works me on filebeat version 7.6.1 to monitor recursive
filebeat.prospectors:
- input_type: log
paths:
- /var/log/**/*
Upvotes: 5
Reputation: 146630
Below is my filebeat version
$ filebeat.sh --version
filebeat version 5.6.2 (amd64), libbeat 5.6.2
And I tried below config
filebeat.prospectors:
- input_type: log
paths:
- /var/log/**/**/*
output.console:
pretty: true
logging.level: debug
And it works great for me
{
"@timestamp": "2017-10-07T18:12:17.694Z",
"beat": {
"hostname": "vagrant",
"name": "vagrant",
"version": "5.6.2"
},
"input_type": "log",
"message": "tarun",
"offset": 6,
"source": "/var/log/test1/test3/test.log",
"type": "log"
}
Upvotes: 6