Raja Muneer
Raja Muneer

Reputation: 13

How to send custom logs in a specified path to filebeat running inside docker

I am new to filebeat and elk. I am trying to send custom logs using filebeat to elastic search directly.Both the elk stack and filebeat are running inside docker containers.. The custom logs are in the folder home/username/docker/hello.log. Here is my filebeat.yml file:

filebeat.config:
  modules:
    path: ${path.config}/modules.d/*.yml
    reload.enabled: false
filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /home/raju/elk/docker/*.log
filebeat.autodiscover:
  providers:
    - type: docker
      hints.enabled: true

processors:
- add_cloud_metadata: ~

output.elasticsearch:
  hosts: ["my_ip:9200"]

And here is my custom log file:

This is a custom log file 
Sending logs to elastic search

And these are the commands using which I am using to run filebeat.

docker run -d \
  --name=filebeat \
  --user=root \
  --volume="$(pwd)/filebeat.docker.yml:/usr/share/filebeat/filebeat.yml:ro" \
  --volume="/var/lib/docker/containers:/var/lib/docker/containers:ro" \
  --volume="/var/run/docker.sock:/var/run/docker.sock:ro" \
  docker.elastic.co/beats/filebeat:8.5.3 filebeat -e --strict.perms=false

When i use the above commands to run filebeat I can see the logs of the docker containers on my kibana dashboard. But I am struggling on how to make filebeat to read my custom logs from the specified location above and show me the lines inside the log file on kibana dashboard.

Anyhelp would be appreciated.

Upvotes: 1

Views: 1284

Answers (1)

Ayush
Ayush

Reputation: 336

Filebeat inputs generally can accept multiple log file paths for harvesting them. In your case, you just need to add the log file location to your log filebeat input path attribute, similar to:

filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /home/raju/elk/docker/*.log
    - /home/username/docker/hello.log

Upvotes: 1

Related Questions